Science.gov

Sample records for model-based event detection

  1. a Topic Modeling Based Representation to Detect Tweet Locations. Example of the Event "je Suis Charlie"

    NASA Astrophysics Data System (ADS)

    Morchid, M.; Josselin, D.; Portilla, Y.; Dufour, R.; Altman, E.; Linarès, G.

    2015-09-01

    Social Networks became a major actor in information propagation. Using the Twitter popular platform, mobile users post or relay messages from different locations. The tweet content, meaning and location, show how an event-such as the bursty one "JeSuisCharlie", happened in France in January 2015, is comprehended in different countries. This research aims at clustering the tweets according to the co-occurrence of their terms, including the country, and forecasting the probable country of a non-located tweet, knowing its content. First, we present the process of collecting a large quantity of data from the Twitter website. We finally have a set of 2,189 located tweets about "Charlie", from the 7th to the 14th of January. We describe an original method adapted from the Author-Topic (AT) model based on the Latent Dirichlet Allocation (LDA) method. We define an homogeneous space containing both lexical content (words) and spatial information (country). During a training process on a part of the sample, we provide a set of clusters (topics) based on statistical relations between lexical and spatial terms. During a clustering task, we evaluate the method effectiveness on the rest of the sample that reaches up to 95% of good assignment. It shows that our model is pertinent to foresee tweet location after a learning process.

  2. A Cyber-Attack Detection Model Based on Multivariate Analyses

    NASA Astrophysics Data System (ADS)

    Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi

    In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.

  3. Detection of anomalous events

    DOEpatents

    Ferragut, Erik M.; Laska, Jason A.; Bridges, Robert A.

    2016-06-07

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The system can include a plurality of anomaly detectors that together implement an algorithm to identify low-probability events and detect atypical traffic patterns. The anomaly detector provides for comparability of disparate sources of data (e.g., network flow data and firewall logs.) Additionally, the anomaly detector allows for regulatability, meaning that the algorithm can be user configurable to adjust a number of false alerts. The anomaly detector can be used for a variety of probability density functions, including normal Gaussian distributions, irregular distributions, as well as functions associated with continuous or discrete variables.

  4. Detection of solar events

    DOEpatents

    Fischbach, Ephraim; Jenkins, Jere

    2013-08-27

    A flux detection apparatus can include a radioactive sample having a decay rate capable of changing in response to interaction with a first particle or a field, and a detector associated with the radioactive sample. The detector is responsive to a second particle or radiation formed by decay of the radioactive sample. The rate of decay of the radioactive sample can be correlated to flux of the first particle or the field. Detection of the first particle or the field can provide an early warning for an impending solar event.

  5. Scintillation event energy measurement via a pulse model based iterative deconvolution method

    NASA Astrophysics Data System (ADS)

    Deng, Zhenzhou; Xie, Qingguo; Duan, Zhiwen; Xiao, Peng

    2013-11-01

    This work focuses on event energy measurement, a crucial task of scintillation detection systems. We modeled the scintillation detector as a linear system and treated the energy measurement as a deconvolution problem. We proposed a pulse model based iterative deconvolution (PMID) method, which can process pileup events without detection and is adaptive for different signal pulse shapes. The proposed method was compared with digital gated integrator (DGI) and digital delay-line clipping (DDLC) using real world experimental data. For singles data, the energy resolution (ER) produced by PMID matched that of DGI. For pileups, the PMID method outperformed both DGI and DDLC in ER and counts recovery. The encouraging results suggest that the PMID method has great potentials in applications like photon-counting systems and pulse height spectrometers, in which multiple-event pileups are common.

  6. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Sullivan, E J; Candy, J V

    2007-08-13

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  7. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Candy, J V

    2008-12-08

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  8. Naive Probability: Model-Based Estimates of Unique Events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. © 2014 Cognitive Science Society, Inc.

  9. Zero-inflated Poisson model based likelihood ratio test for drug safety signal detection.

    PubMed

    Huang, Lan; Zheng, Dan; Zalkikar, Jyoti; Tiwari, Ram

    2017-02-01

    In recent decades, numerous methods have been developed for data mining of large drug safety databases, such as Food and Drug Administration's (FDA's) Adverse Event Reporting System, where data matrices are formed by drugs such as columns and adverse events as rows. Often, a large number of cells in these data matrices have zero cell counts and some of them are "true zeros" indicating that the drug-adverse event pairs cannot occur, and these zero counts are distinguished from the other zero counts that are modeled zero counts and simply indicate that the drug-adverse event pairs have not occurred yet or have not been reported yet. In this paper, a zero-inflated Poisson model based likelihood ratio test method is proposed to identify drug-adverse event pairs that have disproportionately high reporting rates, which are also called signals. The maximum likelihood estimates of the model parameters of zero-inflated Poisson model based likelihood ratio test are obtained using the expectation and maximization algorithm. The zero-inflated Poisson model based likelihood ratio test is also modified to handle the stratified analyses for binary and categorical covariates (e.g. gender and age) in the data. The proposed zero-inflated Poisson model based likelihood ratio test method is shown to asymptotically control the type I error and false discovery rate, and its finite sample performance for signal detection is evaluated through a simulation study. The simulation results show that the zero-inflated Poisson model based likelihood ratio test method performs similar to Poisson model based likelihood ratio test method when the estimated percentage of true zeros in the database is small. Both the zero-inflated Poisson model based likelihood ratio test and likelihood ratio test methods are applied to six selected drugs, from the 2006 to 2011 Adverse Event Reporting System database, with varying percentages of observed zero-count cells.

  10. Fuzzy model-based observers for fault detection in CSTR.

    PubMed

    Ballesteros-Moncada, Hazael; Herrera-López, Enrique J; Anzurez-Marín, Juan

    2015-11-01

    Under the vast variety of fuzzy model-based observers reported in the literature, what would be the properone to be used for fault detection in a class of chemical reactor? In this study four fuzzy model-based observers for sensor fault detection of a Continuous Stirred Tank Reactor were designed and compared. The designs include (i) a Luenberger fuzzy observer, (ii) a Luenberger fuzzy observer with sliding modes, (iii) a Walcott-Zak fuzzy observer, and (iv) an Utkin fuzzy observer. A negative, an oscillating fault signal, and a bounded random noise signal with a maximum value of ±0.4 were used to evaluate and compare the performance of the fuzzy observers. The Utkin fuzzy observer showed the best performance under the tested conditions. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Model-based fault detection and diagnosis in ALMA subsystems

    NASA Astrophysics Data System (ADS)

    Ortiz, José; Carrasco, Rodrigo A.

    2016-07-01

    The Atacama Large Millimeter/submillimeter Array (ALMA) observatory, with its 66 individual telescopes and other central equipment, generates a massive set of monitoring data every day, collecting information on the performance of a variety of critical and complex electrical, electronic and mechanical components. This data is crucial for most troubleshooting efforts performed by engineering teams. More than 5 years of accumulated data and expertise allow for a more systematic approach to fault detection and diagnosis. This paper presents model-based fault detection and diagnosis techniques to support corrective and predictive maintenance in a 24/7 minimum-downtime observatory.

  12. Method for Real-Time Model Based Structural Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Smith, Timothy A. (Inventor); Urnes, James M., Sr. (Inventor); Reichenbach, Eric Y. (Inventor)

    2015-01-01

    A system and methods for real-time model based vehicle structural anomaly detection are disclosed. A real-time measurement corresponding to a location on a vehicle structure during an operation of the vehicle is received, and the real-time measurement is compared to expected operation data for the location to provide a modeling error signal. A statistical significance of the modeling error signal to provide an error significance is calculated, and a persistence of the error significance is determined. A structural anomaly is indicated, if the persistence exceeds a persistence threshold value.

  13. Acoustic Event Detection and Classification

    NASA Astrophysics Data System (ADS)

    Temko, Andrey; Nadeu, Climent; Macho, Dušan; Malkin, Robert; Zieger, Christian; Omologo, Maurizio

    The human activity that takes place in meeting rooms or classrooms is reflected in a rich variety of acoustic events (AE), produced either by the human body or by objects handled by humans, so the determination of both the identity of sounds and their position in time may help to detect and describe that human activity. Indeed, speech is usually the most informative sound, but other kinds of AEs may also carry useful information, for example, clapping or laughing inside a speech, a strong yawn in the middle of a lecture, a chair moving or a door slam when the meeting has just started. Additionally, detection and classification of sounds other than speech may be useful to enhance the robustness of speech technologies like automatic speech recognition.

  14. a model based on crowsourcing for detecting natural hazards

    NASA Astrophysics Data System (ADS)

    Duan, J.; Ma, C.; Zhang, J.; Liu, S.; Liu, J.

    2015-12-01

    Remote Sensing Technology provides a new method for the detecting,early warning,mitigation and relief of natural hazards. Given the suddenness and the unpredictability of the location of natural hazards as well as the actual demands for hazards work, this article proposes an evaluation model for remote sensing detecting of natural hazards based on crowdsourcing. Firstly, using crowdsourcing model and with the help of the Internet and the power of hundreds of millions of Internet users, this evaluation model provides visual interpretation of high-resolution remote sensing images of hazards area and collects massive valuable disaster data; secondly, this evaluation model adopts the strategy of dynamic voting consistency to evaluate the disaster data provided by the crowdsourcing workers; thirdly, this evaluation model pre-estimates the disaster severity with the disaster pre-evaluation model based on regional buffers; lastly, the evaluation model actuates the corresponding expert system work according to the forecast results. The idea of this model breaks the boundaries between geographic information professionals and the public, makes the public participation and the citizen science eventually be realized, and improves the accuracy and timeliness of hazards assessment results.

  15. Lightning Detection Efficiency Analysis Process: Modeling Based on Empirical Data

    NASA Technical Reports Server (NTRS)

    Rompala, John T.

    2005-01-01

    A ground based lightning detection system employs a grid of sensors, which record and evaluate the electromagnetic signal produced by a lightning strike. Several detectors gather information on that signal s strength, time of arrival, and behavior over time. By coordinating the information from several detectors, an event solution can be generated. That solution includes the signal s point of origin, strength and polarity. Determination of the location of the lightning strike uses algorithms based on long used techniques of triangulation. Determination of the event s original signal strength relies on the behavior of the generated magnetic field over distance and time. In general the signal from the event undergoes geometric dispersion and environmental attenuation as it progresses. Our knowledge of that radial behavior together with the strength of the signal received by detecting sites permits an extrapolation and evaluation of the original strength of the lightning strike. It also limits the detection efficiency (DE) of the network. For expansive grids and with a sparse density of detectors, the DE varies widely over the area served. This limits the utility of the network in gathering information on regional lightning strike density and applying it to meteorological studies. A network of this type is a grid of four detectors in the Rondonian region of Brazil. The service area extends over a million square kilometers. Much of that area is covered by rain forests. Thus knowledge of lightning strike characteristics over the expanse is of particular value. I have been developing a process that determines the DE over the region [3]. In turn, this provides a way to produce lightning strike density maps, corrected for DE, over the entire region of interest. This report offers a survey of that development to date and a record of present activity.

  16. Detectability of Discrete Event Systems with Dynamic Event Observation

    PubMed Central

    Shu, Shaolong; Lin, Feng

    2009-01-01

    Our previous work considers detectability of discrete event systems which is to determine the current state and subsequent states of a system based on event observation. We assume that event observation is static, that is, if an event is observable, then all its occurrences are observable. However, in practical systems such as sensor networks, event observation often needs to be dynamic, that is, the occurrences of same events may or may not be observable, depending on the state of the system. In this paper, we generalize static event observation into dynamic event observation and consider the detectability problem under dynamic event observation. We define four types of detectabilities. To check detectabilities, we construct the observer with exponential complexity. To reduce computational complexity, we can also construct a detector with polynomial complexity to check strong detectabilities. Dynamic event observation can be implemented in two possible ways: a passive observation and an active observation. For the active observation, we discuss how to find minimal event observation policies that preserve four types of detectabilities respectively. PMID:20161618

  17. Discrete event model-based simulation for train movement on a single-line railway

    NASA Astrophysics Data System (ADS)

    Xu, Xiao-Ming; Li, Ke-Ping; Yang, Li-Xing

    2014-08-01

    The aim of this paper is to present a discrete event model-based approach to simulate train movement with the considered energy-saving factor. We conduct extensive case studies to show the dynamic characteristics of the traffic flow and demonstrate the effectiveness of the proposed approach. The simulation results indicate that the proposed discrete event model-based simulation approach is suitable for characterizing the movements of a group of trains on a single railway line with less iterations and CPU time. Additionally, some other qualitative and quantitative characteristics are investigated. In particular, because of the cumulative influence from the previous trains, the following trains should be accelerated or braked frequently to control the headway distance, leading to more energy consumption.

  18. Event oriented dictionary learning for complex event detection.

    PubMed

    Yan, Yan; Yang, Yi; Meng, Deyu; Liu, Gaowen; Tong, Wei; Hauptmann, Alexander G; Sebe, Nicu

    2015-06-01

    Complex event detection is a retrieval task with the goal of finding videos of a particular event in a large-scale unconstrained Internet video archive, given example videos and text descriptions. Nowadays, different multimodal fusion schemes of low-level and high-level features are extensively investigated and evaluated for the complex event detection task. However, how to effectively select the high-level semantic meaningful concepts from a large pool to assist complex event detection is rarely studied in the literature. In this paper, we propose a novel strategy to automatically select semantic meaningful concepts for the event detection task based on both the events-kit text descriptions and the concepts high-level feature descriptions. Moreover, we introduce a novel event oriented dictionary representation based on the selected semantic concepts. Toward this goal, we leverage training images (frames) of selected concepts from the semantic indexing dataset with a pool of 346 concepts, into a novel supervised multitask lp -norm dictionary learning framework. Extensive experimental results on TRECVID multimedia event detection dataset demonstrate the efficacy of our proposed method.

  19. Model-Based Reasoning in the Detection of Satellite Anomalies

    DTIC Science & Technology

    1990-12-01

    Conference on Artificial Intellegence . 1363-1368. Detroit, Michigan, August 89. Chu, Wei-Hai. "Generic Expert System Shell for Diagnostic Reasoning... Intellegence . 1324-1330. Detroit, Michigan, August 89. de Kleer, Johan and Brian C. Williams. "Diagnosing Multiple Faults," Artificial Intellegence , 32(1): 97...Benjamin Kuipers. "Model-Based Monitoring of Dynamic Systems," Proceedings of the Eleventh Intematianal Joint Conference on Artificial Intellegence . 1238

  20. Sequential Model-Based Detection in a Shallow Ocean Acoustic Environment

    SciTech Connect

    Candy, J V

    2002-03-26

    A model-based detection scheme is developed to passively monitor an ocean acoustic environment along with its associated variations. The technique employs an embedded model-based processor and a reference model in a sequential likelihood detection scheme. The monitor is therefore called a sequential reference detector. The underlying theory for the design is developed and discussed in detail.

  1. Joint Attributes and Event Analysis for Multimedia Event Detection.

    PubMed

    Ma, Zhigang; Chang, Xiaojun; Xu, Zhongwen; Sebe, Nicu; Hauptmann, Alexander G

    2017-06-15

    Semantic attributes have been increasingly used the past few years for multimedia event detection (MED) with promising results. The motivation is that multimedia events generally consist of lower level components such as objects, scenes, and actions. By characterizing multimedia event videos with semantic attributes, one could exploit more informative cues for improved detection results. Much existing work obtains semantic attributes from images, which may be suboptimal for video analysis since these image-inferred attributes do not carry dynamic information that is essential for videos. To address this issue, we propose to learn semantic attributes from external videos using their semantic labels. We name them video attributes in this paper. In contrast with multimedia event videos, these external videos depict lower level contents such as objects, scenes, and actions. To harness video attributes, we propose an algorithm established on a correlation vector that correlates them to a target event. Consequently, we could incorporate video attributes latently as extra information into the event detector learnt from multimedia event videos in a joint framework. To validate our method, we perform experiments on the real-world large-scale TRECVID MED 2013 and 2014 data sets and compare our method with several state-of-the-art algorithms. The experiments show that our method is advantageous for MED.

  2. Model-Based Design of Tree WSNs for Decentralized Detection.

    PubMed

    Tantawy, Ashraf; Koutsoukos, Xenofon; Biswas, Gautam

    2015-08-20

    The classical decentralized detection problem of finding the optimal decision rules at the sensor and fusion center, as well as variants that introduce physical channel impairments have been studied extensively in the literature. The deployment of WSNs in decentralized detection applications brings new challenges to the field. Protocols for different communication layers have to be co-designed to optimize the detection performance. In this paper, we consider the communication network design problem for a tree WSN. We pursue a system-level approach where a complete model for the system is developed that captures the interactions between different layers, as well as different sensor quality measures. For network optimization, we propose a hierarchical optimization algorithm that lends itself to the tree structure, requiring only local network information. The proposed design approach shows superior performance over several contentionless and contention-based network design approaches.

  3. A novel interacting multiple model based network intrusion detection scheme

    NASA Astrophysics Data System (ADS)

    Xin, Ruichi; Venkatasubramanian, Vijay; Leung, Henry

    2006-04-01

    In today's information age, information and network security are of primary importance to any organization. Network intrusion is a serious threat to security of computers and data networks. In internet protocol (IP) based network, intrusions originate in different kinds of packets/messages contained in the open system interconnection (OSI) layer 3 or higher layers. Network intrusion detection and prevention systems observe the layer 3 packets (or layer 4 to 7 messages) to screen for intrusions and security threats. Signature based methods use a pre-existing database that document intrusion patterns as perceived in the layer 3 to 7 protocol traffics and match the incoming traffic for potential intrusion attacks. Alternately, network traffic data can be modeled and any huge anomaly from the established traffic pattern can be detected as network intrusion. The latter method, also known as anomaly based detection is gaining popularity for its versatility in learning new patterns and discovering new attacks. It is apparent that for a reliable performance, an accurate model of the network data needs to be established. In this paper, we illustrate using collected data that network traffic is seldom stationary. We propose the use of multiple models to accurately represent the traffic data. The improvement in reliability of the proposed model is verified by measuring the detection and false alarm rates on several datasets.

  4. A biological hierarchical model based underwater moving object detection.

    PubMed

    Shen, Jie; Fan, Tanghuai; Tang, Min; Zhang, Qian; Sun, Zhen; Huang, Fengchen

    2014-01-01

    Underwater moving object detection is the key for many underwater computer vision tasks, such as object recognizing, locating, and tracking. Considering the super ability in visual sensing of the underwater habitats, the visual mechanism of aquatic animals is generally regarded as the cue for establishing bionic models which are more adaptive to the underwater environments. However, the low accuracy rate and the absence of the prior knowledge learning limit their adaptation in underwater applications. Aiming to solve the problems originated from the inhomogeneous lumination and the unstable background, the mechanism of the visual information sensing and processing pattern from the eye of frogs are imitated to produce a hierarchical background model for detecting underwater objects. Firstly, the image is segmented into several subblocks. The intensity information is extracted for establishing background model which could roughly identify the object and the background regions. The texture feature of each pixel in the rough object region is further analyzed to generate the object contour precisely. Experimental results demonstrate that the proposed method gives a better performance. Compared to the traditional Gaussian background model, the completeness of the object detection is 97.92% with only 0.94% of the background region that is included in the detection results.

  5. Generalized Detectability for Discrete Event Systems

    PubMed Central

    Shu, Shaolong; Lin, Feng

    2011-01-01

    In our previous work, we investigated detectability of discrete event systems, which is defined as the ability to determine the current and subsequent states of a system based on observation. For different applications, we defined four types of detectabilities: (weak) detectability, strong detectability, (weak) periodic detectability, and strong periodic detectability. In this paper, we extend our results in three aspects. (1) We extend detectability from deterministic systems to nondeterministic systems. Such a generalization is necessary because there are many systems that need to be modeled as nondeterministic discrete event systems. (2) We develop polynomial algorithms to check strong detectability. The previous algorithms are based on observer whose construction is of exponential complexity, while the new algorithms are based on a new automaton called detector. (3) We extend detectability to D-detectability. While detectability requires determining the exact state of a system, D-detectability relaxes this requirement by asking only to distinguish certain pairs of states. With these extensions, the theory on detectability of discrete event systems becomes more applicable in solving many practical problems. PMID:21691432

  6. Detecting unitary events without discretization of time.

    PubMed

    Grün, S; Diesmann, M; Grammont, F; Riehle, A; Aertsen, A

    1999-12-15

    In earlier studies we developed the 'Unitary Events' analysis (Grün S. Unitary Joint-Events in Multiple-Neuron Spiking Activity: Detection, Significance and Interpretation. Reihe Physik, Band 60. Thun, Frankfurt/Main: Verlag Harri Deutsch, 1996.) to detect the presence of conspicuous spike coincidences in multiple single unit recordings and to evaluate their statistical significance. The method enabled us to study the relation between spike synchronization and behavioral events (Riehle A, Grün S, Diesmann M, Aertsen A. Spike synchronization and rate modulation differentially involved in motor cortical function. Science 1997;278:1950-1953.). There is recent experimental evidence that the timing accuracy of coincident spiking events, which might be relevant for higher brain function, may be in the range of 1-5 ms. To detect coincidences on that time scale, we sectioned the observation interval into short disjunct time slices ('bins'). Unitary Events analysis of this discretized process demonstrated that coincident events can indeed be reliably detected. However, the method looses sensitivity for higher temporal jitter of the events constituting the coincidences (Grün S. Unitary Joint-Events in Multiple-Neuron Spiking Activity: Detection, Significance and Interpretation. Reihe Physik, Band 60. Thun, Frankfurt/Main: Verlag Harri Deutsch, 1996.). Here we present a new approach, the 'multiple shift' method (MS), which overcomes the need for binning and treats the data in their (original) high time resolution (typically 1 ms, or better). Technically, coincidences are detected by shifting the spike trains against each other over the range of allowed coincidence width and integrating the number of exact coincidences (on the time resolution of the data) over all shifts. We found that the new method enhances the sensitivity for coincidences with temporal jitter. Both methods are outlined and compared on the basis of their analytical description and their application on

  7. Detecting Extreme Events in Gridded Climate Data

    SciTech Connect

    Ramachandra, Bharathkumar; Gadiraju, Krishna; Vatsavai, Raju; Kaiser, Dale Patrick; Karnowski, Thomas Paul

    2016-01-01

    Detecting and tracking extreme events in gridded climatological data is a challenging problem on several fronts: algorithms, scalability, and I/O. Successful detection of these events will give climate scientists an alternate view of the behavior of different climatological variables, leading to enhanced scientific understanding of the impacts of events such as heat and cold waves, and on a larger scale, the El Nin o Southern Oscillation. Recent advances in computing power and research in data sciences enabled us to look at this problem with a different perspective from what was previously possible. In this paper we present our computationally efficient algorithms for anomalous cluster detection on climate change big data. We provide results on detection and tracking of surface temperature and geopotential height anomalies, a trend analysis, and a study of relationships between the variables. We also identify the limitations of our approaches, future directions for research and alternate approaches.

  8. Certification Aspects in Critical Embedded Software Development with Model Based Techniques: Detection of Unintended Functions

    NASA Astrophysics Data System (ADS)

    Atencia Yepez, A.; Autrán Cerqueira, J.; Urueña, S.; Jurado, R.

    2012-01-01

    This paper, developed under contract with European Aviation Safety Agency (EASA), analyses in detail which may be the certification implications in the aeronautic industry associated to the application of model-level verification and validation techniques. Particularly, this paper focuses on the problematic of detecting unintended functions by applying Model Coverage Criteria at model level. This point is significantly important for the future extensive use of Model Based approaches in safety critical software, since the uncertainty in the system performance introduced by the unintended functions, which may also lead to unacceptable hazardous or catastrophic events, prevents the system to be compliance with certification requirements. The paper provides a definition and a categorization of unintended functions and gives some relevant examples to assess the efficiency of model- coverage techniques in the detection of UF. The paper explains how this analysis is supported by a methodology based on the study of sources for introducing unintended functions. Finally it is analysed the feasibility of using Model-level verification techniques to support the software certification process.

  9. Detection of goal events in soccer videos

    NASA Astrophysics Data System (ADS)

    Kim, Hyoung-Gook; Roeber, Steffen; Samour, Amjad; Sikora, Thomas

    2004-12-01

    In this paper, we present an automatic extraction of goal events in soccer videos by using audio track features alone without relying on expensive-to-compute video track features. The extracted goal events can be used for high-level indexing and selective browsing of soccer videos. The detection of soccer video highlights using audio contents comprises three steps: 1) extraction of audio features from a video sequence, 2) event candidate detection of highlight events based on the information provided by the feature extraction Methods and the Hidden Markov Model (HMM), 3) goal event selection to finally determine the video intervals to be included in the summary. For this purpose we compared the performance of the well known Mel-scale Frequency Cepstral Coefficients (MFCC) feature extraction method vs. MPEG-7 Audio Spectrum Projection feature (ASP) extraction method based on three different decomposition methods namely Principal Component Analysis( PCA), Independent Component Analysis (ICA) and Non-Negative Matrix Factorization (NMF). To evaluate our system we collected five soccer game videos from various sources. In total we have seven hours of soccer games consisting of eight gigabytes of data. One of five soccer games is used as the training data (e.g., announcers' excited speech, audience ambient speech noise, audience clapping, environmental sounds). Our goal event detection results are encouraging.

  10. Detection of goal events in soccer videos

    NASA Astrophysics Data System (ADS)

    Kim, Hyoung-Gook; Roeber, Steffen; Samour, Amjad; Sikora, Thomas

    2005-01-01

    In this paper, we present an automatic extraction of goal events in soccer videos by using audio track features alone without relying on expensive-to-compute video track features. The extracted goal events can be used for high-level indexing and selective browsing of soccer videos. The detection of soccer video highlights using audio contents comprises three steps: 1) extraction of audio features from a video sequence, 2) event candidate detection of highlight events based on the information provided by the feature extraction Methods and the Hidden Markov Model (HMM), 3) goal event selection to finally determine the video intervals to be included in the summary. For this purpose we compared the performance of the well known Mel-scale Frequency Cepstral Coefficients (MFCC) feature extraction method vs. MPEG-7 Audio Spectrum Projection feature (ASP) extraction method based on three different decomposition methods namely Principal Component Analysis( PCA), Independent Component Analysis (ICA) and Non-Negative Matrix Factorization (NMF). To evaluate our system we collected five soccer game videos from various sources. In total we have seven hours of soccer games consisting of eight gigabytes of data. One of five soccer games is used as the training data (e.g., announcers' excited speech, audience ambient speech noise, audience clapping, environmental sounds). Our goal event detection results are encouraging.

  11. A Bayesian Hidden Markov Model-based approach for anomaly detection in electronic systems

    NASA Astrophysics Data System (ADS)

    Dorj, E.; Chen, C.; Pecht, M.

    Early detection of anomalies in any system or component prevents impending failures and enhances performance and availability. The complex architecture of electronics, the interdependency of component functionalities, and the miniaturization of most electronic systems make it difficult to detect and analyze anomalous behaviors. A Hidden Markov Model-based classification technique determines unobservable hidden behaviors of complex and remotely inaccessible electronic systems using observable signals. This paper presents a data-driven approach for anomaly detection in electronic systems based on a Bayesian Hidden Markov Model classification technique. The posterior parameters of the Hidden Markov Models are estimated using the conjugate prior method. An application of the developed Bayesian Hidden Markov Model-based anomaly detection approach is presented for detecting anomalous behavior in Insulated Gate Bipolar Transistors using experimental data. The detection results illustrate that the developed anomaly detection approach can help detect anomalous behaviors in electronic systems, which can help prevent system downtime and catastrophic failures.

  12. Automated Detection of Events of Scientific Interest

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    A report presents a slightly different perspective of the subject matter of Fusing Symbolic and Numerical Diagnostic Computations (NPO-42512), which appears elsewhere in this issue of NASA Tech Briefs. Briefly, the subject matter is the X-2000 Anomaly Detection Language, which is a developmental computing language for fusing two diagnostic computer programs one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for real-time detection of events. In the case of the cited companion NASA Tech Briefs article, the contemplated events that one seeks to detect would be primarily failures or other changes that could adversely affect the safety or success of a spacecraft mission. In the case of the instant report, the events to be detected could also include natural phenomena that could be of scientific interest. Hence, the use of X- 2000 Anomaly Detection Language could contribute to a capability for automated, coordinated use of multiple sensors and sensor-output-data-processing hardware and software to effect opportunistic collection and analysis of scientific data.

  13. Model-based approaches to deal with detectability: a comment on Hutto (2016)

    USGS Publications Warehouse

    Marques, Tiago A.; Thomas, Len; Kéry, Marc; Buckland, Steve T.; Borchers, David L.; Rexstad, Eric; Fewster, Rachel M.; MacKenzie, Darryl I.; Royle, Andy; Guillera-Arroita, Gurutzeta; Handel, Colleen M.; Pavlacky, David C.; Camp, Richard J.

    2017-01-01

    In a recent paper, Hutto (2016a) challenges the need to account for detectability when interpreting data from point counts. A number of issues with model-based approaches to deal with detectability are presented, and an alternative suggested: surveying an area around each point over which detectability is assumed certain. The article contains a number of false claims and errors of logic, and we address these here. We provide suggestions about appropriate uses of distance sampling and occupancy modeling, arising from an intersection of design- and model-based inference.

  14. On event-based optical flow detection

    PubMed Central

    Brosch, Tobias; Tschechne, Stephan; Neumann, Heiko

    2015-01-01

    Event-based sensing, i.e., the asynchronous detection of luminance changes, promises low-energy, high dynamic range, and sparse sensing. This stands in contrast to whole image frame-wise acquisition by standard cameras. Here, we systematically investigate the implications of event-based sensing in the context of visual motion, or flow, estimation. Starting from a common theoretical foundation, we discuss different principal approaches for optical flow detection ranging from gradient-based methods over plane-fitting to filter based methods and identify strengths and weaknesses of each class. Gradient-based methods for local motion integration are shown to suffer from the sparse encoding in address-event representations (AER). Approaches exploiting the local plane like structure of the event cloud, on the other hand, are shown to be well suited. Within this class, filter based approaches are shown to define a proper detection scheme which can also deal with the problem of representing multiple motions at a single location (motion transparency). A novel biologically inspired efficient motion detector is proposed, analyzed and experimentally validated. Furthermore, a stage of surround normalization is incorporated. Together with the filtering this defines a canonical circuit for motion feature detection. The theoretical analysis shows that such an integrated circuit reduces motion ambiguity in addition to decorrelating the representation of motion related activations. PMID:25941470

  15. Subnoise detection of a fast random event.

    PubMed

    Ataie, V; Esman, D; Kuo, B P-P; Alic, N; Radic, S

    2015-12-11

    Observation of random, nonrepetitive phenomena is of critical importance in astronomy, spectroscopy, biology, and remote sensing. Heralded by weak signals, hidden in noise, they pose basic detection challenges. In contrast to repetitive waveforms, a single-instance signal cannot be separated from noise through averaging. Here, we show that a fast, randomly occurring event can be detected and extracted from a noisy background without conventional averaging. An isolated 80-picosecond pulse was received with confidence level exceeding 99%, even when accompanied by noise. Our detector relies on instantaneous spectral cloning and a single-step, coherent field processor. The ability to extract fast, subnoise events is expected to increase detection sensitivity in multiple disciplines. Additionally, the new spectral-cloning receiver can potentially intercept communication signals that are presently considered secure. Copyright © 2015, American Association for the Advancement of Science.

  16. Detection and recognition of indoor smoking events

    NASA Astrophysics Data System (ADS)

    Bien, Tse-Lun; Lin, Chang Hong

    2013-03-01

    Smoking in public indoor spaces has become prohibited in many countries since it not only affects the health of the people around you, but also increases the risk of fire outbreaks. This paper proposes a novel scheme to automatically detect and recognize smoking events by using exsiting surveillance cameras. The main idea of our proposed method is to detect human smoking events by recognizing their actions. In this scheme, the human pose estimation is introduced to analyze human actions from their poses. The human pose estimation method segments head and both hands from human body parts by using a skin color detection method. However, the skin color methods may fail in insufficient light conditions. Therefore, the lighting compensation is applied to help the skin color detection method become more accurate. Due to the human body parts may be covered by shadows, which may cause the human pose estimation to fail, the Kalman filter is applied to track the missed body parts. After that, we evaluate the probability features of hands approaching the head. The support vector machine (SVM) is applied to learn and recognize the smoking events by the probability features. To analysis the performance of proposed method, the datasets established in the survillance camera view under indoor enviroment are tested. The experimental results show the effectiveness of our proposed method with accuracy rate of 83.33%.

  17. A methodology for detecting routing events in discrete flow networks.

    SciTech Connect

    Garcia, H. E.; Yoo, T.; Nuclear Technology

    2004-01-01

    A theoretical framework for formulating and implementing model-based monitoring of discrete flow networks is discussed. Possible flows of items are described as the sequence of discrete-event (DE) traces. Each trace defines the DE sequence(s) that are triggered when an entity follows a given flow-path and visits tracking locations distributed within the monitored system. Given the set of possible discrete flows, a possible-behavior model - an interacting set of automata - is constructed, where each automaton models the discrete flow of items at each tracking location. Event labels or symbols contain all the information required to unambiguously distinguish each discrete flow. Within the possible behavior, there is a special sub-behavior whose occurrence is required to be detected. The special behavior may be specified by the occurrence of routing events, such as faults. These intermittent or non-persistent events may occur repeatedly. An observation mask is then defined, characterizing the actual observation configuration available for collecting item tracking data. The analysis task is then to determine whether this observation configuration is capable of detecting the identified special behavior. The assessment is accomplished by evaluating several observability notions, such as detectability and diagnosability. If the corresponding property is satisfied, associated formal observers are constructed to perform the monitoring task at hand. The synthesis of an optimal observation mask may also be conducted to suggest an appropriate observation configuration guaranteeing the detection of the special events and to construct associated monitoring agents. The proposed framework, modeling methodology, and supporting techniques for discrete flow networks monitoring are presented and illustrated with an example.

  18. Model Based Analysis of Clonal Developments Allows for Early Detection of Monoclonal Conversion and Leukemia

    PubMed Central

    Thielecke, Lars; Glauche, Ingmar

    2016-01-01

    The availability of several methods to unambiguously mark individual cells has strongly fostered the understanding of clonal developments in hematopoiesis and other stem cell driven regenerative tissues. While cellular barcoding is the method of choice for experimental studies, patients that underwent gene therapy carry a unique insertional mark within the transplanted cells originating from the integration of the retroviral vector. Close monitoring of such patients allows accessing their clonal dynamics, however, the early detection of events that predict monoclonal conversion and potentially the onset of leukemia are beneficial for treatment. We developed a simple mathematical model of a self-stabilizing hematopoietic stem cell population to generate a wide range of possible clonal developments, reproducing typical, experimentally and clinically observed scenarios. We use the resulting model scenarios to suggest and test a set of statistical measures that should allow for an interpretation and classification of relevant clonal dynamics. Apart from the assessment of several established diversity indices we suggest a measure that quantifies the extension to which the increase in the size of one clone is attributed to the total loss in the size of all other clones. By evaluating the change in relative clone sizes between consecutive measurements, the suggested measure, referred to as maximum relative clonal expansion (mRCE), proves to be highly sensitive in the detection of rapidly expanding cell clones prior to their dominant manifestation. This predictive potential places the mRCE as a suitable means for the early recognition of leukemogenesis especially in gene therapy patients that are closely monitored. Our model based approach illustrates how simulation studies can actively support the design and evaluation of preclinical strategies for the analysis and risk evaluation of clonal developments. PMID:27764218

  19. Phase-Space Detection of Cyber Events

    SciTech Connect

    Hernandez Jimenez, Jarilyn M; Ferber, Aaron E; Prowell, Stacy J; Hively, Lee M

    2015-01-01

    Energy Delivery Systems (EDS) are a network of processes that produce, transfer and distribute energy. EDS are increasingly dependent on networked computing assets, as are many Industrial Control Systems. Consequently, cyber-attacks pose a real and pertinent threat, as evidenced by Stuxnet, Shamoon and Dragonfly. Hence, there is a critical need for novel methods to detect, prevent, and mitigate effects of such attacks. To detect cyber-attacks in EDS, we developed a framework for gathering and analyzing timing data that involves establishing a baseline execution profile and then capturing the effect of perturbations in the state from injecting various malware. The data analysis was based on nonlinear dynamics and graph theory to improve detection of anomalous events in cyber applications. The goal was the extraction of changing dynamics or anomalous activity in the underlying computer system. Takens' theorem in nonlinear dynamics allows reconstruction of topologically invariant, time-delay-embedding states from the computer data in a sufficiently high-dimensional space. The resultant dynamical states were nodes, and the state-to-state transitions were links in a mathematical graph. Alternatively, sequential tabulation of executing instructions provides the nodes with corresponding instruction-to-instruction links. Graph theorems guarantee graph-invariant measures to quantify the dynamical changes in the running applications. Results showed a successful detection of cyber events.

  20. Model-based detection of synthetic bat echolocation calls using an energy threshold detector for initialization.

    PubMed

    Skowronski, Mark D; Fenton, M Brock

    2008-05-01

    Detection of echolocation calls is fundamental to quantitative analysis of bat acoustic signals. Automated methods of detection reduce the subjectivity of hand labeling of calls and speed up the detection process in an accurate and repeatable manner. A model-based detector was initialized using a baseline energy threshold detector, removing the need for hand labels to train the model, and shown to be superior to the baseline detector using synthetic calls in two experiments: (1) an artificial environment and (2) a field playback setting. Synthetic calls using a piecewise exponential frequency modulation function from five hypothetical species were employed to control the signal-to-noise ratio (SNR) in each experiment and to provide an absolute ground truth to judge detector performance. The model-based detector outperformed the baseline detector by 2.5 dB SNR in the artificial environment and 1.5 dB SNR in the field playback setting. Atmospheric absorption was measured for the synthetic calls, and 1.5 dB increased the effective detection radius by between 1 and 7 m depending on species. The results demonstrate that hand labels are not necessary for training detection models and that model-based detectors significantly increase the range of detection for a recording system.

  1. Integrating physically based simulators with Event Detection Systems: Multi-site detection approach.

    PubMed

    Housh, Mashor; Ohar, Ziv

    2017-03-01

    The Fault Detection (FD) Problem in control theory concerns of monitoring a system to identify when a fault has occurred. Two approaches can be distinguished for the FD: Signal processing based FD and Model-based FD. The former concerns of developing algorithms to directly infer faults from sensors' readings, while the latter uses a simulation model of the real-system to analyze the discrepancy between sensors' readings and expected values from the simulation model. Most contamination Event Detection Systems (EDSs) for water distribution systems have followed the signal processing based FD, which relies on analyzing the signals from monitoring stations independently of each other, rather than evaluating all stations simultaneously within an integrated network. In this study, we show that a model-based EDS which utilizes a physically based water quality and hydraulics simulation models, can outperform the signal processing based EDS. We also show that the model-based EDS can facilitate the development of a Multi-Site EDS (MSEDS), which analyzes the data from all the monitoring stations simultaneously within an integrated network. The advantage of the joint analysis in the MSEDS is expressed by increased detection accuracy (higher true positive alarms and fewer false alarms) and shorter detection time. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. LAN attack detection using Discrete Event Systems.

    PubMed

    Hubballi, Neminath; Biswas, Santosh; Roopa, S; Ratti, Ritesh; Nandi, Sukumar

    2011-01-01

    Address Resolution Protocol (ARP) is used for determining the link layer or Medium Access Control (MAC) address of a network host, given its Internet Layer (IP) or Network Layer address. ARP is a stateless protocol and any IP-MAC pairing sent by a host is accepted without verification. This weakness in the ARP may be exploited by malicious hosts in a Local Area Network (LAN) by spoofing IP-MAC pairs. Several schemes have been proposed in the literature to circumvent these attacks; however, these techniques either make IP-MAC pairing static, modify the existing ARP, patch operating systems of all the hosts etc. In this paper we propose a Discrete Event System (DES) approach for Intrusion Detection System (IDS) for LAN specific attacks which do not require any extra constraint like static IP-MAC, changing the ARP etc. A DES model is built for the LAN under both a normal and compromised (i.e., spoofed request/response) situation based on the sequences of ARP related packets. Sequences of ARP events in normal and spoofed scenarios are similar thereby rendering the same DES models for both the cases. To create different ARP events under normal and spoofed conditions the proposed technique uses active ARP probing. However, this probing adds extra ARP traffic in the LAN. Following that a DES detector is built to determine from observed ARP related events, whether the LAN is operating under a normal or compromised situation. The scheme also minimizes extra ARP traffic by probing the source IP-MAC pair of only those ARP packets which are yet to be determined as genuine/spoofed by the detector. Also, spoofed IP-MAC pairs determined by the detector are stored in tables to detect other LAN attacks triggered by spoofing namely, man-in-the-middle (MiTM), denial of service etc. The scheme is successfully validated in a test bed. Copyright © 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  3. WCEDS: A waveform correlation event detection system

    SciTech Connect

    Young, C.J.; Beiriger, J.I.; Trujillo, J.R.; Withers, M.M.; Aster, R.C.; Astiz, L.; Shearer, P.M.

    1995-08-01

    We have developed a working prototype of a grid-based global event detection system based on waveform correlation. The algorithm comes from a long-period detector but we have recast it in a full matrix formulation which can reduce the number of multiplications needed by better than two orders of magnitude for realistic monitoring scenarios. The reduction is made possible by eliminating redundant multiplications in the original formulation. All unique correlations for a given origin time are stored in a correlation matrix (C) which is formed by a full matrix product of a Master Image matrix (M) and a data matrix (D). The detector value at each grid point is calculated by following a different summation path through the correlation matrix. Master Images can be derived either empirically or synthetically. Our testing has used synthetic Master Images because their influence on the detector is easier to understand. We tested the system using the matrix formulation with continuous data from the IRIS (Incorporate Research Institutes for Seismology) broadband global network to monitor a 2 degree evenly spaced surface grid with a time discretization of 1 sps; we successfully detected the largest event in a two hour segment from October 1993. The output at the correct gridpoint was at least 33% larger than at adjacent grid points, and the output at the correct gridpoint at the correct origin time was more than 500% larger than the output at the same gridpoint immediately before or after. Analysis of the C matrix for the origin time of the event demonstrates that there are many significant ``false`` correlations of observed phases with incorrect predicted phases. These false correlations dull the sensitivity of the detector and so must be dealt with if our system is to attain detection thresholds consistent with a Comprehensive Test Ban Treaty (CTBT).

  4. Radioactive Threat Detection with Scattering Physics: A Model-Based Application

    SciTech Connect

    Candy, J V; Chambers, D H; Breitfeller, E F; Guidry, B L; Verbeke, J M; Axelrod, M A; Sale, K E; Meyer, A M

    2010-01-21

    The detection of radioactive contraband is a critical problem in maintaining national security for any country. Emissions from threat materials challenge both detection and measurement technologies especially when concealed by various types of shielding complicating the transport physics significantly. The development of a model-based sequential Bayesian processor that captures both the underlying transport physics including scattering offers a physics-based approach to attack this challenging problem. It is shown that this processor can be used to develop an effective detection technique.

  5. Model-Based Detection of Radioactive Contraband for Harbor Defense Incorporating Compton Scattering Physics

    SciTech Connect

    Candy, J V; Chambers, D H; Breitfeller, E F; Guidry, B L; Verbeke, J M; Axelrod, M A; Sale, K E; Meyer, A M

    2010-03-02

    The detection of radioactive contraband is a critical problem is maintaining national security for any country. Photon emissions from threat materials challenge both detection and measurement technologies especially when concealed by various types of shielding complicating the transport physics significantly. This problem becomes especially important when ships are intercepted by U.S. Coast Guard harbor patrols searching for contraband. The development of a sequential model-based processor that captures both the underlying transport physics of gamma-ray emissions including Compton scattering and the measurement of photon energies offers a physics-based approach to attack this challenging problem. The inclusion of a basic radionuclide representation of absorbed/scattered photons at a given energy along with interarrival times is used to extract the physics information available from the noisy measurements portable radiation detection systems used to interdict contraband. It is shown that this physics representation can incorporated scattering physics leading to an 'extended' model-based structure that can be used to develop an effective sequential detection technique. The resulting model-based processor is shown to perform quite well based on data obtained from a controlled experiment.

  6. Pickless event detection and location: The waveform correlation event detection system (WCEDS) revisited

    DOE PAGES

    Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; ...

    2016-01-01

    The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events thatmore » generated the phase picks.« less

  7. Pickless event detection and location: The waveform correlation event detection system (WCEDS) revisited

    SciTech Connect

    Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; Slinkard, Megan Elizabeth

    2016-01-01

    The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events that generated the phase picks.

  8. Prediction models for early risk detection of cardiovascular event.

    PubMed

    Purwanto; Eswaran, Chikkannan; Logeswaran, Rajasvaran; Abdul Rahman, Abdul Rashid

    2012-04-01

    Cardiovascular disease (CVD) is the major cause of death globally. More people die of CVDs each year than from any other disease. Over 80% of CVD deaths occur in low and middle income countries and occur almost equally in male and female. In this paper, different computational models based on Bayesian Networks, Multilayer Perceptron,Radial Basis Function and Logistic Regression methods are presented to predict early risk detection of the cardiovascular event. A total of 929 (626 male and 303 female) heart attack data are used to construct the models.The models are tested using combined as well as separate male and female data. Among the models used, it is found that the Multilayer Perceptron model yields the best accuracy result.

  9. Implementation of a model based fault detection and diagnosis technique for actuation faults of the SSME

    NASA Technical Reports Server (NTRS)

    Duyar, A.; Guo, T.-H.; Merrill, W.; Musgrave, J.

    1991-01-01

    In a previous study, Guo, Merrill and Duyar, 1990, reported a conceptual development of a fault detection and diagnosis system for actuation faults of the Space Shuttle main engine. This study, which is a continuation of the previous work, implements the developed fault detection and diagnosis scheme for the real time actuation fault diagnosis of the Space Shuttle Main Engine. The scheme will be used as an integral part of an intelligent control system demonstration experiment at NASA Lewis. The diagnosis system utilizes a model based method with real time identification and hypothesis testing for actuation, sensor, and performance degradation faults.

  10. Real-Time Model-Based Leak-Through Detection within Cryogenic Flow Systems

    NASA Technical Reports Server (NTRS)

    Walker, M.; Figueroa, F.

    2015-01-01

    The timely detection of leaks within cryogenic fuel replenishment systems is of significant importance to operators on account of the safety and economic impacts associated with material loss and operational inefficiencies. Associated loss in control of pressure also effects the stability and ability to control the phase of cryogenic fluids during replenishment operations. Current research dedicated to providing Prognostics and Health Management (PHM) coverage of such cryogenic replenishment systems has focused on the detection of leaks to atmosphere involving relatively simple model-based diagnostic approaches that, while effective, are unable to isolate the fault to specific piping system components. The authors have extended this research to focus on the detection of leaks through closed valves that are intended to isolate sections of the piping system from the flow and pressurization of cryogenic fluids. The described approach employs model-based detection of leak-through conditions based on correlations of pressure changes across isolation valves and attempts to isolate the faults to specific valves. Implementation of this capability is enabled by knowledge and information embedded in the domain model of the system. The approach has been used effectively to detect such leak-through faults during cryogenic operational testing at the Cryogenic Testbed at NASA's Kennedy Space Center.

  11. Model-based fault detection of blade pitch system in floating wind turbines

    NASA Astrophysics Data System (ADS)

    Cho, S.; Gao, Z.; Moan, T.

    2016-09-01

    This paper presents a model-based scheme for fault detection of a blade pitch system in floating wind turbines. A blade pitch system is one of the most critical components due to its effect on the operational safety and the dynamics of wind turbines. Faults in this system should be detected at the early stage to prevent failures. To detect faults of blade pitch actuators and sensors, an appropriate observer should be designed to estimate the states of the system. Residuals are generated by a Kalman filter and a threshold based on H optimization, and linear matrix inequality (LMI) is used for residual evaluation. The proposed method is demonstrated in a case study that bias and fixed output in pitch sensors and stuck in pitch actuators. The simulation results show that the proposed method detects different realistic fault scenarios of wind turbines under the stochastic external winds.

  12. A model-based approach for detection of objects in low resolution passive millimeter wave images

    NASA Technical Reports Server (NTRS)

    Kasturi, Rangachar; Tang, Yuan-Liang; Devadiga, Sadashiva

    1993-01-01

    A model-based vision system to assist the pilots in landing maneuvers under restricted visibility conditions is described. The system was designed to analyze image sequences obtained from a Passive Millimeter Wave (PMMW) imaging system mounted on the aircraft to delineate runways/taxiways, buildings, and other objects on or near runways. PMMW sensors have good response in a foggy atmosphere, but their spatial resolution is very low. However, additional data such as airport model and approximate position and orientation of aircraft are available. These data are exploited to guide our model-based system to locate objects in the low resolution image and generate warning signals to alert the pilots. Also analytical expressions were derived from the accuracy of the camera position estimate obtained by detecting the position of known objects in the image.

  13. Model-based estimation of measures of association for time-to-event outcomes

    PubMed Central

    2014-01-01

    Background Hazard ratios are ubiquitously used in time to event applications to quantify adjusted covariate effects. Although hazard ratios are invaluable for hypothesis testing, other adjusted measures of association, both relative and absolute, should be provided to fully appreciate studies results. The corrected group prognosis method is generally used to estimate the absolute risk reduction and the number needed to be treated for categorical covariates. Methods The goal of this paper is to present transformation models for time-to-event outcomes to obtain, directly from estimated coefficients, the measures of association widely used in biostatistics together with their confidence interval. Pseudo-values are used for a practical estimation of transformation models. Results Using the regression model estimated through pseudo-values with suitable link functions, relative risks, risk differences and the number needed to treat, are obtained together with their confidence intervals. One example based on literature data and one original application to the study of prognostic factors in primary retroperitoneal soft tissue sarcomas are presented. A simulation study is used to show some properties of the different estimation methods. Conclusions Clinically useful measures of treatment or exposure effect are widely available in epidemiology. When time to event outcomes are present, the analysis is performed generally resorting to predicted values from Cox regression model. It is now possible to resort to more general regression models, adopting suitable link functions and pseudo values for estimation, to obtain alternative measures of effect directly from regression coefficients together with their confidence interval. This may be especially useful when, in presence of time dependent covariate effects, it is not straightforward to specify the correct, if any, time dependent functional form. The method can easily be implemented with standard software. PMID:25106903

  14. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan Walker

    2015-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  15. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan W.

    2014-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  16. A model-based framework for the detection of spiculated masses on mammography

    SciTech Connect

    Sampat, Mehul P.; Bovik, Alan C.; Whitman, Gary J.; Markey, Mia K.

    2008-05-15

    The detection of lesions on mammography is a repetitive and fatiguing task. Thus, computer-aided detection systems have been developed to aid radiologists. The detection accuracy of current systems is much higher for clusters of microcalcifications than for spiculated masses. In this article, the authors present a new model-based framework for the detection of spiculated masses. The authors have invented a new class of linear filters, spiculated lesion filters, for the detection of converging lines or spiculations. These filters are highly specific narrowband filters, which are designed to match the expected structures of spiculated masses. As a part of this algorithm, the authors have also invented a novel technique to enhance spicules on mammograms. This entails filtering in the radon domain. They have also developed models to reduce the false positives due to normal linear structures. A key contribution of this work is that the parameters of the detection algorithm are based on measurements of physical properties of spiculated masses. The results of the detection algorithm are presented in the form of free-response receiver operating characteristic curves on images from the Mammographic Image Analysis Society and Digital Database for Screening Mammography databases.

  17. Model-based fault detection and identification with online aerodynamic model structure selection

    NASA Astrophysics Data System (ADS)

    Lombaerts, T.

    2013-12-01

    This publication describes a recursive algorithm for the approximation of time-varying nonlinear aerodynamic models by means of a joint adaptive selection of the model structure and parameter estimation. This procedure is called adaptive recursive orthogonal least squares (AROLS) and is an extension and modification of the previously developed ROLS procedure. This algorithm is particularly useful for model-based fault detection and identification (FDI) of aerospace systems. After the failure, a completely new aerodynamic model can be elaborated recursively with respect to structure as well as parameter values. The performance of the identification algorithm is demonstrated on a simulation data set.

  18. Relevance popularity: A term event model based feature selection scheme for text classification

    PubMed Central

    Yang, Fengqin; Wang, Han; Zhang, Libiao

    2017-01-01

    Feature selection is a practical approach for improving the performance of text classification methods by optimizing the feature subsets input to classifiers. In traditional feature selection methods such as information gain and chi-square, the number of documents that contain a particular term (i.e. the document frequency) is often used. However, the frequency of a given term appearing in each document has not been fully investigated, even though it is a promising feature to produce accurate classifications. In this paper, we propose a new feature selection scheme based on a term event Multinomial naive Bayes probabilistic model. According to the model assumptions, the matching score function, which is based on the prediction probability ratio, can be factorized. Finally, we derive a feature selection measurement for each term after replacing inner parameters by their estimators. On a benchmark English text datasets (20 Newsgroups) and a Chinese text dataset (MPH-20), our numerical experiment results obtained from using two widely used text classifiers (naive Bayes and support vector machine) demonstrate that our method outperformed the representative feature selection methods. PMID:28379986

  19. Relevance popularity: A term event model based feature selection scheme for text classification.

    PubMed

    Feng, Guozhong; An, Baiguo; Yang, Fengqin; Wang, Han; Zhang, Libiao

    2017-01-01

    Feature selection is a practical approach for improving the performance of text classification methods by optimizing the feature subsets input to classifiers. In traditional feature selection methods such as information gain and chi-square, the number of documents that contain a particular term (i.e. the document frequency) is often used. However, the frequency of a given term appearing in each document has not been fully investigated, even though it is a promising feature to produce accurate classifications. In this paper, we propose a new feature selection scheme based on a term event Multinomial naive Bayes probabilistic model. According to the model assumptions, the matching score function, which is based on the prediction probability ratio, can be factorized. Finally, we derive a feature selection measurement for each term after replacing inner parameters by their estimators. On a benchmark English text datasets (20 Newsgroups) and a Chinese text dataset (MPH-20), our numerical experiment results obtained from using two widely used text classifiers (naive Bayes and support vector machine) demonstrate that our method outperformed the representative feature selection methods.

  20. Detecting seismic events using Benford's Law

    NASA Astrophysics Data System (ADS)

    Diaz, Jordi; Gallart, Josep; Ruiz, Mario

    2015-04-01

    The Benford's Law (BL) states that the distribution of first significant digits is not uniform but follows a logarithmic frequency distribution. Even if a remarkable wide range of natural and socioeconomical data sets, from stock market values to quantum phase transitions, fit this peculiar law, the conformity to it has deserved few scientific applications, being used mainly as a test to pinpoint anomalous or fraudulent data. We developed a procedure to detect the arrival of seismic waves based on the degree of conformity of the amplitude values in the raw seismic trace to the BL. The signal is divided in time windows of appropriate length and the fitting of the first digits distribution to BL is checked in each time window using a conformity estimator. We document that both teleseismic and local earthquakes can be clearly identified in this procedure and we compare its performance with respect to the classical STA/LTA approach. Moreover, we show that the conformity of the seismic record to the BL does not depend on the amplitude of the incoming series, as the occurrence of events with very different amplitudes result in quite similar degree of BL fitting. On the other hand, we show that natural or man-made quasi-monochromatic seismic signals, surface wave trains or engine-generated vibrations can be identified through their very low BL estimator values, when appropriate interval lengths are used. Therefore, we conclude that the degree of conformity of a seismic signal with the BL is primarily dependent on the frequency content of that signal.

  1. Model-Based Building Detection from Low-Cost Optical Sensors Onboard Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Karantzalos, K.; Koutsourakis, P.; Kalisperakis, I.; Grammatikopoulos, L.

    2015-08-01

    The automated and cost-effective building detection in ultra high spatial resolution is of major importance for various engineering and smart city applications. To this end, in this paper, a model-based building detection technique has been developed able to extract and reconstruct buildings from UAV aerial imagery and low-cost imaging sensors. In particular, the developed approach through advanced structure from motion, bundle adjustment and dense image matching computes a DSM and a true orthomosaic from the numerous GoPro images which are characterised by important geometric distortions and fish-eye effect. An unsupervised multi-region, graphcut segmentation and a rule-based classification is responsible for delivering the initial multi-class classification map. The DTM is then calculated based on inpaininting and mathematical morphology process. A data fusion process between the detected building from the DSM/DTM and the classification map feeds a grammar-based building reconstruction and scene building are extracted and reconstructed. Preliminary experimental results appear quite promising with the quantitative evaluation indicating detection rates at object level of 88% regarding the correctness and above 75% regarding the detection completeness.

  2. Observability analysis for model-based fault detection and sensor selection in induction motors

    NASA Astrophysics Data System (ADS)

    Nakhaeinejad, Mohsen; Bryant, Michael D.

    2011-07-01

    Sensors in different types and configurations provide information on the dynamics of a system. For a specific task, the question is whether measurements have enough information or whether the sensor configuration can be changed to improve the performance or to reduce costs. Observability analysis may answer the questions. This paper presents a general algorithm of nonlinear observability analysis with application to model-based diagnostics and sensor selection in three-phase induction motors. A bond graph model of the motor is developed and verified with experiments. A nonlinear observability matrix based on Lie derivatives is obtained from state equations. An observability index based on the singular value decomposition of the observability matrix is obtained. Singular values and singular vectors are used to identify the most and least observable configurations of sensors and parameters. A complex step derivative technique is used in the calculation of Jacobians to improve the computational performance of the observability analysis. The proposed algorithm of observability analysis can be applied to any nonlinear system to select the best configuration of sensors for applications of model-based diagnostics, observer-based controller, or to determine the level of sensor redundancy. Observability analysis on induction motors provides various sensor configurations with corresponding observability indices. Results show the redundancy levels for different sensors, and provide a sensor selection guideline for model-based diagnostics, and for observer-based controllers. The results can also be used for sensor fault detection and to improve the reliability of the system by increasing the redundancy level in measurements.

  3. A Unified Framework for Event Summarization and Rare Event Detection from Multiple Views.

    PubMed

    Kwon, Junseok; Lee, Kyoung Mu

    2015-09-01

    A novel approach for event summarization and rare event detection is proposed. Unlike conventional methods that deal with event summarization and rare event detection independently, our method solves them in a single framework by transforming them into a graph editing problem. In our approach, a video is represented by a graph, each node of which indicates an event obtained by segmenting the video spatially and temporally. The edges between nodes describe the relationship between events. Based on the degree of relations, edges have different weights. After learning the graph structure, our method finds subgraphs that represent event summarization and rare events in the video by editing the graph, that is, merging its subgraphs or pruning its edges. The graph is edited to minimize a predefined energy model with the Markov Chain Monte Carlo (MCMC) method. The energy model consists of several parameters that represent the causality, frequency, and significance of events. We design a specific energy model that uses these parameters to satisfy each objective of event summarization and rare event detection. The proposed method is extended to obtain event summarization and rare event detection results across multiple videos captured from multiple views. For this purpose, the proposed method independently learns and edits each graph of individual videos for event summarization or rare event detection. Then, the method matches the extracted multiple graphs to each other, and constructs a single composite graph that represents event summarization or rare events from multiple views. Experimental results show that the proposed approach accurately summarizes multiple videos in a fully unsupervised manner. Moreover, the experiments demonstrate that the approach is advantageous in detecting rare transition of events.

  4. Comparison of chiller models for use in model-based fault detection

    SciTech Connect

    Sreedharan, Priya; Haves, Philip

    2001-06-07

    Selecting the model is an important and essential step in model based fault detection and diagnosis (FDD). Factors that are considered in evaluating a model include accuracy, training data requirements, calibration effort, generality, and computational requirements. The objective of this study was to evaluate different modeling approaches for their applicability to model based FDD of vapor compression chillers. Three different models were studied: the Gordon and Ng Universal Chiller model (2nd generation) and a modified version of the ASHRAE Primary Toolkit model, which are both based on first principles, and the DOE-2 chiller model, as implemented in CoolTools{trademark}, which is empirical. The models were compared in terms of their ability to reproduce the observed performance of an older, centrifugal chiller operating in a commercial office building and a newer centrifugal chiller in a laboratory. All three models displayed similar levels of accuracy. Of the first principles models, the Gordon-Ng model has the advantage of being linear in the parameters, which allows more robust parameter estimation methods to be used and facilitates estimation of the uncertainty in the parameter values. The ASHRAE Toolkit Model may have advantages when refrigerant temperature measurements are also available. The DOE-2 model can be expected to have advantages when very limited data are available to calibrate the model, as long as one of the previously identified models in the CoolTools library matches the performance of the chiller in question.

  5. Failure analysis for model-based organ segmentation using outlier detection

    NASA Astrophysics Data System (ADS)

    Saalbach, Axel; Wächter Stehle, Irina; Lorenz, Cristian; Weese, Jürgen

    2014-03-01

    During the last years Model-Based Segmentation (MBS) techniques have been used in a broad range of medical applications. In clinical practice, such techniques are increasingly employed for diagnostic purposes and treatment decisions. However, it is not guaranteed that a segmentation algorithm will converge towards the desired solution. In specific situations as in the presence of rare anatomical variants (which cannot be represented) or for images with an extremely low quality, a meaningful segmentation might not be feasible. At the same time, an automated estimation of the segmentation reliability is commonly not available. In this paper we present an approach for the identification of segmentation failures using concepts from the field of outlier detection. The approach is validated on a comprehensive set of Computed Tomography Angiography (CTA) images by means of Receiver Operating Characteristic (ROC) analysis. Encouraging results in terms of an Area Under the ROC Curve (AUC) of up to 0.965 were achieved.

  6. Robust event detection scheme for complex scenes in video surveillance

    NASA Astrophysics Data System (ADS)

    Chen, Erkang; Xu, Yi; Yang, Xiaokang; Zhang, Wenjun

    2011-07-01

    Event detection for video surveillance is a difficult task due to many challenges: cluttered background, illumination variations, scale variations, occlusions among people, etc. We propose an effective and efficient event detection scheme in such complex situations. Moving shadows due to illumination are tackled with a segmentation method with shadow detection, and scale variations are taken care of using the CamShift guided particle filter tracking algorithm. For event modeling, hidden Markov models are employed. The proposed scheme also reduces the overall computational cost by combing two human detection algorithms and using tracking information to aid human detection. Experimental results on TRECVid event detection evaluation demonstrate the efficacy of the proposed scheme. It is robust, especially to moving shadows and scale variations. Employing the scheme, we achieved the best run results for two events in the TRECVid benchmarking evaluation.

  7. A sparse model based detection of copy number variations from exome sequencing data

    PubMed Central

    Duan, Junbo; Wan, Mingxi; Deng, Hong-Wen; Wang, Yu-Ping

    2016-01-01

    Goal Whole-exome sequencing provides a more cost-effective way than whole-genome sequencing for detecting genetic variants such as copy number variations (CNVs). Although a number of approaches have been proposed to detect CNVs from whole-genome sequencing, a direct adoption of these approaches to whole-exome sequencing will often fail because exons are separately located along a genome. Therefore, an appropriate method is needed to target the specific features of exome sequencing data. Methods In this paper a novel sparse model based method is proposed to discover CNVs from multiple exome sequencing data. First, exome sequencing data are represented with a penalized matrix approximation, and technical variability and random sequencing errors are assumed to follow a generalized Gaussian distribution. Second, an iteratively re-weighted least squares algorithm is used to estimate the solution. Results The method is tested and validated on both synthetic and real data, and compared with other approaches including CoNIFER, XHMM and cn.MOPS. The test demonstrates that the proposed method outperform other approaches. Conclusion The proposed sparse model can detect CNVs from exome sequencing data with high power and precision. Significance Sparse model can target the specific features of exome sequencing data. The software codes are freely available at http://www.tulane.edu/wyp/software/ExonCNV.m PMID:26258935

  8. 3D model-based detection and tracking for space autonomous and uncooperative rendezvous

    NASA Astrophysics Data System (ADS)

    Shang, Yang; Zhang, Yueqiang; Liu, Haibo

    2015-10-01

    In order to fully navigate using a vision sensor, a 3D edge model based detection and tracking technique was developed. Firstly, we proposed a target detection strategy over a sequence of several images from the 3D model to initialize the tracking. The overall purpose of such approach is to robustly match each image with the model views of the target. Thus we designed a line segment detection and matching method based on the multi-scale space technology. Experiments on real images showed that our method is highly robust under various image changes. Secondly, we proposed a method based on 3D particle filter (PF) coupled with M-estimation to track and estimate the pose of the target efficiently. In the proposed approach, a similarity observation model was designed according to a new distance function of line segments. Then, based on the tracking results of PF, the pose was optimized using M-estimation. Experiments indicated that the proposed method can effectively track and accurately estimate the pose of freely moving target in unconstrained environment.

  9. Real-Time Event Detection for Monitoring Natural and Source ...

    EPA Pesticide Factsheets

    The use of event detection systems in finished drinking water systems is increasing in order to monitor water quality in both operational and security contexts. Recent incidents involving harmful algal blooms and chemical spills into watersheds have increased interest in monitoring source water quality prior to treatment. This work highlights the use of the CANARY event detection software in detecting suspected illicit events in an actively monitored watershed in South Carolina. CANARY is an open source event detection software that was developed by USEPA and Sandia National Laboratories. The software works with any type of sensor, utilizes multiple detection algorithms and approaches, and can incorporate operational information as needed. Monitoring has been underway for several years to detect events related to intentional or unintentional dumping of materials into the monitored watershed. This work evaluates the feasibility of using CANARY to enhance the detection of events in this watershed. This presentation will describe the real-time monitoring approach used in this watershed, the selection of CANARY configuration parameters that optimize detection for this watershed and monitoring application, and the performance of CANARY during the time frame analyzed. Further, this work will highlight how rainfall events impacted analysis, and the innovative application of CANARY taken in order to effectively detect the suspected illicit events. This presentation d

  10. Articulating uncertainty as part of scientific argumentation during model-based exoplanet detection tasks

    NASA Astrophysics Data System (ADS)

    Lee, Hee-Sun; Pallant, Amy; Pryputniewicz, Sarah

    2015-08-01

    Teaching scientific argumentation has emerged as an important goal for K-12 science education. In scientific argumentation, students are actively involved in coordinating evidence with theory based on their understanding of the scientific content and thinking critically about the strengths and weaknesses of the cited evidence in the context of the investigation. We developed a one-week-long online curriculum module called "Is there life in space?" where students conduct a series of four model-based tasks to learn how scientists detect extrasolar planets through the “wobble” and transit methods. The simulation model allows students to manipulate various parameters of an imaginary star and planet system such as planet size, orbit size, planet-orbiting-plane angle, and sensitivity of telescope equipment, and to adjust the display settings for graphs illustrating the relative velocity and light intensity of the star. Students can use model-based evidence to formulate an argument on whether particular signals in the graphs guarantee the presence of a planet. Students' argumentation is facilitated by the four-part prompts consisting of multiple-choice claim, open-ended explanation, Likert-scale uncertainty rating, and open-ended uncertainty rationale. We analyzed 1,013 scientific arguments formulated by 302 high school student groups taught by 7 teachers. We coded these arguments in terms of the accuracy of their claim, the sophistication of explanation connecting evidence to the established knowledge base, the uncertainty rating, and the scientific validity of uncertainty. We found that (1) only 18% of the students' uncertainty rationale involved critical reflection on limitations inherent in data and concepts, (2) 35% of students' uncertainty rationale reflected their assessment of personal ability and knowledge, rather than scientific sources of uncertainty related to the evidence, and (3) the nature of task such as the use of noisy data or the framing of

  11. Correlation between human detection accuracy and observer model-based image quality metrics in computed tomography

    PubMed Central

    Solomon, Justin; Samei, Ehsan

    2016-01-01

    Abstract. The purpose of this study was to compare computed tomography (CT) low-contrast detectability from human readers with observer model-based surrogates of image quality. A phantom with a range of low-contrast signals (five contrasts, three sizes) was imaged on a state-of-the-art CT scanner (Siemens’ force). Images were reconstructed using filtered back projection and advanced modeled iterative reconstruction and were assessed by 11 readers using a two alternative forced choice method. Concurrently, contrast-to-noise ratio (CNR), area-weighted CNR (CNRA), and observer model-based metrics were estimated, including nonprewhitening (NPW) matched filter, NPW with eye filter (NPWE), NPW with internal noise, NPW with an eye filter and internal noise (NPWEi), channelized Hotelling observer (CHO), and CHO with internal noise (CHOi). The correlation coefficients (Pearson and Spearman), linear discriminator error, E, and magnitude of confidence intervals, |CI95%|, were used to determine correlation, proper characterization of the reconstruction algorithms, and model precision, respectively. Pearson (Spearman) correlation was 0.36 (0.33), 0.83 (0.84), 0.84 (0.86), 0.86 (0.88), 0.86 (0.91), 0.88 (0.90), 0.85 (0.89), and 0.87 (0.84), E was 0.25, 0.15, 0.2, 0.25, 0.3, 0.25, 0.4, and 0.45, and |CI95%| was 2.84×10−3, 5.29×10−3, 4.91×10−3, 4.55×10−3, 2.16×10−3, 1.24×10−3, 4.58×10−2, and 7.95×10−2 for CNR, CNRA, NPW, NPWE, NPWi, NPWEi, CHO, and CHOi, respectively. PMID:27704032

  12. CHIRP-Like Signals: Estimation, Detection and Processing A Sequential Model-Based Approach

    SciTech Connect

    Candy, J. V.

    2016-08-04

    Chirp signals have evolved primarily from radar/sonar signal processing applications specifically attempting to estimate the location of a target in surveillance/tracking volume. The chirp, which is essentially a sinusoidal signal whose phase changes instantaneously at each time sample, has an interesting property in that its correlation approximates an impulse function. It is well-known that a matched-filter detector in radar/sonar estimates the target range by cross-correlating a replicant of the transmitted chirp with the measurement data reflected from the target back to the radar/sonar receiver yielding a maximum peak corresponding to the echo time and therefore enabling the desired range estimate. In this application, we perform the same operation as a radar or sonar system, that is, we transmit a “chirp-like pulse” into the target medium and attempt to first detect its presence and second estimate its location or range. Our problem is complicated by the presence of disturbance signals from surrounding broadcast stations as well as extraneous sources of interference in our frequency bands and of course the ever present random noise from instrumentation. First, we discuss the chirp signal itself and illustrate its inherent properties and then develop a model-based processing scheme enabling both the detection and estimation of the signal from noisy measurement data.

  13. Model-based approach to the detection and classification of mines in sidescan sonar.

    PubMed

    Reed, Scott; Petillot, Yvan; Bell, Judith

    2004-01-10

    This paper presents a model-based approach to mine detection and classification by use of sidescan sonar. Advances in autonomous underwater vehicle technology have increased the interest in automatic target recognition systems in an effort to automate a process that is currently carried out by a human operator. Current automated systems generally require training and thus produce poor results when the test data set is different from the training set. This has led to research into unsupervised systems, which are able to cope with the large variability in conditions and terrains seen in sidescan imagery. The system presented in this paper first detects possible minelike objects using a Markov random field model, which operates well on noisy images, such as sidescan, and allows a priori information to be included through the use of priors. The highlight and shadow regions of the object are then extracted with a cooperating statistical snake, which assumes these regions are statistically separate from the background. Finally, a classification decision is made using Dempster-Shafer theory, where the extracted features are compared with synthetic realizations generated with a sidescan sonar simulator model. Results for the entire process are shown on real sidescan sonar data. Similarities between the sidescan sonar and synthetic aperture radar (SAR) imaging processes ensure that the approach outlined here could be made applied to SAR image analysis.

  14. Detecting plastic events in emulsions simulations

    NASA Astrophysics Data System (ADS)

    Lulli, Matteo; Matteo Lulli, Massimo Bernaschi, Mauro Sbragaglia Team

    2016-11-01

    Emulsions are complex systems which are formed by a number of non-coalescing droplets dispersed in a solvent leading to non-trivial effects in the overall flowing dynamics. Such systems possess a yield stress below which an elastic response to an external forcing occurs, while above the yield stress the system flows as a non-Newtonian fluid, i.e. the stress is not proportional to the shear. In the solid-like regime the network of the droplets interfaces stores the energy coming from the work exerted by an external forcing, which can be used to move the droplets in a non-reversible way, i.e. causing plastic events. The Kinetic-Elasto-Plastic (KEP) theory is an effective theory describing some features of the flowing regime relating the rate of plastic events to a scalar field called fluidity f =γ˙/σ , i.e. the inverse of an effective viscosity. Boundary conditions have a non-trivial role not captured by the KEP description. In this contribution we will compare numerical results against experiments concerning the Poiseuille flow of emulsions in microchannels with complex boundary geometries. Using an efficient computational tool we can show non-trivial results on plastic events for different realizations of the rough boundaries. The research leading to these results has received funding from the European Research Council under the European Community's Seventh Framework Programme (FP7/2007- 2013)/ERC Grant Agreement no. [279004].

  15. Subsurface Event Detection and Classification Using Wireless Signal Networks

    PubMed Central

    Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T.

    2012-01-01

    Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events. PMID:23202191

  16. Saliency-based abnormal event detection in crowded scenes

    NASA Astrophysics Data System (ADS)

    Shi, Yanjiao; Liu, Yunxiang; Zhang, Qing; Yi, Yugen; Li, Wenju

    2016-11-01

    Abnormal event detection plays a critical role for intelligent video surveillance, and detection in crowded scenes is a challenging but more practical task. We present an abnormal event detection method for crowded video. Region-wise modeling is proposed to address the inconsistent detected motion of the same object due to different depths of field. Comparing to traditional block-wise modeling, the region-wise method not only can reduce heavily the number of models to be built but also can enrich the samples for training the normal events model. In order to reduce the computational burden and make the region-based anomaly detection feasible, a saliency detection technique is adopted in this paper. By identifying the salient parts of the image sequences, the irrelevant blocks are ignored, which removes the disturbance and improves the detection performance further. Experiments on the benchmark dataset and comparisons with the state-of-the-art algorithms validate the advantages of the proposed method.

  17. Modeling Concept Dependencies for Event Detection

    DTIC Science & Technology

    2014-04-04

    approach to model concept dependencies. An MRF is commonly used in the machine learning domain and is an undirected graph- ical model. It models the...Repairing an appliance, (E015) Working on a sewing project. The EC collection has 2,062 videos each of which is relevant to an event. The DevT...30.4 34.7 29.7 E014 63.6 43.2 37.5 75.0 34.1 28.4 36.4 26.1 32.9 23.2 E015 80.5 68.3 58.5 80.5 52.4 43.9 43.9 45.1 48.7 42.1 sewing project”. When we

  18. Global Seismic Event Detection Using Surface Waves: 15 Possible Antarctic Glacial Sliding Events

    NASA Astrophysics Data System (ADS)

    Chen, X.; Shearer, P. M.; Walker, K. T.; Fricker, H. A.

    2008-12-01

    To identify overlooked or anomalous seismic events not listed in standard catalogs, we have developed an algorithm to detect and locate global seismic events using intermediate-period (35-70s) surface waves. We apply our method to continuous vertical-component seismograms from the global seismic networks as archived in the IRIS UV FARM database from 1997 to 2007. We first bandpass filter the seismograms, apply automatic gain control, and compute envelope functions. We then examine 1654 target event locations defined at 5 degree intervals and stack the seismogram envelopes along the predicted Rayleigh-wave travel times. The resulting function has spatial and temporal peaks that indicate possible seismic events. We visually check these peaks using a graphical user interface to eliminate artifacts and assign an overall reliability grade (A, B or C) to the new events. We detect 78% of events in the Global Centroid Moment Tensor (CMT) catalog. However, we also find 840 new events not listed in the PDE, ISC and REB catalogs. Many of these new events were previously identified by Ekstrom (2006) using a different Rayleigh-wave detection scheme. Most of these new events are located along oceanic ridges and transform faults. Some new events can be associated with volcanic eruptions such as the 2000 Miyakejima sequence near Japan and others with apparent glacial sliding events in Greenland (Ekstrom et al., 2003). We focus our attention on 15 events detected from near the Antarctic coastline and relocate them using a cross-correlation approach. The events occur in 3 groups which are well-separated from areas of cataloged earthquake activity. We speculate that these are iceberg calving and/or glacial sliding events, and hope to test this by inverting for their source mechanisms and examining remote sensing data from their source regions.

  19. [Detecting gene-gene/environment interactions by model-based multifactor dimensionality reduction].

    PubMed

    Fan, Wei; Shen, Chao; Guo, Zhirong

    2015-11-01

    This paper introduces a method called model-based multifactor dimensionality reduction (MB-MDR), which was firstly proposed by Calle et al., and can be applied for detecting gene-gene or gene-environment interactions in genetic studies. The basic principle and characteristics of MB-MDR as well as the operation in R program are briefly summarized. Besides, the detailed procedure of MB-MDR is illustrated by using example. Compared with classical MDR, MB-MDR has similar principle, which merges multi-locus genotypes into a one-dimensional construct and can be used in the study with small sample size. However, there is some difference between MB-MDR and classical MDR. First, it has higher statistical power than MDR and other MDR in the presence of different noises due to the different way the genotype cells merged. Second, compared with MDR, it can deal with all binary and quantitative traits, adjust marginal effects of factors and confounders. MBMDR could be a useful method in the analyses of gene-gene/environment interactions.

  20. High-Resolution Ultrasound Imaging Using Model-Bases Iterative Reconstruction For Canister Degradation Detection

    SciTech Connect

    Chatzidakis, Stylianos; Jarrell, Joshua J; Scaglione, John M

    2017-01-01

    The inspection of the dry storage canisters that house spent nuclear fuel is an important issue facing the nuclear industry; currently, there are limited options available to provide for even minimal inspections. An issue of concern is stress corrosion cracking (SCC) in austenitic stainless steel canisters. SCC is difficult to predict and exhibits small crack opening displacements on the order of 15 30 m. Nondestructive examination (NDE) of such microscopic cracks is especially challenging, and it may be possible to miss SCC during inspections. The coarse grain microstructure at the heat affected zone reduces the achievable sensitivity of conventional ultrasound techniques. At Oak Ridge National Laboratory, a tomographic approach is under development to improve SCC detection using ultrasound guided waves and model-based iterative reconstruction (MBIR). Ultrasound-guided waves propagate parallel to the physical boundaries of the surface and allow for rapid inspection of a large area from a single probe location. MBIR is a novel, effective probabilistic imaging tool that offers higher precision and better image quality than current reconstruction techniques. This paper analyzes the canister environment, stainless steel microstructure, and SCC characteristics. The end goal is to demonstrate the feasibility of an NDE system based on ultrasonic guided waves and MBIR for canister degradation and to produce radar-like images of the canister surface with significantly improved image quality. The proposed methodology can potentially reduce human radiation exposure, result in lower operational costs, and provide a methodology that can be used to verify canister integrity in-situ during extended storage

  1. A physical model-based approach to detecting sky in photographic images.

    PubMed

    Luo, Jiebo; Etz, Stephen P

    2002-01-01

    Sky is among the most important subject matter frequently seen in photographic images. We propose a model-based approach consisting of color classification, region extraction, and physics-motivated sky signature validation. First, the color classification is performed by a multilayer backpropagation neural network trained in a bootstrapping fashion to generate a belief map of sky color. Next, the region extraction algorithm automatically determines an appropriate threshold for the sky color belief map and extracts connected components. Finally, the sky signature validation algorithm determines the orientation of a candidate sky region, classifies one-dimensional (1-D) traces within the region based on a physics-motivated model, and computes the sky belief of the region by the percentage of traces that fit the physics-based sky trace model. A small-scale, yet rigorous test has been conducted to evaluate the algorithm performance. With approximately half of the images containing blue sky regions, the detection rate is 96% with a false positive rate of 2% on a per image basis.

  2. Event Detection using Twitter: A Spatio-Temporal Approach

    PubMed Central

    Cheng, Tao; Wicks, Thomas

    2014-01-01

    Background Every day, around 400 million tweets are sent worldwide, which has become a rich source for detecting, monitoring and analysing news stories and special (disaster) events. Existing research within this field follows key words attributed to an event, monitoring temporal changes in word usage. However, this method requires prior knowledge of the event in order to know which words to follow, and does not guarantee that the words chosen will be the most appropriate to monitor. Methods This paper suggests an alternative methodology for event detection using space-time scan statistics (STSS). This technique looks for clusters within the dataset across both space and time, regardless of tweet content. It is expected that clusters of tweets will emerge during spatio-temporally relevant events, as people will tweet more than expected in order to describe the event and spread information. The special event used as a case study is the 2013 London helicopter crash. Results and Conclusion A spatio-temporally significant cluster is found relating to the London helicopter crash. Although the cluster only remains significant for a relatively short time, it is rich in information, such as important key words and photographs. The method also detects other special events such as football matches, as well as train and flight delays from Twitter data. These findings demonstrate that STSS is an effective approach to analysing Twitter data for event detection. PMID:24893168

  3. Efficient method for events detection in phonocardiographic signals

    NASA Astrophysics Data System (ADS)

    Martinez-Alajarin, Juan; Ruiz-Merino, Ramon

    2005-06-01

    The auscultation of the heart is still the first basic analysis tool used to evaluate the functional state of the heart, as well as the first indicator used to submit the patient to a cardiologist. In order to improve the diagnosis capabilities of auscultation, signal processing algorithms are currently being developed to assist the physician at primary care centers for adult and pediatric population. A basic task for the diagnosis from the phonocardiogram is to detect the events (main and additional sounds, murmurs and clicks) present in the cardiac cycle. This is usually made by applying a threshold and detecting the events that are bigger than the threshold. However, this method usually does not allow the detection of the main sounds when additional sounds and murmurs exist, or it may join several events into a unique one. In this paper we present a reliable method to detect the events present in the phonocardiogram, even in the presence of heart murmurs or additional sounds. The method detects relative maxima peaks in the amplitude envelope of the phonocardiogram, and computes a set of parameters associated with each event. Finally, a set of characteristics is extracted from each event to aid in the identification of the events. Besides, the morphology of the murmurs is also detected, which aids in the differentiation of different diseases that can occur in the same temporal localization. The algorithms have been applied to real normal heart sounds and murmurs, achieving satisfactory results.

  4. Video Event Detection Framework on Large-Scale Video Data

    ERIC Educational Resources Information Center

    Park, Dong-Jun

    2011-01-01

    Detection of events and actions in video entails substantial processing of very large, even open-ended, video streams. Video data present a unique challenge for the information retrieval community because properly representing video events is challenging. We propose a novel approach to analyze temporal aspects of video data. We consider video data…

  5. Particle Filtering for Model-Based Anomaly Detection in Sensor Networks

    NASA Technical Reports Server (NTRS)

    Solano, Wanda; Banerjee, Bikramjit; Kraemer, Landon

    2012-01-01

    A novel technique has been developed for anomaly detection of rocket engine test stand (RETS) data. The objective was to develop a system that postprocesses a csv file containing the sensor readings and activities (time-series) from a rocket engine test, and detects any anomalies that might have occurred during the test. The output consists of the names of the sensors that show anomalous behavior, and the start and end time of each anomaly. In order to reduce the involvement of domain experts significantly, several data-driven approaches have been proposed where models are automatically acquired from the data, thus bypassing the cost and effort of building system models. Many supervised learning methods can efficiently learn operational and fault models, given large amounts of both nominal and fault data. However, for domains such as RETS data, the amount of anomalous data that is actually available is relatively small, making most supervised learning methods rather ineffective, and in general met with limited success in anomaly detection. The fundamental problem with existing approaches is that they assume that the data are iid, i.e., independent and identically distributed, which is violated in typical RETS data. None of these techniques naturally exploit the temporal information inherent in time series data from the sensor networks. There are correlations among the sensor readings, not only at the same time, but also across time. However, these approaches have not explicitly identified and exploited such correlations. Given these limitations of model-free methods, there has been renewed interest in model-based methods, specifically graphical methods that explicitly reason temporally. The Gaussian Mixture Model (GMM) in a Linear Dynamic System approach assumes that the multi-dimensional test data is a mixture of multi-variate Gaussians, and fits a given number of Gaussian clusters with the help of the wellknown Expectation Maximization (EM) algorithm. The

  6. Semantic Concept Discovery for Large Scale Zero Shot Event Detection

    DTIC Science & Technology

    2015-07-25

    NO. 0704-0188 3. DATES COVERED (From - To) - UU UU UU UU 18-08-2015 Approved for public release; distribution is unlimited. Semantic Concept Discovery ...Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 zero shot event detection, semantic concept discovery REPORT DOCUMENTATION PAGE 11...Mellon University 5000 Forbes Avenue Pittsburgh, PA 15213 -3815 ABSTRACT Semantic Concept Discovery for Large-Scale Zero-Shot Event Detection Report

  7. Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks.

    PubMed

    Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming

    2015-12-15

    With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady.

  8. Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks

    PubMed Central

    Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming

    2015-01-01

    With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady. PMID:26694394

  9. Respiratory Event Detection by a Positive Airway Pressure Device

    PubMed Central

    Berry, Richard B.; Kushida, Clete A.; Kryger, Meir H.; Soto-Calderon, Haideliza; Staley, Bethany; Kuna, Samuel T.

    2012-01-01

    Study Objectives: Compare automatic event detection (AED) of respiratory events using a positive airway pressure (PAP) device with manual scoring of polysomnography (PSG) during PAP treatment of obstructive sleep apnea (OSA). Design: Prospective PSGs of patients using a PAP device. Setting: Six academic and private sleep disorders centers. Patients: A total of 148 PSGs from 115 participants with OSA (apnea-hypopnea index [AHI] ≥ 15 events/hr) were analyzed. Interventions: A signal generated by the PAP device identifying the AED of respiratory events based on airflow was recorded during PSG. Measurements and Results: The PSGs were manually scored without visualization of the AED signal and scoring of a hypopnea required a ≥ 4% oxygen desaturation. The apnea index (AI), hypopnea index (HI), and AHI by manual score and PAP AED were compared. A customized computer program compared individual events by manual scoring and AED to determine the true positive, false positive, false negative, or true negative events and found a sensitivity of 0.58 and a specificity of 0.98. The AHI, AI, and HI by the two methods were highly correlated. Bland-Altman analysis showed better agreement for AI than HI. Using a manually scored AHI of ≥ 10 events/hr to denote inadequate treatment, an AED AHI ≥ 10 events/hr had a sensitivity of 0.58 and a specificity of 0.94. Conclusions: An AHI < 10 events/hr by PAP AED is usually associated with good treatment efficacy. Differences between manually scored and AED events were primarily due to different criteria for hypopnea detection. Citation: Berry RB; Kushida CA; Kryger MH; Soto-Calderon H; Staley B; Kuna ST. Respiratory event detection by a positive airway pressure device. SLEEP 2012;35(3):361-367. PMID:22379242

  10. Bayesian-network-based soccer video event detection and retrieval

    NASA Astrophysics Data System (ADS)

    Sun, Xinghua; Jin, Guoying; Huang, Mei; Xu, Guangyou

    2003-09-01

    This paper presents an event based soccer video retrieval method, where the scoring even is detected based on Bayesian network from six kinds of cue information including gate, face, audio, texture, caption and text. The topology within the Bayesian network is predefined by hand according to the domain knowledge and the probability distributions are learned in the case of the known structure and full observability. The resulting event probability from the Bayesian network is used as the feature vector to perform the video retrieval. Experiments show that the true and false detection rations for the scoring event are about 90% and 16.67% respectively, and that the video retrieval result based on event is superior to that based on low-level features in the human visual perception.

  11. Secure Access Control and Large Scale Robust Representation for Online Multimedia Event Detection

    PubMed Central

    Liu, Changyu; Li, Huiling

    2014-01-01

    We developed an online multimedia event detection (MED) system. However, there are a secure access control issue and a large scale robust representation issue when we want to integrate traditional event detection algorithms into the online environment. For the first issue, we proposed a tree proxy-based and service-oriented access control (TPSAC) model based on the traditional role based access control model. Verification experiments were conducted on the CloudSim simulation platform, and the results showed that the TPSAC model is suitable for the access control of dynamic online environments. For the second issue, inspired by the object-bank scene descriptor, we proposed a 1000-object-bank (1000OBK) event descriptor. Feature vectors of the 1000OBK were extracted from response pyramids of 1000 generic object detectors which were trained on standard annotated image datasets, such as the ImageNet dataset. A spatial bag of words tiling approach was then adopted to encode these feature vectors for bridging the gap between the objects and events. Furthermore, we performed experiments in the context of event classification on the challenging TRECVID MED 2012 dataset, and the results showed that the robust 1000OBK event descriptor outperforms the state-of-the-art approaches. PMID:25147840

  12. Abnormal events detection in crowded scenes by trajectory cluster

    NASA Astrophysics Data System (ADS)

    Zhou, Shifu; Zhang, Zhijiang; Zeng, Dan; Shen, Wei

    2015-02-01

    Abnormal events detection in crowded scenes has been a challenge due to volatility of the definitions for both normality and abnormality, the small number of pixels on the target, appearance ambiguity resulting from the dense packing, and severe inter-object occlusions. A novel framework was proposed for the detection of unusual events in crowded scenes using trajectories produced by moving pedestrians based on an intuition that the motion patterns of usual behaviors are similar to these of group activity, whereas unusual behaviors are not. First, spectral clustering is used to group trajectories with similar spatial patterns. Different trajectory clusters represent different activities. Then, unusual trajectories can be detected using these patterns. Furthermore, behavior of a mobile pedestrian can be defined by comparing its direction with these patterns, such as moving in the opposite direction of the group or traversing the group. Experimental results indicated that the proposed algorithm could be used to reliably locate the abnormal events in crowded scenes.

  13. A Framework of Simple Event Detection in Surveillance Video

    NASA Astrophysics Data System (ADS)

    Xu, Weiguang; Zhang, Yafei; Lu, Jianjiang; Tian, Yulong; Wang, Jiabao

    Video surveillance is playing more and more important role in people's social life. Real-time alerting of threaten events and searching interesting content in stored large scale video footage needs human operator to pay full attention on monitor for long time. The labor intensive mode has limit the effectiveness and efficiency of the system. A framework of simple event detection is presented advance the automation of video surveillance. An improved inner key point matching approach is used to compensate motion of background in real-time; frame difference are used to detect foreground; HOG based classifiers are used to classify foreground object into people and car; mean-shift is used to tracking the recognized objects. Events are detected based on predefined rules. The maturity of the algorithms guarantee the robustness of the framework, and the improved approach and the easily checked rules enable the framework to work in real-time. Future works to be done are also discussed.

  14. Data mining for signal detection of adverse event safety data.

    PubMed

    Chen, Hung-Chia; Tsong, Yi; Chen, James J

    2013-01-01

    The Adverse Event Reporting System (AERS) is the primary database designed to support the Food and Drug Administration (FDA) postmarketing safety surveillance program for all approved drugs and therapeutic biologic products. Most current disproportionality analysis focuses on the detection of potential adverse events (AE) involving a single drug and a single AE only. In this paper, we present a data mining biclustering technique based on the singular value decomposition to extract local regions of association for a safety study. The analysis consists of collection of biclusters, each representing an association between a set of drugs with the corresponding set of adverse events. Significance of each bicluster can be tested using disproportionality analysis. Individual drug-event combination can be further tested. A safety data set consisting of 193 drugs with 8453 adverse events is analyzed as an illustration.

  15. Bi-Level Semantic Representation Analysis for Multimedia Event Detection.

    PubMed

    Chang, Xiaojun; Ma, Zhigang; Yang, Yi; Zeng, Zhiqiang; Hauptmann, Alexander G

    2017-05-01

    Multimedia event detection has been one of the major endeavors in video event analysis. A variety of approaches have been proposed recently to tackle this problem. Among others, using semantic representation has been accredited for its promising performance and desirable ability for human-understandable reasoning. To generate semantic representation, we usually utilize several external image/video archives and apply the concept detectors trained on them to the event videos. Due to the intrinsic difference of these archives, the resulted representation is presumable to have different predicting capabilities for a certain event. Notwithstanding, not much work is available for assessing the efficacy of semantic representation from the source-level. On the other hand, it is plausible to perceive that some concepts are noisy for detecting a specific event. Motivated by these two shortcomings, we propose a bi-level semantic representation analyzing method. Regarding source-level, our method learns weights of semantic representation attained from different multimedia archives. Meanwhile, it restrains the negative influence of noisy or irrelevant concepts in the overall concept-level. In addition, we particularly focus on efficient multimedia event detection with few positive examples, which is highly appreciated in the real-world scenario. We perform extensive experiments on the challenging TRECVID MED 2013 and 2014 datasets with encouraging results that validate the efficacy of our proposed approach.

  16. Method for early detection of cooling-loss events

    DOEpatents

    Bermudez, Sergio A.; Hamann, Hendrik F.; Marianno, Fernando J.

    2015-12-22

    A method of detecting cooling-loss event early is provided. The method includes defining a relative humidity limit and change threshold for a given space, measuring relative humidity in the given space, determining, with a processing unit, whether the measured relative humidity is within the defined relative humidity limit, generating a warning in an event the measured relative humidity is outside the defined relative humidity limit and determining whether a change in the measured relative humidity is less than the defined change threshold for the given space and generating an alarm in an event the change is greater than the defined change threshold.

  17. Method for early detection of cooling-loss events

    DOEpatents

    Bermudez, Sergio A.; Hamann, Hendrik; Marianno, Fernando J.

    2015-06-30

    A method of detecting cooling-loss event early is provided. The method includes defining a relative humidity limit and change threshold for a given space, measuring relative humidity in the given space, determining, with a processing unit, whether the measured relative humidity is within the defined relative humidity limit, generating a warning in an event the measured relative humidity is outside the defined relative humidity limit and determining whether a change in the measured relative humidity is less than the defined change threshold for the given space and generating an alarm in an event the change is greater than the defined change threshold.

  18. Respiratory event detection by a positive airway pressure device.

    PubMed

    Berry, Richard B; Kushida, Clete A; Kryger, Meir H; Soto-Calderon, Haideliza; Staley, Bethany; Kuna, Samuel T

    2012-03-01

    Compare automatic event detection (AED) of respiratory events using a positive airway pressure (PAP) device with manual scoring of polysomnography (PSG) during PAP treatment of obstructive sleep apnea (OSA). Prospective PSGs of patients using a PAP device. Six academic and private sleep disorders centers. A total of 148 PSGs from 115 participants with OSA (apnea-hypopnea index [AHI] ≥ 15 events/hr) were analyzed. A signal generated by the PAP device identifying the AED of respiratory events based on airflow was recorded during PSG. The PSGs were manually scored without visualization of the AED signal and scoring of a hypopnea required a ≥ 4% oxygen desaturation. The apnea index (AI), hypopnea index (HI), and AHI by manual score and PAP AED were compared. A customized computer program compared individual events by manual scoring and AED to determine the true positive, false positive, false negative, or true negative events and found a sensitivity of 0.58 and a specificity of 0.98. The AHI, AI, and HI by the two methods were highly correlated. Bland-Altman analysis showed better agreement for AI than HI. Using a manually scored AHI of ≥ 10 events/hr to denote inadequate treatment, an AED AHI ≥ 10 events/hr had a sensitivity of 0.58 and a specificity of 0.94. An AHI < 10 events/hr by PAP AED is usually associated with good treatment efficacy. Differences between manually scored and AED events were primarily due to different criteria for hypopnea detection.

  19. Human Rights Event Detection from Heterogeneous Social Media Graphs.

    PubMed

    Chen, Feng; Neill, Daniel B

    2015-03-01

    Human rights organizations are increasingly monitoring social media for identification, verification, and documentation of human rights violations. Since manual extraction of events from the massive amount of online social network data is difficult and time-consuming, we propose an approach for automated, large-scale discovery and analysis of human rights-related events. We apply our recently developed Non-Parametric Heterogeneous Graph Scan (NPHGS), which models social media data such as Twitter as a heterogeneous network (with multiple different node types, features, and relationships) and detects emerging patterns in the network, to identify and characterize human rights events. NPHGS efficiently maximizes a nonparametric scan statistic (an aggregate measure of anomalousness) over connected subgraphs of the heterogeneous network to identify the most anomalous network clusters. It summarizes each event with information such as type of event, geographical locations, time, and participants, and provides documentation such as links to videos and news reports. Building on our previous work that demonstrates the utility of NPHGS for civil unrest prediction and rare disease outbreak detection, we present an analysis of human rights events detected by NPHGS using two years of Twitter data from Mexico. NPHGS was able to accurately detect relevant clusters of human rights-related tweets prior to international news sources, and in some cases, prior to local news reports. Analysis of social media using NPHGS could enhance the information-gathering missions of human rights organizations by pinpointing specific abuses, revealing events and details that may be blocked from traditional media sources, and providing evidence of emerging patterns of human rights violations. This could lead to more timely, targeted, and effective advocacy, as well as other potential interventions.

  20. Detection of Abnormal Events via Optical Flow Feature Analysis

    PubMed Central

    Wang, Tian; Snoussi, Hichem

    2015-01-01

    In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227

  1. Context-aware event detection smartphone application for first responders

    NASA Astrophysics Data System (ADS)

    Boddhu, Sanjay K.; Dave, Rakesh P.; McCartney, Matt; West, James A.; Williams, Robert L.

    2013-05-01

    The rise of social networking platforms like Twitter, Facebook, etc…, have provided seamless sharing of information (as chat, video and other media) among its user community on a global scale. Further, the proliferation of the smartphones and their connectivity networks has powered the ordinary individuals to share and acquire information regarding the events happening in his/her immediate vicinity in a real-time fashion. This human-centric sensed data being generated in "human-as-sensor" approach is tremendously valuable as it delivered mostly with apt annotations and ground truth that would be missing in traditional machine-centric sensors, besides high redundancy factor (same data thru multiple users). Further, when appropriately employed this real-time data can support in detecting localized events like fire, accidents, shooting, etc…, as they unfold and pin-point individuals being affected by those events. This spatiotemporal information, when made available for first responders in the event vicinity (or approaching it) can greatly assist them to make effective decisions to protect property and life in a timely fashion. In this vein, under SATE and YATE programs, the research team at AFRL Tec^Edge Discovery labs had demonstrated the feasibility of developing Smartphone applications, that can provide a augmented reality view of the appropriate detected events in a given geographical location (localized) and also provide an event search capability over a large geographic extent. In its current state, the application thru its backend connectivity utilizes a data (Text & Image) processing framework, which deals with data challenges like; identifying and aggregating important events, analyzing and correlating the events temporally and spatially and building a search enabled event database. Further, the smartphone application with its backend data processing workflow has been successfully field tested with live user generated feeds.

  2. Automatic Prosodic Event Detection Using Acoustic, Lexical, and Syntactic Evidence

    PubMed Central

    Ananthakrishnan, Sankaranarayanan; Narayanan, Shrikanth S.

    2008-01-01

    With the advent of prosody annotation standards such as tones and break indices (ToBI), speech technologists and linguists alike have been interested in automatically detecting prosodic events in speech. This is because the prosodic tier provides an additional layer of information over the short-term segment-level features and lexical representation of an utterance. As the prosody of an utterance is closely tied to its syntactic and semantic content in addition to its lexical content, knowledge of the prosodic events within and across utterances can assist spoken language applications such as automatic speech recognition and translation. On the other hand, corpora annotated with prosodic events are useful for building natural-sounding speech synthesizers. In this paper, we build an automatic detector and classifier for prosodic events in American English, based on their acoustic, lexical, and syntactic correlates. Following previous work in this area, we focus on accent (prominence, or “stress”) and prosodic phrase boundary detection at the syllable level. Our experiments achieved a performance rate of 86.75% agreement on the accent detection task, and 91.61% agreement on the phrase boundary detection task on the Boston University Radio News Corpus. PMID:19122857

  3. Implementation of a model based fault detection and diagnosis for actuation faults of the Space Shuttle main engine

    NASA Technical Reports Server (NTRS)

    Duyar, A.; Guo, T.-H.; Merrill, W.; Musgrave, J.

    1992-01-01

    In a previous study, Guo, Merrill and Duyar, 1990, reported a conceptual development of a fault detection and diagnosis system for actuation faults of the space shuttle main engine. This study, which is a continuation of the previous work, implements the developed fault detection and diagnosis scheme for the real time actuation fault diagnosis of the space shuttle main engine. The scheme will be used as an integral part of an intelligent control system demonstration experiment at NASA Lewis. The diagnosis system utilizes a model based method with real time identification and hypothesis testing for actuation, sensor, and performance degradation faults.

  4. On Identifiability of Bias-Type Actuator-Sensor Faults in Multiple-Model-Based Fault Detection and Identification

    NASA Technical Reports Server (NTRS)

    Joshi, Suresh M.

    2012-01-01

    This paper explores a class of multiple-model-based fault detection and identification (FDI) methods for bias-type faults in actuators and sensors. These methods employ banks of Kalman-Bucy filters to detect the faults, determine the fault pattern, and estimate the fault values, wherein each Kalman-Bucy filter is tuned to a different failure pattern. Necessary and sufficient conditions are presented for identifiability of actuator faults, sensor faults, and simultaneous actuator and sensor faults. It is shown that FDI of simultaneous actuator and sensor faults is not possible using these methods when all sensors have biases.

  5. Human visual system-based smoking event detection

    NASA Astrophysics Data System (ADS)

    Odetallah, Amjad D.; Agaian, Sos S.

    2012-06-01

    Human action (e.g. smoking, eating, and phoning) analysis is an important task in various application domains like video surveillance, video retrieval, human-computer interaction systems, and so on. Smoke detection is a crucial task in many video surveillance applications and could have a great impact to raise the level of safety of urban areas, public parks, airplanes, hospitals, schools and others. The detection task is challenging since there is no prior knowledge about the object's shape, texture and color. In addition, its visual features will change under different lighting and weather conditions. This paper presents a new scheme of a system for detecting human smoking events, or small smoke, in a sequence of images. In developed system, motion detection and background subtraction are combined with motion-region-saving, skin-based image segmentation, and smoke-based image segmentation to capture potential smoke regions which are further analyzed to decide on the occurrence of smoking events. Experimental results show the effectiveness of the proposed approach. As well, the developed method is capable of detecting the small smoking events of uncertain actions with various cigarette sizes, colors, and shapes.

  6. Aseismic events in Southern California: Detection with InSAR

    NASA Astrophysics Data System (ADS)

    Lohman, R. B.; McGuire, J. J.; Lundgren, P.

    2007-05-01

    Aseismic slow slip events are usually studied using data types that have a dense temporal sampling rate, such as continuous GPS or tremor analysis using seismic data. However, even the sparser temporal coverage of InSAR data can further our understanding of these events in three significant ways - First, in areas where aseismic transients have been detected on geodetic arrays, InSAR may be able to provide a spatially denser image of the extent and magnitude of deformation. Second, InSAR observations are complementary to GPS because of the differing sensitivities to horizontal and vertical motions. Thirdly, in areas with no ground-based geodetic instrumentation, InSAR can be used in survey mode to detect deformation signals that are not associated with any observed seismicity. The temporal constraints on such signals may not be tight enough to allow for dynamics models of how aseismic transients occur, but InSAR-only detections can improve our understanding of the spatial extent of these types of events and can also identify key areas for future instrumentation and observation. Here, I summarize some of the contributions of InSAR observations of slow slip events, including data spanning the 2005 Obsidian Buttes swam in the Salton Trough, CA, and InSAR time-series results for the Salton Trough using both traditional interferometry and the persistent scatterer method.

  7. A model-based approach for detection of objects in low resolution passive-millimeter wave images

    NASA Technical Reports Server (NTRS)

    Tang, Yuan-Liang; Devadiga, Sadashiva; Kasturi, Rangachar; Harris, Randall L., Sr.

    1993-01-01

    We describe a model-based vision system to assist pilots in landing maneuvers under restricted visibility conditions. The system was designed to analyze image sequences obtained from a Passive Millimeter Wave (PMMW) imaging system mounted on the aircraft to delineate runways/taxiways, buildings, and other objects on or near runways. PMMW sensors have good response in a foggy atmosphere; but, their spatial resolution is very low. However, additional data such as airport model and approximate position and orientation of aircraft are available. We exploit these data to guide our model-based system to locate objects in the low resolution image and generate warning signals to alert the pilots. We also derive analytical expressions for the accuracy of the camera position estimate obtained by detecting the position of known objects in the image.

  8. Comparison of Event Detection Methods for Centralized Sensor Networks

    NASA Technical Reports Server (NTRS)

    Sauvageon, Julien; Agogiono, Alice M.; Farhang, Ali; Tumer, Irem Y.

    2006-01-01

    The development of an Integrated Vehicle Health Management (IVHM) for space vehicles has become a great concern. Smart Sensor Networks is one of the promising technologies that are catching a lot of attention. In this paper, we propose to a qualitative comparison of several local event (hot spot) detection algorithms in centralized redundant sensor networks. The algorithms are compared regarding their ability to locate and evaluate the event under noise and sensor failures. The purpose of this study is to check if the ratio performance/computational power of the Mote Fuzzy Validation and Fusion algorithm is relevant compare to simpler methods.

  9. Model-based detection of white matter in optical coherence tomography data.

    PubMed

    Gasca, Fernando; Ramrath, Lukas

    2007-01-01

    A method for white matter detection in Optical Coherence Tomography A-Scans is presented. The Kalman filter is used to obtain a slope change estimate of the intensity signal. The estimate is subsequently analyzed by a spike detection algorithm and then evaluated by a neural network binary classifier (Perceptron). The capability of the proposed method is shown through the quantitative evaluation of simulated A-Scans. The method was also applied to data obtained from a rat's brain in vitro. Results show that the developed algorithm identifies less false positives than other two spike detection methods, thus, enhancing the robustness and quality of detection.

  10. Event-Ready Bell Test Using Entangled Atoms Simultaneously Closing Detection and Locality Loopholes

    NASA Astrophysics Data System (ADS)

    Rosenfeld, Wenjamin; Burchardt, Daniel; Garthoff, Robert; Redeker, Kai; Ortegel, Norbert; Rau, Markus; Weinfurter, Harald

    2017-07-01

    An experimental test of Bell's inequality allows ruling out any local-realistic description of nature by measuring correlations between distant systems. While such tests are conceptually simple, there are strict requirements concerning the detection efficiency of the involved measurements, as well as the enforcement of spacelike separation between the measurement events. Only very recently could both loopholes be closed simultaneously. Here we present a statistically significant, event-ready Bell test based on combining heralded entanglement of atoms separated by 398 m with fast and efficient measurements of the atomic spin states closing essential loopholes. We obtain a violation with S =2.221 ±0.033 (compared to the maximal value of 2 achievable with models based on local hidden variables) which allows us to refute the hypothesis of local realism with a significance level P <2.57 ×10-9.

  11. Detection and interpretation of seismoacoustic events at German infrasound stations

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Koch, Karl; Ceranna, Lars

    2016-04-01

    Three infrasound arrays with collocated or nearby installed seismometers are operated by the Federal Institute for Geosciences and Natural Resources (BGR) as the German National Data Center (NDC) for the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Infrasound generated by seismoacoustic events is routinely detected at these infrasound arrays, but air-to-ground coupled acoustic waves occasionally show up in seismometer recordings as well. Different natural and artificial sources like meteoroids as well as industrial and mining activity generate infrasonic signatures that are simultaneously detected at microbarometers and seismometers. Furthermore, many near-surface sources like earthquakes and explosions generate both seismic and infrasonic waves that can be detected successively with both technologies. The combined interpretation of seismic and acoustic signatures provides additional information about the origin time and location of remote infrasound events or about the characterization of seismic events distinguishing man-made and natural origins. Furthermore, seismoacoustic studies help to improve the modelling of infrasound propagation and ducting in the atmosphere and allow quantifying the portion of energy coupled into ground and into air by seismoacoustic sources. An overview of different seismoacoustic sources and their detection by German infrasound stations as well as some conclusions on the benefit of a combined seismoacoustic analysis are presented within this study.

  12. PMU Data Event Detection: A User Guide for Power Engineers

    SciTech Connect

    Allen, A.; Singh, M.; Muljadi, E.; Santoso, S.

    2014-10-01

    This user guide is intended to accompany a software package containing a Matrix Laboratory (MATLAB) script and related functions for processing phasor measurement unit (PMU) data. This package and guide have been developed by the National Renewable Energy Laboratory and the University of Texas at Austin. The objective of this data processing exercise is to discover events in the vast quantities of data collected by PMUs. This document attempts to cover some of the theory behind processing the data to isolate events as well as the functioning of the MATLAB scripts. The report describes (1) the algorithms and mathematical background that the accompanying MATLAB codes use to detect events in PMU data and (2) the inputs required from the user and the outputs generated by the scripts.

  13. Detecting modification of biomedical events using a deep parsing approach.

    PubMed

    Mackinlay, Andrew; Martinez, David; Baldwin, Timothy

    2012-04-30

    This work describes a system for identifying event mentions in bio-molecular research abstracts that are either speculative (e.g. analysis of IkappaBalpha phosphorylation, where it is not specified whether phosphorylation did or did not occur) or negated (e.g. inhibition of IkappaBalpha phosphorylation, where phosphorylation did not occur). The data comes from a standard dataset created for the BioNLP 2009 Shared Task. The system uses a machine-learning approach, where the features used for classification are a combination of shallow features derived from the words of the sentences and more complex features based on the semantic outputs produced by a deep parser. To detect event modification, we use a Maximum Entropy learner with features extracted from the data relative to the trigger words of the events. The shallow features are bag-of-words features based on a small sliding context window of 3-4 tokens on either side of the trigger word. The deep parser features are derived from parses produced by the English Resource Grammar and the RASP parser. The outputs of these parsers are converted into the Minimal Recursion Semantics formalism, and from this, we extract features motivated by linguistics and the data itself. All of these features are combined to create training or test data for the machine learning algorithm. Over the test data, our methods produce approximately a 4% absolute increase in F-score for detection of event modification compared to a baseline based only on the shallow bag-of-words features. Our results indicate that grammar-based techniques can enhance the accuracy of methods for detecting event modification.

  14. Detecting modification of biomedical events using a deep parsing approach

    PubMed Central

    2012-01-01

    Background This work describes a system for identifying event mentions in bio-molecular research abstracts that are either speculative (e.g. analysis of IkappaBalpha phosphorylation, where it is not specified whether phosphorylation did or did not occur) or negated (e.g. inhibition of IkappaBalpha phosphorylation, where phosphorylation did not occur). The data comes from a standard dataset created for the BioNLP 2009 Shared Task. The system uses a machine-learning approach, where the features used for classification are a combination of shallow features derived from the words of the sentences and more complex features based on the semantic outputs produced by a deep parser. Method To detect event modification, we use a Maximum Entropy learner with features extracted from the data relative to the trigger words of the events. The shallow features are bag-of-words features based on a small sliding context window of 3-4 tokens on either side of the trigger word. The deep parser features are derived from parses produced by the English Resource Grammar and the RASP parser. The outputs of these parsers are converted into the Minimal Recursion Semantics formalism, and from this, we extract features motivated by linguistics and the data itself. All of these features are combined to create training or test data for the machine learning algorithm. Results Over the test data, our methods produce approximately a 4% absolute increase in F-score for detection of event modification compared to a baseline based only on the shallow bag-of-words features. Conclusions Our results indicate that grammar-based techniques can enhance the accuracy of methods for detecting event modification. PMID:22595089

  15. Adaptive Model-Based Mine Detection/Localization using Noisy Laser Doppler Vibration Measurements

    SciTech Connect

    Sullivan, E J; Xiang, N; Candy, J V

    2009-04-06

    The acoustic detection of buried mines is hampered by the fact that at the frequencies required for obtaining useful penetration, the energy is quickly absorbed by the ground. A recent approach which avoids this problem, is to excite the ground with a high-level low frequency sound, which excites low frequency resonances in the mine. These resonances cause a low-level vibration on the surface which can be detected by a Laser Doppler Vibrometer. This paper presents a method of quickly and efficiently detecting these vibrations by sensing a change in the statistics of the signal when the mine is present. Results based on real data are shown.

  16. Model-based assessment of the role of human-induced climate change in the 2005 Caribbean coral bleaching event.

    PubMed

    Donner, Simon D; Knutson, Thomas R; Oppenheimer, Michael

    2007-03-27

    Episodes of mass coral bleaching around the world in recent decades have been attributed to periods of anomalously warm ocean temperatures. In 2005, the sea surface temperature (SST) anomaly in the tropical North Atlantic that may have contributed to the strong hurricane season caused widespread coral bleaching in the Eastern Caribbean. Here, we use two global climate models to evaluate the contribution of natural climate variability and anthropogenic forcing to the thermal stress that caused the 2005 coral bleaching event. Historical temperature data and simulations for the 1870-2000 period show that the observed warming in the region is unlikely to be due to unforced climate variability alone. Simulation of background climate variability suggests that anthropogenic warming may have increased the probability of occurrence of significant thermal stress events for corals in this region by an order of magnitude. Under scenarios of future greenhouse gas emissions, mass coral bleaching in the Eastern Caribbean may become a biannual event in 20-30 years. However, if corals and their symbionts can adapt by 1-1.5 degrees C, such mass bleaching events may not begin to recur at potentially harmful intervals until the latter half of the century. The delay could enable more time to alter the path of greenhouse gas emissions, although long-term "committed warming" even after stabilization of atmospheric CO(2) levels may still represent an additional long-term threat to corals.

  17. Model-based assessment of the role of human-induced climate change in the 2005 Caribbean coral bleaching event

    SciTech Connect

    Donner, S.D.; Knutson, T.R.; Oppenheimer, M.

    2007-03-27

    Episodes of mass coral bleaching around the world in recent decades have been attributed to periods of anomalously warm ocean temperatures. In 2005, the sea surface temperature (SST) anomaly in the tropical North Atlantic that may have contributed to the strong hurricane season caused widespread coral bleaching in the Eastern Caribbean. Here, the authors use two global climate models to evaluate the contribution of natural climate variability and anthropogenic forcing to the thermal stress that caused the 2005 coral bleaching event. Historical temperature data and simulations for the 1870-2000 period show that the observed warming in the region is unlikely to be due to unforced climate variability alone. Simulation of background climate variability suggests that anthropogenic warming may have increased the probability of occurrence of significant thermal stress events for corals in this region by an order of magnitude. Under scenarios of future greenhouse gas emissions, mass coral bleaching in the Eastern Caribbean may become a biannual event in 20-30 years. However, if corals and their symbionts can adapt by 1-1.5{sup o}C, such mass bleaching events may not begin to recur at potentially harmful intervals until the latter half of the century. The delay could enable more time to alter the path of greenhouse gas emissions, although long-term 'committed warming' even after stabilization of atmospheric CO{sub 2} levels may still represent an additional long-term threat to corals.

  18. Gait Event Detection during Stair Walking Using a Rate Gyroscope

    PubMed Central

    Formento, Paola Catalfamo; Acevedo, Ruben; Ghoussayni, Salim; Ewins, David

    2014-01-01

    Gyroscopes have been proposed as sensors for ambulatory gait analysis and functional electrical stimulation systems. These applications often require detection of the initial contact (IC) of the foot with the floor and/or final contact or foot off (FO) from the floor during outdoor walking. Previous investigations have reported the use of a single gyroscope placed on the shank for detection of IC and FO on level ground and incline walking. This paper describes the evaluation of a gyroscope placed on the shank for determination of IC and FO in subjects ascending and descending a set of stairs. Performance was compared with a reference pressure measurement system. The absolute mean difference between the gyroscope and the reference was less than 45 ms for IC and better than 135 ms for FO for both activities. Detection success was over 93%. These results provide preliminary evidence supporting the use of a gyroscope for gait event detection when walking up and down stairs. PMID:24651724

  19. Gait event detection during stair walking using a rate gyroscope.

    PubMed

    Formento, Paola Catalfamo; Acevedo, Ruben; Ghoussayni, Salim; Ewins, David

    2014-03-19

    Gyroscopes have been proposed as sensors for ambulatory gait analysis and functional electrical stimulation systems. These applications often require detection of the initial contact (IC) of the foot with the floor and/or final contact or foot off (FO) from the floor during outdoor walking. Previous investigations have reported the use of a single gyroscope placed on the shank for detection of IC and FO on level ground and incline walking. This paper describes the evaluation of a gyroscope placed on the shank for determination of IC and FO in subjects ascending and descending a set of stairs. Performance was compared with a reference pressure measurement system. The absolute mean difference between the gyroscope and the reference was less than 45 ms for IC and better than 135 ms for FO for both activities. Detection success was over 93%. These results provide preliminary evidence supporting the use of a gyroscope for gait event detection when walking up and down stairs.

  20. Detection of transient events in the presence of background noise.

    PubMed

    Grange, Wilfried; Haas, Philippe; Wild, Andreas; Lieb, Michael Andreas; Calame, Michel; Hegner, Martin; Hecht, Bert

    2008-06-12

    We describe a method to detect and count transient burstlike signals in the presence of a significant stationary noise. To discriminate a transient signal from the background noise, an optimum threshold is determined using an iterative algorithm that yields the probability distribution of the background noise. Knowledge of the probability distribution of the noise then allows the determination of the number of transient events with a quantifiable error (wrong-positives). We apply the method, which does not rely on the choice of free parameters, to the detection and counting of transient single-molecule fluorescence events in the presence of a strong background noise. The method will be of importance in various ultra sensing applications.

  1. Integrated Seismic Event Detection and Location by Advanced Array Processing

    SciTech Connect

    Kvaerna, T; Gibbons, S J; Ringdal, F; Harris, D B

    2007-02-09

    The principal objective of this two-year study is to develop and test a new advanced, automatic approach to seismic detection/location using array processing. We address a strategy to obtain significantly improved precision in the location of low-magnitude events compared with current fully-automatic approaches, combined with a low false alarm rate. We have developed and evaluated a prototype automatic system which uses as a basis regional array processing with fixed, carefully calibrated, site-specific parameters in conjuction with improved automatic phase onset time estimation. We have in parallel developed tools for Matched Field Processing for optimized detection and source-region identification of seismic signals. This narrow-band procedure aims to mitigate some of the causes of difficulty encountered using the standard array processing system, specifically complicated source-time histories of seismic events and shortcomings in the plane-wave approximation for seismic phase arrivals at regional arrays.

  2. Microseismic Events Detection on Xishancun Landslide, Sichuan Province, China

    NASA Astrophysics Data System (ADS)

    Sheng, M.; Chu, R.; Wei, Z.

    2016-12-01

    On landslide, the slope movement and the fracturing of the rock mass often lead to microearthquakes, which are recorded as weak signals on seismographs. The distribution characteristics of temporal and spatial regional unstability as well as the impact of external factors on the unstable regions can be understand and analyzed by monitoring those microseismic events. Microseismic method can provide some information inside the landslide, which can be used as supplementary of geodetic methods for monitoring the movement of landslide surface. Compared to drilling on landslide, microseismic method is more economical and safe. Xishancun Landslide is located about 60km northwest of Wenchuan earthquake centroid, it keep deforming after the earthquake, which greatly increases the probability of disasters. In the autumn of 2015, 30 seismometers were deployed on the landslide for 3 months with intervals of 200 500 meters. First, we used regional earthquakes for time correction of seismometers to eliminate the influence of inaccuracy GPS clocks and the subsurface structure of stations. Due to low velocity of the loose medium, the travel time difference of microseismic events on the landslide up to 5s. According to travel time and waveform characteristics, we found many microseismic events and converted them into envelopes as templates, then we used a sliding-window cross-correlation technique based on waveform envelope to detect the other microseismic events. Consequently, 100 microseismic events were detected with the waveforms recorded on all seismometers. Based on the location, we found most of them located on the front of the landslide while the others located on the back end. The bottom and top of the landslide accumulated considerable energy and deformed largely, radiated waves could be recorded by all stations. What's more, the bottom with more events seemed very active. In addition, there were many smaller events happened in middle part of the landslide where released

  3. A Framework for Automated Spine and Vertebrae Interpolation-Based Detection and Model-Based Segmentation.

    PubMed

    Korez, Robert; Ibragimov, Bulat; Likar, Boštjan; Pernuš, Franjo; Vrtovec, Tomaž

    2015-08-01

    Automated and semi-automated detection and segmentation of spinal and vertebral structures from computed tomography (CT) images is a challenging task due to a relatively high degree of anatomical complexity, presence of unclear boundaries and articulation of vertebrae with each other, as well as due to insufficient image spatial resolution, partial volume effects, presence of image artifacts, intensity variations and low signal-to-noise ratio. In this paper, we describe a novel framework for automated spine and vertebrae detection and segmentation from 3-D CT images. A novel optimization technique based on interpolation theory is applied to detect the location of the whole spine in the 3-D image and, using the obtained location of the whole spine, to further detect the location of individual vertebrae within the spinal column. The obtained vertebra detection results represent a robust and accurate initialization for the subsequent segmentation of individual vertebrae, which is performed by an improved shape-constrained deformable model approach. The framework was evaluated on two publicly available CT spine image databases of 50 lumbar and 170 thoracolumbar vertebrae. Quantitative comparison against corresponding reference vertebra segmentations yielded an overall mean centroid-to-centroid distance of 1.1 mm and Dice coefficient of 83.6% for vertebra detection, and an overall mean symmetric surface distance of 0.3 mm and Dice coefficient of 94.6% for vertebra segmentation. The results indicate that by applying the proposed automated detection and segmentation framework, vertebrae can be successfully detected and accurately segmented in 3-D from CT spine images.

  4. Statistical language analysis for automatic exfiltration event detection.

    SciTech Connect

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  5. Recognition and defect detection of dot-matrix text via variation-model based learning

    NASA Astrophysics Data System (ADS)

    Ohyama, Wataru; Suzuki, Koushi; Wakabayashi, Tetsushi

    2017-03-01

    An algorithm for recognition and defect detection of dot-matrix text printed on products is proposed. Extraction and recognition of dot-matrix text contains several difficulties, which are not involved in standard camera-based OCR, that the appearance of dot-matrix characters is corrupted and broken by illumination, complex texture in the background and other standard characters printed on product packages. We propose a dot-matrix text extraction and recognition method which does not require any user interaction. The method employs detected location of corner points and classification score. The result of evaluation experiment using 250 images shows that recall and precision of extraction are 78.60% and 76.03%, respectively. Recognition accuracy of correctly extracted characters is 94.43%. Detecting printing defect of dot-matrix text is also important in the production scene to avoid illegal productions. We also propose a detection method for printing defect of dot-matrix characters. The method constructs a feature vector of which elements are classification scores of each character class and employs support vector machine to classify four types of printing defect. The detection accuracy of the proposed method is 96.68 %.

  6. ARX model-based gearbox fault detection and localization under varying load conditions

    NASA Astrophysics Data System (ADS)

    Yang, Ming; Makis, Viliam

    2010-11-01

    The development of the fault detection schemes for gearbox systems has received considerable attention in recent years. Both time series modeling and feature extraction based on wavelet methods have been considered, mostly under constant load. Constant load assumption implies that changes in vibration data are caused only by deterioration of the gearbox. However, most real gearbox systems operate under varying load and speed which affect the vibration signature of the system and in general make it difficult to recognize the occurrence of an impending fault. This paper presents a novel approach to detect and localize the gear failure occurrence for a gearbox operating under varying load conditions. First, residual signal is calculated using an autoregressive model with exogenous variables (ARX) fitted to the time-synchronously averaged (TSA) vibration data and filtered TSA envelopes when the gearbox operated under various load conditions in the healthy state. The gear of interest is divided into several sections so that each section includes the same number of adjacent teeth. Then, the fault detection and localization indicator is calculated by applying F-test to the residual signal of the ARX model. The proposed fault detection scheme indicates not only when the gear fault occurs, but also in which section of the gear. Finally, the performance of the fault detection scheme is checked using full lifetime vibration data obtained from the gearbox operating from a new condition to a breakdown under varying load.

  7. FraudMiner: a novel credit card fraud detection model based on frequent itemset mining.

    PubMed

    Seeja, K R; Zareapoor, Masoumeh

    2014-01-01

    This paper proposes an intelligent credit card fraud detection model for detecting fraud from highly imbalanced and anonymous credit card transaction datasets. The class imbalance problem is handled by finding legal as well as fraud transaction patterns for each customer by using frequent itemset mining. A matching algorithm is also proposed to find to which pattern (legal or fraud) the incoming transaction of a particular customer is closer and a decision is made accordingly. In order to handle the anonymous nature of the data, no preference is given to any of the attributes and each attribute is considered equally for finding the patterns. The performance evaluation of the proposed model is done on UCSD Data Mining Contest 2009 Dataset (anonymous and imbalanced) and it is found that the proposed model has very high fraud detection rate, balanced classification rate, Matthews correlation coefficient, and very less false alarm rate than other state-of-the-art classifiers.

  8. FraudMiner: A Novel Credit Card Fraud Detection Model Based on Frequent Itemset Mining

    PubMed Central

    Seeja, K. R.; Zareapoor, Masoumeh

    2014-01-01

    This paper proposes an intelligent credit card fraud detection model for detecting fraud from highly imbalanced and anonymous credit card transaction datasets. The class imbalance problem is handled by finding legal as well as fraud transaction patterns for each customer by using frequent itemset mining. A matching algorithm is also proposed to find to which pattern (legal or fraud) the incoming transaction of a particular customer is closer and a decision is made accordingly. In order to handle the anonymous nature of the data, no preference is given to any of the attributes and each attribute is considered equally for finding the patterns. The performance evaluation of the proposed model is done on UCSD Data Mining Contest 2009 Dataset (anonymous and imbalanced) and it is found that the proposed model has very high fraud detection rate, balanced classification rate, Matthews correlation coefficient, and very less false alarm rate than other state-of-the-art classifiers. PMID:25302317

  9. Moving object detection using a background modeling based on entropy theory and quad-tree decomposition

    NASA Astrophysics Data System (ADS)

    Elharrouss, Omar; Moujahid, Driss; Elkah, Samah; Tairi, Hamid

    2016-11-01

    A particular algorithm for moving object detection using a background subtraction approach is proposed. We generate the background model by combining quad-tree decomposition with entropy theory. In general, many background subtraction approaches are sensitive to sudden illumination change in the scene and cannot update the background image in scenes. The proposed background modeling approach analyzes the illumination change problem. After performing the background subtraction based on the proposed background model, the moving targets can be accurately detected at each frame of the image sequence. In order to produce high accuracy for the motion detection, the binary motion mask can be computed by the proposed threshold function. The experimental analysis based on statistical measurements proves the efficiency of our proposed method in terms of quality and quantity. And it even outperforms substantially existing methods by perceptional evaluation.

  10. Hybrid Model-Based and Data-Driven Fault Detection and Diagnostics for Commercial Buildings: Preprint

    SciTech Connect

    Frank, Stephen; Heaney, Michael; Jin, Xin; Robertson, Joseph; Cheung, Howard; Elmore, Ryan; Henze, Gregor

    2016-08-01

    Commercial buildings often experience faults that produce undesirable behavior in building systems. Building faults waste energy, decrease occupants' comfort, and increase operating costs. Automated fault detection and diagnosis (FDD) tools for buildings help building owners discover and identify the root causes of faults in building systems, equipment, and controls. Proper implementation of FDD has the potential to simultaneously improve comfort, reduce energy use, and narrow the gap between actual and optimal building performance. However, conventional rule-based FDD requires expensive instrumentation and valuable engineering labor, which limit deployment opportunities. This paper presents a hybrid, automated FDD approach that combines building energy models and statistical learning tools to detect and diagnose faults noninvasively, using minimal sensors, with little customization. We compare and contrast the performance of several hybrid FDD algorithms for a small security building. Our results indicate that the algorithms can detect and diagnose several common faults, but more work is required to reduce false positive rates and improve diagnosis accuracy.

  11. Hybrid Model-Based and Data-Driven Fault Detection and Diagnostics for Commercial Buildings

    SciTech Connect

    Frank, Stephen; Heaney, Michael; Jin, Xin; Robertson, Joseph; Cheung, Howard; Elmore, Ryan; Henze, Gregor

    2016-08-26

    Commercial buildings often experience faults that produce undesirable behavior in building systems. Building faults waste energy, decrease occupants' comfort, and increase operating costs. Automated fault detection and diagnosis (FDD) tools for buildings help building owners discover and identify the root causes of faults in building systems, equipment, and controls. Proper implementation of FDD has the potential to simultaneously improve comfort, reduce energy use, and narrow the gap between actual and optimal building performance. However, conventional rule-based FDD requires expensive instrumentation and valuable engineering labor, which limit deployment opportunities. This paper presents a hybrid, automated FDD approach that combines building energy models and statistical learning tools to detect and diagnose faults noninvasively, using minimal sensors, with little customization. We compare and contrast the performance of several hybrid FDD algorithms for a small security building. Our results indicate that the algorithms can detect and diagnose several common faults, but more work is required to reduce false positive rates and improve diagnosis accuracy.

  12. Model-Based Design of Tree WSNs for Decentralized Detection

    PubMed Central

    Tantawy, Ashraf; Koutsoukos, Xenofon; Biswas, Gautam

    2015-01-01

    The classical decentralized detection problem of finding the optimal decision rules at the sensor and fusion center, as well as variants that introduce physical channel impairments have been studied extensively in the literature. The deployment of WSNs in decentralized detection applications brings new challenges to the field. Protocols for different communication layers have to be co-designed to optimize the detection performance. In this paper, we consider the communication network design problem for a tree WSN. We pursue a system-level approach where a complete model for the system is developed that captures the interactions between different layers, as well as different sensor quality measures. For network optimization, we propose a hierarchical optimization algorithm that lends itself to the tree structure, requiring only local network information. The proposed design approach shows superior performance over several contentionless and contention-based network design approaches. PMID:26307989

  13. Spatial event cluster detection using an approximate normal distribution.

    PubMed

    Torabi, Mahmoud; Rosychuk, Rhonda J

    2008-12-12

    In geographic surveillance of disease, areas with large numbers of disease cases are to be identified so that investigations of the causes of high disease rates can be pursued. Areas with high rates are called disease clusters and statistical cluster detection tests are used to identify geographic areas with higher disease rates than expected by chance alone. Typically cluster detection tests are applied to incident or prevalent cases of disease, but surveillance of disease-related events, where an individual may have multiple events, may also be of interest. Previously, a compound Poisson approach that detects clusters of events by testing individual areas that may be combined with their neighbours has been proposed. However, the relevant probabilities from the compound Poisson distribution are obtained from a recursion relation that can be cumbersome if the number of events are large or analyses by strata are performed. We propose a simpler approach that uses an approximate normal distribution. This method is very easy to implement and is applicable to situations where the population sizes are large and the population distribution by important strata may differ by area. We demonstrate the approach on pediatric self-inflicted injury presentations to emergency departments and compare the results for probabilities based on the recursion and the normal approach. We also implement a Monte Carlo simulation to study the performance of the proposed approach. In a self-inflicted injury data example, the normal approach identifies twelve out of thirteen of the same clusters as the compound Poisson approach, noting that the compound Poisson method detects twelve significant clusters in total. Through simulation studies, the normal approach well approximates the compound Poisson approach for a variety of different population sizes and case and event thresholds. A drawback of the compound Poisson approach is that the relevant probabilities must be determined through a

  14. Application of Kalman Filtering Techniques for Microseismic Event Detection

    NASA Astrophysics Data System (ADS)

    Baziw, E.; Weir-Jones, I.

    - Microseismic monitoring systems are generally installed in areas of induced seismicity caused by human activity. Induced seismicity results from changes in the state of stress which may occur as a result of excavation within the rock mass in mining (i.e., rockbursts), and changes in hydrostatic pressures and rock temperatures (e.g., during fluid injection or extraction) in oil exploitation, dam construction or fluid disposal. Microseismic monitoring systems determine event locations and important source parameters such as attenuation, seismic moment, source radius, static stress drop, peak particle velocity and seismic energy. An essential part of the operation of a microseismic monitoring system is the reliable detection of microseismic events. In the absence of reliable, automated picking techniques, operators rely upon manual picking. This is time-consuming, costly and, in the presence of background noise, very prone to error. The techniques described in this paper not only permit the reliable identification of events in cluttered signal environments they have also enabled the authors to develop reliable automated event picking procedures. This opens the way to use microseismic monitoring as a cost-effective production/operations procedure. It has been the experience of the authors that in certain noisy environments, the seismic monitoring system may trigger on and subsequently acquire substantial quantities of erroneous data, due to the high energy content of the ambient noise. Digital filtering techniques need to be applied on the microseismic data so that the ambient noise is removed and event detection simplified. The monitoring of seismic acoustic emissions is a continuous, real-time process and it is desirable to implement digital filters which can also be designed in the time domain and in real-time such as the Kalman Filter. This paper presents a real-time Kalman Filter which removes the statistically describable background noise from the recorded

  15. Automatic Detection of Student Mental Models Based on Natural Language Student Input during Metacognitive Skill Training

    ERIC Educational Resources Information Center

    Lintean, Mihai; Rus, Vasile; Azevedo, Roger

    2012-01-01

    This article describes the problem of detecting the student mental models, i.e. students' knowledge states, during the self-regulatory activity of prior knowledge activation in MetaTutor, an intelligent tutoring system that teaches students self-regulation skills while learning complex science topics. The article presents several approaches to…

  16. Automatic Detection of Student Mental Models Based on Natural Language Student Input during Metacognitive Skill Training

    ERIC Educational Resources Information Center

    Lintean, Mihai; Rus, Vasile; Azevedo, Roger

    2012-01-01

    This article describes the problem of detecting the student mental models, i.e. students' knowledge states, during the self-regulatory activity of prior knowledge activation in MetaTutor, an intelligent tutoring system that teaches students self-regulation skills while learning complex science topics. The article presents several approaches to…

  17. Integrating event detection system operation characteristics into sensor placement optimization.

    SciTech Connect

    Hart, William Eugene; McKenna, Sean Andrew; Phillips, Cynthia Ann; Murray, Regan Elizabeth; Hart, David Blaine

    2010-05-01

    We consider the problem of placing sensors in a municipal water network when we can choose both the location of sensors and the sensitivity and specificity of the contamination warning system. Sensor stations in a municipal water distribution network continuously send sensor output information to a centralized computing facility, and event detection systems at the control center determine when to signal an anomaly worthy of response. Although most sensor placement research has assumed perfect anomaly detection, signal analysis software has parameters that control the tradeoff between false alarms and false negatives. We describe a nonlinear sensor placement formulation, which we heuristically optimize with a linear approximation that can be solved as a mixed-integer linear program. We report the results of initial experiments on a real network and discuss tradeoffs between early detection of contamination incidents, and control of false alarms.

  18. Machine learning for the automatic detection of anomalous events

    NASA Astrophysics Data System (ADS)

    Fisher, Wendy D.

    In this dissertation, we describe our research contributions for a novel approach to the application of machine learning for the automatic detection of anomalous events. We work in two different domains to ensure a robust data-driven workflow that could be generalized for monitoring other systems. Specifically, in our first domain, we begin with the identification of internal erosion events in earth dams and levees (EDLs) using geophysical data collected from sensors located on the surface of the levee. As EDLs across the globe reach the end of their design lives, effectively monitoring their structural integrity is of critical importance. The second domain of interest is related to mobile telecommunications, where we investigate a system for automatically detecting non-commercial base station routers (BSRs) operating in protected frequency space. The presence of non-commercial BSRs can disrupt the connectivity of end users, cause service issues for the commercial providers, and introduce significant security concerns. We provide our motivation, experimentation, and results from investigating a generalized novel data-driven workflow using several machine learning techniques. In Chapter 2, we present results from our performance study that uses popular unsupervised clustering algorithms to gain insights to our real-world problems, and evaluate our results using internal and external validation techniques. Using EDL passive seismic data from an experimental laboratory earth embankment, results consistently show a clear separation of events from non-events in four of the five clustering algorithms applied. Chapter 3 uses a multivariate Gaussian machine learning model to identify anomalies in our experimental data sets. For the EDL work, we used experimental data from two different laboratory earth embankments. Additionally, we explore five wavelet transform methods for signal denoising. The best performance is achieved with the Haar wavelets. We achieve up to 97

  19. Detection of red tide events in the Ariake Sound, Japan

    NASA Astrophysics Data System (ADS)

    Ishizaka, Joji

    2003-05-01

    High resolution SeaWiFS data was used to detect a red tide event occurred in the Ariake Sound, Japan, in winter of 2000 to 2001. The area is small embayment surrounding by tidal flat, and it is known as one of the most productive areas in coast of Japan. The red tide event damaged to seaweed (Nori) culture, and the relation to the reclamation at the Isahaya Bay in the Sound has been discussed. SeaWiFS chlorophyll data showed the red tide started early December 2000, from the Isahaya Bay, although direct relationship to the reclamation was not clear. The red tide persisted to the end of February. Monthly average of SeaWiFS data from May 1998 to December 2001 indicated that the chlorophyll increased twice a year, early summer and fall after the rain. The red tide event was part of the fall bloom which started later and continued longer than other years. Ocean color is useful to detect the red tide; however, it is required to improve the algorithms to accurately estimate chlorophyll in high turbid water and to discriminate toxic flagellates.

  20. Model Based Determination of Detection Limits for Proton Transfer Reaction Mass Spectrometer

    NASA Astrophysics Data System (ADS)

    Amann, Anton; Schwarz, Konrad; Wimmer, Gejza; Witkovský, Viktor

    2010-01-01

    Proton Transfer Reaction Mass Spectrometry (PTR-MS) is a chemical ionization mass spectrometric technique which allows to measure trace gases as, for example, in exhaled human breath. The quantification of compounds at low concentrations is desirable for medical diagnostics. Typically, an increase of measuring accuracy can be achieved if the duration of the measuring process is extended. For real time measurements the time windows for measurement are relatively short, in order to get a good time resolution (e.g. with breath-to-breath resolution during exercise on a stationary bicycle). Determination of statistical detection limits is typically based on calibration measurements, but this approach is limited, especially for very low concentrations. To overcome this problem, a calculation of limit of quantification (LOQ) and limit of detection (LOD), respectively, based on a theoretical model of the measurement process is outlined.

  1. A travel time forecasting model based on change-point detection method

    NASA Astrophysics Data System (ADS)

    LI, Shupeng; GUANG, Xiaoping; QIAN, Yongsheng; ZENG, Junwei

    2017-06-01

    Travel time parameters obtained from road traffic sensors data play an important role in traffic management practice. A travel time forecasting model is proposed for urban road traffic sensors data based on the method of change-point detection in this paper. The first-order differential operation is used for preprocessing over the actual loop data; a change-point detection algorithm is designed to classify the sequence of large number of travel time data items into several patterns; then a travel time forecasting model is established based on autoregressive integrated moving average (ARIMA) model. By computer simulation, different control parameters are chosen for adaptive change point search for travel time series, which is divided into several sections of similar state.Then linear weight function is used to fit travel time sequence and to forecast travel time. The results show that the model has high accuracy in travel time forecasting.

  2. High Probabilities of Planet Detection during Microlensing Events.

    NASA Astrophysics Data System (ADS)

    Peale, S. J.

    2000-10-01

    The averaged probability of detecting a planetary companion of a lensing star during a gravitational microlensing event toward the Galactic center when the planet-lens mass ratio is 0.001 is shown to have a maximum exceeding 20% for a distribution of source-lens impact parameters that is determined by the efficiency of event detection, and a maximum exceeding 10% for a uniform distribution of impact parameters. The probability varies as the square root of the planet-lens mass ratio. A planet is assumed detectable if the perturbation of the light curve exceeds 2/(S/N) for a significant number of data points, where S/N is the signal-to noise ratio for the photometry of the source. The probability peaks at a planetary semimajor axis a that is close to the mean Einstein ring radius of the lenses of about 2 AU along the line of sight, and remains significant for 0.6<= a<= 10 AU. The low value of the mean Einstein ring radius results from the dominance of M stars in the mass function of the lenses. The probability is averaged over the distribution of the projected position of the planet onto the lens plane, over the lens mass function, over the distribution of impact parameters, over the distribution of lens along the line of sight to the source star, over the I band luminosity function of the sources adjusted for the source distance, and over the source distribution along the line of sight. If two or more parameters of the lensing event are known, such as the I magnitude of the source and the impact parameter, the averages over these parameters can be omitted and the probability of detection determined for a particular event. The calculated probabilities behave as expected with variations in the line of sight, the mass function of the lenses, the extinction and distance to and magnitude of the source, and with a more demanding detection criterion. The relatively high values of the probabilities are robust to plausible variations in the assumptions. The high

  3. Automated adverse event detection collaborative: electronic adverse event identification, classification, and corrective actions across academic pediatric institutions.

    PubMed

    Stockwell, David C; Kirkendall, Eric; Muething, Stephen E; Kloppenborg, Elizabeth; Vinodrao, Hima; Jacobs, Brian R

    2013-12-01

    Historically, the gold standard for detecting medical errors has been the voluntary incident reporting system. Voluntary reporting rates significantly underestimate the number of actual adverse events in any given organization. The electronic health record (EHR) contains clinical and administrative data that may indicate the occurrence of an adverse event and can be used to detect adverse events that may otherwise remain unrecognized. Automated adverse event detection has been shown to be efficient and cost effective in the hospital setting. The Automated Adverse Event Detection Collaborative (AAEDC) is a group of academic pediatric organizations working to identify optimal electronic methods of adverse event detection. The Collaborative seeks to aggregate and analyze data around adverse events as well as identify and share specific intervention strategies to reduce the rate of such events, ultimately to deliver higher quality and safer care. The objective of this study is to describe the process of automated adverse event detection, report early results from the Collaborative, identify commonalities and notable differences between 2 organizations, and suggest future directions for the Collaborative. In this retrospective observational study, the implementation and use of an automated adverse event detection system was compared between 2 academic children's hospital participants in the AAEDC, Children's National Medical Center, and Cincinnati Children's Hospital Medical Center. Both organizations use the EHR to identify potential adverse events as designated by specific electronic data triggers. After gathering the electronic data, a clinical investigator at each hospital manually examined the patient record to determine whether an adverse event had occurred, whether the event was preventable, and the level of harm involved. The Automated Adverse Event Detection Collaborative data from the 2 organizations between July 2006 and October 2010 were analyzed. Adverse

  4. Towards perception awareness: Perceptual event detection for Brain computer interfaces.

    PubMed

    Nejati, Hossein; Tsourides, Kleovoulos; Pomponiu, Victor; Ehrenberg, Evan C; Ngai-Man Cheung; Sinha, Pawan

    2015-08-01

    Brain computer interface (BCI) technology is becoming increasingly popular in many domains such as entertainment, mental state analysis, and rehabilitation. For robust performance in these domains, detecting perceptual events would be a vital ability, enabling adaptation to and act on the basis of user's perception of the environment. Here we present a framework to automatically mine spatiotemporal characteristics of a given perceptual event. As this "signature" is derived directly from subject's neural behavior, it can serve as a representation of the subject's perception of the targeted scenario, which in turn allows a BCI system to gain a new level of context awareness: perception awareness. As a proof of concept, we show the application of the proposed framework on MEG signal recordings from a face perception study, and the resulting temporal and spatial characteristics of the derived neural signature, as well as it's compatibility with the neuroscientific literature on face perception.

  5. Complex Event Detection in Pedestrian Groups from Uavs

    NASA Astrophysics Data System (ADS)

    Burkert, F.; Butenuth, M.

    2012-07-01

    We present a new hierarchical event detection approach for highly complex scenarios in pedestrian groups on the basis of airborne image sequences from UAVs. Related work on event detection for pedestrians is capable of learning and analyzing recurring motion paths to detect abnormal paths and of analyzing the type of motion interaction between pairs of pedestrians. However, these approaches can only describe basic motion and fail at the analysis of pedestrian groups with complex behavior. We overcome the limitations of the related work by using a dynamic pedestrian graph of a scene which contains basic pairwise pedestrian motion interaction labels in the first layer. In the second layer, pedestrian groups are analyzed based on the dynamic pedestrian graph in order to get higher-level information about group behavior. This is done by a heuristic assignment of predefined scenarios out of a model library to the data. The assignment is based on the motion interaction labels, on dynamic group motion parameters and on a set of subgraph features. Experimental results are shown based on a new UAV dataset which contains group motion of different complexity levels.

  6. Endmember detection in marine environment with oil spill event

    NASA Astrophysics Data System (ADS)

    Andreou, Charoula; Karathanassi, Vassilia

    2011-11-01

    Oil spill events are a crucial environmental issue. Detection of oil spills is important for both oil exploration and environmental protection. In this paper, investigation of hyperspectral remote sensing is performed for the detection of oil spills and the discrimination of different oil types. Spectral signatures of different oil types are very useful, since they may serve as endmembers in unmixing and classification models. Towards this direction, an oil spectral library, resulting from spectral measurements of artificial oil spills as well as of look-alikes in marine environment was compiled. Samples of four different oil types were used; two crude oils, one marine residual fuel oil, and one light petroleum product. Lookalikes comprise sea water, river discharges, shallow water and water with algae. Spectral measurements were acquired with spectro-radiometer GER1500. Moreover, oil and look-alikes spectral signatures have been examined whether they can be served as endmembers. This was accomplished by testifying their linear independence. After that, synthetic hyperspectral images based on the relevant oil spectral library were created. Several simplex-based endmember algorithms such as sequential maximum angle convex cone (SMACC), vertex component analysis (VCA), n-finder algorithm (N-FINDR), and automatic target generation process (ATGP) were applied on the synthetic images in order to evaluate their effectiveness for detecting oil spill events occurred from different oil types. Results showed that different types of oil spills with various thicknesses can be extracted as endmembers.

  7. Hybrid light transport model based bioluminescence tomography reconstruction for early gastric cancer detection

    NASA Astrophysics Data System (ADS)

    Chen, Xueli; Liang, Jimin; Hu, Hao; Qu, Xiaochao; Yang, Defu; Chen, Duofang; Zhu, Shouping; Tian, Jie

    2012-03-01

    Gastric cancer is the second cause of cancer-related death in the world, and it remains difficult to cure because it has been in late-stage once that is found. Early gastric cancer detection becomes an effective approach to decrease the gastric cancer mortality. Bioluminescence tomography (BLT) has been applied to detect early liver cancer and prostate cancer metastasis. However, the gastric cancer commonly originates from the gastric mucosa and grows outwards. The bioluminescent light will pass through a non-scattering region constructed by gastric pouch when it transports in tissues. Thus, the current BLT reconstruction algorithms based on the approximation model of radiative transfer equation are not optimal to handle this problem. To address the gastric cancer specific problem, this paper presents a novel reconstruction algorithm that uses a hybrid light transport model to describe the bioluminescent light propagation in tissues. The radiosity theory integrated with the diffusion equation to form the hybrid light transport model is utilized to describe light propagation in the non-scattering region. After the finite element discretization, the hybrid light transport model is converted into a minimization problem which fuses an l1 norm based regularization term to reveal the sparsity of bioluminescent source distribution. The performance of the reconstruction algorithm is first demonstrated with a digital mouse based simulation with the reconstruction error less than 1mm. An in situ gastric cancer-bearing nude mouse based experiment is then conducted. The primary result reveals the ability of the novel BLT reconstruction algorithm in early gastric cancer detection.

  8. Model-Based Anomaly Detection for a Transparent Optical Transmission System

    NASA Astrophysics Data System (ADS)

    Bengtsson, Thomas; Salamon, Todd; Ho, Tin Kam; White, Christopher A.

    In this chapter, we present an approach for anomaly detection at the physical layer of networks where detailed knowledge about the devices and their operations is available. The approach combines physics-based process models with observational data models to characterize the uncertainties and derive the alarm decision rules. We formulate and apply three different methods based on this approach for a well-defined problem in optical network monitoring that features many typical challenges for this methodology. Specifically, we address the problem of monitoring optically transparent transmission systems that use dynamically controlled Raman amplification systems. We use models of amplifier physics together with statistical estimation to derive alarm decision rules and use these rules to automatically discriminate between measurement errors, anomalous losses, and pump failures. Our approach has led to an efficient tool for systematically detecting anomalies in the system behavior of a deployed network, where pro-active measures to address such anomalies are key to preventing unnecessary disturbances to the system's continuous operation.

  9. Model-based detection of heart rate turbulence using mean shape information.

    PubMed

    Smith, Danny; Solem, Kristian; Laguna, Pablo; Martínez, Juan Pablo; Sörnmo, Leif

    2010-02-01

    A generalized likelihood ratio test (GLRT) statistic is proposed for detection of heart rate turbulence (HRT), where a set of Karhunen-LoEve basis functions models HRT. The detector structure is based on the extended integral pulse frequency modulation model that accounts for the presence of ectopic beats and HRT. This new test statistic takes a priori information regarding HRT shape into account, whereas our previously presented GLRT detector relied solely on the energy contained in the signal subspace. The spectral relationship between heart rate variability (HRV) and HRT is investigated for the purpose of modeling HRV "noise" present during the turbulence period, the results suggesting that the white noise assumption is feasible to pursue. The performance was studied for both simulated and real data, leading to results which show that the new GLRT detector is superior to the original one as well as to the commonly used parameter turbulence slope (TS) on both types of data. Averaging ten ventricular ectopic beats, the estimated detection probability of the new detector, the previous detector, and TS were found to be 0.83, 0.35, and 0.41, respectively, when the false alarm probability was held fixed at 0.1.

  10. An Automated Visual Event Detection System for Cabled Observatory Video

    NASA Astrophysics Data System (ADS)

    Edgington, D. R.; Cline, D. E.; Mariette, J.

    2007-12-01

    The permanent presence of underwater cameras on oceanic cabled observatories, such as the Victoria Experimental Network Under the Sea (VENUS) and Eye-In-The-Sea (EITS) on Monterey Accelerated Research System (MARS), will generate valuable data that can move forward the boundaries of understanding the underwater world. However, sightings of underwater animal activities are rare, resulting in the recording of many hours of video with relatively few events of interest. The burden of video management and analysis often requires reducing the amount of video recorded and later analyzed. Sometimes enough human resources do not exist to analyze the video; the strains on human attention needed to analyze video demand an automated way to assist in video analysis. Towards this end, an Automated Visual Event Detection System (AVED) is in development at the Monterey Bay Aquarium Research Institute (MBARI) to address the problem of analyzing cabled observatory video. Here we describe the overall design of the system to process video data and enable science users to analyze the results. We present our results analyzing video from the VENUS observatory and test data from EITS deployments. This automated system for detecting visual events includes a collection of custom and open source software that can be run three ways: through a Web Service, through a Condor managed pool of AVED enabled compute servers, or locally on a single computer. The collection of software also includes a graphical user interface to preview or edit detected results and to setup processing options. To optimize the compute-intensive AVED algorithms, a parallel program has been implemented for high-data rate applications like the EITS instrument on MARS.

  11. Measuring target detection performance in paradigms with high event rates.

    PubMed

    Bendixen, Alexandra; Andersen, Søren K

    2013-05-01

    Combining behavioral and neurophysiological measurements inevitably implies mutual constraints, such as when the neurophysiological measurement requires fast-paced stimulus presentation and hence the attribution of a behavioral response to a particular preceding stimulus becomes ambiguous. We develop and test a method for validly assessing behavioral detection performance in spite of this ambiguity. We examine four approaches taken in the literature to treat such situations. We analytically derive a new variant of computing the classical parameters of signal detection theory, hit and false alarm rates, adapted to fast-paced paradigms. Each of the previous approaches shows specific shortcomings (susceptibility towards response window choice, biased estimates of behavioral detection performance). Superior performance of our new approach is demonstrated for both simulated and empirical behavioral data. Further evidence is provided by reliable correspondence between behavioral performance and the N2b component as an electrophysiological indicator of target detection. The appropriateness of our approach is substantiated by both theoretical and empirical arguments. We demonstrate an easy-to-implement solution for measuring target detection performance independent of the rate of event presentation. Thus overcoming the measurement bias of previous approaches, our method will help to clarify the behavioral relevance of different measures of cortical activation. Copyright © 2012 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  12. Data mining to generate adverse drug events detection rules.

    PubMed

    Chazard, Emmanuel; Ficheur, Grégoire; Bernonville, Stéphanie; Luyckx, Michel; Beuscart, Régis

    2011-11-01

    Adverse drug events (ADEs) are a public health issue. Their detection usually relies on voluntary reporting or medical chart reviews. The objective of this paper is to automatically detect cases of ADEs by data mining. 115,447 complete past hospital stays are extracted from six French, Danish, and Bulgarian hospitals using a common data model including diagnoses, drug administrations, laboratory results, and free-text records. Different kinds of outcomes are traced, and supervised rule induction methods (decision trees and association rules) are used to discover ADE detection rules, with respect to time constraints. The rules are then filtered, validated, and reorganized by a committee of experts. The rules are described in a rule repository, and several statistics are automatically computed in every medical department, such as the confidence, relative risk, and median delay of outcome appearance. 236 validated ADE-detection rules are discovered; they enable to detect 27 different kinds of outcomes. The rules use a various number of conditions related to laboratory results, diseases, drug administration, and demographics. Some rules involve innovative conditions, such as drug discontinuations.

  13. Use of sonification in the detection of anomalous events

    NASA Astrophysics Data System (ADS)

    Ballora, Mark; Cole, Robert J.; Kruesi, Heidi; Greene, Herbert; Monahan, Ganesh; Hall, David L.

    2012-06-01

    In this paper, we describe the construction of a soundtrack that fuses stock market data with information taken from tweets. This soundtrack, or auditory display, presents the numerical and text data in such a way that anomalous events may be readily detected, even by untrained listeners. The soundtrack generation is flexible, allowing an individual listener to create a unique audio mix from the available information sources. Properly constructed, the display exploits the auditory system's sensitivities to periodicities, to dynamic changes, and to patterns. This type of display could be valuable in environments that demand high levels of situational awareness based on multiple sources of incoming information.

  14. The waveform correlation event detection system global prototype software design

    SciTech Connect

    Beiriger, J.I.; Moore, S.G.; Trujillo, J.R.; Young, C.J.

    1997-12-01

    The WCEDS prototype software system was developed to investigate the usefulness of waveform correlation methods for CTBT monitoring. The WCEDS prototype performs global seismic event detection and has been used in numerous experiments. This report documents the software system design, presenting an overview of the system operation, describing the system functions, tracing the information flow through the system, discussing the software structures, and describing the subsystem services and interactions. The effectiveness of the software design in meeting project objectives is considered, as well as opportunities for code refuse and lessons learned from the development process. The report concludes with recommendations for modifications and additions envisioned for regional waveform-correlation-based detector.

  15. A Model-Based Probabilistic Inversion Framework for Wire Fault Detection Using TDR

    NASA Technical Reports Server (NTRS)

    Schuet, Stefan R.; Timucin, Dogan A.; Wheeler, Kevin R.

    2010-01-01

    Time-domain reflectometry (TDR) is one of the standard methods for diagnosing faults in electrical wiring and interconnect systems, with a long-standing history focused mainly on hardware development of both high-fidelity systems for laboratory use and portable hand-held devices for field deployment. While these devices can easily assess distance to hard faults such as sustained opens or shorts, their ability to assess subtle but important degradation such as chafing remains an open question. This paper presents a unified framework for TDR-based chafing fault detection in lossy coaxial cables by combining an S-parameter based forward modeling approach with a probabilistic (Bayesian) inference algorithm. Results are presented for the estimation of nominal and faulty cable parameters from laboratory data.

  16. Takagi-Sugeno fuzzy-model-based fault detection for networked control systems with Markov delays.

    PubMed

    Zheng, Ying; Fang, Huajing; Wang, Hua O

    2006-08-01

    A Takagi-Sugeno (T-S) model is employed to represent a networked control system (NCS) with different network-induced delays. Comparing with existing NCS modeling methods, this approach does not require the knowledge of exact values of network-induced delays. Instead, it addresses situations involving all possible network-induced delays. Moreover, this approach also handles data-packet loss. As an application of the T-S-based modeling method, a parity-equation approach and a fuzzy-observer-based approach for fault detection of an NCS were developed. An example of a two-link inverted pendulum is used to illustrate the utility and viability of the proposed approaches.

  17. [Establishment and Improvement of Portable X-Ray Fluorescence Spectrometer Detection Model Based on Wavelet Transform].

    PubMed

    Li, Fang; Wang, Ji-hua; Lu, An-xiang; Han, Ping

    2015-04-01

    The concentration of Cr, Cu, Zn, As and Pb in soil was tested by portable X-ray fluorescence spectrometer. Each sample was tested for 3 times, then after using wavelet threshold noise filtering method for denoising and smoothing the spectra, a standard curve for each heavy metal was established according to the standard values of heavy metals in soil and the corresponding counts which was the average of the 3 processed spectra. The signal to noise ratio (SNR), mean square error (MSE) and information entropy (H) were taken to assess the effects of denoising when using wavelet threshold noise filtering method for determining the best wavelet basis and wavelet decomposition level. Some samples with different concentrations and H3 B03 (blank) were chosen to retest this instrument to verify its stability. The results show that: the best denoising result was obtained with the coif3 wavelet basis at the decomposition level of 3 when using the wavelet transform method. The determination coefficient (R2) range of the instrument is 0.990-0.996, indicating that a high degree of linearity was found between the contents of heavy metals in soil and each X-ray fluorescence spectral characteristic peak intensity with the instrument measurement within the range (0-1,500 mg · kg(-1)). After retesting and calculating, the results indicate that all the detection limits of the instrument are below the soil standards at national level. The accuracy of the model has been effectively improved, and the instrument also shows good precision with the practical application of wavelet transform to the establishment and improvement of X-ray fluorescence spectrometer detection model. Thus the instrument can be applied in on-site rapid screening of heavy metal in contaminated soil.

  18. Functionality of empirical model-based predictive analytics for the early detection of hemodynamic instabilty.

    PubMed

    Summers, Richard L; Pipke, Matt; Wegerich, Stephan; Conkright, Gary; Isom, Kristen C

    2014-01-01

    Background. Monitoring cardiovascular hemodynamics in the modern clinical setting is a major challenge. Increasing amounts of physiologic data must be analyzed and interpreted in the context of the individual patient’s pathology and inherent biologic variability. Certain data-driven analytical methods are currently being explored for smart monitoring of data streams from patients as a first tier automated detection system for clinical deterioration. As a prelude to human clinical trials, an empirical multivariate machine learning method called Similarity-Based Modeling (“SBM”), was tested in an In Silico experiment using data generated with the aid of a detailed computer simulator of human physiology (Quantitative Circulatory Physiology or “QCP”) which contains complex control systems with realistic integrated feedback loops. Methods. SBM is a kernel-based, multivariate machine learning method that that uses monitored clinical information to generate an empirical model of a patient’s physiologic state. This platform allows for the use of predictive analytic techniques to identify early changes in a patient’s condition that are indicative of a state of deterioration or instability. The integrity of the technique was tested through an In Silico experiment using QCP in which the output of computer simulations of a slowly evolving cardiac tamponade resulted in progressive state of cardiovascular decompensation. Simulator outputs for the variables under consideration were generated at a 2-min data rate (0.083Hz) with the tamponade introduced at a point 420 minutes into the simulation sequence. The functionality of the SBM predictive analytics methodology to identify clinical deterioration was compared to the thresholds used by conventional monitoring methods. Results. The SBM modeling method was found to closely track the normal physiologic variation as simulated by QCP. With the slow development of the tamponade, the SBM model are seen to disagree while the

  19. Enhancement of autoregressive model based gear tooth fault detection technique by the use of minimum entropy deconvolution filter

    NASA Astrophysics Data System (ADS)

    Endo, H.; Randall, R. B.

    2007-02-01

    This paper proposes the use of the minimum entropy deconvolution (MED) technique to enhance the ability of the existing autoregressive (AR) model based filtering technique to detect localised faults in gears. The AR filter technique has been proven superior for detecting localised gear tooth faults than the traditionally used residual analysis technique. The AR filter technique is based on subtracting a regular gearmesh signal, as represented by the toothmesh harmonics and immediately adjacent sidebands, from the spectrum of a signal from one gear obtained by the synchronous signal averaging technique (SSAT). The existing AR filter technique performs well but is based on autocorrelation measurements and is thus insensitive to phase relationships which can be used to differentiate noise from impulses. The MED technique can make a use of the phase information by means of the higher-order statistical (HOS) characteristics of the signal, in particular the kurtosis, to enhance the ability to detect emerging gear tooth faults. The experimental results presented in this paper validate the superior performance of the combined AR and MED filtering techniques in detecting spalls and tooth fillet cracks in gears.

  20. Model-based decoding, information estimation, and change-point detection techniques for multineuron spike trains.

    PubMed

    Pillow, Jonathan W; Ahmadian, Yashar; Paninski, Liam

    2011-01-01

    One of the central problems in systems neuroscience is to understand how neural spike trains convey sensory information. Decoding methods, which provide an explicit means for reading out the information contained in neural spike responses, offer a powerful set of tools for studying the neural coding problem. Here we develop several decoding methods based on point-process neural encoding models, or forward models that predict spike responses to stimuli. These models have concave log-likelihood functions, which allow efficient maximum-likelihood model fitting and stimulus decoding. We present several applications of the encoding model framework to the problem of decoding stimulus information from population spike responses: (1) a tractable algorithm for computing the maximum a posteriori (MAP) estimate of the stimulus, the most probable stimulus to have generated an observed single- or multiple-neuron spike train response, given some prior distribution over the stimulus; (2) a gaussian approximation to the posterior stimulus distribution that can be used to quantify the fidelity with which various stimulus features are encoded; (3) an efficient method for estimating the mutual information between the stimulus and the spike trains emitted by a neural population; and (4) a framework for the detection of change-point times (the time at which the stimulus undergoes a change in mean or variance) by marginalizing over the posterior stimulus distribution. We provide several examples illustrating the performance of these estimators with simulated and real neural data.

  1. Detectability of GW150914-like events by gravitational microlensing

    NASA Astrophysics Data System (ADS)

    Eilbott, Daniel; Riley, Alexander; Cohn, Jonathan; Kesden, Michael H.; King, Lindsay J.

    2017-01-01

    The recent discovery of gravitational waves from stellar-mass binary black holes (BBHs) provided direct evidence of the existence of these systems. These BBHs would have gravitational microlensing signatures that are, due to their large masses and small separations, distinct from single-lens signals. We apply Bayesian statistics to examine the distinguishability of BBH microlensing events from single-lens events under ideal observing conditions, using modern photometric and astrometric capabilities. Given one year of ideal observations, a source star at the Galactic center, a GW150914-like BBH lens (total mass 65 M⊙, mass ratio 0.8) at half that distance, and an impact parameter of 0.4 Einstein radii, we find that BBH separations down to 0.00634 Einstein radii are detectable, which is < 0.00716 Einstein radii, the limit at which the BBH would merge within the age of the universe. We encourage analyses of LSST data to search for similar modulation in all long-duration events, providing a new channel for the discovery of short-period BBHs in our Galaxy.

  2. Understanding pharmacist decision making for adverse drug event (ADE) detection.

    PubMed

    Phansalkar, Shobha; Hoffman, Jennifer M; Hurdle, John F; Patel, Vimla L

    2009-04-01

    Manual chart review is an effective but expensive method for adverse drug event (ADE) detection. Building an expert system capable of mimicking the human expert's decision pathway, to deduce the occurrence of an ADE, can improve efficiency and lower cost. As a first step to build such an expert system, this study explores pharmacist's decision-making processes for ADE detection. Think-aloud procedures were used to elicit verbalizations as pharmacists read through ADE case scenarios. Two types of information were extracted, firstly pharmacists' decision-making strategies regarding ADEs and secondly information regarding pharmacists' unmet information needs for ADE detection. Verbal protocols were recorded and analysed qualitatively to extract ADE information signals. Inter-reviewer agreement for classification of ADE information signals was calculated using Cohen's kappa. We extracted a total of 110 information signals, of which 73% consisted of information that was interpreted by the pharmacists from the case scenario and only about half (53%, n = 32) of the information signals were considered relevant for the detection of the ADEs. Excellent reliability was demonstrated between the reviewers for classifying signals. Fifty information signals regarding unmet information needs were extracted and grouped into themes based on the type of missing information. Pharmacists used a forward reasoning approach to make implicit deductions and validate hypotheses about possible ADEs. Verbal protocols also indicated that pharmacists' unmet information needs occurred frequently. Developing alerting systems that meet pharmacists' needs adequately will enhance their ability to reduce preventable ADEs, thus improving patient safety.

  3. Increased SERS detection efficiency for characterizing rare events in flow.

    PubMed

    Jacobs, Kevin T; Schultz, Zachary D

    2015-08-18

    Improved surface-enhanced Raman scattering (SERS) measurements of a flowing aqueous sample are accomplished by combining line focus optics with sheath-flow SERS detection. The straightforward introduction of a cylindrical lens into the optical path of the Raman excitation laser increases the efficiency of SERS detection and the reproducibility of SERS signals at low concentrations. The width of the line focus is matched to the width of the sample capillary from which the analyte elutes under hydrodynamic focusing conditions, allowing for increased collection across the SERS substrate while maintaining the power density below the damage threshold at any specific point. We show that a 4× increase in power spread across the line increases the signal-to-noise ratio by a factor of 2 for a variety of analytes, such as rhodamine 6G, amino acids, and lipid vesicles, without any detectable photodamage. COMSOL simulations and Raman maps elucidate the hydrodynamic focusing properties of the flow cell, providing a clearer picture of the confinement effects at the surface where the sample exits the capillary. The lipid vesicle results suggest that the combination of hydrodynamic focusing and increased optical collection enables the reproducible detection of rare events, in this case individual lipid vesicles.

  4. An evaluation of three signal-detection algorithms using a highly inclusive reference event database.

    PubMed

    Hochberg, Alan M; Hauben, Manfred; Pearson, Ronald K; O'Hara, Donald J; Reisinger, Stephanie J; Goldsmith, David I; Gould, A Lawrence; Madigan, David

    2009-01-01

    Pharmacovigilance data-mining algorithms (DMAs) are known to generate significant numbers of false-positive signals of disproportionate reporting (SDRs), using various standards to define the terms 'true positive' and 'false positive'. To construct a highly inclusive reference event database of reported adverse events for a limited set of drugs, and to utilize that database to evaluate three DMAs for their overall yield of scientifically supported adverse drug effects, with an emphasis on ascertaining false-positive rates as defined by matching to the database, and to assess the overlap among SDRs detected by various DMAs. A sample of 35 drugs approved by the US FDA between 2000 and 2004 was selected, including three drugs added to cover therapeutic categories not included in the original sample. We compiled a reference event database of adverse event information for these drugs from historical and current US prescribing information, from peer-reviewed literature covering 1999 through March 2006, from regulatory actions announced by the FDA and from adverse event listings in the British National Formulary. Every adverse event mentioned in these sources was entered into the database, even those with minimal evidence for causality. To provide some selectivity regarding causality, each entry was assigned a level of evidence based on the source of the information, using rules developed by the authors. Using the FDA adverse event reporting system data for 2002 through 2005, SDRs were identified for each drug using three DMAs: an urn-model based algorithm, the Gamma Poisson Shrinker (GPS) and proportional reporting ratio (PRR), using previously published signalling thresholds. The absolute number and fraction of SDRs matching the reference event database at each level of evidence was determined for each report source and the data-mining method. Overlap of the SDR lists among the various methods and report sources was tabulated as well. The GPS algorithm had the lowest

  5. Automated Detection Method of Slow Slip Events in Southwest Japan

    NASA Astrophysics Data System (ADS)

    Kimura, T.; Hirose, H.; Obara, K.; Kimura, H.

    2010-12-01

    In the Nankai subduction zone, southwest Japan, various types of slow earthquakes have been detected using dense seismic and geodetic observation networks such as Hi-net operated by the National Research Institute for Earth Science and Disaster Prevention. Short-term slow slip events (SSEs) which last for several days are detected as crustal deformation by using borehole tiltmeters and strainmeters, and usually accompanied by seismic slow earthquakes such as nonvolcanic deep low-frequency tremor. These coupled phenomena are called episodic tremor and slip (ETS). In previous studies on ETS events in southwest Japan, short-term SSEs have been identified manually consulting with the seismic tremor data. However, in order to clarify the relationship between geodetic SSEs and seismic tremor objectively, an SSE detection method independent of the tremor data is necessary. In this study, we develop a new automated method that identifies signals caused by SSEs and estimates the source model using ground tilt data. Our method is composed of two phases, estimation of the SSE model and identification of SSE. In the model estimation phase, we assume that an SSE must occur in the analyzed time period, and observed ground tilt contains a response to an SSE, background linear trend, random-walk noise, and white noise. An SSE is modeled as a uniform slip on a rectangular fault with a time-invariant slip-rate. We estimate an optimum source model of the possible SSE using the Kalman filter for linear parameters such as total slip and grid-search method for nonlinear parameters such as fault location, origin time and duration. In the identification phase, another model is estimated from the same tilt data with an assumption that no SSE occurs. The tilt changes modeled as an SSE in the estimation phase is evaluated by comparison between the models with and without SSE on the basis of AIC. Then a robustness test is carried and the model is identified as an SSE. We applied the automated

  6. An approach to model-based fault detection in industrial measurement systems with application to engine test benches

    NASA Astrophysics Data System (ADS)

    Angelov, P.; Giglio, V.; Guardiola, C.; Lughofer, E.; Luján, J. M.

    2006-07-01

    An approach to fault detection (FD) in industrial measurement systems is proposed in this paper which includes an identification strategy for early detection of the appearance of a fault. This approach is model based, i.e. nominal models are used which represent the fault-free state of the on-line measured process. This approach is also suitable for off-line FD. The framework that combines FD with isolation and correction (FDIC) is outlined in this paper. The proposed approach is characterized by automatic threshold determination, ability to analyse local properties of the models, and aggregation of different fault detection statements. The nominal models are built using data-driven and hybrid approaches, combining first principle models with on-line data-driven techniques. At the same time the models are transparent and interpretable. This novel approach is then verified on a number of real and simulated data sets of car engine test benches (both gasoline—Alfa Romeo JTS, and diesel—Caterpillar). It is demonstrated that the approach can work effectively in real industrial measurement systems with data of large dimensions in both on-line and off-line modes.

  7. A Pulse-type Hardware Level Difference Detection Model Based on Sound Source Localization Mechanism in Barn Owl

    NASA Astrophysics Data System (ADS)

    Sakurai, Tsubasa; Sekine, Yoshifumi

    Auditory information processing is very important in the darkness where vision information is extremely limited. Barn owls have excellent hearing information processing function. Barn owls can detect a sound source in the high accuracy of less than two degrees in both of the vertical and horizontal directions. When they perform the sound source localization, the barn owls use the interaural time difference for localization in the horizontal plane, and the interaural level difference for localization in the vertical plane. We are constructing the two-dimensional sound source localization model using pulse-type hardware neuron models based on sound source localization mechanism of barn owl for the purpose of the engineering application. In this paper, we propose a pulse-type hardware model for level difference detection based on sound source localization mechanism of barn owl. Firstly, we discuss the response characteristics of the mathematical model for level difference detection. Next we discuss the response characteristics of the hardware mode. As a result, we show clearly that this proposal model can be used as a sound source localization model of vertical direction.

  8. Pulmonary Nodule Detection Model Based on SVM and CT Image Feature-Level Fusion with Rough Sets

    PubMed Central

    Lu, Huiling; Zhang, Junjie; Shi, Hongbin

    2016-01-01

    In order to improve the detection accuracy of pulmonary nodules in CT image, considering two problems of pulmonary nodules detection model, including unreasonable feature structure and nontightness of feature representation, a pulmonary nodules detection algorithm is proposed based on SVM and CT image feature-level fusion with rough sets. Firstly, CT images of pulmonary nodule are analyzed, and 42-dimensional feature components are extracted, including six new 3-dimensional features proposed by this paper and others 2-dimensional and 3-dimensional features. Secondly, these features are reduced for five times with rough set based on feature-level fusion. Thirdly, a grid optimization model is used to optimize the kernel function of support vector machine (SVM), which is used as a classifier to identify pulmonary nodules. Finally, lung CT images of 70 patients with pulmonary nodules are collected as the original samples, which are used to verify the effectiveness and stability of the proposed model by four groups' comparative experiments. The experimental results show that the effectiveness and stability of the proposed model based on rough set feature-level fusion are improved in some degrees. PMID:27722173

  9. Detecting Rare Events in the Time-Domain

    SciTech Connect

    Rest, A; Garg, A

    2008-10-31

    One of the biggest challenges in current and future time-domain surveys is to extract the objects of interest from the immense data stream. There are two aspects to achieving this goal: detecting variable sources and classifying them. Difference imaging provides an elegant technique for identifying new transients or changes in source brightness. Much progress has been made in recent years toward refining the process. We discuss a selection of pitfalls that can afflict an automated difference imagine pipeline and describe some solutions. After identifying true astrophysical variables, we are faced with the challenge of classifying them. For rare events, such as supernovae and microlensing, this challenge is magnified because we must balance having selection criteria that select for the largest number of objects of interest against a high contamination rate. We discuss considerations and techniques for developing classification schemes.

  10. Detecting and characterising ramp events in wind power time series

    NASA Astrophysics Data System (ADS)

    Gallego, Cristóbal; Cuerva, Álvaro; Costa, Alexandre

    2014-12-01

    In order to implement accurate models for wind power ramp forecasting, ramps need to be previously characterised. This issue has been typically addressed by performing binary ramp/non-ramp classifications based on ad-hoc assessed thresholds. However, recent works question this approach. This paper presents the ramp function, an innovative wavelet- based tool which detects and characterises ramp events in wind power time series. The underlying idea is to assess a continuous index related to the ramp intensity at each time step, which is obtained by considering large power output gradients evaluated under different time scales (up to typical ramp durations). The ramp function overcomes some of the drawbacks shown by the aforementioned binary classification and permits forecasters to easily reveal specific features of the ramp behaviour observed at a wind farm. As an example, the daily profile of the ramp-up and ramp-down intensities are obtained for the case of a wind farm located in Spain.

  11. Revealing cell cycle control by combining model-based detection of periodic expression with novel cis-regulatory descriptors

    PubMed Central

    Andersson, Claes R; Hvidsten, Torgeir R; Isaksson, Anders; Gustafsson, Mats G; Komorowski, Jan

    2007-01-01

    Background We address the issue of explaining the presence or absence of phase-specific transcription in budding yeast cultures under different conditions. To this end we use a model-based detector of gene expression periodicity to divide genes into classes depending on their behavior in experiments using different synchronization methods. While computational inference of gene regulatory circuits typically relies on expression similarity (clustering) in order to find classes of potentially co-regulated genes, this method instead takes advantage of known time profile signatures related to the studied process. Results We explain the regulatory mechanisms of the inferred periodic classes with cis-regulatory descriptors that combine upstream sequence motifs with experimentally determined binding of transcription factors. By systematic statistical analysis we show that periodic classes are best explained by combinations of descriptors rather than single descriptors, and that different combinations correspond to periodic expression in different classes. We also find evidence for additive regulation in that the combinations of cis-regulatory descriptors associated with genes periodically expressed in fewer conditions are frequently subsets of combinations associated with genes periodically expression in more conditions. Finally, we demonstrate that our approach retrieves combinations that are more specific towards known cell-cycle related regulators than the frequently used clustering approach. Conclusion The results illustrate how a model-based approach to expression analysis may be particularly well suited to detect biologically relevant mechanisms. Our new approach makes it possible to provide more refined hypotheses about regulatory mechanisms of the cell cycle and it can easily be adjusted to reveal regulation of other, non-periodic, cellular processes. PMID:17939860

  12. Detecting Tidal Disruption Events (TDEs) with the Einstein Probe

    NASA Astrophysics Data System (ADS)

    Yuan, W.; Komossa, S.; Zhang, C.; Feng, H.; Zhang, S.; Osborne, J.; O'Brien, P.; Watson, M.; Fraser, G.

    2014-07-01

    Stars are tidally disrupted and accreted when they approach supermassive black holes (SMBHs) closely, producing a flare of electromagnetic radiation. The majority of the (approximately two dozen) tidal disruption events (TDEs) identified so far have been discovered by their luminous, transient X-ray emission. Once TDEs are detected in much larger numbers, in future dedicated transient surveys, a wealth of new applications will become possible. Including (1) TDE rate measurements in dependence of host galaxy types, (2) an assessment of the population of IMBHs, and (3) new probes of general relativity and accretion processes. Here, we present the proposed X-ray mission Einstein Probe}, which aims at detecting TDEs in large numbers. The mission consists of a wide-field micro-pore Lobster-eye imager (60deg x 60deg, or ˜1 ster), and is designed to carry out an all-sky transient survey at energies of 0.5-4 keV. It will also carry an X-ray telescope of the same micro-pore optics for follow-ups, with a smaller field-of-view. It will be capable of issuing public transient alerts rapidly.

  13. Communication of ALS Patients by Detecting Event-Related Potential

    NASA Astrophysics Data System (ADS)

    Kanou, Naoyuki; Sakuma, Kenji; Nakashima, Kenji

    Amyotrophic Lateral Sclerosis(ALS) patients are unable to successfully communicate their desires, although their mental capacity is the same as non-affected persons. Therefore, the authors put emphasis on Event-Related Potential(ERP) which elicits the highest outcome for the target visual and hearing stimuli. P300 is one component of ERP. It is positive potential that is elicited when the subject focuses attention on stimuli that appears infrequently. In this paper, the authors focused on P200 and N200 components, in addition to P300, for their great improvement in the rate of correct judgment in the target word-specific experiment. Hence the authors propose the algorithm that specifies target words by detecting these three components. Ten healthy subjects and ALS patient underwent the experiment in which a target word out of five words, was specified by this algorithm. The rates of correct judgment in nine of ten healthy subjects were more than 90.0%. The highest rate was 99.7%. The highest rate of ALS patient was 100.0%. Through these results, the authors found the possibility that ALS patients could communicate with surrounding persons by detecting ERP(P200, N200 and P300) as their desire.

  14. Apparatus and method for detecting full-capture radiation events

    DOEpatents

    Odell, Daniel M. C.

    1994-01-01

    An apparatus and method for sampling the output signal of a radiation detector and distinguishing full-capture radiation events from Compton scattering events. The output signal of a radiation detector is continuously sampled. The samples are converted to digital values and input to a discriminator where samples that are representative of events are identified. The discriminator transfers only event samples, that is, samples representing full-capture events and Compton events, to a signal processor where the samples are saved in a three-dimensional count matrix with time (from the time of onset of the pulse) on the first axis, sample pulse current amplitude on the second axis, and number of samples on the third axis. The stored data are analyzed to separate the Compton events from full-capture events, and the energy of the full-capture events is determined without having determined the energies of any of the individual radiation detector events.

  15. Balloon-Borne Infrasound Detection of Energetic Bolide Events

    NASA Astrophysics Data System (ADS)

    Young, Eliot F.; Ballard, Courtney; Klein, Viliam; Bowman, Daniel; Boslough, Mark

    2016-10-01

    Infrasound is usually defined as sound waves below 20 Hz, the nominal limit of human hearing. Infrasound waves propagate over vast distances through the Earth's atmosphere: the CTBTO (Comprehensive Nuclear-Test-Ban Treaty Organization) has 48 installed infrasound-sensing stations around the world to detect nuclear detonations and other disturbances. In February 2013, several CTBTO infrasound stations detected infrasound signals from a large bolide that exploded over Chelyabinsk, Russia. Some stations recorded signals that had circumnavigated the Earth, over a day after the original event. The goal of this project is to improve upon the sensitivity of the CTBTO network by putting microphones on small, long-duration super-pressure balloons, with the overarching goal of studying the small end of the NEO population by using the Earth's atmosphere as a witness plate.A balloon-borne infrasound sensor is expected to have two advantages over ground-based stations: a lack of wind noise and a concentration of infrasound energy in the "stratospheric duct" between roughly 5 - 50 km altitude. To test these advantages, we have built a small balloon payload with five calibrated microphones. We plan to fly this payload on a NASA high-altitude balloon from Ft Sumner, NM in August 2016. We have arranged for three large explosions to take place in Socorro, NM while the balloon is aloft to assess the sensitivity of balloon-borne vs. ground-based infrasound sensors. We will report on the results from this test flight and the prospects for detecting/characterizing small bolides in the stratosphere.

  16. DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals

    PubMed Central

    Lawhern, Vernon; Hairston, W. David; Robbins, Kay

    2013-01-01

    Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration. PMID:23638169

  17. Automatic detection of volcano-seismic events by modeling state and event duration in hidden Markov models

    NASA Astrophysics Data System (ADS)

    Bhatti, Sohail Masood; Khan, Muhammad Salman; Wuth, Jorge; Huenupan, Fernando; Curilem, Millaray; Franco, Luis; Yoma, Nestor Becerra

    2016-09-01

    In this paper we propose an automatic volcano event detection system based on Hidden Markov Model (HMM) with state and event duration models. Since different volcanic events have different durations, therefore the state and whole event durations learnt from the training data are enforced on the corresponding state and event duration models within the HMM. Seismic signals from the Llaima volcano are used to train the system. Two types of events are employed in this study, Long Period (LP) and Volcano-Tectonic (VT). Experiments show that the standard HMMs can detect the volcano events with high accuracy but generates false positives. The results presented in this paper show that the incorporation of duration modeling can lead to reductions in false positive rate in event detection as high as 31% with a true positive accuracy equal to 94%. Further evaluation of the false positives indicate that the false alarms generated by the system were mostly potential events based on the signal-to-noise ratio criteria recommended by a volcano expert.

  18. Detection of Local/Regional Events in Kuwait Using Next-Generation Detection Algorithms

    SciTech Connect

    Gok, M. Rengin; Al-Jerri, Farra; Dodge, Douglas; Al-Enezi, Abdullah; Hauk, Terri; Mellors, R.

    2014-12-10

    Seismic networks around the world use conventional triggering algorithms to detect seismic signals in order to locate local/regional seismic events. Kuwait National Seismological Network (KNSN) of Kuwait Institute of Scientific Research (KISR) is operating seven broad-band and short-period three-component stations in Kuwait. The network is equipped with Nanometrics digitizers and uses Antelope and Guralp acquisition software for processing and archiving the data. In this study, we selected 10 days of archived hourly-segmented continuous data of five stations (Figure 1) and 250 days of continuous recording at MIB. For the temporary deployment our selection criteria was based on KNSN catalog intensity for the period of time we test the method. An autonomous event detection and clustering framework is employed to test a more complete catalog of this short period of time. The goal is to illustrate the effectiveness of the technique and pursue the framework for longer period of time.

  19. A system for the model based emergency detection and communication for the telerehabilitation training of cardiopulmonary patients.

    PubMed

    Helmer, Axel; Kretschmer, Friedrich; Deparade, Riana; Song, Bianying; Meis, Markus; Hein, Andreas; Marschollek, Michael; Tegtbur, Uwe

    2012-01-01

    Cardiopulmonary diseases affect millions of people and cause high costs in health care systems worldwide. Patients should perform regular endurance exercises to stabilize their health state and prevent further impairment. However, patients are often uncertain about the level of intensity they should exercise in their current condition. The cost of continuous monitoring for these training sessions in clinics is high and additionally requires the patient to travel to a clinic for each single session. Performing the rehabilitation training at home can raise compliance and reduce costs. To ensure safe telerehabilitation training and to enable patients to control their performance and health state, detection of abnormal events during training is a critical prerequisite. Therefore, we created a model that predicts the heart rate of cardiopulmonary patients and that can be used to detect and avoid abnormal health states. To enable external feedback and an immediate reaction in case of a critical situation, the patient should have the possibility to configure the system to communicate warnings and emergency events to clinical and non-clinical actors. To fulfill this task, we coupled a personal health record (PHR) with a new component that extends the classic home emergency systems. The PHR is also used for a training schedule definition that makes use of the predictive HR model. We used statistical methods to evaluate the prediction model and found that our prediction error of 3.2 heart beats per minute is precise enough to enable a detection of critical states. The concept for the communication of alerts was evaluated through focus group interviews with domain experts who judged that it fulfills the needs of potential users.

  20. Adverse drug events and medication errors: detection and classification methods.

    PubMed

    Morimoto, T; Gandhi, T K; Seger, A C; Hsieh, T C; Bates, D W

    2004-08-01

    Investigating the incidence, type, and preventability of adverse drug events (ADEs) and medication errors is crucial to improving the quality of health care delivery. ADEs, potential ADEs, and medication errors can be collected by extraction from practice data, solicitation of incidents from health professionals, and patient surveys. Practice data include charts, laboratory, prescription data, and administrative databases, and can be reviewed manually or screened by computer systems to identify signals. Research nurses, pharmacists, or research assistants review these signals, and those that are likely to represent an ADE or medication error are presented to reviewers who independently categorize them into ADEs, potential ADEs, medication errors, or exclusions. These incidents are also classified according to preventability, ameliorability, disability, severity, stage, and responsible person. These classifications, as well as the initial selection of incidents, have been evaluated for agreement between reviewers and the level of agreement found ranged from satisfactory to excellent (kappa = 0.32-0.98). The method of ADE and medication error detection and classification described is feasible and has good reliability. It can be used in various clinical settings to measure and improve medication safety.

  1. Visual traffic surveillance framework: classification to event detection

    NASA Astrophysics Data System (ADS)

    Ambardekar, Amol; Nicolescu, Mircea; Bebis, George; Nicolescu, Monica

    2013-10-01

    Visual traffic surveillance using computer vision techniques can be noninvasive, automated, and cost effective. Traffic surveillance systems with the ability to detect, count, and classify vehicles can be employed in gathering traffic statistics and achieving better traffic control in intelligent transportation systems. However, vehicle classification poses a difficult problem as vehicles have high intraclass variation and relatively low interclass variation. Five different object recognition techniques are investigated: principal component analysis (PCA)+difference from vehicle space, PCA+difference in vehicle space, PCA+support vector machine, linear discriminant analysis, and constellation-based modeling applied to the problem of vehicle classification. Three of the techniques that performed well were incorporated into a unified traffic surveillance system for online classification of vehicles, which uses tracking results to improve the classification accuracy. To evaluate the accuracy of the system, 31 min of traffic video containing multilane traffic intersection was processed. It was possible to achieve classification accuracy as high as 90.49% while classifying correctly tracked vehicles into four classes: cars, SUVs/vans, pickup trucks, and buses/semis. While processing a video, our system also recorded important traffic parameters such as the appearance, speed, trajectory of a vehicle, etc. This information was later used in a search assistant tool to find interesting traffic events.

  2. Using REDItools to Detect RNA Editing Events in NGS Datasets.

    PubMed

    Picardi, Ernesto; D'Erchia, Anna Maria; Montalvo, Antonio; Pesole, Graziano

    2015-03-09

    RNA editing is a post-transcriptional/co-transcriptional molecular phenomenon whereby a genetic message is modified from the corresponding DNA template by means of substitutions, insertions, and/or deletions. It occurs in a variety of organisms and different cellular locations through evolutionally and biochemically unrelated proteins. RNA editing has a plethora of biological effects including the modulation of alternative splicing and fine-tuning of gene expression. RNA editing events by base substitutions can be detected on a genomic scale by NGS technologies through the REDItools package, an ad hoc suite of Python scripts to study RNA editing using RNA-Seq and DNA-Seq data or RNA-Seq data alone. REDItools implement effective filters to minimize biases due to sequencing errors, mapping errors, and SNPs. The package is freely available at Google Code repository (http://code.google.com/p/reditools/) and released under the MIT license. In the present unit we show three basic protocols corresponding to three main REDItools scripts.

  3. Signal detection to identify serious adverse events (neuropsychiatric events) in travelers taking mefloquine for chemoprophylaxis of malaria

    PubMed Central

    Naing, Cho; Aung, Kyan; Ahmed, Syed Imran; Mak, Joon Wah

    2012-01-01

    Background For all medications, there is a trade-off between benefits and potential for harm. It is important for patient safety to detect drug-event combinations and analyze by appropriate statistical methods. Mefloquine is used as chemoprophylaxis for travelers going to regions with known chloroquine-resistant Plasmodium falciparum malaria. As such, there is a concern about serious adverse events associated with mefloquine chemoprophylaxis. The objective of the present study was to assess whether any signal would be detected for the serious adverse events of mefloquine, based on data in clinicoepidemiological studies. Materials and methods We extracted data on adverse events related to mefloquine chemoprophylaxis from the two published datasets. Disproportionality reporting of adverse events such as neuropsychiatric events and other adverse events was presented in the 2 × 2 contingency table. Reporting odds ratio and corresponding 95% confidence interval [CI] data-mining algorithm was applied for the signal detection. The safety signals are considered significant when the ROR estimates and the lower limits of the corresponding 95% CI are ≥2. Results Two datasets addressing adverse events of mefloquine chemoprophylaxis (one from a published article and one from a Cochrane systematic review) were included for analyses. Reporting odds ratio 1.58, 95% CI: 1.49–1.68 based on published data in the selected article, and 1.195, 95% CI: 0.94–1.44 based on data in the selected Cochrane review. Overall, in both datasets, the reporting odds ratio values of lower 95% CI were less than 2. Conclusion Based on available data, findings suggested that signals for serious adverse events pertinent to neuropsychiatric event were not detected for mefloquine. Further studies are needed to substantiate this. PMID:22936859

  4. Optimal detection of burst events in gravitational wave interferometric observatories

    NASA Astrophysics Data System (ADS)

    Viceré, Andrea

    2002-09-01

    We consider the problem of detecting a burst signal of unknown shape in the data from gravitational wave interferometric detectors. We introduce a statistic which generalizes the excess power statistic proposed first by Flanagan and Hughes, and then extended by Anderson et al. to the multiple detector case. The statistic that we propose is shown to be optimal for an arbitrary noise spectral characteristic, under the two hypotheses that the noise is Gaussian, albeit colored, and that the prior for the signal is uniform. The statistic derivation is based on the assumption that a signal affects only N|| samples in the data stream, but that no other information is a priori available, and that the value of the signal at each sample can be arbitrary. This is the main difference from previous works, where different assumptions were made, such as a signal distribution uniform with respect to the metric induced by the (inverse) noise correlation matrix. The two choices are equivalent if the noise is white, and in that limit the two statistics do indeed coincide. In the general case, we believe that the statistic we propose may be more appropriate, because it does not reflect the characteristics of the noise affecting the detector on the supposed distribution of the gravitational wave signal. Moreover, we show that the proposed statistic can be easily implemented in its exact form, combining standard time-series analysis tools which can be efficiently implemented. We generalize this version of an excess power statistic to the multiple detector case, considering first a noise uncorrelated among the different instruments, and then including the effect of correlated noise. We discuss exact and approximate forms of the statistic; the choice depends on the characteristics of the noise and on the assumed length of the burst event. As an example, we show the sensitivity of the network of interferometers to a δ-function burst.

  5. Large Time Projection Chambers for Rare Event Detection

    SciTech Connect

    Heffner, M

    2009-11-03

    The Time Projection Chamber (TPC) concept [add ref to TPC section] has been applied to many projects outside of particle physics and the accelerator based experiments where it was initially developed. TPCs in non-accelerator particle physics experiments are principally focused on rare event detection (e.g. neutrino and darkmater experiments) and the physics of these experiments can place dramatically different constraints on the TPC design (only extensions to the traditional TPCs are discussed here). The drift gas, or liquid, is usually the target or matter under observation and due to very low signal rates a TPC with the largest active mass is desired. The large mass complicates particle tracking of short and sometimes very low energy particles. Other special design issues include, efficient light collection, background rejection, internal triggering and optimal energy resolution. Backgrounds from gamma-rays and neutrons are significant design issues in the construction of these TPCs. They are generally placed deep underground to shield from cosmogenic particles and surrounded with shielding to reduce radiation from the local surroundings. The construction materials have to be carefully screened for radiopurity as they are in close contact with the active mass and can be a signification source of background events. The TPC excels in reducing this internal background because the mass inside the fieldcage forms one monolithic volume from which fiducial cuts can be made ex post facto to isolate quiet drift mass, and can be circulated and purified to a very high level. Self shielding in these large mass systems can be significant and the effect improves with density. The liquid phase TPC can obtain a high density at low pressure which results in very good self-shielding and compact installation with a lightweight containment. The down sides are the need for cryogenics, slower charge drift, tracks shorter than the typical electron diffusion, lower energy resolution (e

  6. A Prediction Model Based on Biomarkers and Clinical Characteristics for Detection of Lung Cancer in Pulmonary Nodules.

    PubMed

    Ma, Jie; Guarnera, Maria A; Zhou, Wenxian; Fang, HongBin; Jiang, Feng

    2017-02-01

    Lung cancer early detection by low-dose computed tomography (LDCT) can reduce the mortality. However, LDCT increases the number of indeterminate pulmonary nodules (PNs), whereas 95% of the PNs are ultimately false positives. Modalities for specifically distinguishing between malignant and benign PNs are urgently needed. We previously identified a panel of peripheral blood mononucleated cell (PBMC)-miRNA (miRs-19b-3p and -29b-3p) biomarkers for lung cancer. This study aimed to evaluate efficacy of integrating biomarkers and clinical and radiological characteristics of smokers for differentiating malignant from benign PNs. We analyzed expression of 2 miRNAs (miRs-19b-3p and -29b-3p) in PBMCs of a training set of 137 individuals with PNs. We used multivariate logistic regression analysis to develop a prediction model based on the biomarkers, radiographic features of PNs, and clinical characteristics of smokers for identifying malignant PNs. The performance of the prediction model was validated in a testing set of 111 subjects with PNs. A prediction model comprising the two biomarkers, spiculation of PNs and smoking pack-year, was developed that had 0.91 area under the curve of the receiver operating characteristic for distinguishing malignant from benign PNs. The prediction model yielded higher sensitivity (80.3% vs 72.6%) and specificity (89.4% vs 81.9%) compared with the biomarkers used alone (all P<.05). The performance of the prediction model for malignant PNs was confirmed in the validation set. We have for the first time demonstrated that the integration of biomarkers and clinical and radiological characteristics could efficiently identify lung cancer among indeterminate PNs. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Transient Evoked Potential in a Critical Event Detection Task.

    DTIC Science & Technology

    1984-02-01

    Vigilance and Discrimination: A Reassessment," Science, 164:326-328, 1969. 38. Fabiani , Monica and others. "Individual Differences in the von Restorff...implies that events which elicit a P300 are more likely to be remembered than events which do not 2-23 .... . . ... invoke a P300 (15:507-510). Fabiani

  8. Detection Method of Three Events to Window and Key Using Light Sensor for Crime Prevention

    NASA Astrophysics Data System (ADS)

    Yamawaki, Akira; Katakami, Takayuki; Kitazono, Yuhki; Serikawa, Seiichi

    The three events to the window and the key occurring when a thief attempts to intrude into the house are detected by the different sensors conventionally. This paper proposes a method detecting the three events by using the simple light-sensor consisting of an infrared LED and a photodiode. In the experiments, the light sensor shows the different tendencies that can detect each event. This fact indicates that our proposal can realize a sensor module more efficiently instead of using different sensors.

  9. Migration Based Event Detection and Automatic P- and S-Phase Picking in Hengill, Southwest Iceland

    NASA Astrophysics Data System (ADS)

    Wagner, F.; Tryggvason, A.; Gudmundsson, O.; Roberts, R.; Bodvarsson, R.; Fehler, M.

    2015-12-01

    Automatic detection of seismic events is a complicated process. Common procedures depend on the detection of seismic phases (e.g. P and S) in single trace analyses and their correct association with locatable point sources. The event detection threshold is thus directly related to the single trace detection threshold. Highly sensitive phase detectors detect low signal-to-noise ratio (S/N) phases but also produce a low percentage of locatable events. Short inter-event times of only a few seconds, which is not uncommon during seismic or volcanic crises, is a complication to any event association algorithm. We present an event detection algorithm based on seismic migration of trace attributes into an a-priori three-dimensional (3D) velocity model. We evaluate its capacity as automatic detector compared to conventional methods. Detecting events using seismic migration removes the need for phase association. The event detector runs on a stack of time shifted traces, which increases S/N and thus allows for a low detection threshold. Detected events come with an origin time and a location estimate, enabling a focused trace analysis, including P- and S-phase recognition, to discard false detections and build a basis for accurate automatic phase picking. We apply the migration based detection algorithm to data from a semi-permanent seismic network at Hengill, an active volcanic region with several geothermal production sites in southwest Iceland. The network includes 26 stations with inter-station distances down to 5 km. Results show a high success rate compared to the manually picked catalogue (up to 90% detected). New detections, that were missed by the standard detection routine, show a generally good ratio of true to false alarms, i.e. most of the new events are locatable.

  10. Network hydraulics inclusion in water quality event detection using multiple sensor stations data.

    PubMed

    Oliker, Nurit; Ostfeld, Avi

    2015-09-01

    Event detection is one of the current most challenging topics in water distribution systems analysis: how regular on-line hydraulic (e.g., pressure, flow) and water quality (e.g., pH, residual chlorine, turbidity) measurements at different network locations can be efficiently utilized to detect water quality contamination events. This study describes an integrated event detection model which combines multiple sensor stations data with network hydraulics. To date event detection modelling is likely limited to single sensor station location and dataset. Single sensor station models are detached from network hydraulics insights and as a result might be significantly exposed to false positive alarms. This work is aimed at decreasing this limitation through integrating local and spatial hydraulic data understanding into an event detection model. The spatial analysis complements the local event detection effort through discovering events with lower signatures by exploring the sensors mutual hydraulic influences. The unique contribution of this study is in incorporating hydraulic simulation information into the overall event detection process of spatially distributed sensors. The methodology is demonstrated on two example applications using base runs and sensitivity analyses. Results show a clear advantage of the suggested model over single-sensor event detection schemes.

  11. A model of human event detection in multiple process monitoring situations

    NASA Technical Reports Server (NTRS)

    Greenstein, J. S.; Rouse, W. B.

    1978-01-01

    It is proposed that human decision making in many multi-task situations might be modeled in terms of the manner in which the human detects events related to his tasks and the manner in which he allocates his attention among his tasks once he feels events have occurred. A model of human event detection performance in such a situation is presented. An assumption of the model is that, in attempting to detect events, the human generates the probability that events have occurred. Discriminant analysis is used to model the human's generation of these probabilities. An experimental study of human event detection performance in a multiple process monitoring situation is described and the application of the event detection model to this situation is addressed. The experimental study employed a situation in which subjects simulataneously monitored several dynamic processes for the occurrence of events and made yes/no decisions on the presence of events in each process. Input to the event detection model of the information displayed to the experimental subjects allows comparison of the model's performance with the performance of the subjects.

  12. Improving the prospects for detecting extrasolar planets in gravitational microlensing events in 2002

    NASA Astrophysics Data System (ADS)

    Bond, I. A.; Abe, F.; Dodd, R. J.; Hearnshaw, J. B.; Kilmartin, P. M.; Masuda, K.; Matsubara, Y.; Muraki, Y.; Noda, S.; Petterson, O. K. L.; Rattenbury, N. J.; Reid, M.; Saito, To.; Saito, Y.; Sako, T.; Skuljan, J.; Sullivan, D. J.; Sumi, T.; Wilkinson, S.; Yamada, R.; Yanagisawa, T.; Yock, P. C. M.

    2002-03-01

    Gravitational microlensing events of high magnification have been shown to be promising targets for detecting extrasolar planets. However, only a few events of high magnification have been found using conventional survey techniques. Here we demonstrate that high-magnification events can be readily found in microlensing surveys using a strategy that combines high-frequency sampling of target fields with on-line difference imaging analysis. We present 10 microlensing events with peak magnifications greater than 40 that were detected in real-time towards the Galactic bulge during 2001 by the Microlensing Observations in Astrophysics (MOA) project. We show that Earth-mass planets can be detected in future events such as these through intensive follow-up observations around the event peaks. We report this result with urgency as a similar number of such events are expected in 2002.

  13. Detection of severe weather events using new remote sensing methods

    NASA Astrophysics Data System (ADS)

    Choy, S.; Wang, C.; Zhang, K.; Kuleshov, Y.

    2012-04-01

    The potential of using ground- and space-based Global Positioning System (GPS) observations for studying severe weather events is presented using the March 2010 Melbourne storm as a case study. This event generated record rainfall and flash flooding across the state of Victoria, Australia. The Victorian GPSnet is the only state-wide and densest ground-based GPS infrastructure in Australia. This provides a unique opportunity to study the spatial and temporal variations in precipitable water vapour (PWV) as the storm passed over the network. The results show strong spatial and temporal correlation between variations of the ground-based GPS-PWV estimates and the thunderstorm passage. This finding demonstrates that the ground-based GPS techniques can supplement conventional meteorological observations in studying, monitoring, and potentially predicting severe weather events. The advantage of using ground-based GPS technique is that it is capable of providing continuous observation of the storm passage with high temporal resolution; while the spatial resolution of the distribution of water vapour is dependent on the geographical location and density of the GPS stations. The results from the space-based GPS radio occultation (RO) technique, on the other hand, are not as robust. Although GPS RO can capture the dynamics of the atmosphere with high vertical resolution, its limited geographical coverage in a local region and its temporal resolution over a short period of time raise an important question about its potential for monitoring severe weather events, particularly local thunderstorms which have a relatively short life-span. GPS RO technique will be more suitable for long-term climatology studies over a large area. It is anticipated that the findings from this study will encourage further research into using GPS meteorology technique for monitoring and forecasting severe weather events in the Australian region.

  14. Detecting adverse events for patient safety research: a review of current methodologies.

    PubMed

    Murff, Harvey J; Patel, Vimla L; Hripcsak, George; Bates, David W

    2003-01-01

    Promoting patient safety is a national priority. To evaluate interventions for reducing medical errors and adverse event, effective methods for detecting such events are required. This paper reviews the current methodologies for detection of adverse events and discusses their relative advantages and limitations. It also presents a cognitive framework for error monitoring and detection. While manual chart review has been considered the "gold-standard" for identifying adverse events in many patient safety studies, this methodology is expensive and imperfect. Investigators have developed or are currently evaluating, several electronic methods that can detect adverse events using coded data, free-text clinical narratives, or a combination of techniques. Advances in these systems will greatly facilitate our ability to monitor adverse events and promote patient safety research. But these systems will perform optimally only if we improve our understanding of the fundamental nature of errors and the ways in which the human mind can naturally, but erroneously, contribute to the problems that we observe.

  15. Method and apparatus for detecting and determining event characteristics with reduced data collection

    NASA Technical Reports Server (NTRS)

    Totman, Peter D. (Inventor); Everton, Randy L. (Inventor); Egget, Mark R. (Inventor); Macon, David J. (Inventor)

    2007-01-01

    A method and apparatus for detecting and determining event characteristics such as, for example, the material failure of a component, in a manner which significantly reduces the amount of data collected. A sensor array, including a plurality of individual sensor elements, is coupled to a programmable logic device (PLD) configured to operate in a passive state and an active state. A triggering event is established such that the PLD records information only upon detection of the occurrence of the triggering event which causes a change in state within one or more of the plurality of sensor elements. Upon the occurrence of the triggering event, the change in state of the one or more sensor elements causes the PLD to record in memory which sensor element detected the event and at what time the event was detected. The PLD may be coupled with a computer for subsequent downloading and analysis of the acquired data.

  16. Detection of upper airway status and respiratory events by a current generation positive airway pressure device.

    PubMed

    Li, Qing Yun; Berry, Richard B; Goetting, Mark G; Staley, Bethany; Soto-Calderon, Haideliza; Tsai, Sheila C; Jasko, Jeffrey G; Pack, Allan I; Kuna, Samuel T

    2015-04-01

    To compare a positive airway pressure (PAP) device's detection of respiratory events and airway status during device-detected apneas with events scored on simultaneous polysomnography (PSG). Prospective PSGs of patients with sleep apnea using a new-generation PAP device. Four clinical and academic sleep centers. Forty-five patients with obstructive sleep apnea (OSA) and complex sleep apnea (Comp SA) performed a PSG on PAP levels adjusted to induce respiratory events. None. PAP device data identifying the type of respiratory event and whether the airway during a device-detected apnea was open or obstructed were compared to time-synced, manually scored respiratory events on simultaneous PSG recording. Intraclass correlation coefficients between device-detected and PSG scored events were 0.854 for apnea-hypopnea index (AHI), 0.783 for apnea index, 0.252 for hypopnea index, and 0.098 for respiratory event-related arousals index. At a device AHI (AHIFlow) of 10 events/h, area under the receiver operating characteristic curve was 0.98, with sensitivity 0.92 and specificity 0.84. AHIFlow tended to overestimate AHI on PSG at values less than 10 events/h. The device detected that the airway was obstructed in 87.4% of manually scored obstructive apneas. Of the device-detected apneas with clear airway, a minority (15.8%) were manually scored as obstructive apneas. A device-detected apnea-hypopnea index (AHIFlow) < 10 events/h on a positive airway pressure device is strong evidence of good treatment efficacy. Device-detected airway status agrees closely with the presumed airway status during polysomnography scored events, but should not be equated with a specific type of respiratory event. © 2015 Associated Professional Sleep Societies, LLC.

  17. Unsupervised Event Characterization and Detection in Multichannel Signals: An EEG application

    PubMed Central

    Mur, Angel; Dormido, Raquel; Vega, Jesús; Duro, Natividad; Dormido-Canto, Sebastian

    2016-01-01

    In this paper, we propose a new unsupervised method to automatically characterize and detect events in multichannel signals. This method is used to identify artifacts in electroencephalogram (EEG) recordings of brain activity. The proposed algorithm has been evaluated and compared with a supervised method. To this end an example of the performance of the algorithm to detect artifacts is shown. The results show that although both methods obtain similar classification, the proposed method allows detecting events without training data and can also be applied in signals whose events are unknown a priori. Furthermore, the proposed method provides an optimal window whereby an optimal detection and characterization of events is found. The detection of events can be applied in real-time. PMID:27120605

  18. A Dynamically Configurable Log-based Distributed Security Event Detection Methodology using Simple Event Correlator

    DTIC Science & Technology

    2010-06-01

    from SANS Whitepaper - "... Detecting Attacks on Web Applications from Log Files" #look for image tags type=Single continue=TakeNext ptype=RegExp...shellcmd /home/user/sec -2.5.3/ common/syslogclient "... Synthetic : " "$2|$1|xss detected in image tag: $3" #send the raw log type=Single ptype=RegExp...Expressions taken from SANS Whitepaper - "... Detecting Attacks on Web Applications from Log Files" #look for image tags type=Single continue=TakeNext

  19. Supervised machine learning on a network scale: application to seismic event classification and detection

    NASA Astrophysics Data System (ADS)

    Reynen, Andrew; Audet, Pascal

    2017-09-01

    A new method using a machine learning technique is applied to event classification and detection at seismic networks. This method is applicable to a variety of network sizes and settings. The algorithm makes use of a small catalogue of known observations across the entire network. Two attributes, the polarization and frequency content, are used as input to regression. These attributes are extracted at predicted arrival times for P and S waves using only an approximate velocity model, as attributes are calculated over large time spans. This method of waveform characterization is shown to be able to distinguish between blasts and earthquakes with 99 per cent accuracy using a network of 13 stations located in Southern California. The combination of machine learning with generalized waveform features is further applied to event detection in Oklahoma, United States. The event detection algorithm makes use of a pair of unique seismic phases to locate events, with a precision directly related to the sampling rate of the generalized waveform features. Over a week of data from 30 stations in Oklahoma, United States are used to automatically detect 25 times more events than the catalogue of the local geological survey, with a false detection rate of less than 2 per cent. This method provides a highly confident way of detecting and locating events. Furthermore, a large number of seismic events can be automatically detected with low false alarm, allowing for a larger automatic event catalogue with a high degree of trust.

  20. Contamination event detection using multiple types of conventional water quality sensors in source water.

    PubMed

    Liu, Shuming; Che, Han; Smith, Kate; Chen, Lei

    2014-08-01

    Early warning systems are often used to detect deliberate and accidental contamination events in a water system. Conventional methods normally detect a contamination event by comparing the predicted and observed water quality values from one sensor. This paper proposes a new method for event detection by exploring the correlative relationships between multiple types of conventional water quality sensors. The performance of the proposed method was evaluated using data from contaminant injection experiments in a laboratory. Results from these experiments demonstrated the correlative responses of multiple types of sensors. It was observed that the proposed method could detect a contamination event 9 minutes after the introduction of lead nitrate solution with a concentration of 0.01 mg L(-1). The proposed method employs three parameters. Their impact on the detection performance was also analyzed. The initial analysis showed that the correlative response is contaminant-specific, which implies that it can be utilized not only for contamination detection, but also for contaminant identification.

  1. Model-Based Multifactor Dimensionality Reduction to detect epistasis for quantitative traits in the presence of error-free and noisy data.

    PubMed

    Mahachie John, Jestinah M; Van Lishout, François; Van Steen, Kristel

    2011-06-01

    Detecting gene-gene interactions or epistasis in studies of human complex diseases is a big challenge in the area of epidemiology. To address this problem, several methods have been developed, mainly in the context of data dimensionality reduction. One of these methods, Model-Based Multifactor Dimensionality Reduction, has so far mainly been applied to case-control studies. In this study, we evaluate the power of Model-Based Multifactor Dimensionality Reduction for quantitative traits to detect gene-gene interactions (epistasis) in the presence of error-free and noisy data. Considered sources of error are genotyping errors, missing genotypes, phenotypic mixtures and genetic heterogeneity. Our simulation study encompasses a variety of settings with varying minor allele frequencies and genetic variance for different epistasis models. On each simulated data, we have performed Model-Based Multifactor Dimensionality Reduction in two ways: with and without adjustment for main effects of (known) functional SNPs. In line with binary trait counterparts, our simulations show that the power is lowest in the presence of phenotypic mixtures or genetic heterogeneity compared to scenarios with missing genotypes or genotyping errors. In addition, empirical power estimates reduce even further with main effects corrections, but at the same time, false-positive percentages are reduced as well. In conclusion, phenotypic mixtures and genetic heterogeneity remain challenging for epistasis detection, and careful thought must be given to the way important lower-order effects are accounted for in the analysis.

  2. Toward automatic detection and prevention of adverse drug events.

    PubMed

    Leroy, Nicolas; Chazard, Emmanuel; Beuscart, Régis; Beuscart-Zephir, Marie Catherine

    2009-01-01

    Adverse Drug Events (ADE) due to medication errors and human factors are a major public health issue. They endanger patient safety and cause considerable extra healthcare costs. The European project PSIP (Patient Safety through Intelligent Procedures in medication) aims to identify and prevent ADE. Data mining of the structured hospital data bases will give a list of observed ADE with frequencies and probabilities, thereby giving a better understanding of potential risks. The main objective of the project is to develop innovative knowledge based on the mining results and to deliver to professionals and patients, in the form of alerts and decision support functions, a contextualized knowledge fitting the local risk parameters.

  3. Power System Extreme Event Detection: The VulnerabilityFrontier

    SciTech Connect

    Lesieutre, Bernard C.; Pinar, Ali; Roy, Sandip

    2007-10-17

    In this work we apply graph theoretic tools to provide aclose bound on a frontier relating the number of line outages in a gridto the power disrupted by the outages. This frontier describes theboundary of a space relating the possible severity of a disturbance interms of power disruption, from zero to some maximum on the boundary, tothe number line outages involved in the event. We present the usefulnessof this analysis with a complete analysis of a 30 bus system, and presentresults for larger systems.

  4. Data-mining-based detection of adverse drug events.

    PubMed

    Chazard, Emmanuel; Preda, Cristian; Merlin, Béatrice; Ficheur, Grégoire; Beuscart, Régis

    2009-01-01

    Every year adverse drug events (ADEs) are known to be responsible for 98,000 deaths in the USA. Classical methods rely on report statements, expert knowledge, and staff operated record review. One of our objectives, in the PSIP project framework, is to use data mining (e.g., decision trees) to electronically identify situations leading to risk of ADEs. 10,500 hospitalization records from Denmark and France were used. 500 rules were automatically obtained, which are currently being validated by experts. A decision support system to prevent ADEs is then to be developed. The article examines a decision tree and the rules in the field of vitamin K antagonists.

  5. Nonthreshold-based event detection for 3d environment monitoring in sensor networks

    SciTech Connect

    Li, M.; Liu, Y.H.; Chen, L.

    2008-12-15

    Event detection is a crucial task for wireless sensor network applications, especially environment monitoring. Existing approaches for event detection are mainly based on some predefined threshold values and, thus, are often inaccurate and incapable of capturing complex events. For example, in coal mine monitoring scenarios, gas leakage or water osmosis can hardly be described by the overrun of specified attribute thresholds but some complex pattern in the full-scale view of the environmental data. To address this issue, we propose a nonthreshold-based approach for the real 3D sensor monitoring environment. We employ energy-efficient methods to collect a time series of data maps from the sensor network and detect complex events through matching the gathered data to spatiotemporal data patterns. Finally, we conduct trace-driven simulations to prove the efficacy and efficiency of this approach on detecting events of complex phenomena from real-life records.

  6. Neuro-evolutionary event detection technique for downhole microseismic surveys

    NASA Astrophysics Data System (ADS)

    Maity, Debotyam; Salehi, Iraj

    2016-01-01

    Recent years have seen a significant increase in borehole microseismic data acquisition programs associated with unconventional reservoir developments such as hydraulic fracturing programs for shale oil and gas. The data so acquired is used for hydraulic fracture monitoring and diagnostics and therefore, the quality of the data in terms of resolution and accuracy has a significant impact on its value to the industry. Borehole microseismic data acquired in such environments typically suffer from propagation effects due to the presence of thin interbedded shale layers as well as noise and interference effects. Moreover, acquisition geometry has significant impact on detectability across portions of the sensor array. Our work focuses on developing robust first arrival detection and pick selection workflow for both P and S waves specifically designed for such environments. We introduce a novel workflow for refinement of picks with immunity towards significant noise artifacts and applicability over data with very low signal-to-noise ratio provided some accurate picks have already been made. This workflow utilizes multi-step hybrid detection and classification routine which makes use of a neural network based autopicker for initial picking and an evolutionary algorithm for pick refinement. We highlight the results from an actual field case study including multiple examples demonstrating immunity towards noise and compare the effectiveness of the workflow with two contemporary autopicking routines without the application of the shared detection/refinement procedure. Finally, we use a windowed waveform cross-correlation based uncertainty estimation method for potential quality control purposes. While the workflow was developed to work with the neural network based autopicker, it can be used with any other traditional autopicker and provides significant improvements in pick detection across seismic gathers.

  7. Spatial-temporal event detection in climate parameter imagery.

    SciTech Connect

    McKenna, Sean Andrew; Gutierrez, Karen A.

    2011-10-01

    Previously developed techniques that comprise statistical parametric mapping, with applications focused on human brain imaging, are examined and tested here for new applications in anomaly detection within remotely-sensed imagery. Two approaches to analysis are developed: online, regression-based anomaly detection and conditional differences. These approaches are applied to two example spatial-temporal data sets: data simulated with a Gaussian field deformation approach and weekly NDVI images derived from global satellite coverage. Results indicate that anomalies can be identified in spatial temporal data with the regression-based approach. Additionally, la Nina and el Nino climatic conditions are used as different stimuli applied to the earth and this comparison shows that el Nino conditions lead to significant decreases in NDVI in both the Amazon Basin and in Southern India.

  8. Detecting impacts of extreme events with ecological in situ monitoring networks

    NASA Astrophysics Data System (ADS)

    Mahecha, Miguel D.; Gans, Fabian; Sippel, Sebastian; Donges, Jonathan F.; Kaminski, Thomas; Metzger, Stefan; Migliavacca, Mirco; Papale, Dario; Rammig, Anja; Zscheischler, Jakob

    2017-09-01

    Extreme hydrometeorological conditions typically impact ecophysiological processes on land. Satellite-based observations of the terrestrial biosphere provide an important reference for detecting and describing the spatiotemporal development of such events. However, in-depth investigations of ecological processes during extreme events require additional in situ observations. The question is whether the density of existing ecological in situ networks is sufficient for analysing the impact of extreme events, and what are expected event detection rates of ecological in situ networks of a given size. To assess these issues, we build a baseline of extreme reductions in the fraction of absorbed photosynthetically active radiation (FAPAR), identified by a new event detection method tailored to identify extremes of regional relevance. We then investigate the event detection success rates of hypothetical networks of varying sizes. Our results show that large extremes can be reliably detected with relatively small networks, but also reveal a linear decay of detection probabilities towards smaller extreme events in log-log space. For instance, networks with ≈ 100 randomly placed sites in Europe yield a ≥ 90 % chance of detecting the eight largest (typically very large) extreme events; but only a ≥ 50 % chance of capturing the 39 largest events. These findings are consistent with probability-theoretic considerations, but the slopes of the decay rates deviate due to temporal autocorrelation and the exact implementation of the extreme event detection algorithm. Using the examples of AmeriFlux and NEON, we then investigate to what degree ecological in situ networks can capture extreme events of a given size. Consistent with our theoretical considerations, we find that today's systematically designed networks (i.e. NEON) reliably detect the largest extremes, but that the extreme event detection rates are not higher than would be achieved by randomly designed networks. Spatio

  9. Simple Movement and Detection in Discrete Event Simulation

    DTIC Science & Technology

    2005-12-01

    with a description of uniform linear motion in the following section. We will then con- sider the simplest kind of sensing, the “ cookie -cutter.” A... cookie -cutter sensor sees everything that is within its range R, and must be notified at the precise time a target enters it range. In a time-step...simulation, cookie -cutter detection is very easy. Simply compute the distance between the sensor and the target at each time step. If the target is

  10. Detection of intermittent events in atmospheric time series

    NASA Astrophysics Data System (ADS)

    Paradisi, P.; Cesari, R.; Palatella, L.; Contini, D.; Donateo, A.

    2009-04-01

    The modeling approach in atmospheric sciences is based on the assumption that local fluxes of mass, momentum, heat, etc... can be described as linear functions of the local gradient of some intensive property (concentration, flow strain, temperature,...). This is essentially associated with Gaussian statistics and short range (exponential) correlations. However, the atmosphere is a complex dynamical system displaying a wide range of spatial and temporal scales. A global description of the atmospheric dynamics should include a great number of degrees of freedom, strongly interacting on several temporal and spatial scales, thus generating long range (power-law) correlations and non-Gaussian distribution of fluctuations (Lévy flights, Lévy walks, Continuous Time Random Walks) [1]. This is typically associated with anomalous diffusion and scaling, non-trivial memory features and correlation decays and, especially, with the emergence of flux-gradient relationships that are non-linear and/or non-local in time and/or space. Actually, the local flux-gradient relationship is greatly preferred due to a more clear physical meaning, allowing to perform direct comparisons with experimental data, and, especially, to smaller computational costs in numerical models. In particular, the linearity of this relationship allows to define a transport coefficient (e.g., turbulent diffusivity) and the modeling effort is usually focused on this coefficient. However, the validity of the local (and linear) flux-gradient model is strongly dependent on the range of spatial and temporal scales represented by the model and, consequently, by the sub-grid processes included in the flux-gradient relationship. In this work, in order to check the validity of local and linear flux-gradient relationships, an approach based on the concept of renewal critical events [2] is introduced. In fact, in renewal theory [2], the dynamical origin of anomalous behaviour and non-local flux-gradient relation is

  11. Multi-Detection Events, Probability Density Functions, and Reduced Location Area

    SciTech Connect

    Eslinger, Paul W.; Schrom, Brian T.

    2016-03-01

    Abstract Several efforts have been made in the Comprehensive Nuclear-Test-Ban Treaty (CTBT) community to assess the benefits of combining detections of radionuclides to improve the location estimates available from atmospheric transport modeling (ATM) backtrack calculations. We present a Bayesian estimation approach rather than a simple dilution field of regard approach to allow xenon detections and non-detections to be combined mathematically. This system represents one possible probabilistic approach to radionuclide event formation. Application of this method to a recent interesting radionuclide event shows a substantial reduction in the location uncertainty of that event.

  12. Multi-instance dictionary learning for detecting abnormal events in surveillance videos.

    PubMed

    Huo, Jing; Gao, Yang; Yang, Wanqi; Yin, Hujun

    2014-05-01

    In this paper, a novel method termed Multi-Instance Dictionary Learning (MIDL) is presented for detecting abnormal events in crowded video scenes. With respect to multi-instance learning, each event (video clip) in videos is modeled as a bag containing several sub-events (local observations); while each sub-event is regarded as an instance. The MIDL jointly learns a dictionary for sparse representations of sub-events (instances) and multi-instance classifiers for classifying events into normal or abnormal. We further adopt three different multi-instance models, yielding the Max-Pooling-based MIDL (MP-MIDL), Instance-based MIDL (Inst-MIDL) and Bag-based MIDL (Bag-MIDL), for detecting both global and local abnormalities. The MP-MIDL classifies observed events by using bag features extracted via max-pooling over sparse representations. The Inst-MIDL and Bag-MIDL classify observed events by the predicted values of corresponding instances. The proposed MIDL is evaluated and compared with the state-of-the-art methods for abnormal event detection on the UMN (for global abnormalities) and the UCSD (for local abnormalities) datasets and results show that the proposed MP-MIDL and Bag-MIDL achieve either comparable or improved detection performances. The proposed MIDL method is also compared with other multi-instance learning methods on the task and superior results are obtained by the MP-MIDL scheme.

  13. Detecting and Locating Seismic Events Without Phase Picks or Velocity Models

    NASA Astrophysics Data System (ADS)

    Arrowsmith, S.; Young, C. J.; Ballard, S.; Slinkard, M.

    2015-12-01

    The standard paradigm for seismic event monitoring is to scan waveforms from a network of stations and identify the arrival time of various seismic phases. A signal association algorithm then groups the picks to form events, which are subsequently located by minimizing residuals between measured travel times and travel times predicted by an Earth model. Many of these steps are prone to significant errors which can lead to erroneous arrival associations and event locations. Here, we revisit a concept for event detection that does not require phase picks or travel time curves and fuses detection, association and location into a single algorithm. Our pickless event detector exploits existing catalog and waveform data to build an empirical stack of the full regional seismic wavefield, which is subsequently used to detect and locate events at a network level using correlation techniques. Because the technique uses more of the information content of the original waveforms, the concept is particularly powerful for detecting weak events that would be missed by conventional methods. We apply our detector to seismic data from the University of Utah Seismograph Stations network and compare our results with the earthquake catalog published by the University of Utah. We demonstrate that the pickless detector can detect and locate significant numbers of events previously missed by standard data processing techniques.

  14. A Fuzzy-Decision Based Approach for Composite Event Detection in Wireless Sensor Networks

    PubMed Central

    Zhang, Shukui; Chen, Hao; Zhu, Qiaoming

    2014-01-01

    The event detection is one of the fundamental researches in wireless sensor networks (WSNs). Due to the consideration of various properties that reflect events status, the Composite event is more consistent with the objective world. Thus, the research of the Composite event becomes more realistic. In this paper, we analyze the characteristics of the Composite event; then we propose a criterion to determine the area of the Composite event and put forward a dominating set based network topology construction algorithm under random deployment. For the unreliability of partial data in detection process and fuzziness of the event definitions in nature, we propose a cluster-based two-dimensional τ-GAS algorithm and fuzzy-decision based composite event decision mechanism. In the case that the sensory data of most nodes are normal, the two-dimensional τ-GAS algorithm can filter the fault node data effectively and reduce the influence of erroneous data on the event determination. The Composite event judgment mechanism which is based on fuzzy-decision holds the superiority of the fuzzy-logic based algorithm; moreover, it does not need the support of a huge rule base and its computational complexity is small. Compared to CollECT algorithm and CDS algorithm, this algorithm improves the detection accuracy and reduces the traffic. PMID:25136690

  15. A coupled classification - evolutionary optimization model for contamination event detection in water distribution systems.

    PubMed

    Oliker, Nurit; Ostfeld, Avi

    2014-03-15

    This study describes a decision support system, alerts for contamination events in water distribution systems. The developed model comprises a weighted support vector machine (SVM) for the detection of outliers, and a following sequence analysis for the classification of contamination events. The contribution of this study is an improvement of contamination events detection ability and a multi-dimensional analysis of the data, differing from the parallel one-dimensional analysis conducted so far. The multivariate analysis examines the relationships between water quality parameters and detects changes in their mutual patterns. The weights of the SVM model accomplish two goals: blurring the difference between sizes of the two classes' data sets (as there are much more normal/regular than event time measurements), and adhering the time factor attribute by a time decay coefficient, ascribing higher importance to recent observations when classifying a time step measurement. All model parameters were determined by data driven optimization so the calibration of the model was completely autonomic. The model was trained and tested on a real water distribution system (WDS) data set with randomly simulated events superimposed on the original measurements. The model is prominent in its ability to detect events that were only partly expressed in the data (i.e., affecting only some of the measured parameters). The model showed high accuracy and better detection ability as compared to previous modeling attempts of contamination event detection. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Find your manners: how do infants detect the invariant manner of motion in dynamic events?

    PubMed

    Pruden, Shannon M; Göksun, Tilbe; Roseberry, Sarah; Hirsh-Pasek, Kathy; Golinkoff, Roberta M

    2012-01-01

    To learn motion verbs, infants must be sensitive to the specific event features lexicalized in their language. One event feature important for the acquisition of English motion verbs is the manner of motion. This article examines when and how infants detect manners of motion across variations in the figure's path. Experiment 1 shows that 13- to 15-month-olds (N = 30) can detect an invariant manner of motion when the figure's path changes. Experiment 2 reveals that reducing the complexity of the events, by dampening the figure's path, helps 10- to 12-month-olds (N = 19) detect the invariant manner. These findings suggest that: (a) infants notice event features lexicalized in English motion verbs, and (b) attention to manner can be promoted by reducing event complexity. © 2012 The Authors. Child Development © 2012 Society for Research in Child Development, Inc.

  17. Object-Oriented Query Language For Events Detection From Images Sequences

    NASA Astrophysics Data System (ADS)

    Ganea, Ion Eugen

    2015-09-01

    In this paper is presented a method to represent the events extracted from images sequences and the query language used for events detection. Using an object oriented model the spatial and temporal relationships between salient objects and also between events are stored and queried. This works aims to unify the storing and querying phases for video events processing. The object oriented language syntax used for events processing allow the instantiation of the indexes classes in order to improve the accuracy of the query results. The experiments were performed on images sequences provided from sport domain and it shows the reliability and the robustness of the proposed language. To extend the language will be added a specific syntax for constructing the templates for abnormal events and for detection of the incidents as the final goal of the research.

  18. Minimal elastographic modeling of breast cancer for model based tumor detection in a digital image elasto tomography (DIET) system

    NASA Astrophysics Data System (ADS)

    Lotz, Thomas F.; Muller, Natalie; Hann, Christopher E.; Chase, J. Geoffrey

    2011-03-01

    Digital Image Elasto Tomography (DIET) is a non-invasive breast cancer screening technology that images the surface motion of a breast under harmonic mechanical actuation. A new approach capturing the dynamics and characteristics of tumor behavior is presented. A simple mechanical model of the breast is used to identify a transfer function relating the input harmonic actuation to the output surface displacements using imaging data of a silicone phantom. Areas of higher stiffness cause significant changes of damping and resonant frequencies as seen in the resulting Bode plots. A case study on a healthy and tumor silicone breast phantom shows the potential for this model-based method to clearly distinguish cancerous and healthy tissue as well as correctly predicting the tumor position.

  19. MCD for detection of event-based landslides

    NASA Astrophysics Data System (ADS)

    Mondini, A. C.; Chang, K.; Guzzetti, F.

    2011-12-01

    Landslides play an important role in the landscape evolution of mountainous terrain. They also present a socioeconomic problem in terms of risk for people and properties. Landslide inventory maps are not available for many areas affected by slope instabilities, resulting in a lack of primary information for the comprehension of the phenomenon, evaluation of relative landslide statistics, and civil protection operations on large scales. Traditional methods for the preparation of landslide inventory maps are based on the geomorphological interpretation of stereoscopic aerial photography and field surveys. These methods are expensive and time consuming. The exploitation of new remote sensing data, in particular very high resolution (VHR) satellite images, and new dedicated methods present an alternative to the traditional methods and are at the forefront of modern landslide research. Recent studies have showed the possibility to produce accurate landslide maps, reducing the time and resources required for their compilation and systematic update. This paper presents the Multiple Change Detection (MCD) technique, a new method that has shown promising results in landslide mapping. Through supervised or unsupervised classifiers, MCD combines different algorithms of change detection metrics, such as change in Normalized Differential Vegetation Index, spectral angle, principal component analysis, and independent component analysis, and applies them to a multi-temporal set of VHR satellite images to distinguish new landslides from stable areas. MCD has been applied with success in different geographical areas and with different satellite images, suggesting it is a reliable and robust technique. The technique can distinguish old from new landslides and capture runout features. Results of these case studies will be presented in the conference. Also to be presented are new developments of MCD involving the introduction of a priori information on landslide susceptibility within

  20. Detection of Upper Airway Status and Respiratory Events by a Current Generation Positive Airway Pressure Device

    PubMed Central

    Li, Qing Yun; Berry, Richard B.; Goetting, Mark G.; Staley, Bethany; Soto-Calderon, Haideliza; Tsai, Sheila C.; Jasko, Jeffrey G.; Pack, Allan I.; Kuna, Samuel T.

    2015-01-01

    Study Objectives: To compare a positive airway pressure (PAP) device's detection of respiratory events and airway status during device-detected apneas with events scored on simultaneous polysomnography (PSG). Design: Prospective PSGs of patients with sleep apnea using a new-generation PAP device. Settings: Four clinical and academic sleep centers. Patients: Forty-five patients with obstructive sleep apnea (OSA) and complex sleep apnea (Comp SA) performed a PSG on PAP levels adjusted to induce respiratory events. Interventions: None. Measurements and Results: PAP device data identifying the type of respiratory event and whether the airway during a device-detected apnea was open or obstructed were compared to time-synced, manually scored respiratory events on simultaneous PSG recording. Intraclass correlation coefficients between device-detected and PSG scored events were 0.854 for apnea-hypopnea index (AHI), 0.783 for apnea index, 0.252 for hypopnea index, and 0.098 for respiratory event-related arousals index. At a device AHI (AHIFlow) of 10 events/h, area under the receiver operating characteristic curve was 0.98, with sensitivity 0.92 and specificity 0.84. AHIFlow tended to overestimate AHI on PSG at values less than 10 events/h. The device detected that the airway was obstructed in 87.4% of manually scored obstructive apneas. Of the device-detected apneas with clear airway, a minority (15.8%) were manually scored as obstructive apneas. Conclusions: A device-detected apnea-hypopnea index (AHIFlow) < 10 events/h on a positive airway pressure device is strong evidence of good treatment efficacy. Device-detected airway status agrees closely with the presumed airway status during polysomnography scored events, but should not be equated with a specific type of respiratory event. Citation: Li QY, Berry RB, Goetting MG, Staley B, Soto-Calderon H, Tsai SC, Jasko JG, Pack AI, Kuna ST. Detection of upper airway status and respiratory events by a current generation positive

  1. An adaptive fault-tolerant event detection scheme for wireless sensor networks.

    PubMed

    Yim, Sung-Jib; Choi, Yoon-Hwa

    2010-01-01

    In this paper, we present an adaptive fault-tolerant event detection scheme for wireless sensor networks. Each sensor node detects an event locally in a distributed manner by using the sensor readings of its neighboring nodes. Confidence levels of sensor nodes are used to dynamically adjust the threshold for decision making, resulting in consistent performance even with increasing number of faulty nodes. In addition, the scheme employs a moving average filter to tolerate most transient faults in sensor readings, reducing the effective fault probability. Only three bits of data are exchanged to reduce the communication overhead in detecting events. Simulation results show that event detection accuracy and false alarm rate are kept very high and low, respectively, even in the case where 50% of the sensor nodes are faulty.

  2. Global Health Security: Building Capacities for Early Event Detection, Epidemiologic Workforce, and Laboratory Response.

    PubMed

    Balajee, S Arunmozhi; Arthur, Ray; Mounts, Anthony W

    The Global Health Security Agenda (GHSA) was launched in February 2014 to bring countries with limited capacity into compliance with the International Health Regulations (IHR) (2005). Recent international public health events, such as the appearance of Middle Eastern respiratory syndrome coronavirus and the reappearance of Ebola in West Africa, have highlighted the importance of early detection of disease events and the interconnectedness of countries. Surveillance systems that allow early detection and recognition of signal events, a public health infrastructure that allows rapid notification and information sharing within countries and across borders, a trained epidemiologic workforce, and a laboratory network that can respond appropriately and rapidly are emerging as critical components of an early warning and response system. This article focuses on 3 aspects of the GHSA that will lead to improved capacities for the detection and response to outbreaks as required by the IHR: (1) early detection and reporting of events, (2) laboratory capacity, and (3) a trained epidemiologic workforce.

  3. Unsupervised Spatial Event Detection in Targeted Domains with Applications to Civil Unrest Modeling

    PubMed Central

    Zhao, Liang; Chen, Feng; Dai, Jing; Hua, Ting; Lu, Chang-Tien; Ramakrishnan, Naren

    2014-01-01

    Twitter has become a popular data source as a surrogate for monitoring and detecting events. Targeted domains such as crime, election, and social unrest require the creation of algorithms capable of detecting events pertinent to these domains. Due to the unstructured language, short-length messages, dynamics, and heterogeneity typical of Twitter data streams, it is technically difficult and labor-intensive to develop and maintain supervised learning systems. We present a novel unsupervised approach for detecting spatial events in targeted domains and illustrate this approach using one specific domain, viz. civil unrest modeling. Given a targeted domain, we propose a dynamic query expansion algorithm to iteratively expand domain-related terms, and generate a tweet homogeneous graph. An anomaly identification method is utilized to detect spatial events over this graph by jointly maximizing local modularity and spatial scan statistics. Extensive experiments conducted in 10 Latin American countries demonstrate the effectiveness of the proposed approach. PMID:25350136

  4. Event-related complexity analysis and its application in the detection of facial attractiveness.

    PubMed

    Deng, Zhidong; Zhang, Zimu

    2014-11-01

    In this study, an event-related complexity (ERC) analysis method is proposed and used to explore the neural correlates of facial attractiveness detection in the context of a cognitive experiment. The ERC method gives a quantitative index for measuring the diverse brain activation properties that represent the neural correlates of event-related responses. This analysis reveals distinct effects of facial attractiveness processing and also provides further information that could not have been achieved from event-related potential alone.

  5. Seismic network detection probability assessment using waveforms and accounting to event association logic

    NASA Astrophysics Data System (ADS)

    Pinsky, Vladimir; Shapira, Avi

    2017-01-01

    The geographical area where a seismic event of magnitude M ≥ M t is detected by a seismic station network, for a defined probability is derived from a station probability of detection estimated as a function of epicentral distance. The latter is determined from both the bulletin data and the waveforms recorded by the station during the occurrence of the event with and without band-pass filtering. For simulating the real detection process, the waveforms are processed using the conventional Carl Johnson detection and association algorithm. The attempt is presented to account for the association time criterion in addition to the conventional approach adopted by the known PMC method.

  6. Find Your Manners: How Do Infants Detect the Invariant Manner of Motion in Dynamic Events?

    ERIC Educational Resources Information Center

    Pruden, Shannon M.; Goksun, Tilbe; Roseberry, Sarah; Hirsh-Pasek, Kathy; Golinkoff, Roberta M.

    2012-01-01

    To learn motion verbs, infants must be sensitive to the specific event features lexicalized in their language. One event feature important for the acquisition of English motion verbs is the manner of motion. This article examines when and how infants detect manners of motion across variations in the figure's path. Experiment 1 shows that 13- to…

  7. Screening DNA chip and event-specific multiplex PCR detection methods for biotech crops.

    PubMed

    Lee, Seong-Hun

    2014-11-01

    There are about 80 biotech crop events that have been approved by safety assessment in Korea. They have been controlled by genetically modified organism (GMO) and living modified organism (LMO) labeling systems. The DNA-based detection method has been used as an efficient scientific management tool. Recently, the multiplex polymerase chain reaction (PCR) and DNA chip have been developed as simultaneous detection methods for several biotech crops' events. The event-specific multiplex PCR method was developed to detect five biotech maize events: MIR604, Event 3272, LY 038, MON 88017 and DAS-59122-7. The specificity was confirmed and the sensitivity was 0.5%. The screening DNA chip was developed from four endogenous genes of soybean, maize, cotton and canola respectively along with two regulatory elements and seven genes: P35S, tNOS, pat, bar, epsps1, epsps2, pmi, cry1Ac and cry3B. The specificity was confirmed and the sensitivity was 0.5% for four crops' 12 events: one soybean, six maize, three cotton and two canola events. The multiplex PCR and DNA chip can be available for screening, gene-specific and event-specific analysis of biotech crops as efficient detection methods by saving on workload and time. © 2014 Society of Chemical Industry. © 2014 Society of Chemical Industry.

  8. Systematic detection of seismic events at Mount St. Helens with an ultra-dense array

    NASA Astrophysics Data System (ADS)

    Meng, X.; Hartog, J. R.; Schmandt, B.; Hotovec-Ellis, A. J.; Hansen, S. M.; Vidale, J. E.; Vanderplas, J.

    2016-12-01

    During the summer of 2014, an ultra-dense array of 900 geophones was deployed around the crater of Mount St. Helens and continuously operated for 15 days. This dataset provides us an unprecedented opportunity to systematically detect seismic events around an active volcano and study their underlying mechanisms. We use a waveform-based matched filter technique to detect seismic events from this dataset. Due to the large volume of continuous data ( 1 TB), we performed the detection on the GPU cluster Stampede (https://www.tacc.utexas.edu/systems/stampede). We build a suite of template events from three catalogs: 1) the standard Pacific Northwest Seismic Network (PNSN) catalog (45 events); 2) the catalog from Hansen&Schmandt (2015) obtained with a reverse-time imaging method (212 events); and 3) the catalog identified with a matched filter technique using the PNSN permanent stations (190 events). By searching for template matches in the ultra-dense array, we find 2237 events. We then calibrate precise relative magnitudes for template and detected events, using a principal component fit to measure waveform amplitude ratios. The magnitude of completeness and b-value of the detected catalog is -0.5 and 1.1, respectively. Our detected catalog shows several intensive swarms, which are likely driven by fluid pressure transients in conduits or slip transients on faults underneath the volcano. We are currently relocating the detected catalog with HypoDD and measuring the seismic velocity changes at Mount St. Helens using the coda wave interferometry of detected repeating earthquakes. The accurate temporal-spatial migration pattern of seismicity and seismic property changes should shed light on the physical processes beneath Mount St. Helens.

  9. Event Detection Challenges, Methods, and Applications in Natural and Artificial Systems

    DTIC Science & Technology

    2009-03-01

    directly addressed in this exposition; such methods include particle filtering, genetic algorithms , neural networks , and intelligent agents. A simple...Dependence – Criticality of Application – Numerous and Diverse Data Sources – Network Topology – Event Detection Algorithms • Typical Event Detection... algorithms • Neural networks • Intelligent agents 23© 2009 Lockheed Martin MS2 Composite Methods • Those methods that combine techniques within a category or

  10. Probabilistic approaches to fault detection in networked discrete event systems.

    PubMed

    Athanasopoulou, Eleftheria; Hadjicostis, Christoforos N

    2005-09-01

    In this paper, we consider distributed systems that can be modeled as finite state machines with known behavior under fault-free conditions, and we study the detection of a general class of faults that manifest themselves as permanent changes in the next-state transition functionality of the system. This scenario could arise in a variety of situations encountered in communication networks, including faults occurred due to design or implementation errors during the execution of communication protocols. In our approach, fault diagnosis is performed by an external observer/diagnoser that functions as a finite state machine and which has access to the input sequence applied to the system but has only limited access to the system state or output. In particular, we assume that the observer/diagnoser is only able to obtain partial information regarding the state of the given system at intermittent time intervals that are determined by certain synchronizing conditions between the system and the observer/diagnoser. By adopting a probabilistic framework, we analyze ways to optimally choose these synchronizing conditions and develop adaptive strategies that achieve a low probability of aliasing, i.e., a low probability that the external observer/diagnoser incorrectly declares the system as fault-free. An application of these ideas in the context of protocol testing/classification is provided as an example.

  11. Comparison of the STA/LTA and power spectral density methods for microseismic event detection

    NASA Astrophysics Data System (ADS)

    Vaezi, Yoones; Van der Baan, Mirko

    2015-12-01

    Robust event detection and picking is a prerequisite for reliable (micro-) seismic interpretations. Detection of weak events is a common challenge among various available event detection algorithms. In this paper we compare the performance of two event detection methods, the short-term average/long-term average (STA/LTA) method, which is the most commonly used technique in industry, and a newly introduced method that is based on the power spectral density (PSD) measurements. We have applied both techniques to a 1-hr long segment of the vertical component of some raw continuous data recorded at a borehole geophone in a hydraulic fracturing experiment. The PSD technique outperforms the STA/LTA technique by detecting a higher number of weak events while keeping the number of false alarms at a reasonable level. The time-frequency representations obtained through the PSD method can also help define a more suitable bandpass filter which is usually required for the STA/LTA method. The method offers thus much promise for automated event detection in industrial, local, regional and global seismological data sets.

  12. An integrated logit model for contamination event detection in water distribution systems.

    PubMed

    Housh, Mashor; Ostfeld, Avi

    2015-05-15

    The problem of contamination event detection in water distribution systems has become one of the most challenging research topics in water distribution systems analysis. Current attempts for event detection utilize a variety of approaches including statistical, heuristics, machine learning, and optimization methods. Several existing event detection systems share a common feature in which alarms are obtained separately for each of the water quality indicators. Unifying those single alarms from different indicators is usually performed by means of simple heuristics. A salient feature of the current developed approach is using a statistically oriented model for discrete choice prediction which is estimated using the maximum likelihood method for integrating the single alarms. The discrete choice model is jointly calibrated with other components of the event detection system framework in a training data set using genetic algorithms. The fusing process of each indicator probabilities, which is left out of focus in many existing event detection system models, is confirmed to be a crucial part of the system which could be modelled by exploiting a discrete choice model for improving its performance. The developed methodology is tested on real water quality data, showing improved performances in decreasing the number of false positive alarms and in its ability to detect events with higher probabilities, compared to previous studies.

  13. Detecting cardiac events - state-of-the-art.

    PubMed

    Collinson, Paul

    2015-11-01

    Cardiac biomarker measurement currently addresses two key questions in patient management: the differential diagnosis of chest pain and the differential diagnosis of the patient with breathlessness. There are currently three major themes in the strategies for the differential diagnosis of chest pain. The first is to undertake troponin measurement in patients selected to be at lower risk, hence to have a low prior probability of disease. The second is the introduction of high-sensitivity cardiac troponin (hs cTn) assays into routine clinical use with measurement at 0 and 3 h from admission. Two other approaches that utilize the diagnostic characteristics of these assays have also been suggested. The first is to use the limit of detection or limit of blank of the assay as the diagnostic discriminant. The second approach is to use the low imprecision of the assay within the reference interval and combine a discriminant value with an absolute rate of change (delta value). The third is the use of additional biomarkers to allow early discharge from the emergency department. The concept is to measure high-sensitivity cardiac troponin plus the extra marker on admission. The role of measurement of B-type natriuretic peptide or its N-terminal prohormone, N-terminal pro-B-type natriuretic peptide, has been accepted and incorporated into guidelines for chronic heart failure for some time. More recently, guidelines for acute heart failure can also recommend a single measurement of B-type natriuretic peptide or N-terminal pro-B-type natriuretic peptide in people presenting with new suspected acute heart failure.

  14. Event-specific qualitative and quantitative detection of five genetically modified rice events using a single standard reference molecule.

    PubMed

    Kim, Jae-Hwan; Park, Saet-Byul; Roh, Hyo-Jeong; Shin, Min-Ki; Moon, Gui-Im; Hong, Jin-Hwan; Kim, Hae-Yeong

    2017-07-01

    One novel standard reference plasmid, namely pUC-RICE5, was constructed as a positive control and calibrator for event-specific qualitative and quantitative detection of genetically modified (GM) rice (Bt63, Kemingdao1, Kefeng6, Kefeng8, and LLRice62). pUC-RICE5 contained fragments of a rice-specific endogenous reference gene (sucrose phosphate synthase) as well as the five GM rice events. An existing qualitative PCR assay approach was modified using pUC-RICE5 to create a quantitative method with limits of detection correlating to approximately 1-10 copies of rice haploid genomes. In this quantitative PCR assay, the square regression coefficients ranged from 0.993 to 1.000. The standard deviation and relative standard deviation values for repeatability ranged from 0.02 to 0.22 and 0.10% to 0.67%, respectively. The Ministry of Food and Drug Safety (Korea) validated the method and the results suggest it could be used routinely to identify five GM rice events.

  15. Exploiting semantics for sensor re-calibration in event detection systems

    NASA Astrophysics Data System (ADS)

    Vaisenberg, Ronen; Ji, Shengyue; Hore, Bijit; Mehrotra, Sharad; Venkatasubramanian, Nalini

    2008-01-01

    Event detection from a video stream is becoming an important and challenging task in surveillance and sentient systems. While computer vision has been extensively studied to solve different kinds of detection problems over time, it is still a hard problem and even in a controlled environment only simple events can be detected with a high degree of accuracy. Instead of struggling to improve event detection using image processing only, we bring in semantics to direct traditional image processing. Semantics are the underlying facts that hide beneath video frames, which can not be "seen" directly by image processing. In this work we demonstrate that time sequence semantics can be exploited to guide unsupervised re-calibration of the event detection system. We present an instantiation of our ideas by using an appliance as an example--Coffee Pot level detection based on video data--to show that semantics can guide the re-calibration of the detection model. This work exploits time sequence semantics to detect when re-calibration is required to automatically relearn a new detection model for the newly evolved system state and to resume monitoring with a higher rate of accuracy.

  16. Novel Use of Matched Filtering for Synaptic Event Detection and Extraction

    PubMed Central

    Shi, Yulin; Nenadic, Zoran; Xu, Xiangmin

    2010-01-01

    Efficient and dependable methods for detection and measurement of synaptic events are important for studies of synaptic physiology and neuronal circuit connectivity. As the published methods with detection algorithms based upon amplitude thresholding and fixed or scaled template comparisons are of limited utility for detection of signals with variable amplitudes and superimposed events that have complex waveforms, previous techniques are not applicable for detection of evoked synaptic events in photostimulation and other similar experimental situations. Here we report on a novel technique that combines the design of a bank of approximate matched filters with the detection and estimation theory to automatically detect and extract photostimluation-evoked excitatory postsynaptic currents (EPSCs) from individually recorded neurons in cortical circuit mapping experiments. The sensitivity and specificity of the method were evaluated on both simulated and experimental data, with its performance comparable to that of visual event detection performed by human operators. This new technique was applied to quantify and compare the EPSCs obtained from excitatory pyramidal cells and fast-spiking interneurons. In addition, our technique has been further applied to the detection and analysis of inhibitory postsynaptic current (IPSC) responses. Given the general purpose of our matched filtering and signal recognition algorithms, we expect that our technique can be appropriately modified and applied to detect and extract other types of electrophysiological and optical imaging signals. PMID:21124805

  17. Qualitative and event-specific real-time PCR detection methods for Bt brinjal event EE-1.

    PubMed

    Randhawa, Gurinder Jit; Sharma, Ruchi; Singh, Monika

    2012-01-01

    Bt brinjal event EE-1 with cry1Ac gene, expressing insecticidal protein against fruit and shoot borer, is the first genetically modified food crop in the pipeline for commercialization in India. Qualitative polymerase chain reaction (PCR) along with event-specific conventional as well as real-time PCR methods to characterize the event EE-1 is reported. A multiplex (pentaplex) PCR system simultaneously amplifying cry1Ac transgene, Cauliflower Mosaic Virus (CaMV) 35S promoter, nopaline synthase (nos) terminator, aminoglycoside adenyltransferase (aadA) marker gene, and a taxon-specific beta-fructosidase gene in event EE-1 has been developed. Furthermore, construct-specific PCR, targeting the approximate 1.8 kb region of inserted gene construct comprising the region of CaMV 35S promoter and cry1Ac gene has also been developed. The LOD of developed EE-1 specific conventional PCR assay is 0.01%. The method performance of the reported real-time PCR assay was consistent with the acceptance criteria of Codex Alimentarius Commission ALINORM 10/33/23, with the LOD and LOQ values of 0.05%. The developed detection methods would not only facilitate effective regulatory compliance for identification of genetic traits, risk assessment, management, and postrelease monitoring, but also address consumer concerns and resolution of legal disputes.

  18. Adaptive Sensor Tuning for Seismic Event Detection in Environment with Electromagnetic Noise

    NASA Astrophysics Data System (ADS)

    Ziegler, Abra E.

    The goal of this research is to detect possible microseismic events at a carbon sequestration site. Data recorded on a continuous downhole microseismic array in the Farnsworth Field, an oil field in Northern Texas that hosts an ongoing carbon capture, utilization, and storage project, were evaluated using machine learning and reinforcement learning techniques to determine their effectiveness at seismic event detection on a dataset with electromagnetic noise. The data were recorded from a passive vertical monitoring array consisting of 16 levels of 3-component 15 Hz geophones installed in the field and continuously recording since January 2014. Electromagnetic and other noise recorded on the array has significantly impacted the utility of the data and it was necessary to characterize and filter the noise in order to attempt event detection. Traditional detection methods using short-term average/long-term average (STA/LTA) algorithms were evaluated and determined to be ineffective because of changing noise levels. To improve the performance of event detection and automatically and dynamically detect seismic events using effective data processing parameters, an adaptive sensor tuning (AST) algorithm developed by Sandia National Laboratories was utilized. AST exploits neuro-dynamic programming (reinforcement learning) trained with historic event data to automatically self-tune and determine optimal detection parameter settings. The key metric that guides the AST algorithm is consistency of each sensor with its nearest neighbors: parameters are automatically adjusted on a per station basis to be more or less sensitive to produce consistent agreement of detections in its neighborhood. The effects that changes in neighborhood configuration have on signal detection were explored, as it was determined that neighborhood-based detections significantly reduce the number of both missed and false detections in ground-truthed data. The performance of the AST algorithm was

  19. A Decision Mixture Model-Based Method for Inshore Ship Detection Using High-Resolution Remote Sensing Images

    PubMed Central

    Bi, Fukun; Chen, Jing; Zhuang, Yin; Bian, Mingming; Zhang, Qingjun

    2017-01-01

    With the rapid development of optical remote sensing satellites, ship detection and identification based on large-scale remote sensing images has become a significant maritime research topic. Compared with traditional ocean-going vessel detection, inshore ship detection has received increasing attention in harbor dynamic surveillance and maritime management. However, because the harbor environment is complex, gray information and texture features between docked ships and their connected dock regions are indistinguishable, most of the popular detection methods are limited by their calculation efficiency and detection accuracy. In this paper, a novel hierarchical method that combines an efficient candidate scanning strategy and an accurate candidate identification mixture model is presented for inshore ship detection in complex harbor areas. First, in the candidate region extraction phase, an omnidirectional intersected two-dimension scanning (OITDS) strategy is designed to rapidly extract candidate regions from the land-water segmented images. In the candidate region identification phase, a decision mixture model (DMM) is proposed to identify real ships from candidate objects. Specifically, to improve the robustness regarding the diversity of ships, a deformable part model (DPM) was employed to train a key part sub-model and a whole ship sub-model. Furthermore, to improve the identification accuracy, a surrounding correlation context sub-model is built. Finally, to increase the accuracy of candidate region identification, these three sub-models are integrated into the proposed DMM. Experiments were performed on numerous large-scale harbor remote sensing images, and the results showed that the proposed method has high detection accuracy and rapid computational efficiency. PMID:28640236

  20. A Decision Mixture Model-Based Method for Inshore Ship Detection Using High-Resolution Remote Sensing Images.

    PubMed

    Bi, Fukun; Chen, Jing; Zhuang, Yin; Bian, Mingming; Zhang, Qingjun

    2017-06-22

    With the rapid development of optical remote sensing satellites, ship detection and identification based on large-scale remote sensing images has become a significant maritime research topic. Compared with traditional ocean-going vessel detection, inshore ship detection has received increasing attention in harbor dynamic surveillance and maritime management. However, because the harbor environment is complex, gray information and texture features between docked ships and their connected dock regions are indistinguishable, most of the popular detection methods are limited by their calculation efficiency and detection accuracy. In this paper, a novel hierarchical method that combines an efficient candidate scanning strategy and an accurate candidate identification mixture model is presented for inshore ship detection in complex harbor areas. First, in the candidate region extraction phase, an omnidirectional intersected two-dimension scanning (OITDS) strategy is designed to rapidly extract candidate regions from the land-water segmented images. In the candidate region identification phase, a decision mixture model (DMM) is proposed to identify real ships from candidate objects. Specifically, to improve the robustness regarding the diversity of ships, a deformable part model (DPM) was employed to train a key part sub-model and a whole ship sub-model. Furthermore, to improve the identification accuracy, a surrounding correlation context sub-model is built. Finally, to increase the accuracy of candidate region identification, these three sub-models are integrated into the proposed DMM. Experiments were performed on numerous large-scale harbor remote sensing images, and the results showed that the proposed method has high detection accuracy and rapid computational efficiency.

  1. Supervisory control design based on hybrid systems and fuzzy events detection. Application to an oxichlorination reactor.

    PubMed

    Altamiranda, Edmary; Torres, Horacio; Colina, Eliezer; Chacón, Edgar

    2002-10-01

    This paper presents a supervisory control scheme based on hybrid systems theory and fuzzy events detection. The fuzzy event detector is a linguistic model, which synthesizes complex relations between process variables and process events incorporating experts' knowledge about the process operation. This kind of detection allows the anticipation of appropriate control actions, which depend upon the selected membership functions used to characterize the process under scrutiny. The proposed supervisory control scheme was successfully implemented for an oxichlorination reactor in a vinyl monomer plant. This implementation has allowed improvement of reactor stability and reduction of raw material consumption.

  2. Real-time detection and classification of anomalous events in streaming data

    DOEpatents

    Ferragut, Erik M.; Goodall, John R.; Iannacone, Michael D.; Laska, Jason A.; Harrison, Lane T.

    2016-04-19

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The events can be displayed to a user in user-defined groupings in an animated fashion. The system can include a plurality of anomaly detectors that together implement an algorithm to identify low probability events and detect atypical traffic patterns. The atypical traffic patterns can then be classified as being of interest or not. In one particular example, in a network environment, the classification can be whether the network traffic is malicious or not.

  3. Adaptively Adjusted Event-Triggering Mechanism on Fault Detection for Networked Control Systems.

    PubMed

    Wang, Yu-Long; Lim, Cheng-Chew; Shi, Peng

    2016-12-08

    This paper studies the problem of adaptively adjusted event-triggering mechanism-based fault detection for a class of discrete-time networked control system (NCS) with applications to aircraft dynamics. By taking into account the fault occurrence detection progress and the fault occurrence probability, and introducing an adaptively adjusted event-triggering parameter, a novel event-triggering mechanism is proposed to achieve the efficient utilization of the communication network bandwidth. Both the sensor-to-control station and the control station-to-actuator network-induced delays are taken into account. The event-triggered sensor and the event-triggered control station are utilized simultaneously to establish new network-based closed-loop models for the NCS subject to faults. Based on the established models, the event-triggered simultaneous design of fault detection filter (FDF) and controller is presented. A new algorithm for handling the adaptively adjusted event-triggering parameter is proposed. Performance analysis verifies the effectiveness of the adaptively adjusted event-triggering mechanism, and the simultaneous design of FDF and controller.

  4. An Efficient Pattern Mining Approach for Event Detection in Multivariate Temporal Data.

    PubMed

    Batal, Iyad; Cooper, Gregory; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos

    2016-01-01

    This work proposes a pattern mining approach to learn event detection models from complex multivariate temporal data, such as electronic health records. We present Recent Temporal Pattern mining, a novel approach for efficiently finding predictive patterns for event detection problems. This approach first converts the time series data into time-interval sequences of temporal abstractions. It then constructs more complex time-interval patterns backward in time using temporal operators. We also present the Minimal Predictive Recent Temporal Patterns framework for selecting a small set of predictive and non-spurious patterns. We apply our methods for predicting adverse medical events in real-world clinical data. The results demonstrate the benefits of our methods in learning accurate event detection models, which is a key step for developing intelligent patient monitoring and decision support systems.

  5. An Efficient Pattern Mining Approach for Event Detection in Multivariate Temporal Data

    PubMed Central

    Batal, Iyad; Cooper, Gregory; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos

    2015-01-01

    This work proposes a pattern mining approach to learn event detection models from complex multivariate temporal data, such as electronic health records. We present Recent Temporal Pattern mining, a novel approach for efficiently finding predictive patterns for event detection problems. This approach first converts the time series data into time-interval sequences of temporal abstractions. It then constructs more complex time-interval patterns backward in time using temporal operators. We also present the Minimal Predictive Recent Temporal Patterns framework for selecting a small set of predictive and non-spurious patterns. We apply our methods for predicting adverse medical events in real-world clinical data. The results demonstrate the benefits of our methods in learning accurate event detection models, which is a key step for developing intelligent patient monitoring and decision support systems. PMID:26752800

  6. Early snowmelt events: detection, distribution, and significance in a major sub-arctic watershed

    NASA Astrophysics Data System (ADS)

    Alese Semmens, Kathryn; Ramage, Joan; Bartsch, Annett; Liston, Glen E.

    2013-03-01

    High latitude drainage basins are experiencing higher average temperatures, earlier snowmelt onset in spring, and an increase in rain on snow (ROS) events in winter, trends that climate models project into the future. Snowmelt-dominated basins are most sensitive to winter temperature increases that influence the frequency of ROS events and the timing and duration of snowmelt, resulting in changes to spring runoff. Of specific interest in this study are early melt events that occur in late winter preceding melt onset in the spring. The study focuses on satellite determination and characterization of these early melt events using the Yukon River Basin (Canada/USA) as a test domain. The timing of these events was estimated using data from passive (Advanced Microwave Scanning Radiometer—EOS (AMSR-E)) and active (SeaWinds on Quick Scatterometer (QuikSCAT)) microwave remote sensors, employing detection algorithms for brightness temperature (AMSR-E) and radar backscatter (QuikSCAT). The satellite detected events were validated with ground station meteorological and hydrological data, and the spatial and temporal variability of the events across the entire river basin was characterized. Possible causative factors for the detected events, including ROS, fog, and positive air temperatures, were determined by comparing the timing of the events to parameters from SnowModel and National Centers for Environmental Prediction North American Regional Reanalysis (NARR) outputs, and weather station data. All melt events coincided with above freezing temperatures, while a limited number corresponded to ROS (determined from SnowModel and ground data) and a majority to fog occurrence (determined from NARR). The results underscore the significant influence that warm air intrusions have on melt in some areas and demonstrate the large temporal and spatial variability over years and regions. The study provides a method for melt detection and a baseline from which to assess future change.

  7. A model-based information sharing protocol for profile Hidden Markov Models used for HIV-1 recombination detection.

    PubMed

    Bulla, Ingo; Schultz, Anne-Kathrin; Chesneau, Christophe; Mark, Tanya; Serea, Florin

    2014-06-19

    In many applications, a family of nucleotide or protein sequences classified into several subfamilies has to be modeled. Profile Hidden Markov Models (pHMMs) are widely used for this task, modeling each subfamily separately by one pHMM. However, a major drawback of this approach is the difficulty of dealing with subfamilies composed of very few sequences. One of the most crucial bioinformatical tasks affected by the problem of small-size subfamilies is the subtyping of human immunodeficiency virus type 1 (HIV-1) sequences, i.e., HIV-1 subtypes for which only a small number of sequences is known. To deal with small samples for particular subfamilies of HIV-1, we introduce a novel model-based information sharing protocol. It estimates the emission probabilities of the pHMM modeling a particular subfamily not only based on the nucleotide frequencies of the respective subfamily but also incorporating the nucleotide frequencies of all available subfamilies. To this end, the underlying probabilistic model mimics the pattern of commonality and variation between the subtypes with regards to the biological characteristics of HI viruses. In order to implement the proposed protocol, we make use of an existing HMM architecture and its associated inference engine. We apply the modified algorithm to classify HIV-1 sequence data in the form of partial HIV-1 sequences and semi-artificial recombinants. Thereby, we demonstrate that the performance of pHMMs can be significantly improved by the proposed technique. Moreover, we show that our algorithm performs significantly better than Simplot and Bootscanning.

  8. A model-based information sharing protocol for profile Hidden Markov Models used for HIV-1 recombination detection

    PubMed Central

    2014-01-01

    Background In many applications, a family of nucleotide or protein sequences classified into several subfamilies has to be modeled. Profile Hidden Markov Models (pHMMs) are widely used for this task, modeling each subfamily separately by one pHMM. However, a major drawback of this approach is the difficulty of dealing with subfamilies composed of very few sequences. One of the most crucial bioinformatical tasks affected by the problem of small-size subfamilies is the subtyping of human immunodeficiency virus type 1 (HIV-1) sequences, i.e., HIV-1 subtypes for which only a small number of sequences is known. Results To deal with small samples for particular subfamilies of HIV-1, we introduce a novel model-based information sharing protocol. It estimates the emission probabilities of the pHMM modeling a particular subfamily not only based on the nucleotide frequencies of the respective subfamily but also incorporating the nucleotide frequencies of all available subfamilies. To this end, the underlying probabilistic model mimics the pattern of commonality and variation between the subtypes with regards to the biological characteristics of HI viruses. In order to implement the proposed protocol, we make use of an existing HMM architecture and its associated inference engine. Conclusions We apply the modified algorithm to classify HIV-1 sequence data in the form of partial HIV-1 sequences and semi-artificial recombinants. Thereby, we demonstrate that the performance of pHMMs can be significantly improved by the proposed technique. Moreover, we show that our algorithm performs significantly better than Simplot and Bootscanning. PMID:24946781

  9. Using machine learning to detect events in eye-tracking data.

    PubMed

    Zemblys, Raimondas; Niehorster, Diederick C; Komogortsev, Oleg; Holmqvist, Kenneth

    2017-02-23

    Event detection is a challenging stage in eye movement data analysis. A major drawback of current event detection methods is that parameters have to be adjusted based on eye movement data quality. Here we show that a fully automated classification of raw gaze samples as belonging to fixations, saccades, or other oculomotor events can be achieved using a machine-learning approach. Any already manually or algorithmically detected events can be used to train a classifier to produce similar classification of other data without the need for a user to set parameters. In this study, we explore the application of random forest machine-learning technique for the detection of fixations, saccades, and post-saccadic oscillations (PSOs). In an effort to show practical utility of the proposed method to the applications that employ eye movement classification algorithms, we provide an example where the method is employed in an eye movement-driven biometric application. We conclude that machine-learning techniques lead to superior detection compared to current state-of-the-art event detection algorithms and can reach the performance of manual coding.

  10. Event Detection and Visualization of Ocean Eddies based on SSH and Velocity Field

    NASA Astrophysics Data System (ADS)

    Matsuoka, Daisuke; Araki, Fumiaki; Inoue, Yumi; Sasaki, Hideharu

    2016-04-01

    Numerical studies of ocean eddies have been progressed using high-resolution ocean general circulation models. In order to understand ocean eddies from simulation results with large amount of information volume, it is necessary to visualize not only distribution of eddies of each time step, but also events or phenomena of eddies. However, previous methods cannot precisely detect eddies, especially, during the events such as eddies' amalgamation, bifurcation. In the present study, we propose a new approach of eddy's detection, tracking and event visualization based on sea surface height (SSH) and velocity field. The proposed method detects eddies region as well as streams and currents region, and classifies detected eddies into several types. By tracking the time-varying change of classified eddies, it is possible to detect not only eddies event such as amalgamation and bifurcation but also the interaction between eddy and ocean current. As a result of visualizing detected eddies and events, we succeeded in creating the movie which enables us to intuitively understand the region of interest.

  11. Event Detection in Aerospace Systems using Centralized Sensor Networks: A Comparative Study of Several Methodologies

    NASA Technical Reports Server (NTRS)

    Mehr, Ali Farhang; Sauvageon, Julien; Agogino, Alice M.; Tumer, Irem Y.

    2006-01-01

    Recent advances in micro electromechanical systems technology, digital electronics, and wireless communications have enabled development of low-cost, low-power, multifunctional miniature smart sensors. These sensors can be deployed throughout a region in an aerospace vehicle to build a network for measurement, detection and surveillance applications. Event detection using such centralized sensor networks is often regarded as one of the most promising health management technologies in aerospace applications where timely detection of local anomalies has a great impact on the safety of the mission. In this paper, we propose to conduct a qualitative comparison of several local event detection algorithms for centralized redundant sensor networks. The algorithms are compared with respect to their ability to locate and evaluate an event in the presence of noise and sensor failures for various node geometries and densities.

  12. The Cognitive Processes Underlying Event-Based Prospective Memory In School Age Children and Young Adults: A Formal Model-Based Study

    PubMed Central

    Smith, Rebekah E.; Bayen, Ute Johanna; Martin, Claudia

    2010-01-01

    Fifty 7-year-olds (29 female), 53 10-year-olds (29 female), and 36 young adults (19 female), performed a computerized event-based prospective memory task. All three groups differed significantly in prospective memory performance with adults showing the best performance and 7-year-olds the poorest performance. We used a formal multinomial process tree model of event-based prospective memory to decompose age differences in cognitive processes that jointly contribute to prospective memory performance. The formal modeling results demonstrated that adults differed significantly from the 7-year-olds and 10-year-olds on both the prospective component and the retrospective component of the task. The 7-year-olds and 10-year-olds differed only in the ability to recognize prospective memory target events. The prospective memory task imposed a cost to ongoing activities in all three age groups. PMID:20053020

  13. Development of a Physical Model-Based Algorithm for the Detection of Single-Nucleotide Substitutions by Using Tiling Microarrays

    PubMed Central

    Ono, Naoaki; Suzuki, Shingo; Furusawa, Chikara; Shimizu, Hiroshi; Yomo, Tetsuya

    2013-01-01

    High-density DNA microarrays are useful tools for analyzing sequence changes in DNA samples. Although microarray analysis provides informative signals from a large number of probes, the analysis and interpretation of these signals have certain inherent limitations, namely, complex dependency of signals on the probe sequences and the existence of false signals arising from non-specific binding between probe and target. In this study, we have developed a novel algorithm to detect the single-base substitutions by using microarray data based on a thermodynamic model of hybridization. We modified the thermodynamic model by introducing a penalty for mismatches that represent the effects of substitutions on hybridization affinity. This penalty results in significantly higher detection accuracy than other methods, indicating that the incorporation of hybridization free energy can improve the analysis of sequence variants by using microarray data. PMID:23382915

  14. Field testing of component-level model-based fault detection methods for mixing boxes and VAV fan systems

    SciTech Connect

    Xu, Peng; Haves, Philip

    2002-05-16

    An automated fault detection and diagnosis tool for HVAC systems is being developed, based on an integrated, life-cycle, approach to commissioning and performance monitoring. The tool uses component-level HVAC equipment models implemented in the SPARK equation-based simulation environment. The models are configured using design information and component manufacturers' data and then fine-tuned to match the actual performance of the equipment by using data measured during functional tests of the sort using in commissioning. This paper presents the results of field tests of mixing box and VAV fan system models in an experimental facility and a commercial office building. The models were found to be capable of representing the performance of correctly operating mixing box and VAV fan systems and detecting several types of incorrect operation.

  15. Model-based waveform design for optimal detection: A multi-objective approach to dealing with incomplete a priori knowledge.

    PubMed

    Hamschin, Brandon M; Loughlin, Patrick J

    2015-11-01

    This work considers the design of optimal, energy-constrained transmit signals for active sensing for the case when the designer has incomplete or uncertain knowledge of the target and/or environment. The mathematical formulation is that of a multi-objective optimization problem, wherein one can incorporate a plurality of potential targets, interference, or clutter models and in doing so take advantage of the wide range of results in the literature related to modeling each. It is shown, via simulation, that when the objective function of the optimization problem is chosen to maximize the minimum (i.e., maxmin) probability of detection among all possible model combinations, the optimal waveforms obtained are advantageous. The advantage results because the maxmin waveforms judiciously allocate energy to spectral regions where each of the target models respond strongly and each of the environmental models affect minimal detection performance degradation. In particular, improved detection performance is shown compared to linear frequency modulated transmit signals and compared to signals designed with the wrong target spectrum assumed. Additionally, it is shown that the maxmin design yields performance comparable to an optimal design matched to the correct target/environmental model. Finally, it is proven that the maxmin problem formulation is convex.

  16. Optimizing Biosurveillance Systems that Use Threshold-based Event Detection Methods

    DTIC Science & Technology

    2009-06-01

    Optimizing Biosurveillance Systems that Use Threshold-based Event Detection Methods Ronald D. Fricker, Jr.∗ and David Banschbach† June 1, 2009...Abstract We describe a methodology for optimizing a threshold detection-based biosurveillance system. The goal is to maximize the system-wide probability of...Using this approach, pub- lic health officials can “tune” their biosurveillance systems to optimally detect various threats, thereby allowing

  17. A systematic review to evaluate the accuracy of electronic adverse drug event detection.

    PubMed

    Forster, Alan J; Jennings, Alison; Chow, Claire; Leeder, Ciera; van Walraven, Carl

    2012-01-01

    Adverse drug events (ADEs), defined as adverse patient outcomes caused by medications, are common and difficult to detect. Electronic detection of ADEs is a promising method to identify ADEs. We performed this systematic review to characterize established electronic detection systems and their accuracy. We identified studies evaluating electronic ADE detection from the MEDLINE and EMBASE databases. We included studies if they contained original data and involved detection of electronic triggers using information systems. We abstracted data regarding rule characteristics including type, accuracy, and rationale. Forty-eight studies met our inclusion criteria. Twenty-four (50%) studies reported rule accuracy but only 9 (18.8%) utilized a proper gold standard (chart review in all patients). Rule accuracy was variable and often poor (range of sensitivity: 40%-94%; specificity: 1.4%-89.8%; positive predictive value: 0.9%-64%). 5 (10.4%) studies derived or used detection rules that were defined by clinical need or the underlying ADE prevalence. Detection rules in 8 (16.7%) studies detected specific types of ADEs. Several factors led to inaccurate ADE detection algorithms, including immature underlying information systems, non-standard event definitions, and variable methods for detection rule validation. Few ADE detection algorithms considered clinical priorities. To enhance the utility of electronic detection systems, there is a need to systematically address these factors.

  18. Cooperative object tracking and composite event detection with wireless embedded smart cameras.

    PubMed

    Wang, Youlu; Velipasalar, Senem; Casares, Mauricio

    2010-10-01

    Embedded smart cameras have limited processing power, memory, energy, and bandwidth. Thus, many system- and algorithm-wise challenges remain to be addressed to have operational, battery-powered wireless smart-camera networks. We present a wireless embedded smart-camera system for cooperative object tracking and detection of composite events spanning multiple camera views. Each camera is a CITRIC mote consisting of a camera board and wireless mote. Lightweight and robust foreground detection and tracking algorithms are implemented on the camera boards. Cameras exchange small-sized data wirelessly in a peer-to-peer manner. Instead of transferring or saving every frame or trajectory, events of interest are detected. Simpler events are combined in a time sequence to define semantically higher-level events. Event complexity can be increased by increasing the number of primitives and/or number of camera views they span. Examples of consistently tracking objects across different cameras, updating location of occluded/lost objects from other cameras, and detecting composite events spanning two or three camera views, are presented. All the processing is performed on camera boards. Operating current plots of smart cameras, obtained when performing different tasks, are also presented. Power consumption is analyzed based upon these measurements.

  19. The Cognitive Processes Underlying Event-Based Prospective Memory in School-Age Children and Young Adults: A Formal Model-Based Study

    ERIC Educational Resources Information Center

    Smith, Rebekah E.; Bayen, Ute J.; Martin, Claudia

    2010-01-01

    Fifty children 7 years of age (29 girls, 21 boys), 53 children 10 years of age (29 girls, 24 boys), and 36 young adults (19 women, 17 men) performed a computerized event-based prospective memory task. All 3 groups differed significantly in prospective memory performance, with adults showing the best performance and with 7-year-olds showing the…

  20. A model-based fault-detection and prediction scheme for nonlinear multivariable discrete-time systems with asymptotic stability guarantees.

    PubMed

    Thumati, Balaje T; Jagannathan, S

    2010-03-01

    In this paper, a novel, unified model-based fault-detection and prediction (FDP) scheme is developed for nonlinear multiple-input-multiple-output (MIMO) discrete-time systems. The proposed scheme addresses both state and output faults by considering separate time profiles. The faults, which could be incipient or abrupt, are modeled using input and output signals of the system. The fault-detection (FD) scheme comprises online approximator in discrete time (OLAD) with a robust adaptive term. An output residual is generated by comparing the FD estimator output with that of the measured system output. A fault is detected when this output residual exceeds a predefined threshold. Upon detecting the fault, the robust adaptive terms and the OLADs are initiated wherein the OLAD approximates the unknown fault dynamics online while the robust adaptive terms help in ensuring asymptotic stability of the FD design. Using the OLAD outputs, a fault diagnosis scheme is introduced. A stable parameter update law is developed not only to tune the OLAD parameters but also to estimate the time to failure (TTF), which is considered as a first step for prognostics. The asymptotic stability of the FDP scheme enhances the detection and TTF accuracy. The effectiveness of the proposed approach is demonstrated using a fourth-order MIMO satellite system.

  1. Predicting error in detecting mammographic masses among radiology trainees using statistical models based on BI-RADS features

    SciTech Connect

    Grimm, Lars J. Ghate, Sujata V.; Yoon, Sora C.; Kim, Connie; Kuzmiak, Cherie M.; Mazurowski, Maciej A.

    2014-03-15

    Purpose: The purpose of this study is to explore Breast Imaging-Reporting and Data System (BI-RADS) features as predictors of individual errors made by trainees when detecting masses in mammograms. Methods: Ten radiology trainees and three expert breast imagers reviewed 100 mammograms comprised of bilateral medial lateral oblique and craniocaudal views on a research workstation. The cases consisted of normal and biopsy proven benign and malignant masses. For cases with actionable abnormalities, the experts recorded breast (density and axillary lymph nodes) and mass (shape, margin, and density) features according to the BI-RADS lexicon, as well as the abnormality location (depth and clock face). For each trainee, a user-specific multivariate model was constructed to predict the trainee's likelihood of error based on BI-RADS features. The performance of the models was assessed using area under the receive operating characteristic curves (AUC). Results: Despite the variability in errors between different trainees, the individual models were able to predict the likelihood of error for the trainees with a mean AUC of 0.611 (range: 0.502–0.739, 95% Confidence Interval: 0.543–0.680,p < 0.002). Conclusions: Patterns in detection errors for mammographic masses made by radiology trainees can be modeled using BI-RADS features. These findings may have potential implications for the development of future educational materials that are personalized to individual trainees.

  2. Adaptive hidden Markov model-based online learning framework for bearing faulty detection and performance degradation monitoring

    NASA Astrophysics Data System (ADS)

    Yu, Jianbo

    2017-01-01

    This study proposes an adaptive-learning-based method for machine faulty detection and health degradation monitoring. The kernel of the proposed method is an "evolving" model that uses an unsupervised online learning scheme, in which an adaptive hidden Markov model (AHMM) is used for online learning the dynamic health changes of machines in their full life. A statistical index is developed for recognizing the new health states in the machines. Those new health states are then described online by adding of new hidden states in AHMM. Furthermore, the health degradations in machines are quantified online by an AHMM-based health index (HI) that measures the similarity between two density distributions that describe the historic and current health states, respectively. When necessary, the proposed method characterizes the distinct operating modes of the machine and can learn online both abrupt as well as gradual health changes. Our method overcomes some drawbacks of the HIs (e.g., relatively low comprehensibility and applicability) based on fixed monitoring models constructed in the offline phase. Results from its application in a bearing life test reveal that the proposed method is effective in online detection and adaptive assessment of machine health degradation. This study provides a useful guide for developing a condition-based maintenance (CBM) system that uses an online learning method without considerable human intervention.

  3. Position-gram - A Visual Method for Detecting Transient Events in Continuous GPS Time Series

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; Wdowinski, S.

    2008-12-01

    Continuous Global Positioning System (CGPS) time series provide excellent observations for detecting crustal deformation at various length and time-scales. With the increasing precision and length of the time series, new modes of deformation, such as slow slip events and sub-continental scale changes in crustal velocities, can be detected. However, non-tectonic surface movements and measurement noise limit our ability to detect and quantify tectonic-induced transient deformation. Two common methods for reducing noise level in CGPS time series, spatial filtering and periodic seasonal fitting, significantly improve the secular tectonic signal, but fail when transient deformation events are embedded in the time series. We developed a new visually-based method for detecting transient events in CGPS time series. The development was inspired by wavelet analysis presentations that use color to present quantitative information about relationships between time and frequency domains. Here we explore the relationship between time and space domains. The displacement information is color coded according to spline fitting of each time series. This 3-D information (time, space, and displacement in color) allows easy detection of spatio-temporal patterns, which can serve as indicators for transient deformation events. We tested the new method with CGPS time series from three regions with different spatial scales: the Pacific Northwest, Southern California, and the entire continental US. The Pacific Northwest study confirmed that our proposed methodology is capable of detecting transient events and mapping their lateral distribution. The Southern California study detected a new transient event near the intersection of the San Andreas and San Jacinto faults, far from any known creeping fault segments. Finally the continental scale analysis revealed regionally correlated crustal movements in the Basin and Range and California, but uncorrelated with sites in eastern US. Such signal

  4. Adverse event detection in drug development: recommendations and obligations beyond phase 3.

    PubMed

    Berlin, Jesse A; Glasser, Susan C; Ellenberg, Susan S

    2008-08-01

    Premarketing studies of drugs, although large enough to demonstrate efficacy and detect common adverse events, cannot reliably detect an increased incidence of rare adverse events or events with significant latency. For most drugs, only about 500 to 3000 participants are studied, for relatively short durations, before a drug is marketed. Systems for assessment of postmarketing adverse events include spontaneous reports, computerized claims or medical record databases, and formal postmarketing studies. We briefly review the strengths and limitations of each. Postmarketing surveillance is essential for developing a full understanding of the balance between benefits and adverse effects. More work is needed in analysis of data from spontaneous reports of adverse effects and automated databases, design of ad hoc studies, and design of economically feasible large randomized studies.

  5. Is computer aided detection (CAD) cost effective in screening mammography? A model based on the CADET II study

    PubMed Central

    2011-01-01

    Background Single reading with computer aided detection (CAD) is an alternative to double reading for detecting cancer in screening mammograms. The aim of this study is to investigate whether the use of a single reader with CAD is more cost-effective than double reading. Methods Based on data from the CADET II study, the cost-effectiveness of single reading with CAD versus double reading was measured in terms of cost per cancer detected. Cost (Pound (£), year 2007/08) of single reading with CAD versus double reading was estimated assuming a health and social service perspective and a 7 year time horizon. As the equipment cost varies according to the unit size a separate analysis was conducted for high, average and low volume screening units. One-way sensitivity analyses were performed by varying the reading time, equipment and assessment cost, recall rate and reader qualification. Results CAD is cost increasing for all sizes of screening unit. The introduction of CAD is cost-increasing compared to double reading because the cost of CAD equipment, staff training and the higher assessment cost associated with CAD are greater than the saving in reading costs. The introduction of single reading with CAD, in place of double reading, would produce an additional cost of £227 and £253 per 1,000 women screened in high and average volume units respectively. In low volume screening units, the high cost of purchasing the equipment will results in an additional cost of £590 per 1,000 women screened. One-way sensitivity analysis showed that the factors having the greatest effect on the cost-effectiveness of CAD with single reading compared with double reading were the reading time and the reader's professional qualification (radiologist versus advanced practitioner). Conclusions Without improvements in CAD effectiveness (e.g. a decrease in the recall rate) CAD is unlikely to be a cost effective alternative to double reading for mammography screening in UK. This study

  6. Model-based linkage analysis with imprinting for quantitative traits: ignoring imprinting effects can severely jeopardize detection of linkage.

    PubMed

    Sung, Yun Ju; Rao, D C

    2008-07-01

    Genes with imprinting (parent-of-origin) effects express differently when inheriting from the mother or from the father. Some genes for development and behavior in mammals are known to be imprinted. We developed parametric linkage analysis that accounts for imprinting effects for continuous traits, implementing it in MORGAN. To study misspecification of imprinting on linkage analysis, we simulated eight markers over a 35 cM region with phenotypes where imprinting contributes 0, 25, 50, and 75% of the variance of a quantitative trait locus (QTL) effect and analyzed them under all four models. Multipoint lod scores were computed and maximized over the same 35 cM region. Our most important finding is the dramatic lod score improvement under the correct imprinting model over the no-imprinting model. For data with minor QTL allele frequency 0.05, the correct model provided the highest lod scores with maximum expected lod scores over 4 in all settings. Ignoring imprinting provided the lowest lod scores with maximum expected lod scores between -9.9 and 2.4. In the extreme scenario, cases with max lod > or =3 from the correct imprinting model and max lod < or =-2 from the no-imprinting model occurred in 86% of replications. Models with misspecified imprinting produced lod scores intermediate between those with correct imprinting and with no imprinting. The effects of model misspecification were less pronounced for singlepoint analysis. Our multipoint results illustrate that ignoring true imprinting severely impairs detection of linkage and erroneously excludes genomic regions (with max lod <-2), whereas accounting for it can substantially improve linkage detection. (c) 2008 Wiley-Liss, Inc.

  7. Generalized enrichment analysis improves the detection of adverse drug events from the biomedical literature.

    PubMed

    Winnenburg, Rainer; Shah, Nigam H

    2016-06-23

    Identification of associations between marketed drugs and adverse events from the biomedical literature assists drug safety monitoring efforts. Assessing the significance of such literature-derived associations and determining the granularity at which they should be captured remains a challenge. Here, we assess how defining a selection of adverse event terms from MeSH, based on information content, can improve the detection of adverse events for drugs and drug classes. We analyze a set of 105,354 candidate drug adverse event pairs extracted from article indexes in MEDLINE. First, we harmonize extracted adverse event terms by aggregating them into higher-level MeSH terms based on the terms' information content. Then, we determine statistical enrichment of adverse events associated with drug and drug classes using a conditional hypergeometric test that adjusts for dependencies among associated terms. We compare our results with methods based on disproportionality analysis (proportional reporting ratio, PRR) and quantify the improvement in signal detection with our generalized enrichment analysis (GEA) approach using a gold standard of drug-adverse event associations spanning 174 drugs and four events. For single drugs, the best GEA method (Precision: .92/Recall: .71/F1-measure: .80) outperforms the best PRR based method (.69/.69/.69) on all four adverse event outcomes in our gold standard. For drug classes, our GEA performs similarly (.85/.69/.74) when increasing the level of abstraction for adverse event terms. Finally, on examining the 1609 individual drugs in our MEDLINE set, which map to chemical substances in ATC, we find signals for 1379 drugs (10,122 unique adverse event associations) on applying GEA with p < 0.005. We present an approach based on generalized enrichment analysis that can be used to detect associations between drugs, drug classes and adverse events at a given level of granularity, at the same time correcting for known dependencies among

  8. Face Recognition and Event Detection in Video: An Overview of PROVE-IT Projects

    DTIC Science & Technology

    2014-07-01

    Canada. Contract Report DRDC-RDDC 2014-C July 2014 IMPORTANT INFORMATIVE STATEMENTS PROVE-IT (FRiV) Pilot...technologies enabling extraction of information from video footage, with BIOM401 focused on face recognition and BTS402 focused on event detection in video...based on video information as events occur, or extract information and intelligence from the vast amounts of collected video footage Overview of

  9. Model-based analysis supports interglacial refugia over long-dispersal events in the diversification of two South American cactus species.

    PubMed

    Perez, M F; Bonatelli, I A S; Moraes, E M; Carstens, B C

    2016-06-01

    Pilosocereus machrisii and P. aurisetus are cactus species within the P. aurisetus complex, a group of eight cacti that are restricted to rocky habitats within the Neotropical savannas of eastern South America. Previous studies have suggested that diversification within this complex was driven by distributional fragmentation, isolation leading to allopatric differentiation, and secondary contact among divergent lineages. These events have been associated with Quaternary climatic cycles, leading to the hypothesis that the xerophytic vegetation patches which presently harbor these populations operate as refugia during the current interglacial. However, owing to limitations of the standard phylogeographic approaches used in these studies, this hypothesis was not explicitly tested. Here we use Approximate Bayesian Computation to refine the previous inferences and test the role of different events in the diversification of two species within P. aurisetus group. We used molecular data from chloroplast DNA and simple sequence repeats loci of P. machrisii and P. aurisetus, the two species with broadest distribution in the complex, in order to test if the diversification in each species was driven mostly by vicariance or by long-dispersal events. We found that both species were affected primarily by vicariance, with a refuge model as the most likely scenario for P. aurisetus and a soft vicariance scenario most probable for P. machrisii. These results emphasize the importance of distributional fragmentation in these species, and add support to the hypothesis of long-term isolation in interglacial refugia previously proposed for the P. aurisetus species complex diversification.

  10. On-line Machine Learning and Event Detection in Petascale Data Streams

    NASA Astrophysics Data System (ADS)

    Thompson, David R.; Wagstaff, K. L.

    2012-01-01

    Traditional statistical data mining involves off-line analysis in which all data are available and equally accessible. However, petascale datasets have challenged this premise since it is often impossible to store, let alone analyze, the relevant observations. This has led the machine learning community to investigate adaptive processing chains where data mining is a continuous process. Here pattern recognition permits triage and followup decisions at multiple stages of a processing pipeline. Such techniques can also benefit new astronomical instruments such as the Large Synoptic Survey Telescope (LSST) and Square Kilometre Array (SKA) that will generate petascale data volumes. We summarize some machine learning perspectives on real time data mining, with representative cases of astronomical applications and event detection in high volume datastreams. The first is a "supervised classification" approach currently used for transient event detection at the Very Long Baseline Array (VLBA). It injects known signals of interest - faint single-pulse anomalies - and tunes system parameters to recover these events. This permits meaningful event detection for diverse instrument configurations and observing conditions whose noise cannot be well-characterized in advance. Second, "semi-supervised novelty detection" finds novel events based on statistical deviations from previous patterns. It detects outlier signals of interest while considering known examples of false alarm interference. Applied to data from the Parkes pulsar survey, the approach identifies anomalous "peryton" phenomena that do not match previous event models. Finally, we consider online light curve classification that can trigger adaptive followup measurements of candidate events. Classifier performance analyses suggest optimal survey strategies, and permit principled followup decisions from incomplete data. These examples trace a broad range of algorithm possibilities available for online astronomical data

  11. Detection of bubble nucleation event in superheated drop detector by the pressure sensor

    NASA Astrophysics Data System (ADS)

    Das, Mala; Biswas, Nilanjan

    2017-01-01

    Superheated drop detector consisting of drops of superheated liquid suspended in polymer or gel matrix is of great demand, mainly because of its insensitivity to ß-particles and ?-rays and also because of the low cost. The bubble nucleation event is detected by measuring the acoustic shock wave released during the nucleation process. The present work demonstrates the detection of bubble nucleation events by using the pressure sensor. The associated circuits for the measurement are described in this article. The detection of events is verified by measuring the events with the acoustic sensor. The measurement was done using drops of various sizes to study the effect of the size of the drop on the pressure recovery time. Probability of detection of events has increased for larger size of the superheated drops and lesser volume of air in contact with the gel matrix. The exponential decay fitting to the pressure sensor signals shows the dead time for pressure recovery of such a drop detector to be a few microseconds.

  12. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    NASA Astrophysics Data System (ADS)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  13. Automated detection of apnea/hypopnea events in healthy children polysomnograms: preliminary results.

    PubMed

    Held, Claudio M; Causa, Leonardo; Jaillet, Fabrice; Chamorro, Rodrigo; Garrido, Marcelo; Algarin, Cecilia; Peirano, Patricio

    2013-01-01

    A methodology to detect sleep apnea/hypopnea events in the respiratory signals of polysomnographic recordings is presented. It applies empirical mode decomposition (EMD), Hilbert-Huang transform (HHT), fuzzy logic and signal preprocessing techniques for feature extraction, expert criteria and context analysis. EMD, HHT and fuzzy logic are used for artifact detection and preliminary detection of respiration signal zones with significant variations in the amplitude of the signal; feature extraction, expert criteria and context analysis are used to characterize and validate the respiratory events. An annotated database of 30 all-night polysomnographic recordings, acquired from 30 healthy ten-year-old children, was divided in a training set of 15 recordings (485 sleep apnea/hypopnea events), a validation set of five recordings (109 sleep apnea/hypopnea events), and a testing set of ten recordings (281 sleep apnea/hypopnea events). The overall detection performance on the testing data set was 89.7% sensitivity and 16.3% false-positive rate. The next step is to include discrimination among apneas, hypopneas and respiratory pauses.

  14. Method for detecting binding events using micro-X-ray fluorescence spectrometry

    DOEpatents

    Warner, Benjamin P.; Havrilla, George J.; Mann, Grace

    2010-12-28

    Method for detecting binding events using micro-X-ray fluorescence spectrometry. Receptors are exposed to at least one potential binder and arrayed on a substrate support. Each member of the array is exposed to X-ray radiation. The magnitude of a detectable X-ray fluorescence signal for at least one element can be used to determine whether a binding event between a binder and a receptor has occurred, and can provide information related to the extent of binding between the binder and receptor.

  15. Visual sensor based abnormal event detection with moving shadow removal in home healthcare applications.

    PubMed

    Lee, Young-Sook; Chung, Wan-Young

    2012-01-01

    Vision-based abnormal event detection for home healthcare systems can be greatly improved using visual sensor-based techniques able to detect, track and recognize objects in the scene. However, in moving object detection and tracking processes, moving cast shadows can be misclassified as part of objects or moving objects. Shadow removal is an essential step for developing video surveillance systems. The goal of the primary is to design novel computer vision techniques that can extract objects more accurately and discriminate between abnormal and normal activities. To improve the accuracy of object detection and tracking, our proposed shadow removal algorithm is employed. Abnormal event detection based on visual sensor by using shape features variation and 3-D trajectory is presented to overcome the low fall detection rate. The experimental results showed that the success rate of detecting abnormal events was 97% with a false positive rate of 2%. Our proposed algorithm can allow distinguishing diverse fall activities such as forward falls, backward falls, and falling asides from normal activities.

  16. A Model-Based Personalized Cancer Screening Strategy for Detecting Early-Stage Tumors Using Blood-Borne Biomarkers.

    PubMed

    Hori, Sharon Seiko; Lutz, Amelie M; Paulmurugan, Ramasamy; Gambhir, Sanjiv Sam

    2017-05-15

    An effective cancer blood biomarker screening strategy must distinguish aggressive from nonaggressive tumors at an early, intervenable time. However, for blood-based strategies to be useful, the quantity of biomarker shed into the blood and its relationship to tumor growth or progression must be validated. To study how blood biomarker levels correlate with early-stage viable tumor growth in a mouse model of human cancer, we monitored early tumor growth of engineered human ovarian cancer cells (A2780) implanted orthotopically into nude mice. Biomarker shedding was monitored by serial blood sampling, whereas tumor viability and volume were monitored by bioluminescence imaging and ultrasound imaging. From these metrics, we developed a mathematical model of cancer biomarker kinetics that accounts for biomarker shedding from tumor and healthy cells, biomarker entry into vasculature, biomarker elimination from plasma, and subject-specific tumor growth. We validated the model in a separate set of mice in which subject-specific tumor growth rates were accurately predicted. To illustrate clinical translation of this strategy, we allometrically scaled model parameters from mouse to human and used parameters for PSA shedding and prostate cancer. In this manner, we found that blood biomarker sampling data alone were capable of enabling the detection and discrimination of simulated aggressive (2-month tumor doubling time) and nonaggressive (18-month tumor doubling time) tumors as early as 7.2 months and 8.9 years before clinical imaging, respectively. Our model and screening strategy offers broad impact in their applicability to any solid cancer and associated biomarkers shed, thereby allowing a distinction between aggressive and nonaggressive tumors using blood biomarker sampling data alone. Cancer Res; 77(10); 2570-84. ©2017 AACR. ©2017 American Association for Cancer Research.

  17. Correlation Between Ceres' Water Vapor Detections and Energetic Solar Proton Events

    NASA Astrophysics Data System (ADS)

    Villarreal, Michaela N.; Russell, Christopher T.; Luhmann, Janet G.; Thompson, William T.; Prettyman, Thomas H.; A'Hearn, Michael; Kueppers, Michael; O'Rourke, Laurence

    2017-04-01

    Ceres was expected to be, and Dawn has confirmed, an ice-rich body. Prior to the Dawn mission, several attempts were made to detect exospheric water using terrestrial spacecraft around closest approach of Earth and Ceres. These attempts show the exosphere to be time varying. While it has been proposed that sublimation controls the presence of the exosphere, there is not a correlation between Ceres' heliocentric distance and the positive detections or the magnitude of the signal. Recently, Dawn indirectly twice sensed the presence of an exosphere through the presence of energetic electrons reflected at Ceres' bow shock surface; a shock which is created through the exosphere's interaction with the solar wind. Both these events were preceded by large solar proton events. This is important because water ice can be sputtered by these very energetic protons, the flux of which are highly variable. A solar proton event could produce a transient atmosphere that would last on the order of a week before it disappeared. We analyze the correlation between the observed production rates and the energetic proton flux preceding each observation using space-based measurements near 1 AU. We conclude that solar proton events occurred in conjunction with positive detections, and were absent during negative detections. Since Dawn has seen the same correlation and has not detected evidence for active plumes, optically or thermally, we conclude that the variability of solar energetic protons explains the transient behavior of the Ceres water exosphere.

  18. Detection of invisible and crucial events: from seismic fluctuations to the war against terrorism

    NASA Astrophysics Data System (ADS)

    Allegrini, Paolo; Fronzoni, Leone; Grigolini, Paolo; Latora, Vito; Mega, Mirko S.; Palatella, Luigi; Rapisarda, Andrea; Vinciguerra, Sergio

    2004-04-01

    We prove the efficiency of a new method for the detection of crucial events that might have useful applications to the war against terrorism. This has to do with the search for rare but significant events, a theme of research that has been made of extreme importance by the tragedy of September 11. This method is applied here to defining the statistics of seismic main-shocks, as done in cond-mat/0212529. The emphasis here is on the conceptual issues behind the results obtained in cond-mat/0212529 than on geophysics. This discussion suggests that the method has a wider range of validity. We support this general discussion with a dynamic model originally proposed in cond-mat/0107597 for purposes different from geophysical applications. However, it is a case where the crucial events to detect are under our control, thereby making it possible for us to check the accuracy of the method of detection of invisible and crucial events that we propose here for a general purpose, including the war against terrorism. For this model an analytical treatment has been recently found [cond-mat/0209038], supporting the claims that we make in this paper for the accuracy of the method of detection. For the reader's convenience, the results on the seismic fluctuations are suitably reviewed, and discussed in the light of the more general perspective of this paper. We also review the model for seismic fluctuations, proposed in the earlier work of cond-mat/0212529. This model shares with the model of cond-mat/0107597 the property that the crucial events are imbedded in a sea of secondary events, but it allows us to reveal with accuracy the statistics of the crucial events for different mathematical reasons.

  19. Using Atmospheric Circulation Patterns to Detect and Attribute Changes in the Risk of Extreme Climate Events

    NASA Astrophysics Data System (ADS)

    Diffenbaugh, N. S.; Horton, D. E.; Singh, D.; Swain, D. L.; Touma, D. E.; Mankin, J. S.

    2015-12-01

    Because of the high cost of extreme events and the growing evidence that global warming is likely to alter the statistical distribution of climate variables, detection and attribution of changes in the probability of extreme climate events has become a pressing topic for the scientific community, elected officials, and the public. While most of the emphasis has thus far focused on analyzing the climate variable of interest (most often temperature or precipitation, but also flooding and drought), there is an emerging emphasis on applying detection and attribution analysis techniques to the underlying physical causes of individual extreme events. This approach is promising in part because the underlying physical causes (such as atmospheric circulation patterns) can in some cases be more accurately represented in climate models than the more proximal climate variable (such as precipitation). In addition, and more scientifically critical, is the fact that the most extreme events result from a rare combination of interacting causes, often referred to as "ingredients". Rare events will therefore always have a strong influence of "natural" variability. Analyzing the underlying physical mechanisms can therefore help to test whether there have been changes in the probability of the constituent conditions of an individual event, or whether the co-occurrence of causal conditions cannot be distinguished from random chance. This presentation will review approaches to applying detection/attribution analysis to the underlying physical causes of extreme events (including both "thermodynamic" and "dynamic" causes), and provide a number of case studies, including the role of frequency of atmospheric circulation patterns in the probability of hot, cold, wet and dry events.

  20. Why conventional detection methods fail in identifying the existence of contamination events.

    PubMed

    Liu, Shuming; Li, Ruonan; Smith, Kate; Che, Han

    2016-04-15

    Early warning systems are widely used to safeguard water security, but their effectiveness has raised many questions. To understand why conventional detection methods fail to identify contamination events, this study evaluates the performance of three contamination detection methods using data from a real contamination accident and two artificial datasets constructed using a widely applied contamination data construction approach. Results show that the Pearson correlation Euclidean distance (PE) based detection method performs better for real contamination incidents, while the Euclidean distance method (MED) and linear prediction filter (LPF) method are more suitable for detecting sudden spike-like variation. This analysis revealed why the conventional MED and LPF methods failed to identify existence of contamination events. The analysis also revealed that the widely used contamination data construction approach is misleading. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. CASH: a constructing comprehensive splice site method for detecting alternative splicing events.

    PubMed

    Wu, Wenwu; Zong, Jie; Wei, Ning; Cheng, Jian; Zhou, Xuexia; Cheng, Yuanming; Chen, Dai; Guo, Qinghua; Zhang, Bo; Feng, Ying

    2017-04-06

    RNA-sequencing (RNA-seq) can generate millions of reads to provide clues for analyzing novel or abnormal alternative splicing (AS) events in cells. However, current methods for exploring AS events are still far from being satisfactory. Here, we present Comprehensive AS Hunting (CASH), which constructs comprehensive splice sites including known and novel AS sites in cells, and identifies differentially AS events between cells. We illuminated the versatility of CASH on RNA-seq data from a wide range of species and also on simulated in silico data, validated the advantages of CASH over other AS predictors and exhibited novel differentially AS events. Moreover, we used CASH to identify SRSF10-regulated AS events and investigated the evolution of SRSF10-regulated splicing. The results showed that SRSF10-regulated splicing events are highly evolvable from chickens, mice to humans. However, SRSF10-regulated splicing model was observed to be immutable, in which SRSF10 binding to cassette exon promotes exon inclusion while binding to downstream exon induces exon skipping. Altogether, CASH can significantly improve the detection of AS events and facilitate the study of AS regulation and function in cells; the SRSF10 data first demonstrate a flexibility of SRSF10 with their regulated splicing events but an immutability of SRSF10-regulated splicing model to produce opposite AS outcomes in vertebrates. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Detection of and response to mid-ocean ridge magmatic events: Implications for the subsurface biosphere

    NASA Astrophysics Data System (ADS)

    Cowen, James P.; Baker, Edward T.; Embley, Robert W.

    Magmatic events are unpredictable dynamic processes that are integral to the evolution of mid-ocean ridges. Dikes and lava flows develop rapidly and instantly alter the local hydrothermal flow regime, initiating dramatic changes in hydrothermal discharge at the seafloor, and triggering geochemical and microbiological changes within the shallow crust, at the seafloor and within the overlying water column. Despite considerable logistical difficulties, real-time remote detection capabilities (SOSUS) along limited regions of the MOR system have allowed investigators to rapidly respond to significant seismic events. There have been more than 20 documented examples of seafloor volcanic/tectonic events, at both isolated volcanoes and mid-ocean ridges, but only a few of these have led to significant response efforts. The most rapid and thorough response efforts have been to the 1991 9° N EPR event and several events (1986,1993,1996, 1998,2001) on the Juan de Fuca and Gorda Ridges. Together these "SOSUS directed' responses plus the few serendipitous encounters have led to important discoveries (e.g., event plumes; `snow-blower' vents) and provided basic new constraints on presently immature models of submarine magmatic-hydrothermal systems (e.g., intrusive/extrusive diking; event plume formation; subsurface hydrothermal communities). The event response community has gained valuable experience in learning how to exploit these opportunities for scientific observation and is currently poised to continue such studies with increased speed and efficiency. However, our understanding of these geophysical, chemical and biological processes is only in their infancy.

  3. Developing assessment system for wireless capsule endoscopy videos based on event detection

    NASA Astrophysics Data System (ADS)

    Chen, Ying-ju; Yasen, Wisam; Lee, Jeongkyu; Lee, Dongha; Kim, Yongho

    2009-02-01

    Along with the advancing of technology in wireless and miniature camera, Wireless Capsule Endoscopy (WCE), the combination of both, enables a physician to diagnose patient's digestive system without actually perform a surgical procedure. Although WCE is a technical breakthrough that allows physicians to visualize the entire small bowel noninvasively, the video viewing time takes 1 - 2 hours. This is very time consuming for the gastroenterologist. Not only it sets a limit on the wide application of this technology but also it incurs considerable amount of cost. Therefore, it is important to automate such process so that the medical clinicians only focus on interested events. As an extension from our previous work that characterizes the motility of digestive tract in WCE videos, we propose a new assessment system for energy based events detection (EG-EBD) to classify the events in WCE videos. For the system, we first extract general features of a WCE video that can characterize the intestinal contractions in digestive organs. Then, the event boundaries are identified by using High Frequency Content (HFC) function. The segments are classified into WCE event by special features. In this system, we focus on entering duodenum, entering cecum, and active bleeding. This assessment system can be easily extended to discover more WCE events, such as detailed organ segmentation and more diseases, by using new special features. In addition, the system provides a score for every WCE image for each event. Using the event scores, the system helps a specialist to speedup the diagnosis process.

  4. No nitrate spikes detectable in several polar ice cores following the largest known solar events

    NASA Astrophysics Data System (ADS)

    Mekhaldi, Florian; McConnell, Joseph R.; Adolphi, Florian; Arienzo, Monica; Chellman, Nathan J.; Maselli, Olivia; Sigl, Michael; Muscheler, Raimund

    2017-04-01

    Solar energetic particle (SEP) events are a genuine and recognized threat to our modern society which is increasingly relying on satellites and technological infrastructures. However, knowledge on the frequency and on the upper limit of the intensity of major solar storms is largely limited by the relatively short direct observation period. In an effort to extend the observation period and because atmospheric ionization induced by solar particles can lead to the production of odd nitrogen, spikes in the nitrate content of ice cores have been tentatively used to reconstruct both the occurrence and intensity of past SEP events. Yet the reliability of its use as such a proxy has been long debated. This is partly due to differing chemistry-climate model outputs, equivocal detection of nitrate spikes in single ice cores for single events, and possible alternative sources to explain nitrate spikes in ice cores. Here we present nitrate measurements from several Antarctic and Greenland ice cores for time periods covering the largest known solar events. More specifically, we use new highly-resolved nitrate and biomass burning proxy species data (e.g. black carbon) from continuous flow analysis following the largest known solar events from the paleo record - the SEP events of 775 and 994 AD. We also consider the historical Carrington event of 1859 as well as contemporary events from the past 60 years which were observed by satellites. Doing so we show that i) there are no reproducible nitrate spikes in Greenland and Antarctic ice cores following any of these major events and that ii) most nitrate spikes found in ice cores are related to biomass burning plumes. Our analysis thus suggests that ice-core nitrate data is not a reliable proxy for atmospheric ionization by SEP events. In light of our results, we advocate that nitrate spikes so far identified from single ice cores should not be used to assess the intensity and occurrence rate of extreme solar events.

  5. USBeSafe: Applying One Class SVM for Effective USB Event Anomaly Detection

    DTIC Science & Technology

    2016-04-25

    as the attack hides in plain sight. In this thesis, we present USBeSafe as a first-of-its-kind machine learning - based anomaly detection framework... learning techniques, specifically one-class support vector machines, to create an offline USB event anomaly detection system that serves as the basis...Transfer Types . . . . . . . . . . . . . . . . . . . . . . . 7 Enumeration: Learning about the Device . . . . . . . 8 2.2 The Science of Machine Learning

  6. Improving Magnitude Detection Thresholds Using Multi-Station Multi-Event, and Multi-Phase Methods

    DTIC Science & Technology

    2008-07-31

    applied to different tectonic settings and for what percentage of the seismicity. 111 million correlations were performed on Lg-waves for the events in...a significant detection spike. 30 24. Figure 24. Example of an aftershock (spike at 2400 samples) detected after a mainshock (spike at 1500...false alarms in 36 days for a SNR of 0.32. The significant result of this study is that a correlation detector has more than an order of magnitude

  7. Solar Demon: near real-time solar eruptive event detection on SDO/AIA images

    NASA Astrophysics Data System (ADS)

    Kraaikamp, Emil; Verbeeck, Cis

    Solar flares, dimmings and EUV waves have been observed routinely in extreme ultra-violet (EUV) images of the Sun since 1996. These events are closely associated with coronal mass ejections (CMEs), and therefore provide useful information for early space weather alerts. The Solar Dynamics Observatory/Atmospheric Imaging Assembly (SDO/AIA) generates such a massive dataset that it becomes impossible to find most of these eruptive events manually. Solar Demon is a set of automatic detection algorithms that attempts to solve this problem by providing both near real-time warnings of eruptive events and a catalog of characterized events. Solar Demon has been designed to detect and characterize dimmings, EUV waves, as well as solar flares in near real-time on SDO/AIA data. The detection modules are running continuously at the Royal Observatory of Belgium on both quick-look data and synoptic science data. The output of Solar Demon can be accessed in near real-time on the Solar Demon website, and includes images, movies, light curves, and the numerical evolution of several parameters. Solar Demon is the result of collaboration between the FP7 projects AFFECTS and COMESEP. Flare detections of Solar Demon are integrated into the COMESEP alert system. Here we present the Solar Demon detection algorithms and their output. We will focus on the algorithm and its operational implementation. Examples of interesting flare, dimming and EUV wave events, and general statistics of the detections made so far during solar cycle 24 will be presented as well.

  8. Wenchuan Event Detection And Localization Using Waveform Correlation Coupled With Double Difference

    NASA Astrophysics Data System (ADS)

    Slinkard, M.; Heck, S.; Schaff, D. P.; Young, C. J.; Richards, P. G.

    2014-12-01

    The well-studied Wenchuan aftershock sequence triggered by the May 12, 2008, Ms 8.0, mainshock offers an ideal test case for evaluating the effectiveness of using waveform correlation coupled with double difference relocation to detect and locate events in a large aftershock sequence. We use Sandia's SeisCorr detector to process 3 months of data recorded by permanent IRIS and temporary ASCENT stations using templates from events listed in a global catalog to find similar events in the raw data stream. Then we take the detections and relocate them using the double difference method. We explore both the performance that can be expected with using just a small number of stations, and, the benefits of reprocessing a well-studied sequence such as this one using waveform correlation to find even more events. We benchmark our results against previously published results describing relocations of regional catalog data. Before starting this project, we had examples where with just a few stations at far-regional distances, waveform correlation combined with double difference did and impressive job of detection and location events with precision at the few hundred and even tens of meters level.

  9. Effect of parameters in moving average method for event detection enhancement using phase sensitive OTDR

    NASA Astrophysics Data System (ADS)

    Kwon, Yong-Seok; Naeem, Khurram; Jeon, Min Yong; Kwon, Il-bum

    2017-04-01

    We analyze the relations of parameters in moving average method to enhance the event detectability of phase sensitive optical time domain reflectometer (OTDR). If the external events have unique frequency of vibration, then the control parameters of moving average method should be optimized in order to detect these events efficiently. A phase sensitive OTDR was implemented by a pulsed light source, which is composed of a laser diode, a semiconductor optical amplifier, an erbium-doped fiber amplifier, a fiber Bragg grating filter, and a light receiving part, which has a photo-detector and high speed data acquisition system. The moving average method is operated with the control parameters: total number of raw traces, M, number of averaged traces, N, and step size of moving, n. The raw traces are obtained by the phase sensitive OTDR with sound signals generated by a speaker. Using these trace data, the relation of the control parameters is analyzed. In the result, if the event signal has one frequency, then the optimal values of N, n are existed to detect the event efficiently.

  10. Detecting Continuity Violations in Infancy: A New Account and New Evidence from Covering and Tube Events

    ERIC Educational Resources Information Center

    Wang, S.h.; Baillargeon, R.; Paterson, S.

    2005-01-01

    Recent research on infants' responses to occlusion and containment events indicates that, although some violations of the continuity principle are detected at an early age e.g. Aguiar, A., & Baillargeon, R. (1999). 2.5-month-old infants' reasoning about when objects should and should not be occluded. Cognitive Psychology 39, 116-157; Hespos, S.…

  11. A Blind Segmentation Approach to Acoustic Event Detection Based on I Vector

    DTIC Science & Technology

    2013-08-25

    and C.-H. Lee, “ Consumer - level multimedia event detection through unsupervised audio sig- nal modeling,” in Proc. INTERSPEECH, 2012. [8] L. R...recognition sys- tems under noisy conditions,” in ASR2000-Automatic Speech Recognition: Challenges for the new Millenium ISCA Tutorial and Research Workshop (ITRW), 2000. 2286

  12. Detecting Continuity Violations in Infancy: A New Account and New Evidence from Covering and Tube Events

    ERIC Educational Resources Information Center

    Wang, S.h.; Baillargeon, R.; Paterson, S.

    2005-01-01

    Recent research on infants' responses to occlusion and containment events indicates that, although some violations of the continuity principle are detected at an early age e.g. Aguiar, A., & Baillargeon, R. (1999). 2.5-month-old infants' reasoning about when objects should and should not be occluded. Cognitive Psychology 39, 116-157; Hespos, S.…

  13. Real-Time Event Detection for Monitoring Natural and Source Waterways - Sacramento, CA

    EPA Science Inventory

    The use of event detection systems in finished drinking water systems is increasing in order to monitor water quality in both operational and security contexts. Recent incidents involving harmful algal blooms and chemical spills into watersheds have increased interest in monitori...

  14. Detection and identification of multiple genetically modified events using DNA insert fingerprinting.

    PubMed

    Raymond, Philippe; Gendron, Louis; Khalf, Moustafa; Paul, Sylvianne; Dibley, Kim L; Bhat, Somanath; Xie, Vicki R D; Partis, Lina; Moreau, Marie-Eve; Dollard, Cheryl; Coté, Marie-José; Laberge, Serge; Emslie, Kerry R

    2010-03-01

    Current screening and event-specific polymerase chain reaction (PCR) assays for the detection and identification of genetically modified organisms (GMOs) in samples of unknown composition or for the detection of non-regulated GMOs have limitations, and alternative approaches are required. A transgenic DNA fingerprinting methodology using restriction enzyme digestion, adaptor ligation, and nested PCR was developed where individual GMOs are distinguished by the characteristic fingerprint pattern of the fragments generated. The inter-laboratory reproducibility of the amplified fragment sizes using different capillary electrophoresis platforms was compared, and reproducible patterns were obtained with an average difference in fragment size of 2.4 bp. DNA insert fingerprints for 12 different maize events, including two maize hybrids and one soy event, were generated that reflected the composition of the transgenic DNA constructs. Once produced, the fingerprint profiles were added to a database which can be readily exchanged and shared between laboratories. This approach should facilitate the process of GMO identification and characterization.

  15. A novel seizure detection algorithm informed by hidden Markov model event states

    NASA Astrophysics Data System (ADS)

    Baldassano, Steven; Wulsin, Drausin; Ung, Hoameng; Blevins, Tyler; Brown, Mesha-Gay; Fox, Emily; Litt, Brian

    2016-06-01

    Objective. Recently the FDA approved the first responsive, closed-loop intracranial device to treat epilepsy. Because these devices must respond within seconds of seizure onset and not miss events, they are tuned to have high sensitivity, leading to frequent false positive stimulations and decreased battery life. In this work, we propose a more robust seizure detection model. Approach. We use a Bayesian nonparametric Markov switching process to parse intracranial EEG (iEEG) data into distinct dynamic event states. Each event state is then modeled as a multidimensional Gaussian distribution to allow for predictive state assignment. By detecting event states highly specific for seizure onset zones, the method can identify precise regions of iEEG data associated with the transition to seizure activity, reducing false positive detections associated with interictal bursts. The seizure detection algorithm was translated to a real-time application and validated in a small pilot study using 391 days of continuous iEEG data from two dogs with naturally occurring, multifocal epilepsy. A feature-based seizure detector modeled after the NeuroPace RNS System was developed as a control. Main results. Our novel seizure detection method demonstrated an improvement in false negative rate (0/55 seizures missed versus 2/55 seizures missed) as well as a significantly reduced false positive rate (0.0012 h versus 0.058 h-1). All seizures were detected an average of 12.1 ± 6.9 s before the onset of unequivocal epileptic activity (unequivocal epileptic onset (UEO)). Significance. This algorithm represents a computationally inexpensive, individualized, real-time detection method suitable for implantable antiepileptic devices that may considerably reduce false positive rate relative to current industry standards.

  16. Adverse drug reactions – examples of detection of rare events using databases

    PubMed Central

    Chan, Esther W; Liu, Kirin Q L; Chui, Celine S L; Sing, Chor-Wing; Wong, Lisa Y L; Wong, Ian C K

    2015-01-01

    It is recognised that randomised controlled trials are not feasible for capturing rare adverse events. There is an increasing trend towards observational research methodologies using large population-based health databases. These databases offer more scope for adequate sample sizes, allowing for comprehensive patient characterisation and assessment of the associated factors. While direct causality cannot be established and confounders cannot be ignored, databases present an opportunity to explore and quantify rare events. The use of databases for the detection of rare adverse events in the following conditions, sudden death associated with attention deficit hyperactivity disorder (ADHD) treatment, retinal detachment associated with the use of fluoroquinolones and toxic epidermal necrolysis associated with drug exposure, are discussed as examples. In general, rare adverse events tend to have immediate and important clinical implications and may be life-threatening. An understanding of the causative factors is therefore important, in addition to the research methodologies and database platforms that enable the undertaking of the research. PMID:25060360

  17. Low time resolution analysis of polar ice cores cannot detect impulsive nitrate events

    NASA Astrophysics Data System (ADS)

    Smart, D. F.; Shea, M. A.; Melott, A. L.; Laird, C. M.

    2014-12-01

    Ice cores are archives of climate change and possibly large solar proton events (SPEs). Wolff et al. (2012) used a single event, a nitrate peak in the GISP2-H core, which McCracken et al. (2001a) time associated with the poorly quantified 1859 Carrington event, to discredit SPE-produced, impulsive nitrate deposition in polar ice. This is not the ideal test case. We critique the Wolff et al. analysis and demonstrate that the data they used cannot detect impulsive nitrate events because of resolution limitations. We suggest reexamination of the top of the Greenland ice sheet at key intervals over the last two millennia with attention to fine resolution and replicate sampling of multiple species. This will allow further insight into polar depositional processes on a subseasonal scale, including atmospheric sources, transport mechanisms to the ice sheet, postdepositional interactions, and a potential SPE association.

  18. Micro seismic event detection based on neural networks in the Groningen area, The Netherlands

    NASA Astrophysics Data System (ADS)

    Paap, Bob; van Maanen, Peter-Paul; Carpentier, Stefan; Meekes, Sjef

    2017-04-01

    Over the past decades, the Groningen gas field has been increasingly faced by induced earthquakes resulting from gas production. The seismic monitoring network at Groningen has been densified in order to acquire more accurate information regarding the onset and origin of seismic events, resulting in increasing amounts of seismic data. Although traditional automated event detection techniques generally are successful in detecting events from continuous data, its detection success is challenged in cases of lower signal-to-noise ratios and often limited availability of seismologists. Besides the recent expansion of the Groningen seismic network, additional new seismic networks have been deployed at several geothermal and CO2 storage fields. The data stream coming from these networks has sparked specific interest in neural networks for automated classification and interpretation. Here we explore the feasibility of neural networks in classifying the occurrence of seismic events. For this purpose a three-layered feedforward neural network was trained using public data related to a seismic event in the Groningen gas field obtained from the Royal Netherlands Meteorological Institute (KNMI) data portal. The first arrival times that were determined by KNMI for a subset of the station data were used to determine the arrival times for the other station data. Different derivatives, using different frequency sub-band and STA/LTA settings, were used as input. Based on these data, the network's parameters were then optimized to predict arrival times accurately. Although this study is still ongoing, we anticipate our approach can significantly increase the performance as compared to detection methods usually applied to the Groningen gas field. This will clear the way for future real-time micro seismic event classification.

  19. Post-market surveillance to detect adverse events associated with Melody® valve implantation.

    PubMed

    Hill, Kevin D; Goldstein, Bryan H; Angtuaco, Michael J; Chu, Patricia Y; Fleming, Gregory A

    2017-08-01

    The aim of this study was to describe previously unrecognised or under-recognised adverse events associated with Melody® valve implantation. In rare diseases and conditions, it is typically not feasible to conduct large-scale safety trials before drug or device approval. Therefore, post-market surveillance mechanisms are necessary to detect rare but potentially serious adverse events. We reviewed the United States Food and Drug Administration's Manufacturer and User Facility Device Experience (MAUDE) database and conducted a structured literature review to evaluate adverse events associated with on- and off-label Melody® valve implantation. Adverse events were compared with those described in the prospective Investigational Device Exemption and Post-Market Approval Melody® transcatheter pulmonary valve trials. We identified 631 adverse events associated with "on-label" Melody® valve implants and 84 adverse events associated with "off-label" implants. The most frequent "on-label" adverse events were similar to those described in the prospective trials including stent fracture (n=210) and endocarditis (n=104). Previously unrecognised or under-recognised adverse events included stent fragment embolisation (n=5), device erosion (n=4), immediate post-implant severe valvar insufficiency (n=2), and late coronary compression (n=2 cases at 5 days and 3 months after implantation). Under-recognised adverse events associated with off-label implantation included early valve failure due to insufficiency when implanted in the tricuspid position (n=7) and embolisation with percutaneous implantation in the mitral position (n=5). Post-market passive surveillance does not demonstrate a high frequency of previously unrecognised serious adverse events with "on-label" Melody® valve implantation. Further study is needed to evaluate safety of "off-label" uses.

  20. Pre-trained D-CNN models for detecting complex events in unconstrained videos

    NASA Astrophysics Data System (ADS)

    Robinson, Joseph P.; Fu, Yun

    2016-05-01

    Rapid event detection faces an emergent need to process large videos collections; whether surveillance videos or unconstrained web videos, the ability to automatically recognize high-level, complex events is a challenging task. Motivated by pre-existing methods being complex, computationally demanding, and often non-replicable, we designed a simple system that is quick, effective and carries minimal overhead in terms of memory and storage. Our system is clearly described, modular in nature, replicable on any Desktop, and demonstrated with extensive experiments, backed by insightful analysis on different Convolutional Neural Networks (CNNs), as stand-alone and fused with others. With a large corpus of unconstrained, real-world video data, we examine the usefulness of different CNN models as features extractors for modeling high-level events, i.e., pre-trained CNNs that differ in architectures, training data, and number of outputs. For each CNN, we use 1-fps from all training exemplar to train one-vs-rest SVMs for each event. To represent videos, frame-level features were fused using a variety of techniques. The best being to max-pool between predetermined shot boundaries, then average-pool to form the final video-level descriptor. Through extensive analysis, several insights were found on using pre-trained CNNs as off-the-shelf feature extractors for the task of event detection. Fusing SVMs of different CNNs revealed some interesting facts, finding some combinations to be complimentary. It was concluded that no single CNN works best for all events, as some events are more object-driven while others are more scene-based. Our top performance resulted from learning event-dependent weights for different CNNs.

  1. Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)

    NASA Astrophysics Data System (ADS)

    Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko

    2016-07-01

    A new Matched Filtering Algorithm (MFA) is proposed for detecting and analysing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multichannel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multicomponent waveforms into the ray-centred co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, that is, microseismic events for which only one of the S- or P-wave arrivals is evident due to unfavourable S/N conditions. A real-data example using microseismic monitoring data from four stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than fourfold increase) in the number of located events compared with the original catalogue. Moreover, analysis of the new MFA catalogue suggests that this approach leads to more robust interpretation

  2. Feature selection of seismic waveforms for long period event detection at Cotopaxi Volcano

    NASA Astrophysics Data System (ADS)

    Lara-Cueva, R. A.; Benítez, D. S.; Carrera, E. V.; Ruiz, M.; Rojo-Álvarez, J. L.

    2016-04-01

    Volcano Early Warning Systems (VEWS) have become a research topic in order to preserve human lives and material losses. In this setting, event detection criteria based on classification using machine learning techniques have proven useful, and a number of systems have been proposed in the literature. However, to the best of our knowledge, no comprehensive and principled study has been conducted to compare the influence of the many different sets of possible features that have been used as input spaces in previous works. We present an automatic recognition system of volcano seismicity, by considering feature extraction, event classification, and subsequent event detection, in order to reduce the processing time as a first step towards a high reliability automatic detection system in real-time. We compiled and extracted a comprehensive set of temporal, moving average, spectral, and scale-domain features, for separating long period seismic events from background noise. We benchmarked two usual kinds of feature selection techniques, namely, filter (mutual information and statistical dependence) and embedded (cross-validation and pruning), each of them by using suitable and appropriate classification algorithms such as k Nearest Neighbors (k-NN) and Decision Trees (DT). We applied this approach to the seismicity presented at Cotopaxi Volcano in Ecuador during 2009 and 2010. The best results were obtained by using a 15 s segmentation window, feature matrix in the frequency domain, and DT classifier, yielding 99% of detection accuracy and sensitivity. Selected features and their interpretation were consistent among different input spaces, in simple terms of amplitude and spectral content. Our study provides the framework for an event detection system with high accuracy and reduced computational requirements.

  3. Model-Based Fault Tolerant Control

    NASA Technical Reports Server (NTRS)

    Kumar, Aditya; Viassolo, Daniel

    2008-01-01

    The Model Based Fault Tolerant Control (MBFTC) task was conducted under the NASA Aviation Safety and Security Program. The goal of MBFTC is to develop and demonstrate real-time strategies to diagnose and accommodate anomalous aircraft engine events such as sensor faults, actuator faults, or turbine gas-path component damage that can lead to in-flight shutdowns, aborted take offs, asymmetric thrust/loss of thrust control, or engine surge/stall events. A suite of model-based fault detection algorithms were developed and evaluated. Based on the performance and maturity of the developed algorithms two approaches were selected for further analysis: (i) multiple-hypothesis testing, and (ii) neural networks; both used residuals from an Extended Kalman Filter to detect the occurrence of the selected faults. A simple fusion algorithm was implemented to combine the results from each algorithm to obtain an overall estimate of the identified fault type and magnitude. The identification of the fault type and magnitude enabled the use of an online fault accommodation strategy to correct for the adverse impact of these faults on engine operability thereby enabling continued engine operation in the presence of these faults. The performance of the fault detection and accommodation algorithm was extensively tested in a simulation environment.

  4. Visual and Real-Time Event-Specific Loop-Mediated Isothermal Amplification Based Detection Assays for Bt Cotton Events MON531 and MON15985.

    PubMed

    Randhawa, Gurinder Jit; Chhabra, Rashmi; Bhoge, Rajesh K; Singh, Monika

    2015-01-01

    Bt cotton events MON531 and MON15985 are authorized for commercial cultivation in more than 18 countries. In India, four Bt cotton events have been commercialized; more than 95% of total area under genetically modified (GM) cotton cultivation comprises events MON531 and MON15985. The present study reports on the development of efficient event-specific visual and real-time loop-mediated isothermal amplification (LAMP) assays for detection and identification of cotton events MON531 and MON15985. Efficiency of LAMP assays was compared with conventional and real-time PCR assays. Real-time LAMP assay was found time-efficient and most sensitive, detecting up to two target copies within 35 min. The developed real-time LAMP assays, when combined with efficient DNA extraction kit/protocol, may facilitate onsite GM detection to check authenticity of Bt cotton seeds.

  5. Model-based fault detection and isolation for intermittently active faults with application to motion-based thruster fault detection and isolation for spacecraft

    NASA Technical Reports Server (NTRS)

    Wilson, Edward (Inventor)

    2008-01-01

    The present invention is a method for detecting and isolating fault modes in a system having a model describing its behavior and regularly sampled measurements. The models are used to calculate past and present deviations from measurements that would result with no faults present, as well as with one or more potential fault modes present. Algorithms that calculate and store these deviations, along with memory of when said faults, if present, would have an effect on the said actual measurements, are used to detect when a fault is present. Related algorithms are used to exonerate false fault modes and finally to isolate the true fault mode. This invention is presented with application to detection and isolation of thruster faults for a thruster-controlled spacecraft. As a supporting aspect of the invention, a novel, effective, and efficient filtering method for estimating the derivative of a noisy signal is presented.

  6. A time-frequency approach for event detection in non-intrusive load monitoring

    NASA Astrophysics Data System (ADS)

    Jin, Yuanwei; Tebekaemi, Eniye; Berges, Mario; Soibelman, Lucio

    2011-06-01

    Non-intrusive load monitoring is an emerging signal processing and analysis technology that aims to identify individual appliance in residential or commercial buildings or to diagnose shipboard electro-mechanical systems through continuous monitoring of the change of On and Off status of various loads. In this paper, we develop a joint time-frequency approach for appliance event detection based on the time varying power signals obtained from the measured aggregated current and voltage waveforms. The short-time Fourier transform is performed to obtain the spectral components of the non-stationary aggregated power signals of appliances. The proposed event detector utilizes a goodness-of-fit Chi-squared test for detecting load activities using the calculated average power followed by a change point detector for estimating the change point of the transient signals using the first harmonic component of the power signals. Unlike the conventional detectors such as the generalized likelihood ratio test, the proposed event detector allows a closed form calculation of the decision threshold and provides a guideline for choosing the size of the detection data window, thus eliminating the need for extensive training for determining the detection threshold while providing robust detection performance against dynamic load activities. Using the real-world power data collected in two residential building testbeds, we demonstrate the superior performance of the proposed algorithm compared to the conventional generalized likelihood ratio detector.

  7. Drivers of Emerging Infectious Disease Events as a Framework for Digital Detection

    PubMed Central

    Olson, Sarah H.; Benedum, Corey M.; Mekaru, Sumiko R.; Preston, Nicholas D.; Mazet, Jonna A.K.; Joly, Damien O.

    2015-01-01

    The growing field of digital disease detection, or epidemic intelligence, attempts to improve timely detection and awareness of infectious disease (ID) events. Early detection remains an important priority; thus, the next frontier for ID surveillance is to improve the recognition and monitoring of drivers (antecedent conditions) of ID emergence for signals that precede disease events. These data could help alert public health officials to indicators of elevated ID risk, thereby triggering targeted active surveillance and interventions. We believe that ID emergence risks can be anticipated through surveillance of their drivers, just as successful warning systems of climate-based, meteorologically sensitive diseases are supported by improved temperature and precipitation data. We present approaches to driver surveillance, gaps in the current literature, and a scientific framework for the creation of a digital warning system. Fulfilling the promise of driver surveillance will require concerted action to expand the collection of appropriate digital driver data. PMID:26196106

  8. Automatic detection of adverse events to predict drug label changes using text and data mining techniques.

    PubMed

    Gurulingappa, Harsha; Toldo, Luca; Rajput, Abdul Mateen; Kors, Jan A; Taweel, Adel; Tayrouz, Yorki

    2013-11-01

    The aim of this study was to assess the impact of automatically detected adverse event signals from text and open-source data on the prediction of drug label changes. Open-source adverse effect data were collected from FAERS, Yellow Cards and SIDER databases. A shallow linguistic relation extraction system (JSRE) was applied for extraction of adverse effects from MEDLINE case reports. Statistical approach was applied on the extracted datasets for signal detection and subsequent prediction of label changes issued for 29 drugs by the UK Regulatory Authority in 2009. 76% of drug label changes were automatically predicted. Out of these, 6% of drug label changes were detected only by text mining. JSRE enabled precise identification of four adverse drug events from MEDLINE that were undetectable otherwise. Changes in drug labels can be predicted automatically using data and text mining techniques. Text mining technology is mature and well-placed to support the pharmacovigilance tasks. Copyright © 2013 John Wiley & Sons, Ltd.

  9. EO/IR satellite constellations for the early detection and tracking of collision events

    NASA Astrophysics Data System (ADS)

    Zatezalo, A.; El-Fallah, A.; Mahler, R.; Mehra, R. K.; Pham, K.

    2010-04-01

    The detection and tracking of collision events involving existing Low Earth Orbit (LEO) Resident Space Objects (RSOs) is becoming increasingly important with the higher LEO space objects traffic volume which is anticipated to increase even further in the near future. Changes in velocity that can lead to a collision are hard to detect early on time, and before the collision happens. Several collision events can happen at the same time and continuous monitoring of the LEO orbit is necessary in order to determine and implement collision avoidance strategies. We present a simulation of a constellation system consisting of multiple platforms carrying EO/IR sensors for the detection of such collisions. The presented simulation encompasses the full complexity of LEO trajectories changes which can collide with currently operating satellites. Efficient multitarget filter with information-theoretic multisensor management is implemented and evaluated on different constellations.

  10. Drivers of Emerging Infectious Disease Events as a Framework for Digital Detection.

    PubMed

    Olson, Sarah H; Benedum, Corey M; Mekaru, Sumiko R; Preston, Nicholas D; Mazet, Jonna A K; Joly, Damien O; Brownstein, John S

    2015-08-01

    The growing field of digital disease detection, or epidemic intelligence, attempts to improve timely detection and awareness of infectious disease (ID) events. Early detection remains an important priority; thus, the next frontier for ID surveillance is to improve the recognition and monitoring of drivers (antecedent conditions) of ID emergence for signals that precede disease events. These data could help alert public health officials to indicators of elevated ID risk, thereby triggering targeted active surveillance and interventions. We believe that ID emergence risks can be anticipated through surveillance of their drivers, just as successful warning systems of climate-based, meteorologically sensitive diseases are supported by improved temperature and precipitation data. We present approaches to driver surveillance, gaps in the current literature, and a scientific framework for the creation of a digital warning system. Fulfilling the promise of driver surveillance will require concerted action to expand the collection of appropriate digital driver data.

  11. Covert Network Analysis for Key Player Detection and Event Prediction Using a Hybrid Classifier

    PubMed Central

    Akram, M. Usman; Khan, Shoab A.; Javed, Muhammad Younus

    2014-01-01

    National security has gained vital importance due to increasing number of suspicious and terrorist events across the globe. Use of different subfields of information technology has also gained much attraction of researchers and practitioners to design systems which can detect main members which are actually responsible for such kind of events. In this paper, we present a novel method to predict key players from a covert network by applying a hybrid framework. The proposed system calculates certain centrality measures for each node in the network and then applies novel hybrid classifier for detection of key players. Our system also applies anomaly detection to predict any terrorist activity in order to help law enforcement agencies to destabilize the involved network. As a proof of concept, the proposed framework has been implemented and tested using different case studies including two publicly available datasets and one local network. PMID:25136674

  12. Event Detection and Location of Earthquakes Using the Cascadia Initiative Dataset

    NASA Astrophysics Data System (ADS)

    Morton, E.; Bilek, S. L.; Rowe, C. A.

    2015-12-01

    The Cascadia subduction zone (CSZ) produces a range of slip behavior along the plate boundary megathrust, from great earthquakes to episodic slow slip and tremor (ETS). Unlike other subduction zones that produce great earthquakes and ETS, the CSZ is notable for the lack of small and moderate magnitude earthquakes recorded. The seismogenic zone extent is currently estimated to be primarily offshore, thus the lack of observed small, interplate earthquakes may be partially due to the use of only land seismometers. The Cascadia Initiative (CI) community seismic experiment seeks to address this issue by including ocean bottom seismometers (OBS) deployed directly over the locked seismogenic zone, in addition to land seismometers. We use these seismic data to explore whether small magnitude earthquakes are occurring on the plate interface, but have gone undetected by the land-based seismic networks. We select a subset of small magnitude (M0.1-3.7) earthquakes from existing earthquake catalogs, based on land seismic data, whose preliminary hypocentral locations suggest they may have occurred on the plate interface. We window the waveforms on CI OBS and land seismometers around the phase arrival times for these earthquakes to generate templates for subspace detection, which allows for additional flexibility over traditional matched filter detection methods. Here we present event detections from the first year of CI deployment and preliminary locations for the detected events. Initial results of scanning the first year of the CI deployment using one cluster of template events, located near a previously identified subducted seamount, include 473 detections on OBS station M08A (~61.6 km offshore) and 710 detections on OBS station J25A (~44.8 km northeast of M08A). Ongoing efforts include detection using additional OBS stations along the margin, as well as determining locations of clusters detected in the first year of deployment.

  13. Event-specific quantitative detection of nine genetically modified maizes using one novel standard reference molecule.

    PubMed

    Yang, Litao; Guo, Jinchao; Pan, Aihu; Zhang, Haibo; Zhang, Kewei; Wang, Zhengming; Zhang, Dabing

    2007-01-10

    With the development of genetically modified organism (GMO) detection techniques, the Polymerase Chain Reaction (PCR) technique has been the mainstay for GMO detection, and real-time PCR is the most effective and important method for GMO quantification. An event-specific detection strategy based on the unique and specific integration junction sequences between the host plant genome DNA and the integrated gene is being developed for its high specificity. This study establishes the event-specific detection methods for TC1507 and CBH351 maizes. In addition, the event-specific TaqMan real-time PCR detection methods for another seven GM maize events (Bt11, Bt176, GA21, MON810, MON863, NK603, and T25) were systematically optimized and developed. In these PCR assays, the fluorescent quencher, TAMRA, was dyed on the T-base of the probe at the internal position to improve the intensity of the fluorescent signal. To overcome the difficulties in obtaining the certified reference materials of these GM maizes, one novel standard reference molecule containing all nine specific integration junction sequences of these GM maizes and the maize endogenous reference gene, zSSIIb, was constructed and used for quantitative analysis. The limits of detection of these methods were 20 copies for these different GM maizes, the limits of quantitation were about 20 copies, and the dynamic ranges for quantification were from 0.05 to 100% in 100 ng of DNA template. Furthermore, nine groups of the mixed maize samples of these nine GM maize events were quantitatively analyzed to evaluate the accuracy and precision. The accuracy expressed as bias varied from 0.67 to 28.00% for the nine tested groups of GM maize samples, and the precision expressed as relative standard deviations was from 0.83 to 26.20%. All of these indicated that the established event-specific real-time PCR detection systems and the reference molecule in this study are suitable for the identification and quantification of these GM

  14. Long-Duration Neutron Production in Solar Eruptive Events Detected with the MESSENGER Neutron Spectrometer

    NASA Astrophysics Data System (ADS)

    Feldman, W. C.; Lawrence, D. J.; Vestrand, W. T.; Peplowski, P. N.

    2014-12-01

    Nine long-duration neutron solar eruptive events (SEEs) between 31 December 2007 and 16 March 2013 appear to be excellent candidates for detection of fast neutrons from the Sun by the MESSENGER Neutron Spectrometer (NS). One event (on 4 June 2011) is the cleanest example, because it was not accompanied by energetic ions at MESSENGER having energies greater than 50±10 MeV/nuc. The purpose of this study is to assemble a set of conditions common to all events that can help identify the physical conditions at their origin. We classified the nine events into three categories: (1) those having tight magnetic connection to the Sun as well as to spacecraft at 1 AU that can separately measure the energetic proton, alpha particle, and electron flux spectra, (2) those with sufficiently close connection that the energetic flux spectra can be compared, (3) those that have only marginal connections, and (4) those that are also seen at Earth. Four events fall into category (1), three into category (2), two into category (3), and parts of four events overlapped neutron events also seen by the scintillation FIBer solar neutron telescope (FIB) detector placed on the International Space Station in 2009. Seven of the nine events that have either tight or marginal magnetic connection have alpha particle abundances less than 2%. For each event, we modeled expected fast neutron count rates from the 1 AU ion spectrum, a process that accounts for the transport of the neutrons through the spacecraft to the NS. The ratios of measured to predicted fast-neutron counts range between 2.0 and 12.1.

  15. Assessment of a Multiple Model Based Parametric Method for Output-Only Vibration-Based Damage Detection for a Population of Like Structures

    NASA Astrophysics Data System (ADS)

    Vamvoudakis-Stefanou, Kyriakos J.; Sakellariou, John S.; Fassois, Spilios D.

    2015-07-01

    This study focuses on the problem of vibration-based damage detection for a population of like structures. Although nominally identical, like structures exhibit variability in their characteristics due to variability in the materials and manufacturing. This inevitably leads to variability in the dynamics, which may be so significant as to mask deviations due to damage. Damage detection via conventional vibration-based methods, using a common threshold in the decision making mechanism thus becomes highly challenging. The study presents a detailed assessment of a recently introduced Multiple Model (MM) based AutoRegressive (AR) model parameter method aiming at addressing this problem. The assessment is based on high numbers of experimental test/inspection cases using composite beams damaged via impact, as well as comparisons with the corresponding conventional (single model based) method. The results confirm significant improvement over the method's conventional counterpart. A sensitivity analysis additionally indicates that the method is relatively insensitive to the model order, but sensitive to the specific beams selected as baseline (training) ones; in fact their selection may lead to excellent results.

  16. Automatic microseismic event detection by band-limited phase-only correlation

    NASA Astrophysics Data System (ADS)

    Wu, Shaojiang; Wang, Yibo; Zhan, Yi; Chang, Xu

    2016-12-01

    Identification and detection of microseismic events is a significant issue in source locations and source mechanism analysis. The number of the records is notably large, especially in the case of some real-time monitoring, and while the majority of microseismic events are highly weak and sparse, automatic algorithms are indispensable. In this study, we introduce an effective method for the identification and detection of microseismic events by judging whether the P-wave phase exists in a local segment from a single three-component microseismic records. The new judging algorithm consists primarily of the following key steps: 1) transform the waveform time series into time-varying spectral representations using the S-transform; 2) calculate the similarity of the frequency content in the time-frequency domain using the phase-only correlation function; and 3) identify the P-phase by the combination analysis between any two components. The proposed algorithm is compared to a similar approach using the cross-correlation in the time domain between any two components and later tested with synthetic microseismic datasets and real field-recorded datasets. The results indicate that the proposed algorithm is able to distinguish similar and dissimilar waveforms, even for low signal noise ratio and emergent events, which is important for accurate and rapid selection of microseismic events from a large number of records. This method can be applied to other geophysical analyses based on the waveform data.

  17. Effect of amplitude criteria on operating characteristic of detection for OSAH events with oxygen saturation.

    PubMed

    Lee, Y K; Bister, M; Salleh, Y M; Blanchfield, P

    2007-01-01

    Effect of amplitude criteria on the operating characteristics of algorithms for detecting OSAH events based on the analysis of oxygen saturation alone is investigated. The objective is to establish that there exists an oxygen desaturation level that leverages these algorithms to be more sensitive or more specific, irrespective of the differences in detection mechanism and database, a first ever attempt. Linear classification of algorithms from previous studies discovered that a drop in oxygen saturation of 3% or less makes the detection algorithms more sensitive while a drop of 4% or more makes it more specific. Results from two algorithms developed here also supported this. This finding explains the contradiction cited in the performance of algorithms from the different authors, which casts doubts on their detection ability. It could lead to the establishment of standard oxygen desaturation levels for screening and diagnosis of moderate/severe OSA, thus providing a more credible comparison basis for automated detection algorithms or even clinical tests.

  18. Detection of generic differential RNA processing events from RNA-seq data

    PubMed Central

    Tran, Van Du T; Souiai, Oussema; Romero-Barrios, Natali; Crespi, Martin; Gautheret, Daniel

    2016-01-01

    ABSTRACT RNA-seq data analysis has revealed abundant alternative splicing in eukaryotic mRNAs. However, splicing is only one of many processing events that transcripts may undergo during their lifetime. We present here RNAprof (RNA profile analysis), a program for the detection of differential processing events from the comparison of RNA-seq experiments. RNAprof implements a specific gene-level normalization procedure and compares RNA-seq coverage profiles at nucleotide resolution to detect regions of significant coverage differences, independently of splice sites or other gene features. We used RNAprof to analyze the effect of alternative-splicing regulators NSRa and NSRb on the Arabidopsis thaliana transcriptome. A number of intron retention events and alternative transcript structures were specifically detected by RNAprof and confirmed by qRT-PCR. Further tests using a public Mus musculus RNA-seq dataset and comparisons with other RNA isoform predictors showed that RNAprof uniquely identified sets of highly significant processing events as well as other relevant library-specific differences in RNA-seq profiles. This highlights an important layer of variation that remains undetected by current protocols for RNA-seq analysis. PMID:26849165

  19. Detection of generic differential RNA processing events from RNA-seq data.

    PubMed

    Tran, Van Du T; Souiai, Oussema; Romero-Barrios, Natali; Crespi, Martin; Gautheret, Daniel

    2016-01-01

    RNA-seq data analysis has revealed abundant alternative splicing in eukaryotic mRNAs. However, splicing is only one of many processing events that transcripts may undergo during their lifetime. We present here RNAprof (RNA profile analysis), a program for the detection of differential processing events from the comparison of RNA-seq experiments. RNAprof implements a specific gene-level normalization procedure and compares RNA-seq coverage profiles at nucleotide resolution to detect regions of significant coverage differences, independently of splice sites or other gene features. We used RNAprof to analyze the effect of alternative-splicing regulators NSRa and NSRb on the Arabidopsis thaliana transcriptome. A number of intron retention events and alternative transcript structures were specifically detected by RNAprof and confirmed by qRT-PCR. Further tests using a public Mus musculus RNA-seq dataset and comparisons with other RNA isoform predictors showed that RNAprof uniquely identified sets of highly significant processing events as well as other relevant library-specific differences in RNA-seq profiles. This highlights an important layer of variation that remains undetected by current protocols for RNA-seq analysis.

  20. Signal Detection of Adverse Drug Reaction of Amoxicillin Using the Korea Adverse Event Reporting System Database

    PubMed Central

    2016-01-01

    We conducted pharmacovigilance data mining for a β-lactam antibiotics, amoxicillin, and compare the adverse events (AEs) with the drug labels of 9 countries including Korea, USA, UK, Japan, Germany, Swiss, Italy, France, and Laos. We used the Korea Adverse Event Reporting System (KAERS) database, a nationwide database of AE reports, between December 1988 and June 2014. Frequentist and Bayesian methods were used to calculate disproportionality distribution of drug-AE pairs. The AE which was detected by all the three indices of proportional reporting ratio (PRR), reporting odds ratio (ROR), and information component (IC) was defined as a signal. The KAERS database contained a total of 807,582 AE reports, among which 1,722 reports were attributed to amoxicillin. Among the 192,510 antibiotics-AE pairs, the number of amoxicillin-AE pairs was 2,913. Among 241 AEs, 52 adverse events were detected as amoxicillin signals. Comparing the drug labels of 9 countries, 12 adverse events including ineffective medicine, bronchitis, rhinitis, sinusitis, dry mouth, gastroesophageal reflux, hypercholesterolemia, gastric carcinoma, abnormal crying, induration, pulmonary carcinoma, and influenza-like symptoms were not listed on any of the labels of nine countries. In conclusion, we detected 12 new signals of amoxicillin which were not listed on the labels of 9 countries. Therefore, it should be followed by signal evaluation including causal association, clinical significance, and preventability. PMID:27510377

  1. Signal Detection of Adverse Drug Reaction of Amoxicillin Using the Korea Adverse Event Reporting System Database.

    PubMed

    Soukavong, Mick; Kim, Jungmee; Park, Kyounghoon; Yang, Bo Ram; Lee, Joongyub; Jin, Xue Mei; Park, Byung Joo

    2016-09-01

    We conducted pharmacovigilance data mining for a β-lactam antibiotics, amoxicillin, and compare the adverse events (AEs) with the drug labels of 9 countries including Korea, USA, UK, Japan, Germany, Swiss, Italy, France, and Laos. We used the Korea Adverse Event Reporting System (KAERS) database, a nationwide database of AE reports, between December 1988 and June 2014. Frequentist and Bayesian methods were used to calculate disproportionality distribution of drug-AE pairs. The AE which was detected by all the three indices of proportional reporting ratio (PRR), reporting odds ratio (ROR), and information component (IC) was defined as a signal. The KAERS database contained a total of 807,582 AE reports, among which 1,722 reports were attributed to amoxicillin. Among the 192,510 antibiotics-AE pairs, the number of amoxicillin-AE pairs was 2,913. Among 241 AEs, 52 adverse events were detected as amoxicillin signals. Comparing the drug labels of 9 countries, 12 adverse events including ineffective medicine, bronchitis, rhinitis, sinusitis, dry mouth, gastroesophageal reflux, hypercholesterolemia, gastric carcinoma, abnormal crying, induration, pulmonary carcinoma, and influenza-like symptoms were not listed on any of the labels of nine countries. In conclusion, we detected 12 new signals of amoxicillin which were not listed on the labels of 9 countries. Therefore, it should be followed by signal evaluation including causal association, clinical significance, and preventability.

  2. Detections of Planets in Binaries Through the Channel of Chang-Refsdal Gravitational Lensing Events

    NASA Astrophysics Data System (ADS)

    Han, Cheongho; Shin, In-Gu; Jung, Youn Kil

    2017-02-01

    Chang-Refsdal (C-R) lensing, which refers to the gravitational lensing of a point mass perturbed by a constant external shear, provides a good approximation in describing lensing behaviors of either a very wide or a very close binary lens. C-R lensing events, which are identified by short-term anomalies near the peak of high-magnification lensing light curves, are routinely detected from lensing surveys, but not much attention is paid to them. In this paper, we point out that C-R lensing events provide an important channel to detect planets in binaries, both in close and wide binary systems. Detecting planets through the C-R lensing event channel is possible because the planet-induced perturbation occurs in the same region of the C-R lensing-induced anomaly and thus the existence of the planet can be identified by the additional deviation in the central perturbation. By presenting the analysis of the actually observed C-R lensing event OGLE-2015-BLG-1319, we demonstrate that dense and high-precision coverage of a C-R lensing-induced perturbation can provide a strong constraint on the existence of a planet in a wide range of planet parameters. The sample of an increased number of microlensing planets in binary systems will provide important observational constraints in giving shape to the details of planet formation, which have been restricted to the case of single stars to date.

  3. A Heuristic Indication and Warning Staging Model for Detection and Assessment of Biological Events

    PubMed Central

    Wilson, James M.; Polyak, Marat G.; Blake, Jane W.; Collmann, Jeff

    2008-01-01

    Objective This paper presents a model designed to enable rapid detection and assessment of biological threats that may require swift intervention by the international public health community. Design We utilized Strauss’ grounded theory to develop an expanded model of social disruption due to biological events based on retrospective and prospective case studies. We then applied this model to the temporal domain and propose a heuristic staging model, the Wilson–Collmann Scale for assessing biological event evolution. Measurements We retrospectively and manually examined hard copy archival local media reports in the native vernacular for three biological events associated with substantial social disruption. The model was then tested prospectively through media harvesting based on keywords corresponding to the model parameters. Results Our heuristic staging model provides valuable information about the features of a biological event that can be used to determine the level of concern warranted, such as whether the pathogen in question is responding to established public health disease control measures, including the use of antimicrobials or vaccines; whether the public health and medical infrastructure of the country involved is adequate to mount the necessary response; whether the country’s officials are providing an appropriate level of information to international public health authorities; and whether the event poses a international threat. The approach is applicable for monitoring open-source (public-domain) media for indications and warnings of such events, and specifically for markers of the social disruption that commonly occur as these events unfold. These indications and warnings can then be used as the basis for staging the biological threat in the same manner that the United States National Weather Service currently uses storm warning models (such as the Saffir-Simpson Hurricane Scale) to detect and assess threatening weather conditions. Conclusion

  4. Exploring the Limits of Waveform Correlation Event Detection as Applied to Three Earthquake Aftershock Sequences

    NASA Astrophysics Data System (ADS)

    Young, C. J.; Carr, D.; Resor, M.; Duffey, S.

    2009-12-01

    Swarms of earthquakes and/or aftershock sequences can dramatically increase the level of seismicity in a region for a period of time lasting from days to months, depending on the swarm or sequence. Such occurrences can provide a large amount of useful information to seismologists. For those who monitor seismic events for possible nuclear explosions, however, these swarms/sequences are a nuisance. In an explosion monitoring system, each event must be treated as a possible nuclear test until it can be proven, to a high degree of confidence, not to be. Seismic events recorded by the same station with highly correlated waveforms almost certainly have a similar location and source type, so clusters of events within a swarm can quickly be identified as earthquakes. We have developed a number of tools that can be used to exploit the high degree of waveform similarity expected to be associated with swarms/sequences. Dendro Tool measures correlations between known events. The Waveform Correlation Detector is intended to act as a detector, finding events in raw data which correlate with known events. The Self Scanner is used to find all correlated segments within a raw data steam and does not require an event library. All three techniques together provide an opportunity to study the similarities of events in an aftershock sequence in different ways. To comprehensively characterize the benefits and limits of waveform correlation techniques, we studied 3 aftershock sequences, using our 3 tools, at multiple stations. We explored the effects of station distance and event magnitudes on correlation results. Lastly, we show the reduction in detection threshold and analyst workload offered by waveform correlation techniques compared to STA/LTA based detection. We analyzed 4 days of data from each aftershock sequence using all three methods. Most known events clustered in a similar manner across the toolsets. Up to 25% of catalogued events were found to be a member of a cluster. In

  5. Detecting Adverse Drug Events in Discharge Summaries Using Variations on the Simple Bayes Model

    PubMed Central

    Visweswaran, Shyam; Hanbury, Paul; Saul, Melissa; Cooper, Gregory F.

    2003-01-01

    Detection and prevention of adverse events and, in particular, adverse drug events (ADEs), is an important problem in health care today. We describe the implementation and evaluation of four variations on the simple Bayes model for identifying ADE-related discharge summaries. Our results show that these probabilistic techniques achieve an ROC curve area of up to 0.77 in correctly determining which patient cases should be assigned an ADE-related ICD-9-CM code. These results suggest a potential for these techniques to contribute to the development of an automated system that helps identify ADEs, as a step toward further understanding and preventing them. PMID:14728261

  6. Wave-induced burst precipitation events detected with a digital ionosonde

    SciTech Connect

    Jarvis, M.J.; Smith, A.J. ); Berkey, F.T. ); Carpenter, D.L. )

    1990-01-01

    Initial results are presented from two methods whereby burst precipitation events in the lower ionosphere, almost certainly induced by VLF wave-particle interactions in the magnetosphere, have been detected using a ground-based digital ionosonde. In the first method, HF echoes are received above the critical frequency of the surrounding plasma; particle energies and the location and extent of the plasma enhancement may be deduced. In the second method, a rapid decrease in the phase of ionospheric echoes is observed due to refractive index changes along the echo path; particle energies, the duration of the precipitation event and the precipitation energy flux can be estimated.

  7. Detecting adverse drug events in discharge summaries using variations on the simple Bayes model.

    PubMed

    Visweswaran, Shyam; Hanbury, Paul; Saul, Melissa; Cooper, Gregory F

    2003-01-01

    Detection and prevention of adverse events and, in particular, adverse drug events (ADEs), is an important problem in health care today. We describe the implementation and evaluation of four variations on the simple Bayes model for identifying ADE-related discharge summaries. Our results show that these probabilistic techniques achieve an ROC curve area of up to 0.77 in correctly determining which patient cases should be assigned an ADE-related ICD-9-CM code. These results suggest a potential for these techniques to contribute to the development of an automated system that helps identify ADEs, as a step toward further understanding and preventing them.

  8. Composite Event Specification and Detection for Supporting Active Capability in an OODBMS: Semantics Architecture and Implementation.

    DTIC Science & Technology

    1995-03-01

    For all outgoing edges i from ’n’ propagate parameters in node ’n’ to the nodei connected by edge i activate-operator-node( nodej ); Delete propagated...El E2 Figure 6: Detection of X in recent mode PROCEDURE activate-operator-node( nodej ) /* Recent Context */ CASE nodei is of type a primitive or...composite event has been signalled to nodej */ AND(E1, E2): if left event el is signalled if E2’s list is not empty Pass <e2, el> to the parent Replace el in

  9. Detection of seismic events triggered by P-waves from the 2011 Tohoku-Oki earthquake

    NASA Astrophysics Data System (ADS)

    Miyazawa, Masatoshi

    2012-12-01

    Large-amplitude surface waves from the 2011 Tohoku-Oki earthquake triggered many seismic events across Japan, while the smaller amplitude P-wave triggering remains unclear. A spectral method was used to detect seismic events triggered by the first arriving P-waves over Japan. This method uses a reference event to correct for source and propagation effects, so that the local response near the station can be examined in detail. P-wave triggering was found in the regions where triggered non-volcanic tremor (NVT) has been observed, and some seismic and volcanic regions. The triggering strain due to P-waves is of the order of 10-8 to 10-7, which is 1 to 2 orders of magnitude smaller than the triggering strain necessary for the surface wave triggering. In the regions of NVT, the triggered event was not identified with slow events, but with other seismic events such as tectonic earthquakes. The sequence of triggering in the regions started with P-wave arrivals. The subsequent surface waves contributed to triggering of NVT, possibly together with slow slip, which resulted in the large amplitude of the NVT.

  10. Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO).

    PubMed

    Yan, Lixin; Zhang, Yishi; He, Yi; Gao, Song; Zhu, Dunyao; Ran, Bin; Wu, Qing

    2016-07-13

    The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1) the Markov blanket (MB) algorithm is employed to extract the main factors associated with hazardous traffic events; (2) a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle's speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G) have significant influences on hazardous traffic events. The sequential minimal optimization (SMO) algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles.

  11. Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO)

    PubMed Central

    Yan, Lixin; Zhang, Yishi; He, Yi; Gao, Song; Zhu, Dunyao; Ran, Bin; Wu, Qing

    2016-01-01

    The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1) the Markov blanket (MB) algorithm is employed to extract the main factors associated with hazardous traffic events; (2) a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle’s speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G) have significant influences on hazardous traffic events. The sequential minimal optimization (SMO) algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles. PMID:27420073

  12. CTBT infrasound network performance to detect the 2013 Russian fireball event

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; Le Pichon, Alexis; Mialle, Pierrick; Garcés, Milton A.

    2015-04-01

    The explosive fragmentation of the 2013 Chelyabinsk meteorite generated a large airburst with an equivalent yield of 500 kT TNT. It is the most energetic event recorded by the infrasound component of the Comprehensive Nuclear-Test-Ban Treaty-International Monitoring System (CTBT-IMS), globally detected by 20 out of 42 operational stations. This study performs a station-by-station estimation of the IMS detection capability to explain infrasound detections and nondetections from short to long distances, using the Chelyabinsk meteorite as global reference event. Investigated parameters influencing the detection capability are the directivity of the line source signal, the ducting of acoustic energy, and the individual noise conditions at each station. Findings include a clear detection preference for stations perpendicular to the meteorite trajectory, even over large distances. Only a weak influence of stratospheric ducting is observed for this low-frequency case. Furthermore, a strong dependence on the diurnal variability of background noise levels at each station is observed, favoring nocturnal detections.

  13. CTBT infrasound network performance to detect the 2013 Russian fireball event

    DOE PAGES

    Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; ...

    2015-03-18

    The explosive fragmentation of the 2013 Chelyabinsk meteorite generated a large airburst with an equivalent yield of 500 kT TNT. It is the most energetic event recorded by the infrasound component of the Comprehensive Nuclear-Test-Ban Treaty-International Monitoring System (CTBT-IMS), globally detected by 20 out of 42 operational stations. This study performs a station-by-station estimation of the IMS detection capability to explain infrasound detections and nondetections from short to long distances, using the Chelyabinsk meteorite as global reference event. Investigated parameters influencing the detection capability are the directivity of the line source signal, the ducting of acoustic energy, and the individualmore » noise conditions at each station. Findings include a clear detection preference for stations perpendicular to the meteorite trajectory, even over large distances. Only a weak influence of stratospheric ducting is observed for this low-frequency case. As a result, a strong dependence on the diurnal variability of background noise levels at each station is observed, favoring nocturnal detections.« less

  14. CTBT infrasound network performance to detect the 2013 Russian fireball event

    SciTech Connect

    Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; Le Pichon, Alexis; Mialle, Pierrick

    2015-03-18

    The explosive fragmentation of the 2013 Chelyabinsk meteorite generated a large airburst with an equivalent yield of 500 kT TNT. It is the most energetic event recorded by the infrasound component of the Comprehensive Nuclear-Test-Ban Treaty-International Monitoring System (CTBT-IMS), globally detected by 20 out of 42 operational stations. This study performs a station-by-station estimation of the IMS detection capability to explain infrasound detections and nondetections from short to long distances, using the Chelyabinsk meteorite as global reference event. Investigated parameters influencing the detection capability are the directivity of the line source signal, the ducting of acoustic energy, and the individual noise conditions at each station. Findings include a clear detection preference for stations perpendicular to the meteorite trajectory, even over large distances. Only a weak influence of stratospheric ducting is observed for this low-frequency case. As a result, a strong dependence on the diurnal variability of background noise levels at each station is observed, favoring nocturnal detections.

  15. Microfluidic Arrayed Lab-On-A-Chip for Electrochemical Capacitive Detection of DNA Hybridization Events.

    PubMed

    Ben-Yoav, Hadar; Dykstra, Peter H; Bentley, William E; Ghodssi, Reza

    2017-01-01

    A microfluidic electrochemical lab-on-a-chip (LOC) device for DNA hybridization detection has been developed. The device comprises a 3 × 3 array of microelectrodes integrated with a dual layer microfluidic valved manipulation system that provides controlled and automated capabilities for high throughput analysis of microliter volume samples. The surface of the microelectrodes is functionalized with single-stranded DNA (ssDNA) probes which enable specific detection of complementary ssDNA targets. These targets are detected by a capacitive technique which measures dielectric variation at the microelectrode-electrolyte interface due to DNA hybridization events. A quantitative analysis of the hybridization events is carried out based on a sensing modeling that includes detailed analysis of energy storage and dissipation components. By calculating these components during hybridization events the device is able to demonstrate specific and dose response sensing characteristics. The developed microfluidic LOC for DNA hybridization detection offers a technology for real-time and label-free assessment of genetic markers outside of laboratory settings, such as at the point-of-care or in-field environmental monitoring.

  16. An algorithm to detect low incidence arrhythmic events in electrocardiographic records from ambulatory patients.

    PubMed

    Hungenahally, S K; Willis, R J

    1994-11-01

    An algorithm was devised to detect low incidence arrhythmic events in electrocardiograms obtained during ambulatory monitoring. The algorithm incorporated baseline correction and R wave detection. The RR interval was used to identify tachycardia, bradycardia, and premature ventricular beats. Only a few beats before and after the arrhythmic event were stored. The software was evaluated on a prototype hardware system which consisted of an Intel 86/30 single board computer with a suitable analog pre-processor and an analog to digital converter. The algorithm was used to determine the incidence and type of arrhythmia in records from an ambulatory electrocardiogram (ECG) database and from a cardiac exercise laboratory. These results were compared to annotations on the records which were assumed to be correct. Standard criteria used previously to evaluate algorithms designed for arrhythmia detection were sensitivity, specificity, and diagnostic accuracy. Sensitivities ranging from 77 to 100%, specificities from 94 to 100%, and diagnostic accuracies from 92 to 100% were obtained on the different data sets. These results compare favourably with published results based on more elaborate algorithms. By circumventing the need to make a continuous record of the ECG, the algorithm could form the basis for a compact monitoring device for the detection of arrhythmic events which are so infrequent that standard 24-h Holter monitoring is insufficient.

  17. Support Vector Machine Model for Automatic Detection and Classification of Seismic Events

    NASA Astrophysics Data System (ADS)

    Barros, Vesna; Barros, Lucas

    2016-04-01

    The automated processing of multiple seismic signals to detect, localize and classify seismic events is a central tool in both natural hazards monitoring and nuclear treaty verification. However, false detections and missed detections caused by station noise and incorrect classification of arrivals are still an issue and the events are often unclassified or poorly classified. Thus, machine learning techniques can be used in automatic processing for classifying the huge database of seismic recordings and provide more confidence in the final output. Applied in the context of the International Monitoring System (IMS) - a global sensor network developed for the Comprehensive Nuclear-Test-Ban Treaty (CTBT) - we propose a fully automatic method for seismic event detection and classification based on a supervised pattern recognition technique called the Support Vector Machine (SVM). According to Kortström et al., 2015, the advantages of using SVM are handleability of large number of features and effectiveness in high dimensional spaces. Our objective is to detect seismic events from one IMS seismic station located in an area of high seismicity and mining activity and classify them as earthquakes or quarry blasts. It is expected to create a flexible and easily adjustable SVM method that can be applied in different regions and datasets. Taken a step further, accurate results for seismic stations could lead to a modification of the model and its parameters to make it applicable to other waveform technologies used to monitor nuclear explosions such as infrasound and hydroacoustic waveforms. As an authorized user, we have direct access to all IMS data and bulletins through a secure signatory account. A set of significant seismic waveforms containing different types of events (e.g. earthquake, quarry blasts) and noise is being analysed to train the model and learn the typical pattern of the signal from these events. Moreover, comparing the performance of the support

  18. Estimation of streamflow response to wildfire and salvage logging in a snow-dominated catchment using a model-based change detection approach

    NASA Astrophysics Data System (ADS)

    Moore, R. D.; Mahrlein, M.; Chuang, Y. C. M.

    2016-12-01

    Forest cover changes associated with natural disturbance and forest management can have significant influences on the magnitude and timing of streamflow. This study quantified the effect of a wildfire that burned over 60% of the catchment of Fishtrap Creek in the southern interior of British Columbia in August 2003. Fishtrap Creek has been gauged from 1970 to present. The catchment drains 158 km2 at the gauging station and has a snow-dominated hydrologic regime. In 2006, about one-third of the burned area was salvage logged. A semi-distributed hydrologic model was calibrated and tested using the pre-fire streamflow data. Simulated daily streamflow based on the "best" parameter set, and assuming pre-fire forest cover, was used as a "virtual" control in a paired-catchment analysis. Each year was divided into 73 five-day periods (pentads), and separate pre-fire regressions were fit for each of the 73 pentad time series. This approach avoids issues with autocorrelation and can address seasonally varying model bias. Statistically significant increases in streamflow were detected in late winter and through the month of April, with no evidence for increased peak flows, which is inferred to reflect a de-synchronization of snowmelt between disturbed and undisturbed areas of the catchment. The results of the model-based change detection are consistent with statistical analyses using climatic variables as covariates, but have the advantage of providing more temporal detail. However, the power of the change detection can be limited by insufficiently long records of streamflow and driving weather variables for both the pre- and post-fire periods and model structural errors (e.g., an inability to reproduce winter baseflow). An interesting side result of the study was the identification of parameter uncertainty associated with uncertainty regarding forest cover during the calibration period.

  19. Event Detection: A Clinical Notification Service on a Health Information Exchange Platform

    PubMed Central

    Moore, Thomas; Shapiro, Jason S.; Doles, Luke; Calman, Neil; Camhi, Eli; Check, Thomas; Onyile, Arit; Kuperman, Gilad

    2012-01-01

    Notifying ambulatory providers when their patients visit the hospital is a simple concept but potentially a powerful tool for improving care coordination. A health information exchange (HIE) can provide automatic notifications to its members by building services on top of their existing infrastructure. NYCLIX, Inc., a functioning HIE in New York City, has developed a system that detects hospital admissions, discharges and emergency department visits and notifies their providers. The system has been in use since November 2010. Out of 63,305 patients enrolled 6,913 (11%) had one or more events in the study period and on average there were 238 events per day. While event notifications have a clinical value, their use also involves non-clinical care coordination; new workflows should be designed to incorporate a broader care team in their use. This paper describes the user requirements for the notification system, system design, current status, lessons learned and future directions. PMID:23304336

  20. Use of a clinical event monitor to prevent and detect medication errors.

    PubMed Central

    Payne, T. H.; Savarino, J.; Marshall, R.; Hoey, C. T.

    2000-01-01

    Errors in health care facilities are common and often unrecognized. We have used our clinical event monitor to prevent and detect medication errors by scrutinizing electronic messages sent to it when any medication order is written in our facility. A growing collection of medication safety rules covering dose limit errors, laboratory monitoring, and other topics may be applied to each medication order message to provide an additional layer of protection beyond existing order checks, reminders, and alerts available within our computer-based record system. During a typical day the event monitor receives 4802 messages, of which 4719 pertain to medication orders. We have found the clinical event monitor to be a valuable tool for clinicians and quality management groups charged with improving medication safety. PMID:11079962

  1. Detection of Severe Rain on Snow events using passive microwave remote sensing

    NASA Astrophysics Data System (ADS)

    Grenfell, T. C.; Putkonen, J.

    2007-12-01

    Severe wintertime rain-on-snow (ROS) events create a strong ice layer or layers in the snow on arctic tundra that act as a barrier to ungulate grazing. These events are linked with large-scale ungulate herd declines via starvation and reduced calf production rate when the animals are unable to penetrate through the resulting ice layer. ROS events also produce considerable perturbation in the mean wintertime soil temperature beneath the snow pack. ROS is a sporadic but well-known and significant phenomenon that is currently very poorly documented. Characterization of the distribution and occurrence of severe rain-on-snow events is based only on anecdotal evidence, indirect observations of carcasses found adjacent to iced snow packs, and irregular detection by a sparse observational weather network. We have analyzed in detail a particular well-identified ROS event that took place on Banks Island in early October 2003 that resulted in the death of 20,000 musk oxen. We make use of multifrequency passive microwave imagery from the special sensing microwave imager satellite sensor suite (SSM/I) in conjunction with a strong-fluctuation-theory (SFT) emissivity model. We show that a combination of time series analysis and cluster analysis based on microwave spectral gradients and polarization ratios provides a means to detect the stages of the ROS event resulting from the modification of the vertical structure of the snow pack, specifically wetting the snow, the accumulation of liquid water at the base of the snow during the rain event, and the subsequent modification of the snowpack after refreezing. SFT model analysis provides quantitative confirmation of our interpretation of the evolution of the microwave properties of the snowpack as a result of the ROS event. In particular, in addition to the grain coarsening due to destructive metamorphism, we detect the presence of the internal water and ice layers, directly identifying the physical properties producing the

  2. Real-time gait event detection for lower limb amputees using a single wearable sensor.

    PubMed

    Maqbool, H F; Husman, M A B; Awad, M I; Abouhossein, A; Mehryar, P; Iqbal, N; Dehghani-Sanij, A A

    2016-08-01

    This paper presents a rule-based real-time gait event/phase detection system (R-GEDS) using a shank mounted inertial measurement unit (IMU) for lower limb amputees during the level ground walking. Development of the algorithm is based on the shank angular velocity in the sagittal plane and linear acceleration signal in the shank longitudinal direction. System performance was evaluated with four control subjects (CS) and one transfemoral amputee (TFA) and the results were validated with four FlexiForce footswitches (FSW). The results showed a data latency for initial contact (IC) and toe off (TO) within a range of ± 40 ms for both CS and TFA. A delay of about 3.7 ± 62 ms for a foot-flat start (FFS) and an early detection of -9.4 ± 66 ms for heel-off (HO) was found for CS. Prosthetic side showed an early detection of -105 ± 95 ms for FFS whereas intact side showed a delay of 141 ±73 ms for HO. The difference in the kinematics of the TFA and CS is one of the potential reasons for high variations in the time difference. Overall, detection accuracy was 99.78% for all the events in both groups. Based on the validated results, the proposed system can be used to accurately detect the temporal gait events in real-time that leads to the detection of gait phase system and therefore, can be utilized in gait analysis applications and the control of lower limb prostheses.

  3. Flow detection via sparse frame analysis for suspicious event recognition in infrared imagery

    NASA Astrophysics Data System (ADS)

    Fernandes, Henrique C.; Batista, Marcos A.; Barcelos, Celia A. Z.; Maldague, Xavier P. V.

    2013-05-01

    It is becoming increasingly evident that intelligent systems are very bene¯cial for society and that the further development of such systems is necessary to continue to improve society's quality of life. One area that has drawn the attention of recent research is the development of automatic surveillance systems. In our work we outline a system capable of monitoring an uncontrolled area (an outside parking lot) using infrared imagery and recognizing suspicious events in this area. The ¯rst step is to identify moving objects and segment them from the scene's background. Our approach is based on a dynamic background-subtraction technique which robustly adapts detection to illumination changes. It is analyzed only regions where movement is occurring, ignoring in°uence of pixels from regions where there is no movement, to segment moving objects. Regions where movement is occurring are identi¯ed using °ow detection via sparse frame analysis. During the tracking process the objects are classi¯ed into two categories: Persons and Vehicles, based on features such as size and velocity. The last step is to recognize suspicious events that may occur in the scene. Since the objects are correctly segmented and classi¯ed it is possible to identify those events using features such as velocity and time spent motionless in one spot. In this paper we recognize the suspicious event suspicion of object(s) theft from inside a parked vehicle at spot X by a person" and results show that the use of °ow detection increases the recognition of this suspicious event from 78:57% to 92:85%.

  4. Assessment and validation of a simple automated method for the detection of gait events and intervals.

    PubMed

    Ghoussayni, Salim; Stevens, Christopher; Durham, Sally; Ewins, David

    2004-12-01

    A simple and rapid automatic method for detection of gait events at the foot could speed up and possibly increase the repeatability of gait analysis and evaluations of treatments for pathological gaits. The aim of this study was to compare and validate a kinematic-based algorithm used in the detection of four gait events, heel contact, heel rise, toe contact and toe off. Force platform data is often used to obtain start and end of contact phases, but not usually heel rise and toe contact events. For this purpose synchronised kinematic, kinetic and video data were captured from 12 healthy adult subjects walking both barefoot and shod at slow and normal self-selected speeds. The data were used to determine the gait events using three methods: force, visual inspection and algorithm methods. Ninety percent of all timings given by the algorithm were within one frame (16.7 ms) when compared to visual inspection. There were no statistically significant differences between the visual and algorithm timings. For both heel and toe contact the differences between the three methods were within 1.5 frames, whereas for heel rise and toe off the differences between the force on one side and the visual and algorithm on the other were higher and more varied (up to 175 ms). In addition, the algorithm method provided the duration of three intervals, heel contact to toe contact, toe contact to heel rise and heel rise to toe off, which are not readily available from force platform data. The ability to automatically and reliably detect the timings of these four gait events and three intervals using kinematic data alone is an asset to clinical gait analysis.

  5. Infrasound's capability to detect and characterise volcanic events, from local to regional scale.

    NASA Astrophysics Data System (ADS)

    Taisne, Benoit; Perttu, Anna

    2017-04-01

    Local infrasound and seismic networks have been successfully used for identification and quantification of explosions at single volcanoes. However the February, 2014 eruption of Kelud volcano, Indonesia, destroyed most of the local monitoring network. The use of remote seismic and infrasound sensors proved to be essential in the reconstruction of the eruptive sequence. The first recorded explosive event, with relatively weak seismic and infrasonic signature, was followed by a 2 hour sustained signal detected as far away as 11,000 km by infrasound sensors and up to 2,300 km away by seismometers. The volcanic intensity derived from these observations places the 2014 Kelud eruption between the intensity of the 1980 Mount St. Helens and the 1991 Pinatubo eruptions. The use of remote seismic stations and infrasound arrays in deriving valuable information about the onset, evolution, and intensity of volcanic eruptions is clear from the Kelud example. After this eruption the Singapore Infrasound Array became operational. This array, along with the other regional infrasound arrays which are part of the International Monitoring System, have recorded events from fireballs and regional volcanoes. The detection capability of this network for any specific volcanic event is not only dependent on the amplitude of the source, but also the propagation effects, noise level at each station, and characteristics of the regional persistent noise sources (like the microbarum). Combining the spatial and seasonal characteristics of this noise, within the same frequency band as significant eruptive events, with the probability of such events to occur, gives us a comprehensive understanding of detection capability for any of the 750 active or potentially active volcanoes in Southeast Asia.

  6. Residual Events during Use of CPAP: Prevalence, Predictors, and Detection Accuracy

    PubMed Central

    Reiter, Joel; Zleik, Bashar; Bazalakova, Mihaela; Mehta, Pankaj; Thomas, Robert Joseph

    2016-01-01

    Study Objectives: To assess the frequency, severity, and determinants of residual respiratory events during continuous positive airway therapy (CPAP) for obstructive sleep apnea (OSA) as determined by device output. Methods: Subjects were consecutive OSA patients at an American Academy of Sleep Medicine accredited multidisciplinary sleep center. Inclusion criteria included CPAP use for a minimum of 3 months, and a minimum nightly use of 4 hours. Compliance metrics and waveform data from 217 subjects were analyzed retrospectively. Events were scored manually when there was a clear reduction of amplitude (≥ 30%) or flow-limitation with 2–3 larger recovery breaths. Automatically detected versus manually scored events were subjected to statistical analyses included Bland-Altman plots, correlation coefficients, and logistic regression exploring predictors of residual events. Results: The mean patient age was 54.7 ± 14.2 years; 63% were males. All patients had a primary diagnosis of obstructive sleep apnea, 26% defined as complex sleep apnea. Residual flow measurement based apnea-hypopnea index (AHIFLOW) > 5, 10, and 15/h was seen in 32.3%, 9.7%, and 1.8% vs. 60.8%, 23%, and 7.8% of subjects based on automated vs. manual scoring of waveform data. Automatically detected versus manually scored average AHIFLOW was 4.4 ± 3.8 vs. 7.3 ± 5.1 per hour. In a logistic regression analysis, the only predictors for a manual AHIFLOW > 5/h were the absolute central apnea index (CAI), (odds ratio [OR]: 1.5, p: 0.01, CI: 1.1–2.0), or using a CAI threshold of 5/h of sleep (OR: 5.0, p: < 0.001, CI: 2.2–13.8). For AHIFLOW > 10/h, the OR was 1.14, p: 0.03 (CI: 1.1–1.3) per every CAI unit of 1/hour. Conclusions: Residual respiratory events are common during CPAP treatment, may be missed by automated device detection and predicted by a high central apnea index on the baseline diagnostic study. Direct visualization of flow data is generally available and improves detection

  7. Detection and analysis of high-temperature events in the BIRD mission

    NASA Astrophysics Data System (ADS)

    Zhukov, Boris; Briess, Klaus; Lorenz, Eckehard; Oertel, Dieter; Skrbek, Wolfgang

    2005-01-01

    The primary mission objective of a new small Bi-spectral InfraRed Detection (BIRD) satellite is detection and quantitative analysis of high-temperature events like fires and volcanoes. An absence of saturation in the BIRD infrared channels makes it possible to improve false alarm rejection as well as to retrieve quantitative characteristics of hot targets, including their effective fire temperature, area and the radiative energy release. Examples are given of detection and analysis of wild and coal seam fires, of volcanic activity as well as of oil fires in Iraq. The smallest fires detected by BIRD, which were verified on ground, had an area of 12m2 at daytime and 4m2 at night.

  8. Using Structured Telephone Follow-up Assessments to Improve Suicide-related Adverse Event Detection

    PubMed Central

    Arias, Sarah A.; Zhang, Zi; Hillerns, Carla; Sullivan, Ashley F.; Boudreaux, Edwin D.; Miller, Ivan; Camargo, Carlos A.

    2014-01-01

    Adverse event (AE) detection and reporting practices were compared during the first phase of the Emergency Department Safety Assessment and Follow-up Evaluation (ED-SAFE), a suicide intervention study. Data were collected using a combination of chart reviews and structured telephone follow-up assessments post-enrollment. Beyond chart reviews, structured telephone follow-up assessments identified 45% of the total AEs in our study. Notably, detection of suicide attempts significantly varied by approach with 53 (18%) detected by chart review, 173 (59%) by structured telephone follow-up assessments, and 69 (23%) marked as duplicates. Findings provide support for utilizing multiple methods for more robust AE detection in suicide research. PMID:24588679

  9. Model-Based Iterative Reconstruction Versus Adaptive Statistical Iterative Reconstruction and Filtered Back Projection in Liver 64-MDCT: Focal Lesion Detection, Lesion Conspicuity, and Image Noise

    PubMed Central

    Shuman, William P.; Green, Doug E.; Busey, Janet M.; Kolokythas, Orpheus; Mitsumori, Lee M.; Koprowicz, Kent M.; Thibault, Jean-Baptiste; Hsieh, Jiang; Alessio, Adam M.; Choi, Eunice; Kinahan, Paul E.

    2017-01-01

    OBJECTIVE The purpose of this study is to compare three CT image reconstruction algorithms for liver lesion detection and appearance, subjective lesion conspicuity, and measured noise. MATERIALS AND METHODS Thirty-six patients with known liver lesions were scanned with a routine clinical three-phase CT protocol using a weight-based noise index of 30 or 36. Image data from each phase were reconstructed with filtered back projection (FBP), adaptive statistical iterative reconstruction (ASIR), and model-based iterative reconstruction (MBIR). Randomized images were presented to two independent blinded reviewers to detect and categorize the appearance of lesions and to score lesion conspicuity. Lesion size, lesion density (in Hounsfield units), adjacent liver density (in Hounsfield units), and image noise were measured. Two different unblinded truth readers established the number, appearance, and location of lesions. RESULTS Fifty-one focal lesions were detected by truth readers. For blinded reviewers compared with truth readers, there was no difference for lesion detection among the reconstruction algorithms. Lesion appearance was statistically the same among the three reconstructions. Although one reviewer scored lesions as being more conspicuous with MBIR, the other scored them the same. There was significantly less background noise in air with MBIR (mean [± SD], 2.1 ± 1.4 HU) than with ASIR (8.9 ± 1.9 HU; p < 0.001) or FBP (10.6 ± 2.6 HU; p < 0.001). Mean lesion contrast-to-noise ratio was statistically significantly higher for MBIR (34.4 ± 29.1) than for ASIR (6.5 ± 4.9; p < 0.001) or FBP (6.3 ± 6.0; p < 0.001). CONCLUSION In routine-dose clinical CT of the liver, MBIR resulted in comparable lesion detection, lesion characterization, and subjective lesion conspicuity, but significantly lower background noise and higher contrast-to-noise ratio compared with ASIR or FBP. This finding suggests that further investigation of the use of MBIR to enable dose

  10. Automatic Detection and Classification of Unsafe Events During Power Wheelchair Use

    PubMed Central

    Moghaddam, Athena K.; Yuen, Hiu Kim; Archambault, Philippe S.; Routhier, François; Michaud, François; Boissy, Patrick

    2014-01-01

    Using a powered wheelchair (PW) is a complex task requiring advanced perceptual and motor control skills. Unfortunately, PW incidents and accidents are not uncommon and their consequences can be serious. The objective of this paper is to develop technological tools that can be used to characterize a wheelchair user’s driving behavior under various settings. In the experiments conducted, PWs are outfitted with a datalogging platform that records, in real-time, the 3-D acceleration of the PW. Data collection was conducted over 35 different activities, designed to capture a spectrum of PW driving events performed at different speeds (collisions with fixed or moving objects, rolling on incline plane, and rolling across multiple types obstacles). The data was processed using time-series analysis and data mining techniques, to automatically detect and identify the different events. We compared the classification accuracy using four different types of time-series features: 1) time-delay embeddings; 2) time-domain characterization; 3) frequency-domain features; and 4) wavelet transforms. In the analysis, we compared the classification accuracy obtained when distinguishing between safe and unsafe events during each of the 35 different activities. For the purposes of this study, unsafe events were defined as activities containing collisions against objects at different speed, and the remainder were defined as safe events. We were able to accurately detect 98% of unsafe events, with a low (12%) false positive rate, using only five examples of each activity. This proof-of-concept study shows that the proposed approach has the potential of capturing, based on limited input from embedded sensors, contextual information on PW use, and of automatically characterizing a user’s PW driving behavior. PMID:27170879

  11. Mining Recent Temporal Patterns for Event Detection in Multivariate Time Series Data

    PubMed Central

    Batal, Iyad; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos

    2015-01-01

    Improving the performance of classifiers using pattern mining techniques has been an active topic of data mining research. In this work we introduce the recent temporal pattern mining framework for finding predictive patterns for monitoring and event detection problems in complex multivariate time series data. This framework first converts time series into time-interval sequences of temporal abstractions. It then constructs more complex temporal patterns backwards in time using temporal operators. We apply our framework to health care data of 13,558 diabetic patients and show its benefits by efficiently finding useful patterns for detecting and diagnosing adverse medical conditions that are associated with diabetes. PMID:25937993

  12. Nanoporous niobium oxide for label-free detection of DNA hybridization events.

    PubMed

    Choi, Jinsub; Lim, Jae Hoon; Rho, Sangchul; Jahng, Deokjin; Lee, Jaeyoung; Kim, Kyung Ja

    2008-01-15

    We found that DNA probes can be immobilized on anodically prepared porous niobium oxide without a chemical modification of both the DNA probes and the substrate. By using the porous niobium oxide with a positive surface charge, DNA hybridization events are detected on the basis of the blue-shift of a maximum absorption peak in UV-vis-NIR spectroscopy. The blue-shift is ascribed to the change of surface charge upon single- or double-stranded DNA. The method does not require a label and shows high sensitivity with the detection limit of the concentration of 1nM.

  13. A multivariate based event detection method and performance comparison with two baseline methods.

    PubMed

    Liu, Shuming; Smith, Kate; Che, Han

    2015-09-01

    Early warning systems have been widely deployed to protect water systems from accidental and intentional contamination events. Conventional detection algorithms are often criticized for having high false positive rates and low true positive rates. This mainly stems from the inability of these methods to determine whether variation in sensor measurements is caused by equipment noise or the presence of contamination. This paper presents a new detection method that identifies the existence of contamination by comparing Euclidean distances of correlation indicators, which are derived from the correlation coefficients of multiple water quality sensors. The performance of the proposed method was evaluated using data from a contaminant injection experiment and compared with two baseline detection methods. The results show that the proposed method can differentiate between fluctuations caused by equipment noise and those due to the presence of contamination. It yielded higher possibility of detection and a lower false alarm rate than the two baseline methods. With optimized parameter values, the proposed method can correctly detect 95% of all contamination events with a 2% false alarm rate.

  14. An integrated graphic data display improves detection and identification of critical events during anesthesia.

    PubMed

    Michels, P; Gravenstein, D; Westenskow, D R

    1997-07-01

    To show that an integrated graphic data display can shorten the time taken to detect and correctly identify critical events during anesthesia. We developed a graphic display which presents 30 anesthesia-related physiologic variables as shapes and colors, rather than traditional digits and waveforms. To evaluate the new display, we produced four critical events on a computer-based anesthesia simulator and asked two groups of five anesthesiologists to identify the events as quickly as possible. One group observed the new display while the other group viewed a traditional cardiovascular monitor with digital and waveform displays. The group which observed the integrated graphic display saw changes caused by inadequate paralysis 2.4 min sooner, and changes caused by a cuff leak 3.1 min sooner than those observing the traditional display. The integrated display group correctly identified the reason for the change 2.8 min sooner for inadequate paralysis, 3.1 min sooner for cuff leak and 3.1 min sooner for bleeding. These differences were all statistically significant. The results show that some simulated critical events are detected and correctly identified sooner, when an anesthesiologist views an integrated graphic display, rather than a traditional digital/waveform monitor.

  15. A Macro-Observation Scheme for Abnormal Event Detection in Daily-Life Video Sequences

    NASA Astrophysics Data System (ADS)

    Chiu, Wei-Yao; Tsai, Du-Ming

    2010-12-01

    We propose a macro-observation scheme for abnormal event detection in daily life. The proposed macro-observation representation records the time-space energy of motions of all moving objects in a scene without segmenting individual object parts. The energy history of each pixel in the scene is instantly updated with exponential weights without explicitly specifying the duration of each activity. Since possible activities in daily life are numerous and distinct from each other and not all abnormal events can be foreseen, images from a video sequence that spans sufficient repetition of normal day-to-day activities are first randomly sampled. A constrained clustering model is proposed to partition the sampled images into groups. The new observed event that has distinct distance from any of the cluster centroids is then classified as an anomaly. The proposed method has been evaluated in daily work of a laboratory and BEHAVE benchmark dataset. The experimental results reveal that it can well detect abnormal events such as burglary and fighting as long as they last for a sufficient duration of time. The proposed method can be used as a support system for the scene that requires full time monitoring personnel.

  16. A Method for Automated Detection of Usability Problems from Client User Interface Events

    PubMed Central

    Saadawi, Gilan M.; Legowski, Elizabeth; Medvedeva, Olga; Chavan, Girish; Crowley, Rebecca S.

    2005-01-01

    Think-aloud usability analysis provides extremely useful data but is very time-consuming and expensive to perform because of the extensive manual video analysis that is required. We describe a simple method for automated detection of usability problems from client user interface events for a developing medical intelligent tutoring system. The method incorporates (1) an agent-based method for communication that funnels all interface events and system responses to a centralized database, (2) a simple schema for representing interface events and higher order subgoals, and (3) an algorithm that reproduces the criteria used for manual coding of usability problems. A correction factor was empirically determining to account for the slower task performance of users when thinking aloud. We tested the validity of the method by simultaneously identifying usability problems using TAU and manually computing them from stored interface event data using the proposed algorithm. All usability problems that did not rely on verbal utterances were detectable with the proposed method. PMID:16779121

  17. Visual change detection: event-related potentials are dependent on stimulus location in humans.

    PubMed

    Czigler, István; Balázs, László; Pató, Lívia G

    2004-07-08

    Infrequent colored patterns within sequences of patterns of frequent color elicited a posterior negative event-related potential component only in case of lower half-field stimulation. This negativity in the 140-200 ms latency range is considered as a correlate of automatic visual change detection (visual mismatch negativity, vMMN). Retinotopic prestriate visual areas are suggested to be the generating loci of vMMN.

  18. AGILE Detection of a Candidate Gamma-Ray Precursor to the ICECUBE-160731 Neutrino Event

    NASA Astrophysics Data System (ADS)

    Lucarelli, F.; Pittori, C.; Verrecchia, F.; Donnarumma, I.; Tavani, M.; Bulgarelli, A.; Giuliani, A.; Antonelli, L. A.; Caraveo, P.; Cattaneo, P. W.; Colafrancesco, S.; Longo, F.; Mereghetti, S.; Morselli, A.; Pacciani, L.; Piano, G.; Pellizzoni, A.; Pilia, M.; Rappoldi, A.; Trois, A.; Vercellone, S.

    2017-09-01

    On 2016 July 31 the ICECUBE collaboration reported the detection of a high-energy starting event induced by an astrophysical neutrino. Here, we report on a search for a gamma-ray counterpart to the ICECUBE-160731 event, made with the AGILE satellite. No detection was found spanning the time interval of ±1 ks around the neutrino event time T 0 using the AGILE “burst search” system. Looking for a possible gamma-ray precursor in the results of the AGILE-GRID automatic Quick Look procedure over predefined 48-hr time bins, we found an excess above 100 MeV between 1 and 2 days before T 0, which is positionally consistent with the ICECUBE error circle, that has a post-trial significance of about 4σ . A refined data analysis of this excess confirms, a posteriori, the automatic detection. The new AGILE transient source, named AGL J1418+0008, thus stands as a possible ICECUBE-160731 gamma-ray precursor. No other space missions nor ground observatories have reported any detection of transient emission consistent with the ICECUBE event. We show that Fermi-LAT had a low exposure for the ICECUBE region during the AGILE gamma-ray transient. Based on an extensive search for cataloged sources within the error regions of ICECUBE-160731 and AGL J1418+0008, we find a possible common counterpart showing some of the key features associated with the high-energy peaked BL Lac (HBL) class of blazars. Further investigations on the nature of this source using dedicated SWIFT ToO data are presented.

  19. Event Detection for Hydrothermal Plumes: A case study at Grotto Vent

    NASA Astrophysics Data System (ADS)

    Bemis, K. G.; Ozer, S.; Xu, G.; Rona, P. A.; Silver, D.

    2012-12-01

    Evidence is mounting that geologic events such as volcanic eruptions (and intrusions) and earthquakes (near and far) influence the flow rates and temperatures of hydrothermal systems. Connecting such suppositions to observations of hydrothermal output is challenging, but new ongoing time series have the potential to capture such events. This study explores using activity detection, a technique modified from computer vision, to identify pre-defined events within an extended time series recorded by COVIS (Cabled Observatory Vent Imaging Sonar) and applies it to a time series, with gaps, from Sept 2010 to the present; available measurements include plume orientation, plume rise rate, and diffuse flow area at the NEPTUNE Canada Observatory at Grotto Vent, Main Endeavour Field, Juan de Fuca Ridge. Activity detection is the process of finding a pattern (activity) in a data set containing many different types of patterns. Among many approaches proposed to model and detect activities, we have chosen a graph-based technique, Petri Nets, as they do not require training data to model the activity. They use the domain expert's knowledge to build the activity as a combination of feature states and their transitions (actions). Starting from a conceptual model of how hydrothermal plumes respond to daily tides, we have developed a Petri Net based detection algorithm that identifies deviations from the specified response. Initially we assumed that the orientation of the plume would change smoothly and symmetrically in a consistent daily pattern. However, results indicate that the rate of directional changes varies. The present Petri Net detects unusually large and rapid changes in direction or amount of bending; however inspection of Figure 1 suggests that many of the events detected may be artifacts resulting from gaps in the data or from the large temporal spacing. Still, considerable complexity overlies the "normal" tidal response pattern (the data has a dominant frequency of

  20. Temporal and Spatial Predictability of an Irrelevant Event Differently Affect Detection and Memory of Items in a Visual Sequence.

    PubMed

    Ohyama, Junji; Watanabe, Katsumi

    2016-01-01

    We examined how the temporal and spatial predictability of a task-irrelevant visual event affects the detection and memory of a visual item embedded in a continuously changing sequence. Participants observed 11 sequentially presented letters, during which a task-irrelevant visual event was either present or absent. Predictabilities of spatial location and temporal position of the event were controlled in 2 × 2 conditions. In the spatially predictable conditions, the event occurred at the same location within the stimulus sequence or at another location, while, in the spatially unpredictable conditions, it occurred at random locations. In the temporally predictable conditions, the event timing was fixed relative to the order of the letters, while in the temporally unpredictable condition; it could not be predicted from the letter order. Participants performed a working memory task and a target detection reaction time (RT) task. Memory accuracy was higher for a letter simultaneously presented at the same location as the event in the temporally unpredictable conditions, irrespective of the spatial predictability of the event. On the other hand, the detection RTs were only faster for a letter simultaneously presented at the same location as the event when the event was both temporally and spatially predictable. Thus, to facilitate ongoing detection processes, an event must be predictable both in space and time, while memory processes are enhanced by temporally unpredictable (i.e., surprising) events. Evidently, temporal predictability has differential effects on detection and memory of a visual item embedded in a sequence of images.

  1. Signal Detection of Imipenem Compared to Other Drugs from Korea Adverse Event Reporting System Database.

    PubMed

    Park, Kyounghoon; Soukavong, Mick; Kim, Jungmee; Kwon, Kyoung Eun; Jin, Xue Mei; Lee, Joongyub; Yang, Bo Ram; Park, Byung Joo

    2017-05-01

    To detect signals of adverse drug events after imipenem treatment using the Korea Institute of Drug Safety & Risk Management-Korea adverse event reporting system database (KIDS-KD). We performed data mining using KIDS-KD, which was constructed using spontaneously reported adverse event (AE) reports between December 1988 and June 2014. We detected signals calculated the proportional reporting ratio, reporting odds ratio, and information component of imipenem. We defined a signal as any AE that satisfied all three indices. The signals were compared with drug labels of nine countries. There were 807582 spontaneous AEs reports in the KIDS-KD. Among those, the number of antibiotics related AEs was 192510; 3382 reports were associated with imipenem. The most common imipenem-associated AE was the drug eruption; 353 times. We calculated the signal by comparing with all other antibiotics and drugs; 58 and 53 signals satisfied the three methods. We compared the drug labelling information of nine countries, including the USA, the UK, Japan, Italy, Switzerland, Germany, France, Canada, and South Korea, and discovered that the following signals were currently not included in drug labels: hypokalemia, cardiac arrest, cardiac failure, Parkinson's syndrome, myocardial infarction, and prostate enlargement. Hypokalemia was an additional signal compared with all other antibiotics, and the other signals were not different compared with all other antibiotics and all other drugs. We detected new signals that were not listed on the drug labels of nine countries. However, further pharmacoepidemiologic research is needed to evaluate the causality of these signals.

  2. Early detection of cell activation events by means of attenuated total reflection Fourier transform infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Titus, Jitto; Filfili, Chadi; Hilliard, Julia K.; Ward, John A.; Unil Perera, A. G.

    2014-06-01

    Activation of Jurkat T-cells in culture following treatment with anti-CD3 (Cluster of Differentiation 3) antibody is detectable by interrogating the treated T-cells using the Attenuated Total Reflection-Fourier Transform Infrared (ATR-FTIR) Spectroscopy technique. Cell activation was detected within 75 min after the cells encountered specific immunoglobulin molecules. Spectral markers noted following ligation of the CD3 receptor with anti CD3 antibody provides proof-of-concept that ATR-FTIR spectroscopy is a sensitive measure of molecular events subsequent to cells interacting with anti-CD3 Immunoglobulin G. The resultant ligation of the CD3 receptor results in the initiation of well defined, specific signaling pathways that parallel the measurable molecular events detected using ATR-FTIR. Paired t-test with post-hoc Bonferroni corrections for multiple comparisons has resulted in the identification of statistically significant spectral markers (p < 0.02) at 1367 and 1358 cm-1. Together, these data demonstrate that early treatment-specific cellular events can be measured by ATR-FTIR and that this technique can be used to identify specific agents via the responses of the cell biosensor at different time points postexposure.

  3. Real-time gait event detection for normal subjects from lower trunk accelerations.

    PubMed

    González, Rafael C; López, Antonio M; Rodriguez-Uría, Javier; Alvarez, Diego; Alvarez, Juan C

    2010-03-01

    In this paper we report on a novel algorithm for the real-time detection and timing of initial (IC) and final contact (FC) gait events. We process the vertical and antero-posterior accelerations registered at the lower trunk (L3 vertebra). The algorithm is based on a set of heuristic rules extracted from a set of 1719 steps. An independent experiment was conducted to compare the results of our algorithms with those obtained from a Digimax force platform. The results show small deviations from times of occurrence of events recorded from the platform (13+/-35 ms for IC and 9+/-54 ms for FC). Results for the FC timing are especially relevant in this field, as no previous work has addressed its temporal location through the processing of lower trunk accelerations. The delay in the real-time detection of the IC is 117+/-39 ms and 34+/-72 ms for the FC, improving previously reported results for real-time detection of events from lower trunk accelerations.

  4. Adverse event detection (AED) system for continuously monitoring and evaluating structural health status

    NASA Astrophysics Data System (ADS)

    Yun, Jinsik; Ha, Dong Sam; Inman, Daniel J.; Owen, Robert B.

    2011-03-01

    Structural damage for spacecraft is mainly due to impacts such as collision of meteorites or space debris. We present a structural health monitoring (SHM) system for space applications, named Adverse Event Detection (AED), which integrates an acoustic sensor, an impedance-based SHM system, and a Lamb wave SHM system. With these three health-monitoring methods in place, we can determine the presence, location, and severity of damage. An acoustic sensor continuously monitors acoustic events, while the impedance-based and Lamb wave SHM systems are in sleep mode. If an acoustic sensor detects an impact, it activates the impedance-based SHM. The impedance-based system determines if the impact incurred damage. When damage is detected, it activates the Lamb wave SHM system to determine the severity and location of the damage. Further, since an acoustic sensor dissipates much less power than the two SHM systems and the two systems are activated only when there is an acoustic event, our system reduces overall power dissipation significantly. Our prototype system demonstrates the feasibility of the proposed concept.

  5. Method for the depth corrected detection of ionizing events from a co-planar grids sensor

    DOEpatents

    De Geronimo, Gianluigi; Bolotnikov, Aleksey E.; Carini, Gabriella

    2009-05-12

    A method for the detection of ionizing events utilizing a co-planar grids sensor comprising a semiconductor substrate, cathode electrode, collecting grid and non-collecting grid. The semiconductor substrate is sensitive to ionizing radiation. A voltage less than 0 Volts is applied to the cathode electrode. A voltage greater than the voltage applied to the cathode is applied to the non-collecting grid. A voltage greater than the voltage applied to the non-collecting grid is applied to the collecting grid. The collecting grid and the non-collecting grid are summed and subtracted creating a sum and difference respectively. The difference and sum are divided creating a ratio. A gain coefficient factor for each depth (distance between the ionizing event and the collecting grid) is determined, whereby the difference between the collecting electrode and the non-collecting electrode multiplied by the corresponding gain coefficient is the depth corrected energy of an ionizing event. Therefore, the energy of each ionizing event is the difference between the collecting grid and the non-collecting grid multiplied by the corresponding gain coefficient. The depth of the ionizing event can also be determined from the ratio.

  6. Presentation of the results of a Bayesian automatic event detection and localization program to human analysts

    NASA Astrophysics Data System (ADS)

    Kushida, N.; Kebede, F.; Feitio, P.; Le Bras, R.

    2016-12-01

    The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has been developing and testing NET-VISA (Arora et al., 2013), a Bayesian automatic event detection and localization program, and evaluating its performance in a realistic operational mode. In our preliminary testing at the CTBTO, NET-VISA shows better performance than its currently operating automatic localization program. However, given CTBTO's role and its international context, a new technology should be introduced cautiously when it replaces a key piece of the automatic processing. We integrated the results of NET-VISA into the Analyst Review Station, extensively used by the analysts so that they can check the accuracy and robustness of the Bayesian approach. We expect the workload of the analysts to be reduced because of the better performance of NET-VISA in finding missed events and getting a more complete set of stations than the current system which has been operating for nearly twenty years. The results of a series of tests indicate that the expectations born from the automatic tests, which show an overall overlap improvement of 11%, meaning that the missed events rate is cut by 42%, hold for the integrated interactive module as well. New events are found by analysts, which qualify for the CTBTO Reviewed Event Bulletin, beyond the ones analyzed through the standard procedures. Arora, N., Russell, S., and Sudderth, E., NET-VISA: Network Processing Vertically Integrated Seismic Analysis, 2013, Bull. Seismol. Soc. Am., 103, 709-729.

  7. Methods for detecting seismic events at known locations using NORESS (Norwegian Experimental Seismic System) data

    SciTech Connect

    Lee, D.O.; Stearns, S.D.; Wayland, J.R. Jr.

    1989-03-01

    The difficulty of detecting, locating and identifying low-magnitude seismic events has been an ongoing problem. In this note, we describe processing methods that help us to find low-magnitude seismic events. A series of algorithms with beamforming has been developed and has proven effective in helping to discover very low yield events. The beamforming technique consists of determining the array element time delays for the specific source region using previously established events. This allows us to concentrate the array to look at the specified source location. Examples of this type of analysis are provided. In the analysis of seismic data one may be in possession of other information, e.g., newspaper reports of an earthquake. Using this information to concentrate the search for an event will often identify an otherwise overlooked signal. The algorithms for this type of search are incorporated into computer software that includes capabilities for plotting, spectral and signal-to-noise estimation, direction finding, and other functions. 2 refs., 25 figs.

  8. Event-Triggered Fault Detection Filter Design for a Continuous-Time Networked Control System.

    PubMed

    Wang, Yu-Long; Shi, Peng; Lim, Cheng-Chew; Liu, Yuan

    2016-12-01

    This paper studies the problem of event-triggered fault detection filter (FDF) and controller coordinated design for a continuous-time networked control system (NCS) with biased sensor faults. By considering sensor-to-FDF network-induced delays and packet dropouts, which do not impose a constraint on the event-triggering mechanism, and proposing the simultaneous network bandwidth utilization ratio and fault occurrence probability-based event-triggering mechanism, a new closed-loop model for the considered NCS is established. Based on the established model, the event-triggered H ∞ performance analysis, and FDF and controller coordinated design are presented. The combined mutually exclusive distribution and Wirtinger-based integral inequality approach is proposed for the first time to deal with integral inequalities for products of vectors. This approach is proved to be less conservative than the existing Wirtinger-based integral inequality approach. The designed FDF and controller can guarantee the sensitivity of the residual signal to faults and the robustness of the NCS to external disturbances. The simulation results verify the effectiveness of the proposed event-triggering mechanism, and the FDF and controller coordinated design.

  9. Detecting adverse events in surgery: comparing events detected by the Veterans Health Administration Surgical Quality Improvement Program and the Patient Safety Indicators.

    PubMed

    Mull, Hillary J; Borzecki, Ann M; Loveland, Susan; Hickson, Kathleen; Chen, Qi; MacDonald, Sally; Shin, Marlena H; Cevasco, Marisa; Itani, Kamal M F; Rosen, Amy K

    2014-04-01

    The Patient Safety Indicators (PSIs) use administrative data to screen for select adverse events (AEs). In this study, VA Surgical Quality Improvement Program (VASQIP) chart review data were used as the gold standard to measure the criterion validity of 5 surgical PSIs. Independent chart review was also used to determine reasons for PSI errors. The sensitivity, specificity, and positive predictive value of PSI software version 4.1a were calculated among Veterans Health Administration hospitalizations (2003-2007) reviewed by VASQIP (n = 268,771). Nurses re-reviewed a sample of hospitalizations for which PSI and VASQIP AE detection disagreed. Sensitivities ranged from 31% to 68%, specificities from 99.1% to 99.8%, and positive predictive values from 31% to 72%. Reviewers found that coding errors accounted for some PSI-VASQIP disagreement; some disagreement was also the result of differences in AE definitions. These results suggest that the PSIs have moderate criterion validity; however, some surgical PSIs detect different AEs than VASQIP. Future research should explore using both methods to evaluate surgical quality. Published by Elsevier Inc.

  10. Digital disease detection: A systematic review of event-based internet biosurveillance systems.

    PubMed

    O'Shea, Jesse

    2017-05-01

    Internet access and usage has changed how people seek and report health information. Meanwhile,infectious diseases continue to threaten humanity. The analysis of Big Data, or vast digital data, presents an opportunity to improve disease surveillance and epidemic intelligence. Epidemic intelligence contains two components: indicator based and event-based. A relatively new surveillance type has emerged called event-based Internet biosurveillance systems. These systems use information on events impacting health from Internet sources, such as social media or news aggregates. These systems circumvent the limitations of traditional reporting systems by being inexpensive, transparent, and flexible. Yet, innovations and the functionality of these systems can change rapidly. To update the current state of knowledge on event-based Internet biosurveillance systems by identifying all systems, including current functionality, with hopes to aid decision makers with whether to incorporate new methods into comprehensive programmes of surveillance. A systematic review was performed through PubMed, Scopus, and Google Scholar databases, while also including grey literature and other publication types. 50 event-based Internet systems were identified, including an extraction of 15 attributes for each system, described in 99 articles. Each system uses different innovative technology and data sources to gather data, process, and disseminate data to detect infectious disease outbreaks. The review emphasises the importance of using both formal and informal sources for timely and accurate infectious disease outbreak surveillance, cataloguing all event-based Internet biosurveillance systems. By doing so, future researchers will be able to use this review as a library for referencing systems, with hopes of learning, building, and expanding Internet-based surveillance systems. Event-based Internet biosurveillance should act as an extension of traditional systems, to be utilised as an

  11. Syndromic Surveillance Based on Emergency Visits: A Reactive Tool for Unusual Events Detection

    PubMed Central

    Vilain, Pascal; Bourdé, Arnaud; Cassou, Pierre-Jean Marianne dit; Jacques-Antoine, Yves; Morbidelli, Philippe; Filleul, Laurent

    2013-01-01

    Objective To show with examples that syndromic surveillance system can be a reactive tool for public health surveillance. Introduction The late health events such as the heat wave of 2003 showed the need to make public health surveillance evolve in France. Thus, the French Institute for Public Health Surveillance has developed syndromic surveillance systems based on several information sources such as emergency departments (1). In Reunion Island, the chikungunya outbreak of 2005–2006, then the influenza pandemic of 2009 contributed to the implementation and the development of this surveillance system (2–3). In the past years, this tool allowed to follow and measure the impact of seasonal epidemics. Nevertheless, its usefulness for the detection of minor unusual events had yet to be demonstrated. Methods In Reunion Island, the syndromic surveillance system is based on the activity of six emergency departments. Two types of indicators are constructed from collected data: - Qualitative indicators for the alert (every visit whose diagnostic relates to a notifiable disease or potential epidemic disease);- Quantitative indicators for the epidemic/cluster detection (number of visits based on syndromic grouping). Daily and weekly analyses are carried out. A decision algorithm allows to validate the signal and to organize an epidemiological investigation if necessary. Results Each year, about 150 000 visits are registered in the six emergency departments that is 415 consultations per day on average. Several unusual health events on small-scale were detected early. In August 2011, the surveillance system allowed to detect the first autochthonous cases of measles, a few days before this notifiable disease was reported to health authorities (Figure 1). In January 2012, the data of emergency departments allowed to validate the signal of viral meningitis as well as to detect a cluster in the West of the island and to follow its trend. In June 2012, a family foodborne illness

  12. Advanced Geospatial Hydrodynamic Signals Analysis for Tsunami Event Detection and Warning

    NASA Astrophysics Data System (ADS)

    Arbab-Zavar, Banafshe; Sabeur, Zoheir

    2013-04-01

    Current early tsunami warning can be issued upon the detection of a seismic event which may occur at a given location offshore. This also provides an opportunity to predict the tsunami wave propagation and run-ups at potentially affected coastal zones by selecting the best matching seismic event from a database of pre-computed tsunami scenarios. Nevertheless, it remains difficult and challenging to obtain the rupture parameters of the tsunamigenic earthquakes in real time and simulate the tsunami propagation with high accuracy. In this study, we propose a supporting approach, in which the hydrodynamic signal is systematically analysed for traces of a tsunamigenic signal. The combination of relatively low amplitudes of a tsunami signal at deep waters and the frequent occurrence of background signals and noise contributes to a generally low signal to noise ratio for the tsunami signal; which in turn makes the detection of this signal difficult. In order to improve the accuracy and confidence of detection, a re-identification framework in which a tsunamigenic signal is detected via the scan of a network of hydrodynamic stations with water level sensing is performed. The aim is to attempt the re-identification of the same signatures as the tsunami wave spatially propagates through the hydrodynamic stations sensing network. The re-identification of the tsunamigenic signal is technically possible since the tsunami signal at the open ocean itself conserves its birthmarks relating it to the source event. As well as supporting the initial detection and improving the confidence of detection, a re-identified signal is indicative of the spatial range of the signal, and thereby it can be used to facilitate the identification of certain background signals such as wind waves which do not have as large a spatial reach as tsunamis. In this paper, the proposed methodology for the automatic detection of tsunamigenic signals has been achieved using open data from NOAA with a recorded

  13. Group localisation and unsupervised detection and classification of basic crowd behaviour events for surveillance applications

    NASA Astrophysics Data System (ADS)

    Roubtsova, Nadejda S.; de With, Peter H. N.

    2013-02-01

    Technology for monitoring crowd behaviour is in demand for surveillance and security applications. The trend in research is to tackle detection of complex crowd behaviour events (panic, ght, evacuation etc.) directly using machine learning techniques. In this paper, we present a contrary, bottom-up approach seeking basic group information: (1) instantaneous location and (2) the merge, split and lateral slide-by events - the three basic motion patterns comprising any crowd behaviour. The focus on such generic group information makes our algorithm suitable as a building block in a variety of surveillance systems, possibly integrated with static content analysis solutions. Our feature extraction framework has optical ow in its core. The framework is universal being motion-based, rather than object-detection-based and generates a large variety of motion-blob- characterising features useful for an array of classi cation problems. Motion-based characterisation is performed on a group as an atomic whole and not by means of superposition of individual human motions. Within that feature space, our classi cation system makes decisions based on heuristic rules and thresholds, without machine learning. Our system performs well on group localisation, consistently generating contours around both moving and halted groups. The visual output of our periodical group localisation is equivalent to tracking and the group contour accuracy ranges from adequate to exceptionally good. The system successfully detects and classi es within our merge/split/slide-by event space in surveillance-type video sequences, di ering in resolution, scale, quality and motion content. Quantitatively, its performance is characterised by a good recall: 83% on detection and 71% on combined detection and classi cation.

  14. BioSense: implementation of a National Early Event Detection and Situational Awareness System.

    PubMed

    Bradley, Colleen A; Rolka, H; Walker, D; Loonsk, J

    2005-08-26

    BioSense is a CDC initiative to support enhanced early detection, quantification, and localization of possible biologic terrorism attacks and other events of public health concern on a national level. The goals of the BioSense initiative are to advance early detection by providing the standards, infrastructure, and data acquisition for near real-time reporting, analytic evaluation and implementation, and early event detection support for state and local public health officials. BioSense collects and analyzes Department of Defense and Department of Veterans Affairs ambulatory clinical diagnoses and procedures and Laboratory Corporation of America laboratory-test orders. The application summarizes and presents analytical results and data visualizations by source, day, and syndrome for each ZIP code, state, and metropolitan area through maps, graphs, and tables. An initial proof of a concept evaluation project was conducted before the system was made available to state and local users in April 2004. User recruitment involved identifying and training BioSense administrators and users from state and local health departments. User support has been an essential component of the implementation and enhancement process. CDC initiated the BioIntelligence Center (BIC) in June 2004 to conduct internal monitoring of BioSense national data daily. BIC staff have supported state and local system monitoring, conducted data anomaly inquiries, and communicated with state and local public health officials. Substantial investments will be made in providing regional, state, and local data for early event detection and situational awareness, test beds for data and algorithm evaluation, detection algorithm development, and data management technologies, while maintaining the focus on state and local public health needs.

  15. Detecting paralinguistic events in audio stream using context in features and probabilistic decisions☆

    PubMed Central

    Gupta, Rahul; Audhkhasi, Kartik; Lee, Sungbok; Narayanan, Shrikanth

    2017-01-01

    Non-verbal communication involves encoding, transmission and decoding of non-lexical cues and is realized using vocal (e.g. prosody) or visual (e.g. gaze, body language) channels during conversation. These cues perform the function of maintaining conversational flow, expressing emotions, and marking personality and interpersonal attitude. In particular, non-verbal cues in speech such as paralanguage and non-verbal vocal events (e.g. laughters, sighs, cries) are used to nuance meaning and convey emotions, mood and attitude. For instance, laughters are associated with affective expressions while fillers (e.g. um, ah, um) are used to hold floor during a conversation. In this paper we present an automatic non-verbal vocal events detection system focusing on the detect of laughter and fillers. We extend our system presented during Interspeech 2013 Social Signals Sub-challenge (that was the winning entry in the challenge) for frame-wise event detection and test several schemes for incorporating local context during detection. Specifically, we incorporate context at two separate levels in our system: (i) the raw frame-wise features and, (ii) the output decisions. Furthermore, our system processes the output probabilities based on a few heuristic rules in order to reduce erroneous frame-based predictions. Our overall system achieves an Area Under the Receiver Operating Characteristics curve of 95.3% for detecting laughters and 90.4% for fillers on the test set drawn from the data specifications of the Interspeech 2013 Social Signals Sub-challenge. We perform further analysis to understand the interrelation between the features and obtained results. Specifically, we conduct a feature sensitivity analysis and correlate it with each feature's stand alone performance. The observations suggest that the trained system is more sensitive to a feature carrying higher discriminability with implications towards a better system design. PMID:28713197

  16. Detection of pulmonary embolism on computed tomography: improvement using a model-based iterative reconstruction algorithm compared with filtered back projection and iterative reconstruction algorithms.

    PubMed

    Kligerman, Seth; Lahiji, Kian; Weihe, Elizabeth; Lin, Cheng Tin; Terpenning, Silanath; Jeudy, Jean; Frazier, Annie; Pugatch, Robert; Galvin, Jeffrey R; Mittal, Deepika; Kothari, Kunal; White, Charles S

    2015-01-01

    The purpose of the study was to determine whether a model-based iterative reconstruction (MBIR) technique improves diagnostic confidence and detection of pulmonary embolism (PE) compared with hybrid iterative reconstruction (HIR) and filtered back projection (FBP) reconstructions in patients undergoing computed tomography pulmonary angiography. The study was approved by our institutional review board. Fifty patients underwent computed tomography pulmonary angiography at 100 kV using standard departmental protocols. Twenty-two of 50 patients had studies positive for PE. All 50 studies were reconstructed using FBP, HIR, and MBIR. After image randomization, 5 thoracic radiologists and 2 thoracic radiology fellows graded each study on a scale of 1 (very poor) to 5 (ideal) in 4 subjective categories: diagnostic confidence, noise, pulmonary artery enhancement, and plastic appearance. Readers assessed each study for the presence of PE. Parametric and nonparametric data were analyzed with repeated measures and Friedman analysis of variance, respectively. For the 154 positive studies (7 readers × 22 positive studies), pooled sensitivity for detection of PE was 76% (117/154), 78.6% (121/154), and 82.5% (127/154) using FBP, HIR, and MBIR, respectively. PE detection was significantly higher using MBIR compared with FBP (P = 0.016) and HIR (P = 0.046). Because of nonsignificant increase in FP studies using HIR and MBIR, accuracy with MBIR (88.6%), HIR (87.1%), and FBP (87.7%) was similar. Compared with FBP, MBIR led to a significant subjective increase in diagnostic confidence, noise, and enhancement in 6/7, 6/7, and 7/7 readers, respectively. Compared with HIR, MBIR led to significant subjective increase in diagnostic confidence, noise, and enhancement in 5/7, 5/7, and 7/7 readers, respectively. MBIR led to a subjective increase in plastic appearance in all 7 readers compared with both FBP and HIR. MBIR led to significant increase in PE detection compared with FBP and HIR

  17. Final Scientific Report, Integrated Seismic Event Detection and Location by Advanced Array Processing

    SciTech Connect

    Kvaerna, T.; Gibbons. S.J.; Ringdal, F; Harris, D.B.

    2007-01-30

    In the field of nuclear explosion monitoring, it has become a priority to detect, locate, and identify seismic events down to increasingly small magnitudes. The consideration of smaller seismic events has implications for a reliable monitoring regime. Firstly, the number of events to be considered increases greatly; an exponential increase in naturally occurring seismicity is compounded by large numbers of seismic signals generated by human activity. Secondly, the signals from smaller events become more difficult to detect above the background noise and estimates of parameters required for locating the events may be subject to greater errors. Thirdly, events are likely to be observed by a far smaller number of seismic stations, and the reliability of event detection and location using a very limited set of observations needs to be quantified. For many key seismic stations, detection lists may be dominated by signals from routine industrial explosions which should be ascribed, automatically and with a high level of confidence, to known sources. This means that expensive analyst time is not spent locating routine events from repeating seismic sources and that events from unknown sources, which could be of concern in an explosion monitoring context, are more easily identified and can be examined with due care. We have obtained extensive lists of confirmed seismic events from mining and other artificial sources which have provided an excellent opportunity to assess the quality of existing fully-automatic event bulletins and to guide the development of new techniques for online seismic processing. Comparing the times and locations of confirmed events from sources in Fennoscandia and NW Russia with the corresponding time and location estimates reported in existing automatic bulletins has revealed substantial mislocation errors which preclude a confident association of detected signals with known industrial sources. The causes of the errors are well understood and are

  18. Data mining in the US Vaccine Adverse Event Reporting System (VAERS): early detection of intussusception and other events after rotavirus vaccination.

    PubMed

    Niu, M T; Erwin, D E; Braun, M M

    2001-09-14

    The Vaccine Adverse Event Reporting System (VAERS) is the US passive surveillance system monitoring vaccine safety. A major limitation of VAERS is the lack of denominator data (number of doses of administered vaccine), an element necessary for calculating reporting rates. Empirical Bayesian data mining, a data analysis method, utilizes the number of events reported for each vaccine and statistically screens the database for higher than expected vaccine-event combinations signaling a potential vaccine-associated event. This is the first study of data mining in VAERS designed to test the utility of this method to detect retrospectively a known side effect of vaccination-intussusception following rotavirus (RV) vaccine. From October 1998 to December 1999, 112 cases of intussusception were reported. The data mining method was able to detect a signal for RV-intussusception in February 1999 when only four cases were reported. These results demonstrate the utility of data mining to detect significant vaccine-associated events at early date. Data mining appears to be an efficient and effective computer-based program that may enhance early detection of adverse events in passive surveillance systems.

  19. Detecting continuity violations in infancy: a new account and new evidence from covering and tube events

    PubMed Central

    Wang, Su-hua; Baillargeon, Renée; Paterson, Sarah

    2012-01-01

    Recent research on infants’ responses to occlusion and containment events indicates that, although some violations of the continuity principle are detected at an early age e.g. Aguiar, A., & Baillargeon, R. (1999). 2.5-month-old infants’ reasoning about when objects should and should not be occluded. Cognitive Psychology 39, 116–157; Hesposs, S. J., & Baillargeon, R. (2001). Knowledge about containment events in very young infants. Cognition 78, 207–245; Luo, Y., & Baillargeon, R. (in press). When the ordinary seems unexpected: Evidence for rule-based reasoning in young infants. Cognition; Wilcox, T., Nadel, L., & Rosser, R. (1996). Location memory in healthy preterm and full-term infants. Infant Behavior & Development 19, 309–323, others are not detected until much later e.g. Baillargeon, R., & DeVos, J. (1991). Object permanence in young infants: Further evidence. Child Development 62, 1227–1246; Hespos, S. J., & Baillargeon, R. (2001). Infants’ knowledge about occlusion and containment events: A surprising discrepancy. Psychological Science 12, 140–147; Luo, Y., & Baillargeon, R. (2004). Infants’ reasoning about events involving transparent occluders and containers. Manuscript in preparation; Wilcox, T. (1999). Object individuation: Infants’ use of shape, size, pattern, and color. Cognition 72, 125–166. The present research focused on events involving covers or tubes, and brought to light additional examples of early and late successes in infants’ ability to detect continuity violations. In Experiment 1, 2.5- to 3-month-old infants were surprised (1) when a cover was lowered over an object, slid to the right, and lifted to reveal no object; and (2) when a cover was lowered over an object, slid behind the left half of a screen, lifted above the screen, moved to the right, lowered behind the right half of the screen, slid past the screen, and finally lifted to reveal the object. In Experiments 2 and 3, 9- and 11-month-old infants were not

  20. Application of data cubes for improving detection of water cycle extreme events

    NASA Astrophysics Data System (ADS)

    Teng, W. L.; Albayrak, A.

    2015-12-01

    As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case for our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme (WCE) events, a specific case of anomaly detection, requiring time series data. We investigate the use of the sequential probability ratio test (SPRT) for anomaly detection and support vector machines (SVM) for anomaly classification. We show an example of detection of WCE events, using the Global Land Data Assimilation Systems (GLDAS) data set.

  1. A Patch-Based Method for Repetitive and Transient Event Detection in Fluorescence Imaging

    PubMed Central

    Boulanger, Jérôme; Gidon, Alexandre; Kervran, Charles; Salamero, Jean

    2010-01-01

    Automatic detection and characterization of molecular behavior in large data sets obtained by fast imaging in advanced light microscopy become key issues to decipher the dynamic architectures and their coordination in the living cell. Automatic quantification of the number of sudden and transient events observed in fluorescence microscopy is discussed in this paper. We propose a calibrated method based on the comparison of image patches expected to distinguish sudden appearing/vanishing fluorescent spots from other motion behaviors such as lateral movements. We analyze the performances of two statistical control procedures and compare the proposed approach to a frame difference approach using the same controls on a benchmark of synthetic image sequences. We have then selected a molecular model related to membrane trafficking and considered real image sequences obtained in cells stably expressing an endocytic-recycling trans-membrane protein, the Langerin-YFP, for validation. With this model, we targeted the efficient detection of fast and transient local fluorescence concentration arising in image sequences from a data base provided by two different microscopy modalities, wide field (WF) video microscopy using maximum intensity projection along the axial direction and total internal reflection fluorescence microscopy. Finally, the proposed detection method is briefly used to statistically explore the effect of several perturbations on the rate of transient events detected on the pilot biological model. PMID:20976222

  2. Seismic Event Identification Using Scanning Detection: A Comparison of Denoising and Classification Methods

    NASA Astrophysics Data System (ADS)

    Rowe, C. A.; MacCarthy, J. K.; Giudicepietro, F.

    2005-12-01

    Automatic detection and classification methods are increasingly important in observatory operations, as the volume and rate of incoming data exceed the capacity of human analysis staff to process the data in near-real-time. We explore the success of scanning detection for similar event identification in a variety of seismic waveform catalogs. Several waveform pre-processing methods are applied to previously recorded events which are scanned through triggered and continuous waveform catalogs to determine the success and false alarm rate for detections of repeating signals. Pre-processing approaches include adaptive, cross-coherency filtering, adaptive, auto-associative neural network filtering, discrete wavelet package decomposition and linear predictive coding as well as suites of standard bandpass filters. Classification / detection methods for the various pre-processed signals are applied to investigate the robustness of the individual and combined approaches. The classifiers as applied to the processed waveforms include dendrogram-based clustering and neural network classifiers. We will present findings for the various combinations of methods as applied to tectonic earthquakes, mine blasts and volcanic seismicity.

  3. High-Performance Signal Detection for Adverse Drug Events using MapReduce Paradigm.

    PubMed

    Fan, Kai; Sun, Xingzhi; Tao, Ying; Xu, Linhao; Wang, Chen; Mao, Xianling; Peng, Bo; Pan, Yue

    2010-11-13

    Post-marketing pharmacovigilance is important for public health, as many Adverse Drug Events (ADEs) are unknown when those drugs were approved for marketing. However, due to the large number of reported drugs and drug combinations, detecting ADE signals by mining these reports is becoming a challenging task in terms of computational complexity. Recently, a parallel programming model, MapReduce has been introduced by Google to support large-scale data intensive applications. In this study, we proposed a MapReduce-based algorithm, for common ADE detection approach, Proportional Reporting Ratio (PRR), and tested it in mining spontaneous ADE reports from FDA. The purpose is to investigate the possibility of using MapReduce principle to speed up biomedical data mining tasks using this pharmacovigilance case as one specific example. The results demonstrated that MapReduce programming model could improve the performance of common signal detection algorithm for pharmacovigilance in a distributed computation environment at approximately liner speedup rates.

  4. The temporal reliability of sound modulates visual detection: an event-related potential study.

    PubMed

    Li, Qi; Wu, Yan; Yang, Jingjing; Wu, Jinglong; Touge, Tetsuo

    2015-01-01

    Utilizing the high temporal resolution of event-related potentials (ERPs), we examined the effects of temporal reliability of sounds on visual detection. Significantly faster reaction times to visual target stimuli were observed when reliable temporal information was provided by a task-irrelevant auditory stimulus. Three main ERP components related to the effects of auditory temporal reliability were found: the first at 180-240 ms over a wide central area, the second at 300-400 ms over an anterior area, and the third at 300-380 ms over bilateral temporal areas. Our results support the hypothesis that temporal reliability affects visual detection and indicate that auditory facilitation of visual detection is partly due to spread of attention and thus results from implicit temporal linking of auditory and visual information at a relatively late processing stage.

  5. Bio-inspired WSN architecture: event detection and loacalization in a fault tolerant WSN

    NASA Astrophysics Data System (ADS)

    Alayev, Yosef; Damarla, Thyagaraju

    2009-05-01

    One can think of human body as a sensory network. In particular, skin has several neurons that provide the sense of touch with different sensitivities, and neurons for communicating the sensory signals to the brain. Even though skin might occasionally experience some lacerations, it performs remarkably well (fault tolerant) with the failure of some sensors. One of the challenges in collaborative wireless sensor networks (WSN) is fault tolerant detection and localization of targets. In this paper we present a biologically inspired architecture model for WSN. Diagnosis of sensors in WSN model presented here is derived from the concept of the immune system. We present an architecture for WSN for detection and localization of multiple targets inspired by human nervous system. We show that the advantages of such bio-inspired networks are reduced data for communication, self-diagnosis to detect faulty sensors in real-time and the ability to localize events. We present the results of our algorithms on simulation data.

  6. Fault detection and isolation in manufacturing systems with an identified discrete event model

    NASA Astrophysics Data System (ADS)

    Roth, Matthias; Schneider, Stefan; Lesage, Jean-Jacques; Litz, Lothar

    2012-10-01

    In this article a generic method for fault detection and isolation (FDI) in manufacturing systems considered as discrete event systems (DES) is presented. The method uses an identified model of the closed-loop of plant and controller built on the basis of observed fault-free system behaviour. An identification algorithm known from literature is used to determine the fault detection model in form of a non-deterministic automaton. New results of how to parameterise this algorithm are reported. To assess the fault detection capability of an identified automaton, probabilistic measures are proposed. For fault isolation, the concept of residuals adapted for DES is used by defining appropriate set operations representing generic fault symptoms. The method is applied to a case study system.

  7. DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals

    DTIC Science & Technology

    2013-04-24

    newborn infants [5] as well as the monitoring of fatigue in prolonged driving simulations [6]. In many of these settings, the experiments may last...Automatic burst detection for the EEG of the preterm infant . Physiol Meas 32: 1623. doi:10.1088/0967–3334/32/10/010. 6. Chin-Teng Lin, Che-Jui Chang, Bor

  8. Event Detection Using Mobile Phone Mass GPS Data and Their Reliavility Verification by Dmsp/ols Night Light Image

    NASA Astrophysics Data System (ADS)

    Yuki, Akiyama; Satoshi, Ueyama; Ryosuke, Shibasaki; Adachi, Ryuichiro

    2016-06-01

    In this study, we developed a method to detect sudden population concentration on a certain day and area, that is, an "Event," all over Japan in 2012 using mass GPS data provided from mobile phone users. First, stay locations of all phone users were detected using existing methods. Second, areas and days where Events occurred were detected by aggregation of mass stay locations into 1-km-square grid polygons. Finally, the proposed method could detect Events with an especially large number of visitors in the year by removing the influences of Events that occurred continuously throughout the year. In addition, we demonstrated reasonable reliability of the proposed Event detection method by comparing the results of Event detection with light intensities obtained from the night light images from the DMSP/OLS night light images. Our method can detect not only positive events such as festivals but also negative events such as natural disasters and road accidents. These results are expected to support policy development of urban planning, disaster prevention, and transportation management.

  9. [Detection of adverse events in hospitalized adult patients by using the Global Trigger Tool method].

    PubMed

    Guzmán-Ruiz, O; Ruiz-López, P; Gómez-Cámara, A; Ramírez-Martín, M

    2015-01-01

    To identify and characterize adverse events (AE) in an Internal Medicine Department of a district hospital using an extension of the Global Trigger Tool (GTT), analyzing the diagnostic validity of the tool. An observational, analytical, descriptive and retrospective study was conducted on 2013 clinical charts from an Internal Medicine Department in order to detect EA through the identification of 'triggers' (an event often related to an AE). The 'triggers' and AE were located by systematic review of clinical documentation. The AE were characterized after they were identified. A total of 149 AE were detected in 291 clinical charts during 2013, of which 75.3% were detected directly by the tool, while the rest were not associated with a trigger. The percentage of charts that had at least one AE was 35.4%. The most frequent AE found was pressure ulcer (12%), followed by delirium, constipation, nosocomial respiratory infection and altered level of consciousness by drugs. Almost half (47.6%) of the AE were related to drug use, and 32.2% of all AE were considered preventable. The tool demonstrated a sensitivity of 91.3% (95%CI: 88.9-93.2) and a specificity of 32.5% (95%CI: 29.9-35.1). It had a positive predictive value of 42.5% (95%CI: 40.1-45.1) and a negative predictive value of 87.1% (95%CI: 83.8-89.9). The tool used in this study is valid, useful and reproducible for the detection of AE. It also serves to determine rates of injury and to observe their progression over time. A high frequency of both AE and preventable events were observed in this study. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.

  10. An evaluation of generalized likelihood Ratio Outlier Detection to identification of seismic events in Western China

    SciTech Connect

    Taylor, S.R.; Hartse, H.E.

    1996-09-24

    The Generalized Likelihood Ratio Outlier Detection Technique for seismic event identification is evaluated using synthetic test data and frequency-dependent P{sub g}/L{sub g} measurements from western China. For most seismic stations that are to be part of the proposed International Monitoring System for the Comprehensive Test Ban Treaty, there will be few or no nuclear explosions in the magnitude range of interest (e.g. M{sub b} < 4) on which to base an event-identification system using traditional classification techniques. Outlier detection is a reasonable alternative approach to the seismic discrimination problem when no calibration explosions are available. Distance-corrected P{sub g}/L{sub g} data in seven different frequency bands ranging from 0.5 to 8 Hz from the Chinese Digital Seismic Station WMQ are used to evaluate the technique. The data are collected from 157 known earthquakes, 215 unknown events (presumed earthquakes and possibly some industrial explosions), and 18 known nuclear explosions (1 from the Chinese Lop Nor test site and 17 from the East Kazakh test site). A feature selection technique is used to find the best combination of discriminants to use for outlier detection. Good discrimination performance is found by combining a low-frequency (0.5 to 1 Hz) P{sub g}/L{sub g} ratio with high-frequency ratios (e.g. 2 to 4 and 4 to 8 Hz). Although the low-frequency ratio does not discriminate between earthquakes and nuclear explosions well by itself, it can be effectively combined with the high-frequency discriminants. Based on the tests with real and synthetic data, the outlier detection technique appears to be an effective approach to seismic monitoring in uncalibrated regions.

  11. Exupery volcano fast response system - The event detection and waveform classification system

    NASA Astrophysics Data System (ADS)

    Hammer, Conny; Ohrnberger, Matthias

    2010-05-01

    Volcanic eruptions are often preceded by seismic activity which can be used to quantify the volcanic activity since the number and the size of certain types of seismic events usually increase before periods of volcanic crisis. The implementation of an automatic detection and classification system for seismic signals of volcanic origin allows not only for the processing of large amounts of data in short time, but also provides consistent and time-invariant results. Here, we have developed a system based upon a combination of different methods. To enable a first robust event detection in the continuous data stream different modules are implemented in the real time system Earthworm which is widely distributed in active volcano monitoring observatories worldwide. Among those software modules are classical trigger algorithm like STA/LTA and cross-correlation master event matching which is also used to detect different classes of signals. Furthermore an additional module is implemented in the real time system to compute continuous activity parameters which are also used to quantify the volcanic activity. Most automatic classification systems need a sufficiently large pre-classified data set for training the system. However in case of a volcanic crisis we are often confronted with a lack of training data due to insufficient prior observations because prior data acquisition might be carried out with different equipment at a low number of sites and due to the imminent crisis there might be no time for the time-consuming and tedious process of preparing a training data set. For this reason we have developed a novel seismic event spotting technique in order to be less dependent on the existence of previously acquired data bases of event classes. One main goal is therefore to provide observatory staff with a robust event classification based on a minimum number of reference waveforms. By using a "learning-while-recording" approach we are allowing for the fast build-up of a

  12. Gait event detection for use in FES rehabilitation by radial and tangential foot accelerations.

    PubMed

    Rueterbories, Jan; Spaich, Erika G; Andersen, Ole K

    2014-04-01

    Gait rehabilitation by Functional Electrical Stimulations (FESs) requires a reliable trigger signal to start the stimulations. This could be obtained by a simple switch under the heel or by means of an inertial sensor system. This study provides an algorithm to detect gait events in differential acceleration signals of the foot. The key feature of differential measurements is that they compensate the impact of gravity. The real time detection capability of a rule based algorithm in healthy and hemiparetic individuals was investigated. Detection accuracy and precision compared to signals from foot switches were evaluated. The algorithm detected curve features of the vectorial sum of radial and tangential accelerations and mapped those to discrete gait states. The results showed detection rates for healthy and hemiparetic gait ranging form 84.2% to 108.5%. The sensitivity was between 0.81 and 1, and the specificity between 0.85 and 1, depending on gait phase and group of subjects. The algorithm detected gait phase changes earlier than the reference. Differential acceleration signals combined with the proposed algorithm have the potential to be implemented in a future FES system.

  13. Clinical outcome of subchromosomal events detected by whole‐genome noninvasive prenatal testing

    PubMed Central

    Helgeson, J.; Wardrop, J.; Boomer, T.; Almasri, E.; Paxton, W. B.; Saldivar, J. S.; Dharajiya, N.; Monroe, T. J.; Farkas, D. H.; Grosu, D. S.

    2015-01-01

    Abstract Objective A novel algorithm to identify fetal microdeletion events in maternal plasma has been developed and used in clinical laboratory‐based noninvasive prenatal testing. We used this approach to identify the subchromosomal events 5pdel, 22q11del, 15qdel, 1p36del, 4pdel, 11qdel, and 8qdel in routine testing. We describe the clinical outcomes of those samples identified with these subchromosomal events. Methods Blood samples from high‐risk pregnant women submitted for noninvasive prenatal testing were analyzed using low coverage whole genome massively parallel sequencing. Sequencing data were analyzed using a novel algorithm to detect trisomies and microdeletions. Results In testing 175 393 samples, 55 subchromosomal deletions were reported. The overall positive predictive value for each subchromosomal aberration ranged from 60% to 100% for cases with diagnostic and clinical follow‐up information. The total false positive rate was 0.0017% for confirmed false positives results; false negative rate and sensitivity were not conclusively determined. Conclusion Noninvasive testing can be expanded into the detection of subchromosomal copy number variations, while maintaining overall high test specificity. In the current setting, our results demonstrate high positive predictive values for testing of rare subchromosomal deletions. © 2015 The Authors. Prenatal Diagnosis published by John Wiley & Sons Ltd. PMID:26088833

  14. Piezoelectric energy-harvesting power source and event detection sensors for gun-fired munitions

    NASA Astrophysics Data System (ADS)

    Rastegar, Jahangir; Feng, Dake; Pereira, Carlos M.

    2015-05-01

    This paper presents a review of piezoelectric based energy harvesting devices and their charge collection electronics for use in very harsh environment of gun-fired munitions. A number of novel classes of such energy harvesting power sources have been developed for gun-fired munitions and similar applications, including those with integrated safety and firing setback event detection electronics and logic circuitry. The power sources are designed to harvest energy from firing acceleration and vibratory motions during the flight. As an example, the application of the developed piezoelectric based energy harvesting devices with event detection circuitry for the development of self-powered initiators with full no-fire safety circuitry for protection against accidental drops, transportation vibration, and other similar low amplitude accelerations and/or high amplitude but short duration acceleration events is presented. The design allows the use of a very small piezoelectric element, thereby allowing such devices to be highly miniaturized. These devices can be readily hardened to withstand very high G firing setback accelerations in excess of 100,000 G and the harsh firing environment. The design of prototypes and testing under realistic conditions are presented.

  15. Very low frequency earthquakes (VLFEs) detected during episodic tremor and slip (ETS) events in Cascadia using a match filter method indicate repeating events

    NASA Astrophysics Data System (ADS)

    Hutchison, A. A.; Ghosh, A.

    2016-12-01

    Very low frequency earthquakes (VLFEs) occur in transitional zones of faults, releasing seismic energy in the 0.02-0.05 Hz frequency band over a 90 s duration and typically have magntitudes within the range of Mw 3.0-4.0. VLFEs can occur down-dip of the seismogenic zone, where they can transfer stress up-dip potentially bringing the locked zone closer to a critical failure stress. VLFEs also occur up-dip of the seismogenic zone in a region along the plate interface that can rupture coseismically during large megathrust events, such as the 2011 Tohoku-Oki earthquake [Ide et al., 2011]. VLFEs were first detected in Cascadia during the 2011 episodic tremor and slip (ETS) event, occurring coincidentally with tremor [Ghosh et al., 2015]. However, during the 2014 ETS event, VLFEs were spatially and temporally asynchronous with tremor activity [Hutchison and Ghosh, 2016]. Such contrasting behaviors remind us that the mechanics behind such events remain elusive, yet they are responsible for the largest portion of the moment release during an ETS event. Here, we apply a match filter method using known VLFEs as template events to detect additional VLFEs. Using a grid-search centroid moment tensor inversion method, we invert stacks of the resulting match filter detections to ensure moment tensor solutions are similar to that of the respective template events. Our ability to successfully employ a match filter method to VLFE detection in Cascadia intrinsically indicates that these events can be repeating, implying that the same asperities are likely responsible for generating multiple VLFEs.

  16. Detecting regular sound changes in linguistics as events of concerted evolution

    SciTech Connect

    Hruschka, Daniel  J.; Branford, Simon; Smith, Eric  D.; Wilkins, Jon; Meade, Andrew; Pagel, Mark; Bhattacharya, Tanmoy

    2014-12-18

    Background: Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Results: Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular sound change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. Conclusions: We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group.

  17. Detecting Regular Sound Changes in Linguistics as Events of Concerted Evolution

    PubMed Central

    Hruschka, Daniel J.; Branford, Simon; Smith, Eric D.; Wilkins, Jon; Meade, Andrew; Pagel, Mark; Bhattacharya, Tanmoy

    2015-01-01

    Summary Background Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Results Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular sound change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. Conclusions We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group. PMID:25532895

  18. Detection of Visual Events in Underwater Video Using a Neuromorphic Saliency-based Attention System

    NASA Astrophysics Data System (ADS)

    Edgington, D. R.; Walther, D.; Cline, D. E.; Sherlock, R.; Salamy, K. A.; Wilson, A.; Koch, C.

    2003-12-01

    The Monterey Bay Aquarium Research Institute (MBARI) uses high-resolution video equipment on remotely operated vehicles (ROV) to obtain quantitative data on the distribution and abundance of oceanic animals. High-quality video data supplants the traditional approach of assessing the kinds and numbers of animals in the oceanic water column through towing collection nets behind ships. Tow nets are limited in spatial resolution, and often destroy abundant gelatinous animals resulting in species undersampling. Video camera-based quantitative video transects (QVT) are taken through the ocean midwater, from 50m to 4000m, and provide high-resolution data at the scale of the individual animals and their natural aggregation patterns. However, the current manual method of analyzing QVT video by trained scientists is labor intensive and poses a serious limitation to the amount of information that can be analyzed from ROV dives. Presented here is an automated system for detecting marine animals (events) visible in the videos. Automated detection is difficult due to the low contrast of many translucent animals and due to debris ("marine snow") cluttering the scene. Video frames are processed with an artificial intelligence attention selection algorithm that has proven a robust means of target detection in a variety of natural terrestrial scenes. The candidate locations identified by the attention selection module are tracked across video frames using linear Kalman filters. Typically, the occurrence of visible animals in the video footage is sparse in space and time. A notion of "boring" video frames is developed by detecting whether or not there is an interesting candidate object for an animal present in a particular sequence of underwater video -- video frames that do not contain any "interesting" events. If objects can be tracked successfully over several frames, they are stored as potentially "interesting" events. Based on low-level properties, interesting events are

  19. A detailed view on Model-Based Multifactor Dimensionality Reduction for detecting gene-gene interactions in case-control data in the absence and presence of noise

    PubMed Central

    CATTAERT, TOM; CALLE, M. LUZ; DUDEK, SCOTT M.; MAHACHIE JOHN, JESTINAH M.; VAN LISHOUT, FRANÇOIS; URREA, VICTOR; RITCHIE, MARYLYN D.; VAN STEEN, KRISTEL

    2010-01-01

    SUMMARY Analyzing the combined effects of genes and/or environmental factors on the development of complex diseases is a great challenge from both the statistical and computational perspective, even using a relatively small number of genetic and non-genetic exposures. Several data mining methods have been proposed for interaction analysis, among them, the Multifactor Dimensionality Reduction Method (MDR), which has proven its utility in a variety of theoretical and practical settings. Model-Based Multifactor Dimensionality Reduction (MB-MDR), a relatively new MDR-based technique that is able to unify the best of both non-parametric and parametric worlds, was developed to address some of the remaining concerns that go along with an MDR-analysis. These include the restriction to univariate, dichotomous traits, the absence of flexible ways to adjust for lower-order effects and important confounders, and the difficulty to highlight epistasis effects when too many multi-locus genotype cells are pooled into two new genotype groups. Whereas the true value of MB-MDR can only reveal itself by extensive applications of the method in a variety of real-life scenarios, here we investigate the empirical power of MB-MDR to detect gene-gene interactions in the absence of any noise and in the presence of genotyping error, missing data, phenocopy, and genetic heterogeneity. For the considered simulation settings, we show that the power is generally higher for MB-MDR than for MDR, in particular in the presence of genetic heterogeneity, phenocopy, or low minor allele frequencies. PMID:21158747

  20. Detecting consciousness in a total locked-in syndrome: an active event-related paradigm.

    PubMed

    Schnakers, Caroline; Perrin, Fabien; Schabus, Manuel; Hustinx, Roland; Majerus, Steve; Moonen, Gustave; Boly, Melanie; Vanhaudenhuyse, Audrey; Bruno, Marie-Aurelie; Laureys, Steven

    2009-08-01

    Total locked-in syndrome is characterized by tetraplegia, anarthria and paralysis of eye motility. In this study, consciousness was detected in a 21-year-old woman who presented a total locked-in syndrome after a basilar artery thrombosis (49 days post-injury) using an active event-related paradigm. The patient was presented sequences of names containing the patient's own name and other names. The patient was instructed to count her own name or to count another target name. Similar to 4 age- and gender-matched healthy controls, the P3 response recorded for the voluntarily counted own name was larger than while passively listening. This P3 response was observed 14 days before the first behavioral signs of consciousness. This study shows that our active event-related paradigm allowed to identify voluntary brain activity in a patient who would behaviorally be diagnosed as comatose.

  1. Application of Data Cubes for Improving Detection of Water Cycle Extreme Events

    NASA Technical Reports Server (NTRS)

    Albayrak, Arif; Teng, William

    2015-01-01

    As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case of our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme events, a specific case of anomaly detection, requiring time series data. We investigate the use of support vector machines (SVM) for anomaly classification. We show an example of detection of water cycle extreme events, using data from the Tropical Rainfall Measuring Mission (TRMM).

  2. Graph clustering for weapon discharge event detection and tracking in infrared imagery using deep features

    NASA Astrophysics Data System (ADS)

    Bhattacharjee, Sreyasee Das; Talukder, Ashit

    2017-05-01

    This paper addresses the problem of detecting and tracking weapon discharge event in an Infrared Imagery collection. While most of the prior work in related domains exploits the vast amount of complementary in- formation available from both visible-band (EO) and Infrared (IR) image (or video sequences), we handle the problem of recognizing human pose and activity detection exclusively in thermal (IR) images or videos. The task is primarily two-fold: 1) locating the individual in the scene from IR imagery, and 2) identifying the correct pose of the human individual (i.e. presence or absence of weapon discharge activity or intent). An efficient graph-based shortlisting strategy for identifying candidate regions of interest in the IR image utilizes both image saliency and mutual similarities from the initial list of the top scored proposals of a given query frame, which ensures an improved performance for both detection and recognition simultaneously and reduced false alarms. The proposed search strategy offers an efficient feature extraction scheme that can capture the maximum amount of object structural information by defining a region- based deep shape descriptor representing each object of interest present in the scene. Therefore, our solution is capable of handling the fundamental incompleteness of the IR imageries for which the conventional deep features optimized on the natural color images in Imagenet are not quite suitable. Our preliminary experiments on the OSU weapon dataset demonstrates significant success in automated recognition of weapon discharge events from IR imagery.

  3. Foot contact event detection using kinematic data in cerebral palsy children and normal adults gait.

    PubMed

    Desailly, Eric; Daniel, Yepremian; Sardain, Philippe; Lacouture, Patrick

    2009-01-01

    Initial contact (IC) and toe off (TO) times are essential measurements in the analysis of temporal gait parameters, especially in cerebral palsy (CP) gait analysis. A new gait event detection algorithm, called the high pass algorithm (HPA) has been developed and is discussed in this paper. Kinematics of markers on the heel and metatarsal are used. Their forward components are high pass filtered, to amplify the contact discontinuities, thus the local extrema of the processed signal correspond to IC and TO. The accuracy and precision of HPA are compared with the gold standard of foot contact event detection, that is, force plate measurements. Furthermore HPA is compared with two other kinematics methods. This study has been conducted on 20 CP children and on eight normal adults. For normal subjects all the methods performed equally well. True errors in HPA (mean+/-standard deviation) were found to be 1+/-23 ms for IC and 2+/-25 ms for TO in CP children. These results were significantly (p<0.05) more accurate and precise than those obtained using the other algorithms. Moreover, in the case of pathological gaits, the other methods are not suitable for IC detection when IC is flatfoot or forefoot. In conclusion, the HPA is a simple and robust algorithm, which performs equally well for adults and actually performs better when applied to the gait of CP children. It is therefore recommended as the method of choice.

  4. Rapid and reliable detection and identification of GM events using multiplex PCR coupled with oligonucleotide microarray.

    PubMed

    Xu, Xiaodan; Li, Yingcong; Zhao, Heng; Wen, Si-yuan; Wang, Sheng-qi; Huang, Jian; Huang, Kun-lun; Luo, Yun-bo

    2005-05-18

    To devise a rapid and reliable method for the detection and identification of genetically modified (GM) events, we developed a multiplex polymerase chain reaction (PCR) coupled with a DNA microarray system simultaneously aiming at many targets in a single reaction. The system included probes for screening gene, species reference gene, specific gene, construct-specific gene, event-specific gene, and internal and negative control genes. 18S rRNA was combined with species reference genes as internal controls to assess the efficiency of all reactions and to eliminate false negatives. Two sets of the multiplex PCR system were used to amplify four and five targets, respectively. Eight different structure genes could be detected and identified simultaneously for Roundup Ready soybean in a single microarray. The microarray specificity was validated by its ability to discriminate two GM maizes Bt176 and Bt11. The advantages of this method are its high specificity and greatly reduced false-positives and -negatives. The multiplex PCR coupled with microarray technology presented here is a rapid and reliable tool for the simultaneous detection of GM organism ingredients.

  5. Femtomolar detection of single mismatches by discriminant analysis of DNA hybridization events using gold nanoparticles.

    PubMed

    Ma, Xingyi; Sim, Sang Jun

    2013-03-21

    Even though DNA-based nanosensors have been demonstrated for quantitative detection of analytes and diseases, hybridization events have never been numerically investigated for further understanding of DNA mediated interactions. Here, we developed a nanoscale platform with well-designed capture and detection gold nanoprobes to precisely evaluate the hybridization events. The capture gold nanoprobes were mono-laid on glass and the detection probes were fabricated via a novel competitive conjugation method. The two kinds of probes combined in a suitable orientation following the hybridization with the target. We found that hybridization efficiency was markedly dependent on electrostatic interactions between DNA strands, which can be tailored by adjusting the salt concentration of the incubation solution. Due to the much lower stability of the double helix formed by mismatches, the hybridization efficiencies of single mismatched (MMT) and perfectly matched DNA (PMT) were different. Therefore, we obtained an optimized salt concentration that allowed for discrimination of MMT from PMT without stringent control of temperature or pH. The results indicated this to be an ultrasensitive and precise nanosensor for the diagnosis of genetic diseases.

  6. Detecting tidal disruption events of massive black holes in normal galaxies with the Einstein Probe

    NASA Astrophysics Data System (ADS)

    Yuan, W.; Komossa, S.; Zhang, C.; Feng, H.; Ling, Z.-X.; Zhao, D. H.; Zhang, S.-N.; Osborne, J. P.; O'Brien, P.; Willingale, R.; Lapington, J.; Lapington

    2016-02-01

    Stars are tidally disrupted and accreted when they approach massive black holes (MBHs) closely, producing a flare of electromagnetic radiation. The majority of the (approximately two dozen) tidal disruption events (TDEs) identified so far have been discovered by their luminous, transient X-ray emission. Once TDEs are detected in much larger numbers, in future dedicated transient surveys, a wealth of new applications will become possible. Here, we present the proposed Einstein Probe mission, which is a dedicated time-domain soft X-ray all-sky monitor aiming at detecting X-ray transients including TDEs in large numbers. The mission consists of a wide-field micro-pore Lobster-eye imager (60° × 60°), and is designed to carry out an all-sky transient survey at energies of 0.5-4 keV. It will also carry a more sensitive telescope for X-ray follow-ups, and will be capable of issuing public transient alerts rapidly. Einstein Probe is expected to revolutionise the field of TDE research by detecting several tens to hundreds of events per year from the early phase of flares, many with long-term, well sampled lightcurves.

  7. The HAWC Real-time Flare Monitor for Rapid Detection of Transient Events

    NASA Astrophysics Data System (ADS)

    Abeysekara, A. U.; Alfaro, R.; Alvarez, C.; Álvarez, J. D.; Arceo, R.; Arteaga-Velázquez, J. C.; Avila Rojas, D.; Ayala Solares, H. A.; Barber, A. S.; Bautista-Elivar, N.; Becerra Gonzalez, J.; Becerril, A.; Belmont-Moreno, E.; BenZvi, S. Y.; Bernal, A.; Braun, J.; Brisbois, C.; Caballero-Mora, K. S.; Capistrán, T.; Carramiñana, A.; Casanova, S.; Castillo, M.; Cotti, U.; Cotzomi, J.; Coutiño de León, S.; De la Fuente, E.; De León, C.; Díaz-Vélez, J. C.; Dingus, B. L.; DuVernois, M. A.; Ellsworth, R. W.; Engel, K.; Fiorino, D. W.; Fraija, N.; García-González, J. A.; Garfias, F.; Gerhardt, M.; González, M. M.; González Muñoz, A.; Goodman, J. A.; Hampel-Arias, Z.; Harding, J. P.; Hernandez, S.; Hernandez-Almada, A.; Hona, B.; Hui, C. M.; Hüntemeyer, P.; Iriarte, A.; Jardin-Blicq, A.; Joshi, V.; Kaufmann, S.; Kieda, D.; Lauer, R. J.; Lee, W. H.; Lennarz, D.; León Vargas, H.; Linnemann, J. T.; Longinotti, A. L.; López-Cámara, D.; López-Coto, R.; Raya, G. Luis; Luna-García, R.; Malone, K.; Marinelli, S. S.; Martinez, O.; Martinez-Castellanos, I.; Martínez-Castro, J.; Martínez-Huerta, H.; Matthews, J. A.; Miranda-Romagnoli, P.; Moreno, E.; Mostafá, M.; Nellen, L.; Newbold, M.; Nisa, M. U.; Noriega-Papaqui, R.; Pelayo, R.; Pérez-Pérez, E. G.; Pretz, J.; Ren, Z.; Rho, C. D.; Rivière, C.; Rosa-González, D.; Rosenberg, M.; Ruiz-Velasco, E.; Salazar, H.; Salesa Greus, F.; Sandoval, A.; Schneider, M.; Schoorlemmer, H.; Sinnis, G.; Smith, A. J.; Springer, R. W.; Surajbali, P.; Taboada, I.; Tibolla, O.; Tollefson, K.; Torres, I.; Ukwatta, T. N.; Vianello, G.; Weisgarber, T.; Westerhoff, S.; Wisher, I. G.; Wood, J.; Yapici, T.; Younk, P. W.; Zepeda, A.; Zhou, H.

    2017-07-01

    We present the development of a real-time flare monitor for the High Altitude Water Cherenkov (HAWC) observatory. The flare monitor has been fully operational since 2017 January and is designed to detect very high energy (VHE; E ≳ 100 GeV) transient events from blazars on timescales lasting from 2 minutes to 10 hr in order to facilitate multiwavelength and multimessenger studies. These flares provide information for investigations into the mechanisms that power the blazars’ relativistic jets and accelerate particles within them, and they may also serve as probes of the populations of particles and fields in intergalactic space. To date, the detection of blazar flares in the VHE range has relied primarily on pointed observations by imaging atmospheric Cherenkov telescopes. The recently completed HAWC observatory offers the opportunity to study VHE flares in survey mode, scanning two-thirds of the entire sky every day with a field of view of ∼1.8 steradians. In this work, we report on the sensitivity of the HAWC real-time flare monitor and demonstrate its capabilities via the detection of three high-confidence VHE events in the blazars Markarian 421 and Markarian 501.

  8. Advanced Clinical Decision Support for Vaccine Adverse Event Detection and Reporting.

    PubMed

    Baker, Meghan A; Kaelber, David C; Bar-Shain, David S; Moro, Pedro L; Zambarano, Bob; Mazza, Megan; Garcia, Crystal; Henry, Adam; Platt, Richard; Klompas, Michael

    2015-09-15

    Reporting of adverse events (AEs) following vaccination can help identify rare or unexpected complications of immunizations and aid in characterizing potential vaccine safety signals. We developed an open-source, generalizable clinical decision support system called Electronic Support for Public Health-Vaccine Adverse Event Reporting System (ESP-VAERS) to assist clinicians with AE detection and reporting. ESP-VAERS monitors patients' electronic health records for new diagnoses, changes in laboratory values, and new allergies following vaccinations. When suggestive events are found, ESP-VAERS sends the patient's clinician a secure electronic message with an invitation to affirm or refute the message, add comments, and submit an automated, prepopulated electronic report to VAERS. High-probability AEs are reported automatically if the clinician does not respond. We implemented ESP-VAERS in December 2012 throughout the MetroHealth System, an integrated healthcare system in Ohio. We queried the VAERS database to determine MetroHealth's baseline reporting rates from January 2009 to March 2012 and then assessed changes in reporting rates with ESP-VAERS. In the 8 months following implementation, 91 622 vaccinations were given. ESP-VAERS sent 1385 messages to responsible clinicians describing potential AEs. Clinicians opened 1304 (94.2%) messages, responded to 209 (15.1%), and confirmed 16 for transmission to VAERS. An additional 16 high-probability AEs were sent automatically. Reported events included seizure, pleural effusion, and lymphocytopenia. The odds of a VAERS report submission during the implementation period were 30.2 (95% confidence interval, 9.52-95.5) times greater than the odds during the comparable preimplementation period. An open-source, electronic health record-based clinical decision support system can increase AE detection and reporting rates in VAERS. © The Author 2015. Published by Oxford University Press on behalf of the Infectious Diseases Society

  9. Hierarchical modeling for rare event detection and cell subset alignment across flow cytometry samples.

    PubMed

    Cron, Andrew; Gouttefangeas, Cécile; Frelinger, Jacob; Lin, Lin; Singh, Satwinder K; Britten, Cedrik M; Welters, Marij J P; van der Burg, Sjoerd H; West, Mike; Chan, Cliburn

    2013-01-01

    Flow cytometry is the prototypical assay for multi-parameter single cell analysis, and is essential in vaccine and biomarker research for the enumeration of antigen-specific lymphocytes that are often found in extremely low frequencies (0.1% or less). Standard analysis of flow cytometry data relies on visual identification of cell subsets by experts, a process that is subjective and often difficult to reproduce. An alternative and more objective approach is the use of statistical models to identify cell subsets of interest in an automated fashion. Two specific challenges for automated analysis are to detect extremely low frequency event subsets without biasing the estimate by pre-processing enrichment, and the ability to align cell subsets across multiple data samples for comparative analysis. In this manuscript, we develop hierarchical modeling extensions to the Dirichlet Process Gaussian Mixture Model (DPGMM) approach we have previously described for cell subset identification, and show that the hierarchical DPGMM (HDPGMM) naturally generates an aligned data model that captures both commonalities and variations across multiple samples. HDPGMM also increases the sensitivity to extremely low frequency events by sharing information across multiple samples analyzed simultaneously. We validate the accuracy and reproducibility of HDPGMM estimates of antigen-specific T cells on clinically relevant reference peripheral blood mononuclear cell (PBMC) samples with known frequencies of antigen-specific T cells. These cell samples take advantage of retrovirally TCR-transduced T cells spiked into autologous PBMC samples to give a defined number of antigen-specific T cells detectable by HLA-peptide multimer binding. We provide open source software that can take advantage of both multiple processors and GPU-acceleration to perform the numerically-demanding computations. We show that hierarchical modeling is a useful probabilistic approach that can provide a consistent labeling

  10. Hierarchical Modeling for Rare Event Detection and Cell Subset Alignment across Flow Cytometry Samples

    PubMed Central

    Cron, Andrew; Gouttefangeas, Cécile; Frelinger, Jacob; Lin, Lin; Singh, Satwinder K.; Britten, Cedrik M.; Welters, Marij J. P.; van der Burg, Sjoerd H.; West, Mike; Chan, Cliburn

    2013-01-01

    Flow cytometry is the prototypical assay for multi-parameter single cell analysis, and is essential in vaccine and biomarker research for the enumeration of antigen-specific lymphocytes that are often found in extremely low frequencies (0.1% or less). Standard analysis of flow cytometry data relies on visual identification of cell subsets by experts, a process that is subjective and often difficult to reproduce. An alternative and more objective approach is the use of statistical models to identify cell subsets of interest in an automated fashion. Two specific challenges for automated analysis are to detect extremely low frequency event subsets without biasing the estimate by pre-processing enrichment, and the ability to align cell subsets across multiple data samples for comparative analysis. In this manuscript, we develop hierarchical modeling extensions to the Dirichlet Process Gaussian Mixture Model (DPGMM) approach we have previously described for cell subset identification, and show that the hierarchical DPGMM (HDPGMM) naturally generates an aligned data model that captures both commonalities and variations across multiple samples. HDPGMM also increases the sensitivity to extremely low frequency events by sharing information across multiple samples analyzed simultaneously. We validate the accuracy and reproducibility of HDPGMM estimates of antigen-specific T cells on clinically relevant reference peripheral blood mononuclear cell (PBMC) samples with known frequencies of antigen-specific T cells. These cell samples take advantage of retrovirally TCR-transduced T cells spiked into autologous PBMC samples to give a defined number of antigen-specific T cells detectable by HLA-peptide multimer binding. We provide open source software that can take advantage of both multiple processors and GPU-acceleration to perform the numerically-demanding computations. We show that hierarchical modeling is a useful probabilistic approach that can provide a consistent labeling

  11. Automated detection and analysis of depolarization events in human cardiomyocytes using MaDEC.

    PubMed

    Szymanska, Agnieszka F; Heylman, Christopher; Datta, Rupsa; Gratton, Enrico; Nenadic, Zoran

    2016-08-01

    Optical imaging-based methods for assessing the membrane electrophysiology of in vitro human cardiac cells allow for non-invasive temporal assessment of the effect of drugs and other stimuli. Automated methods for detecting and analyzing the depolarization events (DEs) in image-based data allow quantitative assessment of these different treatments. In this study, we use 2-photon microscopy of fluorescent voltage-sensitive dyes (VSDs) to capture the membrane voltage of actively beating human induced pluripotent stem cell-derived cardiomyocytes (hiPS-CMs). We built a custom and freely available Matlab software, called MaDEC, to detect, quantify, and compare DEs of hiPS-CMs treated with the β-adrenergic drugs, propranolol and isoproterenol. The efficacy of our software is quantified by comparing detection results against manual DE detection by expert analysts, and comparing DE analysis results to known drug-induced electrophysiological effects. The software accurately detected DEs with true positive rates of 98-100% and false positive rates of 1-2%, at signal-to-noise ratios (SNRs) of 5 and above. The MaDEC software was also able to distinguish control DEs from drug-treated DEs both immediately as well as 10min after drug administration.

  12. BIRD detection and analysis of high-temperature events: first results

    NASA Astrophysics Data System (ADS)

    Zhukov, Boris; Briess, Klaus; Lorenz, Eckehard; Oertel, Dieter; Skrbek, Wolfgang

    2003-03-01

    The primary mission objective of a new small Bi-spectral InfraRed Detection (BIRD) satellite, which was put in a 570 km circular sun-synchronous orbit on 22 October 2001, is detection and quantitative analysis of high-temperature events (HTE) like fires and volcanoes. A unique feature of the BIRD mid- and thermal infrared channels is a real-time adjustment of their integration time that allows a HTE observation without sensor saturation, preserving a good radiometric resolution of 0.1-0.2 K for pixels at normal temperatures. This makes it possible: (a) to improve false alarm rejection capability and (b) to estimate HTE temperature, area and radiative energy release. Due to a higher spatial resolution, BIRD can detect an order of magnitude smaller HTE than AVHRR and MODIS. The smallest verified fire that was detected in the BIRD data had an area of ~12 m2. The first BIRD HTE detection and analysis results are presented including bush fires in Australia, forest fires in Russia, coal seam fires in China, and a time-varying thermal activity at Etna.

  13. The Evaluation of a Pulmonary Display to Detect Adverse Respiratory Events Using High Resolution Human Simulator

    PubMed Central

    Wachter, S. Blake; Johnson, Ken; Albert, Robert; Syroid, Noah; Drews, Frank; Westenskow, Dwayne

    2006-01-01

    Objective Authors developed a picture-graphics display for pulmonary function to present typical respiratory data used in perioperative and intensive care environments. The display utilizes color, shape and emergent alerting to highlight abnormal pulmonary physiology. The display serves as an adjunct to traditional operating room displays and monitors. Design To evaluate the prototype, nineteen clinician volunteers each managed four adverse respiratory events and one normal event using a high-resolution patient simulator which included the new displays (intervention subjects) and traditional displays (control subjects). Between-group comparisons included (i) time to diagnosis and treatment for each adverse respiratory event; (ii) the number of unnecessary treatments during the normal scenario; and (iii) self-reported workload estimates while managing study events. Measurements Two expert anesthesiologists reviewed video-taped transcriptions of the volunteers to determine time to treat and time to diagnosis. Time values were then compared between groups using a Mann-Whitney-U Test. Estimated workload for both groups was assessed using the NASA-TLX and compared between groups using an ANOVA. P-values < 0.05 were considered significant. Results Clinician volunteers detected and treated obstructed endotracheal tubes and intrinsic PEEP problems faster with graphical rather than conventional displays (p < 0.05). During the normal scenario simulation, 3 clinicians using the graphical display, and 5 clinicians using the conventional display gave unnecessary treatments. Clinician-volunteers reported significantly lower subjective workloads using the graphical display for the obstructed endotracheal tube scenario (p < 0.001) and the intrinsic PEEP scenario (p < 0.03). Conclusion Authors conclude that the graphical pulmonary display may serve as a useful adjunct to traditional displays in identifying adverse respiratory events. PMID:16929038

  14. Implementing Dense Arrays of Single-Channel Seismic Recorders to Detect Global Teleseism Events

    NASA Astrophysics Data System (ADS)

    O'Rourke, C. T.; Sheehan, A. F.; Yang, Z.; Harder, S. H.; Miller, K. C.; Worthington, L. L.

    2010-12-01

    Single-Channel Reftek RT125A (Texan) seismic recorders have frequently been used to detect active-source seismic events (blasts). However, they have been thought to be inefficient for recording global seismicity over long time scales because of short battery life (days), single-channel capability, and typically high frequency response of associated geophones. The goal of this research is to determine just how effective Texan recorders can be for passive teleseism detection. An array of >800 Reftek RT125A recorders and geophones (OYO Geospace 4.5Hz) was recently deployed across the Bighorn Mountain Range in northern Wyoming as part of the EarthScope Bighorn Arch Seismic Experiment (BASE) Flexible Array experiment. In addition, 38 broadband and 172 short-period 3-component seismic stations were interspersed in the array, which we hope will aid in the detection of S wave arrival. Each RT125A was swapped every 4 days to replace batteries and offload data, allowing for a continuous data record over 15 days and potentially recording several global events per day. Because the array is densely spaced (~1km) we expect to be able to image the subsurface with a higher precision than was possible with wider arrays. Preliminary results have shown that M~5.0 events were discernable up to 8,000 km distance, and several M>6.0 were observable from any distance. For example the P, PP and S waves from a magnitude 6.4 earthquake in the Aleutian Islands, AK (12:58GMT Aug 4 2010, depth 27km) can be seen clearly across the entire array. Despite the fact that the geophones are 4.5 Hz instruments, quality signals up to 3 s are observed for teleseisms with no special processing. Regional and local events are also well recorded, including a magnitude 4.8 earthquake near Jackson Hole, Wyoming, associated aftershocks of the Jackson Hole earthquake, and over 200 local mine blasts. The mine blasts were large enough to be recorded by the USArray Array Network Facility (ANF), requiring usable

  15. Improving Infrasound Signal Detection and Event Location in the Western US Using Atmospheric Modeling

    NASA Astrophysics Data System (ADS)

    Dannemann, F. K.; Park, J.; Marcillo, O. E.; Blom, P. S.; Stump, B. W.; Hayward, C.

    2016-12-01

    Data from five infrasound arrays in the western US jointly operated by University of Utah Seismograph Station and Southern Methodist University are used to test a database-centric processing pipeline, InfraPy, for automated event detection, association and location. Infrasonic array data from a one-year time period (January 1 2012 to December 31 2012) are used. This study focuses on the identification and location of 53 ground-truth verified events produced from near surface military explosions at the Utah Test and Training Range (UTTR). Signals are detected using an adaptive F-detector, which accounts for correlated and uncorrelated time-varying noise in order to reduce false detections due to the presence of coherent noise. Variations in detection azimuth and correlation are found to be consistent with seasonal changes in atmospheric winds. The Bayesian infrasonic source location (BISL) method is used to produce source location and time credibility contours based on posterior probability density functions. Updates to the previous BISL methodology include the application of celerity range and azimuth deviation distributions in order to accurately account for the spatial and temporal variability of infrasound propagation through the atmosphere. These priors are estimated by ray tracing through Ground-to-Space (G2S) atmospheric models as a function of season and time of day using historic atmospheric characterizations from 2007 to 2013. Out of the 53 events, 31 are successfully located using the InfraPy pipeline. Confidence contour areas for maximum a posteriori event locations produce error estimates which are reduced a maximum of 98% and an average of 25% from location estimates utilizing a simple time independent uniform atmosphere. We compare real-time ray tracing results with the statistical atmospheric priors used in this study to examine large time differences between known origin times and estimated origin times that might be due to the misidentification of

  16. Predicting Negative Events: Using Post-discharge Data to Detect High-Risk Patients

    PubMed Central

    Sulieman, Lina; Fabbri, Daniel; Wang, Fei; Hu, Jianying; Malin, Bradley A

    2016-01-01

    Predicting negative outcomes, such as readmission or death, and detecting high-risk patients are important yet challenging problems in medical informatics. Various models have been proposed to detect high-risk patients; however, the state of the art relies on patient information collected before or at the time of discharge to predict future outcomes. In this paper, we investigate the effect of including data generated post discharge to predict negative outcomes. Specifically, we focus on two types of patients admitted to the Vanderbilt University Medical Center between 2010-2013: i) those with an acute event - 704 hip fractures and ii) those with chronic problems — 5250 congestive heart failure (CHF) patients. We show that the post-discharge model improved the AUC of the LACE index, a standard readmission scoring function, by 20 - 30%. Moreover, the new model resulted in higher AUCs by 15 - 27% for hip fracture and 10 - 12% for CHF compared to standard models. PMID:28269914

  17. Optimized Swinging Door Algorithm for Wind Power Ramp Event Detection: Preprint

    SciTech Connect

    Cui, Mingjian; Zhang, Jie; Florita, Anthony R.; Hodge, Bri-Mathias; Ke, Deping; Sun, Yuanzhang

    2015-08-06

    Significant wind power ramp events (WPREs) are those that influence the integration of wind power, and they are a concern to the continued reliable operation of the power grid. As wind power penetration has increased in recent years, so has the importance of wind power ramps. In this paper, an optimized swinging door algorithm (SDA) is developed to improve ramp detection performance. Wind power time series data are segmented by the original SDA, and then all significant ramps are detected and merged through a dynamic programming algorithm. An application of the optimized SDA is provided to ascertain the optimal parameter of the original SDA. Measured wind power data from the Electric Reliability Council of Texas (ERCOT) are used to evaluate the proposed optimized SDA.

  18. Label-Free Detection of Single Living Bacteria via Electrochemical Collision Event

    PubMed Central

    Lee, Ji Young; Kim, Byung-Kwon; Kang, Mijeong; Park, Jun Hui

    2016-01-01

    We detected single living bacterial cells on ultramicroelectrode (UME) using a single-particle collision method and optical microscopic methods. The number of collision events involving the bacterial cells indicated in current-time (i-t) curves corresponds to the number of bacterial cells (i.e., Escherichia coli) on the UME surface, as observed visually. Simulations were performed to determine the theoretical current response (75 pA) and frequency (0.47 pM−1 s−1) of single Escherichia coli collisions. The experimental current response (83 pA) and frequency (0.26 pM−1 s−1) were on the same order of magnitude as the theoretical values. This single-particle collision approach facilitates detecting living bacteria and determining their concentration in solution and could be widely applied to studying other bacteria and biomolecules. PMID:27435527

  19. Detecting event-related recurrences by symbolic analysis: applications to human language processing

    PubMed Central

    beim Graben, Peter; Hutt, Axel

    2015-01-01

    Quasi-stationarity is ubiquitous in complex dynamical systems. In brain dynamics, there is ample evidence that event-related potentials (ERPs) reflect such quasi-stationary states. In order to detect them from time series, several segmentation techniques have been proposed. In this study, we elaborate a recent approach for detecting quasi-stationary states as recurrence domains by means of recurrence analysis and subsequent symbolization methods. We address two pertinent problems of contemporary recurrence analysis: optimizing the size of recurrence neighbourhoods and identifying symbols from different realizations for sequence alignment. As possible solutions for these problems, we suggest a maximum entropy criterion and a Hausdorff clustering algorithm. The resulting recurrence domains for single-subject ERPs are obtained as partition cells reflecting quasi-stationary brain states. PMID:25548270

  20. Increasing cognitive load to facilitate lie detection: the benefit of recalling an event in reverse order.

    PubMed

    Vrij, Aldert; Mann, Samantha A; Fisher, Ronald P; Leal, Sharon; Milne, Rebecca; Bull, Ray

    2008-06-01

    In two experiments, we tested the hypotheses that (a) the difference between liars and truth tellers will be greater when interviewees report their stories in reverse order than in chronological order, and (b) instructing interviewees to recall their stories in reverse order will facilitate detecting deception. In Experiment 1, 80 mock suspects told the truth or lied about a staged event and did or did not report their stories in reverse order. The reverse order interviews contained many more cues to deceit than the control interviews. In Experiment 2, 55 police officers watched a selection of the videotaped interviews of Experiment 1 and made veracity judgements. Requesting suspects to convey their stories in reverse order improved police observers' ability to detect deception and did not result in a response bias.

  1. Gait event detection on level ground and incline walking using a rate gyroscope.

    PubMed

    Catalfamo, Paola; Ghoussayni, Salim; Ewins, David

    2010-01-01

    Gyroscopes have been proposed as sensors for ambulatory gait analysis and functional electrical stimulation systems. Accurate determination of the Initial Contact of the foot with the floor (IC) and the final contact or Foot Off (FO) on different terrains is important. This paper describes the evaluation of a gyroscope placed on the shank for determination of IC and FO in subjects walking outdoors on level ground, and up and down an incline. Performance was compared with a reference pressure measurement system. The mean difference between the gyroscope and the reference was less than -25 ms for IC and less than 75 ms for FO for all terrains. Detection success was over 98%. These results provide preliminary evidence supporting the use of the gyroscope for gait event detection on inclines as well as level walking.

  2. The early bird catches the term: combining twitter and news data for event detection and situational awareness.

    PubMed

    Thapen, Nicholas; Simmie, Donal; Hankin, Chris

    2016-10-07

    Twitter updates now represent an enormous stream of information originating from a wide variety of formal and informal sources, much of which is relevant to real-world events. They can therefore be highly useful for event detection and situational awareness applications. In this paper we apply customised filtering techniques to existing bio-surveillance algorithms to detect localised spikes in Twitter activity, showing that these correspond to real events with a high level of confidence. We then develop a methodology to automatically summarise these events, both by providing the tweets which best describe the event and by linking to highly relevant news articles. This news linkage is accomplished by identifying terms occurring more frequently in the event tweets than in a baseline of activity for the area concerned, and using these to search for news. We apply our methods to outbreaks of illness and events strongly affecting sentiment and are able to detect events verifiable by third party sources and produce high quality summaries. This study demonstrates linking event detection from Twitter with relevant online news to provide situational awareness. This builds on the existing studies that focus on Twitter alone, showing that integrating information from multiple online sources can produce useful analysis.

  3. Automatic detection of whole night snoring events using non-contact microphone.

    PubMed

    Dafna, Eliran; Tarasiuk, Ariel; Zigel, Yaniv

    2013-01-01

    Although awareness of sleep disorders is increasing, limited information is available on whole night detection of snoring. Our study aimed to develop and validate a robust, high performance, and sensitive whole-night snore detector based on non-contact technology. Sounds during polysomnography (PSG) were recorded using a directional condenser microphone placed 1 m above the bed. An AdaBoost classifier was trained and validated on manually labeled snoring and non-snoring acoustic events. Sixty-seven subjects (age 52.5 ± 13.5 years, BMI 30.8 ± 4.7 kg/m(2), m/f 40/27) referred for PSG for obstructive sleep apnea diagnoses were prospectively and consecutively recruited. Twenty-five subjects were used for the design study; the validation study was blindly performed on the remaining forty-two subjects. To train the proposed sound detector, >76,600 acoustic episodes collected in the design study were manually classified by three scorers into snore and non-snore episodes (e.g., bedding noise, coughing, environmental). A feature selection process was applied to select the most discriminative features extracted from time and spectral domains. The average snore/non-snore detection rate (accuracy) for the design group was 98.4% based on a ten-fold cross-validation technique. When tested on the validation group, the average detection rate was 98.2% with sensitivity of 98.0% (snore as a snore) and specificity of 98.3% (noise as noise). Audio-based features extracted from time and spectral domains can accurately discriminate between snore and non-snore acoustic events. This audio analysis approach enables detection and analysis of snoring sounds from a full night in order to produce quantified measures for objective follow-up of patients.

  4. Automatic Detection of Whole Night Snoring Events Using Non-Contact Microphone

    PubMed Central

    Dafna, Eliran; Tarasiuk, Ariel; Zigel, Yaniv

    2013-01-01

    Objective Although awareness of sleep disorders is increasing, limited information is available on whole night detection of snoring. Our study aimed to develop and validate a robust, high performance, and sensitive whole-night snore detector based on non-contact technology. Design Sounds during polysomnography (PSG) were recorded using a directional condenser microphone placed 1 m above the bed. An AdaBoost classifier was trained and validated on manually labeled snoring and non-snoring acoustic events. Patients Sixty-seven subjects (age 52.5±13.5 years, BMI 30.8±4.7 kg/m2, m/f 40/27) referred for PSG for obstructive sleep apnea diagnoses were prospectively and consecutively recruited. Twenty-five subjects were used for the design study; the validation study was blindly performed on the remaining forty-two subjects. Measurements and Results To train the proposed sound detector, >76,600 acoustic episodes collected in the design study were manually classified by three scorers into snore and non-snore episodes (e.g., bedding noise, coughing, environmental). A feature selection process was applied to select the most discriminative features extracted from time and spectral domains. The average snore/non-snore detection rate (accuracy) for the design group was 98.4% based on a ten-fold cross-validation technique. When tested on the validation group, the average detection rate was 98.2% with sensitivity of 98.0% (snore as a snore) and specificity of 98.3% (noise as noise). Conclusions Audio-based features extracted from time and spectral domains can accurately discriminate between snore and non-snore acoustic events. This audio analysis approach enables detection and analysis of snoring sounds from a full night in order to produce quantified measures for objective follow-up of patients. PMID:24391903

  5. Detection of planets in extremely weak central perturbation microlensing events via next-generation ground-based surveys

    SciTech Connect

    Chung, Sun-Ju; Lee, Chung-Uk; Koo, Jae-Rim E-mail: leecu@kasi.re.kr

    2014-04-20

    Even though the recently discovered high-magnification event MOA-2010-BLG-311 had complete coverage over its peak, confident planet detection did not happen due to extremely weak central perturbations (EWCPs, fractional deviations of ≲ 2%). For confident detection of planets in EWCP events, it is necessary to have both high cadence monitoring and high photometric accuracy better than those of current follow-up observation systems. The next-generation ground-based observation project, Korea Microlensing Telescope Network (KMTNet), satisfies these conditions. We estimate the probability of occurrence of EWCP events with fractional deviations of ≤2% in high-magnification events and the efficiency of detecting planets in the EWCP events using the KMTNet. From this study, we find that the EWCP events occur with a frequency of >50% in the case of ≲ 100 M {sub E} planets with separations of 0.2 AU ≲ d ≲ 20 AU. We find that for main-sequence and sub-giant source stars, ≳ 1 M {sub E} planets in EWCP events with deviations ≤2% can be detected with frequency >50% in a certain range that changes with the planet mass. However, it is difficult to detect planets in EWCP events of bright stars like giant stars because it is easy for KMTNet to be saturated around the peak of the events because of its constant exposure time. EWCP events are caused by close, intermediate, and wide planetary systems with low-mass planets and close and wide planetary systems with massive planets. Therefore, we expect that a much greater variety of planetary systems than those already detected, which are mostly intermediate planetary systems, regardless of the planet mass, will be significantly detected in the near future.

  6. Endpoint visual detection of three genetically modified rice events by loop-mediated isothermal amplification.

    PubMed

    Chen, Xiaoyun; Wang, Xiaofu; Jin, Nuo; Zhou, Yu; Huang, Sainan; Miao, Qingmei; Zhu, Qing; Xu, Junfeng

    2012-11-07

    Genetically modified (GM) rice KMD1, TT51-1, and KF6 are three of the most well known transgenic Bt rice lines in China. A rapid and sensitive molecular assay for risk assessment of GM rice is needed. Polymerase chain reaction (PCR), currently the most common method for detecting genetically modified organisms, requires temperature cycling and relatively complex procedures. Here we developed a visual and rapid loop-mediated isothermal amplification (LAMP) method to amplify three GM rice event-specific junction sequences. Target DNA was amplified and visualized by two indicators (SYBR green or hydroxy naphthol blue [HNB]) within 60 min at an isothermal temperature of 63 °C. Different kinds of plants were selected to ensure the specificity of detection and the results of the non-target samples were negative, indicating that the primer sets for the three GM rice varieties had good levels of specificity. The sensitivity of LAMP, with detection limits at low concentration levels (0.01%−0.005% GM), was 10- to 100-fold greater than that of conventional PCR. Additionally, the LAMP assay coupled with an indicator (SYBR green or HNB) facilitated analysis. These findings revealed that the rapid detection method was suitable as a simple field-based test to determine the status of GM crops.

  7. Endpoint Visual Detection of Three Genetically Modified Rice Events by Loop-Mediated Isothermal Amplification

    PubMed Central

    Chen, Xiaoyun; Wang, Xiaofu; Jin, Nuo; Zhou, Yu; Huang, Sainan; Miao, Qingmei; Zhu, Qing; Xu, Junfeng

    2012-01-01

    Genetically modified (GM) rice KMD1, TT51-1, and KF6 are three of the most well known transgenic Bt rice lines in China. A rapid and sensitive molecular assay for risk assessment of GM rice is needed. Polymerase chain reaction (PCR), currently the most common method for detecting genetically modified organisms, requires temperature cycling and relatively complex procedures. Here we developed a visual and rapid loop-mediated isothermal amplification (LAMP) method to amplify three GM rice event-specific junction sequences. Target DNA was amplified and visualized by two indicators (SYBR green or hydroxy naphthol blue [HNB]) within 60 min at an isothermal temperature of 63 °C. Different kinds of plants were selected to ensure the specificity of detection and the results of the non-target samples were negative, indicating that the primer sets for the three GM rice varieties had good levels of specificity. The sensitivity of LAMP, with detection limits at low concentration levels (0.01%–0.005% GM), was 10- to 100-fold greater than that of conventional PCR. Additionally, the LAMP assay coupled with an indicator (SYBR green or HNB) facilitated analysis. These findings revealed that the rapid detection method was suitable as a simple field-based test to determine the status of GM crops. PMID:23203072

  8. Development of electrochemical biosensor for detection of pathogenic microorganism in Asian dust events.

    PubMed

    Yoo, Min-Sang; Shin, Minguk; Kim, Younghun; Jang, Min; Choi, Yoon-E; Park, Si Jae; Choi, Jonghoon; Lee, Jinyoung; Park, Chulhwan

    2017-05-01

    We developed a single-walled carbon nanotubes (SWCNTs)-based electrochemical biosensor for the detection of Bacillus subtilis, one of the microorganisms observed in Asian dust events, which causes respiratory diseases such as asthma and pneumonia. SWCNTs plays the role of a transducer in biological antigen/antibody reaction for the electrical signal while 1-pyrenebutanoic acid succinimidyl ester (1-PBSE) and ant-B. subtilis were performed as a chemical linker and an acceptor, respectively, for the adhesion of target microorganism in the developed biosensor. The detection range (10(2)-10(10) CFU/mL) and the detection limit (10(2) CFU/mL) of the developed biosensor were identified while the response time was 10 min. The amount of target B. subtilis was the highest in the specificity test of the developed biosensor, compared with the other tested microorganisms (Staphylococcus aureus, Flavobacterium psychrolimnae, and Aquabacterium commune). In addition, target B. subtilis detected by the developed biosensor was observed by scanning electron microscope (SEM) analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Pinda: a web service for detection and analysis of intraspecies gene duplication events.

    PubMed

    Kontopoulos, Dimitrios-Georgios; Glykos, Nicholas M

    2013-09-01

    We present Pinda, a Web service for the detection and analysis of possible duplications of a given protein or DNA sequence within a source species. Pinda fully automates the whole gene duplication detection procedure, from performing the initial similarity searches, to generating the multiple sequence alignments and the corresponding phylogenetic trees, to bootstrapping the trees and producing a Z-score-based list of duplication candidates for the input sequence. Pinda has been cross-validated using an extensive set of known and bibliographically characterized duplication events. The service facilitates the automatic and dependable identification of gene duplication events, using some of the most successful bioinformatics software to perform an extensive analysis protocol. Pinda will prove of use for the analysis of newly discovered genes and proteins, thus also assisting the study of recently sequenced genomes. The service's location is http://orion.mbg.duth.gr/Pinda. The source code is freely available via https://github.com/dgkontopoulos/Pinda/. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  10. Online Least Squares One-Class Support Vector Machines-Based Abnormal Visual Event Detection

    PubMed Central

    Wang, Tian; Chen, Jie; Zhou, Yi; Snoussi, Hichem

    2013-01-01

    The abnormal event detection problem is an important subject in real-time video surveillance. In this paper, we propose a novel online one-class classification algorithm, online least squares one-class support vector machine (online LS-OC-SVM), combined with its sparsified version (sparse online LS-OC-SVM). LS-OC-SVM extracts a hyperplane as an optimal description of training objects in a regularized least squares sense. The online LS-OC-SVM learns a training set with a limited number of samples to provide a basic normal model, then updates the model through remaining data. In the sparse online scheme, the model complexity is controlled by the coherence criterion. The online LS-OC-SVM is adopted to handle the abnormal event detection problem. Each frame of the video is characterized by the covariance matrix descriptor encoding the moving information, then is classified into a normal or an abnormal frame. Experiments are conducted, on a two-dimensional synthetic distribution dataset and a benchmark video surveillance dataset, to demonstrate the promising results of the proposed online LS-OC-SVM method. PMID:24351629

  11. AKSED: adaptive knowledge-based system for event detection using collaborative unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Wang, X. Sean; Lee, Byung Suk; Sadjadi, Firooz

    2006-05-01

    Advances in sensor technology and image processing have made it possible to equip unmanned aerial vehicles (UAVs) with economical, high-resolution, energy-efficient sensors. Despite the improvements, current UAVs lack autonomous and collaborative operation capabilities, due to limited bandwidth and limited on-board image processing abilities. The situation, however, is changing. In the next generation of UAVs, much image processing can be carried out onboard and communication bandwidth problem will improve. More importantly, with more processing power, collaborative operations among a team of autonomous UAVs can provide more intelligent event detection capabilities. In this paper, we present ideas for developing a system enabling target recognitions by collaborative operations of autonomous UAVs. UAVs are configured in three stages: manufacturing, mission planning, and deployment. Different sets of information are needed at different stages, and the resulting outcome is an optimized event detection code deployed onto a UAV. The envisioned system architecture and the contemplated methodology, together with problems to be addressed, are presented.

  12. A Simple and Robust Event-Detection Algorithm for Single-Cell Impedance Cytometry.

    PubMed

    Caselli, Federica; Bisegna, Paolo

    2016-02-01

    Microfluidic impedance cytometry is emerging as a powerful label-free technique for the characterization of single biological cells. In order to increase the sensitivity and the specificity of the technique, suited digital signal processing methods are required to extract meaningful information from measured impedance data. In this study, a simple and robust event-detection algorithm for impedance cytometry is presented. Since a differential measuring scheme is generally adopted, the signal recorded when a cell passes through the sensing region of the device exhibits a typical odd-symmetric pattern. This feature is exploited twice by the proposed algorithm: first, a preliminary segmentation, based on the correlation of the data stream with the simplest odd-symmetric template, is performed; then, the quality of detected events is established by evaluating their E2O index, that is, a measure of the ratio between their even and odd parts. A thorough performance analysis is reported, showing the robustness of the algorithm with respect to parameter choice and noise level. In terms of sensitivity and positive predictive value, an overall performance of 94.9% and 98.5%, respectively, was achieved on two datasets relevant to microfluidic chips with very different characteristics, considering three noise levels. The present algorithm can foster the role of impedance cytometry in single-cell analysis, which is the new frontier in "Omics."

  13. Automatic Event Detection and Characterization of solar events with IRIS, SDO/AIA and Hi-C

    NASA Astrophysics Data System (ADS)

    Alexander, Caroline; Fayock, Brian; Winebarger, Amy

    2016-05-01

    Dynamic, low-lying loops with peak temperatures <1 MK are observed throughout the solar transition region. These loops can be observed in SDO/AIA data due to some lower temperature spectral lines in the passbands, but have not been studied in great detail. We have developed a technique to automatically identify events (i.e., brightenings) on a pixel-by-pixel basis applying a set of selection criteria. The pixels are then grouped according to their proximity in space and relative progression of the event. This method allows us to characterize their overall lifetime and the rate at which these events occur. Our current progress includes identification of these groups of events in IRIS data, determination of their existence in AIA data, and characterization based on a comparison between the two. This technique has also been used on Hi-C data in preparation for the rocket re-flight in July 2016. Results on the success of this technique at identifying real structures and sources of heating will be shown.

  14. The Waveform Correlation Event Detection System project, Phase II: Testing with the IDC primary network

    SciTech Connect

    Young, C.J.; Beiriger, J.I.; Moore, S.G.

    1998-04-01

    Further improvements to the Waveform Correlation Event Detection System (WCEDS) developed by Sandia Laboratory have made it possible to test the system on the accepted Comprehensive Test Ban Treaty (CTBT) seismic monitoring network. For our test interval we selected a 24-hour period from December 1996, and chose to use the Reviewed Event Bulletin (REB) produced by the Prototype International Data Center (PIDC) as ground truth for evaluating the results. The network is heterogeneous, consisting of array and three-component sites, and as a result requires more flexible waveform processing algorithms than were available in the first version of the system. For simplicity and superior performance, we opted to use the spatial coherency algorithm of Wagner and Owens (1996) for both types of sites. Preliminary tests indicated that the existing version of WCEDS, which ignored directional information, could not achieve satisfactory detection or location performance for many of the smaller events in the REB, particularly those in the south Pacific where the network coverage is unusually sparse. To achieve an acceptable level of performance, we made modifications to include directional consistency checks for the correlations, making the regions of high correlation much less ambiguous. These checks require the production of continuous azimuth and slowness streams for each station, which is accomplished by means of FK processing for the arrays and power polarization processing for the three-component sites. In addition, we added the capability to use multiple frequency-banded data streams for each site to increase sensitivity to phases whose frequency content changes as a function of distance.

  15. Slip-Related Changes in Plantar Pressure Distribution, and Parameters for Early Detection of Slip Events

    PubMed Central

    Choi, Seungyoung; Cho, Hyungpil; Kang, Boram; Lee, Dong Hun; Kim, Mi Jung

    2015-01-01

    Objective To investigate differences in plantar pressure distribution between a normal gait and unpredictable slip events to predict the initiation of the slipping process. Methods Eleven male participants were enrolled. Subjects walked onto a wooden tile, and two layers of oily vinyl sheet were placed on the expected spot of the 4th step to induce a slip. An insole pressure-measuring system was used to monitor plantar pressure distribution. This system measured plantar pressure in four regions (the toes, metatarsal head, arch, and heel) for three events: the step during normal gait; the recovered step, when the subject recovered from a slip; and the uncorrected, harmful slipped step. Four variables were analyzed: peak pressure (PP), contact time (CT), the pressure-time integral (PTI), and the instant of peak pressure (IPP). Results The plantar pressure pattern in the heel was unique, as compared with other parts of the sole. In the heel, PP, CT, and PTI values were high in slipped and recovered steps compared with normal steps. The IPP differed markedly among the three steps. The IPPs in the heel for the three events were, in descending order (from latest to earliest), slipped, recovered, and normal steps, whereas in the other regions the order was normal, recovered, and slipped steps. Finally, the metatarsal head-to-heel IPP ratios for the normal, recovered, and slipped steps were 6.1±2.9, 3.1±3.0, and 2.2±2.5, respectively. Conclusion A distinctive plantar pressure pattern in the heel might be useful for early detection of a slip event to prevent slip-related injuries. PMID:26798603

  16. Detecting regular sound changes in linguistics as events of concerted evolution

    DOE PAGES

    Hruschka, Daniel  J.; Branford, Simon; Smith, Eric  D.; ...

    2014-12-18

    Background: Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Results: Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular soundmore » change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. Conclusions: We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group.« less

  17. Detecting regular sound changes in linguistics as events of concerted evolution.

    PubMed

    Hruschka, Daniel J; Branford, Simon; Smith, Eric D; Wilkins, Jon; Meade, Andrew; Pagel, Mark; Bhattacharya, Tanmoy

    2015-01-05

    Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular sound change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Unreported seismic events found far off-shore Mexico using full-waveform, cross-correlation detection method.

    NASA Astrophysics Data System (ADS)

    Solano, ErickaAlinne; Hjorleifsdottir, Vala; Perez-Campos, Xyoli

    2015-04-01

    A large subset of seismic events do not have impulsive arrivals, such as low frequency events in volcanoes, earthquakes in the shallow part of the subduction interface and further down dip from the traditional seismogenic part, glacial events, volcanic and non-volcanic tremors and landslides. A suite of methods can be used to detect these non-impulsive events. One of this methods is the full-waveform detection based on time reversal methods (Solano, et al , submitted to GJI). The method uses continuous observed seismograms, together with Greens functions and moment tensor responses calculated for an arbitrary 3D structure. This method was applied to the 2012 Ometepec-Pinotepa Nacional earthquake sequence in Guerrero, Mexico. During the span time of the study, we encountered three previously unknown events. One of this events was an impulsive earthquake in the Ometepec area, that only has clear arrivals on three stations and was therefore not located and reported by the SSN. The other two events are previously undetected events, very depleted in high frequencies, that occurred far outside the search area. A very rough estimate gives the location of this two events in the portion of the East Pacific Rise around 9 N. These two events are detected despite their distance from the search area, due to favorable move-out on the array of the Mexican National Seismological Service network (SSN). We are expanding the study area to the EPR and to a larger period of time, with the objective of finding more events in that region. We will present an analysis of the newly detected events, as well as any further findings at the meeting.

  19. Predictors of Arrhythmic Events Detected by Implantable Loop Recorders in Renal Transplant Candidates

    PubMed Central

    Silva, Rodrigo Tavares; Martinelli Filho, Martino; Peixoto, Giselle de Lima; de Lima, José Jayme Galvão; de Siqueira, Sérgio Freitas; Costa, Roberto; Gowdak, Luís Henrique Wolff; de Paula, Flávio Jota; Kalil Filho, Roberto; Ramires, José Antônio Franchini

    2015-01-01

    Background The recording of arrhythmic events (AE) in renal transplant candidates (RTCs) undergoing dialysis is limited by conventional electrocardiography. However, continuous cardiac rhythm monitoring seems to be more appropriate due to automatic detection of arrhythmia, but this method has not been used. Objective We aimed to investigate the incidence and predictors of AE in RTCs using an implantable loop recorder (ILR). Methods A prospective observational study conducted from June 2009 to January 2011 included 100 consecutive ambulatory RTCs who underwent ILR and were followed-up for at least 1 year. Multivariate logistic regression was applied to define predictors of AE. Results During a mean follow-up of 424 ± 127 days, AE could be detected in 98% of patients, and 92% had more than one type of arrhythmia, with most considered potentially not serious. Sustained atrial tachycardia and atrial fibrillation occurred in 7% and 13% of patients, respectively, and bradyarrhythmia and non-sustained or sustained ventricular tachycardia (VT) occurred in 25% and 57%, respectively. There were 18 deaths, of which 7 were sudden cardiac events: 3 bradyarrhythmias, 1 ventricular fibrillation, 1 myocardial infarction, and 2 undetermined. The presence of a long QTc (odds ratio [OR] = 7.28; 95% confidence interval [CI], 2.01–26.35; p = 0.002), and the duration of the PR interval (OR = 1.05; 95% CI, 1.02–1.08; p < 0.001) were independently associated with bradyarrhythmias. Left ventricular dilatation (LVD) was independently associated with non-sustained VT (OR = 2.83; 95% CI, 1.01–7.96; p = 0.041). Conclusions In medium-term follow-up of RTCs, ILR helped detect a high incidence of AE, most of which did not have clinical relevance. The PR interval and presence of long QTc were predictive of bradyarrhythmias, whereas LVD was predictive of non-sustained VT. PMID:26351983

  20. Comparison and applicability of landslide susceptibility models based on landslide ratio-based logistic regression, frequency ratio, weight of evidence, and instability index methods in an extreme rainfall event

    NASA Astrophysics Data System (ADS)

    Wu, Chunhung

    2016-04-01

    Few researches have discussed about the applicability of applying the statistical landslide susceptibility (LS) model for extreme rainfall-induced landslide events. The researches focuses on the comparison and applicability of LS models based on four methods, including landslide ratio-based logistic regression (LRBLR), frequency ratio (FR), weight of evidence (WOE), and instability index (II) methods, in an extreme rainfall-induced landslide cases. The landslide inventory in the Chishan river watershed, Southwestern Taiwan, after 2009 Typhoon Morakot is the main materials in this research. The Chishan river watershed is a tributary watershed of Kaoping river watershed, which is a landslide- and erosion-prone watershed with the annual average suspended load of 3.6×107 MT/yr (ranks 11th in the world). Typhoon Morakot struck Southern Taiwan from Aug. 6-10 in 2009 and dumped nearly 2,000 mm of rainfall in the Chishan river watershed. The 24-hour, 48-hour, and 72-hours accumulated rainfall in the Chishan river watershed exceeded the 200-year return period accumulated rainfall. 2,389 landslide polygons in the Chishan river watershed were extracted from SPOT 5 images after 2009 Typhoon Morakot. The total landslide area is around 33.5 km2, equals to the landslide ratio of 4.1%. The main landslide types based on Varnes' (1978) classification are rotational and translational slides. The two characteristics of extreme rainfall-induced landslide event are dense landslide distribution and large occupation of downslope landslide areas owing to headward erosion and bank erosion in the flooding processes. The area of downslope landslide in the Chishan river watershed after 2009 Typhoon Morakot is 3.2 times higher than that of upslope landslide areas. The prediction accuracy of LS models based on LRBLR, FR, WOE, and II methods have been proven over 70%. The model performance and applicability of four models in a landslide-prone watershed with dense distribution of rainfall

  1. Validity assessment of the detection method of maize event Bt10 through investigation of its molecular structure.

    PubMed

    Milcamps, Anne; Rabe, Scott; Cade, Rebecca; De Framond, Anic J; Henriksson, Peter; Kramer, Vance; Lisboa, Duarte; Pastor-Benito, Susana; Willits, Michael G; Lawrence, David; Van den Eede, Guy

    2009-04-22

    In March 2005, U.S. authorities informed the European Commission of the inadvertent release of unauthorized maize GM event Bt10 in their market and subsequently the grain channel. In the United States measures were taken to eliminate Bt10 from seed and grain supplies; in the European Union an embargo for maize gluten and brewer's grain import was implemented unless certified of Bt10 absence with a Bt10-specific PCR detection method. With the aim of assessing the validity of the Bt10 detection method, an in-depth analysis of the molecular organization of the genetic modification of this event was carried out by both the company Syngenta, who produced the event, and the European Commission Joint Research Centre, who validated the detection method. Using a variety of molecular analytical tools, both organizations found the genetic modification of event Bt10 to be very complex in structure, with rearrangements, inversions, and multiple copies of the structural elements (cry1Ab, pat, and the amp gene), interspersed with small genomic maize fragments. Southern blot analyses demonstrated that all Bt10 elements were found tightly linked on one large fragment, including the region that would generate the event-specific PCR amplicon of the Bt10 detection method. This study proposes a hypothetical map of the insert of event Bt10 and concludes that the validated detection method for event Bt10 is fit for its purpose.

  2. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm

    PubMed Central

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-01-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses. PMID:27706086

  3. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm.

    PubMed

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-10-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.

  4. Automatic Detection of Swallowing Events by Acoustical Means for Applications of Monitoring of Ingestive Behavior

    PubMed Central

    Sazonov, Edward S.; Makeyev, Oleksandr; Schuckers, Stephanie; Lopez-Meyer, Paulo; Melanson, Edward L.; Neuman, Michael R.

    2010-01-01

    Our understanding of etiology of obesity and overweight is incomplete due to lack of objective and accurate methods for Monitoring of Ingestive Behavior (MIB) in the free living population. Our research has shown that frequency of swallowing may serve as a predictor for detecting food intake, differentiating liquids and solids, and estimating ingested mass. This paper proposes and compares two methods of acoustical swallowing detection from sounds contaminated by motion artifacts, speech and external noise. Methods based on mel-scale Fourier spectrum, wavelet packets, and support vector machines are studied considering the effects of epoch size, level of decomposition and lagging on classification accuracy. The methodology was tested on a large dataset (64.5 hours with a total of 9,966 swallows) collected from 20 human subjects with various degrees of adiposity. Average weighted epoch recognition accuracy for intra-visit individual models was 96.8% which resulted in 84.7% average weighted accuracy in detection of swallowing events. These results suggest high efficiency of the proposed methodology in separation of swallowing sounds from artifacts that originate from respiration, intrinsic speech, head movements, food ingestion, and ambient noise. The recognition accuracy was not related to body mass index, suggesting that the methodology is suitable for obese individuals. PMID:19789095

  5. Solar Power Ramp Events Detection Using an Optimized Swinging Door Algorithm

    SciTech Connect

    Cui, Mingjian; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Ke, Deping; Sun, Yuanzhang

    2015-08-05

    Solar power ramp events (SPREs) significantly influence the integration of solar power on non-clear days and threaten the reliable and economic operation of power systems. Accurately extracting solar power ramps becomes more important with increasing levels of solar power penetrations in power systems. In this paper, we develop an optimized swinging door algorithm (OpSDA) to enhance the state of the art in SPRE detection. First, the swinging door algorithm (SDA) is utilized to segregate measured solar power generation into consecutive segments in a piecewise linear fashion. Then we use a dynamic programming approach to combine adjacent segments into significant ramps when the decision thresholds are met. In addition, the expected SPREs occurring in clear-sky solar power conditions are removed. Measured solar power data from Tucson Electric Power is used to assess the performance of the proposed methodology. OpSDA is compared to two other ramp detection methods: the SDA and the L1-Ramp Detect with Sliding Window (L1-SW) method. The statistical results show the validity and effectiveness of the proposed method. OpSDA can significantly improve the performance of the SDA, and it can perform as well as or better than L1-SW with substantially less computation time.

  6. The WISE Detection of an Infrared Echo in Tidal Disruption Event ASASSN-14li

    NASA Astrophysics Data System (ADS)

    Jiang, Ning; Dou, Liming; Wang, Tinggui; Yang, Chenwei; Lyu, Jianwei; Zhou, Hongyan

    2016-09-01

    We report the detection of a significant infrared variability of the nearest tidal disruption event (TDE) ASASSN-14li using Wide-field Infrared Survey Explorer and newly released Near-Earth Object WISE Reactivation data. In comparison with the quiescent state, the infrared flux is brightened by 0.12 and 0.16 mag in the W1 (3.4 μm) and W2 (4.6 μm) bands at 36 days after the optical discovery (or ˜110 days after the peak disruption date). The flux excess is still detectable ˜170 days later. Assuming that the flare-like infrared emission is from the dust around the black hole, its blackbody temperature is estimated to be ˜2.1 × 103 K, slightly higher than the dust sublimation temperature, indicating that the dust is likely located close to the dust sublimation radius. The equilibrium between the heating and radiation of the dust claims a bolometric luminosity of ˜1043-1045 erg s-1, comparable with the observed peak luminosity. This result has for the first time confirmed the detection of infrared emission from the dust echoes of TDEs.

  7. Rare-event detection and process control for a biomedical application

    NASA Astrophysics Data System (ADS)

    Kegelmeyer, Laura N.

    1990-05-01

    Medical researchers are seeking a method for detecting chromosomal abnormalities in unborn children without requiring invasive procedures such as anmiocentesis. Software has been developed to utilize a light microscope to detect fetal cells that occur with very low frequency in a sample of maternal blood. This rare event detection involves dividing a microscope slide containing a maternal blood sample into as many as 40,000 fields, automatically focusing on each field-of-view, and searching for fetal cells. Size and shape information is obtained by calculating a figure of merit through various binary operations and is used to discriminate fetal cells from noise and artifacts. Once the rare fetal cells are located, the slide is automatically rescanned to count the total number of cells on the slide. Binary operations and image processing hardware are used as much as possible to reduce the total amount of time to analyze one slide. Current runtime for scoring one full slide is about four hours, with motorized stage movement and focusing being the speed-limiting factors. Fetal cells occurring with a frequency of less than 1 in 200,000 maternal cells have been consistently found with this system.

  8. Solar Power Ramp Events Detection Using an Optimized Swinging Door Algorithm: Preprint

    SciTech Connect

    Cui, Mingjian; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Ke, Deping; Sun, Yuanzhang

    2015-08-07

    Solar power ramp events (SPREs) are those that significantly influence the integration of solar power on non-clear days and threaten the reliable and economic operation of power systems. Accurately extracting solar power ramps becomes more important with increasing levels of solar power penetrations in power systems. In this paper, we develop an optimized swinging door algorithm (OpSDA) to detection. First, the swinging door algorithm (SDA) is utilized to segregate measured solar power generation into consecutive segments in a piecewise linear fashion. Then we use a dynamic programming approach to combine adjacent segments into significant ramps when the decision thresholds are met. In addition, the expected SPREs occurring in clear-sky solar power conditions are removed. Measured solar power data from Tucson Electric Power is used to assess the performance of the proposed methodology. OpSDA is compared to two other ramp detection methods: the SDA and the L1-Ramp Detect with Sliding Window (L1-SW) method. The statistical results show the validity and effectiveness of the proposed method. OpSDA can significantly improve the performance of the SDA, and it can perform as well as or better than L1-SW with substantially less computation time.

  9. Information Systems Developments to Detect and Analyze Chemotherapy-associated Adverse Drug Events

    PubMed Central

    Weiner, Mark G.; Livshits, Alice; Carozzoni, Carol; McMenamin, Erin; Gibson, Gene; Loren, Alison W.; Hennessy, Sean

    2002-01-01

    A difficult balance exists in the use of cancer chemotherapy in which the cytotoxic medicine must act on the cancer without causing neutropenic fever, a condition that is caused by over-suppression of the immune system. An improved understanding of dosing strategies as well as the use of medications to support the immune system has helped to reduce the likelihood of an admission for neutropenic fever following cancer chemotherapy. Therefore, as with any drug therapy, chemotherapy administration that is temporally associated with an unexpected hospitalization for neutropenia is an adverse drug event (ADE). Analogous to other informatics research to monitor and address the occurrence of ADEs, this work develops and validates the information systems infrastructure necessary to detect the occurrence of and analyze the factors contributing to chemotherapy associated ADEs.

  10. The Event Detection and the Apparent Velocity Estimation Based on Computer Vision

    NASA Astrophysics Data System (ADS)

    Shimojo, M.

    2012-08-01

    The high spatial and time resolution data obtained by the telescopes aboard Hinode revealed the new interesting dynamics in solar atmosphere. In order to detect such events and estimate the velocity of dynamics automatically, we examined the estimation methods of the optical flow based on the OpenCV that is the computer vision library. We applied the methods to the prominence eruption observed by NoRH, and the polar X-ray jet observed by XRT. As a result, it is clear that the methods work well for solar images if the images are optimized for the methods. It indicates that the optical flow estimation methods in the OpenCV library are very useful to analyze the solar phenomena.

  11. Event detection and localization for small mobile robots using reservoir computing.

    PubMed

    Antonelo, E A; Schrauwen, B; Stroobandt, D

    2008-08-01

    Reservoir Computing (RC) techniques use a fixed (usually randomly created) recurrent neural network, or more generally any dynamic system, which operates at the edge of stability, where only a linear static readout output layer is trained by standard linear regression methods. In this work, RC is used for detecting complex events in autonomous robot navigation. This can be extended to robot localization tasks which are solely based on a few low-range, high-noise sensory data. The robot thus builds an implicit map of the environment (after learning) that is used for efficient localization by simply processing the input stream of distance sensors. These techniques are demonstrated in both a simple simulation environment and in the physically realistic Webots simulation of the commercially available e-puck robot, using several complex and even dynamic environments.

  12. DADA: Data Assimilation for the Detection and Attribution of Weather and Climate-related Events

    NASA Astrophysics Data System (ADS)

    Hannart, Alexis; Bocquet, Marc; Carrassi, Alberto; Ghil, Michael; Naveau, Philippe; Pulido, Manuel; Ruiz, Juan; Tandeo, Pierre

    2015-04-01

    We describe a new approach allowing for near real time, systematic causal attribution of weather and climate-related events. The method is purposely designed to allow its operability at meteorological centers by synergizing causal attribution with data treatments that are routinely performed when numerically forecasting the weather, thereby taking advantage of their powerful computational and observational capacity. Namely, we show that causal attribution can be obtained as a by-product of the so-called data assimilation procedures that are run on a daily basis to update the meteorological model with new atmospheric observations. We explain the theoretical rationale of this approach and sketch the most prominent features of a "data assimilation-based detection and attribution" (DADA) procedure. The proposal is illustrated in the context of the 3-variables Lorenz model. Several practical and theoretical research questions that need to be addressed to make the proposal readily operational within weather forecasting centers are finally laid out.

  13. Decision support methods for the detection of adverse events in post-marketing data.

    PubMed

    Hauben, M; Bate, A

    2009-04-01

    Spontaneous reporting is a crucial component of post-marketing drug safety surveillance despite its significant limitations. The size and complexity of some spontaneous reporting system databases represent a challenge for drug safety professionals who traditionally have relied heavily on the scientific and clinical acumen of the prepared mind. Computer algorithms that calculate statistical measures of reporting frequency for huge numbers of drug-event combinations are increasingly used to support pharamcovigilance analysts screening large spontaneous reporting system databases. After an overview of pharmacovigilance and spontaneous reporting systems, we discuss the theory and application of contemporary computer algorithms in regular use, those under development, and the practical considerations involved in the implementation of computer algorithms within a comprehensive and holistic drug safety signal detection program.

  14. The Monitoring, Detection, Isolation and Assessment of Information Warfare Attacks Through Multi-Level, Multi-Scale System Modeling and Model Based Technology

    DTIC Science & Technology

    2004-01-01

    window size of 10. Figure 5-7 shows the ROC curve of the “EWMA vector” classifier for the λ value of 0.04 based on the CHAID algorithm in comparison ...k(i), from the training data using the event intensity method ...11 Figure 1-2. The observations of the event intensity, k(i), from the testing data using the event intensity method

  15. Detecting binarity of GW150914-like lenses in gravitational microlensing events

    NASA Astrophysics Data System (ADS)

    Kesden, Michael; Eilbott, Daniel; Riley, Alexander; Cohn, Jonathan; King, Lindsay

    2017-01-01

    The recent discovery of gravitational waves from stellar-mass binary black holes (BBHs) provided direct evidence of the existence of these systems. These BBHs would have gravitational microlensing signatures that are, due to their large masses and small separations, distinct from single-lens signals. We apply Bayesian statistics to examine the distinguishability of BBH microlensing events from single-lens events under ideal observing conditions, using modern photometric and astrometric capabilities. Given one year of ideal observations, a source star at the Galactic center, a GW150914-like BBH lens (total mass 65 solar masses, mass ratio 0.8) at half that distance, and an impact parameter of 0.4 Einstein radii, we find that BBHs with separations down to 0.00634 Einstein radii are detectable, marginally below the separation at which such systems would merge due to gravitational radiation with the age of the Universe. Supported by Alfred P Sloan Foundation Grant No. RG- 2015-65299 and NSF Grant No. PHY-1607031.

  16. Applying a New Event Detection Algorithm to an Ocean Bottom Seismometer Dataset Recorded Offshore Southern California

    NASA Astrophysics Data System (ADS)

    Bishop, J.; Kohler, M. D.; Bunn, J.; Chandy, K. M.

    2015-12-01

    A number of active southern California offshore faults are capable of M>6 earthquakes, and the only permanent Southern California Seismic Network stations that can contribute to ongoing, small-magnitude earthquake detection and location are those located on the coastline and islands. To obtain a more detailed picture of the seismicity of the region, an array of 34 ocean bottom seismometers (OBSs) was deployed to record continuous waveform data off the coast of Southern California for 12 months (2010-2011) as part of the ALBACORE (Asthenospheric and Lithospheric Broadband Architecture from the California Offshore Region Experiment) project. To obtain a local event catalog based on OBS data, we make use of a newly developed data processing platform based on Python. The data processing procedure comprises a multi-step analysis that starts with the identification of significant signals above the time-adjusted noise floor for each sensor. This is followed by a time-dependent statistical estimate of the likelihood of an earthquake based on the aggregated signals in the array. For periods with elevated event likelihood, an adaptive grid-fitting procedure is used that yields candidate earthquake hypocenters with confidence estimates that best match the observed sensor signals. The results are validated with synthetic travel times and manual picks. Using results from ALBACORE, we have created a more complete view of active faulting in the California Borderland.

  17. Energy efficient data representation and aggregation with event region detection in wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Banerjee, Torsha

    Detection (PERD) for WSNs. When a single event occurs, a child of the tree sends a Flagged Polynomial (FP) to its parent, if the readings approximated by it falls outside the data range defining the existing phenomenon. After the aggregation process is over, the root having the two polynomials, P and FP can be queried for FP (approximating the new event region) instead of flooding the whole network. For multiple such events, instead of computing a polynomial corresponding to each new event, areas with same data range are combined by the corresponding tree nodes and the aggregated coefficients are passed on. Results reveal that a new event can be detected by PERD while error in detection remains constant and is less than a threshold of 10%. As the node density increases, accuracy and delay for event detection are found to remain almost constant, making PERD highly scalable. Whenever an event occurs in a WSN, data is generated by closeby sensors and relaying the data to the base station (BS) make sensors closer to the BS run out of energy at a much faster rate than sensors in other parts of the network. This gives rise to an unequal distribution of residual energy in the network and makes those sensors with lower remaining energy level die at much faster rate than others. We propose a scheme for enhancing network Lifetime using mobile cluster heads (CH) in a WSN. To maintain remaining energy more evenly, some energy-rich nodes are designated as CHs which move in a controlled manner towards sensors rich in energy and data. This eliminates multihop transmission required by the static sensors and thus increases the overall lifetime of the WSN. We combine the idea of clustering and mobile CH to first form clusters of static sensor nodes. A collaborative strategy among the CHs further increases the lifetime of the network. Time taken for transmitting data to the BS is reduced further by making the CHs follow a connectivity strategy that always maintain a connected path to the BS

  18. Using the AHRQ PSIs to Detect Post-Discharge Adverse Events in the Veterans Health Administration

    PubMed Central

    Mull, Hillary J.; Borzecki, Ann M.; Chen, Qi; Shin, Marlena H.; Rosen, Amy K.

    2015-01-01

    Background PSIs use inpatient administrative data to flag cases with potentially preventable adverse events (AEs) attributable to hospital care. We explored how many AEs the PSIs identified in the 30 days post-discharge. Methods We ran the PSI software (version 3.1a) on VA 2003–2007 administrative data for ten recently validated PSIs. Among PSI-eligible index hospitalizations not flagged with an AE, we evaluated how many AEs occurred within 1–14 and 15–30 days post-discharge using inpatient and outpatient administrative data. Results Considering all PSI-eligible index hospitalizations, we identified 11,141 post-discharge AEs, compared to 40,578 inpatient-flagged AEs. More than 60% of post-discharge AEs were detected within 14 days of discharge. The majority of post-discharge AEs were decubitus ulcers and postoperative pulmonary embolisms or deep vein thromboses. Conclusions Extending PSI algorithms to the post-discharge period may provide a more complete picture of hospital quality. Future work should use chart review to validate post-discharge PSI events. PMID:23939485

  19. First neutrino event detection with nuclear emulsion at J-PARC neutrino beamline

    NASA Astrophysics Data System (ADS)

    Fukuda, T.; Aoki, S.; Cao, S.; Chikuma, N.; Fukuzawa, Y.; Gonin, M.; Hayashino, T.; Hayato, Y.; Hiramoto, A.; Hosomi, F.; Ishiguro, K.; Iori, S.; Inoh, T.; Kawahara, H.; Kim, H.; Kitagawa, N.; Koga, T.; Komatani, R.; Komatsu, M.; Matsushita, A.; Mikado, S.; Minamino, A.; Mizusawa, H.; Morishima, K.; Matsuo, T.; Matsumoto, T.; Morimoto, Y.; Morishita, M.; Nakamura, K.; Nakamura, M.; Nakamura, Y.; Naganawa, N.; Nakano, T.; Nakaya, T.; Nakatsuka, Y.; Nishio, A.; Ogawa, S.; Oshima, H.; Quilain, B.; Rokujo, H.; Sato, O.; Seiya, Y.; Shibuya, H.; Shiraishi, T.; Suzuki, Y.; Tada, S.; Takahashi, S.; Yamada, K.; Yoshimoto, M.; Yokoyama, M.

    2017-06-01

    Precise neutrino-nucleus interaction measurements in the sub-multi-GeV region are important to reduce the systematic uncertainty in future neutrino oscillation experiments. Furthermore, an excess of {ν_e} interactions, as a possible interpretation of the existence of a sterile neutrino, has been observed in such an energy region. The nuclear emulsion technique can measure all the final state particles with low energy threshold for a variety of targets (Fe, C, H{_2}O, and so on). Its sub-μm position resolution allows measurements of the {ν_e} cross-section with good electron/gamma separation capability. We started a new experiment at J-PARC to study sub-multi-GeV neutrino interactions by introducing the nuclear emulsion technique. The J-PARC T60 experiment has been implemented as a first step in such a project. Systematic neutrino event analysis with full scanning data in the nuclear emulsion detector was performed for the first time. The first neutrino event detection and its analysis are described in this paper.

  20. Detecting binarity of GW150914-like lenses in gravitational microlensing events

    NASA Astrophysics Data System (ADS)

    Eilbott, Daniel H.; Riley, Alexander H.; Cohn, Jonathan H.; Kesden, Michael; King, Lindsay J.

    2017-05-01

    The recent discovery of gravitational waves (GWs) from stellar-mass binary black holes (BBHs) provided direct evidence of the existence of these systems. BBH lenses would have gravitational microlensing signatures that are distinct from single-lens signals. We apply Bayesian statistics to examine the distinguishability of BBH microlensing events from single-lens events under ideal observing conditions, using the photometric capabilities of the Korean Microlensing Telescope Network. Given one year of observations, a source star at the Galactic Centre, a GW150914-like BBH lens (total mass 65 M⊙, mass ratio 0.8) at half that distance and an impact parameter of 0.4 Einstein radii, we find that binarity is detectable for BBHs with separations down to 0.0250 Einstein radii, which is nearly 3.5 times greater than the maximum separation for which such BBHs would merge within the age of the Universe. Microlensing searches are thus sensitive to more widely separated BBHs than GW searches, perhaps allowing the discovery of BBH populations produced in different channels of binary formation.

  1. Optimization of a quantitative signal detection algorithm for spontaneous reports of adverse events post immunization.

    PubMed

    Van Holle, Lionel; Bauchau, Vincent

    2013-05-01

    To optimize the efficiency of signal detection by maximizing the proportion of true positive (TP) signals among signals detected by a disproportionality algorithm. We compared 176 different combinations of stratification factors, sex (S), age (A), region (R) and year of report (Y), and cut-off values of a Multi-Item Gamma Poisson Schrinker (MGPS) algorithm. Spontaneous adverse event reports of eight vaccines from the GlaxoSmithKline Biologicals safety database were used. Defining events listed in the Product Information as proxy of true safety signals, we compared each algorithm performance in terms of positive predictive value (PPV). For each vaccine, each algorithm was ranked according to PPV. Median rank and overall PPV were computed across vaccines. For a standard cut-off of 2, the optimal stratification factors differed by vaccine and led to a set of algorithms with a median rank of 34.5 (PPV = 0.22; 34 TP). Keeping the original SARY stratification led to optimal cut-offs that differed by vaccine and a set of algorithms with a median rank of 1.75 (PPV = 0.20; 142 TP). The optimal combination of cut-off and stratification led to different algorithms by vaccine with a median rank of 1 (PPV = 0.19; 139 TP). The best unique algorithm parameterization across vaccines was 0.8-SARY (cut-off-stratification), with a median rank of 3 (PPV = 0.20; 195 TP). The original 2-SARY was one of the worst algorithms, with a median rank of 150.75 (PPV = 0.13; 8 TP). Within the scope of this study, a unique MGPS algorithm across vaccines with the original full stratification but a lowered cut-off provided major performance improvement. Copyright © 2012 John Wiley & Sons, Ltd.

  2. Automatic event detection in low SNR microseismic signals based on multi-scale permutation entropy and a support vector machine

    NASA Astrophysics Data System (ADS)

    Jia, Rui-Sheng; Sun, Hong-Mei; Peng, Yan-Jun; Liang, Yong-Quan; Lu, Xin-Ming

    2016-12-01

    Microseismic monitoring is an effective means for providing early warning of rock or coal dynamical disasters, and its first step is microseismic event detection, although low SNR microseismic signals often cannot effectively be detected by routine methods. To solve this problem, this paper presents permutation entropy and a support vector machine to detect low SNR microseismic events. First, an extraction method of signal features based on multi-scale permutation entropy is proposed by studying the influence of the scale factor on the signal permutation entropy. Second, the detection model of low SNR microseismic events based on the least squares support vector machine is built by performing a multi-scale permutation entropy calculation for the collected vibration signals, constructing a feature vector set of signals. Finally, a comparative analysis of the microseismic events and noise signals in the experiment proves that the different characteristics of the two can be fully expressed by using multi-scale permutation entropy. The detection model of microseismic events combined with the support vector machine, which has the features of high classification accuracy and fast real-time algorithms, can meet the requirements of online, real-time extractions of microseismic events.

  3. Automatic event detection in low SNR microseismic signals based on multi-scale permutation entropy and a support vector machine

    NASA Astrophysics Data System (ADS)

    Jia, Rui-Sheng; Sun, Hong-Mei; Peng, Yan-Jun; Liang, Yong-Quan; Lu, Xin-Ming

    2017-07-01

    Microseismic monitoring is an effective means for providing early warning of rock or coal dynamical disasters, and its first step is microseismic event detection, although low SNR microseismic signals often cannot effectively be detected by routine methods. To solve this problem, this paper presents permutation entropy and a support vector machine to detect low SNR microseismic events. First, an extraction method of signal features based on multi-scale permutation entropy is proposed by studying the influence of the scale factor on the signal permutation entropy. Second, the detection model of low SNR microseismic events based on the least squares support vector machine is built by performing a multi-scale permutation entropy calculation for the collected vibration signals, constructing a feature vector set of signals. Finally, a comparative analysis of the microseismic events and noise signals in the experiment proves that the different characteristics of the two can be fully expressed by using multi-scale permutation entropy. The detection model of microseismic events combined with the support vector machine, which has the features of high classification accuracy and fast real-time algorithms, can meet the requirements of online, real-time extractions of microseismic events.