Sample records for complex event detection

  1. Evaluation of Two Statistical Methods Provides Insights into the Complex Patterns of Alternative Polyadenylation Site Switching

    PubMed Central

    Li, Jie; Li, Rui; You, Leiming; Xu, Anlong; Fu, Yonggui; Huang, Shengfeng

    2015-01-01

    Switching between different alternative polyadenylation (APA) sites plays an important role in the fine tuning of gene expression. New technologies for the execution of 3’-end enriched RNA-seq allow genome-wide detection of the genes that exhibit significant APA site switching between different samples. Here, we show that the independence test gives better results than the linear trend test in detecting APA site-switching events. Further examination suggests that the discrepancy between these two statistical methods arises from complex APA site-switching events that cannot be represented by a simple change of average 3’-UTR length. In theory, the linear trend test is only effective in detecting these simple changes. We classify the switching events into four switching patterns: two simple patterns (3’-UTR shortening and lengthening) and two complex patterns. By comparing the results of the two statistical methods, we show that complex patterns account for 1/4 of all observed switching events that happen between normal and cancerous human breast cell lines. Because simple and complex switching patterns may convey different biological meanings, they merit separate study. We therefore propose to combine both the independence test and the linear trend test in practice. First, the independence test should be used to detect APA site switching; second, the linear trend test should be invoked to identify simple switching events; and third, those complex switching events that pass independence testing but fail linear trend testing can be identified. PMID:25875641

  2. An Efficient Pattern Mining Approach for Event Detection in Multivariate Temporal Data

    PubMed Central

    Batal, Iyad; Cooper, Gregory; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos

    2015-01-01

    This work proposes a pattern mining approach to learn event detection models from complex multivariate temporal data, such as electronic health records. We present Recent Temporal Pattern mining, a novel approach for efficiently finding predictive patterns for event detection problems. This approach first converts the time series data into time-interval sequences of temporal abstractions. It then constructs more complex time-interval patterns backward in time using temporal operators. We also present the Minimal Predictive Recent Temporal Patterns framework for selecting a small set of predictive and non-spurious patterns. We apply our methods for predicting adverse medical events in real-world clinical data. The results demonstrate the benefits of our methods in learning accurate event detection models, which is a key step for developing intelligent patient monitoring and decision support systems. PMID:26752800

  3. Real-time monitoring of clinical processes using complex event processing and transition systems.

    PubMed

    Meinecke, Sebastian

    2014-01-01

    Dependencies between tasks in clinical processes are often complex and error-prone. Our aim is to describe a new approach for the automatic derivation of clinical events identified via the behaviour of IT systems using Complex Event Processing. Furthermore we map these events on transition systems to monitor crucial clinical processes in real-time for preventing and detecting erroneous situations.

  4. The analysis of a complex fire event using multispaceborne observations

    NASA Astrophysics Data System (ADS)

    Andrei, Simona; Carstea, Emil; Marmureanu, Luminita; Ene, Dragos; Binietoglou, Ioannis; Nicolae, Doina; Konsta, Dimitra; Amiridis, Vassilis; Proestakis, Emmanouil

    2018-04-01

    This study documents a complex fire event that occurred on October 2016, in Middle East belligerent area. Two fire outbreaks were detected by different spacecraft monitoring instruments on board of TERRA, CALIPSO and AURA Earth Observation missions. Link with local weather conditions was examined using ERA Interim Reanalysis and CAMS datasets. The detection of the event by multiple sensors enabled a detailed characterization of fires and the comparison with different observational data.

  5. Complex Dynamic Scene Perception: Effects of Attentional Set on Perceiving Single and Multiple Event Types

    ERIC Educational Resources Information Center

    Sanocki, Thomas; Sulman, Noah

    2013-01-01

    Three experiments measured the efficiency of monitoring complex scenes composed of changing objects, or events. All events lasted about 4 s, but in a given block of trials, could be of a single type (single task) or of multiple types (multitask, with a total of four event types). Overall accuracy of detecting target events amid distractors was…

  6. Adaptive Self-Tuning Networks

    NASA Astrophysics Data System (ADS)

    Knox, H. A.; Draelos, T.; Young, C. J.; Lawry, B.; Chael, E. P.; Faust, A.; Peterson, M. G.

    2015-12-01

    The quality of automatic detections from seismic sensor networks depends on a large number of data processing parameters that interact in complex ways. The largely manual process of identifying effective parameters is painstaking and does not guarantee that the resulting controls are the optimal configuration settings. Yet, achieving superior automatic detection of seismic events is closely related to these parameters. We present an automated sensor tuning (AST) system that learns near-optimal parameter settings for each event type using neuro-dynamic programming (reinforcement learning) trained with historic data. AST learns to test the raw signal against all event-settings and automatically self-tunes to an emerging event in real-time. The overall goal is to reduce the number of missed legitimate event detections and the number of false event detections. Reducing false alarms early in the seismic pipeline processing will have a significant impact on this goal. Applicable both for existing sensor performance boosting and new sensor deployment, this system provides an important new method to automatically tune complex remote sensing systems. Systems tuned in this way will achieve better performance than is currently possible by manual tuning, and with much less time and effort devoted to the tuning process. With ground truth on detections in seismic waveforms from a network of stations, we show that AST increases the probability of detection while decreasing false alarms.

  7. Solid-state nanopore detection of protein complexes: applications in healthcare and protein kinetics.

    PubMed

    Freedman, Kevin J; Bastian, Arangassery R; Chaiken, Irwin; Kim, Min Jun

    2013-03-11

    Protein conjugation provides a unique look into many biological phenomena and has been used for decades for molecular recognition purposes. In this study, the use of solid-state nanopores for the detection of gp120-associated complexes are investigated. They exhibit monovalent and multivalent binding to anti-gp120 antibody monomer and dimers. In order to investigate the feasibility of many practical applications related to nanopores, detection of specific protein complexes is attempted within a heterogeneous protein sample, and the role of voltage on complexed proteins is researched. It is found that the electric field within the pore can result in unbinding of a freely translocating protein complex within the transient event durations measured experimentally. The strong dependence of the unbinding time with voltage can be used to improve the detection capability of the nanopore system by adding an additional level of specificity that can be probed. These data provide a strong framework for future protein-specific detection schemes, which are shown to be feasible in the realm of a 'real-world' sample and an automated multidimensional method of detecting events. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Mining Recent Temporal Patterns for Event Detection in Multivariate Time Series Data

    PubMed Central

    Batal, Iyad; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos

    2015-01-01

    Improving the performance of classifiers using pattern mining techniques has been an active topic of data mining research. In this work we introduce the recent temporal pattern mining framework for finding predictive patterns for monitoring and event detection problems in complex multivariate time series data. This framework first converts time series into time-interval sequences of temporal abstractions. It then constructs more complex temporal patterns backwards in time using temporal operators. We apply our framework to health care data of 13,558 diabetic patients and show its benefits by efficiently finding useful patterns for detecting and diagnosing adverse medical conditions that are associated with diabetes. PMID:25937993

  9. Generalized Detectability for Discrete Event Systems

    PubMed Central

    Shu, Shaolong; Lin, Feng

    2011-01-01

    In our previous work, we investigated detectability of discrete event systems, which is defined as the ability to determine the current and subsequent states of a system based on observation. For different applications, we defined four types of detectabilities: (weak) detectability, strong detectability, (weak) periodic detectability, and strong periodic detectability. In this paper, we extend our results in three aspects. (1) We extend detectability from deterministic systems to nondeterministic systems. Such a generalization is necessary because there are many systems that need to be modeled as nondeterministic discrete event systems. (2) We develop polynomial algorithms to check strong detectability. The previous algorithms are based on observer whose construction is of exponential complexity, while the new algorithms are based on a new automaton called detector. (3) We extend detectability to D-detectability. While detectability requires determining the exact state of a system, D-detectability relaxes this requirement by asking only to distinguish certain pairs of states. With these extensions, the theory on detectability of discrete event systems becomes more applicable in solving many practical problems. PMID:21691432

  10. TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach.

    PubMed

    Elgendi, Mohamed

    2016-11-02

    Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages ("TERMA") involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages ( W 1 and W 2 ) have to follow the inequality ( 8 × W 1 ) ≥ W 2 ≥ ( 2 × W 1 ) . Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions.

  11. Pre-trained D-CNN models for detecting complex events in unconstrained videos

    NASA Astrophysics Data System (ADS)

    Robinson, Joseph P.; Fu, Yun

    2016-05-01

    Rapid event detection faces an emergent need to process large videos collections; whether surveillance videos or unconstrained web videos, the ability to automatically recognize high-level, complex events is a challenging task. Motivated by pre-existing methods being complex, computationally demanding, and often non-replicable, we designed a simple system that is quick, effective and carries minimal overhead in terms of memory and storage. Our system is clearly described, modular in nature, replicable on any Desktop, and demonstrated with extensive experiments, backed by insightful analysis on different Convolutional Neural Networks (CNNs), as stand-alone and fused with others. With a large corpus of unconstrained, real-world video data, we examine the usefulness of different CNN models as features extractors for modeling high-level events, i.e., pre-trained CNNs that differ in architectures, training data, and number of outputs. For each CNN, we use 1-fps from all training exemplar to train one-vs-rest SVMs for each event. To represent videos, frame-level features were fused using a variety of techniques. The best being to max-pool between predetermined shot boundaries, then average-pool to form the final video-level descriptor. Through extensive analysis, several insights were found on using pre-trained CNNs as off-the-shelf feature extractors for the task of event detection. Fusing SVMs of different CNNs revealed some interesting facts, finding some combinations to be complimentary. It was concluded that no single CNN works best for all events, as some events are more object-driven while others are more scene-based. Our top performance resulted from learning event-dependent weights for different CNNs.

  12. TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach

    PubMed Central

    Elgendi, Mohamed

    2016-01-01

    Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages (“TERMA”) involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages (W1 and W2) have to follow the inequality (8×W1)≥W2≥(2×W1). Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions. PMID:27827852

  13. Association rule mining in the US Vaccine Adverse Event Reporting System (VAERS).

    PubMed

    Wei, Lai; Scott, John

    2015-09-01

    Spontaneous adverse event reporting systems are critical tools for monitoring the safety of licensed medical products. Commonly used signal detection algorithms identify disproportionate product-adverse event pairs and may not be sensitive to more complex potential signals. We sought to develop a computationally tractable multivariate data-mining approach to identify product-multiple adverse event associations. We describe an application of stepwise association rule mining (Step-ARM) to detect potential vaccine-symptom group associations in the US Vaccine Adverse Event Reporting System. Step-ARM identifies strong associations between one vaccine and one or more adverse events. To reduce the number of redundant association rules found by Step-ARM, we also propose a clustering method for the post-processing of association rules. In sample applications to a trivalent intradermal inactivated influenza virus vaccine and to measles, mumps, rubella, and varicella (MMRV) vaccine and in simulation studies, we find that Step-ARM can detect a variety of medically coherent potential vaccine-symptom group signals efficiently. In the MMRV example, Step-ARM appears to outperform univariate methods in detecting a known safety signal. Our approach is sensitive to potentially complex signals, which may be particularly important when monitoring novel medical countermeasure products such as pandemic influenza vaccines. The post-processing clustering algorithm improves the applicability of the approach as a screening method to identify patterns that may merit further investigation. Copyright © 2015 John Wiley & Sons, Ltd.

  14. Temporal Structure and Complexity Affect Audio-Visual Correspondence Detection

    PubMed Central

    Denison, Rachel N.; Driver, Jon; Ruff, Christian C.

    2013-01-01

    Synchrony between events in different senses has long been considered the critical temporal cue for multisensory integration. Here, using rapid streams of auditory and visual events, we demonstrate how humans can use temporal structure (rather than mere temporal coincidence) to detect multisensory relatedness. We find psychophysically that participants can detect matching auditory and visual streams via shared temporal structure for crossmodal lags of up to 200 ms. Performance on this task reproduced features of past findings based on explicit timing judgments but did not show any special advantage for perfectly synchronous streams. Importantly, the complexity of temporal patterns influences sensitivity to correspondence. Stochastic, irregular streams – with richer temporal pattern information – led to higher audio-visual matching sensitivity than predictable, rhythmic streams. Our results reveal that temporal structure and its complexity are key determinants for human detection of audio-visual correspondence. The distinctive emphasis of our new paradigms on temporal patterning could be useful for studying special populations with suspected abnormalities in audio-visual temporal perception and multisensory integration. PMID:23346067

  15. EzyAmp signal amplification cascade enables isothermal detection of nucleic acid and protein targets.

    PubMed

    Linardy, Evelyn M; Erskine, Simon M; Lima, Nicole E; Lonergan, Tina; Mokany, Elisa; Todd, Alison V

    2016-01-15

    Advancements in molecular biology have improved the ability to characterize disease-related nucleic acids and proteins. Recently, there has been an increasing desire for tests that can be performed outside of centralised laboratories. This study describes a novel isothermal signal amplification cascade called EzyAmp (enzymatic signal amplification) that is being developed for detection of targets at the point of care. EzyAmp exploits the ability of some restriction endonucleases to cleave substrates containing nicks within their recognition sites. EzyAmp uses two oligonucleotide duplexes (partial complexes 1 and 2) which are initially cleavage-resistant as they lack a complete recognition site. The recognition site of partial complex 1 can be completed by hybridization of a triggering oligonucleotide (Driver Fragment 1) that is generated by a target-specific initiation event. Binding of Driver Fragment 1 generates a completed complex 1, which upon cleavage, releases Driver Fragment 2. In turn, binding of Driver Fragment 2 to partial complex 2 creates completed complex 2 which when cleaved releases additional Driver Fragment 1. Each cleavage event separates fluorophore quencher pairs resulting in an increase in fluorescence. At this stage a cascade of signal production becomes independent of further target-specific initiation events. This study demonstrated that the EzyAmp cascade can facilitate detection and quantification of nucleic acid targets with sensitivity down to aM concentration. Further, the same cascade detected VEGF protein with a sensitivity of 20nM showing that this universal method for amplifying signal may be linked to the detection of different types of analytes in an isothermal format. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  16. Impact Detection for Characterization of Complex Multiphase Flows

    NASA Astrophysics Data System (ADS)

    Chan, Wai Hong Ronald; Urzay, Javier; Mani, Ali; Moin, Parviz

    2016-11-01

    Multiphase flows often involve a wide range of impact events, such as liquid droplets impinging on a liquid pool or gas bubbles coalescing in a liquid medium. These events contribute to a myriad of large-scale phenomena, including breaking waves on ocean surfaces. As impacts between surfaces necessarily occur at isolated points, numerical simulations of impact events will require the resolution of molecular scales near the impact points for accurate modeling. This can be prohibitively expensive unless subgrid impact and breakup models are formulated to capture the effects of the interactions. The first step in a large-eddy simulation (LES) based computational methodology for complex multiphase flows like air-sea interactions requires effective detection of these impact events. The starting point of this work is a collision detection algorithm for structured grids on a coupled level set / volume of fluid (CLSVOF) solver adapted from an earlier algorithm for cloth animations that triangulates the interface with the marching cubes method. We explore the extension of collision detection to a geometric VOF solver and to unstructured grids. Supported by ONR/A*STAR. Agency of Science, Technology and Research, Singapore; Office of Naval Research, USA.

  17. Waveform Similarity Analysis: A Simple Template Comparing Approach for Detecting and Quantifying Noisy Evoked Compound Action Potentials.

    PubMed

    Potas, Jason Robert; de Castro, Newton Gonçalves; Maddess, Ted; de Souza, Marcio Nogueira

    2015-01-01

    Experimental electrophysiological assessment of evoked responses from regenerating nerves is challenging due to the typical complex response of events dispersed over various latencies and poor signal-to-noise ratio. Our objective was to automate the detection of compound action potential events and derive their latencies and magnitudes using a simple cross-correlation template comparison approach. For this, we developed an algorithm called Waveform Similarity Analysis. To test the algorithm, challenging signals were generated in vivo by stimulating sural and sciatic nerves, whilst recording evoked potentials at the sciatic nerve and tibialis anterior muscle, respectively, in animals recovering from sciatic nerve transection. Our template for the algorithm was generated based on responses evoked from the intact side. We also simulated noisy signals and examined the output of the Waveform Similarity Analysis algorithm with imperfect templates. Signals were detected and quantified using Waveform Similarity Analysis, which was compared to event detection, latency and magnitude measurements of the same signals performed by a trained observer, a process we called Trained Eye Analysis. The Waveform Similarity Analysis algorithm could successfully detect and quantify simple or complex responses from nerve and muscle compound action potentials of intact or regenerated nerves. Incorrectly specifying the template outperformed Trained Eye Analysis for predicting signal amplitude, but produced consistent latency errors for the simulated signals examined. Compared to the trained eye, Waveform Similarity Analysis is automatic, objective, does not rely on the observer to identify and/or measure peaks, and can detect small clustered events even when signal-to-noise ratio is poor. Waveform Similarity Analysis provides a simple, reliable and convenient approach to quantify latencies and magnitudes of complex waveforms and therefore serves as a useful tool for studying evoked compound action potentials in neural regeneration studies.

  18. Waveform Similarity Analysis: A Simple Template Comparing Approach for Detecting and Quantifying Noisy Evoked Compound Action Potentials

    PubMed Central

    Potas, Jason Robert; de Castro, Newton Gonçalves; Maddess, Ted; de Souza, Marcio Nogueira

    2015-01-01

    Experimental electrophysiological assessment of evoked responses from regenerating nerves is challenging due to the typical complex response of events dispersed over various latencies and poor signal-to-noise ratio. Our objective was to automate the detection of compound action potential events and derive their latencies and magnitudes using a simple cross-correlation template comparison approach. For this, we developed an algorithm called Waveform Similarity Analysis. To test the algorithm, challenging signals were generated in vivo by stimulating sural and sciatic nerves, whilst recording evoked potentials at the sciatic nerve and tibialis anterior muscle, respectively, in animals recovering from sciatic nerve transection. Our template for the algorithm was generated based on responses evoked from the intact side. We also simulated noisy signals and examined the output of the Waveform Similarity Analysis algorithm with imperfect templates. Signals were detected and quantified using Waveform Similarity Analysis, which was compared to event detection, latency and magnitude measurements of the same signals performed by a trained observer, a process we called Trained Eye Analysis. The Waveform Similarity Analysis algorithm could successfully detect and quantify simple or complex responses from nerve and muscle compound action potentials of intact or regenerated nerves. Incorrectly specifying the template outperformed Trained Eye Analysis for predicting signal amplitude, but produced consistent latency errors for the simulated signals examined. Compared to the trained eye, Waveform Similarity Analysis is automatic, objective, does not rely on the observer to identify and/or measure peaks, and can detect small clustered events even when signal-to-noise ratio is poor. Waveform Similarity Analysis provides a simple, reliable and convenient approach to quantify latencies and magnitudes of complex waveforms and therefore serves as a useful tool for studying evoked compound action potentials in neural regeneration studies. PMID:26325291

  19. Security Event Recognition for Visual Surveillance

    NASA Astrophysics Data System (ADS)

    Liao, W.; Yang, C.; Yang, M. Ying; Rosenhahn, B.

    2017-05-01

    With rapidly increasing deployment of surveillance cameras, the reliable methods for automatically analyzing the surveillance video and recognizing special events are demanded by different practical applications. This paper proposes a novel effective framework for security event analysis in surveillance videos. First, convolutional neural network (CNN) framework is used to detect objects of interest in the given videos. Second, the owners of the objects are recognized and monitored in real-time as well. If anyone moves any object, this person will be verified whether he/she is its owner. If not, this event will be further analyzed and distinguished between two different scenes: moving the object away or stealing it. To validate the proposed approach, a new video dataset consisting of various scenarios is constructed for more complex tasks. For comparison purpose, the experiments are also carried out on the benchmark databases related to the task on abandoned luggage detection. The experimental results show that the proposed approach outperforms the state-of-the-art methods and effective in recognizing complex security events.

  20. Automated X-ray Flare Detection with GOES, 2003-2017: The Where of the Flare Catalog and Early Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Loftus, K.; Saar, S. H.

    2017-12-01

    NOAA's Space Weather Prediction Center publishes the current definitive public soft X-ray flare catalog, derived using data from the X-ray Sensor (XRS) on the Geostationary Operational Environmental Satellites (GOES) series. However, this flare list has shortcomings for use in scientific analysis. Its detection algorithm has drawbacks (missing smaller flux events and poorly characterizing complex ones), and its event timing is imprecise (peak and end times are frequently marked incorrectly, and hence peak fluxes are underestimated). It also lacks explicit and regular spatial location data. We present a new database, "The Where of the Flare" catalog, which improves upon the precision of NOAA's current version, with more consistent and accurate spatial locations, timings, and peak fluxes. Our catalog also offers several new parameters per flare (e.g. background flux, integrated flux). We use data from the GOES Solar X-ray Imager (SXI) for spatial flare locating. Our detection algorithm is more sensitive to smaller flux events close to the background level and more precisely marks flare start/peak/end times so that integrated flux can be accurately calculated. It also decomposes complex events (with multiple overlapping flares) by constituent peaks. The catalog dates from the operation of the first SXI instrument in 2003 until the present. We give an overview of the detection algorithm's design, review the catalog's features, and discuss preliminary statistical analyses of light curve morphology, complex event decomposition, and integrated flux distribution. The Where of the Flare catalog will be useful in studying X-ray flare statistics and correlating X-ray flare properties with other observations. This work was supported by Contract #8100002705 from Lockheed-Martin to SAO in support of the science of NASA's IRIS mission.

  1. Discrepancy detection in the retrieval-enhanced suggestibility paradigm.

    PubMed

    Butler, Brendon Jerome; Loftus, Elizabeth F

    2018-04-01

    Retrieval-enhanced suggestibility (RES) refers to the finding that immediately recalling the details of a witnessed event can increase susceptibility to later misinformation. In three experiments, we sought to gain a deeper understanding of the role that retrieval plays in the RES paradigm. Consistent with past research, initial testing did increase susceptibility to misinformation - but only for those who failed to detect discrepancies between the original event and the post-event misinformation. In all three experiments, subjects who retrospectively detected discrepancies in the post-event narratives were more resistant to misinformation than those who did not. In Experiments 2 and 3, having subjects concurrently assess the consistency of the misinformation narratives negated the RES effect. Interestingly, in Experiments 2 and 3, subjects who had retrieval practice and detected discrepancies were more likely to endorse misinformation than control subjects who detected discrepancies. These results call attention to limiting conditions of the RES effect and highlight the complex relationship between retrieval practice, discrepancy detection, and misinformation endorsement.

  2. A Fuzzy-Decision Based Approach for Composite Event Detection in Wireless Sensor Networks

    PubMed Central

    Zhang, Shukui; Chen, Hao; Zhu, Qiaoming

    2014-01-01

    The event detection is one of the fundamental researches in wireless sensor networks (WSNs). Due to the consideration of various properties that reflect events status, the Composite event is more consistent with the objective world. Thus, the research of the Composite event becomes more realistic. In this paper, we analyze the characteristics of the Composite event; then we propose a criterion to determine the area of the Composite event and put forward a dominating set based network topology construction algorithm under random deployment. For the unreliability of partial data in detection process and fuzziness of the event definitions in nature, we propose a cluster-based two-dimensional τ-GAS algorithm and fuzzy-decision based composite event decision mechanism. In the case that the sensory data of most nodes are normal, the two-dimensional τ-GAS algorithm can filter the fault node data effectively and reduce the influence of erroneous data on the event determination. The Composite event judgment mechanism which is based on fuzzy-decision holds the superiority of the fuzzy-logic based algorithm; moreover, it does not need the support of a huge rule base and its computational complexity is small. Compared to CollECT algorithm and CDS algorithm, this algorithm improves the detection accuracy and reduces the traffic. PMID:25136690

  3. Tools for Accurate and Efficient Analysis of Complex Evolutionary Mechanisms in Microbial Genomes. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakhleh, Luay

    I proposed to develop computationally efficient tools for accurate detection and reconstruction of microbes' complex evolutionary mechanisms, thus enabling rapid and accurate annotation, analysis and understanding of their genomes. To achieve this goal, I proposed to address three aspects. (1) Mathematical modeling. A major challenge facing the accurate detection of HGT is that of distinguishing between these two events on the one hand and other events that have similar "effects." I proposed to develop a novel mathematical approach for distinguishing among these events. Further, I proposed to develop a set of novel optimization criteria for the evolutionary analysis of microbialmore » genomes in the presence of these complex evolutionary events. (2) Algorithm design. In this aspect of the project, I proposed to develop an array of e cient and accurate algorithms for analyzing microbial genomes based on the formulated optimization criteria. Further, I proposed to test the viability of the criteria and the accuracy of the algorithms in an experimental setting using both synthetic as well as biological data. (3) Software development. I proposed the nal outcome to be a suite of software tools which implements the mathematical models as well as the algorithms developed.« less

  4. Spatiotemporal Detection of Unusual Human Population Behavior Using Mobile Phone Data

    PubMed Central

    Dobra, Adrian; Williams, Nathalie E.; Eagle, Nathan

    2015-01-01

    With the aim to contribute to humanitarian response to disasters and violent events, scientists have proposed the development of analytical tools that could identify emergency events in real-time, using mobile phone data. The assumption is that dramatic and discrete changes in behavior, measured with mobile phone data, will indicate extreme events. In this study, we propose an efficient system for spatiotemporal detection of behavioral anomalies from mobile phone data and compare sites with behavioral anomalies to an extensive database of emergency and non-emergency events in Rwanda. Our methodology successfully captures anomalous behavioral patterns associated with a broad range of events, from religious and official holidays to earthquakes, floods, violence against civilians and protests. Our results suggest that human behavioral responses to extreme events are complex and multi-dimensional, including extreme increases and decreases in both calling and movement behaviors. We also find significant temporal and spatial variance in responses to extreme events. Our behavioral anomaly detection system and extensive discussion of results are a significant contribution to the long-term project of creating an effective real-time event detection system with mobile phone data and we discuss the implications of our findings for future research to this end. PMID:25806954

  5. Spontaneous brain activity as a source of ideal 1/f noise

    NASA Astrophysics Data System (ADS)

    Allegrini, Paolo; Menicucci, Danilo; Bedini, Remo; Fronzoni, Leone; Gemignani, Angelo; Grigolini, Paolo; West, Bruce J.; Paradisi, Paolo

    2009-12-01

    We study the electroencephalogram (EEG) of 30 closed-eye awake subjects with a technique of analysis recently proposed to detect punctual events signaling rapid transitions between different metastable states. After single-EEG-channel event detection, we study global properties of events simultaneously occurring among two or more electrodes termed coincidences. We convert the coincidences into a diffusion process with three distinct rules that can yield the same μ only in the case where the coincidences are driven by a renewal process. We establish that the time interval between two consecutive renewal events driving the coincidences has a waiting-time distribution with inverse power-law index μ≈2 corresponding to ideal 1/f noise. We argue that this discovery, shared by all subjects of our study, supports the conviction that 1/f noise is an optimal communication channel for complex networks as in art or language and may therefore be the channel through which the brain influences complex processes and is influenced by them.

  6. Homologous Recombination—Experimental Systems, Analysis and Significance

    PubMed Central

    Kuzminov, Andrei

    2014-01-01

    Homologous recombination is the most complex of all recombination events that shape genomes and produce material for evolution. Homologous recombination events are exchanges between DNA molecules in the lengthy regions of shared identity, catalyzed by a group of dedicated enzymes. There is a variety of experimental systems in E. coli and Salmonella to detect homologous recombination events of several different kinds. Genetic analysis of homologous recombination reveals three separate phases of this process: pre-synapsis (the early phase), synapsis (homologous strand exchange) and post-synapsis (the late phase). In E. coli, there are at least two independent pathway of the early phase and at least two independent pathways of the late phase. All this complexity is incongruent with the originally ascribed role of homologous recombination as accelerator of genome evolution: there is simply not enough duplication and repetition in enterobacterial genomes for homologous recombination to have a detectable evolutionary role, and therefore not enough selection to maintain such a complexity. At the same time, the mechanisms of homologous recombination are uniquely suited for repair of complex DNA lesions called chromosomal lesions. In fact, the two major classes of chromosomal lesions are recognized and processed by the two individual pathways at the early phase of homologous recombination. It follows, therefore, that homologous recombination events are occasional reflections of the continual recombinational repair, made possible in cases of natural or artificial genome redundancy. PMID:26442506

  7. Numerical study on the sequential Bayesian approach for radioactive materials detection

    NASA Astrophysics Data System (ADS)

    Qingpei, Xiang; Dongfeng, Tian; Jianyu, Zhu; Fanhua, Hao; Ge, Ding; Jun, Zeng

    2013-01-01

    A new detection method, based on the sequential Bayesian approach proposed by Candy et al., offers new horizons for the research of radioactive detection. Compared with the commonly adopted detection methods incorporated with statistical theory, the sequential Bayesian approach offers the advantages of shorter verification time during the analysis of spectra that contain low total counts, especially in complex radionuclide components. In this paper, a simulation experiment platform implanted with the methodology of sequential Bayesian approach was developed. Events sequences of γ-rays associating with the true parameters of a LaBr3(Ce) detector were obtained based on an events sequence generator using Monte Carlo sampling theory to study the performance of the sequential Bayesian approach. The numerical experimental results are in accordance with those of Candy. Moreover, the relationship between the detection model and the event generator, respectively represented by the expected detection rate (Am) and the tested detection rate (Gm) parameters, is investigated. To achieve an optimal performance for this processor, the interval of the tested detection rate as a function of the expected detection rate is also presented.

  8. Assessing the continuum of event-based biosurveillance through an operational lens.

    PubMed

    Corley, Courtney D; Lancaster, Mary J; Brigantic, Robert T; Chung, James S; Walters, Ronald A; Arthur, Ray R; Bruckner-Lea, Cynthia J; Calapristi, Augustin; Dowling, Glenn; Hartley, David M; Kennedy, Shaun; Kircher, Amy; Klucking, Sara; Lee, Eva K; McKenzie, Taylor; Nelson, Noele P; Olsen, Jennifer; Pancerella, Carmen; Quitugua, Teresa N; Reed, Jeremy Todd; Thomas, Carla S

    2012-03-01

    This research follows the Updated Guidelines for Evaluating Public Health Surveillance Systems, Recommendations from the Guidelines Working Group, published by the Centers for Disease Control and Prevention nearly a decade ago. Since then, models have been developed and complex systems have evolved with a breadth of disparate data to detect or forecast chemical, biological, and radiological events that have a significant impact on the One Health landscape. How the attributes identified in 2001 relate to the new range of event-based biosurveillance technologies is unclear. This article frames the continuum of event-based biosurveillance systems (that fuse media reports from the internet), models (ie, computational that forecast disease occurrence), and constructs (ie, descriptive analytical reports) through an operational lens (ie, aspects and attributes associated with operational considerations in the development, testing, and validation of the event-based biosurveillance methods and models and their use in an operational environment). A workshop was held in 2010 to scientifically identify, develop, and vet a set of attributes for event-based biosurveillance. Subject matter experts were invited from 7 federal government agencies and 6 different academic institutions pursuing research in biosurveillance event detection. We describe 8 attribute families for the characterization of event-based biosurveillance: event, readiness, operational aspects, geographic coverage, population coverage, input data, output, and cost. Ultimately, the analyses provide a framework from which the broad scope, complexity, and relevant issues germane to event-based biosurveillance useful in an operational environment can be characterized.

  9. Tackle and impact detection in elite Australian football using wearable microsensor technology.

    PubMed

    Gastin, Paul B; McLean, Owen C; Breed, Ray V P; Spittle, Michael

    2014-01-01

    The effectiveness of a wearable microsensor device (MinimaxX(TM) S4, Catapult Innovations, Melbourne, VIC, Australia) to automatically detect tackles and impact events in elite Australian football (AF) was assessed during four matches. Video observation was used as the criterion measure. A total of 352 tackles were observed, with 78% correctly detected as tackles by the manufacturer's software. Tackles against (i.e. tackled by an opponent) were more accurately detected than tackles made (90% v 66%). Of the 77 tackles that were not detected at all, the majority (74%) were categorised as low-intensity. In contrast, a total of 1510 "tackle" events were detected, with only 18% of these verified as tackles. A further 57% were from contested ball situations involving player contact. The remaining 25% were in general play where no contact was evident; these were significantly lower in peak Player Load™ than those involving player contact (P < 0.01). The tackle detection algorithm, developed primarily for rugby, was not suitable for tackle detection in AF. The underlying sensor data may have the potential to detect a range of events within contact sports such as AF, yet to do so is a complex task and requires sophisticated sport and event-specific algorithms.

  10. Slow Earthquakes in the Microseism Frequency Band (0.1-1.0 Hz) off Kii Peninsula, Japan

    NASA Astrophysics Data System (ADS)

    Kaneko, Lisa; Ide, Satoshi; Nakano, Masaru

    2018-03-01

    It is difficult to detect the signal of slow deformation in the 0.1-1.0 Hz frequency band between tectonic tremors and very low frequency events, where microseism noise is dominant. Here we provide the first evidence of slow earthquakes in this microseism band, observed by the DONET1 ocean bottom seismometer network, after an Mw 5.8 earthquake off Kii Peninsula, Japan, on 1 April 2016. The signals in the microseism band were accompanied by signals from active tremors, very low frequency events, and slow slip events that radiated from the shallow plate interface. We report the detection and locations of events across five frequency bands, including the microseism band. The locations and timing of the events estimated in the different frequency bands are similar, suggesting that these signals radiated from a common source. The observed variations in detectability for each band highlight the complexity of the slow earthquake process.

  11. Patterns of precipitation and soil moisture extremes in Texas, US: A complex network analysis

    NASA Astrophysics Data System (ADS)

    Sun, Alexander Y.; Xia, Youlong; Caldwell, Todd G.; Hao, Zengchao

    2018-02-01

    Understanding of the spatial and temporal dynamics of extreme precipitation not only improves prediction skills, but also helps to prioritize hazard mitigation efforts. This study seeks to enhance the understanding of spatiotemporal covariation patterns embedded in precipitation (P) and soil moisture (SM) by using an event-based, complex-network-theoretic approach. Events concurrences are quantified using a nonparametric event synchronization measure, and spatial patterns of hydroclimate variables are analyzed by using several network measures and a community detection algorithm. SM-P coupling is examined using a directional event coincidence analysis measure that takes the order of event occurrences into account. The complex network approach is demonstrated for Texas, US, a region possessing a rich set of hydroclimate features and is frequented by catastrophic flooding. Gridded daily observed P data and simulated SM data are used to create complex networks of P and SM extremes. The uncovered high degree centrality regions and community structures are qualitatively in agreement with the overall existing knowledge of hydroclimate extremes in the study region. Our analyses provide new visual insights on the propagation, connectivity, and synchronicity of P extremes, as well as the SM-P coupling, in this flood-prone region, and can be readily used as a basis for event-driven predictive analytics for other regions.

  12. Development and validation of a 48-target analytical method for high-throughput monitoring of genetically modified organisms.

    PubMed

    Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang

    2015-01-05

    The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection.

  13. Development and Validation of A 48-Target Analytical Method for High-throughput Monitoring of Genetically Modified Organisms

    PubMed Central

    Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang

    2015-01-01

    The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection. PMID:25556930

  14. Ricin toxicokinetics and its sensitive detection in mouse sera or feces using immuno-PCR

    USDA-ARS?s Scientific Manuscript database

    Ricin (also called RCA-II or RCA60), one of the most potent toxins and documented bioweapons, is derived from castor beans of Ricinus communis. Several in vitro methods have been designed for ricin detection in complex food matrices in the event of intentional contamination. Recently, a novel Immuno...

  15. Efficient, Decentralized Detection of Qualitative Spatial Events in a Dynamic Scalar Field

    PubMed Central

    Jeong, Myeong-Hun; Duckham, Matt

    2015-01-01

    This paper describes an efficient, decentralized algorithm to monitor qualitative spatial events in a dynamic scalar field. The events of interest involve changes to the critical points (i.e., peak, pits and passes) and edges of the surface network derived from the field. Four fundamental types of event (appearance, disappearance, movement and switch) are defined. Our algorithm is designed to rely purely on qualitative information about the neighborhoods of nodes in the sensor network and does not require information about nodes’ coordinate positions. Experimental investigations confirm that our algorithm is efficient, with O(n) overall communication complexity (where n is the number of nodes in the sensor network), an even load balance and low operational latency. The accuracy of event detection is comparable to established centralized algorithms for the identification of critical points of a surface network. Our algorithm is relevant to a broad range of environmental monitoring applications of sensor networks. PMID:26343672

  16. Efficient, Decentralized Detection of Qualitative Spatial Events in a Dynamic Scalar Field.

    PubMed

    Jeong, Myeong-Hun; Duckham, Matt

    2015-08-28

    This paper describes an efficient, decentralized algorithm to monitor qualitative spatial events in a dynamic scalar field. The events of interest involve changes to the critical points (i.e., peak, pits and passes) and edges of the surface network derived from the field. Four fundamental types of event (appearance, disappearance, movement and switch) are defined. Our algorithm is designed to rely purely on qualitative information about the neighborhoods of nodes in the sensor network and does not require information about nodes' coordinate positions. Experimental investigations confirm that our algorithm is efficient, with O(n) overall communication complexity (where n is the number of nodes in the sensor network), an even load balance and low operational latency. The accuracy of event detection is comparable to established centralized algorithms for the identification of critical points of a surface network. Our algorithm is relevant to a broad range of environmental monitoring applications of sensor networks.

  17. Event detection and localization for small mobile robots using reservoir computing.

    PubMed

    Antonelo, E A; Schrauwen, B; Stroobandt, D

    2008-08-01

    Reservoir Computing (RC) techniques use a fixed (usually randomly created) recurrent neural network, or more generally any dynamic system, which operates at the edge of stability, where only a linear static readout output layer is trained by standard linear regression methods. In this work, RC is used for detecting complex events in autonomous robot navigation. This can be extended to robot localization tasks which are solely based on a few low-range, high-noise sensory data. The robot thus builds an implicit map of the environment (after learning) that is used for efficient localization by simply processing the input stream of distance sensors. These techniques are demonstrated in both a simple simulation environment and in the physically realistic Webots simulation of the commercially available e-puck robot, using several complex and even dynamic environments.

  18. Electrochemical DNA biosensor for detection of porcine oligonucleotides using ruthenium(II) complex as intercalator label redox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halid, Nurul Izni Abdullah; Hasbullah, Siti Aishah; Heng, Lee Yook

    2014-09-03

    A DNA biosensor detection of oligonucleotides via the interactions of porcine DNA with redox active complex based on the electrochemical transduction is described. A ruthenium(II) complex, [Ru(bpy){sub 2}(PIP)]{sup 2+}, (bpy = 2,2′bipyridine, PIP = 2-phenylimidazo[4,5-f[[1,10-phenanthroline]) as DNA label has been synthesized and characterized by 1H NMR and mass spectra. The study was carried out by covalent bonding immobilization of porcine aminated DNA probes sequences on screen printed electrode (SPE) modified with succinimide-acrylic microspheres and [Ru(bpy){sub 2}(PIP)]{sup 2+} was used as electrochemical redox intercalator label to detect DNA hybridization event. Electrochemical detection was performed by cyclic voltammetry (CV) and differential pulsemore » voltammetry (DPV) over the potential range where the ruthenium (II) complex was active. The results indicate that the interaction of [Ru(bpy){sub 2}(PIP)]{sup 2+} with hybridization complementary DNA has higher response compared to single-stranded and mismatch complementary DNA.« less

  19. Method and apparatus for distinguishing actual sparse events from sparse event false alarms

    DOEpatents

    Spalding, Richard E.; Grotbeck, Carter L.

    2000-01-01

    Remote sensing method and apparatus wherein sparse optical events are distinguished from false events. "Ghost" images of actual optical phenomena are generated using an optical beam splitter and optics configured to direct split beams to a single sensor or segmented sensor. True optical signals are distinguished from false signals or noise based on whether the ghost image is presence or absent. The invention obviates the need for dual sensor systems to effect a false target detection capability, thus significantly reducing system complexity and cost.

  20. Detection of timescales in evolving complex systems

    PubMed Central

    Darst, Richard K.; Granell, Clara; Arenas, Alex; Gómez, Sergio; Saramäki, Jari; Fortunato, Santo

    2016-01-01

    Most complex systems are intrinsically dynamic in nature. The evolution of a dynamic complex system is typically represented as a sequence of snapshots, where each snapshot describes the configuration of the system at a particular instant of time. This is often done by using constant intervals but a better approach would be to define dynamic intervals that match the evolution of the system’s configuration. To this end, we propose a method that aims at detecting evolutionary changes in the configuration of a complex system, and generates intervals accordingly. We show that evolutionary timescales can be identified by looking for peaks in the similarity between the sets of events on consecutive time intervals of data. Tests on simple toy models reveal that the technique is able to detect evolutionary timescales of time-varying data both when the evolution is smooth as well as when it changes sharply. This is further corroborated by analyses of several real datasets. Our method is scalable to extremely large datasets and is computationally efficient. This allows a quick, parameter-free detection of multiple timescales in the evolution of a complex system. PMID:28004820

  1. Adverse drug events with hyperkalaemia during inpatient stays: evaluation of an automated method for retrospective detection in hospital databases

    PubMed Central

    2014-01-01

    Background Adverse drug reactions and adverse drug events (ADEs) are major public health issues. Many different prospective tools for the automated detection of ADEs in hospital databases have been developed and evaluated. The objective of the present study was to evaluate an automated method for the retrospective detection of ADEs with hyperkalaemia during inpatient stays. Methods We used a set of complex detection rules to take account of the patient’s clinical and biological context and the chronological relationship between the causes and the expected outcome. The dataset consisted of 3,444 inpatient stays in a French general hospital. An automated review was performed for all data and the results were compared with those of an expert chart review. The complex detection rules’ analytical quality was evaluated for ADEs. Results In terms of recall, 89.5% of ADEs with hyperkalaemia “with or without an abnormal symptom” were automatically identified (including all three serious ADEs). In terms of precision, 63.7% of the automatically identified ADEs with hyperkalaemia were true ADEs. Conclusions The use of context-sensitive rules appears to improve the automated detection of ADEs with hyperkalaemia. This type of tool may have an important role in pharmacoepidemiology via the routine analysis of large inter-hospital databases. PMID:25212108

  2. Adverse drug events with hyperkalaemia during inpatient stays: evaluation of an automated method for retrospective detection in hospital databases.

    PubMed

    Ficheur, Grégoire; Chazard, Emmanuel; Beuscart, Jean-Baptiste; Merlin, Béatrice; Luyckx, Michel; Beuscart, Régis

    2014-09-12

    Adverse drug reactions and adverse drug events (ADEs) are major public health issues. Many different prospective tools for the automated detection of ADEs in hospital databases have been developed and evaluated. The objective of the present study was to evaluate an automated method for the retrospective detection of ADEs with hyperkalaemia during inpatient stays. We used a set of complex detection rules to take account of the patient's clinical and biological context and the chronological relationship between the causes and the expected outcome. The dataset consisted of 3,444 inpatient stays in a French general hospital. An automated review was performed for all data and the results were compared with those of an expert chart review. The complex detection rules' analytical quality was evaluated for ADEs. In terms of recall, 89.5% of ADEs with hyperkalaemia "with or without an abnormal symptom" were automatically identified (including all three serious ADEs). In terms of precision, 63.7% of the automatically identified ADEs with hyperkalaemia were true ADEs. The use of context-sensitive rules appears to improve the automated detection of ADEs with hyperkalaemia. This type of tool may have an important role in pharmacoepidemiology via the routine analysis of large inter-hospital databases.

  3. Optimised padlock probe ligation and microarray detection of multiple (non-authorised) GMOs in a single reaction

    PubMed Central

    Prins, Theo W; van Dijk, Jeroen P; Beenen, Henriek G; Van Hoef, AM Angeline; Voorhuijzen, Marleen M; Schoen, Cor D; Aarts, Henk JM; Kok, Esther J

    2008-01-01

    Background To maintain EU GMO regulations, producers of new GM crop varieties need to supply an event-specific method for the new variety. As a result methods are nowadays available for EU-authorised genetically modified organisms (GMOs), but only to a limited extent for EU-non-authorised GMOs (NAGs). In the last decade the diversity of genetically modified (GM) ingredients in food and feed has increased significantly. As a result of this increase GMO laboratories currently need to apply many different methods to establish to potential presence of NAGs in raw materials and complex derived products. Results In this paper we present an innovative method for detecting (approved) GMOs as well as the potential presence of NAGs in complex DNA samples containing different crop species. An optimised protocol has been developed for padlock probe ligation in combination with microarray detection (PPLMD) that can easily be scaled up. Linear padlock probes targeted against GMO-events, -elements and -species have been developed that can hybridise to their genomic target DNA and are visualised using microarray hybridisation. In a tenplex PPLMD experiment, different genomic targets in Roundup-Ready soya, MON1445 cotton and Bt176 maize were detected down to at least 1%. In single experiments, the targets were detected down to 0.1%, i.e. comparable to standard qPCR. Conclusion Compared to currently available methods this is a significant step forward towards multiplex detection in complex raw materials and derived products. It is shown that the PPLMD approach is suitable for large-scale detection of GMOs in real-life samples and provides the possibility to detect and/or identify NAGs that would otherwise remain undetected. PMID:19055784

  4. Optimised padlock probe ligation and microarray detection of multiple (non-authorised) GMOs in a single reaction.

    PubMed

    Prins, Theo W; van Dijk, Jeroen P; Beenen, Henriek G; Van Hoef, Am Angeline; Voorhuijzen, Marleen M; Schoen, Cor D; Aarts, Henk J M; Kok, Esther J

    2008-12-04

    To maintain EU GMO regulations, producers of new GM crop varieties need to supply an event-specific method for the new variety. As a result methods are nowadays available for EU-authorised genetically modified organisms (GMOs), but only to a limited extent for EU-non-authorised GMOs (NAGs). In the last decade the diversity of genetically modified (GM) ingredients in food and feed has increased significantly. As a result of this increase GMO laboratories currently need to apply many different methods to establish to potential presence of NAGs in raw materials and complex derived products. In this paper we present an innovative method for detecting (approved) GMOs as well as the potential presence of NAGs in complex DNA samples containing different crop species. An optimised protocol has been developed for padlock probe ligation in combination with microarray detection (PPLMD) that can easily be scaled up. Linear padlock probes targeted against GMO-events, -elements and -species have been developed that can hybridise to their genomic target DNA and are visualised using microarray hybridisation.In a tenplex PPLMD experiment, different genomic targets in Roundup-Ready soya, MON1445 cotton and Bt176 maize were detected down to at least 1%. In single experiments, the targets were detected down to 0.1%, i.e. comparable to standard qPCR. Compared to currently available methods this is a significant step forward towards multiplex detection in complex raw materials and derived products. It is shown that the PPLMD approach is suitable for large-scale detection of GMOs in real-life samples and provides the possibility to detect and/or identify NAGs that would otherwise remain undetected.

  5. Boosting-Based Optimization as a Generic Framework for Novelty and Fraud Detection in Complex Strategies

    NASA Astrophysics Data System (ADS)

    Gavrishchaka, Valeriy V.; Kovbasinskaya, Maria; Monina, Maria

    2008-11-01

    Novelty detection is a very desirable additional feature of any practical classification or forecasting system. Novelty and rare patterns detection is the main objective in such applications as fault/abnormality discovery in complex technical and biological systems, fraud detection and risk management in financial and insurance industry. Although many interdisciplinary approaches for rare event modeling and novelty detection have been proposed, significant data incompleteness due to the nature of the problem makes it difficult to find a universal solution. Even more challenging and much less formalized problem is novelty detection in complex strategies and models where practical performance criteria are usually multi-objective and the best state-of-the-art solution is often not known due to the complexity of the task and/or proprietary nature of the application area. For example, it is much more difficult to detect a series of small insider trading or other illegal transactions mixed with valid operations and distributed over long time period according to a well-designed strategy than a single, large fraudulent transaction. Recently proposed boosting-based optimization was shown to be an effective generic tool for the discovery of stable multi-component strategies/models from the existing parsimonious base strategies/models in financial and other applications. Here we outline how the same framework can be used for novelty and fraud detection in complex strategies and models.

  6. Lifting Events in RDF from Interactions with Annotated Web Pages

    NASA Astrophysics Data System (ADS)

    Stühmer, Roland; Anicic, Darko; Sen, Sinan; Ma, Jun; Schmidt, Kay-Uwe; Stojanovic, Nenad

    In this paper we present a method and an implementation for creating and processing semantic events from interaction with Web pages which opens possibilities to build event-driven applications for the (Semantic) Web. Events, simple or complex, are models for things that happen e.g., when a user interacts with a Web page. Events are consumed in some meaningful way e.g., for monitoring reasons or to trigger actions such as responses. In order for receiving parties to understand events e.g., comprehend what has led to an event, we propose a general event schema using RDFS. In this schema we cover the composition of complex events and event-to-event relationships. These events can then be used to route semantic information about an occurrence to different recipients helping in making the Semantic Web active. Additionally, we present an architecture for detecting and composing events in Web clients. For the contents of events we show a way of how they are enriched with semantic information about the context in which they occurred. The paper is presented in conjunction with the use case of Semantic Advertising, which extends traditional clickstream analysis by introducing semantic short-term profiling, enabling discovery of the current interest of a Web user and therefore supporting advertisement providers in responding with more relevant advertisements.

  7. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    NASA Astrophysics Data System (ADS)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  8. An automated approach towards detecting complex behaviours in deep brain oscillations.

    PubMed

    Mace, Michael; Yousif, Nada; Naushahi, Mohammad; Abdullah-Al-Mamun, Khondaker; Wang, Shouyan; Nandi, Dipankar; Vaidyanathan, Ravi

    2014-03-15

    Extracting event-related potentials (ERPs) from neurological rhythms is of fundamental importance in neuroscience research. Standard ERP techniques typically require the associated ERP waveform to have low variance, be shape and latency invariant and require many repeated trials. Additionally, the non-ERP part of the signal needs to be sampled from an uncorrelated Gaussian process. This limits methods of analysis to quantifying simple behaviours and movements only when multi-trial data-sets are available. We introduce a method for automatically detecting events associated with complex or large-scale behaviours, where the ERP need not conform to the aforementioned requirements. The algorithm is based on the calculation of a detection contour and adaptive threshold. These are combined using logical operations to produce a binary signal indicating the presence (or absence) of an event with the associated detection parameters tuned using a multi-objective genetic algorithm. To validate the proposed methodology, deep brain signals were recorded from implanted electrodes in patients with Parkinson's disease as they participated in a large movement-based behavioural paradigm. The experiment involved bilateral recordings of local field potentials from the sub-thalamic nucleus (STN) and pedunculopontine nucleus (PPN) during an orientation task. After tuning, the algorithm is able to extract events achieving training set sensitivities and specificities of [87.5 ± 6.5, 76.7 ± 12.8, 90.0 ± 4.1] and [92.6 ± 6.3, 86.0 ± 9.0, 29.8 ± 12.3] (mean ± 1 std) for the three subjects, averaged across the four neural sites. Furthermore, the methodology has the potential for utility in real-time applications as only a single-trial ERP is required. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Helmet-mounted acoustic array for hostile fire detection and localization in an urban environment

    NASA Astrophysics Data System (ADS)

    Scanlon, Michael V.

    2008-04-01

    The detection and localization of hostile weapons firing has been demonstrated successfully with acoustic sensor arrays on unattended ground sensors (UGS), ground-vehicles, and unmanned aerial vehicles (UAVs). Some of the more mature systems have demonstrated significant capabilities and provide direct support to ongoing counter-sniper operations. The Army Research Laboratory (ARL) is conducting research and development for a helmet-mounted system to acoustically detect and localize small arms firing, or other events such as RPG, mortars, and explosions, as well as other non-transient signatures. Since today's soldier is quickly being asked to take on more and more reconnaissance, surveillance, & target acquisition (RSTA) functions, sensor augmentation enables him to become a mobile and networked sensor node on the complex and dynamic battlefield. Having a body-worn threat detection and localization capability for events that pose an immediate danger to the soldiers around him can significantly enhance their survivability and lethality, as well as enable him to provide and use situational awareness clues on the networked battlefield. This paper addresses some of the difficulties encountered by an acoustic system in an urban environment. Complex reverberation, multipath, diffraction, and signature masking by building structures makes this a very harsh environment for robust detection and classification of shockwaves and muzzle blasts. Multifunctional acoustic detection arrays can provide persistent surveillance and enhanced situational awareness for every soldier.

  10. Measurement of the {sup 12}C({alpha},{gamma}){sup 16}O reaction at TRIAC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makii, H.; Miyatake, H.; Wakabayashi, Y.

    2012-11-12

    We have measured the {gamma}-ray angular distribution of the {sup 12}C({alpha},{gamma}){sup 16}O reaction at TRIAC (Tokai Radioactive Ion Accelerator Complex) to accurately determine the E1 and E2 cross sections. In this experiment, we used high efficiency anti-Compton NaI(T1) spectrometers to detect a {gamma}-ray from the reaction with large S/N ratio, intense pulsed {alpha}-beams to discriminate true event from background events due to neutrons from {sup 13}C({alpha},n){sup 16}O reaction with a time-of-flight (TOF) method. We succeeded in removing a background events due to neutrons and clearly detected {gamma}-ray from the {sup 12}C({alpha}{gamma}){sup 16}O reaction with high statistics.

  11. Long-period Seismicity at the Napoleonville Salt Dome: Implications for Local Seismic Monitoring of Underground Hydrocarbon Storage Caverns

    NASA Astrophysics Data System (ADS)

    Dreger, D. S.; Ford, S. R.; Nayak, A.

    2015-12-01

    The formation of a large sinkhole at the Napoleonville salt dome, Assumption Parish, Louisiana, in August 2012 was accompanied by a rich sequence of complex seismic events, including long-period (LP) events that were recorded 11 km away at Transportable Array station 544A in White Castle, Louisiana. The LP events have relatively little energy at short periods, which make them difficult to detect using standard high-frequency power detectors, and the majority of energy that reaches the station is peaked near 0.4 Hz. The analysis of the local records reveals that the onset of the 0.4 Hz signals coincides with the S-wave arrival, and therefore it may be a shaking induced resonance in a fluid filled cavern. We created a low-frequency (0.1-0.6 Hz) power detector (short-term average / long-term average) that operated on all three components of the broadband instrument, since considerable energy was detected on the horizontal components. The detections from the power detector were then used as templates in three-channel correlation detectors thereby increasing the number of detections by a little more than a factor of two to nearly 3000. The rate of LP events is approximately one event every other day at the beginning of recording in March 2011. Around 2 May 2012 the rate changes to approximately 7 events per day and then increases to 25 events per day at the beginning of July 2012. Finally, in the days leading up to the sinkhole formation there are approximately 200 LP events per day. The analysis of these events could aid in the development of local seismic monitoring methods for underground industrial storage caverns. Prepared by LLNL under Contract DE-AC52-07NA27344.

  12. Conversion events in gene clusters

    PubMed Central

    2011-01-01

    Background Gene clusters containing multiple similar genomic regions in close proximity are of great interest for biomedical studies because of their associations with inherited diseases. However, such regions are difficult to analyze due to their structural complexity and their complicated evolutionary histories, reflecting a variety of large-scale mutational events. In particular, conversion events can mislead inferences about the relationships among these regions, as traced by traditional methods such as construction of phylogenetic trees or multi-species alignments. Results To correct the distorted information generated by such methods, we have developed an automated pipeline called CHAP (Cluster History Analysis Package) for detecting conversion events. We used this pipeline to analyze the conversion events that affected two well-studied gene clusters (α-globin and β-globin) and three gene clusters for which comparative sequence data were generated from seven primate species: CCL (chemokine ligand), IFN (interferon), and CYP2abf (part of cytochrome P450 family 2). CHAP is freely available at http://www.bx.psu.edu/miller_lab. Conclusions These studies reveal the value of characterizing conversion events in the context of studying gene clusters in complex genomes. PMID:21798034

  13. Characterizing Micro- and Macro-Scale Seismicity from Bayou Corne, Louisiana

    NASA Astrophysics Data System (ADS)

    Baig, A. M.; Urbancic, T.; Karimi, S.

    2013-12-01

    The initiation of felt seismicity in Bayou Corne, Louisiana, coupled with other phenomena detected by residents on the nearby housing development, prompted a call to install a broadband seismic network to monitor subsurface deformation. The initial deployment was in place to characterize the deformation contemporaneous with the formation of a sinkhole located in close proximity to a salt dome. Seismic events generated during this period followed a swarm-like behaviour with moment magnitudes culminating around Mw2.5. However, the seismic data recorded during this sequence suffer from poor signal to noise, onsets that are very difficult to pick, and the presence of a significant amount of energy arriving later in the waveforms. Efforts to understand the complexity in these waveforms are ongoing, and involve invoking the complexities inherent in recording in a highly attenuating swamp overlying a complex three-dimensional structure with the strong material property contrast of the salt dome. In order to understand the event character, as well as to locally lower the completeness threshold of the sequence, a downhole array of 15 Hz sensors was deployed in a newly drilled well around the salt dome. Although the deployment lasted a little over a month in duration, over 1000 events were detected down to moment magnitude -Mw3. Waveform quality tended to be excellent, with very distinct P and S wave arrivals observable across the array for most events. The highest magnitude events were seen as well on the surface network and allowed for the opportunity to observe the complexities introduced by the site effects, while overcoming the saturation effects on the higher-frequency downhole geophones. This hybrid downhole and surface array illustrates how a full picture of subsurface deformation is only made possible by combining the high-frequency downhole instrumentation to see the microseismicity complemented with a broadband array to accurately characterize the source parameters for the larger magnitude events. Our presentation is focused on investigating this deformation, characterizing the scaling behaviour and the other source processes by taking advantage of the wide-band afforded to us through the deployment.

  14. The subclonal complexity of STIL-TAL1+ T-cell acute lymphoblastic leukaemia.

    PubMed

    Furness, Caroline L; Mansur, Marcela B; Weston, Victoria J; Ermini, Luca; van Delft, Frederik W; Jenkinson, Sarah; Gale, Rosemary; Harrison, Christine J; Pombo-de-Oliveira, Maria S; Sanchez-Martin, Marta; Ferrando, Adolfo A; Kearns, Pamela; Titley, Ian; Ford, Anthony M; Potter, Nicola E; Greaves, Mel

    2018-03-20

    Single-cell genetics were used to interrogate clonal complexity and the sequence of mutational events in STIL-TAL1+ T-ALL. Single-cell multicolour FISH was used to demonstrate that the earliest detectable leukaemia subclone contained the STIL-TAL1 fusion and copy number loss of 9p21.3 (CDKN2A/CDKN2B locus), with other copy number alterations including loss of PTEN occurring as secondary subclonal events. In three cases, multiplex qPCR and phylogenetic analysis were used to produce branching evolutionary trees recapitulating the snapshot history of T-ALL evolution in this leukaemia subtype, which confirmed that mutations in key T-ALL drivers, including NOTCH1 and PTEN, were subclonal and reiterative in distinct subclones. Xenografting confirmed that self-renewing or propagating cells were genetically diverse. These data suggest that the STIL-TAL1 fusion is a likely founder or truncal event. Therapies targeting the TAL1 auto-regulatory complex are worthy of further investigation in T-ALL.

  15. Electro-optical muzzle flash detection

    NASA Astrophysics Data System (ADS)

    Krieg, Jürgen; Eisele, Christian; Seiffer, Dirk

    2016-10-01

    Localizing a shooter in a complex scenario is a difficult task. Acoustic sensors can be used to detect blast waves. Radar technology permits detection of the projectile. A third method is to detect the muzzle flash using electro-optical devices. Detection of muzzle flash events is possible with focal plane arrays, line and single element detectors. In this paper, we will show that the detection of a muzzle flash works well in the shortwave infrared spectral range. Important for the acceptance of an operational warning system in daily use is a very low false alarm rate. Using data from a detector with a high sampling rate the temporal signature of a potential muzzle flash event can be analyzed and the false alarm rate can be reduced. Another important issue is the realization of an omnidirectional view required on an operational level. It will be shown that a combination of single element detectors and simple optics in an appropriate configuration is a capable solution.

  16. Changing scenes: memory for naturalistic events following change blindness.

    PubMed

    Mäntylä, Timo; Sundström, Anna

    2004-11-01

    Research on scene perception indicates that viewers often fail to detect large changes to scene regions when these changes occur during a visual disruption such as a saccade or a movie cut. In two experiments, we examined whether this relative inability to detect changes would produce systematic biases in event memory. In Experiment 1, participants decided whether two successively presented images were the same or different, followed by a memory task, in which they recalled the content of the viewed scene. In Experiment 2, participants viewed a short video, in which an actor carried out a series of daily activities, and central scenes' attributes were changed during a movie cut. A high degree of change blindness was observed in both experiments, and these effects were related to scene complexity (Experiment 1) and level of retrieval support (Experiment 2). Most important, participants reported the changed, rather than the initial, event attributes following a failure in change detection. These findings suggest that attentional limitations during encoding contribute to biases in episodic memory.

  17. Description and detection of burst events in turbulent flows

    NASA Astrophysics Data System (ADS)

    Schmid, P. J.; García-Gutierrez, A.; Jiménez, J.

    2018-04-01

    A mathematical and computational framework is developed for the detection and identification of coherent structures in turbulent wall-bounded shear flows. In a first step, this data-based technique will use an embedding methodology to formulate the fluid motion as a phase-space trajectory, from which state-transition probabilities can be computed. Within this formalism, a second step then applies repeated clustering and graph-community techniques to determine a hierarchy of coherent structures ranked by their persistencies. This latter information will be used to detect highly transitory states that act as precursors to violent and intermittent events in turbulent fluid motion (e.g., bursts). Used as an analysis tool, this technique allows the objective identification of intermittent (but important) events in turbulent fluid motion; however, it also lays the foundation for advanced control strategies for their manipulation. The techniques are applied to low-dimensional model equations for turbulent transport, such as the self-sustaining process (SSP), for varying levels of complexity.

  18. Fusing Symbolic and Numerical Diagnostic Computations

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.

  19. Automatic optical detection and classification of marine animals around MHK converters using machine vision

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunton, Steven

    Optical systems provide valuable information for evaluating interactions and associations between organisms and MHK energy converters and for capturing potentially rare encounters between marine organisms and MHK device. The deluge of optical data from cabled monitoring packages makes expert review time-consuming and expensive. We propose algorithms and a processing framework to automatically extract events of interest from underwater video. The open-source software framework consists of background subtraction, filtering, feature extraction and hierarchical classification algorithms. This principle classification pipeline was validated on real-world data collected with an experimental underwater monitoring package. An event detection rate of 100% was achieved using robustmore » principal components analysis (RPCA), Fourier feature extraction and a support vector machine (SVM) binary classifier. The detected events were then further classified into more complex classes – algae | invertebrate | vertebrate, one species | multiple species of fish, and interest rank. Greater than 80% accuracy was achieved using a combination of machine learning techniques.« less

  20. Single-trial detection of visual evoked potentials by common spatial patterns and wavelet filtering for brain-computer interface.

    PubMed

    Tu, Yiheng; Huang, Gan; Hung, Yeung Sam; Hu, Li; Hu, Yong; Zhang, Zhiguo

    2013-01-01

    Event-related potentials (ERPs) are widely used in brain-computer interface (BCI) systems as input signals conveying a subject's intention. A fast and reliable single-trial ERP detection method can be used to develop a BCI system with both high speed and high accuracy. However, most of single-trial ERP detection methods are developed for offline EEG analysis and thus have a high computational complexity and need manual operations. Therefore, they are not applicable to practical BCI systems, which require a low-complexity and automatic ERP detection method. This work presents a joint spatial-time-frequency filter that combines common spatial patterns (CSP) and wavelet filtering (WF) for improving the signal-to-noise (SNR) of visual evoked potentials (VEP), which can lead to a single-trial ERP-based BCI.

  1. A signal detection method for temporal variation of adverse effect with vaccine adverse event reporting system data.

    PubMed

    Cai, Yi; Du, Jingcheng; Huang, Jing; Ellenberg, Susan S; Hennessy, Sean; Tao, Cui; Chen, Yong

    2017-07-05

    To identify safety signals by manual review of individual report in large surveillance databases is time consuming; such an approach is very unlikely to reveal complex relationships between medications and adverse events. Since the late 1990s, efforts have been made to develop data mining tools to systematically and automatically search for safety signals in surveillance databases. Influenza vaccines present special challenges to safety surveillance because the vaccine changes every year in response to the influenza strains predicted to be prevalent that year. Therefore, it may be expected that reporting rates of adverse events following flu vaccines (number of reports for a specific vaccine-event combination/number of reports for all vaccine-event combinations) may vary substantially across reporting years. Current surveillance methods seldom consider these variations in signal detection, and reports from different years are typically collapsed together to conduct safety analyses. However, merging reports from different years ignores the potential heterogeneity of reporting rates across years and may miss important safety signals. Reports of adverse events between years 1990 to 2013 were extracted from the Vaccine Adverse Event Reporting System (VAERS) database and formatted into a three-dimensional data array with types of vaccine, groups of adverse events and reporting time as the three dimensions. We propose a random effects model to test the heterogeneity of reporting rates for a given vaccine-event combination across reporting years. The proposed method provides a rigorous statistical procedure to detect differences of reporting rates among years. We also introduce a new visualization tool to summarize the result of the proposed method when applied to multiple vaccine-adverse event combinations. We applied the proposed method to detect safety signals of FLU3, an influenza vaccine containing three flu strains, in the VAERS database. We showed that it had high statistical power to detect the variation in reporting rates across years. The identified vaccine-event combinations with significant different reporting rates over years suggested potential safety issues due to changes in vaccines which require further investigation. We developed a statistical model to detect safety signals arising from heterogeneity of reporting rates of a given vaccine-event combinations across reporting years. This method detects variation in reporting rates over years with high power. The temporal trend of reporting rate across years may reveal the impact of vaccine update on occurrence of adverse events and provide evidence for further investigations.

  2. Pulse Detecting Genetic Circuit – A New Design Approach

    PubMed Central

    Inniss, Mara; Iba, Hitoshi; Way, Jeffrey C.

    2016-01-01

    A robust cellular counter could enable synthetic biologists to design complex circuits with diverse behaviors. The existing synthetic-biological counters, responsive to the beginning of the pulse, are sensitive to the pulse duration. Here we present a pulse detecting circuit that responds only at the falling edge of a pulse–analogous to negative edge triggered electric circuits. As biological events do not follow precise timing, use of such a pulse detector would enable the design of robust asynchronous counters which can count the completion of events. This transcription-based pulse detecting circuit depends on the interaction of two co-expressed lambdoid phage-derived proteins: the first is unstable and inhibits the regulatory activity of the second, stable protein. At the end of the pulse the unstable inhibitor protein disappears from the cell and the second protein triggers the recording of the event completion. Using stochastic simulation we showed that the proposed design can detect the completion of the pulse irrespective to the pulse duration. In our simulation we also showed that fusing the pulse detector with a phage lambda memory element we can construct a counter which can be extended to count larger numbers. The proposed design principle is a new control mechanism for synthetic biology which can be integrated in different circuits for identifying the completion of an event. PMID:27907045

  3. Pulse Detecting Genetic Circuit - A New Design Approach.

    PubMed

    Noman, Nasimul; Inniss, Mara; Iba, Hitoshi; Way, Jeffrey C

    2016-01-01

    A robust cellular counter could enable synthetic biologists to design complex circuits with diverse behaviors. The existing synthetic-biological counters, responsive to the beginning of the pulse, are sensitive to the pulse duration. Here we present a pulse detecting circuit that responds only at the falling edge of a pulse-analogous to negative edge triggered electric circuits. As biological events do not follow precise timing, use of such a pulse detector would enable the design of robust asynchronous counters which can count the completion of events. This transcription-based pulse detecting circuit depends on the interaction of two co-expressed lambdoid phage-derived proteins: the first is unstable and inhibits the regulatory activity of the second, stable protein. At the end of the pulse the unstable inhibitor protein disappears from the cell and the second protein triggers the recording of the event completion. Using stochastic simulation we showed that the proposed design can detect the completion of the pulse irrespective to the pulse duration. In our simulation we also showed that fusing the pulse detector with a phage lambda memory element we can construct a counter which can be extended to count larger numbers. The proposed design principle is a new control mechanism for synthetic biology which can be integrated in different circuits for identifying the completion of an event.

  4. Complex Event Detection via Multi Source Video Attributes (Open Access)

    DTIC Science & Technology

    2013-10-03

    representations in H as Ṽi ∣∣∣ m i =1 ∈ Rdi×ni where di is the dimension and ni indicates the number of videos. Suppose the semantic labels are Ai|mi=1...Rni×ci where ci is the number of classes, we propose the following regres- sion loss: min Qi m∑ i =1 ∥∥∥Ṽ Ti Qi −Ai ∥∥∥ 2 F , (1) where Qi ∈ Rdi×ci...map the multiple features of the complex event videos into H and denote the resulted representations as X̃i ∣∣∣ m i =1 ∈ Rdi×n, where n is the number

  5. Towards cross-lingual alerting for bursty epidemic events.

    PubMed

    Collier, Nigel

    2011-10-06

    Online news reports are increasingly becoming a source for event-based early warning systems that detect natural disasters. Harnessing the massive volume of information available from multilingual newswire presents as many challanges as opportunities due to the patterns of reporting complex spatio-temporal events. In this article we study the problem of utilising correlated event reports across languages. We track the evolution of 16 disease outbreaks using 5 temporal aberration detection algorithms on text-mined events classified according to disease and outbreak country. Using ProMED reports as a silver standard, comparative analysis of news data for 13 languages over a 129 day trial period showed improved sensitivity, F1 and timeliness across most models using cross-lingual events. We report a detailed case study analysis for Cholera in Angola 2010 which highlights the challenges faced in correlating news events with the silver standard. The results show that automated health surveillance using multilingual text mining has the potential to turn low value news into high value alerts if informed choices are used to govern the selection of models and data sources. An implementation of the C2 alerting algorithm using multilingual news is available at the BioCaster portal http://born.nii.ac.jp/?page=globalroundup.

  6. Monitoring with head-mounted displays: performance and safety in a full-scale simulator and part-task trainer.

    PubMed

    Liu, David; Jenkins, Simon A; Sanderson, Penelope M; Watson, Marcus O; Leane, Terrence; Kruys, Amanda; Russell, W John

    2009-10-01

    Head-mounted displays (HMDs) can help anesthesiologists with intraoperative monitoring by keeping patients' vital signs within view at all times, even while the anesthesiologist is busy performing procedures or unable to see the monitor. The anesthesia literature suggests that there are advantages of HMD use, but research into head-up displays in the cockpit suggests that HMDs may exacerbate inattentional blindness (a tendency for users to miss unexpected but salient events in the field of view) and may introduce perceptual issues relating to focal depth. We investigated these issues in two simulator-based experiments. Experiment 1 investigated whether wearing a HMD would affect how quickly anesthesiologists detect events, and whether the focus setting of the HMD (near or far) makes any difference. Twelve anesthesiologists provided anesthesia in three naturalistic scenarios within a simulated operating theater environment. There were 24 different events that occurred either on the patient monitor or in the operating room. Experiment 2 investigated whether anesthesiologists physically constrained by performing a procedure would detect patient-related events faster with a HMD than without. Twelve anesthesiologists performed a complex simulated clinical task on a part-task endoscopic dexterity trainer while monitoring the simulated patient's vital signs. All participants experienced four different events within each of two scenarios. Experiment 1 showed that neither wearing the HMD nor adjusting the focus setting reduced participants' ability to detect events (the number of events detected and time to detect events). In general, participants spent more time looking toward the patient and less time toward the anesthesia machine when they wore the HMD than when they used standard monitoring alone. Participants reported that they preferred the near focus setting. Experiment 2 showed that participants detected two of four events faster with the HMD, but one event more slowly with the HMD. Participants turned to look toward the anesthesia machine significantly less often when using the HMD. When using the HMD, participants reported that they were less busy, monitoring was easier, and they believed they were faster at detecting abnormal changes. The HMD helped anesthesiologists detect events when physically constrained, but not when physically unconstrained. Although there was no conclusive evidence of worsened inattentional blindness, found in aviation, the perceptual properties of the HMD display appear to influence whether events are detected. Anesthesiologists wearing HMDs should self-adjust the focus to minimize eyestrain and should be aware that some changes may not attract their attention. Future areas of research include developing principles for the design of HMDs, evaluating other types of HMDs, and evaluating the HMD in clinical contexts.

  7. Optical switches for remote and noninvasive control of cell signaling.

    PubMed

    Gorostiza, Pau; Isacoff, Ehud Y

    2008-10-17

    Although the identity and interactions of signaling proteins have been studied in great detail, the complexity of signaling networks cannot be fully understood without elucidating the timing and location of activity of individual proteins. To do this, one needs a means for detecting and controlling specific signaling events. An attractive approach is to use light, both to report on and control signaling proteins in cells, because light can probe cells in real time with minimal damage. Although optical detection of signaling events has been successful for some time, the development of the means for optical control has accelerated only recently. Of particular interest is the development of chemically engineered proteins that are directly sensitive to light.

  8. Ambulatory REACT: real-time seizure detection with a DSP microprocessor.

    PubMed

    McEvoy, Robert P; Faul, Stephen; Marnane, William P

    2010-01-01

    REACT (Real-Time EEG Analysis for event deteCTion) is a Support Vector Machine based technology which, in recent years, has been successfully applied to the problem of automated seizure detection in both adults and neonates. This paper describes the implementation of REACT on a commercial DSP microprocessor; the Analog Devices Blackfin®. The primary aim of this work is to develop a prototype system for use in ambulatory or in-ward automated EEG analysis. Furthermore, the complexity of the various stages of the REACT algorithm on the Blackfin processor is analysed; in particular the EEG feature extraction stages. This hardware profile is used to select a reduced, platform-aware feature set, in order to evaluate the seizure classification accuracy of a lower-complexity, lower-power REACT system.

  9. Detecting modification of biomedical events using a deep parsing approach.

    PubMed

    Mackinlay, Andrew; Martinez, David; Baldwin, Timothy

    2012-04-30

    This work describes a system for identifying event mentions in bio-molecular research abstracts that are either speculative (e.g. analysis of IkappaBalpha phosphorylation, where it is not specified whether phosphorylation did or did not occur) or negated (e.g. inhibition of IkappaBalpha phosphorylation, where phosphorylation did not occur). The data comes from a standard dataset created for the BioNLP 2009 Shared Task. The system uses a machine-learning approach, where the features used for classification are a combination of shallow features derived from the words of the sentences and more complex features based on the semantic outputs produced by a deep parser. To detect event modification, we use a Maximum Entropy learner with features extracted from the data relative to the trigger words of the events. The shallow features are bag-of-words features based on a small sliding context window of 3-4 tokens on either side of the trigger word. The deep parser features are derived from parses produced by the English Resource Grammar and the RASP parser. The outputs of these parsers are converted into the Minimal Recursion Semantics formalism, and from this, we extract features motivated by linguistics and the data itself. All of these features are combined to create training or test data for the machine learning algorithm. Over the test data, our methods produce approximately a 4% absolute increase in F-score for detection of event modification compared to a baseline based only on the shallow bag-of-words features. Our results indicate that grammar-based techniques can enhance the accuracy of methods for detecting event modification.

  10. Characterization of Large Structural Genetic Mosaicism in Human Autosomes

    PubMed Central

    Machiela, Mitchell J.; Zhou, Weiyin; Sampson, Joshua N.; Dean, Michael C.; Jacobs, Kevin B.; Black, Amanda; Brinton, Louise A.; Chang, I-Shou; Chen, Chu; Chen, Constance; Chen, Kexin; Cook, Linda S.; Crous Bou, Marta; De Vivo, Immaculata; Doherty, Jennifer; Friedenreich, Christine M.; Gaudet, Mia M.; Haiman, Christopher A.; Hankinson, Susan E.; Hartge, Patricia; Henderson, Brian E.; Hong, Yun-Chul; Hosgood, H. Dean; Hsiung, Chao A.; Hu, Wei; Hunter, David J.; Jessop, Lea; Kim, Hee Nam; Kim, Yeul Hong; Kim, Young Tae; Klein, Robert; Kraft, Peter; Lan, Qing; Lin, Dongxin; Liu, Jianjun; Le Marchand, Loic; Liang, Xiaolin; Lissowska, Jolanta; Lu, Lingeng; Magliocco, Anthony M.; Matsuo, Keitaro; Olson, Sara H.; Orlow, Irene; Park, Jae Yong; Pooler, Loreall; Prescott, Jennifer; Rastogi, Radhai; Risch, Harvey A.; Schumacher, Fredrick; Seow, Adeline; Setiawan, Veronica Wendy; Shen, Hongbing; Sheng, Xin; Shin, Min-Ho; Shu, Xiao-Ou; VanDen Berg, David; Wang, Jiu-Cun; Wentzensen, Nicolas; Wong, Maria Pik; Wu, Chen; Wu, Tangchun; Wu, Yi-Long; Xia, Lucy; Yang, Hannah P.; Yang, Pan-Chyr; Zheng, Wei; Zhou, Baosen; Abnet, Christian C.; Albanes, Demetrius; Aldrich, Melinda C.; Amos, Christopher; Amundadottir, Laufey T.; Berndt, Sonja I.; Blot, William J.; Bock, Cathryn H.; Bracci, Paige M.; Burdett, Laurie; Buring, Julie E.; Butler, Mary A.; Carreón, Tania; Chatterjee, Nilanjan; Chung, Charles C.; Cook, Michael B.; Cullen, Michael; Davis, Faith G.; Ding, Ti; Duell, Eric J.; Epstein, Caroline G.; Fan, Jin-Hu; Figueroa, Jonine D.; Fraumeni, Joseph F.; Freedman, Neal D.; Fuchs, Charles S.; Gao, Yu-Tang; Gapstur, Susan M.; Patiño-Garcia, Ana; Garcia-Closas, Montserrat; Gaziano, J. Michael; Giles, Graham G.; Gillanders, Elizabeth M.; Giovannucci, Edward L.; Goldin, Lynn; Goldstein, Alisa M.; Greene, Mark H.; Hallmans, Goran; Harris, Curtis C.; Henriksson, Roger; Holly, Elizabeth A.; Hoover, Robert N.; Hu, Nan; Hutchinson, Amy; Jenab, Mazda; Johansen, Christoffer; Khaw, Kay-Tee; Koh, Woon-Puay; Kolonel, Laurence N.; Kooperberg, Charles; Krogh, Vittorio; Kurtz, Robert C.; LaCroix, Andrea; Landgren, Annelie; Landi, Maria Teresa; Li, Donghui; Liao, Linda M.; Malats, Nuria; McGlynn, Katherine A.; McNeill, Lorna H.; McWilliams, Robert R.; Melin, Beatrice S.; Mirabello, Lisa; Peplonska, Beata; Peters, Ulrike; Petersen, Gloria M.; Prokunina-Olsson, Ludmila; Purdue, Mark; Qiao, You-Lin; Rabe, Kari G.; Rajaraman, Preetha; Real, Francisco X.; Riboli, Elio; Rodríguez-Santiago, Benjamín; Rothman, Nathaniel; Ruder, Avima M.; Savage, Sharon A.; Schwartz, Ann G.; Schwartz, Kendra L.; Sesso, Howard D.; Severi, Gianluca; Silverman, Debra T.; Spitz, Margaret R.; Stevens, Victoria L.; Stolzenberg-Solomon, Rachael; Stram, Daniel; Tang, Ze-Zhong; Taylor, Philip R.; Teras, Lauren R.; Tobias, Geoffrey S.; Viswanathan, Kala; Wacholder, Sholom; Wang, Zhaoming; Weinstein, Stephanie J.; Wheeler, William; White, Emily; Wiencke, John K.; Wolpin, Brian M.; Wu, Xifeng; Wunder, Jay S.; Yu, Kai; Zanetti, Krista A.; Zeleniuch-Jacquotte, Anne; Ziegler, Regina G.; de Andrade, Mariza; Barnes, Kathleen C.; Beaty, Terri H.; Bierut, Laura J.; Desch, Karl C.; Doheny, Kimberly F.; Feenstra, Bjarke; Ginsburg, David; Heit, John A.; Kang, Jae H.; Laurie, Cecilia A.; Li, Jun Z.; Lowe, William L.; Marazita, Mary L.; Melbye, Mads; Mirel, Daniel B.; Murray, Jeffrey C.; Nelson, Sarah C.; Pasquale, Louis R.; Rice, Kenneth; Wiggs, Janey L.; Wise, Anastasia; Tucker, Margaret; Pérez-Jurado, Luis A.; Laurie, Cathy C.; Caporaso, Neil E.; Yeager, Meredith; Chanock, Stephen J.

    2015-01-01

    Analyses of genome-wide association study (GWAS) data have revealed that detectable genetic mosaicism involving large (>2 Mb) structural autosomal alterations occurs in a fraction of individuals. We present results for a set of 24,849 genotyped individuals (total GWAS set II [TGSII]) in whom 341 large autosomal abnormalities were observed in 168 (0.68%) individuals. Merging data from the new TGSII set with data from two prior reports (the Gene-Environment Association Studies and the total GWAS set I) generated a large dataset of 127,179 individuals; we then conducted a meta-analysis to investigate the patterns of detectable autosomal mosaicism (n = 1,315 events in 925 [0.73%] individuals). Restricting to events >2 Mb in size, we observed an increase in event frequency as event size decreased. The combined results underscore that the rate of detectable mosaicism increases with age (p value = 5.5 × 10−31) and is higher in men (p value = 0.002) but lower in participants of African ancestry (p value = 0.003). In a subset of 47 individuals from whom serial samples were collected up to 6 years apart, complex changes were noted over time and showed an overall increase in the proportion of mosaic cells as age increased. Our large combined sample allowed for a unique ability to characterize detectable genetic mosaicism involving large structural events and strengthens the emerging evidence of non-random erosion of the genome in the aging population. PMID:25748358

  11. A Search for Meteoroid Lunar Impact Generated Electromagnetic Pulses

    NASA Astrophysics Data System (ADS)

    Kesaraju, Saiveena; Mathews, John D.; Vierinen, Juha; Perillat, Phil; Meisel, David D.

    2016-11-01

    Lunar white light flashes associated with meteoroid impacts are now regularly observed using modest optical instrumentation. In this paper, we hypothesize that the developing, optically-dense hot ejecta cloud associated with these hypervelocity impacts also produce an associated complex plasma component that rapidly evolves resulting in a highly-transient electro magnetic pulse (EMP) in the VHF/UHF spectral region. Discovery of the characteristics and event frequency of impact EMPs would prove interesting to meteoroid flux and complex plasma physics studies especially if EMPs from the same event are detected from at least two locations on the Earth with relative delays appropriate to the propagation paths. We describe a prototype observational search, conducted in May 2014, for meteoroid lunar-impact EMPs that was conducted using simultaneous, overlapping-band, UHF radio observations at the Arecibo (AO; Puerto Rico) and Haystack (HO, Massachusetts, USA) Observatories. Monostatic/bistatic lunar radar imaging observations were also performed with HO transmitting and HO/AO receiving to confirm tracking, the net delay, and the pointing/timing ephemeris at both observatories. Signal analysis was performed using time-frequency signal processing techniques. Although, we did not conclusively identify EMP returns, this search detected possible EMPs and we have confirmed the search paradigm and established the sensitivity of the AO-HO system in detecting the hypothesized events. We have also characterized the difficult radio-frequency interference environment surrounding these UHF observations. We discuss the wide range of terrestrial-origin, Moon-bounce signals that were observed which additionally validate the observational technique. Further observations are contemplated.

  12. Coincidence and covariance data acquisition in photoelectron and -ion spectroscopy. I. Formal theory

    NASA Astrophysics Data System (ADS)

    Mikosch, Jochen; Patchkovskii, Serguei

    2013-10-01

    We derive a formal theory of noisy Poisson processes with multiple outcomes. We obtain simple, compact expressions for the probability distribution function of arbitrarily complex composite events and its moments. We illustrate the utility of the theory by analyzing properties of coincidence and covariance photoelectron-photoion detection involving single-ionization events. The results and techniques introduced in this work are directly applicable to more general coincidence and covariance experiments, including multiple ionization and multiple-ion fragmentation pathways.

  13. New early warning system for gravity-driven ruptures based on codetection of acoustic signal

    NASA Astrophysics Data System (ADS)

    Faillettaz, J.

    2016-12-01

    Gravity-driven rupture phenomena in natural media - e.g. landslide, rockfalls, snow or ice avalanches - represent an important class of natural hazards in mountainous regions. To protect the population against such events, a timely evacuation often constitutes the only effective way to secure the potentially endangered area. However, reliable prediction of imminence of such failure events remains challenging due to the nonlinear and complex nature of geological material failure hampered by inherent heterogeneity, unknown initial mechanical state, and complex load application (rainfall, temperature, etc.). Here, a simple method for real-time early warning that considers both the heterogeneity of natural media and characteristics of acoustic emissions attenuation is proposed. This new method capitalizes on codetection of elastic waves emanating from microcracks by multiple and spatially separated sensors. Event-codetection is considered as surrogate for large event size with more frequent codetected events (i.e., detected concurrently on more than one sensor) marking imminence of catastrophic failure. Simple numerical model based on a Fiber Bundle Model considering signal attenuation and hypothetical arrays of sensors confirms the early warning potential of codetection principles. Results suggest that although statistical properties of attenuated signal amplitude could lead to misleading results, monitoring the emergence of large events announcing impeding failure is possible even with attenuated signals depending on sensor network geometry and detection threshold. Preliminary application of the proposed method to acoustic emissions during failure of snow samples has confirmed the potential use of codetection as indicator for imminent failure at lab scale. The applicability of such simple and cheap early warning system is now investigated at a larger scale (hillslope). First results of such a pilot field experiment are presented and analysed.

  14. Online track detection in triggerless mode for INO

    NASA Astrophysics Data System (ADS)

    Jain, A.; Padmini, S.; Joseph, A. N.; Mahesh, P.; Preetha, N.; Behere, A.; Sikder, S. S.; Majumder, G.; Behera, S. P.

    2018-03-01

    The India based Neutrino Observatory (INO) is a proposed particle physics research project to study the atmospheric neutrinos. INO-Iron Calorimeter (ICAL) will consist of 28,800 detectors having 3.6 million electronic channels expected to activate with 100 Hz single rate, producing data at a rate of 3 GBps. Data collected contains a few real hits generated by muon tracks and the remaining noise-induced spurious hits. Estimated reduction factor after filtering out data of interest from generated data is of the order of 103. This makes trigger generation critical for efficient data collection and storage. Trigger is generated by detecting coincidence across multiple channels satisfying trigger criteria, within a small window of 200 ns in the trigger region. As the probability of neutrino interaction is very low, track detection algorithm has to be efficient and fast enough to process 5 × 106 events-candidates/s without introducing significant dead time, so that not even a single neutrino event is missed out. A hardware based trigger system is presently proposed for on-line track detection considering stringent timing requirements. Though the trigger system can be designed with scalability, a lot of hardware devices and interconnections make it a complex and expensive solution with limited flexibility. A software based track detection approach working on the hit information offers an elegant solution with possibility of varying trigger criteria for selecting various potentially interesting physics events. An event selection approach for an alternative triggerless readout scheme has been developed. The algorithm is mathematically simple, robust and parallelizable. It has been validated by detecting simulated muon events for energies of the range of 1 GeV-10 GeV with 100% efficiency at a processing rate of 60 μs/event on a 16 core machine. The algorithm and result of a proof-of-concept for its faster implementation over multiple cores is presented. The paper also discusses about harnessing the computing capabilities of multi-core computing farm, thereby optimizing number of nodes required for the proposed system.

  15. Molecular Approaches for High Throughput Detection and Quantification of Genetically Modified Crops: A Review

    PubMed Central

    Salisu, Ibrahim B.; Shahid, Ahmad A.; Yaqoob, Amina; Ali, Qurban; Bajwa, Kamran S.; Rao, Abdul Q.; Husnain, Tayyab

    2017-01-01

    As long as the genetically modified crops are gaining attention globally, their proper approval and commercialization need accurate and reliable diagnostic methods for the transgenic content. These diagnostic techniques are mainly divided into two major groups, i.e., identification of transgenic (1) DNA and (2) proteins from GMOs and their products. Conventional methods such as PCR (polymerase chain reaction) and enzyme-linked immunosorbent assay (ELISA) were routinely employed for DNA and protein based quantification respectively. Although, these Techniques (PCR and ELISA) are considered as significantly convenient and productive, but there is need for more advance technologies that allow for high throughput detection and the quantification of GM event as the production of more complex GMO is increasing day by day. Therefore, recent approaches like microarray, capillary gel electrophoresis, digital PCR and next generation sequencing are more promising due to their accuracy and precise detection of transgenic contents. The present article is a brief comparative study of all such detection techniques on the basis of their advent, feasibility, accuracy, and cost effectiveness. However, these emerging technologies have a lot to do with detection of a specific event, contamination of different events and determination of fusion as well as stacked gene protein are the critical issues to be addressed in future. PMID:29085378

  16. Markov logic network based complex event detection under uncertainty

    NASA Astrophysics Data System (ADS)

    Lu, Jingyang; Jia, Bin; Chen, Genshe; Chen, Hua-mei; Sullivan, Nichole; Pham, Khanh; Blasch, Erik

    2018-05-01

    In a cognitive reasoning system, the four-stage Observe-Orient-Decision-Act (OODA) reasoning loop is of interest. The OODA loop is essential for the situational awareness especially in heterogeneous data fusion. Cognitive reasoning for making decisions can take advantage of different formats of information such as symbolic observations, various real-world sensor readings, or the relationship between intelligent modalities. Markov Logic Network (MLN) provides mathematically sound technique in presenting and fusing data at multiple levels of abstraction, and across multiple intelligent sensors to conduct complex decision-making tasks. In this paper, a scenario about vehicle interaction is investigated, in which uncertainty is taken into consideration as no systematic approaches can perfectly characterize the complex event scenario. MLNs are applied to the terrestrial domain where the dynamic features and relationships among vehicles are captured through multiple sensors and information sources regarding the data uncertainty.

  17. Recurrence-plot-based measures of complexity and their application to heart-rate-variability data.

    PubMed

    Marwan, Norbert; Wessel, Niels; Meyerfeldt, Udo; Schirdewan, Alexander; Kurths, Jürgen

    2002-08-01

    The knowledge of transitions between regular, laminar or chaotic behaviors is essential to understand the underlying mechanisms behind complex systems. While several linear approaches are often insufficient to describe such processes, there are several nonlinear methods that, however, require rather long time observations. To overcome these difficulties, we propose measures of complexity based on vertical structures in recurrence plots and apply them to the logistic map as well as to heart-rate-variability data. For the logistic map these measures enable us not only to detect transitions between chaotic and periodic states, but also to identify laminar states, i.e., chaos-chaos transitions. The traditional recurrence quantification analysis fails to detect the latter transitions. Applying our measures to the heart-rate-variability data, we are able to detect and quantify the laminar phases before a life-threatening cardiac arrhythmia occurs thereby facilitating a prediction of such an event. Our findings could be of importance for the therapy of malignant cardiac arrhythmias.

  18. Online Least Squares One-Class Support Vector Machines-Based Abnormal Visual Event Detection

    PubMed Central

    Wang, Tian; Chen, Jie; Zhou, Yi; Snoussi, Hichem

    2013-01-01

    The abnormal event detection problem is an important subject in real-time video surveillance. In this paper, we propose a novel online one-class classification algorithm, online least squares one-class support vector machine (online LS-OC-SVM), combined with its sparsified version (sparse online LS-OC-SVM). LS-OC-SVM extracts a hyperplane as an optimal description of training objects in a regularized least squares sense. The online LS-OC-SVM learns a training set with a limited number of samples to provide a basic normal model, then updates the model through remaining data. In the sparse online scheme, the model complexity is controlled by the coherence criterion. The online LS-OC-SVM is adopted to handle the abnormal event detection problem. Each frame of the video is characterized by the covariance matrix descriptor encoding the moving information, then is classified into a normal or an abnormal frame. Experiments are conducted, on a two-dimensional synthetic distribution dataset and a benchmark video surveillance dataset, to demonstrate the promising results of the proposed online LS-OC-SVM method. PMID:24351629

  19. Online least squares one-class support vector machines-based abnormal visual event detection.

    PubMed

    Wang, Tian; Chen, Jie; Zhou, Yi; Snoussi, Hichem

    2013-12-12

    The abnormal event detection problem is an important subject in real-time video surveillance. In this paper, we propose a novel online one-class classification algorithm, online least squares one-class support vector machine (online LS-OC-SVM), combined with its sparsified version (sparse online LS-OC-SVM). LS-OC-SVM extracts a hyperplane as an optimal description of training objects in a regularized least squares sense. The online LS-OC-SVM learns a training set with a limited number of samples to provide a basic normal model, then updates the model through remaining data. In the sparse online scheme, the model complexity is controlled by the coherence criterion. The online LS-OC-SVM is adopted to handle the abnormal event detection problem. Each frame of the video is characterized by the covariance matrix descriptor encoding the moving information, then is classified into a normal or an abnormal frame. Experiments are conducted, on a two-dimensional synthetic distribution dataset and a benchmark video surveillance dataset, to demonstrate the promising results of the proposed online LS-OC-SVM method.

  20. Visualization techniques for computer network defense

    NASA Astrophysics Data System (ADS)

    Beaver, Justin M.; Steed, Chad A.; Patton, Robert M.; Cui, Xiaohui; Schultz, Matthew

    2011-06-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operator to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.

  1. Method for enhancing single-trial P300 detection by introducing the complexity degree of image information in rapid serial visual presentation tasks

    PubMed Central

    Lin, Zhimin; Zeng, Ying; Tong, Li; Zhang, Hangming; Zhang, Chi

    2017-01-01

    The application of electroencephalogram (EEG) generated by human viewing images is a new thrust in image retrieval technology. A P300 component in the EEG is induced when the subjects see their point of interest in a target image under the rapid serial visual presentation (RSVP) experimental paradigm. We detected the single-trial P300 component to determine whether a subject was interested in an image. In practice, the latency and amplitude of the P300 component may vary in relation to different experimental parameters, such as target probability and stimulus semantics. Thus, we proposed a novel method, Target Recognition using Image Complexity Priori (TRICP) algorithm, in which the image information is introduced in the calculation of the interest score in the RSVP paradigm. The method combines information from the image and EEG to enhance the accuracy of single-trial P300 detection on the basis of traditional single-trial P300 detection algorithm. We defined an image complexity parameter based on the features of the different layers of a convolution neural network (CNN). We used the TRICP algorithm to compute for the complexity of an image to quantify the effect of different complexity images on the P300 components and training specialty classifier according to the image complexity. We compared TRICP with the HDCA algorithm. Results show that TRICP is significantly higher than the HDCA algorithm (Wilcoxon Sign Rank Test, p<0.05). Thus, the proposed method can be used in other and visual task-related single-trial event-related potential detection. PMID:29283998

  2. iMSRC: converting a standard automated microscope into an intelligent screening platform.

    PubMed

    Carro, Angel; Perez-Martinez, Manuel; Soriano, Joaquim; Pisano, David G; Megias, Diego

    2015-05-27

    Microscopy in the context of biomedical research is demanding new tools to automatically detect and capture objects of interest. The few extant packages addressing this need, however, have enjoyed limited uptake due to complexity of use and installation. To overcome these drawbacks, we developed iMSRC, which combines ease of use and installation with high flexibility and enables applications such as rare event detection and high-resolution tissue sample screening, saving time and resources.

  3. Sleep spindle and K-complex detection using tunable Q-factor wavelet transform and morphological component analysis

    PubMed Central

    Lajnef, Tarek; Chaibi, Sahbi; Eichenlaub, Jean-Baptiste; Ruby, Perrine M.; Aguera, Pierre-Emmanuel; Samet, Mounir; Kachouri, Abdennaceur; Jerbi, Karim

    2015-01-01

    A novel framework for joint detection of sleep spindles and K-complex events, two hallmarks of sleep stage S2, is proposed. Sleep electroencephalography (EEG) signals are split into oscillatory (spindles) and transient (K-complex) components. This decomposition is conveniently achieved by applying morphological component analysis (MCA) to a sparse representation of EEG segments obtained by the recently introduced discrete tunable Q-factor wavelet transform (TQWT). Tuning the Q-factor provides a convenient and elegant tool to naturally decompose the signal into an oscillatory and a transient component. The actual detection step relies on thresholding (i) the transient component to reveal K-complexes and (ii) the time-frequency representation of the oscillatory component to identify sleep spindles. Optimal thresholds are derived from ROC-like curves (sensitivity vs. FDR) on training sets and the performance of the method is assessed on test data sets. We assessed the performance of our method using full-night sleep EEG data we collected from 14 participants. In comparison to visual scoring (Expert 1), the proposed method detected spindles with a sensitivity of 83.18% and false discovery rate (FDR) of 39%, while K-complexes were detected with a sensitivity of 81.57% and an FDR of 29.54%. Similar performances were obtained when using a second expert as benchmark. In addition, when the TQWT and MCA steps were excluded from the pipeline the detection sensitivities dropped down to 70% for spindles and to 76.97% for K-complexes, while the FDR rose up to 43.62 and 49.09%, respectively. Finally, we also evaluated the performance of the proposed method on a set of publicly available sleep EEG recordings. Overall, the results we obtained suggest that the TQWT-MCA method may be a valuable alternative to existing spindle and K-complex detection methods. Paths for improvements and further validations with large-scale standard open-access benchmarking data sets are discussed. PMID:26283943

  4. High-Performance Signal Detection for Adverse Drug Events using MapReduce Paradigm.

    PubMed

    Fan, Kai; Sun, Xingzhi; Tao, Ying; Xu, Linhao; Wang, Chen; Mao, Xianling; Peng, Bo; Pan, Yue

    2010-11-13

    Post-marketing pharmacovigilance is important for public health, as many Adverse Drug Events (ADEs) are unknown when those drugs were approved for marketing. However, due to the large number of reported drugs and drug combinations, detecting ADE signals by mining these reports is becoming a challenging task in terms of computational complexity. Recently, a parallel programming model, MapReduce has been introduced by Google to support large-scale data intensive applications. In this study, we proposed a MapReduce-based algorithm, for common ADE detection approach, Proportional Reporting Ratio (PRR), and tested it in mining spontaneous ADE reports from FDA. The purpose is to investigate the possibility of using MapReduce principle to speed up biomedical data mining tasks using this pharmacovigilance case as one specific example. The results demonstrated that MapReduce programming model could improve the performance of common signal detection algorithm for pharmacovigilance in a distributed computation environment at approximately liner speedup rates.

  5. Transient Events in Archival Very Large Array Observations of the Galactic Center

    NASA Astrophysics Data System (ADS)

    Chiti, Anirudh; Chatterjee, Shami; Wharton, Robert; Cordes, James; Lazio, T. Joseph W.; Kaplan, David L.; Bower, Geoffrey C.; Croft, Steve

    2016-12-01

    The Galactic center has some of the highest stellar densities in the Galaxy and a range of interstellar scattering properties, which may aid in the detection of new radio-selected transient events. Here, we describe a search for radio transients in the Galactic center, using over 200 hr of archival data from the Very Large Array at 5 and 8.4 GHz. Every observation of Sgr A* from 1985 to 2005 has been searched using an automated processing and detection pipeline sensitive to transients with timescales between 30 s and 5 minutes with a typical detection threshold of ˜100 mJy. Eight possible candidates pass tests to filter false-positives from radio-frequency interference, calibration errors, and imaging artifacts. Two events are identified as promising candidates based on the smoothness of their light curves. Despite the high quality of their light curves, these detections remain suspect due to evidence of incomplete subtraction of the complex structure in the Galactic center, and apparent contingency of one detection on reduction routines. Events of this intensity (˜100 mJy) and duration (˜100 s) are not obviously associated with known astrophysical sources, and no counterparts are found in data at other wavelengths. We consider potential sources, including Galactic center pulsars, dwarf stars, sources like GCRT J1745-3009, and bursts from X-ray binaries. None can fully explain the observed transients, suggesting either a new astrophysical source or a subtle imaging artifact. More sensitive multiwavelength studies are necessary to characterize these events, which, if real, occur with a rate of {14}-12+32 {{hr}}-1 {\\deg }-2 in the Galactic center.

  6. Time-gated flow cytometry: an ultra-high selectivity method to recover ultra-rare-event μ-targets in high-background biosamples

    NASA Astrophysics Data System (ADS)

    Jin, Dayong; Piper, James A.; Leif, Robert C.; Yang, Sean; Ferrari, Belinda C.; Yuan, Jingli; Wang, Guilan; Vallarino, Lidia M.; Williams, John W.

    2009-03-01

    A fundamental problem for rare-event cell analysis is auto-fluorescence from nontarget particles and cells. Time-gated flow cytometry is based on the temporal-domain discrimination of long-lifetime (>1 μs) luminescence-stained cells and can render invisible all nontarget cell and particles. We aim to further evaluate the technique, focusing on detection of ultra-rare-event 5-μm calibration beads in environmental water dirt samples. Europium-labeled 5-μm calibration beads with improved luminescence homogeneity and reduced aggregation were evaluated using the prototype UV LED excited time-gated luminescence (TGL) flow cytometer (FCM). A BD FACSAria flow cytometer was used to sort accurately a very low number of beads (<100 events), which were then spiked into concentrated samples of environmental water. The use of europium-labeled beads permitted the demonstration of specific detection rates of 100%+/-30% and 91%+/-3% with 10 and 100 target beads, respectively, that were mixed with over one million nontarget autofluorescent background particles. Under the same conditions, a conventional FCM was unable to recover rare-event fluorescein isothiocyanate (FITC) calibration beads. Preliminary results on Giardia detection are also reported. We have demonstrated the scientific value of lanthanide-complex biolabels in flow cytometry. This approach may augment the current method that uses multifluorescence-channel flow cytometry gating.

  7. Detecting modification of biomedical events using a deep parsing approach

    PubMed Central

    2012-01-01

    Background This work describes a system for identifying event mentions in bio-molecular research abstracts that are either speculative (e.g. analysis of IkappaBalpha phosphorylation, where it is not specified whether phosphorylation did or did not occur) or negated (e.g. inhibition of IkappaBalpha phosphorylation, where phosphorylation did not occur). The data comes from a standard dataset created for the BioNLP 2009 Shared Task. The system uses a machine-learning approach, where the features used for classification are a combination of shallow features derived from the words of the sentences and more complex features based on the semantic outputs produced by a deep parser. Method To detect event modification, we use a Maximum Entropy learner with features extracted from the data relative to the trigger words of the events. The shallow features are bag-of-words features based on a small sliding context window of 3-4 tokens on either side of the trigger word. The deep parser features are derived from parses produced by the English Resource Grammar and the RASP parser. The outputs of these parsers are converted into the Minimal Recursion Semantics formalism, and from this, we extract features motivated by linguistics and the data itself. All of these features are combined to create training or test data for the machine learning algorithm. Results Over the test data, our methods produce approximately a 4% absolute increase in F-score for detection of event modification compared to a baseline based only on the shallow bag-of-words features. Conclusions Our results indicate that grammar-based techniques can enhance the accuracy of methods for detecting event modification. PMID:22595089

  8. Detection of change points in underlying earthquake rates, with application to global mega-earthquakes

    NASA Astrophysics Data System (ADS)

    Touati, Sarah; Naylor, Mark; Main, Ian

    2016-02-01

    The recent spate of mega-earthquakes since 2004 has led to speculation of an underlying change in the global `background' rate of large events. At a regional scale, detecting changes in background rate is also an important practical problem for operational forecasting and risk calculation, for example due to volcanic processes, seismicity induced by fluid injection or withdrawal, or due to redistribution of Coulomb stress after natural large events. Here we examine the general problem of detecting changes in background rate in earthquake catalogues with and without correlated events, for the first time using the Bayes factor as a discriminant for models of varying complexity. First we use synthetic Poisson (purely random) and Epidemic-Type Aftershock Sequence (ETAS) models (which also allow for earthquake triggering) to test the effectiveness of many standard methods of addressing this question. These fall into two classes: those that evaluate the relative likelihood of different models, for example using Information Criteria or the Bayes Factor; and those that evaluate the probability of the observations (including extreme events or clusters of events) under a single null hypothesis, for example by applying the Kolmogorov-Smirnov and `runs' tests, and a variety of Z-score tests. The results demonstrate that the effectiveness among these tests varies widely. Information Criteria worked at least as well as the more computationally expensive Bayes factor method, and the Kolmogorov-Smirnov and runs tests proved to be the relatively ineffective in reliably detecting a change point. We then apply the methods tested to events at different thresholds above magnitude M ≥ 7 in the global earthquake catalogue since 1918, after first declustering the catalogue. This is most effectively done by removing likely correlated events using a much lower magnitude threshold (M ≥ 5), where triggering is much more obvious. We find no strong evidence that the background rate of large events worldwide has increased in recent years.

  9. Identifying Typhoon Tracks based on Event Synchronization derived Spatially Embedded Climate Networks

    NASA Astrophysics Data System (ADS)

    Ozturk, Ugur; Marwan, Norbert; Kurths, Jürgen

    2017-04-01

    Complex networks are commonly used for investigating spatiotemporal dynamics of complex systems, e.g. extreme rainfall. Especially directed networks are very effective tools in identifying climatic patterns on spatially embedded networks. They can capture the network flux, so as the principal dynamics of spreading significant phenomena. Network measures, such as network divergence, bare the source-receptor relation of the directed networks. However, it is still a challenge how to catch fast evolving atmospheric events, i.e. typhoons. In this study, we propose a new technique, namely Radial Ranks, to detect the general pattern of typhoons forward direction based on the strength parameter of the event synchronization over Japan. We suggest to subset a circular zone of high correlation around the selected grid based on the strength parameter. Radial sums of the strength parameter along vectors within this zone, radial ranks are measured for potential directions, which allows us to trace the network flux over long distances. We employed also the delay parameter of event synchronization to identify and separate the frontal storms' and typhoons' individual behaviors.

  10. Measuring Cognitive and Metacognitive Regulatory Processes during Hypermedia Learning: Issues and Challenges

    ERIC Educational Resources Information Center

    Azevedo, Roger; Moos, Daniel C.; Johnson, Amy M.; Chauncey, Amber D.

    2010-01-01

    Self-regulated learning (SRL) with hypermedia environments involves a complex cycle of temporally unfolding cognitive and metacognitive processes that impact students' learning. We present several methodological issues related to treating SRL as an event and strengths and challenges of using online trace methodologies to detect, trace, model, and…

  11. Characterization of large structural genetic mosaicism in human autosomes.

    PubMed

    Machiela, Mitchell J; Zhou, Weiyin; Sampson, Joshua N; Dean, Michael C; Jacobs, Kevin B; Black, Amanda; Brinton, Louise A; Chang, I-Shou; Chen, Chu; Chen, Constance; Chen, Kexin; Cook, Linda S; Crous Bou, Marta; De Vivo, Immaculata; Doherty, Jennifer; Friedenreich, Christine M; Gaudet, Mia M; Haiman, Christopher A; Hankinson, Susan E; Hartge, Patricia; Henderson, Brian E; Hong, Yun-Chul; Hosgood, H Dean; Hsiung, Chao A; Hu, Wei; Hunter, David J; Jessop, Lea; Kim, Hee Nam; Kim, Yeul Hong; Kim, Young Tae; Klein, Robert; Kraft, Peter; Lan, Qing; Lin, Dongxin; Liu, Jianjun; Le Marchand, Loic; Liang, Xiaolin; Lissowska, Jolanta; Lu, Lingeng; Magliocco, Anthony M; Matsuo, Keitaro; Olson, Sara H; Orlow, Irene; Park, Jae Yong; Pooler, Loreall; Prescott, Jennifer; Rastogi, Radhai; Risch, Harvey A; Schumacher, Fredrick; Seow, Adeline; Setiawan, Veronica Wendy; Shen, Hongbing; Sheng, Xin; Shin, Min-Ho; Shu, Xiao-Ou; VanDen Berg, David; Wang, Jiu-Cun; Wentzensen, Nicolas; Wong, Maria Pik; Wu, Chen; Wu, Tangchun; Wu, Yi-Long; Xia, Lucy; Yang, Hannah P; Yang, Pan-Chyr; Zheng, Wei; Zhou, Baosen; Abnet, Christian C; Albanes, Demetrius; Aldrich, Melinda C; Amos, Christopher; Amundadottir, Laufey T; Berndt, Sonja I; Blot, William J; Bock, Cathryn H; Bracci, Paige M; Burdett, Laurie; Buring, Julie E; Butler, Mary A; Carreón, Tania; Chatterjee, Nilanjan; Chung, Charles C; Cook, Michael B; Cullen, Michael; Davis, Faith G; Ding, Ti; Duell, Eric J; Epstein, Caroline G; Fan, Jin-Hu; Figueroa, Jonine D; Fraumeni, Joseph F; Freedman, Neal D; Fuchs, Charles S; Gao, Yu-Tang; Gapstur, Susan M; Patiño-Garcia, Ana; Garcia-Closas, Montserrat; Gaziano, J Michael; Giles, Graham G; Gillanders, Elizabeth M; Giovannucci, Edward L; Goldin, Lynn; Goldstein, Alisa M; Greene, Mark H; Hallmans, Goran; Harris, Curtis C; Henriksson, Roger; Holly, Elizabeth A; Hoover, Robert N; Hu, Nan; Hutchinson, Amy; Jenab, Mazda; Johansen, Christoffer; Khaw, Kay-Tee; Koh, Woon-Puay; Kolonel, Laurence N; Kooperberg, Charles; Krogh, Vittorio; Kurtz, Robert C; LaCroix, Andrea; Landgren, Annelie; Landi, Maria Teresa; Li, Donghui; Liao, Linda M; Malats, Nuria; McGlynn, Katherine A; McNeill, Lorna H; McWilliams, Robert R; Melin, Beatrice S; Mirabello, Lisa; Peplonska, Beata; Peters, Ulrike; Petersen, Gloria M; Prokunina-Olsson, Ludmila; Purdue, Mark; Qiao, You-Lin; Rabe, Kari G; Rajaraman, Preetha; Real, Francisco X; Riboli, Elio; Rodríguez-Santiago, Benjamín; Rothman, Nathaniel; Ruder, Avima M; Savage, Sharon A; Schwartz, Ann G; Schwartz, Kendra L; Sesso, Howard D; Severi, Gianluca; Silverman, Debra T; Spitz, Margaret R; Stevens, Victoria L; Stolzenberg-Solomon, Rachael; Stram, Daniel; Tang, Ze-Zhong; Taylor, Philip R; Teras, Lauren R; Tobias, Geoffrey S; Viswanathan, Kala; Wacholder, Sholom; Wang, Zhaoming; Weinstein, Stephanie J; Wheeler, William; White, Emily; Wiencke, John K; Wolpin, Brian M; Wu, Xifeng; Wunder, Jay S; Yu, Kai; Zanetti, Krista A; Zeleniuch-Jacquotte, Anne; Ziegler, Regina G; de Andrade, Mariza; Barnes, Kathleen C; Beaty, Terri H; Bierut, Laura J; Desch, Karl C; Doheny, Kimberly F; Feenstra, Bjarke; Ginsburg, David; Heit, John A; Kang, Jae H; Laurie, Cecilia A; Li, Jun Z; Lowe, William L; Marazita, Mary L; Melbye, Mads; Mirel, Daniel B; Murray, Jeffrey C; Nelson, Sarah C; Pasquale, Louis R; Rice, Kenneth; Wiggs, Janey L; Wise, Anastasia; Tucker, Margaret; Pérez-Jurado, Luis A; Laurie, Cathy C; Caporaso, Neil E; Yeager, Meredith; Chanock, Stephen J

    2015-03-05

    Analyses of genome-wide association study (GWAS) data have revealed that detectable genetic mosaicism involving large (>2 Mb) structural autosomal alterations occurs in a fraction of individuals. We present results for a set of 24,849 genotyped individuals (total GWAS set II [TGSII]) in whom 341 large autosomal abnormalities were observed in 168 (0.68%) individuals. Merging data from the new TGSII set with data from two prior reports (the Gene-Environment Association Studies and the total GWAS set I) generated a large dataset of 127,179 individuals; we then conducted a meta-analysis to investigate the patterns of detectable autosomal mosaicism (n = 1,315 events in 925 [0.73%] individuals). Restricting to events >2 Mb in size, we observed an increase in event frequency as event size decreased. The combined results underscore that the rate of detectable mosaicism increases with age (p value = 5.5 × 10(-31)) and is higher in men (p value = 0.002) but lower in participants of African ancestry (p value = 0.003). In a subset of 47 individuals from whom serial samples were collected up to 6 years apart, complex changes were noted over time and showed an overall increase in the proportion of mosaic cells as age increased. Our large combined sample allowed for a unique ability to characterize detectable genetic mosaicism involving large structural events and strengthens the emerging evidence of non-random erosion of the genome in the aging population. Copyright © 2015 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  12. Ensemble survival tree models to reveal pairwise interactions of variables with time-to-events outcomes in low-dimensional setting

    PubMed Central

    Dazard, Jean-Eudes; Ishwaran, Hemant; Mehlotra, Rajeev; Weinberg, Aaron; Zimmerman, Peter

    2018-01-01

    Unraveling interactions among variables such as genetic, clinical, demographic and environmental factors is essential to understand the development of common and complex diseases. To increase the power to detect such variables interactions associated with clinical time-to-events outcomes, we borrowed established concepts from random survival forest (RSF) models. We introduce a novel RSF-based pairwise interaction estimator and derive a randomization method with bootstrap confidence intervals for inferring interaction significance. Using various linear and nonlinear time-to-events survival models in simulation studies, we first show the efficiency of our approach: true pairwise interaction-effects between variables are uncovered, while they may not be accompanied with their corresponding main-effects, and may not be detected by standard semi-parametric regression modeling and test statistics used in survival analysis. Moreover, using a RSF-based cross-validation scheme for generating prediction estimators, we show that informative predictors may be inferred. We applied our approach to an HIV cohort study recording key host gene polymorphisms and their association with HIV change of tropism or AIDS progression. Altogether, this shows how linear or nonlinear pairwise statistical interactions of variables may be efficiently detected with a predictive value in observational studies with time-to-event outcomes. PMID:29453930

  13. Ensemble survival tree models to reveal pairwise interactions of variables with time-to-events outcomes in low-dimensional setting.

    PubMed

    Dazard, Jean-Eudes; Ishwaran, Hemant; Mehlotra, Rajeev; Weinberg, Aaron; Zimmerman, Peter

    2018-02-17

    Unraveling interactions among variables such as genetic, clinical, demographic and environmental factors is essential to understand the development of common and complex diseases. To increase the power to detect such variables interactions associated with clinical time-to-events outcomes, we borrowed established concepts from random survival forest (RSF) models. We introduce a novel RSF-based pairwise interaction estimator and derive a randomization method with bootstrap confidence intervals for inferring interaction significance. Using various linear and nonlinear time-to-events survival models in simulation studies, we first show the efficiency of our approach: true pairwise interaction-effects between variables are uncovered, while they may not be accompanied with their corresponding main-effects, and may not be detected by standard semi-parametric regression modeling and test statistics used in survival analysis. Moreover, using a RSF-based cross-validation scheme for generating prediction estimators, we show that informative predictors may be inferred. We applied our approach to an HIV cohort study recording key host gene polymorphisms and their association with HIV change of tropism or AIDS progression. Altogether, this shows how linear or nonlinear pairwise statistical interactions of variables may be efficiently detected with a predictive value in observational studies with time-to-event outcomes.

  14. Negated bio-events: analysis and identification

    PubMed Central

    2013-01-01

    Background Negation occurs frequently in scientific literature, especially in biomedical literature. It has previously been reported that around 13% of sentences found in biomedical research articles contain negation. Historically, the main motivation for identifying negated events has been to ensure their exclusion from lists of extracted interactions. However, recently, there has been a growing interest in negative results, which has resulted in negation detection being identified as a key challenge in biomedical relation extraction. In this article, we focus on the problem of identifying negated bio-events, given gold standard event annotations. Results We have conducted a detailed analysis of three open access bio-event corpora containing negation information (i.e., GENIA Event, BioInfer and BioNLP’09 ST), and have identified the main types of negated bio-events. We have analysed the key aspects of a machine learning solution to the problem of detecting negated events, including selection of negation cues, feature engineering and the choice of learning algorithm. Combining the best solutions for each aspect of the problem, we propose a novel framework for the identification of negated bio-events. We have evaluated our system on each of the three open access corpora mentioned above. The performance of the system significantly surpasses the best results previously reported on the BioNLP’09 ST corpus, and achieves even better results on the GENIA Event and BioInfer corpora, both of which contain more varied and complex events. Conclusions Recently, in the field of biomedical text mining, the development and enhancement of event-based systems has received significant interest. The ability to identify negated events is a key performance element for these systems. We have conducted the first detailed study on the analysis and identification of negated bio-events. Our proposed framework can be integrated with state-of-the-art event extraction systems. The resulting systems will be able to extract bio-events with attached polarities from textual documents, which can serve as the foundation for more elaborate systems that are able to detect mutually contradicting bio-events. PMID:23323936

  15. iMSRC: converting a standard automated microscope into an intelligent screening platform

    PubMed Central

    Carro, Angel; Perez-Martinez, Manuel; Soriano, Joaquim; Pisano, David G.; Megias, Diego

    2015-01-01

    Microscopy in the context of biomedical research is demanding new tools to automatically detect and capture objects of interest. The few extant packages addressing this need, however, have enjoyed limited uptake due to complexity of use and installation. To overcome these drawbacks, we developed iMSRC, which combines ease of use and installation with high flexibility and enables applications such as rare event detection and high-resolution tissue sample screening, saving time and resources. PMID:26015081

  16. Early events in xenograft development from the human embryonic stem cell line HS181--resemblance with an initial multiple epiblast formation.

    PubMed

    Gertow, Karin; Cedervall, Jessica; Jamil, Seema; Ali, Rouknuddin; Imreh, Marta P; Gulyas, Miklos; Sandstedt, Bengt; Ahrlund-Richter, Lars

    2011-01-01

    Xenografting is widely used for assessing in vivo pluripotency of human stem cell populations. Here, we report on early to late events in the development of mature experimental teratoma from a well-characterized human embryonic stem cell (HESC) line, HS181. The results show an embryonic process, increasingly chaotic. Active proliferation of the stem cell derived cellular progeny was detected already at day 5, and characterized by the appearance of multiple sites of engraftment, with structures of single or pseudostratified columnar epithelium surrounding small cavities. The striking histological resemblance to developing embryonic ectoderm, and the formation of epiblast-like structures was supported by the expression of the markers OCT4, NANOG, SSEA-4 and KLF4, but a lack of REX1. The early neural marker NESTIN was uniformly expressed, while markers linked to gastrulation, such as BMP-4, NODAL or BRACHYURY were not detected. Thus, observations on day 5 indicated differentiation comparable to the most early transient cell populations in human post implantation development. Confirming and expanding on previous findings from HS181 xenografts, these early events were followed by an increasingly chaotic development, incorporated in the formation of a benign teratoma with complex embryonic components. In the mature HS181 teratomas not all types of organs/tissues were detected, indicating a restricted differentiation, and a lack of adequate spatial developmental cues during the further teratoma formation. Uniquely, a kinetic alignment of rare complex structures was made to human embryos at diagnosed gestation stages, showing minor kinetic deviations between HS181 teratoma and the human counterpart.

  17. Vertically Integrated Seismological Analysis II : Inference

    NASA Astrophysics Data System (ADS)

    Arora, N. S.; Russell, S.; Sudderth, E.

    2009-12-01

    Methods for automatically associating detected waveform features with hypothesized seismic events, and localizing those events, are a critical component of efforts to verify the Comprehensive Test Ban Treaty (CTBT). As outlined in our companion abstract, we have developed a hierarchical model which views detection, association, and localization as an integrated probabilistic inference problem. In this abstract, we provide more details on the Markov chain Monte Carlo (MCMC) methods used to solve this inference task. MCMC generates samples from a posterior distribution π(x) over possible worlds x by defining a Markov chain whose states are the worlds x, and whose stationary distribution is π(x). In the Metropolis-Hastings (M-H) method, transitions in the Markov chain are constructed in two steps. First, given the current state x, a candidate next state x‧ is generated from a proposal distribution q(x‧ | x), which may be (more or less) arbitrary. Second, the transition to x‧ is not automatic, but occurs with an acceptance probability—α(x‧ | x) = min(1, π(x‧)q(x | x‧)/π(x)q(x‧ | x)). The seismic event model outlined in our companion abstract is quite similar to those used in multitarget tracking, for which MCMC has proved very effective. In this model, each world x is defined by a collection of events, a list of properties characterizing those events (times, locations, magnitudes, and types), and the association of each event to a set of observed detections. The target distribution π(x) = P(x | y), the posterior distribution over worlds x given the observed waveform data y at all stations. Proposal distributions then implement several types of moves between worlds. For example, birth moves create new events; death moves delete existing events; split moves partition the detections for an event into two new events; merge moves combine event pairs; swap moves modify the properties and assocations for pairs of events. Importantly, the rules for accepting such complex moves need not be hand-designed. Instead, they are automatically determined by the underlying probabilistic model, which is in turn calibrated via historical data and scientific knowledge. Consider a small seismic event which generates weak signals at several different stations, which might independently be mistaken for noise. A birth move may nevertheless hypothesize an event jointly explaining these detections. If the corresponding waveform data then aligns with the seismological knowledge encoded in the probabilistic model, the event may be detected even though no single station observes it unambiguously. Alternatively, if a large outlier reading is produced at a single station, moves which instantiate a corresponding (false) event would be rejected because of the absence of plausible detections at other sensors. More broadly, one of the main advantages of our MCMC approach is its consistent handling of the relative uncertainties in different information sources. By avoiding low-level thresholds, we expect to improve accuracy and robustness. At the conference, we will present results quantitatively validating our approach, using ground-truth associations and locations provided either by simulation or human analysts.

  18. Neuronal chronometry of target detection: fusion of hemodynamic and event-related potential data.

    PubMed

    Calhoun, V D; Adali, T; Pearlson, G D; Kiehl, K A

    2006-04-01

    Event-related potential (ERP) studies of the brain's response to infrequent, target (oddball) stimuli elicit a sequence of physiological events, the most prominent and well studied being a complex, the P300 (or P3) peaking approximately 300 ms post-stimulus for simple stimuli and slightly later for more complex stimuli. Localization of the neural generators of the human oddball response remains challenging due to the lack of a single imaging technique with good spatial and temporal resolution. Here, we use independent component analyses to fuse ERP and fMRI modalities in order to examine the dynamics of the auditory oddball response with high spatiotemporal resolution across the entire brain. Initial activations in auditory and motor planning regions are followed by auditory association cortex and motor execution regions. The P3 response is associated with brainstem, temporal lobe, and medial frontal activity and finally a late temporal lobe "evaluative" response. We show that fusing imaging modalities with different advantages can provide new information about the brain.

  19. Decision support methods for the detection of adverse events in post-marketing data.

    PubMed

    Hauben, M; Bate, A

    2009-04-01

    Spontaneous reporting is a crucial component of post-marketing drug safety surveillance despite its significant limitations. The size and complexity of some spontaneous reporting system databases represent a challenge for drug safety professionals who traditionally have relied heavily on the scientific and clinical acumen of the prepared mind. Computer algorithms that calculate statistical measures of reporting frequency for huge numbers of drug-event combinations are increasingly used to support pharamcovigilance analysts screening large spontaneous reporting system databases. After an overview of pharmacovigilance and spontaneous reporting systems, we discuss the theory and application of contemporary computer algorithms in regular use, those under development, and the practical considerations involved in the implementation of computer algorithms within a comprehensive and holistic drug safety signal detection program.

  20. Meet Spinky: An Open-Source Spindle and K-Complex Detection Toolbox Validated on the Open-Access Montreal Archive of Sleep Studies (MASS).

    PubMed

    Lajnef, Tarek; O'Reilly, Christian; Combrisson, Etienne; Chaibi, Sahbi; Eichenlaub, Jean-Baptiste; Ruby, Perrine M; Aguera, Pierre-Emmanuel; Samet, Mounir; Kachouri, Abdennaceur; Frenette, Sonia; Carrier, Julie; Jerbi, Karim

    2017-01-01

    Sleep spindles and K-complexes are among the most prominent micro-events observed in electroencephalographic (EEG) recordings during sleep. These EEG microstructures are thought to be hallmarks of sleep-related cognitive processes. Although tedious and time-consuming, their identification and quantification is important for sleep studies in both healthy subjects and patients with sleep disorders. Therefore, procedures for automatic detection of spindles and K-complexes could provide valuable assistance to researchers and clinicians in the field. Recently, we proposed a framework for joint spindle and K-complex detection (Lajnef et al., 2015a) based on a Tunable Q-factor Wavelet Transform (TQWT; Selesnick, 2011a) and morphological component analysis (MCA). Using a wide range of performance metrics, the present article provides critical validation and benchmarking of the proposed approach by applying it to open-access EEG data from the Montreal Archive of Sleep Studies (MASS; O'Reilly et al., 2014). Importantly, the obtained scores were compared to alternative methods that were previously tested on the same database. With respect to spindle detection, our method achieved higher performance than most of the alternative methods. This was corroborated with statistic tests that took into account both sensitivity and precision (i.e., Matthew's coefficient of correlation (MCC), F1, Cohen κ). Our proposed method has been made available to the community via an open-source tool named Spinky (for spindle and K-complex detection). Thanks to a GUI implementation and access to Matlab and Python resources, Spinky is expected to contribute to an open-science approach that will enhance replicability and reliable comparisons of classifier performances for the detection of sleep EEG microstructure in both healthy and patient populations.

  1. Automatic Detection and Classification of Unsafe Events During Power Wheelchair Use.

    PubMed

    Pineau, Joelle; Moghaddam, Athena K; Yuen, Hiu Kim; Archambault, Philippe S; Routhier, François; Michaud, François; Boissy, Patrick

    2014-01-01

    Using a powered wheelchair (PW) is a complex task requiring advanced perceptual and motor control skills. Unfortunately, PW incidents and accidents are not uncommon and their consequences can be serious. The objective of this paper is to develop technological tools that can be used to characterize a wheelchair user's driving behavior under various settings. In the experiments conducted, PWs are outfitted with a datalogging platform that records, in real-time, the 3-D acceleration of the PW. Data collection was conducted over 35 different activities, designed to capture a spectrum of PW driving events performed at different speeds (collisions with fixed or moving objects, rolling on incline plane, and rolling across multiple types obstacles). The data was processed using time-series analysis and data mining techniques, to automatically detect and identify the different events. We compared the classification accuracy using four different types of time-series features: 1) time-delay embeddings; 2) time-domain characterization; 3) frequency-domain features; and 4) wavelet transforms. In the analysis, we compared the classification accuracy obtained when distinguishing between safe and unsafe events during each of the 35 different activities. For the purposes of this study, unsafe events were defined as activities containing collisions against objects at different speed, and the remainder were defined as safe events. We were able to accurately detect 98% of unsafe events, with a low (12%) false positive rate, using only five examples of each activity. This proof-of-concept study shows that the proposed approach has the potential of capturing, based on limited input from embedded sensors, contextual information on PW use, and of automatically characterizing a user's PW driving behavior.

  2. Time difference of arrival to blast localization of potential chemical/biological event on the move

    NASA Astrophysics Data System (ADS)

    Morcos, Amir; Desai, Sachi; Peltzer, Brian; Hohil, Myron E.

    2007-10-01

    Integrating a sensor suite with ability to discriminate potential Chemical/Biological (CB) events from high-explosive (HE) events employing a standalone acoustic sensor with a Time Difference of Arrival (TDOA) algorithm we developed a cueing mechanism for more power intensive and range limited sensing techniques. Enabling the event detection algorithm to locate to a blast event using TDOA we then provide further information of the event as either Launch/Impact and if CB/HE. The added information is provided to a range limited chemical sensing system that exploits spectroscopy to determine the contents of the chemical event. The main innovation within this sensor suite is the system will provide this information on the move while the chemical sensor will have adequate time to determine the contents of the event from a safe stand-off distance. The CB/HE discrimination algorithm exploits acoustic sensors to provide early detection and identification of CB attacks. Distinct characteristics arise within the different airburst signatures because HE warheads emphasize concussive and shrapnel effects, while CB warheads are designed to disperse their contents over large areas, therefore employing a slower burning, less intense explosive to mix and spread their contents. Differences characterized by variations in the corresponding peak pressure and rise time of the blast, differences in the ratio of positive pressure amplitude to the negative amplitude, and variations in the overall duration of the resulting waveform. The discrete wavelet transform (DWT) is used to extract the predominant components of these characteristics from air burst signatures at ranges exceeding 3km. Highly reliable discrimination is achieved with a feed-forward neural network classifier trained on a feature space derived from the distribution of wavelet coefficients and higher frequency details found within different levels of the multiresolution decomposition. The development of an adaptive noise floor to provide early event detection assists in minimizing the false alarm rate and increasing the confidence whether the event is blast event or back ground noise. The integration of these algorithms with the TDOA algorithm provides a complex suite of algorithms that can give early warning detection and highly reliable look direction from a great stand-off distance for a moving vehicle to determine if a candidate blast event is CB and if CB what is the composition of the resulting cloud.

  3. Analysis of the QRS complex for apnea-bradycardia characterization in preterm infants

    PubMed Central

    Altuve, Miguel; Carrault, Guy; Cruz, Julio; Beuchée, Alain; Pladys, Patrick; Hernandez, Alfredo I.

    2009-01-01

    This work presents an analysis of the information content of new features derived from the electrocardiogram (ECG) for the characterization of apnea-bradycardia events in preterm infants. Automatic beat detection and segmentation methods have been adapted to the ECG signals from preterm infants, through the application of two evolutionary algorithms. ECG data acquired from 32 preterm infants with persistent apnea-bradycardia have been used for quantitative evaluation. The adaptation procedure led to an improved sensitivity and positive predictive value, and a reduced jitter for the detection of the R-wave, QRS onset, QRS offset, and iso-electric level. Additionally, time series representing the RR interval, R-wave amplitude and QRS duration, were automatically extracted for periods at rest, before, during and after apnea-bradycardia episodes. Significant variations (p<0.05) were observed for all time-series when comparing the difference between values at rest versus values just before the bradycardia event, with the difference between values at rest versus values during the bradycardia event. These results reveal changes in the R-wave amplitude and QRS duration, appearing at the onset and termination of apnea-bradycardia episodes, which could be potentially useful for the early detection and characterization of these episodes. PMID:19963984

  4. Visualization Techniques for Computer Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beaver, Justin M; Steed, Chad A; Patton, Robert M

    2011-01-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operatormore » to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.« less

  5. A State-Space Approach to Optimal Level-Crossing Prediction for Linear Gaussian Processes

    NASA Technical Reports Server (NTRS)

    Martin, Rodney Alexander

    2009-01-01

    In many complex engineered systems, the ability to give an alarm prior to impending critical events is of great importance. These critical events may have varying degrees of severity, and in fact they may occur during normal system operation. In this article, we investigate approximations to theoretically optimal methods of designing alarm systems for the prediction of level-crossings by a zero-mean stationary linear dynamic system driven by Gaussian noise. An optimal alarm system is designed to elicit the fewest false alarms for a fixed detection probability. This work introduces the use of Kalman filtering in tandem with the optimal level-crossing problem. It is shown that there is a negligible loss in overall accuracy when using approximations to the theoretically optimal predictor, at the advantage of greatly reduced computational complexity. I

  6. Modern Adaptive Analytics Approach to Lowering Seismic Network Detection Thresholds

    NASA Astrophysics Data System (ADS)

    Johnson, C. E.

    2017-12-01

    Modern seismic networks present a number of challenges, but perhaps most notably are those related to 1) extreme variation in station density, 2) temporal variation in station availability, and 3) the need to achieve detectability for much smaller events of strategic importance. The first of these has been reasonably addressed in the development of modern seismic associators, such as GLASS 3.0 by the USGS/NEIC, though some work still remains to be done in this area. However, the latter two challenges demand special attention. Station availability is impacted by weather, equipment failure or the adding or removing of stations, and while thresholds have been pushed to increasingly smaller magnitudes, new algorithms are needed to achieve even lower thresholds. Station availability can be addressed by a modern, adaptive architecture that maintains specified performance envelopes using adaptive analytics coupled with complexity theory. Finally, detection thresholds can be lowered using a novel approach that tightly couples waveform analytics with the event detection and association processes based on a principled repicking algorithm that uses particle realignment for enhanced phase discrimination.

  7. EEG signatures accompanying auditory figure-ground segregation

    PubMed Central

    Tóth, Brigitta; Kocsis, Zsuzsanna; Háden, Gábor P.; Szerafin, Ágnes; Shinn-Cunningham, Barbara; Winkler, István

    2017-01-01

    In everyday acoustic scenes, figure-ground segregation typically requires one to group together sound elements over both time and frequency. Electroencephalogram was recorded while listeners detected repeating tonal complexes composed of a random set of pure tones within stimuli consisting of randomly varying tonal elements. The repeating pattern was perceived as a figure over the randomly changing background. It was found that detection performance improved both as the number of pure tones making up each repeated complex (figure coherence) increased, and as the number of repeated complexes (duration) increased – i.e., detection was easier when either the spectral or temporal structure of the figure was enhanced. Figure detection was accompanied by the elicitation of the object related negativity (ORN) and the P400 event-related potentials (ERPs), which have been previously shown to be evoked by the presence of two concurrent sounds. Both ERP components had generators within and outside of auditory cortex. The amplitudes of the ORN and the P400 increased with both figure coherence and figure duration. However, only the P400 amplitude correlated with detection performance. These results suggest that 1) the ORN and P400 reflect processes involved in detecting the emergence of a new auditory object in the presence of other concurrent auditory objects; 2) the ORN corresponds to the likelihood of the presence of two or more concurrent sound objects, whereas the P400 reflects the perceptual recognition of the presence of multiple auditory objects and/or preparation for reporting the detection of a target object. PMID:27421185

  8. Correlation between DNA ploidy, metaphase high-resolution comparative genomic hybridization results and clinical outcome of synovial sarcoma

    PubMed Central

    2011-01-01

    Background Although synovial sarcoma is the 3rd most commonly occurring mesenchymal tumor in young adults, usually with a highly aggressive clinical course; remarkable differences can be seen regarding the clinical outcome. According to comparative genomic hybridization (CGH) data published in the literature, the simple and complex karyotypes show a correlation between the prognosis and clinical outcome. In addition, the connection between DNA ploidy and clinical course is controversial. The aim of this study was using a fine-tuning interpretation of our DNA ploidy results and to compare these with metaphase high-resolution CGH (HR-CGH) results. Methods DNA ploidy was determined on Feulgen-stained smears in 56 synovial sarcoma cases by image cytometry; follow up was available in 46 cases (average: 78 months). In 9 cases HR-CGH analysis was also available. Results 10 cases were found DNA-aneuploid, 46 were DNA-diploid by image cytometry. With fine-tuning of the diploid cases according to the 5c exceeding events (single cell aneuploidy), 33 cases were so called "simple-diploid" (without 5c exceeding events) and 13 cases were "complex-diploid"; containing 5c exceeding events (any number). Aneuploid tumors contained large numbers of genetic alterations with the sum gain of at least 2 chromosomes (A-, B- or C-group) detected by HR-CGH. In the "simple-diploid" cases no or few genetic alterations could be detected, whereas the "complex-diploid" samples numerous aberrations (equal or more than 3) could be found. Conclusions Our results show a correlation between the DNA-ploidy, a fine-tuned DNA-ploidy and the HR-CGH results. Furthermore, we found significant correlation between the different ploidy groups and the clinical outcome (p < 0.05). PMID:22053830

  9. Biomolecule recognition using piezoresistive nanomechanical force probes

    NASA Astrophysics Data System (ADS)

    Tosolini, Giordano; Scarponi, Filippo; Cannistraro, Salvatore; Bausells, Joan

    2013-06-01

    Highly sensitive sensors are one of the enabling technologies for the biomarker detection in early stage diagnosis of pathologies. We have developed a self-sensing nanomechanical force probe able for detecting the unbinding of single couples of biomolecular partners in nearly physiological conditions. The embedding of a piezoresistive transducer into a nanomechanical cantilever enabled high force measurement capability with sub 10-pN resolution. Here, we present the design, microfabrication, optimization, and complete characterization of the sensor. The exceptional electromechanical performance obtained allowed us to detect biorecognition specific events underlying the biotin-avidin complex formation, by integrating the sensor in a commercial atomic force microscope.

  10. Manananggal - a novel viewer for alternative splicing events.

    PubMed

    Barann, Matthias; Zimmer, Ralf; Birzele, Fabian

    2017-02-21

    Alternative splicing is an important cellular mechanism that can be analyzed by RNA sequencing. However, identification of splicing events in an automated fashion is error-prone. Thus, further validation is required to select reliable instances of alternative splicing events (ASEs). There are only few tools specifically designed for interactive inspection of ASEs and available visualization approaches can be significantly improved. Here, we present Manananggal, an application specifically designed for the identification of splicing events in next generation sequencing data. Manananggal includes a web application for visual inspection and a command line tool that allows for ASE detection. We compare the sashimi plots available in the IGV Viewer, the DEXSeq splicing plots and SpliceSeq to the Manananggal interface and discuss the advantages and drawbacks of these tools. We show that sashimi plots (such as those used by the IGV Viewer and SpliceSeq) offer a practical solution for simple ASEs, but also indicate short-comings for highly complex genes. Manananggal is an interactive web application that offers functions specifically tailored to the identification of alternative splicing events that other tools are lacking. The ability to select a subset of isoforms allows an easier interpretation of complex alternative splicing events. In contrast to SpliceSeq and the DEXSeq splicing plot, Manananggal does not obscure the gene structure by showing full transcript models that makes it easier to determine which isoforms are expressed and which are not.

  11. High Energy Wide Area Blunt Impact on Composite Aircraft Structures

    NASA Astrophysics Data System (ADS)

    DeFrancisci, Gabriela K.

    The largest source of damage to commercial aircraft is caused by accidental contact with ground service equipment (GSE). The cylindrical bumper typically found on GSE distributes the impact load over a large contact area, possibly spanning multiple internal structural elements (frame bays) of a stiffened-skin fuselage. This type of impact can lead to damage that is widespread and difficult to detect visually. To address this problem, monolithic composite panels of various size and complexity have been modeled and tested quasi-statically and dynamically. The experimental observations have established that detectability is dependent on the impact location and immediately-adjacent internal structure of the panel, as well as the impactor geometry and total deformation of the panel. A methodology to model and predict damage caused by wide area blunt impact events was established, which was then applied to more general cases that were not tested in order to better understand the nature of this type of impact event and how it relates to the final damage state and visual detectability.

  12. Surveillance of ground vehicles for airport security

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Wang, Zhonghai; Shen, Dan; Ling, Haibin; Chen, Genshe

    2014-06-01

    Future surveillance systems will work in complex and cluttered environments which require systems engineering solutions for such applications such as airport ground surface management. In this paper, we highlight the use of a L1 video tracker for monitoring activities at an airport. We present methods of information fusion, entity detection, and activity analysis using airport videos for runway detection and airport terminal events. For coordinated airport security, automated ground surveillance enhances efficient and safe maneuvers for aircraft, unmanned air vehicles (UAVs) and unmanned ground vehicles (UGVs) operating within airport environments.

  13. Evaluation of Earthquake Detection Performance in Terms of Quality and Speed in SEISCOMP3 Using New Modules Qceval, Npeval and Sceval

    NASA Astrophysics Data System (ADS)

    Roessler, D.; Weber, B.; Ellguth, E.; Spazier, J.

    2017-12-01

    The geometry of seismic monitoring networks, site conditions and data availability as well as monitoring targets and strategies typically impose trade-offs between data quality, earthquake detection sensitivity, false detections and alert times. Network detection capabilities typically change with alteration of the seismic noise level by human activity or by varying weather and sea conditions. To give helpful information to operators and maintenance coordinators, gempa developed a range of tools to evaluate earthquake detection and network performance including qceval, npeval and sceval. qceval is a module which analyzes waveform quality parameters in real-time and deactivates and reactivates data streams based on waveform quality thresholds for automatic processing. For example, thresholds can be defined for latency, delay, timing quality, spikes and gaps count and rms. As changes in the automatic processing have a direct influence on detection quality and speed, another tool called "npeval" was designed to calculate in real-time the expected time needed to detect and locate earthquakes by evaluating the effective network geometry. The effective network geometry is derived from the configuration of stations participating in the detection. The detection times are shown as an additional layer on the map and updated in real-time as soon as the effective network geometry changes. Yet another new tool, "sceval", is an automatic module which classifies located seismic events (Origins) in real-time. sceval evaluates the spatial distribution of the stations contributing to an Origin. It confirms or rejects the status of Origins, adds comments or leaves the Origin unclassified. The comments are passed to an additional sceval plug-in where the end user can customize event types. This unique identification of real and fake events in earthquake catalogues allows to lower network detection thresholds. In real-time monitoring situations operators can limit the processing to events with unclassified Origins, reducing their workload. Classified Origins can be treated specifically by other procedures. These modules have been calibrated and fully tested by several complex seismic monitoring networks in the region of Indonesia and Northern Chile.

  14. Label-Free Nanopore Biosensor for Rapid and Highly Sensitive Cocaine Detection in Complex Biological Fluids.

    PubMed

    Rauf, Sana; Zhang, Ling; Ali, Asghar; Liu, Yang; Li, Jinghong

    2017-02-24

    Detection of very low amounts of illicit drugs such as cocaine in clinical fluids like serum continues to be important for many areas in the fight against drug trafficking. Herein, we constructed a label-free nanopore biosensor for rapid and highly sensitive detection of cocaine in human serum and saliva samples based on target-induced strand release strategy. In this bioassay, an aptamer for cocaine was prehybridized with a short complementary DNA. Owing to cocaine specific binding with aptamer, the short DNA strand was displaced from aptamer and translocation of this output DNA through α-hemolysin nanopore generated distinct spike-like current blockages. When plotted in double-logarithmic scale, a linear relationship between target cocaine concentration and output DNA event frequency was obtained in a wide concentration range from 50 nM to 100 μM of cocaine, with the limit of detection down to 50 nM. In addition, this aptamer-based sensor method was successfully applied for cocaine detection in complex biological fluids like human saliva and serum samples with great selectivity. Simple preparation, low cost, rapid, label-free, and real sample detection are the motivating factors for practical application of the proposed biosensor.

  15. Horizontal gene transfer in an acid mine drainage microbial community.

    PubMed

    Guo, Jiangtao; Wang, Qi; Wang, Xiaoqi; Wang, Fumeng; Yao, Jinxian; Zhu, Huaiqiu

    2015-07-04

    Horizontal gene transfer (HGT) has been widely identified in complete prokaryotic genomes. However, the roles of HGT among members of a microbial community and in evolution remain largely unknown. With the emergence of metagenomics, it is nontrivial to investigate such horizontal flow of genetic materials among members in a microbial community from the natural environment. Because of the lack of suitable methods for metagenomics gene transfer detection, microorganisms from a low-complexity community acid mine drainage (AMD) with near-complete genomes were used to detect possible gene transfer events and suggest the biological significance. Using the annotation of coding regions by the current tools, a phylogenetic approach, and an approximately unbiased test, we found that HGTs in AMD organisms are not rare, and we predicted 119 putative transferred genes. Among them, 14 HGT events were determined to be transfer events among the AMD members. Further analysis of the 14 transferred genes revealed that the HGT events affected the functional evolution of archaea or bacteria in AMD, and it probably shaped the community structure, such as the dominance of G-plasma in archaea in AMD through HGT. Our study provides a novel insight into HGT events among microorganisms in natural communities. The interconnectedness between HGT and community evolution is essential to understand microbial community formation and development.

  16. Assessment of imprinting- and genetic variation-dependent monoallelic expression using reciprocal allele descendants between human family trios.

    PubMed

    Chuang, Trees-Juen; Tseng, Yu-Hsiang; Chen, Chia-Ying; Wang, Yi-Da

    2017-08-01

    Genomic imprinting is an important epigenetic process that silences one of the parentally-inherited alleles of a gene and thereby exhibits allelic-specific expression (ASE). Detection of human imprinting events is hampered by the infeasibility of the reciprocal mating system in humans and the removal of ASE events arising from non-imprinting factors. Here, we describe a pipeline with the pattern of reciprocal allele descendants (RADs) through genotyping and transcriptome sequencing data across independent parent-offspring trios to discriminate between varied types of ASE (e.g., imprinting, genetic variation-dependent ASE, and random monoallelic expression (RME)). We show that the vast majority of ASE events are due to sequence-dependent genetic variant, which are evolutionarily conserved and may themselves play a cis-regulatory role. Particularly, 74% of non-RAD ASE events, even though they exhibit ASE biases toward the same parentally-inherited allele across different individuals, are derived from genetic variation but not imprinting. We further show that the RME effect may affect the effectiveness of the population-based method for detecting imprinting events and our pipeline can help to distinguish between these two ASE types. Taken together, this study provides a good indicator for categorization of different types of ASE, opening up this widespread and complex mechanism for comprehensive characterization.

  17. Targeting safety improvements through identification of incident origination and detection in a near-miss incident learning system.

    PubMed

    Novak, Avrey; Nyflot, Matthew J; Ermoian, Ralph P; Jordan, Loucille E; Sponseller, Patricia A; Kane, Gabrielle M; Ford, Eric C; Zeng, Jing

    2016-05-01

    Radiation treatment planning involves a complex workflow that has multiple potential points of vulnerability. This study utilizes an incident reporting system to identify the origination and detection points of near-miss errors, in order to guide their departmental safety improvement efforts. Previous studies have examined where errors arise, but not where they are detected or applied a near-miss risk index (NMRI) to gauge severity. From 3/2012 to 3/2014, 1897 incidents were analyzed from a departmental incident learning system. All incidents were prospectively reviewed weekly by a multidisciplinary team and assigned a NMRI score ranging from 0 to 4 reflecting potential harm to the patient (no potential harm to potential critical harm). Incidents were classified by point of incident origination and detection based on a 103-step workflow. The individual steps were divided among nine broad workflow categories (patient assessment, imaging for radiation therapy (RT) planning, treatment planning, pretreatment plan review, treatment delivery, on-treatment quality management, post-treatment completion, equipment/software quality management, and other). The average NMRI scores of incidents originating or detected within each broad workflow area were calculated. Additionally, out of 103 individual process steps, 35 were classified as safety barriers, the process steps whose primary function is to catch errors. The safety barriers which most frequently detected incidents were identified and analyzed. Finally, the distance between event origination and detection was explored by grouping events by the number of broad workflow area events passed through before detection, and average NMRI scores were compared. Near-miss incidents most commonly originated within treatment planning (33%). However, the incidents with the highest average NMRI scores originated during imaging for RT planning (NMRI = 2.0, average NMRI of all events = 1.5), specifically during the documentation of patient positioning and localization of the patient. Incidents were most frequently detected during treatment delivery (30%), and incidents identified at this point also had higher severity scores than other workflow areas (NMRI = 1.6). Incidents identified during on-treatment quality management were also more severe (NMRI = 1.7), and the specific process steps of reviewing portal and CBCT images tended to catch highest-severity incidents. On average, safety barriers caught 46% of all incidents, most frequently at physics chart review, therapist's chart check, and the review of portal images; however, most of the incidents that pass through a particular safety barrier are not designed to be capable of being captured at that barrier. Incident learning systems can be used to assess the most common points of error origination and detection in radiation oncology. This can help tailor safety improvement efforts and target the highest impact portions of the workflow. The most severe near-miss events tend to originate during simulation, with the most severe near-miss events detected at the time of patient treatment. Safety barriers can be improved to allow earlier detection of near-miss events.

  18. Short template switch events explain mutation clusters in the human genome.

    PubMed

    Löytynoja, Ari; Goldman, Nick

    2017-06-01

    Resequencing efforts are uncovering the extent of genetic variation in humans and provide data to study the evolutionary processes shaping our genome. One recurring puzzle in both intra- and inter-species studies is the high frequency of complex mutations comprising multiple nearby base substitutions or insertion-deletions. We devised a generalized mutation model of template switching during replication that extends existing models of genome rearrangement and used this to study the role of template switch events in the origin of short mutation clusters. Applied to the human genome, our model detects thousands of template switch events during the evolution of human and chimp from their common ancestor and hundreds of events between two independently sequenced human genomes. Although many of these are consistent with a template switch mechanism previously proposed for bacteria, our model also identifies new types of mutations that create short inversions, some flanked by paired inverted repeats. The local template switch process can create numerous complex mutation patterns, including hairpin loop structures, and explains multinucleotide mutations and compensatory substitutions without invoking positive selection, speculative mechanisms, or implausible coincidence. Clustered sequence differences are challenging for current mapping and variant calling methods, and we show that many erroneous variant annotations exist in human reference data. Local template switch events may have been neglected as an explanation for complex mutations because of biases in commonly used analyses. Incorporation of our model into reference-based analysis pipelines and comparisons of de novo assembled genomes will lead to improved understanding of genome variation and evolution. © 2017 Löytynoja and Goldman; Published by Cold Spring Harbor Laboratory Press.

  19. Calculated concentrations of any radionuclide deposited on the ground by release from underground nuclear detonations, tests of nuclear rockets, and tests of nuclear ramjet engines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hicks, H.G.

    1981-11-01

    This report presents calculated gamma radiation exposure rates and ground deposition of related radionuclides resulting from three types of event that deposited detectable radioactivity outside the Nevada Test Site complex, namely, underground nuclear detonations, tests of nuclear rocket engines and tests of nuclear ramjet engines.

  20. Topological data analyses and machine learning for detection, classification and characterization of atmospheric rivers

    NASA Astrophysics Data System (ADS)

    Muszynski, G.; Kashinath, K.; Wehner, M. F.; Prabhat, M.; Kurlin, V.

    2017-12-01

    We investigate novel approaches to detecting, classifying and characterizing extreme weather events, such as atmospheric rivers (ARs), in large high-dimensional climate datasets. ARs are narrow filaments of concentrated water vapour in the atmosphere that bring much of the precipitation in many mid-latitude regions. The precipitation associated with ARs is also responsible for major flooding events in many coastal regions of the world, including the west coast of the United States and western Europe. In this study we combine ideas from Topological Data Analysis (TDA) with Machine Learning (ML) for detecting, classifying and characterizing extreme weather events, like ARs. TDA is a new field that sits at the interface between topology and computer science, that studies "shape" - hidden topological structure - in raw data. It has been applied successfully in many areas of applied sciences, including complex networks, signal processing and image recognition. Using TDA we provide ARs with a shape characteristic as a new feature descriptor for the task of AR classification. In particular, we track the change in topology in precipitable water (integrated water vapour) fields using the Union-Find algorithm. We use the generated feature descriptors with ML classifiers to establish reliability and classification performance of our approach. We utilize the parallel toolkit for extreme climate events analysis (TECA: Petascale Pattern Recognition for Climate Science, Prabhat et al., Computer Analysis of Images and Patterns, 2015) for comparison (it is assumed that events identified by TECA is ground truth). Preliminary results indicate that our approach brings new insight into the study of ARs and provides quantitative information about the relevance of topological feature descriptors in analyses of a large climate datasets. We illustrate this method on climate model output and NCEP reanalysis datasets. Further, our method outperforms existing methods on detection and classification of ARs. This work illustrates that TDA combined with ML may provide a uniquely powerful approach for detection, classification and characterization of extreme weather phenomena.

  1. New Frontiers in Characterization of Sub-Catalog Microseismicity: Utilizing Inter-Event Waveform Cross Correlation for Estimating Precise Locations, Magnitudes, and Focal Mechanisms of Tiny Earthquakes

    NASA Astrophysics Data System (ADS)

    Ellsworth, W. L.; Shelly, D. R.; Hardebeck, J.; Hill, D. P.

    2017-12-01

    Microseismicity often conveys the most direct information about active processes in the earth's subsurface. However, routine network processing typically leaves most earthquakes uncharacterized. These "sub-catalog" events can provide critical clues to ongoing processes in the source region. To address this issue, we have developed waveform-based processing that leverages the existing routine catalog of earthquakes to detect and characterize "sub-catalog" events (those absent in routine catalogs). By correlating waveforms of cataloged events with the continuous data stream, we 1) identify events with similar waveform signatures in the continuous data across multiple stations, 2) precisely measure relative time lags across these stations for both P- and S-wave time windows, and 3) estimate the relative polarity between events by the sign of the peak absolute value correlations and its height above the secondary peak. When combined, these inter-event comparisons yield robust measurements, which enable sensitive event detection, relative relocation, and relative magnitude estimation. The most recent addition, focal mechanisms derived from correlation-based relative polarities, addresses a significant shortcoming in microseismicity analyses (see Shelly et al., JGR, 2016). Depending on the application, we can characterize 2-10 times as many events as included in the initial catalog. This technique is particularly well suited for compact zones of active seismicity such as seismic swarms. Application to a 2014 swarm in Long Valley Caldera, California, illuminates complex patterns of faulting that would have otherwise remained obscured. The prevalence of such features in other environments remains an important, as yet unresolved, question.

  2. Effects of Voice Harmonic Complexity on ERP Responses to Pitch-Shifted Auditory Feedback

    PubMed Central

    Behroozmand, Roozbeh; Korzyukov, Oleg; Larson, Charles R.

    2011-01-01

    Objective The present study investigated the neural mechanisms of voice pitch control for different levels of harmonic complexity in the auditory feedback. Methods Event-related potentials (ERPs) were recorded in response to +200 cents pitch perturbations in the auditory feedback of self-produced natural human vocalizations, complex and pure tone stimuli during active vocalization and passive listening conditions. Results During active vocal production, ERP amplitudes were largest in response to pitch shifts in the natural voice, moderately large for non-voice complex stimuli and smallest for the pure tones. However, during passive listening, neural responses were equally large for pitch shifts in voice and non-voice complex stimuli but still larger than that for pure tones. Conclusions These findings suggest that pitch change detection is facilitated for spectrally rich sounds such as natural human voice and non-voice complex stimuli compared with pure tones. Vocalization-induced increase in neural responses for voice feedback suggests that sensory processing of naturally-produced complex sounds such as human voice is enhanced by means of motor-driven mechanisms (e.g. efference copies) during vocal production. Significance This enhancement may enable the audio-vocal system to more effectively detect and correct for vocal errors in the feedback of natural human vocalizations to maintain an intended vocal output for speaking. PMID:21719346

  3. Network Catastrophe: Self-Organized Patterns Reveal both the Instability and the Structure of Complex Networks

    PubMed Central

    Moon, Hankyu; Lu, Tsai-Ching

    2015-01-01

    Critical events in society or biological systems can be understood as large-scale self-emergent phenomena due to deteriorating stability. We often observe peculiar patterns preceding these events, posing a question of—how to interpret the self-organized patterns to know more about the imminent crisis. We start with a very general description — of interacting population giving rise to large-scale emergent behaviors that constitute critical events. Then we pose a key question: is there a quantifiable relation between the network of interactions and the emergent patterns? Our investigation leads to a fundamental understanding to: 1. Detect the system's transition based on the principal mode of the pattern dynamics; 2. Identify its evolving structure based on the observed patterns. The main finding of this study is that while the pattern is distorted by the network of interactions, its principal mode is invariant to the distortion even when the network constantly evolves. Our analysis on real-world markets show common self-organized behavior near the critical transitions, such as housing market collapse and stock market crashes, thus detection of critical events before they are in full effect is possible. PMID:25822423

  4. Network Catastrophe: Self-Organized Patterns Reveal both the Instability and the Structure of Complex Networks

    NASA Astrophysics Data System (ADS)

    Moon, Hankyu; Lu, Tsai-Ching

    2015-03-01

    Critical events in society or biological systems can be understood as large-scale self-emergent phenomena due to deteriorating stability. We often observe peculiar patterns preceding these events, posing a question of--how to interpret the self-organized patterns to know more about the imminent crisis. We start with a very general description -- of interacting population giving rise to large-scale emergent behaviors that constitute critical events. Then we pose a key question: is there a quantifiable relation between the network of interactions and the emergent patterns? Our investigation leads to a fundamental understanding to: 1. Detect the system's transition based on the principal mode of the pattern dynamics; 2. Identify its evolving structure based on the observed patterns. The main finding of this study is that while the pattern is distorted by the network of interactions, its principal mode is invariant to the distortion even when the network constantly evolves. Our analysis on real-world markets show common self-organized behavior near the critical transitions, such as housing market collapse and stock market crashes, thus detection of critical events before they are in full effect is possible.

  5. Fluid Intelligence Predicts Novel Rule Implementation in a Distributed Frontoparietal Control Network.

    PubMed

    Tschentscher, Nadja; Mitchell, Daniel; Duncan, John

    2017-05-03

    Fluid intelligence has been associated with a distributed cognitive control or multiple-demand (MD) network, comprising regions of lateral frontal, insular, dorsomedial frontal, and parietal cortex. Human fluid intelligence is also intimately linked to task complexity, and the process of solving complex problems in a sequence of simpler, more focused parts. Here, a complex target detection task included multiple independent rules, applied one at a time in successive task epochs. Although only one rule was applied at a time, increasing task complexity (i.e., the number of rules) impaired performance in participants of lower fluid intelligence. Accompanying this loss of performance was reduced response to rule-critical events across the distributed MD network. The results link fluid intelligence and MD function to a process of attentional focus on the successive parts of complex behavior. SIGNIFICANCE STATEMENT Fluid intelligence is intimately linked to the ability to structure complex problems in a sequence of simpler, more focused parts. We examine the basis for this link in the functions of a distributed frontoparietal or multiple-demand (MD) network. With increased task complexity, participants of lower fluid intelligence showed reduced responses to task-critical events. Reduced responses in the MD system were accompanied by impaired behavioral performance. Low fluid intelligence is linked to poor foregrounding of task-critical information across a distributed MD system. Copyright © 2017 Tschentscher et al.

  6. Flight deck disturbance management: a simulator study of diagnosis and recovery from breakdowns in pilot-automation coordination.

    PubMed

    Nikolic, Mark I; Sarter, Nadine B

    2007-08-01

    To examine operator strategies for diagnosing and recovering from errors and disturbances as well as the impact of automation design and time pressure on these processes. Considerable efforts have been directed at error prevention through training and design. However, because errors cannot be eliminated completely, their detection, diagnosis, and recovery must also be supported. Research has focused almost exclusively on error detection. Little is known about error diagnosis and recovery, especially in the context of event-driven tasks and domains. With a confederate pilot, 12 airline pilots flew a 1-hr simulator scenario that involved three challenging automation-related tasks and events that were likely to produce erroneous actions or assessments. Behavioral data were compared with a canonical path to examine pilots' error and disturbance management strategies. Debriefings were conducted to probe pilots' system knowledge. Pilots seldom followed the canonical path to cope with the scenario events. Detection of a disturbance was often delayed. Diagnostic episodes were rare because of pilots' knowledge gaps and time criticality. In many cases, generic inefficient recovery strategies were observed, and pilots relied on high levels of automation to manage the consequences of an error. Our findings describe and explain the nature and shortcomings of pilots' error management activities. They highlight the need for improved automation training and design to achieve more timely detection, accurate explanation, and effective recovery from errors and disturbances. Our findings can inform the design of tools and techniques that support disturbance management in various complex, event-driven environments.

  7. Expert and crowd-sourced validation of an individualized sleep spindle detection method employing complex demodulation and individualized normalization

    PubMed Central

    Ray, Laura B.; Sockeel, Stéphane; Soon, Melissa; Bore, Arnaud; Myhr, Ayako; Stojanoski, Bobby; Cusack, Rhodri; Owen, Adrian M.; Doyon, Julien; Fogel, Stuart M.

    2015-01-01

    A spindle detection method was developed that: (1) extracts the signal of interest (i.e., spindle-related phasic changes in sigma) relative to ongoing “background” sigma activity using complex demodulation, (2) accounts for variations of spindle characteristics across the night, scalp derivations and between individuals, and (3) employs a minimum number of sometimes arbitrary, user-defined parameters. Complex demodulation was used to extract instantaneous power in the spindle band. To account for intra- and inter-individual differences, the signal was z-score transformed using a 60 s sliding window, per channel, over the course of the recording. Spindle events were detected with a z-score threshold corresponding to a low probability (e.g., 99th percentile). Spindle characteristics, such as amplitude, duration and oscillatory frequency, were derived for each individual spindle following detection, which permits spindles to be subsequently and flexibly categorized as slow or fast spindles from a single detection pass. Spindles were automatically detected in 15 young healthy subjects. Two experts manually identified spindles from C3 during Stage 2 sleep, from each recording; one employing conventional guidelines, and the other, identifying spindles with the aid of a sigma (11–16 Hz) filtered channel. These spindles were then compared between raters and to the automated detection to identify the presence of true positives, true negatives, false positives and false negatives. This method of automated spindle detection resolves or avoids many of the limitations that complicate automated spindle detection, and performs well compared to a group of non-experts, and importantly, has good external validity with respect to the extant literature in terms of the characteristics of automatically detected spindles. PMID:26441604

  8. The design and implementation of EPL: An event pattern language for active databases

    NASA Technical Reports Server (NTRS)

    Giuffrida, G.; Zaniolo, C.

    1994-01-01

    The growing demand for intelligent information systems requires closer coupling of rule-based reasoning engines, such as CLIPS, with advanced data base management systems (DBMS). For instance, several commercial DBMS now support the notion of triggers that monitor events and transactions occurring in the database and fire induced actions, which perform a variety of critical functions, including safeguarding the integrity of data, monitoring access, and recording volatile information needed by administrators, analysts, and expert systems to perform assorted tasks; examples of these tasks include security enforcement, market studies, knowledge discovery, and link analysis. At UCLA, we designed and implemented the event pattern language (EPL) which is capable of detecting and acting upon complex patterns of events which are temporally related to each other. For instance, a plant manager should be notified when a certain pattern of overheating repeats itself over time in a chemical process; likewise, proper notification is required when a suspicious sequence of bank transactions is executed within a certain time limit. The EPL prototype is built in CLIPS to operate on top of Sybase, a commercial relational DBMS, where actions can be triggered by events such as simple database updates, insertions, and deletions. The rule-based syntax of EPL allows the sequences of goals in rules to be interpreted as sequences of temporal events; each goal can correspond to either (1) a simple event, or (2) a (possibly negated) event/condition predicate, or (3) a complex event defined as the disjunction and repetition of other events. Various extensions have been added to CLIPS in order to tailor the interface with Sybase and its open client/server architecture.

  9. A real time QRS detection using delay-coordinate mapping for the microcontroller implementation.

    PubMed

    Lee, Jeong-Whan; Kim, Kyeong-Seop; Lee, Bongsoo; Lee, Byungchae; Lee, Myoung-Ho

    2002-01-01

    In this article, we propose a new algorithm using the characteristics of reconstructed phase portraits by delay-coordinate mapping utilizing lag rotundity for a real-time detection of QRS complexes in ECG signals. In reconstructing phase portrait the mapping parameters, time delay, and mapping dimension play important roles in shaping of portraits drawn in a new dimensional space. Experimentally, the optimal mapping time delay for detection of QRS complexes turned out to be 20 ms. To explore the meaning of this time delay and the proper mapping dimension, we applied a fill factor, mutual information, and autocorrelation function algorithm that were generally used to analyze the chaotic characteristics of sampled signals. From these results, we could find the fact that the performance of our proposed algorithms relied mainly on the geometrical property such as an area of the reconstructed phase portrait. For the real application, we applied our algorithm for designing a small cardiac event recorder. This system was to record patients' ECG and R-R intervals for 1 h to investigate HRV characteristics of the patients who had vasovagal syncope symptom and for the evaluation, we implemented our algorithm in C language and applied to MIT/BIH arrhythmia database of 48 subjects. Our proposed algorithm achieved a 99.58% detection rate of QRS complexes.

  10. Cardiac mitochondrial matrix and respiratory complex protein phosphorylation

    PubMed Central

    Covian, Raul

    2012-01-01

    It has become appreciated over the last several years that protein phosphorylation within the cardiac mitochondrial matrix and respiratory complexes is extensive. Given the importance of oxidative phosphorylation and the balance of energy metabolism in the heart, the potential regulatory effect of these classical signaling events on mitochondrial function is of interest. However, the functional impact of protein phosphorylation and the kinase/phosphatase system responsible for it are relatively unknown. Exceptions include the well-characterized pyruvate dehydrogenase and branched chain α-ketoacid dehydrogenase regulatory system. The first task of this review is to update the current status of protein phosphorylation detection primarily in the matrix and evaluate evidence linking these events with enzymatic function or protein processing. To manage the scope of this effort, we have focused on the pathways involved in energy metabolism. The high sensitivity of modern methods of detecting protein phosphorylation and the low specificity of many kinases suggests that detection of protein phosphorylation sites without information on the mole fraction of phosphorylation is difficult to interpret, especially in metabolic enzymes, and is likely irrelevant to function. However, several systems including protein translocation, adenine nucleotide translocase, cytochrome c, and complex IV protein phosphorylation have been well correlated with enzymatic function along with the classical dehydrogenase systems. The second task is to review the current understanding of the kinase/phosphatase system within the matrix. Though it is clear that protein phosphorylation occurs within the matrix, based on 32P incorporation and quantitative mass spectrometry measures, the kinase/phosphatase system responsible for this process is ill-defined. An argument is presented that remnants of the much more labile bacterial protein phosphoryl transfer system may be present in the matrix and that the evaluation of this possibility will require the application of approaches developed for bacterial cell signaling to the mitochondria. PMID:22886415

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Novak, Avrey; Nyflot, Matthew J.; Ermoian, Ralph P.

    Purpose: Radiation treatment planning involves a complex workflow that has multiple potential points of vulnerability. This study utilizes an incident reporting system to identify the origination and detection points of near-miss errors, in order to guide their departmental safety improvement efforts. Previous studies have examined where errors arise, but not where they are detected or applied a near-miss risk index (NMRI) to gauge severity. Methods: From 3/2012 to 3/2014, 1897 incidents were analyzed from a departmental incident learning system. All incidents were prospectively reviewed weekly by a multidisciplinary team and assigned a NMRI score ranging from 0 to 4 reflectingmore » potential harm to the patient (no potential harm to potential critical harm). Incidents were classified by point of incident origination and detection based on a 103-step workflow. The individual steps were divided among nine broad workflow categories (patient assessment, imaging for radiation therapy (RT) planning, treatment planning, pretreatment plan review, treatment delivery, on-treatment quality management, post-treatment completion, equipment/software quality management, and other). The average NMRI scores of incidents originating or detected within each broad workflow area were calculated. Additionally, out of 103 individual process steps, 35 were classified as safety barriers, the process steps whose primary function is to catch errors. The safety barriers which most frequently detected incidents were identified and analyzed. Finally, the distance between event origination and detection was explored by grouping events by the number of broad workflow area events passed through before detection, and average NMRI scores were compared. Results: Near-miss incidents most commonly originated within treatment planning (33%). However, the incidents with the highest average NMRI scores originated during imaging for RT planning (NMRI = 2.0, average NMRI of all events = 1.5), specifically during the documentation of patient positioning and localization of the patient. Incidents were most frequently detected during treatment delivery (30%), and incidents identified at this point also had higher severity scores than other workflow areas (NMRI = 1.6). Incidents identified during on-treatment quality management were also more severe (NMRI = 1.7), and the specific process steps of reviewing portal and CBCT images tended to catch highest-severity incidents. On average, safety barriers caught 46% of all incidents, most frequently at physics chart review, therapist’s chart check, and the review of portal images; however, most of the incidents that pass through a particular safety barrier are not designed to be capable of being captured at that barrier. Conclusions: Incident learning systems can be used to assess the most common points of error origination and detection in radiation oncology. This can help tailor safety improvement efforts and target the highest impact portions of the workflow. The most severe near-miss events tend to originate during simulation, with the most severe near-miss events detected at the time of patient treatment. Safety barriers can be improved to allow earlier detection of near-miss events.« less

  12. Cloud-based serviced-orientated data systems for ocean observational data - an example from the coral reef community

    NASA Astrophysics Data System (ADS)

    Bainbridge, S.

    2012-04-01

    The advent of new observing systems, such as sensor networks, have dramatically increased our ability to collect marine data; the issue now is not data drought but data deluge. The challenge now is to extract data representing events of interest from the background data, that is how to deliver information and potentially knowledge from an increasing large store of base observations. Given that each potential user will have differing definitions of 'interesting' and that this is often defined by other events and data, systems need to deliver information or knowledge in a form and context defined by the user. This paper reports on a series of coral reef sensor networks set up under the Coral Reef Environmental Observation Network (CREON). CREON is a community of interest group deploying coral reef sensor networks with the goal of increasing capacity in coral reef observation, especially into developing areas. Issues such as coral bleaching, terrestrial runoff, human impacts and climate change are impacting reefs with one assessment indicating a quarter of the worlds reefs being severely degraded with another quarter under immediate threat. Increasing our ability to collect scientifically valid observations is fundamental to understanding these systems and ultimately in preserving and sustaining them. A cloud based data management system was used to store the base sensor data from each agency involved using service based agents to push the data from individual field sensors to the cloud. The system supports a range of service based outputs such as on-line graphs, a smart-phone application and simple event detection. A more complex event detection system was written that takes input from the cloud services and outputs natural language 'tweets' to Twitter as events occur. It therefore becomes possible to distil the entire data set down to a series of Twitter entries that interested parties can subscribe to. The next step is to allow users to define their own events and to deliver results, in context, to their preferred medium. The paper contrasts what has been achieved within a small community with well defined issues with what it would take to build equivalent systems to hold a wide range of cross community observational data addressing a wider range of potential issues. The role of discoverability, quality control, uncertainly, conformity and metadata are investigated along with a brief discussion of existing and emerging standards in this area. The elements of such as system are described along with the role of modelling and scenario tools in delivering a higher level of outputs linking what may have already occurred (event detection) with what may potentially occur (scenarios). The development of service based cloud computing open data systems coupled with complex event detection systems delivering through social media and other channels linked into model and scenario systems represents one vision for delivering value from the increasing store of ocean observations, most of which lie unknown, unused and unloved.

  13. A Carbon Nanotube Reporter of miRNA Hybridization Events In Vivo

    PubMed Central

    Harvey, Jackson D.; Jena, Prakrit V.; Baker, Hanan A.; Zerze, Gül H.; Williams, Ryan M.; Galassi, Thomas V.; Roxbury, Daniel; Mittal, Jeetain

    2017-01-01

    MicroRNAs and other small oligonucleotides in biofluids are promising disease biomarkers, yet conventional assays require complex processing steps that are unsuitable for point-of-care testing or for implantable or wearable sensors. Single-walled carbon nanotubes are an ideal material for implantable sensors, owing to their emission in the near-infrared spectral region, photostability and exquisite sensitivity. Here, we report an engineered carbon-nanotube-based sensor capable of real-time optical quantification of hybridization events of microRNA and other oligonucleotides. The mechanism of the sensor arises from competitive effects between displacement of both oligonucleotide charge groups and water from the nanotube surface, which result in a solvatochromism-like response. The sensor, which allows for detection via single-molecule sensor elements and for multiplexing by using multiple nanotube chiralities, can monitor toehold-based strand-displacement events, which reverse the sensor response and regenerate the sensor complex. We also show that the sensor functions in whole urine and serum, and can non-invasively measure DNA and microRNA after implantation in live mice. PMID:28845337

  14. A Carbon Nanotube Reporter of miRNA Hybridization Events In Vivo.

    PubMed

    Harvey, Jackson D; Jena, Prakrit V; Baker, Hanan A; Zerze, Gül H; Williams, Ryan M; Galassi, Thomas V; Roxbury, Daniel; Mittal, Jeetain; Heller, Daniel A

    2017-01-01

    MicroRNAs and other small oligonucleotides in biofluids are promising disease biomarkers, yet conventional assays require complex processing steps that are unsuitable for point-of-care testing or for implantable or wearable sensors. Single-walled carbon nanotubes are an ideal material for implantable sensors, owing to their emission in the near-infrared spectral region, photostability and exquisite sensitivity. Here, we report an engineered carbon-nanotube-based sensor capable of real-time optical quantification of hybridization events of microRNA and other oligonucleotides. The mechanism of the sensor arises from competitive effects between displacement of both oligonucleotide charge groups and water from the nanotube surface, which result in a solvatochromism-like response. The sensor, which allows for detection via single-molecule sensor elements and for multiplexing by using multiple nanotube chiralities, can monitor toehold-based strand-displacement events, which reverse the sensor response and regenerate the sensor complex. We also show that the sensor functions in whole urine and serum, and can non-invasively measure DNA and microRNA after implantation in live mice.

  15. Applying complex networks to evaluate precipitation patterns over South America

    NASA Astrophysics Data System (ADS)

    Ciemer, Catrin; Boers, Niklas; Barbosa, Henrique; Kurths, Jürgen; Rammig, Anja

    2016-04-01

    The climate of South America exhibits pronounced differences between the wet- and the dry-season, which are accompanied by specific synoptic events like changes in the location of the South American Low Level Jet (SALLJ) and the establishment of the South American Convergence Zone (SACZ). The onset of these events can be related to the presence of typical large-scale precipitation patterns over South America, as previous studies have shown[1,2]. The application of complex network methods to precipitation data recently received increased scientific attention for the special case of extreme events, as it is possible with such methods to analyze the spatiotemporal correlation structure as well as possible teleconnections of these events[3,4]. In these approaches the correlation between precipitation datasets is calculated by means of Event Synchronization which restricts their applicability to extreme precipitation events. In this work, we propose a method which is able to consider not only extreme precipitation but complete time series. A direct application of standard similarity measures in order to correlate precipitation time series is impossible due to their intricate statistical properties as the large amount of zeros. Therefore, we introduced and evaluated a suitable modification of Pearson's correlation coefficient to construct spatial correlation networks of precipitation. By analyzing the characteristics of spatial correlation networks constructed on the basis of this new measure, we are able to determine coherent areas of similar precipitation patterns, spot teleconnections of correlated areas, and detect central regions for precipitation correlation. By analyzing the change of the network over the year[5], we are also able to determine local and global changes in precipitation correlation patterns. Additionally, global network characteristics as the network connectivity yield indications for beginning and end of wet- and dry season. In order to identify large-scale synoptic events like the SACZ and SALLJ onset, detecting the changes of correlation over time between certain regions is of significant relevance. [1] Nieto-Ferreira et al. Quarterly Journal of the Royal Meteorological Society (2011) [2] Vera et al. Bulletin of the American Meteorological Society (2006) [3] Quiroga et al. Physical review E (2002) [4] Boers et al. nature communications (2014) [5] Radebach et al. Physical review E (2013)

  16. Classification of Meteorological Influences Surrounding Extreme Precipitation Events in the United States using the MERRA-2 Reanalysis

    NASA Technical Reports Server (NTRS)

    Collow, Allie Marquardt; Bosilovich, Mike; Ullrich, Paul; Hoeck, Ian

    2017-01-01

    Extreme precipitation events can have a large impact on society through flooding that can result in property destruction, crop losses, economic losses, the spread of water-borne diseases, and fatalities. Observations indicate there has been a statistically significant increase in extreme precipitation events over the past 15 years in the Northeastern United States and other localized regions of the country have become crippled with record flooding events, for example, the flooding that occurred in the Southeast United States associated with Hurricane Matthew in October 2016. Extreme precipitation events in the United States can be caused by various meteorological influences such as extratropical cyclones, tropical cyclones, mesoscale convective complexes, general air mass thunderstorms, upslope flow, fronts, and the North American Monsoon. Reanalyses, such as the Modern Era Retrospective Analysis for Research and Applications, version 2 (MERRA-2), have become a pivotal tool to study the meteorology surrounding extreme precipitation events. Using days classified as an extreme precipitation events based on a combination of observational gauge and radar data, two techniques for the classification of these events are used to gather additional information that can be used to determine how events have changed over time using atmospheric data from MERRA-2. The first is self organizing maps, which is an artificial neural network that uses unsupervised learning to cluster like patterns and the second is an automated detection technique that searches for characteristics in the atmosphere that define a meteorological phenomena. For example, the automated detection for tropical cycles searches for a defined area of suppressed sea level pressure, alongside thickness anomalies aloft, indicating the presence of a warm core. These techniques are employed for extreme precipitation events in preselected regions that were chosen based an analysis of the climatology of precipitation.

  17. A habituation based approach for detection of visual changes in surveillance camera

    NASA Astrophysics Data System (ADS)

    Sha'abani, M. N. A. H.; Adan, N. F.; Sabani, M. S. M.; Abdullah, F.; Nadira, J. H. S.; Yasin, M. S. M.

    2017-09-01

    This paper investigates a habituation based approach in detecting visual changes using video surveillance systems in a passive environment. Various techniques have been introduced for dynamic environment such as motion detection, object classification and behaviour analysis. However, in a passive environment, most of the scenes recorded by the surveillance system are normal. Therefore, implementing a complex analysis all the time in the passive environment resulting on computationally expensive, especially when using a high video resolution. Thus, a mechanism of attention is required, where the system only responds to an abnormal event. This paper proposed a novelty detection mechanism in detecting visual changes and a habituation based approach to measure the level of novelty. The objective of the paper is to investigate the feasibility of the habituation based approach in detecting visual changes. Experiment results show that the approach are able to accurately detect the presence of novelty as deviations from the learned knowledge.

  18. Analysis of Deep Long-Period Subglacial Seismicity in Marie Byrd Land, Antarctica

    NASA Astrophysics Data System (ADS)

    McMahon, N. D.; Aster, R. C.; Myers, E. K.; Lough, A. C.

    2017-12-01

    We utilize subspace detection methodology to extend the detection and analysis of deep, long-period seismic activity associated with the subglacial and lower crust magmatic complex beneath the Executive Committee Range volcanoes of Marie Byrd Land (Lough et al., 2013). The Marie Byrd Land (MBL) volcanic province is a remote continental region that is almost completely covered by the West Antarctic Ice Sheet (WAIS). The southern extent of Marie Byrd Land lies within the West Antarctic Rift System (WARS), which includes the volcanic Executive Committee Range. Lough et al. noted that seismic stations in the POLENET/ANET seismic network detected two swarms of seismic activity during 2010 and 2011. These events have been interpreted as deep, long-period (DLP) earthquakes based on their depth (25-40 km), tectonic context, and low frequency spectra. The DLP events in MBL lie beneath an inferred volcanic edifice that is visible in ice penetrating radar images via subglacial topography and intraglacial ash deposits, and have been interpreted as a present location of Moho-proximal magmatic activity. The magmatic swarm activity in MBL provides a promising target for advanced subspace detection, and for the temporal, spatial, and event size analysis of an extensive deep long period earthquake swarm using a remote and sparse seismographic network. We utilized a catalog of 1370 traditionally identified DLP events to construct subspace detectors for the nine nearest stations using two years of data spanning 2010-2011. Via subspace detection we increase the number of observable detections more than 70 times at the highest signal to noise station while decreasing the overall minimum magnitude of completeness. In addition to the two previously identified swarms during early 2010 and early 2011, we find sustained activity throughout the two years of study that includes several previously unidentified periods of heightened activity. These events have a very high Gutenberg-Richter b-value (>2.0). We also note evidence of continuing seismicity through 2015 examining data from the small number of longer-running POLENET stations in the region.

  19. Detection of protease activity in cells and animals.

    PubMed

    Verdoes, Martijn; Verhelst, Steven H L

    2016-01-01

    Proteases are involved in a wide variety of biologically and medically important events. They are entangled in a complex network of processes that regulate their activity, which makes their study intriguing, but challenging. For comprehensive understanding of protease biology and effective drug discovery, it is therefore essential to study proteases in models that are close to their complex native environments such as live cells or whole organisms. Protease activity can be detected by reporter substrates and activity-based probes, but not all of these reagents are suitable for intracellular or in vivo use. This review focuses on the detection of proteases in cells and in vivo. We summarize the use of probes and substrates as molecular tools, discuss strategies to deliver these tools inside cells, and describe sophisticated read-out techniques such as mass spectrometry and various imaging applications. This article is part of a Special Issue entitled: Physiological Enzymology and Protein Functions. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Broadband seismology and the detection and verification of underground nuclear explosions

    NASA Astrophysics Data System (ADS)

    Tinker, Mark Andrew

    1997-10-01

    On September 24, 1996, President Clinton signed the Comprehensive Test Ban Treaty (CTBT), which bans the testing of all nuclear weapons thereby limiting their future development. Seismology is the primary tool used for the detection and identification of underground explosions and thus, will play a key role in monitoring a CTBT. The detection and identification of low yield explosions requires seismic stations at regional distances (<1500 km). However, because the regional wavefield propagates within the extremely heterogeneous crustal waveguide, the seismic waveforms are also very complicated. Therefore, it is necessary to have a solid understanding of how the phases used in regional discriminants develop within different tectonic regimes. Thus, the development of the seismic phases Pn and Lg, which compose the seismic discriminant Pn/Lg, within the western U.S. from the Non-Proliferation Experiment are evaluated. The most fundamental discriminant is event location as 90% of all seismic sources occur too deep within the earth to be unnatural. France resumed its nuclear testing program after a four year moratorium and conducted six tests during a five month period starting in September of 1995. Using teleseismic data, a joint hypocenter determination algorithm was used to determine the hypocenters of these six explosions. One of the most important problems in monitoring a CTBT is the detection and location of small seismic events. Although seismic arrays have become the central tool for event detection, in the context of a global monitoring treaty, there will be some dependence on sparse regional networks of three-component broadband seismic stations to detect low yield explosions. However, the full power of the data has not been utilized, namely using phases other than P and S. Therefore, the information in the surface wavetrain is used to improve the locations of small seismic events recorded on a sparse network in Bolivia. Finally, as a discrimination example in a complex region, P to S ratios are used to determine source parameters of the Msb{w} 8.3 deep Bolivia earthquake.

  1. Applicability of major histocompatibility complex DRB1 alleles as markers to detect vertebrate hybridization: a case study from Iberian ibex × domestic goat in southern Spain

    PubMed Central

    2012-01-01

    Background Hybridization between closely related wild and domestic species is of great concern because it can alter the evolutionary integrity of the affected populations. The high allelic variability of Major Histocompatibility Complex (MHC) loci usually excludes them from being used in studies to detect hybridization events. However, if a) the parental species don’t share alleles, and b) one of the parental species possesses an exceptionally low number of alleles (to facilitate analysis), then even MHC loci have the potential to detect hybrids. Results By genotyping the exon2 of the MHC class II DRB1 locus, we were able to detect hybridization between domestic goats (Capra hircus) and free-ranging Iberian ibex (Capra pyrenaica hispanica) by molecular means. Conclusions This is the first documentation of a Capra pyrenaica × Capra hircus hybridization, which presented us the opportunity to test the applicability of MHC loci as new, simple, cost-effective, and time-saving approach to detect hybridization between wild species and their domesticated relatives, thus adding value to MHC genes role in animal conservation and management. PMID:23006678

  2. EEG signatures accompanying auditory figure-ground segregation.

    PubMed

    Tóth, Brigitta; Kocsis, Zsuzsanna; Háden, Gábor P; Szerafin, Ágnes; Shinn-Cunningham, Barbara G; Winkler, István

    2016-11-01

    In everyday acoustic scenes, figure-ground segregation typically requires one to group together sound elements over both time and frequency. Electroencephalogram was recorded while listeners detected repeating tonal complexes composed of a random set of pure tones within stimuli consisting of randomly varying tonal elements. The repeating pattern was perceived as a figure over the randomly changing background. It was found that detection performance improved both as the number of pure tones making up each repeated complex (figure coherence) increased, and as the number of repeated complexes (duration) increased - i.e., detection was easier when either the spectral or temporal structure of the figure was enhanced. Figure detection was accompanied by the elicitation of the object related negativity (ORN) and the P400 event-related potentials (ERPs), which have been previously shown to be evoked by the presence of two concurrent sounds. Both ERP components had generators within and outside of auditory cortex. The amplitudes of the ORN and the P400 increased with both figure coherence and figure duration. However, only the P400 amplitude correlated with detection performance. These results suggest that 1) the ORN and P400 reflect processes involved in detecting the emergence of a new auditory object in the presence of other concurrent auditory objects; 2) the ORN corresponds to the likelihood of the presence of two or more concurrent sound objects, whereas the P400 reflects the perceptual recognition of the presence of multiple auditory objects and/or preparation for reporting the detection of a target object. Copyright © 2016. Published by Elsevier Inc.

  3. Past and future detector arrays for complete event reconstruction in heavy-ion reactions

    NASA Astrophysics Data System (ADS)

    Cardella, G.; Acosta, L.; Auditore, L.; Boiano, C.; Castoldi, A.; D'Andrea, M.; De Filippo, E.; Dell'Aquila, D.; De Luca, S.; Fichera, F.; Giudice, N.; Gnoffo, B.; Grimaldi, A.; Guazzoni, C.; Lanzalone, G.; Librizzi, F.; Lombardo, I.; Maiolino, C.; Maffesanti, S.; Martorana, N. S.; Norella, S.; Pagano, A.; Pagano, E. V.; Papa, M.; Parsani, T.; Passaro, G.; Pirrone, S.; Politi, G.; Previdi, F.; Quattrocchi, L.; Rizzo, F.; Russotto, P.; Saccà, G.; Salemi, G.; Sciliberto, D.; Trifirò, A.; Trimarchi, M.; Vigilante, M.

    2017-11-01

    Complex and more and more complete detector arrays have been developed in the last two decades, or are in advanced design stage, in different laboratories. Such arrays are necessary to fully characterize nuclear reactions induced by stable and exotic beams. The need for contemporary detection of charged particles, and/or γ -rays, and/or neutrons, has been stressed in many fields of nuclear structure and reaction dynamics, with particular attention to the improvement of both high angular and energy resolution. Some examples of detection systems adapted to various energy ranges is discussed. Emphasis is given to the possible update of relatively old 4π detectors with new electronics and new detection methods.

  4. Fall Detection Using Smartphone Audio Features.

    PubMed

    Cheffena, Michael

    2016-07-01

    An automated fall detection system based on smartphone audio features is developed. The spectrogram, mel frequency cepstral coefficents (MFCCs), linear predictive coding (LPC), and matching pursuit (MP) features of different fall and no-fall sound events are extracted from experimental data. Based on the extracted audio features, four different machine learning classifiers: k-nearest neighbor classifier (k-NN), support vector machine (SVM), least squares method (LSM), and artificial neural network (ANN) are investigated for distinguishing between fall and no-fall events. For each audio feature, the performance of each classifier in terms of sensitivity, specificity, accuracy, and computational complexity is evaluated. The best performance is achieved using spectrogram features with ANN classifier with sensitivity, specificity, and accuracy all above 98%. The classifier also has acceptable computational requirement for training and testing. The system is applicable in home environments where the phone is placed in the vicinity of the user.

  5. A Fuzzy Reasoning Design for Fault Detection and Diagnosis of a Computer-Controlled System

    PubMed Central

    Ting, Y.; Lu, W.B.; Chen, C.H.; Wang, G.K.

    2008-01-01

    A Fuzzy Reasoning and Verification Petri Nets (FRVPNs) model is established for an error detection and diagnosis mechanism (EDDM) applied to a complex fault-tolerant PC-controlled system. The inference accuracy can be improved through the hierarchical design of a two-level fuzzy rule decision tree (FRDT) and a Petri nets (PNs) technique to transform the fuzzy rule into the FRVPNs model. Several simulation examples of the assumed failure events were carried out by using the FRVPNs and the Mamdani fuzzy method with MATLAB tools. The reasoning performance of the developed FRVPNs was verified by comparing the inference outcome to that of the Mamdani method. Both methods result in the same conclusions. Thus, the present study demonstratrates that the proposed FRVPNs model is able to achieve the purpose of reasoning, and furthermore, determining of the failure event of the monitored application program. PMID:19255619

  6. A longitudinal cohort study of malaria exposure and changing serostatus in a malaria endemic area of rural Tanzania.

    PubMed

    Simmons, Ryan A; Mboera, Leonard; Miranda, Marie Lynn; Morris, Alison; Stresman, Gillian; Turner, Elizabeth L; Kramer, Randall; Drakeley, Chris; O'Meara, Wendy P

    2017-08-02

    Measurements of anti-malarial antibodies are increasingly used as a proxy of transmission intensity. Most serological surveys are based on the use of cross-sectional data that, when age-stratified, approximates historical patterns of transmission within a population. Comparatively few studies leverage longitudinal data to explicitly relate individual infection events with subsequent antibody responses. The occurrence of seroconversion and seroreversion events for two Plasmodium falciparum asexual stage antigens (MSP-1 and AMA-1) was examined using three annual measurements of 691 individuals from a cohort of individuals in a malaria-endemic area of rural east-central Tanzania. Mixed-effect logistic regression models were employed to determine factors associated with changes in serostatus over time. While the expected population-level relationship between seroprevalence and disease incidence was observed, on an individual level the relationship between individual infections and the antibody response was complex. MSP-1 antibody responses were more dynamic in response to the occurrence and resolution of infection events than AMA-1, while the latter was more correlated with consecutive infections. The MSP-1 antibody response to an observed infection seemed to decay faster over time than the corresponding AMA-1 response. Surprisingly, there was no evidence of an age effect on the occurrence of a conversion or reversion event. While the population-level results concur with previously published sero-epidemiological surveys, the individual-level results highlight the more complex relationship between detected infections and antibody dynamics than can be analysed using cross-sectional data. The longitudinal analysis of serological data may provide a powerful tool for teasing apart the complex relationship between infection events and the corresponding immune response, thereby improving the ability to rapidly assess the success or failure of malaria control programmes.

  7. HOMOLOGOUS SOLAR EVENTS ON 2011 JANUARY 27: BUILD-UP AND PROPAGATION IN A COMPLEX CORONAL ENVIRONMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pick, M.; Démoulin, P.; Zucca, P.

    2016-05-20

    In spite of the wealth of imaging observations at the extreme-ultraviolet (EUV), X-ray, and radio wavelengths, there are still relatively few cases where all of the imagery is available to study the full development of a coronal mass ejection (CME) event and its associated shock. The aim of this study is to contribute to the understanding of the role of the coronal environment in the development of CMEs and the formation of shocks, and their propagation. We have analyzed the interactions of a couple of homologous CME events with ambient coronal structures. Both events were launched in a direction farmore » from the local vertical, and exhibited a radical change in their direction of propagation during their progression from the low corona into higher altitudes. Observations at EUV wavelengths from the Atmospheric Imaging Assembly instrument on board the Solar Dynamic Observatory were used to track the events in the low corona. The development of the events at higher altitudes was followed by the white-light coronagraphs on board the Solar and Heliospheric Observatory . Radio emissions produced during the development of the events were well recorded by the Nançay solar instruments. Thanks to their detection of accelerated electrons, the radio observations are an important complement to the EUV imaging. They allowed us to characterize the development of the associated shocks, and helped to unveil the physical processes behind the complex interactions between the CMEs and ambient medium (e.g., compression, reconnection).« less

  8. The Dangers of Failure Masking in Fault-Tolerant Software: Aspects of a Recent In-Flight Upset Event

    NASA Technical Reports Server (NTRS)

    Johnson, C. W.; Holloway, C. M.

    2007-01-01

    On 1 August 2005, a Boeing Company 777-200 aircraft, operating on an international passenger flight from Australia to Malaysia, was involved in a significant upset event while flying on autopilot. The Australian Transport Safety Bureau's investigation into the event discovered that an anomaly existed in the component software hierarchy that allowed inputs from a known faulty accelerometer to be processed by the air data inertial reference unit (ADIRU) and used by the primary flight computer, autopilot and other aircraft systems. This anomaly had existed in original ADIRU software, and had not been detected in the testing and certification process for the unit. This paper describes the software aspects of the incident in detail, and suggests possible implications concerning complex, safety-critical, fault-tolerant software.

  9. Evidence for a seismic activity mainly constituted of hybrid events at Cayambe volcano, Ecuador. Interpretation in a iced-domes volcano context

    NASA Astrophysics Data System (ADS)

    Guillier, Bertrand; Chatelain, Jean-Luc

    2006-06-01

    The high activity level of Hybrid Events (HE) detected beneath the Cayambe volcano since 1989 has been more thoroughly investigated with data from a temporary array. The unusual HE spectral content allows separating a high-frequency signal riding on a low-frequency one, with a probable single source. HEs are interpreted as high frequency VT events, produced by the interaction between magmatic heat and an underground water system fed by thaw water from the summital glacier, which trigger simultaneous low-frequency fluid resonance in the highly fractured adjacent medium. Pure VTs are interpreted as 'aborted' HEs occurring probably in the oldest and coldest part of the volcano complex. To cite this article: B. Guillier, J.-L. Chatelain, C. R. Geoscience 338 (2006).

  10. Getting the right blood to the right patient: the contribution of near-miss event reporting and barrier analysis.

    PubMed

    Kaplan, H S

    2005-11-01

    Safety and reliability in blood transfusion are not static, but are dynamic non-events. Since performance deviations continually occur in complex systems, their detection and correction must be accomplished over and over again. Non-conformance must be detected early enough to allow for recovery or mitigation. Near-miss events afford early detection of possible system weaknesses and provide an early chance at correction. National event reporting systems, both voluntary and involuntary, have begun to include near-miss reporting in their classification schemes, raising awareness for their detection. MERS-TM is a voluntary safety reporting initiative in transfusion. Currently 22 hospitals submit reports anonymously to a central database which supports analysis of a hospital's own data and that of an aggregate database. The system encourages reporting of near-miss events, where the patient is protected from receiving an unsuitable or incorrect blood component due to a planned or unplanned recovery step. MERS-TM data suggest approximately 90% of events are near-misses, with 10% caught after issue but before transfusion. Near-miss reporting may increase total reports ten-fold. The ratio of near-misses to events with harm is 339:1, consistent with other industries' ratio of 300:1, which has been proposed as a measure of reporting in event reporting systems. Use of a risk matrix and an event's relation to protective barriers allow prioritization of these events. Near-misses recovered by planned barriers occur ten times more frequently then unplanned recoveries. A bedside check of the patient's identity with that on the blood component is an essential, final barrier. How the typical two person check is performed, is critical. Even properly done, this check is ineffective against sampling and testing errors. Blood testing at bedside just prior to transfusion minimizes the risk of such upstream events. However, even with simple and well designed devices, training may be a critical issue. Sample errors account for more than half of reported events. The most dangerous miscollection is a blood sample passing acceptance with no previous patient results for comparison. Bar code labels or collection of a second sample may counter this upstream vulnerability. Further upstream barriers have been proposed to counter the precariousness of urgent blood sample collection in a changing unstable situation. One, a linking device, allows safer labeling of tubes away from the bedside, the second, a forcing function, prevents omission of critical patient identification steps. Errors in the blood bank itself account for 15% of errors with a high potential severity. In one such event, a component incorrectly issued, but safely detected prior to transfusion, focused attention on multitasking's contribution to laboratory error. In sum, use of near-miss information, by enhancing barriers supporting error prevention and mitigation, increases our capacity to get the right blood to the right patient.

  11. Assessing the Continuum of Event-Based Biosurveillance Through an Operational Lens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corley, Courtney D.; Lancaster, Mary J.; Brigantic, Robert T.

    2012-03-28

    This research follows the Updated Guidelines for Evaluating Public Health Surveillance Systems, Recommendations from the Guidelines Working Group, published by the Centers for Disease Control and Prevention nearly a decade ago. Since then, models have been developed and complex systems have evolved with a breadth of disparate data to detect or forecast chemical, biological, and radiological events that have significant impact in the One Health landscape. How the attributes identified in 2001 relate to the new range of event-based biosurveillance (EBB) technologies is unclear. This manuscript frames the continuum of EBB methods, models, and constructs through an operational lens (i.e.,more » aspects and attributes associated with operational considerations in the development, testing, and validation of the EBB methods and models and their use in an operational environment). A 2-day subject matter expert workshop was held to scientifically identify, develop, and vet a set of attributes for the broad range of such operational considerations. Workshop participants identified and described comprehensive attributes for the characterization of EBB. The identified attributes are: (1) event, (2) readiness, (3) operational aspects, (4) geographic coverage, (5) population coverage, (6) input data, (7) output, and (8) cost. Ultimately, the analyses herein discuss the broad scope, complexity, and relevant issues germane to EBB useful in an operational environment.« less

  12. Imaging the Subduction Plate Interface Using Low-Frequency Earthquakes

    NASA Astrophysics Data System (ADS)

    Plourde, A. P.; Bostock, M. G.

    2015-12-01

    Low-frequency Earthquakes (LFEs) in subduction zones are commonly thought to represent slip on the plate interface. They have also been observed to lie near or within a zone of low shear-wave velocity, which is modelled as fluid-rich upper oceanic crust. Due to relatively large depth uncertainties in absolute hypocenters of most LFE families, their location relative to an independently imaged subucting plate and, consequently, the nature of the plate boundary at depths between 30-45 km have not been precisely determined. For a selection of LFE families in northern Washington, we measure variations in arrival time of individual LFE detections using multi-channel cross-correlation incorporating both arrivals at the same station and different events (cross-detection data), and the same event but different stations (cross-station data). Employing HypoDD, these times are used to generate relative locations for individual LFE detections. After creating templates from spatial subgroups of detections, network cross-correlation techniques will be used to search for new detections in neighbouring areas, thereby expanding the local catalogue and enabling further subdivision. By combining the source ``arrays'' and the receiver arrays from the Array of Arrays experiment we plan to interrogate plate boundary structure using migration of scattered waves from the subduction complex as previously documented beneath southern Vancouver Island.

  13. Endpoint visual detection of three genetically modified rice events by loop-mediated isothermal amplification.

    PubMed

    Chen, Xiaoyun; Wang, Xiaofu; Jin, Nuo; Zhou, Yu; Huang, Sainan; Miao, Qingmei; Zhu, Qing; Xu, Junfeng

    2012-11-07

    Genetically modified (GM) rice KMD1, TT51-1, and KF6 are three of the most well known transgenic Bt rice lines in China. A rapid and sensitive molecular assay for risk assessment of GM rice is needed. Polymerase chain reaction (PCR), currently the most common method for detecting genetically modified organisms, requires temperature cycling and relatively complex procedures. Here we developed a visual and rapid loop-mediated isothermal amplification (LAMP) method to amplify three GM rice event-specific junction sequences. Target DNA was amplified and visualized by two indicators (SYBR green or hydroxy naphthol blue [HNB]) within 60 min at an isothermal temperature of 63 °C. Different kinds of plants were selected to ensure the specificity of detection and the results of the non-target samples were negative, indicating that the primer sets for the three GM rice varieties had good levels of specificity. The sensitivity of LAMP, with detection limits at low concentration levels (0.01%−0.005% GM), was 10- to 100-fold greater than that of conventional PCR. Additionally, the LAMP assay coupled with an indicator (SYBR green or HNB) facilitated analysis. These findings revealed that the rapid detection method was suitable as a simple field-based test to determine the status of GM crops.

  14. Effects of voice harmonic complexity on ERP responses to pitch-shifted auditory feedback.

    PubMed

    Behroozmand, Roozbeh; Korzyukov, Oleg; Larson, Charles R

    2011-12-01

    The present study investigated the neural mechanisms of voice pitch control for different levels of harmonic complexity in the auditory feedback. Event-related potentials (ERPs) were recorded in response to+200 cents pitch perturbations in the auditory feedback of self-produced natural human vocalizations, complex and pure tone stimuli during active vocalization and passive listening conditions. During active vocal production, ERP amplitudes were largest in response to pitch shifts in the natural voice, moderately large for non-voice complex stimuli and smallest for the pure tones. However, during passive listening, neural responses were equally large for pitch shifts in voice and non-voice complex stimuli but still larger than that for pure tones. These findings suggest that pitch change detection is facilitated for spectrally rich sounds such as natural human voice and non-voice complex stimuli compared with pure tones. Vocalization-induced increase in neural responses for voice feedback suggests that sensory processing of naturally-produced complex sounds such as human voice is enhanced by means of motor-driven mechanisms (e.g. efference copies) during vocal production. This enhancement may enable the audio-vocal system to more effectively detect and correct for vocal errors in the feedback of natural human vocalizations to maintain an intended vocal output for speaking. Copyright © 2011 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  15. The Arabidopsis THO/TREX component TEX1 functionally interacts with MOS11 and modulates mRNA export and alternative splicing events.

    PubMed

    Sørensen, Brian B; Ehrnsberger, Hans F; Esposito, Silvia; Pfab, Alexander; Bruckmann, Astrid; Hauptmann, Judith; Meister, Gunter; Merkl, Rainer; Schubert, Thomas; Längst, Gernot; Melzer, Michael; Grasser, Marion; Grasser, Klaus D

    2017-02-01

    We identify proteins that associate with the THO core complex, and show that the TEX1 and MOS11 components functionally interact, affecting mRNA export and splicing as well as plant development. TREX (TRanscription-EXport) is a multiprotein complex that plays a central role in the coordination of synthesis, processing and nuclear export of mRNAs. Using targeted proteomics, we identified proteins that associate with the THO core complex of Arabidopsis TREX. In addition to the RNA helicase UAP56 and the mRNA export factors ALY2-4 and MOS11 we detected interactions with the mRNA export complex TREX-2 and multiple spliceosomal components. Plants defective in the THO component TEX1 or in the mRNA export factor MOS11 (orthologue of human CIP29) are mildly affected. However, tex1 mos11 double-mutant plants show marked defects in vegetative and reproductive development. In tex1 plants, the levels of tasiRNAs are reduced, while miR173 levels are decreased in mos11 mutants. In nuclei of mos11 cells increased mRNA accumulation was observed, while no mRNA export defect was detected with tex1 cells. Nevertheless, in tex1 mos11 double-mutants, the mRNA export defect was clearly enhanced relative to mos11. The subnuclear distribution of TEX1 substantially overlaps with that of splicing-related SR proteins and in tex1 plants the ratio of certain alternative splicing events is altered. Our results demonstrate that Arabidopsis TEX1 and MOS11 are involved in distinct steps of the biogenesis of mRNAs and small RNAs, and that they interact regarding some aspects, but act independently in others.

  16. Event Detection for Hydrothermal Plumes: A case study at Grotto Vent

    NASA Astrophysics Data System (ADS)

    Bemis, K. G.; Ozer, S.; Xu, G.; Rona, P. A.; Silver, D.

    2012-12-01

    Evidence is mounting that geologic events such as volcanic eruptions (and intrusions) and earthquakes (near and far) influence the flow rates and temperatures of hydrothermal systems. Connecting such suppositions to observations of hydrothermal output is challenging, but new ongoing time series have the potential to capture such events. This study explores using activity detection, a technique modified from computer vision, to identify pre-defined events within an extended time series recorded by COVIS (Cabled Observatory Vent Imaging Sonar) and applies it to a time series, with gaps, from Sept 2010 to the present; available measurements include plume orientation, plume rise rate, and diffuse flow area at the NEPTUNE Canada Observatory at Grotto Vent, Main Endeavour Field, Juan de Fuca Ridge. Activity detection is the process of finding a pattern (activity) in a data set containing many different types of patterns. Among many approaches proposed to model and detect activities, we have chosen a graph-based technique, Petri Nets, as they do not require training data to model the activity. They use the domain expert's knowledge to build the activity as a combination of feature states and their transitions (actions). Starting from a conceptual model of how hydrothermal plumes respond to daily tides, we have developed a Petri Net based detection algorithm that identifies deviations from the specified response. Initially we assumed that the orientation of the plume would change smoothly and symmetrically in a consistent daily pattern. However, results indicate that the rate of directional changes varies. The present Petri Net detects unusually large and rapid changes in direction or amount of bending; however inspection of Figure 1 suggests that many of the events detected may be artifacts resulting from gaps in the data or from the large temporal spacing. Still, considerable complexity overlies the "normal" tidal response pattern (the data has a dominant frequency of ~12.9 hours). We are in the process of defining several events of particular scientific interest: 1) transient behavioral changes associated with atmospheric storms, earthquakes or volcanic intrusions or eruptions, 2) mutual interaction of neighboring plumes on each other's behavior, and 3) rapid shifts in plume direction that indicate the presence of unusual currents or changes in currents. We will query the existing data to see if these relationships are ever observed as well as testing our understanding of the "normal" pattern of response to tidal currents.Figure 1. Arrows indicate plume orientation at a given time (time axis in days after 9/29/10) and stars indicate times when orientation changes rapidly.

  17. Surface Management System Departure Event Data Analysis

    NASA Technical Reports Server (NTRS)

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  18. A General theory of Signal Integration for Fault-Tolerant Dynamic Distributed Sensor Networks

    DTIC Science & Technology

    1993-10-01

    related to a) the architecture and fault- tolerance of the distributed sensor network, b) the proper synchronisation of sensor signals, c) the...Computational complexities of the problem of distributed detection. 5) Issues related to recording of events and synchronization in distributed sensor...Intervals for Synchronization in Real Time Distributed Systems", Submitted to Electronic Encyclopedia. 3. V. G. Hegde and S. S. Iyengar "Efficient

  19. Guided Wave Delamination Detection and Quantification With Wavefield Data Analysis

    NASA Technical Reports Server (NTRS)

    Tian, Zhenhua; Campbell Leckey, Cara A.; Seebo, Jeffrey P.; Yu, Lingyu

    2014-01-01

    Unexpected damage can occur in aerospace composites due to impact events or material stress during off-nominal loading events. In particular, laminated composites are susceptible to delamination damage due to weak transverse tensile and inter-laminar shear strengths. Developments of reliable and quantitative techniques to detect delamination damage in laminated composites are imperative for safe and functional optimally-designed next-generation composite structures. In this paper, we investigate guided wave interactions with delamination damage and develop quantification algorithms by using wavefield data analysis. The trapped guided waves in the delamination region are observed from the wavefield data and further quantitatively interpreted by using different wavenumber analysis methods. The frequency-wavenumber representation of the wavefield shows that new wavenumbers are present and correlate to trapped waves in the damage region. These new wavenumbers are used to detect and quantify the delamination damage through the wavenumber analysis, which can show how the wavenumber changes as a function of wave propagation distance. The location and spatial duration of the new wavenumbers can be identified, providing a useful means not only for detecting the presence of delamination damage but also allowing for estimation of the delamination size. Our method has been applied to detect and quantify real delamination damage with complex geometry (grown using a quasi-static indentation technique). The detection and quantification results show the location, size, and shape of the delamination damage.

  20. Domain-general neural correlates of dependency formation: Using complex tones to simulate language.

    PubMed

    Brilmayer, Ingmar; Sassenhagen, Jona; Bornkessel-Schlesewsky, Ina; Schlesewsky, Matthias

    2017-08-01

    There is an ongoing debate whether the P600 event-related potential component following syntactic anomalies reflects syntactic processes per se, or if it is an instance of the P300, a domain-general ERP component associated with attention and cognitive reorientation. A direct comparison of both components is challenging because of the huge discrepancy in experimental designs and stimulus choice between language and 'classic' P300 experiments. In the present study, we develop a new approach to mimic the interplay of sequential position as well as categorical and relational information in natural language syntax (word category and agreement) in a non-linguistic target detection paradigm using musical instruments. Participants were instructed to (covertly) detect target tones which were defined by instrument change and pitch rise between subsequent tones at the last two positions of four-tone sequences. We analysed the EEG using event-related averaging and time-frequency decomposition. Our results show striking similarities to results obtained from linguistic experiments. We found a P300 that showed sensitivity to sequential position and a late positivity sensitive to stimulus type and position. A time-frequency decomposition revealed significant effects of sequential position on the theta band and a significant influence of stimulus type on the delta band. Our results suggest that the detection of non-linguistic targets defined via complex feature conjunctions in the present study and the detection of syntactic anomalies share the same underlying processes: attentional shift and memory based matching processes that act upon multi-feature conjunctions. We discuss the results as supporting domain-general accounts of the P600 during natural language comprehension. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Early Detection of Human Epileptic Seizures Based on Intracortical Local Field Potentials

    PubMed Central

    Park, Yun S.; Hochberg, Leigh R.; Eskandar, Emad N.; Cash, Sydney S.; Truccolo, Wilson

    2014-01-01

    The unpredictability of re-occurring seizures dramatically impacts the quality of life and autonomy of people with epilepsy. Reliable early seizure detection could open new therapeutic possibilities and thus substantially improve quality of life and autonomy. Though many seizure detection studies have shown the potential of scalp electroencephalogram (EEG) and intracranial EEG (iEEG) signals, reliable early detection of human seizures remains elusive in practice. Here, we examined the use of intracortical local field potentials (LFPs) recorded from 4×4-mm2 96-microelectrode arrays (MEA) for early detection of human epileptic seizures. We adopted a framework consisting of (1) sampling of intracortical LFPs; (2) denoising of LFPs with the Kalman filter; (3) spectral power estimation in specific frequency bands using 1-sec moving time windows; (4) extraction of statistical features, such as the mean, variance, and Fano factor (calculated across channels) of the power in each frequency band; and (5) cost-sensitive support vector machine (SVM) classification of ictal and interictal samples. We tested the framework in one-participant dataset, including 4 seizures and corresponding interictal recordings preceding each seizure. The participant was a 52-year-old woman suffering from complex partial seizures. LFPs were recorded from an MEA implanted in the participant’s left middle temporal gyrus. In this participant, spectral power in 0.3–10 Hz, 20–55 Hz, and 125–250 Hz changed significantly between ictal and interictal epochs. The examined seizure detection framework provided an event-wise sensitivity of 100% (4/4) and only one 20-sec-long false positive event in interictal recordings (likely an undetected subclinical event under further visual inspection), and a detection latency of 4.35 ± 2.21 sec (mean ± std) with respect to iEEG-identified seizure onsets. These preliminary results indicate that intracortical MEA recordings may provide key signals to quickly and reliably detect human seizures. PMID:24663490

  2. Transcription-Coupled Repair and Complex Biology.

    PubMed

    Portman, James R; Strick, Terence R

    2018-05-04

    All active living organisms mitigate DNA damage via DNA repair, and the so-called nucleotide excision repair pathway (NER) represents a functionally major part of the cell's DNA repair repertoire [1]. In this pathway, the damaged strand of DNA is incised and removed before being resynthesized. This form of DNA repair requires a multitude of proteins working in a complex choreography. Repair thus typically involves detection of a DNA lesion; validation of that detection event; search for an appropriate incision site and subsequent DNA incision; DNA unwinding/removal; and DNA resynthesis and religation. These activities are ultimately the result of molecules randomly diffusing and bumping into each other and acting in succession. It is also true however that repair components are often assembled into functional complexes which may be more efficient or regular in their mode of action. Studying DNA repair complexes for their mechanisms of assembly, action, and disassembly can help address fundamental questions such as whether DNA repair pathways are branched or linear; whether for instance they tolerate fluctuations in numbers of components; and more broadly how search processes between macromolecules take place or can be enhanced. Copyright © 2018. Published by Elsevier Ltd.

  3. Classification of Error Related Brain Activity in an Auditory Identification Task with Conditions of Varying Complexity

    NASA Astrophysics Data System (ADS)

    Kakkos, I.; Gkiatis, K.; Bromis, K.; Asvestas, P. A.; Karanasiou, I. S.; Ventouras, E. M.; Matsopoulos, G. K.

    2017-11-01

    The detection of an error is the cognitive evaluation of an action outcome that is considered undesired or mismatches an expected response. Brain activity during monitoring of correct and incorrect responses elicits Event Related Potentials (ERPs) revealing complex cerebral responses to deviant sensory stimuli. Development of accurate error detection systems is of great importance both concerning practical applications and in investigating the complex neural mechanisms of decision making. In this study, data are used from an audio identification experiment that was implemented with two levels of complexity in order to investigate neurophysiological error processing mechanisms in actors and observers. To examine and analyse the variations of the processing of erroneous sensory information for each level of complexity we employ Support Vector Machines (SVM) classifiers with various learning methods and kernels using characteristic ERP time-windowed features. For dimensionality reduction and to remove redundant features we implement a feature selection framework based on Sequential Forward Selection (SFS). The proposed method provided high accuracy in identifying correct and incorrect responses both for actors and for observers with mean accuracy of 93% and 91% respectively. Additionally, computational time was reduced and the effects of the nesting problem usually occurring in SFS of large feature sets were alleviated.

  4. Analysis of the Seismic Activity During the Preparatory Phase of the Mw 8.2 Iquique Earthquake, Chile 2014

    NASA Astrophysics Data System (ADS)

    Aden-Antoniow, F.; Satriano, C.; Poiata, N.; Bernard, P.; Vilotte, J. P.; Aissaoui, E. M.; Ruiz, S.; Schurr, B.; Sobiesiak, M.

    2015-12-01

    The 2014 Iquique seismic crisis, culminating with the main Mw 8.2 Iquique earthquake (Chile), 1st of April 2014, and the largest Mw 7.7 aftershock, 3rd of April, highlighted a complex unlocking of the North Chile subduction interface. Indeed, during many months preceding this event, at least three large seismic clusters have been observed, in July 2013, in January and in March 2014. Their location and final migration towards the mainshock rupture area represents the main motivation of this work.We built a new, more complete catalogue for the period over December 2013 to March 2014 in Northern Chile, using a new automated array method for earthquake detection and location [Poiata et al. 2015]. With the data-set provided by the IPOC and ILN networks, we detected an average of 8000 events per month, forty times more than the catalogue produced by Centro Sismologico National del Chile. The new catalogue decreases the magnitude of completeness by more than two units, from 3.3 to 1.2. We observe two shallow clusters offshore of the cities of Iquique and Pisagua in January 2014, and a strong one covering the rupture zone of Mw 8.2 mainshock in March. A spatial-temporal statistical analysis of these three clusters allows us to better characterize the whole preparatory phase. We interpret our results in light of the location, timing and energy of several aseismic slip events, evidenced by Boudin et al. [AGU 2014], which coincide with the seismic clusters. We propose that the preparatory phase of the Iquique earthquake consists of a complex interplay of seismic and aseismic slip along the subduction surface. Furthermore, our analysis raises new questions regarding the complex slip during the Mw 7.7 aftershock, and the spatial variation of the effective coupling along the subduction interface, imaged by GPS studies, suggesting new research direction that will be outlined.

  5. A model of human event detection in multiple process monitoring situations

    NASA Technical Reports Server (NTRS)

    Greenstein, J. S.; Rouse, W. B.

    1978-01-01

    It is proposed that human decision making in many multi-task situations might be modeled in terms of the manner in which the human detects events related to his tasks and the manner in which he allocates his attention among his tasks once he feels events have occurred. A model of human event detection performance in such a situation is presented. An assumption of the model is that, in attempting to detect events, the human generates the probability that events have occurred. Discriminant analysis is used to model the human's generation of these probabilities. An experimental study of human event detection performance in a multiple process monitoring situation is described and the application of the event detection model to this situation is addressed. The experimental study employed a situation in which subjects simulataneously monitored several dynamic processes for the occurrence of events and made yes/no decisions on the presence of events in each process. Input to the event detection model of the information displayed to the experimental subjects allows comparison of the model's performance with the performance of the subjects.

  6. Information spread of emergency events: path searching on social networks.

    PubMed

    Dai, Weihui; Hu, Hongzhi; Wu, Tunan; Dai, Yonghui

    2014-01-01

    Emergency has attracted global attentions of government and the public, and it will easily trigger a series of serious social problems if it is not supervised effectively in the dissemination process. In the Internet world, people communicate with each other and form various virtual communities based on social networks, which lead to a complex and fast information spread pattern of emergency events. This paper collects Internet data based on data acquisition and topic detection technology, analyzes the process of information spread on social networks, describes the diffusions and impacts of that information from the perspective of random graph, and finally seeks the key paths through an improved IBF algorithm. Application cases have shown that this algorithm can search the shortest spread paths efficiently, which may help us to guide and control the information dissemination of emergency events on early warning.

  7. Implicit Binding of Facial Features During Change Blindness

    PubMed Central

    Lyyra, Pessi; Mäkelä, Hanna; Hietanen, Jari K.; Astikainen, Piia

    2014-01-01

    Change blindness refers to the inability to detect visual changes if introduced together with an eye-movement, blink, flash of light, or with distracting stimuli. Evidence of implicit detection of changed visual features during change blindness has been reported in a number of studies using both behavioral and neurophysiological measurements. However, it is not known whether implicit detection occurs only at the level of single features or whether complex organizations of features can be implicitly detected as well. We tested this in adult humans using intact and scrambled versions of schematic faces as stimuli in a change blindness paradigm while recording event-related potentials (ERPs). An enlargement of the face-sensitive N170 ERP component was observed at the right temporal electrode site to changes from scrambled to intact faces, even if the participants were not consciously able to report such changes (change blindness). Similarly, the disintegration of an intact face to scrambled features resulted in attenuated N170 responses during change blindness. Other ERP deflections were modulated by changes, but unlike the N170 component, they were indifferent to the direction of the change. The bidirectional modulation of the N170 component during change blindness suggests that implicit change detection can also occur at the level of complex features in the case of facial stimuli. PMID:24498165

  8. Implicit binding of facial features during change blindness.

    PubMed

    Lyyra, Pessi; Mäkelä, Hanna; Hietanen, Jari K; Astikainen, Piia

    2014-01-01

    Change blindness refers to the inability to detect visual changes if introduced together with an eye-movement, blink, flash of light, or with distracting stimuli. Evidence of implicit detection of changed visual features during change blindness has been reported in a number of studies using both behavioral and neurophysiological measurements. However, it is not known whether implicit detection occurs only at the level of single features or whether complex organizations of features can be implicitly detected as well. We tested this in adult humans using intact and scrambled versions of schematic faces as stimuli in a change blindness paradigm while recording event-related potentials (ERPs). An enlargement of the face-sensitive N170 ERP component was observed at the right temporal electrode site to changes from scrambled to intact faces, even if the participants were not consciously able to report such changes (change blindness). Similarly, the disintegration of an intact face to scrambled features resulted in attenuated N170 responses during change blindness. Other ERP deflections were modulated by changes, but unlike the N170 component, they were indifferent to the direction of the change. The bidirectional modulation of the N170 component during change blindness suggests that implicit change detection can also occur at the level of complex features in the case of facial stimuli.

  9. Advanced Geospatial Hydrodynamic Signals Analysis for Tsunami Event Detection and Warning

    NASA Astrophysics Data System (ADS)

    Arbab-Zavar, Banafshe; Sabeur, Zoheir

    2013-04-01

    Current early tsunami warning can be issued upon the detection of a seismic event which may occur at a given location offshore. This also provides an opportunity to predict the tsunami wave propagation and run-ups at potentially affected coastal zones by selecting the best matching seismic event from a database of pre-computed tsunami scenarios. Nevertheless, it remains difficult and challenging to obtain the rupture parameters of the tsunamigenic earthquakes in real time and simulate the tsunami propagation with high accuracy. In this study, we propose a supporting approach, in which the hydrodynamic signal is systematically analysed for traces of a tsunamigenic signal. The combination of relatively low amplitudes of a tsunami signal at deep waters and the frequent occurrence of background signals and noise contributes to a generally low signal to noise ratio for the tsunami signal; which in turn makes the detection of this signal difficult. In order to improve the accuracy and confidence of detection, a re-identification framework in which a tsunamigenic signal is detected via the scan of a network of hydrodynamic stations with water level sensing is performed. The aim is to attempt the re-identification of the same signatures as the tsunami wave spatially propagates through the hydrodynamic stations sensing network. The re-identification of the tsunamigenic signal is technically possible since the tsunami signal at the open ocean itself conserves its birthmarks relating it to the source event. As well as supporting the initial detection and improving the confidence of detection, a re-identified signal is indicative of the spatial range of the signal, and thereby it can be used to facilitate the identification of certain background signals such as wind waves which do not have as large a spatial reach as tsunamis. In this paper, the proposed methodology for the automatic detection of tsunamigenic signals has been achieved using open data from NOAA with a recorded tsunami event in the Pacific Ocean. The new approach will be tested in the future on other oceanic regions including the Mediteranean Sea and North East Atlantic Ocean zones. Both authors acknowledge that the current research is currently conducted under the TRIDEC IP FP7 project[1] which involves the development of a system of systems for collaborative, complex and critical decision-support in evolving crises. [1] TRIDEC IP ICT-2009.4.3 Intelligent Information Management Project Reference: 258723. http://www.tridec-online.eu/home

  10. Foundations for Streaming Model Transformations by Complex Event Processing.

    PubMed

    Dávid, István; Ráth, István; Varró, Dániel

    2018-01-01

    Streaming model transformations represent a novel class of transformations to manipulate models whose elements are continuously produced or modified in high volume and with rapid rate of change. Executing streaming transformations requires efficient techniques to recognize activated transformation rules over a live model and a potentially infinite stream of events. In this paper, we propose foundations of streaming model transformations by innovatively integrating incremental model query, complex event processing (CEP) and reactive (event-driven) transformation techniques. Complex event processing allows to identify relevant patterns and sequences of events over an event stream. Our approach enables event streams to include model change events which are automatically and continuously populated by incremental model queries. Furthermore, a reactive rule engine carries out transformations on identified complex event patterns. We provide an integrated domain-specific language with precise semantics for capturing complex event patterns and streaming transformations together with an execution engine, all of which is now part of the Viatra reactive transformation framework. We demonstrate the feasibility of our approach with two case studies: one in an advanced model engineering workflow; and one in the context of on-the-fly gesture recognition.

  11. Verification of Disarmament or Limitation of Armaments: Instruments, Negotiations, Proposals

    DTIC Science & Technology

    1992-05-01

    explosions and may complicate the process of detection. An even greater difficulty faced by seismologists is the ambient background of seismic "noise...suspected event would be a complex operation. It would consist of surveys of the area of the presumed nuclear explosion in order to measure ambient ...Draft Resolution to the OAS General Assembly, June 1991 and OAS Resolution "Cooperacion para la seguridad en el hemisferio. Limitacion de la

  12. A Cooperative Communication System for the Advancement of Safe, Effective, and Efficient Patient Care

    DTIC Science & Technology

    2015-09-01

    graphical display to promote acute change detection in ICU patients. International Journal of Medical Informatics , 81(12), 842-851. Brooke, J. (1996...patient history timeline. In this instance, the window describes the event causing injury (an oil rig explosion) and diagnosis upon admission Figures A...solutions to overcome them. Despite years of effort in medical informatics , a gap remains between the complexities of the clinical work setting and the

  13. Monitoring and Identifying in Real time Critical Patients Events.

    PubMed

    Chavez Mora, Emma

    2014-01-01

    Nowadays pervasive health care monitoring environments, as well as business activity monitoring environments, gather information from a variety of data sources. However it includes new challenges because of the use of body and wireless sensors, nontraditional operational and transactional sources. This makes the health data more difficult to monitor. Decision making in this environment is typically complex and unstructured as clinical work is essentially interpretative, multitasking, collaborative, distributed and reactive. Thus, the health care arena requires real time data management in areas such as patient monitoring, detection of adverse events and adaptive responses to operational failures. This research presents a new architecture that enables real time patient data management through the use of intelligent data sources.

  14. Causal simulation and sensor planning in predictive monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, Richard J.

    1989-01-01

    Two issues are addressed which arise in the task of detecting anomalous behavior in complex systems with numerous sensor channels: how to adjust alarm thresholds dynamically, within the changing operating context of the system, and how to utilize sensors selectively, so that nominal operation can be verified reliably without processing a prohibitive amount of sensor data. The approach involves simulation of a causal model of the system, which provides information on expected sensor values, and on dependencies between predicted events, useful in assessing the relative importance of events so that sensor resources can be allocated effectively. The potential applicability of this work to the execution monitoring of robot task plans is briefly discussed.

  15. Event Detection Using Mobile Phone Mass GPS Data and Their Reliavility Verification by Dmsp/ols Night Light Image

    NASA Astrophysics Data System (ADS)

    Yuki, Akiyama; Satoshi, Ueyama; Ryosuke, Shibasaki; Adachi, Ryuichiro

    2016-06-01

    In this study, we developed a method to detect sudden population concentration on a certain day and area, that is, an "Event," all over Japan in 2012 using mass GPS data provided from mobile phone users. First, stay locations of all phone users were detected using existing methods. Second, areas and days where Events occurred were detected by aggregation of mass stay locations into 1-km-square grid polygons. Finally, the proposed method could detect Events with an especially large number of visitors in the year by removing the influences of Events that occurred continuously throughout the year. In addition, we demonstrated reasonable reliability of the proposed Event detection method by comparing the results of Event detection with light intensities obtained from the night light images from the DMSP/OLS night light images. Our method can detect not only positive events such as festivals but also negative events such as natural disasters and road accidents. These results are expected to support policy development of urban planning, disaster prevention, and transportation management.

  16. Event attribution using data assimilation in an intermediate complexity atmospheric model

    NASA Astrophysics Data System (ADS)

    Metref, Sammy; Hannart, Alexis; Ruiz, Juan; Carrassi, Alberto; Bocquet, Marc; Ghil, Michael

    2016-04-01

    A new approach, coined DADA (Data Assimilation for Detection and Attribution) has been recently introduced by Hannart et al. 2015, and is potentially useful for near real time, systematic causal attribution of weather and climate-related events The method is purposely designed to allow its operability at meteorological centers by synergizing causal attribution with Data Assimilation (DA) methods usually designed to deal with large nonlinear models. In Hannart et al. 2015, the DADA proposal is illustrated in the context of a low-order nonlinear model (forced three-variable Lorenz model) that is of course not realistic to represent the events considered. As a continuation of this stream of work, we therefore propose an implementation of the DADA approach in a realistic intermediate complexity atmospheric model (ICTP AGCM, nicknamed SPEEDY). The SPEEDY model is based on a spectral dynamical core developed at the Geophysical Fluid Dynamics Laboratory (see Held and Suarez 1994). It is a hydrostatic, r-coordinate, spectral-transform model in the vorticity-divergence form described by Bourke (1974). A synthetic dataset of observations of an extreme precipitation event over Southeastern South America is extracted from a long SPEEDY simulation under present climatic conditions (i.e. factual conditions). Then, following the DADA approach, observations of this event are assimilated twice in the SPEEDY model: first in the factual configuration of the model and second under its counterfactual, pre-industrial configuration. We show that attribution can be performed based on the likelihood ratio as in Hannart et al. 2015, but we further extend this result by showing that the likelihood can be split in space, time and variables in order to help identify the specific physical features of the event that bear the causal signature. References: Hannart A., A. Carrassi, M. Bocquet, M. Ghil, P. Naveau, M. Pulido, J. Ruiz, P. Tandeo (2015) DADA: Data assimilation for the detection and attribution of weather and climate-related events, Climatic Change, (in press). Held I. M. and M. J. Suarez, (1994): A Proposal for the Intercomparison of the Dynamical Cores of Atmospheric General Circulation Models. Bull. Amer. Meteor. Soc., 75, 1825-1830. Bourke W. (1972): A multi-level spectral model. I. Formulation and hemispheric integrations. Mon. Wea. Rev., 102, 687-701.

  17. Video mining using combinations of unsupervised and supervised learning techniques

    NASA Astrophysics Data System (ADS)

    Divakaran, Ajay; Miyahara, Koji; Peker, Kadir A.; Radhakrishnan, Regunathan; Xiong, Ziyou

    2003-12-01

    We discuss the meaning and significance of the video mining problem, and present our work on some aspects of video mining. A simple definition of video mining is unsupervised discovery of patterns in audio-visual content. Such purely unsupervised discovery is readily applicable to video surveillance as well as to consumer video browsing applications. We interpret video mining as content-adaptive or "blind" content processing, in which the first stage is content characterization and the second stage is event discovery based on the characterization obtained in stage 1. We discuss the target applications and find that using a purely unsupervised approach are too computationally complex to be implemented on our product platform. We then describe various combinations of unsupervised and supervised learning techniques that help discover patterns that are useful to the end-user of the application. We target consumer video browsing applications such as commercial message detection, sports highlights extraction etc. We employ both audio and video features. We find that supervised audio classification combined with unsupervised unusual event discovery enables accurate supervised detection of desired events. Our techniques are computationally simple and robust to common variations in production styles etc.

  18. Detection of infectious disease outbreaks in twenty-two fragile states, 2000-2010: a systematic review

    PubMed Central

    2011-01-01

    Fragile states are home to a sixth of the world's population, and their populations are particularly vulnerable to infectious disease outbreaks. Timely surveillance and control are essential to minimise the impact of these outbreaks, but little evidence is published about the effectiveness of existing surveillance systems. We did a systematic review of the circumstances (mode) of detection of outbreaks occurring in 22 fragile states in the decade 2000-2010 (i.e. all states consistently meeting fragility criteria during the timeframe of the review), as well as time lags from onset to detection of these outbreaks, and from detection to further events in their timeline. The aim of this review was to enhance the evidence base for implementing infectious disease surveillance in these complex, resource-constrained settings, and to assess the relative importance of different routes whereby outbreak detection occurs. We identified 61 reports concerning 38 outbreaks. Twenty of these were detected by existing surveillance systems, but 10 detections occurred following formal notifications by participating health facilities rather than data analysis. A further 15 outbreaks were detected by informal notifications, including rumours. There were long delays from onset to detection (median 29 days) and from detection to further events (investigation, confirmation, declaration, control). Existing surveillance systems yielded the shortest detection delays when linked to reduced barriers to health care and frequent analysis and reporting of incidence data. Epidemic surveillance and control appear to be insufficiently timely in fragile states, and need to be strengthened. Greater reliance on formal and informal notifications is warranted. Outbreak reports should be more standardised and enable monitoring of surveillance systems' effectiveness. PMID:21861869

  19. Multi-Station Broad Regional Event Detection Using Waveform Correlation

    NASA Astrophysics Data System (ADS)

    Slinkard, M.; Stephen, H.; Young, C. J.; Eckert, R.; Schaff, D. P.; Richards, P. G.

    2013-12-01

    Previous waveform correlation studies have established the occurrence of repeating seismic events in various regions, and the utility of waveform-correlation event-detection on broad regional or even global scales to find events currently not included in traditionally-prepared bulletins. The computational burden, however, is high, limiting previous experiments to relatively modest template libraries and/or processing time periods. We have developed a distributed computing waveform correlation event detection utility that allows us to process years of continuous waveform data with template libraries numbering in the thousands. We have used this system to process several years of waveform data from IRIS stations in East Asia, using libraries of template events taken from global and regional bulletins. Detections at a given station are confirmed by 1) comparison with independent bulletins of seismicity, and 2) consistent detections at other stations. We find that many of the detected events are not in traditional catalogs, hence the multi-station comparison is essential. In addition to detecting the similar events, we also estimate magnitudes very precisely based on comparison with the template events (when magnitudes are available). We have investigated magnitude variation within detected families of similar events, false alarm rates, and the temporal and spatial reach of templates.

  20. Network hydraulics inclusion in water quality event detection using multiple sensor stations data.

    PubMed

    Oliker, Nurit; Ostfeld, Avi

    2015-09-01

    Event detection is one of the current most challenging topics in water distribution systems analysis: how regular on-line hydraulic (e.g., pressure, flow) and water quality (e.g., pH, residual chlorine, turbidity) measurements at different network locations can be efficiently utilized to detect water quality contamination events. This study describes an integrated event detection model which combines multiple sensor stations data with network hydraulics. To date event detection modelling is likely limited to single sensor station location and dataset. Single sensor station models are detached from network hydraulics insights and as a result might be significantly exposed to false positive alarms. This work is aimed at decreasing this limitation through integrating local and spatial hydraulic data understanding into an event detection model. The spatial analysis complements the local event detection effort through discovering events with lower signatures by exploring the sensors mutual hydraulic influences. The unique contribution of this study is in incorporating hydraulic simulation information into the overall event detection process of spatially distributed sensors. The methodology is demonstrated on two example applications using base runs and sensitivity analyses. Results show a clear advantage of the suggested model over single-sensor event detection schemes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Searching for the optimal drought index and timescale combination to detect drought: a case study from the lower Jinsha River basin, China

    NASA Astrophysics Data System (ADS)

    Fluixá-Sanmartín, Javier; Pan, Deng; Fischer, Luzia; Orlowsky, Boris; García-Hernández, Javier; Jordan, Frédéric; Haemmig, Christoph; Zhang, Fangwei; Xu, Jijun

    2018-02-01

    Drought indices based on precipitation are commonly used to identify and characterize droughts. Due to the general complexity of droughts, the comparison of index-identified events with droughts at different levels of the complete system, including soil humidity or river discharges, relies typically on model simulations of the latter, entailing potentially significant uncertainties. The present study explores the potential of using precipitation-based indices to reproduce observed droughts in the lower part of the Jinsha River basin (JRB), proposing an innovative approach for a catchment-wide drought detection and characterization. Two indicators, namely the Overall Drought Extension (ODE) and the Overall Drought Indicator (ODI), have been defined. These indicators aim at identifying and characterizing drought events on the basin scale, using results from four meteorological drought indices (standardized precipitation index, SPI; rainfall anomaly index, RAI; percent of normal precipitation, PN; deciles, DEC) calculated at different locations of the basin and for different timescales. Collected historical information on drought events is used to contrast results obtained with the indicators. This method has been successfully applied to the lower Jinsha River basin in China, a region prone to frequent and severe droughts. Historical drought events that occurred from 1960 to 2014 have been compiled and cataloged from different sources, in a challenging process. The analysis of the indicators shows a good agreement with the recorded historical drought events on the basin scale. It has been found that the timescale that best reproduces observed events across all the indices is the 6-month timescale.

  2. Real-time measurements, rare events and photon economics

    NASA Astrophysics Data System (ADS)

    Jalali, B.; Solli, D. R.; Goda, K.; Tsia, K.; Ropers, C.

    2010-07-01

    Rogue events otherwise known as outliers and black swans are singular, rare, events that carry dramatic impact. They appear in seemingly unconnected systems in the form of oceanic rogue waves, stock market crashes, evolution, and communication systems. Attempts to understand the underlying dynamics of such complex systems that lead to spectacular and often cataclysmic outcomes have been frustrated by the scarcity of events, resulting in insufficient statistical data, and by the inability to perform experiments under controlled conditions. Extreme rare events also occur in ultrafast physical sciences where it is possible to collect large data sets, even for rare events, in a short time period. The knowledge gained from observing rare events in ultrafast systems may provide valuable insight into extreme value phenomena that occur over a much slower timescale and that have a closer connection with human experience. One solution is a real-time ultrafast instrument that is capable of capturing singular and randomly occurring non-repetitive events. The time stretch technology developed during the past 13 years is providing a powerful tool box for reaching this goal. This paper reviews this technology and discusses its use in capturing rogue events in electronic signals, spectroscopy, and imaging. We show an example in nonlinear optics where it was possible to capture rare and random solitons whose unusual statistical distribution resemble those observed in financial markets. The ability to observe the true spectrum of each event in real time has led to important insight in understanding the underlying process, which in turn has made it possible to control soliton generation leading to improvement in the coherence of supercontinuum light. We also show a new class of fast imagers which are being considered for early detection of cancer because of their potential ability to detect rare diseased cells (so called rogue cells) in a large population of healthy cells.

  3. Rapid extraction of auditory feature contingencies.

    PubMed

    Bendixen, Alexandra; Prinz, Wolfgang; Horváth, János; Trujillo-Barreto, Nelson J; Schröger, Erich

    2008-07-01

    Contingent relations between sensory events render the environment predictable and thus facilitate adaptive behavior. The human capacity to detect such relations has been comprehensively demonstrated in paradigms in which contingency rules were task-relevant or in which they applied to motor behavior. The extent to which contingencies can also be extracted from events that are unrelated to the current goals of the organism has remained largely unclear. The present study addressed the emergence of contingency-related effects for behaviorally irrelevant auditory stimuli and the cortical areas involved in the processing of such contingency rules. Contingent relations between different features of temporally separate events were embedded in a new dynamic protocol. Participants were presented with the auditory stimulus sequences while their attention was captured by a video. The mismatch negativity (MMN) component of the event-related brain potential (ERP) was employed as an electrophysiological correlate of contingency detection. MMN generators were localized by means of scalp current density (SCD) and primary current density (PCD) analyses with variable resolution electromagnetic tomography (VARETA). Results show that task-irrelevant contingencies can be extracted from about fifteen to twenty successive events conforming to the contingent relation. Topographic and tomographic analyses reveal the involvement of the auditory cortex in the processing of contingency violations. The present data provide evidence for the rapid encoding of complex extrapolative relations in sensory areas. This capacity is of fundamental importance for the organism in its attempt to model the sensory environment outside the focus of attention.

  4. Automatic detection of recoil-proton tracks and background rejection criteria in liquid scintillator-micro-capillary-array fast neutron spectrometer

    NASA Astrophysics Data System (ADS)

    Mor, Ilan; Vartsky, David; Dangendorf, Volker; Tittelmeier, Kai.; Weierganz, Mathias; Goldberg, Mark Benjamin; Bar, Doron; Brandis, Michal

    2018-06-01

    We describe an analysis procedure for automatic unambiguous detection of fast-neutron-induced recoil proton tracks in a micro-capillary array filled with organic liquid scintillator. The detector is viewed by an intensified CCD camera. This imaging neutron detector possesses the capability to perform high position-resolution (few tens of μm), energy-dispersive transmission-imaging using ns-pulsed beams. However, when operated with CW or DC beams, it also features medium-quality spectroscopic capabilities for incident neutrons in the energy range 2-20 MeV. In addition to the recoil proton events which display a continuous extended track structure, the raw images exhibit complex ion-tracks from nuclear interactions of fast-neutrons in the scintillator, capillaries quartz-matrix and CCD. Moreover, as expected, one also observes a multitude of isolated scintillation spots of varying intensity (henceforth denoted "blobs") that originate from several different sources, such as: fragmented proton tracks, gamma-rays, heavy-ion reactions as well as events and noise that occur in the image-intensifier and CCD. In order to identify the continuous-track recoil proton events and distinguish them from all these background events, a rapid, computerized and automatic track-recognition-procedure was developed. Based on an appropriately weighted analysis of track parameters such as: length, width, area and overall light intensity, the method is capable of distinguishing a single continuous-track recoil proton from typically surrounding several thousands of background events that are found in each CCD frame.

  5. Real-Time Event Detection for Monitoring Natural and Source ...

    EPA Pesticide Factsheets

    The use of event detection systems in finished drinking water systems is increasing in order to monitor water quality in both operational and security contexts. Recent incidents involving harmful algal blooms and chemical spills into watersheds have increased interest in monitoring source water quality prior to treatment. This work highlights the use of the CANARY event detection software in detecting suspected illicit events in an actively monitored watershed in South Carolina. CANARY is an open source event detection software that was developed by USEPA and Sandia National Laboratories. The software works with any type of sensor, utilizes multiple detection algorithms and approaches, and can incorporate operational information as needed. Monitoring has been underway for several years to detect events related to intentional or unintentional dumping of materials into the monitored watershed. This work evaluates the feasibility of using CANARY to enhance the detection of events in this watershed. This presentation will describe the real-time monitoring approach used in this watershed, the selection of CANARY configuration parameters that optimize detection for this watershed and monitoring application, and the performance of CANARY during the time frame analyzed. Further, this work will highlight how rainfall events impacted analysis, and the innovative application of CANARY taken in order to effectively detect the suspected illicit events. This presentation d

  6. Dynamic analysis of heartbeat rate signals of epileptics using multidimensional phase space reconstruction approach

    NASA Astrophysics Data System (ADS)

    Su, Zhi-Yuan; Wu, Tzuyin; Yang, Po-Hua; Wang, Yeng-Tseng

    2008-04-01

    The heartbeat rate signal provides an invaluable means of assessing the sympathetic-parasympathetic balance of the human autonomic nervous system and thus represents an ideal diagnostic mechanism for detecting a variety of disorders such as epilepsy, cardiac disease and so forth. The current study analyses the dynamics of the heartbeat rate signal of known epilepsy sufferers in order to obtain a detailed understanding of the heart rate pattern during a seizure event. In the proposed approach, the ECG signals are converted into heartbeat rate signals and the embedology theorem is then used to construct the corresponding multidimensional phase space. The dynamics of the heartbeat rate signal are then analyzed before, during and after an epileptic seizure by examining the maximum Lyapunov exponent and the correlation dimension of the attractors in the reconstructed phase space. In general, the results reveal that the heartbeat rate signal transits from an aperiodic, highly-complex behaviour before an epileptic seizure to a low dimensional chaotic motion during the seizure event. Following the seizure, the signal trajectories return to a highly-complex state, and the complex signal patterns associated with normal physiological conditions reappear.

  7. Characterization of brightness and stoichiometry of bright particles by flow-fluorescence fluctuation spectroscopy.

    PubMed

    Johnson, Jolene; Chen, Yan; Mueller, Joachim D

    2010-11-03

    Characterization of bright particles at low concentrations by fluorescence fluctuation spectroscopy (FFS) is challenging, because the event rate of particle detection is low and fluorescence background contributes significantly to the measured signal. It is straightforward to increase the event rate by flow, but the high background continues to be problematic for fluorescence correlation spectroscopy. Here, we characterize the use of photon-counting histogram analysis in the presence of flow. We demonstrate that a photon-counting histogram efficiently separates the particle signal from the background and faithfully determines the brightness and concentration of particles independent of flow speed, as long as undersampling is avoided. Brightness provides a measure of the number of fluorescently labeled proteins within a complex and has been used to determine stoichiometry of protein complexes in vivo and in vitro. We apply flow-FFS to determine the stoichiometry of the group specific antigen protein within viral-like particles of the human immunodeficiency virus type-1 from the brightness. Our results demonstrate that flow-FFS is a sensitive method for the characterization of complex macromolecular particles at low concentrations. Copyright © 2010 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  8. Distributed Events in Sentinel: Design and Implementation of a Global Event Detector

    DTIC Science & Technology

    1999-01-01

    local event detector and a global event detector to detect events. Global event detector in this case plays the role of a message sending/receiving than...significant in this case . The system performance will decrease with increase in the number of applications involved in global event detection. Yet from a...Figure 8: A Global event tree (2) 1. Global composite event is detected at the GED In this case , the whole global composite event tree is sent to the

  9. State-of-the-Art Fusion-Finder Algorithms Sensitivity and Specificity

    PubMed Central

    Carrara, Matteo; Beccuti, Marco; Lazzarato, Fulvio; Cavallo, Federica; Cordero, Francesca; Donatelli, Susanna; Calogero, Raffaele A.

    2013-01-01

    Background. Gene fusions arising from chromosomal translocations have been implicated in cancer. RNA-seq has the potential to discover such rearrangements generating functional proteins (chimera/fusion). Recently, many methods for chimeras detection have been published. However, specificity and sensitivity of those tools were not extensively investigated in a comparative way. Results. We tested eight fusion-detection tools (FusionHunter, FusionMap, FusionFinder, MapSplice, deFuse, Bellerophontes, ChimeraScan, and TopHat-fusion) to detect fusion events using synthetic and real datasets encompassing chimeras. The comparison analysis run only on synthetic data could generate misleading results since we found no counterpart on real dataset. Furthermore, most tools report a very high number of false positive chimeras. In particular, the most sensitive tool, ChimeraScan, reports a large number of false positives that we were able to significantly reduce by devising and applying two filters to remove fusions not supported by fusion junction-spanning reads or encompassing large intronic regions. Conclusions. The discordant results obtained using synthetic and real datasets suggest that synthetic datasets encompassing fusion events may not fully catch the complexity of RNA-seq experiment. Moreover, fusion detection tools are still limited in sensitivity or specificity; thus, there is space for further improvement in the fusion-finder algorithms. PMID:23555082

  10. ShatterProof: operational detection and quantification of chromothripsis.

    PubMed

    Govind, Shaylan K; Zia, Amin; Hennings-Yeomans, Pablo H; Watson, John D; Fraser, Michael; Anghel, Catalina; Wyatt, Alexander W; van der Kwast, Theodorus; Collins, Colin C; McPherson, John D; Bristow, Robert G; Boutros, Paul C

    2014-03-19

    Chromothripsis, a newly discovered type of complex genomic rearrangement, has been implicated in the evolution of several types of cancers. To date, it has been described in bone cancer, SHH-medulloblastoma and acute myeloid leukemia, amongst others, however there are still no formal or automated methods for detecting or annotating it in high throughput sequencing data. As such, findings of chromothripsis are difficult to compare and many cases likely escape detection altogether. We introduce ShatterProof, a software tool for detecting and quantifying chromothriptic events. ShatterProof takes structural variation calls (translocations, copy-number variations, short insertions and loss of heterozygosity) produced by any algorithm and using an operational definition of chromothripsis performs robust statistical tests to accurately predict the presence and location of chromothriptic events. Validation of our tool was conducted using clinical data sets including matched normal, prostate cancer samples in addition to the colorectal cancer and SCLC data sets used in the original description of chromothripsis. ShatterProof is computationally efficient, having low memory requirements and near linear computation time. This allows it to become a standard component of sequencing analysis pipelines, enabling researchers to routinely and accurately assess samples for chromothripsis. Source code and documentation can be found at http://search.cpan.org/~sgovind/Shatterproof.

  11. Detection of Ludic Patterns in Two Triadic Motor Games and Differences in Decision Complexity

    PubMed Central

    Aguilar, Miguel Pic; Navarro-Adelantado, Vicente; Jonsson, Gudberg K.

    2018-01-01

    The triad is a particular structure in which an ambivalent social relationship takes place. This work is focused on the search of behavioral regularities in the practice of motor games in triad, which is a little known field. For the detection of behavioral patterns not visible to the naked eye, we use Theme. A chasing games model was followed, with rules, and in two different structures (A↔B↔C↔A and A → B → C → A) on four class groups (two for each structure), for a total of 84, 12, and 13 year old secondary school students, 37 girls (44%) and 47 boys (56%). The aim was to examine if the players' behavior, in relation to the triad structure, matches with any ludic behavior patterns. An observational methodology was applied, with a nomothetic, punctual and multidimensional design. The intra and inter-evaluative correlation coefficients and the generalizability theory ensured the quality of the data. A mixed behavioral role system was used (four criteria and 15 categories), and the pattern detection software Theme was applied to detect temporal regularities in the order of event occurrences. The results show that time location of motor responses in triad games was not random. In the “maze” game we detected more complex ludic patterns than the “three fields” game, which might be explained by means of structural determinants such as circulation. This research points out the decisional complexity in motor games, and it confirms the differences among triads from the point of view of motor communication. PMID:29354084

  12. Real-time Automatic Detectors of P and S Waves Using Singular Values Decomposition

    NASA Astrophysics Data System (ADS)

    Kurzon, I.; Vernon, F.; Rosenberger, A.; Ben-Zion, Y.

    2013-12-01

    We implement a new method for the automatic detection of the primary P and S phases using Singular Value Decomposition (SVD) analysis. The method is based on a real-time iteration algorithm of Rosenberger (2010) for the SVD of three component seismograms. Rosenberger's algorithm identifies the incidence angle by applying SVD and separates the waveforms into their P and S components. We have been using the same algorithm with the modification that we filter the waveforms prior to the SVD, and then apply SNR (Signal-to-Noise Ratio) detectors for picking the P and S arrivals, on the new filtered+SVD-separated channels. A recent deployment in San Jacinto Fault Zone area provides a very dense seismic network that allows us to test the detection algorithm in diverse setting, such as: events with different source mechanisms, stations with different site characteristics, and ray paths that diverge from the SVD approximation used in the algorithm, (e.g., rays propagating within the fault and recorded on linear arrays, crossing the fault). We have found that a Butterworth band-pass filter of 2-30Hz, with four poles at each of the corner frequencies, shows the best performance in a large variety of events and stations within the SJFZ. Using the SVD detectors we obtain a similar number of P and S picks, which is a rare thing to see in ordinary SNR detectors. Also for the actual real-time operation of the ANZA and SJFZ real-time seismic networks, the above filter (2-30Hz) shows a very impressive performance, tested on many events and several aftershock sequences in the region from the MW 5.2 of June 2005, through the MW 5.4 of July 2010, to MW 4.7 of March 2013. Here we show the results of testing the detectors on the most complex and intense aftershock sequence, the MW 5.2 of June 2005, in which in the very first hour there were ~4 events a minute. This aftershock sequence was thoroughly reviewed by several analysts, identifying 294 events in the first hour, located in a condensed cluster around the main shock. We used this hour of events to fine-tune the automatic SVD detection, association and location of the real-time system, reaching a 37% automatic identification and location of events, with a minimum of 10 stations per event, all events fall within the same condensed cluster and there are no false events or large offsets of their locations. An ordinary SNR detector did not exceed the 11% success with a minimum of 8 stations per event, 2 false events and a wider spread of events (not within the reviewed cluster). One of the main advantages of the SVD detectors for real-time operations is the actual separation between the P and S components, by that significantly reducing the noise of picks detected by ordinary SNR detectors. The new method has been applied for a significant amount of events within the SJFZ in the past 8 years, and is now in the final stage of real-time implementation in UCSD for the ANZA and SJFZ networks, tuned for automatic detection and location of local events.

  13. Non-stationary least-squares complex decomposition for microseismic noise attenuation

    NASA Astrophysics Data System (ADS)

    Chen, Yangkang

    2018-06-01

    Microseismic data processing and imaging are crucial for subsurface real-time monitoring during hydraulic fracturing process. Unlike the active-source seismic events or large-scale earthquake events, the microseismic event is usually of very small magnitude, which makes its detection challenging. The biggest trouble of microseismic data is the low signal-to-noise ratio issue. Because of the small energy difference between effective microseismic signal and ambient noise, the effective signals are usually buried in strong random noise. I propose a useful microseismic denoising algorithm that is based on decomposing a microseismic trace into an ensemble of components using least-squares inversion. Based on the predictive property of useful microseismic event along the time direction, the random noise can be filtered out via least-squares fitting of multiple damping exponential components. The method is flexible and almost automated since the only parameter needed to be defined is a decomposition number. I use some synthetic and real data examples to demonstrate the potential of the algorithm in processing complicated microseismic data sets.

  14. Vertically Integrated Seismological Analysis I : Modeling

    NASA Astrophysics Data System (ADS)

    Russell, S.; Arora, N. S.; Jordan, M. I.; Sudderth, E.

    2009-12-01

    As part of its CTBT verification efforts, the International Data Centre (IDC) analyzes seismic and other signals collected from hundreds of stations around the world. Current processing at the IDC proceeds in a series of pipelined stages. From station processing to network processing, each decision is made on the basis of local information. This has the advantage of efficiency, and simplifies the structure of software implementations. However, this approach may reduce accuracy in the detection and phase classification of arrivals, association of detections to hypothesized events, and localization of small-magnitude events.In our work, we approach such detection and association problems as ones of probabilistic inference. In simple terms, let X be a random variable ranging over all possible collections of events, with each event defined by time, location, magnitude, and type (natural or man-made). Let Y range over all possible waveform signal recordings at all detection stations. Then Pθ(X) describes a parameterized generative prior over events, and P[|#30#|]φ(Y | X) describes how the signal is propagated and measured (including travel time, selective absorption and scattering, noise, artifacts, sensor bias, sensor failures, etc.). Given observed recordings Y = y, we are interested in the posterior P(X | Y = y), and perhaps in the value of X that maximizes it—i.e., the most likely explanation for all the sensor readings. As detailed below, an additional focus of our work is to robustly learn appropriate model parameters θ and φ from historical data. The primary advantage we expect is that decisions about arrivals, phase classifications, and associations are made with the benefit of all available evidence, not just the local signal or predefined recipes. Important phenomena—such as the successful detection of sub-threshold signals, correction of phase classifications using arrival information at other stations, and removal of false events based on the absence of signals—should all fall out of our probabilistic framework without the need for special processing rules. In our baseline model, natural events occur according to a spatially inhomogeneous Poisson process. Complex events (swarms and aftershocks) may then be captured via temporally inhomogeneous extensions. Man-made events have a uniform probability of occurring anywhere on the earth, with a tendency to occur closer to the surface. Phases are modelled via their amplitude, frequency distribution, and origin. In the simplest case, transmission times are characterized via the one-dimensional IASPEI-91 model, accounting for model errors with Gaussian uncertainty. Such homogeneous, approximate physical models can be further refined via historical data and previously developed corrections. Signal measurements are captured by station-specific models, based on sensor types and geometries, local frequency absorption characteristics, and time-varying noise models. At the conference, we expect to be able to quantitatively demonstrate the advantages of our approach, at least for simulated data. When reporting their findings, such systems can easily flag low-confidence events, unexplained arrivals, and ambiguous classifications to focus the efforts of expert analysts.

  15. Self-similarity Clustering Event Detection Based on Triggers Guidance

    NASA Astrophysics Data System (ADS)

    Zhang, Xianfei; Li, Bicheng; Tian, Yuxuan

    Traditional method of Event Detection and Characterization (EDC) regards event detection task as classification problem. It makes words as samples to train classifier, which can lead to positive and negative samples of classifier imbalance. Meanwhile, there is data sparseness problem of this method when the corpus is small. This paper doesn't classify event using word as samples, but cluster event in judging event types. It adopts self-similarity to convergence the value of K in K-means algorithm by the guidance of event triggers, and optimizes clustering algorithm. Then, combining with named entity and its comparative position information, the new method further make sure the pinpoint type of event. The new method avoids depending on template of event in tradition methods, and its result of event detection can well be used in automatic text summarization, text retrieval, and topic detection and tracking.

  16. A slow earthquake sequence on the San Andreas fault

    USGS Publications Warehouse

    Linde, A.T.; Gladwin, M.T.; Johnston, M.J.S.; Gwyther, R.L.; Bilham, R.G.

    1996-01-01

    EARTHQUAKES typically release stored strain energy on timescales of the order of seconds, limited by the velocity of sound in rock. Over the past 20 years, observations and laboratory experiments have indicated that capture can also occur more slowly, with durations up to hours. Such events may be important in earthquake nucleation and in accounting for the excess of plate convergence over seismic slip in subduction zones. The detection of events with larger timescales requires near-field deformation measurements. In December 1992, two borehole strainmeters close to the San Andreas fault in California recorded a slow strain event of about a week in duration, and we show here that the strain changes were produced by a slow earthquake sequence (equivalent magnitude 4.8) with complexity similar to that of regular earthquakes. The largest earthquakes associated with these slow events were small (local magnitude 3.7) and contributed negligible strain release. The importance of slow earthquakes in the seismogenic process remains an open question, but these observations extend the observed timescale for slow events by two orders of magnitude.

  17. FusionAnalyser: a new graphical, event-driven tool for fusion rearrangements discovery

    PubMed Central

    Piazza, Rocco; Pirola, Alessandra; Spinelli, Roberta; Valletta, Simona; Redaelli, Sara; Magistroni, Vera; Gambacorti-Passerini, Carlo

    2012-01-01

    Gene fusions are common driver events in leukaemias and solid tumours; here we present FusionAnalyser, a tool dedicated to the identification of driver fusion rearrangements in human cancer through the analysis of paired-end high-throughput transcriptome sequencing data. We initially tested FusionAnalyser by using a set of in silico randomly generated sequencing data from 20 known human translocations occurring in cancer and subsequently using transcriptome data from three chronic and three acute myeloid leukaemia samples. in all the cases our tool was invariably able to detect the presence of the correct driver fusion event(s) with high specificity. In one of the acute myeloid leukaemia samples, FusionAnalyser identified a novel, cryptic, in-frame ETS2–ERG fusion. A fully event-driven graphical interface and a flexible filtering system allow complex analyses to be run in the absence of any a priori programming or scripting knowledge. Therefore, we propose FusionAnalyser as an efficient and robust graphical tool for the identification of functional rearrangements in the context of high-throughput transcriptome sequencing data. PMID:22570408

  18. FusionAnalyser: a new graphical, event-driven tool for fusion rearrangements discovery.

    PubMed

    Piazza, Rocco; Pirola, Alessandra; Spinelli, Roberta; Valletta, Simona; Redaelli, Sara; Magistroni, Vera; Gambacorti-Passerini, Carlo

    2012-09-01

    Gene fusions are common driver events in leukaemias and solid tumours; here we present FusionAnalyser, a tool dedicated to the identification of driver fusion rearrangements in human cancer through the analysis of paired-end high-throughput transcriptome sequencing data. We initially tested FusionAnalyser by using a set of in silico randomly generated sequencing data from 20 known human translocations occurring in cancer and subsequently using transcriptome data from three chronic and three acute myeloid leukaemia samples. in all the cases our tool was invariably able to detect the presence of the correct driver fusion event(s) with high specificity. In one of the acute myeloid leukaemia samples, FusionAnalyser identified a novel, cryptic, in-frame ETS2-ERG fusion. A fully event-driven graphical interface and a flexible filtering system allow complex analyses to be run in the absence of any a priori programming or scripting knowledge. Therefore, we propose FusionAnalyser as an efficient and robust graphical tool for the identification of functional rearrangements in the context of high-throughput transcriptome sequencing data.

  19. Mineralogical Diversity and Geology of Humboldt Crater Derived Using Moon Mineralogy Mapper Data.

    PubMed

    Martinot, M; Besse, S; Flahaut, J; Quantin-Nataf, C; Lozac'h, L; van Westrenen, W

    2018-02-01

    Moon Mineralogy Mapper (M 3 ) spectroscopic data and high-resolution imagery data sets were used to study the mineralogy and geology of the 207 km diameter Humboldt crater. Analyses of M 3 data, using a custom-made method for M 3 spectra continuum removal and spectral parameters calculation, reveal multiple pure crystalline plagioclase detections within the Humboldt crater central peak complex, hinting at its crustal origin. However, olivine, spinel, and glass are observed in the crater walls and rims, suggesting these minerals derive from shallower levels than the plagioclase of the central peak complex. High-calcium pyroxenes are detected in association with volcanic deposits emplaced on the crater's floor. Geologic mapping was performed, and the age of Humboldt crater's units was estimated from crater counts. Results suggest that volcanic activity within this floor-fractured crater spanned over a billion years. The felsic mineralogy of the central peak complex region, which presumably excavated deeper material, and the shallow mafic minerals (olivine and spinel) detected in Humboldt crater walls and rim are not in accordance with the general view of the structure of the lunar crust. Our observations can be explained by the presence of a mafic pluton emplaced in the anorthositic crust prior to the Humboldt-forming impact event. Alternatively, the excavation of Australe basin ejecta could explain the observed mineralogical detections. This highlights the importance of detailed combined mineralogical and geological remote sensing studies to assess the heterogeneity of the lunar crust.

  20. Mineralogical Diversity and Geology of Humboldt Crater Derived Using Moon Mineralogy Mapper Data

    NASA Astrophysics Data System (ADS)

    Martinot, M.; Besse, S.; Flahaut, J.; Quantin-Nataf, C.; Lozac'h, L.; van Westrenen, W.

    2018-02-01

    Moon Mineralogy Mapper (M3) spectroscopic data and high-resolution imagery data sets were used to study the mineralogy and geology of the 207 km diameter Humboldt crater. Analyses of M3 data, using a custom-made method for M3 spectra continuum removal and spectral parameters calculation, reveal multiple pure crystalline plagioclase detections within the Humboldt crater central peak complex, hinting at its crustal origin. However, olivine, spinel, and glass are observed in the crater walls and rims, suggesting these minerals derive from shallower levels than the plagioclase of the central peak complex. High-calcium pyroxenes are detected in association with volcanic deposits emplaced on the crater's floor. Geologic mapping was performed, and the age of Humboldt crater's units was estimated from crater counts. Results suggest that volcanic activity within this floor-fractured crater spanned over a billion years. The felsic mineralogy of the central peak complex region, which presumably excavated deeper material, and the shallow mafic minerals (olivine and spinel) detected in Humboldt crater walls and rim are not in accordance with the general view of the structure of the lunar crust. Our observations can be explained by the presence of a mafic pluton emplaced in the anorthositic crust prior to the Humboldt-forming impact event. Alternatively, the excavation of Australe basin ejecta could explain the observed mineralogical detections. This highlights the importance of detailed combined mineralogical and geological remote sensing studies to assess the heterogeneity of the lunar crust.

  1. Cardio-hepatic risk assessment by CMR imaging in liver transplant candidates.

    PubMed

    Reddy, Sahadev T; Thai, Ngoc L; Oliva, Jose; Tom, Kusum B; Dishart, Michael K; Doyle, Mark; Yamrozik, June A; Williams, Ronald B; Shah, Moneal; Wani, Adil; Singh, Anil; Maheswary, Rishi; Biederman, Robert W W

    2018-03-02

    The preoperative workup of orthotopic liver transplantation (OLT) patients is practically complex given the need for multiple imaging modalities. We recently demonstrated in our proof-of-concept study the value of a one-stop-shop approach using cardiovascular MRI (CMR) to address this complex problem. However, this approach requires further validation in a larger cohort, as detection of hepatocellular carcinoma (HCC) as well as cardiovascular risk assessment is critically important in these patients. We hypothesized that coronary risk assessment and HCC detectability is acceptable using the one-stop-shop CMR approach. In this observational study, patients underwent CMRI evaluation including cardiac function, stress CMR, thoracoabdominal MRA, and abdominal MRI on a standard MRI scanner in one examination. Over 8 years, 252 OLT candidates underwent evaluation in the cardiac MRI suit. The completion rates for each segment of the CMR examination were 99% for function, 95% completed stress CMR, 93% completed LGE for viability, 85% for liver MRI, and 87% for MRA. A negative CMR stress examination had 100% CAD event-free survival at 12 months. A total of 63 (29%) patients proceeded to OLT. Explant pathology confirmed detection/exclusion of HCC. This study further defines the population suitable for the one-stop-shop CMR concept for preop evaluation of OLT candidates providing a road map for integrated testing in this complex patient population for evaluation of cardiac risk and detection of HCC lesions. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. Genomic Data Quality Impacts Automated Detection of Lateral Gene Transfer in Fungi

    PubMed Central

    Dupont, Pierre-Yves; Cox, Murray P.

    2017-01-01

    Lateral gene transfer (LGT, also known as horizontal gene transfer), an atypical mechanism of transferring genes between species, has almost become the default explanation for genes that display an unexpected composition or phylogeny. Numerous methods of detecting LGT events all rely on two fundamental strategies: primary structure composition or gene tree/species tree comparisons. Discouragingly, the results of these different approaches rarely coincide. With the wealth of genome data now available, detection of laterally transferred genes is increasingly being attempted in large uncurated eukaryotic datasets. However, detection methods depend greatly on the quality of the underlying genomic data, which are typically complex for eukaryotes. Furthermore, given the automated nature of genomic data collection, it is typically impractical to manually verify all protein or gene models, orthology predictions, and multiple sequence alignments, requiring researchers to accept a substantial margin of error in their datasets. Using a test case comprising plant-associated genomes across the fungal kingdom, this study reveals that composition- and phylogeny-based methods have little statistical power to detect laterally transferred genes. In particular, phylogenetic methods reveal extreme levels of topological variation in fungal gene trees, the vast majority of which show departures from the canonical species tree. Therefore, it is inherently challenging to detect LGT events in typical eukaryotic genomes. This finding is in striking contrast to the large number of claims for laterally transferred genes in eukaryotic species that routinely appear in the literature, and questions how many of these proposed examples are statistically well supported. PMID:28235827

  3. [Study on the timeliness of detection and reporting on public health emergency events in China].

    PubMed

    Li, Ke-Li; Feng, Zi-Jian; Ni, Da-Xin

    2009-03-01

    To analyze the timeliness of detection and reporting on public health emergency events, and to explore the effective strategies for improving the relative capacity on those issues. We conducted a retrospective survey on 3275 emergency events reported through Public Health Emergency Events Surveillance System from 2005 to the first half of 2006. Developed by county Centers for Disease Control and Prevention, a uniformed self-administrated questionnaire was used to collect data, which would include information on the detection, reporting of the events. For communicable diseases events, the median of time interval between the occurrence of first case and the detection of event was 6 days (P25 = 2, P75 = 13). For food poisoning events and clusters of disease with unknown origin, the medians were 3 hours (P25, P75 = 16) and 1 days (P25 = 0, P75 = 5). 71.54% of the events were reported by the discoverers within 2 hours after the detection. In general, the ranges of time intervals between the occurrence, detection or reporting of the events were different, according to the categories of events. The timeliness of detection and reporting of events could have been improved dramatically if the definition of events, according to their characteristics, had been more reasonable and accessible, as well as the improvement of training program for healthcare staff and teachers.

  4. Real-time optimizations for integrated smart network camera

    NASA Astrophysics Data System (ADS)

    Desurmont, Xavier; Lienard, Bruno; Meessen, Jerome; Delaigle, Jean-Francois

    2005-02-01

    We present an integrated real-time smart network camera. This system is composed of an image sensor, an embedded PC based electronic card for image processing and some network capabilities. The application detects events of interest in visual scenes, highlights alarms and computes statistics. The system also produces meta-data information that could be shared between other cameras in a network. We describe the requirements of such a system and then show how the design of the system is optimized to process and compress video in real-time. Indeed, typical video-surveillance algorithms as background differencing, tracking and event detection should be highly optimized and simplified to be used in this hardware. To have a good adequation between hardware and software in this light embedded system, the software management is written on top of the java based middle-ware specification established by the OSGi alliance. We can integrate easily software and hardware in complex environments thanks to the Java Real-Time specification for the virtual machine and some network and service oriented java specifications (like RMI and Jini). Finally, we will report some outcomes and typical case studies of such a camera like counter-flow detection.

  5. Predicting Atmospheric Releases from the September 3, 2017 North Korean Event

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Simpson, M. D.; Glascoe, L. G.

    2017-12-01

    Underground nuclear explosions produce radionuclides that can be vented to the atmosphere and transported to International Monitoring System (IMS) measurement stations. Although a positive atmospheric detection from North Korea's declared test on September 3, 2017 has not been reported at any IMS station through early October, atmospheric transport models can predict when and where detections may arise and provide valuable information to optimize air collection strategies. We present predictive atmospheric transport simulations initiated in the early days after the event. Wind fields were simulated with the Weather Research and Forecast model and used to transport air tracers from an ensemble of releases in the FLEXPART dispersion model. If early venting had occurred, the simulations suggested that detections were possible at the IMS station in Takasaki, Japan. On-going and future research efforts associated with nuclear testing are focused on quantifying meteorological uncertainty, simulating releases in complex terrain, and developing new statistical methods for source attribution. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and is released as LLNL-ABS-740341.

  6. A real-time detector system for precise timing of audiovisual stimuli.

    PubMed

    Henelius, Andreas; Jagadeesan, Sharman; Huotilainen, Minna

    2012-01-01

    The successful recording of neurophysiologic signals, such as event-related potentials (ERPs) or event-related magnetic fields (ERFs), relies on precise information of stimulus presentation times. We have developed an accurate and flexible audiovisual sensor solution operating in real-time for on-line use in both auditory and visual ERP and ERF paradigms. The sensor functions independently of the used audio or video stimulus presentation tools or signal acquisition system. The sensor solution consists of two independent sensors; one for sound and one for light. The microcontroller-based audio sensor incorporates a novel approach to the detection of natural sounds such as multipart audio stimuli, using an adjustable dead time. This aids in producing exact markers for complex auditory stimuli and reduces the number of false detections. The analog photosensor circuit detects changes in light intensity on the screen and produces a marker for changes exceeding a threshold. The microcontroller software for the audio sensor is free and open source, allowing other researchers to customise the sensor for use in specific auditory ERP/ERF paradigms. The hardware schematics and software for the audiovisual sensor are freely available from the webpage of the authors' lab.

  7. Improving patient safety by optimizing the use of nursing human resources.

    PubMed

    Rochefort, Christian M; Buckeridge, David L; Abrahamowicz, Michal

    2015-06-14

    Recent ecological studies have suggested that inadequate nurse staffing may contribute to the incidence of adverse events in acute care hospitals. However, longitudinal studies are needed to further examine these associations and to identify the staffing patterns that are of greatest risk. The aims of this study are to determine if (a) nurse staffing levels are associated with an increased risk of adverse events, (b) the risk of adverse events in relationship to nurse staffing levels is modified by the complexity of patient requirements, and (c) optimal nurse staffing levels can be established. A dynamic cohort of all adult medical, surgical, and intensive care unit patients admitted between 2010 and 2015 to a Canadian academic health center will be followed during the inpatient and 7-day post-discharge period to assess the occurrence and frequency of adverse events in relationship to antecedent nurse staffing levels. Four potentially preventable adverse events will be measured: (a) hospital-acquired pneumonia, (b) ventilator-associated pneumonia, (c) venous thromboembolism, and (d) in-hospital fall. These events were selected for their high incidence, morbidity and mortality rates, and because they are hypothesized to be related to nurse staffing levels. Adverse events will be ascertained from electronic health record data using validated automated detection algorithms. Patient exposure to nurse staffing will be measured on every shift of the hospitalization using electronic payroll records. To examine the association between nurse staffing levels and the risk of adverse events, four Cox proportional hazards regression models will be used (one for each adverse event), while adjusting for patient characteristics and risk factors of adverse event occurrence. To determine if the association between nurse staffing levels and the occurrence of adverse events is modified by the complexity of patient requirements, interaction terms will be included in the regression models, and their significance assessed. To assess for the presence of optimal nurse staffing levels, flexible nonlinear spline functions will be fitted. This study will likely generate evidence-based information that will assist managers in making the most effective use of scarce nursing resources and in identifying staffing patterns that minimize the risk of adverse events.

  8. Information Spread of Emergency Events: Path Searching on Social Networks

    PubMed Central

    Hu, Hongzhi; Wu, Tunan

    2014-01-01

    Emergency has attracted global attentions of government and the public, and it will easily trigger a series of serious social problems if it is not supervised effectively in the dissemination process. In the Internet world, people communicate with each other and form various virtual communities based on social networks, which lead to a complex and fast information spread pattern of emergency events. This paper collects Internet data based on data acquisition and topic detection technology, analyzes the process of information spread on social networks, describes the diffusions and impacts of that information from the perspective of random graph, and finally seeks the key paths through an improved IBF algorithm. Application cases have shown that this algorithm can search the shortest spread paths efficiently, which may help us to guide and control the information dissemination of emergency events on early warning. PMID:24600323

  9. Energetic protons from a disappearing solar filament

    NASA Technical Reports Server (NTRS)

    Kahler, S. W.; Cliver, E. W.; Cane, H. V.; Mcguire, R. E.; Stone, R. G.; Sheeley, N. R., Jr.

    1985-01-01

    A solar energetic (E 50 MeV) particle (SEP) event observed at 1 AU began about 15000 UT on 1981 December 5. This event was associated with a fast coronal mass ejection observed with the Solwind coronagraph on the P78-1 satellite. No metric type 2 or type 4 burst was observed, but a weak interplanetary type 2 burst was observed with the low frequency radio experiment on the International Sun-Earth Explorer-3 satellite. The mass ejection was associated with the eruption of a large solar quiescent filament which lay well away from any active regions. The eruption resulted in an H alpha double ribbon structure which straddled the magnetic inversion line. No impulsive phase was obvious in either the H alpha or the microwave observations. This event indicates that neither a detectable impulsive phase nor a strong or complex magnetic field is necessary for the production of energetic ions.

  10. Discovering Coseismic Traveling Ionospheric Disturbances Generated by the 2016 Kaikoura Earthquake

    NASA Astrophysics Data System (ADS)

    Li, J. D.; Rude, C. M.; Gowanlock, M.; Pankratius, V.

    2017-12-01

    Geophysical events and hazards, such as earthquakes, tsunamis, and volcanoes, have been shown to generate traveling ionospheric disturbances (TIDs). These disturbances can be measured by means of Total Electron Content fluctuations obtained from a network of multifrequency GPS receivers in the MIT Haystack Observatory Madrigal database. Analyzing the response of the ionosphere to such hazards enhances our understanding of natural phenomena and augments our large-scale monitoring capabilities in conjunction with other ground-based sensors. However, it is currently challenging for human investigators to spot and characterize such signatures, or whether a geophysical event has actually occurred, because the ionosphere can be noisy with multiple simultaneous phenomena taking place at the same time. This work therefore explores a systematic pipeline for the ex-post discovery and characterization of TIDs. Our technique starts by geolocating the event and gathering the corresponding data, then checks for potentially conflicting TID sources, and processes the raw total electron content data to generate differential measurements. A Kolmogorov-Smirnov test is applied to evaluate the statistical significance of detected deviations in the differential measurements. We present results from our successful application of this pipeline to the 2016 7.8 Mw Kaikoura earthquake occurring in New Zealand on November 13th. We detect a coseismic TID occurring 8 minutes after the earthquake and propagating towards the equator at 1050 m/s, with a 0.22 peak-to-peak TECu amplitude. Furthermore, the observed waveform exhibits more complex behavior than the expected N-wave for a coseismic TID, which potentially results from the complex multi-fault structure of the earthquake. We acknowledge support from NSF ACI1442997 (PI Pankratius), NASA AISTNNX15AG84G (PI Pankratius), and NSF AGS-1343967 (PI Pankratius), and NSF AGS-1242204 (PI Erickson).

  11. Label-free detection of DNA hybridization using carbon nanotube network field-effect transistors

    NASA Astrophysics Data System (ADS)

    Star, Alexander; Tu, Eugene; Niemann, Joseph; Gabriel, Jean-Christophe P.; Joiner, C. Steve; Valcke, Christian

    2006-01-01

    We report carbon nanotube network field-effect transistors (NTNFETs) that function as selective detectors of DNA immobilization and hybridization. NTNFETs with immobilized synthetic oligonucleotides have been shown to specifically recognize target DNA sequences, including H63D single-nucleotide polymorphism (SNP) discrimination in the HFE gene, responsible for hereditary hemochromatosis. The electronic responses of NTNFETs upon single-stranded DNA immobilization and subsequent DNA hybridization events were confirmed by using fluorescence-labeled oligonucleotides and then were further explored for label-free DNA detection at picomolar to micromolar concentrations. We have also observed a strong effect of DNA counterions on the electronic response, thus suggesting a charge-based mechanism of DNA detection using NTNFET devices. Implementation of label-free electronic detection assays using NTNFETs constitutes an important step toward low-cost, low-complexity, highly sensitive and accurate molecular diagnostics. hemochromatosis | SNP | biosensor

  12. Detection of Maillard reaction products by a coupled HPLC-Fraction collector technique and FTIR characterization of Cu(II)-complexation with the isolated species

    NASA Astrophysics Data System (ADS)

    Ioannou, Aristos; Daskalakis, Vangelis; Varotsis, Constantinos

    2017-08-01

    The isolation of reaction products of asparagine with reducing sugars at alkaline pH and high temperature has been probed by a combination of high performance liquid chromatography (HPLC) coupled with a Fraction Collector. The UV-vis and FTIR spectra of the isolated Maillard reaction products showed structure-sensitive changes as depicted by deamination events and formation of asparagine-saccharide conjugates. The initial reaction species of the Asn-Gluc reaction were also characterized by Density Functional Theory (DFT) methods. Evidence for Cu (II) metal ion complexation with the Maillard reaction products is supported by UV-vis and FTIR spectroscopy.

  13. Comparison of Glaucoma Progression Detection by Optical Coherence Tomography and Visual Field.

    PubMed

    Zhang, Xinbo; Dastiridou, Anna; Francis, Brian A; Tan, Ou; Varma, Rohit; Greenfield, David S; Schuman, Joel S; Huang, David

    2017-12-01

    To compare longitudinal glaucoma progression detection using optical coherence tomography (OCT) and visual field (VF). Validity assessment. We analyzed subjects with more than 4 semi-annual follow-up visits (every 6 months) in the multicenter Advanced Imaging for Glaucoma Study. Fourier-domain optical coherence tomography (OCT) was used to map the thickness of the peripapillary retinal nerve fiber layer (NFL) and ganglion cell complex (GCC). OCT-based progression detection was defined as a significant negative trend for either NFL or GCC. VF progression was reached if either the event or trend analysis reached significance. The analysis included 356 glaucoma suspect/preperimetric glaucoma (GS/PPG) eyes and 153 perimetric glaucoma (PG) eyes. Follow-up length was 54.1 ± 16.2 months for GS/PPG eyes and 56.7 ± 16.0 for PG eyes. Progression was detected in 62.1% of PG eyes and 59.8% of GS/PPG eyes by OCT, significantly (P < .001) more than the detection rate of 41.8% and 27.3% by VF. In severity-stratified analysis of PG eyes, OCT had significantly higher detection rate than VF in mild PG (63.1% vs. 38.7%, P < .001), but not in moderate and advanced PG. The rate of NFL thinning slowed dramatically in advanced PG, but GCC thinning rate remained relatively steady and allowed good progression detection even in advanced disease. The Kaplan-Meier time-to-event analyses showed that OCT detected progression earlier than VF in both PG and GS/PPG groups. OCT is more sensitive than VF for the detection of progression in early glaucoma. While the utility of NFL declines in advanced glaucoma, GCC remains a sensitive progression detector from early to advanced stages. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Cognitive complexity of the medical record is a risk factor for major adverse events.

    PubMed

    Roberson, David; Connell, Michael; Dillis, Shay; Gauvreau, Kimberlee; Gore, Rebecca; Heagerty, Elaina; Jenkins, Kathy; Ma, Lin; Maurer, Amy; Stephenson, Jessica; Schwartz, Margot

    2014-01-01

    Patients in tertiary care hospitals are more complex than in the past, but the implications of this are poorly understood as "patient complexity" has been difficult to quantify. We developed a tool, the Complexity Ruler, to quantify the amount of data (as bits) in the patient’s medical record. We designated the amount of data in the medical record as the cognitive complexity of the medical record (CCMR). We hypothesized that CCMR is a useful surrogate for true patient complexity and that higher CCMR correlates with risk of major adverse events. The Complexity Ruler was validated by comparing the measured CCMR with physician rankings of patient complexity on specific inpatient services. It was tested in a case-control model of all patients with major adverse events at a tertiary care pediatric hospital from 2005 to 2006. The main outcome measure was an externally reported major adverse event. We measured CCMR for 24 hours before the event, and we estimated lifetime CCMR. Above empirically derived cutoffs, 24-hour and lifetime CCMR were risk factors for major adverse events (odds ratios, 5.3 and 6.5, respectively). In a multivariate analysis, CCMR alone was essentially as predictive of risk as a model that started with 30-plus clinical factors. CCMR correlates with physician assessment of complexity and risk of adverse events. We hypothesize that increased CCMR increases the risk of physician cognitive overload. An automated version of the Complexity Ruler could allow identification of at-risk patients in real time.

  15. EGF receptor lysosomal degradation is delayed in the cells stimulated with EGF-Quantum dot bioconjugate but earlier key events of endocytic degradative pathway are similar to that of native EGF

    PubMed Central

    Leontieva, Ekaterina A.; Kornilova, Elena S.

    2017-01-01

    Quantum dots (QDs) complexed to ligands recognizing surface receptors undergoing internalization are an attractive tool for live cell imaging of ligand-receptor complexes behavior and for specific tracking of the cells of interest. However, conjugation of quasi-multivalent large QD-particle to monovalent small growth factors like EGF that bound their tyrosine-kinase receptors may affect key endocytic events tightly bound to signaling. Here, by means of confocal microscopy we have addressed the key endocytic events of lysosomal degradative pathway stimulated by native EGF or EGF-QD bioconjugate. We have demonstrated that the decrease in endosome number, increase in mean endosome integrated density and the pattern of EEA1 co-localization with EGF-EGFR complexes at early stages of endocytosis were similar for the both native and QD-conjugated ligands. In both cases enlarged hollow endosomes appeared after wortmannin treatment. This indicates that early endosomal fusions and their maturation proceed similar for both ligands. EGF-QD and native EGF similarly accumulated in juxtanuclear region, and live cell imaging of endosome motion revealed the behavior described elsewhere for microtubule-facilitated motility. Finally, EGF-QD and the receptor were found in lysosomes. However, degradation of receptor part of QD-EGF-EGFR-complex was delayed compared to native EGF, but not inhibited, while QDs fluorescence was detected in lysosomes even after 24 hours. Importantly, in HeLa and A549 cells the both ligands behaved similarly. We conclude that during endocytosis EGF-QD behaves as a neutral marker for degradative pathway up to lysosomal stage and can also be used as a long-term cell marker. PMID:28574831

  16. Deviance detection based on regularity encoding along the auditory hierarchy: electrophysiological evidence in humans.

    PubMed

    Escera, Carles; Leung, Sumie; Grimm, Sabine

    2014-07-01

    Detection of changes in the acoustic environment is critical for survival, as it prevents missing potentially relevant events outside the focus of attention. In humans, deviance detection based on acoustic regularity encoding has been associated with a brain response derived from the human EEG, the mismatch negativity (MMN) auditory evoked potential, peaking at about 100-200 ms from deviance onset. By its long latency and cerebral generators, the cortical nature of both the processes of regularity encoding and deviance detection has been assumed. Yet, intracellular, extracellular, single-unit and local-field potential recordings in rats and cats have shown much earlier (circa 20-30 ms) and hierarchically lower (primary auditory cortex, medial geniculate body, inferior colliculus) deviance-related responses. Here, we review the recent evidence obtained with the complex auditory brainstem response (cABR), the middle latency response (MLR) and magnetoencephalography (MEG) demonstrating that human auditory deviance detection based on regularity encoding-rather than on refractoriness-occurs at latencies and in neural networks comparable to those revealed in animals. Specifically, encoding of simple acoustic-feature regularities and detection of corresponding deviance, such as an infrequent change in frequency or location, occur in the latency range of the MLR, in separate auditory cortical regions from those generating the MMN, and even at the level of human auditory brainstem. In contrast, violations of more complex regularities, such as those defined by the alternation of two different tones or by feature conjunctions (i.e., frequency and location) fail to elicit MLR correlates but elicit sizable MMNs. Altogether, these findings support the emerging view that deviance detection is a basic principle of the functional organization of the auditory system, and that regularity encoding and deviance detection is organized in ascending levels of complexity along the auditory pathway expanding from the brainstem up to higher-order areas of the cerebral cortex.

  17. Perceived synchrony for realistic and dynamic audiovisual events.

    PubMed

    Eg, Ragnhild; Behne, Dawn M

    2015-01-01

    In well-controlled laboratory experiments, researchers have found that humans can perceive delays between auditory and visual signals as short as 20 ms. Conversely, other experiments have shown that humans can tolerate audiovisual asynchrony that exceeds 200 ms. This seeming contradiction in human temporal sensitivity can be attributed to a number of factors such as experimental approaches and precedence of the asynchronous signals, along with the nature, duration, location, complexity and repetitiveness of the audiovisual stimuli, and even individual differences. In order to better understand how temporal integration of audiovisual events occurs in the real world, we need to close the gap between the experimental setting and the complex setting of everyday life. With this work, we aimed to contribute one brick to the bridge that will close this gap. We compared perceived synchrony for long-running and eventful audiovisual sequences to shorter sequences that contain a single audiovisual event, for three types of content: action, music, and speech. The resulting windows of temporal integration showed that participants were better at detecting asynchrony for the longer stimuli, possibly because the long-running sequences contain multiple corresponding events that offer audiovisual timing cues. Moreover, the points of subjective simultaneity differ between content types, suggesting that the nature of a visual scene could influence the temporal perception of events. An expected outcome from this type of experiment was the rich variation among participants' distributions and the derived points of subjective simultaneity. Hence, the designs of similar experiments call for more participants than traditional psychophysical studies. Heeding this caution, we conclude that existing theories on multisensory perception are ready to be tested on more natural and representative stimuli.

  18. Perceived synchrony for realistic and dynamic audiovisual events

    PubMed Central

    Eg, Ragnhild; Behne, Dawn M.

    2015-01-01

    In well-controlled laboratory experiments, researchers have found that humans can perceive delays between auditory and visual signals as short as 20 ms. Conversely, other experiments have shown that humans can tolerate audiovisual asynchrony that exceeds 200 ms. This seeming contradiction in human temporal sensitivity can be attributed to a number of factors such as experimental approaches and precedence of the asynchronous signals, along with the nature, duration, location, complexity and repetitiveness of the audiovisual stimuli, and even individual differences. In order to better understand how temporal integration of audiovisual events occurs in the real world, we need to close the gap between the experimental setting and the complex setting of everyday life. With this work, we aimed to contribute one brick to the bridge that will close this gap. We compared perceived synchrony for long-running and eventful audiovisual sequences to shorter sequences that contain a single audiovisual event, for three types of content: action, music, and speech. The resulting windows of temporal integration showed that participants were better at detecting asynchrony for the longer stimuli, possibly because the long-running sequences contain multiple corresponding events that offer audiovisual timing cues. Moreover, the points of subjective simultaneity differ between content types, suggesting that the nature of a visual scene could influence the temporal perception of events. An expected outcome from this type of experiment was the rich variation among participants' distributions and the derived points of subjective simultaneity. Hence, the designs of similar experiments call for more participants than traditional psychophysical studies. Heeding this caution, we conclude that existing theories on multisensory perception are ready to be tested on more natural and representative stimuli. PMID:26082738

  19. Automatic event recognition and anomaly detection with attribute grammar by learning scene semantics

    NASA Astrophysics Data System (ADS)

    Qi, Lin; Yao, Zhenyu; Li, Li; Dong, Junyu

    2007-11-01

    In this paper we present a novel framework for automatic event recognition and abnormal behavior detection with attribute grammar by learning scene semantics. This framework combines learning scene semantics by trajectory analysis and constructing attribute grammar-based event representation. The scene and event information is learned automatically. Abnormal behaviors that disobey scene semantics or event grammars rules are detected. By this method, an approach to understanding video scenes is achieved. Further more, with this prior knowledge, the accuracy of abnormal event detection is increased.

  20. Subsurface event detection and classification using Wireless Signal Networks.

    PubMed

    Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T

    2012-11-05

    Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events.

  1. Subsurface Event Detection and Classification Using Wireless Signal Networks

    PubMed Central

    Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T.

    2012-01-01

    Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events. PMID:23202191

  2. Analysis of arrhythmic events is useful to detect lead failure earlier in patients followed by remote monitoring.

    PubMed

    Nishii, Nobuhiro; Miyoshi, Akihito; Kubo, Motoki; Miyamoto, Masakazu; Morimoto, Yoshimasa; Kawada, Satoshi; Nakagawa, Koji; Watanabe, Atsuyuki; Nakamura, Kazufumi; Morita, Hiroshi; Ito, Hiroshi

    2018-03-01

    Remote monitoring (RM) has been advocated as the new standard of care for patients with cardiovascular implantable electronic devices (CIEDs). RM has allowed the early detection of adverse clinical events, such as arrhythmia, lead failure, and battery depletion. However, lead failure was often identified only by arrhythmic events, but not impedance abnormalities. To compare the usefulness of arrhythmic events with conventional impedance abnormalities for identifying lead failure in CIED patients followed by RM. CIED patients in 12 hospitals have been followed by the RM center in Okayama University Hospital. All transmitted data have been analyzed and summarized. From April 2009 to March 2016, 1,873 patients have been followed by the RM center. During the mean follow-up period of 775 days, 42 lead failure events (atrial lead 22, right ventricular pacemaker lead 5, implantable cardioverter defibrillator [ICD] lead 15) were detected. The proportion of lead failures detected only by arrhythmic events, which were not detected by conventional impedance abnormalities, was significantly higher than that detected by impedance abnormalities (arrhythmic event 76.2%, 95% CI: 60.5-87.9%; impedance abnormalities 23.8%, 95% CI: 12.1-39.5%). Twenty-seven events (64.7%) were detected without any alert. Of 15 patients with ICD lead failure, none has experienced inappropriate therapy. RM can detect lead failure earlier, before clinical adverse events. However, CIEDs often diagnose lead failure as just arrhythmic events without any warning. Thus, to detect lead failure earlier, careful human analysis of arrhythmic events is useful. © 2017 Wiley Periodicals, Inc.

  3. The Connection between the Complexity of Perception of an Event and Judging Decisions in a Complex Situation

    ERIC Educational Resources Information Center

    Rauchberger, Nirit; Kaniel, Shlomo; Gross, Zehavit

    2017-01-01

    This study examines the process of judging complex real-life events in Israel: the disengagement from Gush Katif, Rabin's assassination and the Second Lebanon War. The process of judging is based on Weiner's attribution model, (Weiner, 2000, 2006); however, due to the complexity of the events studied, variables were added to characterize the…

  4. Saliency Detection as a Reactive Process: Unexpected Sensory Events Evoke Corticomuscular Coupling

    PubMed Central

    Kilintari, Marina; Srinivasan, Mandayam; Haggard, Patrick

    2018-01-01

    Survival in a fast-changing environment requires animals not only to detect unexpected sensory events, but also to react. In humans, these salient sensory events generate large electrocortical responses, which have been traditionally interpreted within the sensory domain. Here we describe a basic physiological mechanism coupling saliency-related cortical responses with motor output. In four experiments conducted on 70 healthy participants, we show that salient substartle sensory stimuli modulate isometric force exertion by human participants, and that this modulation is tightly coupled with electrocortical activity elicited by the same stimuli. We obtained four main results. First, the force modulation follows a complex triphasic pattern consisting of alternating decreases and increases of force, time-locked to stimulus onset. Second, this modulation occurs regardless of the sensory modality of the eliciting stimulus. Third, the magnitude of the force modulation is predicted by the amplitude of the electrocortical activity elicited by the same stimuli. Fourth, both neural and motor effects are not reflexive but depend on contextual factors. Together, these results indicate that sudden environmental stimuli have an immediate effect on motor processing, through a tight corticomuscular coupling. These observations suggest that saliency detection is not merely perceptive but reactive, preparing the animal for subsequent appropriate actions. SIGNIFICANCE STATEMENT Salient events occurring in the environment, regardless of their modalities, elicit large electrical brain responses, dominated by a widespread “vertex” negative-positive potential. This response is the largest synchronization of neural activity that can be recorded from a healthy human being. Current interpretations assume that this vertex potential reflects sensory processes. Contrary to this general assumption, we show that the vertex potential is strongly coupled with a modulation of muscular activity that follows the same pattern. Both the vertex potential and its motor effects are not reflexive but strongly depend on contextual factors. These results reconceptualize the significance of these evoked electrocortical responses, suggesting that saliency detection is not merely perceptive but reactive, preparing the animal for subsequent appropriate actions. PMID:29378865

  5. Leveraging Social Norms to Improve Leak Resolution Outcomes Across Meter Classes:

    NASA Astrophysics Data System (ADS)

    Holleran, W.

    2016-12-01

    Over the past decade, utilities, governments, businesses, and nonprofits have come to realize that more than just financial considerations and information drive behavior. Social and psychological factors also play a significant role in shaping consumers' decisions and behaviors around resource use. Stakeholders have consequently turned their interest to behavioral science, a multidisciplinary field that draws from psychology, sociology, public health, and behavioral economics to explain the complex mechanisms that shape human behavior. When used strategically, behavioral science holds the potential to drive down resource use, drive up profits, and generate measurable gains in conservation and efficiency. WaterSmart will present on how the water sector can employ behavioral science to nudge residential rate-payers to use water more efficiently and help them save money. Utilities can use behavioral science to influence people's reaction to leaks. 5% of Single Family Residential (SFR) metered water use can be attributed to leaks. This value potentially skews even higher for MultiFamily (MF) and Commercial accounts given that it can get lost in the noise of daily consumption. Existing leak detection algorithms in the market are not sophisticated enough to detect leaks for a MF or Commercial property. Leveraging data from utilities on known leak events at MF and Commercial buildings allowed WaterSmart to train a machine learning model to identify key features in the load shape and accurately detect these types of water use events. The outcome of the model is a leak amount and confidence level for each irregular usage event. The model also incorporates record feedback from users on the type of leak event, and the accuracy of the alert. When WaterSmart leverages this data model with social norms messaging, we've been able to improve water demand management for MF and Commercial properties. Experiences from leak detection and resolution in the SFR space will also be discussed.

  6. Satellite microwave observations of a storm complex: A comparative analysis

    NASA Technical Reports Server (NTRS)

    Martin, D. W.

    1985-01-01

    The hypothesis that cold events correspond to a particular stage in a class of thunderstorms was tested. That class is a storms class which updrafts are: (1) strong, broad and moist, and (2) extend well above the freezing level. Condition (1) implies strong mesoscale forcing. Condition (2) implies a tall updraft or a relatively low freezing level. Such storms should have big, intense radar echoes and cold, fast-growing anvils. The thunderstorm events were analyzed by radar, rain gauge and GOES infrared observations. Radar was the starting point for detection and definition of the hypothesized thunderstorms. The radar signature is compared to the signature of the storm in rain gauge observations, satellite infrared images and satellite microwave images.

  7. A Survey of Logic Formalisms to Support Mishap Analysis

    NASA Technical Reports Server (NTRS)

    Johnson, Chris; Holloway, C. M.

    2003-01-01

    Mishap investigations provide important information about adverse events and near miss incidents. They are intended to help avoid any recurrence of previous failures. Over time, they can also yield statistical information about incident frequencies that helps to detect patterns of failure and can validate risk assessments. However, the increasing complexity of many safety critical systems is posing new challenges for mishap analysis. Similarly, the recognition that many failures have complex, systemic causes has helped to widen the scope of many mishap investigations. These two factors have combined to pose new challenges for the analysis of adverse events. A new generation of formal and semi-formal techniques have been proposed to help investigators address these problems. We introduce the term mishap logics to collectively describe these notations that might be applied to support the analysis of mishaps. The proponents of these notations have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. These proofs can be used to reduce the bias that is often perceived to effect the interpretation of adverse events. Others have argued that one cannot use logic formalisms to prove causes in the same way that one might prove propositions or theorems. Such mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators must use in their analysis of adverse events. This paper provides an overview of these mishap logics. It also identifies several additional classes of logic that might also be used to support mishap analysis.

  8. Geologic events coupled with Pleistocene climatic oscillations drove genetic variation of Omei treefrog (Rhacophorus omeimontis) in southern China.

    PubMed

    Li, Jun; Zhao, Mian; Wei, Shichao; Luo, Zhenhua; Wu, Hua

    2015-12-21

    Pleistocene climatic oscillations and historical geological events may both influence current patterns of genetic variation, and the species in southern China that faced unique climatic and topographical events have complex evolutionary histories. However, the relative contributions of climatic oscillations and geographical events to the genetic variation of these species remain undetermined. To investigate patterns of genetic variation and to test the hypotheses about the factors that shaped the distribution of this genetic variation in species of southern China, mitochondrial genes (cytochrome b and NADH dehydrogenase subunit 2) and nine microsatellite loci of the Omei tree frog (Rhacophorus omeimontis) were amplified in this study. The genetic diversity in the populations of R. omeimontis was high. The phylogenetic trees reconstructed from the mitochondrial DNA (mtDNA) haplotypes and the Bayesian genetic clustering analysis based on microsatellite data both revealed that all populations were divided into three lineages (SC, HG and YN). The two most recent splitting events among the lineages coincided with recent geological events (including the intense uplift of the Qinghai-Tibet Plateau, QTP and the subsequent movements of the Yun-Gui Plateau, YGP) and the Pleistocene glaciations. Significant expansion signals were not detected in mismatch analyses or neutrality tests. And the effective population size of each lineage was stable during the Pleistocene. Based on the results of this study, complex geological events (the recent dramatic uplift of the QTP and the subsequent movements of the YGP) and the Pleistocene glaciations were apparent drivers of the rapid divergence of the R. omeimontis lineages. Each diverged lineages survived in situ with limited gene exchanges, and the stable demographics of lineages indicate that the Pleistocene climatic oscillations were inconsequential for this species. The analysis of genetic variation in populations of R. omeimontis contributes to the understanding of the effects of changes in climate and of geographical events on the dynamic development of contemporary patterns of genetic variation in the species of southern China.

  9. Characterization of fusion genes and the significantly expressed fusion isoforms in breast cancer by hybrid sequencing

    PubMed Central

    Weirather, Jason L.; Afshar, Pegah Tootoonchi; Clark, Tyson A.; Tseng, Elizabeth; Powers, Linda S.; Underwood, Jason G.; Zabner, Joseph; Korlach, Jonas; Wong, Wing Hung; Au, Kin Fai

    2015-01-01

    We developed an innovative hybrid sequencing approach, IDP-fusion, to detect fusion genes, determine fusion sites and identify and quantify fusion isoforms. IDP-fusion is the first method to study gene fusion events by integrating Third Generation Sequencing long reads and Second Generation Sequencing short reads. We applied IDP-fusion to PacBio data and Illumina data from the MCF-7 breast cancer cells. Compared with the existing tools, IDP-fusion detects fusion genes at higher precision and a very low false positive rate. The results show that IDP-fusion will be useful for unraveling the complexity of multiple fusion splices and fusion isoforms within tumorigenesis-relevant fusion genes. PMID:26040699

  10. Long-term neurocognitive outcome and auditory event-related potentials after complex febrile seizures in children.

    PubMed

    Tsai, Min-Lan; Hung, Kun-Long; Tsan, Ying-Ying; Tung, William Tao-Hsin

    2015-06-01

    Whether prolonged or complex febrile seizures (FS) produce long-term injury to the hippocampus is a critical question concerning the neurocognitive outcome of these seizures. Long-term event-related evoked potential (ERP) recording from the scalp is a noninvasive technique reflecting the sensory and cognitive processes associated with attention tasks. This study aimed to investigate the long-term outcome of neurocognitive and attention functions and evaluated auditory event-related potentials in children who have experienced complex FS in comparison with other types of FS. One hundred and forty-seven children aged more than 6 years who had experienced complex FS, simple single FS, simple recurrent FS, or afebrile seizures (AFS) after FS and age-matched healthy controls were enrolled. Patients were evaluated with Wechsler Intelligence Scale for Children (WISC; Chinese WISC-IV) scores, behavior test scores (Chinese version of Conners' continuous performance test, CPT II V.5), and behavior rating scales. Auditory ERPs were recorded in each patient. Patients who had experienced complex FS exhibited significantly lower full-scale intelligence quotient (FSIQ), perceptual reasoning index, and working memory index scores than did the control group but did not show significant differences in CPT scores, behavior rating scales, or ERP latencies and amplitude compared with the other groups with FS. We found a significant decrease in the FSIQ and four indices of the WISC-IV, higher behavior rating scales, a trend of increased CPT II scores, and significantly delayed P300 latency and reduced P300 amplitude in the patients with AFS after FS. We conclude that there is an effect on cognitive function in children who have experienced complex FS and patients who developed AFS after FS. The results indicated that the WISC-IV is more sensitive in detecting cognitive abnormality than ERP. Cognition impairment, including perceptual reasoning and working memory defects, was identified in patients with prolonged, multiple, or focal FS. These results may have implications for the pathogenesis of complex FS. Further comprehensive psychological evaluation and educational programs are suggested. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Full-waveform detection of non-impulsive seismic events based on time-reversal methods

    NASA Astrophysics Data System (ADS)

    Solano, Ericka Alinne; Hjörleifsdóttir, Vala; Liu, Qinya

    2017-12-01

    We present a full-waveform detection method for non-impulsive seismic events, based on time-reversal principles. We use the strain Green's tensor as a matched filter, correlating it with continuous observed seismograms, to detect non-impulsive seismic events. We show that this is mathematically equivalent to an adjoint method for detecting earthquakes. We define the detection function, a scalar valued function, which depends on the stacked correlations for a group of stations. Event detections are given by the times at which the amplitude of the detection function exceeds a given value relative to the noise level. The method can make use of the whole seismic waveform or any combination of time-windows with different filters. It is expected to have an advantage compared to traditional detection methods for events that do not produce energetic and impulsive P waves, for example glacial events, landslides, volcanic events and transform-fault earthquakes for events which velocity structure along the path is relatively well known. Furthermore, the method has advantages over empirical Greens functions template matching methods, as it does not depend on records from previously detected events, and therefore is not limited to events occurring in similar regions and with similar focal mechanisms as these events. The method is not specific to any particular way of calculating the synthetic seismograms, and therefore complicated structural models can be used. This is particularly beneficial for intermediate size events that are registered on regional networks, for which the effect of lateral structure on the waveforms can be significant. To demonstrate the feasibility of the method, we apply it to two different areas located along the mid-oceanic ridge system west of Mexico where non-impulsive events have been reported. The first study area is between Clipperton and Siqueiros transform faults (9°N), during the time of two earthquake swarms, occurring in March 2012 and May 2016. The second area of interest is the Gulf of California where two swarms took place during July and September of 2015. We show that we are able to detect previously non-reported, non-impulsive events and recommend that this method be used together with more traditional template matching methods to maximize the number of detected events.

  12. Detecting regular sound changes in linguistics as events of concerted evolution

    DOE PAGES

    Hruschka, Daniel  J.; Branford, Simon; Smith, Eric  D.; ...

    2014-12-18

    Background: Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Results: Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular soundmore » change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. Conclusions: We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group.« less

  13. Detecting regular sound changes in linguistics as events of concerted evolution.

    PubMed

    Hruschka, Daniel J; Branford, Simon; Smith, Eric D; Wilkins, Jon; Meade, Andrew; Pagel, Mark; Bhattacharya, Tanmoy

    2015-01-05

    Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular sound change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Detecting regular sound changes in linguistics as events of concerted evolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hruschka, Daniel  J.; Branford, Simon; Smith, Eric  D.

    Background: Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Results: Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular soundmore » change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. Conclusions: We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group.« less

  15. Current and Developing Technologies for Monitoring Agents of Bioterrorism and Biowarfare

    PubMed Central

    Lim, Daniel V.; Simpson, Joyce M.; Kearns, Elizabeth A.; Kramer, Marianne F.

    2005-01-01

    Recent events have made public health officials acutely aware of the importance of rapidly and accurately detecting acts of bioterrorism. Because bioterrorism is difficult to predict or prevent, reliable platforms to rapidly detect and identify biothreat agents are important to minimize the spread of these agents and to protect the public health. These platforms must not only be sensitive and specific, but must also be able to accurately detect a variety of pathogens, including modified or previously uncharacterized agents, directly from complex sample matrices. Various commercial tests utilizing biochemical, immunological, nucleic acid, and bioluminescence procedures are currently available to identify biological threat agents. Newer tests have also been developed to identify such agents using aptamers, biochips, evanescent wave biosensors, cantilevers, living cells, and other innovative technologies. This review describes these current and developing technologies and considers challenges to rapid, accurate detection of biothreat agents. Although there is no ideal platform, many of these technologies have proved invaluable for the detection and identification of biothreat agents. PMID:16223949

  16. Hiding earthquakes from scrupulous monitoring eyes of dense local seismic networks

    NASA Astrophysics Data System (ADS)

    Bogiatzis, P.; Ishii, M.; Kiser, E.

    2012-12-01

    Accurate and complete cataloguing of aftershocks is essential for a variety of purposes, including the estimation of the mainshock rupture area, the identification of seismic gaps, and seismic hazard assessment. However, immediately following large earthquakes, the seismograms recorded by local networks are noisy, with energy arriving from hundreds of aftershocks, in addition to different seismic phases interfering with one another. This causes deterioration in the performance of detection and location of earthquakes using conventional methods such as the S-P approach. This is demonstrated by results of back-projection analysis of teleseismic data showing that a significant number of events are undetected by the Japan Meteorological Agency, within the first twenty-four hours after the Mw9.0 Tohoku-oki, Japan earthquake. The spatial distribution of the hidden events is not arbitrary. Most of these earthquakes are located close to the trench, while some are located at the outer rise. Furthermore, there is a relatively sharp trench-parallel boundary separating the detected and undetected events. We investigate the cause of these hidden earthquakes using forward modeling. The calculation of raypaths for various source locations and takeoff angles with the "shooting" method suggests that this phenomenon is a consequence of the complexities associated with subducting slab. Laterally varying velocity structure defocuses the seismic energy from shallow earthquakes located near the trench and makes the observation of P and S arrivals difficult at stations situated on mainland Japan. Full waveform simulations confirm these results. Our forward calculations also show that the probability of detection is sensitive to the depth of the event. Shallower events near the trench are more difficult to detect than deeper earthquakes that are located inside the subducting plate for which the shadow-zone effect diminishes. The modeling effort is expanded to include three-dimensional structure in velocity and intrinsic attenuation to evaluate possible laterally varying patterns. Our study suggests that the phenomenon of hidden earthquakes could be present at other regions around the world with active subductions. Considering that many of these subduction zones are not as well monitored as Japan, the number of missed events, especially after large earthquakes, could be significant. The results of this work can help to identify "blind spots" of present seismic networks, and can contribute to improving monitoring activities.

  17. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method

    PubMed Central

    Cao, Youfang; Liang, Jie

    2013-01-01

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape. PMID:23862966

  18. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method

    NASA Astrophysics Data System (ADS)

    Cao, Youfang; Liang, Jie

    2013-07-01

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape.

  19. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method.

    PubMed

    Cao, Youfang; Liang, Jie

    2013-07-14

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape.

  20. Contribution of Infrasound to IDC Reviewed Event Bulletin

    NASA Astrophysics Data System (ADS)

    Bittner, Paulina; Polich, Paul; Gore, Jane; Ali, Sherif Mohamed; Medinskaya, Tatiana; Mialle, Pierrick

    2016-04-01

    Until 2003 two waveform technologies, i.e. seismic and hydroacoustic were used to detect and locate events included in the International Data Centre (IDC) Reviewed Event Bulletin (REB). The first atmospheric event was published in the REB in 2003 but infrasound detections could not be used by the Global Association (GA) Software due to the unmanageable high number of spurious associations. Offline improvements of the automatic processing took place to reduce the number of false detections to a reasonable level. In February 2010 the infrasound technology was reintroduced to the IDC operations and has contributed to both automatic and reviewed IDC bulletins. The primary contribution of infrasound technology is to detect atmospheric events. These events may also be observed at seismic stations, which will significantly improve event location. Examples of REB events, which were detected by the International Monitoring System (IMS) infrasound network were fireballs (e.g. Bangkok fireball, 2015), volcanic eruptions (e.g. Calbuco, Chile 2015) and large surface explosions (e.g. Tjanjin, China 2015). Query blasts and large earthquakes belong to events primarily recorded at seismic stations of the IMS network but often detected at the infrasound stations. Presence of infrasound detection associated to an event from a mining area indicates a surface explosion. Satellite imaging and a database of active mines can be used to confirm the origin of such events. This presentation will summarize the contribution of 6 years of infrasound data to IDC bulletins and provide examples of events recorded at the IMS infrasound network. Results of this study may help to improve location of small events with observations on infrasound stations.

  1. Final Technical Report. Project Boeing SGS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bell, Thomas E.

    Boeing and its partner, PJM Interconnection, teamed to bring advanced “defense-grade” technologies for cyber security to the US regional power grid through demonstration in PJM’s energy management environment. Under this cooperative project with the Department of Energy, Boeing and PJM have developed and demonstrated a host of technologies specifically tailored to the needs of PJM and the electric sector as a whole. The team has demonstrated to the energy industry a combination of processes, techniques and technologies that have been successfully implemented in the commercial, defense, and intelligence communities to identify, mitigate and continuously monitor the cyber security of criticalmore » systems. Guided by the results of a Cyber Security Risk-Based Assessment completed in Phase I, the Boeing-PJM team has completed multiple iterations through the Phase II Development and Phase III Deployment phases. Multiple cyber security solutions have been completed across a variety of controls including: Application Security, Enhanced Malware Detection, Security Incident and Event Management (SIEM) Optimization, Continuous Vulnerability Monitoring, SCADA Monitoring/Intrusion Detection, Operational Resiliency, Cyber Range simulations and hands on cyber security personnel training. All of the developed and demonstrated solutions are suitable for replication across the electric sector and/or the energy sector as a whole. Benefits identified include; Improved malware and intrusion detection capability on critical SCADA networks including behavioral-based alerts resulting in improved zero-day threat protection; Improved Security Incident and Event Management system resulting in better threat visibility, thus increasing the likelihood of detecting a serious event; Improved malware detection and zero-day threat response capability; Improved ability to systematically evaluate and secure in house and vendor sourced software applications; Improved ability to continuously monitor and maintain secure configuration of network devices resulting in reduced vulnerabilities for potential exploitation; Improved overall cyber security situational awareness through the integration of multiple discrete security technologies into a single cyber security reporting console; Improved ability to maintain the resiliency of critical systems in the face of a targeted cyber attack of other significant event; Improved ability to model complex networks for penetration testing and advanced training of cyber security personnel« less

  2. The effectiveness of pretreatment physics plan review for detecting errors in radiation therapy.

    PubMed

    Gopan, Olga; Zeng, Jing; Novak, Avrey; Nyflot, Matthew; Ford, Eric

    2016-09-01

    The pretreatment physics plan review is a standard tool for ensuring treatment quality. Studies have shown that the majority of errors in radiation oncology originate in treatment planning, which underscores the importance of the pretreatment physics plan review. This quality assurance measure is fundamentally important and central to the safety of patients and the quality of care that they receive. However, little is known about its effectiveness. The purpose of this study was to analyze reported incidents to quantify the effectiveness of the pretreatment physics plan review with the goal of improving it. This study analyzed 522 potentially severe or critical near-miss events within an institutional incident learning system collected over a three-year period. Of these 522 events, 356 originated at a workflow point that was prior to the pretreatment physics plan review. The remaining 166 events originated after the pretreatment physics plan review and were not considered in the study. The applicable 356 events were classified into one of the three categories: (1) events detected by the pretreatment physics plan review, (2) events not detected but "potentially detectable" by the physics review, and (3) events "not detectable" by the physics review. Potentially detectable events were further classified by which specific checks performed during the pretreatment physics plan review detected or could have detected the event. For these events, the associated specific check was also evaluated as to the possibility of automating that check given current data structures. For comparison, a similar analysis was carried out on 81 events from the international SAFRON radiation oncology incident learning system. Of the 356 applicable events from the institutional database, 180/356 (51%) were detected or could have been detected by the pretreatment physics plan review. Of these events, 125 actually passed through the physics review; however, only 38% (47/125) were actually detected at the review. Of the 81 events from the SAFRON database, 66/81 (81%) were potentially detectable by the pretreatment physics plan review. From the institutional database, three specific physics checks were particularly effective at detecting events (combined effectiveness of 38%): verifying the isocenter (39/180), verifying DRRs (17/180), and verifying that the plan matched the prescription (12/180). The most effective checks from the SAFRON database were verifying that the plan matched the prescription (13/66) and verifying the field parameters in the record and verify system against those in the plan (23/66). Software-based plan checking systems, if available, would have potential effectiveness of 29% and 64% at detecting events from the institutional and SAFRON databases, respectively. Pretreatment physics plan review is a key safety measure and can detect a high percentage of errors. However, the majority of errors that potentially could have been detected were not detected in this study, indicating the need to improve the pretreatment physics plan review performance. Suggestions for improvement include the automation of specific physics checks performed during the pretreatment physics plan review and the standardization of the review process.

  3. Public health situation awareness: toward a semantic approach

    NASA Astrophysics Data System (ADS)

    Mirhaji, Parsa; Richesson, Rachel L.; Turley, James P.; Zhang, Jiajie; Smith, Jack W.

    2004-04-01

    We propose a knowledge-based public health situation awareness system. The basis for this system is an explicit representation of public health situation awareness concepts and their interrelationships. This representation is based upon the users" (public health decision makers) cognitive model of the world, and optimized towards the efficacy of performance and relevance to the public health situation awareness processes and tasks. In our approach, explicit domain knowledge is the foundation for interpretation of public health data, as apposed to conventional systems where the statistical methods are the essence of the processes. Objectives: To develop a prototype knowledge-based system for public health situation awareness and to demonstrate the utility of knowledge intensive approaches in integration of heterogeneous information, eliminating the effects of incomplete and poor quality surveillance data, uncertainty in syndrome and aberration detection and visualization of complex information structures in public health surveillance settings, particularly in the context of bioterrorism (BT) preparedness. The system employs the Resource Definition Framework (RDF) and additional layers of more expressive languages to explicate the knowledge of domain experts into machine interpretable and computable problem-solving modules that can then guide users and computer systems in sifting through the most "relevant" data for syndrome and outbreak detection and investigation of root cause of the event. The Center for Biosecurity and Public Health Informatics Research is developing a prototype knowledge-based system around influenza, which has complex natural disease patterns, many public health implications, and is a potential agent for bioterrorism. The preliminary data from this effort may demonstrate superior performance in information integration, syndrome and aberration detection, information access through information visualization, and cross-domain investigation of the root causes of public health events.

  4. Embedded security system for multi-modal surveillance in a railway carriage

    NASA Astrophysics Data System (ADS)

    Zouaoui, Rhalem; Audigier, Romaric; Ambellouis, Sébastien; Capman, François; Benhadda, Hamid; Joudrier, Stéphanie; Sodoyer, David; Lamarque, Thierry

    2015-10-01

    Public transport security is one of the main priorities of the public authorities when fighting against crime and terrorism. In this context, there is a great demand for autonomous systems able to detect abnormal events such as violent acts aboard passenger cars and intrusions when the train is parked at the depot. To this end, we present an innovative approach which aims at providing efficient automatic event detection by fusing video and audio analytics and reducing the false alarm rate compared to classical stand-alone video detection. The multi-modal system is composed of two microphones and one camera and integrates onboard video and audio analytics and fusion capabilities. On the one hand, for detecting intrusion, the system relies on the fusion of "unusual" audio events detection with intrusion detections from video processing. The audio analysis consists in modeling the normal ambience and detecting deviation from the trained models during testing. This unsupervised approach is based on clustering of automatically extracted segments of acoustic features and statistical Gaussian Mixture Model (GMM) modeling of each cluster. The intrusion detection is based on the three-dimensional (3D) detection and tracking of individuals in the videos. On the other hand, for violent events detection, the system fuses unsupervised and supervised audio algorithms with video event detection. The supervised audio technique detects specific events such as shouts. A GMM is used to catch the formant structure of a shout signal. Video analytics use an original approach for detecting aggressive motion by focusing on erratic motion patterns specific to violent events. As data with violent events is not easily available, a normality model with structured motions from non-violent videos is learned for one-class classification. A fusion algorithm based on Dempster-Shafer's theory analyses the asynchronous detection outputs and computes the degree of belief of each probable event.

  5. Developmental changes in distinguishing concurrent auditory objects.

    PubMed

    Alain, Claude; Theunissen, Eef L; Chevalier, Hélène; Batty, Magali; Taylor, Margot J

    2003-04-01

    Children have considerable difficulties in identifying speech in noise. In the present study, we examined age-related differences in central auditory functions that are crucial for parsing co-occurring auditory events using behavioral and event-related brain potential measures. Seventeen pre-adolescent children and 17 adults were presented with complex sounds containing multiple harmonics, one of which could be 'mistuned' so that it was no longer an integer multiple of the fundamental. Both children and adults were more likely to report hearing the mistuned harmonic as a separate sound with an increase in mistuning. However, children were less sensitive in detecting mistuning across all levels as revealed by lower d' scores than adults. The perception of two concurrent auditory events was accompanied by a negative wave that peaked at about 160 ms after sound onset. In both age groups, the negative wave, referred to as the 'object-related negativity' (ORN), increased in amplitude with mistuning. The ORN was larger in children than in adults despite a lower d' score. Together, the behavioral and electrophysiological results suggest that concurrent sound segregation is probably adult-like in pre-adolescent children, but that children are inefficient in processing the information following the detection of mistuning. These findings also suggest that processes involved in distinguishing concurrent auditory objects continue to mature during adolescence.

  6. LobeFinder: A Convex Hull-Based Method for Quantitative Boundary Analyses of Lobed Plant Cells1[OPEN

    PubMed Central

    Wu, Tzu-Ching; Belteton, Samuel A.; Szymanski, Daniel B.; Umulis, David M.

    2016-01-01

    Dicot leaves are composed of a heterogeneous mosaic of jigsaw puzzle piece-shaped pavement cells that vary greatly in size and the complexity of their shape. Given the importance of the epidermis and this particular cell type for leaf expansion, there is a strong need to understand how pavement cells morph from a simple polyhedral shape into highly lobed and interdigitated cells. At present, it is still unclear how and when the patterns of lobing are initiated in pavement cells, and one major technological bottleneck to addressing the problem is the lack of a robust and objective methodology to identify and track lobing events during the transition from simple cell geometry to lobed cells. We developed a convex hull-based algorithm termed LobeFinder to identify lobes, quantify geometric properties, and create a useful graphical output of cell coordinates for further analysis. The algorithm was validated against manually curated images of pavement cells of widely varying sizes and shapes. The ability to objectively count and detect new lobe initiation events provides an improved quantitative framework to analyze mutant phenotypes, detect symmetry-breaking events in time-lapse image data, and quantify the time-dependent correlation between cell shape change and intracellular factors that may play a role in the morphogenesis process. PMID:27288363

  7. Multiple Kernel Learning for Heterogeneous Anomaly Detection: Algorithm and Aviation Safety Case Study

    NASA Technical Reports Server (NTRS)

    Das, Santanu; Srivastava, Ashok N.; Matthews, Bryan L.; Oza, Nikunj C.

    2010-01-01

    The world-wide aviation system is one of the most complex dynamical systems ever developed and is generating data at an extremely rapid rate. Most modern commercial aircraft record several hundred flight parameters including information from the guidance, navigation, and control systems, the avionics and propulsion systems, and the pilot inputs into the aircraft. These parameters may be continuous measurements or binary or categorical measurements recorded in one second intervals for the duration of the flight. Currently, most approaches to aviation safety are reactive, meaning that they are designed to react to an aviation safety incident or accident. In this paper, we discuss a novel approach based on the theory of multiple kernel learning to detect potential safety anomalies in very large data bases of discrete and continuous data from world-wide operations of commercial fleets. We pose a general anomaly detection problem which includes both discrete and continuous data streams, where we assume that the discrete streams have a causal influence on the continuous streams. We also assume that atypical sequence of events in the discrete streams can lead to off-nominal system performance. We discuss the application domain, novel algorithms, and also discuss results on real-world data sets. Our algorithm uncovers operationally significant events in high dimensional data streams in the aviation industry which are not detectable using state of the art methods

  8. Most HIV Type 1 Non-B Infections in the Spanish Cohort of Antiretroviral Treatment-Naïve HIV-Infected Patients (CoRIS) Are Due to Recombinant Viruses

    PubMed Central

    Yebra, Gonzalo; de Mulder, Miguel; Martín, Leticia; Rodríguez, Carmen; Labarga, Pablo; Viciana, Isabel; Berenguer, Juan; Alemán, María Remedios; Pineda, Juan Antonio; García, Federico

    2012-01-01

    HIV-1 group M is classified into 9 subtypes, as well as recombinants favored by coinfection and superinfection events with different variants. Although HIV-1 subtype B is predominant in Europe, intersubtype recombinants are increasing in prevalence and complexity. In this study, phylogenetic analyses of pol sequences were performed to detect the HIV-1 circulating and unique recombinant forms (CRFs and URFs, respectively) in a Spanish cohort of antiretroviral treatment-naïve HIV-infected patients included in the Research Network on HIV/AIDS (CoRIS). Bootscanning and other methods were used to define complex recombinants not assigned to any subtype or CRF. A total of 670 available HIV-1 pol sequences from different patients were collected, of which 588 (87.8%) were assigned to HIV-1 subtype B and 82 (12.2%) to HIV-1 non-B variants. Recombinants caused the majority (71.9%) of HIV-1 non-B infections and were found in 8.8% of CoRIS patients. Eleven URFs (accounting for 13.4% of HIV-1 non-B infections), presenting complex mosaic patterns, were detected. Among them, 10 harbored subtype B fragments. Four of the 11 URFs were found in Spanish natives. A cluster of three B/CRF02_AG recombinants was detected. We conclude that complex variants, including unique recombinant forms, are being introduced into Spain through both immigrants and natives. An increase in the frequency of mosaic viruses, reflecting the increasing heterogeneity of the HIV epidemic in our country, is expected. PMID:22162552

  9. Most HIV type 1 non-B infections in the Spanish cohort of antiretroviral treatment-naïve HIV-infected patients (CoRIS) are due to recombinant viruses.

    PubMed

    Yebra, Gonzalo; de Mulder, Miguel; Martín, Leticia; Rodríguez, Carmen; Labarga, Pablo; Viciana, Isabel; Berenguer, Juan; Alemán, María Remedios; Pineda, Juan Antonio; García, Federico; Holguín, Africa

    2012-02-01

    HIV-1 group M is classified into 9 subtypes, as well as recombinants favored by coinfection and superinfection events with different variants. Although HIV-1 subtype B is predominant in Europe, intersubtype recombinants are increasing in prevalence and complexity. In this study, phylogenetic analyses of pol sequences were performed to detect the HIV-1 circulating and unique recombinant forms (CRFs and URFs, respectively) in a Spanish cohort of antiretroviral treatment-naïve HIV-infected patients included in the Research Network on HIV/AIDS (CoRIS). Bootscanning and other methods were used to define complex recombinants not assigned to any subtype or CRF. A total of 670 available HIV-1 pol sequences from different patients were collected, of which 588 (87.8%) were assigned to HIV-1 subtype B and 82 (12.2%) to HIV-1 non-B variants. Recombinants caused the majority (71.9%) of HIV-1 non-B infections and were found in 8.8% of CoRIS patients. Eleven URFs (accounting for 13.4% of HIV-1 non-B infections), presenting complex mosaic patterns, were detected. Among them, 10 harbored subtype B fragments. Four of the 11 URFs were found in Spanish natives. A cluster of three B/CRF02_AG recombinants was detected. We conclude that complex variants, including unique recombinant forms, are being introduced into Spain through both immigrants and natives. An increase in the frequency of mosaic viruses, reflecting the increasing heterogeneity of the HIV epidemic in our country, is expected.

  10. Piecing together the puzzle: Improving event content coverage for real-time sub-event detection using adaptive microblog crawling

    PubMed Central

    Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan

    2017-01-01

    In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of our proposed adaptive crawler and our stream division/recombination technique provides significant gains in event recall (44.44%) and event precision (9.57%). The addition of these sub-events or pieces, allows us to get closer to solving the event puzzle. PMID:29107976

  11. Piecing together the puzzle: Improving event content coverage for real-time sub-event detection using adaptive microblog crawling.

    PubMed

    Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan

    2017-01-01

    In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of our proposed adaptive crawler and our stream division/recombination technique provides significant gains in event recall (44.44%) and event precision (9.57%). The addition of these sub-events or pieces, allows us to get closer to solving the event puzzle.

  12. DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals

    DTIC Science & Technology

    2013-04-24

    DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals Vernon...datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal . We have developed...As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and

  13. Improving Focal Depth Estimates: Studies of Depth Phase Detection at Regional Distances

    NASA Astrophysics Data System (ADS)

    Stroujkova, A.; Reiter, D. T.; Shumway, R. H.

    2006-12-01

    The accurate estimation of the depth of small, regionally recorded events continues to be an important and difficult explosion monitoring research problem. Depth phases (free surface reflections) are the primary tool that seismologists use to constrain the depth of a seismic event. When depth phases from an event are detected, an accurate source depth is easily found by using the delay times of the depth phases relative to the P wave and a velocity profile near the source. Cepstral techniques, including cepstral F-statistics, represent a class of methods designed for the depth-phase detection and identification; however, they offer only a moderate level of success at epicentral distances less than 15°. This is due to complexities in the Pn coda, which can lead to numerous false detections in addition to the true phase detection. Therefore, cepstral methods cannot be used independently to reliably identify depth phases. Other evidence, such as apparent velocities, amplitudes and frequency content, must be used to confirm whether the phase is truly a depth phase. In this study we used a variety of array methods to estimate apparent phase velocities and arrival azimuths, including beam-forming, semblance analysis, MUltiple SIgnal Classification (MUSIC) (e.g., Schmidt, 1979), and cross-correlation (e.g., Cansi, 1995; Tibuleac and Herrin, 1997). To facilitate the processing and comparison of results, we developed a MATLAB-based processing tool, which allows application of all of these techniques (i.e., augmented cepstral processing) in a single environment. The main objective of this research was to combine the results of three focal-depth estimation techniques and their associated standard errors into a statistically valid unified depth estimate. The three techniques include: 1. Direct focal depth estimate from the depth-phase arrival times picked via augmented cepstral processing. 2. Hypocenter location from direct and surface-reflected arrivals observed on sparse networks of regional stations using a Grid-search, Multiple-Event Location method (GMEL; Rodi and Toksöz, 2000; 2001). 3. Surface-wave dispersion inversion for event depth and focal mechanism (Herrmann and Ammon, 2002). To validate our approach and provide quality control for our solutions, we applied the techniques to moderated- sized events (mb between 4.5 and 6.0) with known focal mechanisms. We illustrate the techniques using events observed at regional distances from the KSAR (Wonju, South Korea) teleseismic array and other nearby broadband three-component stations. Our results indicate that the techniques can produce excellent agreement between the various depth estimates. In addition, combining the techniques into a "unified" estimate greatly reduced location errors and improved robustness of the solution, even if results from the individual methods yielded large standard errors.

  14. Complex Event Recognition Architecture

    NASA Technical Reports Server (NTRS)

    Fitzgerald, William A.; Firby, R. James

    2009-01-01

    Complex Event Recognition Architecture (CERA) is the name of a computational architecture, and software that implements the architecture, for recognizing complex event patterns that may be spread across multiple streams of input data. One of the main components of CERA is an intuitive event pattern language that simplifies what would otherwise be the complex, difficult tasks of creating logical descriptions of combinations of temporal events and defining rules for combining information from different sources over time. In this language, recognition patterns are defined in simple, declarative statements that combine point events from given input streams with those from other streams, using conjunction, disjunction, and negation. Patterns can be built on one another recursively to describe very rich, temporally extended combinations of events. Thereafter, a run-time matching algorithm in CERA efficiently matches these patterns against input data and signals when patterns are recognized. CERA can be used to monitor complex systems and to signal operators or initiate corrective actions when anomalous conditions are recognized. CERA can be run as a stand-alone monitoring system, or it can be integrated into a larger system to automatically trigger responses to changing environments or problematic situations.

  15. Cognitive Complexity of the Medical Record Is a Risk Factor for Major Adverse Events

    PubMed Central

    Roberson, David; Connell, Michael; Dillis, Shay; Gauvreau, Kimberlee; Gore, Rebecca; Heagerty, Elaina; Jenkins, Kathy; Ma, Lin; Maurer, Amy; Stephenson, Jessica; Schwartz, Margot

    2014-01-01

    Context: Patients in tertiary care hospitals are more complex than in the past, but the implications of this are poorly understood because “patient complexity” has been difficult to quantify. Objective: We developed a tool, the Complexity Ruler, to quantify the amount of data (as bits) in the patient’s medical record. We designated the amount of data in the medical record as the cognitive complexity of the medical record (CCMR). We hypothesized that CCMR is a useful surrogate for true patient complexity and that higher CCMR correlates with risk of major adverse events. Design: The Complexity Ruler was validated by comparing the measured CCMR with physician rankings of patient complexity on specific inpatient services. It was tested in a case-control model of all patients with major adverse events at a tertiary care pediatric hospital from 2005 to 2006. Main Outcome Measures: The main outcome measure was an externally reported major adverse event. We measured CCMR for 24 hours before the event, and we estimated lifetime CCMR. Results: Above empirically derived cutoffs, 24-hour and lifetime CCMR were risk factors for major adverse events (odds ratios, 5.3 and 6.5, respectively). In a multivariate analysis, CCMR alone was essentially as predictive of risk as a model that started with 30-plus clinical factors. Conclusions: CCMR correlates with physician assessment of complexity and risk of adverse events. We hypothesize that increased CCMR increases the risk of physician cognitive overload. An automated version of the Complexity Ruler could allow identification of at-risk patients in real time. PMID:24626065

  16. A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data

    NASA Astrophysics Data System (ADS)

    Kohl, B. C.; Given, J.

    2017-12-01

    The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in particular the spectral characteristics of events of interest, Is entirely model-based, i.e. does not rely on a priori's - particularly important for nuclear monitoring, Does not rely on individualized signal detection thresholds - it's the network solution that matters.

  17. Optimizing collection of adverse event data in cancer clinical trials supporting supplemental indications.

    PubMed

    Kaiser, Lee D; Melemed, Allen S; Preston, Alaknanda J; Chaudri Ross, Hilary A; Niedzwiecki, Donna; Fyfe, Gwendolyn A; Gough, Jacqueline M; Bushnell, William D; Stephens, Cynthia L; Mace, M Kelsey; Abrams, Jeffrey S; Schilsky, Richard L

    2010-12-01

    Although much is known about the safety of an anticancer agent at the time of initial marketing approval, sponsors customarily collect comprehensive safety data for studies that support supplemental indications. This adds significant cost and complexity to the study but may not provide useful new information. The main purpose of this analysis was to assess the amount of safety and concomitant medication data collected to determine a more optimal approach in the collection of these data when used in support of supplemental applications. Following a prospectively developed statistical analysis plan, we reanalyzed safety data from eight previously completed prospective randomized trials. A total of 107,884 adverse events and 136,608 concomitant medication records were reviewed for the analysis. Of these, four grade 1 to 2 and nine grade 3 and higher events were identified as drug effects that were not included in the previously established safety profiles and could potentially have been missed using subsampling. These events were frequently detected in subsamples of 400 patients or larger. Furthermore, none of the concomitant medication records contributed to labeling changes for the supplemental indications. Our study found that applying the optimized methodologic approach, described herein, has a high probability of detecting new drug safety signals. Focusing data collection on signals that cause physicians to modify or discontinue treatment ensures that safety issues of the highest concern for patients and regulators are captured and has significant potential to relieve strain on the clinical trials system.

  18. Molecular characterization of Plum pox virus Rec isolates from Russia suggests a new insight into evolution of the strain.

    PubMed

    Chirkov, Sergei; Ivanov, Peter; Sheveleva, Anna; Kudryavtseva, Anna; Mitrofanova, Irina

    2018-04-01

    Field isolates of Plum pox virus (PPV), belonging to the strain Rec, have been found for the first time in Russia. Full-size genomes of the isolates K28 and Kisl-1pl from myrobalan and plum, respectively, were sequenced on the 454 platform. Analysis of all known PPV-Rec complete genomes using the Recombination Detection Program (RDP4) revealed yet another recombination event in the 5'-terminal region. This event was detected by seven algorithms, implemented in the RDP4, with statistically significant P values and supported by a phylogenetic analysis with the bootstrap value of 87%. A putative PPV-M-derived segment, encompassing the C-terminus of the P1 gene and approximately two-thirds of the HcPro gene, is bordered by breakpoints at positions 760-940 and 1838-1964, depending on the recombinant isolate. The predicted 5'-distal breakpoint for the isolate Valjevka is located at position 2804. The Dideron (strain D) and SK68 (strain M) isolates were inferred as major and minor parents, respectively. Finding of another recombination event suggests more complex evolutionary history of PPV-Rec than previously assumed. Perhaps the first recombination event led to the formation of a PPV-D variant harboring the PPV-M-derived fragment within the 5'-proximal part of the genome. Subsequent recombination of its descendant with PPV-M in the 3'-proximal genomic region resulted in the emergence of the evolutionary successful strain Rec.

  19. Results from the MACHO Galactic Pixel Lensing Search

    NASA Astrophysics Data System (ADS)

    Drake, Andrew J.; Minniti, Dante; Alcock, Charles; Allsman, Robyn A.; Alves, David; Axelrod, Tim S.; Becker, Andrew C.; Bennett, David; Cook, Kem H.; Freeman, Ken C.; Griest, Kim; Lehner, Matt; Marshall, Stuart; Peterson, Bruce; Pratt, Mark; Quinn, Peter; Rodgers, Alex; Stubbs, Chris; Sutherland, Will; Tomaney, Austin; Vandehei, Thor; Welch, Doug L.

    The MACHO, EROS, OGLE and AGAPE collaborations have been studying nature of the galactic halo for a number of years using microlensing events. The MACHO group undertakes observations of the LMC, SMC and Galactic Bulge monitoring the light curves of millions of stars to detect microlensing. Most of these fields are crowded to the extent that all the monitored stars are blended. Such crowding makes the performance of accurate photometry difficult. We apply the new technique of Difference Image Analysis (DIA) on archival data to improve the photometry and increase both the detection sensitivity and effective search area. The application of this technique also allows us to detect so called `pixel lensing' events. These are microlensing events where the source star is only detectable during lensing. The detection of these events will allow us to make a large increase in the number of detected microlensing events. We present a light curve demonstrating the detection of a pixel lensing event with this technique.

  20. Method and apparatus for detecting and determining event characteristics with reduced data collection

    NASA Technical Reports Server (NTRS)

    Totman, Peter D. (Inventor); Everton, Randy L. (Inventor); Egget, Mark R. (Inventor); Macon, David J. (Inventor)

    2007-01-01

    A method and apparatus for detecting and determining event characteristics such as, for example, the material failure of a component, in a manner which significantly reduces the amount of data collected. A sensor array, including a plurality of individual sensor elements, is coupled to a programmable logic device (PLD) configured to operate in a passive state and an active state. A triggering event is established such that the PLD records information only upon detection of the occurrence of the triggering event which causes a change in state within one or more of the plurality of sensor elements. Upon the occurrence of the triggering event, the change in state of the one or more sensor elements causes the PLD to record in memory which sensor element detected the event and at what time the event was detected. The PLD may be coupled with a computer for subsequent downloading and analysis of the acquired data.

  1. An Assistive Technology System that Provides Personalized Dressing Support for People Living with Dementia: Capability Study.

    PubMed

    Burleson, Winslow; Lozano, Cecil; Ravishankar, Vijay; Lee, Jisoo; Mahoney, Diane

    2018-05-01

    Individuals living with advancing stages of dementia (persons with dementia, PWDs) or other cognitive disorders do not have the luxury of remembering how to perform basic day-to-day activities, which in turn makes them increasingly dependent on the assistance of caregivers. Dressing is one of the most common and stressful activities provided by caregivers because of its complexity and privacy challenges posed during the process. In preparation for in-home trials with PWDs, the aim of this study was to develop and evaluate a prototype intelligent system, the DRESS prototype, to assess its ability to provide automated assistance with dressing that can afford independence and privacy to individual PWDs and potentially provide additional freedom to their caregivers (family members and professionals). This laboratory study evaluated the DRESS prototype's capacity to detect dressing events. These events were engaged in by 11 healthy participants simulating common correct and incorrect dressing scenarios. The events ranged from donning a shirt and pants inside out or backwards to partial dressing-typical issues that challenge a PWD and their caregivers. A set of expected detections for correct dressing was prepared via video analysis of all participants' dressing behaviors. In the initial phases of donning either shirts or pants, the DRESS prototype missed only 4 out of 388 expected detections. The prototype's ability to recognize other missing detections varied across conditions. There were also some unexpected detections such as detection of the inside of a shirt as it was being put on. Throughout the study, detection of dressing events was adversely affected by the relatively smaller effective size of the markers at greater distances. Although the DRESS prototype incorrectly identified 10 of 22 cases for shirts, the prototype preformed significantly better for pants, incorrectly identifying only 5 of 22 cases. Further analyses identified opportunities to improve the DRESS prototype's reliability, including increasing the size of markers, minimizing garment folding or occlusions, and optimal positioning of participants with respect to the DRESS prototype. This study demonstrates the ability to detect clothing orientation and position and infer current state of dressing using a combination of sensors, intelligent software, and barcode tracking. With improvements identified by this study, the DRESS prototype has the potential to provide a viable option to provide automated dressing support to assist PWDs in maintaining their independence and privacy, while potentially providing their caregivers with the much-needed respite. ©Winslow Burleson, Cecil Lozano, Vijay Ravishankar, Jisoo Lee, Diane Mahoney. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 01.05.2018.

  2. An Assistive Technology System that Provides Personalized Dressing Support for People Living with Dementia: Capability Study

    PubMed Central

    Lozano, Cecil; Ravishankar, Vijay; Lee, Jisoo; Mahoney, Diane

    2018-01-01

    Background Individuals living with advancing stages of dementia (persons with dementia, PWDs) or other cognitive disorders do not have the luxury of remembering how to perform basic day-to-day activities, which in turn makes them increasingly dependent on the assistance of caregivers. Dressing is one of the most common and stressful activities provided by caregivers because of its complexity and privacy challenges posed during the process. Objective In preparation for in-home trials with PWDs, the aim of this study was to develop and evaluate a prototype intelligent system, the DRESS prototype, to assess its ability to provide automated assistance with dressing that can afford independence and privacy to individual PWDs and potentially provide additional freedom to their caregivers (family members and professionals). Methods This laboratory study evaluated the DRESS prototype’s capacity to detect dressing events. These events were engaged in by 11 healthy participants simulating common correct and incorrect dressing scenarios. The events ranged from donning a shirt and pants inside out or backwards to partial dressing—typical issues that challenge a PWD and their caregivers. Results A set of expected detections for correct dressing was prepared via video analysis of all participants’ dressing behaviors. In the initial phases of donning either shirts or pants, the DRESS prototype missed only 4 out of 388 expected detections. The prototype’s ability to recognize other missing detections varied across conditions. There were also some unexpected detections such as detection of the inside of a shirt as it was being put on. Throughout the study, detection of dressing events was adversely affected by the relatively smaller effective size of the markers at greater distances. Although the DRESS prototype incorrectly identified 10 of 22 cases for shirts, the prototype preformed significantly better for pants, incorrectly identifying only 5 of 22 cases. Further analyses identified opportunities to improve the DRESS prototype’s reliability, including increasing the size of markers, minimizing garment folding or occlusions, and optimal positioning of participants with respect to the DRESS prototype. Conclusions This study demonstrates the ability to detect clothing orientation and position and infer current state of dressing using a combination of sensors, intelligent software, and barcode tracking. With improvements identified by this study, the DRESS prototype has the potential to provide a viable option to provide automated dressing support to assist PWDs in maintaining their independence and privacy, while potentially providing their caregivers with the much-needed respite. PMID:29716885

  3. The Complex Admixture History and Recent Southern Origins of Siberian Populations

    PubMed Central

    Pugach, Irina; Matveev, Rostislav; Spitsyn, Viktor; Makarov, Sergey; Novgorodov, Innokentiy; Osakovsky, Vladimir; Stoneking, Mark; Pakendorf, Brigitte

    2016-01-01

    Although Siberia was inhabited by modern humans at an early stage, there is still debate over whether it remained habitable during the extreme cold of the Last Glacial Maximum or whether it was subsequently repopulated by peoples with recent shared ancestry. Previous studies of the genetic history of Siberian populations were hampered by the extensive admixture that appears to have taken place among these populations, because commonly used methods assume a tree-like population history and at most single admixture events. Here we analyze geogenetic maps and use other approaches to distinguish the effects of shared ancestry from prehistoric migrations and contact, and develop a new method based on the covariance of ancestry components, to investigate the potentially complex admixture history. We furthermore adapt a previously devised method of admixture dating for use with multiple events of gene flow, and apply these methods to whole-genome genotype data from over 500 individuals belonging to 20 different Siberian ethnolinguistic groups. The results of these analyses indicate that there have been multiple layers of admixture detectable in most of the Siberian populations, with considerable differences in the admixture histories of individual populations. Furthermore, most of the populations of Siberia included here, even those settled far to the north, appear to have a southern origin, with the northward expansions of different populations possibly being driven partly by the advent of pastoralism, especially reindeer domestication. These newly developed methods to analyze multiple admixture events should aid in the investigation of similarly complex population histories elsewhere. PMID:26993256

  4. Bimolecular fluorescence complementation: lighting up seven transmembrane domain receptor signalling networks

    PubMed Central

    Rose, Rachel H; Briddon, Stephen J; Holliday, Nicholas D

    2010-01-01

    There is increasing complexity in the organization of seven transmembrane domain (7TM) receptor signalling pathways, and in the ability of their ligands to modulate and direct this signalling. Underlying these events is a network of protein interactions between the 7TM receptors themselves and associated effectors, such as G proteins and β-arrestins. Bimolecular fluorescence complementation, or BiFC, is a technique capable of detecting these protein–protein events essential for 7TM receptor function. Fluorescent proteins, such as those from Aequorea victoria, are split into two non-fluorescent halves, which then tag the proteins under study. On association, these fragments refold and regenerate a mature fluorescent protein, producing a BiFC signal indicative of complex formation. Here, we review the experimental criteria for successful application of BiFC, considered in the context of 7TM receptor signalling events such as receptor dimerization, G protein and β-arrestin signalling. The advantages and limitations of BiFC imaging are compared with alternative resonance energy transfer techniques. We show that the essential simplicity of the fluorescent BiFC measurement allows high-content and advanced imaging applications, and that it can probe more complex multi-protein interactions alone or in combination with resonance energy transfer. These capabilities suggest that BiFC techniques will become ever more useful in the analysis of ligand and 7TM receptor pharmacology at the molecular level of protein–protein interactions. This article is part of a themed section on Imaging in Pharmacology. To view the editorial for this themed section visit http://dx.doi.org/10.1111/j.1476-5381.2010.00685.x PMID:20015298

  5. Detection of planets in extremely weak central perturbation microlensing events via next-generation ground-based surveys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chung, Sun-Ju; Lee, Chung-Uk; Koo, Jae-Rim, E-mail: sjchung@kasi.re.kr, E-mail: leecu@kasi.re.kr, E-mail: koojr@kasi.re.kr

    2014-04-20

    Even though the recently discovered high-magnification event MOA-2010-BLG-311 had complete coverage over its peak, confident planet detection did not happen due to extremely weak central perturbations (EWCPs, fractional deviations of ≲ 2%). For confident detection of planets in EWCP events, it is necessary to have both high cadence monitoring and high photometric accuracy better than those of current follow-up observation systems. The next-generation ground-based observation project, Korea Microlensing Telescope Network (KMTNet), satisfies these conditions. We estimate the probability of occurrence of EWCP events with fractional deviations of ≤2% in high-magnification events and the efficiency of detecting planets in the EWCPmore » events using the KMTNet. From this study, we find that the EWCP events occur with a frequency of >50% in the case of ≲ 100 M {sub E} planets with separations of 0.2 AU ≲ d ≲ 20 AU. We find that for main-sequence and sub-giant source stars, ≳ 1 M {sub E} planets in EWCP events with deviations ≤2% can be detected with frequency >50% in a certain range that changes with the planet mass. However, it is difficult to detect planets in EWCP events of bright stars like giant stars because it is easy for KMTNet to be saturated around the peak of the events because of its constant exposure time. EWCP events are caused by close, intermediate, and wide planetary systems with low-mass planets and close and wide planetary systems with massive planets. Therefore, we expect that a much greater variety of planetary systems than those already detected, which are mostly intermediate planetary systems, regardless of the planet mass, will be significantly detected in the near future.« less

  6. Teleconnection Locator: TeleLoc

    NASA Astrophysics Data System (ADS)

    Bowen, M. K.; Duffy, D.

    2016-12-01

    Extreme climate events, such as tropical storms, droughts, and floods, have an enormous impact on all aspects of society. Being able to detect the causes of such events on a global scale is paramount to being able to predict when and where these events will occur. These teleconnections, where a small change in a closed, complex system creates drastic disturbances elsewhere in the system, are generally represented by an index, one of the most famous being the El Nino Southern Oscillation (ENSO). However, due to the enormity, complexity, and technical challenges surrounding climate and its data, it is hypothesized that many of these teleconnections have as of yet gone undiscovered. TeleLoc (Teleconnection Locator) is a machine-learning framework combining a number of techniques for finding correlations between weather trends and extreme climate events. The current focus is on connecting global trends with tropical cyclones. A combination of two data sets, The International Best Track Archive for Climate Stewardship (IBTrACS) and the Modern-Era Retrospective analysis for Research and Applications (MERRA2), are being utilized. PostGIS is used for raw data storage, and a Python API has been developed as the core of the framework. Cyclones are first clustered using a combination of Symbolic Aggregate ApproXimation (this allows for a symbolic, sequential representation of the various time-series variables of interest) and DBSCAN. This serves to break the events into subcategories, which alleviates computational load for the next step. Events which are clustered together (those with similar characteristics) are compared against global climate variables of interest, which are also converted to a symbolic form, leading up to the event using Association Rule Mining. Results will be shown where cyclones have been clustered, specifically in the West Pacific storm basin, as well as the global variable symbolic subsections with a high support that have been singled out for analysis.

  7. The effectiveness of pretreatment physics plan review for detecting errors in radiation therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gopan, Olga; Zeng, Jing; Novak, Avrey

    Purpose: The pretreatment physics plan review is a standard tool for ensuring treatment quality. Studies have shown that the majority of errors in radiation oncology originate in treatment planning, which underscores the importance of the pretreatment physics plan review. This quality assurance measure is fundamentally important and central to the safety of patients and the quality of care that they receive. However, little is known about its effectiveness. The purpose of this study was to analyze reported incidents to quantify the effectiveness of the pretreatment physics plan review with the goal of improving it. Methods: This study analyzed 522 potentiallymore » severe or critical near-miss events within an institutional incident learning system collected over a three-year period. Of these 522 events, 356 originated at a workflow point that was prior to the pretreatment physics plan review. The remaining 166 events originated after the pretreatment physics plan review and were not considered in the study. The applicable 356 events were classified into one of the three categories: (1) events detected by the pretreatment physics plan review, (2) events not detected but “potentially detectable” by the physics review, and (3) events “not detectable” by the physics review. Potentially detectable events were further classified by which specific checks performed during the pretreatment physics plan review detected or could have detected the event. For these events, the associated specific check was also evaluated as to the possibility of automating that check given current data structures. For comparison, a similar analysis was carried out on 81 events from the international SAFRON radiation oncology incident learning system. Results: Of the 356 applicable events from the institutional database, 180/356 (51%) were detected or could have been detected by the pretreatment physics plan review. Of these events, 125 actually passed through the physics review; however, only 38% (47/125) were actually detected at the review. Of the 81 events from the SAFRON database, 66/81 (81%) were potentially detectable by the pretreatment physics plan review. From the institutional database, three specific physics checks were particularly effective at detecting events (combined effectiveness of 38%): verifying the isocenter (39/180), verifying DRRs (17/180), and verifying that the plan matched the prescription (12/180). The most effective checks from the SAFRON database were verifying that the plan matched the prescription (13/66) and verifying the field parameters in the record and verify system against those in the plan (23/66). Software-based plan checking systems, if available, would have potential effectiveness of 29% and 64% at detecting events from the institutional and SAFRON databases, respectively. Conclusions: Pretreatment physics plan review is a key safety measure and can detect a high percentage of errors. However, the majority of errors that potentially could have been detected were not detected in this study, indicating the need to improve the pretreatment physics plan review performance. Suggestions for improvement include the automation of specific physics checks performed during the pretreatment physics plan review and the standardization of the review process.« less

  8. Fast low-level light pulses from the night sky observed with the SKYFLASH program

    NASA Astrophysics Data System (ADS)

    Winckler, J. R.; Franz, R. C.; Nemzek, R. J.

    1993-05-01

    This paper presents further discussion of and new data on fast subvisual increases in the luminosity of the night sky described in our previous papers. A detailed technical description of the simple telescopic photometers used in the project SKYFLASH and their mode of operation including the detection of polarized Rayleigh-scattered flashes is provided. Distant lightning storms account for many of the events, and the complex relations between short and long luminous pulses with and without sferics are shown by examples from a new computerized data system, supplemented by two low-light-level TV cameras. Of particular interest are the previously observed 'long' events having a slow rise and fall, 20-ms duration, and showing small polarization and no coincident sferic. A group of such events on September 22-23 during the invasion of U.S. coasts by Hurricane Hugo, is discussed in detail. The recently observed 'plume' cloud-top-to-stratosphere lightning event is suggested as a possible source type for these flashes. An alternative source may be exploding meteors, recently identified during SKYFLASH observations by low-light-level television techniques as the origin of some sky-wide flash events described herein.

  9. Fast low-level light pulses from the night sky observed with the SKYFLASH program

    NASA Technical Reports Server (NTRS)

    Winckler, J. R.; Franz, R. C.; Nemzek, R. J.

    1993-01-01

    This paper presents further discussion of and new data on fast subvisual increases in the luminosity of the night sky described in our previous papers. A detailed technical description of the simple telescopic photometers used in the project SKYFLASH and their mode of operation including the detection of polarized Rayleigh-scattered flashes is provided. Distant lightning storms account for many of the events, and the complex relations between short and long luminous pulses with and without sferics are shown by examples from a new computerized data system, supplemented by two low-light-level TV cameras. Of particular interest are the previously observed 'long' events having a slow rise and fall, 20-ms duration, and showing small polarization and no coincident sferic. A group of such events on September 22-23 during the invasion of U.S. coasts by Hurricane Hugo, is discussed in detail. The recently observed 'plume' cloud-top-to-stratosphere lightning event is suggested as a possible source type for these flashes. An alternative source may be exploding meteors, recently identified during SKYFLASH observations by low-light-level television techniques as the origin of some sky-wide flash events described herein.

  10. Supervised machine learning on a network scale: application to seismic event classification and detection

    NASA Astrophysics Data System (ADS)

    Reynen, Andrew; Audet, Pascal

    2017-09-01

    A new method using a machine learning technique is applied to event classification and detection at seismic networks. This method is applicable to a variety of network sizes and settings. The algorithm makes use of a small catalogue of known observations across the entire network. Two attributes, the polarization and frequency content, are used as input to regression. These attributes are extracted at predicted arrival times for P and S waves using only an approximate velocity model, as attributes are calculated over large time spans. This method of waveform characterization is shown to be able to distinguish between blasts and earthquakes with 99 per cent accuracy using a network of 13 stations located in Southern California. The combination of machine learning with generalized waveform features is further applied to event detection in Oklahoma, United States. The event detection algorithm makes use of a pair of unique seismic phases to locate events, with a precision directly related to the sampling rate of the generalized waveform features. Over a week of data from 30 stations in Oklahoma, United States are used to automatically detect 25 times more events than the catalogue of the local geological survey, with a false detection rate of less than 2 per cent. This method provides a highly confident way of detecting and locating events. Furthermore, a large number of seismic events can be automatically detected with low false alarm, allowing for a larger automatic event catalogue with a high degree of trust.

  11. Video Traffic Analysis for Abnormal Event Detection

    DOT National Transportation Integrated Search

    2010-01-01

    We propose the use of video imaging sensors for the detection and classification of abnormal events to be used primarily for mitigation of traffic congestion. Successful detection of such events will allow for new road guidelines; for rapid deploymen...

  12. Video traffic analysis for abnormal event detection.

    DOT National Transportation Integrated Search

    2010-01-01

    We propose the use of video imaging sensors for the detection and classification of abnormal events to : be used primarily for mitigation of traffic congestion. Successful detection of such events will allow for : new road guidelines; for rapid deplo...

  13. JPL Big Data Technologies for Radio Astronomy

    NASA Astrophysics Data System (ADS)

    Jones, Dayton L.; D'Addario, L. R.; De Jong, E. M.; Mattmann, C. A.; Rebbapragada, U. D.; Thompson, D. R.; Wagstaff, K.

    2014-04-01

    During the past three years the Jet Propulsion Laboratory has been working on several technologies to deal with big data challenges facing next-generation radio arrays, among other applications. This program has focused on the following four areas: 1) We are investigating high-level ASIC architectures that reduce power consumption for cross-correlation of data from large interferometer arrays by one to two orders of magnitude. The cost of operations for the Square Kilometre Array (SKA), which may be dominated by the cost of power for data processing, is a serious concern. A large improvement in correlator power efficiency could have a major positive impact. 2) Data-adaptive algorithms (machine learning) for real-time detection and classification of fast transient signals in high volume data streams are being developed and demonstrated. Studies of the dynamic universe, particularly searches for fast (<< 1 second) transient events, require that data be analyzed rapidly and with robust RFI rejection. JPL, in collaboration with the International Center for Radio Astronomy Research in Australia, has developed a fast transient search system for eventual deployment on ASKAP. In addition, a real-time transient detection experiment is now running continuously and commensally on NRAO's Very Long Baseline Array. 3) Scalable frameworks for data archiving, mining, and distribution are being applied to radio astronomy. A set of powerful open-source Object Oriented Data Technology (OODT) tools is now available through Apache. OODT was developed at JPL for Earth science data archives, but it is proving to be useful for radio astronomy, planetary science, health care, Earth climate, and other large-scale archives. 4) We are creating automated, event-driven data visualization tools that can be used to extract information from a wide range of complex data sets. Visualization of complex data can be improved through algorithms that detect events or features of interest and autonomously generate images or video to display those features. This work has been carried out at the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics and Space Administration.

  14. An Intelligent Man-Machine Interface—Multi-Robot Control Adapted for Task Engagement Based on Single-Trial Detectability of P300

    PubMed Central

    Kirchner, Elsa A.; Kim, Su K.; Tabie, Marc; Wöhrle, Hendrik; Maurus, Michael; Kirchner, Frank

    2016-01-01

    Advanced man-machine interfaces (MMIs) are being developed for teleoperating robots at remote and hardly accessible places. Such MMIs make use of a virtual environment and can therefore make the operator immerse him-/herself into the environment of the robot. In this paper, we present our developed MMI for multi-robot control. Our MMI can adapt to changes in task load and task engagement online. Applying our approach of embedded Brain Reading we improve user support and efficiency of interaction. The level of task engagement was inferred from the single-trial detectability of P300-related brain activity that was naturally evoked during interaction. With our approach no secondary task is needed to measure task load. It is based on research results on the single-stimulus paradigm, distribution of brain resources and its effect on the P300 event-related component. It further considers effects of the modulation caused by a delayed reaction time on the P300 component evoked by complex responses to task-relevant messages. We prove our concept using single-trial based machine learning analysis, analysis of averaged event-related potentials and behavioral analysis. As main results we show (1) a significant improvement of runtime needed to perform the interaction tasks compared to a setting in which all subjects could easily perform the tasks. We show that (2) the single-trial detectability of the event-related potential P300 can be used to measure the changes in task load and task engagement during complex interaction while also being sensitive to the level of experience of the operator and (3) can be used to adapt the MMI individually to the different needs of users without increasing total workload. Our online adaptation of the proposed MMI is based on a continuous supervision of the operator's cognitive resources by means of embedded Brain Reading. Operators with different qualifications or capabilities receive only as many tasks as they can perform to avoid mental overload as well as mental underload. PMID:27445742

  15. Waves associated to COMPLEX EVENTS observed by STEREO

    NASA Astrophysics Data System (ADS)

    Siu Tapia, A. L.; Blanco-Cano, X.; Kajdic, P.; Aguilar-Rodriguez, E.; Russell, C. T.; Jian, L. K.; Luhmann, J. G.

    2012-12-01

    Complex Events are formed by two or more large-scale solar wind structures which interact in space. Typical cases are interactions of: (i) a Magnetic Cloud/Interplanetary Coronal Mass Ejection (MC/ICME) with another MC/ICME transient; and (ii) an ICME followed by a Stream Interaction Region (SIR). Complex Events are of importance for space weather studies and studying them can enhance our understanding of collisionless plasma physics. Some of these structures can produce or enhance southward magnetic fields, a key factor in geomagnetic storm generation. Using data from the STEREO mission during the years 2006-2011, we found 17 Complex Events preceded by a shock wave. We use magnetic field and plasma data to study the micro-scale structure of the shocks, and the waves associated to these shocks and within Complex Events structures. To determine wave characteristics we perform Power Spectra and Minimum Variance Analysis. We also use PLASTIC WAP protons data to study foreshock extensions and the relationship between Complex Regions and particle acceleration to suprathermal energies.

  16. Modeling Concept Dependencies for Event Detection

    DTIC Science & Technology

    2014-04-04

    Gaussian Mixture Model (GMM). Jiang et al . [8] provide a summary of experiments for TRECVID MED 2010 . They employ low-level features such as SIFT and...event detection literature. Ballan et al . [2] present a method to introduce temporal information for video event detection with a BoW (bag-of-words...approach. Zhou et al . [24] study video event detection by encoding a video with a set of bag of SIFT feature vectors and describe the distribution with a

  17. Development and application of absolute quantitative detection by duplex chamber-based digital PCR of genetically modified maize events without pretreatment steps.

    PubMed

    Zhu, Pengyu; Fu, Wei; Wang, Chenguang; Du, Zhixin; Huang, Kunlun; Zhu, Shuifang; Xu, Wentao

    2016-04-15

    The possibility of the absolute quantitation of GMO events by digital PCR was recently reported. However, most absolute quantitation methods based on the digital PCR required pretreatment steps. Meanwhile, singleplex detection could not meet the demand of the absolute quantitation of GMO events that is based on the ratio of foreign fragments and reference genes. Thus, to promote the absolute quantitative detection of different GMO events by digital PCR, we developed a quantitative detection method based on duplex digital PCR without pretreatment. Moreover, we tested 7 GMO events in our study to evaluate the fitness of our method. The optimized combination of foreign and reference primers, limit of quantitation (LOQ), limit of detection (LOD) and specificity were validated. The results showed that the LOQ of our method for different GMO events was 0.5%, while the LOD is 0.1%. Additionally, we found that duplex digital PCR could achieve the detection results with lower RSD compared with singleplex digital PCR. In summary, the duplex digital PCR detection system is a simple and stable way to achieve the absolute quantitation of different GMO events. Moreover, the LOQ and LOD indicated that this method is suitable for the daily detection and quantitation of GMO events. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Detecting Earthquakes over a Seismic Network using Single-Station Similarity Measures

    NASA Astrophysics Data System (ADS)

    Bergen, Karianne J.; Beroza, Gregory C.

    2018-03-01

    New blind waveform-similarity-based detection methods, such as Fingerprint and Similarity Thresholding (FAST), have shown promise for detecting weak signals in long-duration, continuous waveform data. While blind detectors are capable of identifying similar or repeating waveforms without templates, they can also be susceptible to false detections due to local correlated noise. In this work, we present a set of three new methods that allow us to extend single-station similarity-based detection over a seismic network; event-pair extraction, pairwise pseudo-association, and event resolution complete a post-processing pipeline that combines single-station similarity measures (e.g. FAST sparse similarity matrix) from each station in a network into a list of candidate events. The core technique, pairwise pseudo-association, leverages the pairwise structure of event detections in its network detection model, which allows it to identify events observed at multiple stations in the network without modeling the expected move-out. Though our approach is general, we apply it to extend FAST over a sparse seismic network. We demonstrate that our network-based extension of FAST is both sensitive and maintains a low false detection rate. As a test case, we apply our approach to two weeks of continuous waveform data from five stations during the foreshock sequence prior to the 2014 Mw 8.2 Iquique earthquake. Our method identifies nearly five times as many events as the local seismicity catalog (including 95% of the catalog events), and less than 1% of these candidate events are false detections.

  19. Volcanic activity and satellite-detected thermal anomalies at Central American volcanoes

    NASA Technical Reports Server (NTRS)

    Stoiber, R. E. (Principal Investigator); Rose, W. I., Jr.

    1973-01-01

    The author has identified the following significant results. A large nuee ardente eruption occurred at Santiaguito volcano, within the test area on 16 September 1973. Through a system of local observers, the eruption has been described, reported to the international scientific community, extent of affected area mapped, and the new ash sampled. A more extensive report on this event will be prepared. The eruption is an excellent example of the kind of volcanic situation in which satellite thermal imagery might be useful. The Santiaguito dome is a complex mass with a whole series of historically active vents. It's location makes access difficult, yet its activity is of great concern to large agricultural populations who live downslope. Santiaguito has produced a number of large eruptions with little apparent warning. In the earlier ground survey large thermal anomalies were identified at Santiaguito. There is no way of knowing whether satellite monitoring could have detected changes in thermal anomaly patterns related to this recent event, but the position of thermal anomalies on Santiaguito and any changes in their character would be relevant information.

  20. Aptamer-conjugated live human immune cell based biosensors for the accurate detection of C-reactive protein

    NASA Astrophysics Data System (ADS)

    Hwang, Jangsun; Seo, Youngmin; Jo, Yeonho; Son, Jaewoo; Choi, Jonghoon

    2016-10-01

    C-reactive protein (CRP) is a pentameric protein that is present in the bloodstream during inflammatory events, e.g., liver failure, leukemia, and/or bacterial infection. The level of CRP indicates the progress and prognosis of certain diseases; it is therefore necessary to measure CRP levels in the blood accurately. The normal concentration of CRP is reported to be 1-3 mg/L. Inflammatory events increase the level of CRP by up to 500 times; accordingly, CRP is a biomarker of acute inflammatory disease. In this study, we demonstrated the preparation of DNA aptamer-conjugated peripheral blood mononuclear cells (Apt-PBMCs) that specifically capture human CRP. Live PBMCs functionalized with aptamers could detect different levels of human CRP by producing immune complexes with reporter antibody. The binding behavior of Apt-PBMCs toward highly concentrated CRP sites was also investigated. The immune responses of Apt-PBMCs were evaluated by measuring TNF-alpha secretion after stimulating the PBMCs with lipopolysaccharides. In summary, engineered Apt-PBMCs have potential applications as live cell based biosensors and for in vitro tracing of CRP secretion sites.

  1. Sensing qualitative events to control manipulation

    NASA Astrophysics Data System (ADS)

    Pook, Polly K.; Ballard, Dana H.

    1992-11-01

    Dexterous robotic hands have numerous sensors distributed over a flexible high-degree-of- freedom framework. Control of these hands often relies on a detailed task description that is either specified a priori or computed on-line from sensory feedback. Such controllers are complex and may use unnecessary precision. In contrast, one can incorporate plan cues that provide a contextual backdrop in order to simplify the control task. To demonstrate, a Utah/MIT dexterous hand mounted on a Puma 760 arm flips a plastic egg, using the finger tendon tensions as the sole control signal. The completion of each subtask, such as picking up the spatula, finding the pan, and sliding the spatula under the egg, is detected by sensing tension states. The strategy depends on the task context but does not require precise positioning knowledge. We term this qualitative manipulation to draw a parallel with qualitative vision strategies. The approach is to design closed-loop programs that detect significant events to control manipulation but ignore inessential details. The strategy is generalized by analyzing the robot state dynamics during teleoperated hand actions to reveal the essential features that control each action.

  2. Brain Signals of Face Processing as Revealed by Event-Related Potentials

    PubMed Central

    Olivares, Ela I.; Iglesias, Jaime; Saavedra, Cristina; Trujillo-Barreto, Nelson J.; Valdés-Sosa, Mitchell

    2015-01-01

    We analyze the functional significance of different event-related potentials (ERPs) as electrophysiological indices of face perception and face recognition, according to cognitive and neurofunctional models of face processing. Initially, the processing of faces seems to be supported by early extrastriate occipital cortices and revealed by modulations of the occipital P1. This early response is thought to reflect the detection of certain primary structural aspects indicating the presence grosso modo of a face within the visual field. The posterior-temporal N170 is more sensitive to the detection of faces as complex-structured stimuli and, therefore, to the presence of its distinctive organizational characteristics prior to within-category identification. In turn, the relatively late and probably more rostrally generated N250r and N400-like responses might respectively indicate processes of access and retrieval of face-related information, which is stored in long-term memory (LTM). New methods of analysis of electrophysiological and neuroanatomical data, namely, dynamic causal modeling, single-trial and time-frequency analyses, are highly recommended to advance in the knowledge of those brain mechanisms concerning face processing. PMID:26160999

  3. GPS, Earthquakes, the Ionosphere, and the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Calais, Eric; Minster, J. Bernard

    1998-01-01

    Sources such as atmospheric or buried explosions and shallow earthquakes producing strong vertical ground displacements are known to produce infrasonic pressure waves in the atmosphere. Because of the coupling between neutral particles and electrons at ionospheric altitudes, these acoustic waves induce variations of the ionospheric electron density. The Global Positioning System provides a way of directly measuring the Total Electron Content in the ionosphere and, therefore. of detecting such perturbations in the upper atmosphere. In this work, we demonstrate the capabilities of the GPS technique to detect ionospheric perturbations caused by the January 17. 1994, M (sub w) =6.7, Northridge earthquake and the STS-58 Space Shuttle ascent. In both cases, we observe a perturbation of the ionospheric electron density lasting for about 30 m, with periods less than 10 m. The perturbation is complex and shows two sub-events separated by about 15 m. The phase velocities and waveform characteristics of the two sub-events lead us to interpret the first arrival as the direct propagation of 2 free wave, followed by oscillatory guided waves propagating along horizontal atmospheric interfaces at 120 km altitude and below.

  4. A novel method to accurately locate and count large numbers of steps by photobleaching

    PubMed Central

    Tsekouras, Konstantinos; Custer, Thomas C.; Jashnsaz, Hossein; Walter, Nils G.; Pressé, Steve

    2016-01-01

    Photobleaching event counting is a single-molecule fluorescence technique that is increasingly being used to determine the stoichiometry of protein and RNA complexes composed of many subunits in vivo as well as in vitro. By tagging protein or RNA subunits with fluorophores, activating them, and subsequently observing as the fluorophores photobleach, one obtains information on the number of subunits in a complex. The noise properties in a photobleaching time trace depend on the number of active fluorescent subunits. Thus, as fluorophores stochastically photobleach, noise properties of the time trace change stochastically, and these varying noise properties have created a challenge in identifying photobleaching steps in a time trace. Although photobleaching steps are often detected by eye, this method only works for high individual fluorophore emission signal-to-noise ratios and small numbers of fluorophores. With filtering methods or currently available algorithms, it is possible to reliably identify photobleaching steps for up to 20–30 fluorophores and signal-to-noise ratios down to ∼1. Here we present a new Bayesian method of counting steps in photobleaching time traces that takes into account stochastic noise variation in addition to complications such as overlapping photobleaching events that may arise from fluorophore interactions, as well as on-off blinking. Our method is capable of detecting ≥50 photobleaching steps even for signal-to-noise ratios as low as 0.1, can find up to ≥500 steps for more favorable noise profiles, and is computationally inexpensive. PMID:27654946

  5. Introduction to State Estimation of High-Rate System Dynamics.

    PubMed

    Hong, Jonathan; Laflamme, Simon; Dodson, Jacob; Joyce, Bryan

    2018-01-13

    Engineering systems experiencing high-rate dynamic events, including airbags, debris detection, and active blast protection systems, could benefit from real-time observability for enhanced performance. However, the task of high-rate state estimation is challenging, in particular for real-time applications where the rate of the observer's convergence needs to be in the microsecond range. This paper identifies the challenges of state estimation of high-rate systems and discusses the fundamental characteristics of high-rate systems. A survey of applications and methods for estimators that have the potential to produce accurate estimations for a complex system experiencing highly dynamic events is presented. It is argued that adaptive observers are important to this research. In particular, adaptive data-driven observers are advantageous due to their adaptability and lack of dependence on the system model.

  6. nES GEMMA Analysis of Lectins and Their Interactions with Glycoproteins - Separation, Detection, and Sampling of Noncovalent Biospecific Complexes

    NASA Astrophysics Data System (ADS)

    Engel, Nicole Y.; Weiss, Victor U.; Marchetti-Deschmann, Martina; Allmaier, Günter

    2017-01-01

    In order to better understand biological events, lectin-glycoprotein interactions are of interest. The possibility to gather more information than the mere positive or negative response for interactions brought mass spectrometry into the center of many research fields. The presented work shows the potential of a nano-electrospray gas-phase electrophoretic mobility molecular analyzer (nES GEMMA) to detect weak, noncovalent, biospecific interactions besides still unbound glycoproteins and unreacted lectins without prior liquid phase separation. First results for Sambucus nigra agglutinin, concanavalin A, and wheat germ agglutinin and their retained noncovalent interactions with glycoproteins in the gas phase are presented. Electrophoretic mobility diameters (EMDs) were obtained by nES GEMMA for all interaction partners correlating very well with molecular masses determined by matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) of the individual molecules. Moreover, EMDs measured for the lectin-glycoprotein complexes were in good accordance with theoretically calculated mass values. Special focus was laid on complex formation for different lectin concentrations and binding specificities to evaluate the method with respect to results obtained in the liquid phase. The latter was addressed by capillary electrophoresis on-a-chip (CE-on-a-chip). Of exceptional interest was the fact that the formed complexes could be sampled according to their size onto nitrocellulose membranes after gas-phase separation. Subsequent immunological investigation further proved that the collected complex actually retained its native structure throughout nES GEMMA analysis and sampling.

  7. Empirical support for the definition of a complex trauma event in children and adolescents.

    PubMed

    Wamser-Nanney, Rachel; Vandenberg, Brian R

    2013-12-01

    Complex trauma events have been defined as chronic, interpersonal traumas that begin early in life (Cook, Blaustein, Spinazzola, & van der Kolk, 2003). The complex trauma definition has been examined in adults, as indicated by the Diagnostic and Statistical Manual of Mental Disorders (4th ed.; DSM-IV) field trial; however, this research was lacking in child populations. The symptom presentations of complexly traumatized children were contrasted with those exposed to other, less severe trauma ecologies that met 1 or 2 features of the complex trauma definition. Included in this study were 346 treatment-seeking children and adolescents (ages 3–18 years) who had experienced atraumatic event. Results indicated that child survivors of complex trauma presented with higher levels of generalized behavior problems and trauma-related symptoms than those who experienced (a) acute noninterpersonal trauma, (b) chronic interpersonal trauma that begins later in life, and (c) acute interpersonal trauma. Greater levels of behavioral problems were observed in children exposed to complex trauma as compared to those who experienced a traumatic event that begins early in life. These results provide support for the complex trauma event definition and suggest the need for a complex trauma diagnostic construct for children and adolescents.

  8. Molecular evidence of hybridization in sympatric populations of the Enantia jethys complex (Lepidoptera: Pieridae).

    PubMed

    Jasso-Martínez, Jovana M; Machkour-M'Rabet, Salima; Vila, Roger; Rodríguez-Arnaiz, Rosario; Castañeda-Sortibrán, América Nitxin

    2018-01-01

    Hybridization events are frequently demonstrated in natural butterfly populations. One interesting butterfly complex species is the Enantia jethys complex that has been studied for over a century; many debates exist regarding the species composition of this complex. Currently, three species that live sympatrically in the Gulf slope of Mexico (Enantia jethys, E. mazai, and E. albania) are recognized in this complex (based on morphological and molecular studies). Where these species live in sympatry, some cases of interspecific mating have been observed, suggesting hybridization events. Considering this, we employed a multilocus approach (analyses of mitochondrial and nuclear sequences: COI, RpS5, and Wg; and nuclear dominant markers: inter-simple sequence repeat (ISSRs) to study hybridization in sympatric populations from Veracruz, Mexico. Genetic diversity parameters were determined for all molecular markers, and species identification was assessed by different methods such as analyses of molecular variance (AMOVA), clustering, principal coordinate analysis (PCoA), gene flow, and PhiPT parameters. ISSR molecular markers were used for a more profound study of hybridization process. Although species of the Enantia jethys complex have a low dispersal capacity, we observed high genetic diversity, probably reflecting a high density of individuals locally. ISSR markers provided evidence of a contemporary hybridization process, detecting a high number of hybrids (from 17% to 53%) with significant differences in genetic diversity. Furthermore, a directional pattern of hybridization was observed from E. albania to other species. Phylogenetic study through DNA sequencing confirmed the existence of three clades corresponding to the three species previously recognized by morphological and molecular studies. This study underlines the importance of assessing hybridization in evolutionary studies, by tracing the lineage separation process that leads to the origin of new species. Our research demonstrates that hybridization processes have a high occurrence in natural populations.

  9. LINEBACKER: LINE-speed Bio-inspired Analysis and Characterization for Event Recognition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oehmen, Christopher S.; Bruillard, Paul J.; Matzke, Brett D.

    2016-08-04

    The cyber world is a complex domain, with digital systems mediating a wide spectrum of human and machine behaviors. While this is enabling a revolution in the way humans interact with each other and data, it also is exposing previously unreachable infrastructure to a worldwide set of actors. Existing solutions for intrusion detection and prevention that are signature-focused typically seek to detect anomalous and/or malicious activity for the sake of preventing or mitigating negative impacts. But a growing interest in behavior-based detection is driving new forms of analysis that move the emphasis from static indicators (e.g. rule-based alarms or tripwires)more » to behavioral indicators that accommodate a wider contextual perspective. Similar to cyber systems, biosystems have always existed in resource-constrained hostile environments where behaviors are tuned by context. So we look to biosystems as an inspiration for addressing behavior-based cyber challenges. In this paper, we introduce LINEBACKER, a behavior-model based approach to recognizing anomalous events in network traffic and present the design of this approach of bio-inspired and statistical models working in tandem to produce individualized alerting for a collection of systems. Preliminary results of these models operating on historic data are presented along with a plugin to support real-world cyber operations.« less

  10. Phenotypic screening for developmental neurotoxicity ...

    EPA Pesticide Factsheets

    There are large numbers of environmental chemicals with little or no available information on their toxicity, including developmental neurotoxicity. Because of the resource-intensive nature of traditional animal tests, high-throughput (HTP) methods that can rapidly evaluate chemicals for the potential to affect the developing brain are being explored. Typically, HTP screening uses biochemical and molecular assays to detect the interaction of a chemical with a known target or molecular initiating event (e.g., the mechanism of action). For developmental neurotoxicity, however, the mechanism(s) is often unknown. Thus, we have developed assays for detecting chemical effects on the key events of neurodevelopment at the cellular level (e.g., proliferation, differentiation, neurite growth, synaptogenesis, network formation). Cell-based assays provide a test system at a level of biological complexity that encompasses many potential neurotoxic mechanisms. For example, phenotypic assessment of neurite outgrowth at the cellular level can detect chemicals that target kinases, ion channels, or esterases at the molecular level. The results from cell-based assays can be placed in a conceptual framework using an Adverse Outcome Pathway (AOP) which links molecular, cellular, and organ level effects with apical measures of developmental neurotoxicity. Testing a wide range of concentrations allows for the distinction between selective effects on neurodevelopmental and non-specific

  11. Detecting and Locating Seismic Events Without Phase Picks or Velocity Models

    NASA Astrophysics Data System (ADS)

    Arrowsmith, S.; Young, C. J.; Ballard, S.; Slinkard, M.

    2015-12-01

    The standard paradigm for seismic event monitoring is to scan waveforms from a network of stations and identify the arrival time of various seismic phases. A signal association algorithm then groups the picks to form events, which are subsequently located by minimizing residuals between measured travel times and travel times predicted by an Earth model. Many of these steps are prone to significant errors which can lead to erroneous arrival associations and event locations. Here, we revisit a concept for event detection that does not require phase picks or travel time curves and fuses detection, association and location into a single algorithm. Our pickless event detector exploits existing catalog and waveform data to build an empirical stack of the full regional seismic wavefield, which is subsequently used to detect and locate events at a network level using correlation techniques. Because the technique uses more of the information content of the original waveforms, the concept is particularly powerful for detecting weak events that would be missed by conventional methods. We apply our detector to seismic data from the University of Utah Seismograph Stations network and compare our results with the earthquake catalog published by the University of Utah. We demonstrate that the pickless detector can detect and locate significant numbers of events previously missed by standard data processing techniques.

  12. Developing an Automated Machine Learning Marine Oil Spill Detection System with Synthetic Aperture Radar

    NASA Astrophysics Data System (ADS)

    Pinales, J. C.; Graber, H. C.; Hargrove, J. T.; Caruso, M. J.

    2016-02-01

    Previous studies have demonstrated the ability to detect and classify marine hydrocarbon films with spaceborne synthetic aperture radar (SAR) imagery. The dampening effects of hydrocarbon discharges on small surface capillary-gravity waves renders the ocean surface "radar dark" compared with the standard wind-borne ocean surfaces. Given the scope and impact of events like the Deepwater Horizon oil spill, the need for improved, automated and expedient monitoring of hydrocarbon-related marine anomalies has become a pressing and complex issue for governments and the extraction industry. The research presented here describes the development, training, and utilization of an algorithm that detects marine oil spills in an automated, semi-supervised manner, utilizing X-, C-, or L-band SAR data as the primary input. Ancillary datasets include related radar-borne variables (incidence angle, etc.), environmental data (wind speed, etc.) and textural descriptors. Shapefiles produced by an experienced human-analyst served as targets (validation) during the training portion of the investigation. Training and testing datasets were chosen for development and assessment of algorithm effectiveness as well as optimal conditions for oil detection in SAR data. The algorithm detects oil spills by following a 3-step methodology: object detection, feature extraction, and classification. Previous oil spill detection and classification methodologies such as machine learning algorithms, artificial neural networks (ANN), and multivariate classification methods like partial least squares-discriminant analysis (PLS-DA) are evaluated and compared. Statistical, transform, and model-based image texture techniques, commonly used for object mapping directly or as inputs for more complex methodologies, are explored to determine optimal textures for an oil spill detection system. The influence of the ancillary variables is explored, with a particular focus on the role of strong vs. weak wind forcing.

  13. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm.

    PubMed

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-10-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.

  14. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm

    PubMed Central

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-01-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses. PMID:27706086

  15. Systematic detection of seismic events at Mount St. Helens with an ultra-dense array

    NASA Astrophysics Data System (ADS)

    Meng, X.; Hartog, J. R.; Schmandt, B.; Hotovec-Ellis, A. J.; Hansen, S. M.; Vidale, J. E.; Vanderplas, J.

    2016-12-01

    During the summer of 2014, an ultra-dense array of 900 geophones was deployed around the crater of Mount St. Helens and continuously operated for 15 days. This dataset provides us an unprecedented opportunity to systematically detect seismic events around an active volcano and study their underlying mechanisms. We use a waveform-based matched filter technique to detect seismic events from this dataset. Due to the large volume of continuous data ( 1 TB), we performed the detection on the GPU cluster Stampede (https://www.tacc.utexas.edu/systems/stampede). We build a suite of template events from three catalogs: 1) the standard Pacific Northwest Seismic Network (PNSN) catalog (45 events); 2) the catalog from Hansen&Schmandt (2015) obtained with a reverse-time imaging method (212 events); and 3) the catalog identified with a matched filter technique using the PNSN permanent stations (190 events). By searching for template matches in the ultra-dense array, we find 2237 events. We then calibrate precise relative magnitudes for template and detected events, using a principal component fit to measure waveform amplitude ratios. The magnitude of completeness and b-value of the detected catalog is -0.5 and 1.1, respectively. Our detected catalog shows several intensive swarms, which are likely driven by fluid pressure transients in conduits or slip transients on faults underneath the volcano. We are currently relocating the detected catalog with HypoDD and measuring the seismic velocity changes at Mount St. Helens using the coda wave interferometry of detected repeating earthquakes. The accurate temporal-spatial migration pattern of seismicity and seismic property changes should shed light on the physical processes beneath Mount St. Helens.

  16. A new prior for bayesian anomaly detection: application to biosurveillance.

    PubMed

    Shen, Y; Cooper, G F

    2010-01-01

    Bayesian anomaly detection computes posterior probabilities of anomalous events by combining prior beliefs and evidence from data. However, the specification of prior probabilities can be challenging. This paper describes a Bayesian prior in the context of disease outbreak detection. The goal is to provide a meaningful, easy-to-use prior that yields a posterior probability of an outbreak that performs at least as well as a standard frequentist approach. If this goal is achieved, the resulting posterior could be usefully incorporated into a decision analysis about how to act in light of a possible disease outbreak. This paper describes a Bayesian method for anomaly detection that combines learning from data with a semi-informative prior probability over patterns of anomalous events. A univariate version of the algorithm is presented here for ease of illustration of the essential ideas. The paper describes the algorithm in the context of disease-outbreak detection, but it is general and can be used in other anomaly detection applications. For this application, the semi-informative prior specifies that an increased count over baseline is expected for the variable being monitored, such as the number of respiratory chief complaints per day at a given emergency department. The semi-informative prior is derived based on the baseline prior, which is estimated from using historical data. The evaluation reported here used semi-synthetic data to evaluate the detection performance of the proposed Bayesian method and a control chart method, which is a standard frequentist algorithm that is closest to the Bayesian method in terms of the type of data it uses. The disease-outbreak detection performance of the Bayesian method was statistically significantly better than that of the control chart method when proper baseline periods were used to estimate the baseline behavior to avoid seasonal effects. When using longer baseline periods, the Bayesian method performed as well as the control chart method. The time complexity of the Bayesian algorithm is linear in the number of the observed events being monitored, due to a novel, closed-form derivation that is introduced in the paper. This paper introduces a novel prior probability for Bayesian outbreak detection that is expressive, easy-to-apply, computationally efficient, and performs as well or better than a standard frequentist method.

  17. Variations of trends of indicators describing complex systems: Change of scaling precursory to extreme events

    NASA Astrophysics Data System (ADS)

    Keilis-Borok, V. I.; Soloviev, A. A.

    2010-09-01

    Socioeconomic and natural complex systems persistently generate extreme events also known as disasters, crises, or critical transitions. Here we analyze patterns of background activity preceding extreme events in four complex systems: economic recessions, surges in homicides in a megacity, magnetic storms, and strong earthquakes. We use as a starting point the indicators describing the system's behavior and identify changes in an indicator's trend. Those changes constitute our background events (BEs). We demonstrate a premonitory pattern common to all four systems considered: relatively large magnitude BEs become more frequent before extreme event. A premonitory change of scaling has been found in various models and observations. Here we demonstrate this change in scaling of uniformly defined BEs in four real complex systems, their enormous differences notwithstanding.

  18. Tidal triggering of earthquakes suggests poroelastic behavior on the San Andreas Fault

    DOE PAGES

    Delorey, Andrew A.; van der Elst, Nicholas J.; Johnson, Paul Allan

    2016-12-28

    Tidal triggering of earthquakes is hypothesized to provide quantitative information regarding the fault's stress state, poroelastic properties, and may be significant for our understanding of seismic hazard. To date, studies of regional or global earthquake catalogs have had only modest successes in identifying tidal triggering. We posit that the smallest events that may provide additional evidence of triggering go unidentified and thus we developed a technique to improve the identification of very small magnitude events. We identify events applying a method known as inter-station seismic coherence where we prioritize detection and discrimination over characterization. Here we show tidal triggering ofmore » earthquakes on the San Andreas Fault. We find the complex interaction of semi-diurnal and fortnightly tidal periods exposes both stress threshold and critical state behavior. Lastly, our findings reveal earthquake nucleation processes and pore pressure conditions – properties of faults that are difficult to measure, yet extremely important for characterizing earthquake physics and seismic hazards.« less

  19. Tidal triggering of earthquakes suggests poroelastic behavior on the San Andreas Fault

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delorey, Andrew A.; van der Elst, Nicholas J.; Johnson, Paul Allan

    Tidal triggering of earthquakes is hypothesized to provide quantitative information regarding the fault's stress state, poroelastic properties, and may be significant for our understanding of seismic hazard. To date, studies of regional or global earthquake catalogs have had only modest successes in identifying tidal triggering. We posit that the smallest events that may provide additional evidence of triggering go unidentified and thus we developed a technique to improve the identification of very small magnitude events. We identify events applying a method known as inter-station seismic coherence where we prioritize detection and discrimination over characterization. Here we show tidal triggering ofmore » earthquakes on the San Andreas Fault. We find the complex interaction of semi-diurnal and fortnightly tidal periods exposes both stress threshold and critical state behavior. Lastly, our findings reveal earthquake nucleation processes and pore pressure conditions – properties of faults that are difficult to measure, yet extremely important for characterizing earthquake physics and seismic hazards.« less

  20. Sleep-Related Orgasms in a 57-Year-Old Woman: A Case Report.

    PubMed

    Irfan, Muna; Schenck, Carlos H

    2018-01-15

    We report a case of problematic spontaneous orgasms during sleep in a 57-year-old woman who also complained of hypnic jerks and symptoms of exploding head syndrome. To our knowledge, this is the first case report in the English language literature of problematic spontaneous orgasms during sleep. She had a complex medical and psychiatric history, and was taking oxycontin, venlafaxine, amitriptyline, and lurasidone. Prolonged video electroencephalogram monitoring did not record any ictal or interictal electroencephalogram discharges, and nocturnal video polysomnography monitoring did not record any behavioral or orgasmic event. Periodic limb movement index was zero events/h. Severe central sleep apnea was detected with apnea-hypopnea index = 130 events/h, but she could not tolerate positive airway pressure titration. Sleep architecture was disturbed, with 96.4% of sleep spent in stage N2 sleep. Bedtime clonazepam therapy (1.5 mg) was effective in suppressing the sleep-related orgasms and hypnic jerks. © 2018 American Academy of Sleep Medicine

  1. Tidal triggering of earthquakes suggests poroelastic behavior on the San Andreas Fault

    USGS Publications Warehouse

    Delorey, Andrew; Van Der Elst, Nicholas; Johnson, Paul

    2017-01-01

    Tidal triggering of earthquakes is hypothesized to provide quantitative information regarding the fault's stress state, poroelastic properties, and may be significant for our understanding of seismic hazard. To date, studies of regional or global earthquake catalogs have had only modest successes in identifying tidal triggering. We posit that the smallest events that may provide additional evidence of triggering go unidentified and thus we developed a technique to improve the identification of very small magnitude events. We identify events applying a method known as inter-station seismic coherence where we prioritize detection and discrimination over characterization. Here we show tidal triggering of earthquakes on the San Andreas Fault. We find the complex interaction of semi-diurnal and fortnightly tidal periods exposes both stress threshold and critical state behavior. Our findings reveal earthquake nucleation processes and pore pressure conditions – properties of faults that are difficult to measure, yet extremely important for characterizing earthquake physics and seismic hazards.

  2. Revealing hidden clonal complexity in Mycobacterium tuberculosis infection by qualitative and quantitative improvement of sampling.

    PubMed

    Pérez-Lago, L; Palacios, J J; Herranz, M; Ruiz Serrano, M J; Bouza, E; García-de-Viedma, D

    2015-02-01

    The analysis of microevolution events, its functional relevance and impact on molecular epidemiology strategies, constitutes one of the most challenging aspects of the study of clonal complexity in infection by Mycobacterium tuberculosis. In this study, we retrospectively evaluated whether two improved sampling schemes could provide access to the clonal complexity that is undetected by the current standards (analysis of one isolate from one sputum). We evaluated in 48 patients the analysis by mycobacterial interspersed repetitive unit-variable number tandem repeat of M. tuberculosis isolates cultured from bronchial aspirate (BAS) or bronchoalveolar lavage (BAL) and, in another 16 cases, the analysis of a higher number of isolates from independent sputum samples. Analysis of the isolates from BAS/BAL specimens revealed clonal complexity in a very high proportion of cases (5/48); in most of these cases, complexity was not detected when the isolates from sputum samples were analysed. Systematic analysis of isolates from multiple sputum samples also improved the detection of clonal complexity. We found coexisting clonal variants in two of 16 cases that would have gone undetected in the analysis of the isolate from a single sputum specimen. Our results suggest that analysis of isolates from BAS/BAL specimens is highly efficient for recording the true clonal composition of M. tuberculosis in the lungs. When these samples are not available, we recommend increasing the number of isolates from independent sputum specimens, because they might not harbour the same pool of bacteria. Our data suggest that the degree of clonal complexity in tuberculosis has been underestimated because of the deficiencies inherent in a simplified procedure. Copyright © 2014 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  3. Automatic Detection and Classification of Audio Events for Road Surveillance Applications.

    PubMed

    Almaadeed, Noor; Asim, Muhammad; Al-Maadeed, Somaya; Bouridane, Ahmed; Beghdadi, Azeddine

    2018-06-06

    This work investigates the problem of detecting hazardous events on roads by designing an audio surveillance system that automatically detects perilous situations such as car crashes and tire skidding. In recent years, research has shown several visual surveillance systems that have been proposed for road monitoring to detect accidents with an aim to improve safety procedures in emergency cases. However, the visual information alone cannot detect certain events such as car crashes and tire skidding, especially under adverse and visually cluttered weather conditions such as snowfall, rain, and fog. Consequently, the incorporation of microphones and audio event detectors based on audio processing can significantly enhance the detection accuracy of such surveillance systems. This paper proposes to combine time-domain, frequency-domain, and joint time-frequency features extracted from a class of quadratic time-frequency distributions (QTFDs) to detect events on roads through audio analysis and processing. Experiments were carried out using a publicly available dataset. The experimental results conform the effectiveness of the proposed approach for detecting hazardous events on roads as demonstrated by 7% improvement of accuracy rate when compared against methods that use individual temporal and spectral features.

  4. Detecting earthquakes over a seismic network using single-station similarity measures

    NASA Astrophysics Data System (ADS)

    Bergen, Karianne J.; Beroza, Gregory C.

    2018-06-01

    New blind waveform-similarity-based detection methods, such as Fingerprint and Similarity Thresholding (FAST), have shown promise for detecting weak signals in long-duration, continuous waveform data. While blind detectors are capable of identifying similar or repeating waveforms without templates, they can also be susceptible to false detections due to local correlated noise. In this work, we present a set of three new methods that allow us to extend single-station similarity-based detection over a seismic network; event-pair extraction, pairwise pseudo-association, and event resolution complete a post-processing pipeline that combines single-station similarity measures (e.g. FAST sparse similarity matrix) from each station in a network into a list of candidate events. The core technique, pairwise pseudo-association, leverages the pairwise structure of event detections in its network detection model, which allows it to identify events observed at multiple stations in the network without modeling the expected moveout. Though our approach is general, we apply it to extend FAST over a sparse seismic network. We demonstrate that our network-based extension of FAST is both sensitive and maintains a low false detection rate. As a test case, we apply our approach to 2 weeks of continuous waveform data from five stations during the foreshock sequence prior to the 2014 Mw 8.2 Iquique earthquake. Our method identifies nearly five times as many events as the local seismicity catalogue (including 95 per cent of the catalogue events), and less than 1 per cent of these candidate events are false detections.

  5. Temporal and Spatial Predictability of an Irrelevant Event Differently Affect Detection and Memory of Items in a Visual Sequence

    PubMed Central

    Ohyama, Junji; Watanabe, Katsumi

    2016-01-01

    We examined how the temporal and spatial predictability of a task-irrelevant visual event affects the detection and memory of a visual item embedded in a continuously changing sequence. Participants observed 11 sequentially presented letters, during which a task-irrelevant visual event was either present or absent. Predictabilities of spatial location and temporal position of the event were controlled in 2 × 2 conditions. In the spatially predictable conditions, the event occurred at the same location within the stimulus sequence or at another location, while, in the spatially unpredictable conditions, it occurred at random locations. In the temporally predictable conditions, the event timing was fixed relative to the order of the letters, while in the temporally unpredictable condition; it could not be predicted from the letter order. Participants performed a working memory task and a target detection reaction time (RT) task. Memory accuracy was higher for a letter simultaneously presented at the same location as the event in the temporally unpredictable conditions, irrespective of the spatial predictability of the event. On the other hand, the detection RTs were only faster for a letter simultaneously presented at the same location as the event when the event was both temporally and spatially predictable. Thus, to facilitate ongoing detection processes, an event must be predictable both in space and time, while memory processes are enhanced by temporally unpredictable (i.e., surprising) events. Evidently, temporal predictability has differential effects on detection and memory of a visual item embedded in a sequence of images. PMID:26869966

  6. Temporal and Spatial Predictability of an Irrelevant Event Differently Affect Detection and Memory of Items in a Visual Sequence.

    PubMed

    Ohyama, Junji; Watanabe, Katsumi

    2016-01-01

    We examined how the temporal and spatial predictability of a task-irrelevant visual event affects the detection and memory of a visual item embedded in a continuously changing sequence. Participants observed 11 sequentially presented letters, during which a task-irrelevant visual event was either present or absent. Predictabilities of spatial location and temporal position of the event were controlled in 2 × 2 conditions. In the spatially predictable conditions, the event occurred at the same location within the stimulus sequence or at another location, while, in the spatially unpredictable conditions, it occurred at random locations. In the temporally predictable conditions, the event timing was fixed relative to the order of the letters, while in the temporally unpredictable condition; it could not be predicted from the letter order. Participants performed a working memory task and a target detection reaction time (RT) task. Memory accuracy was higher for a letter simultaneously presented at the same location as the event in the temporally unpredictable conditions, irrespective of the spatial predictability of the event. On the other hand, the detection RTs were only faster for a letter simultaneously presented at the same location as the event when the event was both temporally and spatially predictable. Thus, to facilitate ongoing detection processes, an event must be predictable both in space and time, while memory processes are enhanced by temporally unpredictable (i.e., surprising) events. Evidently, temporal predictability has differential effects on detection and memory of a visual item embedded in a sequence of images.

  7. Pickless event detection and location: The waveform correlation event detection system (WCEDS) revisited

    DOE PAGES

    Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; ...

    2016-01-01

    The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events thatmore » generated the phase picks.« less

  8. NAIMA as a solution for future GMO diagnostics challenges.

    PubMed

    Dobnik, David; Morisset, Dany; Gruden, Kristina

    2010-03-01

    In the field of genetically modified organism (GMO) diagnostics, real-time PCR has been the method of choice for target detection and quantification in most laboratories. Despite its numerous advantages, however, the lack of a true multiplexing option may render real-time PCR less practical in the face of future GMO detection challenges such as the multiplicity and increasing complexity of new transgenic events, as well as the repeated occurrence of unauthorized GMOs on the market. In this context, we recently reported the development of a novel multiplex quantitative DNA-based target amplification method, named NASBA implemented microarray analysis (NAIMA), which is suitable for sensitive, specific and quantitative detection of GMOs on a microarray. In this article, the performance of NAIMA is compared with that of real-time PCR, the focus being their performances in view of the upcoming challenge to detect/quantify an increasing number of possible GMOs at a sustainable cost and affordable staff effort. Finally, we present our conclusions concerning the applicability of NAIMA for future use in GMO diagnostics.

  9. Event detection in an assisted living environment.

    PubMed

    Stroiescu, Florin; Daly, Kieran; Kuris, Benjamin

    2011-01-01

    This paper presents the design of a wireless event detection and in building location awareness system. The systems architecture is based on using a body worn sensor to detect events such as falls where they occur in an assisted living environment. This process involves developing event detection algorithms and transmitting such events wirelessly to an in house network based on the 802.15.4 protocol. The network would then generate alerts both in the assisted living facility and remotely to an offsite monitoring facility. The focus of this paper is on the design of the system architecture and the compliance challenges in applying this technology.

  10. Large-N Nodal Seismic Deployment at Mount St Helens

    NASA Astrophysics Data System (ADS)

    Hansen, S. M.; Schmandt, B.; Vidale, J. E.; Creager, K. C.; Levander, A.; Kiser, E.; Barklage, M.; Hollis, D.

    2014-12-01

    In late July of 2014 over 900 autonomous short period seismometers were deployed within 12 km of the summit crater at Mount St Helens. In concert with the larger iMUSH experiment, these data constitute the largest seismic interrogation of an active volcano ever conducted. The array was deployed along the road and trail system of the national volcanic monument and adjacent regions with an average station spacing of 250 meters and included several station clusters with increased sampling density. The 10 Hz phones recorded the vertical component wavefield continuously at 250 Hz sampling rate over a period of approximately two weeks. During the recording time, the Pacific Northwest Seismic Network detected ~65 earthquakes within the array footprint ranging in magnitude from -0.9 to 1.1, the majority of which were located beneath the crater at less than 10 km depth. In addition to the natural seismicity, 23 explosion sources from the iMUSH active source experiment were recorded, several of which exceeded magnitude 2. Preliminary results for this project will include an expanded event catalog as the array should significantly reduce the detection threshold. The sheer number of instruments allows for stacking of station clusters producing high signal-to-noise beam traces which can be used for event triggering and for creating waveform templates to measure relative travel-times across the array via cross-correlation. The ability of the array to estimate focal mechanisms from event radiation patterns and delineate complex path effects will also be investigated. The density and azimuthal coverage provide by this array offers an excellent opportunity to investigate short-wavelength variations of the seismic wavefield in a complex geologic environment. Previous seismic tomography results suggest the presence of a shallow magma chamber at 1-3 km depth near the region of shallow seismicity as evidenced by a P wave low-velocity anomaly of at least -5.5% [Waite and Moran, 2009]. The proximity of the array as well as the event distribution make it possible to investigate wavefield distortion and scattering due to the potential magma chamber, including s-wave blockage as has been observed in other systems.

  11. Asynchronous ripple oscillations between left and right hippocampi during slow-wave sleep

    PubMed Central

    Villalobos, Claudio

    2017-01-01

    Spatial memory, among many other brain processes, shows hemispheric lateralization. Most of the published evidence suggests that the right hippocampus plays a leading role in the manipulation of spatial information. Concurrently in the hippocampus, memory consolidation during sleep periods is one of the key steps in the formation of newly acquired spatial memory traces. One of the most characteristic oscillatory patterns in the hippocampus are sharp-wave ripple (SWR) complexes. Within this complex, fast-field oscillations or ripples have been demonstrated to be instrumental in the memory consolidation process. Since these ripples are relevant for the consolidation of memory traces associated with spatial navigation, and this process appears to be lateralized, we hypothesize that ripple events between both hippocampi would exhibit different temporal dynamics. We tested this idea by using a modified "split-hyperdrive" that allows us to record simultaneous LFPs from both right and left hippocampi of Sprague-Dawley rats during sleep. We detected individual events and found that during sleep periods these ripples exhibited a different occurrence patterns between hemispheres. Most ripple events were synchronous between intra- rather than inter-hemispherical recordings, suggesting that ripples in the hippocampus are independently generated and locally propagated within a specific hemisphere. In this study, we propose the ripples’ lack of synchrony between left and right hippocampi as the putative physiological mechanism underlying lateralization of spatial memory. PMID:28158285

  12. Asynchronous ripple oscillations between left and right hippocampi during slow-wave sleep.

    PubMed

    Villalobos, Claudio; Maldonado, Pedro E; Valdés, José L

    2017-01-01

    Spatial memory, among many other brain processes, shows hemispheric lateralization. Most of the published evidence suggests that the right hippocampus plays a leading role in the manipulation of spatial information. Concurrently in the hippocampus, memory consolidation during sleep periods is one of the key steps in the formation of newly acquired spatial memory traces. One of the most characteristic oscillatory patterns in the hippocampus are sharp-wave ripple (SWR) complexes. Within this complex, fast-field oscillations or ripples have been demonstrated to be instrumental in the memory consolidation process. Since these ripples are relevant for the consolidation of memory traces associated with spatial navigation, and this process appears to be lateralized, we hypothesize that ripple events between both hippocampi would exhibit different temporal dynamics. We tested this idea by using a modified "split-hyperdrive" that allows us to record simultaneous LFPs from both right and left hippocampi of Sprague-Dawley rats during sleep. We detected individual events and found that during sleep periods these ripples exhibited a different occurrence patterns between hemispheres. Most ripple events were synchronous between intra- rather than inter-hemispherical recordings, suggesting that ripples in the hippocampus are independently generated and locally propagated within a specific hemisphere. In this study, we propose the ripples' lack of synchrony between left and right hippocampi as the putative physiological mechanism underlying lateralization of spatial memory.

  13. Adaptively Adjusted Event-Triggering Mechanism on Fault Detection for Networked Control Systems.

    PubMed

    Wang, Yu-Long; Lim, Cheng-Chew; Shi, Peng

    2016-12-08

    This paper studies the problem of adaptively adjusted event-triggering mechanism-based fault detection for a class of discrete-time networked control system (NCS) with applications to aircraft dynamics. By taking into account the fault occurrence detection progress and the fault occurrence probability, and introducing an adaptively adjusted event-triggering parameter, a novel event-triggering mechanism is proposed to achieve the efficient utilization of the communication network bandwidth. Both the sensor-to-control station and the control station-to-actuator network-induced delays are taken into account. The event-triggered sensor and the event-triggered control station are utilized simultaneously to establish new network-based closed-loop models for the NCS subject to faults. Based on the established models, the event-triggered simultaneous design of fault detection filter (FDF) and controller is presented. A new algorithm for handling the adaptively adjusted event-triggering parameter is proposed. Performance analysis verifies the effectiveness of the adaptively adjusted event-triggering mechanism, and the simultaneous design of FDF and controller.

  14. Complex Event Processing for Content-Based Text, Image, and Video Retrieval

    DTIC Science & Technology

    2016-06-01

    NY): Wiley- Interscience; 2000. Feldman R, Sanger J. The text mining handbook: advanced approaches in analyzing unstructured data. New York (NY...ARL-TR-7705 ● JUNE 2016 US Army Research Laboratory Complex Event Processing for Content-Based Text , Image, and Video Retrieval...ARL-TR-7705 ● JUNE 2016 US Army Research Laboratory Complex Event Processing for Content-Based Text , Image, and Video Retrieval

  15. Nanostructured silver fabric as a free-standing NanoZyme for colorimetric detection of glucose in urine.

    PubMed

    Karim, Md N; Anderson, Samuel R; Singh, Sanjay; Ramanathan, Rajesh; Bansal, Vipul

    2018-07-01

    Enzyme-mimicking catalytic nanoparticles, more commonly known as NanoZymes, have been at the forefront for the development of new sensing platforms for the detection of a range of molecules. Although solution-based NanoZymes have shown promise in glucose detection, the ability to immobilize NanoZymes on highly absorbent surfaces, particularly on free-standing substrates that can be feasibly exposed and removed from the reaction medium, can offer significant benefits for a range of biosensing and catalysis applications. This work, for the first time, shows the ability of Ag nanoparticles embedded within the 3D matrix of a cotton fabric to act as a free-standing peroxidase-mimic NanoZyme for the rapid detection of glucose in complex biological fluids such as urine. The use of cotton fabric as a template not only allows high number of catalytically active sites to participate in the enzyme-mimic catalytic reaction, the absorbent property of the cotton fibres also helps in rapid absorption of biological molecules such as glucose during the sensing event. This, in turn, brings the target molecule of interest in close proximity of the NanoZyme catalyst enabling accurate detection of glucose in urine. Additionally, the ability to extract the free-standing cotton fabric-supported NanoZyme following the reaction overcomes the issue of potential interference from colloidal nanoparticles during the assay. Based on these unique characteristics, nanostructured silver fabrics offer remarkable promise for the detection of glucose and other biomolecules in complex biological and environmental fluids. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. The AAL project: automated monitoring and intelligent analysis for the ATLAS data taking infrastructure

    NASA Astrophysics Data System (ADS)

    Kazarov, A.; Lehmann Miotto, G.; Magnoni, L.

    2012-06-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for collecting and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This requires strong competence and experience in understanding and discovering problems and root causes, and often the meaningful information is not in the single message or update, but in the aggregated behavior in a certain time-line. The AAL project is meant at reducing the man power needs and at assuring a constant high quality of problem detection by automating most of the monitoring tasks and providing real-time correlation of data-taking and system metrics. This project combines technologies coming from different disciplines, in particular it leverages on an Event Driven Architecture to unify the flow of data from the ATLAS infrastructure, on a Complex Event Processing (CEP) engine for correlation of events and on a message oriented architecture for components integration. The project is composed of 2 main components: a core processing engine, responsible for correlation of events through expert-defined queries and a web based front-end to present real-time information and interact with the system. All components works in a loose-coupled event based architecture, with a message broker to centralize all communication between modules. The result is an intelligent system able to extract and compute relevant information from the flow of operational data to provide real-time feedback to human experts who can promptly react when needed. The paper presents the design and implementation of the AAL project, together with the results of its usage as automated monitoring assistant for the ATLAS data taking infrastructure.

  17. Perceiving goals and actions in individuals with autism spectrum disorders.

    PubMed

    Zalla, Tiziana; Labruyère, Nelly; Georgieff, Nicolas

    2013-10-01

    In the present study, we investigated the ability to parse familiar sequences of action into meaningful events in young individuals with autism spectrum disorders (ASDs), as compared to young individuals with typical development (TD) and young individuals with moderate mental retardation or learning disabilities (MLDs). While viewing two videotaped movies, participants were requested to detect the boundary transitions between component events at both fine and coarse levels of the action hierarchical structure. Overall, reduced accuracy for event detection was found in participants with ASDs, relative to participants with TD, at both levels of action segmentation. The performance was, however, equally diminished in participants with ASDs and MLDs under the course-grained segmentation suggesting that difficulties to detect fine-grained events in ASDs cannot be explained by a general intellectual dysfunction. Reduced accuracy for event detection was related to diminished event recall, memory for event sequence and Theory of Mind abilities. We hypothesized that difficulties with event detection result from a deficit disrupting the on-line processing of kinematic features and physical changes of dynamic human actions. An impairment at the earlier stages of the event encoding process might contribute to deficits in episodic memory and social functioning in individuals with ASDs.

  18. Station Set Residual: Event Classification Using Historical Distribution of Observing Stations

    NASA Astrophysics Data System (ADS)

    Procopio, Mike; Lewis, Jennifer; Young, Chris

    2010-05-01

    Analysts working at the International Data Centre in support of treaty monitoring through the Comprehensive Nuclear-Test-Ban Treaty Organization spend a significant amount of time reviewing hypothesized seismic events produced by an automatic processing system. When reviewing these events to determine their legitimacy, analysts take a variety of approaches that rely heavily on training and past experience. One method used by analysts to gauge the validity of an event involves examining the set of stations involved in the detection of an event. In particular, leveraging past experience, an analyst can say that an event located in a certain part of the world is expected to be detected by Stations A, B, and C. Implicit in this statement is that such an event would usually not be detected by Stations X, Y, or Z. For some well understood parts of the world, the absence of one or more "expected" stations—or the presence of one or more "unexpected" stations—is correlated with a hypothesized event's legitimacy and to its survival to the event bulletin. The primary objective of this research is to formalize and quantify the difference between the observed set of stations detecting some hypothesized event, versus the expected set of stations historically associated with detecting similar nearby events close in magnitude. This Station Set Residual can be quantified in many ways, some of which are correlated with the analysts' determination of whether or not the event is valid. We propose that this Station Set Residual score can be used to screen out certain classes of "false" events produced by automatic processing with a high degree of confidence, reducing the analyst burden. Moreover, we propose that the visualization of the historically expected distribution of detecting stations can be immediately useful as an analyst aid during their review process.

  19. Controlling extreme events on complex networks

    NASA Astrophysics Data System (ADS)

    Chen, Yu-Zhong; Huang, Zi-Gang; Lai, Ying-Cheng

    2014-08-01

    Extreme events, a type of collective behavior in complex networked dynamical systems, often can have catastrophic consequences. To develop effective strategies to control extreme events is of fundamental importance and practical interest. Utilizing transportation dynamics on complex networks as a prototypical setting, we find that making the network ``mobile'' can effectively suppress extreme events. A striking, resonance-like phenomenon is uncovered, where an optimal degree of mobility exists for which the probability of extreme events is minimized. We derive an analytic theory to understand the mechanism of control at a detailed and quantitative level, and validate the theory numerically. Implications of our finding to current areas such as cybersecurity are discussed.

  20. Radiation detector device for rejecting and excluding incomplete charge collection events

    DOEpatents

    Bolotnikov, Aleksey E.; De Geronimo, Gianluigi; Vernon, Emerson; Yang, Ge; Camarda, Giuseppe; Cui, Yonggang; Hossain, Anwar; Kim, Ki Hyun; James, Ralph B.

    2016-05-10

    A radiation detector device is provided that is capable of distinguishing between full charge collection (FCC) events and incomplete charge collection (ICC) events based upon a correlation value comparison algorithm that compares correlation values calculated for individually sensed radiation detection events with a calibrated FCC event correlation function. The calibrated FCC event correlation function serves as a reference curve utilized by a correlation value comparison algorithm to determine whether a sensed radiation detection event fits the profile of the FCC event correlation function within the noise tolerances of the radiation detector device. If the radiation detection event is determined to be an ICC event, then the spectrum for the ICC event is rejected and excluded from inclusion in the radiation detector device spectral analyses. The radiation detector device also can calculate a performance factor to determine the efficacy of distinguishing between FCC and ICC events.

  1. Complex reassortment events of unusual G9P[4] rotavirus strains in India between 2011 and 2013.

    PubMed

    Doan, Yen Hai; Suzuki, Yoshiyuki; Fujii, Yoshiki; Haga, Kei; Fujimoto, Akira; Takai-Todaka, Reiko; Someya, Yuichi; Nayak, Mukti K; Mukherjee, Anupam; Imamura, Daisuke; Shinoda, Sumio; Chawla-Sarkar, Mamta; Katayama, Kazuhiko

    2017-10-01

    Rotavirus A (RVA) is the predominant etiological agent of acute gastroenteritis in young children worldwide. Recently, unusual G9P[4] rotavirus strains emerged with high prevalence in many countries. Such intergenogroup reassortant strains highlight the ongoing spread of unusual rotavirus strains throughout Asia. This study was undertaken to determine the whole genome of eleven unusual G9P[4] strains detected in India during 2011-2013, and to compare them with other human and animal global RVAs to understand the exact origin of unusual G9P[4] circulating in India and other countries worldwide. Of these 11 RVAs, four G9P[4] strains were double-reassortants with the G9-VP7 and E6-NSP4 genes on a DS-1-like genetic backbone (G9-P[4]-I2-R2-C2-M2-A2-N2-T2-E6-H2). The other strains showed a complex genetic constellation, likely derived from triple reassortment event with the G9-VP7, N1-NSP2 and E6-NSP4 on a DS-1-like genetic backbone (G9-P[4]-I2-R2-C2-M2-A2-N1-T2-E6-H2). Presumably, these unusual G9P[4] strains were generated after several reassortment events between the contemporary co-circulating human rotavirus strains. Moreover, the point mutation S291L at the interaction site between inner and outer capsid proteins of VP6 gene may be important in the rapid spread of this unusual strain. The complex reassortment events within the G9[4] strains may be related to the high prevalence of mixed infections in India as reported in this study and other previous studies. Copyright © 2017. Published by Elsevier B.V.

  2. Socio-Technical Systems Analysis in Health Care: A Research Agenda

    PubMed Central

    Bass, Ellen; Bellandi, Tommaso; Gurses, Ayse; Hallbeck, Susan; Mollo, Vanina

    2012-01-01

    Given the complexity of health care and the ‘people’ nature of healthcare work and delivery, STSA (Sociotechnical Systems Analysis) research is needed to address the numerous quality of care problems observed across the world. This paper describes open STSA research areas, including workload management, physical, cognitive and macroergonomic issues of medical devices and health information technologies, STSA in transitions of care, STSA of patient-centered care, risk management and patient safety management, resilience, and feedback loops between event detection, reporting and analysis and system redesign. PMID:22611480

  3. Abnormal global and local event detection in compressive sensing domain

    NASA Astrophysics Data System (ADS)

    Wang, Tian; Qiao, Meina; Chen, Jie; Wang, Chuanyun; Zhang, Wenjia; Snoussi, Hichem

    2018-05-01

    Abnormal event detection, also known as anomaly detection, is one challenging task in security video surveillance. It is important to develop effective and robust movement representation models for global and local abnormal event detection to fight against factors such as occlusion and illumination change. In this paper, a new algorithm is proposed. It can locate the abnormal events on one frame, and detect the global abnormal frame. The proposed algorithm employs a sparse measurement matrix designed to represent the movement feature based on optical flow efficiently. Then, the abnormal detection mission is constructed as a one-class classification task via merely learning from the training normal samples. Experiments demonstrate that our algorithm performs well on the benchmark abnormal detection datasets against state-of-the-art methods.

  4. A fuzzy Petri-net-based mode identification algorithm for fault diagnosis of complex systems

    NASA Astrophysics Data System (ADS)

    Propes, Nicholas C.; Vachtsevanos, George

    2003-08-01

    Complex dynamical systems such as aircraft, manufacturing systems, chillers, motor vehicles, submarines, etc. exhibit continuous and event-driven dynamics. These systems undergo several discrete operating modes from startup to shutdown. For example, a certain shipboard system may be operating at half load or full load or may be at start-up or shutdown. Of particular interest are extreme or "shock" operating conditions, which tend to severely impact fault diagnosis or the progression of a fault leading to a failure. Fault conditions are strongly dependent on the operating mode. Therefore, it is essential that in any diagnostic/prognostic architecture, the operating mode be identified as accurately as possible so that such functions as feature extraction, diagnostics, prognostics, etc. can be correlated with the predominant operating conditions. This paper introduces a mode identification methodology that incorporates both time- and event-driven information about the process. A fuzzy Petri net is used to represent the possible successive mode transitions and to detect events from processed sensor signals signifying a mode change. The operating mode is initialized and verified by analysis of the time-driven dynamics through a fuzzy logic classifier. An evidence combiner module is used to combine the results from both the fuzzy Petri net and the fuzzy logic classifier to determine the mode. Unlike most event-driven mode identifiers, this architecture will provide automatic mode initialization through the fuzzy logic classifier and robustness through the combining of evidence of the two algorithms. The mode identification methodology is applied to an AC Plant typically found as a component of a shipboard system.

  5. Analysis of different device-based intrathoracic impedance vectors for detection of heart failure events (from the Detect Fluid Early from Intrathoracic Impedance Monitoring study).

    PubMed

    Heist, E Kevin; Herre, John M; Binkley, Philip F; Van Bakel, Adrian B; Porterfield, James G; Porterfield, Linda M; Qu, Fujian; Turkel, Melanie; Pavri, Behzad B

    2014-10-15

    Detect Fluid Early from Intrathoracic Impedance Monitoring (DEFEAT-PE) is a prospective, multicenter study of multiple intrathoracic impedance vectors to detect pulmonary congestion (PC) events. Changes in intrathoracic impedance between the right ventricular (RV) coil and device can (RVcoil→Can) of implantable cardioverter-defibrillators (ICDs) and cardiac resynchronization therapy ICDs (CRT-Ds) are used clinically for the detection of PC events, but other impedance vectors and algorithms have not been studied prospectively. An initial 75-patient study was used to derive optimal impedance vectors to detect PC events, with 2 vector combinations selected for prospective analysis in DEFEAT-PE (ICD vectors: RVring→Can + RVcoil→Can, detection threshold 13 days; CRT-D vectors: left ventricular ring→Can + RVcoil→Can, detection threshold 14 days). Impedance changes were considered true positive if detected <30 days before an adjudicated PC event. One hundred sixty-two patients were enrolled (80 with ICDs and 82 with CRT-Ds), all with ≥1 previous PC event. One hundred forty-four patients provided study data, with 214 patient-years of follow-up and 139 PC events. Sensitivity for PC events of the prespecified algorithms was as follows: ICD: sensitivity 32.3%, false-positive rate 1.28 per patient-year; CRT-D: sensitivity 32.4%, false-positive rate 1.66 per patient-year. An alternative algorithm, ultimately approved by the US Food and Drug Administration (RVring→Can + RVcoil→Can, detection threshold 14 days), resulted in (for all patients) sensitivity of 21.6% and a false-positive rate of 0.9 per patient-year. The CRT-D thoracic impedance vector algorithm selected in the derivation study was not superior to the ICD algorithm RVring→Can + RVcoil→Can when studied prospectively. In conclusion, to achieve an acceptably low false-positive rate, the intrathoracic impedance algorithms studied in DEFEAT-PE resulted in low sensitivity for the prediction of heart failure events. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Matched Filter Detection of Microseismicity at Ngatamariki and Rotokawa Geothermal Fields, Central North Island, New Zealand

    NASA Astrophysics Data System (ADS)

    Hopp, C. J.; Savage, M. K.; Townend, J.; Sherburn, S.

    2016-12-01

    Monitoring patterns in local microseismicity gives clues to the existence and location of subsurface structures. In the context of a geothermal reservoir, subsurface structures often indicate areas of high permeability and are vitally important in understanding fluid flow within the geothermal resource. Detecting and locating microseismic events within an area of power generation, however, is often challenging due to high levels of noise associated with nearby power plant infrastructure. In this situation, matched filter detection improves drastically upon standard earthquake detection techniques, specifically when events are likely induced by fluid injection and are therefore near-repeating. Using an earthquake catalog of 637 events which occurred between 1 January and 18 November 2015 as our initial dataset, we implemented a matched filtering routine for the Mighty River Power (MRP) geothermal fields at Rotokawa and Ngatamariki, central North Island, New Zealand. We detected nearly 21,000 additional events across both geothermal fields, a roughly 30-fold increase from the original catalog. On average, each of the 637 template events detected 45 additional events throughout the study period, with a maximum number of additional detections for a single template of 359. Cumulative detection rates for all template events, in general, do not mimic large scale changes in injection rates within the fields, however we do see indications of an increase in detection rate associated with power plant shutdown at Ngatamariki. Locations of detected events follow established patterns of historic seismicity at both Ngatamariki and Rotokawa. One large cluster of events persists in the southeastern portion of Rotokawa and is likely bounded to the northwest by a known fault dividing the injection and production sections of the field. Two distinct clusters of microseismicity occur in the North and South of Ngatamariki, the latter appearing to coincide with a structure dividing the production zone and the southern injection zone.

  7. Sources of Infrasound events listed in IDC Reviewed Event Bulletin

    NASA Astrophysics Data System (ADS)

    Bittner, Paulina; Polich, Paul; Gore, Jane; Ali, Sherif; Medinskaya, Tatiana; Mialle, Pierrick

    2017-04-01

    Until 2003 two waveform technologies, i.e. seismic and hydroacoustic were used to detect and locate events included in the International Data Centre (IDC) Reviewed Event Bulletin (REB). The first atmospheric event was published in the REB in 2003, however automatic processing required significant improvements to reduce the number of false events. In the beginning of 2010 the infrasound technology was reintroduced to the IDC operations and has contributed to both automatic and reviewed IDC bulletins. The primary contribution of infrasound technology is to detect atmospheric events. These events may also be observed at seismic stations, which will significantly improve event location. Examples sources of REB events, which were detected by the International Monitoring System (IMS) infrasound network were fireballs (e.g. Bangkok fireball, 2015), volcanic eruptions (e.g. Calbuco, Chile 2015) and large surface explosions (e.g. Tjanjin, China 2015). Query blasts (e.g. Zheleznogorsk) and large earthquakes (e.g. Italy 2016) belong to events primarily recorded at seismic stations of the IMS network but often detected at the infrasound stations. In case of earthquakes analysis of infrasound signals may help to estimate the area affected by ground vibration. Infrasound associations to query blast events may help to obtain better source location. The role of IDC analysts is to verify and improve location of events detected by the automatic system and to add events which were missed in the automatic process. Open source materials may help to identify nature of some events. Well recorded examples may be added to the Reference Infrasound Event Database to help in analysis process. This presentation will provide examples of events generated by different sources which were included in the IDC bulletins.

  8. Event Detection Challenges, Methods, and Applications in Natural and Artificial Systems

    DTIC Science & Technology

    2009-03-01

    using the composite event detection method [Kerman, Jiang, Blumberg , and Buttrey, 2009]. Although the techniques and utility of the...aforementioned method have been clearly demonstrated, there is still much work and research to be conducted within the realm of event detection. This...detection methods . The paragraphs that follow summarize the discoveries of and lessons learned by multiple researchers and authors over many

  9. Exploiting semantics for sensor re-calibration in event detection systems

    NASA Astrophysics Data System (ADS)

    Vaisenberg, Ronen; Ji, Shengyue; Hore, Bijit; Mehrotra, Sharad; Venkatasubramanian, Nalini

    2008-01-01

    Event detection from a video stream is becoming an important and challenging task in surveillance and sentient systems. While computer vision has been extensively studied to solve different kinds of detection problems over time, it is still a hard problem and even in a controlled environment only simple events can be detected with a high degree of accuracy. Instead of struggling to improve event detection using image processing only, we bring in semantics to direct traditional image processing. Semantics are the underlying facts that hide beneath video frames, which can not be "seen" directly by image processing. In this work we demonstrate that time sequence semantics can be exploited to guide unsupervised re-calibration of the event detection system. We present an instantiation of our ideas by using an appliance as an example--Coffee Pot level detection based on video data--to show that semantics can guide the re-calibration of the detection model. This work exploits time sequence semantics to detect when re-calibration is required to automatically relearn a new detection model for the newly evolved system state and to resume monitoring with a higher rate of accuracy.

  10. Complexity analysis of human physiological signals based on case studies

    NASA Astrophysics Data System (ADS)

    Angelova, Maia; Holloway, Philip; Ellis, Jason

    2015-04-01

    This work focuses on methods for investigation of physiological time series based on complexity analysis. It is a part of a wider programme to determine non-invasive markers for healthy ageing. We consider two case studies investigated with actigraphy: (a) sleep and alternations with insomnia, and (b) ageing effects on mobility patterns. We illustrate, using these case studies, the application of fractal analysis to the investigation of regulation patterns and control, and change of physiological function. In the first case study, fractal analysis techniques were implemented to study the correlations present in sleep actigraphy for individuals suffering from acute insomnia in comparison with healthy controls. The aim was to investigate if complexity analysis can detect the onset of adverse health-related events. The subjects with acute insomnia displayed significantly higher levels of complexity, possibly a result of too much activity in the underlying regulatory systems. The second case study considered mobility patterns during night time and their variations with age. It showed that complexity metrics can identify change in physiological function with ageing. Both studies demonstrated that complexity analysis can be used to investigate markers of health, disease and healthy ageing.

  11. Gene fusion analysis in the battle against the African endemic sleeping sickness.

    PubMed

    Trimpalis, Philip; Koumandou, Vassiliki Lila; Pliakou, Evangelia; Anagnou, Nicholas P; Kossida, Sophia

    2013-01-01

    The protozoan Trypanosoma brucei causes African Trypanosomiasis or sleeping sickness in humans, which can be lethal if untreated. Most available pharmacological treatments for the disease have severe side-effects. The purpose of this analysis was to detect novel protein-protein interactions (PPIs), vital for the parasite, which could lead to the development of drugs against this disease to block the specific interactions. In this work, the Domain Fusion Analysis (Rosetta Stone method) was used to identify novel PPIs, by comparing T. brucei to 19 organisms covering all major lineages of the tree of life. Overall, 49 possible protein-protein interactions were detected, and classified based on (a) statistical significance (BLAST e-value, domain length etc.), (b) their involvement in crucial metabolic pathways, and (c) their evolutionary history, particularly focusing on whether a protein pair is split in T. brucei and fused in the human host. We also evaluated fusion events including hypothetical proteins, and suggest a possible molecular function or involvement in a certain biological process. This work has produced valuable results which could be further studied through structural biology or other experimental approaches so as to validate the protein-protein interactions proposed here. The evolutionary analysis of the proteins involved showed that, gene fusion or gene fission events can happen in all organisms, while some protein domains are more prone to fusion and fission events and present complex evolutionary patterns.

  12. Automatic event detection in low SNR microseismic signals based on multi-scale permutation entropy and a support vector machine

    NASA Astrophysics Data System (ADS)

    Jia, Rui-Sheng; Sun, Hong-Mei; Peng, Yan-Jun; Liang, Yong-Quan; Lu, Xin-Ming

    2017-07-01

    Microseismic monitoring is an effective means for providing early warning of rock or coal dynamical disasters, and its first step is microseismic event detection, although low SNR microseismic signals often cannot effectively be detected by routine methods. To solve this problem, this paper presents permutation entropy and a support vector machine to detect low SNR microseismic events. First, an extraction method of signal features based on multi-scale permutation entropy is proposed by studying the influence of the scale factor on the signal permutation entropy. Second, the detection model of low SNR microseismic events based on the least squares support vector machine is built by performing a multi-scale permutation entropy calculation for the collected vibration signals, constructing a feature vector set of signals. Finally, a comparative analysis of the microseismic events and noise signals in the experiment proves that the different characteristics of the two can be fully expressed by using multi-scale permutation entropy. The detection model of microseismic events combined with the support vector machine, which has the features of high classification accuracy and fast real-time algorithms, can meet the requirements of online, real-time extractions of microseismic events.

  13. Effects of Cross-axis Wind Jet Events on the Northern Red Sea Circulation

    NASA Astrophysics Data System (ADS)

    Menezes, V. V.; Bower, A. S.; Farrar, J. T.

    2016-12-01

    Despite its small size, the Red Sea has a complex circulation. There are boundary currents in both sides of the basin, a meridional overturning circulation, water mass formation in the northern part and an intense eddy activity. This complex pattern is driven by strong air-sea interactions. The Red Sea has one of the largest evaporation rates of the global oceans (2m/yr), an intricate and seasonally varying wind pattern. The winds blowing over the Northern Rea Sea (NRS, north of 20N) are predominantly southeastward along the main axis all year round; in the southern, they reverse seasonally due to the monsoonal regime. Although the winds are mostly along-axis in the NRS, several works have shown that sometimes during the boreal winter, the winds blow in a cross-axis direction. The westward winds from Saudi Arabia bring relatively cold dry air and dust from the desert, enhancing heat loss and evaporation off the Red Sea. These wind-jet events may contribute to increased eddy activity and are a trigger for water mass formation. Despite that, our knowledge about the cross-axis winds and their effect on NRS circulation is still incipient. In the present work we analyze 10-years of Quikscat scatterometer winds and altimetric sea surface height anomalies, together with 2-yrs of mooring data, to characterize the westward wind jet events and their impacts on the circulation. We show that the cross-axis winds are, indeed, an important component of the wind regime, explaining 11% of wind variability of the NRS (well-described by a 2nd EOF mode). The westward events occur predominantly in the winter, preferentially in January (about 15 events in 10-years) and have a mean duration of 4-5 days, with a maximum of 12 days (north of 22N). There are around 6 events per year, but in 2002-2003 and 2007-2008, twice more events were detected. The westward wind events are found to strongly modify the wind stress curl, causing a distinct positive/negative curl pattern along the main axis. This pattern enhances the eddy activity and impacts the NRS circulation.

  14. Using natural archives to detect climate and environmental tipping points in the Earth System

    NASA Astrophysics Data System (ADS)

    Thomas, Zoë A.

    2016-11-01

    'Tipping points' in the Earth system are characterised by a nonlinear response to gradual forcing, and may have severe and wide-ranging impacts. Many abrupt events result from simple underlying system dynamics termed 'critical transitions' or 'bifurcations'. One of the best ways to identify and potentially predict threshold behaviour in the climate system is through analysis of natural ('palaeo') archives. Specifically, on the approach to a tipping point, early warning signals can be detected as characteristic fluctuations in a time series as a system loses stability. Testing whether these early warning signals can be detected in highly complex real systems is a key challenge, since much work is either theoretical or only tested with simple models. This is particularly problematic in palaeoclimate and palaeoenvironmental records with low resolution, non-equidistant data, which can limit accurate analysis. Here, a range of different datasets are examined to explore generic rules that can be used to detect such dramatic events. A number of key criteria are identified to be necessary for the reliable identification of early warning signals in natural archives, most crucially, the need for a low-noise record of sufficient data length, resolution and accuracy. A deeper understanding of the underlying system dynamics is required to inform the development of more robust system-specific indicators, or to indicate the temporal resolution required, given a known forcing. This review demonstrates that time series precursors from natural archives provide a powerful means of forewarning tipping points within the Earth System.

  15. Complex effusive events at Kilauea as documented by the GOES satellite and remote video cameras

    USGS Publications Warehouse

    Harris, A.J.L.; Thornber, C.R.

    1999-01-01

    GOES provides thermal data for all of the Hawaiian volcanoes once every 15 min. We show how volcanic radiance time series produced from this data stream can be used as a simple measure of effusive activity. Two types of radiance trends in these time series can be used to monitor effusive activity: (a) Gradual variations in radiance reveal steady flow-field extension and tube development. (b) Discrete spikes correlate with short bursts of activity, such as lava fountaining or lava-lake overflows. We are confident that any effusive event covering more than 10,000 m2 of ground in less than 60 min will be unambiguously detectable using this approach. We demonstrate this capability using GOES, video camera and ground-based observational data for the current eruption of Kilauea volcano (Hawai'i). A GOES radiance time series was constructed from 3987 images between 19 June and 12 August 1997. This time series displayed 24 radiance spikes elevated more than two standard deviations above the mean; 19 of these are correlated with video-recorded short-burst effusive events. Less ambiguous events are interpreted, assessed and related to specific volcanic events by simultaneous use of permanently recording video camera data and ground-observer reports. The GOES radiance time series are automatically processed on data reception and made available in near-real-time, so such time series can contribute to three main monitoring functions: (a) automatically alerting major effusive events; (b) event confirmation and assessment; and (c) establishing effusive event chronology.

  16. Detecting cell division of Pseudomonas aeruginosa bacteria from bright-field microscopy images with hidden conditional random fields.

    PubMed

    Ong, Lee-Ling S; Xinghua Zhang; Kundukad, Binu; Dauwels, Justin; Doyle, Patrick; Asada, H Harry

    2016-08-01

    An approach to automatically detect bacteria division with temporal models is presented. To understand how bacteria migrate and proliferate to form complex multicellular behaviours such as biofilms, it is desirable to track individual bacteria and detect cell division events. Unlike eukaryotic cells, prokaryotic cells such as bacteria lack distinctive features, causing bacteria division difficult to detect in a single image frame. Furthermore, bacteria may detach, migrate close to other bacteria and may orientate themselves at an angle to the horizontal plane. Our system trains a hidden conditional random field (HCRF) model from tracked and aligned bacteria division sequences. The HCRF model classifies a set of image frames as division or otherwise. The performance of our HCRF model is compared with a Hidden Markov Model (HMM). The results show that a HCRF classifier outperforms a HMM classifier. From 2D bright field microscopy data, it is a challenge to separate individual bacteria and associate observations to tracks. Automatic detection of sequences with bacteria division will improve tracking accuracy.

  17. A high-throughput method for GMO multi-detection using a microfluidic dynamic array.

    PubMed

    Brod, Fábio Cristiano Angonesi; van Dijk, Jeroen P; Voorhuijzen, Marleen M; Dinon, Andréia Zilio; Guimarães, Luis Henrique S; Scholtens, Ingrid M J; Arisi, Ana Carolina Maisonnave; Kok, Esther J

    2014-02-01

    The ever-increasing production of genetically modified crops generates a demand for high-throughput DNA-based methods for the enforcement of genetically modified organisms (GMO) labelling requirements. The application of standard real-time PCR will become increasingly costly with the growth of the number of GMOs that is potentially present in an individual sample. The present work presents the results of an innovative approach in genetically modified crops analysis by DNA based methods, which is the use of a microfluidic dynamic array as a high throughput multi-detection system. In order to evaluate the system, six test samples with an increasing degree of complexity were prepared, preamplified and subsequently analysed in the Fluidigm system. Twenty-eight assays targeting different DNA elements, GM events and species-specific reference genes were used in the experiment. The large majority of the assays tested presented expected results. The power of low level detection was assessed and elements present at concentrations as low as 0.06 % were successfully detected. The approach proposed in this work presents the Fluidigm system as a suitable and promising platform for GMO multi-detection.

  18. Detection of complex cyber attacks

    NASA Astrophysics Data System (ADS)

    Gregorio-de Souza, Ian; Berk, Vincent H.; Giani, Annarita; Bakos, George; Bates, Marion; Cybenko, George; Madory, Doug

    2006-05-01

    One significant drawback to currently available security products is their inabilty to correlate diverse sensor input. For instance, by only using network intrusion detection data, a root kit installed through a weak username-password combination may go unnoticed. Similarly, an administrator may never make the link between deteriorating response times from the database server and an attacker exfiltrating trusted data, if these facts aren't presented together. Current Security Information Management Systems (SIMS) can collect and represent diverse data but lack sufficient correlation algorithms. By using a Process Query System, we were able to quickly bring together data flowing from many sources, including NIDS, HIDS, server logs, CPU load and memory usage, etc. We constructed PQS models that describe dynamic behavior of complicated attacks and failures, allowing us to detect and differentiate simultaneous sophisticated attacks on a target network. In this paper, we discuss the benefits of implementing such a multistage cyber attack detection system using PQS. We focus on how data from multiple sources can be combined and used to detect and track comprehensive network security events that go unnoticed using conventional tools.

  19. affy2sv: an R package to pre-process Affymetrix CytoScan HD and 750K arrays for SNP, CNV, inversion and mosaicism calling.

    PubMed

    Hernandez-Ferrer, Carles; Quintela Garcia, Ines; Danielski, Katharina; Carracedo, Ángel; Pérez-Jurado, Luis A; González, Juan R

    2015-05-20

    The well-known Genome-Wide Association Studies (GWAS) had led to many scientific discoveries using SNP data. Even so, they were not able to explain the full heritability of complex diseases. Now, other structural variants like copy number variants or DNA inversions, either germ-line or in mosaicism events, are being studies. We present the R package affy2sv to pre-process Affymetrix CytoScan HD/750k array (also for Genome-Wide SNP 5.0/6.0 and Axiom) in structural variant studies. We illustrate the capabilities of affy2sv using two different complete pipelines on real data. The first one performing a GWAS and a mosaic alterations detection study, and the other detecting CNVs and performing an inversion calling. Both examples presented in the article show up how affy2sv can be used as part of more complex pipelines aimed to analyze Affymetrix SNP arrays data in genetic association studies, where different types of structural variants are considered.

  20. Time-resolved vibrational spectroscopy detects protein-based intermediates in the photosynthetic oxygen-evolving cycle.

    PubMed

    Barry, Bridgette A; Cooper, Ian B; De Riso, Antonio; Brewer, Scott H; Vu, Dung M; Dyer, R Brian

    2006-05-09

    Photosynthetic oxygen production by photosystem II (PSII) is responsible for the maintenance of aerobic life on earth. The production of oxygen occurs at the PSII oxygen-evolving complex (OEC), which contains a tetranuclear manganese (Mn) cluster. Photo-induced electron transfer events in the reaction center lead to the accumulation of oxidizing equivalents on the OEC. Four sequential photooxidation reactions are required for oxygen production. The oxidizing complex cycles among five oxidation states, called the S(n) states, where n refers to the number of oxidizing equivalents stored. Oxygen release occurs during the S(3)-to-S(0) transition from an unstable intermediate, known as the S(4) state. In this report, we present data providing evidence for the production of an intermediate during each S state transition. These protein-derived intermediates are produced on the microsecond to millisecond time scale and are detected by time-resolved vibrational spectroscopy on the microsecond time scale. Our results suggest that a protein-derived conformational change or proton transfer reaction precedes Mn redox reactions during the S(2)-to-S(3) and S(3)-to-S(0) transitions.

  1. The use of Matlab for colour fuzzy representation of multichannel EEG short time spectra.

    PubMed

    Bigan, C; Strungaru, R

    1998-01-01

    During the last years, a lot of EEG research efforts was directed to intelligent methods for automatic analysis of data from multichannel EEG recordings. However, all the applications reported were focused on specific single tasks like detection of one specific "event" in the EEG signal: spikes, sleep spindles, epileptic seizures, K complexes, alpha or other rhythms or even artefacts. The aim of this paper is to present a complex system being able to perform a representation of the dynamic changes in frequency components of each EEG channel. This representation uses colours as a powerful means to show the only one frequency range chosen from the shortest epoch of signal able to be processed with the conventional "Short Time Fast Fourier Transform" (S.T.F.F.T.) method.

  2. Global Seismic Event Detection Using Surface Waves: 15 Possible Antarctic Glacial Sliding Events

    NASA Astrophysics Data System (ADS)

    Chen, X.; Shearer, P. M.; Walker, K. T.; Fricker, H. A.

    2008-12-01

    To identify overlooked or anomalous seismic events not listed in standard catalogs, we have developed an algorithm to detect and locate global seismic events using intermediate-period (35-70s) surface waves. We apply our method to continuous vertical-component seismograms from the global seismic networks as archived in the IRIS UV FARM database from 1997 to 2007. We first bandpass filter the seismograms, apply automatic gain control, and compute envelope functions. We then examine 1654 target event locations defined at 5 degree intervals and stack the seismogram envelopes along the predicted Rayleigh-wave travel times. The resulting function has spatial and temporal peaks that indicate possible seismic events. We visually check these peaks using a graphical user interface to eliminate artifacts and assign an overall reliability grade (A, B or C) to the new events. We detect 78% of events in the Global Centroid Moment Tensor (CMT) catalog. However, we also find 840 new events not listed in the PDE, ISC and REB catalogs. Many of these new events were previously identified by Ekstrom (2006) using a different Rayleigh-wave detection scheme. Most of these new events are located along oceanic ridges and transform faults. Some new events can be associated with volcanic eruptions such as the 2000 Miyakejima sequence near Japan and others with apparent glacial sliding events in Greenland (Ekstrom et al., 2003). We focus our attention on 15 events detected from near the Antarctic coastline and relocate them using a cross-correlation approach. The events occur in 3 groups which are well-separated from areas of cataloged earthquake activity. We speculate that these are iceberg calving and/or glacial sliding events, and hope to test this by inverting for their source mechanisms and examining remote sensing data from their source regions.

  3. DETECT: a MATLAB toolbox for event detection and identification in time series, with applications to artifact detection in EEG signals.

    PubMed

    Lawhern, Vernon; Hairston, W David; Robbins, Kay

    2013-01-01

    Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration.

  4. DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals

    PubMed Central

    Lawhern, Vernon; Hairston, W. David; Robbins, Kay

    2013-01-01

    Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration. PMID:23638169

  5. Process connectivity reveals ecohydrologic sensitivity to drought and rainfall pulses

    NASA Astrophysics Data System (ADS)

    Goodwell, A. E.; Kumar, P.

    2017-12-01

    Ecohydrologic fluxes within atmosphere, canopy and soil systems exhibit complex and joint variability. This complexity arises from direct and indirect forcing and feedback interactions that can cause fluctuations to propagate between water, energy, and nutrient fluxes at various time scales. When an ecosystem is perturbed in the form of a single storm event, an accumulating drought, or changes in climate and land cover, this aspect of joint variability may dictate responsiveness and resilience of the entire system. A characterization of the time-dependent and multivariate connectivity between processes, fluxes, and states is necessary to identify and understand these aspects of ecohydrologic systems. We construct Temporal Information Partitioning Networks (TIPNets), based on information theory measures, to identify time-dependencies between variables measured at flux towers along elevation and climate gradients in relation to their responses to moisture-related perturbations. Along a flux tower transect in the Reynolds Creek Critical Zone Observatory (CZO) in Idaho, we detect a significant network response to a large 2015 dry season rainfall event that enhances microbial respiration and latent heat fluxes. At a transect in the Southern Sierra CZO in California, we explore network properties in relation to drought responses from 2011 to 2015. We find that both high and low elevation sites exhibit decreased connectivity between atmospheric and soil variables and latent heat fluxes, but the higher elevation site is less sensitive to this altered connectivity in terms of average monthly heat fluxes. Through a novel approach to gage the responsiveness of ecosystem fluxes to shifts in connectivity, this study aids our understanding of ecohydrologic sensitivity to short-term rainfall events and longer term droughts. This study is relevant to ecosystem resilience under a changing climate, and can lead to a greater understanding of shifting behaviors in many types of complex systems.

  6. Brain, music, and non-Poisson renewal processes

    NASA Astrophysics Data System (ADS)

    Bianco, Simone; Ignaccolo, Massimiliano; Rider, Mark S.; Ross, Mary J.; Winsor, Phil; Grigolini, Paolo

    2007-06-01

    In this paper we show that both music composition and brain function, as revealed by the electroencephalogram (EEG) analysis, are renewal non-Poisson processes living in the nonergodic dominion. To reach this important conclusion we process the data with the minimum spanning tree method, so as to detect significant events, thereby building a sequence of times, which is the time series to analyze. Then we show that in both cases, EEG and music composition, these significant events are the signature of a non-Poisson renewal process. This conclusion is reached using a technique of statistical analysis recently developed by our group, the aging experiment (AE). First, we find that in both cases the distances between two consecutive events are described by nonexponential histograms, thereby proving the non-Poisson nature of these processes. The corresponding survival probabilities Ψ(t) are well fitted by stretched exponentials [ Ψ(t)∝exp (-(γt)α) , with 0.5<α<1 .] The second step rests on the adoption of AE, which shows that these are renewal processes. We show that the stretched exponential, due to its renewal character, is the emerging tip of an iceberg, whose underwater part has slow tails with an inverse power law structure with power index μ=1+α . Adopting the AE procedure we find that both EEG and music composition yield μ<2 . On the basis of the recently discovered complexity matching effect, according to which a complex system S with μS<2 responds only to a complex driving signal P with μP⩽μS , we conclude that the results of our analysis may explain the influence of music on the human brain.

  7. Development of a homogeneous assay format for p53 antibodies using fluorescence correlation spectroscopy

    NASA Astrophysics Data System (ADS)

    Neuweiler, Hannes; Scheffler, Silvia; Sauer, Markus

    2005-08-01

    The development of reliable methods for the detection of minute amounts of antibodies directly in homogeneous solution represents one of the major tasks in the current research field of molecular diagnostics. We demonstrate the potential of fluorescence correlation spectroscopy (FCS) in combination with quenched peptide-based fluorescence probes for sensitive detection of p53 antibodies directly in homogeneous solution. Single tryptophan (Trp) residues in the sequences of short, synthetic peptide epitopes of the human p53 protein efficiently quench the fluorescence of an oxazine fluorophore attached to the amino terminal ends of the peptides. The fluorescence quenching mechanism is thought to be a photoinduced electron transfer reaction from Trp to the dye enabled by the formation of intramolecular complexes between dye and Trp. Specific recognition of the epitope by the antibody confines the conformational flexibility of the peptide. Consequently, complex formation between dye and Trp is abolished and fluorescence is recovered. Using fluorescence correlation spectroscopy (FCS), antibody binding can be monitored observing two parameters simultaneously: the diffusional mobility of the peptide as well as the quenching amplitude induced by the conformational flexibility of the peptide change significantly upon antibody binding. Our data demonstrate that FCS in combination with fluorescence-quenched peptide epitopes opens new possibilities for the reliable detection of antibody binding events in homogeneous solution.

  8. Comparison of outliers and novelty detection to identify ionospheric TEC irregularities during geomagnetic storm and substorm

    NASA Astrophysics Data System (ADS)

    Pattisahusiwa, Asis; Houw Liong, The; Purqon, Acep

    2016-08-01

    In this study, we compare two learning mechanisms: outliers and novelty detection in order to detect ionospheric TEC disturbance by November 2004 geomagnetic storm and January 2005 substorm. The mechanisms are applied by using v-SVR learning algorithm which is a regression version of SVM. Our results show that both mechanisms are quiet accurate in learning TEC data. However, novelty detection is more accurate than outliers detection in extracting anomalies related to geomagnetic events. The detected anomalies by outliers detection are mostly related to trend of data, while novelty detection are associated to geomagnetic events. Novelty detection also shows evidence of LSTID during geomagnetic events.

  9. Fluid-Faulting Interactions Examined Though Massive Waveform-Based Analyses of Earthquake Swarms in Volcanic and Tectonic Settings: Mammoth Mountain, Long Valley, Lassen, and Fillmore, California Swarms, 2014-2015

    NASA Astrophysics Data System (ADS)

    Shelly, D. R.; Ellsworth, W. L.; Prejean, S. G.; Hill, D. P.; Hardebeck, J.; Hsieh, P. A.

    2015-12-01

    Earthquake swarms, sequences of sustained seismicity, convey active subsurface processes that sometimes precede larger tectonic or volcanic episodes. Their extended activity and spatiotemporal migration can often be attributed to fluid pressure transients as migrating crustal fluids (typically water and CO2) interact with subsurface structures. Although the swarms analyzed here are interpreted to be natural in origin, the mechanisms of seismic activation likely mirror those observed for earthquakes induced by industrial fluid injection. Here, we use massive-scale waveform correlation to detect and precisely locate 3-10 times as many earthquakes as included in routine catalogs for recent (2014-2015) swarms beneath Mammoth Mountain, Long Valley Caldera, Lassen Volcanic Center, and Fillmore areas of California, USA. These enhanced catalogs, with location precision as good as a few meters, reveal signatures of fluid-faulting interactions, such as systematic migration, fault-valve behavior, and fracture mesh structures, not resolved in routine catalogs. We extend this analysis to characterize source mechanism similarity even for very small newly detected events using relative P and S polarity estimates. This information complements precise locations to define fault complexities that would otherwise be invisible. In particular, although swarms often consist of groups of highly similar events, some swarms contain a population of outliers with different slip and/or fault orientations. These events highlight the complexity of fluid-faulting interactions. Despite their different settings, the four swarms analyzed here share many similarities, including pronounced hypocenter migration suggestive of a fluid pressure trigger. This includes the July 2015 Fillmore swarm, which, unlike the others, occurred outside of an obvious volcanic zone. Nevertheless, it exhibited systematic westward and downdip migration on a ~1x1.5 km low-angle, NW-dipping reverse fault at midcrustal depth.

  10. Complex landslides in the Trans-Mexican Volcanic Belt - a case study in the State of Veracruz

    NASA Astrophysics Data System (ADS)

    Wilde, M.; Terhorst, B.; Schwindt, D.; Rodriguez Elizarrarás, S. R.; Morales Barrera, W. V.; Bücker, M.; Flores Orozco, A.; García García, E.; Pita de la Paz, C.

    2017-12-01

    The State of Veracruz (Mexico) is a region which is highly affected by landslides, therefore detailed studies on triggering factors and process dynamics of landslides are required. Profound insights are essential for further hazard assessments and compilation of susceptibility maps. Exemplary landslide sites were investigated in order to determine characteristic features of specific regions. In the Chiconquiaco Mountain Range numerous damaging landslide events occurred in the year of 2013 and our case study corresponds to a deep-seated landslide originating from this slide-intensive year. The main scientific aspects are placed on the reconstruction of the landslides geometry and its process dynamics. Therefore, surface and subsurface analysis form the base of a multimethodological approach. In order to perform surface analysis, aerial photographs were collected by an unmanned aerial vehicle (UAV) aiming at the generation of a 3D model with the Structure from Motion (SfM) work routine. Ground control points (GCP) were used to ensure the geometric accuracy of the model. The obtained DEM of the 2013 slide mass as well as an elevation model representing the topographic situation before the event (year 2011) were used to detect surface changes. The data enabled determination of the most affected areas as well as areas characterized by secondary movements. Furthermore, the volume of the slide mass could be calculated. Geophysical methods, as electrical resistivity tomography (ERT) as well as seismic refraction tomography (SRT), were applied for subsurface analysis. Differences in subsurface composition, respectively density, allowed for separation of the slide mass and the underlying unit. Most relevant for our studies is the detection of an earlier landslide leading to the assumption that the 2013 landslide event corresponds to a reactivation process. This multimethodological approach enables a far-reaching visualization of complex landslides and strongly supports the reconstruction of interior structures and process dynamics.

  11. Designing and Implementing a Retrospective Earthquake Detection Framework at the U.S. Geological Survey National Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Patton, J.; Yeck, W.; Benz, H.

    2017-12-01

    The U.S. Geological Survey National Earthquake Information Center (USGS NEIC) is implementing and integrating new signal detection methods such as subspace correlation, continuous beamforming, multi-band picking and automatic phase identification into near-real-time monitoring operations. Leveraging the additional information from these techniques help the NEIC utilize a large and varied network on local to global scales. The NEIC is developing an ordered, rapid, robust, and decentralized framework for distributing seismic detection data as well as a set of formalized formatting standards. These frameworks and standards enable the NEIC to implement a seismic event detection framework that supports basic tasks, including automatic arrival time picking, social media based event detections, and automatic association of different seismic detection data into seismic earthquake events. In addition, this framework enables retrospective detection processing such as automated S-wave arrival time picking given a detected event, discrimination and classification of detected events by type, back-azimuth and slowness calculations, and ensuring aftershock and induced sequence detection completeness. These processes and infrastructure improve the NEIC's capabilities, accuracy, and speed of response. In addition, this same infrastructure provides an improved and convenient structure to support access to automatic detection data for both research and algorithmic development.

  12. TwitterSensing: An Event-Based Approach for Wireless Sensor Networks Optimization Exploiting Social Media in Smart City Applications

    PubMed Central

    2018-01-01

    Modern cities are subject to periodic or unexpected critical events, which may bring economic losses or even put people in danger. When some monitoring systems based on wireless sensor networks are deployed, sensing and transmission configurations of sensor nodes may be adjusted exploiting the relevance of the considered events, but efficient detection and classification of events of interest may be hard to achieve. In Smart City environments, several people spontaneously post information in social media about some event that is being observed and such information may be mined and processed for detection and classification of critical events. This article proposes an integrated approach to detect and classify events of interest posted in social media, notably in Twitter, and the assignment of sensing priorities to source nodes. By doing so, wireless sensor networks deployed in Smart City scenarios can be optimized for higher efficiency when monitoring areas under the influence of the detected events. PMID:29614060

  13. TwitterSensing: An Event-Based Approach for Wireless Sensor Networks Optimization Exploiting Social Media in Smart City Applications.

    PubMed

    Costa, Daniel G; Duran-Faundez, Cristian; Andrade, Daniel C; Rocha-Junior, João B; Peixoto, João Paulo Just

    2018-04-03

    Modern cities are subject to periodic or unexpected critical events, which may bring economic losses or even put people in danger. When some monitoring systems based on wireless sensor networks are deployed, sensing and transmission configurations of sensor nodes may be adjusted exploiting the relevance of the considered events, but efficient detection and classification of events of interest may be hard to achieve. In Smart City environments, several people spontaneously post information in social media about some event that is being observed and such information may be mined and processed for detection and classification of critical events. This article proposes an integrated approach to detect and classify events of interest posted in social media, notably in Twitter , and the assignment of sensing priorities to source nodes. By doing so, wireless sensor networks deployed in Smart City scenarios can be optimized for higher efficiency when monitoring areas under the influence of the detected events.

  14. A Zn(2+)-responsive highly sensitive fluorescent probe and 1D coordination polymer based on a coumarin platform.

    PubMed

    Kumar, Virendra; Kumar, Ajit; Diwan, Uzra; Upadhyay, K K

    2013-09-28

    A coumarin-based Schiff base (receptor 1) exhibited fluorescence enhancement selectively with Zn(2+) at a nanomolar level in near-aqueous medium (EtOH-H2O; 1:1, v/v). The response was instantaneous with a detection limit of 3.26 × 10(-9) M. The sensing event is supposed to incorporate a combinational effect of intramolecular charge transfer (ICT), chelation-enhanced fluorescence (CHEF) and C[double bond, length as m-dash]N isomerization mechanisms. Various spectroscopic methods, viz. IR, UV-visible, fluorescence and NMR in association with single crystal XRD studies, were used for thorough investigation of the structure of receptor 1 as well as of the sensing event. The Zn(2+) complex of receptor 1 exhibited a very nice 1D chain coordination polymeric framework in its single crystal XRD.

  15. Introduction to State Estimation of High-Rate System Dynamics

    PubMed Central

    Dodson, Jacob; Joyce, Bryan

    2018-01-01

    Engineering systems experiencing high-rate dynamic events, including airbags, debris detection, and active blast protection systems, could benefit from real-time observability for enhanced performance. However, the task of high-rate state estimation is challenging, in particular for real-time applications where the rate of the observer’s convergence needs to be in the microsecond range. This paper identifies the challenges of state estimation of high-rate systems and discusses the fundamental characteristics of high-rate systems. A survey of applications and methods for estimators that have the potential to produce accurate estimations for a complex system experiencing highly dynamic events is presented. It is argued that adaptive observers are important to this research. In particular, adaptive data-driven observers are advantageous due to their adaptability and lack of dependence on the system model. PMID:29342855

  16. Case study on complex sporadic E layers observed by GPS radio occultations

    NASA Astrophysics Data System (ADS)

    Yue, X.; Schreiner, W. S.; Zeng, Z.; Kuo, Y.-H.; Xue, X.

    2015-01-01

    The occurrence of sporadic E (Es) layers has been a hot scientific topic for a long time. The GNSS (global navigation satellite system)-based radio occultation (RO) has proven to be a powerful technique for detecting the global Es layers. In this paper, we focus on some cases of complex Es layers based on the RO data from multiple missions processed in UCAR/CDAAC (University Corporation for Atmospheric Research (UCAR) the Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) Data Analysis and Archive Center (CDAAC)). We first show some examples of multiple Es layers occurred in one RO event. Based on the evaluations between colocated simultaneous RO events and between RO and lidar observations, it could be concluded that some of these do manifest the multiple Es layer structures. We then show a case of the occurrence of Es in a broad region during a certain time interval. The result is then validated by independent ionosondes observations. It is possible to explain these complex Es structures using the popular wind shear theory. We could map the global Es occurrence routinely in the near future, given that more RO data will be available. Further statistical studies will enhance our understanding of the Es mechanism. The understanding of Es should benefit both Es-based long-distance communication and accurate neutral RO retrievals.

  17. Efficient method for events detection in phonocardiographic signals

    NASA Astrophysics Data System (ADS)

    Martinez-Alajarin, Juan; Ruiz-Merino, Ramon

    2005-06-01

    The auscultation of the heart is still the first basic analysis tool used to evaluate the functional state of the heart, as well as the first indicator used to submit the patient to a cardiologist. In order to improve the diagnosis capabilities of auscultation, signal processing algorithms are currently being developed to assist the physician at primary care centers for adult and pediatric population. A basic task for the diagnosis from the phonocardiogram is to detect the events (main and additional sounds, murmurs and clicks) present in the cardiac cycle. This is usually made by applying a threshold and detecting the events that are bigger than the threshold. However, this method usually does not allow the detection of the main sounds when additional sounds and murmurs exist, or it may join several events into a unique one. In this paper we present a reliable method to detect the events present in the phonocardiogram, even in the presence of heart murmurs or additional sounds. The method detects relative maxima peaks in the amplitude envelope of the phonocardiogram, and computes a set of parameters associated with each event. Finally, a set of characteristics is extracted from each event to aid in the identification of the events. Besides, the morphology of the murmurs is also detected, which aids in the differentiation of different diseases that can occur in the same temporal localization. The algorithms have been applied to real normal heart sounds and murmurs, achieving satisfactory results.

  18. A coupled classification - evolutionary optimization model for contamination event detection in water distribution systems.

    PubMed

    Oliker, Nurit; Ostfeld, Avi

    2014-03-15

    This study describes a decision support system, alerts for contamination events in water distribution systems. The developed model comprises a weighted support vector machine (SVM) for the detection of outliers, and a following sequence analysis for the classification of contamination events. The contribution of this study is an improvement of contamination events detection ability and a multi-dimensional analysis of the data, differing from the parallel one-dimensional analysis conducted so far. The multivariate analysis examines the relationships between water quality parameters and detects changes in their mutual patterns. The weights of the SVM model accomplish two goals: blurring the difference between sizes of the two classes' data sets (as there are much more normal/regular than event time measurements), and adhering the time factor attribute by a time decay coefficient, ascribing higher importance to recent observations when classifying a time step measurement. All model parameters were determined by data driven optimization so the calibration of the model was completely autonomic. The model was trained and tested on a real water distribution system (WDS) data set with randomly simulated events superimposed on the original measurements. The model is prominent in its ability to detect events that were only partly expressed in the data (i.e., affecting only some of the measured parameters). The model showed high accuracy and better detection ability as compared to previous modeling attempts of contamination event detection. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Alternative splicing regulated by butyrate in bovine epithelial cells.

    PubMed

    Wu, Sitao; Li, Congjun; Huang, Wen; Li, Weizhong; Li, Robert W

    2012-01-01

    As a signaling molecule and an inhibitor of histone deacetylases (HDACs), butyrate exerts its impact on a broad range of biological processes, such as apoptosis and cell proliferation, in addition to its critical role in energy metabolism in ruminants. This study examined the effect of butyrate on alternative splicing in bovine epithelial cells using RNA-seq technology. Junction reads account for 11.28 and 12.32% of total mapped reads between the butyrate-treated (BT) and control (CT) groups. 201,326 potential splicing junctions detected were supported by ≥ 3 junction reads. Approximately 94% of these junctions conformed to the consensus sequence (GT/AG) while ~3% were GC/AG junctions. No AT/AC junctions were observed. A total of 2,834 exon skipping events, supported by a minimum of 3 junction reads, were detected. At least 7 genes, their mRNA expression significantly affected by butyrate, also had exon skipping events differentially regulated by butyrate. Furthermore, COL5A3, which was induced 310-fold by butyrate (FDR <0.001) at the gene level, had a significantly higher number of junction reads mapped to Exon#8 (Donor) and Exon#11 (Acceptor) in BT. This event had the potential to result in the formation of a COL5A3 mRNA isoform with 2 of the 69 exons missing. In addition, 216 differentially expressed transcript isoforms regulated by butyrate were detected. For example, Isoform 1 of ORC1 was strongly repressed by butyrate while Isoform 2 remained unchanged. Butyrate physically binds to and inhibits all zinc-dependent HDACs except HDAC6 and HDAC10. Our results provided evidence that butyrate also regulated deacetylase activities of classical HDACs via its transcriptional control. Moreover, thirteen gene fusion events differentially affected by butyrate were identified. Our results provided a snapshot into complex transcriptome dynamics regulated by butyrate, which will facilitate our understanding of the biological effects of butyrate and other HDAC inhibitors.

  20. In Vivo Assessment of Protease Dynamics in Cutaneous Wound Healing by Degradomics Analysis of Porcine Wound Exudates*

    PubMed Central

    Sabino, Fabio; Hermes, Olivia; Egli, Fabian E.; Kockmann, Tobias; Schlage, Pascal; Croizat, Pierre; Kizhakkedathu, Jayachandran N.; Smola, Hans; auf dem Keller, Ulrich

    2015-01-01

    Proteases control complex tissue responses by modulating inflammation, cell proliferation and migration, and matrix remodeling. All these processes are orchestrated in cutaneous wound healing to restore the skin's barrier function upon injury. Altered protease activity has been implicated in the pathogenesis of healing impairments, and proteases are important targets in diagnosis and therapy of this pathology. Global assessment of proteolysis at critical turning points after injury will define crucial events in acute healing that might be disturbed in healing disorders. As optimal biospecimens, wound exudates contain an ideal proteome to detect extracellular proteolytic events, are noninvasively accessible, and can be collected at multiple time points along the healing process from the same wound in the clinics. In this study, we applied multiplexed Terminal Amine Isotopic Labeling of Substrates (TAILS) to globally assess proteolysis in early phases of cutaneous wound healing. By quantitative analysis of proteins and protein N termini in wound fluids from a clinically relevant pig wound model, we identified more than 650 proteins and discerned major healing phases through distinctive abundance clustering of markers of inflammation, granulation tissue formation, and re-epithelialization. TAILS revealed a high degree of proteolysis at all time points after injury by detecting almost 1300 N-terminal peptides in ∼450 proteins. Quantitative positional proteomics mapped pivotal interdependent processing events in the blood coagulation and complement cascades, temporally discerned clotting and fibrinolysis during the healing process, and detected processing of complement C3 at distinct time points after wounding and by different proteases. Exploiting data on primary cleavage specificities, we related candidate proteases to cleavage events and revealed processing of the integrin adapter protein kindlin-3 by caspase-3, generating new hypotheses for protease-substrate relations in the healing skin wound in vivo. The data have been deposited to the ProteomeXchange Consortium with identifier PXD001198. PMID:25516628

  1. Passive (Micro-) Seismic Event Detection by Identifying Embedded "Event" Anomalies Within Statistically Describable Background Noise

    NASA Astrophysics Data System (ADS)

    Baziw, Erick; Verbeek, Gerald

    2012-12-01

    Among engineers there is considerable interest in the real-time identification of "events" within time series data with a low signal to noise ratio. This is especially true for acoustic emission analysis, which is utilized to assess the integrity and safety of many structures and is also applied in the field of passive seismic monitoring (PSM). Here an array of seismic receivers are used to acquire acoustic signals to monitor locations where seismic activity is expected: underground excavations, deep open pits and quarries, reservoirs into which fluids are injected or from which fluids are produced, permeable subsurface formations, or sites of large underground explosions. The most important element of PSM is event detection: the monitoring of seismic acoustic emissions is a continuous, real-time process which typically runs 24 h a day, 7 days a week, and therefore a PSM system with poor event detection can easily acquire terabytes of useless data as it does not identify crucial acoustic events. This paper outlines a new algorithm developed for this application, the so-called SEED™ (Signal Enhancement and Event Detection) algorithm. The SEED™ algorithm uses real-time Bayesian recursive estimation digital filtering techniques for PSM signal enhancement and event detection.

  2. Development of a database and processing method for detecting hematotoxicity adverse drug events.

    PubMed

    Shimai, Yoshie; Takeda, Toshihiro; Manabe, Shirou; Teramoto, Kei; Mihara, Naoki; Matsumura, Yasushi

    2015-01-01

    Adverse events are detected by monitoring the patient's status, including blood test results. However, it is difficult to identify all adverse events based on recognition by individual doctors. We developed a system that can be used to detect hematotoxicity adverse events according to blood test results recorded in an electronic medical record system. The blood test results were graded based on Common Terminology Criteria for Adverse Events (CTCAE) and changes in the blood test results (Up, Down, Flat) were assessed according to the variation in the grade. The changes in the blood test and injection data were stored in a database. By comparing the date of injection and start and end dates of the change in the blood test results, adverse events related to a designated drug were detected. Using this method, we searched for the occurrence of serious adverse events (CTCAE Grades 3 or 4) concerning WBC, ALT and creatinine related to paclitaxel at Osaka University Hospital. The rate of occurrence of a decreased WBC count, increased ALT level and increased creatinine level was 36.0%, 0.6% and 0.4%, respectively. This method is useful for detecting and estimating the rate of occurrence of hematotoxicity adverse drug events.

  3. Screening DNA chip and event-specific multiplex PCR detection methods for biotech crops.

    PubMed

    Lee, Seong-Hun

    2014-11-01

    There are about 80 biotech crop events that have been approved by safety assessment in Korea. They have been controlled by genetically modified organism (GMO) and living modified organism (LMO) labeling systems. The DNA-based detection method has been used as an efficient scientific management tool. Recently, the multiplex polymerase chain reaction (PCR) and DNA chip have been developed as simultaneous detection methods for several biotech crops' events. The event-specific multiplex PCR method was developed to detect five biotech maize events: MIR604, Event 3272, LY 038, MON 88017 and DAS-59122-7. The specificity was confirmed and the sensitivity was 0.5%. The screening DNA chip was developed from four endogenous genes of soybean, maize, cotton and canola respectively along with two regulatory elements and seven genes: P35S, tNOS, pat, bar, epsps1, epsps2, pmi, cry1Ac and cry3B. The specificity was confirmed and the sensitivity was 0.5% for four crops' 12 events: one soybean, six maize, three cotton and two canola events. The multiplex PCR and DNA chip can be available for screening, gene-specific and event-specific analysis of biotech crops as efficient detection methods by saving on workload and time. © 2014 Society of Chemical Industry. © 2014 Society of Chemical Industry.

  4. Characterization of fusion genes and the significantly expressed fusion isoforms in breast cancer by hybrid sequencing.

    PubMed

    Weirather, Jason L; Afshar, Pegah Tootoonchi; Clark, Tyson A; Tseng, Elizabeth; Powers, Linda S; Underwood, Jason G; Zabner, Joseph; Korlach, Jonas; Wong, Wing Hung; Au, Kin Fai

    2015-10-15

    We developed an innovative hybrid sequencing approach, IDP-fusion, to detect fusion genes, determine fusion sites and identify and quantify fusion isoforms. IDP-fusion is the first method to study gene fusion events by integrating Third Generation Sequencing long reads and Second Generation Sequencing short reads. We applied IDP-fusion to PacBio data and Illumina data from the MCF-7 breast cancer cells. Compared with the existing tools, IDP-fusion detects fusion genes at higher precision and a very low false positive rate. The results show that IDP-fusion will be useful for unraveling the complexity of multiple fusion splices and fusion isoforms within tumorigenesis-relevant fusion genes. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Fei; Jiang, Huaiguang; Tan, Jin

    This paper proposes an event-driven approach for reconfiguring distribution systems automatically. Specifically, an optimal synchrophasor sensor placement (OSSP) is used to reduce the number of synchrophasor sensors while keeping the whole system observable. Then, a wavelet-based event detection and location approach is used to detect and locate the event, which performs as a trigger for network reconfiguration. With the detected information, the system is then reconfigured using the hierarchical decentralized approach to seek for the new optimal topology. In this manner, whenever an event happens the distribution network can be reconfigured automatically based on the real-time information that is observablemore » and detectable.« less

  6. Multi-Detection Events, Probability Density Functions, and Reduced Location Area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eslinger, Paul W.; Schrom, Brian T.

    2016-03-01

    Abstract Several efforts have been made in the Comprehensive Nuclear-Test-Ban Treaty (CTBT) community to assess the benefits of combining detections of radionuclides to improve the location estimates available from atmospheric transport modeling (ATM) backtrack calculations. We present a Bayesian estimation approach rather than a simple dilution field of regard approach to allow xenon detections and non-detections to be combined mathematically. This system represents one possible probabilistic approach to radionuclide event formation. Application of this method to a recent interesting radionuclide event shows a substantial reduction in the location uncertainty of that event.

  7. Real-time distributed fiber optic sensor for security systems: Performance, event classification and nuisance mitigation

    NASA Astrophysics Data System (ADS)

    Mahmoud, Seedahmed S.; Visagathilagar, Yuvaraja; Katsifolis, Jim

    2012-09-01

    The success of any perimeter intrusion detection system depends on three important performance parameters: the probability of detection (POD), the nuisance alarm rate (NAR), and the false alarm rate (FAR). The most fundamental parameter, POD, is normally related to a number of factors such as the event of interest, the sensitivity of the sensor, the installation quality of the system, and the reliability of the sensing equipment. The suppression of nuisance alarms without degrading sensitivity in fiber optic intrusion detection systems is key to maintaining acceptable performance. Signal processing algorithms that maintain the POD and eliminate nuisance alarms are crucial for achieving this. In this paper, a robust event classification system using supervised neural networks together with a level crossings (LCs) based feature extraction algorithm is presented for the detection and recognition of intrusion and non-intrusion events in a fence-based fiber-optic intrusion detection system. A level crossings algorithm is also used with a dynamic threshold to suppress torrential rain-induced nuisance alarms in a fence system. Results show that rain-induced nuisance alarms can be suppressed for rainfall rates in excess of 100 mm/hr with the simultaneous detection of intrusion events. The use of a level crossing based detection and novel classification algorithm is also presented for a buried pipeline fiber optic intrusion detection system for the suppression of nuisance events and discrimination of intrusion events. The sensor employed for both types of systems is a distributed bidirectional fiber-optic Mach-Zehnder (MZ) interferometer.

  8. Optimising 4-D surface change detection: an approach for capturing rockfall magnitude-frequency

    NASA Astrophysics Data System (ADS)

    Williams, Jack G.; Rosser, Nick J.; Hardy, Richard J.; Brain, Matthew J.; Afana, Ashraf A.

    2018-02-01

    We present a monitoring technique tailored to analysing change from near-continuously collected, high-resolution 3-D data. Our aim is to fully characterise geomorphological change typified by an event magnitude-frequency relationship that adheres to an inverse power law or similar. While recent advances in monitoring have enabled changes in volume across more than 7 orders of magnitude to be captured, event frequency is commonly assumed to be interchangeable with the time-averaged event numbers between successive surveys. Where events coincide, or coalesce, or where the mechanisms driving change are not spatially independent, apparent event frequency must be partially determined by survey interval.The data reported have been obtained from a permanently installed terrestrial laser scanner, which permits an increased frequency of surveys. Surveying from a single position raises challenges, given the single viewpoint onto a complex surface and the need for computational efficiency associated with handling a large time series of 3-D data. A workflow is presented that optimises the detection of change by filtering and aligning scans to improve repeatability. An adaptation of the M3C2 algorithm is used to detect 3-D change to overcome data inconsistencies between scans. Individual rockfall geometries are then extracted and the associated volumetric errors modelled. The utility of this approach is demonstrated using a dataset of ˜ 9 × 103 surveys acquired at ˜ 1 h intervals over 10 months. The magnitude-frequency distribution of rockfall volumes generated is shown to be sensitive to monitoring frequency. Using a 1 h interval between surveys, rather than 30 days, the volume contribution from small (< 0.1 m3) rockfalls increases from 67 to 98 % of the total, and the number of individual rockfalls observed increases by over 3 orders of magnitude. High-frequency monitoring therefore holds considerable implications for magnitude-frequency derivatives, such as hazard return intervals and erosion rates. As such, while high-frequency monitoring has potential to describe short-term controls on geomorphological change and more realistic magnitude-frequency relationships, the assessment of longer-term erosion rates may be more suited to less-frequent data collection with lower accumulative errors.

  9. Detecting event-related changes in organizational networks using optimized neural network models.

    PubMed

    Li, Ze; Sun, Duoyong; Zhu, Renqi; Lin, Zihan

    2017-01-01

    Organizational external behavior changes are caused by the internal structure and interactions. External behaviors are also known as the behavioral events of an organization. Detecting event-related changes in organizational networks could efficiently be used to monitor the dynamics of organizational behaviors. Although many different methods have been used to detect changes in organizational networks, these methods usually ignore the correlation between the internal structure and external events. Event-related change detection considers the correlation and could be used for event recognition based on social network modeling and supervised classification. Detecting event-related changes could be effectively useful in providing early warnings and faster responses to both positive and negative organizational activities. In this study, event-related change in an organizational network was defined, and artificial neural network models were used to quantitatively determine whether and when a change occurred. To achieve a higher accuracy, Back Propagation Neural Networks (BPNNs) were optimized using Genetic Algorithms (GAs) and Particle Swarm Optimization (PSO). We showed the feasibility of the proposed method by comparing its performance with that of other methods using two cases. The results suggested that the proposed method could identify organizational events based on a correlation between the organizational networks and events. The results also suggested that the proposed method not only has a higher precision but also has a better robustness than the previously used techniques.

  10. Detecting event-related changes in organizational networks using optimized neural network models

    PubMed Central

    Sun, Duoyong; Zhu, Renqi; Lin, Zihan

    2017-01-01

    Organizational external behavior changes are caused by the internal structure and interactions. External behaviors are also known as the behavioral events of an organization. Detecting event-related changes in organizational networks could efficiently be used to monitor the dynamics of organizational behaviors. Although many different methods have been used to detect changes in organizational networks, these methods usually ignore the correlation between the internal structure and external events. Event-related change detection considers the correlation and could be used for event recognition based on social network modeling and supervised classification. Detecting event-related changes could be effectively useful in providing early warnings and faster responses to both positive and negative organizational activities. In this study, event-related change in an organizational network was defined, and artificial neural network models were used to quantitatively determine whether and when a change occurred. To achieve a higher accuracy, Back Propagation Neural Networks (BPNNs) were optimized using Genetic Algorithms (GAs) and Particle Swarm Optimization (PSO). We showed the feasibility of the proposed method by comparing its performance with that of other methods using two cases. The results suggested that the proposed method could identify organizational events based on a correlation between the organizational networks and events. The results also suggested that the proposed method not only has a higher precision but also has a better robustness than the previously used techniques. PMID:29190799

  11. An integrated logit model for contamination event detection in water distribution systems.

    PubMed

    Housh, Mashor; Ostfeld, Avi

    2015-05-15

    The problem of contamination event detection in water distribution systems has become one of the most challenging research topics in water distribution systems analysis. Current attempts for event detection utilize a variety of approaches including statistical, heuristics, machine learning, and optimization methods. Several existing event detection systems share a common feature in which alarms are obtained separately for each of the water quality indicators. Unifying those single alarms from different indicators is usually performed by means of simple heuristics. A salient feature of the current developed approach is using a statistically oriented model for discrete choice prediction which is estimated using the maximum likelihood method for integrating the single alarms. The discrete choice model is jointly calibrated with other components of the event detection system framework in a training data set using genetic algorithms. The fusing process of each indicator probabilities, which is left out of focus in many existing event detection system models, is confirmed to be a crucial part of the system which could be modelled by exploiting a discrete choice model for improving its performance. The developed methodology is tested on real water quality data, showing improved performances in decreasing the number of false positive alarms and in its ability to detect events with higher probabilities, compared to previous studies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Plasmaspheric Plumes Observed by the CLUSTER and IMAGE Spacecraft

    NASA Technical Reports Server (NTRS)

    Fung, S. F.; Benson, R. F.; Garcia, L. N.; Adrian, M. L.; Sandel, B.; Goldstein, M. L.

    2008-01-01

    Global IMAGE/EUV observations have revealed complex changes in plasmaspheric structures as the plasmasphere responds to geomagnetic activity while remaining under varying degrees of influence by co-rotation, depending on the radial distance. The complex plasmaspheric dynamics, with different scales of variability, is clearly far from being well understood. There is now renewed interest in the plasmasphere due to its apparent connections with the development of the ring current and radiation belt, and loss of ionospheric plasmas. Early in the mission, the Cluster spacecraft only crossed the plasmapause (L - 4) occasionally and made measurements of the outer plasmasphere and plasmaspheric drainage plumes. The study by Darrouzet et al. [2006] provided detailed analyses of in situ Cluster observations and IMAGE EUV observations of three plasmaspheric plumes detected in April-June, 2002. Within the next couple of years, Cluster orbit will change, causing perigee to migrate to lower altitudes, and thus providing excellent opportunities to obtain more detailed measurements of the plasmasphere. In this paper, we report our analyses of the earlier Cluster-IMAGE events by incorporating the different perspectives provided by the IMAGE Radio Plasma Imager (RPI) observations. We will discuss our new understanding of the structure and dynamics of the Cluster-IMAGE events.

  13. A multidisciplinary weight of evidence approach for environmental risk assessment at the Costa Concordia wreck: Integrative indices from Mussel Watch.

    PubMed

    Regoli, Francesco; Pellegrini, David; Cicero, Anna Maria; Nigro, Marco; Benedetti, Maura; Gorbi, Stefania; Fattorini, Daniele; D'Errico, Giuseppe; Di Carlo, Marta; Nardi, Alessandro; Gaion, Andrea; Scuderi, Alice; Giuliani, Silvia; Romanelli, Giulia; Berto, Daniela; Trabucco, Benedetta; Guidi, Patrizia; Bernardeschi, Margherita; Scarcelli, Vittoria; Frenzilli, Giada

    2014-05-01

    A complex framework of chemical, biological and oceanographic activities was immediately activated after the Costa Concordia shipwreck, to assess possible contamination events and the environmental impact during both emergency and wreck removal operations. In the present paper, we describe the results obtained with caged mussels, Mytilus galloprovincialis, chosen as bioindicator organisms to detect variations of bioavailability and the early onset of molecular and cellular effects (biomarkers). Seven translocation experiments were carried out during the first year from the incident, with organisms deployed at 2 depths in 3 different sites. After 4-6 weeks, tissue concentrations were measured for the main classes of potentially released chemicals (trace metals, polycyclic aromatic hydrocarbons, volatile and aliphatic hydrocarbons, polychlorinated biphenyls, halogenated pesticides, organotin compounds, brominated flame retardants, anionic surfactants); a wide battery of biomarkers covered responses indicative of exposure, detoxification, oxidative stress, cell damage and genotoxic effects. Results excluded serious contamination events or a consistent increase of environmental pollution although some episodic spills with reversible effects were detected. Data were elaborated within a quantitative weight of evidence (WOE) model which provided synthetic hazard indices for each typology of data, before their overall integration in an environmental risk index, which generally ranged from slight to moderate. The proposed WOE model was confirmed a useful tool to summarize large datasets of complex data in integrative indices, and to simplify the interpretation for stakeholders and decision makers, thus supporting a more comprehensive process of "site-oriented" management decisions. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Adaptive Sensor Tuning for Seismic Event Detection in Environment with Electromagnetic Noise

    NASA Astrophysics Data System (ADS)

    Ziegler, Abra E.

    The goal of this research is to detect possible microseismic events at a carbon sequestration site. Data recorded on a continuous downhole microseismic array in the Farnsworth Field, an oil field in Northern Texas that hosts an ongoing carbon capture, utilization, and storage project, were evaluated using machine learning and reinforcement learning techniques to determine their effectiveness at seismic event detection on a dataset with electromagnetic noise. The data were recorded from a passive vertical monitoring array consisting of 16 levels of 3-component 15 Hz geophones installed in the field and continuously recording since January 2014. Electromagnetic and other noise recorded on the array has significantly impacted the utility of the data and it was necessary to characterize and filter the noise in order to attempt event detection. Traditional detection methods using short-term average/long-term average (STA/LTA) algorithms were evaluated and determined to be ineffective because of changing noise levels. To improve the performance of event detection and automatically and dynamically detect seismic events using effective data processing parameters, an adaptive sensor tuning (AST) algorithm developed by Sandia National Laboratories was utilized. AST exploits neuro-dynamic programming (reinforcement learning) trained with historic event data to automatically self-tune and determine optimal detection parameter settings. The key metric that guides the AST algorithm is consistency of each sensor with its nearest neighbors: parameters are automatically adjusted on a per station basis to be more or less sensitive to produce consistent agreement of detections in its neighborhood. The effects that changes in neighborhood configuration have on signal detection were explored, as it was determined that neighborhood-based detections significantly reduce the number of both missed and false detections in ground-truthed data. The performance of the AST algorithm was quantitatively evaluated during a variety of noise conditions and seismic detections were identified using AST and compared to ancillary injection data. During a period of CO2 injection in a nearby well to the monitoring array, 82% of seismic events were accurately detected, 13% of events were missed, and 5% of detections were determined to be false. Additionally, seismic risk was evaluated from the stress field and faulting regime at FWU to determine the likelihood of pressure perturbations to trigger slip on previously mapped faults. Faults oriented NW-SE were identified as requiring the smallest pore pressure changes to trigger slip and faults oriented N-S will also potentially be reactivated although this is less likely.

  15. Spatio-temporal Event Classification using Time-series Kernel based Structured Sparsity

    PubMed Central

    Jeni, László A.; Lőrincz, András; Szabó, Zoltán; Cohn, Jeffrey F.; Kanade, Takeo

    2016-01-01

    In many behavioral domains, such as facial expression and gesture, sparse structure is prevalent. This sparsity would be well suited for event detection but for one problem. Features typically are confounded by alignment error in space and time. As a consequence, high-dimensional representations such as SIFT and Gabor features have been favored despite their much greater computational cost and potential loss of information. We propose a Kernel Structured Sparsity (KSS) method that can handle both the temporal alignment problem and the structured sparse reconstruction within a common framework, and it can rely on simple features. We characterize spatio-temporal events as time-series of motion patterns and by utilizing time-series kernels we apply standard structured-sparse coding techniques to tackle this important problem. We evaluated the KSS method using both gesture and facial expression datasets that include spontaneous behavior and differ in degree of difficulty and type of ground truth coding. KSS outperformed both sparse and non-sparse methods that utilize complex image features and their temporal extensions. In the case of early facial event classification KSS had 10% higher accuracy as measured by F1 score over kernel SVM methods1. PMID:27830214

  16. Sexual Initiation and Complex Recent Polydrug Use Patterns Among Male Sex Workers in Vietnam: A Preliminary Epidemiological Trajectory.

    PubMed

    Yu, Gary; Goldsamt, Lloyd A; Clatts, Michael C; Giang, Lê Minh

    2016-05-01

    Little is known about the age of onset of sexual and drug risk and their association with complex patterns of recent drug use among male sex workers (MSW) in a developing country, such as Vietnam. The aim of this study was to determine whether latent class analysis (LCA) would aid in the detection of current individual and polydrug use combinations to predict how different trajectories of sexual and drug initiation contribute to different patterns of current illicit drug use. Data were collected from a cross-sectional survey administered to young MSWs between 2010 and 2011 in Vietnam (N = 710). LCA clustered participants into recent drug use groups, incorporating both the specific types and overall count of different drugs used. Men reported drug use within a 1 month period from an 11-item drug use list. LCA identified three distinct drug use classes: (1) alcohol use, (2) alcohol and tobacco use, and (3) high polydrug use. The current drug use classes are associated with sex worker status, housing stability, income level, educational attainment, marital status, sexual identity, and sexual preferences. High levels of drug use are strongly associated with being a recent sex worker, not having recent stable housing, higher than median income, more than a high school education, less likely to be currently in school and more likely to have non-homosexual preferences and heterosexual partners. An event history analysis approach (time-event displays) examined the timing of the age of onset of drug and sexual risks. Early ages of drug and sexual initiation are seen for all three classes. High current drug users show earlier onset of these risks, which are significantly delayed for moderate and low current drug users. LCA incorporating an overall count of different drugs detected three distinct current drug use classes. The data illustrates that the complexity of drug factors that must be accounted for, both in advancing our epidemiological understanding of the complexity of drug use and the use of drug and sexual risk initiation data to predict current drug use subtypes among high-risk populations.

  17. Initial Evaluation of Signal-Based Bayesian Monitoring

    NASA Astrophysics Data System (ADS)

    Moore, D.; Russell, S.

    2016-12-01

    We present SIGVISA (Signal-based Vertically Integrated Seismic Analysis), a next-generation system for global seismic monitoring through Bayesian inference on seismic signals. Traditional seismic monitoring systems rely on discrete detections produced by station processing software, discarding significant information present in the original recorded signal. By modeling signals directly, our forward model is able to incorporate a rich representation of the physics underlying the signal generation process, including source mechanisms, wave propagation, and station response. This allows inference in the model to recover the qualitative behavior of geophysical methods including waveform matching and double-differencing, all as part of a unified Bayesian monitoring system that simultaneously detects and locates events from a network of stations. We report results from an evaluation of SIGVISA monitoring the western United States for a two-week period following the magnitude 6.0 event in Wells, NV in February 2008. During this period, SIGVISA detects more than twice as many events as NETVISA, and three times as many as SEL3, while operating at the same precision; at lower precisions it detects up to five times as many events as SEL3. At the same time, signal-based monitoring reduces mean location errors by a factor of four relative to detection-based systems. We provide evidence that, given only IMS data, SIGVISA detects events that are missed by regional monitoring networks, indicating that our evaluations may even underestimate its performance. Finally, SIGVISA matches or exceeds the detection rates of existing systems for de novo events - events with no nearby historical seismicity - and detects through automated processing a number of such events missed even by the human analysts generating the LEB.

  18. Boolean Logic Tree of Label-Free Dual-Signal Electrochemical Aptasensor System for Biosensing, Three-State Logic Computation, and Keypad Lock Security Operation.

    PubMed

    Lu, Jiao Yang; Zhang, Xin Xing; Huang, Wei Tao; Zhu, Qiu Yan; Ding, Xue Zhi; Xia, Li Qiu; Luo, Hong Qun; Li, Nian Bing

    2017-09-19

    The most serious and yet unsolved problems of molecular logic computing consist in how to connect molecular events in complex systems into a usable device with specific functions and how to selectively control branchy logic processes from the cascading logic systems. This report demonstrates that a Boolean logic tree is utilized to organize and connect "plug and play" chemical events DNA, nanomaterials, organic dye, biomolecule, and denaturant for developing the dual-signal electrochemical evolution aptasensor system with good resettability for amplification detection of thrombin, controllable and selectable three-state logic computation, and keypad lock security operation. The aptasensor system combines the merits of DNA-functionalized nanoamplification architecture and simple dual-signal electroactive dye brilliant cresyl blue for sensitive and selective detection of thrombin with a wide linear response range of 0.02-100 nM and a detection limit of 1.92 pM. By using these aforementioned chemical events as inputs and the differential pulse voltammetry current changes at different voltages as dual outputs, a resettable three-input biomolecular keypad lock based on sequential logic is established. Moreover, the first example of controllable and selectable three-state molecular logic computation with active-high and active-low logic functions can be implemented and allows the output ports to assume a high impediment or nothing (Z) state in addition to the 0 and 1 logic levels, effectively controlling subsequent branchy logic computation processes. Our approach is helpful in developing the advanced controllable and selectable logic computing and sensing system in large-scale integration circuits for application in biomedical engineering, intelligent sensing, and control.

  19. Implementation of a portable device for real-time ECG signal analysis.

    PubMed

    Jeon, Taegyun; Kim, Byoungho; Jeon, Moongu; Lee, Byung-Geun

    2014-12-10

    Cardiac disease is one of the main causes of catastrophic mortality. Therefore, detecting the symptoms of cardiac disease as early as possible is important for increasing the patient's survival. In this study, a compact and effective architecture for detecting atrial fibrillation (AFib) and myocardial ischemia is proposed. We developed a portable device using this architecture, which allows real-time electrocardiogram (ECG) signal acquisition and analysis for cardiac diseases. A noisy ECG signal was preprocessed by an analog front-end consisting of analog filters and amplifiers before it was converted into digital data. The analog front-end was minimized to reduce the size of the device and power consumption by implementing some of its functions with digital filters realized in software. With the ECG data, we detected QRS complexes based on wavelet analysis and feature extraction for morphological shape and regularity using an ARM processor. A classifier for cardiac disease was constructed based on features extracted from a training dataset using support vector machines. The classifier then categorized the ECG data into normal beats, AFib, and myocardial ischemia. A portable ECG device was implemented, and successfully acquired and processed ECG signals. The performance of this device was also verified by comparing the processed ECG data with high-quality ECG data from a public cardiac database. Because of reduced computational complexity, the ARM processor was able to process up to a thousand samples per second, and this allowed real-time acquisition and diagnosis of heart disease. Experimental results for detection of heart disease showed that the device classified AFib and ischemia with a sensitivity of 95.1% and a specificity of 95.9%. Current home care and telemedicine systems have a separate device and diagnostic service system, which results in additional time and cost. Our proposed portable ECG device provides captured ECG data and suspected waveform to identify sporadic and chronic events of heart diseases. This device has been built and evaluated for high quality of signals, low computational complexity, and accurate detection.

  20. Unsupervised Spatial Event Detection in Targeted Domains with Applications to Civil Unrest Modeling

    PubMed Central

    Zhao, Liang; Chen, Feng; Dai, Jing; Hua, Ting; Lu, Chang-Tien; Ramakrishnan, Naren

    2014-01-01

    Twitter has become a popular data source as a surrogate for monitoring and detecting events. Targeted domains such as crime, election, and social unrest require the creation of algorithms capable of detecting events pertinent to these domains. Due to the unstructured language, short-length messages, dynamics, and heterogeneity typical of Twitter data streams, it is technically difficult and labor-intensive to develop and maintain supervised learning systems. We present a novel unsupervised approach for detecting spatial events in targeted domains and illustrate this approach using one specific domain, viz. civil unrest modeling. Given a targeted domain, we propose a dynamic query expansion algorithm to iteratively expand domain-related terms, and generate a tweet homogeneous graph. An anomaly identification method is utilized to detect spatial events over this graph by jointly maximizing local modularity and spatial scan statistics. Extensive experiments conducted in 10 Latin American countries demonstrate the effectiveness of the proposed approach. PMID:25350136

  1. Artificial Neural Network applied to lightning flashes

    NASA Astrophysics Data System (ADS)

    Gin, R. B.; Guedes, D.; Bianchi, R.

    2013-05-01

    The development of video cameras enabled cientists to study lightning discharges comportment with more precision. The main goal of this project is to create a system able to detect images of lightning discharges stored in videos and classify them using an Artificial Neural Network (ANN)using C Language and OpenCV libraries. The developed system, can be split in two different modules: detection module and classification module. The detection module uses OpenCV`s computer vision libraries and image processing techniques to detect if there are significant differences between frames in a sequence, indicating that something, still not classified, occurred. Whenever there is a significant difference between two consecutive frames, two main algorithms are used to analyze the frame image: brightness and shape algorithms. These algorithms detect both shape and brightness of the event, removing irrelevant events like birds, as well as detecting the relevant events exact position, allowing the system to track it over time. The classification module uses a neural network to classify the relevant events as horizontal or vertical lightning, save the event`s images and calculates his number of discharges. The Neural Network was implemented using the backpropagation algorithm, and was trained with 42 training images , containing 57 lightning events (one image can have more than one lightning). TheANN was tested with one to five hidden layers, with up to 50 neurons each. The best configuration achieved a success rate of 95%, with one layer containing 20 neurons (33 test images with 42 events were used in this phase). This configuration was implemented in the developed system to analyze 20 video files, containing 63 lightning discharges previously manually detected. Results showed that all the lightning discharges were detected, many irrelevant events were unconsidered, and the event's number of discharges was correctly computed. The neural network used in this project achieved a success rate of 90%. The videos used in this experiment were acquired by seven video cameras installed in São Bernardo do Campo, Brazil, that continuously recorded lightning events during the summer. The cameras were disposed in a 360 loop, recording all data at a time resolution of 33ms. During this period, several convective storms were recorded.

  2. The impact of symptomatic mild traumatic brain injury on complex everyday activities and the link with alterations in cerebral functioning: Exploratory case studies.

    PubMed

    Bottari, Carolina; Gosselin, Nadia; Chen, Jen-Kai; Ptito, Alain

    2017-07-01

    The objective of the study was to explore the neurophysiological correlates of altered functional independence using functional magnetic resonance imaging (fMRI) and event-related potentials (ERP) after a mild traumatic brain injury (mTBI). The participants consisted of three individuals with symptomatic mTBI (3.9 ± 3.6 months post-mTBI) and 12 healthy controls. The main measures used were the Instrumental Activities of Daily Living (IADL) Profile observation-based assessment; a visual externally ordered working memory task combined to event-related potentials (ERP) and fMRI recordings; neuropsychological tests; post-concussion symptoms questionnaires; and the Activities of Daily Living (ADL) Profile interview. Compared to normal controls, all three patients had difficulty with a real-world complex budgeting activity due to deficits in planning, ineffective strategy use and/or a prolonged time to detect and correct errors. Reduced activations in the right mid-dorsolateral prefrontal cortex on fMRI as well as abnormal frontal or parietal components of the ERP occurred alongside these deficits. Results of this exploratory study suggest that reduced independence in complex everyday activities in symptomatic mTBI may be at least partly explained by a decrease in brain activation in the prefrontal cortex, abnormal ERP, or slower reaction times on working memory tasks. The study presents an initial attempt at combining research in neuroscience with ecological real-world evaluation research to further our understanding of the difficulties in complex everyday activities experienced by individuals with mTBI.

  3. Constitutional Chromoanagenesis of Distal 13q in a Young Adult with Recurrent Strokes.

    PubMed

    Burnside, Rachel D; Harris, April; Speyer, Darrow; Burgin, W Scott; Rose, David Z; Sanchez-Valle, Amarilis

    2016-01-01

    Constitutional chromoanagenesis events, which include chromoanasynthesis and chromothripsis and result in highly complex rearrangements, have been reported for only a few individuals. While rare, these phenomena have likely been underestimated in a constitutional setting as technologies that can accurately detect such complexity are relatively new to the mature field of clinical cytogenetics. G-banding is not likely to accurately identify chromoanasynthesis or chromothripsis, since the banding patterns of chromosomes are likely to be misidentified or oversimplified due to a much lower resolution. We describe a patient who was initially referred for cytogenetic testing as a child for speech delay. As a young adult, he was referred again for recurrent strokes. Chromosome analysis was performed, and the rearrangement resembled a simple duplication of 13q32q34. However, SNP microarray analysis showed a complex pattern of copy number gains and a loss consistent with chromoanasynthesis involving distal 13q (13q32.1q34). This report emphasizes the value of performing microarray analysis for individuals with abnormal or complex chromosome rearrangements. © 2016 S. Karger AG, Basel.

  4. Fighting detection using interaction energy force

    NASA Astrophysics Data System (ADS)

    Wateosot, Chonthisa; Suvonvorn, Nikom

    2017-02-01

    Fighting detection is an important issue in security aimed to prevent criminal or undesirable events in public places. Many researches on computer vision techniques have studied to detect the specific event in crowded scenes. In this paper we focus on fighting detection using social-based Interaction Energy Force (IEF). The method uses low level features without object extraction and tracking. The interaction force is modeled using the magnitude and direction of optical flows. A fighting factor is developed under this model to detect fighting events using thresholding method. An energy map of interaction force is also presented to identify the corresponding events. The evaluation is performed using NUSHGA and BEHAVE datasets. The results show the efficiency with high accuracy regardless of various conditions.

  5. Aptamer-based microfluidic beads array sensor for simultaneous detection of multiple analytes employing multienzyme-linked nanoparticle amplification and quantum dots labels.

    PubMed

    Zhang, He; Hu, Xinjiang; Fu, Xin

    2014-07-15

    This study reports the development of an aptamer-mediated microfluidic beads-based sensor for multiple analytes detection and quantification using multienzyme-linked nanoparticle amplification and quantum dots labels. Adenosine and cocaine were selected as the model analytes to validate the assay design based on strand displacement induced by target-aptamer complex. Microbeads functionalized with the aptamers and modified electron rich proteins were arrayed within a microfluidic channel and were connected with the horseradish peroxidases (HRP) and capture DNA probe derivative gold nanoparticles (AuNPs) via hybridization. The conformational transition of aptamer induced by target-aptamer complex contributes to the displacement of functionalized AuNPs and decreases the fluorescence signal of microbeads. In this approach, increased binding events of HRP on each nanosphere and enhanced mass transport capability inherent from microfluidics are integrated for enhancing the detection sensitivity of analytes. Based on the dual signal amplification strategy, the developed aptamer-based microfluidic bead array sensor could discriminate as low as 0.1 pM of adenosine and 0.5 pM cocaine, and showed a 500-fold increase in detection limit of adenosine compared to the off-chip test. The results proved the microfluidic-based method was a rapid and efficient system for aptamer-based targets assays (adenosine (0.1 pM) and cocaine (0.5 pM)), requiring only minimal (microliter) reagent use. This work demonstrated the successful application of aptamer-based microfluidic beads array sensor for detection of important molecules in biomedical fields. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. A data fusion approach to indications and warnings of terrorist attacks

    NASA Astrophysics Data System (ADS)

    McDaniel, David; Schaefer, Gregory

    2014-05-01

    Indications and Warning (I&W) of terrorist attacks, particularly IED attacks, require detection of networks of agents and patterns of behavior. Social Network Analysis tries to detect a network; activity analysis tries to detect anomalous activities. This work builds on both to detect elements of an activity model of terrorist attack activity - the agents, resources, networks, and behaviors. The activity model is expressed as RDF triples statements where the tuple positions are elements or subsets of a formal ontology for activity models. The advantage of a model is that elements are interdependent and evidence for or against one will influence others so that there is a multiplier effect. The advantage of the formality is that detection could occur hierarchically, that is, at different levels of abstraction. The model matching is expressed as a likelihood ratio between input text and the model triples. The likelihood ratio is designed to be analogous to track correlation likelihood ratios common in JDL fusion level 1. This required development of a semantic distance metric for positive and null hypotheses as well as for complex objects. The metric uses the Web 1Terabype database of one to five gram frequencies for priors. This size requires the use of big data technologies so a Hadoop cluster is used in conjunction with OpenNLP natural language and Mahout clustering software. Distributed data fusion Map Reduce jobs distribute parts of the data fusion problem to the Hadoop nodes. For the purposes of this initial testing, open source models and text inputs of similar complexity to terrorist events were used as surrogates for the intended counter-terrorist application.

  7. Vision-based Detection of Acoustic Timed Events: a Case Study on Clarinet Note Onsets

    NASA Astrophysics Data System (ADS)

    Bazzica, A.; van Gemert, J. C.; Liem, C. C. S.; Hanjalic, A.

    2017-05-01

    Acoustic events often have a visual counterpart. Knowledge of visual information can aid the understanding of complex auditory scenes, even when only a stereo mixdown is available in the audio domain, \\eg identifying which musicians are playing in large musical ensembles. In this paper, we consider a vision-based approach to note onset detection. As a case study we focus on challenging, real-world clarinetist videos and carry out preliminary experiments on a 3D convolutional neural network based on multiple streams and purposely avoiding temporal pooling. We release an audiovisual dataset with 4.5 hours of clarinetist videos together with cleaned annotations which include about 36,000 onsets and the coordinates for a number of salient points and regions of interest. By performing several training trials on our dataset, we learned that the problem is challenging. We found that the CNN model is highly sensitive to the optimization algorithm and hyper-parameters, and that treating the problem as binary classification may prevent the joint optimization of precision and recall. To encourage further research, we publicly share our dataset, annotations and all models and detail which issues we came across during our preliminary experiments.

  8. Symbolic Processing Combined with Model-Based Reasoning

    NASA Technical Reports Server (NTRS)

    James, Mark

    2009-01-01

    A computer program for the detection of present and prediction of future discrete states of a complex, real-time engineering system utilizes a combination of symbolic processing and numerical model-based reasoning. One of the biggest weaknesses of a purely symbolic approach is that it enables prediction of only future discrete states while missing all unmodeled states or leading to incorrect identification of an unmodeled state as a modeled one. A purely numerical approach is based on a combination of statistical methods and mathematical models of the applicable physics and necessitates development of a complete model to the level of fidelity required for prediction. In addition, a purely numerical approach does not afford the ability to qualify its results without some form of symbolic processing. The present software implements numerical algorithms to detect unmodeled events and symbolic algorithms to predict expected behavior, correlate the expected behavior with the unmodeled events, and interpret the results in order to predict future discrete states. The approach embodied in this software differs from that of the BEAM methodology (aspects of which have been discussed in several prior NASA Tech Briefs articles), which provides for prediction of future measurements in the continuous-data domain.

  9. Electrochemical sensor for multiplex screening of genetically modified DNA: identification of biotech crops by logic-based biomolecular analysis.

    PubMed

    Liao, Wei-Ching; Chuang, Min-Chieh; Ho, Ja-An Annie

    2013-12-15

    Genetically modified (GM) technique, one of the modern biomolecular engineering technologies, has been deemed as profitable strategy to fight against global starvation. Yet rapid and reliable analytical method is deficient to evaluate the quality and potential risk of such resulting GM products. We herein present a biomolecular analytical system constructed with distinct biochemical activities to expedite the computational detection of genetically modified organisms (GMOs). The computational mechanism provides an alternative to the complex procedures commonly involved in the screening of GMOs. Given that the bioanalytical system is capable of processing promoter, coding and species genes, affirmative interpretations succeed to identify specified GM event in terms of both electrochemical and optical fashions. The biomolecular computational assay exhibits detection capability of genetically modified DNA below sub-nanomolar level and is found interference-free by abundant coexistence of non-GM DNA. This bioanalytical system, furthermore, sophisticates in array fashion operating multiplex screening against variable GM events. Such a biomolecular computational assay and biosensor holds great promise for rapid, cost-effective, and high-fidelity screening of GMO. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Review of complex networks application in hydroclimatic extremes with an implementation to characterize spatio-temporal drought propagation in continental USA

    NASA Astrophysics Data System (ADS)

    Konapala, Goutam; Mishra, Ashok

    2017-12-01

    The quantification of spatio-temporal hydroclimatic extreme events is a key variable in water resources planning, disaster mitigation, and preparing climate resilient society. However, quantification of these extreme events has always been a great challenge, which is further compounded by climate variability and change. Recently complex network theory was applied in earth science community to investigate spatial connections among hydrologic fluxes (e.g., rainfall and streamflow) in water cycle. However, there are limited applications of complex network theory for investigating hydroclimatic extreme events. This article attempts to provide an overview of complex networks and extreme events, event synchronization method, construction of networks, their statistical significance and the associated network evaluation metrics. For illustration purpose, we apply the complex network approach to study the spatio-temporal evolution of droughts in Continental USA (CONUS). A different drought threshold leads to a new drought event as well as different socio-economic implications. Therefore, it would be interesting to explore the role of thresholds on spatio-temporal evolution of drought through network analysis. In this study, long term (1900-2016) Palmer drought severity index (PDSI) was selected for spatio-temporal drought analysis using three network-based metrics (i.e., strength, direction and distance). The results indicate that the drought events propagate differently at different thresholds associated with initiation of drought events. The direction metrics indicated that onset of mild drought events usually propagate in a more spatially clustered and uniform approach compared to onsets of moderate droughts. The distance metric shows that the drought events propagate for longer distance in western part compared to eastern part of CONUS. We believe that the network-aided metrics utilized in this study can be an important tool in advancing our knowledge on drought propagation as well as other hydroclimatic extreme events. Although the propagation of droughts is investigated using the network approach, however process (physics) based approaches is essential to further understand the dynamics of hydroclimatic extreme events.

  11. Single Versus Multiple Events Error Potential Detection in a BCI-Controlled Car Game With Continuous and Discrete Feedback.

    PubMed

    Kreilinger, Alex; Hiebel, Hannah; Müller-Putz, Gernot R

    2016-03-01

    This work aimed to find and evaluate a new method for detecting errors in continuous brain-computer interface (BCI) applications. Instead of classifying errors on a single-trial basis, the new method was based on multiple events (MEs) analysis to increase the accuracy of error detection. In a BCI-driven car game, based on motor imagery (MI), discrete events were triggered whenever subjects collided with coins and/or barriers. Coins counted as correct events, whereas barriers were errors. This new method, termed ME method, combined and averaged the classification results of single events (SEs) and determined the correctness of MI trials, which consisted of event sequences instead of SEs. The benefit of this method was evaluated in an offline simulation. In an online experiment, the new method was used to detect erroneous MI trials. Such MI trials were discarded and could be repeated by the users. We found that, even with low SE error potential (ErrP) detection rates, feasible accuracies can be achieved when combining MEs to distinguish erroneous from correct MI trials. Online, all subjects reached higher scores with error detection than without, at the cost of longer times needed for completing the game. Findings suggest that ErrP detection may become a reliable tool for monitoring continuous states in BCI applications when combining MEs. This paper demonstrates a novel technique for detecting errors in online continuous BCI applications, which yields promising results even with low single-trial detection rates.

  12. Unreported seismic events found far off-shore Mexico using full-waveform, cross-correlation detection method.

    NASA Astrophysics Data System (ADS)

    Solano, ErickaAlinne; Hjorleifsdottir, Vala; Perez-Campos, Xyoli

    2015-04-01

    A large subset of seismic events do not have impulsive arrivals, such as low frequency events in volcanoes, earthquakes in the shallow part of the subduction interface and further down dip from the traditional seismogenic part, glacial events, volcanic and non-volcanic tremors and landslides. A suite of methods can be used to detect these non-impulsive events. One of this methods is the full-waveform detection based on time reversal methods (Solano, et al , submitted to GJI). The method uses continuous observed seismograms, together with Greens functions and moment tensor responses calculated for an arbitrary 3D structure. This method was applied to the 2012 Ometepec-Pinotepa Nacional earthquake sequence in Guerrero, Mexico. During the span time of the study, we encountered three previously unknown events. One of this events was an impulsive earthquake in the Ometepec area, that only has clear arrivals on three stations and was therefore not located and reported by the SSN. The other two events are previously undetected events, very depleted in high frequencies, that occurred far outside the search area. A very rough estimate gives the location of this two events in the portion of the East Pacific Rise around 9 N. These two events are detected despite their distance from the search area, due to favorable move-out on the array of the Mexican National Seismological Service network (SSN). We are expanding the study area to the EPR and to a larger period of time, with the objective of finding more events in that region. We will present an analysis of the newly detected events, as well as any further findings at the meeting.

  13. Detection of cough signals in continuous audio recordings using hidden Markov models.

    PubMed

    Matos, Sergio; Birring, Surinder S; Pavord, Ian D; Evans, David H

    2006-06-01

    Cough is a common symptom of many respiratory diseases. The evaluation of its intensity and frequency of occurrence could provide valuable clinical information in the assessment of patients with chronic cough. In this paper we propose the use of hidden Markov models (HMMs) to automatically detect cough sounds from continuous ambulatory recordings. The recording system consists of a digital sound recorder and a microphone attached to the patient's chest. The recognition algorithm follows a keyword-spotting approach, with cough sounds representing the keywords. It was trained on 821 min selected from 10 ambulatory recordings, including 2473 manually labeled cough events, and tested on a database of nine recordings from separate patients with a total recording time of 3060 min and comprising 2155 cough events. The average detection rate was 82% at a false alarm rate of seven events/h, when considering only events above an energy threshold relative to each recording's average energy. These results suggest that HMMs can be applied to the detection of cough sounds from ambulatory patients. A postprocessing stage to perform a more detailed analysis on the detected events is under development, and could allow the rejection of some of the incorrectly detected events.

  14. A new moonquake catalog from Apollo 17 geophone data

    NASA Astrophysics Data System (ADS)

    Dimech, Jesse-Lee; Knapmeyer-Endrun, Brigitte; Weber, Renee

    2017-04-01

    New lunar seismic events have been detected on geophone data from the Apollo 17 Lunar Seismic Profile Experiment (LSPE). This dataset is already known to contain an abundance of thermal seismic events, and potentially some meteorite impacts, but prior to this study only 26 days of LSPE "listening mode" data has been analysed. In this new analysis, additional listening mode data collected between August 1976 and April 1977 is incorporated. To the authors knowledge these 8-months of data have not yet been used to detect seismic moonquake events. The geophones in question are situated adjacent to the Apollo 17 site in the Taurus-Littrow valley, about 5.5 km east of Lee-Lincoln scarp, and between the North and South Massifs. Any of these features are potential seismic sources. We have used an event-detection and classification technique based on 'Hidden Markov Models' to automatically detect and categorize seismic signals, in order to objectively generate a seismic event catalog. Currently, 2.5 months of the 8-month listening mode dataset has been processed, totaling 14,338 detections. Of these, 672 detections (classification "n1") have a sharp onset with a steep risetime suggesting they occur close to the recording geophone. These events almost all occur in association with lunar sunrise over the span of 1-2 days. One possibility is that these events originate from the nearby Apollo 17 lunar lander due to rapid heating at sunrise. A further 10,004 detections (classification "d1") show strong diurnal periodicity, with detections increasing during the lunar day and reaching a peak at sunset, and therefore probably represent thermal events from the lunar regolith immediately surrounding the Apollo 17 landing site. The final 3662 detections (classification "d2") have emergent onsets and relatively long durations. These detections have peaks associated with lunar sunrise and sunset, but also sometimes have peaks at seemingly random times. Their source mechanism has not yet been investigated. It's possible that many of these are misclassified d1/n1 events, and further QC work needs to be undertaken. But it is also possible that many of these represent more distant thermal moonquakes e.g. from the North and South massif, or even the ridge adjacent to the Lee-Lincoln scarp. The unknown event spikes will be the subject of closer inspection once the HMM technique has been refined.

  15. Modeling Events in the Lower Imperial Valley Basin

    NASA Astrophysics Data System (ADS)

    Tian, X.; Wei, S.; Zhan, Z.; Fielding, E. J.; Helmberger, D. V.

    2010-12-01

    The Imperial Valley below the US-Mexican border has few seismic stations but many significant earthquakes. Many of these events, such as the recent El Mayor-Cucapah event, have complex mechanisms involving a mixture of strike-slip and normal slip patterns with now over 30 aftershocks with magnitude over 4.5. Unfortunately, many earthquake records from the Southern Imperial Valley display a great deal of complexity, ie., strong Rayleigh wave multipathing and extended codas. In short, regional recordings in the US are too complex to easily separate source properties from complex propagation. Fortunately, the Dec 30 foreshock (Mw=5.9) has excellent recordings teleseismically and regionally, and moreover is observed with InSAR. We use this simple strike-slip event to calibrate paths. In particular, we are finding record segments involving Pnl (including depth phases) and some surface waves (mostly Love waves) that appear well behaved, ie., can be approximated by synthetics from 1D local models and events modeled with the Cut-and-Paste (CAP) routine. Simple events can then be identified along with path calibration. Modeling the more complicated paths can be started with known mechanisms. We will report on both the aftershocks and historic events.

  16. Shifting Quaternary migration patterns in the Bahamian archipelago: Evidence from the Zamia pumila complex at the northern limits of the Caribbean island biodiversity hotspot.

    PubMed

    Salas-Leiva, Dayana E; Meerow, Alan W; Calonje, Michael; Francisco-Ortega, Javier; Griffith, M Patrick; Nakamura, Kyoko; Sánchez, Vanessa; Knowles, Lindy; Knowles, David

    2017-05-01

    The Bahamas archipelago is formed by young, tectonically stable carbonate banks that harbor direct geological evidence of global ice-volume changes. We sought to detect signatures of major changes on gene flow patterns and reconstruct the phylogeographic history of the monophyletic Zamia pumila complex across the Bahamas. Nuclear molecular markers with both high and low mutation rates were used to capture two different time scale signatures and test several gene flow and demographic hypotheses. Single-copy nuclear genes unveiled apparent ancestral admixture on Andros, suggesting a significant role of this island as main hub of diversity of the archipelago. We detected demographic and spatial expansion of the Zamia pumila complex on both paleo-provinces around the Piacenzian (Pliocene)/Gelasian (Pleistocene). Populations evidenced signatures of different migration models that have occurred at two different times. Populations on Long Island ( Z. lucayana ) may either represent a secondary colonization of the Bahamas by Zamia or a rapid and early-divergence event of at least one population on the Bahamas. Despite changes in migration patterns with global climate, expected heterozygosity with both marker systems remains within the range reported for cycads, but with significant levels of increased inbreeding detected by the microsatellites. This finding is likely associated with reduced gene flow between and within paleo-provinces, accompanied by genetic drift, as rising seas enforced isolation. Our study highlights the importance of the maintenance of the predominant direction of genetic exchange and the role of overseas dispersion among the islands during climate oscillations. © 2017 Botanical Society of America.

  17. An open-source framework for stress-testing non-invasive foetal ECG extraction algorithms.

    PubMed

    Andreotti, Fernando; Behar, Joachim; Zaunseder, Sebastian; Oster, Julien; Clifford, Gari D

    2016-05-01

    Over the past decades, many studies have been published on the extraction of non-invasive foetal electrocardiogram (NI-FECG) from abdominal recordings. Most of these contributions claim to obtain excellent results in detecting foetal QRS (FQRS) complexes in terms of location. A small subset of authors have investigated the extraction of morphological features from the NI-FECG. However, due to the shortage of available public databases, the large variety of performance measures employed and the lack of open-source reference algorithms, most contributions cannot be meaningfully assessed. This article attempts to address these issues by presenting a standardised methodology for stress testing NI-FECG algorithms, including absolute data, as well as extraction and evaluation routines. To that end, a large database of realistic artificial signals was created, totaling 145.8 h of multichannel data and over one million FQRS complexes. An important characteristic of this dataset is the inclusion of several non-stationary events (e.g. foetal movements, uterine contractions and heart rate fluctuations) that are critical for evaluating extraction routines. To demonstrate our testing methodology, three classes of NI-FECG extraction algorithms were evaluated: blind source separation (BSS), template subtraction (TS) and adaptive methods (AM). Experiments were conducted to benchmark the performance of eight NI-FECG extraction algorithms on the artificial database focusing on: FQRS detection and morphological analysis (foetal QT and T/QRS ratio). The overall median FQRS detection accuracies (i.e. considering all non-stationary events) for the best performing methods in each group were 99.9% for BSS, 97.9% for AM and 96.0% for TS. Both FQRS detections and morphological parameters were shown to heavily depend on the extraction techniques and signal-to-noise ratio. Particularly, it is shown that their evaluation in the source domain, obtained after using a BSS technique, should be avoided. Data, extraction algorithms and evaluation routines were released as part of the fecgsyn toolbox on Physionet under an GNU GPL open-source license. This contribution provides a standard framework for benchmarking and regulatory testing of NI-FECG extraction algorithms.

  18. Exploring the Appropriate Drought Index in a Humid Tropical Area with Complex Terrain

    NASA Astrophysics Data System (ADS)

    Lee, C. H.; Chen, W. T.; Lo, M. H.; Chu, J. L.; Chen, Y. J.; Chen, Y. M.

    2017-12-01

    The goal of the present study is to identify the most appropriate index to monitor droughts in Taiwan, an extremely humid region with steep terrain. Three drought indices were calculated using in situ high resolution rainfall observations and compared: the Standardized Precipitation Index (SPI), the self-calibrating Palmer Drought Severity Index (sc-PDSI), and the Standardized Precipitation Evapotranspiration Index (SPEI). In Taiwan, the average amount of precipitation is around 2500 mm per year, which is six times of the global average. However, with the complexity of topography and the uneven distribution throughout the year in Taiwan, abundant rainfall during the wet season is mostly lost as runoff. Severe droughts occur frequently at approximately once per decade, while moderate droughts occur every 2 years. Earlier studies indicated that the SPI is limited in describing drought events because the temperature effect is not taken into account in SPI as in the sc-PDSI. In addition, SPEI, which take the Penman-Monteith Potential Evapotranspiration (PET_pm) into account, is also considered in the present study. The atmospheric water demand increases as temperature increasing, which is reflected in PET_pm. To calculate the three drought indices, we will use the monthly average temperature to calculate the PET_pm and monthly accumulated precipitation from automatic weather stations from the Central Weather Bureau. All of the detected droughts are evaluated against the dataset of historical drought records in Taiwan. We explore whether the temperature is an important factor for the occurrence of droughts in Taiwan first. In addition to severe droughts, we expect that SPEI and sc-PDSI can detect more moderate droughts in Taiwan. Second, we survey the performance of three drought indices for the detection of droughts in Taiwan. Because the soil water model used in sc-PDSI doesn't consider the effect of steep terrain, and because SPI only considers the monthly precipitation, we expect SPEI to be the more appropriate index for monitoring drought events in Taiwan.

  19. Detecting and characterizing coal mine related seismicity in the Western U.S. using subspace methods

    NASA Astrophysics Data System (ADS)

    Chambers, Derrick J. A.; Koper, Keith D.; Pankow, Kristine L.; McCarter, Michael K.

    2015-11-01

    We present an approach for subspace detection of small seismic events that includes methods for estimating magnitudes and associating detections from multiple stations into unique events. The process is used to identify mining related seismicity from a surface coal mine and an underground coal mining district, both located in the Western U.S. Using a blasting log and a locally derived seismic catalogue as ground truth, we assess detector performance in terms of verified detections, false positives and failed detections. We are able to correctly identify over 95 per cent of the surface coal mine blasts and about 33 per cent of the events from the underground mining district, while keeping the number of potential false positives relatively low by requiring all detections to occur on two stations. We find that most of the potential false detections for the underground coal district are genuine events missed by the local seismic network, demonstrating the usefulness of regional subspace detectors in augmenting local catalogues. We note a trade-off in detection performance between stations at smaller source-receiver distances, which have increased signal-to-noise ratio, and stations at larger distances, which have greater waveform similarity. We also explore the increased detection capabilities of a single higher dimension subspace detector, compared to multiple lower dimension detectors, in identifying events that can be described as linear combinations of training events. We find, in our data set, that such an advantage can be significant, justifying the use of a subspace detection scheme over conventional correlation methods.

  20. Contamination Event Detection with Multivariate Time-Series Data in Agricultural Water Monitoring †

    PubMed Central

    Mao, Yingchi; Qi, Hai; Ping, Ping; Li, Xiaofang

    2017-01-01

    Time series data of multiple water quality parameters are obtained from the water sensor networks deployed in the agricultural water supply network. The accurate and efficient detection and warning of contamination events to prevent pollution from spreading is one of the most important issues when pollution occurs. In order to comprehensively reduce the event detection deviation, a spatial–temporal-based event detection approach with multivariate time-series data for water quality monitoring (M-STED) was proposed. The M-STED approach includes three parts. The first part is that M-STED adopts a Rule K algorithm to select backbone nodes as the nodes in the CDS, and forward the sensed data of multiple water parameters. The second part is to determine the state of each backbone node with back propagation neural network models and the sequential Bayesian analysis in the current timestamp. The third part is to establish a spatial model with Bayesian networks to estimate the state of the backbones in the next timestamp and trace the “outlier” node to its neighborhoods to detect a contamination event. The experimental results indicate that the average detection rate is more than 80% with M-STED and the false detection rate is lower than 9%, respectively. The M-STED approach can improve the rate of detection by about 40% and reduce the false alarm rate by about 45%, compared with the event detection with a single water parameter algorithm, S-STED. Moreover, the proposed M-STED can exhibit better performance in terms of detection delay and scalability. PMID:29207535

  1. Mastery in Goal Scoring, T-Pattern Detection, and Polar Coordinate Analysis of Motor Skills Used by Lionel Messi and Cristiano Ronaldo

    PubMed Central

    Castañer, Marta; Barreira, Daniel; Camerino, Oleguer; Anguera, M. Teresa; Fernandes, Tiago; Hileno, Raúl

    2017-01-01

    Research in soccer has traditionally given more weight to players' technical and tactical skills, but few studies have analyzed the motor skills that underpin specific motor actions. The objective of this study was to investigate the style of play of the world's top soccer players, Cristiano Ronaldo and Lionel Messi, and how they use their motor skills in attacking actions that result in a goal. We used and improved the easy-to-use observation instrument (OSMOS-soccer player) with 9 criteria, each one expanded to build 50 categories. Associations between these categories were investigated by T-pattern detection and polar coordinate analysis. T-pattern analysis detects temporal structures of complex behavioral sequences composed of simpler or directly distinguishable events within specified observation periods (time point series). Polar coordinate analysis involves the application of a complex procedure to provide a vector map of interrelated behaviors obtained from prospective and retrospective sequential analysis. The T-patterns showed that for both players the combined criteria were mainly between the different aspects of motor skills, namely the use of lower limbs, contact with the ball using the outside of the foot, locomotion, body orientation with respect to the opponent goal line, and the criteria of technical actions and the right midfield. Polar coordinate analysis detected significant associations between the same criteria included in the T-patterns as well as the criteria of turning the body, numerical equality with no pressure, and relative numerical superiority. PMID:28553245

  2. The DISC Quotient

    NASA Astrophysics Data System (ADS)

    Elliott, John R.; Baxter, Stephen

    2012-09-01

    D.I.S.C: Decipherment Impact of a Signal's Content. The authors present a numerical method to characterise the significance of the receipt of a complex and potentially decipherable signal from extraterrestrial intelligence (ETI). The purpose of the scale is to facilitate the public communication of work on any such claimed signal, as such work proceeds, and to assist in its discussion and interpretation. Building on a "position" paper rationale, this paper looks at the DISC quotient proposed and develops the algorithmic steps and comprising measures that form this post detection strategy for information dissemination, based on prior work on message detection, decipherment. As argued, we require a robust and incremental strategy, to disseminate timely, accurate and meaningful information, to the scientific community and the general public, in the event we receive an "alien" signal that displays decipherable information. This post-detection strategy is to serve as a stepwise algorithm for a logical approach to information extraction and a vehicle for sequential information dissemination, to manage societal impact. The "DISC Quotient", which is based on signal analysis processing stages, includes factors based on the signal's data quantity, structure, affinity to known human languages, and likely decipherment times. Comparisons with human and other phenomena are included as a guide to assessing likely societal impact. It is submitted that the development, refinement and implementation of DISC as an integral strategy, during the complex processes involved in post detection and decipherment, is essential if we wish to minimize disruption and optimize dissemination.

  3. Mastery in Goal Scoring, T-Pattern Detection, and Polar Coordinate Analysis of Motor Skills Used by Lionel Messi and Cristiano Ronaldo.

    PubMed

    Castañer, Marta; Barreira, Daniel; Camerino, Oleguer; Anguera, M Teresa; Fernandes, Tiago; Hileno, Raúl

    2017-01-01

    Research in soccer has traditionally given more weight to players' technical and tactical skills, but few studies have analyzed the motor skills that underpin specific motor actions. The objective of this study was to investigate the style of play of the world's top soccer players, Cristiano Ronaldo and Lionel Messi, and how they use their motor skills in attacking actions that result in a goal. We used and improved the easy-to-use observation instrument (OSMOS-soccer player) with 9 criteria, each one expanded to build 50 categories. Associations between these categories were investigated by T-pattern detection and polar coordinate analysis. T-pattern analysis detects temporal structures of complex behavioral sequences composed of simpler or directly distinguishable events within specified observation periods (time point series). Polar coordinate analysis involves the application of a complex procedure to provide a vector map of interrelated behaviors obtained from prospective and retrospective sequential analysis. The T-patterns showed that for both players the combined criteria were mainly between the different aspects of motor skills, namely the use of lower limbs, contact with the ball using the outside of the foot, locomotion, body orientation with respect to the opponent goal line, and the criteria of technical actions and the right midfield. Polar coordinate analysis detected significant associations between the same criteria included in the T-patterns as well as the criteria of turning the body, numerical equality with no pressure, and relative numerical superiority.

  4. Production and condensation of organic gases in the atmosphere of Titan

    NASA Technical Reports Server (NTRS)

    Sagan, C.; Thompson, W. R.

    1982-01-01

    The rates and altitudes for the dissociation of atmospheric constituents on Titan are calculated for solar ultraviolet radiation, the solar wind, Saturn magnetospheric particles, the Saturn co-rotating plasma, and cosmic rays. Laboratory experiments show that a variety of simple gas phase organic molecules and more complex organic solids called tholins are produced by such irradiations of simulated Titanian atmospheres. Except for ultraviolet wavelengths longward of the methane photodissociation continuum, most dissociation events occur between about 3100 and 3600 km altitude, corresponding well to the region of EUV opacity detected by Voyager. For a wide variety of simple to moderately complex organic gases in the Titanian atmosphere, condensation occurs below the top of the main cloud deck at about 2825 km. It is proposed that such condensates, beginning with CH4 at about 2615 km, comprise the principal mass of the Titan clouds. There is a distinct tendency for the atmosphere of Titan to act as a fractional distillation device, molecules of greater complexity condensing out at higher altitudes.

  5. Porous polycarbene-bearing membrane actuator for ultrasensitive weak-acid detection and real-time chemical reaction monitoring.

    PubMed

    Sun, Jian-Ke; Zhang, Weiyi; Guterman, Ryan; Lin, Hui-Juan; Yuan, Jiayin

    2018-04-30

    Soft actuators with integration of ultrasensitivity and capability of simultaneous interaction with multiple stimuli through an entire event ask for a high level of structure complexity, adaptability, and/or multi-responsiveness, which is a great challenge. Here, we develop a porous polycarbene-bearing membrane actuator built up from ionic complexation between a poly(ionic liquid) and trimesic acid (TA). The actuator features two concurrent structure gradients, i.e., an electrostatic complexation (EC) degree and a density distribution of a carbene-NH 3 adduct (CNA) along the membrane cross-section. The membrane actuator performs the highest sensitivity among the state-of-the-art soft proton actuators toward acetic acid at 10 -6  mol L -1 (M) level in aqueous media. Through competing actuation of the two gradients, it is capable of monitoring an entire process of proton-involved chemical reactions that comprise multiple stimuli and operational steps. The present achievement constitutes a significant step toward real-life application of soft actuators in chemical sensing and reaction technology.

  6. Label-Free Discovery Array Platform for the Characterization of Glycan Binding Proteins and Glycoproteins.

    PubMed

    Gray, Christopher J; Sánchez-Ruíz, Antonio; Šardzíková, Ivana; Ahmed, Yassir A; Miller, Rebecca L; Reyes Martinez, Juana E; Pallister, Edward; Huang, Kun; Both, Peter; Hartmann, Mirja; Roberts, Hannah N; Šardzík, Robert; Mandal, Santanu; Turnbull, Jerry E; Eyers, Claire E; Flitsch, Sabine L

    2017-04-18

    The identification of carbohydrate-protein interactions is central to our understanding of the roles of cell-surface carbohydrates (the glycocalyx), fundamental for cell-recognition events. Therefore, there is a need for fast high-throughput biochemical tools to capture the complexity of these biological interactions. Here, we describe a rapid method for qualitative label-free detection of carbohydrate-protein interactions on arrays of simple synthetic glycans, more complex natural glycosaminoglycans (GAG), and lectins/carbohydrate binding proteins using matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry. The platform can unequivocally identify proteins that are captured from either purified or complex sample mixtures, including biofluids. Identification of proteins bound to the functionalized array is achieved by analyzing either the intact protein mass or, after on-chip proteolytic digestion, the peptide mass fingerprint and/or tandem mass spectrometry of selected peptides, which can yield highly diagnostic sequence information. The platform described here should be a valuable addition to the limited analytical toolbox that is currently available for glycomics.

  7. Selected control events and reporting odds ratio in signal detection methodology.

    PubMed

    Ooba, Nobuhiro; Kubota, Kiyoshi

    2010-11-01

    To know whether the reporting odds ratio (ROR) using "control events" can detect signals hidden behind striking reports on one or more particular events. We used data of 956 drug use investigations (DUIs) conducted between 1970 and 1998 in Japan and domestic spontaneous reports (SRs) between 1998 and 2008. The event terms in DUIs were converted to the preferred terms in Medical Dictionary for Regulatory Activities (MedDRA). We calculated the incidence proportion for various events and selected 20 "control events" with a relatively constant incidence proportion across DUIs and also reported regularly to the spontaneous reporting system. A "signal" was generated for the drug-event combination when the lower limit of 95% confidence interval of the ROR exceeded 1. We also compared the ROR in SRs with the RR in DUIs. The "control events" accounted for 18.2% of all reports. The ROR using "control events" may detect some hidden signals for a drug with the proportion of "control events" lower than the average. The median of the ratios of the ROR using "control events" to RR was around the unity indicating that "control events" roughly represented the exposure distribution though the range of the ratios was so diverse that the individual ROR might not be regarded as the estimate of RR. The use of the ROR with "control events" may give an adjunctive to the traditional signal detection methods to find a signal hidden behind some major events. Copyright © 2010 John Wiley & Sons, Ltd.

  8. Radionuclide data analysis in connection of DPRK event in May 2009

    NASA Astrophysics Data System (ADS)

    Nikkinen, Mika; Becker, Andreas; Zähringer, Matthias; Polphong, Pornsri; Pires, Carla; Assef, Thierry; Han, Dongmei

    2010-05-01

    The seismic event detected in DPRK on 25.5.2009 was triggering a series of actions within CTBTO/PTS to ensure its preparedness to detect any radionuclide emissions possibly linked with the event. Despite meticulous work to detect and verify, traces linked to the DPRK event were not found. After three weeks of high alert the PTS resumed back to normal operational routine. This case illuminates the importance of objectivity and procedural approach in the data evaluation. All the data coming from particulate and noble gas stations were evaluated daily, some of the samples even outside of office hours and during the weekends. Standard procedures were used to determine the network detection thresholds of the key (CTBT relevant) radionuclides achieved across the DPRK event area and for the assessment of radionuclides typically occurring at IMS stations (background history). Noble gas system has sometimes detections that are typical for the sites due to legitimate non-nuclear test related activities. Therefore, set of hypothesis were used to see if the detection is consistent with event time and location through atmospheric transport modelling. Also the consistency of event timing and isotopic ratios was used in the evaluation work. As a result it was concluded that if even 1/1000 of noble gasses from a nuclear detonation would had leaked, the IMS system would not had problems to detect it. This case also showed the importance of on-site inspections to verify the nuclear traces of possible tests.

  9. Molecular toolbox for the identification of unknown genetically modified organisms.

    PubMed

    Ruttink, Tom; Demeyer, Rolinde; Van Gulck, Elke; Van Droogenbroeck, Bart; Querci, Maddalena; Taverniers, Isabel; De Loose, Marc

    2010-03-01

    Competent laboratories monitor genetically modified organisms (GMOs) and products derived thereof in the food and feed chain in the framework of labeling and traceability legislation. In addition, screening is performed to detect the unauthorized presence of GMOs including asynchronously authorized GMOs or GMOs that are not officially registered for commercialization (unknown GMOs). Currently, unauthorized or unknown events are detected by screening blind samples for commonly used transgenic elements, such as p35S or t-nos. If (1) positive detection of such screening elements shows the presence of transgenic material and (2) all known GMOs are tested by event-specific methods but are not detected, then the presence of an unknown GMO is inferred. However, such evidence is indirect because it is based on negative observations and inconclusive because the procedure does not identify the causative event per se. In addition, detection of unknown events is hampered in products that also contain known authorized events. Here, we outline alternative approaches for analytical detection and GMO identification and develop new methods to complement the existing routine screening procedure. We developed a fluorescent anchor-polymerase chain reaction (PCR) method for the identification of the sequences flanking the p35S and t-nos screening elements. Thus, anchor-PCR fingerprinting allows the detection of unique discriminative signals per event. In addition, we established a collection of in silico calculated fingerprints of known events to support interpretation of experimentally generated anchor-PCR GM fingerprints of blind samples. Here, we first describe the molecular characterization of a novel GMO, which expresses recombinant human intrinsic factor in Arabidopsis thaliana. Next, we purposefully treated the novel GMO as a blind sample to simulate how the new methods lead to the molecular identification of a novel unknown event without prior knowledge of its transgene sequence. The results demonstrate that the new methods complement routine screening procedures by providing direct conclusive evidence and may also be useful to resolve masking of unknown events by known events.

  10. Type III-L Solar Radio Bursts and Solar Energetic Particle Events

    NASA Astrophysics Data System (ADS)

    Duffin, R. T.; White, S. M.; Ray, P. S.; Kaiser, M. L.

    2015-09-01

    A radio-selected sample of fast drift radio bursts with complex structure occurring after the impulsive phase of the associated flare (“Type III-L bursts”) is identified by inspection of radio dynamic spectra from 1 to 180 MHz for over 300 large flares in 2001. An operational definition that takes into account previous work on these radio bursts starting from samples of solar energetic particle (SEP) events is applied to the data, and 66 Type III-L bursts are found in the sample. In order to determine whether the presence of these radio bursts can be used to predict the occurrence of SEP events, we also develop a catalog of all SEP proton events in 2001 using data from the ERNE detector on the SOHO satellite. 68 SEP events are found, for 48 of which we can identify a solar source and hence look for associated Type III-L emission. We confirm previous work that found that most (76% in our sample) of the solar sources of SEP events exhibit radio emission of this type. However, the correlation in the opposite direction is not as strong: starting from a radio-selected sample of Type III-L events, around 64% of the bursts that occur at longitudes magnetically well-connected to the Earth, and hence favorable for detection of SEPs, are associated with SEP events. The degree of association increases when the events have durations over 10 minutes at 1 MHz, but in general Type III-L bursts do not perform any better than Type II bursts in our sample as predictors of SEP events. A comparison of Type III-L timing with the arrival of near-relativistic electrons at the ACE spacecraft is not inconsistent with a common source for the accelerated electrons in both phenomena.

  11. Fractal analysis of the ambulation pattern of Japanese quail.

    PubMed

    Kembro, J M; Perillo, M A; Pury, P A; Satterlee, D G; Marín, R H

    2009-03-01

    1. The study examined the practicality and usefulness of fractal analyses in evaluating the temporal organisation of avian ambulatory behaviour by using female Japanese quail in their home boxes as the model system. To induce two locomotion activity levels, we tested half of the birds without disturbance (Unstimulated) and the other half when food was scattered on the floor of the home box after 3 h of feeder withdrawal (Stimulated). 2. Ambulatory activity was recorded during 40 min at a resolution of 1 s and evaluated by: (1) detrended fluctuation analyses (DFA), (2) the frequency distribution of the duration of the walking or non-walking events (FDD-W or FDD-NW, respectively), and (3) the transition probabilities between walking/non-walking states. Conventional measures of total time spent walking and average duration of the walking/non-walking events were also employed. 3. DFA showed a decreased value of the self-similarity parameter (alpha; indicative of a more complex ambulatory pattern) in Stimulated birds compared to their Unstimulated counterparts. The FDD-NW showed a more negative scaling factor in Stimulated than in Unstimulated birds. Stimulated birds also had more transitions between non-walking and walking states, consistent with stimulated exploratory activity. No differences were found between groups in the FDD-W, in percentage of total time spent walking, or in average duration of the walking events. 4. The temporal walking pattern of female Japanese quail has a fractal structure and its organisation and complexity is altered when birds are stimulated to explore. The fractal analyses detected differences between the Unstimulated and Stimulated groups that went undetected by the traditional measurements of the percentage of total time spent walking and the duration of the walking events suggesting its usefulness as a tool for behavioural studies.

  12. Effects of complex life cycles on genetic diversity: cyclical parthenogenesis.

    PubMed

    Rouger, R; Reichel, K; Malrieu, F; Masson, J P; Stoeckel, S

    2016-11-01

    Neutral patterns of population genetic diversity in species with complex life cycles are difficult to anticipate. Cyclical parthenogenesis (CP), in which organisms undergo several rounds of clonal reproduction followed by a sexual event, is one such life cycle. Many species, including crop pests (aphids), human parasites (trematodes) or models used in evolutionary science (Daphnia), are cyclical parthenogens. It is therefore crucial to understand the impact of such a life cycle on neutral genetic diversity. In this paper, we describe distributions of genetic diversity under conditions of CP with various clonal phase lengths. Using a Markov chain model of CP for a single locus and individual-based simulations for two loci, our analysis first demonstrates that strong departures from full sexuality are observed after only a few generations of clonality. The convergence towards predictions made under conditions of full clonality during the clonal phase depends on the balance between mutations and genetic drift. Second, the sexual event of CP usually resets the genetic diversity at a single locus towards predictions made under full sexuality. However, this single recombination event is insufficient to reshuffle gametic phases towards full-sexuality predictions. Finally, for similar levels of clonality, CP and acyclic partial clonality (wherein a fixed proportion of individuals are clonally produced within each generation) differentially affect the distribution of genetic diversity. Overall, this work provides solid predictions of neutral genetic diversity that may serve as a null model in detecting the action of common evolutionary or demographic processes in cyclical parthenogens (for example, selection or bottlenecks).

  13. iss053e238931

    NASA Image and Video Library

    2017-11-22

    iss053e238931 (Nov. 22, 2017) --- Flight Engineer Alexander Misurkin from Roscosmos works with the JPL Electronic Nose (ENose) experiment in the Zvezda service module. ENose is a full-time, continuously operating event monitor designed to detect air contamination from spills and leaks in the crew habitat of the International Space Station. It fills the long-standing gap between onboard alarms and complex analytical instruments. ENose provides rapid, early identification and quantification of atmospheric changes caused by chemical species to which it has been trained. ENose can also be used to monitor cleanup processes after a leak or a spill.

  14. SNIa detection in the SNLS photometric analysis using Morphological Component Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Möller, A.; Ruhlmann-Kleider, V.; Neveu, J.

    2015-04-01

    Detection of supernovae (SNe) and, more generally, of transient events in large surveys can provide numerous false detections. In the case of a deferred processing of survey images, this implies reconstructing complete light curves for all detections, requiring sizable processing time and resources. Optimizing the detection of transient events is thus an important issue for both present and future surveys. We present here the optimization done in the SuperNova Legacy Survey (SNLS) for the 5-year data deferred photometric analysis. In this analysis, detections are derived from stacks of subtracted images with one stack per lunation. The 3-year analysis provided 300,000more » detections dominated by signals of bright objects that were not perfectly subtracted. Allowing these artifacts to be detected leads not only to a waste of resources but also to possible signal coordinate contamination. We developed a subtracted image stack treatment to reduce the number of non SN-like events using morphological component analysis. This technique exploits the morphological diversity of objects to be detected to extract the signal of interest. At the level of our subtraction stacks, SN-like events are rather circular objects while most spurious detections exhibit different shapes. A two-step procedure was necessary to have a proper evaluation of the noise in the subtracted image stacks and thus a reliable signal extraction. We also set up a new detection strategy to obtain coordinates with good resolution for the extracted signal. SNIa Monte-Carlo (MC) generated images were used to study detection efficiency and coordinate resolution. When tested on SNLS 3-year data this procedure decreases the number of detections by a factor of two, while losing only 10% of SN-like events, almost all faint ones. MC results show that SNIa detection efficiency is equivalent to that of the original method for bright events, while the coordinate resolution is improved.« less

  15. Fine-Scale Event Location and Error Analysis in NET-VISA

    NASA Astrophysics Data System (ADS)

    Arora, N. S.; Russell, S.

    2016-12-01

    NET-VISA is a generative probabilistic model for the occurrence of seismic, hydro, and atmospheric events, and the propagation of energy from these events through various mediums and phases before being detected, or misdetected, by IMS stations. It is built on top of the basic station, and arrival detection processing at the IDC, and is currently being tested in the IDC network processing pipelines. A key distinguishing feature of NET-VISA is that it is easy to incorporate prior scientific knowledge and historical data into the probabilistic model. The model accounts for both detections and mis-detections when forming events, and this allows it to make more accurate event hypothesis. It has been continuously evaluated since 2012, and in each year it makes a roughly 60% reduction in the number of missed events without increasing the false event rate as compared to the existing GA algorithm. More importantly the model finds large numbers of events that have been confirmed by regional seismic bulletins but missed by the IDC analysts using the same data. In this work we focus on enhancements to the model to improve the location accuracy, and error ellipses. We will present a new version of the model that focuses on the fine scale around the event location, and present error ellipses and analysis of recent important events.

  16. Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks.

    PubMed

    Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming

    2015-12-15

    With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady.

  17. Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks

    PubMed Central

    Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming

    2015-01-01

    With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady. PMID:26694394

  18. Detecting NEO Impacts using the International Monitoring System

    NASA Astrophysics Data System (ADS)

    Brown, Peter G.; Dube, Kimberlee; Silber, Elizabeth

    2014-11-01

    As part of the verification regime for the Comprehensive Nuclear Test Ban Treaty an International Monitoring System (IMS) consisting of seismic, hydroacoustic, infrasound and radionuclide technologies has been globally deployed beginning in the late 1990s. The infrasound network sub-component of the IMS consists of 47 active stations as of mid-2014. These microbarograph arrays detect coherent infrasonic signals from a range of sources including volcanoes, man-made explosions and bolides. Bolide detections from IMS stations have been reported since ~2000, but with the maturation of the network over the last several years the rate of detections has increased substantially. Presently the IMS performs semi-automated near real-time global event identification on timescales of 6-12 hours as well as analyst verified event identification having time lags of several weeks. Here we report on infrasound events identified by the IMS between 2010-2014 which are likely bolide impacts. Identification in this context refers to an event being included in one of the event bulletins issued by the IMS. In this untargeted study we find that the IMS globally identifies approximately 16 events per year which are likely bolide impacts. Using data released since the beginning of 2014 of US Government sensor detections (as given at http://neo.jpl.nasa.gov/fireballs/ ) of fireballs we find in a complementary targeted survey that the current IMS system is able to identify ~25% of fireballs with E > 0.1 kT energy. Using all 16 US Government sensor fireballs listed as of July 31, 2014 we are able to detect infrasound from 75% of these events on at least one IMS station. The high ratio of detection/identification is a product of the stricter criteria adopted by the IMS for inclusion in an event bulletin as compared to simple station detection.We discuss energy comparisons between infrasound-estimated energies based on amplitudes and periods and estimates provided by US Government sensors. Specific impact events of interest will be discussed as well as the utility of the global IMS infrasound system for location and timing of future NEAs detected prior to impact.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borges, Raymond Charles; Beaver, Justin M; Buckner, Mark A

    Power system disturbances are inherently complex and can be attributed to a wide range of sources, including both natural and man-made events. Currently, the power system operators are heavily relied on to make decisions regarding the causes of experienced disturbances and the appropriate course of action as a response. In the case of cyber-attacks against a power system, human judgment is less certain since there is an overt attempt to disguise the attack and deceive the operators as to the true state of the system. To enable the human decision maker, we explore the viability of machine learning as amore » means for discriminating types of power system disturbances, and focus specifically on detecting cyber-attacks where deception is a core tenet of the event. We evaluate various machine learning methods as disturbance discriminators and discuss the practical implications for deploying machine learning systems as an enhancement to existing power system architectures.« less

  20. Does the Type of Event Influence How User Interactions Evolve on Twitter?

    PubMed Central

    del Val, Elena; Rebollo, Miguel; Botti, Vicente

    2015-01-01

    The number of people using on-line social networks as a new way of communication is continually increasing. The messages that a user writes in these networks and his/her interactions with other users leave a digital trace that is recorded. Thanks to this fact and the use of network theory, the analysis of messages, user interactions, and the complex structures that emerge is greatly facilitated. In addition, information generated in on-line social networks is labeled temporarily, which makes it possible to go a step further analyzing the dynamics of the interaction patterns. In this article, we present an analysis of the evolution of user interactions that take place in television, socio-political, conference, and keynote events on Twitter. Interactions have been modeled as networks that are annotated with the time markers. We study changes in the structural properties at both the network level and the node level. As a result of this analysis, we have detected patterns of network evolution and common structural features as well as differences among the events. PMID:25961305

  1. Good vibrations: tactile feedback in support of attention allocation and human-automation coordination in event-driven domains.

    PubMed

    Sklar, A E; Sarter, N B

    1999-12-01

    Observed breakdowns in human-machine communication can be explained, in part, by the nature of current automation feedback, which relies heavily on focal visual attention. Such feedback is not well suited for capturing attention in case of unexpected changes and events or for supporting the parallel processing of large amounts of data in complex domains. As suggested by multiple-resource theory, one possible solution to this problem is to distribute information across various sensory modalities. A simulator study was conducted to compare the effectiveness of visual, tactile, and redundant visual and tactile cues for indicating unexpected changes in the status of an automated cockpit system. Both tactile conditions resulted in higher detection rates for, and faster response times to, uncommanded mode transitions. Tactile feedback did not interfere with, nor was its effectiveness affected by, the performance of concurrent visual tasks. The observed improvement in task-sharing performance indicates that the introduction of tactile feedback is a promising avenue toward better supporting human-machine communication in event-driven, information-rich domains.

  2. Compound Event Barrier Coverage in Wireless Sensor Networks under Multi-Constraint Conditions.

    PubMed

    Zhuang, Yaoming; Wu, Chengdong; Zhang, Yunzhou; Jia, Zixi

    2016-12-24

    It is important to monitor compound event by barrier coverage issues in wireless sensor networks (WSNs). Compound event barrier coverage (CEBC) is a novel coverage problem. Unlike traditional ones, the data of compound event barrier coverage comes from different types of sensors. It will be subject to multiple constraints under complex conditions in real-world applications. The main objective of this paper is to design an efficient algorithm for complex conditions that can combine the compound event confidence. Moreover, a multiplier method based on an active-set strategy (ASMP) is proposed to optimize the multiple constraints in compound event barrier coverage. The algorithm can calculate the coverage ratio efficiently and allocate the sensor resources reasonably in compound event barrier coverage. The proposed algorithm can simplify complex problems to reduce the computational load of the network and improve the network efficiency. The simulation results demonstrate that the proposed algorithm is more effective and efficient than existing methods, especially in the allocation of sensor resources.

  3. Compound Event Barrier Coverage in Wireless Sensor Networks under Multi-Constraint Conditions

    PubMed Central

    Zhuang, Yaoming; Wu, Chengdong; Zhang, Yunzhou; Jia, Zixi

    2016-01-01

    It is important to monitor compound event by barrier coverage issues in wireless sensor networks (WSNs). Compound event barrier coverage (CEBC) is a novel coverage problem. Unlike traditional ones, the data of compound event barrier coverage comes from different types of sensors. It will be subject to multiple constraints under complex conditions in real-world applications. The main objective of this paper is to design an efficient algorithm for complex conditions that can combine the compound event confidence. Moreover, a multiplier method based on an active-set strategy (ASMP) is proposed to optimize the multiple constraints in compound event barrier coverage. The algorithm can calculate the coverage ratio efficiently and allocate the sensor resources reasonably in compound event barrier coverage. The proposed algorithm can simplify complex problems to reduce the computational load of the network and improve the network efficiency. The simulation results demonstrate that the proposed algorithm is more effective and efficient than existing methods, especially in the allocation of sensor resources. PMID:28029118

  4. Pedestrian Detection by Laser Scanning and Depth Imagery

    NASA Astrophysics Data System (ADS)

    Barsi, A.; Lovas, T.; Molnar, B.; Somogyi, A.; Igazvolgyi, Z.

    2016-06-01

    Pedestrian flow is much less regulated and controlled compared to vehicle traffic. Estimating flow parameters would support many safety, security or commercial applications. Current paper discusses a method that enables acquiring information on pedestrian movements without disturbing and changing their motion. Profile laser scanner and depth camera have been applied to capture the geometry of the moving people as time series. Procedures have been developed to derive complex flow parameters, such as count, volume, walking direction and velocity from laser scanned point clouds. Since no images are captured from the faces of pedestrians, no privacy issues raised. The paper includes accuracy analysis of the estimated parameters based on video footage as reference. Due to the dense point clouds, detailed geometry analysis has been conducted to obtain the height and shoulder width of pedestrians and to detect whether luggage has been carried or not. The derived parameters support safety (e.g. detecting critical pedestrian density in mass events), security (e.g. detecting prohibited baggage in endangered areas) and commercial applications (e.g. counting pedestrians at all entrances/exits of a shopping mall).

  5. Automated Detection of Events of Scientific Interest

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    A report presents a slightly different perspective of the subject matter of Fusing Symbolic and Numerical Diagnostic Computations (NPO-42512), which appears elsewhere in this issue of NASA Tech Briefs. Briefly, the subject matter is the X-2000 Anomaly Detection Language, which is a developmental computing language for fusing two diagnostic computer programs one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for real-time detection of events. In the case of the cited companion NASA Tech Briefs article, the contemplated events that one seeks to detect would be primarily failures or other changes that could adversely affect the safety or success of a spacecraft mission. In the case of the instant report, the events to be detected could also include natural phenomena that could be of scientific interest. Hence, the use of X- 2000 Anomaly Detection Language could contribute to a capability for automated, coordinated use of multiple sensors and sensor-output-data-processing hardware and software to effect opportunistic collection and analysis of scientific data.

  6. FORTE Compact Intra-cloud Discharge Detection parameterized by Peak Current

    NASA Astrophysics Data System (ADS)

    Heavner, M. J.; Suszcynsky, D. M.; Jacobson, A. R.; Heavner, B. D.; Smith, D. A.

    2002-12-01

    The Los Alamos Sferic Array (EDOT) has recorded over 3.7 million lightning-related fast electric field change data records during April 1 - August 31, 2001 and 2002. The events were detected by three or more stations, allowing for differential-time-of-arrival location determination. The waveforms are characterized with estimated peak currents as well as by event type. Narrow Bipolar Events (NBEs), the VLF/LF signature of Compact Intra-cloud Discharges (CIDs), are generally isolated pulses with identifiable ionospheric reflections, permitting determination of event source altitudes. We briefly review the EDOT characterization of events. The FORTE satellite observes Trans-Ionospheric Pulse Pairs (TIPPs, the VHF satellite signature of CIDs). The subset of coincident EDOT and FORTE CID observations are compared with the total EDOT CID database to characterize the VHF detection efficiency of CIDs. The NBE polarity and altitude are also examined in the context of FORTE TIPP detection. The parameter-dependent detection efficiencies are extrapolated from FORTE orbit to GPS orbit in support of the V-GLASS effort (GPS based global detection of lightning).

  7. The Study of the Cosmic Gamma-Emission Nonstationary Fluxes Characteristics by the AVS-F Apparatus Data

    NASA Astrophysics Data System (ADS)

    Kotov, Yu. D.; Arkhangelskaja, I. V.; Arkhangelsky, A. I.; Kuznetsov, S. N.; Glyanenko, A. S.; Kalmykov, P. A.; Amandzholova, D. B.; Samoylenko, V. T.; Yurov, V. N.; Pavlov, A. V.; Chervyakova, O. I.; Afonina, I. V.

    The AVS-F apparatus (Russian abbreviation for Amplitude-Time Spectrometry of the Sun) is intended for the solar flares' hard X-ray and gamma-ray emission characteristic studies and for the search and detection of the gamma-ray bursts (GRB). At present over 1,100 events with duration more than 2 s without any coordinate relations to Earth Radiation Belts and South Atlantic Anomaly were separated on the results of preliminary analysis of AVS-F experiment database.About 68 % of the identified events were associated with quasistationary equatorial precipitations-15-30 % count rate increases in the low-energy gamma-band of the AVS-F apparatus over its average value obtained by approximation of these parts with polynomials discovered on some equatorial segments in the ranges of geographic latitude of 25∘ up to +30∘. Several short events with duration of 1-16 ms associated with terrestrial gamma-ray flashes were registered during the experiment. These events were detected above the powerful thunderstorm formations.Solar flares with classes stronger than M1.0 according to the GOES classification were about 7 % of the detected events. Solar flares' hard X-rays and γ-emission were mainly observed during the rise or maximum phases of the emission in the soft X-rays band according to the detectors on board the GOES series satellites data and duration of their registration is less than of the soft X-ray bands. According to the preliminary data analysis gamma-emission with energy over 10 MeV was registered during 12 % of the observed flares. The emission in the energy band E ¿ 100 keV was registered during over 60 faint solar flares (of B and C classes according to the GOES and from several ones γ-quanta with energy up to several tens of MeV were observed.Several spectral line complexes were observed in the spectra of some solar flares stronger than M1.0 in the low-energy gamma-range. Registered spectral features were corresponded to α α-lines, annihilation line, nuclear lines, and neutron capture line on1H (2.223 MeV). In the spectrum of the January 20, 2005 solar flare the feature in the range of 15-21 MeV was detected for the first time. It can be associated with lines of 15.11 MeV (12C +16O) or 20.58 MeV (from neutron radiative capture on3He), or with their combination. Also several e-dominant flares without any gamma-lines in energy spectra were identified. All detected faint solar flares were e-dominant according to the preliminary data analysis.Thin structure with characteristic timescale of 30-160 s was observed at 99 % significance level on some solar flares stronger than M1.0 temporal profiles in the low-energy gamma-band in the energy ranges corresponding to the identified spectral features or whole gamma-band energy boundaries. According to the results of the preliminary analysis during the flare of January 20, 2005, thin structure with timescale from 7 ms to 35 ms was detected at 99 % confidence level in the energy range of 0.1-20 MeV. Some thin structure with characteristic timescale 50-110 s was observed on temporal profiles of several faint events.About 3 % of the identified events were gamma-ray bursts. During some bursts high-energy gamma-emission was observed, for example Emax = 147 ± 3 MeV for GRB050525.

  8. 135Xe measurements with a two-element CZT-based radioxenon detector for nuclear explosion monitoring.

    PubMed

    Ranjbar, Lily; Farsoni, Abi T; Becker, Eric M

    2017-04-01

    Measurement of elevated concentrations of xenon radioisotopes ( 131m Xe, 133m Xe, 133 Xe and 135 Xe) in the atmosphere has been shown to be a very powerful method for verifying whether or not a detected explosion is nuclear in nature. These isotopes are among the few with enough mobility and with half-lives long enough to make their detection at long distances realistic. Existing radioxenon detection systems used by the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) suffer from problems such as complexity, need for high maintenance and memory effect. To study the response of CdZnTe (CZT) detectors to xenon radioisotopes and investigate whether it is capable of mitigating the aforementioned issues with the current radioxenon detection systems, a prototype detector utilizing two coplanar CZT detectors was built and tested at Oregon State University. The detection system measures xenon radioisotopes through beta-gamma coincidence technique by detecting coincidence events between the two detectors. In this paper, we introduce the detector design and report our measurement results with radioactive lab sources and 135 Xe produced in the OSU TRIGA reactor. Minimum Detectable Concentration (MDC) for 135 Xe was calculated to be 1.47 ± 0.05 mBq/m 3 . Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Multiscale-Driven approach to detecting change in Synthetic Aperture Radar (SAR) imagery

    NASA Astrophysics Data System (ADS)

    Gens, R.; Hogenson, K.; Ajadi, O. A.; Meyer, F. J.; Myers, A.; Logan, T. A.; Arnoult, K., Jr.

    2017-12-01

    Detecting changes between Synthetic Aperture Radar (SAR) images can be a useful but challenging exercise. SAR with its all-weather capabilities can be an important resource in identifying and estimating the expanse of events such as flooding, river ice breakup, earthquake damage, oil spills, and forest growth, as it can overcome shortcomings of optical methods related to cloud cover. However, detecting change in SAR imagery can be impeded by many factors including speckle, complex scattering responses, low temporal sampling, and difficulty delineating boundaries. In this presentation we use a change detection method based on a multiscale-driven approach. By using information at different resolution levels, we attempt to obtain more accurate change detection maps in both heterogeneous and homogeneous regions. Integrated within the processing flow are processes that 1) improve classification performance by combining Expectation-Maximization algorithms with mathematical morphology, 2) achieve high accuracy in preserving boundaries using measurement level fusion techniques, and 3) combine modern non-local filtering and 2D-discrete stationary wavelet transform to provide robustness against noise. This multiscale-driven approach to change detection has recently been incorporated into the Alaska Satellite Facility (ASF) Hybrid Pluggable Processing Pipeline (HyP3) using radiometrically terrain corrected SAR images. Examples primarily from natural hazards are presented to illustrate the capabilities and limitations of the change detection method.

  10. A fluorescence method for detection of DNA and DNA methylation based on graphene oxide and restriction endonuclease HpaII.

    PubMed

    Wei, Wei; Gao, Chunyan; Xiong, Yanxiang; Zhang, Yuanjian; Liu, Songqin; Pu, Yuepu

    2015-01-01

    DNA methylation plays an important role in many biological events and is associated with various diseases. Most traditional methods for detection of DNA methylation are based on the complex and expensive bisulfite method. In this paper, we report a novel fluorescence method to detect DNA and DNA methylation based on graphene oxide (GO) and restriction endonuclease HpaII. The skillfully designed probe DNA labeled with 5-carboxyfluorescein (FAM) and optimized GO concentration keep the probe/target DNA still adsorbed on the GO. After the cleavage action of HpaII the labeled FAM is released from the GO surface and its fluorescence recovers, which could be used to detect DNA in the linear range of 50 pM-50 nM with a detection limit of 43 pM. DNA methylation induced by transmethylase (Mtase) or other chemical reagents prevents HpaII from recognizing and cleaving the specific site; as a result, fluorescence cannot recover. The fluorescence recovery efficiency is closely related to the DNA methylation level, which can be used to detect DNA methylation by comparing it with the fluorescence in the presence of intact target DNA. The method for detection of DNA and DNA methylation is simple, reliable and accurate. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. The Development of Narrative Productivity, Syntactic Complexity, Referential Cohesion and Event Content in Four- to Eight-Year-Old Finnish Children

    ERIC Educational Resources Information Center

    Mäkinen, Leena; Loukusa, Soile; Nieminen, Lea; Leinonen, Eeva; Kunnari, Sari

    2014-01-01

    This study focuses on the development of narrative structure and the relationship between narrative productivity and event content. A total of 172 Finnish children aged between four and eight participated. Their picture-elicited narrations were analysed for productivity, syntactic complexity, referential cohesion and event content. Each measure…

  12. How a Country-Wide Seismological Network Can Improve Understanding of Seismicity and Seismic Hazard -- The Example of Bhutan

    NASA Astrophysics Data System (ADS)

    Hetényi, G.; Diehl, T.; Singer, J.; Kissling, E. H.; Clinton, J. F.; Wiemer, S.

    2015-12-01

    The Eastern Himalayas are home to a seemingly complex seismo-tectonic evolution. The rate of instrumental seismicity is lower than the average along the orogen, there is no record of large historical events, but both paleoseismology and GPS studies point to potentially large (M>8) earthquakes. Due to the lack of a permanent seismic monitoring system in the area, our current level of understanding is inappropriate to create a reliable quantitative seismic hazard model for the region. Existing maps are based on questionable hypotheses and show major inconsistencies when compared to each other. Here we present results on national and regional scales from a 38-station broadband seismological network we operated for almost 2 years in the Kingdom of Bhutan. A thorough, state-of-the-art analysis of local and regional earthquakes builds a comprehensive catalogue that reveals significantly (2-to-3 orders of magnitude) more events than detected from global networks. The seismotectonic analysis reveals new patterns of seismic activity as well as striking differences over relatively short distances within the Himalayas, only partly explained by surface observations such as geology. We compare a priori and a posteriori (BMC) magnitude of completeness maps and show that our network was able to detect all felt events during its operation. Some of these events could be felt at surprisingly large distances. Based on our experiment and experience, we draft the pillars on which a permanent seismological observatory for Bhutan could be constructed. Such a continuous monitoring system of seismic activity could then lead to a reliable quantitative seismic hazard model for Bhutan and surrounding regions, and serve as a base to improve building codes and general preparedness.

  13. Seismic Excitation of the Ross Ice Shelf by Whillans Ice Stream Stick-Slip Events

    NASA Astrophysics Data System (ADS)

    Wiens, D.; Pratt, M. J.; Aster, R. C.; Nyblade, A.; Bromirski, P. D.; Stephen, R. A.; Gerstoft, P.; Diez, A.; Cai, C.; Anthony, R. E.; Shore, P.

    2015-12-01

    Rapid variations in the flow rate of upstream glaciers and ice streams may cause significant deformation of ice shelves. The Whillans Ice Stream (WIS) represents an extreme example of rapid variations in velocity, with motions near the grounding line consisting almost entirely of once or twice-daily stick-slip events with a displacement of up to 0.7 m (Winberry et al, 2014). Here we report observations of compressional waves from the WIS slip events propagating hundreds of kilometers across the Ross Ice Shelf (RIS) detected by broadband seismographs deployed on the ice shelf. The WIS slip events consist of rapid basal slip concentrated at three high friction regions (often termed sticky-spots or asperities) within a period of about 25 minutes (Pratt et al, 2014). Compressional displacement pulses from the second and third sticky spots are detected across the entire RIS up to about 600 km away from the source. The largest pulse results from the third sticky spot, located along the northwestern grounding line of the WIS. Propagation velocities across the ice shelf are significantly slower than the P wave velocity in ice, as the long period displacement pulse is also sensitive to velocities of the water and sediments beneath the ice shelf. Particle motions are, to the limit of resolution, entirely within the horizontal plane and roughly radial with respect to the WIS sticky-spots, but show significant complexity, presumably due to differences in ice velocity, thickness, and the thickness of water and sediment beneath. Study of this phenomenon should lead to greater understanding of how the ice shelf responds to sudden forcing around the periphery.

  14. Object-Oriented Query Language For Events Detection From Images Sequences

    NASA Astrophysics Data System (ADS)

    Ganea, Ion Eugen

    2015-09-01

    In this paper is presented a method to represent the events extracted from images sequences and the query language used for events detection. Using an object oriented model the spatial and temporal relationships between salient objects and also between events are stored and queried. This works aims to unify the storing and querying phases for video events processing. The object oriented language syntax used for events processing allow the instantiation of the indexes classes in order to improve the accuracy of the query results. The experiments were performed on images sequences provided from sport domain and it shows the reliability and the robustness of the proposed language. To extend the language will be added a specific syntax for constructing the templates for abnormal events and for detection of the incidents as the final goal of the research.

  15. LSST Astroinformatics And Astrostatistics: Data-oriented Astronomical Research

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.; Stassun, K.; Brunner, R. J.; Djorgovski, S. G.; Graham, M.; Hakkila, J.; Mahabal, A.; Paegert, M.; Pesenson, M.; Ptak, A.; Scargle, J.; Informatics, LSST; Statistics Team

    2011-01-01

    The LSST Informatics and Statistics Science Collaboration (ISSC) focuses on research and scientific discovery challenges posed by the very large and complex data collection that LSST will generate. Application areas include astroinformatics, machine learning, data mining, astrostatistics, visualization, scientific data semantics, time series analysis, and advanced signal processing. Research problems to be addressed with these methodologies include transient event characterization and classification, rare class discovery, correlation mining, outlier/anomaly/surprise detection, improved estimators (e.g., for photometric redshift or early onset supernova classification), exploration of highly dimensional (multivariate) data catalogs, and more. We present sample science results from these data-oriented approaches to large-data astronomical research. We present results from LSST ISSC team members, including the EB (Eclipsing Binary) Factory, the environmental variations in the fundamental plane of elliptical galaxies, and outlier detection in multivariate catalogs.

  16. SAMuS: Service-Oriented Architecture for Multisensor Surveillance in Smart Homes

    PubMed Central

    Van de Walle, Rik

    2014-01-01

    The design of a service-oriented architecture for multisensor surveillance in smart homes is presented as an integrated solution enabling automatic deployment, dynamic selection, and composition of sensors. Sensors are implemented as Web-connected devices, with a uniform Web API. RESTdesc is used to describe the sensors and a novel solution is presented to automatically compose Web APIs that can be applied with existing Semantic Web reasoners. We evaluated the solution by building a smart Kinect sensor that is able to dynamically switch between IR and RGB and optimizing person detection by incorporating feedback from pressure sensors, as such demonstrating the collaboration among sensors to enhance detection of complex events. The performance results show that the platform scales for many Web APIs as composition time remains limited to a few hundred milliseconds in almost all cases. PMID:24778579

  17. Future Expansion of the Lightning Surveillance System at the Kennedy Space Center and the Cape Canaveral Air Force Station, Florida, USA

    NASA Technical Reports Server (NTRS)

    Mata, C. T.; Wilson, J. G.

    2012-01-01

    The NASA Kennedy Space Center (KSC) and the Air Force Eastern Range (ER) use data from two cloud-to-ground (CG) lightning detection networks, the Cloud-to-Ground Lightning Surveillance System (CGLSS) and the U.S. National Lightning Detection Network (NLDN), and a volumetric mapping array, the lightning detection and ranging II (LDAR II) system: These systems are used to monitor and characterize lightning that is potentially hazardous to launch or ground operations and hardware. These systems are not perfect and both have documented missed lightning events when compared to the existing lightning surveillance system at Launch Complex 39B (LC39B). Because of this finding it is NASA's plan to install a lightning surveillance system around each of the active launch pads sharing site locations and triggering capabilities when possible. This paper shows how the existing lightning surveillance system at LC39B has performed in 2011 as well as the plan for the expansion around all active pads.

  18. TU-G-BRD-01: Quantifying the Effectiveness of the Physics Pre-Treatment Plan Review for Detecting Errors in Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gopan, O; Novak, A; Zeng, J

    Purpose: Physics pre-treatment plan review is crucial to safe radiation oncology treatments. Studies show that most errors originate in treatment planning, which underscores the importance of physics plan review. As a QA measure the physics review is of fundamental importance and is central to the profession of medical physics. However, little is known about its effectiveness. More hard data are needed. The purpose of this study was to quantify the effectiveness of physics review with the goal of improving it. Methods: This study analyzed 315 “potentially serious” near-miss incidents within an institutional incident learning system collected over a two-year period.more » 139 of these originated prior to physics review and were found at the review or after. Incidents were classified as events that: 1)were detected by physics review, 2)could have been detected (but were not), and 3)could not have been detected. Category 1 and 2 events were classified by which specific check (within physics review) detected or could have detected the event. Results: Of the 139 analyzed events, 73/139 (53%) were detected or could have been detected by the physics review; although, 42/73 (58%) were not actually detected. 45/73 (62%) errors originated in treatment planning, making physics review the first step in the workflow that could detect the error. Two specific physics checks were particularly effective (combined effectiveness of >20%): verifying DRRs (8/73) and verifying isocenter (7/73). Software-based plan checking systems were evaluated and found to have potential effectiveness of 40%. Given current data structures, software implementations of some tests such as isocenter verification check would be challenging. Conclusion: Physics plan review is a key safety measure and can detect majority of reported events. However, a majority of events that potentially could have been detected were NOT detected in this study, indicating the need to improve the performance of physics review.« less

  19. Improving the Detectability of the Catalan Seismic Network for Local Seismic Activity Monitoring

    NASA Astrophysics Data System (ADS)

    Jara, Jose Antonio; Frontera, Tànit; Batlló, Josep; Goula, Xavier

    2016-04-01

    The seismic survey of the territory of Catalonia is mainly performed by the regional seismic network operated by the Cartographic and Geologic Institute of Catalonia (ICGC). After successive deployments and upgrades, the current network consists of 16 permanent stations equipped with 3 component broadband seismometers (STS2, STS2.5, CMG3ESP and CMG3T), 24 bits digitizers (Nanometrics Trident) and VSAT telemetry. Data are continuously sent in real-time via Hispasat 1D satellite to the ICGC datacenter in Barcelona. Additionally, data from other 10 stations of neighboring areas (Spain, France and Andorra) are continuously received since 2011 via Internet or VSAT, contributing both to detect and to locate events affecting the region. More than 300 local events with Ml ≥ 0.7 have been yearly detected and located in the region. Nevertheless, small magnitude earthquakes, especially those located in the south and south-west of Catalonia may still go undetected by the automatic detection system (DAS), based on Earthworm (USGS). Thus, in order to improve the detection and characterization of these missed events, one or two new stations should be installed. Before making the decision about where to install these new stations, the performance of each existing station is evaluated taking into account the fraction of detected events using the station records, compared to the total number of events in the catalogue, occurred during the station operation time from January 1, 2011 to December 31, 2014. These evaluations allow us to build an Event Detection Probability Map (EDPM), a required tool to simulate EDPMs resulting from different network topology scenarios depending on where these new stations are sited, and becoming essential for the decision-making process to increase and optimize the event detection probability of the seismic network.

  20. Alternatives for Laboratory Measurement of Aerosol Samples from the International Monitoring System of the CTBT

    NASA Astrophysics Data System (ADS)

    Miley, H.; Forrester, J. B.; Greenwood, L. R.; Keillor, M. E.; Eslinger, P. W.; Regmi, R.; Biegalski, S.; Erikson, L. E.

    2013-12-01

    The aerosol samples taken from the CTBT International Monitoring Systems stations are measured in the field with a minimum detectable concentration (MDC) of ~30 microBq/m3 of Ba-140. This is sufficient to detect far less than 1 kt of aerosol fission products in the atmosphere when the station is in the plume from such an event. Recent thinking about minimizing the potential source region (PSR) from a detection has led to a desire for a multi-station or multi-time period detection. These would be connected through the concept of ';event formation', analogous to event formation in seismic event study. However, to form such events, samples from the nearest neighbors of the detection would require re-analysis with a more sensitive laboratory to gain a substantially lower MDC, and potentially find radionuclide concentrations undetected by the station. The authors will present recent laboratory work with air filters showing various cost effective means for enhancing laboratory sensitivity.

  1. Quantifying full phenological event distributions reveals simultaneous advances, temporal stability and delays in spring and autumn migration timing in long-distance migratory birds.

    PubMed

    Miles, Will T S; Bolton, Mark; Davis, Peter; Dennis, Roy; Broad, Roger; Robertson, Iain; Riddiford, Nick J; Harvey, Paul V; Riddington, Roger; Shaw, Deryk N; Parnaby, David; Reid, Jane M

    2017-04-01

    Phenological changes in key seasonally expressed life-history traits occurring across periods of climatic and environmental change can cause temporal mismatches between interacting species, and thereby impact population and community dynamics. However, studies quantifying long-term phenological changes have commonly only measured variation occurring in spring, measured as the first or mean dates on which focal traits or events were observed. Few studies have considered seasonally paired events spanning spring and autumn or tested the key assumption that single convenient metrics accurately capture entire event distributions. We used 60 years (1955-2014) of daily bird migration census data from Fair Isle, Scotland, to comprehensively quantify the degree to which the full distributions of spring and autumn migration timing of 13 species of long-distance migratory bird changed across a period of substantial climatic and environmental change. In most species, mean spring and autumn migration dates changed little. However, the early migration phase (≤10th percentile date) commonly got earlier, while the late migration phase (≥90th percentile date) commonly got later. Consequently, species' total migration durations typically lengthened across years. Spring and autumn migration phenologies were not consistently correlated within or between years within species and hence were not tightly coupled. Furthermore, different metrics quantifying different aspects of migration phenology within seasons were not strongly cross-correlated, meaning that no single metric adequately described the full pattern of phenological change. These analyses therefore reveal complex patterns of simultaneous advancement, temporal stability and delay in spring and autumn migration phenologies, altering species' life-history structures. Additionally, they demonstrate that this complexity is only revealed if multiple metrics encompassing entire seasonal event distributions, rather than single metrics, are used to quantify phenological change. Existing evidence of long-term phenological changes detected using only one or two metrics should consequently be interpreted cautiously because divergent changes occurring simultaneously could potentially have remained undetected. © 2016 John Wiley & Sons Ltd.

  2. Mechanistic insights into the luminescent sensing of organophosphorus chemical warfare agents and simulants using trivalent lanthanide complexes.

    PubMed

    Dennison, Genevieve H; Johnston, Martin R

    2015-04-20

    Organophosphorus chemical warfare agents (OP CWAs) are potent acetylcholinesterase inhibitors that can cause incapacitation and death within minutes of exposure, and furthermore are largely undetectable by the human senses. Fast, efficient, sensitive and selective detection of these compounds is therefore critical to minimise exposure. Traditional molecular-based sensing approaches have exploited the chemical reactivity of the OP CWAs, whereas more recently supramolecular-based approaches using non-covalent interactions have gained momentum. This is due, in part, to the potential development of sensors with second-generation properties, such as reversibility and multifunction capabilities. Supramolecular sensors also offer opportunities for incorporation of metal ions allowing for the exploitation of their unique properties. In particular, trivalent lanthanide ions are being increasingly used in the OP CWA sensing event and their use in supramolecular sensors is discussed in this Minireview. We focus on the fundamental interactions of simple lanthanide systems with OP CWAs and simulants, along with the development of more elaborate and complex systems including those containing nanotubes, polymers and gold nanoparticles. Whilst literature investigations into lanthanide-based OP CWA detection systems are relatively scarce, their unique and versatile properties provide a promising platform for the development of more efficient and complex sensing systems into the future. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Long-term stormwater quantity and quality analysis using continuous measurements in a French urban catchment.

    PubMed

    Sun, Siao; Barraud, Sylvie; Castebrunet, Hélène; Aubin, Jean-Baptiste; Marmonier, Pierre

    2015-11-15

    The assessment of urban stormwater quantity and quality is important for evaluating and controlling the impact of the stormwater to natural water and environment. This study mainly addresses long-term evolution of stormwater quantity and quality in a French urban catchment using continuous measured data from 2004 to 2011. Storm event-based data series are obtained (716 rainfall events and 521 runoff events are available) from measured continuous time series. The Mann-Kendall test is applied to these event-based data series for trend detection. A lack of trend is found in rainfall and an increasing trend in runoff is detected. As a result, an increasing trend is present in the runoff coefficient, likely due to growing imperviousness of the catchment caused by urbanization. The event mean concentration of the total suspended solid (TSS) in stormwater does not present a trend, whereas the event load of TSS has an increasing tendency, which is attributed to the increasing event runoff volume. Uncertainty analysis suggests that the major uncertainty in trend detection results lies in uncertainty due to available data. A lack of events due to missing data leads to dramatically increased uncertainty in trend detection results. In contrast, measurement uncertainty in time series data plays a trivial role. The intra-event distribution of TSS is studied based on both M(V) curves and pollutant concentrations of absolute runoff volumes. The trend detection test reveals no significant change in intra-event distributions of TSS in the studied catchment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Adaptive Waveform Correlation Detectors for Arrays: Algorithms for Autonomous Calibration

    DTIC Science & Technology

    2007-09-01

    March 17, 2005. The seismic signals from both master and detected events are followed by infrasound arrivals. Note the long duration of the...correlation coefficient traces with a significant array -gain. A detected event that is co-located with the master event will record the same time-difference...estimating the detection threshold reduction for a range of highly repeating seismic sources using arrays of different configurations and at different

  5. Climate network analysis of regional precipitation extremes: The true story told by event synchronization

    NASA Astrophysics Data System (ADS)

    Odenweller, Adrian; Donner, Reik V.

    2017-04-01

    Over the last decade, complex network methods have been frequently used for characterizing spatio-temporal patterns of climate variability from a complex systems perspective, yielding new insights into time-dependent teleconnectivity patterns and couplings between different components of the Earth climate. Among the foremost results reported, network analyses of the synchronicity of extreme events as captured by the so-called event synchronization have been proposed to be powerful tools for disentangling the spatio-temporal organization of particularly extreme rainfall events and anticipating the timing of monsoon onsets or extreme floodings. Rooted in the analysis of spike train synchrony analysis in the neurosciences, event synchronization has the great advantage of automatically classifying pairs of events arising at two distinct spatial locations as temporally close (and, thus, possibly statistically - or even dynamically - interrelated) or not without the necessity of selecting an additional parameter in terms of a maximally tolerable delay between these events. This consideration is conceptually justified in case of the original application to spike trains in electroencephalogram (EEG) recordings, where the inter-spike intervals show relatively narrow distributions at high temporal sampling rates. However, in case of climate studies, precipitation extremes defined by daily precipitation sums exceeding a certain empirical percentile of their local distribution exhibit a distinctively different type of distribution of waiting times between subsequent events. This raises conceptual concerns if event synchronization is still appropriate for detecting interlinkages between spatially distributed precipitation extremes. In order to study this problem in more detail, we employ event synchronization together with an alternative similarity measure for event sequences, event coincidence rates, which requires a manual setting of the tolerable maximum delay between two events to be considered potentially related. Both measures are then used to generate climate networks from parts of the satellite-based TRMM precipitation data set at daily resolution covering the Indian and East Asian monsoon domains, respectively, thereby reanalysing previously published results. The obtained spatial patterns of degree densities and local clustering coefficients exhibit marked differences between both similarity measures. Specifically, we demonstrate that there exists a strong relationship between the fraction of extremes occurring at subsequent days and the degree density in the event synchronization based networks, suggesting that the spatial patterns obtained using this approach are strongly affected by the presence of serial dependencies between events. Given that a manual selection of the maximally tolerable delay between two events can be guided by a priori climatological knowledge and even used for systematic testing of different hypotheses on climatic processes underlying the emergence of spatio-temporal patterns of extreme precipitation, our results provide evidence that event coincidence rates are a more appropriate statistical characteristic for similarity assessment and network construction for climate extremes, while results based on event synchronization need to be interpreted with great caution.

  6. Exploring the unbinding of Leishmania (L.) amazonensis CPB derived-epitopes from H2 MHC class I proteins.

    PubMed

    Brandt, Artur M L; Batista, Paulo Ricardo; Souza-Silva, Franklin; Alves, Carlos Roberto; Caffarena, Ernesto Raul

    2016-04-01

    New strategies to control Leishmania disease demand an extensive knowledge about several aspects of infection including the understanding of its molecular events. In murine models, cysteine proteinase B from Leishmania amazonensis promotes regulation of immune response, and fragments from its C-terminus extension (cyspep) can play a decisive role in the host-parasite interaction. The interaction between cyspep-derived peptides and major histocompatibility complex (MHC) proteins is a crucial factor in Leishmania infections. Seven cyspep-derived peptides, previously identified as capable of interacting with H-2 (murine) MHC class I proteins, were studied in this work. We established a protocol to simulate the unbinding of these peptides from the cleft of H-2 receptors. From the simulations, we estimated the corresponding free energy of dissociation (ΔGd ) and described the molecular events that occur during the exit of peptides from the cleft. To test the reliability of this method, we first applied it to a calibration set of four crystallographic MHC/peptide complexes. Next, we explored the unbinding of the seven complexes mentioned above. Results were consistent with ΔGd values obtained from surface plasmon resonance (SPR) experiments. We also identified some of the primary interactions between peptides and H-2 receptors, and we detected three regions of influence for the interaction. This pattern was systematically observed for the peptides and helped determine a minimum distance for the real interaction between peptides and H-2 proteins occurring at ∼ 25 Å. © 2016 Wiley Periodicals, Inc.

  7. Visual Sensor Based Abnormal Event Detection with Moving Shadow Removal in Home Healthcare Applications

    PubMed Central

    Lee, Young-Sook; Chung, Wan-Young

    2012-01-01

    Vision-based abnormal event detection for home healthcare systems can be greatly improved using visual sensor-based techniques able to detect, track and recognize objects in the scene. However, in moving object detection and tracking processes, moving cast shadows can be misclassified as part of objects or moving objects. Shadow removal is an essential step for developing video surveillance systems. The goal of the primary is to design novel computer vision techniques that can extract objects more accurately and discriminate between abnormal and normal activities. To improve the accuracy of object detection and tracking, our proposed shadow removal algorithm is employed. Abnormal event detection based on visual sensor by using shape features variation and 3-D trajectory is presented to overcome the low fall detection rate. The experimental results showed that the success rate of detecting abnormal events was 97% with a false positive rate of 2%. Our proposed algorithm can allow distinguishing diverse fall activities such as forward falls, backward falls, and falling asides from normal activities. PMID:22368486

  8. Event Detection in Aerospace Systems using Centralized Sensor Networks: A Comparative Study of Several Methodologies

    NASA Technical Reports Server (NTRS)

    Mehr, Ali Farhang; Sauvageon, Julien; Agogino, Alice M.; Tumer, Irem Y.

    2006-01-01

    Recent advances in micro electromechanical systems technology, digital electronics, and wireless communications have enabled development of low-cost, low-power, multifunctional miniature smart sensors. These sensors can be deployed throughout a region in an aerospace vehicle to build a network for measurement, detection and surveillance applications. Event detection using such centralized sensor networks is often regarded as one of the most promising health management technologies in aerospace applications where timely detection of local anomalies has a great impact on the safety of the mission. In this paper, we propose to conduct a qualitative comparison of several local event detection algorithms for centralized redundant sensor networks. The algorithms are compared with respect to their ability to locate and evaluate an event in the presence of noise and sensor failures for various node geometries and densities.

  9. Method of controlling cyclic variation in engine combustion

    DOEpatents

    Davis, L.I. Jr.; Daw, C.S.; Feldkamp, L.A.; Hoard, J.W.; Yuan, F.; Connolly, F.T.

    1999-07-13

    Cyclic variation in combustion of a lean burning engine is reduced by detecting an engine combustion event output such as torsional acceleration in a cylinder (i) at a combustion event (k), using the detected acceleration to predict a target acceleration for the cylinder at the next combustion event (k+1), modifying the target output by a correction term that is inversely proportional to the average phase of the combustion event output of cylinder (i) and calculating a control output such as fuel pulse width or spark timing necessary to achieve the target acceleration for cylinder (i) at combustion event (k+1) based on anti-correlation with the detected acceleration and spill-over effects from fueling. 27 figs.

  10. Method of controlling cyclic variation in engine combustion

    DOEpatents

    Davis, Jr., Leighton Ira; Daw, Charles Stuart; Feldkamp, Lee Albert; Hoard, John William; Yuan, Fumin; Connolly, Francis Thomas

    1999-01-01

    Cyclic variation in combustion of a lean burning engine is reduced by detecting an engine combustion event output such as torsional acceleration in a cylinder (i) at a combustion event (k), using the detected acceleration to predict a target acceleration for the cylinder at the next combustion event (k+1), modifying the target output by a correction term that is inversely proportional to the average phase of the combustion event output of cylinder (i) and calculating a control output such as fuel pulse width or spark timing necessary to achieve the target acceleration for cylinder (i) at combustion event (k+1) based on anti-correlation with the detected acceleration and spill-over effects from fueling.

  11. Does Leisure Time as a Stress Coping Resource Increase Affective Complexity? Applying the Dynamic Model of Affect (DMA).

    PubMed

    Qian, Xinyi Lisa; Yarnal, Careen M; Almeida, David M

    2013-01-01

    Affective complexity, a manifestation of psychological well-being, refers to the relative independence between positive and negative affect (PA, NA). According to the Dynamic Model of Affect (DMA), stressful situations lead to highly inverse PA-NA relationship, reducing affective complexity. Meanwhile, positive events can sustain affective complexity by restoring PA-NA independence. Leisure, a type of positive events, has been identified as a coping resource. This study used the DMA to assess whether leisure time helps restore affective complexity on stressful days. We found that on days with more leisure time than usual, an individual experienced less negative PA-NA relationship after daily stressful events. The finding demonstrates the value of leisure time as a coping resource and the DMA's contribution to coping research.

  12. Quantitative detection of pathogens in centrifugal microfluidic disks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koh, Chung-Yan; Schaff, Ulrich Y.; Sommer, Gregory Jon

    A system and methods for detection of a nucleic acid including forming a plurality of nucleic acid detection complexes are described, each of the complexes including a nucleic acid analyte, a detection agent and a functionalized probe. The method further including binding the nucleic acid detection complexes to a plurality of functionalized particles in a fluid sample and separating the functionalized particles having the nucleic acid detection complexes bound thereto from the fluid sample using a density media. The nucleic acid analyte is detected by detecting the detection agent.

  13. Identification and removal of low-complexity sites in allele-specific analysis of ChIP-seq data.

    PubMed

    Waszak, Sebastian M; Kilpinen, Helena; Gschwind, Andreas R; Orioli, Andrea; Raghav, Sunil K; Witwicki, Robert M; Migliavacca, Eugenia; Yurovsky, Alisa; Lappalainen, Tuuli; Hernandez, Nouria; Reymond, Alexandre; Dermitzakis, Emmanouil T; Deplancke, Bart

    2014-01-15

    High-throughput sequencing technologies enable the genome-wide analysis of the impact of genetic variation on molecular phenotypes at unprecedented resolution. However, although powerful, these technologies can also introduce unexpected artifacts. We investigated the impact of library amplification bias on the identification of allele-specific (AS) molecular events from high-throughput sequencing data derived from chromatin immunoprecipitation assays (ChIP-seq). Putative AS DNA binding activity for RNA polymerase II was determined using ChIP-seq data derived from lymphoblastoid cell lines of two parent-daughter trios. We found that, at high-sequencing depth, many significant AS binding sites suffered from an amplification bias, as evidenced by a larger number of clonal reads representing one of the two alleles. To alleviate this bias, we devised an amplification bias detection strategy, which filters out sites with low read complexity and sites featuring a significant excess of clonal reads. This method will be useful for AS analyses involving ChIP-seq and other functional sequencing assays. The R package abs filter for library clonality simulations and detection of amplification-biased sites is available from http://updepla1srv1.epfl.ch/waszaks/absfilter

  14. Human behavior recognition using a context-free grammar

    NASA Astrophysics Data System (ADS)

    Rosani, Andrea; Conci, Nicola; De Natale, Francesco G. B.

    2014-05-01

    Automatic recognition of human activities and behaviors is still a challenging problem for many reasons, including limited accuracy of the data acquired by sensing devices, high variability of human behaviors, and gap between visual appearance and scene semantics. Symbolic approaches can significantly simplify the analysis and turn raw data into chains of meaningful patterns. This allows getting rid of most of the clutter produced by low-level processing operations, embedding significant contextual information into the data, as well as using simple syntactic approaches to perform the matching between incoming sequences and models. We propose a symbolic approach to learn and detect complex activities through the sequences of atomic actions. Compared to previous methods based on context-free grammars, we introduce several important novelties, such as the capability to learn actions based on both positive and negative samples, the possibility of efficiently retraining the system in the presence of misclassified or unrecognized events, and the use of a parsing procedure that allows correct detection of the activities also when they are concatenated and/or nested one with each other. An experimental validation on three datasets with different characteristics demonstrates the robustness of the approach in classifying complex human behaviors.

  15. Simulating Operation of a Complex Sensor Network

    NASA Technical Reports Server (NTRS)

    Jennings, Esther; Clare, Loren; Woo, Simon

    2008-01-01

    Simulation Tool for ASCTA Microsensor Network Architecture (STAMiNA) ["ASCTA" denotes the Advanced Sensors Collaborative Technology Alliance.] is a computer program for evaluating conceptual sensor networks deployed over terrain to provide military situational awareness. This or a similar program is needed because of the complexity of interactions among such diverse phenomena as sensing and communication portions of a network, deployment of sensor nodes, effects of terrain, data-fusion algorithms, and threat characteristics. STAMiNA is built upon a commercial network-simulator engine, with extensions to include both sensing and communication models in a discrete-event simulation environment. Users can define (1) a mission environment, including terrain features; (2) objects to be sensed; (3) placements and modalities of sensors, abilities of sensors to sense objects of various types, and sensor false alarm rates; (4) trajectories of threatening objects; (5) means of dissemination and fusion of data; and (6) various network configurations. By use of STAMiNA, one can simulate detection of targets through sensing, dissemination of information by various wireless communication subsystems under various scenarios, and fusion of information, incorporating such metrics as target-detection probabilities, false-alarm rates, and communication loads, and capturing effects of terrain and threat.

  16. Mitochondrial impairment contributes to cocaine-induced cardiac dysfunction: Prevention by the targeted antioxidant MitoQ.

    PubMed

    Vergeade, Aurélia; Mulder, Paul; Vendeville-Dehaudt, Cathy; Estour, François; Fortin, Dominique; Ventura-Clapier, Renée; Thuillez, Christian; Monteil, Christelle

    2010-09-01

    The goal of this study was to assess mitochondrial function and ROS production in an experimental model of cocaine-induced cardiac dysfunction. We hypothesized that cocaine abuse may lead to altered mitochondrial function that in turn may cause left ventricular dysfunction. Seven days of cocaine administration to rats led to an increased oxygen consumption detected in cardiac fibers, specifically through complex I and complex III. ROS levels were increased, specifically in interfibrillar mitochondria. In parallel there was a decrease in ATP synthesis, whereas no difference was observed in subsarcolemmal mitochondria. This uncoupling effect on oxidative phosphorylation was not detectable after short-term exposure to cocaine, suggesting that these mitochondrial abnormalities were a late rather than a primary event in the pathological response to cocaine. MitoQ, a mitochondrial-targeted antioxidant, was shown to completely prevent these mitochondrial abnormalities as well as cardiac dysfunction characterized here by a diastolic dysfunction studied with a conductance catheter to obtain pressure-volume data. Taken together, these results extend previous studies and demonstrate that cocaine-induced cardiac dysfunction may be due to a mitochondrial defect. Copyright 2010 Elsevier Inc. All rights reserved.

  17. History of on-orbit satellite fragmentations

    NASA Technical Reports Server (NTRS)

    Nauer, David J.

    1992-01-01

    Since the first serious satellite fragmentation occurred in Jun. 1961, and instantaneously increased the total Earth satellite population by more than 400 percent, the issue of space operations within the finite region of space around the Earth has been the subject of increasing interest and concern. The prolific satellite fragmentations of the 1970's and the marked increase in the number of fragmentations in the 1980's served to widen international research into the characteristics and consequences of such events. Plans for large, manned space stations in the next decade and beyond demand a better understanding of the hazards of the dynamic Earth satellite population. The contribution of satellite fragmentations to the growth of the Earth satellite population is complex and varied. The majority of detectable fragmentation debris have already fallen out of orbit, and the effects of 40 percent of all fragmentations have completely disappeared. In this volume, satellite fragmentations are categorized by their assessed nature and to a lesser degree by their effect on the near-Earth space environment. A satellite breakup is the usually destructive disassociation of an orbital payload, rocket body, or structure, often with a wide range of ejecta velocities. A satellite breakup may be accidental or the result of intentional actions, e.g., due to a propulsion system malfunction or a space weapons test, respectively. An anomalous event is the unplanned separation, usually at low velocity, of one or more detectable objects from a satellite which remains essentially intact. Anomalous events can be caused by material deterioration of items such as thermal blankets, protective shields, or solar panels. As a general rule, a satellite breakup will produce considerably more debris, both trackable and non-trackable, than an anomalous event. From one perspective, satellite breakups may be viewed as a measure of the effects of man's activity on the environment, while anomalous events may be a measure of the environment on man-made objects.

  18. History of on-orbit satellite fragmentations

    NASA Astrophysics Data System (ADS)

    Nauer, David J.

    1992-07-01

    Since the first serious satellite fragmentation occurred in Jun. 1961, and instantaneously increased the total Earth satellite population by more than 400 percent, the issue of space operations within the finite region of space around the Earth has been the subject of increasing interest and concern. The prolific satellite fragmentations of the 1970's and the marked increase in the number of fragmentations in the 1980's served to widen international research into the characteristics and consequences of such events. Plans for large, manned space stations in the next decade and beyond demand a better understanding of the hazards of the dynamic Earth satellite population. The contribution of satellite fragmentations to the growth of the Earth satellite population is complex and varied. The majority of detectable fragmentation debris have already fallen out of orbit, and the effects of 40 percent of all fragmentations have completely disappeared. In this volume, satellite fragmentations are categorized by their assessed nature and to a lesser degree by their effect on the near-Earth space environment. A satellite breakup is the usually destructive disassociation of an orbital payload, rocket body, or structure, often with a wide range of ejecta velocities. A satellite breakup may be accidental or the result of intentional actions, e.g., due to a propulsion system malfunction or a space weapons test, respectively. An anomalous event is the unplanned separation, usually at low velocity, of one or more detectable objects from a satellite which remains essentially intact. Anomalous events can be caused by material deterioration of items such as thermal blankets, protective shields, or solar panels. As a general rule, a satellite breakup will produce considerably more debris, both trackable and non-trackable, than an anomalous event. From one perspective, satellite breakups may be viewed as a measure of the effects of man's activity on the environment, while anomalous events may be a measure of the environment on man-made objects.

  19. Global Profiling and Molecular Characterization of Alternative Splicing Events Misregulated in Lung Cancer ▿ †

    PubMed Central

    Misquitta-Ali, Christine M.; Cheng, Edith; O'Hanlon, Dave; Liu, Ni; McGlade, C. Jane; Tsao, Ming Sound; Blencowe, Benjamin J.

    2011-01-01

    Alternative splicing (AS) is a widespread mechanism underlying the generation of proteomic and regulatory complexity. However, which of the myriad of human AS events play important roles in disease is largely unknown. To identify frequently occurring AS events in lung cancer, we used AS microarray profiling and reverse transcription-PCR (RT-PCR) assays to survey patient-matched normal and adenocarcinoma tumor tissues from the lungs of 29 individuals diagnosed with non-small cell lung cancer (NSCLC). Of 5,183 profiled alternative exons, four displayed tumor-associated changes in the majority of the patients. These events affected transcripts from the VEGFA, MACF1, APP, and NUMB genes. Similar AS changes were detected in NUMB and APP transcripts in primary breast and colon tumors. Tumor-associated increases in NUMB exon 9 inclusion correlated with reduced levels of NUMB protein expression and activation of the Notch signaling pathway, an event that has been linked to tumorigenesis. Moreover, short hairpin RNA (shRNA) knockdown of NUMB followed by isoform-specific rescue revealed that expression of the exon 9-skipped (nontumor) isoform represses Notch target gene activation whereas expression of the exon 9-included (tumor) isoform lacks this activity and is capable of promoting cell proliferation. The results thus reveal widespread AS changes in NSCLC that impact cell signaling in a manner that likely contributes to tumorigenesis. PMID:21041478

  20. Global profiling and molecular characterization of alternative splicing events misregulated in lung cancer.

    PubMed

    Misquitta-Ali, Christine M; Cheng, Edith; O'Hanlon, Dave; Liu, Ni; McGlade, C Jane; Tsao, Ming Sound; Blencowe, Benjamin J

    2011-01-01

    Alternative splicing (AS) is a widespread mechanism underlying the generation of proteomic and regulatory complexity. However, which of the myriad of human AS events play important roles in disease is largely unknown. To identify frequently occurring AS events in lung cancer, we used AS microarray profiling and reverse transcription-PCR (RT-PCR) assays to survey patient-matched normal and adenocarcinoma tumor tissues from the lungs of 29 individuals diagnosed with non-small cell lung cancer (NSCLC). Of 5,183 profiled alternative exons, four displayed tumor-associated changes in the majority of the patients. These events affected transcripts from the VEGFA, MACF1, APP, and NUMB genes. Similar AS changes were detected in NUMB and APP transcripts in primary breast and colon tumors. Tumor-associated increases in NUMB exon 9 inclusion correlated with reduced levels of NUMB protein expression and activation of the Notch signaling pathway, an event that has been linked to tumorigenesis. Moreover, short hairpin RNA (shRNA) knockdown of NUMB followed by isoform-specific rescue revealed that expression of the exon 9-skipped (nontumor) isoform represses Notch target gene activation whereas expression of the exon 9-included (tumor) isoform lacks this activity and is capable of promoting cell proliferation. The results thus reveal widespread AS changes in NSCLC that impact cell signaling in a manner that likely contributes to tumorigenesis.

  1. Advancing Porous Silicon Biosensor Technology for Use in Clinical Diagnostics

    NASA Astrophysics Data System (ADS)

    Bonanno, Lisa Marie

    Inexpensive and robust analytical techniques for detecting molecular recognition events are in great demand in healthcare, food safety, and environmental monitoring. Despite vast research in this area, challanges remain to develop practical biomolecular platforms that, meet the rigorous demands of real-world applications. This includes maintaining low-cost devices that are sensitive and specific in complex test specimens, are stable after storage, have short assay time, and possess minimal complexity of instrumentation for readout. Nanostructured porous silicon (PSi) material has been identified as an ideal candidate towards achieving these goals and the past decade has seen diverse proof-of-principle studies developing optical-based sensing techniques. In Part 1 of this thesis, the impact of surface chemistry and PSi morphology on detection sensitivity of target molecules is investigated. Initial proof-of-concept that PSi devices facilitate detection of protein in whole blood is demonstrated. This work highlights the importance of material stability and blocking chemistry for sensor use in real world biological samples. In addition, the intrinisic filtering capability of the 3-D PSi morphology is shown as an advantage in complex solutions, such as whole blood. Ultimately, this initial work identified a need to improve detection sensitivity of the PSI biosensor technique to facilitate clinical diagnostic use over relevant target concentration ranges. The second part of this thesis, builds upon sensitivity challenges that are highlighted in the first part of the thesis and development of a surface-bound competitive inhibition immunoassay facilitated improved detection sensitivity of small molecular weight targets (opiates) over a relevant clinical concentration range. In addition, optimization of assay protocol addressed issues of maintaining stability of sensors after storage. Performance of the developed assay (specificity and sensitivity) was then validated in a blind clinical study that screened real patient urine samples (n=70) for opiates in collaboration with Strong Memorial Hospital Clinical Toxicology Laboratory. PSI sensor results showed improved clinical specificity over current commercial opiate immunoassay techniques and therefore, identified potential for a reduction in false-negative and false-positive screening results. Here, we demonstrate for the first time, successful clinical capability of a PSi sensor to detect opiates as a model target in real-world patient samples. The final part of this thesis explores novel sensor designs to leverage the tunable optical properties of PSi photonic devices and facilitate colorimetric readout of molecular recognition events by the unaided eye. Such a design is ideal for uncomplicated diagnostic screening at point-of-care as no instrumentation is needed for result readout. The photonic PSi transducers were integrated with target analyte-responsive hydrogels (TRAP-gels) that upon exposure to a target solution would swell and dissolute, inducing material property changes that were optically detected by the incorporated PSi transducer. This strategy extends target detection throughout the 3-ll internal volume of the PSi, improving upon current techniques that limit detection to the surface area (2-ll) of PSi. Work to acheive this approach involved design of TRAP-gel networks, polymer synthesis and characterization techniques, and optical characterization of the hybrid hydrogel-PSi material sensor. Successful implementation of a hybrid sensor design was exhibited for a. model chemical target (reducing agent), in which visual colorimetric change from red to green was observed for above-threshold exposure to the chemical target. In addition, initial proof-of-concept of an opiate responsive TRAP-gel is also demonstrated where cross-links are formed between antibody-antigen interactions and exposure to opiates induces bulk gel dissolution.

  2. CHRONOLOGY OF MAJOR METAMORPHIC EVENTS IN THE SOUTHEASTERN UNITED STATES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, L.E.; Kulp, J.L.; Eckelmann, F.D.

    1959-10-01

    Potassium-argon and rubidium-strontium age measurements have been made on a variety of granites, pegmatites, gneiases and schists which comprise the plutonicmetamorphic complex of the Piedmont and Blue Ridge of the southeastern United States. Large portions of the area appear to have been metamorphosed initially approximately at the same time as the Grenville Province, i.e., about 900 to 1100 m.y. ago. Superimposed on this older metamorphic province was a major orogenic event culminating at about 350 m.y. with widespread recrystallization of existing rocks and intrusion of pegmatites in the Spruce Pine, Franklin-Sylva and Bryson City Districts, and granites in the Virginiamore » and North Carolina Piedmont. There is strong evidence of an additional metamorphic epoch between 350 and 1000 m.y., but its effects have been largely obliterated by the 350 m.y. event. In western North Carolina a transition of apparent ages from 355 to 890 m.y. occurs in the same rock unit. (Cranberry gneiss) over a distance of about 10 miles across the strike of the border- of the 350 m.y. event. In the southeastern Piedmont of Georgia and South Carolina a younger metamorphic event or events can be detected producing rocks of apparent age ranging from 230 to 310 m.y. The time of these orogenies is compared with those in the Central and Northern Appalachians. Evidence is accumulating that the Holmes' time scale will have to be considerably lengthened. (auth)« less

  3. Developing Fluorescence Sensor Systems for Early Detection of Nitrification Events in Chloraminated Drinking Water Distribution Systems

    EPA Science Inventory

    Detection of nitrification events in chloraminated drinking water distribution systems remains an ongoing challenge for many drinking water utilities, including Dallas Water Utilities (DWU) and the City of Houston (CoH). Each year, these utilities experience nitrification events ...

  4. Trend Detection and Bivariate Frequency Analysis for Nonstrationary Rainfall Data

    NASA Astrophysics Data System (ADS)

    Joo, K.; Kim, H.; Shin, J. Y.; Heo, J. H.

    2017-12-01

    Multivariate frequency analysis has been developing for hydro-meteorological data such as rainfall, flood, and drought. Particularly, the copula has been used as a useful tool for multivariate probability model which has no limitation on deciding marginal distributions. The time-series rainfall data can be characterized to rainfall event by inter-event time definition (IETD) and each rainfall event has a rainfall depth and rainfall duration. In addition, nonstationarity in rainfall event has been studied recently due to climate change and trend detection of rainfall event is important to determine the data has nonstationarity or not. With the rainfall depth and duration of a rainfall event, trend detection and nonstationary bivariate frequency analysis has performed in this study. 62 stations from Korea Meteorological Association (KMA) over 30 years of hourly recorded data used in this study and the suitability of nonstationary copula for rainfall event has examined by the goodness-of-fit test.

  5. Influence of generalized complexity of a musical event on subjective time estimation.

    PubMed

    Bueno, José Lino Oliveira; Firmino, Erico Artioli; Engelman, Arno

    2002-04-01

    This study examined the variations in the apparent duration of music events produced by differences in their generalized compositional complexity. Stimuli were the first 90 sec. of Gustav Mahler's 3rd Movement of Symphony No. 2 (low complexity) and the first 90 sec. of Luciano Bério's 3rd Movement of Symphony for Eight Voices and Orchestra (high complexity). Bério's symphony is another "reading" of Mahler's. On the compositional base of Mahler's symphony, Bério explored complexity in several musical elements--temporal (i.e., rhythm), nontemporal (i.e., pitch, orchestral and vocal timbre, texture, density), and verbal (i.e., text, words, phonemes). These two somewhat differently filled durations were reproduced by 10 women and 6 men with a stopwatch under the prospective paradigm. Analysis showed that the more generalized complexity of the musical event was followed by greater subjective estimation of the duration of this 90-sec. symphonic excerpt.

  6. A novel method to accurately locate and count large numbers of steps by photobleaching.

    PubMed

    Tsekouras, Konstantinos; Custer, Thomas C; Jashnsaz, Hossein; Walter, Nils G; Pressé, Steve

    2016-11-07

    Photobleaching event counting is a single-molecule fluorescence technique that is increasingly being used to determine the stoichiometry of protein and RNA complexes composed of many subunits in vivo as well as in vitro. By tagging protein or RNA subunits with fluorophores, activating them, and subsequently observing as the fluorophores photobleach, one obtains information on the number of subunits in a complex. The noise properties in a photobleaching time trace depend on the number of active fluorescent subunits. Thus, as fluorophores stochastically photobleach, noise properties of the time trace change stochastically, and these varying noise properties have created a challenge in identifying photobleaching steps in a time trace. Although photobleaching steps are often detected by eye, this method only works for high individual fluorophore emission signal-to-noise ratios and small numbers of fluorophores. With filtering methods or currently available algorithms, it is possible to reliably identify photobleaching steps for up to 20-30 fluorophores and signal-to-noise ratios down to ∼1. Here we present a new Bayesian method of counting steps in photobleaching time traces that takes into account stochastic noise variation in addition to complications such as overlapping photobleaching events that may arise from fluorophore interactions, as well as on-off blinking. Our method is capable of detecting ≥50 photobleaching steps even for signal-to-noise ratios as low as 0.1, can find up to ≥500 steps for more favorable noise profiles, and is computationally inexpensive. © 2016 Tsekouras et al. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  7. Reciprocal Modulation of Cognitive and Emotional Aspects in Pianistic Performances

    PubMed Central

    Higuchi, Marcia K. Kodama; Fornari, José; Del Ben, Cristina M.; Graeff, Frederico G.; Leite, João Pereira

    2011-01-01

    Background High level piano performance requires complex integration of perceptual, motor, cognitive and emotive skills. Observations in psychology and neuroscience studies have suggested reciprocal inhibitory modulation of the cognition by emotion and emotion by cognition. However, it is still unclear how cognitive states may influence the pianistic performance. The aim of the present study is to verify the influence of cognitive and affective attention in the piano performances. Methods and Findings Nine pianists were instructed to play the same piece of music, firstly focusing only on cognitive aspects of musical structure (cognitive performances), and secondly, paying attention solely on affective aspects (affective performances). Audio files from pianistic performances were examined using a computational model that retrieves nine specific musical features (descriptors) – loudness, articulation, brightness, harmonic complexity, event detection, key clarity, mode detection, pulse clarity and repetition. In addition, the number of volunteers' errors in the recording sessions was counted. Comments from pianists about their thoughts during performances were also evaluated. The analyses of audio files throughout musical descriptors indicated that the affective performances have more: agogics, legatos, pianos phrasing, and less perception of event density when compared to the cognitive ones. Error analysis demonstrated that volunteers misplayed more left hand notes in the cognitive performances than in the affective ones. Volunteers also played more wrong notes in affective than in cognitive performances. These results correspond to the volunteers' comments that in the affective performances, the cognitive aspects of piano execution are inhibited, whereas in the cognitive performances, the expressiveness is inhibited. Conclusions Therefore, the present results indicate that attention to the emotional aspects of performance enhances expressiveness, but constrains cognitive and motor skills in the piano execution. In contrast, attention to the cognitive aspects may constrain the expressivity and automatism of piano performances. PMID:21931716

  8. Non-linear regime shifts in Holocene Asian monsoon variability: potential impacts on cultural change and migratory patterns

    NASA Astrophysics Data System (ADS)

    Donges, J. F.; Donner, R. V.; Marwan, N.; Breitenbach, S. F. M.; Rehfeld, K.; Kurths, J.

    2015-05-01

    The Asian monsoon system is an important tipping element in Earth's climate with a large impact on human societies in the past and present. In light of the potentially severe impacts of present and future anthropogenic climate change on Asian hydrology, it is vital to understand the forcing mechanisms of past climatic regime shifts in the Asian monsoon domain. Here we use novel recurrence network analysis techniques for detecting episodes with pronounced non-linear changes in Holocene Asian monsoon dynamics recorded in speleothems from caves distributed throughout the major branches of the Asian monsoon system. A newly developed multi-proxy methodology explicitly considers dating uncertainties with the COPRA (COnstructing Proxy Records from Age models) approach and allows for detection of continental-scale regime shifts in the complexity of monsoon dynamics. Several epochs are characterised by non-linear regime shifts in Asian monsoon variability, including the periods around 8.5-7.9, 5.7-5.0, 4.1-3.7, and 3.0-2.4 ka BP. The timing of these regime shifts is consistent with known episodes of Holocene rapid climate change (RCC) and high-latitude Bond events. Additionally, we observe a previously rarely reported non-linear regime shift around 7.3 ka BP, a timing that matches the typical 1.0-1.5 ky return intervals of Bond events. A detailed review of previously suggested links between Holocene climatic changes in the Asian monsoon domain and the archaeological record indicates that, in addition to previously considered longer-term changes in mean monsoon intensity and other climatic parameters, regime shifts in monsoon complexity might have played an important role as drivers of migration, pronounced cultural changes, and the collapse of ancient human societies.

  9. Insertable cardiac event recorder in detection of atrial fibrillation after cryptogenic stroke: an audit report.

    PubMed

    Etgen, Thorleif; Hochreiter, Manfred; Mundel, Markus; Freudenberger, Thomas

    2013-07-01

    Atrial fibrillation (AF) is the most frequent risk factor in ischemic stroke but often remains undetected. We analyzed the value of insertable cardiac event recorder in detection of AF in a 1-year cohort of patients with cryptogenic ischemic stroke. All patients with cryptogenic stroke and eligibility for oral anticoagulation were offered the insertion of a cardiac event recorder. Regular follow-up for 1 year recorded the incidence of AF. Of the 393 patients with ischemic stroke, 65 (16.5%) had a cryptogenic stroke, and in 22 eligible patients, an event recorder was inserted. After 1 year, in 6 of 22 patients (27.3%), AF was detected. These preliminary data show that insertion of cardiac event recorder was eligible in approximately one third of patients with cryptogenic stroke and detected in approximately one quarter of these patients new AF.

  10. Signaling communication events in a computer network

    DOEpatents

    Bender, Carl A.; DiNicola, Paul D.; Gildea, Kevin J.; Govindaraju, Rama K.; Kim, Chulho; Mirza, Jamshed H.; Shah, Gautam H.; Nieplocha, Jaroslaw

    2000-01-01

    A method, apparatus and program product for detecting a communication event in a distributed parallel data processing system in which a message is sent from an origin to a target. A low-level application programming interface (LAPI) is provided which has an operation for associating a counter with a communication event to be detected. The LAPI increments the counter upon the occurrence of the communication event. The number in the counter is monitored, and when the number increases, the event is detected. A completion counter in the origin is associated with the completion of a message being sent from the origin to the target. When the message is completed, LAPI increments the completion counter such that monitoring the completion counter detects the completion of the message. The completion counter may be used to insure that a first message has been sent from the origin to the target and completed before a second message is sent.

  11. USGS Tweet Earthquake Dispatch (@USGSted): Using Twitter for Earthquake Detection and Characterization

    NASA Astrophysics Data System (ADS)

    Liu, S. B.; Bouchard, B.; Bowden, D. C.; Guy, M.; Earle, P.

    2012-12-01

    The U.S. Geological Survey (USGS) is investigating how online social networking services like Twitter—a microblogging service for sending and reading public text-based messages of up to 140 characters—can augment USGS earthquake response products and the delivery of hazard information. The USGS Tweet Earthquake Dispatch (TED) system is using Twitter not only to broadcast seismically-verified earthquake alerts via the @USGSted and @USGSbigquakes Twitter accounts, but also to rapidly detect widely felt seismic events through a real-time detection system. The detector algorithm scans for significant increases in tweets containing the word "earthquake" or its equivalent in other languages and sends internal alerts with the detection time, tweet text, and the location of the city where most of the tweets originated. It has been running in real-time for 7 months and finds, on average, two or three felt events per day with a false detection rate of less than 10%. The detections have reasonable coverage of populated areas globally. The number of detections is small compared to the number of earthquakes detected seismically, and only a rough location and qualitative assessment of shaking can be determined based on Tweet data alone. However, the Twitter detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The main benefit of the tweet-based detections is speed, with most detections occurring between 19 seconds and 2 minutes from the origin time. This is considerably faster than seismic detections in poorly instrumented regions of the world. Going beyond the initial detection, the USGS is developing data mining techniques to continuously archive and analyze relevant tweets for additional details about the detected events. The information generated about an event is displayed on a web-based map designed using HTML5 for the mobile environment, which can be valuable when the user is not able to access a desktop computer at the time of the detections. The continuously updating map displays geolocated tweets arriving after the detection and plots epicenters of recent earthquakes. When available, seismograms from nearby stations are displayed as an additional form of verification. A time series of tweets-per-minute is also shown to illustrate the volume of tweets being generated for the detected event. Future additions are being investigated to provide a more in-depth characterization of the seismic events based on an analysis of tweet text and content from other social media sources.

  12. Vegetation responses to abrupt climatic changes during the Last Interglacial Complex (Marine Isotope Stage 5) at Tenaghi Philippon, NE Greece

    NASA Astrophysics Data System (ADS)

    Milner, A. M.; Roucoux, K. H.; Collier, R. E. L.; Müller, U. C.; Pross, J.; Tzedakis, P. C.

    2016-12-01

    The discovery that climate variability during the Last Glacial shifted rapidly between climate states has intensified efforts to understand the distribution, timing and impact of abrupt climate change under a wide range of boundary conditions. In contribution to this, we investigate the nature of abrupt environmental changes in terrestrial settings of the Mediterranean region during the Last Interglacial Complex (Marine Isotope Stage [MIS] 5) and explore the relationships of these changes to high-latitude climate events. We present a new, temporally highly resolved (mean: 170 years) pollen record for the Last Interglacial Complex from Tenaghi Philippon, north-east Greece. The new pollen record, which spans the interval from 130,000 to 65,000 years ago, forms part of an exceptionally long polleniferous sediment archive covering the last 1.35 million years. The pollen data reveal an interglacial followed by alternating forest and steppe phases representing the interstadials and stadials of the Early Glacial. Superimposed on these millennial-scale changes is evidence of persistent sub-millennial-scale variability. We identify ten high-amplitude abrupt events in the pollen record, characterised by rapid contractions of closed forest to open steppe environment and interpreted to indicate major changes in moisture availability and temperature. The contractions in forest cover on millennial timescales appear associated with cooling events in the Mediterranean Sea, North Atlantic and Greenland regions, linked to the Dansgaard-Oeschger (DO) cycles of the Early Glacial. On sub-millennial timescales, the pattern of changes in forest cover at Tenaghi Philippon display a structure similar to the pattern of short-lived precursor and rebound-type events detected in the Greenland ice-core record. Our findings indicate that persistent, high-amplitude environmental variability occurred throughout the Early Glacial, on both millennial and submillennial timescales. Furthermore, the similarity of the pattern of change between Tenaghi Philippon and Greenland on sub-millennial timescales suggests that teleconnections between the high-latitudes and the Mediterranean region operate on sub-millennial timescales and that some terrestrial archives, such as Tenaghi Philippon, are particularly sensitive recorders of these abrupt climate changes.

  13. Cartan invariants and event horizon detection

    NASA Astrophysics Data System (ADS)

    Brooks, D.; Chavy-Waddy, P. C.; Coley, A. A.; Forget, A.; Gregoris, D.; MacCallum, M. A. H.; McNutt, D. D.

    2018-04-01

    We show that it is possible to locate the event horizon of a black hole (in arbitrary dimensions) by the zeros of certain Cartan invariants. This approach accounts for the recent results on the detection of stationary horizons using scalar polynomial curvature invariants, and improves upon them since the proposed method is computationally less expensive. As an application, we produce Cartan invariants that locate the event horizons for various exact four-dimensional and five-dimensional stationary, asymptotically flat (or (anti) de Sitter), black hole solutions and compare the Cartan invariants with the corresponding scalar curvature invariants that detect the event horizon.

  14. A General Event Location Algorithm with Applications to Eclipse and Station Line-of-Sight

    NASA Technical Reports Server (NTRS)

    Parker, Joel J. K.; Hughes, Steven P.

    2011-01-01

    A general-purpose algorithm for the detection and location of orbital events is developed. The proposed algorithm reduces the problem to a global root-finding problem by mapping events of interest (such as eclipses, station access events, etc.) to continuous, differentiable event functions. A stepping algorithm and a bracketing algorithm are used to detect and locate the roots. Examples of event functions and the stepping/bracketing algorithms are discussed, along with results indicating performance and accuracy in comparison to commercial tools across a variety of trajectories.

  15. A General Event Location Algorithm with Applications to Eclispe and Station Line-of-Sight

    NASA Technical Reports Server (NTRS)

    Parker, Joel J. K.; Hughes, Steven P.

    2011-01-01

    A general-purpose algorithm for the detection and location of orbital events is developed. The proposed algorithm reduces the problem to a global root-finding problem by mapping events of interest (such as eclipses, station access events, etc.) to continuous, differentiable event functions. A stepping algorithm and a bracketing algorithm are used to detect and locate the roots. Examples of event functions and the stepping/bracketing algorithms are discussed, along with results indicating performance and accuracy in comparison to commercial tools across a variety of trajectories.

  16. Detecting a Non-Gaussian Stochastic Background of Gravitational Radiation

    NASA Astrophysics Data System (ADS)

    Drasco, Steve; Flanagan, Éanna É.

    2002-12-01

    We derive a detection method for a stochastic background of gravitational waves produced by events where the ratio of the average time between events to the average duration of an event is large. Such a signal would sound something like popcorn popping. Our derivation is based on the somewhat unrealistic assumption that the duration of an event is smaller than the detector time resolution.

  17. Automated Sensor Tuning for Seismic Event Detection at a Carbon Capture, Utilization, and Storage Site, Farnsworth Unit, Ochiltree County, Texas

    NASA Astrophysics Data System (ADS)

    Ziegler, A.; Balch, R. S.; Knox, H. A.; Van Wijk, J. W.; Draelos, T.; Peterson, M. G.

    2016-12-01

    We present results (e.g. seismic detections and STA/LTA detection parameters) from a continuous downhole seismic array in the Farnsworth Field, an oil field in Northern Texas that hosts an ongoing carbon capture, utilization, and storage project. Specifically, we evaluate data from a passive vertical monitoring array consisting of 16 levels of 3-component 15Hz geophones installed in the field and continuously recording since January 2014. This detection database is directly compared to ancillary data (i.e. wellbore pressure) to determine if there is any relationship between seismic observables and CO2 injection and pressure maintenance in the field. Of particular interest is detection of relatively low-amplitude signals constituting long-period long-duration (LPLD) events that may be associated with slow shear-slip analogous to low frequency tectonic tremor. While this category of seismic event provides great insight into dynamic behavior of the pressurized subsurface, it is inherently difficult to detect. To automatically detect seismic events using effective data processing parameters, an automated sensor tuning (AST) algorithm developed by Sandia National Laboratories is being utilized. AST exploits ideas from neuro-dynamic programming (reinforcement learning) to automatically self-tune and determine optimal detection parameter settings. AST adapts in near real-time to changing conditions and automatically self-tune a signal detector to identify (detect) only signals from events of interest, leading to a reduction in the number of missed legitimate event detections and the number of false event detections. Funding for this project is provided by the U.S. Department of Energy's (DOE) National Energy Technology Laboratory (NETL) through the Southwest Regional Partnership on Carbon Sequestration (SWP) under Award No. DE-FC26-05NT42591. Additional support has been provided by site operator Chaparral Energy, L.L.C. and Schlumberger Carbon Services. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  18. Unraveling multiple changes in complex climate time series using Bayesian inference

    NASA Astrophysics Data System (ADS)

    Berner, Nadine; Trauth, Martin H.; Holschneider, Matthias

    2016-04-01

    Change points in time series are perceived as heterogeneities in the statistical or dynamical characteristics of observations. Unraveling such transitions yields essential information for the understanding of the observed system. The precise detection and basic characterization of underlying changes is therefore of particular importance in environmental sciences. We present a kernel-based Bayesian inference approach to investigate direct as well as indirect climate observations for multiple generic transition events. In order to develop a diagnostic approach designed to capture a variety of natural processes, the basic statistical features of central tendency and dispersion are used to locally approximate a complex time series by a generic transition model. A Bayesian inversion approach is developed to robustly infer on the location and the generic patterns of such a transition. To systematically investigate time series for multiple changes occurring at different temporal scales, the Bayesian inversion is extended to a kernel-based inference approach. By introducing basic kernel measures, the kernel inference results are composed into a proxy probability to a posterior distribution of multiple transitions. Thus, based on a generic transition model a probability expression is derived that is capable to indicate multiple changes within a complex time series. We discuss the method's performance by investigating direct and indirect climate observations. The approach is applied to environmental time series (about 100 a), from the weather station in Tuscaloosa, Alabama, and confirms documented instrumentation changes. Moreover, the approach is used to investigate a set of complex terrigenous dust records from the ODP sites 659, 721/722 and 967 interpreted as climate indicators of the African region of the Plio-Pleistocene period (about 5 Ma). The detailed inference unravels multiple transitions underlying the indirect climate observations coinciding with established global climate events.

  19. FISH analysis in the derivation of a 12, 15, 21 complex chromosomal rearrangement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stein, C.K.; Muscolino, D.; Baird, N.

    Cytogenetic analysis was performed for a couple referred for recurrent pregnancy loss. Routine GTG banded studies revealed a 46,XY karyotype for the husband, but in the woman, an apparently balanced complex rearrangement involving chromosomes 12, 15, and 21 was detected. The 46,XX,t(12;15)(q13.3;q23),t(12;21)(q21;q11.2) karyotype is the consequence of 2 translocation events resulting in 3 rearranged chromosomes: (1) a derivative 12 arising from the exchange of the short arms of 12 and 21; (2) a derivative chromosome 15 consisting of segments of the long arms of chromosomes 12 and 15; and (3) a complex derivative chromosome 21 which includes the short armmore » and centromere of 21, and portions of the long arms of both chromosomes 12 and 15. Because the 12;21 translocation occurred at the centromeric region on both chromosomes, it was not possible to cytogenetically differentiate the derivative chromosomes 12 and 21. To clarify this issue, fluorescence in situ hybridization (FISH) was performed utilizing a 13/21 alpha-satellite probe. The location of the FITC signal clearly indicated a chromosome 21 centromere present on the derivative containing portions of all three chromosomes. A family history of spontaneous fetal losses suggested the possibility of a familial translocation. However, the likelihood of transmission of such a complex set of translocations is low, leading to the hypothesis that only one of the translocations was inherited with the second a de novo event in this individual. Karyotype analysis of both parents revealed no cytogenetic anomalies. Therefore, the extremely unusual occurrence of two independent translocations involving 3 chromosomes arose de novo in this patient.« less

  20. Patterns of technical error among surgical malpractice claims: an analysis of strategies to prevent injury to surgical patients.

    PubMed

    Regenbogen, Scott E; Greenberg, Caprice C; Studdert, David M; Lipsitz, Stuart R; Zinner, Michael J; Gawande, Atul A

    2007-11-01

    To identify the most prevalent patterns of technical errors in surgery, and evaluate commonly recommended interventions in light of these patterns. The majority of surgical adverse events involve technical errors, but little is known about the nature and causes of these events. We examined characteristics of technical errors and common contributing factors among closed surgical malpractice claims. Surgeon reviewers analyzed 444 randomly sampled surgical malpractice claims from four liability insurers. Among 258 claims in which injuries due to error were detected, 52% (n = 133) involved technical errors. These technical errors were further analyzed with a structured review instrument designed by qualitative content analysis. Forty-nine percent of the technical errors caused permanent disability; an additional 16% resulted in death. Two-thirds (65%) of the technical errors were linked to manual error, 9% to errors in judgment, and 26% to both manual and judgment error. A minority of technical errors involved advanced procedures requiring special training ("index operations"; 16%), surgeons inexperienced with the task (14%), or poorly supervised residents (9%). The majority involved experienced surgeons (73%), and occurred in routine, rather than index, operations (84%). Patient-related complexities-including emergencies, difficult or unexpected anatomy, and previous surgery-contributed to 61% of technical errors, and technology or systems failures contributed to 21%. Most technical errors occur in routine operations with experienced surgeons, under conditions of increased patient complexity or systems failure. Commonly recommended interventions, including restricting high-complexity operations to experienced surgeons, additional training for inexperienced surgeons, and stricter supervision of trainees, are likely to address only a minority of technical errors. Surgical safety research should instead focus on improving decision-making and performance in routine operations for complex patients and circumstances.

  1. Towards a global flood detection system using social media

    NASA Astrophysics Data System (ADS)

    de Bruijn, Jens; de Moel, Hans; Jongman, Brenden; Aerts, Jeroen

    2017-04-01

    It is widely recognized that an early warning is critical in improving international disaster response. Analysis of social media in real-time can provide valuable information about an event or help to detect unexpected events. For successful and reliable detection systems that work globally, it is important that sufficient data is available and that the algorithm works both in data-rich and data-poor environments. In this study, both a new geotagging system and multi-level event detection system for flood hazards was developed using Twitter data. Geotagging algorithms that regard one tweet as a single document are well-studied. However, no algorithms exist that combine several sequential tweets mentioning keywords regarding a specific event type. Within the time frame of an event, multiple users use event related keywords that refer to the same place name. This notion allows us to treat several sequential tweets posted in the last 24 hours as one document. For all these tweets, we collect a series of spatial indicators given in the tweet metadata and extract additional topological indicators from the text. Using these indicators, we can reduce ambiguity and thus better estimate what locations are tweeted about. Using these localized tweets, Bayesian change-point analysis is used to find significant increases of tweets mentioning countries, provinces or towns. In data-poor environments detection of events on a country level is possible, while in other, data-rich, environments detection on a city level is achieved. Additionally, on a city-level we analyse the spatial dependence of mentioned places. If multiple places within a limited spatial extent are mentioned, detection confidence increases. We run the algorithm using 2 years of Twitter data with flood related keywords in 13 major languages and validate against a flood event database. We find that the geotagging algorithm yields significantly more data than previously developed algorithms and successfully deals with ambiguous place names. In addition, we show that our detection system can both quickly and reliably detect floods, even in countries where data is scarce, while achieving high detail in countries where more data is available.

  2. An islanding detection methodology combining decision trees and Sandia frequency shift for inverter-based distributed generations

    DOE PAGES

    Azim, Riyasat; Li, Fangxing; Xue, Yaosuo; ...

    2017-07-14

    Distributed generations (DGs) for grid-connected applications require an accurate and reliable islanding detection methodology (IDM) for secure system operation. This paper presents an IDM for grid-connected inverter-based DGs. The proposed method is a combination of passive and active islanding detection techniques for aggregation of their advantages and elimination/minimisation of the drawbacks. In the proposed IDM, the passive method utilises critical system attributes extracted from local voltage measurements at target DG locations as well as employs decision tree-based classifiers for characterisation and detection of islanding events. The active method is based on Sandia frequency shift technique and is initiated only whenmore » the passive method is unable to differentiate islanding events from other system events. Thus, the power quality degradation introduced into the system by active islanding detection techniques can be minimised. Furthermore, a combination of active and passive techniques allows detection of islanding events under low power mismatch scenarios eliminating the disadvantage associated with the use of passive techniques alone. Finally, detailed case study results demonstrate the effectiveness of the proposed method in detection of islanding events under various power mismatch scenarios, load quality factors and in the presence of single or multiple grid-connected inverter-based DG units.« less

  3. An islanding detection methodology combining decision trees and Sandia frequency shift for inverter-based distributed generations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Azim, Riyasat; Li, Fangxing; Xue, Yaosuo

    Distributed generations (DGs) for grid-connected applications require an accurate and reliable islanding detection methodology (IDM) for secure system operation. This paper presents an IDM for grid-connected inverter-based DGs. The proposed method is a combination of passive and active islanding detection techniques for aggregation of their advantages and elimination/minimisation of the drawbacks. In the proposed IDM, the passive method utilises critical system attributes extracted from local voltage measurements at target DG locations as well as employs decision tree-based classifiers for characterisation and detection of islanding events. The active method is based on Sandia frequency shift technique and is initiated only whenmore » the passive method is unable to differentiate islanding events from other system events. Thus, the power quality degradation introduced into the system by active islanding detection techniques can be minimised. Furthermore, a combination of active and passive techniques allows detection of islanding events under low power mismatch scenarios eliminating the disadvantage associated with the use of passive techniques alone. Finally, detailed case study results demonstrate the effectiveness of the proposed method in detection of islanding events under various power mismatch scenarios, load quality factors and in the presence of single or multiple grid-connected inverter-based DG units.« less

  4. 22nd Annual Logistics Conference and Exhibition

    DTIC Science & Technology

    2006-04-20

    Prognostics & Health Management at GE Dr. Piero P.Bonissone Industrial AI Lab GE Global Research NCD Select detection model Anomaly detection results...Mode 213 x Failure mode histogram 2130014 Anomaly detection from event-log data Anomaly detection from event-log data Diagnostics/ Prognostics Using...Failure Monitoring & AssessmentTactical C4ISR Sense Respond 7 •Diagnostics, Prognostics and health management

  5. Contrast-enhanced ultrasonography vs B-mode ultrasound for visualization of intima-media thickness and detection of plaques in human carotid arteries.

    PubMed

    Shah, Benoy N; Chahal, Navtej S; Kooner, Jaspal S; Senior, Roxy

    2017-05-01

    Carotid intima-media thickness (IMT) and plaque are recognized markers of increased risk for cerebrovascular events. Accurate visualization of the IMT and plaques is dependent upon image quality. Ultrasound contrast agents improve image quality during echocardiography-this study assessed whether contrast-enhanced ultrasound (CEUS) improves carotid IMT visualization and plaque detection in an asymptomatic population. Individuals free from known cardiovascular disease, enrolled in a community study, underwent B-mode and CEUS carotid imaging. Each carotid artery was divided into 10 segments (far and near walls of the proximal, mid and distal segments of the common carotid artery, the carotid bulb, and internal carotid artery). Visualization of the IMT complex and plaque assessments was made during both B-mode and CEUS imaging for all enrolled subjects, a total of 175 individuals (mean age 65±9 years). Visualization of the IMT was significantly improved during CEUS compared with B-mode imaging, in both near and far walls of the carotid arteries (% IMT visualization during B-mode vs CEUS imaging: 61% vs 94% and 66% vs 95% for right and left carotid arteries, respectively, P<.001 for both). Additionally, a greater number of plaques were detected during CEUS imaging compared with B-mode imaging (367 plaques vs 350 plaques, P=.02). Contrast-enhanced ultrasound improves visualization of the intima-media complex, in both near and far walls, of the common and internal carotid arteries and permits greater detection of carotid plaques. Further studies are required to determine whether there is incremental clinical and prognostic benefit related to superior plaque detection by CEUS. © 2017, Wiley Periodicals, Inc.

  6. Solar Demon: near real-time solar eruptive event detection on SDO/AIA images

    NASA Astrophysics Data System (ADS)

    Kraaikamp, Emil; Verbeeck, Cis

    Solar flares, dimmings and EUV waves have been observed routinely in extreme ultra-violet (EUV) images of the Sun since 1996. These events are closely associated with coronal mass ejections (CMEs), and therefore provide useful information for early space weather alerts. The Solar Dynamics Observatory/Atmospheric Imaging Assembly (SDO/AIA) generates such a massive dataset that it becomes impossible to find most of these eruptive events manually. Solar Demon is a set of automatic detection algorithms that attempts to solve this problem by providing both near real-time warnings of eruptive events and a catalog of characterized events. Solar Demon has been designed to detect and characterize dimmings, EUV waves, as well as solar flares in near real-time on SDO/AIA data. The detection modules are running continuously at the Royal Observatory of Belgium on both quick-look data and synoptic science data. The output of Solar Demon can be accessed in near real-time on the Solar Demon website, and includes images, movies, light curves, and the numerical evolution of several parameters. Solar Demon is the result of collaboration between the FP7 projects AFFECTS and COMESEP. Flare detections of Solar Demon are integrated into the COMESEP alert system. Here we present the Solar Demon detection algorithms and their output. We will focus on the algorithm and its operational implementation. Examples of interesting flare, dimming and EUV wave events, and general statistics of the detections made so far during solar cycle 24 will be presented as well.

  7. Monitoring Chewing and Eating in Free-Living Using Smart Eyeglasses.

    PubMed

    Zhang, Rui; Amft, Oliver

    2018-01-01

    We propose to 3-D-print personal fitted regular-look smart eyeglasses frames equipped with bilateral electromyography recording to monitor temporalis muscles' activity for automatic dietary monitoring. Personal fitting supported electrode-skin contacts are at temple ear bend and temple end positions. We evaluated the smart monitoring eyeglasses during in-lab and free-living studies of food chewing and eating event detection with ten participants. The in-lab study was designed to explore three natural food hardness levels and determine parameters of an energy-based chewing cycle detection. Our free-living study investigated whether chewing monitoring and eating event detection using smart eyeglasses is feasible in free-living. An eating event detection algorithm was developed to determine intake activities based on the estimated chewing rate. Results showed an average food hardness classification accuracy of 94% and chewing cycle detection precision and recall above 90% for the in-lab study and above 77% for the free-living study covering 122 hours of recordings. Eating detection revealed the 44 eating events with an average accuracy above 95%. We conclude that smart eyeglasses are suitable for monitoring chewing and eating events in free-living and even could provide further insights into the wearer's natural chewing patterns.

  8. Aiding the Detection of QRS Complex in ECG Signals by Detecting S Peaks Independently.

    PubMed

    Sabherwal, Pooja; Singh, Latika; Agrawal, Monika

    2018-03-30

    In this paper, a novel algorithm for the accurate detection of QRS complex by combining the independent detection of R and S peaks, using fusion algorithm is proposed. R peak detection has been extensively studied and is being used to detect the QRS complex. Whereas, S peaks, which is also part of QRS complex can be independently detected to aid the detection of QRS complex. In this paper, we suggest a method to first estimate S peak from raw ECG signal and then use them to aid the detection of QRS complex. The amplitude of S peak in ECG signal is relatively weak than corresponding R peak, which is traditionally used for the detection of QRS complex, therefore, an appropriate digital filter is designed to enhance the S peaks. These enhanced S peaks are then detected by adaptive thresholding. The algorithm is validated on all the signals of MIT-BIH arrhythmia database and noise stress database taken from physionet.org. The algorithm performs reasonably well even for the signals highly corrupted by noise. The algorithm performance is confirmed by sensitivity and positive predictivity of 99.99% and the detection accuracy of 99.98% for QRS complex detection. The number of false positives and false negatives resulted while analysis has been drastically reduced to 80 and 42 against the 98 and 84 the best results reported so far.

  9. Very low frequency earthquakes (VLFEs) detected during episodic tremor and slip (ETS) events in Cascadia using a match filter method indicate repeating events

    NASA Astrophysics Data System (ADS)

    Hutchison, A. A.; Ghosh, A.

    2016-12-01

    Very low frequency earthquakes (VLFEs) occur in transitional zones of faults, releasing seismic energy in the 0.02-0.05 Hz frequency band over a 90 s duration and typically have magntitudes within the range of Mw 3.0-4.0. VLFEs can occur down-dip of the seismogenic zone, where they can transfer stress up-dip potentially bringing the locked zone closer to a critical failure stress. VLFEs also occur up-dip of the seismogenic zone in a region along the plate interface that can rupture coseismically during large megathrust events, such as the 2011 Tohoku-Oki earthquake [Ide et al., 2011]. VLFEs were first detected in Cascadia during the 2011 episodic tremor and slip (ETS) event, occurring coincidentally with tremor [Ghosh et al., 2015]. However, during the 2014 ETS event, VLFEs were spatially and temporally asynchronous with tremor activity [Hutchison and Ghosh, 2016]. Such contrasting behaviors remind us that the mechanics behind such events remain elusive, yet they are responsible for the largest portion of the moment release during an ETS event. Here, we apply a match filter method using known VLFEs as template events to detect additional VLFEs. Using a grid-search centroid moment tensor inversion method, we invert stacks of the resulting match filter detections to ensure moment tensor solutions are similar to that of the respective template events. Our ability to successfully employ a match filter method to VLFE detection in Cascadia intrinsically indicates that these events can be repeating, implying that the same asperities are likely responsible for generating multiple VLFEs.

  10. Automatic processing of induced events in the geothermal reservoirs Landau and Insheim, Germany

    NASA Astrophysics Data System (ADS)

    Olbert, Kai; Küperkoch, Ludger; Meier, Thomas

    2016-04-01

    Induced events can be a risk to local infrastructure that need to be understood and evaluated. They represent also a chance to learn more about the reservoir behavior and characteristics. Prior to the analysis, the waveform data must be processed consistently and accurately to avoid erroneous interpretations. In the framework of the MAGS2 project an automatic off-line event detection and a phase onset time determination algorithm are applied to induced seismic events in geothermal systems in Landau and Insheim, Germany. The off-line detection algorithm works based on a cross-correlation of continuous data taken from the local seismic network with master events. It distinguishes events between different reservoirs and within the individual reservoirs. Furthermore, it provides a location and magnitude estimation. Data from 2007 to 2014 are processed and compared with other detections using the SeisComp3 cross correlation detector and a STA/LTA detector. The detected events are analyzed concerning spatial or temporal clustering. Furthermore the number of events are compared to the existing detection lists. The automatic phase picking algorithm combines an AR-AIC approach with a cost function to find precise P1- and S1-phase onset times which can be used for localization and tomography studies. 800 induced events are processed, determining 5000 P1- and 6000 S1-picks. The phase onset times show a high precision with mean residuals to manual phase picks of 0s (P1) to 0.04s (S1) and standard deviations below ±0.05s. The received automatic picks are applied to relocate a selected number of events to evaluate influences on the location precision.

  11. A substitution method to improve completeness of events documentation in anesthesia records.

    PubMed

    Lamer, Antoine; De Jonckheere, Julien; Marcilly, Romaric; Tavernier, Benoît; Vallet, Benoît; Jeanne, Mathieu; Logier, Régis

    2015-12-01

    AIMS are optimized to find and display data and curves about one specific intervention but is not retrospective analysis on a huge volume of interventions. Such a system present two main limitation; (1) the transactional database architecture, (2) the completeness of documentation. In order to solve the architectural problem, data warehouses were developed to propose architecture suitable for analysis. However, completeness of documentation stays unsolved. In this paper, we describe a method which allows determining of substitution rules in order to detect missing anesthesia events in an anesthesia record. Our method is based on the principle that missing event could be detected using a substitution one defined as the nearest documented event. As an example, we focused on the automatic detection of the start and the end of anesthesia procedure when these events were not documented by the clinicians. We applied our method on a set of records in order to evaluate; (1) the event detection accuracy, (2) the improvement of valid records. For the year 2010-2012, we obtained event detection with a precision of 0.00 (-2.22; 2.00) min for the start of anesthesia and 0.10 (0.00; 0.35) min for the end of anesthesia. On the other hand, we increased by 21.1% the data completeness (from 80.3 to 97.2% of the total database) for the start and the end of anesthesia events. This method seems to be efficient to replace missing "start and end of anesthesia" events. This method could also be used to replace other missing time events in this particular data warehouse as well as in other kind of data warehouses.

  12. On-Die Sensors for Transient Events

    NASA Astrophysics Data System (ADS)

    Suchak, Mihir Vimal

    Failures caused by transient electromagnetic events like Electrostatic Discharge (ESD) are a major concern for embedded systems. The component often failing is an integrated circuit (IC). Determining which IC is affected in a multi-device system is a challenging task. Debugging errors often requires sophisticated lab setups which require intentionally disturbing and probing various parts of the system which might not be easily accessible. Opening the system and adding probes may change its response to the transient event, which further compounds the problem. On-die transient event sensors were developed that require relatively little area on die, making them inexpensive, they consume negligible static current, and do not interfere with normal operation of the IC. These circuits can be used to determine the pin involved and the level of the event in the event of a transient event affecting the IC, thus allowing the user to debug system-level transient events without modifying the system. The circuit and detection scheme design has been completed and verified in simulations with Cadence Virtuoso environment. Simulations accounted for the impact of the ESD protection circuits, parasitics from the I/O pin, package and I/O ring, and included a model of an ESD gun to test the circuit's response to an ESD pulse as specified in IEC 61000-4-2. Multiple detection schemes are proposed. The final detection scheme consists of an event detector and a level sensor. The event detector latches on the presence of an event at a pad, to determine on which pin an event occurred. The level sensor generates current proportional to the level of the event. This current is converted to a voltage and digitized at the A/D converter to be read by the microprocessor. Detection scheme shows good performance in simulations when checked against process variations and different kind of events.

  13. Autonomous Detection of Eruptions, Plumes, and Other Transient Events in the Outer Solar System

    NASA Astrophysics Data System (ADS)

    Bunte, M. K.; Lin, Y.; Saripalli, S.; Bell, J. F.

    2012-12-01

    The outer solar system abounds with visually stunning examples of dynamic processes such as eruptive events that jettison materials from satellites and small bodies into space. The most notable examples of such events are the prominent volcanic plumes of Io, the wispy water jets of Enceladus, and the outgassing of comet nuclei. We are investigating techniques that will allow a spacecraft to autonomously detect those events in visible images. This technique will allow future outer planet missions to conduct sustained event monitoring and automate prioritization of data for downlink. Our technique detects plumes by searching for concentrations of large local gradients in images. Applying a Scale Invariant Feature Transform (SIFT) to either raw or calibrated images identifies interest points for further investigation based on the magnitude and orientation of local gradients in pixel values. The interest points are classified as possible transient geophysical events when they share characteristics with similar features in user-classified images. A nearest neighbor classification scheme assesses the similarity of all interest points within a threshold Euclidean distance and classifies each according to the majority classification of other interest points. Thus, features marked by multiple interest points are more likely to be classified positively as events; isolated large plumes or multiple small jets are easily distinguished from a textured background surface due to the higher magnitude gradient of the plume or jet when compared with the small, randomly oriented gradients of the textured surface. We have applied this method to images of Io, Enceladus, and comet Hartley 2 from the Voyager, Galileo, New Horizons, Cassini, and Deep Impact EPOXI missions, where appropriate, and have successfully detected up to 95% of manually identifiable events that our method was able to distinguish from the background surface and surface features of a body. Dozens of distinct features are identifiable under a variety of viewing conditions and hundreds of detections are made in each of the aforementioned datasets. In this presentation, we explore the controlling factors in detecting transient events and discuss causes of success or failure due to distinct data characteristics. These include the level of calibration of images, the ability to differentiate an event from artifacts, and the variety of event appearances in user-classified images. Other important factors include the physical characteristics of the events themselves: albedo, size as a function of image resolution, and proximity to other events (as in the case of multiple small jets which feed into the overall plume at the south pole of Enceladus). A notable strength of this method is the ability to detect events that do not extend beyond the limb of a planetary body or are adjacent to the terminator or other strong edges in the image. The former scenario strongly influences the success rate of detecting eruptive events in nadir views.

  14. Improvement of the Error-detection Mechanism in Adults with Dyslexia Following Reading Acceleration Training.

    PubMed

    Horowitz-Kraus, Tzipi

    2016-05-01

    The error-detection mechanism aids in preventing error repetition during a given task. Electroencephalography demonstrates that error detection involves two event-related potential components: error-related and correct-response negativities (ERN and CRN, respectively). Dyslexia is characterized by slow, inaccurate reading. In particular, individuals with dyslexia have a less active error-detection mechanism during reading than typical readers. In the current study, we examined whether a reading training programme could improve the ability to recognize words automatically (lexical representations) in adults with dyslexia, thereby resulting in more efficient error detection during reading. Behavioural and electrophysiological measures were obtained using a lexical decision task before and after participants trained with the reading acceleration programme. ERN amplitudes were smaller in individuals with dyslexia than in typical readers before training but increased following training, as did behavioural reading scores. Differences between the pre-training and post-training ERN and CRN components were larger in individuals with dyslexia than in typical readers. Also, the error-detection mechanism as represented by the ERN/CRN complex might serve as a biomarker for dyslexia and be used to evaluate the effectiveness of reading intervention programmes. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  15. Flare Characteristics from X-ray Light Curves

    NASA Astrophysics Data System (ADS)

    Gryciuk, M.; Siarkowski, M.; Sylwester, J.; Gburek, S.; Podgorski, P.; Kepa, A.; Sylwester, B.; Mrozek, T.

    2017-06-01

    A new methodology is given to determine basic parameters of flares from their X-ray light curves. Algorithms are developed from the analysis of small X-ray flares occurring during the deep solar minimum of 2009, between Solar Cycles 23 and 24, observed by the Polish Solar Photometer in X-rays (SphinX) on the Complex Orbital Observations Near-Earth of Activity of the Sun-Photon (CORONAS- Photon) spacecraft. One is a semi-automatic flare detection procedure that gives start, peak, and end times for single ("elementary") flare events under the assumption that the light curve is a simple convolution of a Gaussian and exponential decay functions. More complex flares with multiple peaks can generally be described by a sum of such elementary flares. Flare time profiles in the two energy ranges of SphinX (1.16 - 1.51 keV, 1.51 - 15 keV) are used to derive temperature and emission measure as a function of time during each flare. The result is a comprehensive catalogue - the SphinX Flare Catalogue - which contains 1600 flares or flare-like events and is made available for general use. The methods described here can be applied to observations made by Geosynchronous Operational Environmental Satellites (GOES), the Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI) and other broad-band spectrometers.

  16. Transposon mutagenesis of Xylella fastidiosa by electroporation of Tn5 synaptic complexes.

    PubMed

    Guilhabert, M R; Hoffman, L M; Mills, D A; Kirkpatrick, B C

    2001-06-01

    Pierce's disease, a lethal disease of grapevine, is caused by Xylella fastidiosa, a gram-negative, xylem-limited bacterium that is transmitted from plant to plant by xylem-feeding insects. Strains of X. fastidiosa also have been associated with diseases that cause tremendous losses in many other economically important plants, including citrus. Although the complete genome sequence of X. fastidiosa has recently been determined, the inability to transform or produce transposon mutants of X. fastidiosa has been a major impediment to understanding pathogen-, plant-, and insect-vector interactions. We evaluated the ability of four different suicide vectors carrying either Tn5 or Tn10 transposons as well as a preformed Tn5 transposase-transposon synaptic complex (transposome) to transpose X. fastidiosa. The four suicide vectors failed to produce any detectable transposition events. Electroporation of transposomes, however, yielded 6 x 10(3) and 4 x 10(3) Tn5 mutants per microg of DNA in two different grapevine strains of X. fastidiosa. Molecular analysis showed that the transposition insertions were single, independent, stable events. Sequence analysis of the Tn5 insertion sites indicated that the transpositions occur randomly in the X. fastidiosa genome. Transposome-mediated mutagenesis should facilitate the identification of X. fastidiosa genes that mediate plant pathogenicity and insect transmission.

  17. Origin, radiation, dispersion and allopatric hybridization in the chub Leuciscus cephalus.

    PubMed

    Durand, J D; Unlü, E; Doadrio, I; Pipoyan, S; Templeton, A R

    2000-08-22

    The phylogenetic relationships of 492 chub (Leuciscus cephalus) belonging to 89 populations across the species' range were assessed using 600 base pairs of cytochrome b. Furthermore, nine species belonging to the L. cephalus complex were also analysed (over the whole cytochrome b) in order to test potential allopatric hybridization with L. cephalus sensu stricto (i.e. the chub). Our results show that the chub includes four highly divergent lineages descending from a quick radiation that took place three million years ago. The geographical distribution of these lineages and results of the nested clade analysis indicated that the chub may have originated from Mesopotamia. Chub radiation probably occurred during an important vicariant event such as the isolation of numerous Turkish river systems, a consequence of the uplift of the Anatolian Plateau (formerly covered by a broad inland lake). Dispersion of these lineages arose from the changes in the European hydrographic network and, thus, the chub and endemic species of the L. cephalus complex met by secondary contacts. Our results show several patterns of introgression, from Leuciscus lepidus fully introgressed by chub mitochondrial DNA to Leuciscus borysthenicus where no introgression at all was detected. We assume that these hybridization events might constitute an important evolutionary process for the settlement of the chub in new environments in the Mediterranean area.

  18. Origin, radiation, dispersion and allopatric hybridization in the chub Leuciscus cephalus.

    PubMed Central

    Durand, J D; Unlü, E; Doadrio, I; Pipoyan, S; Templeton, A R

    2000-01-01

    The phylogenetic relationships of 492 chub (Leuciscus cephalus) belonging to 89 populations across the species' range were assessed using 600 base pairs of cytochrome b. Furthermore, nine species belonging to the L. cephalus complex were also analysed (over the whole cytochrome b) in order to test potential allopatric hybridization with L. cephalus sensu stricto (i.e. the chub). Our results show that the chub includes four highly divergent lineages descending from a quick radiation that took place three million years ago. The geographical distribution of these lineages and results of the nested clade analysis indicated that the chub may have originated from Mesopotamia. Chub radiation probably occurred during an important vicariant event such as the isolation of numerous Turkish river systems, a consequence of the uplift of the Anatolian Plateau (formerly covered by a broad inland lake). Dispersion of these lineages arose from the changes in the European hydrographic network and, thus, the chub and endemic species of the L. cephalus complex met by secondary contacts. Our results show several patterns of introgression, from Leuciscus lepidus fully introgressed by chub mitochondrial DNA to Leuciscus borysthenicus where no introgression at all was detected. We assume that these hybridization events might constitute an important evolutionary process for the settlement of the chub in new environments in the Mediterranean area. PMID:11467433

  19. Precambrian evolution of the Salalah Crystalline Basement from structural analysis and 40Ar/39Ar geochronology

    NASA Astrophysics Data System (ADS)

    Al-Doukhi, Hanadi Abulateef

    The Salalah Crystalline Basement (SCB) is the largest Precambrian exposure in Oman located on the southern margin of the Arabian Plate at the Arabian Sea shore. This work used remote sensing, detailed structural analysis and the analysis of ten samples using 40Ar/39Ar age dating to establish the Precambrian evolution of the SCB by focusing on its central and southwestern parts. This work found that the SCB evolved through four deformational events that shaped its final architecture: (1) Folding and thrusting event that resulted in the emplacement of the Sadh complex atop the Juffa complex. This event resulted in the formation of possibly N-verging nappe structure; (2) Regional folding event around SE- and SW-plunging axes that deformed the regional fabric developed during the N-verging nappe structure and produced map-scale SE- and SW-plunging antiforms shaping the complexes into a semi-dome structure; (3) Strike-slip shearing event that produced a conjugate set of NE-trending sinistral and NW-trending dextral strike-slip shear zones; and (4) Localized SE-directed gravitational collapse manifested by top-to-the-southeast kinematic indicators. Deformation within the SCB might have ceased by 752.2+/-2.7 Ma as indicated by an age given by an undeformed granite. The thermochron of samples collected throughout the SCB complexes shows a single cooling event that occurred between about 800 and 760 Ma. This cooling event could be accomplished by crustal exhumation resulting in regional collapse following the prolonged period of the contractional deformation of the SCB. This makes the SCB a possible metamorphic core complex.

  20. The benefits of flexible team interaction during crises.

    PubMed

    Stachowski, Alicia A; Kaplan, Seth A; Waller, Mary J

    2009-11-01

    Organizations increasingly rely on teams to respond to crises. While research on team effectiveness during nonroutine events is growing, naturalistic studies examining team behaviors during crises are relatively scarce. Furthermore, the relevant literature offers competing theoretical rationales concerning effective team response to crises. In this article, the authors investigate whether high- versus average-performing teams can be distinguished on the basis of the number and complexity of their interaction patterns. Using behavioral observation methodology, the authors coded the discrete verbal and nonverbal behaviors of 14 nuclear power plant control room crews as they responded to a simulated crisis. Pattern detection software revealed systematic differences among crews in their patterns of interaction. Mean comparisons and discriminant function analysis indicated that higher performing crews exhibited fewer, shorter, and less complex interaction patterns. These results illustrate the limitations of standardized response patterns and highlight the importance of team adaptability. Implications for future research and for team training are included.

Top