Subsurface event detection and classification using Wireless Signal Networks.
Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T
2012-11-05
Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events.
Subsurface Event Detection and Classification Using Wireless Signal Networks
Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T.
2012-01-01
Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events. PMID:23202191
A Fuzzy-Decision Based Approach for Composite Event Detection in Wireless Sensor Networks
Zhang, Shukui; Chen, Hao; Zhu, Qiaoming
2014-01-01
The event detection is one of the fundamental researches in wireless sensor networks (WSNs). Due to the consideration of various properties that reflect events status, the Composite event is more consistent with the objective world. Thus, the research of the Composite event becomes more realistic. In this paper, we analyze the characteristics of the Composite event; then we propose a criterion to determine the area of the Composite event and put forward a dominating set based network topology construction algorithm under random deployment. For the unreliability of partial data in detection process and fuzziness of the event definitions in nature, we propose a cluster-based two-dimensional τ-GAS algorithm and fuzzy-decision based composite event decision mechanism. In the case that the sensory data of most nodes are normal, the two-dimensional τ-GAS algorithm can filter the fault node data effectively and reduce the influence of erroneous data on the event determination. The Composite event judgment mechanism which is based on fuzzy-decision holds the superiority of the fuzzy-logic based algorithm; moreover, it does not need the support of a huge rule base and its computational complexity is small. Compared to CollECT algorithm and CDS algorithm, this algorithm improves the detection accuracy and reduces the traffic. PMID:25136690
Zhu, Pengyu; Fu, Wei; Wang, Chenguang; Du, Zhixin; Huang, Kunlun; Zhu, Shuifang; Xu, Wentao
2016-04-15
The possibility of the absolute quantitation of GMO events by digital PCR was recently reported. However, most absolute quantitation methods based on the digital PCR required pretreatment steps. Meanwhile, singleplex detection could not meet the demand of the absolute quantitation of GMO events that is based on the ratio of foreign fragments and reference genes. Thus, to promote the absolute quantitative detection of different GMO events by digital PCR, we developed a quantitative detection method based on duplex digital PCR without pretreatment. Moreover, we tested 7 GMO events in our study to evaluate the fitness of our method. The optimized combination of foreign and reference primers, limit of quantitation (LOQ), limit of detection (LOD) and specificity were validated. The results showed that the LOQ of our method for different GMO events was 0.5%, while the LOD is 0.1%. Additionally, we found that duplex digital PCR could achieve the detection results with lower RSD compared with singleplex digital PCR. In summary, the duplex digital PCR detection system is a simple and stable way to achieve the absolute quantitation of different GMO events. Moreover, the LOQ and LOD indicated that this method is suitable for the daily detection and quantitation of GMO events. Copyright © 2016 Elsevier B.V. All rights reserved.
Generalized Detectability for Discrete Event Systems
Shu, Shaolong; Lin, Feng
2011-01-01
In our previous work, we investigated detectability of discrete event systems, which is defined as the ability to determine the current and subsequent states of a system based on observation. For different applications, we defined four types of detectabilities: (weak) detectability, strong detectability, (weak) periodic detectability, and strong periodic detectability. In this paper, we extend our results in three aspects. (1) We extend detectability from deterministic systems to nondeterministic systems. Such a generalization is necessary because there are many systems that need to be modeled as nondeterministic discrete event systems. (2) We develop polynomial algorithms to check strong detectability. The previous algorithms are based on observer whose construction is of exponential complexity, while the new algorithms are based on a new automaton called detector. (3) We extend detectability to D-detectability. While detectability requires determining the exact state of a system, D-detectability relaxes this requirement by asking only to distinguish certain pairs of states. With these extensions, the theory on detectability of discrete event systems becomes more applicable in solving many practical problems. PMID:21691432
Adaptively Adjusted Event-Triggering Mechanism on Fault Detection for Networked Control Systems.
Wang, Yu-Long; Lim, Cheng-Chew; Shi, Peng
2016-12-08
This paper studies the problem of adaptively adjusted event-triggering mechanism-based fault detection for a class of discrete-time networked control system (NCS) with applications to aircraft dynamics. By taking into account the fault occurrence detection progress and the fault occurrence probability, and introducing an adaptively adjusted event-triggering parameter, a novel event-triggering mechanism is proposed to achieve the efficient utilization of the communication network bandwidth. Both the sensor-to-control station and the control station-to-actuator network-induced delays are taken into account. The event-triggered sensor and the event-triggered control station are utilized simultaneously to establish new network-based closed-loop models for the NCS subject to faults. Based on the established models, the event-triggered simultaneous design of fault detection filter (FDF) and controller is presented. A new algorithm for handling the adaptively adjusted event-triggering parameter is proposed. Performance analysis verifies the effectiveness of the adaptively adjusted event-triggering mechanism, and the simultaneous design of FDF and controller.
Event detection in an assisted living environment.
Stroiescu, Florin; Daly, Kieran; Kuris, Benjamin
2011-01-01
This paper presents the design of a wireless event detection and in building location awareness system. The systems architecture is based on using a body worn sensor to detect events such as falls where they occur in an assisted living environment. This process involves developing event detection algorithms and transmitting such events wirelessly to an in house network based on the 802.15.4 protocol. The network would then generate alerts both in the assisted living facility and remotely to an offsite monitoring facility. The focus of this paper is on the design of the system architecture and the compliance challenges in applying this technology.
Detecting Earthquakes over a Seismic Network using Single-Station Similarity Measures
NASA Astrophysics Data System (ADS)
Bergen, Karianne J.; Beroza, Gregory C.
2018-03-01
New blind waveform-similarity-based detection methods, such as Fingerprint and Similarity Thresholding (FAST), have shown promise for detecting weak signals in long-duration, continuous waveform data. While blind detectors are capable of identifying similar or repeating waveforms without templates, they can also be susceptible to false detections due to local correlated noise. In this work, we present a set of three new methods that allow us to extend single-station similarity-based detection over a seismic network; event-pair extraction, pairwise pseudo-association, and event resolution complete a post-processing pipeline that combines single-station similarity measures (e.g. FAST sparse similarity matrix) from each station in a network into a list of candidate events. The core technique, pairwise pseudo-association, leverages the pairwise structure of event detections in its network detection model, which allows it to identify events observed at multiple stations in the network without modeling the expected move-out. Though our approach is general, we apply it to extend FAST over a sparse seismic network. We demonstrate that our network-based extension of FAST is both sensitive and maintains a low false detection rate. As a test case, we apply our approach to two weeks of continuous waveform data from five stations during the foreshock sequence prior to the 2014 Mw 8.2 Iquique earthquake. Our method identifies nearly five times as many events as the local seismicity catalog (including 95% of the catalog events), and less than 1% of these candidate events are false detections.
NASA Astrophysics Data System (ADS)
Jia, Rui-Sheng; Sun, Hong-Mei; Peng, Yan-Jun; Liang, Yong-Quan; Lu, Xin-Ming
2017-07-01
Microseismic monitoring is an effective means for providing early warning of rock or coal dynamical disasters, and its first step is microseismic event detection, although low SNR microseismic signals often cannot effectively be detected by routine methods. To solve this problem, this paper presents permutation entropy and a support vector machine to detect low SNR microseismic events. First, an extraction method of signal features based on multi-scale permutation entropy is proposed by studying the influence of the scale factor on the signal permutation entropy. Second, the detection model of low SNR microseismic events based on the least squares support vector machine is built by performing a multi-scale permutation entropy calculation for the collected vibration signals, constructing a feature vector set of signals. Finally, a comparative analysis of the microseismic events and noise signals in the experiment proves that the different characteristics of the two can be fully expressed by using multi-scale permutation entropy. The detection model of microseismic events combined with the support vector machine, which has the features of high classification accuracy and fast real-time algorithms, can meet the requirements of online, real-time extractions of microseismic events.
Embedded security system for multi-modal surveillance in a railway carriage
NASA Astrophysics Data System (ADS)
Zouaoui, Rhalem; Audigier, Romaric; Ambellouis, Sébastien; Capman, François; Benhadda, Hamid; Joudrier, Stéphanie; Sodoyer, David; Lamarque, Thierry
2015-10-01
Public transport security is one of the main priorities of the public authorities when fighting against crime and terrorism. In this context, there is a great demand for autonomous systems able to detect abnormal events such as violent acts aboard passenger cars and intrusions when the train is parked at the depot. To this end, we present an innovative approach which aims at providing efficient automatic event detection by fusing video and audio analytics and reducing the false alarm rate compared to classical stand-alone video detection. The multi-modal system is composed of two microphones and one camera and integrates onboard video and audio analytics and fusion capabilities. On the one hand, for detecting intrusion, the system relies on the fusion of "unusual" audio events detection with intrusion detections from video processing. The audio analysis consists in modeling the normal ambience and detecting deviation from the trained models during testing. This unsupervised approach is based on clustering of automatically extracted segments of acoustic features and statistical Gaussian Mixture Model (GMM) modeling of each cluster. The intrusion detection is based on the three-dimensional (3D) detection and tracking of individuals in the videos. On the other hand, for violent events detection, the system fuses unsupervised and supervised audio algorithms with video event detection. The supervised audio technique detects specific events such as shouts. A GMM is used to catch the formant structure of a shout signal. Video analytics use an original approach for detecting aggressive motion by focusing on erratic motion patterns specific to violent events. As data with violent events is not easily available, a normality model with structured motions from non-violent videos is learned for one-class classification. A fusion algorithm based on Dempster-Shafer's theory analyses the asynchronous detection outputs and computes the degree of belief of each probable event.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, Fei; Jiang, Huaiguang; Tan, Jin
This paper proposes an event-driven approach for reconfiguring distribution systems automatically. Specifically, an optimal synchrophasor sensor placement (OSSP) is used to reduce the number of synchrophasor sensors while keeping the whole system observable. Then, a wavelet-based event detection and location approach is used to detect and locate the event, which performs as a trigger for network reconfiguration. With the detected information, the system is then reconfigured using the hierarchical decentralized approach to seek for the new optimal topology. In this manner, whenever an event happens the distribution network can be reconfigured automatically based on the real-time information that is observablemore » and detectable.« less
Azim, Riyasat; Li, Fangxing; Xue, Yaosuo; ...
2017-07-14
Distributed generations (DGs) for grid-connected applications require an accurate and reliable islanding detection methodology (IDM) for secure system operation. This paper presents an IDM for grid-connected inverter-based DGs. The proposed method is a combination of passive and active islanding detection techniques for aggregation of their advantages and elimination/minimisation of the drawbacks. In the proposed IDM, the passive method utilises critical system attributes extracted from local voltage measurements at target DG locations as well as employs decision tree-based classifiers for characterisation and detection of islanding events. The active method is based on Sandia frequency shift technique and is initiated only whenmore » the passive method is unable to differentiate islanding events from other system events. Thus, the power quality degradation introduced into the system by active islanding detection techniques can be minimised. Furthermore, a combination of active and passive techniques allows detection of islanding events under low power mismatch scenarios eliminating the disadvantage associated with the use of passive techniques alone. Finally, detailed case study results demonstrate the effectiveness of the proposed method in detection of islanding events under various power mismatch scenarios, load quality factors and in the presence of single or multiple grid-connected inverter-based DG units.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azim, Riyasat; Li, Fangxing; Xue, Yaosuo
Distributed generations (DGs) for grid-connected applications require an accurate and reliable islanding detection methodology (IDM) for secure system operation. This paper presents an IDM for grid-connected inverter-based DGs. The proposed method is a combination of passive and active islanding detection techniques for aggregation of their advantages and elimination/minimisation of the drawbacks. In the proposed IDM, the passive method utilises critical system attributes extracted from local voltage measurements at target DG locations as well as employs decision tree-based classifiers for characterisation and detection of islanding events. The active method is based on Sandia frequency shift technique and is initiated only whenmore » the passive method is unable to differentiate islanding events from other system events. Thus, the power quality degradation introduced into the system by active islanding detection techniques can be minimised. Furthermore, a combination of active and passive techniques allows detection of islanding events under low power mismatch scenarios eliminating the disadvantage associated with the use of passive techniques alone. Finally, detailed case study results demonstrate the effectiveness of the proposed method in detection of islanding events under various power mismatch scenarios, load quality factors and in the presence of single or multiple grid-connected inverter-based DG units.« less
Lee, Young-Sook; Chung, Wan-Young
2012-01-01
Vision-based abnormal event detection for home healthcare systems can be greatly improved using visual sensor-based techniques able to detect, track and recognize objects in the scene. However, in moving object detection and tracking processes, moving cast shadows can be misclassified as part of objects or moving objects. Shadow removal is an essential step for developing video surveillance systems. The goal of the primary is to design novel computer vision techniques that can extract objects more accurately and discriminate between abnormal and normal activities. To improve the accuracy of object detection and tracking, our proposed shadow removal algorithm is employed. Abnormal event detection based on visual sensor by using shape features variation and 3-D trajectory is presented to overcome the low fall detection rate. The experimental results showed that the success rate of detecting abnormal events was 97% with a false positive rate of 2%. Our proposed algorithm can allow distinguishing diverse fall activities such as forward falls, backward falls, and falling asides from normal activities. PMID:22368486
Initial Evaluation of Signal-Based Bayesian Monitoring
NASA Astrophysics Data System (ADS)
Moore, D.; Russell, S.
2016-12-01
We present SIGVISA (Signal-based Vertically Integrated Seismic Analysis), a next-generation system for global seismic monitoring through Bayesian inference on seismic signals. Traditional seismic monitoring systems rely on discrete detections produced by station processing software, discarding significant information present in the original recorded signal. By modeling signals directly, our forward model is able to incorporate a rich representation of the physics underlying the signal generation process, including source mechanisms, wave propagation, and station response. This allows inference in the model to recover the qualitative behavior of geophysical methods including waveform matching and double-differencing, all as part of a unified Bayesian monitoring system that simultaneously detects and locates events from a network of stations. We report results from an evaluation of SIGVISA monitoring the western United States for a two-week period following the magnitude 6.0 event in Wells, NV in February 2008. During this period, SIGVISA detects more than twice as many events as NETVISA, and three times as many as SEL3, while operating at the same precision; at lower precisions it detects up to five times as many events as SEL3. At the same time, signal-based monitoring reduces mean location errors by a factor of four relative to detection-based systems. We provide evidence that, given only IMS data, SIGVISA detects events that are missed by regional monitoring networks, indicating that our evaluations may even underestimate its performance. Finally, SIGVISA matches or exceeds the detection rates of existing systems for de novo events - events with no nearby historical seismicity - and detects through automated processing a number of such events missed even by the human analysts generating the LEB.
Detecting earthquakes over a seismic network using single-station similarity measures
NASA Astrophysics Data System (ADS)
Bergen, Karianne J.; Beroza, Gregory C.
2018-06-01
New blind waveform-similarity-based detection methods, such as Fingerprint and Similarity Thresholding (FAST), have shown promise for detecting weak signals in long-duration, continuous waveform data. While blind detectors are capable of identifying similar or repeating waveforms without templates, they can also be susceptible to false detections due to local correlated noise. In this work, we present a set of three new methods that allow us to extend single-station similarity-based detection over a seismic network; event-pair extraction, pairwise pseudo-association, and event resolution complete a post-processing pipeline that combines single-station similarity measures (e.g. FAST sparse similarity matrix) from each station in a network into a list of candidate events. The core technique, pairwise pseudo-association, leverages the pairwise structure of event detections in its network detection model, which allows it to identify events observed at multiple stations in the network without modeling the expected moveout. Though our approach is general, we apply it to extend FAST over a sparse seismic network. We demonstrate that our network-based extension of FAST is both sensitive and maintains a low false detection rate. As a test case, we apply our approach to 2 weeks of continuous waveform data from five stations during the foreshock sequence prior to the 2014 Mw 8.2 Iquique earthquake. Our method identifies nearly five times as many events as the local seismicity catalogue (including 95 per cent of the catalogue events), and less than 1 per cent of these candidate events are false detections.
Automatic event recognition and anomaly detection with attribute grammar by learning scene semantics
NASA Astrophysics Data System (ADS)
Qi, Lin; Yao, Zhenyu; Li, Li; Dong, Junyu
2007-11-01
In this paper we present a novel framework for automatic event recognition and abnormal behavior detection with attribute grammar by learning scene semantics. This framework combines learning scene semantics by trajectory analysis and constructing attribute grammar-based event representation. The scene and event information is learned automatically. Abnormal behaviors that disobey scene semantics or event grammars rules are detected. By this method, an approach to understanding video scenes is achieved. Further more, with this prior knowledge, the accuracy of abnormal event detection is increased.
On event-based optical flow detection
Brosch, Tobias; Tschechne, Stephan; Neumann, Heiko
2015-01-01
Event-based sensing, i.e., the asynchronous detection of luminance changes, promises low-energy, high dynamic range, and sparse sensing. This stands in contrast to whole image frame-wise acquisition by standard cameras. Here, we systematically investigate the implications of event-based sensing in the context of visual motion, or flow, estimation. Starting from a common theoretical foundation, we discuss different principal approaches for optical flow detection ranging from gradient-based methods over plane-fitting to filter based methods and identify strengths and weaknesses of each class. Gradient-based methods for local motion integration are shown to suffer from the sparse encoding in address-event representations (AER). Approaches exploiting the local plane like structure of the event cloud, on the other hand, are shown to be well suited. Within this class, filter based approaches are shown to define a proper detection scheme which can also deal with the problem of representing multiple motions at a single location (motion transparency). A novel biologically inspired efficient motion detector is proposed, analyzed and experimentally validated. Furthermore, a stage of surround normalization is incorporated. Together with the filtering this defines a canonical circuit for motion feature detection. The theoretical analysis shows that such an integrated circuit reduces motion ambiguity in addition to decorrelating the representation of motion related activations. PMID:25941470
NASA Astrophysics Data System (ADS)
Mahmoud, Seedahmed S.; Visagathilagar, Yuvaraja; Katsifolis, Jim
2012-09-01
The success of any perimeter intrusion detection system depends on three important performance parameters: the probability of detection (POD), the nuisance alarm rate (NAR), and the false alarm rate (FAR). The most fundamental parameter, POD, is normally related to a number of factors such as the event of interest, the sensitivity of the sensor, the installation quality of the system, and the reliability of the sensing equipment. The suppression of nuisance alarms without degrading sensitivity in fiber optic intrusion detection systems is key to maintaining acceptable performance. Signal processing algorithms that maintain the POD and eliminate nuisance alarms are crucial for achieving this. In this paper, a robust event classification system using supervised neural networks together with a level crossings (LCs) based feature extraction algorithm is presented for the detection and recognition of intrusion and non-intrusion events in a fence-based fiber-optic intrusion detection system. A level crossings algorithm is also used with a dynamic threshold to suppress torrential rain-induced nuisance alarms in a fence system. Results show that rain-induced nuisance alarms can be suppressed for rainfall rates in excess of 100 mm/hr with the simultaneous detection of intrusion events. The use of a level crossing based detection and novel classification algorithm is also presented for a buried pipeline fiber optic intrusion detection system for the suppression of nuisance events and discrimination of intrusion events. The sensor employed for both types of systems is a distributed bidirectional fiber-optic Mach-Zehnder (MZ) interferometer.
2018-01-01
Modern cities are subject to periodic or unexpected critical events, which may bring economic losses or even put people in danger. When some monitoring systems based on wireless sensor networks are deployed, sensing and transmission configurations of sensor nodes may be adjusted exploiting the relevance of the considered events, but efficient detection and classification of events of interest may be hard to achieve. In Smart City environments, several people spontaneously post information in social media about some event that is being observed and such information may be mined and processed for detection and classification of critical events. This article proposes an integrated approach to detect and classify events of interest posted in social media, notably in Twitter, and the assignment of sensing priorities to source nodes. By doing so, wireless sensor networks deployed in Smart City scenarios can be optimized for higher efficiency when monitoring areas under the influence of the detected events. PMID:29614060
Costa, Daniel G; Duran-Faundez, Cristian; Andrade, Daniel C; Rocha-Junior, João B; Peixoto, João Paulo Just
2018-04-03
Modern cities are subject to periodic or unexpected critical events, which may bring economic losses or even put people in danger. When some monitoring systems based on wireless sensor networks are deployed, sensing and transmission configurations of sensor nodes may be adjusted exploiting the relevance of the considered events, but efficient detection and classification of events of interest may be hard to achieve. In Smart City environments, several people spontaneously post information in social media about some event that is being observed and such information may be mined and processed for detection and classification of critical events. This article proposes an integrated approach to detect and classify events of interest posted in social media, notably in Twitter , and the assignment of sensing priorities to source nodes. By doing so, wireless sensor networks deployed in Smart City scenarios can be optimized for higher efficiency when monitoring areas under the influence of the detected events.
OGLE-2017-BLG-1130: The First Binary Gravitational Microlens Detected from Spitzer Only
NASA Astrophysics Data System (ADS)
Wang, Tianshu; Calchi Novati, S.; Udalski, A.; Gould, A.; Mao, Shude; Zang, W.; Beichman, C.; Bryden, G.; Carey, S.; Gaudi, B. S.; Henderson, C. B.; Shvartzvald, Y.; Yee, J. C.; Spitzer Team; Mróz, P.; Poleski, R.; Skowron, J.; Szymański, M. K.; Soszyński, I.; Kozłowski, S.; Pietrukowicz, P.; Ulaczyk, K.; Pawlak, M.; OGLE Collaboration; Albrow, M. D.; Chung, S.-J.; Han, C.; Hwang, K.-H.; Jung, Y. K.; Ryu, Y.-H.; Shin, I.-G.; Zhu, W.; Cha, S.-M.; Kim, D.-J.; Kim, H.-W.; Kim, S.-L.; Lee, C.-U.; Lee, D.-J.; Lee, Y.; Park, B.-G.; Pogge, R. W.; KMTNet Collaboration
2018-06-01
We analyze the binary gravitational microlensing event OGLE-2017-BLG-1130 (mass ratio q ∼ 0.45), the first published case in which the binary anomaly was detected only by the Spitzer Space Telescope. This event provides strong evidence that some binary signals can be missed by observations from the ground alone but detected by Spitzer. We therefore invert the normal procedure, first finding the lens parameters by fitting the space-based data and then measuring the microlensing parallax using ground-based observations. We also show that the normal four-fold space-based degeneracy in the single-lens case can become a weak eight-fold degeneracy in binary-lens events. Although this degeneracy is resolved in event OGLE-2017-BLG-1130, it might persist in other events.
Self-similarity Clustering Event Detection Based on Triggers Guidance
NASA Astrophysics Data System (ADS)
Zhang, Xianfei; Li, Bicheng; Tian, Yuxuan
Traditional method of Event Detection and Characterization (EDC) regards event detection task as classification problem. It makes words as samples to train classifier, which can lead to positive and negative samples of classifier imbalance. Meanwhile, there is data sparseness problem of this method when the corpus is small. This paper doesn't classify event using word as samples, but cluster event in judging event types. It adopts self-similarity to convergence the value of K in K-means algorithm by the guidance of event triggers, and optimizes clustering algorithm. Then, combining with named entity and its comparative position information, the new method further make sure the pinpoint type of event. The new method avoids depending on template of event in tradition methods, and its result of event detection can well be used in automatic text summarization, text retrieval, and topic detection and tracking.
Commonality of drug-associated adverse events detected by 4 commonly used data mining algorithms.
Sakaeda, Toshiyuki; Kadoyama, Kaori; Minami, Keiko; Okuno, Yasushi
2014-01-01
Data mining algorithms have been developed for the quantitative detection of drug-associated adverse events (signals) from a large database on spontaneously reported adverse events. In the present study, the commonality of signals detected by 4 commonly used data mining algorithms was examined. A total of 2,231,029 reports were retrieved from the public release of the US Food and Drug Administration Adverse Event Reporting System database between 2004 and 2009. The deletion of duplicated submissions and revision of arbitrary drug names resulted in a reduction in the number of reports to 1,644,220. Associations with adverse events were analyzed for 16 unrelated drugs, using the proportional reporting ratio (PRR), reporting odds ratio (ROR), information component (IC), and empirical Bayes geometric mean (EBGM). All EBGM-based signals were included in the PRR-based signals as well as IC- or ROR-based ones, and PRR- and IC-based signals were included in ROR-based ones. The PRR scores of PRR-based signals were significantly larger for 15 of 16 drugs when adverse events were also detected as signals by the EBGM method, as were the IC scores of IC-based signals for all drugs; however, no such effect was observed in the ROR scores of ROR-based signals. The EBGM method was the most conservative among the 4 methods examined, which suggested its better suitability for pharmacoepidemiological studies. Further examinations should be performed on the reproducibility of clinical observations, especially for EBGM-based signals.
NASA Astrophysics Data System (ADS)
Kim, S. K.; Lee, J.; Zhang, C.; Ames, S.; Williams, D. N.
2017-12-01
Deep learning techniques have been successfully applied to solve many problems in climate and geoscience using massive-scaled observed and modeled data. For extreme climate event detections, several models based on deep neural networks have been recently proposed and attend superior performance that overshadows all previous handcrafted expert based method. The issue arising, though, is that accurate localization of events requires high quality of climate data. In this work, we propose framework capable of detecting and localizing extreme climate events in very coarse climate data. Our framework is based on two models using deep neural networks, (1) Convolutional Neural Networks (CNNs) to detect and localize extreme climate events, and (2) Pixel recursive recursive super resolution model to reconstruct high resolution climate data from low resolution climate data. Based on our preliminary work, we have presented two CNNs in our framework for different purposes, detection and localization. Our results using CNNs for extreme climate events detection shows that simple neural nets can capture the pattern of extreme climate events with high accuracy from very coarse reanalysis data. However, localization accuracy is relatively low due to the coarse resolution. To resolve this issue, the pixel recursive super resolution model reconstructs the resolution of input of localization CNNs. We present a best networks using pixel recursive super resolution model that synthesizes details of tropical cyclone in ground truth data while enhancing their resolution. Therefore, this approach not only dramat- ically reduces the human effort, but also suggests possibility to reduce computing cost required for downscaling process to increase resolution of data.
Development of a database and processing method for detecting hematotoxicity adverse drug events.
Shimai, Yoshie; Takeda, Toshihiro; Manabe, Shirou; Teramoto, Kei; Mihara, Naoki; Matsumura, Yasushi
2015-01-01
Adverse events are detected by monitoring the patient's status, including blood test results. However, it is difficult to identify all adverse events based on recognition by individual doctors. We developed a system that can be used to detect hematotoxicity adverse events according to blood test results recorded in an electronic medical record system. The blood test results were graded based on Common Terminology Criteria for Adverse Events (CTCAE) and changes in the blood test results (Up, Down, Flat) were assessed according to the variation in the grade. The changes in the blood test and injection data were stored in a database. By comparing the date of injection and start and end dates of the change in the blood test results, adverse events related to a designated drug were detected. Using this method, we searched for the occurrence of serious adverse events (CTCAE Grades 3 or 4) concerning WBC, ALT and creatinine related to paclitaxel at Osaka University Hospital. The rate of occurrence of a decreased WBC count, increased ALT level and increased creatinine level was 36.0%, 0.6% and 0.4%, respectively. This method is useful for detecting and estimating the rate of occurrence of hematotoxicity adverse drug events.
NASA Astrophysics Data System (ADS)
Liu, S. B.; Bouchard, B.; Bowden, D. C.; Guy, M.; Earle, P.
2012-12-01
The U.S. Geological Survey (USGS) is investigating how online social networking services like Twitter—a microblogging service for sending and reading public text-based messages of up to 140 characters—can augment USGS earthquake response products and the delivery of hazard information. The USGS Tweet Earthquake Dispatch (TED) system is using Twitter not only to broadcast seismically-verified earthquake alerts via the @USGSted and @USGSbigquakes Twitter accounts, but also to rapidly detect widely felt seismic events through a real-time detection system. The detector algorithm scans for significant increases in tweets containing the word "earthquake" or its equivalent in other languages and sends internal alerts with the detection time, tweet text, and the location of the city where most of the tweets originated. It has been running in real-time for 7 months and finds, on average, two or three felt events per day with a false detection rate of less than 10%. The detections have reasonable coverage of populated areas globally. The number of detections is small compared to the number of earthquakes detected seismically, and only a rough location and qualitative assessment of shaking can be determined based on Tweet data alone. However, the Twitter detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The main benefit of the tweet-based detections is speed, with most detections occurring between 19 seconds and 2 minutes from the origin time. This is considerably faster than seismic detections in poorly instrumented regions of the world. Going beyond the initial detection, the USGS is developing data mining techniques to continuously archive and analyze relevant tweets for additional details about the detected events. The information generated about an event is displayed on a web-based map designed using HTML5 for the mobile environment, which can be valuable when the user is not able to access a desktop computer at the time of the detections. The continuously updating map displays geolocated tweets arriving after the detection and plots epicenters of recent earthquakes. When available, seismograms from nearby stations are displayed as an additional form of verification. A time series of tweets-per-minute is also shown to illustrate the volume of tweets being generated for the detected event. Future additions are being investigated to provide a more in-depth characterization of the seismic events based on an analysis of tweet text and content from other social media sources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chung, Sun-Ju; Lee, Chung-Uk; Koo, Jae-Rim, E-mail: sjchung@kasi.re.kr, E-mail: leecu@kasi.re.kr, E-mail: koojr@kasi.re.kr
2014-04-20
Even though the recently discovered high-magnification event MOA-2010-BLG-311 had complete coverage over its peak, confident planet detection did not happen due to extremely weak central perturbations (EWCPs, fractional deviations of ≲ 2%). For confident detection of planets in EWCP events, it is necessary to have both high cadence monitoring and high photometric accuracy better than those of current follow-up observation systems. The next-generation ground-based observation project, Korea Microlensing Telescope Network (KMTNet), satisfies these conditions. We estimate the probability of occurrence of EWCP events with fractional deviations of ≤2% in high-magnification events and the efficiency of detecting planets in the EWCPmore » events using the KMTNet. From this study, we find that the EWCP events occur with a frequency of >50% in the case of ≲ 100 M {sub E} planets with separations of 0.2 AU ≲ d ≲ 20 AU. We find that for main-sequence and sub-giant source stars, ≳ 1 M {sub E} planets in EWCP events with deviations ≤2% can be detected with frequency >50% in a certain range that changes with the planet mass. However, it is difficult to detect planets in EWCP events of bright stars like giant stars because it is easy for KMTNet to be saturated around the peak of the events because of its constant exposure time. EWCP events are caused by close, intermediate, and wide planetary systems with low-mass planets and close and wide planetary systems with massive planets. Therefore, we expect that a much greater variety of planetary systems than those already detected, which are mostly intermediate planetary systems, regardless of the planet mass, will be significantly detected in the near future.« less
Brauchli Pernus, Yolanda; Nan, Cassandra; Verstraeten, Thomas; Pedenko, Mariia; Osokogu, Osemeke U; Weibel, Daniel; Sturkenboom, Miriam; Bonhoeffer, Jan
2016-12-12
Safety signal detection in spontaneous reporting system databases and electronic healthcare records is key to detection of previously unknown adverse events following immunization. Various statistical methods for signal detection in these different datasources have been developed, however none are geared to the pediatric population and none specifically to vaccines. A reference set comprising pediatric vaccine-adverse event pairs is required for reliable performance testing of statistical methods within and across data sources. The study was conducted within the context of the Global Research in Paediatrics (GRiP) project, as part of the seventh framework programme (FP7) of the European Commission. Criteria for the selection of vaccines considered in the reference set were routine and global use in the pediatric population. Adverse events were primarily selected based on importance. Outcome based systematic literature searches were performed for all identified vaccine-adverse event pairs and complemented by expert committee reports, evidence based decision support systems (e.g. Micromedex), and summaries of product characteristics. Classification into positive (PC) and negative control (NC) pairs was performed by two independent reviewers according to a pre-defined algorithm and discussed for consensus in case of disagreement. We selected 13 vaccines and 14 adverse events to be included in the reference set. From a total of 182 vaccine-adverse event pairs, we classified 18 as PC, 113 as NC and 51 as unclassifiable. Most classifications (91) were based on literature review, 45 were based on expert committee reports, and for 46 vaccine-adverse event pairs, an underlying pathomechanism was not plausible classifying the association as NC. A reference set of vaccine-adverse event pairs was developed. We propose its use for comparing signal detection methods and systems in the pediatric population. Published by Elsevier Ltd.
Multi-Station Broad Regional Event Detection Using Waveform Correlation
NASA Astrophysics Data System (ADS)
Slinkard, M.; Stephen, H.; Young, C. J.; Eckert, R.; Schaff, D. P.; Richards, P. G.
2013-12-01
Previous waveform correlation studies have established the occurrence of repeating seismic events in various regions, and the utility of waveform-correlation event-detection on broad regional or even global scales to find events currently not included in traditionally-prepared bulletins. The computational burden, however, is high, limiting previous experiments to relatively modest template libraries and/or processing time periods. We have developed a distributed computing waveform correlation event detection utility that allows us to process years of continuous waveform data with template libraries numbering in the thousands. We have used this system to process several years of waveform data from IRIS stations in East Asia, using libraries of template events taken from global and regional bulletins. Detections at a given station are confirmed by 1) comparison with independent bulletins of seismicity, and 2) consistent detections at other stations. We find that many of the detected events are not in traditional catalogs, hence the multi-station comparison is essential. In addition to detecting the similar events, we also estimate magnitudes very precisely based on comparison with the template events (when magnitudes are available). We have investigated magnitude variation within detected families of similar events, false alarm rates, and the temporal and spatial reach of templates.
Sun, Siao; Barraud, Sylvie; Castebrunet, Hélène; Aubin, Jean-Baptiste; Marmonier, Pierre
2015-11-15
The assessment of urban stormwater quantity and quality is important for evaluating and controlling the impact of the stormwater to natural water and environment. This study mainly addresses long-term evolution of stormwater quantity and quality in a French urban catchment using continuous measured data from 2004 to 2011. Storm event-based data series are obtained (716 rainfall events and 521 runoff events are available) from measured continuous time series. The Mann-Kendall test is applied to these event-based data series for trend detection. A lack of trend is found in rainfall and an increasing trend in runoff is detected. As a result, an increasing trend is present in the runoff coefficient, likely due to growing imperviousness of the catchment caused by urbanization. The event mean concentration of the total suspended solid (TSS) in stormwater does not present a trend, whereas the event load of TSS has an increasing tendency, which is attributed to the increasing event runoff volume. Uncertainty analysis suggests that the major uncertainty in trend detection results lies in uncertainty due to available data. A lack of events due to missing data leads to dramatically increased uncertainty in trend detection results. In contrast, measurement uncertainty in time series data plays a trivial role. The intra-event distribution of TSS is studied based on both M(V) curves and pollutant concentrations of absolute runoff volumes. The trend detection test reveals no significant change in intra-event distributions of TSS in the studied catchment. Copyright © 2015 Elsevier Ltd. All rights reserved.
Human visual system-based smoking event detection
NASA Astrophysics Data System (ADS)
Odetallah, Amjad D.; Agaian, Sos S.
2012-06-01
Human action (e.g. smoking, eating, and phoning) analysis is an important task in various application domains like video surveillance, video retrieval, human-computer interaction systems, and so on. Smoke detection is a crucial task in many video surveillance applications and could have a great impact to raise the level of safety of urban areas, public parks, airplanes, hospitals, schools and others. The detection task is challenging since there is no prior knowledge about the object's shape, texture and color. In addition, its visual features will change under different lighting and weather conditions. This paper presents a new scheme of a system for detecting human smoking events, or small smoke, in a sequence of images. In developed system, motion detection and background subtraction are combined with motion-region-saving, skin-based image segmentation, and smoke-based image segmentation to capture potential smoke regions which are further analyzed to decide on the occurrence of smoking events. Experimental results show the effectiveness of the proposed approach. As well, the developed method is capable of detecting the small smoking events of uncertain actions with various cigarette sizes, colors, and shapes.
Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin
2016-10-01
Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.
Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin
2016-01-01
Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses. PMID:27706086
Real-Time Gait Event Detection Based on Kinematic Data Coupled to a Biomechanical Model.
Lambrecht, Stefan; Harutyunyan, Anna; Tanghe, Kevin; Afschrift, Maarten; De Schutter, Joris; Jonkers, Ilse
2017-03-24
Real-time detection of multiple stance events, more specifically initial contact (IC), foot flat (FF), heel off (HO), and toe off (TO), could greatly benefit neurorobotic (NR) and neuroprosthetic (NP) control. Three real-time threshold-based algorithms have been developed, detecting the aforementioned events based on kinematic data in combination with a biomechanical model. Data from seven subjects walking at three speeds on an instrumented treadmill were used to validate the presented algorithms, accumulating to a total of 558 steps. The reference for the gait events was obtained using marker and force plate data. All algorithms had excellent precision and no false positives were observed. Timing delays of the presented algorithms were similar to current state-of-the-art algorithms for the detection of IC and TO, whereas smaller delays were achieved for the detection of FF. Our results indicate that, based on their high precision and low delays, these algorithms can be used for the control of an NR/NP, with the exception of the HO event. Kinematic data is used in most NR/NP control schemes and is thus available at no additional cost, resulting in a minimal computational burden. The presented methods can also be applied for screening pathological gait or gait analysis in general in/outside of the laboratory.
Network hydraulics inclusion in water quality event detection using multiple sensor stations data.
Oliker, Nurit; Ostfeld, Avi
2015-09-01
Event detection is one of the current most challenging topics in water distribution systems analysis: how regular on-line hydraulic (e.g., pressure, flow) and water quality (e.g., pH, residual chlorine, turbidity) measurements at different network locations can be efficiently utilized to detect water quality contamination events. This study describes an integrated event detection model which combines multiple sensor stations data with network hydraulics. To date event detection modelling is likely limited to single sensor station location and dataset. Single sensor station models are detached from network hydraulics insights and as a result might be significantly exposed to false positive alarms. This work is aimed at decreasing this limitation through integrating local and spatial hydraulic data understanding into an event detection model. The spatial analysis complements the local event detection effort through discovering events with lower signatures by exploring the sensors mutual hydraulic influences. The unique contribution of this study is in incorporating hydraulic simulation information into the overall event detection process of spatially distributed sensors. The methodology is demonstrated on two example applications using base runs and sensitivity analyses. Results show a clear advantage of the suggested model over single-sensor event detection schemes. Copyright © 2015 Elsevier Ltd. All rights reserved.
Full-waveform detection of non-impulsive seismic events based on time-reversal methods
NASA Astrophysics Data System (ADS)
Solano, Ericka Alinne; Hjörleifsdóttir, Vala; Liu, Qinya
2017-12-01
We present a full-waveform detection method for non-impulsive seismic events, based on time-reversal principles. We use the strain Green's tensor as a matched filter, correlating it with continuous observed seismograms, to detect non-impulsive seismic events. We show that this is mathematically equivalent to an adjoint method for detecting earthquakes. We define the detection function, a scalar valued function, which depends on the stacked correlations for a group of stations. Event detections are given by the times at which the amplitude of the detection function exceeds a given value relative to the noise level. The method can make use of the whole seismic waveform or any combination of time-windows with different filters. It is expected to have an advantage compared to traditional detection methods for events that do not produce energetic and impulsive P waves, for example glacial events, landslides, volcanic events and transform-fault earthquakes for events which velocity structure along the path is relatively well known. Furthermore, the method has advantages over empirical Greens functions template matching methods, as it does not depend on records from previously detected events, and therefore is not limited to events occurring in similar regions and with similar focal mechanisms as these events. The method is not specific to any particular way of calculating the synthetic seismograms, and therefore complicated structural models can be used. This is particularly beneficial for intermediate size events that are registered on regional networks, for which the effect of lateral structure on the waveforms can be significant. To demonstrate the feasibility of the method, we apply it to two different areas located along the mid-oceanic ridge system west of Mexico where non-impulsive events have been reported. The first study area is between Clipperton and Siqueiros transform faults (9°N), during the time of two earthquake swarms, occurring in March 2012 and May 2016. The second area of interest is the Gulf of California where two swarms took place during July and September of 2015. We show that we are able to detect previously non-reported, non-impulsive events and recommend that this method be used together with more traditional template matching methods to maximize the number of detected events.
Detecting event-related changes in organizational networks using optimized neural network models.
Li, Ze; Sun, Duoyong; Zhu, Renqi; Lin, Zihan
2017-01-01
Organizational external behavior changes are caused by the internal structure and interactions. External behaviors are also known as the behavioral events of an organization. Detecting event-related changes in organizational networks could efficiently be used to monitor the dynamics of organizational behaviors. Although many different methods have been used to detect changes in organizational networks, these methods usually ignore the correlation between the internal structure and external events. Event-related change detection considers the correlation and could be used for event recognition based on social network modeling and supervised classification. Detecting event-related changes could be effectively useful in providing early warnings and faster responses to both positive and negative organizational activities. In this study, event-related change in an organizational network was defined, and artificial neural network models were used to quantitatively determine whether and when a change occurred. To achieve a higher accuracy, Back Propagation Neural Networks (BPNNs) were optimized using Genetic Algorithms (GAs) and Particle Swarm Optimization (PSO). We showed the feasibility of the proposed method by comparing its performance with that of other methods using two cases. The results suggested that the proposed method could identify organizational events based on a correlation between the organizational networks and events. The results also suggested that the proposed method not only has a higher precision but also has a better robustness than the previously used techniques.
Detecting event-related changes in organizational networks using optimized neural network models
Sun, Duoyong; Zhu, Renqi; Lin, Zihan
2017-01-01
Organizational external behavior changes are caused by the internal structure and interactions. External behaviors are also known as the behavioral events of an organization. Detecting event-related changes in organizational networks could efficiently be used to monitor the dynamics of organizational behaviors. Although many different methods have been used to detect changes in organizational networks, these methods usually ignore the correlation between the internal structure and external events. Event-related change detection considers the correlation and could be used for event recognition based on social network modeling and supervised classification. Detecting event-related changes could be effectively useful in providing early warnings and faster responses to both positive and negative organizational activities. In this study, event-related change in an organizational network was defined, and artificial neural network models were used to quantitatively determine whether and when a change occurred. To achieve a higher accuracy, Back Propagation Neural Networks (BPNNs) were optimized using Genetic Algorithms (GAs) and Particle Swarm Optimization (PSO). We showed the feasibility of the proposed method by comparing its performance with that of other methods using two cases. The results suggested that the proposed method could identify organizational events based on a correlation between the organizational networks and events. The results also suggested that the proposed method not only has a higher precision but also has a better robustness than the previously used techniques. PMID:29190799
NASA Astrophysics Data System (ADS)
LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.
2016-12-01
Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.
Label-free DNA biosensor based on resistance change of platinum nanoparticles assemblies.
Skotadis, Evangelos; Voutyras, Konstantinos; Chatzipetrou, Marianneza; Tsekenis, Georgios; Patsiouras, Lampros; Madianos, Leonidas; Chatzandroulis, Stavros; Zergioti, Ioanna; Tsoukalas, Dimitris
2016-07-15
A novel nanoparticle based biosensor for the fast and simple detection of DNA hybridization events is presented. The sensor utilizes hybridized DNA's charge transport properties, combining them with metallic nanoparticle networks that act as nano-gapped electrodes. The DNA hybridization events can be detected by a significant reduction in the sensor's resistance due to the conductive bridging offered by hybridized DNA. By modifying the nanoparticle surface coverage, which can be controlled experimentally being a function of deposition time, and the structural properties of the electrodes, an optimized biosensor for the in situ detection of DNA hybridization events is ultimately fabricated. The fabricated biosensor exhibits a wide response range, covering four orders of magnitude, a limit of detection of 1nM and can detect a single base pair mismatch between probe and complementary DNA. Copyright © 2016 Elsevier B.V. All rights reserved.
Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; ...
2016-01-01
The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events thatmore » generated the phase picks.« less
Fehre, Karsten; Plössnig, Manuela; Schuler, Jochen; Hofer-Dückelmann, Christina; Rappelsberger, Andrea; Adlassnig, Klaus-Peter
2015-01-01
The detection of adverse drug events (ADEs) is an important aspect of improving patient safety. The iMedication system employs predefined triggers associated with significant events in a patient's clinical data to automatically detect possible ADEs. We defined four clinically relevant conditions: hyperkalemia, hyponatremia, renal failure, and over-anticoagulation. These are some of the most relevant ADEs in internal medical and geriatric wards. For each patient, ADE risk scores for all four situations are calculated, compared against a threshold, and judged to be monitored, or reported. A ward-based cockpit view summarizes the results.
Kreilinger, Alex; Hiebel, Hannah; Müller-Putz, Gernot R
2016-03-01
This work aimed to find and evaluate a new method for detecting errors in continuous brain-computer interface (BCI) applications. Instead of classifying errors on a single-trial basis, the new method was based on multiple events (MEs) analysis to increase the accuracy of error detection. In a BCI-driven car game, based on motor imagery (MI), discrete events were triggered whenever subjects collided with coins and/or barriers. Coins counted as correct events, whereas barriers were errors. This new method, termed ME method, combined and averaged the classification results of single events (SEs) and determined the correctness of MI trials, which consisted of event sequences instead of SEs. The benefit of this method was evaluated in an offline simulation. In an online experiment, the new method was used to detect erroneous MI trials. Such MI trials were discarded and could be repeated by the users. We found that, even with low SE error potential (ErrP) detection rates, feasible accuracies can be achieved when combining MEs to distinguish erroneous from correct MI trials. Online, all subjects reached higher scores with error detection than without, at the cost of longer times needed for completing the game. Findings suggest that ErrP detection may become a reliable tool for monitoring continuous states in BCI applications when combining MEs. This paper demonstrates a novel technique for detecting errors in online continuous BCI applications, which yields promising results even with low single-trial detection rates.
NASA Astrophysics Data System (ADS)
Yun, Jinsik; Ha, Dong Sam; Inman, Daniel J.; Owen, Robert B.
2011-03-01
Structural damage for spacecraft is mainly due to impacts such as collision of meteorites or space debris. We present a structural health monitoring (SHM) system for space applications, named Adverse Event Detection (AED), which integrates an acoustic sensor, an impedance-based SHM system, and a Lamb wave SHM system. With these three health-monitoring methods in place, we can determine the presence, location, and severity of damage. An acoustic sensor continuously monitors acoustic events, while the impedance-based and Lamb wave SHM systems are in sleep mode. If an acoustic sensor detects an impact, it activates the impedance-based SHM. The impedance-based system determines if the impact incurred damage. When damage is detected, it activates the Lamb wave SHM system to determine the severity and location of the damage. Further, since an acoustic sensor dissipates much less power than the two SHM systems and the two systems are activated only when there is an acoustic event, our system reduces overall power dissipation significantly. Our prototype system demonstrates the feasibility of the proposed concept.
Accelerometer and Camera-Based Strategy for Improved Human Fall Detection.
Zerrouki, Nabil; Harrou, Fouzi; Sun, Ying; Houacine, Amrane
2016-12-01
In this paper, we address the problem of detecting human falls using anomaly detection. Detection and classification of falls are based on accelerometric data and variations in human silhouette shape. First, we use the exponentially weighted moving average (EWMA) monitoring scheme to detect a potential fall in the accelerometric data. We used an EWMA to identify features that correspond with a particular type of fall allowing us to classify falls. Only features corresponding with detected falls were used in the classification phase. A benefit of using a subset of the original data to design classification models minimizes training time and simplifies models. Based on features corresponding to detected falls, we used the support vector machine (SVM) algorithm to distinguish between true falls and fall-like events. We apply this strategy to the publicly available fall detection databases from the university of Rzeszow's. Results indicated that our strategy accurately detected and classified fall events, suggesting its potential application to early alert mechanisms in the event of fall situations and its capability for classification of detected falls. Comparison of the classification results using the EWMA-based SVM classifier method with those achieved using three commonly used machine learning classifiers, neural network, K-nearest neighbor and naïve Bayes, proved our model superior.
Brewer, Gene A; Knight, Justin B; Marsh, Richard L; Unsworth, Nash
2010-04-01
The multiprocess view proposes that different processes can be used to detect event-based prospective memory cues, depending in part on the specificity of the cue. According to this theory, attentional processes are not necessary to detect focal cues, whereas detection of nonfocal cues requires some form of controlled attention. This notion was tested using a design in which we compared performance on a focal and on a nonfocal prospective memory task by participants with high or low working memory capacity. An interaction was found, such that participants with high and low working memory performed equally well on the focal task, whereas the participants with high working memory performed significantly better on the nonfocal task than did their counterparts with low working memory. Thus, controlled attention was only necessary for detecting event-based prospective memory cues in the nonfocal task. These results have implications for theories of prospective memory, the processes necessary for cue detection, and the successful fulfillment of intentions.
Chen, Yen-Lin; Liang, Wen-Yew; Chiang, Chuan-Yen; Hsieh, Tung-Ju; Lee, Da-Cheng; Yuan, Shyan-Ming; Chang, Yang-Lang
2011-01-01
This study presents efficient vision-based finger detection, tracking, and event identification techniques and a low-cost hardware framework for multi-touch sensing and display applications. The proposed approach uses a fast bright-blob segmentation process based on automatic multilevel histogram thresholding to extract the pixels of touch blobs obtained from scattered infrared lights captured by a video camera. The advantage of this automatic multilevel thresholding approach is its robustness and adaptability when dealing with various ambient lighting conditions and spurious infrared noises. To extract the connected components of these touch blobs, a connected-component analysis procedure is applied to the bright pixels acquired by the previous stage. After extracting the touch blobs from each of the captured image frames, a blob tracking and event recognition process analyzes the spatial and temporal information of these touch blobs from consecutive frames to determine the possible touch events and actions performed by users. This process also refines the detection results and corrects for errors and occlusions caused by noise and errors during the blob extraction process. The proposed blob tracking and touch event recognition process includes two phases. First, the phase of blob tracking associates the motion correspondence of blobs in succeeding frames by analyzing their spatial and temporal features. The touch event recognition process can identify meaningful touch events based on the motion information of touch blobs, such as finger moving, rotating, pressing, hovering, and clicking actions. Experimental results demonstrate that the proposed vision-based finger detection, tracking, and event identification system is feasible and effective for multi-touch sensing applications in various operational environments and conditions. PMID:22163990
Abnormal global and local event detection in compressive sensing domain
NASA Astrophysics Data System (ADS)
Wang, Tian; Qiao, Meina; Chen, Jie; Wang, Chuanyun; Zhang, Wenjia; Snoussi, Hichem
2018-05-01
Abnormal event detection, also known as anomaly detection, is one challenging task in security video surveillance. It is important to develop effective and robust movement representation models for global and local abnormal event detection to fight against factors such as occlusion and illumination change. In this paper, a new algorithm is proposed. It can locate the abnormal events on one frame, and detect the global abnormal frame. The proposed algorithm employs a sparse measurement matrix designed to represent the movement feature based on optical flow efficiently. Then, the abnormal detection mission is constructed as a one-class classification task via merely learning from the training normal samples. Experiments demonstrate that our algorithm performs well on the benchmark abnormal detection datasets against state-of-the-art methods.
Monitoring Chewing and Eating in Free-Living Using Smart Eyeglasses.
Zhang, Rui; Amft, Oliver
2018-01-01
We propose to 3-D-print personal fitted regular-look smart eyeglasses frames equipped with bilateral electromyography recording to monitor temporalis muscles' activity for automatic dietary monitoring. Personal fitting supported electrode-skin contacts are at temple ear bend and temple end positions. We evaluated the smart monitoring eyeglasses during in-lab and free-living studies of food chewing and eating event detection with ten participants. The in-lab study was designed to explore three natural food hardness levels and determine parameters of an energy-based chewing cycle detection. Our free-living study investigated whether chewing monitoring and eating event detection using smart eyeglasses is feasible in free-living. An eating event detection algorithm was developed to determine intake activities based on the estimated chewing rate. Results showed an average food hardness classification accuracy of 94% and chewing cycle detection precision and recall above 90% for the in-lab study and above 77% for the free-living study covering 122 hours of recordings. Eating detection revealed the 44 eating events with an average accuracy above 95%. We conclude that smart eyeglasses are suitable for monitoring chewing and eating events in free-living and even could provide further insights into the wearer's natural chewing patterns.
Domain Anomaly Detection in Machine Perception: A System Architecture and Taxonomy.
Kittler, Josef; Christmas, William; de Campos, Teófilo; Windridge, David; Yan, Fei; Illingworth, John; Osman, Magda
2014-05-01
We address the problem of anomaly detection in machine perception. The concept of domain anomaly is introduced as distinct from the conventional notion of anomaly used in the literature. We propose a unified framework for anomaly detection which exposes the multifaceted nature of anomalies and suggest effective mechanisms for identifying and distinguishing each facet as instruments for domain anomaly detection. The framework draws on the Bayesian probabilistic reasoning apparatus which clearly defines concepts such as outlier, noise, distribution drift, novelty detection (object, object primitive), rare events, and unexpected events. Based on these concepts we provide a taxonomy of domain anomaly events. One of the mechanisms helping to pinpoint the nature of anomaly is based on detecting incongruence between contextual and noncontextual sensor(y) data interpretation. The proposed methodology has wide applicability. It underpins in a unified way the anomaly detection applications found in the literature. To illustrate some of its distinguishing features, in here the domain anomaly detection methodology is applied to the problem of anomaly detection for a video annotation system.
Detecting a Non-Gaussian Stochastic Background of Gravitational Radiation
NASA Astrophysics Data System (ADS)
Drasco, Steve; Flanagan, Éanna É.
2002-12-01
We derive a detection method for a stochastic background of gravitational waves produced by events where the ratio of the average time between events to the average duration of an event is large. Such a signal would sound something like popcorn popping. Our derivation is based on the somewhat unrealistic assumption that the duration of an event is smaller than the detector time resolution.
Model Based Verification of Cyber Range Event Environments
2015-12-10
Model Based Verification of Cyber Range Event Environments Suresh K. Damodaran MIT Lincoln Laboratory 244 Wood St., Lexington, MA, USA...apply model based verification to cyber range event environment configurations, allowing for the early detection of errors in event environment...Environment Representation (CCER) ontology. We also provide an overview of a methodology to specify verification rules and the corresponding error
A Framework of Simple Event Detection in Surveillance Video
NASA Astrophysics Data System (ADS)
Xu, Weiguang; Zhang, Yafei; Lu, Jianjiang; Tian, Yulong; Wang, Jiabao
Video surveillance is playing more and more important role in people's social life. Real-time alerting of threaten events and searching interesting content in stored large scale video footage needs human operator to pay full attention on monitor for long time. The labor intensive mode has limit the effectiveness and efficiency of the system. A framework of simple event detection is presented advance the automation of video surveillance. An improved inner key point matching approach is used to compensate motion of background in real-time; frame difference are used to detect foreground; HOG based classifiers are used to classify foreground object into people and car; mean-shift is used to tracking the recognized objects. Events are detected based on predefined rules. The maturity of the algorithms guarantee the robustness of the framework, and the improved approach and the easily checked rules enable the framework to work in real-time. Future works to be done are also discussed.
Radiation detector device for rejecting and excluding incomplete charge collection events
Bolotnikov, Aleksey E.; De Geronimo, Gianluigi; Vernon, Emerson; Yang, Ge; Camarda, Giuseppe; Cui, Yonggang; Hossain, Anwar; Kim, Ki Hyun; James, Ralph B.
2016-05-10
A radiation detector device is provided that is capable of distinguishing between full charge collection (FCC) events and incomplete charge collection (ICC) events based upon a correlation value comparison algorithm that compares correlation values calculated for individually sensed radiation detection events with a calibrated FCC event correlation function. The calibrated FCC event correlation function serves as a reference curve utilized by a correlation value comparison algorithm to determine whether a sensed radiation detection event fits the profile of the FCC event correlation function within the noise tolerances of the radiation detector device. If the radiation detection event is determined to be an ICC event, then the spectrum for the ICC event is rejected and excluded from inclusion in the radiation detector device spectral analyses. The radiation detector device also can calculate a performance factor to determine the efficacy of distinguishing between FCC and ICC events.
Secure access control and large scale robust representation for online multimedia event detection.
Liu, Changyu; Lu, Bin; Li, Huiling
2014-01-01
We developed an online multimedia event detection (MED) system. However, there are a secure access control issue and a large scale robust representation issue when we want to integrate traditional event detection algorithms into the online environment. For the first issue, we proposed a tree proxy-based and service-oriented access control (TPSAC) model based on the traditional role based access control model. Verification experiments were conducted on the CloudSim simulation platform, and the results showed that the TPSAC model is suitable for the access control of dynamic online environments. For the second issue, inspired by the object-bank scene descriptor, we proposed a 1000-object-bank (1000OBK) event descriptor. Feature vectors of the 1000OBK were extracted from response pyramids of 1000 generic object detectors which were trained on standard annotated image datasets, such as the ImageNet dataset. A spatial bag of words tiling approach was then adopted to encode these feature vectors for bridging the gap between the objects and events. Furthermore, we performed experiments in the context of event classification on the challenging TRECVID MED 2012 dataset, and the results showed that the robust 1000OBK event descriptor outperforms the state-of-the-art approaches.
Supervised Time Series Event Detector for Building Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-04-13
A machine learning based approach is developed to detect events that have rarely been seen in the historical data. The data can include building energy consumption, sensor data, environmental data and any data that may affect the building's energy consumption. The algorithm is a modified nonlinear Bayesian support vector machine, which examines daily energy consumption profile, detect the days with abnormal events, and diagnose the cause of the events.
Experiments on Adaptive Techniques for Host-Based Intrusion Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
DRAELOS, TIMOTHY J.; COLLINS, MICHAEL J.; DUGGAN, DAVID P.
2001-09-01
This research explores four experiments of adaptive host-based intrusion detection (ID) techniques in an attempt to develop systems that can detect novel exploits. The technique considered to have the most potential is adaptive critic designs (ACDs) because of their utilization of reinforcement learning, which allows learning exploits that are difficult to pinpoint in sensor data. Preliminary results of ID using an ACD, an Elman recurrent neural network, and a statistical anomaly detection technique demonstrate an ability to learn to distinguish between clean and exploit data. We used the Solaris Basic Security Module (BSM) as a data source and performed considerablemore » preprocessing on the raw data. A detection approach called generalized signature-based ID is recommended as a middle ground between signature-based ID, which has an inability to detect novel exploits, and anomaly detection, which detects too many events including events that are not exploits. The primary results of the ID experiments demonstrate the use of custom data for generalized signature-based intrusion detection and the ability of neural network-based systems to learn in this application environment.« less
Fighting detection using interaction energy force
NASA Astrophysics Data System (ADS)
Wateosot, Chonthisa; Suvonvorn, Nikom
2017-02-01
Fighting detection is an important issue in security aimed to prevent criminal or undesirable events in public places. Many researches on computer vision techniques have studied to detect the specific event in crowded scenes. In this paper we focus on fighting detection using social-based Interaction Energy Force (IEF). The method uses low level features without object extraction and tracking. The interaction force is modeled using the magnitude and direction of optical flows. A fighting factor is developed under this model to detect fighting events using thresholding method. An energy map of interaction force is also presented to identify the corresponding events. The evaluation is performed using NUSHGA and BEHAVE datasets. The results show the efficiency with high accuracy regardless of various conditions.
A Novel Event-Based Incipient Slip Detection Using Dynamic Active-Pixel Vision Sensor (DAVIS)
Rigi, Amin
2018-01-01
In this paper, a novel approach to detect incipient slip based on the contact area between a transparent silicone medium and different objects using a neuromorphic event-based vision sensor (DAVIS) is proposed. Event-based algorithms are developed to detect incipient slip, slip, stress distribution and object vibration. Thirty-seven experiments were performed on five objects with different sizes, shapes, materials and weights to compare precision and response time of the proposed approach. The proposed approach is validated by using a high speed constitutional camera (1000 FPS). The results indicate that the sensor can detect incipient slippage with an average of 44.1 ms latency in unstructured environment for various objects. It is worth mentioning that the experiments were conducted in an uncontrolled experimental environment, therefore adding high noise levels that affected results significantly. However, eleven of the experiments had a detection latency below 10 ms which shows the capability of this method. The results are very promising and show a high potential of the sensor being used for manipulation applications especially in dynamic environments. PMID:29364190
NASA Astrophysics Data System (ADS)
Khosla, Deepak; Huber, David J.; Martin, Kevin
2017-05-01
This paper† describes a technique in which we improve upon the prior performance of the Rapid Serial Visual Presentation (RSVP) EEG paradigm for image classification though the insertion of visual attention distracters and overall sequence reordering based upon the expected ratio of rare to common "events" in the environment and operational context. Inserting distracter images maintains the ratio of common events to rare events at an ideal level, maximizing the rare event detection via P300 EEG response to the RSVP stimuli. The method has two steps: first, we compute the optimal number of distracters needed for an RSVP stimuli based on the desired sequence length and expected number of targets and insert the distracters into the RSVP sequence, and then we reorder the RSVP sequence to maximize P300 detection. We show that by reducing the ratio of target events to nontarget events using this method, we can allow RSVP sequences with more targets without sacrificing area under the ROC curve (azimuth).
NASA Astrophysics Data System (ADS)
Bergen, K.; Yoon, C. E.; OReilly, O. J.; Beroza, G. C.
2015-12-01
Recent improvements in computational efficiency for waveform correlation-based detections achieved by new methods such as Fingerprint and Similarity Thresholding (FAST) promise to allow large-scale blind search for similar waveforms in long-duration continuous seismic data. Waveform similarity search applied to datasets of months to years of continuous seismic data will identify significantly more events than traditional detection methods. With the anticipated increase in number of detections and associated increase in false positives, manual inspection of the detection results will become infeasible. This motivates the need for new approaches to process the output of similarity-based detection. We explore data mining techniques for improved detection post-processing. We approach this by considering similarity-detector output as a sparse similarity graph with candidate events as vertices and similarities as weighted edges. Image processing techniques are leveraged to define candidate events and combine results individually processed at multiple stations. Clustering and graph analysis methods are used to identify groups of similar waveforms and assign a confidence score to candidate detections. Anomaly detection and classification are applied to waveform data for additional false detection removal. A comparison of methods will be presented and their performance will be demonstrated on a suspected induced and non-induced earthquake sequence.
Numerical study on the sequential Bayesian approach for radioactive materials detection
NASA Astrophysics Data System (ADS)
Qingpei, Xiang; Dongfeng, Tian; Jianyu, Zhu; Fanhua, Hao; Ge, Ding; Jun, Zeng
2013-01-01
A new detection method, based on the sequential Bayesian approach proposed by Candy et al., offers new horizons for the research of radioactive detection. Compared with the commonly adopted detection methods incorporated with statistical theory, the sequential Bayesian approach offers the advantages of shorter verification time during the analysis of spectra that contain low total counts, especially in complex radionuclide components. In this paper, a simulation experiment platform implanted with the methodology of sequential Bayesian approach was developed. Events sequences of γ-rays associating with the true parameters of a LaBr3(Ce) detector were obtained based on an events sequence generator using Monte Carlo sampling theory to study the performance of the sequential Bayesian approach. The numerical experimental results are in accordance with those of Candy. Moreover, the relationship between the detection model and the event generator, respectively represented by the expected detection rate (Am) and the tested detection rate (Gm) parameters, is investigated. To achieve an optimal performance for this processor, the interval of the tested detection rate as a function of the expected detection rate is also presented.
Automatic Detection and Classification of Audio Events for Road Surveillance Applications.
Almaadeed, Noor; Asim, Muhammad; Al-Maadeed, Somaya; Bouridane, Ahmed; Beghdadi, Azeddine
2018-06-06
This work investigates the problem of detecting hazardous events on roads by designing an audio surveillance system that automatically detects perilous situations such as car crashes and tire skidding. In recent years, research has shown several visual surveillance systems that have been proposed for road monitoring to detect accidents with an aim to improve safety procedures in emergency cases. However, the visual information alone cannot detect certain events such as car crashes and tire skidding, especially under adverse and visually cluttered weather conditions such as snowfall, rain, and fog. Consequently, the incorporation of microphones and audio event detectors based on audio processing can significantly enhance the detection accuracy of such surveillance systems. This paper proposes to combine time-domain, frequency-domain, and joint time-frequency features extracted from a class of quadratic time-frequency distributions (QTFDs) to detect events on roads through audio analysis and processing. Experiments were carried out using a publicly available dataset. The experimental results conform the effectiveness of the proposed approach for detecting hazardous events on roads as demonstrated by 7% improvement of accuracy rate when compared against methods that use individual temporal and spectral features.
A Cyber-Attack Detection Model Based on Multivariate Analyses
NASA Astrophysics Data System (ADS)
Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi
In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.
Jasiewicz, Jan M; Allum, John H J; Middleton, James W; Barriskill, Andrew; Condie, Peter; Purcell, Brendan; Li, Raymond Che Tin
2006-12-01
We report on three different methods of gait event detection (toe-off and heel strike) using miniature linear accelerometers and angular velocity transducers in comparison to using standard pressure-sensitive foot switches. Detection was performed with normal and spinal-cord injured subjects. The detection of end contact (EC), normally toe-off, and initial contact (IC) normally, heel strike was based on either foot linear accelerations or foot sagittal angular velocity or shank sagittal angular velocity. The results showed that all three methods were as accurate as foot switches in estimating times of IC and EC for normal gait patterns. In spinal-cord injured subjects, shank angular velocity was significantly less accurate (p<0.02). We conclude that detection based on foot linear accelerations or foot angular velocity can correctly identify the timing of IC and EC events in both normal and spinal-cord injured subjects.
Detection of goal events in soccer videos
NASA Astrophysics Data System (ADS)
Kim, Hyoung-Gook; Roeber, Steffen; Samour, Amjad; Sikora, Thomas
2005-01-01
In this paper, we present an automatic extraction of goal events in soccer videos by using audio track features alone without relying on expensive-to-compute video track features. The extracted goal events can be used for high-level indexing and selective browsing of soccer videos. The detection of soccer video highlights using audio contents comprises three steps: 1) extraction of audio features from a video sequence, 2) event candidate detection of highlight events based on the information provided by the feature extraction Methods and the Hidden Markov Model (HMM), 3) goal event selection to finally determine the video intervals to be included in the summary. For this purpose we compared the performance of the well known Mel-scale Frequency Cepstral Coefficients (MFCC) feature extraction method vs. MPEG-7 Audio Spectrum Projection feature (ASP) extraction method based on three different decomposition methods namely Principal Component Analysis( PCA), Independent Component Analysis (ICA) and Non-Negative Matrix Factorization (NMF). To evaluate our system we collected five soccer game videos from various sources. In total we have seven hours of soccer games consisting of eight gigabytes of data. One of five soccer games is used as the training data (e.g., announcers' excited speech, audience ambient speech noise, audience clapping, environmental sounds). Our goal event detection results are encouraging.
Picking vs Waveform based detection and location methods for induced seismicity monitoring
NASA Astrophysics Data System (ADS)
Grigoli, Francesco; Boese, Maren; Scarabello, Luca; Diehl, Tobias; Weber, Bernd; Wiemer, Stefan; Clinton, John F.
2017-04-01
Microseismic monitoring is a common operation in various industrial activities related to geo-resouces, such as oil and gas and mining operations or geothermal energy exploitation. In microseismic monitoring we generally deal with large datasets from dense monitoring networks that require robust automated analysis procedures. The seismic sequences being monitored are often characterized by very many events with short inter-event times that can even provide overlapped seismic signatures. In these situations, traditional approaches that identify seismic events using dense seismic networks based on detections, phase identification and event association can fail, leading to missed detections and/or reduced location resolution. In recent years, to improve the quality of automated catalogues, various waveform-based methods for the detection and location of microseismicity have been proposed. These methods exploit the coherence of the waveforms recorded at different stations and do not require any automated picking procedure. Although this family of methods have been applied to different induced seismicity datasets, an extensive comparison with sophisticated pick-based detection and location methods is still lacking. We aim here to perform a systematic comparison in term of performance using the waveform-based method LOKI and the pick-based detection and location methods (SCAUTOLOC and SCANLOC) implemented within the SeisComP3 software package. SCANLOC is a new detection and location method specifically designed for seismic monitoring at local scale. Although recent applications have proved an extensive test with induced seismicity datasets have been not yet performed. This method is based on a cluster search algorithm to associate detections to one or many potential earthquake sources. On the other hand, SCAUTOLOC is more a "conventional" method and is the basic tool for seismic event detection and location in SeisComp3. This approach was specifically designed for regional and teleseismic applications, thus its performance with microseismic data might be limited. We analyze the performance of the three methodologies for a synthetic dataset with realistic noise conditions as well as for the first hour of continuous waveform data, including the Ml 3.5 St. Gallen earthquake, recorded by a microseismic network deployed in the area. We finally compare the results obtained all these three methods with a manually revised catalogue.
Secure Access Control and Large Scale Robust Representation for Online Multimedia Event Detection
Liu, Changyu; Li, Huiling
2014-01-01
We developed an online multimedia event detection (MED) system. However, there are a secure access control issue and a large scale robust representation issue when we want to integrate traditional event detection algorithms into the online environment. For the first issue, we proposed a tree proxy-based and service-oriented access control (TPSAC) model based on the traditional role based access control model. Verification experiments were conducted on the CloudSim simulation platform, and the results showed that the TPSAC model is suitable for the access control of dynamic online environments. For the second issue, inspired by the object-bank scene descriptor, we proposed a 1000-object-bank (1000OBK) event descriptor. Feature vectors of the 1000OBK were extracted from response pyramids of 1000 generic object detectors which were trained on standard annotated image datasets, such as the ImageNet dataset. A spatial bag of words tiling approach was then adopted to encode these feature vectors for bridging the gap between the objects and events. Furthermore, we performed experiments in the context of event classification on the challenging TRECVID MED 2012 dataset, and the results showed that the robust 1000OBK event descriptor outperforms the state-of-the-art approaches. PMID:25147840
Pick- and waveform-based techniques for real-time detection of induced seismicity
NASA Astrophysics Data System (ADS)
Grigoli, Francesco; Scarabello, Luca; Böse, Maren; Weber, Bernd; Wiemer, Stefan; Clinton, John F.
2018-05-01
The monitoring of induced seismicity is a common operation in many industrial activities, such as conventional and non-conventional hydrocarbon production or mining and geothermal energy exploitation, to cite a few. During such operations, we generally collect very large and strongly noise-contaminated data sets that require robust and automated analysis procedures. Induced seismicity data sets are often characterized by sequences of multiple events with short interevent times or overlapping events; in these cases, pick-based location methods may struggle to correctly assign picks to phases and events, and errors can lead to missed detections and/or reduced location resolution and incorrect magnitudes, which can have significant consequences if real-time seismicity information are used for risk assessment frameworks. To overcome these issues, different waveform-based methods for the detection and location of microseismicity have been proposed. The main advantages of waveform-based methods is that they appear to perform better and can simultaneously detect and locate seismic events providing high-quality locations in a single step, while the main disadvantage is that they are computationally expensive. Although these methods have been applied to different induced seismicity data sets, an extensive comparison with sophisticated pick-based detection methods is still missing. In this work, we introduce our improved waveform-based detector and we compare its performance with two pick-based detectors implemented within the SeiscomP3 software suite. We test the performance of these three approaches with both synthetic and real data sets related to the induced seismicity sequence at the deep geothermal project in the vicinity of the city of St. Gallen, Switzerland.
Piezoelectric-based self-powered electronic adjustable impulse switches
NASA Astrophysics Data System (ADS)
Rastegar, Jahangir; Kwok, Philip
2018-03-01
Novel piezoelectric-based self-powered impulse detecting switches are presented. The switches are designed to detect shock loading events resulting in acceleration or deceleration above prescribed levels and durations. The prescribed acceleration level and duration thresholds are adjustable. They are provided with false trigger protection logic. The impulse switches are provided with electronic and logic circuitry to detect prescribed impulse events and reject events such as high amplitude but short duration shocks, and transportation vibration and similar low amplitude and relatively long duration events. They can be mounted directly onto electronics circuit boards, thereby significantly simplifying the electrical and electronic circuitry, simplifying the assembly process and total cost, significantly reducing the occupied volume, and in some applications eliminating the need for physical wiring to and from the impulse switches. The design of prototypes and testing under realistic conditions are presented.
Method of controlling cyclic variation in engine combustion
Davis, L.I. Jr.; Daw, C.S.; Feldkamp, L.A.; Hoard, J.W.; Yuan, F.; Connolly, F.T.
1999-07-13
Cyclic variation in combustion of a lean burning engine is reduced by detecting an engine combustion event output such as torsional acceleration in a cylinder (i) at a combustion event (k), using the detected acceleration to predict a target acceleration for the cylinder at the next combustion event (k+1), modifying the target output by a correction term that is inversely proportional to the average phase of the combustion event output of cylinder (i) and calculating a control output such as fuel pulse width or spark timing necessary to achieve the target acceleration for cylinder (i) at combustion event (k+1) based on anti-correlation with the detected acceleration and spill-over effects from fueling. 27 figs.
Method of controlling cyclic variation in engine combustion
Davis, Jr., Leighton Ira; Daw, Charles Stuart; Feldkamp, Lee Albert; Hoard, John William; Yuan, Fumin; Connolly, Francis Thomas
1999-01-01
Cyclic variation in combustion of a lean burning engine is reduced by detecting an engine combustion event output such as torsional acceleration in a cylinder (i) at a combustion event (k), using the detected acceleration to predict a target acceleration for the cylinder at the next combustion event (k+1), modifying the target output by a correction term that is inversely proportional to the average phase of the combustion event output of cylinder (i) and calculating a control output such as fuel pulse width or spark timing necessary to achieve the target acceleration for cylinder (i) at combustion event (k+1) based on anti-correlation with the detected acceleration and spill-over effects from fueling.
Assessing the continuum of event-based biosurveillance through an operational lens.
Corley, Courtney D; Lancaster, Mary J; Brigantic, Robert T; Chung, James S; Walters, Ronald A; Arthur, Ray R; Bruckner-Lea, Cynthia J; Calapristi, Augustin; Dowling, Glenn; Hartley, David M; Kennedy, Shaun; Kircher, Amy; Klucking, Sara; Lee, Eva K; McKenzie, Taylor; Nelson, Noele P; Olsen, Jennifer; Pancerella, Carmen; Quitugua, Teresa N; Reed, Jeremy Todd; Thomas, Carla S
2012-03-01
This research follows the Updated Guidelines for Evaluating Public Health Surveillance Systems, Recommendations from the Guidelines Working Group, published by the Centers for Disease Control and Prevention nearly a decade ago. Since then, models have been developed and complex systems have evolved with a breadth of disparate data to detect or forecast chemical, biological, and radiological events that have a significant impact on the One Health landscape. How the attributes identified in 2001 relate to the new range of event-based biosurveillance technologies is unclear. This article frames the continuum of event-based biosurveillance systems (that fuse media reports from the internet), models (ie, computational that forecast disease occurrence), and constructs (ie, descriptive analytical reports) through an operational lens (ie, aspects and attributes associated with operational considerations in the development, testing, and validation of the event-based biosurveillance methods and models and their use in an operational environment). A workshop was held in 2010 to scientifically identify, develop, and vet a set of attributes for event-based biosurveillance. Subject matter experts were invited from 7 federal government agencies and 6 different academic institutions pursuing research in biosurveillance event detection. We describe 8 attribute families for the characterization of event-based biosurveillance: event, readiness, operational aspects, geographic coverage, population coverage, input data, output, and cost. Ultimately, the analyses provide a framework from which the broad scope, complexity, and relevant issues germane to event-based biosurveillance useful in an operational environment can be characterized.
Rapid Landslide Mapping by Means of Post-Event Polarimetric SAR Imagery
NASA Astrophysics Data System (ADS)
Plank, Simon; Martinis, Sandro; Twele, Andre
2016-08-01
Rapid mapping of landslides, quickly providing information about the extent of the affected area and type and grade of damage, is crucial to enable fast crisis response. Reviewing the literature shows that most synthetic aperture radar (SAR) data-based landslide mapping procedures use change detection techniques. However, the required very high resolution (VHR) pre-event SAR imagery, acquired shortly before the landslide event, is commonly not available. Due to limitations in onboard disk space and downlink transmission rates modern VHR SAR missions do not systematically cover the entire world. We present a fast and robust procedure for mapping of landslides, based on change detection between freely available and systematically acquired pre-event optical and post-event polarimetric SAR data.
Semantic Concept Discovery for Large Scale Zero Shot Event Detection
2015-07-25
sources and can be shared among many different events, including unseen ones. Based on this idea, events can be detected by inspect- ing the individual...2013]. Partial success along this vein has also been achieved in the zero-shot setting, e.g. [Habibian et al., 2014; Wu et al., 2014], but the...candle”, “birthday cake” and “applaud- ing”. Since concepts are shared among many different classes (events) and each concept classifier can be trained
Exploiting semantics for sensor re-calibration in event detection systems
NASA Astrophysics Data System (ADS)
Vaisenberg, Ronen; Ji, Shengyue; Hore, Bijit; Mehrotra, Sharad; Venkatasubramanian, Nalini
2008-01-01
Event detection from a video stream is becoming an important and challenging task in surveillance and sentient systems. While computer vision has been extensively studied to solve different kinds of detection problems over time, it is still a hard problem and even in a controlled environment only simple events can be detected with a high degree of accuracy. Instead of struggling to improve event detection using image processing only, we bring in semantics to direct traditional image processing. Semantics are the underlying facts that hide beneath video frames, which can not be "seen" directly by image processing. In this work we demonstrate that time sequence semantics can be exploited to guide unsupervised re-calibration of the event detection system. We present an instantiation of our ideas by using an appliance as an example--Coffee Pot level detection based on video data--to show that semantics can guide the re-calibration of the detection model. This work exploits time sequence semantics to detect when re-calibration is required to automatically relearn a new detection model for the newly evolved system state and to resume monitoring with a higher rate of accuracy.
Track-based event recognition in a realistic crowded environment
NASA Astrophysics Data System (ADS)
van Huis, Jasper R.; Bouma, Henri; Baan, Jan; Burghouts, Gertjan J.; Eendebak, Pieter T.; den Hollander, Richard J. M.; Dijk, Judith; van Rest, Jeroen H.
2014-10-01
Automatic detection of abnormal behavior in CCTV cameras is important to improve the security in crowded environments, such as shopping malls, airports and railway stations. This behavior can be characterized at different time scales, e.g., by small-scale subtle and obvious actions or by large-scale walking patterns and interactions between people. For example, pickpocketing can be recognized by the actual snatch (small scale), when he follows the victim, or when he interacts with an accomplice before and after the incident (longer time scale). This paper focusses on event recognition by detecting large-scale track-based patterns. Our event recognition method consists of several steps: pedestrian detection, object tracking, track-based feature computation and rule-based event classification. In the experiment, we focused on single track actions (walk, run, loiter, stop, turn) and track interactions (pass, meet, merge, split). The experiment includes a controlled setup, where 10 actors perform these actions. The method is also applied to all tracks that are generated in a crowded shopping mall in a selected time frame. The results show that most of the actions can be detected reliably (on average 90%) at a low false positive rate (1.1%), and that the interactions obtain lower detection rates (70% at 0.3% FP). This method may become one of the components that assists operators to find threatening behavior and enrich the selection of videos that are to be observed.
Multi-Sensor Data Fusion Project
2000-02-28
seismic network by detecting T phases generated by underground events ( generally earthquakes ) and associating these phases to seismic events. The...between underwater explosions (H), underground sources, mostly earthquake - generated (7), and noise detections (N). The phases classified as H are the only...processing for infrasound sensors is most similar to seismic array processing with the exception that the detections are based on a more sophisticated
Shadow Detection Based on Regions of Light Sources for Object Extraction in Nighttime Video
Lee, Gil-beom; Lee, Myeong-jin; Lee, Woo-Kyung; Park, Joo-heon; Kim, Tae-Hwan
2017-01-01
Intelligent video surveillance systems detect pre-configured surveillance events through background modeling, foreground and object extraction, object tracking, and event detection. Shadow regions inside video frames sometimes appear as foreground objects, interfere with ensuing processes, and finally degrade the event detection performance of the systems. Conventional studies have mostly used intensity, color, texture, and geometric information to perform shadow detection in daytime video, but these methods lack the capability of removing shadows in nighttime video. In this paper, a novel shadow detection algorithm for nighttime video is proposed; this algorithm partitions each foreground object based on the object’s vertical histogram and screens out shadow objects by validating their orientations heading toward regions of light sources. From the experimental results, it can be seen that the proposed algorithm shows more than 93.8% shadow removal and 89.9% object extraction rates for nighttime video sequences, and the algorithm outperforms conventional shadow removal algorithms designed for daytime videos. PMID:28327515
Li, P; Jia, J W; Jiang, L X; Zhu, H; Bai, L; Wang, J B; Tang, X M; Pan, A H
2012-04-27
To ensure the implementation of genetically modified organism (GMO)-labeling regulations, an event-specific detection method was developed based on the junction sequence of an exogenous integrant in the transgenic carnation variety Moonlite. The 5'-transgene integration sequence was isolated by thermal asymmetric interlaced PCR. Based upon the 5'-transgene integration sequence, the event-specific primers and TaqMan probe were designed to amplify the fragments, which spanned the exogenous DNA and carnation genomic DNA. Qualitative and quantitative PCR assays were developed employing the designed primers and probe. The detection limit of the qualitative PCR assay was 0.05% for Moonlite in 100 ng total carnation genomic DNA, corresponding to about 79 copies of the carnation haploid genome; the limit of detection and quantification of the quantitative PCR assay were estimated to be 38 and 190 copies of haploid carnation genomic DNA, respectively. Carnation samples with different contents of genetically modified components were quantified and the bias between the observed and true values of three samples were lower than the acceptance criterion (<25%) of the GMO detection method. These results indicated that these event-specific methods would be useful for the identification and quantification of the GMO carnation Moonlite.
Detecting modification of biomedical events using a deep parsing approach.
Mackinlay, Andrew; Martinez, David; Baldwin, Timothy
2012-04-30
This work describes a system for identifying event mentions in bio-molecular research abstracts that are either speculative (e.g. analysis of IkappaBalpha phosphorylation, where it is not specified whether phosphorylation did or did not occur) or negated (e.g. inhibition of IkappaBalpha phosphorylation, where phosphorylation did not occur). The data comes from a standard dataset created for the BioNLP 2009 Shared Task. The system uses a machine-learning approach, where the features used for classification are a combination of shallow features derived from the words of the sentences and more complex features based on the semantic outputs produced by a deep parser. To detect event modification, we use a Maximum Entropy learner with features extracted from the data relative to the trigger words of the events. The shallow features are bag-of-words features based on a small sliding context window of 3-4 tokens on either side of the trigger word. The deep parser features are derived from parses produced by the English Resource Grammar and the RASP parser. The outputs of these parsers are converted into the Minimal Recursion Semantics formalism, and from this, we extract features motivated by linguistics and the data itself. All of these features are combined to create training or test data for the machine learning algorithm. Over the test data, our methods produce approximately a 4% absolute increase in F-score for detection of event modification compared to a baseline based only on the shallow bag-of-words features. Our results indicate that grammar-based techniques can enhance the accuracy of methods for detecting event modification.
Ontology-based knowledge management for personalized adverse drug events detection.
Cao, Feng; Sun, Xingzhi; Wang, Xiaoyuan; Li, Bo; Li, Jing; Pan, Yue
2011-01-01
Since Adverse Drug Event (ADE) has become a leading cause of death around the world, there arises high demand for helping clinicians or patients to identify possible hazards from drug effects. Motivated by this, we present a personalized ADE detection system, with the focus on applying ontology-based knowledge management techniques to enhance ADE detection services. The development of electronic health records makes it possible to automate the personalized ADE detection, i.e., to take patient clinical conditions into account during ADE detection. Specifically, we define the ADE ontology to uniformly manage the ADE knowledge from multiple sources. We take advantage of the rich semantics from the terminology SNOMED-CT and apply it to ADE detection via the semantic query and reasoning.
Detection of dominant flow and abnormal events in surveillance video
NASA Astrophysics Data System (ADS)
Kwak, Sooyeong; Byun, Hyeran
2011-02-01
We propose an algorithm for abnormal event detection in surveillance video. The proposed algorithm is based on a semi-unsupervised learning method, a kind of feature-based approach so that it does not detect the moving object individually. The proposed algorithm identifies dominant flow without individual object tracking using a latent Dirichlet allocation model in crowded environments. It can also automatically detect and localize an abnormally moving object in real-life video. The performance tests are taken with several real-life databases, and their results show that the proposed algorithm can efficiently detect abnormally moving objects in real time. The proposed algorithm can be applied to any situation in which abnormal directions or abnormal speeds are detected regardless of direction.
Zhang, Jian; Hou, Dibo; Wang, Ke; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo
2017-05-01
The detection of organic contaminants in water distribution systems is essential to protect public health from potential harmful compounds resulting from accidental spills or intentional releases. Existing methods for detecting organic contaminants are based on quantitative analyses such as chemical testing and gas/liquid chromatography, which are time- and reagent-consuming and involve costly maintenance. This study proposes a novel procedure based on discrete wavelet transform and principal component analysis for detecting organic contamination events from ultraviolet spectral data. Firstly, the spectrum of each observation is transformed using discrete wavelet with a coiflet mother wavelet to capture the abrupt change along the wavelength. Principal component analysis is then employed to approximate the spectra based on capture and fusion features. The significant value of Hotelling's T 2 statistics is calculated and used to detect outliers. An alarm of contamination event is triggered by sequential Bayesian analysis when the outliers appear continuously in several observations. The effectiveness of the proposed procedure is tested on-line using a pilot-scale setup and experimental data.
Detection of Abnormal Events via Optical Flow Feature Analysis
Wang, Tian; Snoussi, Hichem
2015-01-01
In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227
Forghani-Arani, Farnoush; Behura, Jyoti; Haines, Seth S.; Batzle, Mike
2013-01-01
In studies on heavy oil, shale reservoirs, tight gas and enhanced geothermal systems, the use of surface passive seismic data to monitor induced microseismicity due to the fluid flow in the subsurface is becoming more common. However, in most studies passive seismic records contain days and months of data and manually analysing the data can be expensive and inaccurate. Moreover, in the presence of noise, detecting the arrival of weak microseismic events becomes challenging. Hence, the use of an automated, accurate and computationally fast technique for event detection in passive seismic data is essential. The conventional automatic event identification algorithm computes a running-window energy ratio of the short-term average to the long-term average of the passive seismic data for each trace. We show that for the common case of a low signal-to-noise ratio in surface passive records, the conventional method is not sufficiently effective at event identification. Here, we extend the conventional algorithm by introducing a technique that is based on the cross-correlation of the energy ratios computed by the conventional method. With our technique we can measure the similarities amongst the computed energy ratios at different traces. Our approach is successful at improving the detectability of events with a low signal-to-noise ratio that are not detectable with the conventional algorithm. Also, our algorithm has the advantage to identify if an event is common to all stations (a regional event) or to a limited number of stations (a local event). We provide examples of applying our technique to synthetic data and a field surface passive data set recorded at a geothermal site.
Novel Hierarchical Fall Detection Algorithm Using a Multiphase Fall Model.
Hsieh, Chia-Yeh; Liu, Kai-Chun; Huang, Chih-Ning; Chu, Woei-Chyn; Chan, Chia-Tai
2017-02-08
Falls are the primary cause of accidents for the elderly in the living environment. Reducing hazards in the living environment and performing exercises for training balance and muscles are the common strategies for fall prevention. However, falls cannot be avoided completely; fall detection provides an alarm that can decrease injuries or death caused by the lack of rescue. The automatic fall detection system has opportunities to provide real-time emergency alarms for improving the safety and quality of home healthcare services. Two common technical challenges are also tackled in order to provide a reliable fall detection algorithm, including variability and ambiguity. We propose a novel hierarchical fall detection algorithm involving threshold-based and knowledge-based approaches to detect a fall event. The threshold-based approach efficiently supports the detection and identification of fall events from continuous sensor data. A multiphase fall model is utilized, including free fall, impact, and rest phases for the knowledge-based approach, which identifies fall events and has the potential to deal with the aforementioned technical challenges of a fall detection system. Seven kinds of falls and seven types of daily activities arranged in an experiment are used to explore the performance of the proposed fall detection algorithm. The overall performances of the sensitivity, specificity, precision, and accuracy using a knowledge-based algorithm are 99.79%, 98.74%, 99.05% and 99.33%, respectively. The results show that the proposed novel hierarchical fall detection algorithm can cope with the variability and ambiguity of the technical challenges and fulfill the reliability, adaptability, and flexibility requirements of an automatic fall detection system with respect to the individual differences.
Novel Hierarchical Fall Detection Algorithm Using a Multiphase Fall Model
Hsieh, Chia-Yeh; Liu, Kai-Chun; Huang, Chih-Ning; Chu, Woei-Chyn; Chan, Chia-Tai
2017-01-01
Falls are the primary cause of accidents for the elderly in the living environment. Reducing hazards in the living environment and performing exercises for training balance and muscles are the common strategies for fall prevention. However, falls cannot be avoided completely; fall detection provides an alarm that can decrease injuries or death caused by the lack of rescue. The automatic fall detection system has opportunities to provide real-time emergency alarms for improving the safety and quality of home healthcare services. Two common technical challenges are also tackled in order to provide a reliable fall detection algorithm, including variability and ambiguity. We propose a novel hierarchical fall detection algorithm involving threshold-based and knowledge-based approaches to detect a fall event. The threshold-based approach efficiently supports the detection and identification of fall events from continuous sensor data. A multiphase fall model is utilized, including free fall, impact, and rest phases for the knowledge-based approach, which identifies fall events and has the potential to deal with the aforementioned technical challenges of a fall detection system. Seven kinds of falls and seven types of daily activities arranged in an experiment are used to explore the performance of the proposed fall detection algorithm. The overall performances of the sensitivity, specificity, precision, and accuracy using a knowledge-based algorithm are 99.79%, 98.74%, 99.05% and 99.33%, respectively. The results show that the proposed novel hierarchical fall detection algorithm can cope with the variability and ambiguity of the technical challenges and fulfill the reliability, adaptability, and flexibility requirements of an automatic fall detection system with respect to the individual differences. PMID:28208694
Anomaly Detection Based on Local Nearest Neighbor Distance Descriptor in Crowded Scenes
Hu, Shiqiang; Zhang, Huanlong; Luo, Lingkun
2014-01-01
We propose a novel local nearest neighbor distance (LNND) descriptor for anomaly detection in crowded scenes. Comparing with the commonly used low-level feature descriptors in previous works, LNND descriptor has two major advantages. First, LNND descriptor efficiently incorporates spatial and temporal contextual information around the video event that is important for detecting anomalous interaction among multiple events, while most existing feature descriptors only contain the information of single event. Second, LNND descriptor is a compact representation and its dimensionality is typically much lower than the low-level feature descriptor. Therefore, not only the computation time and storage requirement can be accordingly saved by using LNND descriptor for the anomaly detection method with offline training fashion, but also the negative aspects caused by using high-dimensional feature descriptor can be avoided. We validate the effectiveness of LNND descriptor by conducting extensive experiments on different benchmark datasets. Experimental results show the promising performance of LNND-based method against the state-of-the-art methods. It is worthwhile to notice that the LNND-based approach requires less intermediate processing steps without any subsequent processing such as smoothing but achieves comparable event better performance. PMID:25105164
Thermal wake/vessel detection technique
Roskovensky, John K [Albuquerque, NM; Nandy, Prabal [Albuquerque, NM; Post, Brian N [Albuquerque, NM
2012-01-10
A computer-automated method for detecting a vessel in water based on an image of a portion of Earth includes generating a thermal anomaly mask. The thermal anomaly mask flags each pixel of the image initially deemed to be a wake pixel based on a comparison of a thermal value of each pixel against other thermal values of other pixels localized about each pixel. Contiguous pixels flagged by the thermal anomaly mask are grouped into pixel clusters. A shape of each of the pixel clusters is analyzed to determine whether each of the pixel clusters represents a possible vessel detection event. The possible vessel detection events are represented visually within the image.
Concept and Analysis of a Satellite for Space-Based Radio Detection of Ultra-High Energy Cosmic Rays
NASA Astrophysics Data System (ADS)
Romero-Wolf, Andrew; Gorham, P.; Booth, J.; Chen, P.; Duren, R. M.; Liewer, K.; Nam, J.; Saltzberg, D.; Schoorlemmer, H.; Wissel, S.; Zairfian, P.
2014-01-01
We present a concept for on-orbit radio detection of ultra-high energy cosmic rays (UHECRs) that has the potential to provide collection rates of ~100 events per year for energies above 10^20 eV. The synoptic wideband orbiting radio detector (SWORD) mission's high event statistics at these energies combined with the pointing capabilities of a space-borne antenna array could enable charged particle astronomy. The detector concept is based on ANITA's successful detection UHECRs where the geosynchrotron radio signal produced by the extended air shower is reflected off the Earth's surface and detected in flight.
NASA Astrophysics Data System (ADS)
Patton, J.; Yeck, W.; Benz, H.
2017-12-01
The U.S. Geological Survey National Earthquake Information Center (USGS NEIC) is implementing and integrating new signal detection methods such as subspace correlation, continuous beamforming, multi-band picking and automatic phase identification into near-real-time monitoring operations. Leveraging the additional information from these techniques help the NEIC utilize a large and varied network on local to global scales. The NEIC is developing an ordered, rapid, robust, and decentralized framework for distributing seismic detection data as well as a set of formalized formatting standards. These frameworks and standards enable the NEIC to implement a seismic event detection framework that supports basic tasks, including automatic arrival time picking, social media based event detections, and automatic association of different seismic detection data into seismic earthquake events. In addition, this framework enables retrospective detection processing such as automated S-wave arrival time picking given a detected event, discrimination and classification of detected events by type, back-azimuth and slowness calculations, and ensuring aftershock and induced sequence detection completeness. These processes and infrastructure improve the NEIC's capabilities, accuracy, and speed of response. In addition, this same infrastructure provides an improved and convenient structure to support access to automatic detection data for both research and algorithmic development.
Warrick, P A; Precup, D; Hamilton, E F; Kearney, R E
2007-01-01
To develop a singular-spectrum analysis (SSA) based change-point detection algorithm applicable to fetal heart rate (FHR) monitoring to improve the detection of deceleration events. We present a method for decomposing a signal into near-orthogonal components via the discrete cosine transform (DCT) and apply this in a novel online manner to change-point detection based on SSA. The SSA technique forms models of the underlying signal that can be compared over time; models that are sufficiently different indicate signal change points. To adapt the algorithm to deceleration detection where many successive similar change events can occur, we modify the standard SSA algorithm to hold the reference model constant under such conditions, an approach that we term "base-hold SSA". The algorithm is applied to a database of 15 FHR tracings that have been preprocessed to locate candidate decelerations and is compared to the markings of an expert obstetrician. Of the 528 true and 1285 false decelerations presented to the algorithm, the base-hold approach improved on standard SSA, reducing the number of missed decelerations from 64 to 49 (21.9%) while maintaining the same reduction in false-positives (278). The standard SSA assumption that changes are infrequent does not apply to FHR analysis where decelerations can occur successively and in close proximity; our base-hold SSA modification improves detection of these types of event series.
Christian, Kira A; Iuliano, A Danielle; Uyeki, Timothy M; Mintz, Eric D; Nichol, Stuart T; Rollin, Pierre; Staples, J Erin; Arthur, Ray R
To better track public health events in areas where the public health system is unable or unwilling to report the event to appropriate public health authorities, agencies can conduct event-based surveillance, which is defined as the organized collection, monitoring, assessment, and interpretation of unstructured information regarding public health events that may represent an acute risk to public health. The US Centers for Disease Control and Prevention's (CDC's) Global Disease Detection Operations Center (GDDOC) was created in 2007 to serve as CDC's platform dedicated to conducting worldwide event-based surveillance, which is now highlighted as part of the "detect" element of the Global Health Security Agenda (GHSA). The GHSA works toward making the world more safe and secure from disease threats through building capacity to better "Prevent, Detect, and Respond" to those threats. The GDDOC monitors approximately 30 to 40 public health events each day. In this article, we describe the top threats to public health monitored during 2012 to 2016: avian influenza, cholera, Ebola virus disease, and the vector-borne diseases yellow fever, chikungunya virus, and Zika virus, with updates to the previously described threats from Middle East respiratory syndrome-coronavirus (MERS-CoV) and poliomyelitis.
Shang, Ying; Xu, Wentao; Wang, Yong; Xu, Yuancong; Huang, Kunlun
2017-12-15
This study described a novel multiplex qualitative detection method using pyrosequencing. Based on the principle of the universal primer-multiplex-PCR, only one sequencing primer was employed to realize the detection of the multiple targets. Samples containing three genetically modified (GM) crops in different proportions were used to validate the method. The dNTP dispensing order was designed based on the product sequences. Only 12 rounds (ATCTGATCGACT) of dNTPs addition and, often, as few as three rounds (CAT) under ideal conditions, were required to detect the GM events qualitatively, and sensitivity was as low as 1% of a mixture. However, when considering a mixture, calculating signal values allowed the proportion of each GM to be estimated. Based on these results, we concluded that our novel method not only realized detection but also allowed semi-quantitative detection of individual events. Copyright © 2017. Published by Elsevier Ltd.
Screening DNA chip and event-specific multiplex PCR detection methods for biotech crops.
Lee, Seong-Hun
2014-11-01
There are about 80 biotech crop events that have been approved by safety assessment in Korea. They have been controlled by genetically modified organism (GMO) and living modified organism (LMO) labeling systems. The DNA-based detection method has been used as an efficient scientific management tool. Recently, the multiplex polymerase chain reaction (PCR) and DNA chip have been developed as simultaneous detection methods for several biotech crops' events. The event-specific multiplex PCR method was developed to detect five biotech maize events: MIR604, Event 3272, LY 038, MON 88017 and DAS-59122-7. The specificity was confirmed and the sensitivity was 0.5%. The screening DNA chip was developed from four endogenous genes of soybean, maize, cotton and canola respectively along with two regulatory elements and seven genes: P35S, tNOS, pat, bar, epsps1, epsps2, pmi, cry1Ac and cry3B. The specificity was confirmed and the sensitivity was 0.5% for four crops' 12 events: one soybean, six maize, three cotton and two canola events. The multiplex PCR and DNA chip can be available for screening, gene-specific and event-specific analysis of biotech crops as efficient detection methods by saving on workload and time. © 2014 Society of Chemical Industry. © 2014 Society of Chemical Industry.
A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data
NASA Astrophysics Data System (ADS)
Kohl, B. C.; Given, J.
2017-12-01
The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in particular the spectral characteristics of events of interest, Is entirely model-based, i.e. does not rely on a priori's - particularly important for nuclear monitoring, Does not rely on individualized signal detection thresholds - it's the network solution that matters.
Pies, Ross E.
2016-03-29
A method and device for the detection of impact events on a security barrier. A hollow rebar is farmed within a security barrier, whereby the hollow rebar is completely surrounded by the security barrier. An optical fiber passes through the interior of the hollow rebar. An optical transmitter and an optical receiver are both optically connected to the optical fiber and connected to optical electronics. The optical electronics are configured to provide notification upon the detection of an impact event at the security barrier based on the detection of disturbances within the optical fiber.
Systematic detection of seismic events at Mount St. Helens with an ultra-dense array
NASA Astrophysics Data System (ADS)
Meng, X.; Hartog, J. R.; Schmandt, B.; Hotovec-Ellis, A. J.; Hansen, S. M.; Vidale, J. E.; Vanderplas, J.
2016-12-01
During the summer of 2014, an ultra-dense array of 900 geophones was deployed around the crater of Mount St. Helens and continuously operated for 15 days. This dataset provides us an unprecedented opportunity to systematically detect seismic events around an active volcano and study their underlying mechanisms. We use a waveform-based matched filter technique to detect seismic events from this dataset. Due to the large volume of continuous data ( 1 TB), we performed the detection on the GPU cluster Stampede (https://www.tacc.utexas.edu/systems/stampede). We build a suite of template events from three catalogs: 1) the standard Pacific Northwest Seismic Network (PNSN) catalog (45 events); 2) the catalog from Hansen&Schmandt (2015) obtained with a reverse-time imaging method (212 events); and 3) the catalog identified with a matched filter technique using the PNSN permanent stations (190 events). By searching for template matches in the ultra-dense array, we find 2237 events. We then calibrate precise relative magnitudes for template and detected events, using a principal component fit to measure waveform amplitude ratios. The magnitude of completeness and b-value of the detected catalog is -0.5 and 1.1, respectively. Our detected catalog shows several intensive swarms, which are likely driven by fluid pressure transients in conduits or slip transients on faults underneath the volcano. We are currently relocating the detected catalog with HypoDD and measuring the seismic velocity changes at Mount St. Helens using the coda wave interferometry of detected repeating earthquakes. The accurate temporal-spatial migration pattern of seismicity and seismic property changes should shed light on the physical processes beneath Mount St. Helens.
Information-based self-organization of sensor nodes of a sensor network
Ko, Teresa H [Castro Valley, CA; Berry, Nina M [Tracy, CA
2011-09-20
A sensor node detects a plurality of information-based events. The sensor node determines whether at least one other sensor node is an information neighbor of the sensor node based on at least a portion of the plurality of information-based events. The information neighbor has an overlapping field of view with the sensor node. The sensor node sends at least one communication to the at least one other sensor node that is an information neighbor of the sensor node in response to at least one information-based event of the plurality of information-based events.
Detecting event-based prospective memory cues occurring within and outside the focus of attention.
Hicks, Jason L; Cook, Gabriel I; Marsh, Richard L
2005-01-01
Event-based prospective memory cues are environmental stimuli that are associated with a previously established intention to perform an activity. Such cues traditionally have been placed in materials that receive focal attention during an ongoing activity. This article reports a direct comparison of event-based cues that occurred either within the focus of attention or at the periphery of such attention. When the cue occurred outside focal attention, manipulating that cue changed event-based prospective memory. The identical manipulation had no effect on event-based responding if the cue occurred within focal attention. These results suggest that cue characteristics can compensate for attention being directed away from an aspect of an ongoing task that contains event-based prospective memory.
Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan
2017-01-01
In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of our proposed adaptive crawler and our stream division/recombination technique provides significant gains in event recall (44.44%) and event precision (9.57%). The addition of these sub-events or pieces, allows us to get closer to solving the event puzzle. PMID:29107976
Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan
2017-01-01
In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of our proposed adaptive crawler and our stream division/recombination technique provides significant gains in event recall (44.44%) and event precision (9.57%). The addition of these sub-events or pieces, allows us to get closer to solving the event puzzle.
Yang, Litao; Xu, Songci; Pan, Aihu; Yin, Changsong; Zhang, Kewei; Wang, Zhenying; Zhou, Zhigang; Zhang, Dabing
2005-11-30
Because of the genetically modified organisms (GMOs) labeling policies issued in many countries and areas, polymerase chain reaction (PCR) methods were developed for the execution of GMO labeling policies, such as screening, gene specific, construct specific, and event specific PCR detection methods, which have become a mainstay of GMOs detection. The event specific PCR detection method is the primary trend in GMOs detection because of its high specificity based on the flanking sequence of the exogenous integrant. This genetically modified maize, MON863, contains a Cry3Bb1 coding sequence that produces a protein with enhanced insecticidal activity against the coleopteran pest, corn rootworm. In this study, the 5'-integration junction sequence between the host plant DNA and the integrated gene construct of the genetically modified maize MON863 was revealed by means of thermal asymmetric interlaced-PCR, and the specific PCR primers and TaqMan probe were designed based upon the revealed 5'-integration junction sequence; the conventional qualitative PCR and quantitative TaqMan real-time PCR detection methods employing these primers and probes were successfully developed. In conventional qualitative PCR assay, the limit of detection (LOD) was 0.1% for MON863 in 100 ng of maize genomic DNA for one reaction. In the quantitative TaqMan real-time PCR assay, the LOD and the limit of quantification were eight and 80 haploid genome copies, respectively. In addition, three mixed maize samples with known MON863 contents were detected using the established real-time PCR systems, and the ideal results indicated that the established event specific real-time PCR detection systems were reliable, sensitive, and accurate.
Real-time detection and classification of anomalous events in streaming data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferragut, Erik M.; Goodall, John R.; Iannacone, Michael D.
2016-04-19
A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The events can be displayed to a user in user-defined groupings in an animated fashion. The system can include a plurality of anomaly detectors that together implement an algorithm to identify low probability events and detect atypical traffic patterns. The atypical traffic patterns can then be classified as being of interest or not. In one particular example, in a network environment, the classification can be whether the network traffic is malicious or not.
NASA Astrophysics Data System (ADS)
Kim, W. Y.; Richards, P. G.
2017-12-01
At least four small seismic events were detected around the North Korean nuclear test site following the 3 September 2017 underground nuclear test. The magnitude of these shocks range from 2.6 to 3.5. Based on their proximity to the September 3 UNT, these shocks may be considered as aftershocks of the UNT. We assess the best method to classify these small events based on spectral amplitude ratios of regional P and S wave from the shocks. None of these shocks are classified as explosion-like based on P/S spectral amplitude ratios. We examine additional possible small seismic events around the North Korean test site by using seismic data from stations in southern Korea and northeastern China including IMS seismic arrays, GSN stations, and regional network stations in the region.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dodge, D. A.; Harris, D. B.
Correlation detectors are of considerable interest to the seismic monitoring communities because they offer reduced detection thresholds and combine detection, location and identification functions into a single operation. They appear to be ideal for applications requiring screening of frequent repeating events. However, questions remain about how broadly empirical correlation methods are applicable. We describe the effectiveness of banks of correlation detectors in a system that combines traditional power detectors with correlation detectors in terms of efficiency, which we define to be the fraction of events detected by the correlators. This paper elaborates and extends the concept of a dynamic correlationmore » detection framework – a system which autonomously creates correlation detectors from event waveforms detected by power detectors; and reports observed performance on a network of arrays in terms of efficiency. We performed a large scale test of dynamic correlation processors on an 11 terabyte global dataset using 25 arrays in the single frequency band 1-3 Hz. The system found over 3.2 million unique signals and produced 459,747 screened detections. A very satisfying result is that, on average, efficiency grows with time and, after nearly 16 years of operation, exceeds 47% for events observed over all distance ranges and approaches 70% for near regional and 90% for local events. This observation suggests that future pipeline architectures should make extensive use of correlation detectors, principally for decluttering observations of local and near-regional events. Our results also suggest that future operations based on correlation detection will require commodity large-scale computing infrastructure, since the numbers of correlators in an autonomous system can grow into the hundreds of thousands.« less
Dodge, D. A.; Harris, D. B.
2016-03-15
Correlation detectors are of considerable interest to the seismic monitoring communities because they offer reduced detection thresholds and combine detection, location and identification functions into a single operation. They appear to be ideal for applications requiring screening of frequent repeating events. However, questions remain about how broadly empirical correlation methods are applicable. We describe the effectiveness of banks of correlation detectors in a system that combines traditional power detectors with correlation detectors in terms of efficiency, which we define to be the fraction of events detected by the correlators. This paper elaborates and extends the concept of a dynamic correlationmore » detection framework – a system which autonomously creates correlation detectors from event waveforms detected by power detectors; and reports observed performance on a network of arrays in terms of efficiency. We performed a large scale test of dynamic correlation processors on an 11 terabyte global dataset using 25 arrays in the single frequency band 1-3 Hz. The system found over 3.2 million unique signals and produced 459,747 screened detections. A very satisfying result is that, on average, efficiency grows with time and, after nearly 16 years of operation, exceeds 47% for events observed over all distance ranges and approaches 70% for near regional and 90% for local events. This observation suggests that future pipeline architectures should make extensive use of correlation detectors, principally for decluttering observations of local and near-regional events. Our results also suggest that future operations based on correlation detection will require commodity large-scale computing infrastructure, since the numbers of correlators in an autonomous system can grow into the hundreds of thousands.« less
Jensen, Morten Hasselstrøm; Christensen, Toke Folke; Tarnow, Lise; Seto, Edmund; Dencker Johansen, Mette; Hejlesen, Ole Kristian
2013-07-01
Hypoglycemia is a potentially fatal condition. Continuous glucose monitoring (CGM) has the potential to detect hypoglycemia in real time and thereby reduce time in hypoglycemia and avoid any further decline in blood glucose level. However, CGM is inaccurate and shows a substantial number of cases in which the hypoglycemic event is not detected by the CGM. The aim of this study was to develop a pattern classification model to optimize real-time hypoglycemia detection. Features such as time since last insulin injection and linear regression, kurtosis, and skewness of the CGM signal in different time intervals were extracted from data of 10 male subjects experiencing 17 insulin-induced hypoglycemic events in an experimental setting. Nondiscriminative features were eliminated with SEPCOR and forward selection. The feature combinations were used in a Support Vector Machine model and the performance assessed by sample-based sensitivity and specificity and event-based sensitivity and number of false-positives. The best model was composed by using seven features and was able to detect 17 of 17 hypoglycemic events with one false-positive compared with 12 of 17 hypoglycemic events with zero false-positives for the CGM alone. Lead-time was 14 min and 0 min for the model and the CGM alone, respectively. This optimized real-time hypoglycemia detection provides a unique approach for the diabetes patient to reduce time in hypoglycemia and learn about patterns in glucose excursions. Although these results are promising, the model needs to be validated on CGM data from patients with spontaneous hypoglycemic events.
Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang
2015-01-05
The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection.
Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang
2015-01-01
The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection. PMID:25556930
Clayton, Hilary M.
2015-01-01
The study of animal movement commonly requires the segmentation of continuous data streams into individual strides. The use of forceplates and foot-mounted accelerometers readily allows the detection of the foot-on and foot-off events that define a stride. However, when relying on optical methods such as motion capture, there is lack of validated robust, universally applicable stride event detection methods. To date, no method has been validated for movement on a circle, while algorithms are commonly specific to front/hind limbs or gait. In this study, we aimed to develop and validate kinematic stride segmentation methods applicable to movement on straight line and circle at walk and trot, which exclusively rely on a single, dorsal hoof marker. The advantage of such marker placement is the robustness to marker loss and occlusion. Eight horses walked and trotted on a straight line and in a circle over an array of multiple forceplates. Kinetic events were detected based on the vertical force profile and used as the reference values. Kinematic events were detected based on displacement, velocity or acceleration signals of the dorsal hoof marker depending on the algorithm using (i) defined thresholds associated with derived movement signals and (ii) specific events in the derived movement signals. Method comparison was performed by calculating limits of agreement, accuracy, between-horse precision and within-horse precision based on differences between kinetic and kinematic event. In addition, we examined the effect of force thresholds ranging from 50 to 150 N on the timings of kinetic events. The two approaches resulted in very good and comparable performance: of the 3,074 processed footfall events, 95% of individual foot on and foot off events differed by no more than 26 ms from the kinetic event, with average accuracy between −11 and 10 ms and average within- and between horse precision ≤8 ms. While the event-based method may be less likely to suffer from scaling effects, on soft ground the threshold-based method may prove more valuable. While we found that use of velocity thresholds for foot on detection results in biased event estimates for the foot on the inside of the circle at trot, adjusting thresholds for this condition negated the effect. For the final four algorithms, we found no noteworthy bias between conditions or between front- and hind-foot timings. Different force thresholds in the range of 50 to 150 N had the greatest systematic effect on foot-off estimates in the hind limbs (up to on average 16 ms per condition), being greater than the effect on foot-on estimates or foot-off estimates in the forelimbs (up to on average ±7 ms per condition). PMID:26157641
Automated Detection of Events of Scientific Interest
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
A report presents a slightly different perspective of the subject matter of Fusing Symbolic and Numerical Diagnostic Computations (NPO-42512), which appears elsewhere in this issue of NASA Tech Briefs. Briefly, the subject matter is the X-2000 Anomaly Detection Language, which is a developmental computing language for fusing two diagnostic computer programs one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for real-time detection of events. In the case of the cited companion NASA Tech Briefs article, the contemplated events that one seeks to detect would be primarily failures or other changes that could adversely affect the safety or success of a spacecraft mission. In the case of the instant report, the events to be detected could also include natural phenomena that could be of scientific interest. Hence, the use of X- 2000 Anomaly Detection Language could contribute to a capability for automated, coordinated use of multiple sensors and sensor-output-data-processing hardware and software to effect opportunistic collection and analysis of scientific data.
FORTE Compact Intra-cloud Discharge Detection parameterized by Peak Current
NASA Astrophysics Data System (ADS)
Heavner, M. J.; Suszcynsky, D. M.; Jacobson, A. R.; Heavner, B. D.; Smith, D. A.
2002-12-01
The Los Alamos Sferic Array (EDOT) has recorded over 3.7 million lightning-related fast electric field change data records during April 1 - August 31, 2001 and 2002. The events were detected by three or more stations, allowing for differential-time-of-arrival location determination. The waveforms are characterized with estimated peak currents as well as by event type. Narrow Bipolar Events (NBEs), the VLF/LF signature of Compact Intra-cloud Discharges (CIDs), are generally isolated pulses with identifiable ionospheric reflections, permitting determination of event source altitudes. We briefly review the EDOT characterization of events. The FORTE satellite observes Trans-Ionospheric Pulse Pairs (TIPPs, the VHF satellite signature of CIDs). The subset of coincident EDOT and FORTE CID observations are compared with the total EDOT CID database to characterize the VHF detection efficiency of CIDs. The NBE polarity and altitude are also examined in the context of FORTE TIPP detection. The parameter-dependent detection efficiencies are extrapolated from FORTE orbit to GPS orbit in support of the V-GLASS effort (GPS based global detection of lightning).
Detecting modification of biomedical events using a deep parsing approach
2012-01-01
Background This work describes a system for identifying event mentions in bio-molecular research abstracts that are either speculative (e.g. analysis of IkappaBalpha phosphorylation, where it is not specified whether phosphorylation did or did not occur) or negated (e.g. inhibition of IkappaBalpha phosphorylation, where phosphorylation did not occur). The data comes from a standard dataset created for the BioNLP 2009 Shared Task. The system uses a machine-learning approach, where the features used for classification are a combination of shallow features derived from the words of the sentences and more complex features based on the semantic outputs produced by a deep parser. Method To detect event modification, we use a Maximum Entropy learner with features extracted from the data relative to the trigger words of the events. The shallow features are bag-of-words features based on a small sliding context window of 3-4 tokens on either side of the trigger word. The deep parser features are derived from parses produced by the English Resource Grammar and the RASP parser. The outputs of these parsers are converted into the Minimal Recursion Semantics formalism, and from this, we extract features motivated by linguistics and the data itself. All of these features are combined to create training or test data for the machine learning algorithm. Results Over the test data, our methods produce approximately a 4% absolute increase in F-score for detection of event modification compared to a baseline based only on the shallow bag-of-words features. Conclusions Our results indicate that grammar-based techniques can enhance the accuracy of methods for detecting event modification. PMID:22595089
Dimension-based attention in visual short-term memory.
Pilling, Michael; Barrett, Doug J K
2016-07-01
We investigated how dimension-based attention influences visual short-term memory (VSTM). This was done through examining the effects of cueing a feature dimension in two perceptual comparison tasks (change detection and sameness detection). In both tasks, a memory array and a test array consisting of a number of colored shapes were presented successively, interleaved by a blank interstimulus interval (ISI). In Experiment 1 (change detection), the critical event was a feature change in one item across the memory and test arrays. In Experiment 2 (sameness detection), the critical event was the absence of a feature change in one item across the two arrays. Auditory cues indicated the feature dimension (color or shape) of the critical event with 80 % validity; the cues were presented either prior to the memory array, during the ISI, or simultaneously with the test array. In Experiment 1, the cue validity influenced sensitivity only when the cue was given at the earliest position; in Experiment 2, the cue validity influenced sensitivity at all three cue positions. We attributed the greater effectiveness of top-down guidance by cues in the sameness detection task to the more active nature of the comparison process required to detect sameness events (Hyun, Woodman, Vogel, Hollingworth, & Luck, Journal of Experimental Psychology: Human Perception and Performance, 35; 1140-1160, 2009).
Rule-Based Event Processing and Reaction Rules
NASA Astrophysics Data System (ADS)
Paschke, Adrian; Kozlenkov, Alexander
Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.
Limitations imposed on fire PRA methods as the result of incomplete and uncertain fire event data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nowlen, Steven Patrick; Hyslop, J. S.
2010-04-01
Fire probabilistic risk assessment (PRA) methods utilize data and insights gained from actual fire events in a variety of ways. For example, fire occurrence frequencies, manual fire fighting effectiveness and timing, and the distribution of fire events by fire source and plant location are all based directly on the historical experience base. Other factors are either derived indirectly or supported qualitatively based on insights from the event data. These factors include the general nature and intensity of plant fires, insights into operator performance, and insights into fire growth and damage behaviors. This paper will discuss the potential methodology improvements thatmore » could be realized if more complete fire event reporting information were available. Areas that could benefit from more complete event reporting that will be discussed in the paper include fire event frequency analysis, analysis of fire detection and suppression system performance including incipient detection systems, analysis of manual fire fighting performance, treatment of fire growth from incipient stages to fully-involved fires, operator response to fire events, the impact of smoke on plant operations and equipment, and the impact of fire-induced cable failures on plant electrical circuits.« less
Online track detection in triggerless mode for INO
NASA Astrophysics Data System (ADS)
Jain, A.; Padmini, S.; Joseph, A. N.; Mahesh, P.; Preetha, N.; Behere, A.; Sikder, S. S.; Majumder, G.; Behera, S. P.
2018-03-01
The India based Neutrino Observatory (INO) is a proposed particle physics research project to study the atmospheric neutrinos. INO-Iron Calorimeter (ICAL) will consist of 28,800 detectors having 3.6 million electronic channels expected to activate with 100 Hz single rate, producing data at a rate of 3 GBps. Data collected contains a few real hits generated by muon tracks and the remaining noise-induced spurious hits. Estimated reduction factor after filtering out data of interest from generated data is of the order of 103. This makes trigger generation critical for efficient data collection and storage. Trigger is generated by detecting coincidence across multiple channels satisfying trigger criteria, within a small window of 200 ns in the trigger region. As the probability of neutrino interaction is very low, track detection algorithm has to be efficient and fast enough to process 5 × 106 events-candidates/s without introducing significant dead time, so that not even a single neutrino event is missed out. A hardware based trigger system is presently proposed for on-line track detection considering stringent timing requirements. Though the trigger system can be designed with scalability, a lot of hardware devices and interconnections make it a complex and expensive solution with limited flexibility. A software based track detection approach working on the hit information offers an elegant solution with possibility of varying trigger criteria for selecting various potentially interesting physics events. An event selection approach for an alternative triggerless readout scheme has been developed. The algorithm is mathematically simple, robust and parallelizable. It has been validated by detecting simulated muon events for energies of the range of 1 GeV-10 GeV with 100% efficiency at a processing rate of 60 μs/event on a 16 core machine. The algorithm and result of a proof-of-concept for its faster implementation over multiple cores is presented. The paper also discusses about harnessing the computing capabilities of multi-core computing farm, thereby optimizing number of nodes required for the proposed system.
Video-Based Affect Detection in Noninteractive Learning Environments
ERIC Educational Resources Information Center
Chen, Yuxuan; Bosch, Nigel; D'Mello, Sidney
2015-01-01
The current paper explores possible solutions to the problem of detecting affective states from facial expressions during text/diagram comprehension, a context devoid of interactive events that can be used to infer affect. These data present an interesting challenge for face-based affect detection because likely locations of affective facial…
Characterization of GM events by insert knowledge adapted re-sequencing approaches
Yang, Litao; Wang, Congmao; Holst-Jensen, Arne; Morisset, Dany; Lin, Yongjun; Zhang, Dabing
2013-01-01
Detection methods and data from molecular characterization of genetically modified (GM) events are needed by stakeholders of public risk assessors and regulators. Generally, the molecular characteristics of GM events are incomprehensively revealed by current approaches and biased towards detecting transformation vector derived sequences. GM events are classified based on available knowledge of the sequences of vectors and inserts (insert knowledge). Herein we present three insert knowledge-adapted approaches for characterization GM events (TT51-1 and T1c-19 rice as examples) based on paired-end re-sequencing with the advantages of comprehensiveness, accuracy, and automation. The comprehensive molecular characteristics of two rice events were revealed with additional unintended insertions comparing with the results from PCR and Southern blotting. Comprehensive transgene characterization of TT51-1 and T1c-19 is shown to be independent of a priori knowledge of the insert and vector sequences employing the developed approaches. This provides an opportunity to identify and characterize also unknown GM events. PMID:24088728
Characterization of GM events by insert knowledge adapted re-sequencing approaches.
Yang, Litao; Wang, Congmao; Holst-Jensen, Arne; Morisset, Dany; Lin, Yongjun; Zhang, Dabing
2013-10-03
Detection methods and data from molecular characterization of genetically modified (GM) events are needed by stakeholders of public risk assessors and regulators. Generally, the molecular characteristics of GM events are incomprehensively revealed by current approaches and biased towards detecting transformation vector derived sequences. GM events are classified based on available knowledge of the sequences of vectors and inserts (insert knowledge). Herein we present three insert knowledge-adapted approaches for characterization GM events (TT51-1 and T1c-19 rice as examples) based on paired-end re-sequencing with the advantages of comprehensiveness, accuracy, and automation. The comprehensive molecular characteristics of two rice events were revealed with additional unintended insertions comparing with the results from PCR and Southern blotting. Comprehensive transgene characterization of TT51-1 and T1c-19 is shown to be independent of a priori knowledge of the insert and vector sequences employing the developed approaches. This provides an opportunity to identify and characterize also unknown GM events.
Improved Detection of Local Earthquakes in the Vienna Basin (Austria), using Subspace Detectors
NASA Astrophysics Data System (ADS)
Apoloner, Maria-Theresia; Caffagni, Enrico; Bokelmann, Götz
2016-04-01
The Vienna Basin in Eastern Austria is densely populated and highly-developed; it is also a region of low to moderate seismicity, yet the seismological network coverage is relatively sparse. This demands improving our capability of earthquake detection by testing new methods, enlarging the existing local earthquake catalogue. This contributes to imaging tectonic fault zones for better understanding seismic hazard, also through improved earthquake statistics (b-value, magnitude of completeness). Detection of low-magnitude earthquakes or events for which the highest amplitudes slightly exceed the signal-to-noise-ratio (SNR), may be possible by using standard methods like the short-term over long-term average (STA/LTA). However, due to sparse network coverage and high background noise, such a technique may not detect all potentially recoverable events. Yet, earthquakes originating from the same source region and relatively close to each other, should be characterized by similarity in seismic waveforms, at a given station. Therefore, waveform similarity can be exploited by using specific techniques such as correlation-template based (also known as matched filtering) or subspace detection methods (based on the subspace theory). Matching techniques basically require a reference or template event, usually characterized by high waveform coherence in the array receivers, and high SNR, which is cross-correlated with the continuous data. Instead, subspace detection methods overcome in principle the necessity of defining template events as single events, but use a subspace extracted from multiple events. This approach theoretically should be more robust in detecting signals that exhibit a strong variability (e.g. because of source or magnitude). In this study we scan the continuous data recorded in the Vienna Basin with a subspace detector to identify additional events. This will allow us to estimate the increase of the seismicity rate in the local earthquake catalogue, therefore providing an evaluation of network performance and efficiency of the method.
Determination of a Limited Scope Network's Lightning Detection Efficiency
NASA Technical Reports Server (NTRS)
Rompala, John T.; Blakeslee, R.
2008-01-01
This paper outlines a modeling technique to map lightning detection efficiency variations over a region surveyed by a sparse array of ground based detectors. A reliable flash peak current distribution (PCD) for the region serves as the technique's base. This distribution is recast as an event probability distribution function. The technique then uses the PCD together with information regarding: site signal detection thresholds, type of solution algorithm used, and range attenuation; to formulate the probability that a flash at a specified location will yield a solution. Applying this technique to the full region produces detection efficiency contour maps specific to the parameters employed. These contours facilitate a comparative analysis of each parameter's effect on the network's detection efficiency. In an alternate application, this modeling technique gives an estimate of the number, strength, and distribution of events going undetected. This approach leads to a variety of event density contour maps. This application is also illustrated. The technique's base PCD can be empirical or analytical. A process for formulating an empirical PCD specific to the region and network being studied is presented. A new method for producing an analytical representation of the empirical PCD is also introduced.
Schadt, Eric E.; Banerjee, Onureena; Fang, Gang; Feng, Zhixing; Wong, Wing H.; Zhang, Xuegong; Kislyuk, Andrey; Clark, Tyson A.; Luong, Khai; Keren-Paz, Alona; Chess, Andrew; Kumar, Vipin; Chen-Plotkin, Alice; Sondheimer, Neal; Korlach, Jonas; Kasarskis, Andrew
2013-01-01
Current generation DNA sequencing instruments are moving closer to seamlessly sequencing genomes of entire populations as a routine part of scientific investigation. However, while significant inroads have been made identifying small nucleotide variation and structural variations in DNA that impact phenotypes of interest, progress has not been as dramatic regarding epigenetic changes and base-level damage to DNA, largely due to technological limitations in assaying all known and unknown types of modifications at genome scale. Recently, single-molecule real time (SMRT) sequencing has been reported to identify kinetic variation (KV) events that have been demonstrated to reflect epigenetic changes of every known type, providing a path forward for detecting base modifications as a routine part of sequencing. However, to date no statistical framework has been proposed to enhance the power to detect these events while also controlling for false-positive events. By modeling enzyme kinetics in the neighborhood of an arbitrary location in a genomic region of interest as a conditional random field, we provide a statistical framework for incorporating kinetic information at a test position of interest as well as at neighboring sites that help enhance the power to detect KV events. The performance of this and related models is explored, with the best-performing model applied to plasmid DNA isolated from Escherichia coli and mitochondrial DNA isolated from human brain tissue. We highlight widespread kinetic variation events, some of which strongly associate with known modification events, while others represent putative chemically modified sites of unknown types. PMID:23093720
Schadt, Eric E; Banerjee, Onureena; Fang, Gang; Feng, Zhixing; Wong, Wing H; Zhang, Xuegong; Kislyuk, Andrey; Clark, Tyson A; Luong, Khai; Keren-Paz, Alona; Chess, Andrew; Kumar, Vipin; Chen-Plotkin, Alice; Sondheimer, Neal; Korlach, Jonas; Kasarskis, Andrew
2013-01-01
Current generation DNA sequencing instruments are moving closer to seamlessly sequencing genomes of entire populations as a routine part of scientific investigation. However, while significant inroads have been made identifying small nucleotide variation and structural variations in DNA that impact phenotypes of interest, progress has not been as dramatic regarding epigenetic changes and base-level damage to DNA, largely due to technological limitations in assaying all known and unknown types of modifications at genome scale. Recently, single-molecule real time (SMRT) sequencing has been reported to identify kinetic variation (KV) events that have been demonstrated to reflect epigenetic changes of every known type, providing a path forward for detecting base modifications as a routine part of sequencing. However, to date no statistical framework has been proposed to enhance the power to detect these events while also controlling for false-positive events. By modeling enzyme kinetics in the neighborhood of an arbitrary location in a genomic region of interest as a conditional random field, we provide a statistical framework for incorporating kinetic information at a test position of interest as well as at neighboring sites that help enhance the power to detect KV events. The performance of this and related models is explored, with the best-performing model applied to plasmid DNA isolated from Escherichia coli and mitochondrial DNA isolated from human brain tissue. We highlight widespread kinetic variation events, some of which strongly associate with known modification events, while others represent putative chemically modified sites of unknown types.
Liu, Jia; Guo, Jinchao; Zhang, Haibo; Li, Ning; Yang, Litao; Zhang, Dabing
2009-11-25
Various polymerase chain reaction (PCR) methods were developed for the execution of genetically modified organism (GMO) labeling policies, of which an event-specific PCR detection method based on the flanking sequence of exogenous integration is the primary trend in GMO detection due to its high specificity. In this study, the 5' and 3' flanking sequences of the exogenous integration of MON89788 soybean were revealed by thermal asymmetric interlaced PCR. The event-specific PCR primers and TaqMan probe were designed based upon the revealed 5' flanking sequence, and the qualitative and quantitative PCR assays were established employing these designed primers and probes. In qualitative PCR, the limit of detection (LOD) was about 0.01 ng of genomic DNA corresponding to 10 copies of haploid soybean genomic DNA. In the quantitative PCR assay, the LOD was as low as two haploid genome copies, and the limit of quantification was five haploid genome copies. Furthermore, the developed PCR methods were in-house validated by five researchers, and the validated results indicated that the developed event-specific PCR methods can be used for identification and quantification of MON89788 soybean and its derivates.
A spatial scan statistic for compound Poisson data.
Rosychuk, Rhonda J; Chang, Hsing-Ming
2013-12-20
The topic of spatial cluster detection gained attention in statistics during the late 1980s and early 1990s. Effort has been devoted to the development of methods for detecting spatial clustering of cases and events in the biological sciences, astronomy and epidemiology. More recently, research has examined detecting clusters of correlated count data associated with health conditions of individuals. Such a method allows researchers to examine spatial relationships of disease-related events rather than just incident or prevalent cases. We introduce a spatial scan test that identifies clusters of events in a study region. Because an individual case may have multiple (repeated) events, we base the test on a compound Poisson model. We illustrate our method for cluster detection on emergency department visits, where individuals may make multiple disease-related visits. Copyright © 2013 John Wiley & Sons, Ltd.
Heist, E Kevin; Herre, John M; Binkley, Philip F; Van Bakel, Adrian B; Porterfield, James G; Porterfield, Linda M; Qu, Fujian; Turkel, Melanie; Pavri, Behzad B
2014-10-15
Detect Fluid Early from Intrathoracic Impedance Monitoring (DEFEAT-PE) is a prospective, multicenter study of multiple intrathoracic impedance vectors to detect pulmonary congestion (PC) events. Changes in intrathoracic impedance between the right ventricular (RV) coil and device can (RVcoil→Can) of implantable cardioverter-defibrillators (ICDs) and cardiac resynchronization therapy ICDs (CRT-Ds) are used clinically for the detection of PC events, but other impedance vectors and algorithms have not been studied prospectively. An initial 75-patient study was used to derive optimal impedance vectors to detect PC events, with 2 vector combinations selected for prospective analysis in DEFEAT-PE (ICD vectors: RVring→Can + RVcoil→Can, detection threshold 13 days; CRT-D vectors: left ventricular ring→Can + RVcoil→Can, detection threshold 14 days). Impedance changes were considered true positive if detected <30 days before an adjudicated PC event. One hundred sixty-two patients were enrolled (80 with ICDs and 82 with CRT-Ds), all with ≥1 previous PC event. One hundred forty-four patients provided study data, with 214 patient-years of follow-up and 139 PC events. Sensitivity for PC events of the prespecified algorithms was as follows: ICD: sensitivity 32.3%, false-positive rate 1.28 per patient-year; CRT-D: sensitivity 32.4%, false-positive rate 1.66 per patient-year. An alternative algorithm, ultimately approved by the US Food and Drug Administration (RVring→Can + RVcoil→Can, detection threshold 14 days), resulted in (for all patients) sensitivity of 21.6% and a false-positive rate of 0.9 per patient-year. The CRT-D thoracic impedance vector algorithm selected in the derivation study was not superior to the ICD algorithm RVring→Can + RVcoil→Can when studied prospectively. In conclusion, to achieve an acceptably low false-positive rate, the intrathoracic impedance algorithms studied in DEFEAT-PE resulted in low sensitivity for the prediction of heart failure events. Copyright © 2014 Elsevier Inc. All rights reserved.
Blowing snow detection from ground-based ceilometers: application to East Antarctica
NASA Astrophysics Data System (ADS)
Gossart, Alexandra; Souverijns, Niels; Gorodetskaya, Irina V.; Lhermitte, Stef; Lenaerts, Jan T. M.; Schween, Jan H.; Mangold, Alexander; Laffineur, Quentin; van Lipzig, Nicole P. M.
2017-12-01
Blowing snow impacts Antarctic ice sheet surface mass balance by snow redistribution and sublimation. However, numerical models poorly represent blowing snow processes, while direct observations are limited in space and time. Satellite retrieval of blowing snow is hindered by clouds and only the strongest events are considered. Here, we develop a blowing snow detection (BSD) algorithm for ground-based remote-sensing ceilometers in polar regions and apply it to ceilometers at Neumayer III and Princess Elisabeth (PE) stations, East Antarctica. The algorithm is able to detect (heavy) blowing snow layers reaching 30 m height. Results show that 78 % of the detected events are in agreement with visual observations at Neumayer III station. The BSD algorithm detects heavy blowing snow 36 % of the time at Neumayer (2011-2015) and 13 % at PE station (2010-2016). Blowing snow occurrence peaks during the austral winter and shows around 5 % interannual variability. The BSD algorithm is capable of detecting blowing snow both lifted from the ground and occurring during precipitation, which is an added value since results indicate that 92 % of the blowing snow is during synoptic events, often combined with precipitation. Analysis of atmospheric meteorological variables shows that blowing snow occurrence strongly depends on fresh snow availability in addition to wind speed. This finding challenges the commonly used parametrizations, where the threshold for snow particles to be lifted is a function of wind speed only. Blowing snow occurs predominantly during storms and overcast conditions, shortly after precipitation events, and can reach up to 1300 m a. g. l. in the case of heavy mixed events (precipitation and blowing snow together). These results suggest that synoptic conditions play an important role in generating blowing snow events and that fresh snow availability should be considered in determining the blowing snow onset.
Continuous robust sound event classification using time-frequency features and deep learning
Song, Yan; Xiao, Wei; Phan, Huy
2017-01-01
The automatic detection and recognition of sound events by computers is a requirement for a number of emerging sensing and human computer interaction technologies. Recent advances in this field have been achieved by machine learning classifiers working in conjunction with time-frequency feature representations. This combination has achieved excellent accuracy for classification of discrete sounds. The ability to recognise sounds under real-world noisy conditions, called robust sound event classification, is an especially challenging task that has attracted recent research attention. Another aspect of real-word conditions is the classification of continuous, occluded or overlapping sounds, rather than classification of short isolated sound recordings. This paper addresses the classification of noise-corrupted, occluded, overlapped, continuous sound recordings. It first proposes a standard evaluation task for such sounds based upon a common existing method for evaluating isolated sound classification. It then benchmarks several high performing isolated sound classifiers to operate with continuous sound data by incorporating an energy-based event detection front end. Results are reported for each tested system using the new task, to provide the first analysis of their performance for continuous sound event detection. In addition it proposes and evaluates a novel Bayesian-inspired front end for the segmentation and detection of continuous sound recordings prior to classification. PMID:28892478
Continuous robust sound event classification using time-frequency features and deep learning.
McLoughlin, Ian; Zhang, Haomin; Xie, Zhipeng; Song, Yan; Xiao, Wei; Phan, Huy
2017-01-01
The automatic detection and recognition of sound events by computers is a requirement for a number of emerging sensing and human computer interaction technologies. Recent advances in this field have been achieved by machine learning classifiers working in conjunction with time-frequency feature representations. This combination has achieved excellent accuracy for classification of discrete sounds. The ability to recognise sounds under real-world noisy conditions, called robust sound event classification, is an especially challenging task that has attracted recent research attention. Another aspect of real-word conditions is the classification of continuous, occluded or overlapping sounds, rather than classification of short isolated sound recordings. This paper addresses the classification of noise-corrupted, occluded, overlapped, continuous sound recordings. It first proposes a standard evaluation task for such sounds based upon a common existing method for evaluating isolated sound classification. It then benchmarks several high performing isolated sound classifiers to operate with continuous sound data by incorporating an energy-based event detection front end. Results are reported for each tested system using the new task, to provide the first analysis of their performance for continuous sound event detection. In addition it proposes and evaluates a novel Bayesian-inspired front end for the segmentation and detection of continuous sound recordings prior to classification.
NASA Astrophysics Data System (ADS)
Pötzi, W.; Veronig, A. M.; Temmer, M.
2018-06-01
In the framework of the Space Situational Awareness program of the European Space Agency (ESA/SSA), an automatic flare detection system was developed at Kanzelhöhe Observatory (KSO). The system has been in operation since mid-2013. The event detection algorithm was upgraded in September 2017. All data back to 2014 was reprocessed using the new algorithm. In order to evaluate both algorithms, we apply verification measures that are commonly used for forecast validation. In order to overcome the problem of rare events, which biases the verification measures, we introduce a new event-based method. We divide the timeline of the Hα observations into positive events (flaring period) and negative events (quiet period), independent of the length of each event. In total, 329 positive and negative events were detected between 2014 and 2016. The hit rate for the new algorithm reached 96% (just five events were missed) and a false-alarm ratio of 17%. This is a significant improvement of the algorithm, as the original system had a hit rate of 85% and a false-alarm ratio of 33%. The true skill score and the Heidke skill score both reach values of 0.8 for the new algorithm; originally, they were at 0.5. The mean flare positions are accurate within {±} 1 heliographic degree for both algorithms, and the peak times improve from a mean difference of 1.7± 2.9 minutes to 1.3± 2.3 minutes. The flare start times that had been systematically late by about 3 minutes as determined by the original algorithm, now match the visual inspection within -0.47± 4.10 minutes.
Sommermeyer, Dirk; Zou, Ding; Grote, Ludger; Hedner, Jan
2012-10-15
To assess the accuracy of novel algorithms using an oximeter-based finger plethysmographic signal in combination with a nasal cannula for the detection and differentiation of central and obstructive apneas. The validity of single pulse oximetry to detect respiratory disturbance events was also studied. Patients recruited from four sleep laboratories underwent an ambulatory overnight cardiorespiratory polygraphy recording. The nasal flow and photoplethysmographic signals of the recording were analyzed by automated algorithms. The apnea hypopnea index (AHI(auto)) was calculated using both signals, and a respiratory disturbance index (RDI(auto)) was calculated from photoplethysmography alone. Apnea events were classified into obstructive and central types using the oximeter derived pulse wave signal and compared with manual scoring. Sixty-six subjects (42 males, age 54 ± 14 yrs, body mass index 28.5 ± 5.9 kg/m(2)) were included in the analysis. AHI(manual) (19.4 ± 18.5 events/h) correlated highly significantly with AHI(auto) (19.9 ± 16.5 events/h) and RDI(auto) (20.4 ± 17.2 events/h); the correlation coefficients were r = 0.94 and 0.95, respectively (p < 0.001) with a mean difference of -0.5 ± 6.6 and -1.0 ± 6.1 events/h. The automatic analysis of AHI(auto) and RDI(auto) detected sleep apnea (cutoff AHI(manual) ≥ 15 events/h) with a sensitivity/specificity of 0.90/0.97 and 0.86/0.94, respectively. The automated obstructive/central apnea indices correlated closely with manually scoring (r = 0.87 and 0.95, p < 0.001) with mean difference of -4.3 ± 7.9 and 0.3 ± 1.5 events/h, respectively. Automatic analysis based on routine pulse oximetry alone may be used to detect sleep disordered breathing with accuracy. In addition, the combination of photoplethysmographic signals with a nasal flow signal provides an accurate distinction between obstructive and central apneic events during sleep.
Physics-based, Bayesian sequential detection method and system for radioactive contraband
Candy, James V; Axelrod, Michael C; Breitfeller, Eric F; Chambers, David H; Guidry, Brian L; Manatt, Douglas R; Meyer, Alan W; Sale, Kenneth E
2014-03-18
A distributed sequential method and system for detecting and identifying radioactive contraband from highly uncertain (noisy) low-count, radionuclide measurements, i.e. an event mode sequence (EMS), using a statistical approach based on Bayesian inference and physics-model-based signal processing based on the representation of a radionuclide as a monoenergetic decomposition of monoenergetic sources. For a given photon event of the EMS, the appropriate monoenergy processing channel is determined using a confidence interval condition-based discriminator for the energy amplitude and interarrival time and parameter estimates are used to update a measured probability density function estimate for a target radionuclide. A sequential likelihood ratio test is then used to determine one of two threshold conditions signifying that the EMS is either identified as the target radionuclide or not, and if not, then repeating the process for the next sequential photon event of the EMS until one of the two threshold conditions is satisfied.
Integrating physically based simulators with Event Detection Systems: Multi-site detection approach.
Housh, Mashor; Ohar, Ziv
2017-03-01
The Fault Detection (FD) Problem in control theory concerns of monitoring a system to identify when a fault has occurred. Two approaches can be distinguished for the FD: Signal processing based FD and Model-based FD. The former concerns of developing algorithms to directly infer faults from sensors' readings, while the latter uses a simulation model of the real-system to analyze the discrepancy between sensors' readings and expected values from the simulation model. Most contamination Event Detection Systems (EDSs) for water distribution systems have followed the signal processing based FD, which relies on analyzing the signals from monitoring stations independently of each other, rather than evaluating all stations simultaneously within an integrated network. In this study, we show that a model-based EDS which utilizes a physically based water quality and hydraulics simulation models, can outperform the signal processing based EDS. We also show that the model-based EDS can facilitate the development of a Multi-Site EDS (MSEDS), which analyzes the data from all the monitoring stations simultaneously within an integrated network. The advantage of the joint analysis in the MSEDS is expressed by increased detection accuracy (higher true positive alarms and fewer false alarms) and shorter detection time. Copyright © 2016 Elsevier Ltd. All rights reserved.
Assessing the severity of sleep apnea syndrome based on ballistocardiogram
Zhou, Xingshe; Zhao, Weichao; Liu, Fan; Ni, Hongbo; Yu, Zhiwen
2017-01-01
Background Sleep Apnea Syndrome (SAS) is a common sleep-related breathing disorder, which affects about 4-7% males and 2-4% females all around the world. Different approaches have been adopted to diagnose SAS and measure its severity, including the gold standard Polysomnography (PSG) in sleep study field as well as several alternative techniques such as single-channel ECG, pulse oximeter and so on. However, many shortcomings still limit their generalization in home environment. In this study, we aim to propose an efficient approach to automatically assess the severity of sleep apnea syndrome based on the ballistocardiogram (BCG) signal, which is non-intrusive and suitable for in home environment. Methods We develop an unobtrusive sleep monitoring system to capture the BCG signals, based on which we put forward a three-stage sleep apnea syndrome severity assessment framework, i.e., data preprocessing, sleep-related breathing events (SBEs) detection, and sleep apnea syndrome severity evaluation. First, in the data preprocessing stage, to overcome the limits of BCG signals (e.g., low precision and reliability), we utilize wavelet decomposition to obtain the outline information of heartbeats, and apply a RR correction algorithm to handle missing or spurious RR intervals. Afterwards, in the event detection stage, we propose an automatic sleep-related breathing event detection algorithm named Physio_ICSS based on the iterative cumulative sums of squares (i.e., the ICSS algorithm), which is originally used to detect structural breakpoints in a time series. In particular, to efficiently detect sleep-related breathing events in the obtained time series of RR intervals, the proposed algorithm not only explores the practical factors of sleep-related breathing events (e.g., the limit of lasting duration and possible occurrence sleep stages) but also overcomes the event segmentation issue (e.g., equal-length segmentation method might divide one sleep-related breathing event into different fragments and lead to incorrect results) of existing approaches. Finally, by fusing features extracted from multiple domains, we can identify sleep-related breathing events and assess the severity level of sleep apnea syndrome effectively. Conclusions Experimental results on 136 individuals of different sleep apnea syndrome severities validate the effectiveness of the proposed framework, with the accuracy of 94.12% (128/136). PMID:28445548
Contamination Event Detection with Multivariate Time-Series Data in Agricultural Water Monitoring †
Mao, Yingchi; Qi, Hai; Ping, Ping; Li, Xiaofang
2017-01-01
Time series data of multiple water quality parameters are obtained from the water sensor networks deployed in the agricultural water supply network. The accurate and efficient detection and warning of contamination events to prevent pollution from spreading is one of the most important issues when pollution occurs. In order to comprehensively reduce the event detection deviation, a spatial–temporal-based event detection approach with multivariate time-series data for water quality monitoring (M-STED) was proposed. The M-STED approach includes three parts. The first part is that M-STED adopts a Rule K algorithm to select backbone nodes as the nodes in the CDS, and forward the sensed data of multiple water parameters. The second part is to determine the state of each backbone node with back propagation neural network models and the sequential Bayesian analysis in the current timestamp. The third part is to establish a spatial model with Bayesian networks to estimate the state of the backbones in the next timestamp and trace the “outlier” node to its neighborhoods to detect a contamination event. The experimental results indicate that the average detection rate is more than 80% with M-STED and the false detection rate is lower than 9%, respectively. The M-STED approach can improve the rate of detection by about 40% and reduce the false alarm rate by about 45%, compared with the event detection with a single water parameter algorithm, S-STED. Moreover, the proposed M-STED can exhibit better performance in terms of detection delay and scalability. PMID:29207535
The effectiveness of pretreatment physics plan review for detecting errors in radiation therapy.
Gopan, Olga; Zeng, Jing; Novak, Avrey; Nyflot, Matthew; Ford, Eric
2016-09-01
The pretreatment physics plan review is a standard tool for ensuring treatment quality. Studies have shown that the majority of errors in radiation oncology originate in treatment planning, which underscores the importance of the pretreatment physics plan review. This quality assurance measure is fundamentally important and central to the safety of patients and the quality of care that they receive. However, little is known about its effectiveness. The purpose of this study was to analyze reported incidents to quantify the effectiveness of the pretreatment physics plan review with the goal of improving it. This study analyzed 522 potentially severe or critical near-miss events within an institutional incident learning system collected over a three-year period. Of these 522 events, 356 originated at a workflow point that was prior to the pretreatment physics plan review. The remaining 166 events originated after the pretreatment physics plan review and were not considered in the study. The applicable 356 events were classified into one of the three categories: (1) events detected by the pretreatment physics plan review, (2) events not detected but "potentially detectable" by the physics review, and (3) events "not detectable" by the physics review. Potentially detectable events were further classified by which specific checks performed during the pretreatment physics plan review detected or could have detected the event. For these events, the associated specific check was also evaluated as to the possibility of automating that check given current data structures. For comparison, a similar analysis was carried out on 81 events from the international SAFRON radiation oncology incident learning system. Of the 356 applicable events from the institutional database, 180/356 (51%) were detected or could have been detected by the pretreatment physics plan review. Of these events, 125 actually passed through the physics review; however, only 38% (47/125) were actually detected at the review. Of the 81 events from the SAFRON database, 66/81 (81%) were potentially detectable by the pretreatment physics plan review. From the institutional database, three specific physics checks were particularly effective at detecting events (combined effectiveness of 38%): verifying the isocenter (39/180), verifying DRRs (17/180), and verifying that the plan matched the prescription (12/180). The most effective checks from the SAFRON database were verifying that the plan matched the prescription (13/66) and verifying the field parameters in the record and verify system against those in the plan (23/66). Software-based plan checking systems, if available, would have potential effectiveness of 29% and 64% at detecting events from the institutional and SAFRON databases, respectively. Pretreatment physics plan review is a key safety measure and can detect a high percentage of errors. However, the majority of errors that potentially could have been detected were not detected in this study, indicating the need to improve the pretreatment physics plan review performance. Suggestions for improvement include the automation of specific physics checks performed during the pretreatment physics plan review and the standardization of the review process.
Video-tracker trajectory analysis: who meets whom, when and where
NASA Astrophysics Data System (ADS)
Jäger, U.; Willersinn, D.
2010-04-01
Unveiling unusual or hostile events by observing manifold moving persons in a crowd is a challenging task for human operators, especially when sitting in front of monitor walls for hours. Typically, hostile events are rare. Thus, due to tiredness and negligence the operator may miss important events. In such situations, an automatic alarming system is able to support the human operator. The system incorporates a processing chain consisting of (1) people tracking, (2) event detection, (3) data retrieval, and (4) display of relevant video sequence overlaid by highlighted regions of interest. In this paper we focus on the event detection stage of the processing chain mentioned above. In our case, the selected event of interest is the encounter of people. Although being based on a rather simple trajectory analysis, this kind of event embodies great practical importance because it paves the way to answer the question "who meets whom, when and where". This, in turn, forms the basis to detect potential situations where e.g. money, weapons, drugs etc. are handed over from one person to another in crowded environments like railway stations, airports or busy streets and places etc.. The input to the trajectory analysis comes from a multi-object video-based tracking system developed at IOSB which is able to track multiple individuals within a crowd in real-time [1]. From this we calculate the inter-distances between all persons on a frame-to-frame basis. We use a sequence of simple rules based on the individuals' kinematics to detect the event mentioned above to output the frame number, the persons' IDs from the tracker and the pixel coordinates of the meeting position. Using this information, a data retrieval system may extract the corresponding part of the recorded video image sequence and finally allows for replaying the selected video clip with a highlighted region of interest to attract the operator's attention for further visual inspection.
Automatic Detection and Vulnerability Analysis of Areas Endangered by Heavy Rain
NASA Astrophysics Data System (ADS)
Krauß, Thomas; Fischer, Peter
2016-08-01
In this paper we present a new method for fully automatic detection and derivation of areas endangered by heavy rainfall based only on digital elevation models. Tracking news show that the majority of occuring natural hazards are flood events. So already many flood prediction systems were developed. But most of these existing systems for deriving areas endangered by flooding events are based only on horizontal and vertical distances to existing rivers and lakes. Typically such systems take not into account dangers arising directly from heavy rain events. In a study conducted by us together with a german insurance company a new approach for detection of areas endangered by heavy rain was proven to give a high correlation of the derived endangered areas and the losses claimed at the insurance company. Here we describe three methods for classification of digital terrain models and analyze their usability for automatic detection and vulnerability analysis for areas endangered by heavy rainfall and analyze the results using the available insurance data.
Detecting Single-Nucleotide Substitutions Induced by Genome Editing.
Miyaoka, Yuichiro; Chan, Amanda H; Conklin, Bruce R
2016-08-01
The detection of genome editing is critical in evaluating genome-editing tools or conditions, but it is not an easy task to detect genome-editing events-especially single-nucleotide substitutions-without a surrogate marker. Here we introduce a procedure that significantly contributes to the advancement of genome-editing technologies. It uses droplet digital polymerase chain reaction (ddPCR) and allele-specific hydrolysis probes to detect single-nucleotide substitutions generated by genome editing (via homology-directed repair, or HDR). HDR events that introduce substitutions using donor DNA are generally infrequent, even with genome-editing tools, and the outcome is only one base pair difference in 3 billion base pairs of the human genome. This task is particularly difficult in induced pluripotent stem (iPS) cells, in which editing events can be very rare. Therefore, the technological advances described here have implications for therapeutic genome editing and experimental approaches to disease modeling with iPS cells. © 2016 Cold Spring Harbor Laboratory Press.
Detection of Epileptic Seizure Event and Onset Using EEG
Ahammad, Nabeel; Fathima, Thasneem; Joseph, Paul
2014-01-01
This study proposes a method of automatic detection of epileptic seizure event and onset using wavelet based features and certain statistical features without wavelet decomposition. Normal and epileptic EEG signals were classified using linear classifier. For seizure event detection, Bonn University EEG database has been used. Three types of EEG signals (EEG signal recorded from healthy volunteer with eye open, epilepsy patients in the epileptogenic zone during a seizure-free interval, and epilepsy patients during epileptic seizures) were classified. Important features such as energy, entropy, standard deviation, maximum, minimum, and mean at different subbands were computed and classification was done using linear classifier. The performance of classifier was determined in terms of specificity, sensitivity, and accuracy. The overall accuracy was 84.2%. In the case of seizure onset detection, the database used is CHB-MIT scalp EEG database. Along with wavelet based features, interquartile range (IQR) and mean absolute deviation (MAD) without wavelet decomposition were extracted. Latency was used to study the performance of seizure onset detection. Classifier gave a sensitivity of 98.5% with an average latency of 1.76 seconds. PMID:24616892
Fusing Symbolic and Numerical Diagnostic Computations
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.
Sommermeyer, Dirk; Zou, Ding; Grote, Ludger; Hedner, Jan
2012-01-01
Study Objective: To assess the accuracy of novel algorithms using an oximeter-based finger plethysmographic signal in combination with a nasal cannula for the detection and differentiation of central and obstructive apneas. The validity of single pulse oximetry to detect respiratory disturbance events was also studied. Methods: Patients recruited from four sleep laboratories underwent an ambulatory overnight cardiorespiratory polygraphy recording. The nasal flow and photoplethysmographic signals of the recording were analyzed by automated algorithms. The apnea hypopnea index (AHIauto) was calculated using both signals, and a respiratory disturbance index (RDIauto) was calculated from photoplethysmography alone. Apnea events were classified into obstructive and central types using the oximeter derived pulse wave signal and compared with manual scoring. Results: Sixty-six subjects (42 males, age 54 ± 14 yrs, body mass index 28.5 ± 5.9 kg/m2) were included in the analysis. AHImanual (19.4 ± 18.5 events/h) correlated highly significantly with AHIauto (19.9 ± 16.5 events/h) and RDIauto (20.4 ± 17.2 events/h); the correlation coefficients were r = 0.94 and 0.95, respectively (p < 0.001) with a mean difference of −0.5 ± 6.6 and −1.0 ± 6.1 events/h. The automatic analysis of AHIauto and RDIauto detected sleep apnea (cutoff AHImanual ≥ 15 events/h) with a sensitivity/specificity of 0.90/0.97 and 0.86/0.94, respectively. The automated obstructive/central apnea indices correlated closely with manually scoring (r = 0.87 and 0.95, p < 0.001) with mean difference of −4.3 ± 7.9 and 0.3 ± 1.5 events/h, respectively. Conclusions: Automatic analysis based on routine pulse oximetry alone may be used to detect sleep disordered breathing with accuracy. In addition, the combination of photoplethysmographic signals with a nasal flow signal provides an accurate distinction between obstructive and central apneic events during sleep. Citation: Sommermeyer D; Zou D; Grote L; Hedner J. Detection of sleep disordered breathing and its central/obstructive character using nasal cannula and finger pulse oximeter. J Clin Sleep Med 2012;8(5):527-533. PMID:23066364
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bame, D.
To determine if seismic signals at frequencies up to 50 Hz are useful for detecting events and discriminating between earthquakes and explosions, approximately 180 events from the three-component high-frequency seismic element (HFSE) installed at the center of the Norwegian Regional Seismic Array (NRSA) have been analyzed. The attenuation of high-frequency signals in Scandinavia varies with distance, azimuth, magnitude, and source effects. Most of the events were detected with HFSE, although detections were better on the NRSA where signal processing techniques were used. Based on a preliminary analysis, high-frequency data do not appear to be a useful discriminant in Scandinavia. 21more » refs., 29 figs., 3 tabs.« less
NASA Astrophysics Data System (ADS)
Solano, ErickaAlinne; Hjorleifsdottir, Vala; Perez-Campos, Xyoli
2015-04-01
A large subset of seismic events do not have impulsive arrivals, such as low frequency events in volcanoes, earthquakes in the shallow part of the subduction interface and further down dip from the traditional seismogenic part, glacial events, volcanic and non-volcanic tremors and landslides. A suite of methods can be used to detect these non-impulsive events. One of this methods is the full-waveform detection based on time reversal methods (Solano, et al , submitted to GJI). The method uses continuous observed seismograms, together with Greens functions and moment tensor responses calculated for an arbitrary 3D structure. This method was applied to the 2012 Ometepec-Pinotepa Nacional earthquake sequence in Guerrero, Mexico. During the span time of the study, we encountered three previously unknown events. One of this events was an impulsive earthquake in the Ometepec area, that only has clear arrivals on three stations and was therefore not located and reported by the SSN. The other two events are previously undetected events, very depleted in high frequencies, that occurred far outside the search area. A very rough estimate gives the location of this two events in the portion of the East Pacific Rise around 9 N. These two events are detected despite their distance from the search area, due to favorable move-out on the array of the Mexican National Seismological Service network (SSN). We are expanding the study area to the EPR and to a larger period of time, with the objective of finding more events in that region. We will present an analysis of the newly detected events, as well as any further findings at the meeting.
Call, Rosemary J.; Burlison, Jonathan D.; Robertson, Jennifer J.; Scott, Jeffrey R.; Baker, Donald K.; Rossi, Michael G.; Howard, Scott C.; Hoffman, James M.
2014-01-01
Objective To investigate the use of a trigger tool for adverse drug event (ADE) detection in a pediatric hospital specializing in oncology, hematology, and other catastrophic diseases. Study design A medication-based trigger tool package analyzed electronic health records from February 2009 to February 2013. Chart review determined whether an ADE precipitated the trigger. Severity was assigned to ADEs, and preventability was assessed. Preventable ADEs were compared with the hospital’s electronic voluntary event reporting system to identify whether these ADEs had been previously identified. The positive predictive values (PPVs) of the entire trigger tool and individual triggers were calculated to assess their accuracy to detect ADEs. Results Trigger occurrences (n=706) were detected in 390 patients from six medication triggers, 33 of which were ADEs (overall PPV = 16%). Hyaluronidase had the highest PPV (60%). Most ADEs were category E harm (temporary harm) per the National Coordinating Council for Medication Error Reporting and Prevention (NCC MERP) index. One event was category H harm (intervention to sustain life). Naloxone was associated with the most grade 4 ADEs per the Common Terminology Criteria for Adverse Events (CTCAE) v4.03. Twenty-one (64%) ADEs were preventable; 3 of which were submitted via the voluntary reporting system. Conclusion Most of the medication-based triggers yielded low PPVs. Refining the triggers based on patients’ characteristics and medication usage patterns could increase the PPVs and make them more useful for quality improvement. To efficiently detect ADEs, triggers must be revised to reflect specialized pediatric patient populations such as hematology and oncology patients. PMID:24768254
Call, Rosemary J; Burlison, Jonathan D; Robertson, Jennifer J; Scott, Jeffrey R; Baker, Donald K; Rossi, Michael G; Howard, Scott C; Hoffman, James M
2014-09-01
To investigate the use of a trigger tool for the detection of adverse drug events (ADE) in a pediatric hospital specializing in oncology, hematology, and other catastrophic diseases. A medication-based trigger tool package analyzed electronic health records from February 2009 to February 2013. Chart review determined whether an ADE precipitated the trigger. Severity was assigned to ADEs, and preventability was assessed. Preventable ADEs were compared with the hospital's electronic voluntary event reporting system to identify whether these ADEs had been previously identified. The positive predictive values (PPVs) of the entire trigger tool and individual triggers were calculated to assess their accuracy to detect ADEs. Trigger occurrences (n = 706) were detected in 390 patients from 6 medication triggers, 33 of which were ADEs (overall PPV = 16%). Hyaluronidase had the greatest PPV (60%). Most ADEs were category E harm (temporary harm) per the National Coordinating Council for Medication Error Reporting and Prevention index. One event was category H harm (intervention to sustain life). Naloxone was associated with the most grade 4 ADEs per the Common Terminology Criteria for Adverse Events v4.03. Twenty-one (64%) ADEs were preventable, 3 of which were submitted via the voluntary reporting system. Most of the medication-based triggers yielded low PPVs. Refining the triggers based on patients' characteristics and medication usage patterns could increase the PPVs and make them more useful for quality improvement. To efficiently detect ADEs, triggers must be revised to reflect specialized pediatric patient populations such as hematology and oncology patients. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Turso, James; Lawrence, Charles; Litt, Jonathan
2004-01-01
The development of a wavelet-based feature extraction technique specifically targeting FOD-event induced vibration signal changes in gas turbine engines is described. The technique performs wavelet analysis of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/health parameter estimation for FOD-event detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-event test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite element model modal analysis to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-event detection. In the presence of significant noise, precise location of the FOD event in time was obtained using the developed wavelet-based feature.
NASA Technical Reports Server (NTRS)
Turso, James A.; Lawrence, Charles; Litt, Jonathan S.
2007-01-01
The development of a wavelet-based feature extraction technique specifically targeting FOD-event induced vibration signal changes in gas turbine engines is described. The technique performs wavelet analysis of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/ health parameter estimation for FOD-event detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-event test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite-element model modal analysis to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-event detection. In the presence of significant noise, precise location of the FOD event in time was obtained using the developed wavelet-based feature.
NASA Astrophysics Data System (ADS)
Knapmeyer-Endrun, B.; Hammer, C.
2014-12-01
The seismometers that the Apollo astronauts deployed on the Moon provide the only recordings of seismic events from any extra-terrestrial body so far. These lunar events are significantly different from ones recorded on Earth, in terms of both signal shape and source processes. Thus they are a valuable test case for any experiment in planetary seismology. In this study, we analyze Apollo 16 data with a single-station event detection and classification algorithm in view of NASA's upcoming InSight mission to Mars. InSight, scheduled for launch in early 2016, has the goal to investigate Mars' internal structure by deploying a seismometer on its surface. As the mission does not feature any orbiter, continuous data will be relayed to Earth at a reduced rate. Full range data will only be available by requesting specific time-windows within a few days after the receipt of the original transmission. We apply a recently introduced algorithm based on hidden Markov models that requires only a single example waveform of each event class for training appropriate models. After constructing the prototypes we detect and classify impacts and deep and shallow moonquakes. Initial results for 1972 (year of station installation with 8 months of data) indicate a high detection rate of over 95% for impacts, of which more than 80% are classified correctly. Deep moonquakes, which occur in large amounts, but often show only very weak signals, are detected with less certainty (~70%). As there is only one weak shallow moonquake covered, results for this event class are not statistically significant. Daily adjustments of the background noise model help to reduce false alarms, which are mainly erroneous deep moonquake detections, by about 25%. The algorithm enables us to classify events that were previously listed in the catalog without classification, and, through the combined use of long period and short period data, identify some unlisted local impacts as well as at least two yet unreported deep moonquakes.
Automatic identification of alpine mass movements based on seismic and infrasound signals
NASA Astrophysics Data System (ADS)
Schimmel, Andreas; Hübl, Johannes
2017-04-01
The automatic detection and identification of alpine mass movements like debris flows, debris floods or landslides gets increasing importance for mitigation measures in the densely populated and intensively used alpine regions. Since this mass movement processes emits characteristically seismic and acoustic waves in the low frequency range this events can be detected and identified based on this signals. So already several approaches for detection and warning systems based on seismic or infrasound signals has been developed. But a combination of both methods, which can increase detection probability and reduce false alarms is currently used very rarely and can serve as a promising method for developing an automatic detection and identification system. So this work presents an approach for a detection and identification system based on a combination of seismic and infrasound sensors, which can detect sediment related mass movements from a remote location unaffected by the process. The system is based on one infrasound sensor and one geophone which are placed co-located and a microcontroller where a specially designed detection algorithm is executed which can detect mass movements in real time directly at the sensor site. Further this work tries to get out more information from the seismic and infrasound spectrum produced by different sediment related mass movements to identify the process type and estimate the magnitude of the event. The system is currently installed and tested on five test sites in Austria, two in Italy and one in Switzerland as well as one in Germany. This high number of test sites is used to get a large database of very different events which will be the basis for a new identification method for alpine mass movements. These tests shows promising results and so this system provides an easy to install and inexpensive approach for a detection and warning system.
Why conventional detection methods fail in identifying the existence of contamination events.
Liu, Shuming; Li, Ruonan; Smith, Kate; Che, Han
2016-04-15
Early warning systems are widely used to safeguard water security, but their effectiveness has raised many questions. To understand why conventional detection methods fail to identify contamination events, this study evaluates the performance of three contamination detection methods using data from a real contamination accident and two artificial datasets constructed using a widely applied contamination data construction approach. Results show that the Pearson correlation Euclidean distance (PE) based detection method performs better for real contamination incidents, while the Euclidean distance method (MED) and linear prediction filter (LPF) method are more suitable for detecting sudden spike-like variation. This analysis revealed why the conventional MED and LPF methods failed to identify existence of contamination events. The analysis also revealed that the widely used contamination data construction approach is misleading. Copyright © 2016 Elsevier Ltd. All rights reserved.
Event-driven simulation in SELMON: An overview of EDSE
NASA Technical Reports Server (NTRS)
Rouquette, Nicolas F.; Chien, Steve A.; Charest, Leonard, Jr.
1992-01-01
EDSE (event-driven simulation engine), a model-based event-driven simulator implemented for SELMON, a tool for sensor selection and anomaly detection in real-time monitoring is described. The simulator is used in conjunction with a causal model to predict future behavior of the model from observed data. The behavior of the causal model is interpreted as equivalent to the behavior of the physical system being modeled. An overview of the functionality of the simulator and the model-based event-driven simulation paradigm on which it is based is provided. Included are high-level descriptions of the following key properties: event consumption and event creation, iterative simulation, synchronization and filtering of monitoring data from the physical system. Finally, how EDSE stands with respect to the relevant open issues of discrete-event and model-based simulation is discussed.
Spawning behaviour of Allis shad Alosa alosa: new insights based on imaging sonar data.
Langkau, M C; Clavé, D; Schmidt, M B; Borcherding, J
2016-06-01
Spawning behaviour of Alosa alosa was observed by high resolution imaging sonar. Detected clouds of sexual products and micro bubbles served as a potential indicator of spawning activity. Peak spawning time was between 0130 and 0200 hours at night. Increasing detections over three consecutive nights were consistent with sounds of mating events (bulls) assessed in hearing surveys in parallel to the hydro acoustic detection. In 70% of the analysed mating events there were no additional A. alosa joining the event whilst 70% of the mating events showed one or two A. alosa leaving the cloud. In 31% of the analysed mating events, however, three or more A. alosa were leaving the clouds, indicating that matings are not restricted to a pair. Imaging sonar is suitable for monitoring spawning activity and behaviour of anadromous clupeids in their spawning habitats. © 2016 The Fisheries Society of the British Isles.
Iuliano, A. Danielle; Uyeki, Timothy M.; Mintz, Eric D.; Nichol, Stuart T.; Rollin, Pierre; Staples, J. Erin; Arthur, Ray R.
2017-01-01
To better track public health events in areas where the public health system is unable or unwilling to report the event to appropriate public health authorities, agencies can conduct event-based surveillance, which is defined as the organized collection, monitoring, assessment, and interpretation of unstructured information regarding public health events that may represent an acute risk to public health. The US Centers for Disease Control and Prevention's (CDC's) Global Disease Detection Operations Center (GDDOC) was created in 2007 to serve as CDC's platform dedicated to conducting worldwide event-based surveillance, which is now highlighted as part of the “detect” element of the Global Health Security Agenda (GHSA). The GHSA works toward making the world more safe and secure from disease threats through building capacity to better “Prevent, Detect, and Respond” to those threats. The GDDOC monitors approximately 30 to 40 public health events each day. In this article, we describe the top threats to public health monitored during 2012 to 2016: avian influenza, cholera, Ebola virus disease, and the vector-borne diseases yellow fever, chikungunya virus, and Zika virus, with updates to the previously described threats from Middle East respiratory syndrome-coronavirus (MERS-CoV) and poliomyelitis. PMID:28805465
The effectiveness of pretreatment physics plan review for detecting errors in radiation therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopan, Olga; Zeng, Jing; Novak, Avrey
Purpose: The pretreatment physics plan review is a standard tool for ensuring treatment quality. Studies have shown that the majority of errors in radiation oncology originate in treatment planning, which underscores the importance of the pretreatment physics plan review. This quality assurance measure is fundamentally important and central to the safety of patients and the quality of care that they receive. However, little is known about its effectiveness. The purpose of this study was to analyze reported incidents to quantify the effectiveness of the pretreatment physics plan review with the goal of improving it. Methods: This study analyzed 522 potentiallymore » severe or critical near-miss events within an institutional incident learning system collected over a three-year period. Of these 522 events, 356 originated at a workflow point that was prior to the pretreatment physics plan review. The remaining 166 events originated after the pretreatment physics plan review and were not considered in the study. The applicable 356 events were classified into one of the three categories: (1) events detected by the pretreatment physics plan review, (2) events not detected but “potentially detectable” by the physics review, and (3) events “not detectable” by the physics review. Potentially detectable events were further classified by which specific checks performed during the pretreatment physics plan review detected or could have detected the event. For these events, the associated specific check was also evaluated as to the possibility of automating that check given current data structures. For comparison, a similar analysis was carried out on 81 events from the international SAFRON radiation oncology incident learning system. Results: Of the 356 applicable events from the institutional database, 180/356 (51%) were detected or could have been detected by the pretreatment physics plan review. Of these events, 125 actually passed through the physics review; however, only 38% (47/125) were actually detected at the review. Of the 81 events from the SAFRON database, 66/81 (81%) were potentially detectable by the pretreatment physics plan review. From the institutional database, three specific physics checks were particularly effective at detecting events (combined effectiveness of 38%): verifying the isocenter (39/180), verifying DRRs (17/180), and verifying that the plan matched the prescription (12/180). The most effective checks from the SAFRON database were verifying that the plan matched the prescription (13/66) and verifying the field parameters in the record and verify system against those in the plan (23/66). Software-based plan checking systems, if available, would have potential effectiveness of 29% and 64% at detecting events from the institutional and SAFRON databases, respectively. Conclusions: Pretreatment physics plan review is a key safety measure and can detect a high percentage of errors. However, the majority of errors that potentially could have been detected were not detected in this study, indicating the need to improve the pretreatment physics plan review performance. Suggestions for improvement include the automation of specific physics checks performed during the pretreatment physics plan review and the standardization of the review process.« less
Barboza, Philippe; Vaillant, Laetitia; Le Strat, Yann; Hartley, David M.; Nelson, Noele P.; Mawudeku, Abla; Madoff, Lawrence C.; Linge, Jens P.; Collier, Nigel; Brownstein, John S.; Astagneau, Pascal
2014-01-01
Background Internet-based biosurveillance systems have been developed to detect health threats using information available on the Internet, but system performance has not been assessed relative to end-user needs and perspectives. Method and Findings Infectious disease events from the French Institute for Public Health Surveillance (InVS) weekly international epidemiological bulletin published in 2010 were used to construct the gold-standard official dataset. Data from six biosurveillance systems were used to detect raw signals (infectious disease events from informal Internet sources): Argus, BioCaster, GPHIN, HealthMap, MedISys and ProMED-mail. Crude detection rates (C-DR), crude sensitivity rates (C-Se) and intrinsic sensitivity rates (I-Se) were calculated from multivariable regressions to evaluate the systems’ performance (events detected compared to the gold-standard) 472 raw signals (Internet disease reports) related to the 86 events included in the gold-standard data set were retrieved from the six systems. 84 events were detected before their publication in the gold-standard. The type of sources utilised by the systems varied significantly (p<0001). I-Se varied significantly from 43% to 71% (p = 0001) whereas other indicators were similar (C-DR: p = 020; C-Se, p = 013). I-Se was significantly associated with individual systems, types of system, languages, regions of occurrence, and types of infectious disease. Conversely, no statistical difference of C-DR was observed after adjustment for other variables. Conclusion Although differences could result from a biosurveillance system's conceptual design, findings suggest that the combined expertise amongst systems enhances early detection performance for detection of infectious diseases. While all systems showed similar early detection performance, systems including human moderation were found to have a 53% higher I-Se (p = 00001) after adjustment for other variables. Overall, the use of moderation, sources, languages, regions of occurrence, and types of cases were found to influence system performance. PMID:24599062
Barboza, Philippe; Vaillant, Laetitia; Le Strat, Yann; Hartley, David M; Nelson, Noele P; Mawudeku, Abla; Madoff, Lawrence C; Linge, Jens P; Collier, Nigel; Brownstein, John S; Astagneau, Pascal
2014-01-01
Internet-based biosurveillance systems have been developed to detect health threats using information available on the Internet, but system performance has not been assessed relative to end-user needs and perspectives. Infectious disease events from the French Institute for Public Health Surveillance (InVS) weekly international epidemiological bulletin published in 2010 were used to construct the gold-standard official dataset. Data from six biosurveillance systems were used to detect raw signals (infectious disease events from informal Internet sources): Argus, BioCaster, GPHIN, HealthMap, MedISys and ProMED-mail. Crude detection rates (C-DR), crude sensitivity rates (C-Se) and intrinsic sensitivity rates (I-Se) were calculated from multivariable regressions to evaluate the systems' performance (events detected compared to the gold-standard) 472 raw signals (Internet disease reports) related to the 86 events included in the gold-standard data set were retrieved from the six systems. 84 events were detected before their publication in the gold-standard. The type of sources utilised by the systems varied significantly (p<0001). I-Se varied significantly from 43% to 71% (p=0001) whereas other indicators were similar (C-DR: p=020; C-Se, p=013). I-Se was significantly associated with individual systems, types of system, languages, regions of occurrence, and types of infectious disease. Conversely, no statistical difference of C-DR was observed after adjustment for other variables. Although differences could result from a biosurveillance system's conceptual design, findings suggest that the combined expertise amongst systems enhances early detection performance for detection of infectious diseases. While all systems showed similar early detection performance, systems including human moderation were found to have a 53% higher I-Se (p=00001) after adjustment for other variables. Overall, the use of moderation, sources, languages, regions of occurrence, and types of cases were found to influence system performance.
Masquerade Detection Using a Taxonomy-Based Multinomial Modeling Approach in UNIX Systems
2008-08-25
primarily the modeling of statistical features , such as the frequency of events, the duration of events, the co- occurrence of multiple events...are identified, we can extract features representing such behavior while auditing the user’s behavior. Figure1: Taxonomy of Linux and Unix...achieved when the features are extracted just from simple commands. Method Hit Rate False Positive Rate ocSVM using simple cmds (freq.-based
LLNL Location and Detection Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Myers, S C; Harris, D B; Anderson, M L
2003-07-16
We present two LLNL research projects in the topical areas of location and detection. The first project assesses epicenter accuracy using a multiple-event location algorithm, and the second project employs waveform subspace Correlation to detect and identify events at Fennoscandian mines. Accurately located seismic events are the bases of location calibration. A well-characterized set of calibration events enables new Earth model development, empirical calibration, and validation of models. In a recent study, Bondar et al. (2003) develop network coverage criteria for assessing the accuracy of event locations that are determined using single-event, linearized inversion methods. These criteria are conservative andmore » are meant for application to large bulletins where emphasis is on catalog completeness and any given event location may be improved through detailed analysis or application of advanced algorithms. Relative event location techniques are touted as advancements that may improve absolute location accuracy by (1) ensuring an internally consistent dataset, (2) constraining a subset of events to known locations, and (3) taking advantage of station and event correlation structure. Here we present the preliminary phase of this work in which we use Nevada Test Site (NTS) nuclear explosions, with known locations, to test the effect of travel-time model accuracy on relative location accuracy. Like previous studies, we find that the reference velocity-model and relative-location accuracy are highly correlated. We also find that metrics based on travel-time residual of relocated events are not a reliable for assessing either velocity-model or relative-location accuracy. In the topical area of detection, we develop specialized correlation (subspace) detectors for the principal mines surrounding the ARCES station located in the European Arctic. Our objective is to provide efficient screens for explosions occurring in the mines of the Kola Peninsula (Kovdor, Zapolyarny, Olenogorsk, Khibiny) and the major iron mines of northern Sweden (Malmberget, Kiruna). In excess of 90% of the events detected by the ARCES station are mining explosions, and a significant fraction are from these northern mining groups. The primary challenge in developing waveform correlation detectors is the degree of variation in the source time histories of the shots, which can result in poor correlation among events even in close proximity. Our approach to solving this problem is to use lagged subspace correlation detectors, which offer some prospect of compensating for variation and uncertainty in source time functions.« less
A Motion-Based Feature for Event-Based Pattern Recognition
Clady, Xavier; Maro, Jean-Matthieu; Barré, Sébastien; Benosman, Ryad B.
2017-01-01
This paper introduces an event-based luminance-free feature from the output of asynchronous event-based neuromorphic retinas. The feature consists in mapping the distribution of the optical flow along the contours of the moving objects in the visual scene into a matrix. Asynchronous event-based neuromorphic retinas are composed of autonomous pixels, each of them asynchronously generating “spiking” events that encode relative changes in pixels' illumination at high temporal resolutions. The optical flow is computed at each event, and is integrated locally or globally in a speed and direction coordinate frame based grid, using speed-tuned temporal kernels. The latter ensures that the resulting feature equitably represents the distribution of the normal motion along the current moving edges, whatever their respective dynamics. The usefulness and the generality of the proposed feature are demonstrated in pattern recognition applications: local corner detection and global gesture recognition. PMID:28101001
Improving the Detectability of the Catalan Seismic Network for Local Seismic Activity Monitoring
NASA Astrophysics Data System (ADS)
Jara, Jose Antonio; Frontera, Tànit; Batlló, Josep; Goula, Xavier
2016-04-01
The seismic survey of the territory of Catalonia is mainly performed by the regional seismic network operated by the Cartographic and Geologic Institute of Catalonia (ICGC). After successive deployments and upgrades, the current network consists of 16 permanent stations equipped with 3 component broadband seismometers (STS2, STS2.5, CMG3ESP and CMG3T), 24 bits digitizers (Nanometrics Trident) and VSAT telemetry. Data are continuously sent in real-time via Hispasat 1D satellite to the ICGC datacenter in Barcelona. Additionally, data from other 10 stations of neighboring areas (Spain, France and Andorra) are continuously received since 2011 via Internet or VSAT, contributing both to detect and to locate events affecting the region. More than 300 local events with Ml ≥ 0.7 have been yearly detected and located in the region. Nevertheless, small magnitude earthquakes, especially those located in the south and south-west of Catalonia may still go undetected by the automatic detection system (DAS), based on Earthworm (USGS). Thus, in order to improve the detection and characterization of these missed events, one or two new stations should be installed. Before making the decision about where to install these new stations, the performance of each existing station is evaluated taking into account the fraction of detected events using the station records, compared to the total number of events in the catalogue, occurred during the station operation time from January 1, 2011 to December 31, 2014. These evaluations allow us to build an Event Detection Probability Map (EDPM), a required tool to simulate EDPMs resulting from different network topology scenarios depending on where these new stations are sited, and becoming essential for the decision-making process to increase and optimize the event detection probability of the seismic network.
Molecular toolbox for the identification of unknown genetically modified organisms.
Ruttink, Tom; Demeyer, Rolinde; Van Gulck, Elke; Van Droogenbroeck, Bart; Querci, Maddalena; Taverniers, Isabel; De Loose, Marc
2010-03-01
Competent laboratories monitor genetically modified organisms (GMOs) and products derived thereof in the food and feed chain in the framework of labeling and traceability legislation. In addition, screening is performed to detect the unauthorized presence of GMOs including asynchronously authorized GMOs or GMOs that are not officially registered for commercialization (unknown GMOs). Currently, unauthorized or unknown events are detected by screening blind samples for commonly used transgenic elements, such as p35S or t-nos. If (1) positive detection of such screening elements shows the presence of transgenic material and (2) all known GMOs are tested by event-specific methods but are not detected, then the presence of an unknown GMO is inferred. However, such evidence is indirect because it is based on negative observations and inconclusive because the procedure does not identify the causative event per se. In addition, detection of unknown events is hampered in products that also contain known authorized events. Here, we outline alternative approaches for analytical detection and GMO identification and develop new methods to complement the existing routine screening procedure. We developed a fluorescent anchor-polymerase chain reaction (PCR) method for the identification of the sequences flanking the p35S and t-nos screening elements. Thus, anchor-PCR fingerprinting allows the detection of unique discriminative signals per event. In addition, we established a collection of in silico calculated fingerprints of known events to support interpretation of experimentally generated anchor-PCR GM fingerprints of blind samples. Here, we first describe the molecular characterization of a novel GMO, which expresses recombinant human intrinsic factor in Arabidopsis thaliana. Next, we purposefully treated the novel GMO as a blind sample to simulate how the new methods lead to the molecular identification of a novel unknown event without prior knowledge of its transgene sequence. The results demonstrate that the new methods complement routine screening procedures by providing direct conclusive evidence and may also be useful to resolve masking of unknown events by known events.
Multimodal Sparse Coding for Event Detection
2015-10-13
classification tasks based on single modality. We present multimodal sparse coding for learning feature representations shared across multiple modalities...The shared representa- tions are applied to multimedia event detection (MED) and evaluated in compar- ison to unimodal counterparts, as well as other...and video tracks from the same multimedia clip, we can force the two modalities to share a similar sparse representation whose benefit includes robust
NASA Astrophysics Data System (ADS)
Ghiami-Shamami, Fereshteh; Sabziparvar, Ali Akbar; Shinoda, Seirou
2018-06-01
The present study examined annually and seasonally trends in climate-based and location-based indices after detection of artificial change points and application of homogenization. Thirteen temperature and eight precipitation indices were generated at 27 meteorological stations over Iran during 1961-2012. The Mann-Kendall test and Sen's slope estimator were applied for trend detection. Results revealed that almost all indices based on minimum temperature followed warmer conditions. Indicators based on minimum temperature showed less consistency with more cold and less warm events. Climate-based results for all extremes indicated semi-arid climate had the most warming events. Moreover, based on location-based results, inland areas showed the most signs of warming. Indices based on precipitation exhibited a negative trend in warm seasons, with the most changes in coastal areas and inland, respectively. Results provided evidence of warming and drying since the 1990s. Changes in precipitation indices were much weaker and less spatially coherent. Summer was found to be the most sensitive season, in comparison with winter. For arid and semi-arid regions, by increasing the latitude, less warm events occurred, while increasing the longitude led to more warming events. Overall, Iran is dominated by a significant increase in warm events, especially minimum temperature-based indices (nighttime). This result, in addition to fewer precipitation events, suggests a generally dryer regime for the future, which is more evident in the warm season of semi-arid sites. The results could provide beneficial references for water resources and eco-environmental policymakers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao
In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a systemcentric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologiesmore » within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford
The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events thatmore » generated the phase picks.« less
Event-Based Stereo Depth Estimation Using Belief Propagation.
Xie, Zhen; Chen, Shengyong; Orchard, Garrick
2017-01-01
Compared to standard frame-based cameras, biologically-inspired event-based sensors capture visual information with low latency and minimal redundancy. These event-based sensors are also far less prone to motion blur than traditional cameras, and still operate effectively in high dynamic range scenes. However, classical framed-based algorithms are not typically suitable for these event-based data and new processing algorithms are required. This paper focuses on the problem of depth estimation from a stereo pair of event-based sensors. A fully event-based stereo depth estimation algorithm which relies on message passing is proposed. The algorithm not only considers the properties of a single event but also uses a Markov Random Field (MRF) to consider the constraints between the nearby events, such as disparity uniqueness and depth continuity. The method is tested on five different scenes and compared to other state-of-art event-based stereo matching methods. The results show that the method detects more stereo matches than other methods, with each match having a higher accuracy. The method can operate in an event-driven manner where depths are reported for individual events as they are received, or the network can be queried at any time to generate a sparse depth frame which represents the current state of the network.
Day-time identification of summer hailstorm cells from MSG data
NASA Astrophysics Data System (ADS)
Merino, A.; López, L.; Sánchez, J. L.; García-Ortega, E.; Cattani, E.; Levizzani, V.
2013-10-01
Identifying deep convection is of paramount importance, as it may be associated with extreme weather that has significant impact on the environment, property and the population. A new method, the Hail Detection Tool (HDT), is described for identifying hail-bearing storms using multi-spectral Meteosat Second Generation (MSG) data. HDT was conceived as a two-phase method, in which the first step is the Convective Mask (CM) algorithm devised for detection of deep convection, and the second a Hail Detection algorithm (HD) for the identification of hail-bearing clouds among cumulonimbus systems detected by CM. Both CM and HD are based on logistic regression models trained with multi-spectral MSG data-sets comprised of summer convective events in the middle Ebro Valley between 2006-2010, and detected by the RGB visualization technique (CM) or C-band weather radar system of the University of León. By means of the logistic regression approach, the probability of identifying a cumulonimbus event with CM or a hail event with HD are computed by exploiting a proper selection of MSG wavelengths or their combination. A number of cloud physical properties (liquid water path, optical thickness and effective cloud drop radius) were used to physically interpret results of statistical models from a meteorological perspective, using a method based on these "ingredients." Finally, the HDT was applied to a new validation sample consisting of events during summer 2011. The overall Probability of Detection (POD) was 76.9% and False Alarm Ratio 16.7%.
Eventogram: A Visual Representation of Main Events in Biomedical Signals.
Elgendi, Mohamed
2016-09-22
Biomedical signals carry valuable physiological information and many researchers have difficulty interpreting and analyzing long-term, one-dimensional, quasi-periodic biomedical signals. Traditionally, biomedical signals are analyzed and visualized using periodogram, spectrogram, and wavelet methods. However, these methods do not offer an informative visualization of main events within the processed signal. This paper attempts to provide an event-related framework to overcome the drawbacks of the traditional visualization methods and describe the main events within the biomedical signal in terms of duration and morphology. Electrocardiogram and photoplethysmogram signals are used in the analysis to demonstrate the differences between the traditional visualization methods, and their performance is compared against the proposed method, referred to as the " eventogram " in this paper. The proposed method is based on two event-related moving averages that visualizes the main time-domain events in the processed biomedical signals. The traditional visualization methods were unable to find dominant events in processed signals while the eventogram was able to visualize dominant events in signals in terms of duration and morphology. Moreover, eventogram -based detection algorithms succeeded with detecting main events in different biomedical signals with a sensitivity and positive predictivity >95%. The output of the eventogram captured unique patterns and signatures of physiological events, which could be used to visualize and identify abnormal waveforms in any quasi-periodic signal.
ERIC Educational Resources Information Center
Ball, B. Hunter; Brewer, Gene A.
2018-01-01
The present study implemented an individual differences approach in conjunction with response time (RT) variability and distribution modeling techniques to better characterize the cognitive control dynamics underlying ongoing task cost (i.e., slowing) and cue detection in event-based prospective memory (PM). Three experiments assessed the relation…
NASA Astrophysics Data System (ADS)
Kornhuber, K.; Petoukhov, V.; Petri, S.; Rahmstorf, S.; Coumou, D.
2017-09-01
Several recent northern hemisphere summer extremes have been linked to persistent high-amplitude wave patterns (e.g. heat waves in Europe 2003, Russia 2010 and in the US 2011, Floods in Pakistan 2010 and Europe 2013). Recently quasi-resonant amplification (QRA) was proposed as a mechanism that, when certain dynamical conditions are fulfilled, can lead to such high-amplitude wave events. Based on these resonance conditions a detection scheme to scan reanalysis data for QRA events in boreal summer months was implemented. With this objective detection scheme we analyzed the occurrence and duration of QRA events and the associated atmospheric flow patterns in 1979-2015 reanalysis data. We detect a total number of 178 events for wave 6, 7 and 8 and find that during roughly one-third of all high amplitude events QRA conditions were met for respective waves. Our analysis reveals a significant shift for quasi-stationary waves 6 and 7 towards high amplitudes during QRA events, lagging first QRA-detection by typically one week. The results provide further evidence for the validity of the QRA hypothesis and its important role in generating high amplitude waves in boreal summer.
NASA Astrophysics Data System (ADS)
Ryan, E. M.; Brucker, L.; Forman, B. A.
2015-12-01
During the winter months, the occurrence of rain-on-snow (ROS) events can impact snow stratigraphy via generation of large scale ice crusts, e.g., on or within the snowpack. The formation of such layers significantly alters the electromagnetic response of the snowpack, which can be witnessed using space-based microwave radiometers. In addition, ROS layers can hinder the ability of wildlife to burrow in the snow for vegetation, which limits their foraging capability. A prime example occurred on 23 October 2003 in Banks Island, Canada, where an ROS event is believed to have caused the deaths of over 20,000 musk oxen. Through the use of passive microwave remote sensing, ROS events can be detected by utilizing observed brightness temperatures (Tb) from AMSR-E. Tb observed at different microwave frequencies and polarizations depends on snow properties. A wet snowpack formed from an ROS event yields a larger Tb than a typical dry snowpack would. This phenomenon makes observed Tb useful when detecting ROS events. With the use of data retrieved from AMSR-E, in conjunction with observations from ground-based weather station networks, a database of estimated ROS events over the past twelve years was generated. Using this database, changes in measured Tb following the ROS events was also observed. This study adds to the growing knowledge of ROS events and has the potential to help inform passive microwave snow water equivalent (SWE) retrievals or snow cover properties in polar regions.
On-line early fault detection and diagnosis of municipal solid waste incinerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao Jinsong; Huang Jianchao; Sun Wei
A fault detection and diagnosis framework is proposed in this paper for early fault detection and diagnosis (FDD) of municipal solid waste incinerators (MSWIs) in order to improve the safety and continuity of production. In this framework, principal component analysis (PCA), one of the multivariate statistical technologies, is used for detecting abnormal events, while rule-based reasoning performs the fault diagnosis and consequence prediction, and also generates recommendations for fault mitigation once an abnormal event is detected. A software package, SWIFT, is developed based on the proposed framework, and has been applied in an actual industrial MSWI. The application shows thatmore » automated real-time abnormal situation management (ASM) of the MSWI can be achieved by using SWIFT, resulting in an industrially acceptable low rate of wrong diagnosis, which has resulted in improved process continuity and environmental performance of the MSWI.« less
Homaeinezhad, M R; Erfanianmoshiri-Nejad, M; Naseri, H
2014-01-01
The goal of this study is to introduce a simple, standard and safe procedure to detect and to delineate P and T waves of the electrocardiogram (ECG) signal in real conditions. The proposed method consists of four major steps: (1) a secure QRS detection and delineation algorithm, (2) a pattern recognition algorithm designed for distinguishing various ECG clusters which take place between consecutive R-waves, (3) extracting template of the dominant events of each cluster waveform and (4) application of the correlation analysis in order to delineate automatically the P- and T-waves in noisy conditions. The performance characteristics of the proposed P and T detection-delineation algorithm are evaluated versus various ECG signals whose qualities are altered from the best to the worst cases based on the random-walk noise theory. Also, the method is applied to the MIT-BIH Arrhythmia and the QT databases for comparing some parts of its performance characteristics with a number of P and T detection-delineation algorithms. The conducted evaluations indicate that in a signal with low quality value of about 0.6, the proposed method detects the P and T events with sensitivity Se=85% and positive predictive value of P+=89%, respectively. In addition, at the same quality, the average delineation errors associated with those ECG events are 45 and 63ms, respectively. Stable delineation error, high detection accuracy and high noise tolerance were the most important aspects considered during development of the proposed method. © 2013 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopan, O; Novak, A; Zeng, J
Purpose: Physics pre-treatment plan review is crucial to safe radiation oncology treatments. Studies show that most errors originate in treatment planning, which underscores the importance of physics plan review. As a QA measure the physics review is of fundamental importance and is central to the profession of medical physics. However, little is known about its effectiveness. More hard data are needed. The purpose of this study was to quantify the effectiveness of physics review with the goal of improving it. Methods: This study analyzed 315 “potentially serious” near-miss incidents within an institutional incident learning system collected over a two-year period.more » 139 of these originated prior to physics review and were found at the review or after. Incidents were classified as events that: 1)were detected by physics review, 2)could have been detected (but were not), and 3)could not have been detected. Category 1 and 2 events were classified by which specific check (within physics review) detected or could have detected the event. Results: Of the 139 analyzed events, 73/139 (53%) were detected or could have been detected by the physics review; although, 42/73 (58%) were not actually detected. 45/73 (62%) errors originated in treatment planning, making physics review the first step in the workflow that could detect the error. Two specific physics checks were particularly effective (combined effectiveness of >20%): verifying DRRs (8/73) and verifying isocenter (7/73). Software-based plan checking systems were evaluated and found to have potential effectiveness of 40%. Given current data structures, software implementations of some tests such as isocenter verification check would be challenging. Conclusion: Physics plan review is a key safety measure and can detect majority of reported events. However, a majority of events that potentially could have been detected were NOT detected in this study, indicating the need to improve the performance of physics review.« less
Event Classification and Identification Based on the Characteristic Ellipsoid of Phasor Measurement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Jian; Diao, Ruisheng; Makarov, Yuri V.
2011-09-23
In this paper, a method to classify and identify power system events based on the characteristic ellipsoid of phasor measurement is presented. The decision tree technique is used to perform the event classification and identification. Event types, event locations and clearance times are identified by decision trees based on the indices of the characteristic ellipsoid. A sufficiently large number of transient events were simulated on the New England 10-machine 39-bus system based on different system configurations. Transient simulations taking into account different event types, clearance times and various locations are conducted to simulate phasor measurement. Bus voltage magnitudes and recordedmore » reactive and active power flows are used to build the characteristic ellipsoid. The volume, eccentricity, center and projection of the longest axis in the parameter space coordinates of the characteristic ellipsoids are used to classify and identify events. Results demonstrate that the characteristic ellipsoid and the decision tree are capable to detect the event type, location, and clearance time with very high accuracy.« less
Sampled-data consensus in switching networks of integrators based on edge events
NASA Astrophysics Data System (ADS)
Xiao, Feng; Meng, Xiangyu; Chen, Tongwen
2015-02-01
This paper investigates the event-driven sampled-data consensus in switching networks of multiple integrators and studies both the bidirectional interaction and leader-following passive reaction topologies in a unified framework. In these topologies, each information link is modelled by an edge of the information graph and assigned a sequence of edge events, which activate the mutual data sampling and controller updates of the two linked agents. Two kinds of edge-event-detecting rules are proposed for the general asynchronous data-sampling case and the synchronous periodic event-detecting case. They are implemented in a distributed fashion, and their effectiveness in reducing communication costs and solving consensus problems under a jointly connected topology condition is shown by both theoretical analysis and simulation examples.
Adaptive Sensor Tuning for Seismic Event Detection in Environment with Electromagnetic Noise
NASA Astrophysics Data System (ADS)
Ziegler, Abra E.
The goal of this research is to detect possible microseismic events at a carbon sequestration site. Data recorded on a continuous downhole microseismic array in the Farnsworth Field, an oil field in Northern Texas that hosts an ongoing carbon capture, utilization, and storage project, were evaluated using machine learning and reinforcement learning techniques to determine their effectiveness at seismic event detection on a dataset with electromagnetic noise. The data were recorded from a passive vertical monitoring array consisting of 16 levels of 3-component 15 Hz geophones installed in the field and continuously recording since January 2014. Electromagnetic and other noise recorded on the array has significantly impacted the utility of the data and it was necessary to characterize and filter the noise in order to attempt event detection. Traditional detection methods using short-term average/long-term average (STA/LTA) algorithms were evaluated and determined to be ineffective because of changing noise levels. To improve the performance of event detection and automatically and dynamically detect seismic events using effective data processing parameters, an adaptive sensor tuning (AST) algorithm developed by Sandia National Laboratories was utilized. AST exploits neuro-dynamic programming (reinforcement learning) trained with historic event data to automatically self-tune and determine optimal detection parameter settings. The key metric that guides the AST algorithm is consistency of each sensor with its nearest neighbors: parameters are automatically adjusted on a per station basis to be more or less sensitive to produce consistent agreement of detections in its neighborhood. The effects that changes in neighborhood configuration have on signal detection were explored, as it was determined that neighborhood-based detections significantly reduce the number of both missed and false detections in ground-truthed data. The performance of the AST algorithm was quantitatively evaluated during a variety of noise conditions and seismic detections were identified using AST and compared to ancillary injection data. During a period of CO2 injection in a nearby well to the monitoring array, 82% of seismic events were accurately detected, 13% of events were missed, and 5% of detections were determined to be false. Additionally, seismic risk was evaluated from the stress field and faulting regime at FWU to determine the likelihood of pressure perturbations to trigger slip on previously mapped faults. Faults oriented NW-SE were identified as requiring the smallest pore pressure changes to trigger slip and faults oriented N-S will also potentially be reactivated although this is less likely.
NASA Astrophysics Data System (ADS)
Missif, Lial Raja; Kadhum, Mohammad M.
2017-09-01
Wireless Sensor Network (WSN) has been widely used for monitoring where sensors are deployed to operate independently to sense abnormal phenomena. Most of the proposed environmental monitoring systems are designed based on a predetermined sensing range which does not reflect the sensor reliability, event characteristics, and the environment conditions. Measuring of the capability of a sensor node to accurately detect an event within a sensing field is of great important for monitoring applications. This paper presents an efficient mechanism for even detection based on probabilistic sensing model. Different models have been presented theoretically in this paper to examine their adaptability and applicability to the real environment applications. The numerical results of the experimental evaluation have showed that the probabilistic sensing model provides accurate observation and delectability of an event, and it can be utilized for different environment scenarios.
Detecting NEO Impacts using the International Monitoring System
NASA Astrophysics Data System (ADS)
Brown, Peter G.; Dube, Kimberlee; Silber, Elizabeth
2014-11-01
As part of the verification regime for the Comprehensive Nuclear Test Ban Treaty an International Monitoring System (IMS) consisting of seismic, hydroacoustic, infrasound and radionuclide technologies has been globally deployed beginning in the late 1990s. The infrasound network sub-component of the IMS consists of 47 active stations as of mid-2014. These microbarograph arrays detect coherent infrasonic signals from a range of sources including volcanoes, man-made explosions and bolides. Bolide detections from IMS stations have been reported since ~2000, but with the maturation of the network over the last several years the rate of detections has increased substantially. Presently the IMS performs semi-automated near real-time global event identification on timescales of 6-12 hours as well as analyst verified event identification having time lags of several weeks. Here we report on infrasound events identified by the IMS between 2010-2014 which are likely bolide impacts. Identification in this context refers to an event being included in one of the event bulletins issued by the IMS. In this untargeted study we find that the IMS globally identifies approximately 16 events per year which are likely bolide impacts. Using data released since the beginning of 2014 of US Government sensor detections (as given at http://neo.jpl.nasa.gov/fireballs/ ) of fireballs we find in a complementary targeted survey that the current IMS system is able to identify ~25% of fireballs with E > 0.1 kT energy. Using all 16 US Government sensor fireballs listed as of July 31, 2014 we are able to detect infrasound from 75% of these events on at least one IMS station. The high ratio of detection/identification is a product of the stricter criteria adopted by the IMS for inclusion in an event bulletin as compared to simple station detection.We discuss energy comparisons between infrasound-estimated energies based on amplitudes and periods and estimates provided by US Government sensors. Specific impact events of interest will be discussed as well as the utility of the global IMS infrasound system for location and timing of future NEAs detected prior to impact.
Automatic processing of induced events in the geothermal reservoirs Landau and Insheim, Germany
NASA Astrophysics Data System (ADS)
Olbert, Kai; Küperkoch, Ludger; Meier, Thomas
2016-04-01
Induced events can be a risk to local infrastructure that need to be understood and evaluated. They represent also a chance to learn more about the reservoir behavior and characteristics. Prior to the analysis, the waveform data must be processed consistently and accurately to avoid erroneous interpretations. In the framework of the MAGS2 project an automatic off-line event detection and a phase onset time determination algorithm are applied to induced seismic events in geothermal systems in Landau and Insheim, Germany. The off-line detection algorithm works based on a cross-correlation of continuous data taken from the local seismic network with master events. It distinguishes events between different reservoirs and within the individual reservoirs. Furthermore, it provides a location and magnitude estimation. Data from 2007 to 2014 are processed and compared with other detections using the SeisComp3 cross correlation detector and a STA/LTA detector. The detected events are analyzed concerning spatial or temporal clustering. Furthermore the number of events are compared to the existing detection lists. The automatic phase picking algorithm combines an AR-AIC approach with a cost function to find precise P1- and S1-phase onset times which can be used for localization and tomography studies. 800 induced events are processed, determining 5000 P1- and 6000 S1-picks. The phase onset times show a high precision with mean residuals to manual phase picks of 0s (P1) to 0.04s (S1) and standard deviations below ±0.05s. The received automatic picks are applied to relocate a selected number of events to evaluate influences on the location precision.
A substitution method to improve completeness of events documentation in anesthesia records.
Lamer, Antoine; De Jonckheere, Julien; Marcilly, Romaric; Tavernier, Benoît; Vallet, Benoît; Jeanne, Mathieu; Logier, Régis
2015-12-01
AIMS are optimized to find and display data and curves about one specific intervention but is not retrospective analysis on a huge volume of interventions. Such a system present two main limitation; (1) the transactional database architecture, (2) the completeness of documentation. In order to solve the architectural problem, data warehouses were developed to propose architecture suitable for analysis. However, completeness of documentation stays unsolved. In this paper, we describe a method which allows determining of substitution rules in order to detect missing anesthesia events in an anesthesia record. Our method is based on the principle that missing event could be detected using a substitution one defined as the nearest documented event. As an example, we focused on the automatic detection of the start and the end of anesthesia procedure when these events were not documented by the clinicians. We applied our method on a set of records in order to evaluate; (1) the event detection accuracy, (2) the improvement of valid records. For the year 2010-2012, we obtained event detection with a precision of 0.00 (-2.22; 2.00) min for the start of anesthesia and 0.10 (0.00; 0.35) min for the end of anesthesia. On the other hand, we increased by 21.1% the data completeness (from 80.3 to 97.2% of the total database) for the start and the end of anesthesia events. This method seems to be efficient to replace missing "start and end of anesthesia" events. This method could also be used to replace other missing time events in this particular data warehouse as well as in other kind of data warehouses.
A new moonquake catalog from Apollo 17 geophone data
NASA Astrophysics Data System (ADS)
Dimech, Jesse-Lee; Knapmeyer-Endrun, Brigitte; Weber, Renee
2017-04-01
New lunar seismic events have been detected on geophone data from the Apollo 17 Lunar Seismic Profile Experiment (LSPE). This dataset is already known to contain an abundance of thermal seismic events, and potentially some meteorite impacts, but prior to this study only 26 days of LSPE "listening mode" data has been analysed. In this new analysis, additional listening mode data collected between August 1976 and April 1977 is incorporated. To the authors knowledge these 8-months of data have not yet been used to detect seismic moonquake events. The geophones in question are situated adjacent to the Apollo 17 site in the Taurus-Littrow valley, about 5.5 km east of Lee-Lincoln scarp, and between the North and South Massifs. Any of these features are potential seismic sources. We have used an event-detection and classification technique based on 'Hidden Markov Models' to automatically detect and categorize seismic signals, in order to objectively generate a seismic event catalog. Currently, 2.5 months of the 8-month listening mode dataset has been processed, totaling 14,338 detections. Of these, 672 detections (classification "n1") have a sharp onset with a steep risetime suggesting they occur close to the recording geophone. These events almost all occur in association with lunar sunrise over the span of 1-2 days. One possibility is that these events originate from the nearby Apollo 17 lunar lander due to rapid heating at sunrise. A further 10,004 detections (classification "d1") show strong diurnal periodicity, with detections increasing during the lunar day and reaching a peak at sunset, and therefore probably represent thermal events from the lunar regolith immediately surrounding the Apollo 17 landing site. The final 3662 detections (classification "d2") have emergent onsets and relatively long durations. These detections have peaks associated with lunar sunrise and sunset, but also sometimes have peaks at seemingly random times. Their source mechanism has not yet been investigated. It's possible that many of these are misclassified d1/n1 events, and further QC work needs to be undertaken. But it is also possible that many of these represent more distant thermal moonquakes e.g. from the North and South massif, or even the ridge adjacent to the Lee-Lincoln scarp. The unknown event spikes will be the subject of closer inspection once the HMM technique has been refined.
Laboratory-Based Prospective Surveillance for Community Outbreaks of Shigella spp. in Argentina
Viñas, María R.; Tuduri, Ezequiel; Galar, Alicia; Yih, Katherine; Pichel, Mariana; Stelling, John; Brengi, Silvina P.; Della Gaspera, Anabella; van der Ploeg, Claudia; Bruno, Susana; Rogé, Ariel; Caffer, María I.; Kulldorff, Martin; Galas, Marcelo
2013-01-01
Background To implement effective control measures, timely outbreak detection is essential. Shigella is the most common cause of bacterial diarrhea in Argentina. Highly resistant clones of Shigella have emerged, and outbreaks have been recognized in closed settings and in whole communities. We hereby report our experience with an evolving, integrated, laboratory-based, near real-time surveillance system operating in six contiguous provinces of Argentina during April 2009 to March 2012. Methodology To detect localized shigellosis outbreaks timely, we used the prospective space-time permutation scan statistic algorithm of SaTScan, embedded in WHONET software. Twenty three laboratories sent updated Shigella data on a weekly basis to the National Reference Laboratory. Cluster detection analysis was performed at several taxonomic levels: for all Shigella spp., for serotypes within species and for antimicrobial resistance phenotypes within species. Shigella isolates associated with statistically significant signals (clusters in time/space with recurrence interval ≥365 days) were subtyped by pulsed field gel electrophoresis (PFGE) using PulseNet protocols. Principal Findings In three years of active surveillance, our system detected 32 statistically significant events, 26 of them identified before hospital staff was aware of any unexpected increase in the number of Shigella isolates. Twenty-six signals were investigated by PFGE, which confirmed a close relationship among the isolates for 22 events (84.6%). Seven events were investigated epidemiologically, which revealed links among the patients. Seventeen events were found at the resistance profile level. The system detected events of public health importance: infrequent resistance profiles, long-lasting and/or re-emergent clusters and events important for their duration or size, which were reported to local public health authorities. Conclusions/Significance The WHONET-SaTScan system may serve as a model for surveillance and can be applied to other pathogens, implemented by other networks, and scaled up to national and international levels for early detection and control of outbreaks. PMID:24349586
Laboratory-based prospective surveillance for community outbreaks of Shigella spp. in Argentina.
Viñas, María R; Tuduri, Ezequiel; Galar, Alicia; Yih, Katherine; Pichel, Mariana; Stelling, John; Brengi, Silvina P; Della Gaspera, Anabella; van der Ploeg, Claudia; Bruno, Susana; Rogé, Ariel; Caffer, María I; Kulldorff, Martin; Galas, Marcelo
2013-01-01
To implement effective control measures, timely outbreak detection is essential. Shigella is the most common cause of bacterial diarrhea in Argentina. Highly resistant clones of Shigella have emerged, and outbreaks have been recognized in closed settings and in whole communities. We hereby report our experience with an evolving, integrated, laboratory-based, near real-time surveillance system operating in six contiguous provinces of Argentina during April 2009 to March 2012. To detect localized shigellosis outbreaks timely, we used the prospective space-time permutation scan statistic algorithm of SaTScan, embedded in WHONET software. Twenty three laboratories sent updated Shigella data on a weekly basis to the National Reference Laboratory. Cluster detection analysis was performed at several taxonomic levels: for all Shigella spp., for serotypes within species and for antimicrobial resistance phenotypes within species. Shigella isolates associated with statistically significant signals (clusters in time/space with recurrence interval ≥365 days) were subtyped by pulsed field gel electrophoresis (PFGE) using PulseNet protocols. In three years of active surveillance, our system detected 32 statistically significant events, 26 of them identified before hospital staff was aware of any unexpected increase in the number of Shigella isolates. Twenty-six signals were investigated by PFGE, which confirmed a close relationship among the isolates for 22 events (84.6%). Seven events were investigated epidemiologically, which revealed links among the patients. Seventeen events were found at the resistance profile level. The system detected events of public health importance: infrequent resistance profiles, long-lasting and/or re-emergent clusters and events important for their duration or size, which were reported to local public health authorities. The WHONET-SaTScan system may serve as a model for surveillance and can be applied to other pathogens, implemented by other networks, and scaled up to national and international levels for early detection and control of outbreaks.
Digital disease detection: A systematic review of event-based internet biosurveillance systems.
O'Shea, Jesse
2017-05-01
Internet access and usage has changed how people seek and report health information. Meanwhile,infectious diseases continue to threaten humanity. The analysis of Big Data, or vast digital data, presents an opportunity to improve disease surveillance and epidemic intelligence. Epidemic intelligence contains two components: indicator based and event-based. A relatively new surveillance type has emerged called event-based Internet biosurveillance systems. These systems use information on events impacting health from Internet sources, such as social media or news aggregates. These systems circumvent the limitations of traditional reporting systems by being inexpensive, transparent, and flexible. Yet, innovations and the functionality of these systems can change rapidly. To update the current state of knowledge on event-based Internet biosurveillance systems by identifying all systems, including current functionality, with hopes to aid decision makers with whether to incorporate new methods into comprehensive programmes of surveillance. A systematic review was performed through PubMed, Scopus, and Google Scholar databases, while also including grey literature and other publication types. 50 event-based Internet systems were identified, including an extraction of 15 attributes for each system, described in 99 articles. Each system uses different innovative technology and data sources to gather data, process, and disseminate data to detect infectious disease outbreaks. The review emphasises the importance of using both formal and informal sources for timely and accurate infectious disease outbreak surveillance, cataloguing all event-based Internet biosurveillance systems. By doing so, future researchers will be able to use this review as a library for referencing systems, with hopes of learning, building, and expanding Internet-based surveillance systems. Event-based Internet biosurveillance should act as an extension of traditional systems, to be utilised as an additional, supplemental data source to have a more comprehensive estimate of disease burden. Copyright © 2017 Elsevier B.V. All rights reserved.
Ben-Yoav, Hadar; Dykstra, Peter H; Bentley, William E; Ghodssi, Reza
2017-01-01
A microfluidic electrochemical lab-on-a-chip (LOC) device for DNA hybridization detection has been developed. The device comprises a 3 × 3 array of microelectrodes integrated with a dual layer microfluidic valved manipulation system that provides controlled and automated capabilities for high throughput analysis of microliter volume samples. The surface of the microelectrodes is functionalized with single-stranded DNA (ssDNA) probes which enable specific detection of complementary ssDNA targets. These targets are detected by a capacitive technique which measures dielectric variation at the microelectrode-electrolyte interface due to DNA hybridization events. A quantitative analysis of the hybridization events is carried out based on a sensing modeling that includes detailed analysis of energy storage and dissipation components. By calculating these components during hybridization events the device is able to demonstrate specific and dose response sensing characteristics. The developed microfluidic LOC for DNA hybridization detection offers a technology for real-time and label-free assessment of genetic markers outside of laboratory settings, such as at the point-of-care or in-field environmental monitoring.
NASA Technical Reports Server (NTRS)
Thompson, D. J.; Bertsch, D. L.; ONeal, R. H., Jr.
2005-01-01
During its nine-year lifetime, the Energetic Gamma Ray Experiment Telescope (EGBET) on the Compton Gamma Ray Observatory (CGRO) detected 1506 cosmic photons with measured energy E>10 GeV. Of this number, 187 are found within a 1 deg of sources that are listed in the Third EGRET Catalog and were included in determining the detection likelihood, flux, and spectra of those sources. In particular, five detected EGRET pulsars are found to have events above 10 GeV, and together they account for 37 events. A pulsar not included in the Third EGRET Catalog has 2 events, both with the same phase and in one peak of the lower-energy gamma-ray light-curve. Most of the remaining 1319 events appear to be diffuse Galactic and extragalactic radiation based on the similarity of the their spatial and energy distributions with the diffuse model and in the E>100, MeV emission. No significant time clustering which would suggest a burst was detected.
NASA Astrophysics Data System (ADS)
Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao
2006-12-01
We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.
On-line Machine Learning and Event Detection in Petascale Data Streams
NASA Astrophysics Data System (ADS)
Thompson, David R.; Wagstaff, K. L.
2012-01-01
Traditional statistical data mining involves off-line analysis in which all data are available and equally accessible. However, petascale datasets have challenged this premise since it is often impossible to store, let alone analyze, the relevant observations. This has led the machine learning community to investigate adaptive processing chains where data mining is a continuous process. Here pattern recognition permits triage and followup decisions at multiple stages of a processing pipeline. Such techniques can also benefit new astronomical instruments such as the Large Synoptic Survey Telescope (LSST) and Square Kilometre Array (SKA) that will generate petascale data volumes. We summarize some machine learning perspectives on real time data mining, with representative cases of astronomical applications and event detection in high volume datastreams. The first is a "supervised classification" approach currently used for transient event detection at the Very Long Baseline Array (VLBA). It injects known signals of interest - faint single-pulse anomalies - and tunes system parameters to recover these events. This permits meaningful event detection for diverse instrument configurations and observing conditions whose noise cannot be well-characterized in advance. Second, "semi-supervised novelty detection" finds novel events based on statistical deviations from previous patterns. It detects outlier signals of interest while considering known examples of false alarm interference. Applied to data from the Parkes pulsar survey, the approach identifies anomalous "peryton" phenomena that do not match previous event models. Finally, we consider online light curve classification that can trigger adaptive followup measurements of candidate events. Classifier performance analyses suggest optimal survey strategies, and permit principled followup decisions from incomplete data. These examples trace a broad range of algorithm possibilities available for online astronomical data mining. This talk describes research performed at the Jet Propulsion Laboratory, California Institute of Technology. Copyright 2012, All Rights Reserved. U.S. Government support acknowledged.
Kambhampati, Satya Samyukta; Singh, Vishal; Manikandan, M Sabarimalai; Ramkumar, Barathram
2015-08-01
In this Letter, the authors present a unified framework for fall event detection and classification using the cumulants extracted from the acceleration (ACC) signals acquired using a single waist-mounted triaxial accelerometer. The main objective of this Letter is to find suitable representative cumulants and classifiers in effectively detecting and classifying different types of fall and non-fall events. It was discovered that the first level of the proposed hierarchical decision tree algorithm implements fall detection using fifth-order cumulants and support vector machine (SVM) classifier. In the second level, the fall event classification algorithm uses the fifth-order cumulants and SVM. Finally, human activity classification is performed using the second-order cumulants and SVM. The detection and classification results are compared with those of the decision tree, naive Bayes, multilayer perceptron and SVM classifiers with different types of time-domain features including the second-, third-, fourth- and fifth-order cumulants and the signal magnitude vector and signal magnitude area. The experimental results demonstrate that the second- and fifth-order cumulant features and SVM classifier can achieve optimal detection and classification rates of above 95%, as well as the lowest false alarm rate of 1.03%.
Investigating Montara platform oil spill accident by implementing RST-OIL approach.
NASA Astrophysics Data System (ADS)
Satriano, Valeria; Ciancia, Emanuele; Coviello, Irina; Di Polito, Carmine; Lacava, Teodosio; Pergola, Nicola; Tramutoli, Valerio
2016-04-01
Oil Spills represent one of the most harmful events to marine ecosystems and their timely detection is crucial for their mitigation and management. The potential of satellite data for their detection and monitoring has been largely investigated. Traditional satellite techniques usually identify oil spill presence applying a fixed threshold scheme only after the occurrence of an event, which make them not well suited for their prompt identification. The Robust Satellite Technique (RST) approach, in its oil spill detection version (RST-OIL), being based on the comparison of the latest satellite acquisition with its historical value, previously identified, allows the automatic and near real-time detection of events. Such a technique has been already successfully applied on data from different sources (AVHRR-Advanced Very High Resolution Radiometer and MODIS-Moderate Resolution Imaging Spectroradiometer) showing excellent performance in detecting oil spills both during day- and night-time conditions, with an high level of sensitivity (detection also of low intensity events) and reliability (no false alarm on scene). In this paper, RST-OIL has been implemented on MODIS thermal infrared data for the analysis of the Montara Platform (Timor Sea - Australia) oil spill disaster occurred in August 2009. Preliminary achievements are presented and discussed in this paper.
A Patch-Based Method for Repetitive and Transient Event Detection in Fluorescence Imaging
Boulanger, Jérôme; Gidon, Alexandre; Kervran, Charles; Salamero, Jean
2010-01-01
Automatic detection and characterization of molecular behavior in large data sets obtained by fast imaging in advanced light microscopy become key issues to decipher the dynamic architectures and their coordination in the living cell. Automatic quantification of the number of sudden and transient events observed in fluorescence microscopy is discussed in this paper. We propose a calibrated method based on the comparison of image patches expected to distinguish sudden appearing/vanishing fluorescent spots from other motion behaviors such as lateral movements. We analyze the performances of two statistical control procedures and compare the proposed approach to a frame difference approach using the same controls on a benchmark of synthetic image sequences. We have then selected a molecular model related to membrane trafficking and considered real image sequences obtained in cells stably expressing an endocytic-recycling trans-membrane protein, the Langerin-YFP, for validation. With this model, we targeted the efficient detection of fast and transient local fluorescence concentration arising in image sequences from a data base provided by two different microscopy modalities, wide field (WF) video microscopy using maximum intensity projection along the axial direction and total internal reflection fluorescence microscopy. Finally, the proposed detection method is briefly used to statistically explore the effect of several perturbations on the rate of transient events detected on the pilot biological model. PMID:20976222
MEMS-based sensing and algorithm development for fall detection and gait analysis
NASA Astrophysics Data System (ADS)
Gupta, Piyush; Ramirez, Gabriel; Lie, Donald Y. C.; Dallas, Tim; Banister, Ron E.; Dentino, Andrew
2010-02-01
Falls by the elderly are highly detrimental to health, frequently resulting in injury, high medical costs, and even death. Using a MEMS-based sensing system, algorithms are being developed for detecting falls and monitoring the gait of elderly and disabled persons. In this study, wireless sensors utilize Zigbee protocols were incorporated into planar shoe insoles and a waist mounted device. The insole contains four sensors to measure pressure applied by the foot. A MEMS based tri-axial accelerometer is embedded in the insert and a second one is utilized by the waist mounted device. The primary fall detection algorithm is derived from the waist accelerometer. The differential acceleration is calculated from samples received in 1.5s time intervals. This differential acceleration provides the quantification via an energy index. From this index one may ascertain different gait and identify fall events. Once a pre-determined index threshold is exceeded, the algorithm will classify an event as a fall or a stumble. The secondary algorithm is derived from frequency analysis techniques. The analysis consists of wavelet transforms conducted on the waist accelerometer data. The insole pressure data is then used to underline discrepancies in the transforms, providing more accurate data for classifying gait and/or detecting falls. The range of the transform amplitude in the fourth iteration of a Daubechies-6 transform was found sufficient to detect and classify fall events.
NASA Astrophysics Data System (ADS)
Shrestha, Sumeet; Kamehama, Hiroki; Kawahito, Shoji; Yasutomi, Keita; Kagawa, Keiichiro; Takeda, Ayaki; Tsuru, Takeshi Go; Arai, Yasuo
2015-08-01
This paper presents a low-noise wide-dynamic-range pixel design for a high-energy particle detector in astronomical applications. A silicon on insulator (SOI) based detector is used for the detection of wide energy range of high energy particles (mainly for X-ray). The sensor has a thin layer of SOI CMOS readout circuitry and a thick layer of high-resistivity detector vertically stacked in a single chip. Pixel circuits are divided into two parts; signal sensing circuit and event detection circuit. The event detection circuit consisting of a comparator and logic circuits which detect the incidence of high energy particle categorizes the incident photon it into two energy groups using an appropriate energy threshold and generate a two-bit code for an event and energy level. The code for energy level is then used for selection of the gain of the in-pixel amplifier for the detected signal, providing a function of high-dynamic-range signal measurement. The two-bit code for the event and energy level is scanned in the event scanning block and the signals from the hit pixels only are read out. The variable-gain in-pixel amplifier uses a continuous integrator and integration-time control for the variable gain. The proposed design allows the small signal detection and wide dynamic range due to the adaptive gain technique and capability of correlated double sampling (CDS) technique of kTC noise canceling of the charge detector.
Bridging the semantic gap in sports
NASA Astrophysics Data System (ADS)
Li, Baoxin; Errico, James; Pan, Hao; Sezan, M. Ibrahim
2003-01-01
One of the major challenges facing current media management systems and the related applications is the so-called "semantic gap" between the rich meaning that a user desires and the shallowness of the content descriptions that are automatically extracted from the media. In this paper, we address the problem of bridging this gap in the sports domain. We propose a general framework for indexing and summarizing sports broadcast programs. The framework is based on a high-level model of sports broadcast video using the concept of an event, defined according to domain-specific knowledge for different types of sports. Within this general framework, we develop automatic event detection algorithms that are based on automatic analysis of the visual and aural signals in the media. We have successfully applied the event detection algorithms to different types of sports including American football, baseball, Japanese sumo wrestling, and soccer. Event modeling and detection contribute to the reduction of the semantic gap by providing rudimentary semantic information obtained through media analysis. We further propose a novel approach, which makes use of independently generated rich textual metadata, to fill the gap completely through synchronization of the information-laden textual data with the basic event segments. An MPEG-7 compliant prototype browsing system has been implemented to demonstrate semantic retrieval and summarization of sports video.
Scott, J; Botsis, T; Ball, R
2014-01-01
Spontaneous Reporting Systems [SRS] are critical tools in the post-licensure evaluation of medical product safety. Regulatory authorities use a variety of data mining techniques to detect potential safety signals in SRS databases. Assessing the performance of such signal detection procedures requires simulated SRS databases, but simulation strategies proposed to date each have limitations. We sought to develop a novel SRS simulation strategy based on plausible mechanisms for the growth of databases over time. We developed a simulation strategy based on the network principle of preferential attachment. We demonstrated how this strategy can be used to create simulations based on specific databases of interest, and provided an example of using such simulations to compare signal detection thresholds for a popular data mining algorithm. The preferential attachment simulations were generally structurally similar to our targeted SRS database, although they had fewer nodes of very high degree. The approach was able to generate signal-free SRS simulations, as well as mimicking specific known true signals. Explorations of different reporting thresholds for the FDA Vaccine Adverse Event Reporting System suggested that using proportional reporting ratio [PRR] > 3.0 may yield better signal detection operating characteristics than the more commonly used PRR > 2.0 threshold. The network analytic approach to SRS simulation based on the principle of preferential attachment provides an attractive framework for exploring the performance of safety signal detection algorithms. This approach is potentially more principled and versatile than existing simulation approaches. The utility of network-based SRS simulations needs to be further explored by evaluating other types of simulated signals with a broader range of data mining approaches, and comparing network-based simulations with other simulation strategies where applicable.
Andersson, Richard; Larsson, Linnea; Holmqvist, Kenneth; Stridh, Martin; Nyström, Marcus
2017-04-01
Almost all eye-movement researchers use algorithms to parse raw data and detect distinct types of eye movement events, such as fixations, saccades, and pursuit, and then base their results on these. Surprisingly, these algorithms are rarely evaluated. We evaluated the classifications of ten eye-movement event detection algorithms, on data from an SMI HiSpeed 1250 system, and compared them to manual ratings of two human experts. The evaluation focused on fixations, saccades, and post-saccadic oscillations. The evaluation used both event duration parameters, and sample-by-sample comparisons to rank the algorithms. The resulting event durations varied substantially as a function of what algorithm was used. This evaluation differed from previous evaluations by considering a relatively large set of algorithms, multiple events, and data from both static and dynamic stimuli. The main conclusion is that current detectors of only fixations and saccades work reasonably well for static stimuli, but barely better than chance for dynamic stimuli. Differing results across evaluation methods make it difficult to select one winner for fixation detection. For saccade detection, however, the algorithm by Larsson, Nyström and Stridh (IEEE Transaction on Biomedical Engineering, 60(9):2484-2493,2013) outperforms all algorithms in data from both static and dynamic stimuli. The data also show how improperly selected algorithms applied to dynamic data misestimate fixation and saccade properties.
Ferragut, Erik M.; Laska, Jason A.; Bridges, Robert A.
2016-06-07
A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The system can include a plurality of anomaly detectors that together implement an algorithm to identify low-probability events and detect atypical traffic patterns. The anomaly detector provides for comparability of disparate sources of data (e.g., network flow data and firewall logs.) Additionally, the anomaly detector allows for regulatability, meaning that the algorithm can be user configurable to adjust a number of false alerts. The anomaly detector can be used for a variety of probability density functions, including normal Gaussian distributions, irregular distributions, as well as functions associated with continuous or discrete variables.
A Method for Automated Detection of Usability Problems from Client User Interface Events
Saadawi, Gilan M.; Legowski, Elizabeth; Medvedeva, Olga; Chavan, Girish; Crowley, Rebecca S.
2005-01-01
Think-aloud usability analysis provides extremely useful data but is very time-consuming and expensive to perform because of the extensive manual video analysis that is required. We describe a simple method for automated detection of usability problems from client user interface events for a developing medical intelligent tutoring system. The method incorporates (1) an agent-based method for communication that funnels all interface events and system responses to a centralized database, (2) a simple schema for representing interface events and higher order subgoals, and (3) an algorithm that reproduces the criteria used for manual coding of usability problems. A correction factor was empirically determining to account for the slower task performance of users when thinking aloud. We tested the validity of the method by simultaneously identifying usability problems using TAU and manually computing them from stored interface event data using the proposed algorithm. All usability problems that did not rely on verbal utterances were detectable with the proposed method. PMID:16779121
Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R
2009-01-01
We present g-PRIME, a software based tool for physiology data acquisition, analysis, and stimulus generation in education and research. This software was developed in an undergraduate neurophysiology course and strongly influenced by instructor and student feedback. g-PRIME is a free, stand-alone, windows application coded and "compiled" in Matlab (does not require a Matlab license). g-PRIME supports many data acquisition interfaces from the PC sound card to expensive high throughput calibrated equipment. The program is designed as a software oscilloscope with standard trigger modes, multi-channel visualization controls, and data logging features. Extensive analysis options allow real time and offline filtering of signals, multi-parameter threshold-and-window based event detection, and two-dimensional display of a variety of parameters including event time, energy density, maximum FFT frequency component, max/min amplitudes, and inter-event rate and intervals. The software also correlates detected events with another simultaneously acquired source (event triggered average) in real time or offline. g-PRIME supports parameter histogram production and a variety of elegant publication quality graphics outputs. A major goal of this software is to merge powerful engineering acquisition and analysis tools with a biological approach to studies of nervous system function.
Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)
NASA Astrophysics Data System (ADS)
Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko
2016-07-01
A new Matched Filtering Algorithm (MFA) is proposed for detecting and analysing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multichannel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multicomponent waveforms into the ray-centred co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, that is, microseismic events for which only one of the S- or P-wave arrivals is evident due to unfavourable S/N conditions. A real-data example using microseismic monitoring data from four stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than fourfold increase) in the number of located events compared with the original catalogue. Moreover, analysis of the new MFA catalogue suggests that this approach leads to more robust interpretation of the induced microseismicity and novel insights into dynamic rupture processes based on the average temporal (foreshock-aftershock) relationship of child events to parents.
Cheng, Nan; Shang, Ying; Xu, Yuancong; Zhang, Li; Luo, Yunbo; Huang, Kunlun; Xu, Wentao
2017-05-15
Stacked genetically modified organisms (GMO) are becoming popular for their enhanced production efficiency and improved functional properties, and on-site detection of stacked GMO is an urgent challenge to be solved. In this study, we developed a cascade system combining event-specific tag-labeled multiplex LAMP with a DNAzyme-lateral flow biosensor for reliable detection of stacked events (DP305423× GTS 40-3-2). Three primer sets, both event-specific and soybean species-specific, were newly designed for the tag-labeled multiplex LAMP system. A trident-like lateral flow biosensor displayed amplified products simultaneously without cross contamination, and DNAzyme enhancement improved the sensitivity effectively. After optimization, the limit of detection was approximately 0.1% (w/w) for stacked GM soybean, which is sensitive enough to detect genetically modified content up to a threshold value established by several countries for regulatory compliance. The entire detection process could be shortened to 120min without any large-scale instrumentation. This method may be useful for the in-field detection of DP305423× GTS 40-3-2 soybean on a single kernel basis and on-site screening tests of stacked GM soybean lines and individual parent GM soybean lines in highly processed foods. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Hamolli, L.; Hafizi, M.; Nucita, A. A.
2013-08-01
Free-floating planets (FFPs) are recently drawing a special interest of the scientific community. Gravitational microlensing is up to now the exclusive method for the investigation of FFPs, including their spatial distribution function and mass function. In this paper, we examine the possibility that the future Euclid space-based observatory may allow to discover a substantial number of microlensing events caused by FFPs. Based on latest results about the free-floating planet (FFP) mass function in the mass range [10-5, 10-2]M⊙, we calculate the optical depth towards the Galactic bulge as well as the expected microlensing rate and find that Euclid may be able to detect hundreds to thousands of these events per month. Making use of a synthetic population, we also investigate the possibility of detecting parallax effect in simulated microlensing events due to FFPs and find a significant efficiency for the parallax detection that turns out to be around 30%.
The Dependence of the Cerean Exosphere on Solar Energetic Particle Events
DOE Office of Scientific and Technical Information (OSTI.GOV)
Villarreal, M. N.; Russell, C. T.; Luhmann, J. G.
2017-03-20
Observations from Earth-based ground and orbiting telescopes indicate that the Ceres’s exosphere has a time-varying water component. Evidence of a transient atmosphere was also detected by Dawn upon its arrival, inferred from the response on the Gamma Ray and Neutron Detector. That atmosphere appeared shortly after the passage of a large enhancement in the local flux of high-energy solar protons. Solar proton events have highly variable fluxes over a range of proton energies from 10 s of keV to over 100 MeV and are capable of sputtering water ice at or near the surface. Herein, we examine the fluxes ofmore » solar energetic protons measured during Earth-based attempts to detect water vapor and OH in the Ceres’ atmosphere. We find that the presence of the cerean exosphere is correlated with the inferred presence of solar energetic protons at Ceres, consistent with the event detected by Dawn.« less
Sengupta, Partha Pratim; Gloria, Jared N; Amato, Dahlia N; Amato, Douglas V; Patton, Derek L; Murali, Beddhu; Flynt, Alex S
2015-10-12
Detection of specific RNA or DNA molecules by hybridization to "probe" nucleic acids via complementary base-pairing is a powerful method for analysis of biological systems. Here we describe a strategy for transducing hybridization events through modulating intrinsic properties of the electroconductive polymer polyaniline (PANI). When DNA-based probes electrostatically interact with PANI, its fluorescence properties are increased, a phenomenon that can be enhanced by UV irradiation. Hybridization of target nucleic acids results in dissociation of probes causing PANI fluorescence to return to basal levels. By monitoring restoration of base PANI fluorescence as little as 10(-11) M (10 pM) of target oligonucleotides could be detected within 15 min of hybridization. Detection of complementary oligos was specific, with introduction of a single mismatch failing to form a target-probe duplex that would dissociate from PANI. Furthermore, this approach is robust and is capable of detecting specific RNAs in extracts from animals. This sensor system improves on previously reported strategies by transducing highly specific probe dissociation events through intrinsic properties of a conducting polymer without the need for additional labels.
Participation of the NDC Austria at the NDC Preparedness Exercise 2012
NASA Astrophysics Data System (ADS)
Mitterbauer, Ulrike; Wotawa, Gerhard; Schraick, Irene
2013-04-01
NDC Preparedness Exercises (NPEs) are conducted annually by the National Data Centers (NDCs) of CTBT States Signatories to train the detection of a (hypothetical) nuclear test. During the NDC Preparedness Exercise 2012, a fictitious radionuclide scenario originating from a real seismic event (mining explosion) was calculated by the German NDC and distributed among all NDCs. For the scenario computation, it was assumed that the selected seismic event was the epicentre of an underground nuclear fission explosion. The scenario included detections of the Iodine isotopes I-131 and I-133 (both particulates), and the Radioxenon Isotopes Xe-133, Xe-133M, Xe-131M and Xe-135 (noble gas). By means of atmospheric transport modelling (ATM), concentrations of all these six isotopes which would result from the hypothetical explosion were calculated and interpolated to the IMS station locations. The participating NDCs received information about the concentration of the isotopes at the station locations without knowing the underlying seismic event. The aim of the exercise was to identify this event based on the detection scenario. The Austrian NDC performed the following analyses: • Atmospheric backtracking and data fusion to identify seismic candidate events, • Seismic analysis of candidate events within the possible source region, • Atmospheric transport modelling (forward mode) from identified candidate events, comparison between "measured" and simulated concentrations based on certain release assumptions. The main goal of the analysis was to identify the event selected by NDC Germany to calculate the radionuclide scenario, and to exclude other events. In the presentation, the analysis methodology as well as the final results and conclusions will be shown and discussed in detail.
Daytime identification of summer hailstorm cells from MSG data
NASA Astrophysics Data System (ADS)
Merino, A.; López, L.; Sánchez, J. L.; García-Ortega, E.; Cattani, E.; Levizzani, V.
2014-04-01
Identifying deep convection is of paramount importance, as it may be associated with extreme weather phenomena that have significant impact on the environment, property and populations. A new method, the hail detection tool (HDT), is described for identifying hail-bearing storms using multispectral Meteosat Second Generation (MSG) data. HDT was conceived as a two-phase method, in which the first step is the convective mask (CM) algorithm devised for detection of deep convection, and the second a hail mask algorithm (HM) for the identification of hail-bearing clouds among cumulonimbus systems detected by CM. Both CM and HM are based on logistic regression models trained with multispectral MSG data sets comprised of summer convective events in the middle Ebro Valley (Spain) between 2006 and 2010, and detected by the RGB (red-green-blue) visualization technique (CM) or C-band weather radar system of the University of León. By means of the logistic regression approach, the probability of identifying a cumulonimbus event with CM or a hail event with HM are computed by exploiting a proper selection of MSG wavelengths or their combination. A number of cloud physical properties (liquid water path, optical thickness and effective cloud drop radius) were used to physically interpret results of statistical models from a meteorological perspective, using a method based on these "ingredients". Finally, the HDT was applied to a new validation sample consisting of events during summer 2011. The overall probability of detection was 76.9 % and the false alarm ratio 16.7 %.
NASA Astrophysics Data System (ADS)
Befort, Daniel J.; Kruschke, Tim; Leckebusch, Gregor C.
2017-04-01
Tropical Cyclones over East Asia have huge socio-economic impacts due to their strong wind fields and large rainfall amounts. Especially, the most severe events are associated with huge economic losses, e.g. Typhoon Herb in 1996 is related to overall losses exceeding 5 billion US (Munich Re, 2016). In this study, an objective tracking algorithm is applied to JRA55 reanalysis data from 1979 to 2014 over the Western North Pacific. For this purpose, a purely wind based algorithm, formerly used to identify extra-tropical wind storms, has been further developed. The algorithm is based on the exceedance of the local 98th percentile to define strong wind fields in gridded climate data. To be detected as a tropical cyclone candidate, the following criteria must be fulfilled: 1) the wind storm must exist for at least eight 6-hourly time steps and 2) the wind field must exceed a minimum size of 130.000km2 for each time step. The usage of wind information is motivated to focus on damage related events, however, a pre-selection based on the affected region is necessary to remove events of extra-tropical nature. Using IBTrACS Best Tracks for validation, it is found that about 62% of all detected tropical cyclone events in JRA55 reanalysis can be matched to an observed best track. As expected the relative amount of matched tracks increases with the wind intensity of the event, with a hit rate of about 98% for Violent Typhoons, above 90% for Very Strong Typhoons and about 75% for Typhoons. Overall these results are encouraging as the parameters used to detect tropical cyclones in JRA55, e.g. minimum area, are also suitable to detect TCs in most CMIP5 simulations and will thus allow estimates of potential future changes.
NASA Astrophysics Data System (ADS)
Wu, Huijuan; Qian, Ya; Zhang, Wei; Li, Hanyu; Xie, Xin
2015-12-01
A real-time intelligent fiber-optic perimeter intrusion detection system (PIDS) based on the fiber Bragg grating (FBG) sensor network is presented in this paper. To distinguish the effects of different intrusion events, a novel real-time behavior impact classification method is proposed based on the essential statistical characteristics of signal's profile in the time domain. The features are extracted by the principal component analysis (PCA), which are then used to identify the event with a K-nearest neighbor classifier. Simulation and field tests are both carried out to validate its effectiveness. The average identification rate (IR) for five sample signals in the simulation test is as high as 96.67%, and the recognition rate for eight typical signals in the field test can also be achieved up to 96.52%, which includes both the fence-mounted and the ground-buried sensing signals. Besides, critically high detection rate (DR) and low false alarm rate (FAR) can be simultaneously obtained based on the autocorrelation characteristics analysis and a hierarchical detection and identification flow.
Treml, Diana; Venturelli, Gustavo L; Brod, Fábio C A; Faria, Josias C; Arisi, Ana C M
2014-12-10
A genetically modified (GM) common bean event, namely Embrapa 5.1, resistant to the bean golden mosaic virus (BGMV), was approved for commercialization in Brazil. Brazilian regulation for genetically modified organism (GMO) labeling requires that any food containing more than 1% GMO be labeled. The event-specific polymerase chain reaction (PCR) method has been the primary trend for GMO identification and quantitation because of its high specificity based on the flanking sequence. This work reports the development of an event-specific assay, named FGM, for Embrapa 5.1 detection and quantitation by use of SYBR Green or hydrolysis probe. The FGM assay specificity was tested for Embrapa 2.3 event (a noncommercial GM common bean also resistant to BGMV), 46 non-GM common bean varieties, and other crop species including maize, GM maize, soybean, and GM soybean. The FGM assay showed high specificity to detect the Embrapa 5.1 event. Standard curves for the FGM assay presented a mean efficiency of 95% and a limit of detection (LOD) of 100 genome copies in the presence of background DNA. The primers and probe developed are suitable for the detection and quantitation of Embrapa 5.1.
Nuclear Explosion and Infrasound Event Resources of the SMDC Monitoring Research Program
2008-09-01
2008 Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies 928 Figure 7. Dozens of detected infrasound signals from...investigate alternative detection schemes at the two infrasound arrays based on frequency-wavenumber (fk) processing and the F-statistic. The results of... infrasound signal - detection processing schemes. REFERENCES Bahavar, M., B. Barker, J. Bennett, R. Bowman, H. Israelsson, B. Kohl, Y-L. Kung, J. Murphy
Sandberg, Warren S; Häkkinen, Matti; Egan, Marie; Curran, Paige K; Fairbrother, Pamela; Choquette, Ken; Daily, Bethany; Sarkka, Jukka-Pekka; Rattner, David
2005-09-01
When procedures and processes to assure patient location based on human performance do not work as expected, patients are brought incrementally closer to a possible "wrong patient-wrong procedure'' error. We developed a system for automated patient location monitoring and management. Real-time data from an active infrared/radio frequency identification tracking system provides patient location data that are robust and can be compared with an "expected process'' model to automatically flag wrong-location events as soon as they occur. The system also generates messages that are automatically sent to process managers via the hospital paging system, thus creating an active alerting function to annunciate errors. We deployed the system to detect and annunciate "patient-in-wrong-OR'' events. The system detected all "wrong-operating room (OR)'' events, and all "wrong-OR'' locations were correctly assigned within 0.50+/-0.28 minutes (mean+/-SD). This corresponded to the measured latency of the tracking system. All wrong-OR events were correctly annunciated via the paging function. This experiment demonstrates that current technology can automatically collect sufficient data to remotely monitor patient flow through a hospital, provide decision support based on predefined rules, and automatically notify stakeholders of errors.
O'Leary, Kevin J; Devisetty, Vikram K; Patel, Amitkumar R; Malkenson, David; Sama, Pradeep; Thompson, William K; Landler, Matthew P; Barnard, Cynthia; Williams, Mark V
2013-02-01
Research supports medical record review using screening triggers as the optimal method to detect hospital adverse events (AE), yet the method is labour-intensive. This study compared a traditional trigger tool with an enterprise data warehouse (EDW) based screening method to detect AEs. We created 51 automated queries based on 33 traditional triggers from prior research, and then applied them to 250 randomly selected medical patients hospitalised between 1 September 2009 and 31 August 2010. Two physicians each abstracted records from half the patients using a traditional trigger tool and then performed targeted abstractions for patients with positive EDW queries in the complementary half of the sample. A third physician confirmed presence of AEs and assessed preventability and severity. Traditional trigger tool and EDW based screening identified 54 (22%) and 53 (21%) patients with one or more AE. Overall, 140 (56%) patients had one or more positive EDW screens (total 366 positive screens). Of the 137 AEs detected by at least one method, 86 (63%) were detected by a traditional trigger tool, 97 (71%) by EDW based screening and 46 (34%) by both methods. Of the 11 total preventable AEs, 6 (55%) were detected by traditional trigger tool, 7 (64%) by EDW based screening and 2 (18%) by both methods. Of the 43 total serious AEs, 28 (65%) were detected by traditional trigger tool, 29 (67%) by EDW based screening and 14 (33%) by both. We found relatively poor agreement between traditional trigger tool and EDW based screening with only approximately a third of all AEs detected by both methods. A combination of complementary methods is the optimal approach to detecting AEs among hospitalised patients.
Automatic detection of snow avalanches in continuous seismic data using hidden Markov models
NASA Astrophysics Data System (ADS)
Heck, Matthias; Hammer, Conny; van Herwijnen, Alec; Schweizer, Jürg; Fäh, Donat
2018-01-01
Snow avalanches generate seismic signals as many other mass movements. Detection of avalanches by seismic monitoring is highly relevant to assess avalanche danger. In contrast to other seismic events, signals generated by avalanches do not have a characteristic first arrival nor is it possible to detect different wave phases. In addition, the moving source character of avalanches increases the intricacy of the signals. Although it is possible to visually detect seismic signals produced by avalanches, reliable automatic detection methods for all types of avalanches do not exist yet. We therefore evaluate whether hidden Markov models (HMMs) are suitable for the automatic detection of avalanches in continuous seismic data. We analyzed data recorded during the winter season 2010 by a seismic array deployed in an avalanche starting zone above Davos, Switzerland. We re-evaluated a reference catalogue containing 385 events by grouping the events in seven probability classes. Since most of the data consist of noise, we first applied a simple amplitude threshold to reduce the amount of data. As first classification results were unsatisfying, we analyzed the temporal behavior of the seismic signals for the whole data set and found that there is a high variability in the seismic signals. We therefore applied further post-processing steps to reduce the number of false alarms by defining a minimal duration for the detected event, implementing a voting-based approach and analyzing the coherence of the detected events. We obtained the best classification results for events detected by at least five sensors and with a minimal duration of 12 s. These processing steps allowed identifying two periods of high avalanche activity, suggesting that HMMs are suitable for the automatic detection of avalanches in seismic data. However, our results also showed that more sensitive sensors and more appropriate sensor locations are needed to improve the signal-to-noise ratio of the signals and therefore the classification.
ERIC Educational Resources Information Center
Marsh, Richard L.; Hicks, Jason L.; Cook, Gabriel I.
2005-01-01
In recent theories of event-based prospective memory, researchers have debated what degree of resources are necessary to identify a cue as related to a previously established intention. In order to simulate natural variations in attention, the authors manipulated effort toward an ongoing cognitive task in which intention-related cues were embedded…
DOT National Transportation Integrated Search
2017-02-01
This project collected and analyzed event based vehicle detection data from multiple technologies at four different sites across Oregon to provide guidance for deployment of non-invasive detection for use in adaptive control, as well as develop a tru...
Drivers of Emerging Infectious Disease Events as a Framework for Digital Detection.
Olson, Sarah H; Benedum, Corey M; Mekaru, Sumiko R; Preston, Nicholas D; Mazet, Jonna A K; Joly, Damien O; Brownstein, John S
2015-08-01
The growing field of digital disease detection, or epidemic intelligence, attempts to improve timely detection and awareness of infectious disease (ID) events. Early detection remains an important priority; thus, the next frontier for ID surveillance is to improve the recognition and monitoring of drivers (antecedent conditions) of ID emergence for signals that precede disease events. These data could help alert public health officials to indicators of elevated ID risk, thereby triggering targeted active surveillance and interventions. We believe that ID emergence risks can be anticipated through surveillance of their drivers, just as successful warning systems of climate-based, meteorologically sensitive diseases are supported by improved temperature and precipitation data. We present approaches to driver surveillance, gaps in the current literature, and a scientific framework for the creation of a digital warning system. Fulfilling the promise of driver surveillance will require concerted action to expand the collection of appropriate digital driver data.
Flow detection via sparse frame analysis for suspicious event recognition in infrared imagery
NASA Astrophysics Data System (ADS)
Fernandes, Henrique C.; Batista, Marcos A.; Barcelos, Celia A. Z.; Maldague, Xavier P. V.
2013-05-01
It is becoming increasingly evident that intelligent systems are very bene¯cial for society and that the further development of such systems is necessary to continue to improve society's quality of life. One area that has drawn the attention of recent research is the development of automatic surveillance systems. In our work we outline a system capable of monitoring an uncontrolled area (an outside parking lot) using infrared imagery and recognizing suspicious events in this area. The ¯rst step is to identify moving objects and segment them from the scene's background. Our approach is based on a dynamic background-subtraction technique which robustly adapts detection to illumination changes. It is analyzed only regions where movement is occurring, ignoring in°uence of pixels from regions where there is no movement, to segment moving objects. Regions where movement is occurring are identi¯ed using °ow detection via sparse frame analysis. During the tracking process the objects are classi¯ed into two categories: Persons and Vehicles, based on features such as size and velocity. The last step is to recognize suspicious events that may occur in the scene. Since the objects are correctly segmented and classi¯ed it is possible to identify those events using features such as velocity and time spent motionless in one spot. In this paper we recognize the suspicious event suspicion of object(s) theft from inside a parked vehicle at spot X by a person" and results show that the use of °ow detection increases the recognition of this suspicious event from 78:57% to 92:85%.
Bayesian Inference for Signal-Based Seismic Monitoring
NASA Astrophysics Data System (ADS)
Moore, D.
2015-12-01
Traditional seismic monitoring systems rely on discrete detections produced by station processing software, discarding significant information present in the original recorded signal. SIG-VISA (Signal-based Vertically Integrated Seismic Analysis) is a system for global seismic monitoring through Bayesian inference on seismic signals. By modeling signals directly, our forward model is able to incorporate a rich representation of the physics underlying the signal generation process, including source mechanisms, wave propagation, and station response. This allows inference in the model to recover the qualitative behavior of recent geophysical methods including waveform matching and double-differencing, all as part of a unified Bayesian monitoring system that simultaneously detects and locates events from a global network of stations. We demonstrate recent progress in scaling up SIG-VISA to efficiently process the data stream of global signals recorded by the International Monitoring System (IMS), including comparisons against existing processing methods that show increased sensitivity from our signal-based model and in particular the ability to locate events (including aftershock sequences that can tax analyst processing) precisely from waveform correlation effects. We also provide a Bayesian analysis of an alleged low-magnitude event near the DPRK test site in May 2010 [1] [2], investigating whether such an event could plausibly be detected through automated processing in a signal-based monitoring system. [1] Zhang, Miao and Wen, Lianxing. "Seismological Evidence for a Low-Yield Nuclear Test on 12 May 2010 in North Korea". Seismological Research Letters, January/February 2015. [2] Richards, Paul. "A Seismic Event in North Korea on 12 May 2010". CTBTO SnT 2015 oral presentation, video at https://video-archive.ctbto.org/index.php/kmc/preview/partner_id/103/uiconf_id/4421629/entry_id/0_ymmtpps0/delivery/http
Homaeinezhad, M R; Atyabi, S A; Daneshvar, E; Ghaffari, A; Tahmasebi, M
2010-12-01
The aim of this study is to describe a robust unified framework for segmentation of the phonocardiogram (PCG) signal sounds based on the false-alarm probability (FAP) bounded segmentation of a properly calculated detection measure. To this end, first the original PCG signal is appropriately pre-processed and then, a fixed sample size sliding window is moved on the pre-processed signal. In each slid, the area under the excerpted segment is multiplied by its curve-length to generate the Area Curve Length (ACL) metric to be used as the segmentation decision statistic (DS). Afterwards, histogram parameters of the nonlinearly enhanced DS metric are used for regulation of the α-level Neyman-Pearson classifier for FAP-bounded delineation of the PCG events. The proposed method was applied to all 85 records of Nursing Student Heart Sounds database (NSHSDB) including stenosis, insufficiency, regurgitation, gallop, septal defect, split sound, rumble, murmur, clicks, friction rub and snap disorders with different sampling frequencies. Also, the method was applied to the records obtained from an electronic stethoscope board designed for fulfillment of this study in the presence of high-level power-line noise and external disturbing sounds and as the results, no false positive (FP) or false negative (FN) errors were detected. High noise robustness, acceptable detection-segmentation accuracy of PCG events in various cardiac system conditions, and having no parameters dependency to the acquisition sampling frequency can be mentioned as the principal virtues and abilities of the proposed ACL-based PCG events detection-segmentation algorithm.
Microseismic Events Detection on Xishancun Landslide, Sichuan Province, China
NASA Astrophysics Data System (ADS)
Sheng, M.; Chu, R.; Wei, Z.
2016-12-01
On landslide, the slope movement and the fracturing of the rock mass often lead to microearthquakes, which are recorded as weak signals on seismographs. The distribution characteristics of temporal and spatial regional unstability as well as the impact of external factors on the unstable regions can be understand and analyzed by monitoring those microseismic events. Microseismic method can provide some information inside the landslide, which can be used as supplementary of geodetic methods for monitoring the movement of landslide surface. Compared to drilling on landslide, microseismic method is more economical and safe. Xishancun Landslide is located about 60km northwest of Wenchuan earthquake centroid, it keep deforming after the earthquake, which greatly increases the probability of disasters. In the autumn of 2015, 30 seismometers were deployed on the landslide for 3 months with intervals of 200 500 meters. First, we used regional earthquakes for time correction of seismometers to eliminate the influence of inaccuracy GPS clocks and the subsurface structure of stations. Due to low velocity of the loose medium, the travel time difference of microseismic events on the landslide up to 5s. According to travel time and waveform characteristics, we found many microseismic events and converted them into envelopes as templates, then we used a sliding-window cross-correlation technique based on waveform envelope to detect the other microseismic events. Consequently, 100 microseismic events were detected with the waveforms recorded on all seismometers. Based on the location, we found most of them located on the front of the landslide while the others located on the back end. The bottom and top of the landslide accumulated considerable energy and deformed largely, radiated waves could be recorded by all stations. What's more, the bottom with more events seemed very active. In addition, there were many smaller events happened in middle part of the landslide where released less energy, generated signals could be recorded only by a few stations. Based on the distribution of those microseismic events, we found four unstable regions which agreed well with deformed areas monitored by Geodesy methods. The distribution of those microseismic events, should be related to internal structure and movement of the landslide.
Perspectives of Cross-Correlation in Seismic Monitoring at the International Data Centre
NASA Astrophysics Data System (ADS)
Bobrov, Dmitry; Kitov, Ivan; Zerbo, Lassina
2014-03-01
We demonstrate that several techniques based on waveform cross-correlation are able to significantly reduce the detection threshold of seismic sources worldwide and to improve the reliability of arrivals by a more accurate estimation of their defining parameters. A master event and the events it can find using waveform cross-correlation at array stations of the International Monitoring System (IMS) have to be close. For the purposes of the International Data Centre (IDC), one can use the spatial closeness of the master and slave events in order to construct a new automatic processing pipeline: all qualified arrivals detected using cross-correlation are associated with events matching the current IDC event definition criteria (EDC) in a local association procedure. Considering the repeating character of global seismicity, more than 90 % of events in the reviewed event bulletin (REB) can be built in this automatic processing. Due to the reduced detection threshold, waveform cross-correlation may increase the number of valid REB events by a factor of 1.5-2.0. Therefore, the new pipeline may produce a more comprehensive bulletin than the current pipeline—the goal of seismic monitoring. The analysts' experience with the cross correlation event list (XSEL) shows that the workload of interactive processing might be reduced by a factor of two or even more. Since cross-correlation produces a comprehensive list of detections for a given master event, no additional arrivals from primary stations are expected to be associated with the XSEL events. The number of false alarms, relative to the number of events rejected from the standard event list 3 (SEL3) in the current interactive processing—can also be reduced by the use of several powerful filters. The principal filter is the difference between the arrival times of the master and newly built events at three or more primary stations, which should lie in a narrow range of a few seconds. In this study, one event at a distance of about 2,000 km from the main shock was formed by three stations, with the stations and both events on the same great circle. Such spurious events are rejected by checking consistency between detections at stations at different back azimuths from the source region. Two additional effective pre-filters are f-k analysis and F prob based on correlation traces instead of original waveforms. Overall, waveform cross-correlation is able to improve the REB completeness, to reduce the workload related to IDC interactive analysis, and to provide a precise tool for quality check for both arrivals and events. Some major improvements in automatic and interactive processing achieved by cross-correlation are illustrated using an aftershock sequence from a large continental earthquake. Exploring this sequence, we describe schematically the next steps for the development of a processing pipeline parallel to the existing IDC one in order to improve the quality of the REB together with the reduction of the magnitude threshold.
NASA Astrophysics Data System (ADS)
Helmboldt, J.; Park, J.; von Frese, R. R. B.; Grejner-Brzezinska, D. A.
2016-12-01
Traveling ionospheric disturbance (TID) is generated by various sources and detectable by observing the spatial and temporal change of electron contents in the ionosphere. This study focused on detecting and analyzing TIDs generated by acoustic-gravity waves from man-made events including underground nuclear explosions (UNEs), mine collapses, mine blasts, and large chemical explosions (LCEs) using Global Navigation Satellite System (GNSS). In this study we selected different types of events for case study which covers two US and three North Korean UNEs, two large US mine collapses, three large US mine blasts, and a LCE in northern China and a second LCE at the Nevada Test Site. In most cases, we successfully detected the TIDs as array signatures from the multiple nearby GNSS stations. The array-based TID signatures from these studies were found to yield event-appropriate TID propagation speeds ranging from about a few hundred m/s to roughly a km/s. In addition, the event TID waveforms, and propagation angles and directions were established. The TID waveforms and the maximum angle between each event and the IPP of its TID with the longest travel distance from the source may help differentiate UNEs and LCEs, but the uneven distributions of the observing GNSS stations complicates these results. Thus, further analysis is required of the utility of the apertures of event signatures in the ionosphere for discriminating these events. In general, the results of this study show the potential utility of GNSS observations for detecting and mapping the ionospheric signatures of large-energy anthropological explosions and subsurface collapses.
NASA Astrophysics Data System (ADS)
Zhang, C.; Wu, J.; Ma, Q.
2017-12-01
The environmental effect on the ionosphere caused by man-made power line emission (PLE) and power line harmonic radiation (PLHR) has become an increasing concern. Based on the observed data of 6.5 operating years of DEMETER satellite, by scanning the electric field power density time-frequency spectrograms, 133 PLHR events with central frequencies from 500 Hz to 4.5 kHz are detected in the near-Earth space above China. Among the 133 events, 129 events have PLE events at the base power system frequency (50 Hz in China). The duration time of every PLE event covers that of the corresponding PLHR event totally. As the same with PLHR, PLE is also propagating in whistler mode in the ionosphere. In two events that are detected in the conjugate region of Australian NWC VLF transmitter, radiations with line structure in the vicinity of 19.8 kHz are detected. There are 5 lines distributed from about 19.7 kHz to 19.9 kHz, which are in accordance with the frequency range of NWC transmitted signals. The frequency spacing of the 5 lines is exactly 50 Hz and the bandwidth of each line is about 10 Hz. The electric field power density of the line structure radiation is at the same level with the corresponding PLE, much higher than that of PLHR. The line structure radiations suggest possible modulation of VLF signals by PLE. At last, the variation of ionospheric parameters measured by DEMETER in relation with PLHR is analyzed statistically. As the revisiting orbits of DEMETER pass over the same area with nearly no deviation and at the same time of day, for each PLHR event, we check and average the parameters of 3 revisiting orbits before and after the event respectively. Combined with the event orbit, the variations of these parameters can be obtained. There are totally 5 tendencies: no variation, ascending, descending, crest and trough. Only a few events show no variation. Though there are differences in other 4 tendencies, none of the parameters show extremely preferences on one of the 4 tendencies. The crest and trough events are generally more than ascending and descending events, especially for ion density and O+ ion percentage. The variations of parameters show no preferences on latitude and the day of year.
NASA Astrophysics Data System (ADS)
Earle, P. S.; Guy, M. R.; Smoczyk, G. M.; Horvath, S. R.; Jessica, T. S.; Bausch, D. B.
2014-12-01
The U.S. Geological Survey (USGS) operates a real-time system that detects earthquakes using only data from Twitter—a service for sending and reading public text-based messages of up to 140 characters. The detector algorithm scans for significant increases in tweets containing the word "earthquake" in several languages and sends internal alerts with the detection time, representative tweet texts, and the location of the population center where most of the tweets originated. It has been running in real-time for over two years and finds, on average, two or three felt events per day, with a false detection rate of 9%. The main benefit of the tweet-based detections is speed, with most detections occurring between 20 and 120 seconds after the earthquake origin time. This is considerably faster than seismic detections in poorly instrumented regions of the world. The detections have reasonable coverage of populated areas globally. The number of Twitter-based detections is small compared to the number of earthquakes detected seismically, and only a rough location and qualitative assessment of shaking can be determined based on Tweet data alone. However, the Twitter-based detections are generally caused by widely felt events in populated urban areas that are of more immediate interest than those with no human impact. We will present a technical overview of the system and investigate the potential for rapid characterization of earthquake damage and effects using the 32 million "earthquake" tweets that the system has so far amassed. Initial results show potential for a correlation between characteristic responses and shaking level. For example, tweets containing the word "terremoto" were common following the MMI VII shaking produced by the April 1, 2014 M8.2 Iquique, Chile earthquake whereas a widely-tweeted deep-focus M5.2 north of Santiago, Chile on April 4, 2014 produced MMI VI shaking and almost exclusively "temblor" tweets. We are also investigating the use of other social media such as Instagram to obtain rapid images of earthquake-related damage. An Instagram search following the damaging M6.9 earthquake near the Mexico, Guatemala boarder on July 7, 2014 reveled half a dozen unconfirmed images of damage; the first posted 15 minutes after the event.
Cashman, Patrick; Moberley, Sarah; Dalton, Craig; Stephenson, Jody; Elvidge, Elissa; Butler, Michelle; Durrheim, David N
2014-09-22
Vaxtracker is a web based survey for active post marketing surveillance of Adverse Events Following Immunisation. It is designed to efficiently monitor vaccine safety of new vaccines by early signal detection of serious adverse events. The Vaxtracker system automates contact with the parents or carers of immunised children by email and/or sms message to their smart phone. A hyperlink on the email and text messages links to a web based survey exploring adverse events following the immunisation. The Vaxtracker concept was developed during 2011 (n=21), and piloted during the 2012 (n=200) and 2013 (n=477) influenza seasons for children receiving inactivated influenza vaccine (IIV) in the Hunter New England Local Health District, New South Wales, Australia. Survey results were reviewed by surveillance staff to detect any safety signals and compare adverse event frequencies among the different influenza vaccines administered. In 2012, 57% (n=113) of the 200 participants responded to the online survey and 61% (290/477) in 2013. Vaxtracker appears to be an effective method for actively monitoring adverse events following influenza vaccination in children. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.
Olsen, Sisse; Neale, Graham; Schwab, Kat; Psaila, Beth; Patel, Tejal; Chapman, E Jane; Vincent, Charles
2007-01-01
Background Over the past five years, in most hospitals in England and Wales, incident reporting has become well established but it remains unclear how well reports match clinical adverse events. International epidemiological studies of adverse events are based on retrospective, multi‐hospital case record review. In this paper the authors describe the use of incident reporting, pharmacist surveillance and local real‐time record review for the recognition of clinical risks associated with hospital inpatient care. Methodology Data on adverse events were collected prospectively on 288 patients discharged from adult acute medical and surgical units in an NHS district general hospital using incident reports, active surveillance of prescription charts by pharmacists and record review at time of discharge. Results Record review detected 26 adverse events (AEs) and 40 potential adverse events (PAEs) occurring during the index admission. In contrast, in the same patient group, incident reporting detected 11 PAEs and no AEs. Pharmacy surveillance found 10 medication errors all of which were PAEs. There was little overlap in the nature of events detected by the three methods. Conclusion The findings suggest that incident reporting does not provide an adequate assessment of clinical adverse events and that this method needs to be supplemented with other more systematic forms of data collection. Structured record review, carried out by clinicians, provides an important component of an integrated approach to identifying risk in the context of developing a safety and quality improvement programme. PMID:17301203
NASA Astrophysics Data System (ADS)
Han, Cheongho
2005-11-01
Currently, gravitational microlensing survey experiments toward the Galactic bulge field use two different methods of minimizing the blending effect for the accurate determination of the optical depth τ. One is measuring τ based on clump giant (CG) source stars, and the other is using ``difference image analysis'' (DIA) photometry to measure the unblended source flux variation. Despite the expectation that the two estimates should be the same assuming that blending is properly considered, the estimates based on CG stars systematically fall below the DIA results based on all events with source stars down to the detection limit. Prompted by the gap, we investigate the previously unconsidered effect of companion-associated events on τ determination. Although the image of a companion is blended with that of its primary star and thus not resolved, the event associated with the companion can be detected if the companion flux is highly magnified. Therefore, companions work effectively as source stars to microlensing, and thus the neglect of them in the source star count could result in a wrong τ estimation. By carrying out simulations based on the assumption that companions follow the same luminosity function as primary stars, we estimate that the contribution of the companion-associated events to the total event rate is ~5fbi% for current surveys and can reach up to ~6fbi% for future surveys monitoring fainter stars, where fbi is the binary frequency. Therefore, we conclude that the companion-associated events comprise a nonnegligible fraction of all events. However, their contribution to the optical depth is not large enough to explain the systematic difference between the optical depth estimates based on the two different methods.
Acoustic emission analysis of tooth-composite interfacial debonding.
Cho, N Y; Ferracane, J L; Lee, I B
2013-01-01
This study detected tooth-composite interfacial debonding during composite restoration by means of acoustic emission (AE) analysis and investigated the effects of composite properties and adhesives on AE characteristics. The polymerization shrinkage, peak shrinkage rate, flexural modulus, and shrinkage stress of a methacrylate-based universal hybrid, a flowable, and a silorane-based composite were measured. Class I cavities on 49 extracted premolars were restored with 1 of the 3 composites and 1 of the following adhesives: 2 etch-and-rinse adhesives, 2 self-etch adhesives, and an adhesive for the silorane-based composite. AE analysis was done for 2,000 sec during light-curing. The silorane-based composite exhibited the lowest shrinkage (rate), the longest time to peak shrinkage rate, the lowest shrinkage stress, and the fewest AE events. AE events were detected immediately after the beginning of light-curing in most composite-adhesive combinations, but not until 40 sec after light-curing began for the silorane-based composite. AE events were concentrated at the initial stage of curing in self-etch adhesives compared with etch-and-rinse adhesives. Reducing the shrinkage (rate) of composites resulted in reduced shrinkage stress and less debonding, as evidenced by fewer AE events. AE is an effective technique for monitoring, in real time, the debonding kinetics at the tooth-composite interface.
Li, Xiang; Wang, Xiuxiu; Yang, Jielin; Liu, Yueming; He, Yuping; Pan, Liangwen
2014-05-16
To date, over 150 genetically modified (GM) crops are widely cultivated. To comply with regulations developed for genetically modified organisms (GMOs), including labeling policies, many detection methods for GMO identification and quantification have been developed. To detect the entrance and exit of unauthorized GM crop events in China, we developed a novel quadruplex real-time PCR method for simultaneous detection and quantification of GM cotton events GHB119 and T304-40 in cotton-derived products (based on the 5'-flanking sequence) and the insect-resistance gene Cry2Ae. The limit of detection was 10 copies for GHB119 and Cry2Ae and 25 copies for T304-40. The limit of quantification was 25 copies for GHB119 and Cry2Ae and 50 copies for T304-40. Moreover, low bias and acceptable standard deviation and relative standard deviation values were obtained in quantification analysis of six blind samples containing different GHB119 and T304-40 ingredients. The developed quadruplex quantitative method could be used for quantitative detection of two GM cotton events (GHB119 and T304-40) and Cry2Ae gene ingredient in cotton derived products.
2014-01-01
Background To date, over 150 genetically modified (GM) crops are widely cultivated. To comply with regulations developed for genetically modified organisms (GMOs), including labeling policies, many detection methods for GMO identification and quantification have been developed. Results To detect the entrance and exit of unauthorized GM crop events in China, we developed a novel quadruplex real-time PCR method for simultaneous detection and quantification of GM cotton events GHB119 and T304-40 in cotton-derived products (based on the 5′-flanking sequence) and the insect-resistance gene Cry2Ae. The limit of detection was 10 copies for GHB119 and Cry2Ae and 25 copies for T304-40. The limit of quantification was 25 copies for GHB119 and Cry2Ae and 50 copies for T304-40. Moreover, low bias and acceptable standard deviation and relative standard deviation values were obtained in quantification analysis of six blind samples containing different GHB119 and T304-40 ingredients. Conclusions The developed quadruplex quantitative method could be used for quantitative detection of two GM cotton events (GHB119 and T304-40) and Cry2Ae gene ingredient in cotton derived products. PMID:24884946
Detection of VHF lightning from GPS orbit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suszcynsky, D. M.
2003-01-01
Satellite-based VHF' lightning detection is characterized at GPS orbit by using a VHF receiver system recently launched on the GPS SVN 54 satellite. Collected lightning triggers consist of Narrow Bipolar Events (80%) and strong negative return strokes (20%). The results are used to evaluate the performance of a future GPS-satellite-based VHF global lightning monitor.
Sakaeda, Toshiyuki; Kadoyama, Kaori; Okuno, Yasushi
2011-01-01
Adverse event reports (AERs) submitted to the US Food and Drug Administration (FDA) were reviewed to assess the muscular and renal adverse events induced by the administration of 3-hydroxy-3-methylglutaryl coenzyme A (HMG-CoA) reductase inhibitors (statins) and to attempt to determine the rank-order of the association. After a revision of arbitrary drug names and the deletion of duplicated submissions, AERs involving pravastatin, simvastatin, atorvastatin, or rosuvastatin were analyzed. Authorized pharmacovigilance tools were used for quantitative detection of signals, i.e., drug-associated adverse events, including the proportional reporting ratio, the reporting odds ratio, the information component given by a Bayesian confidence propagation neural network, and the empirical Bayes geometric mean. Myalgia, rhabdomyolysis and an increase in creatine phosphokinase level were focused on as the muscular adverse events, and acute renal failure, non-acute renal failure, and an increase in blood creatinine level as the renal adverse events. Based on 1,644,220 AERs from 2004 to 2009, signals were detected for 4 statins with respect to myalgia, rhabdomyolysis, and an increase in creatine phosphokinase level, but these signals were stronger for rosuvastatin than pravastatin and atorvastatin. Signals were also detected for acute renal failure, though in the case of atorvastatin, the association was marginal, and furthermore, a signal was not detected for non-acute renal failure or for an increase in blood creatinine level. Data mining of the FDA's adverse event reporting system, AERS, is useful for examining statin-associated muscular and renal adverse events. The data strongly suggest the necessity of well-organized clinical studies with respect to statin-associated adverse events.
Binary Microlensing Events from the MACHO Project
NASA Astrophysics Data System (ADS)
Alcock, C.; Allsman, R. A.; Alves, D.; Axelrod, T. S.; Baines, D.; Becker, A. C.; Bennett, D. P.; Bourke, A.; Brakel, A.; Cook, K. H.; Crook, B.; Crouch, A.; Dan, J.; Drake, A. J.; Fragile, P. C.; Freeman, K. C.; Gal-Yam, A.; Geha, M.; Gray, J.; Griest, K.; Gurtierrez, A.; Heller, A.; Howard, J.; Johnson, B. R.; Kaspi, S.; Keane, M.; Kovo, O.; Leach, C.; Leach, T.; Leibowitz, E. M.; Lehner, M. J.; Lipkin, Y.; Maoz, D.; Marshall, S. L.; McDowell, D.; McKeown, S.; Mendelson, H.; Messenger, B.; Minniti, D.; Nelson, C.; Peterson, B. A.; Popowski, P.; Pozza, E.; Purcell, P.; Pratt, M. R.; Quinn, J.; Quinn, P. J.; Rhie, S. H.; Rodgers, A. W.; Salmon, A.; Shemmer, O.; Stetson, P.; Stubbs, C. W.; Sutherland, W.; Thomson, S.; Tomaney, A.; Vandehei, T.; Walker, A.; Ward, K.; Wyper, G.
2000-09-01
We present the light curves of 21 gravitational microlensing events from the first six years of the MACHO Project gravitational microlensing survey that are likely examples of lensing by binary systems. These events were manually selected from a total sample of ~350 candidate microlensing events that were either detected by the MACHO Alert System or discovered through retrospective analyses of the MACHO database. At least 14 of these 21 events exhibit strong (caustic) features, and four of the events are well fit with lensing by large mass ratio (brown dwarf or planetary) systems, although these fits are not necessarily unique. The total binary event rate is roughly consistent with predictions based upon our knowledge of the properties of binary stars, but a precise comparison cannot be made without a determination of our binary lens event detection efficiency. Toward the Galactic bulge, we find a ratio of caustic crossing to noncaustic crossing binary lensing events of 12:4, excluding one event for which we present two fits. This suggests significant incompleteness in our ability to detect and characterize noncaustic crossing binary lensing. The distribution of mass ratios, N(q), for these binary lenses appears relatively flat. We are also able to reliably measure source-face crossing times in four of the bulge caustic crossing events, and recover from them a distribution of lens proper motions, masses, and distances consistent with a population of Galactic bulge lenses at a distance of 7+/-1 kpc. This analysis yields two systems with companions of ~0.05 Msolar.
Gupta, Rahul; Audhkhasi, Kartik; Lee, Sungbok; Narayanan, Shrikanth
2017-01-01
Non-verbal communication involves encoding, transmission and decoding of non-lexical cues and is realized using vocal (e.g. prosody) or visual (e.g. gaze, body language) channels during conversation. These cues perform the function of maintaining conversational flow, expressing emotions, and marking personality and interpersonal attitude. In particular, non-verbal cues in speech such as paralanguage and non-verbal vocal events (e.g. laughters, sighs, cries) are used to nuance meaning and convey emotions, mood and attitude. For instance, laughters are associated with affective expressions while fillers (e.g. um, ah, um) are used to hold floor during a conversation. In this paper we present an automatic non-verbal vocal events detection system focusing on the detect of laughter and fillers. We extend our system presented during Interspeech 2013 Social Signals Sub-challenge (that was the winning entry in the challenge) for frame-wise event detection and test several schemes for incorporating local context during detection. Specifically, we incorporate context at two separate levels in our system: (i) the raw frame-wise features and, (ii) the output decisions. Furthermore, our system processes the output probabilities based on a few heuristic rules in order to reduce erroneous frame-based predictions. Our overall system achieves an Area Under the Receiver Operating Characteristics curve of 95.3% for detecting laughters and 90.4% for fillers on the test set drawn from the data specifications of the Interspeech 2013 Social Signals Sub-challenge. We perform further analysis to understand the interrelation between the features and obtained results. Specifically, we conduct a feature sensitivity analysis and correlate it with each feature's stand alone performance. The observations suggest that the trained system is more sensitive to a feature carrying higher discriminability with implications towards a better system design. PMID:28713197
Onsets of Solar Proton Events in Satellite and Ground Level Observations: A Comparison
NASA Astrophysics Data System (ADS)
He, Jing; Rodriguez, Juan V.
2018-03-01
The early detection of solar proton event onsets is essential for protecting humans and electronics in space, as well as passengers and crew at aviation altitudes. Two commonly compared methods for observing solar proton events that are sufficiently large and energetic to be detected on the ground through the creation of secondary radiation—known as ground level enhancements (GLEs)—are (1) a network of ground-based neutron monitors (NMs) and (2) satellite-based particle detectors. Until recently, owing to the different time resolution of the two data sets, it has not been feasible to compare these two types of observations using the same detection algorithm. This paper presents a comparison between the two observational platforms using newly processed >100 MeV 1 min count rates and fluxes from National Oceanic and Atmospheric Administration's Geostationary Operational Environmental Satellite (GOES) 8-12 satellites, and 1 min count rates from the Neutron Monitor Database. We applied the same detection algorithm to each data set (tuned to the different background noise levels of the instrument types). Seventeen SPEs with GLEs were studied: GLEs 55-70 from Solar Cycle 23 and GLE 71 from Solar Cycle 24. The median difference in the event detection times by GOES and NM data is 0 min, indicating no innate benefit in time of either system. The 10th, 25th, 75th, and 90th percentiles of the onset time differences (GOES minus NMs) are -7.2 min, -1.5 min, 2.5 min, and 4.2 min, respectively. This is in contrast to previous studies in which NM detections led GOES by 8 to 52 min without accounting for different alert protocols.
Wang, Su-hua; Baillargeon, Renée
2009-01-01
As they observe or produce events, infants identify variables that help them predict outcomes in each category of events. How do infants identify a new variable? An explanation-based learning (EBL) account suggests three essential steps: (1) observing contrastive outcomes relevant to the variable; (2) discovering the conditions associated with these outcomes; and (3) generating an explanation for the condition-outcome regularity discovered. In Experiments 1–3, 9-month-old infants watched events designed to “teach” them the variable height in covering events. After watching these events, designed in accord with the EBL account, the infants detected a height violation in a covering event, three months earlier than they ordinarily would have. In Experiments 4–6, the “teaching” events were modified to remove one of the EBL steps, and the infants no longer detected the height violation. The present findings thus support the EBL account and help specify the processes by which infants acquire their physical knowledge. PMID:18177635
Ben Mansour, Khaireddine; Rezzoug, Nasser; Gorce, Philippe
2015-10-01
The purpose of this paper was to determine which types of inertial sensors and which advocated locations should be used for reliable and accurate gait event detection and temporal parameter assessment in normal adults. In addition, we aimed to remove the ambiguity found in the literature of the definition of the initial contact (IC) from the lumbar accelerometer. Acceleration and angular velocity data was gathered from the lumbar region and the distal edge of each shank. This data was evaluated in comparison to an instrumented treadmill and an optoelectronic system during five treadmill speed sessions. The lumbar accelerometer showed that the peak of the anteroposterior component was the most accurate for IC detection. Similarly, the valley that followed the peak of the vertical component was the most precise for terminal contact (TC) detection. Results based on ANOVA and Tukey tests showed that the set of inertial methods was suitable for temporal gait assessment and gait event detection in able-bodied subjects. For gait event detection, an exception was found with the shank accelerometer. The tool was suitable for temporal parameters assessment, despite the high root mean square error on the detection of IC (RMSEIC) and TC (RMSETC). The shank gyroscope was found to be as accurate as the kinematic method since the statistical tests revealed no significant difference between the two techniques for the RMSE off all gait events and temporal parameters. The lumbar and shank accelerometers were the most accurate alternative to the shank gyroscope for gait event detection and temporal parameters assessment, respectively. Copyright © 2015. Published by Elsevier B.V.
Liu, David; Jenkins, Simon A; Sanderson, Penelope M; Watson, Marcus O; Leane, Terrence; Kruys, Amanda; Russell, W John
2009-10-01
Head-mounted displays (HMDs) can help anesthesiologists with intraoperative monitoring by keeping patients' vital signs within view at all times, even while the anesthesiologist is busy performing procedures or unable to see the monitor. The anesthesia literature suggests that there are advantages of HMD use, but research into head-up displays in the cockpit suggests that HMDs may exacerbate inattentional blindness (a tendency for users to miss unexpected but salient events in the field of view) and may introduce perceptual issues relating to focal depth. We investigated these issues in two simulator-based experiments. Experiment 1 investigated whether wearing a HMD would affect how quickly anesthesiologists detect events, and whether the focus setting of the HMD (near or far) makes any difference. Twelve anesthesiologists provided anesthesia in three naturalistic scenarios within a simulated operating theater environment. There were 24 different events that occurred either on the patient monitor or in the operating room. Experiment 2 investigated whether anesthesiologists physically constrained by performing a procedure would detect patient-related events faster with a HMD than without. Twelve anesthesiologists performed a complex simulated clinical task on a part-task endoscopic dexterity trainer while monitoring the simulated patient's vital signs. All participants experienced four different events within each of two scenarios. Experiment 1 showed that neither wearing the HMD nor adjusting the focus setting reduced participants' ability to detect events (the number of events detected and time to detect events). In general, participants spent more time looking toward the patient and less time toward the anesthesia machine when they wore the HMD than when they used standard monitoring alone. Participants reported that they preferred the near focus setting. Experiment 2 showed that participants detected two of four events faster with the HMD, but one event more slowly with the HMD. Participants turned to look toward the anesthesia machine significantly less often when using the HMD. When using the HMD, participants reported that they were less busy, monitoring was easier, and they believed they were faster at detecting abnormal changes. The HMD helped anesthesiologists detect events when physically constrained, but not when physically unconstrained. Although there was no conclusive evidence of worsened inattentional blindness, found in aviation, the perceptual properties of the HMD display appear to influence whether events are detected. Anesthesiologists wearing HMDs should self-adjust the focus to minimize eyestrain and should be aware that some changes may not attract their attention. Future areas of research include developing principles for the design of HMDs, evaluating other types of HMDs, and evaluating the HMD in clinical contexts.
Lee, Jiyoung; Deininger, Rolf A
2010-05-01
Water distribution systems can be vulnerable to microbial contamination through cross-connections, wastewater backflow, the intrusion of soiled water after a loss of pressure resulting from an electricity blackout, natural disaster, or intentional contamination of the system in a bioterrrorism event. The most urgent matter a water treatment utility would face in this situation is detecting the presence and extent of a contamination event in real-time, so that immediate action can be taken to mitigate the problem. The current approved microbiological detection methods are culture-based plate count methods, which require incubation time (1 to 7 days). This long period of time would not be useful for the protection of public health. This study was designed to simulate wastewater intrusion in a water distribution system. The objectives were 2-fold: (1) real-time detection of water contamination, and (2) investigation of the sustainability of drinking water systems to suppress the contamination with secondary disinfectant residuals (chlorine and chloramine). The events of drinking water contamination resulting from a wastewater addition were determined by filtration-based luminescence assay. The water contamination was detected by luminescence method within 5 minutes. The signal amplification attributed to wastewater contamination was clear-102-fold signal increase. After 1 hour, chlorinated water could inactivate 98.8% of the bacterial contaminant, while chloraminated water reduced 77.2%.
Romero, Peggy; Miller, Ted; Garakani, Arman
2009-12-01
Current methods to assess neurodegradation in dorsal root ganglion cultures as a model for neurodegenerative diseases are imprecise and time-consuming. Here we describe two new methods to quantify neuroprotection in these cultures. The neurite quality index (NQI) builds upon earlier manual methods, incorporating additional morphological events to increase detection sensitivity for the detection of early degeneration events. Neurosight is a machine vision-based method that recapitulates many of the strengths of NQI while enabling high-throughput screening applications with decreased costs.
Particle Filter Based Tracking in a Detection Sparse Discrete Event Simulation Environment
2007-03-01
obtained by disqualifying a large number of particles. 52 (a) (b) ( c ) Figure 31. Particle Disqualification via Sanitization b...1 B. RESEARCH APPROACH..............................................................................5 C . THESIS ORGANIZATION...38 b. Detection Distribution Sampling............................................43 c . Estimated Position Calculation
Closing the Loop in ICU Decision Support: Physiologic Event Detection, Alerts, and Documentation
Norris, Patrick R.; Dawant, Benoit M.
2002-01-01
Automated physiologic event detection and alerting is a challenging task in the ICU. Ideally care providers should be alerted only when events are clinically significant and there is opportunity for corrective action. However, the concepts of clinical significance and opportunity are difficult to define in automated systems, and effectiveness of alerting algorithms is difficult to measure. This paper describes recent efforts on the Simon project to capture information from ICU care providers about patient state and therapy in response to alerts, in order to assess the value of event definitions and progressively refine alerting algorithms. Event definitions for intracranial pressure and cerebral perfusion pressure were studied by implementing a reliable system to automatically deliver alerts to clinical users’ alphanumeric pagers, and to capture associated documentation about patient state and therapy when the alerts occurred. During a 6-month test period in the trauma ICU at Vanderbilt University Medical Center, 530 alerts were detected in 2280 hours of data spanning 14 patients. Clinical users electronically documented 81% of these alerts as they occurred. Retrospectively classifying documentation based on therapeutic actions taken, or reasons why actions were not taken, provided useful information about ways to potentially improve event definitions and enhance system utility.
VVV Survey Microlensing Events in the Galactic Center Region
NASA Astrophysics Data System (ADS)
Navarro, María Gabriela; Minniti, Dante; Contreras Ramos, Rodrigo
2017-12-01
We search for microlensing events in the highly reddened areas surrounding the Galactic center using the near-IR observations with the VISTA Variables in the Vía Láctea Survey (VVV). We report the discovery of 182 new microlensing events, based on observations acquired between 2010 and 2015. We present the color-magnitude diagrams of the microlensing sources for the VVV tiles b332, b333, and b334, which were independently analyzed, and show good qualitative agreement among themselves. We detect an excess of microlensing events in the central tile b333 in comparison with the other two tiles, suggesting that the microlensing optical depth keeps rising all the way to the Galactic center. We derive the Einstein radius crossing time for all of the observed events. The observed event timescales range from t E = 5 to 200 days. The resulting timescale distribution shows a mean timescale of < {t}{{E}}> =30.91 days for the complete sample (N = 182 events), and < {t}{{E}}> =29.93 days if restricted only for the red clump (RC) giant sources (N = 96 RC events). There are 20 long timescale events ({t}{{E}}≥slant 100 days) that suggest the presence of massive lenses (black holes) or disk-disk event. This work demonstrates that the VVV Survey is a powerful tool to detect intermediate/long timescale microlensing events in highly reddened areas, and it enables a number of future applications, from analyzing individual events to computing the statistics for the inner Galactic mass and kinematic distributions, in aid of future ground- and space-based experiments.
NASA Astrophysics Data System (ADS)
Bainbridge, S.
2012-04-01
The advent of new observing systems, such as sensor networks, have dramatically increased our ability to collect marine data; the issue now is not data drought but data deluge. The challenge now is to extract data representing events of interest from the background data, that is how to deliver information and potentially knowledge from an increasing large store of base observations. Given that each potential user will have differing definitions of 'interesting' and that this is often defined by other events and data, systems need to deliver information or knowledge in a form and context defined by the user. This paper reports on a series of coral reef sensor networks set up under the Coral Reef Environmental Observation Network (CREON). CREON is a community of interest group deploying coral reef sensor networks with the goal of increasing capacity in coral reef observation, especially into developing areas. Issues such as coral bleaching, terrestrial runoff, human impacts and climate change are impacting reefs with one assessment indicating a quarter of the worlds reefs being severely degraded with another quarter under immediate threat. Increasing our ability to collect scientifically valid observations is fundamental to understanding these systems and ultimately in preserving and sustaining them. A cloud based data management system was used to store the base sensor data from each agency involved using service based agents to push the data from individual field sensors to the cloud. The system supports a range of service based outputs such as on-line graphs, a smart-phone application and simple event detection. A more complex event detection system was written that takes input from the cloud services and outputs natural language 'tweets' to Twitter as events occur. It therefore becomes possible to distil the entire data set down to a series of Twitter entries that interested parties can subscribe to. The next step is to allow users to define their own events and to deliver results, in context, to their preferred medium. The paper contrasts what has been achieved within a small community with well defined issues with what it would take to build equivalent systems to hold a wide range of cross community observational data addressing a wider range of potential issues. The role of discoverability, quality control, uncertainly, conformity and metadata are investigated along with a brief discussion of existing and emerging standards in this area. The elements of such as system are described along with the role of modelling and scenario tools in delivering a higher level of outputs linking what may have already occurred (event detection) with what may potentially occur (scenarios). The development of service based cloud computing open data systems coupled with complex event detection systems delivering through social media and other channels linked into model and scenario systems represents one vision for delivering value from the increasing store of ocean observations, most of which lie unknown, unused and unloved.
Pipeline Processing with an Iterative, Context-Based Detection Model
2016-01-22
25: Teleseismic paths from earthquakes in Myanmar to three North American arrays. The path length to ILAR (the nearest array) is about 8950...kilometers. ................................. 57 Figure 26: Waveforms of Myanmar calibration event (left) and target event (right), recorded at ILAR...one Myanmar event (2007 5/16 8:56:16.0, Mw 6.3; 20.47°N 100.69°E) as a calibration for a second event occurring nearly 4 years later (2011 3/24 13:55
Single-Event Effect Testing of the Linear Technology LTC6103HMS8#PBF Current Sense Amplifier
NASA Technical Reports Server (NTRS)
Yau, Ka-Yen; Campola, Michael J.; Wilcox, Edward
2016-01-01
The LTC6103HMS8#PBF (henceforth abbreviated as LTC6103) current sense amplifier from Linear Technology was tested for both destructive and non-destructive single-event effects (SEE) using the heavy-ion cyclotron accelerator beam at Lawrence Berkeley National Laboratory (LBNL) Berkeley Accelerator Effects (BASE) facility. During testing, the input voltages and output currents were monitored to detect single event latch-up (SEL) and single-event transients (SETs).
A novel seizure detection algorithm informed by hidden Markov model event states
NASA Astrophysics Data System (ADS)
Baldassano, Steven; Wulsin, Drausin; Ung, Hoameng; Blevins, Tyler; Brown, Mesha-Gay; Fox, Emily; Litt, Brian
2016-06-01
Objective. Recently the FDA approved the first responsive, closed-loop intracranial device to treat epilepsy. Because these devices must respond within seconds of seizure onset and not miss events, they are tuned to have high sensitivity, leading to frequent false positive stimulations and decreased battery life. In this work, we propose a more robust seizure detection model. Approach. We use a Bayesian nonparametric Markov switching process to parse intracranial EEG (iEEG) data into distinct dynamic event states. Each event state is then modeled as a multidimensional Gaussian distribution to allow for predictive state assignment. By detecting event states highly specific for seizure onset zones, the method can identify precise regions of iEEG data associated with the transition to seizure activity, reducing false positive detections associated with interictal bursts. The seizure detection algorithm was translated to a real-time application and validated in a small pilot study using 391 days of continuous iEEG data from two dogs with naturally occurring, multifocal epilepsy. A feature-based seizure detector modeled after the NeuroPace RNS System was developed as a control. Main results. Our novel seizure detection method demonstrated an improvement in false negative rate (0/55 seizures missed versus 2/55 seizures missed) as well as a significantly reduced false positive rate (0.0012 h versus 0.058 h-1). All seizures were detected an average of 12.1 ± 6.9 s before the onset of unequivocal epileptic activity (unequivocal epileptic onset (UEO)). Significance. This algorithm represents a computationally inexpensive, individualized, real-time detection method suitable for implantable antiepileptic devices that may considerably reduce false positive rate relative to current industry standards.
Lee, David; La Mura, Maurizio; Allnutt, Theo R; Powell, Wayne
2009-02-02
The most common method of GMO detection is based upon the amplification of GMO-specific DNA amplicons using the polymerase chain reaction (PCR). Here we have applied the loop-mediated isothermal amplification (LAMP) method to amplify GMO-related DNA sequences, 'internal' commonly-used motifs for controlling transgene expression and event-specific (plant-transgene) junctions. We have tested the specificity and sensitivity of the technique for use in GMO studies. Results show that detection of 0.01% GMO in equivalent background DNA was possible and dilutions of template suggest that detection from single copies of the template may be possible using LAMP. This work shows that GMO detection can be carried out using LAMP for routine screening as well as for specific events detection. Moreover, the sensitivity and ability to amplify targets, even with a high background of DNA, here demonstrated, highlights the advantages of this isothermal amplification when applied for GMO detection.
APDS: the autonomous pathogen detection system.
Hindson, Benjamin J; Makarewicz, Anthony J; Setlur, Ujwal S; Henderer, Bruce D; McBride, Mary T; Dzenitis, John M
2005-04-15
We have developed and tested a fully autonomous pathogen detection system (APDS) capable of continuously monitoring the environment for airborne biological threat agents. The system was developed to provide early warning to civilians in the event of a bioterrorism incident and can be used at high profile events for short-term, intensive monitoring or in major public buildings or transportation nodes for long-term monitoring. The APDS is completely automated, offering continuous aerosol sampling, in-line sample preparation fluidics, multiplexed detection and identification immunoassays, and nucleic acid-based polymerase chain reaction (PCR) amplification and detection. Highly multiplexed antibody-based and duplex nucleic acid-based assays are combined to reduce false positives to a very low level, lower reagent costs, and significantly expand the detection capabilities of this biosensor. This article provides an overview of the current design and operation of the APDS. Certain sub-components of the ADPS are described in detail, including the aerosol collector, the automated sample preparation module that performs multiplexed immunoassays with confirmatory PCR, and the data monitoring and communications system. Data obtained from an APDS that operated continuously for 7 days in a major U.S. transportation hub is reported.
Chiles, M.M.; Mihalczo, J.T.; Blakeman, E.D.
1987-02-27
A scintillation based radiation detector for the combined detection of thermal neutrons, high-energy neutrons and gamma rays in a single detecting unit. The detector consists of a pair of scintillators sandwiched together and optically coupled to the light sensitive face of a photomultiplier tube. A light tight radiation pervious housing is disposed about the scintillators and a portion of the photomultiplier tube to hold the arrangement in assembly and provides a radiation window adjacent the outer scintillator through which the radiation to be detected enters the detector. The outer scintillator is formed of a material in which scintillations are produced by thermal-neutrons and the inner scintillator is formed of a material in which scintillations are produced by high-energy neutrons and gamma rays. The light pulses produced by events detected in both scintillators are coupled to the photomultiplier tube which produces a current pulse in response to each detected event. These current pulses may be processed in a conventional manner to produce a count rate output indicative of the total detected radiation event count rate. Pulse discrimination techniques may be used to distinguish the different radiations and their energy distribution.
GMDD: a database of GMO detection methods.
Dong, Wei; Yang, Litao; Shen, Kailin; Kim, Banghyun; Kleter, Gijs A; Marvin, Hans J P; Guo, Rong; Liang, Wanqi; Zhang, Dabing
2008-06-04
Since more than one hundred events of genetically modified organisms (GMOs) have been developed and approved for commercialization in global area, the GMO analysis methods are essential for the enforcement of GMO labelling regulations. Protein and nucleic acid-based detection techniques have been developed and utilized for GMOs identification and quantification. However, the information for harmonization and standardization of GMO analysis methods at global level is needed. GMO Detection method Database (GMDD) has collected almost all the previous developed and reported GMOs detection methods, which have been grouped by different strategies (screen-, gene-, construct-, and event-specific), and also provide a user-friendly search service of the detection methods by GMO event name, exogenous gene, or protein information, etc. In this database, users can obtain the sequences of exogenous integration, which will facilitate PCR primers and probes design. Also the information on endogenous genes, certified reference materials, reference molecules, and the validation status of developed methods is included in this database. Furthermore, registered users can also submit new detection methods and sequences to this database, and the newly submitted information will be released soon after being checked. GMDD contains comprehensive information of GMO detection methods. The database will make the GMOs analysis much easier.
Analyzing and Identifying Teens' Stressful Periods and Stressor Events From a Microblog.
Li, Qi; Xue, Yuanyuan; Zhao, Liang; Jia, Jia; Feng, Ling
2017-09-01
Increased health problems among adolescents caused by psychological stress have aroused worldwide attention. Long-standing stress without targeted assistance and guidance negatively impacts the healthy growth of adolescents, threatening the future development of our society. So far, research focused on detecting adolescent psychological stress revealed from each individual post on microblogs. However, beyond stressful moments, identifying teens' stressful periods and stressor events that trigger each stressful period is more desirable to understand the stress from appearance to essence. In this paper, we define the problem of identifying teens' stressful periods and stressor events from the open social media microblog. Starting from a case study of adolescents' posting behaviors during stressful school events, we build a Poisson-based probability model for the correlation between stressor events and stressful posting behaviors through a series of posts on Tencent Weibo (referred to as the microblog throughout the paper). With the model, we discover teens' maximal stressful periods and further extract details of possible stressor events that cause the stressful periods. We generalize and present the extracted stressor events in a hierarchy based on common stress dimensions and event types. Taking 122 scheduled stressful study-related events in a high school as the ground truth, we test the approach on 124 students' posts from January 1, 2012 to February 1, 2015 and obtain some promising experimental results: (stressful periods: recall 0.761, precision 0.737, and F 1 -measure 0.734) and (top-3 stressor events: recall 0.763, precision 0.756, and F 1 -measure 0.759). The most prominent stressor events extracted are in the self-cognition domain, followed by the school life domain. This conforms to the adolescent psychological investigation result that problems in school life usually accompanied with teens' inner cognition problems. Compared with the state-of-the-art top-1 personal life event detection approach, our stressor event detection method is 13.72% higher in precision, 19.18% higher in recall, and 16.50% higher in F 1 -measure, demonstrating the effectiveness of our proposed framework.
A novel CUSUM-based approach for event detection in smart metering
NASA Astrophysics Data System (ADS)
Zhu, Zhicheng; Zhang, Shuai; Wei, Zhiqiang; Yin, Bo; Huang, Xianqing
2018-03-01
Non-intrusive load monitoring (NILM) plays such a significant role in raising consumer awareness on household electricity use to reduce overall energy consumption in the society. With regard to monitoring low power load, many researchers have introduced CUSUM into the NILM system, since the traditional event detection method is not as effective as expected. Due to the fact that the original CUSUM faces limitations given the small shift is below threshold, we therefore improve the test statistic which allows permissible deviation to gradually rise as the data size increases. This paper proposes a novel event detection and corresponding criterion that could be used in NILM systems to recognize transient states and to help the labelling task. Its performance has been tested in a real scenario where eight different appliances are connected to main line of electric power.
SIG-VISA: Signal-based Vertically Integrated Seismic Monitoring
NASA Astrophysics Data System (ADS)
Moore, D.; Mayeda, K. M.; Myers, S. C.; Russell, S.
2013-12-01
Traditional seismic monitoring systems rely on discrete detections produced by station processing software; however, while such detections may constitute a useful summary of station activity, they discard large amounts of information present in the original recorded signal. We present SIG-VISA (Signal-based Vertically Integrated Seismic Analysis), a system for seismic monitoring through Bayesian inference on seismic signals. By directly modeling the recorded signal, our approach incorporates additional information unavailable to detection-based methods, enabling higher sensitivity and more accurate localization using techniques such as waveform matching. SIG-VISA's Bayesian forward model of seismic signal envelopes includes physically-derived models of travel times and source characteristics as well as Gaussian process (kriging) statistical models of signal properties that combine interpolation of historical data with extrapolation of learned physical trends. Applying Bayesian inference, we evaluate the model on earthquakes as well as the 2009 DPRK test event, demonstrating a waveform matching effect as part of the probabilistic inference, along with results on event localization and sensitivity. In particular, we demonstrate increased sensitivity from signal-based modeling, in which the SIGVISA signal model finds statistical evidence for arrivals even at stations for which the IMS station processing failed to register any detection.
Dazard, Jean-Eudes; Ishwaran, Hemant; Mehlotra, Rajeev; Weinberg, Aaron; Zimmerman, Peter
2018-01-01
Unraveling interactions among variables such as genetic, clinical, demographic and environmental factors is essential to understand the development of common and complex diseases. To increase the power to detect such variables interactions associated with clinical time-to-events outcomes, we borrowed established concepts from random survival forest (RSF) models. We introduce a novel RSF-based pairwise interaction estimator and derive a randomization method with bootstrap confidence intervals for inferring interaction significance. Using various linear and nonlinear time-to-events survival models in simulation studies, we first show the efficiency of our approach: true pairwise interaction-effects between variables are uncovered, while they may not be accompanied with their corresponding main-effects, and may not be detected by standard semi-parametric regression modeling and test statistics used in survival analysis. Moreover, using a RSF-based cross-validation scheme for generating prediction estimators, we show that informative predictors may be inferred. We applied our approach to an HIV cohort study recording key host gene polymorphisms and their association with HIV change of tropism or AIDS progression. Altogether, this shows how linear or nonlinear pairwise statistical interactions of variables may be efficiently detected with a predictive value in observational studies with time-to-event outcomes. PMID:29453930
Dazard, Jean-Eudes; Ishwaran, Hemant; Mehlotra, Rajeev; Weinberg, Aaron; Zimmerman, Peter
2018-02-17
Unraveling interactions among variables such as genetic, clinical, demographic and environmental factors is essential to understand the development of common and complex diseases. To increase the power to detect such variables interactions associated with clinical time-to-events outcomes, we borrowed established concepts from random survival forest (RSF) models. We introduce a novel RSF-based pairwise interaction estimator and derive a randomization method with bootstrap confidence intervals for inferring interaction significance. Using various linear and nonlinear time-to-events survival models in simulation studies, we first show the efficiency of our approach: true pairwise interaction-effects between variables are uncovered, while they may not be accompanied with their corresponding main-effects, and may not be detected by standard semi-parametric regression modeling and test statistics used in survival analysis. Moreover, using a RSF-based cross-validation scheme for generating prediction estimators, we show that informative predictors may be inferred. We applied our approach to an HIV cohort study recording key host gene polymorphisms and their association with HIV change of tropism or AIDS progression. Altogether, this shows how linear or nonlinear pairwise statistical interactions of variables may be efficiently detected with a predictive value in observational studies with time-to-event outcomes.
Evaluation of the LWVD Luminosity for Use in the Spectral-Based Volume Sensor Algorithms
2010-04-29
VMI Vibro-Meter, Inc. VS Volume Sensor VSCS Volume Sensor Communications Specification VSDS Volume Sensor Detection Suite VSNP Volume Sensor Nodal Panel...using the VSCS communications protocol. Appendix A gives a complete listing of the SBVS EVENT parameters and the EVENT algorithm descriptions. See
Validation of an automated seizure detection algorithm for term neonates
Mathieson, Sean R.; Stevenson, Nathan J.; Low, Evonne; Marnane, William P.; Rennie, Janet M.; Temko, Andrey; Lightbody, Gordon; Boylan, Geraldine B.
2016-01-01
Objective The objective of this study was to validate the performance of a seizure detection algorithm (SDA) developed by our group, on previously unseen, prolonged, unedited EEG recordings from 70 babies from 2 centres. Methods EEGs of 70 babies (35 seizure, 35 non-seizure) were annotated for seizures by experts as the gold standard. The SDA was tested on the EEGs at a range of sensitivity settings. Annotations from the expert and SDA were compared using event and epoch based metrics. The effect of seizure duration on SDA performance was also analysed. Results Between sensitivity settings of 0.5 and 0.3, the algorithm achieved seizure detection rates of 52.6–75.0%, with false detection (FD) rates of 0.04–0.36 FD/h for event based analysis, which was deemed to be acceptable in a clinical environment. Time based comparison of expert and SDA annotations using Cohen’s Kappa Index revealed a best performing SDA threshold of 0.4 (Kappa 0.630). The SDA showed improved detection performance with longer seizures. Conclusion The SDA achieved promising performance and warrants further testing in a live clinical evaluation. Significance The SDA has the potential to improve seizure detection and provide a robust tool for comparing treatment regimens. PMID:26055336
High infrasonic goniometry applied to the detection of a helicopter in a high activity environment
NASA Astrophysics Data System (ADS)
Chritin, Vincent; Van Lancker, Eric; Wellig, Peter; Ott, Beat
2016-10-01
A current concern of armasuisse is the feasibility of a fixed or mobile acoustic surveillance and recognition network of sensors allowing to permanently monitor the noise immissions of a wide range of aerial activities such as civil or military aviation, and other possible acoustic events such as transient events, subsonic or sonic booms or other. This objective requires an ability to detect, localize and recognize a wide range of potential acoustic events of interest, among others possibly parasitic acoustic events (natural and industrial events on the ground for example), and possibly high background noise (for example close to urban or high activity areas). This article presents a general discussion and conclusion about this problem, based on 20 years of experience totalizing a dozen of research programs or internal researches by IAV, with an illustration through one central specific experimental case-study carried out within the framework of an armasuisse research program.
Observational evidence of predawn plasma bubble and its irregularity scales in Southeast Asia
NASA Astrophysics Data System (ADS)
Watthanasangmechai, K.; Tsunoda, R. T.; Yokoyama, T.; Ishii, M.; Tsugawa, T.
2016-12-01
This paper describes an event of deep plasma depletion simultaneously detected with GPS, GNU Radio Beacon Receiver (GRBR) and in situ satellite measurement from DMFPF15. The event is on March 7, 2012 at 4:30 LT with geomagnetic quiet condition. Such a sharp depletion at plasma bubble wall detected at predawn is interesting but apparently rare event. Only one event is found from all dataset in March 2012. The inside structure of the predawn plasma bubble was clearly captured by DMSPF15 and the ground-based GRBR. The envelop structure seen from the precessed GPS-TEC appeares as a cluster. The observed cluster is concluded as the structure at the westwall of an upwelling of the large-scale wave structure, that accompanies the fifty- and thousand-km scales. This event is consistent with the plasma bubble structure simulated from the high-resolution bubble (HIRB) model.
Surface Management System Departure Event Data Analysis
NASA Technical Reports Server (NTRS)
Monroe, Gilena A.
2010-01-01
This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.
Autonomous Detection of Eruptions, Plumes, and Other Transient Events in the Outer Solar System
NASA Astrophysics Data System (ADS)
Bunte, M. K.; Lin, Y.; Saripalli, S.; Bell, J. F.
2012-12-01
The outer solar system abounds with visually stunning examples of dynamic processes such as eruptive events that jettison materials from satellites and small bodies into space. The most notable examples of such events are the prominent volcanic plumes of Io, the wispy water jets of Enceladus, and the outgassing of comet nuclei. We are investigating techniques that will allow a spacecraft to autonomously detect those events in visible images. This technique will allow future outer planet missions to conduct sustained event monitoring and automate prioritization of data for downlink. Our technique detects plumes by searching for concentrations of large local gradients in images. Applying a Scale Invariant Feature Transform (SIFT) to either raw or calibrated images identifies interest points for further investigation based on the magnitude and orientation of local gradients in pixel values. The interest points are classified as possible transient geophysical events when they share characteristics with similar features in user-classified images. A nearest neighbor classification scheme assesses the similarity of all interest points within a threshold Euclidean distance and classifies each according to the majority classification of other interest points. Thus, features marked by multiple interest points are more likely to be classified positively as events; isolated large plumes or multiple small jets are easily distinguished from a textured background surface due to the higher magnitude gradient of the plume or jet when compared with the small, randomly oriented gradients of the textured surface. We have applied this method to images of Io, Enceladus, and comet Hartley 2 from the Voyager, Galileo, New Horizons, Cassini, and Deep Impact EPOXI missions, where appropriate, and have successfully detected up to 95% of manually identifiable events that our method was able to distinguish from the background surface and surface features of a body. Dozens of distinct features are identifiable under a variety of viewing conditions and hundreds of detections are made in each of the aforementioned datasets. In this presentation, we explore the controlling factors in detecting transient events and discuss causes of success or failure due to distinct data characteristics. These include the level of calibration of images, the ability to differentiate an event from artifacts, and the variety of event appearances in user-classified images. Other important factors include the physical characteristics of the events themselves: albedo, size as a function of image resolution, and proximity to other events (as in the case of multiple small jets which feed into the overall plume at the south pole of Enceladus). A notable strength of this method is the ability to detect events that do not extend beyond the limb of a planetary body or are adjacent to the terminator or other strong edges in the image. The former scenario strongly influences the success rate of detecting eruptive events in nadir views.
Event-Ready Bell Test Using Entangled Atoms Simultaneously Closing Detection and Locality Loopholes
NASA Astrophysics Data System (ADS)
Rosenfeld, Wenjamin; Burchardt, Daniel; Garthoff, Robert; Redeker, Kai; Ortegel, Norbert; Rau, Markus; Weinfurter, Harald
2017-07-01
An experimental test of Bell's inequality allows ruling out any local-realistic description of nature by measuring correlations between distant systems. While such tests are conceptually simple, there are strict requirements concerning the detection efficiency of the involved measurements, as well as the enforcement of spacelike separation between the measurement events. Only very recently could both loopholes be closed simultaneously. Here we present a statistically significant, event-ready Bell test based on combining heralded entanglement of atoms separated by 398 m with fast and efficient measurements of the atomic spin states closing essential loopholes. We obtain a violation with S =2.221 ±0.033 (compared to the maximal value of 2 achievable with models based on local hidden variables) which allows us to refute the hypothesis of local realism with a significance level P <2.57 ×10-9.
Boulila, Moncef
2010-06-01
To enhance the knowledge of recombination as an evolutionary process, 267 accessions retrieved from GenBank were investigated, all belonging to five economically important viruses infecting fruit crops (Plum pox, Apple chlorotic leaf spot, Apple mosaic, Prune dwarf, and Prunus necrotic ringspot viruses). Putative recombinational events were detected in the coat protein (CP)-encoding gene using RECCO and RDP version 3.31beta algorithms. Based on RECCO results, all five viruses were shown to contain potential recombination signals in the CP gene. Reconstructed trees with modified topologies were proposed. Furthermore, RECCO performed better than the RDP package in detecting recombination events and exhibiting their evolution rate along the sequences of the five viruses. RDP, however, provided the possible major and minor parents of the recombinants. Thus, the two methods should be considered complementary.
Automatic detection of lexical change: an auditory event-related potential study.
Muller-Gass, Alexandra; Roye, Anja; Kirmse, Ursula; Saupe, Katja; Jacobsen, Thomas; Schröger, Erich
2007-10-29
We investigated the detection of rare task-irrelevant changes in the lexical status of speech stimuli. Participants performed a nonlinguistic task on word and pseudoword stimuli that occurred, in separate conditions, rarely or frequently. Task performance for pseudowords was deteriorated relative to words, suggesting unintentional lexical analysis. Furthermore, rare word and pseudoword changes had a similar effect on the event-related potentials, starting as early as 165 ms. This is the first demonstration of the automatic detection of change in lexical status that is not based on a co-occurring acoustic change. We propose that, following lexical analysis of the incoming stimuli, a mental representation of the lexical regularity is formed and used as a template against which lexical change can be detected.
Developing assessment system for wireless capsule endoscopy videos based on event detection
NASA Astrophysics Data System (ADS)
Chen, Ying-ju; Yasen, Wisam; Lee, Jeongkyu; Lee, Dongha; Kim, Yongho
2009-02-01
Along with the advancing of technology in wireless and miniature camera, Wireless Capsule Endoscopy (WCE), the combination of both, enables a physician to diagnose patient's digestive system without actually perform a surgical procedure. Although WCE is a technical breakthrough that allows physicians to visualize the entire small bowel noninvasively, the video viewing time takes 1 - 2 hours. This is very time consuming for the gastroenterologist. Not only it sets a limit on the wide application of this technology but also it incurs considerable amount of cost. Therefore, it is important to automate such process so that the medical clinicians only focus on interested events. As an extension from our previous work that characterizes the motility of digestive tract in WCE videos, we propose a new assessment system for energy based events detection (EG-EBD) to classify the events in WCE videos. For the system, we first extract general features of a WCE video that can characterize the intestinal contractions in digestive organs. Then, the event boundaries are identified by using High Frequency Content (HFC) function. The segments are classified into WCE event by special features. In this system, we focus on entering duodenum, entering cecum, and active bleeding. This assessment system can be easily extended to discover more WCE events, such as detailed organ segmentation and more diseases, by using new special features. In addition, the system provides a score for every WCE image for each event. Using the event scores, the system helps a specialist to speedup the diagnosis process.
Sarntivijai, Sirarat; Xiang, Zuoshuang; Shedden, Kerby A.; Markel, Howard; Omenn, Gilbert S.; Athey, Brian D.; He, Yongqun
2012-01-01
Vaccine adverse events (VAEs) are adverse bodily changes occurring after vaccination. Understanding the adverse event (AE) profiles is a crucial step to identify serious AEs. Two different types of seasonal influenza vaccines have been used on the market: trivalent (killed) inactivated influenza vaccine (TIV) and trivalent live attenuated influenza vaccine (LAIV). Different adverse event profiles induced by these two groups of seasonal influenza vaccines were studied based on the data drawn from the CDC Vaccine Adverse Event Report System (VAERS). Extracted from VAERS were 37,621 AE reports for four TIVs (Afluria, Fluarix, Fluvirin, and Fluzone) and 3,707 AE reports for the only LAIV (FluMist). The AE report data were analyzed by a novel combinatorial, ontology-based detection of AE method (CODAE). CODAE detects AEs using Proportional Reporting Ratio (PRR), Chi-square significance test, and base level filtration, and groups identified AEs by ontology-based hierarchical classification. In total, 48 TIV-enriched and 68 LAIV-enriched AEs were identified (PRR>2, Chi-square score >4, and the number of cases >0.2% of total reports). These AE terms were classified using the Ontology of Adverse Events (OAE), MedDRA, and SNOMED-CT. The OAE method provided better classification results than the two other methods. Thirteen out of 48 TIV-enriched AEs were related to neurological and muscular processing such as paralysis, movement disorders, and muscular weakness. In contrast, 15 out of 68 LAIV-enriched AEs were associated with inflammatory response and respiratory system disorders. There were evidences of two severe adverse events (Guillain-Barre Syndrome and paralysis) present in TIV. Although these severe adverse events were at low incidence rate, they were found to be more significantly enriched in TIV-vaccinated patients than LAIV-vaccinated patients. Therefore, our novel combinatorial bioinformatics analysis discovered that LAIV had lower chance of inducing these two severe adverse events than TIV. In addition, our meta-analysis found that all previously reported positive correlation between GBS and influenza vaccine immunization were based on trivalent influenza vaccines instead of monovalent influenza vaccines. PMID:23209624
Surface Plasmon Resonance Label-Free Monitoring of Antibody Antigen Interactions in Real Time
ERIC Educational Resources Information Center
Kausaite, Asta; van Dijk, Martijn; Castrop, Jan; Ramanaviciene, Almira; Baltrus, John P.; Acaite, Juzefa; Ramanavicius, Arunas
2007-01-01
Detection of biologically active compounds is one of the most important topics in molecular biology and biochemistry. One of the most promising detection methods is based on the application of surface plasmon resonance for label-free detection of biologically active compounds. This method allows one to monitor binding events in real time without…
NASA Astrophysics Data System (ADS)
Brax, Christoffer; Niklasson, Lars
2009-05-01
Maritime Domain Awareness is important for both civilian and military applications. An important part of MDA is detection of unusual vessel activities such as piracy, smuggling, poaching, collisions, etc. Today's interconnected sensorsystems provide us with huge amounts of information over large geographical areas which can make the operators reach their cognitive capacity and start to miss important events. We propose and agent-based situation management system that automatically analyse sensor information to detect unusual activity and anomalies. The system combines knowledge-based detection with data-driven anomaly detection. The system is evaluated using information from both radar and AIS sensors.
Hierarchical structure for audio-video based semantic classification of sports video sequences
NASA Astrophysics Data System (ADS)
Kolekar, M. H.; Sengupta, S.
2005-07-01
A hierarchical structure for sports event classification based on audio and video content analysis is proposed in this paper. Compared to the event classifications in other games, those of cricket are very challenging and yet unexplored. We have successfully solved cricket video classification problem using a six level hierarchical structure. The first level performs event detection based on audio energy and Zero Crossing Rate (ZCR) of short-time audio signal. In the subsequent levels, we classify the events based on video features using a Hidden Markov Model implemented through Dynamic Programming (HMM-DP) using color or motion as a likelihood function. For some of the game-specific decisions, a rule-based classification is also performed. Our proposed hierarchical structure can easily be applied to any other sports. Our results are very promising and we have moved a step forward towards addressing semantic classification problems in general.
Three-dimensional, position-sensitive radiation detection
He, Zhong; Zhang, Feng
2010-04-06
Disclosed herein is a method of determining a characteristic of radiation detected by a radiation detector via a multiple-pixel event having a plurality of radiation interactions. The method includes determining a cathode-to-anode signal ratio for a selected interaction of the plurality of radiation interactions based on electron drift time data for the selected interaction, and determining the radiation characteristic for the multiple-pixel event based on both the cathode-to-anode signal ratio and the electron drift time data. In some embodiments, the method further includes determining a correction factor for the radiation characteristic based on an interaction depth of the plurality of radiation interactions, a lateral distance between the selected interaction and a further interaction of the plurality of radiation interactions, and the lateral positioning of the plurality of radiation interactions.
Bowden, Vanessa K; Loft, Shayne
2016-06-01
In 2 experiments we examined the impact of memory for prior events on conflict detection in simulated air traffic control under conditions where individuals proactively controlled aircraft and completed concurrent tasks. Individuals were faster to detect conflicts that had repeatedly been presented during training (positive transfer). Bayesian statistics indicated strong evidence for the null hypothesis that conflict detection was not impaired for events that resembled an aircraft pair that had repeatedly come close to conflicting during training. This is likely because aircraft altitude (the feature manipulated between training and test) was attended to by participants when proactively controlling aircraft. In contrast, a minor change to the relative position of a repeated nonconflicting aircraft pair moderately impaired conflict detection (negative transfer). There was strong evidence for the null hypothesis that positive transfer was not impacted by dividing participant attention, which suggests that part of the information retrieved regarding prior aircraft events was perceptual (the new aircraft pair "looked" like a conflict based on familiarity). These findings extend the effects previously reported by Loft, Humphreys, and Neal (2004), answering the recent strong and unanimous calls across the psychological science discipline to formally establish the robustness and generality of previously published effects. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Vilar, Santiago; Harpaz, Rave; Chase, Herbert S; Costanzi, Stefano; Rabadan, Raul
2011-01-01
Background Adverse drug events (ADE) cause considerable harm to patients, and consequently their detection is critical for patient safety. The US Food and Drug Administration maintains an adverse event reporting system (AERS) to facilitate the detection of ADE in drugs. Various data mining approaches have been developed that use AERS to detect signals identifying associations between drugs and ADE. The signals must then be monitored further by domain experts, which is a time-consuming task. Objective To develop a new methodology that combines existing data mining algorithms with chemical information by analysis of molecular fingerprints to enhance initial ADE signals generated from AERS, and to provide a decision support mechanism to facilitate the identification of novel adverse events. Results The method achieved a significant improvement in precision in identifying known ADE, and a more than twofold signal enhancement when applied to the ADE rhabdomyolysis. The simplicity of the method assists in highlighting the etiology of the ADE by identifying structurally similar drugs. A set of drugs with strong evidence from both AERS and molecular fingerprint-based modeling is constructed for further analysis. Conclusion The results demonstrate that the proposed methodology could be used as a pharmacovigilance decision support tool to facilitate ADE detection. PMID:21946238
NASA Astrophysics Data System (ADS)
Meng, X.; Daniels, C.; Smith, E.; Peng, Z.; Chen, X.; Wagner, L. S.; Fischer, K. M.; Hawman, R. B.
2015-12-01
Since 2001, the number of M>3 earthquakes increased significantly in Central and Eastern United States (CEUS), likely due to waste-water injection, also known as "induced earthquakes" [Ellsworth, 2013]. Because induced earthquakes are driven by short-term external forcing and hence may behave like earthquake swarms, which are not well characterized by branching point-process models, such as the Epidemic Type Aftershock Sequence (ETAS) model [Ogata, 1988]. In this study we focus on the 02/15/2014 M4.1 South Carolina and the 06/16/2014 M4.3 Oklahoma earthquakes, which likely represent intraplate tectonic and induced events, respectively. For the South Carolina event, only one M3.0 aftershock is identified by the ANSS catalog, which may be caused by a lack of low-magnitude events in this catalog. We apply a recently developed matched filter technique to detect earthquakes from 02/08/2014 to 02/22/2014 around the epicentral region. 15 seismic stations (both permanent and temporary USArray networks) within 100 km of the mainshock are used for detection. The mainshock and aftershock are used as templates for the initial detection. Newly detected events are employed as new templates, and the same detection procedure repeats until no new event can be added. Overall we have identified more than 10 events, including one foreshock occurred ~11 min before the M4.1 mainshock. However, the numbers of aftershocks are still much less than predicted with the modified Bath's law. For the Oklahoma event, we use 1270 events from the ANSS catalog and 182 events from a relocated catalog as templates to scan through continuous recordings 3 days before to 7 days after the mainshock. 12 seismic stations within the vicinity of the mainshock are included in the study. After obtaining more complete catalogs for both sequences, we plan to compare the statistical parameters (e.g., b, a, K, and p values) between the two sequences, as well as their spatial-temporal migration pattern, which may shed light on the underlying physics of tectonic and induced earthquakes.
Wu, Yuhua; Wang, Yulei; Li, Jun; Li, Wei; Zhang, Li; Li, Yunjing; Li, Xiaofei; Li, Jun; Zhu, Li; Wu, Gang
2014-01-01
The Cauliflower mosaic virus (CaMV) 35S promoter (P35S) is a commonly used target for detection of genetically modified organisms (GMOs). There are currently 24 reported detection methods, targeting different regions of the P35S promoter. Initial assessment revealed that due to the absence of primer binding sites in the P35S sequence, 19 of the 24 reported methods failed to detect P35S in MON88913 cotton, and the other two methods could only be applied to certain GMOs. The rest three reported methods were not suitable for measurement of P35S in some testing events, because SNPs in binding sites of the primer/probe would result in abnormal amplification plots and poor linear regression parameters. In this study, we discovered a conserved region in the P35S sequence through sequencing of P35S promoters from multiple transgenic events, and developed new qualitative and quantitative detection systems targeting this conserved region. The qualitative PCR could detect the P35S promoter in 23 unique GMO events with high specificity and sensitivity. The quantitative method was suitable for measurement of P35S promoter, exhibiting good agreement between the amount of template and Ct values for each testing event. This study provides a general P35S screening method, with greater coverage than existing methods. PMID:25483893
Pacurariu, Alexandra C; Straus, Sabine M; Trifirò, Gianluca; Schuemie, Martijn J; Gini, Rosa; Herings, Ron; Mazzaglia, Giampiero; Picelli, Gino; Scotti, Lorenza; Pedersen, Lars; Arlett, Peter; van der Lei, Johan; Sturkenboom, Miriam C; Coloma, Preciosa M
2015-12-01
Spontaneous reporting systems (SRSs) remain the cornerstone of post-marketing drug safety surveillance despite their well-known limitations. Judicious use of other available data sources is essential to enable better detection, strengthening and validation of signals. In this study, we investigated the potential of electronic healthcare records (EHRs) to be used alongside an SRS as an independent system, with the aim of improving signal detection. A signal detection strategy, focused on a limited set of adverse events deemed important in pharmacovigilance, was performed retrospectively in two data sources-(1) the Exploring and Understanding Adverse Drug Reactions (EU-ADR) database network and (2) the EudraVigilance database-using data between 2000 and 2010. Five events were considered for analysis: (1) acute myocardial infarction (AMI); (2) bullous eruption; (3) hip fracture; (4) acute pancreatitis; and (5) upper gastrointestinal bleeding (UGIB). Potential signals identified in each system were verified using the current published literature. The complementarity of the two systems to detect signals was expressed as the percentage of the unilaterally identified signals out of the total number of confirmed signals. As a proxy for the associated costs, the number of signals that needed to be reviewed to detect one true signal (number needed to detect [NND]) was calculated. The relationship between the background frequency of the events and the capability of each system to detect signals was also investigated. The contribution of each system to signal detection appeared to be correlated with the background incidence of the events, being directly proportional to the incidence in EU-ADR and inversely proportional in EudraVigilance. EudraVigilance was particularly valuable in identifying bullous eruption and acute pancreatitis (71 and 42 % of signals were correctly identified from the total pool of known associations, respectively), while EU-ADR was most useful in identifying hip fractures (60 %). Both systems contributed reasonably well to identification of signals related to UGIB (45 % in EudraVigilance, 40 % in EU-ADR) but only fairly for signals related to AMI (25 % in EU-ADR, 20 % in EudraVigilance). The costs associated with detection of signals were variable across events; however, it was often more costly to detect safety signals in EU-ADR than in EudraVigilance (median NNDs: 7 versus 5). An EHR-based system may have additional value for signal detection, alongside already established systems, especially in the presence of adverse events with a high background incidence. While the SRS appeared to be more cost effective overall, for some events the costs associated with signal detection in the EHR might be justifiable.
Nadal, Anna; Coll, Anna; La Paz, Jose-Luis; Esteve, Teresa; Pla, Maria
2006-10-01
We present a novel multiplex PCR assay for simultaneous detection of multiple transgenic events in maize. Initially, five PCR primers pairs specific to events Bt11, GA21, MON810, and NK603, and Zea mays L. (alcohol dehydrogenase) were included. The event specificity was based on amplification of transgene/plant genome flanking regions, i.e., the same targets as for validated real-time PCR assays. These short and similarly sized amplicons were selected to achieve high and similar amplification efficiency for all targets; however, its unambiguous identification was a technical challenge. We achieved a clear distinction by a novel CGE approach that combined the identification by size and color (CGE-SC). In one single step, all five targets were amplified and specifically labeled with three different fluorescent dyes. The assay was specific and displayed an LOD of 0.1% of each genetically modified organism (GMO). Therefore, it was adequate to fulfill legal thresholds established, e.g., in the European Union. Our CGE-SC based strategy in combination with an adequate labeling design has the potential to simultaneously detect higher numbers of targets. As an example, we present the detection of up to eight targets in a single run. Multiplex PCR-CGE-SC only requires a conventional sequencer device and enables automation and high throughput. In addition, it proved to be transferable to a different laboratory. The number of authorized GMO events is rapidly growing; and the acreage of genetically modified (GM) varieties cultivated and commercialized worldwide is rapidly increasing. In this context, our multiplex PCR-CGE-SC can be suitable for screening GM contents in food.
A novel adaptive, real-time algorithm to detect gait events from wearable sensors.
Chia Bejarano, Noelia; Ambrosini, Emilia; Pedrocchi, Alessandra; Ferrigno, Giancarlo; Monticone, Marco; Ferrante, Simona
2015-05-01
A real-time, adaptive algorithm based on two inertial and magnetic sensors placed on the shanks was developed for gait-event detection. For each leg, the algorithm detected the Initial Contact (IC), as the minimum of the flexion/extension angle, and the End Contact (EC) and the Mid-Swing (MS), as minimum and maximum of the angular velocity, respectively. The algorithm consisted of calibration, real-time detection, and step-by-step update. Data collected from 22 healthy subjects (21 to 85 years) walking at three self-selected speeds were used to validate the algorithm against the GaitRite system. Comparable levels of accuracy and significantly lower detection delays were achieved with respect to other published methods. The algorithm robustness was tested on ten healthy subjects performing sudden speed changes and on ten stroke subjects (43 to 89 years). For healthy subjects, F1-scores of 1 and mean detection delays lower than 14 ms were obtained. For stroke subjects, F1-scores of 0.998 and 0.944 were obtained for IC and EC, respectively, with mean detection delays always below 31 ms. The algorithm accurately detected gait events in real time from a heterogeneous dataset of gait patterns and paves the way for the design of closed-loop controllers for customized gait trainings and/or assistive devices.
Zero-inflated Poisson model based likelihood ratio test for drug safety signal detection.
Huang, Lan; Zheng, Dan; Zalkikar, Jyoti; Tiwari, Ram
2017-02-01
In recent decades, numerous methods have been developed for data mining of large drug safety databases, such as Food and Drug Administration's (FDA's) Adverse Event Reporting System, where data matrices are formed by drugs such as columns and adverse events as rows. Often, a large number of cells in these data matrices have zero cell counts and some of them are "true zeros" indicating that the drug-adverse event pairs cannot occur, and these zero counts are distinguished from the other zero counts that are modeled zero counts and simply indicate that the drug-adverse event pairs have not occurred yet or have not been reported yet. In this paper, a zero-inflated Poisson model based likelihood ratio test method is proposed to identify drug-adverse event pairs that have disproportionately high reporting rates, which are also called signals. The maximum likelihood estimates of the model parameters of zero-inflated Poisson model based likelihood ratio test are obtained using the expectation and maximization algorithm. The zero-inflated Poisson model based likelihood ratio test is also modified to handle the stratified analyses for binary and categorical covariates (e.g. gender and age) in the data. The proposed zero-inflated Poisson model based likelihood ratio test method is shown to asymptotically control the type I error and false discovery rate, and its finite sample performance for signal detection is evaluated through a simulation study. The simulation results show that the zero-inflated Poisson model based likelihood ratio test method performs similar to Poisson model based likelihood ratio test method when the estimated percentage of true zeros in the database is small. Both the zero-inflated Poisson model based likelihood ratio test and likelihood ratio test methods are applied to six selected drugs, from the 2006 to 2011 Adverse Event Reporting System database, with varying percentages of observed zero-count cells.
A Survey of Insider Attack Detection Research
2008-08-25
modeling of statistical features , such as the frequency of events, the duration of events, the co-occurrence of multiple events combined through...forms of attack that have been reported [Error! Reference source not found.]. For example: • Unauthorized extraction , duplication, or exfiltration...network level. Schultz pointed out that not one approach will work but solutions need to be based on multiple sensors to be able to find any combination
Phylogenetic framework for coevolutionary studies: a compass for exploring jungles of tangled trees.
Martínez-Aquino, Andrés
2016-08-01
Phylogenetics is used to detect past evolutionary events, from how species originated to how their ecological interactions with other species arose, which can mirror cophylogenetic patterns. Cophylogenetic reconstructions uncover past ecological relationships between taxa through inferred coevolutionary events on trees, for example, codivergence, duplication, host-switching, and loss. These events can be detected by cophylogenetic analyses based on nodes and the length and branching pattern of the phylogenetic trees of symbiotic associations, for example, host-parasite. In the past 2 decades, algorithms have been developed for cophylogetenic analyses and implemented in different software, for example, statistical congruence index and event-based methods. Based on the combination of these approaches, it is possible to integrate temporal information into cophylogenetical inference, such as estimates of lineage divergence times between 2 taxa, for example, hosts and parasites. Additionally, the advances in phylogenetic biogeography applying methods based on parametric process models and combined Bayesian approaches, can be useful for interpreting coevolutionary histories in a scenario of biogeographical area connectivity through time. This article briefly reviews the basics of parasitology and provides an overview of software packages in cophylogenetic methods. Thus, the objective here is to present a phylogenetic framework for coevolutionary studies, with special emphasis on groups of parasitic organisms. Researchers wishing to undertake phylogeny-based coevolutionary studies can use this review as a "compass" when "walking" through jungles of tangled phylogenetic trees.
Phylogenetic framework for coevolutionary studies: a compass for exploring jungles of tangled trees
2016-01-01
Abstract Phylogenetics is used to detect past evolutionary events, from how species originated to how their ecological interactions with other species arose, which can mirror cophylogenetic patterns. Cophylogenetic reconstructions uncover past ecological relationships between taxa through inferred coevolutionary events on trees, for example, codivergence, duplication, host-switching, and loss. These events can be detected by cophylogenetic analyses based on nodes and the length and branching pattern of the phylogenetic trees of symbiotic associations, for example, host–parasite. In the past 2 decades, algorithms have been developed for cophylogetenic analyses and implemented in different software, for example, statistical congruence index and event-based methods. Based on the combination of these approaches, it is possible to integrate temporal information into cophylogenetical inference, such as estimates of lineage divergence times between 2 taxa, for example, hosts and parasites. Additionally, the advances in phylogenetic biogeography applying methods based on parametric process models and combined Bayesian approaches, can be useful for interpreting coevolutionary histories in a scenario of biogeographical area connectivity through time. This article briefly reviews the basics of parasitology and provides an overview of software packages in cophylogenetic methods. Thus, the objective here is to present a phylogenetic framework for coevolutionary studies, with special emphasis on groups of parasitic organisms. Researchers wishing to undertake phylogeny-based coevolutionary studies can use this review as a “compass” when “walking” through jungles of tangled phylogenetic trees. PMID:29491928
NASA Astrophysics Data System (ADS)
Radtke, J.; Sponner, J.; Jakobi, C.; Schneider, J.; Sommer, M.; Teichmann, T.; Ullrich, W.; Henniger, J.; Kormoll, T.
2018-01-01
Single photon detection applied to optically stimulated luminescence (OSL) dosimetry is a promising approach due to the low level of luminescence light and the known statistical behavior of single photon events. Time resolved detection allows to apply a variety of different and independent data analysis methods. Furthermore, using amplitude modulated stimulation impresses time- and frequency information into the OSL light and therefore allows for additional means of analysis. Considering the impressed frequency information, data analysis by using Fourier transform algorithms or other digital filters can be used for separating the OSL signal from unwanted light or events generated by other phenomena. This potentially lowers the detection limits of low dose measurements and might improve the reproducibility and stability of obtained data. In this work, an OSL system based on a single photon detector, a fast and accurate stimulation unit and an FPGA is presented. Different analysis algorithms which are applied to the single photon data are discussed.
Austin, Peter C
2018-01-01
The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest.
Austin, Peter C.
2017-01-01
The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest. PMID:29321694
Park, Jong-Uk; Lee, Hyo-Ki; Lee, Junghun; Urtnasan, Erdenebayar; Kim, Hojoong; Lee, Kyoung-Joung
2015-09-01
This study proposes a method of automatically classifying sleep apnea/hypopnea events based on sleep states and the severity of sleep-disordered breathing (SDB) using photoplethysmogram (PPG) and oxygen saturation (SpO2) signals acquired from a pulse oximeter. The PPG was used to classify sleep state, while the severity of SDB was estimated by detecting events of SpO2 oxygen desaturation. Furthermore, we classified sleep apnea/hypopnea events by applying different categorisations according to the severity of SDB based on a support vector machine. The classification results showed sensitivity performances and positivity predictive values of 74.2% and 87.5% for apnea, 87.5% and 63.4% for hypopnea, and 92.4% and 92.8% for apnea + hypopnea, respectively. These results represent better or comparable outcomes compared to those of previous studies. In addition, our classification method reliably detected sleep apnea/hypopnea events in all patient groups without bias in particular patient groups when our algorithm was applied to a variety of patient groups. Therefore, this method has the potential to diagnose SDB more reliably and conveniently using a pulse oximeter.
Determining dark matter properties with a XENONnT/LZ signal and LHC Run 3 monojet searches
NASA Astrophysics Data System (ADS)
Baum, Sebastian; Catena, Riccardo; Conrad, Jan; Freese, Katherine; Krauss, Martin B.
2018-04-01
We develop a method to forecast the outcome of the LHC Run 3 based on the hypothetical detection of O (100 ) signal events at XENONnT. Our method relies on a systematic classification of renormalizable single-mediator models for dark matter-quark interactions and is valid for dark matter candidates of spin less than or equal to one. Applying our method to simulated data, we find that at the end of the LHC Run 3 only two mutually exclusive scenarios would be compatible with the detection of O (100 ) signal events at XENONnT. In the first scenario, the energy distribution of the signal events is featureless, as for canonical spin-independent interactions. In this case, if a monojet signal is detected at the LHC, dark matter must have spin 1 /2 and interact with nucleons through a unique velocity-dependent operator. If a monojet signal is not detected, dark matter interacts with nucleons through canonical spin-independent interactions. In a second scenario, the spectral distribution of the signal events exhibits a bump at nonzero recoil energies. In this second case, a monojet signal can be detected at the LHC Run 3; dark matter must have spin 1 /2 and interact with nucleons through a unique momentum-dependent operator. We therefore conclude that the observation of O (100 ) signal events at XENONnT combined with the detection, or the lack of detection, of a monojet signal at the LHC Run 3 would significantly narrow the range of possible dark matter-nucleon interactions. As we argued above, it can also provide key information on the dark matter particle spin.
An iterative matching and locating technique for borehole microseismic monitoring
NASA Astrophysics Data System (ADS)
Chen, H.; Meng, X.; Niu, F.; Tang, Y.
2016-12-01
Microseismic monitoring has been proven to be an effective and valuable technology to image hydraulic fracture geometry. The success of hydraulic fracturing monitoring relies on the detection and characterization (i.e., location and focal mechanism estimation) of a maximum number of induced microseismic events. All the events are important to quantify the stimulated reservior volume (SRV) and characterize the newly created fracture network. Detecting and locating low magnitude events, however, are notoriously difficult, particularly at a high noisy production environment. Here we propose an iterative matching and locating technique (iMLT) to obtain a maximum detection of small events and the best determination of their locations from continuous data recorded by a single azimuth downhole geophone array. As the downhole array is located in one azimuth, the regular M&L using the P-wave cross-correlation only is not able to resolve the location of a matched event relative to the template event. We thus introduce the polarization direction in the matching, which significantly improve the lateral resolution of the M&L method based on numerical simulations with synthetic data. Our synthetic tests further indicate that the inclusion of S-wave cross-correlation data can help better constrain the focal depth of the matched events. We apply this method to a dataset recorded during hydraulic fracturing treatment of a pilot horizontal well within the shale play in southwest China. Our approach yields a more than fourfold increase in the number of located events, compared with the original event catalog from traditional downhole processing.
Event Recognition for Contactless Activity Monitoring Using Phase-Modulated Continuous Wave Radar.
Forouzanfar, Mohamad; Mabrouk, Mohamed; Rajan, Sreeraman; Bolic, Miodrag; Dajani, Hilmi R; Groza, Voicu Z
2017-02-01
The use of remote sensing technologies such as radar is gaining popularity as a technique for contactless detection of physiological signals and analysis of human motion. This paper presents a methodology for classifying different events in a collection of phase modulated continuous wave radar returns. The primary application of interest is to monitor inmates where the presence of human vital signs amidst different, interferences needs to be identified. A comprehensive set of features is derived through time and frequency domain analyses of the radar returns. The Bhattacharyya distance is used to preselect the features with highest class separability as the possible candidate features for use in the classification process. The uncorrelated linear discriminant analysis is performed to decorrelate, denoise, and reduce the dimension of the candidate feature set. Linear and quadratic Bayesian classifiers are designed to distinguish breathing, different human motions, and nonhuman motions. The performance of these classifiers is evaluated on a pilot dataset of radar returns that contained different events including breathing, stopped breathing, simple human motions, and movement of fan and water. Our proposed pattern classification system achieved accuracies of up to 93% in stationary subject detection, 90% in stop-breathing detection, and 86% in interference detection. Our proposed radar pattern recognition system was able to accurately distinguish the predefined events amidst interferences. Besides inmate monitoring and suicide attempt detection, this paper can be extended to other radar applications such as home-based monitoring of elderly people, apnea detection, and home occupancy detection.
van Holle, Lionel; Bauchau, Vincent
2014-01-01
Purpose Disproportionality methods measure how unexpected the observed number of adverse events is. Time-to-onset (TTO) methods measure how unexpected the TTO distribution of a vaccine-event pair is compared with what is expected from other vaccines and events. Our purpose is to compare the performance associated with each method. Methods For the disproportionality algorithms, we defined 336 combinations of stratification factors (sex, age, region and year) and threshold values of the multi-item gamma Poisson shrinker (MGPS). For the TTO algorithms, we defined 18 combinations of significance level and time windows. We used spontaneous reports of adverse events recorded for eight vaccines. The vaccine product labels were used as proxies for true safety signals. Algorithms were ranked according to their positive predictive value (PPV) for each vaccine separately; amedian rank was attributed to each algorithm across vaccines. Results The algorithm with the highest median rank was based on TTO with a significance level of 0.01 and a time window of 60 days after immunisation. It had an overall PPV 2.5 times higher than for the highest-ranked MGPS algorithm, 16th rank overall, which was fully stratified and had a threshold value of 0.8. A TTO algorithm with roughly the same sensitivity as the highest-ranked MGPS had better specificity but longer time-to-detection. Conclusions Within the scope of this study, the majority of the TTO algorithms presented a higher PPV than for any MGPS algorithm. Considering the complementarity of TTO and disproportionality methods, a signal detection strategy combining them merits further investigation. PMID:24038719
A 20-year catalog comparing smooth and sharp estimates of slow slip events in Cascadia
NASA Astrophysics Data System (ADS)
Molitors Bergman, E. G.; Evans, E. L.; Loveless, J. P.
2017-12-01
Slow slip events (SSEs) are a form of aseismic strain release at subduction zones resulting in a temporary reversal in interseismic upper plate motion over a period of weeks, frequently accompanied in time and space by seismic tremor at the Cascadia subduction zone. Locating SSEs spatially along the subduction zone interface is essential to understanding the relationship between SSEs, earthquakes, and tremor and assessing megathrust earthquake hazard. We apply an automated slope comparison-based detection algorithm to single continuously recording GPS stations to determine dates and surface displacement vectors of SSEs, then apply network-based filters to eliminate false detections. The main benefits of this algorithm are its ability to detect SSEs while they are occurring and track the spatial migration of each event. We invert geodetic displacement fields for slip distributions on the subduction zone interface for SSEs between 1997 and 2017 using two estimation techniques: spatial smoothing and total variation regularization (TVR). Smoothing has been frequently used in determining the location of interseismic coupling, earthquake rupture, and SSE slip and yields spatially coherent but inherently blurred solutions. TVR yields compact, sharply bordered slip estimates of similar magnitude and along-strike extent to previously presented studied events, while fitting the constraining geodetic data as well as corresponding smoothing-based solutions. Slip distributions estimated using TVR have up-dip limits that align well with down-dip limits of interseismic coupling on the plate interface and spatial extents that approximately correspond to the distribution of tremor concurrent with each event. TVR gives a unique view of slow slip distributions that can contribute to understanding of the physical properties that govern megathrust slip processes.
Impact Detection for Characterization of Complex Multiphase Flows
NASA Astrophysics Data System (ADS)
Chan, Wai Hong Ronald; Urzay, Javier; Mani, Ali; Moin, Parviz
2016-11-01
Multiphase flows often involve a wide range of impact events, such as liquid droplets impinging on a liquid pool or gas bubbles coalescing in a liquid medium. These events contribute to a myriad of large-scale phenomena, including breaking waves on ocean surfaces. As impacts between surfaces necessarily occur at isolated points, numerical simulations of impact events will require the resolution of molecular scales near the impact points for accurate modeling. This can be prohibitively expensive unless subgrid impact and breakup models are formulated to capture the effects of the interactions. The first step in a large-eddy simulation (LES) based computational methodology for complex multiphase flows like air-sea interactions requires effective detection of these impact events. The starting point of this work is a collision detection algorithm for structured grids on a coupled level set / volume of fluid (CLSVOF) solver adapted from an earlier algorithm for cloth animations that triangulates the interface with the marching cubes method. We explore the extension of collision detection to a geometric VOF solver and to unstructured grids. Supported by ONR/A*STAR. Agency of Science, Technology and Research, Singapore; Office of Naval Research, USA.
NASA Astrophysics Data System (ADS)
Chiu, L.; Hao, X.; Kinter, J. L.; Stearn, G.; Aliani, M.
2017-12-01
The launch of GOES-16 series provides an opportunity to advance near real-time applications in natural hazard detection, monitoring and warning. This study demonstrates the capability and values of receiving real-time satellite-based Earth observations over a fast terrestrial networks and processing high-resolution remote sensing data in a university environment. The demonstration system includes 4 components: 1) Near real-time data receiving and processing; 2) data analysis and visualization; 3) event detection and monitoring; and 4) information dissemination. Various tools are developed and integrated to receive and process GRB data in near real-time, produce images and value-added data products, and detect and monitor extreme weather events such as hurricane, fire, flooding, fog, lightning, etc. A web-based application system is developed to disseminate near-real satellite images and data products. The images are generated with GIS-compatible format (GeoTIFF) to enable convenient use and integration in various GIS platforms. This study enhances the capacities for undergraduate and graduate education in Earth system and climate sciences, and related applications to understand the basic principles and technology in real-time applications with remote sensing measurements. It also provides an integrated platform for near real-time monitoring of extreme weather events, which are helpful for various user communities.
The chordate proteome history database.
Levasseur, Anthony; Paganini, Julien; Dainat, Jacques; Thompson, Julie D; Poch, Olivier; Pontarotti, Pierre; Gouret, Philippe
2012-01-01
The chordate proteome history database (http://ioda.univ-provence.fr) comprises some 20,000 evolutionary analyses of proteins from chordate species. Our main objective was to characterize and study the evolutionary histories of the chordate proteome, and in particular to detect genomic events and automatic functional searches. Firstly, phylogenetic analyses based on high quality multiple sequence alignments and a robust phylogenetic pipeline were performed for the whole protein and for each individual domain. Novel approaches were developed to identify orthologs/paralogs, and predict gene duplication/gain/loss events and the occurrence of new protein architectures (domain gains, losses and shuffling). These important genetic events were localized on the phylogenetic trees and on the genomic sequence. Secondly, the phylogenetic trees were enhanced by the creation of phylogroups, whereby groups of orthologous sequences created using OrthoMCL were corrected based on the phylogenetic trees; gene family size and gene gain/loss in a given lineage could be deduced from the phylogroups. For each ortholog group obtained from the phylogenetic or the phylogroup analysis, functional information and expression data can be retrieved. Database searches can be performed easily using biological objects: protein identifier, keyword or domain, but can also be based on events, eg, domain exchange events can be retrieved. To our knowledge, this is the first database that links group clustering, phylogeny and automatic functional searches along with the detection of important events occurring during genome evolution, such as the appearance of a new domain architecture.
A habituation based approach for detection of visual changes in surveillance camera
NASA Astrophysics Data System (ADS)
Sha'abani, M. N. A. H.; Adan, N. F.; Sabani, M. S. M.; Abdullah, F.; Nadira, J. H. S.; Yasin, M. S. M.
2017-09-01
This paper investigates a habituation based approach in detecting visual changes using video surveillance systems in a passive environment. Various techniques have been introduced for dynamic environment such as motion detection, object classification and behaviour analysis. However, in a passive environment, most of the scenes recorded by the surveillance system are normal. Therefore, implementing a complex analysis all the time in the passive environment resulting on computationally expensive, especially when using a high video resolution. Thus, a mechanism of attention is required, where the system only responds to an abnormal event. This paper proposed a novelty detection mechanism in detecting visual changes and a habituation based approach to measure the level of novelty. The objective of the paper is to investigate the feasibility of the habituation based approach in detecting visual changes. Experiment results show that the approach are able to accurately detect the presence of novelty as deviations from the learned knowledge.
Britton, Jr., Charles L.; Wintenberg, Alan L.
1993-01-01
A radiation detection method and system for continuously correcting the quantization of detected charge during pulse pile-up conditions. Charge pulses from a radiation detector responsive to the energy of detected radiation events are converted to voltage pulses of predetermined shape whose peak amplitudes are proportional to the quantity of charge of each corresponding detected event by means of a charge-sensitive preamplifier. These peak amplitudes are sampled and stored sequentially in accordance with their respective times of occurrence. Based on the stored peak amplitudes and times of occurrence, a correction factor is generated which represents the fraction of a previous pulses influence on a preceding pulse peak amplitude. This correction factor is subtracted from the following pulse amplitude in a summing amplifier whose output then represents the corrected charge quantity measurement.
Using Knowledge Base for Event-Driven Scheduling of Web Monitoring Systems
NASA Astrophysics Data System (ADS)
Kim, Yang Sok; Kang, Sung Won; Kang, Byeong Ho; Compton, Paul
Web monitoring systems report any changes to their target web pages by revisiting them frequently. As they operate under significant resource constraints, it is essential to minimize revisits while ensuring minimal delay and maximum coverage. Various statistical scheduling methods have been proposed to resolve this problem; however, they are static and cannot easily cope with events in the real world. This paper proposes a new scheduling method that manages unpredictable events. An MCRDR (Multiple Classification Ripple-Down Rules) document classification knowledge base was reused to detect events and to initiate a prompt web monitoring process independent of a static monitoring schedule. Our experiment demonstrates that the approach improves monitoring efficiency significantly.
JRC GMO-Matrix: a web application to support Genetically Modified Organisms detection strategies.
Angers-Loustau, Alexandre; Petrillo, Mauro; Bonfini, Laura; Gatto, Francesco; Rosa, Sabrina; Patak, Alexandre; Kreysa, Joachim
2014-12-30
The polymerase chain reaction (PCR) is the current state of the art technique for DNA-based detection of Genetically Modified Organisms (GMOs). A typical control strategy starts by analyzing a sample for the presence of target sequences (GM-elements) known to be present in many GMOs. Positive findings from this "screening" are then confirmed with GM (event) specific test methods. A reliable knowledge of which GMOs are detected by combinations of GM-detection methods is thus crucial to minimize the verification efforts. In this article, we describe a novel platform that links the information of two unique databases built and maintained by the European Union Reference Laboratory for Genetically Modified Food and Feed (EU-RL GMFF) at the Joint Research Centre (JRC) of the European Commission, one containing the sequence information of known GM-events and the other validated PCR-based detection and identification methods. The new platform compiles in silico determinations of the detection of a wide range of GMOs by the available detection methods using existing scripts that simulate PCR amplification and, when present, probe binding. The correctness of the information has been verified by comparing the in silico conclusions to experimental results for a subset of forty-nine GM events and six methods. The JRC GMO-Matrix is unique for its reliance on DNA sequence data and its flexibility in integrating novel GMOs and new detection methods. Users can mine the database using a set of web interfaces that thus provide a valuable support to GMO control laboratories in planning and evaluating their GMO screening strategies. The platform is accessible at http://gmo-crl.jrc.ec.europa.eu/jrcgmomatrix/ .
A model of human event detection in multiple process monitoring situations
NASA Technical Reports Server (NTRS)
Greenstein, J. S.; Rouse, W. B.
1978-01-01
It is proposed that human decision making in many multi-task situations might be modeled in terms of the manner in which the human detects events related to his tasks and the manner in which he allocates his attention among his tasks once he feels events have occurred. A model of human event detection performance in such a situation is presented. An assumption of the model is that, in attempting to detect events, the human generates the probability that events have occurred. Discriminant analysis is used to model the human's generation of these probabilities. An experimental study of human event detection performance in a multiple process monitoring situation is described and the application of the event detection model to this situation is addressed. The experimental study employed a situation in which subjects simulataneously monitored several dynamic processes for the occurrence of events and made yes/no decisions on the presence of events in each process. Input to the event detection model of the information displayed to the experimental subjects allows comparison of the model's performance with the performance of the subjects.
A 300-mV 220-nW event-driven ADC with real-time QRS detection for wearable ECG sensors.
Zhang, Xiaoyang; Lian, Yong
2014-12-01
This paper presents an ultra-low-power event-driven analog-to-digital converter (ADC) with real-time QRS detection for wearable electrocardiogram (ECG) sensors in wireless body sensor network (WBSN) applications. Two QRS detection algorithms, pulse-triggered (PUT) and time-assisted PUT (t-PUT), are proposed based on the level-crossing events generated from the ADC. The PUT detector achieves 97.63% sensitivity and 97.33% positive prediction in simulation on the MIT-BIH Arrhythmia Database. The t-PUT improves the sensitivity and positive prediction to 97.76% and 98.59% respectively. Fabricated in 0.13 μm CMOS technology, the ADC with QRS detector consumes only 220 nW measured under 300 mV power supply, making it the first nanoWatt compact analog-to-information (A2I) converter with embedded QRS detector.
Active Learning with Rationales for Identifying Operationally Significant Anomalies in Aviation
NASA Technical Reports Server (NTRS)
Sharma, Manali; Das, Kamalika; Bilgic, Mustafa; Matthews, Bryan; Nielsen, David Lynn; Oza, Nikunj C.
2016-01-01
A major focus of the commercial aviation community is discovery of unknown safety events in flight operations data. Data-driven unsupervised anomaly detection methods are better at capturing unknown safety events compared to rule-based methods which only look for known violations. However, not all statistical anomalies that are discovered by these unsupervised anomaly detection methods are operationally significant (e.g., represent a safety concern). Subject Matter Experts (SMEs) have to spend significant time reviewing these statistical anomalies individually to identify a few operationally significant ones. In this paper we propose an active learning algorithm that incorporates SME feedback in the form of rationales to build a classifier that can distinguish between uninteresting and operationally significant anomalies. Experimental evaluation on real aviation data shows that our approach improves detection of operationally significant events by as much as 75% compared to the state-of-the-art. The learnt classifier also generalizes well to additional validation data sets.
Zhou, Hanying; Homer, Margie L.; Shevade, Abhijit V.; Ryan, Margaret A.
2006-01-01
The Jet Propulsion Laboratory has recently developed and built an electronic nose (ENose) using a polymer-carbon composite sensing array. This ENose is designed to be used for air quality monitoring in an enclosed space, and is designed to detect, identify and quantify common contaminants at concentrations in the parts-per-million range. Its capabilities were demonstrated in an experiment aboard the National Aeronautics and Space Administration's Space Shuttle Flight STS-95. This paper describes a modified nonlinear least-squares based algorithm developed to analyze data taken by the ENose, and its performance for the identification and quantification of single gases and binary mixtures of twelve target analytes in clean air. Results from laboratory-controlled events demonstrate the effectiveness of the algorithm to identify and quantify a gas event if concentration exceeds the ENose detection threshold. Results from the flight test demonstrate that the algorithm correctly identifies and quantifies all registered events (planned or unplanned, as singles or mixtures) with no false positives and no inconsistencies with the logged events and the independent analysis of air samples.
Eggerth, Alphons; Modre-Osprian, Robert; Hayn, Dieter; Kastner, Peter; Pölzl, Gerhard; Schreier, Günter
2017-01-01
Automatic event detection is used in telemedicine based heart failure disease management programs supporting physicians and nurses in monitoring of patients' health data. Analysis of the performance of automatic event detection algorithms for prediction of HF related hospitalisations or diuretic dose increases. Rule-Of-Thumb and Moving Average Convergence Divergence (MACD) algorithm were applied to body weight data from 106 heart failure patients of the HerzMobil-Tirol disease management program. The evaluation criteria were based on Youden index and ROC curves. Analysis of data from 1460 monitoring weeks with 54 events showed a maximum Youden index of 0.19 for MACD and RoT with a specificity > 0.90. Comparison of the two algorithms for real-world monitoring data showed similar results regarding total and limited AUC. An improvement of the sensitivity might be possible by including additional health data (e.g. vital signs and self-reported well-being) because body weight variations obviously are not the only cause of HF related hospitalisations or diuretic dose increases.
Efficient dynamic events discrimination technique for fiber distributed Brillouin sensors.
Galindez, Carlos A; Madruga, Francisco J; Lopez-Higuera, Jose M
2011-09-26
A technique to detect real time variations of temperature or strain in Brillouin based distributed fiber sensors is proposed and is investigated in this paper. The technique is based on anomaly detection methods such as the RX-algorithm. Detection and isolation of dynamic events from the static ones are demonstrated by a proper processing of the Brillouin gain values obtained by using a standard BOTDA system. Results also suggest that better signal to noise ratio, dynamic range and spatial resolution can be obtained. For a pump pulse of 5 ns the spatial resolution is enhanced, (from 0.541 m obtained by direct gain measurement, to 0.418 m obtained with the technique here exposed) since the analysis is concentrated in the variation of the Brillouin gain and not only on the averaging of the signal along the time. © 2011 Optical Society of America
Clark-Foos, Arlo; Brewer, Gene A; Marsh, Richard L; Meeks, J Thadeus; Cook, Gabriel I
2009-01-01
Event-based prospective memory tasks entail detecting cues or reminders in our environment related to previously established intentions. If they are detected at an opportune time, then the intention can be fulfilled. In Experiments 1a-1c, we gave people 3 different nonfocal intentions (e.g., respond to words denoting animals) and discovered that negatively valenced cues delivered the intention to mind less frequently than positively valenced cues. In Experiment 2, this effect was extended to valenced and neutral sentential contexts with convergent results that cues embedded in negatively valenced sentences evoked remembering the intention less often than in positive contexts. In addition, both classes of valence caused the intention to be forgotten more often than a more neutral context. We propose that valence has the ability to usurp attentional resources that otherwise would have supported successful prospective memory performance.
Hernández, Marta; Rodríguez-Lázaro, David; Zhang, David; Esteve, Teresa; Pla, Maria; Prat, Salomé
2005-05-04
The number of cultured hectares and commercialized genetically modified organisms (GMOs) has increased exponentially in the past 9 years. Governments in many countries have established a policy of labeling all food and feed containing or produced by GMOs. Consequently, versatile, laboratory-transferable GMO detection methods are in increasing demand. Here, we describe a qualitative PCR-based multiplex method for simultaneous detection and identification of four genetically modified maize lines: Bt11, MON810, T25, and GA21. The described system is based on the use of five primers directed to specific sequences in these insertion events. Primers were used in a single optimized multiplex PCR reaction, and sequences of the amplified fragments are reported. The assay allows amplification of the MON810 event from the 35S promoter to the hsp intron yielding a 468 bp amplicon. Amplification of the Bt11 and T25 events from the 35S promoter to the PAT gene yielded two different amplicons of 280 and 177 bp, respectively, whereas amplification of the 5' flanking region of the GA21 gave rise to an amplicon of 72 bp. These fragments are clearly distinguishable in agarose gels and have been reproduced successfully in a different laboratory. Hence, the proposed method comprises a rapid, simple, reliable, and sensitive (down to 0.05%) PCR-based assay, suitable for detection of these four GM maize lines in a single reaction.
Learning rational temporal eye movement strategies.
Hoppe, David; Rothkopf, Constantin A
2016-07-19
During active behavior humans redirect their gaze several times every second within the visual environment. Where we look within static images is highly efficient, as quantified by computational models of human gaze shifts in visual search and face recognition tasks. However, when we shift gaze is mostly unknown despite its fundamental importance for survival in a dynamic world. It has been suggested that during naturalistic visuomotor behavior gaze deployment is coordinated with task-relevant events, often predictive of future events, and studies in sportsmen suggest that timing of eye movements is learned. Here we establish that humans efficiently learn to adjust the timing of eye movements in response to environmental regularities when monitoring locations in the visual scene to detect probabilistically occurring events. To detect the events humans adopt strategies that can be understood through a computational model that includes perceptual and acting uncertainties, a minimal processing time, and, crucially, the intrinsic costs of gaze behavior. Thus, subjects traded off event detection rate with behavioral costs of carrying out eye movements. Remarkably, based on this rational bounded actor model the time course of learning the gaze strategies is fully explained by an optimal Bayesian learner with humans' characteristic uncertainty in time estimation, the well-known scalar law of biological timing. Taken together, these findings establish that the human visual system is highly efficient in learning temporal regularities in the environment and that it can use these regularities to control the timing of eye movements to detect behaviorally relevant events.
Closing the loop in ICU decision support: physiologic event detection, alerts, and documentation.
Norris, P. R.; Dawant, B. M.
2001-01-01
Automated physiologic event detection and alerting is a challenging task in the ICU. Ideally care providers should be alerted only when events are clinically significant and there is opportunity for corrective action. However, the concepts of clinical significance and opportunity are difficult to define in automated systems, and effectiveness of alerting algorithms is difficult to measure. This paper describes recent efforts on the Simon project to capture information from ICU care providers about patient state and therapy in response to alerts, in order to assess the value of event definitions and progressively refine alerting algorithms. Event definitions for intracranial pressure and cerebral perfusion pressure were studied by implementing a reliable system to automatically deliver alerts to clinical users alphanumeric pagers, and to capture associated documentation about patient state and therapy when the alerts occurred. During a 6-month test period in the trauma ICU at Vanderbilt University Medical Center, 530 alerts were detected in 2280 hours of data spanning 14 patients. Clinical users electronically documented 81% of these alerts as they occurred. Retrospectively classifying documentation based on therapeutic actions taken, or reasons why actions were not taken, provided useful information about ways to potentially improve event definitions and enhance system utility. PMID:11825238
NASA Astrophysics Data System (ADS)
Nissen, Katrin; Ulbrich, Uwe
2016-04-01
An event based detection algorithm for extreme precipitation is applied to a multi-model ensemble of regional climate model simulations. The algorithm determines extent, location, duration and severity of extreme precipitation events. We assume that precipitation in excess of the local present-day 10-year return value will potentially exceed the capacity of the drainage systems that protect critical infrastructure elements. This assumption is based on legislation for the design of drainage systems which is in place in many European countries. Thus, events exceeding the local 10-year return value are detected. In this study we distinguish between sub-daily events (3 hourly) with high precipitation intensities and long-duration events (1-3 days) with high precipitation amounts. The climate change simulations investigated here were conducted within the EURO-CORDEX framework and exhibit a horizontal resolution of approximately 12.5 km. The period between 1971-2100 forced with observed and scenario (RCP 8.5 and RCP 4.5) greenhouse gas concentrations was analysed. Examined are changes in event frequency, event duration and size. The simulations show an increase in the number of extreme precipitation events for the future climate period over most of the area, which is strongest in Northern Europe. Strength and statistical significance of the signal increase with increasing greenhouse gas concentrations. This work has been conducted within the EU project RAIN (Risk Analysis of Infrastructure Networks in response to extreme weather).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hutchison, Janine R.; Erikson, Rebecca L.; Sheen, Allison M.
Rapid, cost-effective bacterial detection systems are needed to respond to potential biothreat events. Here we report the use of smartphone-based microscopy in combination with a simple microfluidic incubation device to detect 5000 Bacillus anthracis spores in 3 hours. This field-deployable approach is compatible with real-time PCR for secondary confirmation.
Fibre optic system for biochemical and microbiological sensing
NASA Astrophysics Data System (ADS)
Penwill, L. A.; Slater, J. H.; Hayes, N. W.; Tremlett, C. J.
2007-07-01
This poster will discuss state-of-the-art fibre optic sensors based on evanescent wave technology emphasising chemophotonic sensors for biochemical reactions and microbe detection. Devices based on antibody specificity and unique DNA sequences will be described. The development of simple sensor devices with disposable single use sensor probes will be illustrated with a view to providing cost effective field based or point of care analysis of major themes such as hospital acquired infections or bioterrorism events. This presentation will discuss the nature and detection thresholds required, the optical detection techniques investigated, results of sensor trials and the potential for wider commercial application.
Antón, Alfonso; Pazos, Marta; Martín, Belén; Navero, José Manuel; Ayala, Miriam Eleonora; Castany, Marta; Martínez, Patricia; Bardavío, Javier
2013-01-01
To assess sensitivity, specificity, and agreement among automated event analysis, automated trend analysis, and expert evaluation to detect glaucoma progression. This was a prospective study that included 37 eyes with a follow-up of 36 months. All had glaucomatous disks and fields and performed reliable visual fields every 6 months. Each series of fields was assessed with 3 different methods: subjective assessment by 2 independent teams of glaucoma experts, glaucoma/guided progression analysis (GPA) event analysis, and GPA (visual field index-based) trend analysis. Kappa agreement coefficient between methods and sensitivity and specificity for each method using expert opinion as gold standard were calculated. The incidence of glaucoma progression was 16% to 18% in 3 years but only 3 cases showed progression with all 3 methods. Kappa agreement coefficient was high (k=0.82) between subjective expert assessment and GPA event analysis, and only moderate between these two and GPA trend analysis (k=0.57). Sensitivity and specificity for GPA event and GPA trend analysis were 71% and 96%, and 57% and 93%, respectively. The 3 methods detected similar numbers of progressing cases. The GPA event analysis and expert subjective assessment showed high agreement between them and moderate agreement with GPA trend analysis. In a period of 3 years, both methods of GPA analysis offered high specificity, event analysis showed 83% sensitivity, and trend analysis had a 66% sensitivity.
Flight Tests of the Turbulence Prediction and Warning System (TPAWS)
NASA Technical Reports Server (NTRS)
Hamilton, David W.; Proctor, Fred H.; Ahmad, Nashat N.
2012-01-01
Flight tests of the National Aeronautics and Space Administration's Turbulence Prediction And Warning System (TPAWS) were conducted in the Fall of 2000 and Spring of 2002. TPAWS is a radar-based airborne turbulence detection system. During twelve flights, NASA's B-757 tallied 53 encounters with convectively induced turbulence. Analysis of data collected during 49 encounters in the Spring of 2002 showed that the TPAWS Airborne Turbulence Detection System (ATDS) successfully detected 80% of the events at least 30 seconds prior to the encounter, achieving FAA recommended performance criteria. Details of the flights, the prevailing weather conditions, and each of the turbulence events are presented in this report. Sensor and environmental characterizations are also provided.
NASA Astrophysics Data System (ADS)
Feller, Jens; Feller, Sebastian; Mauersberg, Bernhard; Mergenthaler, Wolfgang
2009-09-01
Many applications in plant management require close monitoring of equipment performance, in particular with the objective to prevent certain critical events. At each point in time, the information available to classify the criticality of the process, is represented through the historic signal database as well as the actual measurement. This paper presents an approach to detect and predict critical events, based on pattern recognition and discriminance analysis.
NASA Astrophysics Data System (ADS)
Yuki, Akiyama; Satoshi, Ueyama; Ryosuke, Shibasaki; Adachi, Ryuichiro
2016-06-01
In this study, we developed a method to detect sudden population concentration on a certain day and area, that is, an "Event," all over Japan in 2012 using mass GPS data provided from mobile phone users. First, stay locations of all phone users were detected using existing methods. Second, areas and days where Events occurred were detected by aggregation of mass stay locations into 1-km-square grid polygons. Finally, the proposed method could detect Events with an especially large number of visitors in the year by removing the influences of Events that occurred continuously throughout the year. In addition, we demonstrated reasonable reliability of the proposed Event detection method by comparing the results of Event detection with light intensities obtained from the night light images from the DMSP/OLS night light images. Our method can detect not only positive events such as festivals but also negative events such as natural disasters and road accidents. These results are expected to support policy development of urban planning, disaster prevention, and transportation management.
GMDD: a database of GMO detection methods
Dong, Wei; Yang, Litao; Shen, Kailin; Kim, Banghyun; Kleter, Gijs A; Marvin, Hans JP; Guo, Rong; Liang, Wanqi; Zhang, Dabing
2008-01-01
Background Since more than one hundred events of genetically modified organisms (GMOs) have been developed and approved for commercialization in global area, the GMO analysis methods are essential for the enforcement of GMO labelling regulations. Protein and nucleic acid-based detection techniques have been developed and utilized for GMOs identification and quantification. However, the information for harmonization and standardization of GMO analysis methods at global level is needed. Results GMO Detection method Database (GMDD) has collected almost all the previous developed and reported GMOs detection methods, which have been grouped by different strategies (screen-, gene-, construct-, and event-specific), and also provide a user-friendly search service of the detection methods by GMO event name, exogenous gene, or protein information, etc. In this database, users can obtain the sequences of exogenous integration, which will facilitate PCR primers and probes design. Also the information on endogenous genes, certified reference materials, reference molecules, and the validation status of developed methods is included in this database. Furthermore, registered users can also submit new detection methods and sequences to this database, and the newly submitted information will be released soon after being checked. Conclusion GMDD contains comprehensive information of GMO detection methods. The database will make the GMOs analysis much easier. PMID:18522755
A cyber-event correlation framework and metrics
NASA Astrophysics Data System (ADS)
Kang, Myong H.; Mayfield, Terry
2003-08-01
In this paper, we propose a cyber-event fusion, correlation, and situation assessment framework that, when instantiated, will allow cyber defenders to better understand the local, regional, and global cyber-situation. This framework, with associated metrics, can be used to guide assessment of our existing cyber-defense capabilities, and to help evaluate the state of cyber-event correlation research and where we must focus our future cyber-event correlation research. The framework, based on the cyber-event gathering activities and analysis functions, consists of five operational steps, each of which provides a richer set of contextual information to support greater situational understanding. The first three steps are categorically depicted as increasingly richer and broader-scoped contexts achieved through correlation activity, while in the final two steps, these richer contexts are achieved through analytical activities (situation assessment, and threat analysis & prediction). Category 1 Correlation focuses on the detection of suspicious activities and the correlation of events from a single cyber-event source. Category 2 Correlation clusters the same or similar events from multiple detectors that are located at close proximity and prioritizes them. Finally, the events from different time periods and event sources at different location/regions are correlated at Category 3 to recognize the relationship among different events. This is the category that focuses on the detection of large-scale and coordinated attacks. The situation assessment step (Category 4) focuses on the assessment of cyber asset damage and the analysis of the impact on missions. The threat analysis and prediction step (Category 5) analyzes attacks based on attack traces and predicts the next steps. Metrics that can distinguish correlation and cyber-situation assessment tools for each category are also proposed.
Space-time clusters for early detection of grizzly bear predation.
Kermish-Wells, Joseph; Massolo, Alessandro; Stenhouse, Gordon B; Larsen, Terrence A; Musiani, Marco
2018-01-01
Accurate detection and classification of predation events is important to determine predation and consumption rates by predators. However, obtaining this information for large predators is constrained by the speed at which carcasses disappear and the cost of field data collection. To accurately detect predation events, researchers have used GPS collar technology combined with targeted site visits. However, kill sites are often investigated well after the predation event due to limited data retrieval options on GPS collars (VHF or UHF downloading) and to ensure crew safety when working with large predators. This can lead to missing information from small-prey (including young ungulates) kill sites due to scavenging and general site deterioration (e.g., vegetation growth). We used a space-time permutation scan statistic (STPSS) clustering method (SaTScan) to detect predation events of grizzly bears ( Ursus arctos ) fitted with satellite transmitting GPS collars. We used generalized linear mixed models to verify predation events and the size of carcasses using spatiotemporal characteristics as predictors. STPSS uses a probability model to compare expected cluster size (space and time) with the observed size. We applied this method retrospectively to data from 2006 to 2007 to compare our method to random GPS site selection. In 2013-2014, we applied our detection method to visit sites one week after their occupation. Both datasets were collected in the same study area. Our approach detected 23 of 27 predation sites verified by visiting 464 random grizzly bear locations in 2006-2007, 187 of which were within space-time clusters and 277 outside. Predation site detection increased by 2.75 times (54 predation events of 335 visited clusters) using 2013-2014 data. Our GLMMs showed that cluster size and duration predicted predation events and carcass size with high sensitivity (0.72 and 0.94, respectively). Coupling GPS satellite technology with clusters using a program based on space-time probability models allows for prompt visits to predation sites. This enables accurate identification of the carcass size and increases fieldwork efficiency in predation studies.
Support Vector Machine Model for Automatic Detection and Classification of Seismic Events
NASA Astrophysics Data System (ADS)
Barros, Vesna; Barros, Lucas
2016-04-01
The automated processing of multiple seismic signals to detect, localize and classify seismic events is a central tool in both natural hazards monitoring and nuclear treaty verification. However, false detections and missed detections caused by station noise and incorrect classification of arrivals are still an issue and the events are often unclassified or poorly classified. Thus, machine learning techniques can be used in automatic processing for classifying the huge database of seismic recordings and provide more confidence in the final output. Applied in the context of the International Monitoring System (IMS) - a global sensor network developed for the Comprehensive Nuclear-Test-Ban Treaty (CTBT) - we propose a fully automatic method for seismic event detection and classification based on a supervised pattern recognition technique called the Support Vector Machine (SVM). According to Kortström et al., 2015, the advantages of using SVM are handleability of large number of features and effectiveness in high dimensional spaces. Our objective is to detect seismic events from one IMS seismic station located in an area of high seismicity and mining activity and classify them as earthquakes or quarry blasts. It is expected to create a flexible and easily adjustable SVM method that can be applied in different regions and datasets. Taken a step further, accurate results for seismic stations could lead to a modification of the model and its parameters to make it applicable to other waveform technologies used to monitor nuclear explosions such as infrasound and hydroacoustic waveforms. As an authorized user, we have direct access to all IMS data and bulletins through a secure signatory account. A set of significant seismic waveforms containing different types of events (e.g. earthquake, quarry blasts) and noise is being analysed to train the model and learn the typical pattern of the signal from these events. Moreover, comparing the performance of the support-vector network to various classical learning algorithms used before in seismic detection and classification is an essential final step to analyze the advantages and disadvantages of the model.
Ballari, Rajashekhar V; Martin, Asha; Gowda, Lalitha R
2013-01-01
Brinjal is an important vegetable crop. Major crop loss of brinjal is due to insect attack. Insect-resistant EE-1 brinjal has been developed and is awaiting approval for commercial release. Consumer health concerns and implementation of international labelling legislation demand reliable analytical detection methods for genetically modified (GM) varieties. End-point and real-time polymerase chain reaction (PCR) methods were used to detect EE-1 brinjal. In end-point PCR, primer pairs specific to 35S CaMV promoter, NOS terminator and nptII gene common to other GM crops were used. Based on the revealed 3' transgene integration sequence, primers specific for the event EE-1 brinjal were designed. These primers were used for end-point single, multiplex and SYBR-based real-time PCR. End-point single PCR showed that the designed primers were highly specific to event EE-1 with a sensitivity of 20 pg of genomic DNA, corresponding to 20 copies of haploid EE-1 brinjal genomic DNA. The limits of detection and quantification for SYBR-based real-time PCR assay were 10 and 100 copies respectively. The prior development of detection methods for this important vegetable crop will facilitate compliance with any forthcoming labelling regulations. Copyright © 2012 Society of Chemical Industry.
NASA Astrophysics Data System (ADS)
Hotokezaka, K.; Nissanke, S.; Hallinan, G.; Lazio, T. J. W.; Nakar, E.; Piran, T.
2016-11-01
Mergers of binary neutron stars and black hole-neutron star binaries produce gravitational-wave (GW) emission and outflows with significant kinetic energies. These outflows result in radio emissions through synchrotron radiation. We explore the detectability of these synchrotron-generated radio signals by follow-up observations of GW merger events lacking a detection of electromagnetic counterparts in other wavelengths. We model radio light curves arising from (I) sub-relativistic merger ejecta and (II) ultra-relativistic jets. The former produce radio remnants on timescales of a few years and the latter produce γ-ray bursts in the direction of the jet and orphan-radio afterglows extending over wider angles on timescales of weeks. Based on the derived light curves, we suggest an optimized survey at 1.4 GHz with five epochs separated by a logarithmic time interval. We estimate the detectability of the radio counterparts of simulated GW-merger events to be detected by advanced LIGO and Virgo by current and future radio facilities. The detectable distances for these GW merger events could be as high as 1 Gpc. Around 20%-60% of the long-lasting radio remnants will be detectable in the case of the moderate kinetic energy of 3\\cdot {10}50 erg and a circum-merger density of 0.1 {{cm}}-3 or larger, while 5%-20% of the orphan-radio afterglows with kinetic energy of 1048 erg will be detectable. The detection likelihood increases if one focuses on the well-localizable GW events. We discuss the background noise due to radio fluxes of host galaxies and false positives arising from extragalactic radio transients and variable active galactic nuclei, and we show that the quiet radio transient sky is of great advantage when searching for the radio counterparts.
Real-Time Event Detection for Monitoring Natural and Source ...
The use of event detection systems in finished drinking water systems is increasing in order to monitor water quality in both operational and security contexts. Recent incidents involving harmful algal blooms and chemical spills into watersheds have increased interest in monitoring source water quality prior to treatment. This work highlights the use of the CANARY event detection software in detecting suspected illicit events in an actively monitored watershed in South Carolina. CANARY is an open source event detection software that was developed by USEPA and Sandia National Laboratories. The software works with any type of sensor, utilizes multiple detection algorithms and approaches, and can incorporate operational information as needed. Monitoring has been underway for several years to detect events related to intentional or unintentional dumping of materials into the monitored watershed. This work evaluates the feasibility of using CANARY to enhance the detection of events in this watershed. This presentation will describe the real-time monitoring approach used in this watershed, the selection of CANARY configuration parameters that optimize detection for this watershed and monitoring application, and the performance of CANARY during the time frame analyzed. Further, this work will highlight how rainfall events impacted analysis, and the innovative application of CANARY taken in order to effectively detect the suspected illicit events. This presentation d
NASA Astrophysics Data System (ADS)
Ellsworth, W. L.; Shelly, D. R.; Hardebeck, J.; Hill, D. P.
2017-12-01
Microseismicity often conveys the most direct information about active processes in the earth's subsurface. However, routine network processing typically leaves most earthquakes uncharacterized. These "sub-catalog" events can provide critical clues to ongoing processes in the source region. To address this issue, we have developed waveform-based processing that leverages the existing routine catalog of earthquakes to detect and characterize "sub-catalog" events (those absent in routine catalogs). By correlating waveforms of cataloged events with the continuous data stream, we 1) identify events with similar waveform signatures in the continuous data across multiple stations, 2) precisely measure relative time lags across these stations for both P- and S-wave time windows, and 3) estimate the relative polarity between events by the sign of the peak absolute value correlations and its height above the secondary peak. When combined, these inter-event comparisons yield robust measurements, which enable sensitive event detection, relative relocation, and relative magnitude estimation. The most recent addition, focal mechanisms derived from correlation-based relative polarities, addresses a significant shortcoming in microseismicity analyses (see Shelly et al., JGR, 2016). Depending on the application, we can characterize 2-10 times as many events as included in the initial catalog. This technique is particularly well suited for compact zones of active seismicity such as seismic swarms. Application to a 2014 swarm in Long Valley Caldera, California, illuminates complex patterns of faulting that would have otherwise remained obscured. The prevalence of such features in other environments remains an important, as yet unresolved, question.
The Event Detection and the Apparent Velocity Estimation Based on Computer Vision
NASA Astrophysics Data System (ADS)
Shimojo, M.
2012-08-01
The high spatial and time resolution data obtained by the telescopes aboard Hinode revealed the new interesting dynamics in solar atmosphere. In order to detect such events and estimate the velocity of dynamics automatically, we examined the estimation methods of the optical flow based on the OpenCV that is the computer vision library. We applied the methods to the prominence eruption observed by NoRH, and the polar X-ray jet observed by XRT. As a result, it is clear that the methods work well for solar images if the images are optimized for the methods. It indicates that the optical flow estimation methods in the OpenCV library are very useful to analyze the solar phenomena.
NASA Astrophysics Data System (ADS)
Takagi, R.; Obara, K.; Uchida, N.
2017-12-01
Understanding slow earthquake activity improves our knowledge of slip behavior in brittle-ductile transition zone and subduction process including megathrust earthquakes. In order to understand overall picture of slow slip activity, it is important to make a comprehensive catalog of slow slip events (SSEs). Although short-term SSEs have been detected by GNSS and tilt meter records systematically, analysis of long-term slow slip events relies on individual slip inversions. We develop an algorism to systematically detect long-term SSEs and estimate source parameters of the SSEs using GNSS data. The algorism is similar to GRiD-MT (Tsuruoka et al., 2009), which is grid-based automatic determination of moment tensor solution. Instead of moment tensor fitting to long period seismic records, we estimate parameters of a single rectangle fault to fit GNSS displacement time series. First, we make a two dimensional grid covering possible location of SSE. Second, we estimate best-fit parameters (length, width, slip, and rake) of the rectangle fault at each grid point by an iterative damped least square method. Depth, strike, and dip are fixed on the plate boundary. Ramp function with duration of 300 days is used for expressing time evolution of the fault slip. Third, a grid maximizing variance reduction is selected as a candidate of long-term SSE. We also search onset of ramp function based on the grid search. We applied the method to GNSS data in southwest Japan to detect long-term SSEs in Nankai subduction zone. With current selection criteria, we found 13 events with Mw6.2-6.9 in Hyuga-nada, Bungo channel, and central Shikoku from 1998 to 2015, which include unreported events. Key finding is along strike migrations of long-term SSEs from Hyuga-nada to Bungo channel and from Bungo channel to central Shikoku. In particular, three successive events migrating northward in Hyuga-nada preceded the 2003 Bungo channel SSE, and one event in central Shikoku followed the 2003 SSE in Bungo channel. The space-time dimensions of the possible along-strike migration are about 300km in length and 6 years in time. Systematic detection with assumptions of various durations in the time evolution of SSE may improve the picture of SSE activity and possible interaction with neighboring SSEs.
Detection and localization capability of an urban seismic sinkhole monitoring network
NASA Astrophysics Data System (ADS)
Becker, Dirk; Dahm, Torsten; Schneider, Fabian
2017-04-01
Microseismic events linked to underground processes in sinkhole areas might serve as precursors to larger mass dislocation or rupture events which can cause felt ground shaking or even structural damage. To identify these weak and shallow events, a sensitive local seismic monitoring network is needed. In case of an urban environment the performance of local monitoring networks is severely compromised by the high anthropogenic noise level. We study the detection and localization capability of such a network, which is already partly installed in the urban area of the city of Hamburg, Germany, within the joint project SIMULTAN (http://www.gfz-potsdam.de/en/section/near-surface-geophysics/projects/simultan/). SIMULTAN aims to monitor a known sinkhole structure and gain a better understanding of the underlying processes. The current network consists of six surface stations installed in the basement of private houses and underground structures of a research facility (DESY - Deutsches Elektronen Synchrotron). During the started monitoring campaign since 2015, no microseismic events could be unambiguously attributed to the sinkholes. To estimate the detection and location capability of the network, we calculate synthetic waveforms based on the location and mechanism of former events in the area. These waveforms are combined with the recorded urban seismic noise at the station sites. As detection algorithms a simple STA/LTA trigger and a more sophisticated phase detector are used. While the STA/LTA detector delivers stable results and is able to detect events with a moment magnitude as low as 0.35 at a distance of 1.3km from the source even under the present high noise conditions the phase detector is more sensitive but also less stable. It should be stressed that due to the local near surface conditions of the wave propagation the detections are generally performed on S- or surface waves and not on P-waves, which have a significantly lower amplitude. Due to the often emergent onsets of the seismic phases of sinkhole events and the high noise conditions the localization capability of the network is assessed by a stacking approach of characteristic waveforms (STA/LTA traces) in addition to traditional estimates based on travel time uncertainties and network geometry. Also the effect of a vertical array of borehole sensors as well as a small scale surface array on the location accuracy is investigated. Due to the expected, rather low frequency character of the seismic signals arrays with a small aperture due to the required close proximity to the source exhibit considerable uncertainty in the determination of the azimuth of the incoming wavefront, but can contribute to better constrain the event location. Future borehole stations, apart from significantly reducing the detection threshold, would also significantly reduce the location uncertainty. In addition, the synthetic data sets created for this study can also be used to better constrain the magnitudes of the microseismic events by deriving attenuation relations for the surface waves of shallow events encountered in the sinkhole environment. This work has been funded by the German 'Geotechnologien' project SIMULTAN (BMBF03G0737A).
Distributed Events in Sentinel: Design and Implementation of a Global Event Detector
1999-01-01
local event detector and a global event detector to detect events. Global event detector in this case plays the role of a message sending/receiving than...significant in this case . The system performance will decrease with increase in the number of applications involved in global event detection. Yet from a...Figure 8: A Global event tree (2) 1. Global composite event is detected at the GED In this case , the whole global composite event tree is sent to the
Representation of photon limited data in emission tomography using origin ensembles
NASA Astrophysics Data System (ADS)
Sitek, A.
2008-06-01
Representation and reconstruction of data obtained by emission tomography scanners are challenging due to high noise levels in the data. Typically, images obtained using tomographic measurements are represented using grids. In this work, we define images as sets of origins of events detected during tomographic measurements; we call these origin ensembles (OEs). A state in the ensemble is characterized by a vector of 3N parameters Y, where the parameters are the coordinates of origins of detected events in a three-dimensional space and N is the number of detected events. The 3N-dimensional probability density function (PDF) for that ensemble is derived, and we present an algorithm for OE image estimation from tomographic measurements. A displayable image (e.g. grid based image) is derived from the OE formulation by calculating ensemble expectations based on the PDF using the Markov chain Monte Carlo method. The approach was applied to computer-simulated 3D list-mode positron emission tomography data. The reconstruction errors for a 10 000 000 event acquisition for simulated ranged from 0.1 to 34.8%, depending on object size and sampling density. The method was also applied to experimental data and the results of the OE method were consistent with those obtained by a standard maximum-likelihood approach. The method is a new approach to representation and reconstruction of data obtained by photon-limited emission tomography measurements.
NASA Technical Reports Server (NTRS)
Webb, D. F.; Jackson, B. V.
1992-01-01
The zodiacal light photometers on the two Helios spacecraft have been used to detect and study mass ejections and other phenomena emanating from the sun and traversing the heliosphere within 1 AU. We have recently compiled a complete list of all of the significant white light transient events detected from the 90-deg photometers on both Helios spacecraft. This is a preliminary report on the long-term frequency of occurrence of these events; it emphasizes newly processed data from Helios-l from 1975 through 1982 and viewed south of the ecliptic. With the large Helios photometer data base, we will be able to identify the fraction of the 90 deg events which are heliospheric CMEs and determine their characteristics.
QRS detection based ECG quality assessment.
Hayn, Dieter; Jammerbund, Bernhard; Schreier, Günter
2012-09-01
Although immediate feedback concerning ECG signal quality during recording is useful, up to now not much literature describing quality measures is available. We have implemented and evaluated four ECG quality measures. Empty lead criterion (A), spike detection criterion (B) and lead crossing point criterion (C) were calculated from basic signal properties. Measure D quantified the robustness of QRS detection when applied to the signal. An advanced Matlab-based algorithm combining all four measures and a simplified algorithm for Android platforms, excluding measure D, were developed. Both algorithms were evaluated by taking part in the Computing in Cardiology Challenge 2011. Each measure's accuracy and computing time was evaluated separately. During the challenge, the advanced algorithm correctly classified 93.3% of the ECGs in the training-set and 91.6 % in the test-set. Scores for the simplified algorithm were 0.834 in event 2 and 0.873 in event 3. Computing time for measure D was almost five times higher than for other measures. Required accuracy levels depend on the application and are related to computing time. While our simplified algorithm may be accurate for real-time feedback during ECG self-recordings, QRS detection based measures can further increase the performance if sufficient computing power is available.
Deep Recurrent Neural Network-Based Autoencoders for Acoustic Novelty Detection
Vesperini, Fabio; Schuller, Björn
2017-01-01
In the emerging field of acoustic novelty detection, most research efforts are devoted to probabilistic approaches such as mixture models or state-space models. Only recent studies introduced (pseudo-)generative models for acoustic novelty detection with recurrent neural networks in the form of an autoencoder. In these approaches, auditory spectral features of the next short term frame are predicted from the previous frames by means of Long-Short Term Memory recurrent denoising autoencoders. The reconstruction error between the input and the output of the autoencoder is used as activation signal to detect novel events. There is no evidence of studies focused on comparing previous efforts to automatically recognize novel events from audio signals and giving a broad and in depth evaluation of recurrent neural network-based autoencoders. The present contribution aims to consistently evaluate our recent novel approaches to fill this white spot in the literature and provide insight by extensive evaluations carried out on three databases: A3Novelty, PASCAL CHiME, and PROMETHEUS. Besides providing an extensive analysis of novel and state-of-the-art methods, the article shows how RNN-based autoencoders outperform statistical approaches up to an absolute improvement of 16.4% average F-measure over the three databases. PMID:28182121
Perez, Miguel A; Sudweeks, Jeremy D; Sears, Edie; Antin, Jonathan; Lee, Suzanne; Hankey, Jonathan M; Dingus, Thomas A
2017-06-01
Understanding causal factors for traffic safety-critical events (e.g., crashes and near-crashes) is an important step in reducing their frequency and severity. Naturalistic driving data offers unparalleled insight into these factors, but requires identification of situations where crashes are present within large volumes of data. Sensitivity and specificity of these identification approaches are key to minimizing the resources required to validate candidate crash events. This investigation used data from the Second Strategic Highway Research Program Naturalistic Driving Study (SHRP 2 NDS) and the Canada Naturalistic Driving Study (CNDS) to develop and validate different kinematic thresholds that can be used to detect crash events. Results indicate that the sensitivity of many of these approaches can be quite low, but can be improved by selecting particular threshold levels based on detection performance. Additional improvements in these approaches are possible, and may involve leveraging combinations of different detection approaches, including advanced statistical techniques and artificial intelligence approaches, additional parameter modifications, and automation of validation processes. Copyright © 2017 Elsevier Ltd. All rights reserved.
Motion camera based on a custom vision sensor and an FPGA architecture
NASA Astrophysics Data System (ADS)
Arias-Estrada, Miguel
1998-09-01
A digital camera for custom focal plane arrays was developed. The camera allows the test and development of analog or mixed-mode arrays for focal plane processing. The camera is used with a custom sensor for motion detection to implement a motion computation system. The custom focal plane sensor detects moving edges at the pixel level using analog VLSI techniques. The sensor communicates motion events using the event-address protocol associated to a temporal reference. In a second stage, a coprocessing architecture based on a field programmable gate array (FPGA) computes the time-of-travel between adjacent pixels. The FPGA allows rapid prototyping and flexible architecture development. Furthermore, the FPGA interfaces the sensor to a compact PC computer which is used for high level control and data communication to the local network. The camera could be used in applications such as self-guided vehicles, mobile robotics and smart surveillance systems. The programmability of the FPGA allows the exploration of further signal processing like spatial edge detection or image segmentation tasks. The article details the motion algorithm, the sensor architecture, the use of the event- address protocol for velocity vector computation and the FPGA architecture used in the motion camera system.
Evidence-based librarianship: what might we expect in the years ahead?
Eldredge, Jonathan D
2002-06-01
To predict the possible accomplishments of the Evidence-Based Librarianship (EBL) movement by the years 2005, 2010, 2015 and 2020. Predictive. The author draws upon recent events, relevant historical events and anecdotal accounts to detect evidence of predictable trends. The author develops a set of probable predictions for the development of EBL. Although incomplete evidence exists, some trends still seem discernible. By 2020, EBL will have become indistinguishable from mainstream health sciences librarianship/informatics practices.
Wang, Anran; Wang, Jian; Lin, Hongfei; Zhang, Jianhai; Yang, Zhihao; Xu, Kan
2017-12-20
Biomedical event extraction is one of the most frontier domains in biomedical research. The two main subtasks of biomedical event extraction are trigger identification and arguments detection which can both be considered as classification problems. However, traditional state-of-the-art methods are based on support vector machine (SVM) with massive manually designed one-hot represented features, which require enormous work but lack semantic relation among words. In this paper, we propose a multiple distributed representation method for biomedical event extraction. The method combines context consisting of dependency-based word embedding, and task-based features represented in a distributed way as the input of deep learning models to train deep learning models. Finally, we used softmax classifier to label the example candidates. The experimental results on Multi-Level Event Extraction (MLEE) corpus show higher F-scores of 77.97% in trigger identification and 58.31% in overall compared to the state-of-the-art SVM method. Our distributed representation method for biomedical event extraction avoids the problems of semantic gap and dimension disaster from traditional one-hot representation methods. The promising results demonstrate that our proposed method is effective for biomedical event extraction.
NASA Astrophysics Data System (ADS)
Tremsin, A. S.; Vallerga, J. V.; McPhate, J. B.; Siegmund, O. H. W.
2015-07-01
Many high resolution event counting devices process one event at a time and cannot register simultaneous events. In this article a frame-based readout event counting detector consisting of a pair of Microchannel Plates and a quad Timepix readout is described. More than 104 simultaneous events can be detected with a spatial resolution of 55 μm, while >103 simultaneous events can be detected with <10 μm spatial resolution when event centroiding is implemented. The fast readout electronics is capable of processing >1200 frames/sec, while the global count rate of the detector can exceed 5×108 particles/s when no timing information on every particle is required. For the first generation Timepix readout, the timing resolution is limited by the Timepix clock to 10-20 ns. Optimization of the MCP gain, rear field voltage and Timepix threshold levels are crucial for the device performance and that is the main subject of this article. These devices can be very attractive for applications where the photon/electron/ion/neutron counting with high spatial and temporal resolution is required, such as energy resolved neutron imaging, Time of Flight experiments in lidar applications, experiments on photoelectron spectroscopy and many others.
Method and apparatus for distinguishing actual sparse events from sparse event false alarms
Spalding, Richard E.; Grotbeck, Carter L.
2000-01-01
Remote sensing method and apparatus wherein sparse optical events are distinguished from false events. "Ghost" images of actual optical phenomena are generated using an optical beam splitter and optics configured to direct split beams to a single sensor or segmented sensor. True optical signals are distinguished from false signals or noise based on whether the ghost image is presence or absent. The invention obviates the need for dual sensor systems to effect a false target detection capability, thus significantly reducing system complexity and cost.
Element analysis: a wavelet-based method for analysing time-localized events in noisy time series.
Lilly, Jonathan M
2017-04-01
A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized 'events'. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event's 'region of influence' within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis , is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry.
Automatic Detection of Whole Night Snoring Events Using Non-Contact Microphone
Dafna, Eliran; Tarasiuk, Ariel; Zigel, Yaniv
2013-01-01
Objective Although awareness of sleep disorders is increasing, limited information is available on whole night detection of snoring. Our study aimed to develop and validate a robust, high performance, and sensitive whole-night snore detector based on non-contact technology. Design Sounds during polysomnography (PSG) were recorded using a directional condenser microphone placed 1 m above the bed. An AdaBoost classifier was trained and validated on manually labeled snoring and non-snoring acoustic events. Patients Sixty-seven subjects (age 52.5±13.5 years, BMI 30.8±4.7 kg/m2, m/f 40/27) referred for PSG for obstructive sleep apnea diagnoses were prospectively and consecutively recruited. Twenty-five subjects were used for the design study; the validation study was blindly performed on the remaining forty-two subjects. Measurements and Results To train the proposed sound detector, >76,600 acoustic episodes collected in the design study were manually classified by three scorers into snore and non-snore episodes (e.g., bedding noise, coughing, environmental). A feature selection process was applied to select the most discriminative features extracted from time and spectral domains. The average snore/non-snore detection rate (accuracy) for the design group was 98.4% based on a ten-fold cross-validation technique. When tested on the validation group, the average detection rate was 98.2% with sensitivity of 98.0% (snore as a snore) and specificity of 98.3% (noise as noise). Conclusions Audio-based features extracted from time and spectral domains can accurately discriminate between snore and non-snore acoustic events. This audio analysis approach enables detection and analysis of snoring sounds from a full night in order to produce quantified measures for objective follow-up of patients. PMID:24391903
Automatic detection of whole night snoring events using non-contact microphone.
Dafna, Eliran; Tarasiuk, Ariel; Zigel, Yaniv
2013-01-01
Although awareness of sleep disorders is increasing, limited information is available on whole night detection of snoring. Our study aimed to develop and validate a robust, high performance, and sensitive whole-night snore detector based on non-contact technology. Sounds during polysomnography (PSG) were recorded using a directional condenser microphone placed 1 m above the bed. An AdaBoost classifier was trained and validated on manually labeled snoring and non-snoring acoustic events. Sixty-seven subjects (age 52.5 ± 13.5 years, BMI 30.8 ± 4.7 kg/m(2), m/f 40/27) referred for PSG for obstructive sleep apnea diagnoses were prospectively and consecutively recruited. Twenty-five subjects were used for the design study; the validation study was blindly performed on the remaining forty-two subjects. To train the proposed sound detector, >76,600 acoustic episodes collected in the design study were manually classified by three scorers into snore and non-snore episodes (e.g., bedding noise, coughing, environmental). A feature selection process was applied to select the most discriminative features extracted from time and spectral domains. The average snore/non-snore detection rate (accuracy) for the design group was 98.4% based on a ten-fold cross-validation technique. When tested on the validation group, the average detection rate was 98.2% with sensitivity of 98.0% (snore as a snore) and specificity of 98.3% (noise as noise). Audio-based features extracted from time and spectral domains can accurately discriminate between snore and non-snore acoustic events. This audio analysis approach enables detection and analysis of snoring sounds from a full night in order to produce quantified measures for objective follow-up of patients.
Taking the CCDs to the ultimate performance for low threshold experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haro, Miguel; Moroni, Guillermo; Tiffenberg, Javier
2016-11-14
Scientific grade CCDs show atractive capabilities for the detection of particles with small energy deposition in matter. Their very low threshold of approximately 40 eV and their good spatial reconstruction of the event are key properties for currently running experiments: CONNIE and DAMIC. Both experiments can benefit from any increase of the detection efficiency of nuclear recoils at low energy. In this work we present two different approaches to increase this efficiency by increasing the SNR of events. The first one is based on the reduction of the readout noise of the device, which is the main contribution of uncertaintymore » to the signal measurement. New studies on the electronic noise from the integrated output amplifier and the readout electronics will be presented together with result of a new configuration showing a lower limit on the readout noise which can be implemented on the current setup of the CCD based experiments. A second approach to increase the SNR of events at low energy that will be presented is the studies of the spatial conformation of nuclear recoil events at different depth in the active volume by studies of new effects that differ from expected models based on not interacting diffusion model of electrons in the semiconductor.« less
Smith, Brian T; Coiro, Daniel J; Finson, Richard; Betz, Randal R; McCarthy, James
2002-03-01
Force-sensing resistors (FSRs) were used to detect the transitions between five main phases of gait for the control of electrical stimulation (ES) while walking with seven children with spastic diplegia, cerebral palsy. The FSR positions within each child's insoles were customized based on plantar pressure profiles determined using a pressure-sensitive membrane array (Tekscan Inc., Boston, MA). The FSRs were placed in the insoles so that pressure transitions coincided with an ipsilateral or contralateral gait event. The transitions between the following gait phases were determined: loading response, mid- and terminal stance, and pre- and initial swing. Following several months of walking on a regular basis with FSR-triggered intramuscular ES to the hip and knee extensors, hip abductors, and ankle dorsi and plantar flexors, the accuracy and reliability of the FSRs to detect gait phase transitions were evaluated. Accuracy was evaluated with four of the subjects by synchronizing the output of the FSR detection scheme with a VICON (Oxford Metrics, U.K.) motion analysis system, which was used as the gait event reference. While mean differences between each FSR-detected gait event and that of the standard (VICON) ranged from +35 ms (indicating that the FSR detection scheme recognized the event before it actually happened) to -55 ms (indicating that the FSR scheme recognized the event after it occurred), the difference data was widely distributed, which appeared to be due in part to both intrasubject (step-to-step) and intersubject variability. Terminal stance exhibited the largest mean difference and standard deviation, while initial swing exhibited the smallest deviation and preswing the smallest mean difference. To determine step-to-step reliability, all seven children walked on a level walkway for at least 50 steps. Of 642 steps, there were no detection errors in 94.5% of the steps. Of the steps that contained a detection error, 80% were due to the failure of the FSR signal to reach the programmed threshold level during the transition to loading response. Recovery from an error always occurred one to three steps later.
Bayesian analysis of caustic-crossing microlensing events
NASA Astrophysics Data System (ADS)
Cassan, A.; Horne, K.; Kains, N.; Tsapras, Y.; Browne, P.
2010-06-01
Aims: Caustic-crossing binary-lens microlensing events are important anomalous events because they are capable of detecting an extrasolar planet companion orbiting the lens star. Fast and robust modelling methods are thus of prime interest in helping to decide whether a planet is detected by an event. Cassan introduced a new set of parameters to model binary-lens events, which are closely related to properties of the light curve. In this work, we explain how Bayesian priors can be added to this framework, and investigate on interesting options. Methods: We develop a mathematical formulation that allows us to compute analytically the priors on the new parameters, given some previous knowledge about other physical quantities. We explicitly compute the priors for a number of interesting cases, and show how this can be implemented in a fully Bayesian, Markov chain Monte Carlo algorithm. Results: Using Bayesian priors can accelerate microlens fitting codes by reducing the time spent considering physically implausible models, and helps us to discriminate between alternative models based on the physical plausibility of their parameters.
Event Detection for Hydrothermal Plumes: A case study at Grotto Vent
NASA Astrophysics Data System (ADS)
Bemis, K. G.; Ozer, S.; Xu, G.; Rona, P. A.; Silver, D.
2012-12-01
Evidence is mounting that geologic events such as volcanic eruptions (and intrusions) and earthquakes (near and far) influence the flow rates and temperatures of hydrothermal systems. Connecting such suppositions to observations of hydrothermal output is challenging, but new ongoing time series have the potential to capture such events. This study explores using activity detection, a technique modified from computer vision, to identify pre-defined events within an extended time series recorded by COVIS (Cabled Observatory Vent Imaging Sonar) and applies it to a time series, with gaps, from Sept 2010 to the present; available measurements include plume orientation, plume rise rate, and diffuse flow area at the NEPTUNE Canada Observatory at Grotto Vent, Main Endeavour Field, Juan de Fuca Ridge. Activity detection is the process of finding a pattern (activity) in a data set containing many different types of patterns. Among many approaches proposed to model and detect activities, we have chosen a graph-based technique, Petri Nets, as they do not require training data to model the activity. They use the domain expert's knowledge to build the activity as a combination of feature states and their transitions (actions). Starting from a conceptual model of how hydrothermal plumes respond to daily tides, we have developed a Petri Net based detection algorithm that identifies deviations from the specified response. Initially we assumed that the orientation of the plume would change smoothly and symmetrically in a consistent daily pattern. However, results indicate that the rate of directional changes varies. The present Petri Net detects unusually large and rapid changes in direction or amount of bending; however inspection of Figure 1 suggests that many of the events detected may be artifacts resulting from gaps in the data or from the large temporal spacing. Still, considerable complexity overlies the "normal" tidal response pattern (the data has a dominant frequency of ~12.9 hours). We are in the process of defining several events of particular scientific interest: 1) transient behavioral changes associated with atmospheric storms, earthquakes or volcanic intrusions or eruptions, 2) mutual interaction of neighboring plumes on each other's behavior, and 3) rapid shifts in plume direction that indicate the presence of unusual currents or changes in currents. We will query the existing data to see if these relationships are ever observed as well as testing our understanding of the "normal" pattern of response to tidal currents.Figure 1. Arrows indicate plume orientation at a given time (time axis in days after 9/29/10) and stars indicate times when orientation changes rapidly.
NASA Technical Reports Server (NTRS)
Collow, Allie Marquardt; Bosilovich, Mike; Ullrich, Paul; Hoeck, Ian
2017-01-01
Extreme precipitation events can have a large impact on society through flooding that can result in property destruction, crop losses, economic losses, the spread of water-borne diseases, and fatalities. Observations indicate there has been a statistically significant increase in extreme precipitation events over the past 15 years in the Northeastern United States and other localized regions of the country have become crippled with record flooding events, for example, the flooding that occurred in the Southeast United States associated with Hurricane Matthew in October 2016. Extreme precipitation events in the United States can be caused by various meteorological influences such as extratropical cyclones, tropical cyclones, mesoscale convective complexes, general air mass thunderstorms, upslope flow, fronts, and the North American Monsoon. Reanalyses, such as the Modern Era Retrospective Analysis for Research and Applications, version 2 (MERRA-2), have become a pivotal tool to study the meteorology surrounding extreme precipitation events. Using days classified as an extreme precipitation events based on a combination of observational gauge and radar data, two techniques for the classification of these events are used to gather additional information that can be used to determine how events have changed over time using atmospheric data from MERRA-2. The first is self organizing maps, which is an artificial neural network that uses unsupervised learning to cluster like patterns and the second is an automated detection technique that searches for characteristics in the atmosphere that define a meteorological phenomena. For example, the automated detection for tropical cycles searches for a defined area of suppressed sea level pressure, alongside thickness anomalies aloft, indicating the presence of a warm core. These techniques are employed for extreme precipitation events in preselected regions that were chosen based an analysis of the climatology of precipitation.
On the significance of future trends in flood frequencies
NASA Astrophysics Data System (ADS)
Bernhardt, M.; Schulz, K.; Wieder, O.
2015-12-01
Floods are a significant threat for alpine headwater catchments and for the forelands. The formation of significant flood events is thereby often coupled on processes occurring in the alpine zone. Rain on snow events are just one example. The prediction of flood risks or trends of flood risks is of major interest to people under direct threat, policy and decision makers as well as for insurance companies. A lot of research was and is currently done in view of detecting future trends in flood extremes or return periods. From a pure physically based point of view, there is strong evidence that those trends exist. But, the central point question is if trends in flood events or other extreme events could be detected from a statistical point of view and on the basis of the available data. This study will investigate this question on the basis of different target parameters and by using long term measurements.
Real Time Coincidence Detection Engine for High Count Rate Timestamp Based PET
NASA Astrophysics Data System (ADS)
Tetrault, M.-A.; Oliver, J. F.; Bergeron, M.; Lecomte, R.; Fontaine, R.
2010-02-01
Coincidence engines follow two main implementation flows: timestamp based systems and AND-gate based systems. The latter have been more widespread in recent years because of its lower cost and high efficiency. However, they are highly dependent on the selected electronic components, they have limited flexibility once assembled and they are customized to fit a specific scanner's geometry. Timestamp based systems are gathering more attention lately, especially with high channel count fully digital systems. These new systems must however cope with important singles count rates. One option is to record every detected event and postpone coincidence detection offline. For daily use systems, a real time engine is preferable because it dramatically reduces data volume and hence image preprocessing time and raw data management. This paper presents the timestamp based coincidence engine for the LabPET¿, a small animal PET scanner with up to 4608 individual readout avalanche photodiode channels. The engine can handle up to 100 million single events per second and has extensive flexibility because it resides in programmable logic devices. It can be adapted for any detector geometry or channel count, can be ported to newer, faster programmable devices and can have extra modules added to take advantage of scanner-specific features. Finally, the user can select between full processing mode for imaging protocols and minimum processing mode to study different approaches for coincidence detection with offline software.
NASA Astrophysics Data System (ADS)
Aster, R. C.; McMahon, N. D.; Myers, E. K.; Lough, A. C.
2015-12-01
Lough et al. (2014) first detected deep sub-icecap magmatic events beneath the Executive Committee Range volcanoes of Marie Byrd Land. Here, we extend the identification and analysis of these events in space and time utilizing subspace detection. Subspace detectors provide a highly effective methodology for studying events within seismic swarms that have similar moment tensor and Green's function characteristics and are particularly effective for identifying low signal-to-noise events. Marie Byrd Land (MBL) is an extremely remote continental region that is nearly completely covered by the West Antarctic Ice Sheet (WAIS). The southern extent of Marie Byrd Land lies within the West Antarctic Rift System (WARS), which includes the volcanic Executive Committee Range (ECR). The ECR shows north-to-south progression of volcanism across the WARS during the Holocene. In 2013, the POLENET/ANET seismic data identified two swarms of seismic activity in 2010 and 2011. These events have been interpreted as deep, long-period (DLP) earthquakes based on depth (25-40 km) and low frequency content. The DLP events in MBL lie beneath an inferred sub-WAIS volcanic edifice imaged with ice penetrating radar and have been interpreted as a present location of magmatic intrusion. The magmatic swarm activity in MBL provides a promising target for advanced subspace detection and temporal, spatial, and event size analysis of an extensive deep long period earthquake swarm using a remote seismographic network. We utilized a catalog of 1,370 traditionally identified DLP events to construct subspace detectors for the six nearest stations and analyzed two years of data spanning 2010-2011. Association of these detections into events resulted in an approximate ten-fold increase in number of locatable earthquakes. In addition to the two previously identified swarms during early 2010 and early 2011, we find sustained activity throughout the two years of study that includes several previously unidentified periods of heightened activity. Correlation with large global earthquakes suggests that the DLP activity is not sensitive to remote teleseismic triggering.
[Study on the timeliness of detection and reporting on public health emergency events in China].
Li, Ke-Li; Feng, Zi-Jian; Ni, Da-Xin
2009-03-01
To analyze the timeliness of detection and reporting on public health emergency events, and to explore the effective strategies for improving the relative capacity on those issues. We conducted a retrospective survey on 3275 emergency events reported through Public Health Emergency Events Surveillance System from 2005 to the first half of 2006. Developed by county Centers for Disease Control and Prevention, a uniformed self-administrated questionnaire was used to collect data, which would include information on the detection, reporting of the events. For communicable diseases events, the median of time interval between the occurrence of first case and the detection of event was 6 days (P25 = 2, P75 = 13). For food poisoning events and clusters of disease with unknown origin, the medians were 3 hours (P25, P75 = 16) and 1 days (P25 = 0, P75 = 5). 71.54% of the events were reported by the discoverers within 2 hours after the detection. In general, the ranges of time intervals between the occurrence, detection or reporting of the events were different, according to the categories of events. The timeliness of detection and reporting of events could have been improved dramatically if the definition of events, according to their characteristics, had been more reasonable and accessible, as well as the improvement of training program for healthcare staff and teachers.
Overview and early results of the Global Lightning and Sprite Measurements mission
NASA Astrophysics Data System (ADS)
Sato, M.; Ushio, T.; Morimoto, T.; Kikuchi, M.; Kikuchi, H.; Adachi, T.; Suzuki, M.; Yamazaki, A.; Takahashi, Y.; Inan, U.; Linscott, I.; Ishida, R.; Sakamoto, Y.; Yoshida, K.; Hobara, Y.; Sano, T.; Abe, T.; Nakamura, M.; Oda, H.; Kawasaki, Z.-I.
2015-05-01
Global Lightning and Sprite Measurements on Japanese Experiment Module (JEM-GLIMS) is a space mission to conduct the nadir observations of lightning discharges and transient luminous events (TLEs). The main objectives of this mission are to identify the horizontal distribution of TLEs and to solve the occurrence conditions determining the spatial distribution. JEM-GLIMS was successfully launched and started continuous nadir observations in 2012. The global distribution of the detected lightning events shows that most of the events occurred over continental regions in the local summer hemisphere. In some events, strong far-ultraviolet emissions have been simultaneously detected with N2 1P and 2P emissions by the spectrophotometers, which strongly suggest the occurrence of TLEs. Especially, in some of these events, no significant optical emission was measured by the narrowband filter camera, which suggests the occurrence of elves, not sprites. The VLF receiver also succeeded in detecting lightning whistlers, which show clear falling-tone frequency dispersion. Based on the optical data, the time delay from the detected lightning emission to the whistlers was identified as ˜10 ms, which can be reasonably explained by the wave propagation with the group velocity of whistlers. The VHF interferometer conducted the spaceborne interferometric observations and succeeded in detecting VHF pulses. We observed that the VHF pulses are likely to be excited by the lightning discharge possibly related with in-cloud discharges and measured with the JEM-GLIMS optical instruments. Thus, JEM-GLIMS provides the first full set of optical and electromagnetic data of lightning and TLEs obtained by nadir observations from space.
Foreign Object Damage Identification in Turbine Engines
NASA Technical Reports Server (NTRS)
Strack, William; Zhang, Desheng; Turso, James; Pavlik, William; Lopez, Isaac
2005-01-01
This report summarizes the collective work of a five-person team from different organizations examining the problem of detecting foreign object damage (FOD) events in turbofan engines from gas path thermodynamic and bearing accelerometer sensors, and determining the severity of damage to each component (diagnosis). Several detection and diagnostic approaches were investigated and a software tool (FODID) was developed to assist researchers detect/diagnose FOD events. These approaches include (1) fan efficiency deviation computed from upstream and downstream temperature/ pressure measurements, (2) gas path weighted least squares estimation of component health parameter deficiencies, (3) Kalman filter estimation of component health parameters, and (4) use of structural vibration signal processing to detect both large and small FOD events. The last three of these approaches require a significant amount of computation in conjunction with a physics-based analytic model of the underlying phenomenon the NPSS thermodynamic cycle code for approaches 1 to 3 and the DyRoBeS reduced-order rotor dynamics code for approach 4. A potential application of the FODID software tool, in addition to its detection/diagnosis role, is using its sensitivity results to help identify the best types of sensors and their optimum locations within the gas path, and similarly for bearing accelerometers.
Lin, Yin-Yan; Wu, Hau-Tieng; Hsu, Chi-An; Huang, Po-Chiun; Huang, Yuan-Hao; Lo, Yu-Lun
2016-12-07
Physiologically, the thoracic (THO) and abdominal (ABD) movement signals, captured using wearable piezo-electric bands, provide information about various types of apnea, including central sleep apnea (CSA) and obstructive sleep apnea (OSA). However, the use of piezo-electric wearables in detecting sleep apnea events has been seldom explored in the literature. This study explored the possibility of identifying sleep apnea events, including OSA and CSA, by solely analyzing one or both the THO and ABD signals. An adaptive non-harmonic model was introduced to model the THO and ABD signals, which allows us to design features for sleep apnea events. To confirm the suitability of the extracted features, a support vector machine was applied to classify three categories - normal and hypopnea, OSA, and CSA. According to a database of 34 subjects, the overall classification accuracies were on average 75.9%±11.7% and 73.8%±4.4%, respectively, based on the cross validation. When the features determined from the THO and ABD signals were combined, the overall classification accuracy became 81.8%±9.4%. These features were applied for designing a state machine for online apnea event detection. Two event-byevent accuracy indices, S and I, were proposed for evaluating the performance of the state machine. For the same database, the S index was 84.01%±9.06%, and the I index was 77.21%±19.01%. The results indicate the considerable potential of applying the proposed algorithm to clinical examinations for both screening and homecare purposes.
REDO: RNA Editing Detection in Plant Organelles Based on Variant Calling Results.
Wu, Shuangyang; Liu, Wanfei; Aljohi, Hasan Awad; Alromaih, Sarah A; Alanazi, Ibrahim O; Lin, Qiang; Yu, Jun; Hu, Songnian
2018-05-01
RNA editing is a post-transcriptional or cotranscriptional process that changes the sequence of the precursor transcript by substitutions, insertions, or deletions. Almost all of the land plants undergo RNA editing in organelles (plastids and mitochondria). Although several software tools have been developed to identify RNA editing events, there has been a great challenge to distinguish true RNA editing events from genome variation, sequencing errors, and other factors. Here we introduce REDO, a comprehensive application tool for identifying RNA editing events in plant organelles based on variant call format files from RNA-sequencing data. REDO is a suite of Perl scripts that illustrate a bunch of attributes of RNA editing events in figures and tables. REDO can also detect RNA editing events in multiple samples simultaneously and identify the significant differential proportion of RNA editing loci. Comparing with similar tools, such as REDItools, REDO runs faster with higher accuracy, and more specificity at the cost of slightly lower sensitivity. Moreover, REDO annotates each RNA editing site in RNAs, whereas REDItools reports only possible RNA editing sites in genome, which need additional steps to obtain RNA editing profiles for RNAs. Overall, REDO can identify potential RNA editing sites easily and provide several functions such as detailed annotations, statistics, figures, and significantly differential proportion of RNA editing sites among different samples.
Twitter Seismology: Earthquake Monitoring and Response in a Social World
NASA Astrophysics Data System (ADS)
Bowden, D. C.; Earle, P. S.; Guy, M.; Smoczyk, G.
2011-12-01
The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment USGS earthquake response products and the delivery of hazard information. The potential uses of Twitter for earthquake response include broadcasting earthquake alerts, rapidly detecting widely felt events, qualitatively assessing earthquake damage effects, communicating with the public, and participating in post-event collaboration. Several seismic networks and agencies are currently distributing Twitter earthquake alerts including the European-Mediterranean Seismological Centre (@LastQuake), Natural Resources Canada (@CANADAquakes), and the Indonesian meteorological agency (@infogempabmg); the USGS will soon distribute alerts via the @USGSted and @USGSbigquakes Twitter accounts. Beyond broadcasting alerts, the USGS is investigating how to use tweets that originate near the epicenter to detect and characterize shaking events. This is possible because people begin tweeting immediately after feeling an earthquake, and their short narratives and exclamations are available for analysis within 10's of seconds of the origin time. Using five months of tweets that contain the word "earthquake" and its equivalent in other languages, we generate a tweet-frequency time series. The time series clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a simple Short-Term-Average / Long-Term-Average algorithm similar to that commonly used to detect seismic phases. As with most auto-detection algorithms, the parameters can be tuned to catch more or less events at the cost of more or less false triggers. When tuned to a moderate sensitivity, the detector found 48 globally-distributed, confirmed seismic events with only 2 false triggers. A space-shuttle landing and "The Great California ShakeOut" caused the false triggers. This number of detections is very small compared to the 5,175 earthquakes in the USGS PDE global earthquake catalog for the same five month time period, and no accurate location or magnitude can be assigned based on Tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 80% occurred within 2 minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided (very) short first-impression narratives from people who experienced the shaking. The USGS will continue investigating how to use Twitter and other forms of social media to augment is current suite of seismographically derived products.
Myint, S.W.; Yuan, M.; Cerveny, R.S.; Giri, C.P.
2008-01-01
Remote sensing techniques have been shown effective for large-scale damage surveys after a hazardous event in both near real-time or post-event analyses. The paper aims to compare accuracy of common imaging processing techniques to detect tornado damage tracks from Landsat TM data. We employed the direct change detection approach using two sets of images acquired before and after the tornado event to produce a principal component composite images and a set of image difference bands. Techniques in the comparison include supervised classification, unsupervised classification, and objectoriented classification approach with a nearest neighbor classifier. Accuracy assessment is based on Kappa coefficient calculated from error matrices which cross tabulate correctly identified cells on the TM image and commission and omission errors in the result. Overall, the Object-oriented Approach exhibits the highest degree of accuracy in tornado damage detection. PCA and Image Differencing methods show comparable outcomes. While selected PCs can improve detection accuracy 5 to 10%, the Object-oriented Approach performs significantly better with 15-20% higher accuracy than the other two techniques. ?? 2008 by MDPI.
Myint, Soe W.; Yuan, May; Cerveny, Randall S.; Giri, Chandra P.
2008-01-01
Remote sensing techniques have been shown effective for large-scale damage surveys after a hazardous event in both near real-time or post-event analyses. The paper aims to compare accuracy of common imaging processing techniques to detect tornado damage tracks from Landsat TM data. We employed the direct change detection approach using two sets of images acquired before and after the tornado event to produce a principal component composite images and a set of image difference bands. Techniques in the comparison include supervised classification, unsupervised classification, and object-oriented classification approach with a nearest neighbor classifier. Accuracy assessment is based on Kappa coefficient calculated from error matrices which cross tabulate correctly identified cells on the TM image and commission and omission errors in the result. Overall, the Object-oriented Approach exhibits the highest degree of accuracy in tornado damage detection. PCA and Image Differencing methods show comparable outcomes. While selected PCs can improve detection accuracy 5 to 10%, the Object-oriented Approach performs significantly better with 15-20% higher accuracy than the other two techniques. PMID:27879757
Ma, Xingyi; Sim, Sang Jun
2013-03-21
Even though DNA-based nanosensors have been demonstrated for quantitative detection of analytes and diseases, hybridization events have never been numerically investigated for further understanding of DNA mediated interactions. Here, we developed a nanoscale platform with well-designed capture and detection gold nanoprobes to precisely evaluate the hybridization events. The capture gold nanoprobes were mono-laid on glass and the detection probes were fabricated via a novel competitive conjugation method. The two kinds of probes combined in a suitable orientation following the hybridization with the target. We found that hybridization efficiency was markedly dependent on electrostatic interactions between DNA strands, which can be tailored by adjusting the salt concentration of the incubation solution. Due to the much lower stability of the double helix formed by mismatches, the hybridization efficiencies of single mismatched (MMT) and perfectly matched DNA (PMT) were different. Therefore, we obtained an optimized salt concentration that allowed for discrimination of MMT from PMT without stringent control of temperature or pH. The results indicated this to be an ultrasensitive and precise nanosensor for the diagnosis of genetic diseases.
Pulse Detecting Genetic Circuit – A New Design Approach
Inniss, Mara; Iba, Hitoshi; Way, Jeffrey C.
2016-01-01
A robust cellular counter could enable synthetic biologists to design complex circuits with diverse behaviors. The existing synthetic-biological counters, responsive to the beginning of the pulse, are sensitive to the pulse duration. Here we present a pulse detecting circuit that responds only at the falling edge of a pulse–analogous to negative edge triggered electric circuits. As biological events do not follow precise timing, use of such a pulse detector would enable the design of robust asynchronous counters which can count the completion of events. This transcription-based pulse detecting circuit depends on the interaction of two co-expressed lambdoid phage-derived proteins: the first is unstable and inhibits the regulatory activity of the second, stable protein. At the end of the pulse the unstable inhibitor protein disappears from the cell and the second protein triggers the recording of the event completion. Using stochastic simulation we showed that the proposed design can detect the completion of the pulse irrespective to the pulse duration. In our simulation we also showed that fusing the pulse detector with a phage lambda memory element we can construct a counter which can be extended to count larger numbers. The proposed design principle is a new control mechanism for synthetic biology which can be integrated in different circuits for identifying the completion of an event. PMID:27907045
Pulse Detecting Genetic Circuit - A New Design Approach.
Noman, Nasimul; Inniss, Mara; Iba, Hitoshi; Way, Jeffrey C
2016-01-01
A robust cellular counter could enable synthetic biologists to design complex circuits with diverse behaviors. The existing synthetic-biological counters, responsive to the beginning of the pulse, are sensitive to the pulse duration. Here we present a pulse detecting circuit that responds only at the falling edge of a pulse-analogous to negative edge triggered electric circuits. As biological events do not follow precise timing, use of such a pulse detector would enable the design of robust asynchronous counters which can count the completion of events. This transcription-based pulse detecting circuit depends on the interaction of two co-expressed lambdoid phage-derived proteins: the first is unstable and inhibits the regulatory activity of the second, stable protein. At the end of the pulse the unstable inhibitor protein disappears from the cell and the second protein triggers the recording of the event completion. Using stochastic simulation we showed that the proposed design can detect the completion of the pulse irrespective to the pulse duration. In our simulation we also showed that fusing the pulse detector with a phage lambda memory element we can construct a counter which can be extended to count larger numbers. The proposed design principle is a new control mechanism for synthetic biology which can be integrated in different circuits for identifying the completion of an event.
Machine intelligence-based decision-making (MIND) for automatic anomaly detection
NASA Astrophysics Data System (ADS)
Prasad, Nadipuram R.; King, Jason C.; Lu, Thomas
2007-04-01
Any event deemed as being out-of-the-ordinary may be called an anomaly. Anomalies by virtue of their definition are events that occur spontaneously with no prior indication of their existence or appearance. Effects of anomalies are typically unknown until they actually occur, and their effects aggregate in time to show noticeable change from the original behavior. An evolved behavior would in general be very difficult to correct unless the anomalous event that caused such behavior can be detected early, and any consequence attributed to the specific anomaly. Substantial time and effort is required to back-track the cause for abnormal behavior and to recreate the event sequence leading to abnormal behavior. There is a critical need therefore to automatically detect anomalous behavior as and when they may occur, and to do so with the operator in the loop. Human-machine interaction results in better machine learning and a better decision-support mechanism. This is the fundamental concept of intelligent control where machine learning is enhanced by interaction with human operators, and vice versa. The paper discusses a revolutionary framework for the characterization, detection, identification, learning, and modeling of anomalous behavior in observed phenomena arising from a large class of unknown and uncertain dynamical systems.
NASA Astrophysics Data System (ADS)
Sato, Mitsuteru; Mihara, Masahiro; Ushio, Tomoo; Morimoto, Takeshi; Kikuchi, Hiroshi; Adachi, Toru; Suzuki, Makoto; Yamazaki, Atsushi; Takahashi, Yukihiro
2015-04-01
JEM-GLIMS is continuing the comprehensive nadir observations of lightning and TLEs using optical instruments and electromagnetic wave receivers since November 2012. For the period between November 20, 2012 and November 30, 2014, JEM-GLIMS succeeded in detecting 5,048 lightning events. A total of 567 events in 5,048 lightning events were TLEs, which were mostly elves events. To identify the sprite occurrences from the transient optical flash data, it is necessary to perform the following data analysis: (1) a subtraction of the appropriately scaled wideband camera data from the narrowband camera data; (2) a calculation of intensity ratio between different spectrophotometer channels; and (3) an estimation of the polarization and CMC for the parent CG discharges using ground-based ELF measurement data. From a synthetic comparison of these results, it is confirmed that JEM-GLISM succeeded in detecting sprite events. The VHF receiver (VITF) onboard JEM-GLIMS uses two patch-type antennas separated by a 1.6-m interval and can detect VHF pulses emitted by lightning discharges in the 70-100 MHz frequency range. Using both an interferometric technique and a group delay technique, we can estimate the source locations of VHF pulses excited by lightning discharges. In the event detected at 06:41:15.68565 UT on June 12, 2014 over central North America, sprite was distributed with a horizontal displacement of 20 km from the peak location of the parent lightning emission. In this event, a total of 180 VHF pulses were simultaneously detected by VITF. From the detailed data analysis of these VHF pulse data, it is found that the majority of the source locations were placed near the area of the dim lightning emission, which may imply that the VHF pulses were associated with the in-cloud lightning current. At the presentation, we will show detailed comparison between the spatiotemporal characteristics of sprite emission and source locations of VHF pulses excited by the parent lightning discharges of sprites.
NASA Astrophysics Data System (ADS)
Le Bras, R.; Rozhkov, M.; Bobrov, D.; Kitov, I. O.; Sanina, I.
2017-12-01
Association of weak seismic signals generated by low-magnitude aftershocks of the DPRK underground tests into event hypotheses represent a challenge for routine automatic and interactive processing at the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization, due to the relatively low station density of the International Monitoring System (IMS) seismic network. Since 2011, as an alternative, the IDC has been testing various prototype techniques of signal detection and event creation based on waveform cross correlation. Using signals measured by seismic stations of the IMS from DPRK explosions as waveform templates, the IDC detected several small (estimated mb between 2.2 and 3.6) seismic events after two DPRK tests conducted on September 9, 2016 and September 3, 2017. The obtained detections were associated with reliable event hypothesis and then used to locate these events relative to the epicenters of the DPRK explosions. We observe high similarity of the detected signals with the corresponding waveform templates. The newly found signals also correlate well between themselves. In addition, the values of the signal-to-noise ratios (SNR) estimated using the traces of cross correlation coefficients, increase with template length (from 5 s to 150 s), providing strong evidence in favour of their spatial closeness, which allows interpreting them as explosion aftershocks. We estimated the relative magnitudes of all aftershocks using the ratio of RMS amplitudes of the master and slave signal in the cross correlation windows characterized by the highest SNR. Additional waveform data from regional non-IMS stations MDJ and SEHB provide independent validation of these aftershock hypotheses. Since waveform templates from any single master event may be sub-efficient at some stations, we have also developed a method of joint usage of the DPRK and the biggest aftershocks templates to build more robust event hypotheses.
NASA Astrophysics Data System (ADS)
Morton, E.; Bilek, S. L.; Rowe, C. A.
2016-12-01
Unlike other subduction zones, the Cascadia subduction zone (CSZ) is notable for the absence of detected and located small and moderate magnitude interplate earthquakes, despite the presence of recurring episodic tremor and slip (ETS) downdip and evidence of pre-historic great earthquakes. Thermal and geodetic models indicate that the seismogenic zone exists primarily, if not entirely, offshore; therefore the perceived unusual seismic quiescence may be a consequence of seismic source location in relation to land based seismometers. The Cascadia Initiative (CI) amphibious community seismic experiment includes ocean bottom seismometers (OBS) deployed directly above the presumed locked seismogenic zone. We use the CI dataset to search for small magnitude interplate earthquakes previously undetected using the on-land sensors alone. We implement subspace detection to search for small earthquakes. We build our subspace with template events from existing earthquake catalogs that appear to have occurred on the plate interface, windowing waveforms on CI OBS and land seismometers. Although our efforts will target the entire CSZ margin and full 4-year CI deployment, here we focus on a previously identified cluster off the coast of Oregon, related to a subducting seamount. During the first year of CI deployment, this target area yields 293 unique detections with 86 well-located events. Thirty-two of these events occurred within the seamount cluster, and 13 events were located in another cluster to the northwest of the seamount. Events within the seamount cluster are separated into those whose depths place them on the plate interface, and a shallower set ( 5 km depth). These separate event groups track together temporally, and seem to agree with a model of seamount subduction that creates extensive fracturing around the seamount, rather than stress concentrated at the seamount-plate boundary. During CI year 2, this target area yields >1000 additional event detections.
First Light Detected from Gravitational Wave Event on This Week @NASA – October 20, 2017
2017-10-20
For the first time, NASA scientists have detected light tied to a gravitational-wave event. The gravitational wave – caused by an explosive merger of two neutron stars, about 130 million light-years from Earth – produced a gamma-ray burst and a rarely seen flare-up called a "kilonova". The phenomenon was captured by our Fermi, Swift, Hubble, Chandra and Spitzer missions, along with dozens of NASA-funded ground-based observatories. Also, Trio of Station Spacewalks Completed, Fresh Findings from Cassini, and Test of SLS RS-25 Flight Engine!
Body-borne IED detection: NATO DAT#10 BELCOAST 09 demonstration results
NASA Astrophysics Data System (ADS)
Alexander, Naomi; Gómez, Ignacio; Ortega, Isabel; Fiore, Franco; Coman, Cristian
2010-04-01
Belgium leads the tenth initiative in the CNAD Programme of Work for the Defense Against Terrorism (PoW DAT), dealing with Critical Infrastructure Protection (CIP). The BELCOAST 09 event, comprising a series of technology demonstrations, was organized to tackle the need for an event that brings together the operational, armaments and technological communities in the field of CIP. A counter terrorism scenario has been created: Terrorist with body-borne IED approaching the entrance of an installation, and a millimeter-wave imager's ability to detect IEDs has been demonstrated. The results of this scenario-based demonstration are presented in this paper.
Patwardhan, Supriya; Dasari, Srikanth; Bhagavatula, Krishna; Mueller, Steffen; Deepak, Saligrama Adavigowda; Ghosh, Sudip; Basak, Sanjay
2015-01-01
An efficient PCR-based method to trace genetically modified food and feed products is in demand due to regulatory requirements and contaminant issues in India. However, post-PCR detection with conventional methods has limited sensitivity in amplicon separation that is crucial in multiplexing. The study aimed to develop a sensitive post-PCR detection method by using PCR-chip capillary electrophoresis (PCR-CCE) to detect and identify specific genetically modified organisms in their genomic DNA mixture by targeting event-specific nucleotide sequences. Using the PCR-CCE approach, novel multiplex methods were developed to detect MON531 cotton, EH 92-527-1 potato, Bt176 maize, GT73 canola, or GA21 maize simultaneously when their genomic DNAs in mixtures were amplified using their primer mixture. The repeatability RSD (RSDr) of the peak migration time was 0.06 and 3.88% for the MON531 and Bt176, respectively. The RSD (RSDR) of the Cry1Ac peak ranged from 0.12 to 0.40% in multiplex methods. The method was sensitive in resolving amplicon of size difference up to 4 bp. The PCR-CCE method is suitable to detect multiple genetically modified events in a composite DNA sample by tagging their event specific sequences.
Automatic arrival time detection for earthquakes based on Modified Laplacian of Gaussian filter
NASA Astrophysics Data System (ADS)
Saad, Omar M.; Shalaby, Ahmed; Samy, Lotfy; Sayed, Mohammed S.
2018-04-01
Precise identification of onset time for an earthquake is imperative in the right figuring of earthquake's location and different parameters that are utilized for building seismic catalogues. P-wave arrival detection of weak events or micro-earthquakes cannot be precisely determined due to background noise. In this paper, we propose a novel approach based on Modified Laplacian of Gaussian (MLoG) filter to detect the onset time even in the presence of very weak signal-to-noise ratios (SNRs). The proposed algorithm utilizes a denoising-filter algorithm to smooth the background noise. In the proposed algorithm, we employ the MLoG mask to filter the seismic data. Afterward, we apply a Dual-threshold comparator to detect the onset time of the event. The results show that the proposed algorithm can detect the onset time for micro-earthquakes accurately, with SNR of -12 dB. The proposed algorithm achieves an onset time picking accuracy of 93% with a standard deviation error of 0.10 s for 407 field seismic waveforms. Also, we compare the results with short and long time average algorithm (STA/LTA) and the Akaike Information Criterion (AIC), and the proposed algorithm outperforms them.
On the reliable use of satellite-derived surface water products for global flood monitoring
NASA Astrophysics Data System (ADS)
Hirpa, F. A.; Revilla-Romero, B.; Thielen, J.; Salamon, P.; Brakenridge, R.; Pappenberger, F.; de Groeve, T.
2015-12-01
Early flood warning and real-time monitoring systems play a key role in flood risk reduction and disaster response management. To this end, real-time flood forecasting and satellite-based detection systems have been developed at global scale. However, due to the limited availability of up-to-date ground observations, the reliability of these systems for real-time applications have not been assessed in large parts of the globe. In this study, we performed comparative evaluations of the commonly used satellite-based global flood detections and operational flood forecasting system using 10 major flood cases reported over three years (2012-2014). Specially, we assessed the flood detection capabilities of the near real-time global flood maps from the Global Flood Detection System (GFDS), and from the Moderate Resolution Imaging Spectroradiometer (MODIS), and the operational forecasts from the Global Flood Awareness System (GloFAS) for the major flood events recorded in global flood databases. We present the evaluation results of the global flood detection and forecasting systems in terms of correctly indicating the reported flood events and highlight the exiting limitations of each system. Finally, we propose possible ways forward to improve the reliability of large scale flood monitoring tools.
A Search for Neutrinos from Fast Radio Bursts with IceCube
NASA Astrophysics Data System (ADS)
Fahey, Samuel; Kheirandish, Ali; Vandenbroucke, Justin; Xu, Donglian
2017-08-01
We present a search for neutrinos in coincidence in time and direction with four fast radio bursts (FRBs) detected by the Parkes and Green Bank radio telescopes during the first year of operation of the complete IceCube Neutrino Observatory (2011 May through 2012 May). The neutrino sample consists of 138,322 muon neutrino candidate events, which are dominated by atmospheric neutrinos and atmospheric muons but also contain an astrophysical neutrino component. Considering only neutrinos detected on the same day as each FRB, zero IceCube events were found to be compatible with the FRB directions within the estimated 99% error radius of the neutrino directions. Based on the non-detection, we present the first upper limits on the neutrino fluence from FRBs.
Observations of Sprites and Elves Associated With Winter Thunderstorms in the Eastern Mediterranean
NASA Astrophysics Data System (ADS)
Ganot, M.; Yair, Y.; Price, C.; Ziv, B.; Sherez, Y.; Greenberg, E.; Devir, A.; Yaniv, R.; Bor, J.; Satori, G.
2006-12-01
The results of the 2005-6 winter sprite campaign in Israel are reported. We conducted optical ground-based observations aiming to detect transient luminous events (TLEs) above winter thunderstorms in Israel and in the area over the Mediterranean Sea between Israel, Cyprus and Lebanon. We alternated between two observation sites: the Tel-Aviv University campus in central Tel-Aviv (32.5N, 34.5E) and the Wise astronomical observatory in the Negev desert, near Mitzpe-Ramon (30N, 34.5E). We used 2 WATEC cameras, mounted on a pan-and- tilt unit with GPS time-base and event-detection software (UFO-Capture). The system was remote-controlled via the Internet and targets were chosen in real-time based on lightning locations derived from a BOLTEK lightning detection system stationed in Tel-Aviv. Detailed weather forecasts and careful analysis of lightning probability allowed us to choose between the two observation sites. The optical campaign was accompanied by ELF and VLF electromagnetic measurements from the existing TAU array in southern Israel. During five separate winter storms (December 2005 through March 2006) we detected 31 events: 27 sprites (4 halo sprites) and 4 elves. Detection ranges varied from 250 to 450km. Sprites were found to occur almost exclusively over the sea, in the height range 44-105km. Most sprites were columnar, and the number of elements varied from 1 to 9 with lengths varying from 10 to 48km. The average duration of sprites was ~43ms. All TLEs were accompanied by distinct positive ELF transients, which were clearly identified by our ELF station in Mizpe-Ramon and by the ELF station near Sopron, Hungary (range ~2500km). Calculated charge moment values were 800-1870 C·km, with some events exceeding 2500 C·km. We employed different lightning location systems (Israel Electrical Company LPATS and TOGA, ZEUS global networks) to determine the ground location of the parent lightning and succeeded in geo-locating 7 events. Based on weather radar and satellite images, it was found that most of the thunderclouds that produced sprites were isolated Cumulonimbus cells embedded within a matrix of lower rain clouds, associated with the cold sector of Cyprus lows. The relationship between the meteorological parameters, storm size, vertical cloud development and lightning properties, as well as a comparison with the properties of thunderstorms producing winter sprites in Japan, will be presented.
Detection and analysis of a transient energy burst with beamforming of multiple teleseismic phases
NASA Astrophysics Data System (ADS)
Retailleau, Lise; Landès, Matthieu; Gualtieri, Lucia; Shapiro, Nikolai M.; Campillo, Michel; Roux, Philippe; Guilbert, Jocelyn
2018-01-01
Seismological detection methods are traditionally based on picking techniques. These methods cannot be used to analyse emergent signals where the arrivals cannot be picked. Here, we detect and locate seismic events by applying a beamforming method that combines multiple body-wave phases to USArray data. This method explores the consistency and characteristic behaviour of teleseismic body waves that are recorded by a large-scale, still dense, seismic network. We perform time-slowness analysis of the signals and correlate this with the time-slowness equivalent of the different body-wave phases predicted by a global traveltime calculator, to determine the occurrence of an event with no a priori information about it. We apply this method continuously to one year of data to analyse the different events that generate signals reaching the USArray network. In particular, we analyse in detail a low-frequency secondary microseismic event that occurred on 2010 February 1. This event, that lasted 1 d, has a narrow frequency band around 0.1 Hz, and it occurred at a distance of 150° to the USArray network, South of Australia. We show that the most energetic phase observed is the PKPab phase. Direct amplitude analysis of regional seismograms confirms the occurrence of this event. We compare the seismic observations with models of the spectral density of the pressure field generated by the interferences between oceanic waves. We attribute the observed signals to a storm-generated microseismic event that occurred along the South East Indian Ridge.
Rapid Disaster Analysis based on Remote Sensing: A Case Study about the Tohoku Tsunami Disaster 2011
NASA Astrophysics Data System (ADS)
Yang, C. H.; Soergel, U.; Lanaras, Ch.; Baltsavias, E.; Cho, K.; Remondino, F.; Wakabayashi, H.
2014-09-01
In this study, we present first results of RAPIDMAP, a project funded by European Union in a framework aiming to foster the cooperation of European countries with Japan in R&D. The main objective of RAPIDMAP is to construct a Decision Support System (DSS) based on remote sensing data and WebGIS technologies, where users can easily access real-time information assisting with disaster analysis. In this paper, we present a case study of the Tohoku Tsunami Disaster 2011. We address two approaches namely change detection based on SAR data and co-registration of optical and SAR satellite images. With respect to SAR data, our efforts are subdivided into three parts: (1) initial coarse change detection for entire area, (2) flood area detection, and (3) linearfeature change detection. The investigations are based on pre- and post-event TerraSAR-X images. In (1), two pre- and post-event TerraSAR-X images are accurately co-registered and radiometrically calibrated. Data are fused in a false-color image that provides a quick and rough overview of potential changes, which is useful for initial decision making and identifying areas worthwhile to be analysed further in more depth. However, a bunch of inevitable false alarms appear within the scene caused by speckle, temporal decorrelation, co-registration inaccuracy and so on. In (2), the post-event TerraSAR-X data are used to extract the flood area by using thresholding and morphological approaches. The validated result indicates that using SAR data combining with suitable morphological approaches is a quick and effective way to detect flood area. Except for usage of SAR data, the false-color image composed of optical images are also used to detect flood area for further exploration in this part. In (3), Curvelet filtering is applied in the difference image of pre- and post-event TerraSAR-X images not only to suppress false alarms of irregular-features, but also to enhance the change signals of linear-features (e.g. buildings) in settlements. Afterwards, thresholding is exploited to extract the linear-feature changes. In rapid mapping of disasters various sensors are often employed, including optical and SAR, since they provide complementary information. Such data needs to be analyzed in an integrated fashion and the results from each dataset should be integrated in a GIS with a common coordinate reference system. Thus, if no orthoimages can be generated, the images should be co-registered employing matching of common features. We present results of co-registration between optical (FORMOSAT-2) and TerraSAR-X images based on different matching methods, and also techniques for detecting and eliminating matching errors.
A Risk Assessment System with Automatic Extraction of Event Types
NASA Astrophysics Data System (ADS)
Capet, Philippe; Delavallade, Thomas; Nakamura, Takuya; Sandor, Agnes; Tarsitano, Cedric; Voyatzi, Stavroula
In this article we describe the joint effort of experts in linguistics, information extraction and risk assessment to integrate EventSpotter, an automatic event extraction engine, into ADAC, an automated early warning system. By detecting as early as possible weak signals of emerging risks ADAC provides a dynamic synthetic picture of situations involving risk. The ADAC system calculates risk on the basis of fuzzy logic rules operated on a template graph whose leaves are event types. EventSpotter is based on a general purpose natural language dependency parser, XIP, enhanced with domain-specific lexical resources (Lexicon-Grammar). Its role is to automatically feed the leaves with input data.
Processing circuitry for single channel radiation detector
NASA Technical Reports Server (NTRS)
Holland, Samuel D. (Inventor); Delaune, Paul B. (Inventor); Turner, Kathryn M. (Inventor)
2009-01-01
Processing circuitry is provided for a high voltage operated radiation detector. An event detector utilizes a comparator configured to produce an event signal based on a leading edge threshold value. A preferred event detector does not produce another event signal until a trailing edge threshold value is satisfied. The event signal can be utilized for counting the number of particle hits and also for controlling data collection operation for a peak detect circuit and timer. The leading edge threshold value is programmable such that it can be reprogrammed by a remote computer. A digital high voltage control is preferably operable to monitor and adjust high voltage for the detector.
Pipeline oil fire detection with MODIS active fire products
NASA Astrophysics Data System (ADS)
Ogungbuyi, M. G.; Martinez, P.; Eckardt, F. D.
2017-12-01
We investigate 85 129 MODIS satellite active fire events from 2007 to 2015 in the Niger Delta of Nigeria. The region is the oil base for Nigerian economy and the hub of oil exploration where oil facilities (i.e. flowlines, flow stations, trunklines, oil wells and oil fields) are domiciled, and from where crude oil and refined products are transported to different Nigerian locations through a network of pipeline systems. Pipeline and other oil facilities are consistently susceptible to oil leaks due to operational or maintenance error, and by acts of deliberate sabotage of the pipeline equipment which often result in explosions and fire outbreaks. We used ground oil spill reports obtained from the National Oil Spill Detection and Response Agency (NOSDRA) database (see www.oilspillmonitor.ng) to validate MODIS satellite data. NOSDRA database shows an estimate of 10 000 spill events from 2007 - 2015. The spill events were filtered to include largest spills by volume and events occurring only in the Niger Delta (i.e. 386 spills). By projecting both MODIS fire and spill as `input vector' layers with `Points' geometry, and the Nigerian pipeline networks as `from vector' layers with `LineString' geometry in a geographical information system, we extracted the nearest MODIS events (i.e. 2192) closed to the pipelines by 1000m distance in spatial vector analysis. The extraction process that defined the nearest distance to the pipelines is based on the global practices of the Right of Way (ROW) in pipeline management that earmarked 30m strip of land to the pipeline. The KML files of the extracted fires in a Google map validated their source origin to be from oil facilities. Land cover mapping confirmed fire anomalies. The aim of the study is to propose a near-real-time monitoring of spill events along pipeline routes using 250 m spatial resolution of MODIS active fire detection sensor when such spills are accompanied by fire events in the study location.
The solar origins of two high-latitude interplanetary disturbances
NASA Technical Reports Server (NTRS)
Hudson, H. S.; Acton, L. W.; Alexander, D.; Harvey, K. L.; Kurokawa, H.; Kahler, S.; Lemen, J. R.
1995-01-01
Two extremely similar interplanetary forward/reverse shock events, with bidirectional electron streaming were detected by Ulysses in 1994. Ground-based and Yohkoh/SXT observations show two strikingly different solar events that could be associated with them: an LDE flare on 20 Feb. 1994, and a extremely large-scale eruptive event on 14 April 1994. Both events resulted in geomagnetic storms and presumably were associated with coronal mass ejections. The sharply contrasting nature of these solar events argues against an energetic causal relationship between them and the bidirectional streaming events observed by Ulysses during its S polar passage. We suggest instead that for each pair of events. a common solar trigger may have caused independent instabilities leading to the solar and interplanetary phenomena.
Nishii, Nobuhiro; Miyoshi, Akihito; Kubo, Motoki; Miyamoto, Masakazu; Morimoto, Yoshimasa; Kawada, Satoshi; Nakagawa, Koji; Watanabe, Atsuyuki; Nakamura, Kazufumi; Morita, Hiroshi; Ito, Hiroshi
2018-03-01
Remote monitoring (RM) has been advocated as the new standard of care for patients with cardiovascular implantable electronic devices (CIEDs). RM has allowed the early detection of adverse clinical events, such as arrhythmia, lead failure, and battery depletion. However, lead failure was often identified only by arrhythmic events, but not impedance abnormalities. To compare the usefulness of arrhythmic events with conventional impedance abnormalities for identifying lead failure in CIED patients followed by RM. CIED patients in 12 hospitals have been followed by the RM center in Okayama University Hospital. All transmitted data have been analyzed and summarized. From April 2009 to March 2016, 1,873 patients have been followed by the RM center. During the mean follow-up period of 775 days, 42 lead failure events (atrial lead 22, right ventricular pacemaker lead 5, implantable cardioverter defibrillator [ICD] lead 15) were detected. The proportion of lead failures detected only by arrhythmic events, which were not detected by conventional impedance abnormalities, was significantly higher than that detected by impedance abnormalities (arrhythmic event 76.2%, 95% CI: 60.5-87.9%; impedance abnormalities 23.8%, 95% CI: 12.1-39.5%). Twenty-seven events (64.7%) were detected without any alert. Of 15 patients with ICD lead failure, none has experienced inappropriate therapy. RM can detect lead failure earlier, before clinical adverse events. However, CIEDs often diagnose lead failure as just arrhythmic events without any warning. Thus, to detect lead failure earlier, careful human analysis of arrhythmic events is useful. © 2017 Wiley Periodicals, Inc.
Model-Based Fault Tolerant Control
NASA Technical Reports Server (NTRS)
Kumar, Aditya; Viassolo, Daniel
2008-01-01
The Model Based Fault Tolerant Control (MBFTC) task was conducted under the NASA Aviation Safety and Security Program. The goal of MBFTC is to develop and demonstrate real-time strategies to diagnose and accommodate anomalous aircraft engine events such as sensor faults, actuator faults, or turbine gas-path component damage that can lead to in-flight shutdowns, aborted take offs, asymmetric thrust/loss of thrust control, or engine surge/stall events. A suite of model-based fault detection algorithms were developed and evaluated. Based on the performance and maturity of the developed algorithms two approaches were selected for further analysis: (i) multiple-hypothesis testing, and (ii) neural networks; both used residuals from an Extended Kalman Filter to detect the occurrence of the selected faults. A simple fusion algorithm was implemented to combine the results from each algorithm to obtain an overall estimate of the identified fault type and magnitude. The identification of the fault type and magnitude enabled the use of an online fault accommodation strategy to correct for the adverse impact of these faults on engine operability thereby enabling continued engine operation in the presence of these faults. The performance of the fault detection and accommodation algorithm was extensively tested in a simulation environment.
Towards cross-lingual alerting for bursty epidemic events.
Collier, Nigel
2011-10-06
Online news reports are increasingly becoming a source for event-based early warning systems that detect natural disasters. Harnessing the massive volume of information available from multilingual newswire presents as many challanges as opportunities due to the patterns of reporting complex spatio-temporal events. In this article we study the problem of utilising correlated event reports across languages. We track the evolution of 16 disease outbreaks using 5 temporal aberration detection algorithms on text-mined events classified according to disease and outbreak country. Using ProMED reports as a silver standard, comparative analysis of news data for 13 languages over a 129 day trial period showed improved sensitivity, F1 and timeliness across most models using cross-lingual events. We report a detailed case study analysis for Cholera in Angola 2010 which highlights the challenges faced in correlating news events with the silver standard. The results show that automated health surveillance using multilingual text mining has the potential to turn low value news into high value alerts if informed choices are used to govern the selection of models and data sources. An implementation of the C2 alerting algorithm using multilingual news is available at the BioCaster portal http://born.nii.ac.jp/?page=globalroundup.
Critical Current Statistics of a Graphene-Based Josephson Junction Infrared Single Photon Detector
NASA Astrophysics Data System (ADS)
Walsh, Evan D.; Lee, Gil-Ho; Efetov, Dmitri K.; Heuck, Mikkel; Crossno, Jesse; Taniguchi, Takashi; Watanabe, Kenji; Ohki, Thomas A.; Kim, Philip; Englund, Dirk; Fong, Kin Chung
Graphene is a promising material for single photon detection due to its broadband absorption and exceptionally low specific heat. We present a photon detector using a graphene sheet as the weak link in a Josephson junction (JJ) to form a threshold detector for single infrared photons. Calculations show that such a device could experience temperature changes of a few hundred percent leading to sub-Hz dark count rates and internal efficiencies approaching unity. We have fabricated the graphene-based JJ (gJJ) detector and measure switching events that are consistent with single photon detection under illumination by an attenuated laser. We study the physical mechanism for these events through the critical current behavior of the gJJ as a function of incident photon flux.
Estimation of Temporal Gait Parameters Using a Wearable Microphone-Sensor-Based System
Wang, Cheng; Wang, Xiangdong; Long, Zhou; Yuan, Jing; Qian, Yueliang; Li, Jintao
2016-01-01
Most existing wearable gait analysis methods focus on the analysis of data obtained from inertial sensors. This paper proposes a novel, low-cost, wireless and wearable gait analysis system which uses microphone sensors to collect footstep sound signals during walking. This is the first time a microphone sensor is used as a wearable gait analysis device as far as we know. Based on this system, a gait analysis algorithm for estimating the temporal parameters of gait is presented. The algorithm fully uses the fusion of two feet footstep sound signals and includes three stages: footstep detection, heel-strike event and toe-on event detection, and calculation of gait temporal parameters. Experimental results show that with a total of 240 data sequences and 1732 steps collected using three different gait data collection strategies from 15 healthy subjects, the proposed system achieves an average 0.955 F1-measure for footstep detection, an average 94.52% accuracy rate for heel-strike detection and 94.25% accuracy rate for toe-on detection. Using these detection results, nine temporal related gait parameters are calculated and these parameters are consistent with their corresponding normal gait temporal parameters and labeled data calculation results. The results verify the effectiveness of our proposed system and algorithm for temporal gait parameter estimation. PMID:27999321
Mapping the Recent US Hurricanes Triggered Flood Events in Near Real Time
NASA Astrophysics Data System (ADS)
Shen, X.; Lazin, R.; Anagnostou, E. N.; Wanik, D. W.; Brakenridge, G. R.
2017-12-01
Synthetic Aperture Radar (SAR) observations is the only reliable remote sensing data source to map flood inundation during severe weather events. Unfortunately, since state-of-art data processing algorithms cannot meet the automation and quality standard of a near-real-time (NRT) system, quality controlled inundation mapping by SAR currently depends heavily on manual processing, which limits our capability to quickly issue flood inundation maps at global scale. Specifically, most SAR-based inundation mapping algorithms are not fully automated, while those that are automated exhibit severe over- and/or under-detection errors that limit their potential. These detection errors are primarily caused by the strong overlap among the SAR backscattering probability density functions (PDF) of different land cover types. In this study, we tested a newly developed NRT SAR-based inundation mapping system, named Radar Produced Inundation Diary (RAPID), using Sentinel-1 dual polarized SAR data over recent flood events caused by Hurricanes Harvey, Irma, and Maria (2017). The system consists of 1) self-optimized multi-threshold classification, 2) over-detection removal using land-cover information and change detection, 3) under-detection compensation, and 4) machine-learning based correction. Algorithm details are introduced in another poster, H53J-1603. Good agreements were obtained by comparing the result from RAPID with visual interpretation of SAR images and manual processing from Dartmouth Flood Observatory (DFO) (See Figure 1). Specifically, the over- and under-detections that is typically noted in automated methods is significantly reduced to negligible levels. This performance indicates that RAPID can address the automation and accuracy issues of current state-of-art algorithms and has the potential to apply operationally on a number of satellite SAR missions, such as SWOT, ALOS, Sentinel etc. RAPID data can support many applications such as rapid assessment of damage losses and disaster alleviation/rescue at global scale.
Moon, Hankyu; Lu, Tsai-Ching
2015-01-01
Critical events in society or biological systems can be understood as large-scale self-emergent phenomena due to deteriorating stability. We often observe peculiar patterns preceding these events, posing a question of—how to interpret the self-organized patterns to know more about the imminent crisis. We start with a very general description — of interacting population giving rise to large-scale emergent behaviors that constitute critical events. Then we pose a key question: is there a quantifiable relation between the network of interactions and the emergent patterns? Our investigation leads to a fundamental understanding to: 1. Detect the system's transition based on the principal mode of the pattern dynamics; 2. Identify its evolving structure based on the observed patterns. The main finding of this study is that while the pattern is distorted by the network of interactions, its principal mode is invariant to the distortion even when the network constantly evolves. Our analysis on real-world markets show common self-organized behavior near the critical transitions, such as housing market collapse and stock market crashes, thus detection of critical events before they are in full effect is possible. PMID:25822423
NASA Astrophysics Data System (ADS)
Moon, Hankyu; Lu, Tsai-Ching
2015-03-01
Critical events in society or biological systems can be understood as large-scale self-emergent phenomena due to deteriorating stability. We often observe peculiar patterns preceding these events, posing a question of--how to interpret the self-organized patterns to know more about the imminent crisis. We start with a very general description -- of interacting population giving rise to large-scale emergent behaviors that constitute critical events. Then we pose a key question: is there a quantifiable relation between the network of interactions and the emergent patterns? Our investigation leads to a fundamental understanding to: 1. Detect the system's transition based on the principal mode of the pattern dynamics; 2. Identify its evolving structure based on the observed patterns. The main finding of this study is that while the pattern is distorted by the network of interactions, its principal mode is invariant to the distortion even when the network constantly evolves. Our analysis on real-world markets show common self-organized behavior near the critical transitions, such as housing market collapse and stock market crashes, thus detection of critical events before they are in full effect is possible.
Cheng, Li-Fang; Chen, Tung-Chien; Chen, Liang-Gee
2012-01-01
Most of the abnormal cardiac events such as myocardial ischemia, acute myocardial infarction (AMI) and fatal arrhythmia can be diagnosed through continuous electrocardiogram (ECG) analysis. According to recent clinical research, early detection and alarming of such cardiac events can reduce the time delay to the hospital, and the clinical outcomes of these individuals can be greatly improved. Therefore, it would be helpful if there is a long-term ECG monitoring system with the ability to identify abnormal cardiac events and provide realtime warning for the users. The combination of the wireless body area sensor network (BASN) and the on-sensor ECG processor is a possible solution for this application. In this paper, we aim to design and implement a digital signal processor that is suitable for continuous ECG monitoring and alarming based on the continuous wavelet transform (CWT) through the proposed architectures--using both programmable RISC processor and application specific integrated circuits (ASIC) for performance optimization. According to the implementation results, the power consumption of the proposed processor integrated with an ASIC for CWT computation is only 79.4 mW. Compared with the single-RISC processor, about 91.6% of the power reduction is achieved.
Risk factors for hazardous events in olfactory-impaired patients.
Pence, Taylor S; Reiter, Evan R; DiNardo, Laurence J; Costanzo, Richard M
2014-10-01
Normal olfaction provides essential cues to allow early detection and avoidance of potentially hazardous situations. Thus, patients with impaired olfaction may be at increased risk of experiencing certain hazardous events such as cooking or house fires, delayed detection of gas leaks, and exposure to or ingestion of toxic substances. To identify risk factors and potential trends over time in olfactory-related hazardous events in patients with impaired olfactory function. Retrospective cohort study of 1047 patients presenting to a university smell and taste clinic between 1983 and 2013. A total of 704 patients had both clinical olfactory testing and a hazard interview and were studied. On the basis of olfactory function testing results, patients were categorized as normosmic (n = 161), mildly hyposmic (n = 99), moderately hyposmic (n = 93), severely hyposmic (n = 142), and anosmic (n = 209). Patient evaluation including interview, examination, and olfactory testing. Incidence of specific olfaction-related hazardous events (ie, burning pots and/or pans, starting a fire while cooking, inability to detect gas leaks, inability to detect smoke, and ingestion of toxic substances or spoiled foods) by degree of olfactory impairment. The incidence of having experienced any hazardous event progressively increased with degree of impairment: normosmic (18.0%), mildly hyposmic (22.2%), moderately hyposmic (31.2%), severely hyposmic (32.4%), and anosmic (39.2%). Over 3 decades there was no significant change in the overall incidence of hazardous events. Analysis of demographic data (age, sex, race, smoking status, and etiology) revealed significant differences in the incidence of hazardous events based on age (among 397 patients <65 years, 148 [37.3%] with hazardous event, vs 31 of 146 patients ≥65 years [21.3%]; P < .001), sex (among 278 women, 106 [38.1%] with hazardous event, vs 73 of 265 men [27.6%]; P = .009), and race (among 98 African Americans, 41 [41.8%] with hazardous event, vs 134 of 434 whites [30.9%]; P = .04). Increased level of olfactory impairment portends an increased risk of experiencing a hazardous event. Risk is further impacted by individuals' age, sex, and race. These results may assist health care practitioners in counseling patients on the risks associated with olfactory impairment.
Jupiter emission observed near 1 MHz
NASA Technical Reports Server (NTRS)
Brown, L. W.
1974-01-01
Emission from Jupiter has been observed by the IMP-6 spacecraft at 19 frequencies between 600 and 9900 kHz covering the period from April 1971 to October 1972. The Jovian bursts were identified in the IMP-6 data through the phase of the observed modulated signal detected from the spinning dipole antenna. Initial data reduction has isolated 177 events over a span of 500 days. These events persisted over a period between 1 and 60 min. Of these events at least 48 occurred during times in which Jupiter emission was being observed at either 16.7 or 22.2 MHz by ground-based instruments of the Goddard Space Flight Center Jupiter monitoring system. Large bursts were detectable from 9900 kHz down to 600 kHz, while smaller bursts ranged down to 1030 kHz.-
Improved microseismic event locations through large-N arrays and wave-equation imaging and inversion
NASA Astrophysics Data System (ADS)
Witten, B.; Shragge, J. C.
2016-12-01
The recent increased focus on small-scale seismicity, Mw < 4 has come about primarily for two reasons. First, there is an increase in induced seismicity related to injection operations primarily for wastewater disposal and hydraulic fracturing for oil and gas recovery and for geothermal energy production. While the seismicity associated with injection is sometimes felt, it is more often weak. Some weak events are detected on current sparse arrays; however, accurate location of the events often requires a larger number of (multi-component) sensors. This leads to the second reason for an increased focus on small magnitude seismicity: a greater number of seismometers are being deployed in large N-arrays. The greater number of sensors decreases the detection threshold and therefore significantly increases the number of weak events found. Overall, these two factors bring new challenges and opportunities. Many standard seismological location and inversion techniques are geared toward large, easily identifiable events recorded on a sparse number of stations. However, with large-N arrays we can detect small events by utilizing multi-trace processing techniques, and increased processing power equips us with tools that employ more complete physics for simultaneously locating events and inverting for P- and S-wave velocity structure. We present a method that uses large-N arrays and wave-equation-based imaging and inversion to jointly locate earthquakes and estimate the elastic velocities of the earth. The technique requires no picking and is thus suitable for weak events. We validate the methodology through synthetic and field data examples.
Precise genotyping and recombination detection of Enterovirus
2015-01-01
Enteroviruses (EV) with different genotypes cause diverse infectious diseases in humans and mammals. A correct EV typing result is crucial for effective medical treatment and disease control; however, the emergence of novel viral strains has impaired the performance of available diagnostic tools. Here, we present a web-based tool, named EVIDENCE (EnteroVirus In DEep conception, http://symbiont.iis.sinica.edu.tw/evidence), for EV genotyping and recombination detection. We introduce the idea of using mixed-ranking scores to evaluate the fitness of prototypes based on relatedness and on the genome regions of interest. Using phylogenetic methods, the most possible genotype is determined based on the closest neighbor among the selected references. To detect possible recombination events, EVIDENCE calculates the sequence distance and phylogenetic relationship among sequences of all sliding windows scanning over the whole genome. Detected recombination events are plotted in an interactive figure for viewing of fine details. In addition, all EV sequences available in GenBank were collected and revised using the latest classification and nomenclature of EV in EVIDENCE. These sequences are built into the database and are retrieved in an indexed catalog, or can be searched for by keywords or by sequence similarity. EVIDENCE is the first web-based tool containing pipelines for genotyping and recombination detection, with updated, built-in, and complete reference sequences to improve sensitivity and specificity. The use of EVIDENCE can accelerate genotype identification, aiding clinical diagnosis and enhancing our understanding of EV evolution. PMID:26678286
Ionospheric "Volcanology": Ionospheric Detection of Volcano Eruptions
NASA Astrophysics Data System (ADS)
Astafyeva, E.; Shults, K.; Lognonne, P. H.; Rakoto, V.
2016-12-01
It is known that volcano eruptions and explosions can generate acoustic and gravity waves. These neutral waves further propagate into the atmosphere and ionosphere, where they are detectable by atmospheric and ionospheric sounding tools. So far, the features of co-volcanic ionospheric perturbations are not well understood yet. The development of the global and regional networks of ground-based GPS/GNSS receivers has opened a new era in the ionospheric detection of natural hazard events, including volcano eruptions. It is now known that eruptions with the volcanic explosivity index (VEI) of more than 2 can be detected in the ionosphere, especially in regions with dense GPS/GNSS-receiver coverage. The co-volcanic ionospheric disturbances are usually characterized as quasi-periodic oscillations. The Calbuco volcano, located in southern Chile, awoke in April 2015 after 43 years of inactivity. The first eruption began at 21:04UT on 22 April 2015, preceded by only an hour-long period of volcano-tectonic activity. This first eruption lasted 90 minutes and generated a sub-Plinian (i.e. medium to large explosive event), gray ash plume that rose 15 km above the main crater. A larger second event on 23 April began at 04:00UT (01:00LT), it lasted six hours, and also generated a sub-Plinian ash plume that rose higher than 15 km. The VEI was estimated to be 4 to 5 for these two events. In this work, we first study ionospheric TEC response to the Calbuco volcano eruptions of April 2015 by using ground-based GNSS-receivers located around the volcano. We analyze the spectral characteristics of the observed TEC variations and we estimate the propagation speed of the co-volcanic ionospheric perturbations. We further proceed with the normal mode summation technique based modeling of the ionospheric TEC variations due to the Calbuco volcano eruptions. Finally, we attempt to localize the position of the volcano from the ionospheric measurements, and we also estimate the time of the beginning of the eruption.
Real time, TV-based, point-image quantizer and sorter
Case, Arthur L.; Davidson, Jackson B.
1976-01-01
A device is provided for improving the vertical resolution in a television-based, two-dimensional readout for radiation detection systems such as are used to determine the location of light or nuclear radiation impinging a target area viewed by a television camera, where it is desired to store the data indicative of the centroid location of such images. In the example embodiment, impinging nuclear radiation detected in the form of a scintillation occurring in a crystal is stored as a charge image on a television camera tube target. The target is scanned in a raster and the image position is stored according to a corresponding vertical scan number and horizontal position number along the scan. To determine the centroid location of an image that may overlap a number of horizontal scan lines along the vertical axis of the raster, digital logic circuits are provided with at least four series-connected shift registers, each having 512 bit positions according to a selected 512 horizontal increment of resolutions along a scan line. The registers are shifted by clock pulses at a rate of 512 pulses per scan line. When an image or portion thereof is detected along a scan, its horizontal center location is determined and the present front bit is set in the first shift register and shifted through the registers one at a time for each horizontal scan. Each register is compared bit-by-bit with the preceding register to detect coincident set bit positions until the last scan line detecting a portion of the image is determined. Depending on the number of shift registers through which the first detection of the image is shifted, circuitry is provided to store the vertical center position of the event according to the number of shift registers through which the first detection of the event is shifted. Interpolation circuitry is provided to determine if the event centroid is between adjacent scan lines and stored in a vertical address accordingly. The horizontal location of the event is stored in a separate address memory.
National Earthquake Information Center Seismic Event Detections on Multiple Scales
NASA Astrophysics Data System (ADS)
Patton, J.; Yeck, W. L.; Benz, H.; Earle, P. S.; Soto-Cordero, L.; Johnson, C. E.
2017-12-01
The U.S. Geological Survey National Earthquake Information Center (NEIC) monitors seismicity on local, regional, and global scales using automatic picks from more than 2,000 near-real time seismic stations. This presents unique challenges in automated event detection due to the high variability in data quality, network geometries and density, and distance-dependent variability in observed seismic signals. To lower the overall detection threshold while minimizing false detection rates, NEIC has begun to test the incorporation of new detection and picking algorithms, including multiband (Lomax et al., 2012) and kurtosis (Baillard et al., 2014) pickers, and a new bayesian associator (Glass 3.0). The Glass 3.0 associator allows for simultaneous processing of variably scaled detection grids, each with a unique set of nucleation criteria (e.g., nucleation threshold, minimum associated picks, nucleation phases) to meet specific monitoring goals. We test the efficacy of these new tools on event detection in networks of various scales and geometries, compare our results with previous catalogs, and discuss lessons learned. For example, we find that on local and regional scales, rapid nucleation of small events may require event nucleation with both P and higher-amplitude secondary phases (e.g., S or Lg). We provide examples of the implementation of a scale-independent associator for an induced seismicity sequence (local-scale), a large aftershock sequence (regional-scale), and for monitoring global seismicity. Baillard, C., Crawford, W. C., Ballu, V., Hibert, C., & Mangeney, A. (2014). An automatic kurtosis-based P-and S-phase picker designed for local seismic networks. Bulletin of the Seismological Society of America, 104(1), 394-409. Lomax, A., Satriano, C., & Vassallo, M. (2012). Automatic picker developments and optimization: FilterPicker - a robust, broadband picker for real-time seismic monitoring and earthquake early-warning, Seism. Res. Lett. , 83, 531-540, doi: 10.1785/gssrl.83.3.531.
Description and detection of burst events in turbulent flows
NASA Astrophysics Data System (ADS)
Schmid, P. J.; García-Gutierrez, A.; Jiménez, J.
2018-04-01
A mathematical and computational framework is developed for the detection and identification of coherent structures in turbulent wall-bounded shear flows. In a first step, this data-based technique will use an embedding methodology to formulate the fluid motion as a phase-space trajectory, from which state-transition probabilities can be computed. Within this formalism, a second step then applies repeated clustering and graph-community techniques to determine a hierarchy of coherent structures ranked by their persistencies. This latter information will be used to detect highly transitory states that act as precursors to violent and intermittent events in turbulent fluid motion (e.g., bursts). Used as an analysis tool, this technique allows the objective identification of intermittent (but important) events in turbulent fluid motion; however, it also lays the foundation for advanced control strategies for their manipulation. The techniques are applied to low-dimensional model equations for turbulent transport, such as the self-sustaining process (SSP), for varying levels of complexity.
NASA Astrophysics Data System (ADS)
Howell, E. J.; Chan, M. L.; Chu, Q.; Jones, D. H.; Heng, I. S.; Lee, H.-M.; Blair, D.; Degallaix, J.; Regimbau, T.; Miao, H.; Zhao, C.; Hendry, M.; Coward, D.; Messenger, C.; Ju, L.; Zhu, Z.-H.
2018-03-01
The detection of black hole binary coalescence events by Advanced LIGO allows the science benefits of future detectors to be evaluated. In this paper, we report the science benefits of one or two 8 km arm length detectors based on the doubling of key parameters in an Advanced LIGO-type detector, combined with realizable enhancements. It is shown that the total detection rate for sources similar to those already detected would increase to ˜ 103-105 per year. Within 0.4 Gpc, we find that around 10 of these events would be localizable to within ˜10-1 deg2. This is sufficient to make unique associations or to rule out a direct association with the brightest galaxies in optical surveys (at r-band magnitudes of 17 or above) or for deeper limits (down to r-band magnitudes of 20) yield statistically significant associations. The combination of angular resolution and event rate would benefit precision testing of formation models, cosmic evolution, and cosmological studies.
Distributing entanglement and single photons through an intra-city, free-space quantum channel.
Resch, K; Lindenthal, M; Blauensteiner, B; Böhm, H; Fedrizzi, A; Kurtsiefer, C; Poppe, A; Schmitt-Manderbach, T; Taraba, M; Ursin, R; Walther, P; Weier, H; Weinfurter, H; Zeilinger, A
2005-01-10
We have distributed entangled photons directly through the atmosphere to a receiver station 7.8 km away over the city of Vienna, Austria at night. Detection of one photon from our entangled pairs constitutes a triggered single photon source from the sender. With no direct time-stable connection, the two stations found coincidence counts in the detection events by calculating the cross-correlation of locally-recorded time stamps shared over a public internet channel. For this experiment, our quantum channel was maintained for a total of 40 minutes during which time a coincidence lock found approximately 60000 coincident detection events. The polarization correlations in those events yielded a Bell parameter, S=2.27+/-0.019, which violates the CHSH-Bell inequality by 14 standard deviations. This result is promising for entanglement-based freespace quantum communication in high-density urban areas. It is also encouraging for optical quantum communication between ground stations and satellites since the length of our free-space link exceeds the atmospheric equivalent.
NASA Astrophysics Data System (ADS)
Kenefic, L.; Morton, E.; Bilek, S.
2017-12-01
It is well known that subduction zones create the largest earthquakes in the world, like the magnitude 9.5 Chile earthquake in 1960, or the more recent 9.1 magnitude Japan earthquake in 2011, both of which are in the top five largest earthquakes ever recorded. However, off the coast of the Pacific Northwest region of the U.S., the Cascadia subduction zone (CSZ) remains relatively quiet and modern seismic instruments have not recorded earthquakes of this size in the CSZ. The last great earthquake, a magnitude 8.7-9.2, occurred in 1700 and is constrained by written reports of the resultant tsunami in Japan and dating a drowned forest in the U.S. Previous studies have suggested the margin is most likely segmented along-strike. However, variations in frictional conditions in the CSZ fault zone are not well known. Geodetic modeling indicates that the locked seismogenic zone is likely completely offshore, which may be too far from land seismometers to adequately detect related seismicity. Ocean bottom seismometers, as part of the Cascadia Initiative Amphibious Network, were installed directly above the inferred seismogenic zone, which we use to better detect small interplate seismicity. Using the subspace detection method, this study looks to find new seismogenic zone earthquakes. This subspace detection method uses multiple previously known event templates concurrently to scan through continuous seismic data. Template events that make up the subspace are chosen from events in existing catalogs that likely occurred along the plate interface. Corresponding waveforms are windowed on the nearby Cascadia Initiative ocean bottom seismometers and coastal land seismometers for scanning. Detections that are found by the scan are similar to the template waveforms based upon a predefined threshold. Detections are then visually examined to determine if an event is present. The presence of repeating event clusters can indicate persistent seismic patches, likely corresponding to areas of stronger coupling. This work will ultimately improve the understanding of CSZ fault zone heterogeneity. Preliminary results gathered indicate 96 possible new events between August 2, 2013 and July 1, 2014 for four target clusters off the coast of northern Oregon.
NASA Astrophysics Data System (ADS)
Tang, Xiaojing
Fast and accurate monitoring of tropical forest disturbance is essential for understanding current patterns of deforestation as well as helping eliminate illegal logging. This dissertation explores the use of data from different satellites for near real-time monitoring of forest disturbance in tropical forests, including: development of new monitoring methods; development of new assessment methods; and assessment of the performance and operational readiness of existing methods. Current methods for accuracy assessment of remote sensing products do not address the priority of near real-time monitoring of detecting disturbance events as early as possible. I introduce a new assessment framework for near real-time products that focuses on the timing and the minimum detectable size of disturbance events. The new framework reveals the relationship between change detection accuracy and the time needed to identify events. In regions that are frequently cloudy, near real-time monitoring using data from a single sensor is difficult. This study extends the work by Xin et al. (2013) and develops a new time series method (Fusion2) based on fusion of Landsat and MODIS (Moderate Resolution Imaging Spectroradiometer) data. Results of three test sites in the Amazon Basin show that Fusion2 can detect 44.4% of the forest disturbance within 13 clear observations (82 days) after the initial disturbance. The smallest event detected by Fusion2 is 6.5 ha. Also, Fusion2 detects disturbance faster and has less commission error than more conventional methods. In a comparison of coarse resolution sensors, MODIS Terra and Aqua combined provides faster and more accurate detection of disturbance events than VIIRS (Visible Infrared Imaging Radiometer Suite) and MODIS single sensor data. The performance of near real-time monitoring using VIIRS is slightly worse than MODIS Terra but significantly better than MODIS Aqua. New monitoring methods developed in this dissertation provide forest protection organizations the capacity to monitor illegal logging events promptly. In the future, combining two Landsat and two Sentinel-2 satellites will provide global coverage at 30 m resolution every 4 days, and routine monitoring may be possible at high resolution. The methods and assessment framework developed in this dissertation are adaptable to newly available datasets.
Non Conventional Seismic Events Along the Himalayan Arc Detected in the Hi-Climb Dataset
NASA Astrophysics Data System (ADS)
Vergne, J.; Nàbĕlek, J. L.; Rivera, L.; Bollinger, L.; Burtin, A.
2008-12-01
From September 2002 to August 2005, more than 200 broadband seismic stations were operated across the Himalayan arc and the southern Tibetan plateau in the framework of the Hi-Climb project. Here, we take advantage of the high density of stations along the main profile to look for coherent seismic wave arrivals that can not be attributed to ordinary tectonic events. An automatic detection algorithm is applied to the continuous data streams filtered between 1 and 10 Hz, followed by a visual inspection of all detections. We discovered about one hundred coherent signals that cannot be attributed to local, regional or teleseismic earthquakes and which are characterized by emergent arrivals and long durations ranging from one minute to several hours. Most of these non conventional seismic events have a low signal to noise ratio and are thus only observed above 1 Hz in the frequency band where the seismic noise is the lowest. However, a small subset of them are strong enough to be observed in a larger frequency band and show an enhancement of long periods compared to standard earthquakes. Based on the analysis of the relative amplitude measured at each station or, when possible, on the correlation of the low frequency part of the signals, most of these events appear to be located along the High Himalayan range. But, because of their emergent character and the main orientation of the seismic profile, their longitude and depth remain poorly constrained. The origin of these non conventional seismic events is still unsealed but their seismic signature shares several characteristics with non volcanic tremors, glacial earthquakes and/or debris avalanches. All these phenomena may occur along the Himalayan range but were not seismically detected before. Here we discuss the pros and cons for each of these postulated candidates based on the analysis of the recorded waveforms and slip models.
An Event-Triggered Machine Learning Approach for Accelerometer-Based Fall Detection.
Putra, I Putu Edy Suardiyana; Brusey, James; Gaura, Elena; Vesilo, Rein
2017-12-22
The fixed-size non-overlapping sliding window (FNSW) and fixed-size overlapping sliding window (FOSW) approaches are the most commonly used data-segmentation techniques in machine learning-based fall detection using accelerometer sensors. However, these techniques do not segment by fall stages (pre-impact, impact, and post-impact) and thus useful information is lost, which may reduce the detection rate of the classifier. Aligning the segment with the fall stage is difficult, as the segment size varies. We propose an event-triggered machine learning (EvenT-ML) approach that aligns each fall stage so that the characteristic features of the fall stages are more easily recognized. To evaluate our approach, two publicly accessible datasets were used. Classification and regression tree (CART), k -nearest neighbor ( k -NN), logistic regression (LR), and the support vector machine (SVM) were used to train the classifiers. EvenT-ML gives classifier F-scores of 98% for a chest-worn sensor and 92% for a waist-worn sensor, and significantly reduces the computational cost compared with the FNSW- and FOSW-based approaches, with reductions of up to 8-fold and 78-fold, respectively. EvenT-ML achieves a significantly better F-score than existing fall detection approaches. These results indicate that aligning feature segments with fall stages significantly increases the detection rate and reduces the computational cost.
A new type of tri-axial accelerometers with high dynamic range MEMS for earthquake early warning
NASA Astrophysics Data System (ADS)
Peng, Chaoyong; Chen, Yang; Chen, Quansheng; Yang, Jiansi; Wang, Hongti; Zhu, Xiaoyi; Xu, Zhiqiang; Zheng, Yu
2017-03-01
Earthquake Early Warning System (EEWS) has shown its efficiency for earthquake damage mitigation. As the progress of low-cost Micro Electro Mechanical System (MEMS), many types of MEMS-based accelerometers have been developed and widely used in deploying large-scale, dense seismic networks for EEWS. However, the noise performance of these commercially available MEMS is still insufficient for weak seismic signals, leading to the large scatter of early-warning parameters estimation. In this study, we developed a new type of tri-axial accelerometer based on high dynamic range MEMS with low noise level using for EEWS. It is a MEMS-integrated data logger with built-in seismological processing. The device is built on a custom-tailored Linux 2.6.27 operating system and the method for automatic detecting seismic events is STA/LTA algorithms. When a seismic event is detected, peak ground parameters of all data components will be calculated at an interval of 1 s, and τc-Pd values will be evaluated using the initial 3 s of P wave. These values will then be organized as a trigger packet actively sent to the processing center for event combining detection. The output data of all three components are calibrated to sensitivity 500 counts/cm/s2. Several tests and a real field test deployment were performed to obtain the performances of this device. The results show that the dynamic range can reach 98 dB for the vertical component and 99 dB for the horizontal components, and majority of bias temperature coefficients are lower than 200 μg/°C. In addition, the results of event detection and real field deployment have shown its capabilities for EEWS and rapid intensity reporting.
A physics investigation of deadtime losses in neutron counting at low rates with Cf252
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Louise G; Croft, Stephen
2009-01-01
{sup 252}Cf spontaneous fission sources are used for the characterization of neutron counters and the determination of calibration parameters; including both neutron coincidence counting (NCC) and neutron multiplicity deadtime (DT) parameters. Even at low event rates, temporally-correlated neutron counting using {sup 252}Cf suffers a deadtime effect. Meaning that in contrast to counting a random neutron source (e.g. AmLi to a close approximation), DT losses do not vanish in the low rate limit. This is because neutrons are emitted from spontaneous fission events in time-correlated 'bursts', and are detected over a short period commensurate with their lifetime in the detector (characterizedmore » by the system die-away time, {tau}). Thus, even when detected neutron events from different spontaneous fissions are unlikely to overlap in time, neutron events within the detected 'burst' are subject to intrinsic DT losses. Intrinsic DT losses for dilute Pu will be lower since the multiplicity distribution is softer, but real items also experience self-multiplication which can increase the 'size' of the bursts. Traditional NCC DT correction methods do not include the intrinsic (within burst) losses. We have proposed new forms of the traditional NCC Singles and Doubles DT correction factors. In this work, we apply Monte Carlo neutron pulse train analysis to investigate the functional form of the deadtime correction factors for an updating deadtime. Modeling is based on a high efficiency {sup 3}He neutron counter with short die-away time, representing an ideal {sup 3}He based detection system. The physics of dead time losses at low rates is explored and presented. It is observed that new forms are applicable and offer more accurate correction than the traditional forms.« less
An automated approach towards detecting complex behaviours in deep brain oscillations.
Mace, Michael; Yousif, Nada; Naushahi, Mohammad; Abdullah-Al-Mamun, Khondaker; Wang, Shouyan; Nandi, Dipankar; Vaidyanathan, Ravi
2014-03-15
Extracting event-related potentials (ERPs) from neurological rhythms is of fundamental importance in neuroscience research. Standard ERP techniques typically require the associated ERP waveform to have low variance, be shape and latency invariant and require many repeated trials. Additionally, the non-ERP part of the signal needs to be sampled from an uncorrelated Gaussian process. This limits methods of analysis to quantifying simple behaviours and movements only when multi-trial data-sets are available. We introduce a method for automatically detecting events associated with complex or large-scale behaviours, where the ERP need not conform to the aforementioned requirements. The algorithm is based on the calculation of a detection contour and adaptive threshold. These are combined using logical operations to produce a binary signal indicating the presence (or absence) of an event with the associated detection parameters tuned using a multi-objective genetic algorithm. To validate the proposed methodology, deep brain signals were recorded from implanted electrodes in patients with Parkinson's disease as they participated in a large movement-based behavioural paradigm. The experiment involved bilateral recordings of local field potentials from the sub-thalamic nucleus (STN) and pedunculopontine nucleus (PPN) during an orientation task. After tuning, the algorithm is able to extract events achieving training set sensitivities and specificities of [87.5 ± 6.5, 76.7 ± 12.8, 90.0 ± 4.1] and [92.6 ± 6.3, 86.0 ± 9.0, 29.8 ± 12.3] (mean ± 1 std) for the three subjects, averaged across the four neural sites. Furthermore, the methodology has the potential for utility in real-time applications as only a single-trial ERP is required. Copyright © 2013 Elsevier B.V. All rights reserved.
Big Data and the Global Public Health Intelligence Network (GPHIN)
Dion, M; AbdelMalik, P; Mawudeku, A
2015-01-01
Background Globalization and the potential for rapid spread of emerging infectious diseases have heightened the need for ongoing surveillance and early detection. The Global Public Health Intelligence Network (GPHIN) was established to increase situational awareness and capacity for the early detection of emerging public health events. Objective To describe how the GPHIN has used Big Data as an effective early detection technique for infectious disease outbreaks worldwide and to identify potential future directions for the GPHIN. Findings Every day the GPHIN analyzes over more than 20,000 online news reports (over 30,000 sources) in nine languages worldwide. A web-based program aggregates data based on an algorithm that provides potential signals of emerging public health events which are then reviewed by a multilingual, multidisciplinary team. An alert is sent out if a potential risk is identified. This process proved useful during the Severe Acute Respiratory Syndrome (SARS) outbreak and was adopted shortly after by a number of countries to meet new International Health Regulations that require each country to have the capacity for early detection and reporting. The GPHIN identified the early SARS outbreak in China, was credited with the first alert on MERS-CoV and has played a significant role in the monitoring of the Ebola outbreak in West Africa. Future developments are being considered to advance the GPHIN’s capacity in light of other Big Data sources such as social media and its analytical capacity in terms of algorithm development. Conclusion The GPHIN’s early adoption of Big Data has increased global capacity to detect international infectious disease outbreaks and other public health events. Integration of additional Big Data sources and advances in analytical capacity could further strengthen the GPHIN’s capability for timely detection and early warning. PMID:29769954
The Explosive Universe with Gaia
NASA Astrophysics Data System (ADS)
Wyrzykowski, Łukasz; Hodgkin, Simon T.; Blagorodnova, Nadejda; Belokurov, Vasily
2014-01-01
The Gaia mission will observe the entire sky for 5 years providing ultra-precise astrometric, photometric and spectroscopic measurements for a billion stars in the Galaxy. Hence, naturally, Gaia becomes an all-sky multi-epoch photometric survey, which will monitor and detect variability with millimag precision as well as new transient sources such as supernovae, novae, microlensing events, tidal disruption events, asteroids, among others. Gaia data-flow allows for quick detections of anomalies within 24-48h after the observation. Such near-real-time survey will be able to detect about 6000 supernovae brighter than 19 mag up to redshifts of Z 0.15. The on-board low-resolution (R 100) spectrograph will allow for early and robust classification of transients and minimise the false-alert rate, even providing the estimates on redshift for supernovae. Gaia will also offer a unique possibility for detecting astrometric shifts in microlensing events, which, combined with Gaia's and ground-based photometry, will provide unique mass measurements of lenses, constrains on the dark matter content in the Milky Way and possible detections of free floating black holes. Alerts from Gaia will be publicly available soon after the detection is verified and tested. First alerts are expected early in 2014 and those will be used for ground-based verification. All facilities are invited to join the verification and the follow-up effort. Alerts will be published on a web page, via Skyalert.org and via emailing list. Each alert will contain coordinates, Gaia light curve and low-resolution spectra, classification and cross-matching results. More information on the Gaia Science Alerts can be found here: http://www.ast.cam.ac.uk/ioa/wikis/gsawgwiki/ The full version of the poster is available here: http://www.ast.cam.ac.uk/ioa/wikis/gsawgwiki/images/1/13/GaiaAlertsPosterIAUS298.pdf
DOE Office of Scientific and Technical Information (OSTI.GOV)
González, L. X.; Valdés-Galicia, J. F.; Musalem, O.
2015-12-01
The X17.0 solar flare of 2005 September 7 released high-energy neutrons that were detected by the Solar Neutron Telescope (SNT) at Sierra Negra, Mexico. In three separate and independent studies of this solar neutron event, several of its unique characteristics were studied; in particular, a power-law energy spectra was estimated. In this paper, we present an alternative analysis, based on improved numerical simulations of the detector using GEANT4, and a different technique for processing the SNT data. The results indicate that the spectral index that best fits the neutron flux is around 3, in agreement with previous works. Based onmore » the numerically calculated neutron energy deposition on the SNT, we confirm that the detected neutrons might have reached an energy of 1 GeV, which implies that 10 GeV protons were probably produced; these could not be observed at Earth, as their parent flare was an east limb event.« less
From TRMM to GPM: How well can heavy rainfall be detected from space?
NASA Astrophysics Data System (ADS)
Prakash, Satya; Mitra, Ashis K.; Pai, D. S.; AghaKouchak, Amir
2016-02-01
In this study, we investigate the capabilities of the Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) and the recently released Integrated Multi-satellitE Retrievals for GPM (IMERG) in detecting and estimating heavy rainfall across India. First, the study analyzes TMPA data products over a 17-year period (1998-2014). While TMPA and reference gauge-based observations show similar mean monthly variations of conditional heavy rainfall events, the multi-satellite product systematically overestimates its inter-annual variations. Categorical as well as volumetric skill scores reveal that TMPA over-detects heavy rainfall events (above 75th percentile of reference data), but it shows reasonable performance in capturing the volume of heavy rain across the country. An initial assessment of the GPM-based multi-satellite IMERG precipitation estimates for the southwest monsoon season shows notable improvements over TMPA in capturing heavy rainfall over India. The recently released IMERG shows promising results to help improve modeling of hydrological extremes (e.g., floods and landslides) using satellite observations.
Pre-trained D-CNN models for detecting complex events in unconstrained videos
NASA Astrophysics Data System (ADS)
Robinson, Joseph P.; Fu, Yun
2016-05-01
Rapid event detection faces an emergent need to process large videos collections; whether surveillance videos or unconstrained web videos, the ability to automatically recognize high-level, complex events is a challenging task. Motivated by pre-existing methods being complex, computationally demanding, and often non-replicable, we designed a simple system that is quick, effective and carries minimal overhead in terms of memory and storage. Our system is clearly described, modular in nature, replicable on any Desktop, and demonstrated with extensive experiments, backed by insightful analysis on different Convolutional Neural Networks (CNNs), as stand-alone and fused with others. With a large corpus of unconstrained, real-world video data, we examine the usefulness of different CNN models as features extractors for modeling high-level events, i.e., pre-trained CNNs that differ in architectures, training data, and number of outputs. For each CNN, we use 1-fps from all training exemplar to train one-vs-rest SVMs for each event. To represent videos, frame-level features were fused using a variety of techniques. The best being to max-pool between predetermined shot boundaries, then average-pool to form the final video-level descriptor. Through extensive analysis, several insights were found on using pre-trained CNNs as off-the-shelf feature extractors for the task of event detection. Fusing SVMs of different CNNs revealed some interesting facts, finding some combinations to be complimentary. It was concluded that no single CNN works best for all events, as some events are more object-driven while others are more scene-based. Our top performance resulted from learning event-dependent weights for different CNNs.
Tan, Francisca M; Caballero-Gaudes, César; Mullinger, Karen J; Cho, Siu-Yeung; Zhang, Yaping; Dryden, Ian L; Francis, Susan T; Gowland, Penny A
2017-11-01
Most functional MRI (fMRI) studies map task-driven brain activity using a block or event-related paradigm. Sparse paradigm free mapping (SPFM) can detect the onset and spatial distribution of BOLD events in the brain without prior timing information, but relating the detected events to brain function remains a challenge. In this study, we developed a decoding method for SPFM using a coordinate-based meta-analysis method of activation likelihood estimation (ALE). We defined meta-maps of statistically significant ALE values that correspond to types of events and calculated a summation overlap between the normalized meta-maps and SPFM maps. As a proof of concept, this framework was applied to relate SPFM-detected events in the sensorimotor network (SMN) to six motor functions (left/right fingers, left/right toes, swallowing, and eye blinks). We validated the framework using simultaneous electromyography (EMG)-fMRI experiments and motor tasks with short and long duration, and random interstimulus interval. The decoding scores were considerably lower for eye movements relative to other movement types tested. The average successful rate for short and long motor events were 77 ± 13% and 74 ± 16%, respectively, excluding eye movements. We found good agreement between the decoding results and EMG for most events and subjects, with a range in sensitivity between 55% and 100%, excluding eye movements. The proposed method was then used to classify the movement types of spontaneous single-trial events in the SMN during resting state, which produced an average successful rate of 22 ± 12%. Finally, this article discusses methodological implications and improvements to increase the decoding performance. Hum Brain Mapp 38:5778-5794, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Tan, Francisca M.; Caballero-Gaudes, César; Mullinger, Karen J.; Cho, Siu-Yeung; Zhang, Yaping; Dryden, Ian L.; Francis, Susan T.; Gowland, Penny A.
2017-01-01
Most fMRI studies map task-driven brain activity using a block or event-related paradigm. Sparse Paradigm Free Mapping (SPFM) can detect the onset and spatial distribution of BOLD events in the brain without prior timing information; but relating the detected events to brain function remains a challenge. In this study, we developed a decoding method for SPFM using a coordinate-based meta-analysis method of Activation Likelihood Estimation (ALE). We defined meta-maps of statistically significant ALE values that correspond to types of events and calculated a summation overlap between the normalized meta-maps and SPFM maps. As a proof of concept, this framework was applied to relate SPFM-detected events in the Sensorimotor Network (SMN) to six motor function (left/right fingers, left/right toes, swallowing and eye blinks). We validated the framework using simultaneous Electromyography-fMRI experiments and motor tasks with short and long duration, and random inter-stimulus interval. The decoding scores were considerably lower for eye movements relative to other movement types tested. The average successful rate for short and long motor events was 77 ± 13% and 74 ± 16% respectively, excluding eye movements. We found good agreement between the decoding results and EMG for most events and subjects, with a range in sensitivity between 55 and 100%, excluding eye movements. The proposed method was then used to classify the movement types of spontaneous single-trial events in the SMN during resting state, which produced an average successful rate of 22 ± 12%. Finally, this paper discusses methodological implications and improvements to increase the decoding performance. PMID:28815863
Meteor studies in the framework of the JEM-EUSO program
NASA Astrophysics Data System (ADS)
Abdellaoui, G.; Abe, S.; Acheli, A.; Adams, J. H.; Ahmad, S.; Ahriche, A.; Albert, J.-N.; Allard, D.; Alonso, G.; Anchordoqui, L.; Andreev, V.; Anzalone, A.; Aouimeur, W.; Arai, Y.; Arsene, N.; Asano, K.; Attallah, R.; Attoui, H.; Ave Pernas, M.; Bacholle, S.; Bakiri, M.; Baragatti, P.; Barrillon, P.; Bartocci, S.; Batsch, T.; Bayer, J.; Bechini, R.; Belenguer, T.; Bellotti, R.; Belov, A.; Belov, K.; Benadda, B.; Benmessai, K.; Berlind, A. A.; Bertaina, M.; Biermann, P. L.; Biktemerova, S.; Bisconti, F.; Blanc, N.; Błȩcki, J.; Blin-Bondil, S.; Bobik, P.; Bogomilov, M.; Bonamente, M.; Boudaoud, R.; Bozzo, E.; Briggs, M. S.; Bruno, A.; Caballero, K. S.; Cafagna, F.; Campana, D.; Capdevielle, J.-N.; Capel, F.; Caramete, A.; Caramete, L.; Carlson, P.; Caruso, R.; Casolino, M.; Cassardo, C.; Castellina, A.; Castellini, G.; Catalano, C.; Catalano, O.; Cellino, A.; Chikawa, M.; Chiritoi, G.; Christl, M. J.; Connaughton, V.; Conti, L.; Cordero, G.; Crawford, H. J.; Cremonini, R.; Csorna, S.; Dagoret-Campagne, S.; De Donato, C.; de la Taille, C.; De Santis, C.; del Peral, L.; Di Martino, M.; Djemil, T.; Djenas, S. A.; Dulucq, F.; Dupieux, M.; Dutan, I.; Ebersoldt, A.; Ebisuzaki, T.; Engel, R.; Eser, J.; Fang, K.; Fenu, F.; Fernández-González, S.; Fernández-Soriano, J.; Ferrarese, S.; Finco, D.; Flamini, M.; Fornaro, C.; Fouka, M.; Franceschi, A.; Franchini, S.; Fuglesang, C.; Fujimoto, J.; Fukushima, M.; Galeotti, P.; García-Ortega, E.; Garipov, G.; Gascón, E.; Geary, J.; Gelmini, G.; Genci, J.; Giraudo, G.; Gonchar, M.; González Alvarado, C.; Gorodetzky, P.; Guarino, F.; Guehaz, R.; Guzmán, A.; Hachisu, Y.; Haiduc, M.; Harlov, B.; Haungs, A.; Hernández Carretero, J.; Hidber, W.; Higashide, K.; Ikeda, D.; Ikeda, H.; Inoue, N.; Inoue, S.; Isgrò, F.; Itow, Y.; Jammer, T.; Joven, E.; Judd, E. G.; Jung, A.; Jochum, J.; Kajino, F.; Kajino, T.; Kalli, S.; Kaneko, I.; Kang, D.; Kanouni, F.; Karadzhov, Y.; Karczmarczyk, J.; Karus, M.; Katahira, K.; Kawai, K.; Kawasaki, Y.; Kedadra, A.; Khales, H.; Khrenov, B. A.; Kim, Jeong-Sook; Kim, Soon-Wook; Kim, Sug-Whan; Kleifges, M.; Klimov, P. A.; Kolev, D.; Kreykenbohm, I.; Kudela, K.; Kurihara, Y.; Kusenko, A.; Kuznetsov, E.; Lacombe, M.; Lachaud, C.; Lahmar, H.; Lakhdari, F.; Larsson, O.; Lee, J.; Licandro, J.; Lim, H.; López Campano, L.; Maccarone, M. C.; Mackovjak, S.; Mahdi, M.; Maravilla, D.; Marcelli, L.; Marcos, J. L.; Marini, A.; Martens, K.; Martín, Y.; Martinez, O.; Masciantonio, G.; Mase, K.; Matev, R.; Matthews, J. N.; Mebarki, N.; Medina-Tanco, G.; Mehrad, L.; Mendoza, M. A.; Merino, A.; Mernik, T.; Meseguer, J.; Messaoud, S.; Micu, O.; Mimouni, J.; Miyamoto, H.; Miyazaki, Y.; Mizumoto, Y.; Modestino, G.; Monaco, A.; Monnier-Ragaigne, D.; Morales de los Ríos, J. A.; Moretto, C.; Morozenko, V. S.; Mot, B.; Murakami, T.; Nadji, B.; Nagano, M.; Nagata, M.; Nagataki, S.; Nakamura, T.; Napolitano, T.; Nardelli, A.; Naumov, D.; Nava, R.; Neronov, A.; Nomoto, K.; Nonaka, T.; Ogawa, T.; Ogio, S.; Ohmori, H.; Olinto, A. V.; Orleański, P.; Osteria, G.; Painter, W.; Panasyuk, M. I.; Panico, B.; Parizot, E.; Park, I. H.; Park, H. W.; Pastircak, B.; Patzak, T.; Paul, T.; Pennypacker, C.; Perdichizzi, M.; Pérez-Grande, I.; Perfetto, F.; Peter, T.; Picozza, P.; Pierog, T.; Pindado, S.; Piotrowski, L. W.; Piraino, S.; Placidi, L.; Plebaniak, Z.; Pliego, S.; Pollini, A.; Popescu, E. M.; Prat, P.; Prévôt, G.; Prieto, H.; Putis, M.; Rabanal, J.; Radu, A. A.; Rahmani, M.; Reardon, P.; Reyes, M.; Rezazadeh, M.; Ricci, M.; Rodríguez Frías, M. D.; Ronga, F.; Roth, M.; Rothkaehl, H.; Roudil, G.; Rusinov, I.; Rybczyński, M.; Sabau, M. D.; Sáez Cano, G.; Sagawa, H.; Sahnoune, Z.; Saito, A.; Sakaki, N.; Sakata, M.; Salazar, H.; Sanchez, J. C.; Sánchez, J. L.; Santangelo, A.; Santiago Crúz, L.; Sanz-Andrés, A.; Sanz Palomino, M.; Saprykin, O.; Sarazin, F.; Sato, H.; Sato, M.; Schanz, T.; Schieler, H.; Scotti, V.; Segreto, A.; Selmane, S.; Semikoz, D.; Serra, M.; Sharakin, S.; Shibata, T.; Shimizu, H. M.; Shinozaki, K.; Shirahama, T.; Siemieniec-Oziȩbło, G.; Sledd, J.; Słomińska, K.; Sobey, A.; Stan, I.; Sugiyama, T.; Supanitsky, D.; Suzuki, M.; Szabelska, B.; Szabelski, J.; Tahi, H.; Tajima, F.; Tajima, N.; Tajima, T.; Takahashi, Y.; Takami, H.; Takeda, M.; Takizawa, Y.; Talai, M. C.; Tenzer, C.; Tibolla, O.; Tkachev, L.; Tokuno, H.; Tomida, T.; Tone, N.; Toscano, S.; Traïche, M.; Tsenov, R.; Tsunesada, Y.; Tsuno, K.; Tymieniecka, T.; Uchihori, Y.; Unger, M.; Vaduvescu, O.; Valdés-Galicia, J. F.; Vallania, P.; Vankova, G.; Vigorito, C.; Villaseñor, L.; Vlcek, B.; von Ballmoos, P.; Vrabel, M.; Wada, S.; Watanabe, J.; Watanabe, S.; Watts, J., Jr.; Weber, M.; Weigand Muñoz, R.; Weindl, A.; Weiler, T. J.; Wibig, T.; Wiencke, L.; Wille, M.; Wilms, J.; Włodarczyk, Z.; Yamamoto, T.; Yamamoto, Y.; Yang, J.; Yano, H.; Yashin, I. V.; Yonetoku, D.; Yoshida, S.; Young, R.; Zgura, I. S.; Zotov, M. Yu.; Zuccaro Marchi, A.
2017-09-01
We summarize the state of the art of a program of UV observations from space of meteor phenomena, a secondary objective of the JEM-EUSO international collaboration. Our preliminary analysis indicates that JEM-EUSO, taking advantage of its large FOV and good sensitivity, should be able to detect meteors down to absolute magnitude close to 7. This means that JEM-EUSO should be able to record a statistically significant flux of meteors, including both sporadic ones, and events produced by different meteor streams. Being unaffected by adverse weather conditions, JEM-EUSO can also be a very important facility for the detection of bright meteors and fireballs, as these events can be detected even in conditions of very high sky background. In the case of bright events, moreover, exhibiting some persistence of the meteor train, preliminary simulations show that it should be possible to exploit the motion of the ISS itself and derive at least a rough 3D reconstruction of the meteor trajectory. Moreover, the observing strategy developed to detect meteors may also be applied to the detection of nuclearites, exotic particles whose existence has been suggested by some theoretical investigations. Nuclearites are expected to move at higher velocities than meteoroids, and to exhibit a wider range of possible trajectories, including particles moving upward after crossing the Earth. Some pilot studies, including the approved Mini-EUSO mission, a precursor of JEM-EUSO, are currently operational or in preparation. We are doing simulations to assess the performance of Mini-EUSO for meteor studies, while a few meteor events have been already detected using the ground-based facility EUSO-TA.
NASA Astrophysics Data System (ADS)
Hart, Andrew F.; Cinquini, Luca; Khudikyan, Shakeh E.; Thompson, David R.; Mattmann, Chris A.; Wagstaff, Kiri; Lazio, Joseph; Jones, Dayton
2015-01-01
“Fast radio transients” are defined here as bright millisecond pulses of radio-frequency energy. These short-duration pulses can be produced by known objects such as pulsars or potentially by more exotic objects such as evaporating black holes. The identification and verification of such an event would be of great scientific value. This is one major goal of the Very Long Baseline Array (VLBA) Fast Transient Experiment (V-FASTR), a software-based detection system installed at the VLBA. V-FASTR uses a “commensal” (piggy-back) approach, analyzing all array data continually during routine VLBA observations and identifying candidate fast transient events. Raw data can be stored from a buffer memory, which enables a comprehensive off-line analysis. This is invaluable for validating the astrophysical origin of any detection. Candidates discovered by the automatic system must be reviewed each day by analysts to identify any promising signals that warrant a more in-depth investigation. To support the timely analysis of fast transient detection candidates by V-FASTR scientists, we have developed a metadata-driven, collaborative candidate review framework. The framework consists of a software pipeline for metadata processing composed of both open source software components and project-specific code written expressly to extract and catalog metadata from the incoming V-FASTR data products, and a web-based data portal that facilitates browsing and inspection of the available metadata for candidate events extracted from the VLBA radio data.
Schwartz, Frank L; Vernier, Stanley J; Shubrook, Jay H; Marling, Cynthia R
2010-11-01
We have developed a prototypical case-based reasoning system to enhance management of patients with type 1 diabetes mellitus (T1DM). The system is capable of automatically analyzing large volumes of life events, self-monitoring of blood glucose readings, continuous glucose monitoring system results, and insulin pump data to detect clinical problems. In a preliminary study, manual entry of large volumes of life-event and other data was too burdensome for patients. In this study, life-event and pump data collection were automated, and then the system was reevaluated. Twenty-three adult T1DM patients on insulin pumps completed the five-week study. A usual daily schedule was entered into the database, and patients were only required to upload their insulin pump data to Medtronic's CareLink® Web site weekly. Situation assessment routines were run weekly for each participant to detect possible problems, and once the trial was completed, the case-retrieval module was tested. Using the situation assessment routines previously developed, the system found 295 possible problems. The enhanced system detected only 2.6 problems per patient per week compared to 4.9 problems per patient per week in the preliminary study (p=.017). Problems detected by the system were correctly identified in 97.9% of the cases, and 96.1% of these were clinically useful. With less life-event data, the system is unable to detect certain clinical problems and detects fewer problems overall. Additional work is needed to provide device/software interfaces that allow patients to provide this data quickly and conveniently. © 2010 Diabetes Technology Society.
Observation of Long Ionospheric Recoveries from Lightning-induced Electron Precipitation Events
NASA Astrophysics Data System (ADS)
Mohammadpour Salut, M.; Cohen, M.
2015-12-01
Lightning strokes induces lower ionospheric nighttime disturbances which can be detected through Very Low Frequency (VLF) remote sensing via at least two means: (1) direct heating and ionization, known as an Early event, and (2) triggered precipitation of energetic electrons from the radiation belts, known as Lightning-induced Electron Precipitation (LEP). For each, the ionospheric recover time is typically a few minutes or less. A small class of Early events have been identified as having unusually long ionospheric recoveries (10s of minutes), with the underlying mechanism still in question. Our study shows for the first time that some LEP events also demonstrate unusually long recovery. The VLF events were detected by visual inspection of the recorded data in both the North-South and East-West magnetic fields. Data from the National Lightning Detection Network (NLDN) are used to determine the location and peak current of the lightning responsible for each lightning-associated VLF perturbation. LEP or Early VLF events are determined by measuring the time delay between the causative lightning discharges and the onset of all lightning-associated perturbations. LEP events typically possess an onset delay greater than ~ 200 msec following the causative lightning discharges, while the onset of Early VLF events is time-aligned (<20 msec) with the lightning return stroke. Nonducted LEP events are distinguished from ducted events based on the location of the causative lightning relative to the precipitation region. From 15 March to 20 April and 15 October to 15 November 2011, a total of 385 LEP events observed at Indiana, Montana, Colorado and Oklahoma VLF sites, on the NAA, NLK and NML transmitter signals. 46 of these events exhibited a long recovery. It has been found that the occurrence rate of ducted long recovery LEP events is higher than nonducted. Of the 46 long recovery LEP events, 33 events were induced by ducted whistlers, and 13 events were associated with nonducted obliquely propagating whistler waves. The occurrence of high peak current lightning strokes is a prerequisite for long recovery LEP events.
Negated bio-events: analysis and identification
2013-01-01
Background Negation occurs frequently in scientific literature, especially in biomedical literature. It has previously been reported that around 13% of sentences found in biomedical research articles contain negation. Historically, the main motivation for identifying negated events has been to ensure their exclusion from lists of extracted interactions. However, recently, there has been a growing interest in negative results, which has resulted in negation detection being identified as a key challenge in biomedical relation extraction. In this article, we focus on the problem of identifying negated bio-events, given gold standard event annotations. Results We have conducted a detailed analysis of three open access bio-event corpora containing negation information (i.e., GENIA Event, BioInfer and BioNLP’09 ST), and have identified the main types of negated bio-events. We have analysed the key aspects of a machine learning solution to the problem of detecting negated events, including selection of negation cues, feature engineering and the choice of learning algorithm. Combining the best solutions for each aspect of the problem, we propose a novel framework for the identification of negated bio-events. We have evaluated our system on each of the three open access corpora mentioned above. The performance of the system significantly surpasses the best results previously reported on the BioNLP’09 ST corpus, and achieves even better results on the GENIA Event and BioInfer corpora, both of which contain more varied and complex events. Conclusions Recently, in the field of biomedical text mining, the development and enhancement of event-based systems has received significant interest. The ability to identify negated events is a key performance element for these systems. We have conducted the first detailed study on the analysis and identification of negated bio-events. Our proposed framework can be integrated with state-of-the-art event extraction systems. The resulting systems will be able to extract bio-events with attached polarities from textual documents, which can serve as the foundation for more elaborate systems that are able to detect mutually contradicting bio-events. PMID:23323936
FPGA-Based X-Ray Detection and Measurement for an X-Ray Polarimeter
NASA Technical Reports Server (NTRS)
Gregory, Kyle; Hill, Joanne; Black, Kevin; Baumgartner, Wayne
2013-01-01
This technology enables detection and measurement of x-rays in an x-ray polarimeter using a field-programmable gate array (FPGA). The technology was developed for the Gravitational and Extreme Magnetism Small Explorer (GEMS) mission. It performs precision energy and timing measurements, as well as rejection of non-x-ray events. It enables the GEMS polarimeter to detect precisely when an event has taken place so that additional measurements can be made. The technology also enables this function to be performed in an FPGA using limited resources so that mass and power can be minimized while reliability for a space application is maximized and precise real-time operation is achieved. This design requires a low-noise, charge-sensitive preamplifier; a highspeed analog to digital converter (ADC); and an x-ray detector with a cathode terminal. It functions by computing a sum of differences for time-samples whose difference exceeds a programmable threshold. A state machine advances through states as a programmable number of consecutive samples exceeds or fails to exceed this threshold. The pulse height is recorded as the accumulated sum. The track length is also measured based on the time from the start to the end of accumulation. For track lengths longer than a certain length, the algorithm estimates the barycenter of charge deposit by comparing the accumulator value at the midpoint to the final accumulator value. The design also employs a number of techniques for rejecting background events. This innovation enables the function to be performed in space where it can operate autonomously with a rapid response time. This implementation combines advantages of computing system-based approaches with those of pure analog approaches. The result is an implementation that is highly reliable, performs in real-time, rejects background events, and consumes minimal power.
Heat waves in Senegal : detection, characterization and associated processes.
NASA Astrophysics Data System (ADS)
Gnacoussa Sambou, Marie Jeanne; Janicot, Serge; Badiane, Daouda; Pohl, Benjamin; Dieng, Abdou L.; Gaye, Amadou T.
2017-04-01
Atmospheric configuration and synoptic evolution of patterns associated with Senegalese heat wave (HW) are examined on the period 1979-2014 using the Global Surface Summary of the Day (GSOD) observational database and ERA-Interim reanalysis. Since there is no objective and uniform definition of HW events, threshold methods based on atmospheric variables as daily maximum (Tmax) / minimum (Tmin) temperatures and daily mean apparent temperature (AT) are used to define HW threshold detection. Each criterion is related to a specific category of HW events: Tmax (warm day events), Tmin (warm night events) and AT (combining temperature and moisture). These definitions are used in order to characterize as well as possible the warm events over the Senegalese regions (oceanic versus continental region). Statistics on time evolution and spatial distribution of warm events are carried out over the 2 seasons of maximum temperature (March-May and October-November). For each season, a composite of HW events, as well as the most extended event over Senegal (as a case study) are analyzed using usual atmospheric fields (sea level pressure, geopotential height, total column water content, wind components, 2m temperature). This study is part of the project ACASIS (https://acasis.locean-ipsl.upmc.fr/doku.php) on heat waves occurrences over the Sahel and their impact on health. Keywords: heat wave, Senegal, ACASIS.
Data Mining of the Public Version of the FDA Adverse Event Reporting System
Sakaeda, Toshiyuki; Tamon, Akiko; Kadoyama, Kaori; Okuno, Yasushi
2013-01-01
The US Food and Drug Administration (FDA) Adverse Event Reporting System (FAERS, formerly AERS) is a database that contains information on adverse event and medication error reports submitted to the FDA. Besides those from manufacturers, reports can be submitted from health care professionals and the public. The original system was started in 1969, but since the last major revision in 1997, reporting has markedly increased. Data mining algorithms have been developed for the quantitative detection of signals from such a large database, where a signal means a statistical association between a drug and an adverse event or a drug-associated adverse event, including the proportional reporting ratio (PRR), the reporting odds ratio (ROR), the information component (IC), and the empirical Bayes geometric mean (EBGM). A survey of our previous reports suggested that the ROR provided the highest number of signals, and the EBGM the lowest. Additionally, an analysis of warfarin-, aspirin- and clopidogrel-associated adverse events suggested that all EBGM-based signals were included in the PRR-based signals, and also in the IC- or ROR-based ones, and that the PRR- and IC-based signals were in the ROR-based ones. In this article, the latest information on this area is summarized for future pharmacoepidemiological studies and/or pharmacovigilance analyses. PMID:23794943
Bruxism force detection by a piezoelectric film-based recording device in sleeping humans.
Baba, Kazuyoshi; Clark, Glenn T; Watanabe, Tatsutomi; Ohyama, Takashi
2003-01-01
To test the reliability and utility of a force-based bruxism detection system (Intra-Splint Force Detector [ISFD]) for multiple night recordings of forceful tooth-to-splint contacts in sleeping human subjects in their home environment. Bruxism-type forces, i.e., forceful tooth-to-splint contacts, during the night were recorded with this system in 12 subjects (6 bruxers and 6 controls) for 5 nights in their home environment; a laboratory-based nocturnal polysomnogram (NPSG) study was also performed on 1 of these subjects. All 12 subjects were able to use the device without substantial difficulty on a nightly basis. The bruxer group exhibited bruxism events of significantly longer duration than the control group (27 seconds/hour versus 7.4 seconds/hour, P < .01). A NPSG study performed on 1 subject revealed that, when the masseter muscle electromyogram (EMG) was used as a "gold standard," the ISFD had a sensitivity of 0.89. The correlation coefficient between the duration of events detected by the ISFD and the EMG was also 0.89. These results suggest that the ISFD is a system that can be used easily by the subjects and that has a reasonable reliability for bruxism detection as reflected in forceful tooth-to-splint contacts during sleep.
Contribution of Infrasound to IDC Reviewed Event Bulletin
NASA Astrophysics Data System (ADS)
Bittner, Paulina; Polich, Paul; Gore, Jane; Ali, Sherif Mohamed; Medinskaya, Tatiana; Mialle, Pierrick
2016-04-01
Until 2003 two waveform technologies, i.e. seismic and hydroacoustic were used to detect and locate events included in the International Data Centre (IDC) Reviewed Event Bulletin (REB). The first atmospheric event was published in the REB in 2003 but infrasound detections could not be used by the Global Association (GA) Software due to the unmanageable high number of spurious associations. Offline improvements of the automatic processing took place to reduce the number of false detections to a reasonable level. In February 2010 the infrasound technology was reintroduced to the IDC operations and has contributed to both automatic and reviewed IDC bulletins. The primary contribution of infrasound technology is to detect atmospheric events. These events may also be observed at seismic stations, which will significantly improve event location. Examples of REB events, which were detected by the International Monitoring System (IMS) infrasound network were fireballs (e.g. Bangkok fireball, 2015), volcanic eruptions (e.g. Calbuco, Chile 2015) and large surface explosions (e.g. Tjanjin, China 2015). Query blasts and large earthquakes belong to events primarily recorded at seismic stations of the IMS network but often detected at the infrasound stations. Presence of infrasound detection associated to an event from a mining area indicates a surface explosion. Satellite imaging and a database of active mines can be used to confirm the origin of such events. This presentation will summarize the contribution of 6 years of infrasound data to IDC bulletins and provide examples of events recorded at the IMS infrasound network. Results of this study may help to improve location of small events with observations on infrasound stations.
Automated detection of epileptic ripples in MEG using beamformer-based virtual sensors
NASA Astrophysics Data System (ADS)
Migliorelli, Carolina; Alonso, Joan F.; Romero, Sergio; Nowak, Rafał; Russi, Antonio; Mañanas, Miguel A.
2017-08-01
Objective. In epilepsy, high-frequency oscillations (HFOs) are expressively linked to the seizure onset zone (SOZ). The detection of HFOs in the noninvasive signals from scalp electroencephalography (EEG) and magnetoencephalography (MEG) is still a challenging task. The aim of this study was to automate the detection of ripples in MEG signals by reducing the high-frequency noise using beamformer-based virtual sensors (VSs) and applying an automatic procedure for exploring the time-frequency content of the detected events. Approach. Two-hundred seconds of MEG signal and simultaneous iEEG were selected from nine patients with refractory epilepsy. A two-stage algorithm was implemented. Firstly, beamforming was applied to the whole head to delimitate the region of interest (ROI) within a coarse grid of MEG-VS. Secondly, a beamformer using a finer grid in the ROI was computed. The automatic detection of ripples was performed using the time-frequency response provided by the Stockwell transform. Performance was evaluated through comparisons with simultaneous iEEG signals. Main results. ROIs were located within the seizure-generating lobes in the nine subjects. Precision and sensitivity values were 79.18% and 68.88%, respectively, by considering iEEG-detected events as benchmarks. A higher number of ripples were detected inside the ROI compared to the same region in the contralateral lobe. Significance. The evaluation of interictal ripples using non-invasive techniques can help in the delimitation of the epileptogenic zone and guide placement of intracranial electrodes. This is the first study that automatically detects ripples in MEG in the time domain located within the clinically expected epileptic area taking into account the time-frequency characteristics of the events through the whole signal spectrum. The algorithm was tested against intracranial recordings, the current gold standard. Further studies should explore this approach to enable the localization of noninvasively recorded HFOs to help during pre-surgical planning and to reduce the need for invasive diagnostics.
Transient Events in Archival Very Large Array Observations of the Galactic Center
NASA Astrophysics Data System (ADS)
Chiti, Anirudh; Chatterjee, Shami; Wharton, Robert; Cordes, James; Lazio, T. Joseph W.; Kaplan, David L.; Bower, Geoffrey C.; Croft, Steve
2016-12-01
The Galactic center has some of the highest stellar densities in the Galaxy and a range of interstellar scattering properties, which may aid in the detection of new radio-selected transient events. Here, we describe a search for radio transients in the Galactic center, using over 200 hr of archival data from the Very Large Array at 5 and 8.4 GHz. Every observation of Sgr A* from 1985 to 2005 has been searched using an automated processing and detection pipeline sensitive to transients with timescales between 30 s and 5 minutes with a typical detection threshold of ˜100 mJy. Eight possible candidates pass tests to filter false-positives from radio-frequency interference, calibration errors, and imaging artifacts. Two events are identified as promising candidates based on the smoothness of their light curves. Despite the high quality of their light curves, these detections remain suspect due to evidence of incomplete subtraction of the complex structure in the Galactic center, and apparent contingency of one detection on reduction routines. Events of this intensity (˜100 mJy) and duration (˜100 s) are not obviously associated with known astrophysical sources, and no counterparts are found in data at other wavelengths. We consider potential sources, including Galactic center pulsars, dwarf stars, sources like GCRT J1745-3009, and bursts from X-ray binaries. None can fully explain the observed transients, suggesting either a new astrophysical source or a subtle imaging artifact. More sensitive multiwavelength studies are necessary to characterize these events, which, if real, occur with a rate of {14}-12+32 {{hr}}-1 {\\deg }-2 in the Galactic center.
NASA Technical Reports Server (NTRS)
Jules, Kenol; Lin, Paul P.
2001-01-01
This paper presents an artificial intelligence monitoring system developed by the NASA Glenn Principal Investigator Microgravity Services project to help the principal investigator teams identify the primary vibratory disturbance sources that are active, at any moment in time, on-board the International Space Station, which might impact the microgravity environment their experiments are exposed to. From the Principal Investigator Microgravity Services' web site, the principal investigator teams can monitor via a graphical display, in near real time, which event(s) is/are on, such as crew activities, pumps, fans, centrifuges, compressor, crew exercise, platform structural modes, etc., and decide whether or not to run their experiments based on the acceleration environment associated with a specific event. This monitoring system is focused primarily on detecting the vibratory disturbance sources, but could be used as well to detect some of the transient disturbance sources, depending on the events duration. The system has built-in capability to detect both known and unknown vibratory disturbance sources. Several soft computing techniques such as Kohonen's Self-Organizing Feature Map, Learning Vector Quantization, Back-Propagation Neural Networks, and Fuzzy Logic were used to design the system.
Automatic Detection of Seizures with Applications
NASA Technical Reports Server (NTRS)
Olsen, Dale E.; Harris, John C.; Cutchis, Protagoras N.; Cristion, John A.; Lesser, Ronald P.; Webber, W. Robert S.
1993-01-01
There are an estimated two million people with epilepsy in the United States. Many of these people do not respond to anti-epileptic drug therapy. Two devices can be developed to assist in the treatment of epilepsy. The first is a microcomputer-based system designed to process massive amounts of electroencephalogram (EEG) data collected during long-term monitoring of patients for the purpose of diagnosing seizures, assessing the effectiveness of medical therapy, or selecting patients for epilepsy surgery. Such a device would select and display important EEG events. Currently many such events are missed. A second device could be implanted and would detect seizures and initiate therapy. Both of these devices require a reliable seizure detection algorithm. A new algorithm is described. It is believed to represent an improvement over existing seizure detection algorithms because better signal features were selected and better standardization methods were used.
Nakamura, Kosuke; Kondo, Kazunari; Akiyama, Hiroshi; Ishigaki, Takumi; Noguchi, Akio; Katsumata, Hiroshi; Takasaki, Kazuto; Futo, Satoshi; Sakata, Kozue; Fukuda, Nozomi; Mano, Junichi; Kitta, Kazumi; Tanaka, Hidenori; Akashi, Ryo; Nishimaki-Mogami, Tomoko
2016-08-15
Identification of transgenic sequences in an unknown genetically modified (GM) papaya (Carica papaya L.) by whole genome sequence analysis was demonstrated. Whole genome sequence data were generated for a GM-positive fresh papaya fruit commodity detected in monitoring using real-time polymerase chain reaction (PCR). The sequences obtained were mapped against an open database for papaya genome sequence. Transgenic construct- and event-specific sequences were identified as a GM papaya developed to resist infection from a Papaya ringspot virus. Based on the transgenic sequences, a specific real-time PCR detection method for GM papaya applicable to various food commodities was developed. Whole genome sequence analysis enabled identifying unknown transgenic construct- and event-specific sequences in GM papaya and development of a reliable method for detecting them in papaya food commodities. Copyright © 2016 Elsevier Ltd. All rights reserved.
Vertically Integrated Seismological Analysis I : Modeling
NASA Astrophysics Data System (ADS)
Russell, S.; Arora, N. S.; Jordan, M. I.; Sudderth, E.
2009-12-01
As part of its CTBT verification efforts, the International Data Centre (IDC) analyzes seismic and other signals collected from hundreds of stations around the world. Current processing at the IDC proceeds in a series of pipelined stages. From station processing to network processing, each decision is made on the basis of local information. This has the advantage of efficiency, and simplifies the structure of software implementations. However, this approach may reduce accuracy in the detection and phase classification of arrivals, association of detections to hypothesized events, and localization of small-magnitude events.In our work, we approach such detection and association problems as ones of probabilistic inference. In simple terms, let X be a random variable ranging over all possible collections of events, with each event defined by time, location, magnitude, and type (natural or man-made). Let Y range over all possible waveform signal recordings at all detection stations. Then Pθ(X) describes a parameterized generative prior over events, and P[|#30#|]φ(Y | X) describes how the signal is propagated and measured (including travel time, selective absorption and scattering, noise, artifacts, sensor bias, sensor failures, etc.). Given observed recordings Y = y, we are interested in the posterior P(X | Y = y), and perhaps in the value of X that maximizes it—i.e., the most likely explanation for all the sensor readings. As detailed below, an additional focus of our work is to robustly learn appropriate model parameters θ and φ from historical data. The primary advantage we expect is that decisions about arrivals, phase classifications, and associations are made with the benefit of all available evidence, not just the local signal or predefined recipes. Important phenomena—such as the successful detection of sub-threshold signals, correction of phase classifications using arrival information at other stations, and removal of false events based on the absence of signals—should all fall out of our probabilistic framework without the need for special processing rules. In our baseline model, natural events occur according to a spatially inhomogeneous Poisson process. Complex events (swarms and aftershocks) may then be captured via temporally inhomogeneous extensions. Man-made events have a uniform probability of occurring anywhere on the earth, with a tendency to occur closer to the surface. Phases are modelled via their amplitude, frequency distribution, and origin. In the simplest case, transmission times are characterized via the one-dimensional IASPEI-91 model, accounting for model errors with Gaussian uncertainty. Such homogeneous, approximate physical models can be further refined via historical data and previously developed corrections. Signal measurements are captured by station-specific models, based on sensor types and geometries, local frequency absorption characteristics, and time-varying noise models. At the conference, we expect to be able to quantitatively demonstrate the advantages of our approach, at least for simulated data. When reporting their findings, such systems can easily flag low-confidence events, unexplained arrivals, and ambiguous classifications to focus the efforts of expert analysts.
Besmer, Michael D.; Hammes, Frederik; Sigrist, Jürg A.; Ort, Christoph
2017-01-01
Monitoring of microbial drinking water quality is a key component for ensuring safety and understanding risk, but conventional monitoring strategies are typically based on low sampling frequencies (e.g., quarterly or monthly). This is of concern because many drinking water sources, such as karstic springs are often subject to changes in bacterial concentrations on much shorter time scales (e.g., hours to days), for example after precipitation events. Microbial contamination events are crucial from a risk assessment perspective and should therefore be targeted by monitoring strategies to establish both the frequency of their occurrence and the magnitude of bacterial peak concentrations. In this study we used monitoring data from two specific karstic springs. We assessed the performance of conventional monitoring based on historical records and tested a number of alternative strategies based on a high-resolution data set of bacterial concentrations in spring water collected with online flow cytometry (FCM). We quantified the effect of increasing sampling frequency and found that for the specific case studied, at least bi-weekly sampling would be needed to detect precipitation events with a probability of >90%. We then proposed an optimized monitoring strategy with three targeted samples per event, triggered by precipitation measurements. This approach is more effective and efficient than simply increasing overall sampling frequency. It would enable the water utility to (1) analyze any relevant event and (2) limit median underestimation of peak concentrations to approximately 10%. We conclude with a generalized perspective on sampling optimization and argue that the assessment of short-term dynamics causing microbial peak loads initially requires increased sampling/analysis efforts, but can be optimized subsequently to account for limited resources. This offers water utilities and public health authorities systematic ways to evaluate and optimize their current monitoring strategies. PMID:29213255
Besmer, Michael D; Hammes, Frederik; Sigrist, Jürg A; Ort, Christoph
2017-01-01
Monitoring of microbial drinking water quality is a key component for ensuring safety and understanding risk, but conventional monitoring strategies are typically based on low sampling frequencies (e.g., quarterly or monthly). This is of concern because many drinking water sources, such as karstic springs are often subject to changes in bacterial concentrations on much shorter time scales (e.g., hours to days), for example after precipitation events. Microbial contamination events are crucial from a risk assessment perspective and should therefore be targeted by monitoring strategies to establish both the frequency of their occurrence and the magnitude of bacterial peak concentrations. In this study we used monitoring data from two specific karstic springs. We assessed the performance of conventional monitoring based on historical records and tested a number of alternative strategies based on a high-resolution data set of bacterial concentrations in spring water collected with online flow cytometry (FCM). We quantified the effect of increasing sampling frequency and found that for the specific case studied, at least bi-weekly sampling would be needed to detect precipitation events with a probability of >90%. We then proposed an optimized monitoring strategy with three targeted samples per event, triggered by precipitation measurements. This approach is more effective and efficient than simply increasing overall sampling frequency. It would enable the water utility to (1) analyze any relevant event and (2) limit median underestimation of peak concentrations to approximately 10%. We conclude with a generalized perspective on sampling optimization and argue that the assessment of short-term dynamics causing microbial peak loads initially requires increased sampling/analysis efforts, but can be optimized subsequently to account for limited resources. This offers water utilities and public health authorities systematic ways to evaluate and optimize their current monitoring strategies.
NASA Astrophysics Data System (ADS)
Do, T. D.; Pifer, A.; Chowdhury, Z.; Wahman, D.; Zhang, W.; Fairey, J.
2017-12-01
Detection of nitrification events in chloraminated drinking water distribution systems remains an ongoing challenge for many drinking water utilities, including Dallas Water Utilities (DWU) and the City of Houston (CoH). Each year, these utilities experience nitrification events that necessitate extensive flushing, resulting in the loss of billions of gallons of finished water. Biological techniques used to quantify the activity of nitrifying bacteria are impractical for real-time monitoring because they require significant laboratory efforts and/or lengthy incubation times. At present, DWU and CoH regularly rely on physicochemical parameters including total chlorine and monochloramine residual, and free ammonia, nitrite, and nitrate as indicators of nitrification, but these metrics lack specificity to nitrifying bacteria. To improve detection of nitrification in chloraminated drinking water distribution systems, we seek to develop a real-time fluorescence-based sensor system to detect the early onset of nitrification events by measuring the fluorescence of soluble microbial products (SMPs) specific to nitrifying bacteria. Preliminary data indicates that fluorescence-based metrics have the sensitivity to detect these SMPs in the early stages of nitrification, but several remaining challenges will be explored in this presentation. We will focus on benchtop and sensor results from ongoing batch and annular reactor experiments designed to (1) identify fluorescence wavelength pairs and data processing techniques suitable for measurement of SMPs from nitrification and (2) assess and correct potential interferences, such as those from monochloramine, pH, iron, nitrite, nitrate and humic substances. This work will serve as the basis for developing fluorescence sensor packages for full-scale testing and validation in the DWU and CoH systems. Findings from this research could be leveraged to identify nitrification events in their early stages, facilitating proactive interventions and decreasing the severity and frequency of nitrification episodes and water loss due to flushing.
Assessing autobiographical memory: the web-based autobiographical Implicit Association Test.
Verschuere, Bruno; Kleinberg, Bennett
2017-04-01
By assessing the association strength with TRUE and FALSE, the autobiographical Implicit Association Test (aIAT) [Sartori, G., Agosta, S., Zogmaister, C., Ferrara, S. D., & Castiello, U. (2008). How to accurately detect autobiographical events. Psychological Science, 19, 772-780. doi: 10.1111/j.1467-9280.2008.02156.x ] aims to determine which of two contrasting statements is true. To efficiently run well-powered aIAT experiments, we propose a web-based aIAT (web-aIAT). Experiment 1 (n = 522) is a web-based replication study of the first published aIAT study [Sartori, G., Agosta, S., Zogmaister, C., Ferrara, S. D., & Castiello, U. (2008). How to accurately detect autobiographical events. Psychological Science, 19, 772-780. doi: 10.1111/j.1467-9280.2008.02156.x ; Experiment 1]. We conclude that the replication was successful as the web-based aIAT could accurately detect which of two playing cards participants chose (AUC = .88; Hit rate = 81%). In Experiment 2 (n = 424), we investigated whether the use of affirmative versus negative sentences may partly explain the variability in aIAT accuracy findings. The aIAT could detect the chosen card when using affirmative (AUC = .90; Hit rate = 81%), but not when using negative sentences (AUC = .60; Hit rate = 53%). The web-based aIAT seems to be a valuable tool to facilitate aIAT research and may help to further identify moderators of the test's accuracy.
Ross, Joseph S; Bates, Jonathan; Parzynski, Craig S; Akar, Joseph G; Curtis, Jeptha P; Desai, Nihar R; Freeman, James V; Gamble, Ginger M; Kuntz, Richard; Li, Shu-Xia; Marinac-Dabic, Danica; Masoudi, Frederick A; Normand, Sharon-Lise T; Ranasinghe, Isuru; Shaw, Richard E; Krumholz, Harlan M
2017-01-01
Machine learning methods may complement traditional analytic methods for medical device surveillance. Using data from the National Cardiovascular Data Registry for implantable cardioverter-defibrillators (ICDs) linked to Medicare administrative claims for longitudinal follow-up, we applied three statistical approaches to safety-signal detection for commonly used dual-chamber ICDs that used two propensity score (PS) models: one specified by subject-matter experts (PS-SME), and the other one by machine learning-based selection (PS-ML). The first approach used PS-SME and cumulative incidence (time-to-event), the second approach used PS-SME and cumulative risk (Data Extraction and Longitudinal Trend Analysis [DELTA]), and the third approach used PS-ML and cumulative risk (embedded feature selection). Safety-signal surveillance was conducted for eleven dual-chamber ICD models implanted at least 2,000 times over 3 years. Between 2006 and 2010, there were 71,948 Medicare fee-for-service beneficiaries who received dual-chamber ICDs. Cumulative device-specific unadjusted 3-year event rates varied for three surveyed safety signals: death from any cause, 12.8%-20.9%; nonfatal ICD-related adverse events, 19.3%-26.3%; and death from any cause or nonfatal ICD-related adverse event, 27.1%-37.6%. Agreement among safety signals detected/not detected between the time-to-event and DELTA approaches was 90.9% (360 of 396, k =0.068), between the time-to-event and embedded feature-selection approaches was 91.7% (363 of 396, k =-0.028), and between the DELTA and embedded feature selection approaches was 88.1% (349 of 396, k =-0.042). Three statistical approaches, including one machine learning method, identified important safety signals, but without exact agreement. Ensemble methods may be needed to detect all safety signals for further evaluation during medical device surveillance.
Single Color Multiplexed ddPCR Copy Number Measurements and Single Nucleotide Variant Genotyping.
Wood-Bouwens, Christina M; Ji, Hanlee P
2018-01-01
Droplet digital PCR (ddPCR) allows for accurate quantification of genetic events such as copy number variation and single nucleotide variants. Probe-based assays represent the current "gold-standard" for detection and quantification of these genetic events. Here, we introduce a cost-effective single color ddPCR assay that allows for single genome resolution quantification of copy number and single nucleotide variation.
2013-04-24
DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals Vernon...datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal . We have developed...As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and
TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach.
Elgendi, Mohamed
2016-11-02
Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages ("TERMA") involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages ( W 1 and W 2 ) have to follow the inequality ( 8 × W 1 ) ≥ W 2 ≥ ( 2 × W 1 ) . Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions.
Use of sonification in the detection of anomalous events
NASA Astrophysics Data System (ADS)
Ballora, Mark; Cole, Robert J.; Kruesi, Heidi; Greene, Herbert; Monahan, Ganesh; Hall, David L.
2012-06-01
In this paper, we describe the construction of a soundtrack that fuses stock market data with information taken from tweets. This soundtrack, or auditory display, presents the numerical and text data in such a way that anomalous events may be readily detected, even by untrained listeners. The soundtrack generation is flexible, allowing an individual listener to create a unique audio mix from the available information sources. Properly constructed, the display exploits the auditory system's sensitivities to periodicities, to dynamic changes, and to patterns. This type of display could be valuable in environments that demand high levels of situational awareness based on multiple sources of incoming information.
Pennell, William E.; Sutton, Jr., Harry G.
1981-01-01
Method and apparatus for detecting failure in a welded connection, particrly applicable to not readily accessible welds such as those joining components within the reactor vessel of a nuclear reactor system. A preselected tag gas is sealed within a chamber which extends through selected portions of the base metal and weld deposit. In the event of a failure, such as development of a crack extending from the chamber to an outer surface, the tag gas is released. The environment about the welded area is directed to an analyzer which, in the event of presence of the tag gas, evidences the failure. A trigger gas can be included with the tag gas to actuate the analyzer.
Phase-Space Detection of Cyber Events
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hernandez Jimenez, Jarilyn M; Ferber, Aaron E; Prowell, Stacy J
Energy Delivery Systems (EDS) are a network of processes that produce, transfer and distribute energy. EDS are increasingly dependent on networked computing assets, as are many Industrial Control Systems. Consequently, cyber-attacks pose a real and pertinent threat, as evidenced by Stuxnet, Shamoon and Dragonfly. Hence, there is a critical need for novel methods to detect, prevent, and mitigate effects of such attacks. To detect cyber-attacks in EDS, we developed a framework for gathering and analyzing timing data that involves establishing a baseline execution profile and then capturing the effect of perturbations in the state from injecting various malware. The datamore » analysis was based on nonlinear dynamics and graph theory to improve detection of anomalous events in cyber applications. The goal was the extraction of changing dynamics or anomalous activity in the underlying computer system. Takens' theorem in nonlinear dynamics allows reconstruction of topologically invariant, time-delay-embedding states from the computer data in a sufficiently high-dimensional space. The resultant dynamical states were nodes, and the state-to-state transitions were links in a mathematical graph. Alternatively, sequential tabulation of executing instructions provides the nodes with corresponding instruction-to-instruction links. Graph theorems guarantee graph-invariant measures to quantify the dynamical changes in the running applications. Results showed a successful detection of cyber events.« less
Salisu, Ibrahim B.; Shahid, Ahmad A.; Yaqoob, Amina; Ali, Qurban; Bajwa, Kamran S.; Rao, Abdul Q.; Husnain, Tayyab
2017-01-01
As long as the genetically modified crops are gaining attention globally, their proper approval and commercialization need accurate and reliable diagnostic methods for the transgenic content. These diagnostic techniques are mainly divided into two major groups, i.e., identification of transgenic (1) DNA and (2) proteins from GMOs and their products. Conventional methods such as PCR (polymerase chain reaction) and enzyme-linked immunosorbent assay (ELISA) were routinely employed for DNA and protein based quantification respectively. Although, these Techniques (PCR and ELISA) are considered as significantly convenient and productive, but there is need for more advance technologies that allow for high throughput detection and the quantification of GM event as the production of more complex GMO is increasing day by day. Therefore, recent approaches like microarray, capillary gel electrophoresis, digital PCR and next generation sequencing are more promising due to their accuracy and precise detection of transgenic contents. The present article is a brief comparative study of all such detection techniques on the basis of their advent, feasibility, accuracy, and cost effectiveness. However, these emerging technologies have a lot to do with detection of a specific event, contamination of different events and determination of fusion as well as stacked gene protein are the critical issues to be addressed in future. PMID:29085378
Event-based Sensing for Space Situational Awareness
NASA Astrophysics Data System (ADS)
Cohen, G.; Afshar, S.; van Schaik, A.; Wabnitz, A.; Bessell, T.; Rutten, M.; Morreale, B.
A revolutionary type of imaging device, known as a silicon retina or event-based sensor, has recently been developed and is gaining in popularity in the field of artificial vision systems. These devices are inspired by a biological retina and operate in a significantly different way to traditional CCD-based imaging sensors. While a CCD produces frames of pixel intensities, an event-based sensor produces a continuous stream of events, each of which is generated when a pixel detects a change in log light intensity. These pixels operate asynchronously and independently, producing an event-based output with high temporal resolution. There are also no fixed exposure times, allowing these devices to offer a very high dynamic range independently for each pixel. Additionally, these devices offer high-speed, low power operation and a sparse spatiotemporal output. As a consequence, the data from these sensors must be interpreted in a significantly different way to traditional imaging sensors and this paper explores the advantages this technology provides for space imaging. The applicability and capabilities of event-based sensors for SSA applications are demonstrated through telescope field trials. Trial results have confirmed that the devices are capable of observing resident space objects from LEO through to GEO orbital regimes. Significantly, observations of RSOs were made during both day-time and nighttime (terminator) conditions without modification to the camera or optics. The event based sensor’s ability to image stars and satellites during day-time hours offers a dramatic capability increase for terrestrial optical sensors. This paper shows the field testing and validation of two different architectures of event-based imaging sensors. An eventbased sensor’s asynchronous output has an intrinsically low data-rate. In addition to low-bandwidth communications requirements, the low weight, low-power and high-speed make them ideally suitable to meeting the demanding challenges required by space-based SSA systems. Results from these experiments and the systems developed highlight the applicability of event-based sensors to ground and space-based SSA tasks.
Traffic Congestion Detection System through Connected Vehicles and Big Data
Cárdenas-Benítez, Néstor; Aquino-Santos, Raúl; Magaña-Espinoza, Pedro; Aguilar-Velazco, José; Edwards-Block, Arthur; Medina Cass, Aldo
2016-01-01
This article discusses the simulation and evaluation of a traffic congestion detection system which combines inter-vehicular communications, fixed roadside infrastructure and infrastructure-to-infrastructure connectivity and big data. The system discussed in this article permits drivers to identify traffic congestion and change their routes accordingly, thus reducing the total emissions of CO2 and decreasing travel time. This system monitors, processes and stores large amounts of data, which can detect traffic congestion in a precise way by means of a series of algorithms that reduces localized vehicular emission by rerouting vehicles. To simulate and evaluate the proposed system, a big data cluster was developed based on Cassandra, which was used in tandem with the OMNeT++ discreet event network simulator, coupled with the SUMO (Simulation of Urban MObility) traffic simulator and the Veins vehicular network framework. The results validate the efficiency of the traffic detection system and its positive impact in detecting, reporting and rerouting traffic when traffic events occur. PMID:27136548
Traffic Congestion Detection System through Connected Vehicles and Big Data.
Cárdenas-Benítez, Néstor; Aquino-Santos, Raúl; Magaña-Espinoza, Pedro; Aguilar-Velazco, José; Edwards-Block, Arthur; Medina Cass, Aldo
2016-04-28
This article discusses the simulation and evaluation of a traffic congestion detection system which combines inter-vehicular communications, fixed roadside infrastructure and infrastructure-to-infrastructure connectivity and big data. The system discussed in this article permits drivers to identify traffic congestion and change their routes accordingly, thus reducing the total emissions of CO₂ and decreasing travel time. This system monitors, processes and stores large amounts of data, which can detect traffic congestion in a precise way by means of a series of algorithms that reduces localized vehicular emission by rerouting vehicles. To simulate and evaluate the proposed system, a big data cluster was developed based on Cassandra, which was used in tandem with the OMNeT++ discreet event network simulator, coupled with the SUMO (Simulation of Urban MObility) traffic simulator and the Veins vehicular network framework. The results validate the efficiency of the traffic detection system and its positive impact in detecting, reporting and rerouting traffic when traffic events occur.
Ho, Daniel W H; Sze, Karen M F; Ng, Irene O L
2015-08-28
Viral integration into the human genome upon infection is an important risk factor for various human malignancies. We developed viral integration site detection tool called Virus-Clip, which makes use of information extracted from soft-clipped sequencing reads to identify exact positions of human and virus breakpoints of integration events. With initial read alignment to virus reference genome and streamlined procedures, Virus-Clip delivers a simple, fast and memory-efficient solution to viral integration site detection. Moreover, it can also automatically annotate the integration events with the corresponding affected human genes. Virus-Clip has been verified using whole-transcriptome sequencing data and its detection was validated to have satisfactory sensitivity and specificity. Marked advancement in performance was detected, compared to existing tools. It is applicable to versatile types of data including whole-genome sequencing, whole-transcriptome sequencing, and targeted sequencing. Virus-Clip is available at http://web.hku.hk/~dwhho/Virus-Clip.zip.
Baer, Atar; Elbert, Yevgeniy; Burkom, Howard S; Holtry, Rekha; Lombardo, Joseph S; Duchin, Jeffrey S
2011-03-01
We evaluated emergency department (ED) data, emergency medical services (EMS) data, and public utilities data for describing an outbreak of carbon monoxide (CO) poisoning following a windstorm. Syndromic ED data were matched against previously collected chart abstraction data. We ran detection algorithms on selected time series derived from all 3 data sources to identify health events associated with the CO poisoning outbreak. We used spatial and spatiotemporal scan statistics to identify geographic areas that were most heavily affected by the CO poisoning event. Of the 241 CO cases confirmed by chart review, 190 (78.8%) were identified in the syndromic surveillance data as exact matches. Records from the ED and EMS data detected an increase in CO-consistent syndromes after the storm. The ED data identified significant clusters of CO-consistent syndromes, including zip codes that had widespread power outages. Weak temporal gastrointestinal (GI) signals, possibly resulting from ingestion of food spoiled by lack of refrigeration, were detected in the ED data but not in the EMS data. Spatial clustering of GI-based groupings in the ED data was not detected. Data from this evaluation support the value of ED data for surveillance after natural disasters. Enhanced EMS data may be useful for monitoring a CO poisoning event, if these data are available to the health department promptly. ©2011 American Medical Association. All rights reserved.
NASA Astrophysics Data System (ADS)
Larnier, H.; Sailhac, P.; Chambodut, A.
2018-01-01
Atmospheric electromagnetic waves created by global lightning activity contain information about electrical processes of the inner and the outer Earth. Large signal-to-noise ratio events are particularly interesting because they convey information about electromagnetic properties along their path. We introduce a new methodology to automatically detect and characterize lightning-based waves using a time-frequency decomposition obtained through the application of continuous wavelet transform. We focus specifically on three types of sources, namely, atmospherics, slow tails and whistlers, that cover the frequency range 10 Hz to 10 kHz. Each wave has distinguishable characteristics in the time-frequency domain due to source shape and dispersion processes. Our methodology allows automatic detection of each type of event in the time-frequency decomposition thanks to their specific signature. Horizontal polarization attributes are also recovered in the time-frequency domain. This procedure is first applied to synthetic extremely low frequency time-series with different signal-to-noise ratios to test for robustness. We then apply it on real data: three stations of audio-magnetotelluric data acquired in Guadeloupe, oversea French territories. Most of analysed atmospherics and slow tails display linear polarization, whereas analysed whistlers are elliptically polarized. The diversity of lightning activity is finally analysed in an audio-magnetotelluric data processing framework, as used in subsurface prospecting, through estimation of the impedance response functions. We show that audio-magnetotelluric processing results depend mainly on the frequency content of electromagnetic waves observed in processed time-series, with an emphasis on the difference between morning and afternoon acquisition. Our new methodology based on the time-frequency signature of lightning-induced electromagnetic waves allows automatic detection and characterization of events in audio-magnetotelluric time-series, providing the means to assess quality of response functions obtained through processing.
Lightning Detection Efficiency Analysis Process: Modeling Based on Empirical Data
NASA Technical Reports Server (NTRS)
Rompala, John T.
2005-01-01
A ground based lightning detection system employs a grid of sensors, which record and evaluate the electromagnetic signal produced by a lightning strike. Several detectors gather information on that signal s strength, time of arrival, and behavior over time. By coordinating the information from several detectors, an event solution can be generated. That solution includes the signal s point of origin, strength and polarity. Determination of the location of the lightning strike uses algorithms based on long used techniques of triangulation. Determination of the event s original signal strength relies on the behavior of the generated magnetic field over distance and time. In general the signal from the event undergoes geometric dispersion and environmental attenuation as it progresses. Our knowledge of that radial behavior together with the strength of the signal received by detecting sites permits an extrapolation and evaluation of the original strength of the lightning strike. It also limits the detection efficiency (DE) of the network. For expansive grids and with a sparse density of detectors, the DE varies widely over the area served. This limits the utility of the network in gathering information on regional lightning strike density and applying it to meteorological studies. A network of this type is a grid of four detectors in the Rondonian region of Brazil. The service area extends over a million square kilometers. Much of that area is covered by rain forests. Thus knowledge of lightning strike characteristics over the expanse is of particular value. I have been developing a process that determines the DE over the region [3]. In turn, this provides a way to produce lightning strike density maps, corrected for DE, over the entire region of interest. This report offers a survey of that development to date and a record of present activity.
Twitter earthquake detection: Earthquake monitoring in a social world
Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.
2011-01-01
The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.
Candidate Binary Microlensing Events from the MACHO Project
NASA Astrophysics Data System (ADS)
Becker, A. C.; Alcock, C.; Allsman, R. A.; Alves, D. R.; Axelrod, T. S.; Bennett, D. P.; Cook, K. H.; Drake, A. J.; Freeman, K. C.; Griest, K.; King, L. J.; Lehner, M. J.; Marshall, S. L.; Minniti, D.; Peterson, B. A.; Popowski, P.; Pratt, M. R.; Quinn, P. J.; Rodgers, A. W.; Stubbs, C. W.; Sutherland, W.; Tomaney, A.; Vandehei, T.; Welch, D. L.; Baines, D.; Brakel, A.; Crook, B.; Howard, J.; Leach, T.; McDowell, D.; McKeown, S.; Mitchell, J.; Moreland, J.; Pozza, E.; Purcell, P.; Ring, S.; Salmon, A.; Ward, K.; Wyper, G.; Heller, A.; Kaspi, S.; Kovo, O.; Maoz, D.; Retter, A.; Rhie, S. H.; Stetson, P.; Walker, A.; MACHO Collaboration
1998-12-01
We present the lightcurves of 22 gravitational microlensing events from the first six years of the MACHO Project gravitational microlensing survey which are likely examples of lensing by binary systems. These events were selected from a total sample of ~ 300 events which were either detected by the MACHO Alert System or discovered through retrospective analyses of the MACHO database. Many of these events appear to have undergone a caustic or cusp crossing, and 2 of the events are well fit with lensing by binary systems with large mass ratios, indicating secondary companions of approximately planetary mass. The event rate is roughly consistent with predictions based upon our knowledge of the properties of binary stars. The utility of binary lensing in helping to solve the Galactic dark matter problem is demonstrated with analyses of 3 binary microlensing events seen towards the Magellanic Clouds. Source star resolution during caustic crossings in 2 of these events allows us to estimate the location of the lensing systems, assuming each source is a single star and not a short period binary. * MACHO LMC-9 appears to be a binary lensing event with a caustic crossing partially resolved in 2 observations. The resulting lens proper motion appears too small for a single source and LMC disk lens. However, it is considerably less likely to be a single source star and Galactic halo lens. We estimate the a priori probability of a short period binary source with a detectable binary character to be ~ 10 %. If the source is also a binary, then we currently have no constraints on the lens location. * The most recent of these events, MACHO 98-SMC-1, was detected in real-time. Follow-up observations by the MACHO/GMAN, PLANET, MPS, EROS and OGLE microlensing collaborations lead to the robust conclusion that the lens likely resides in the SMC.
A scalable multi-photon coincidence detector based on superconducting nanowires.
Zhu, Di; Zhao, Qing-Yuan; Choi, Hyeongrak; Lu, Tsung-Ju; Dane, Andrew E; Englund, Dirk; Berggren, Karl K
2018-06-04
Coincidence detection of single photons is crucial in numerous quantum technologies and usually requires multiple time-resolved single-photon detectors. However, the electronic readout becomes a major challenge when the measurement basis scales to large numbers of spatial modes. Here, we address this problem by introducing a two-terminal coincidence detector that enables scalable readout of an array of detector segments based on superconducting nanowire microstrip transmission line. Exploiting timing logic, we demonstrate a sixteen-element detector that resolves all 136 possible single-photon and two-photon coincidence events. We further explore the pulse shapes of the detector output and resolve up to four-photon events in a four-element device, giving the detector photon-number-resolving capability. This new detector architecture and operating scheme will be particularly useful for multi-photon coincidence detection in large-scale photonic integrated circuits.
An Ultralow-Power Sleep Spindle Detection System on Chip.
Iranmanesh, Saam; Rodriguez-Villegas, Esther
2017-08-01
This paper describes a full system-on-chip to automatically detect sleep spindle events from scalp EEG signals. These events, which are known to play an important role on memory consolidation during sleep, are also characteristic of a number of neurological diseases. The operation of the system is based on a previously reported algorithm, which used the Teager energy operator, together with the Spectral Edge Frequency (SEF50) achieving more than 70% sensitivity and 98% specificity. The algorithm is now converted into a hardware analog based customized implementation in order to achieve extremely low levels of power. Experimental results prove that the system, which is fabricated in a 0.18 μm CMOS technology, is able to operate from a 1.25 V power supply consuming only 515 nW, with an accuracy that is comparable to its software counterpart.
The European Infrasound Bulletin
NASA Astrophysics Data System (ADS)
Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; Vergoz, Julien; Le Pichon, Alexis; Brachet, Nicolas; Blanc, Elisabeth; Kero, Johan; Liszka, Ludwik; Gibbons, Steven; Kvaerna, Tormod; Näsholm, Sven Peter; Marchetti, Emanuele; Ripepe, Maurizio; Smets, Pieter; Evers, Laslo; Ghica, Daniela; Ionescu, Constantin; Sindelarova, Tereza; Ben Horin, Yochai; Mialle, Pierrick
2018-05-01
The European Infrasound Bulletin highlights infrasound activity produced mostly by anthropogenic sources, recorded all over Europe and collected in the course of the ARISE and ARISE2 projects (Atmospheric dynamics Research InfraStructure in Europe). Data includes high-frequency (> 0.7 Hz) infrasound detections at 24 European infrasound arrays from nine different national institutions complemented with infrasound stations of the International Monitoring System for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Data were acquired during 16 years of operation (from 2000 to 2015) and processed to identify and locate ˜ 48,000 infrasound events within Europe. The source locations of these events were derived by combining at least two corresponding station detections per event. Comparisons with ground-truth sources, e.g., Scandinavian mining activity, are provided as well as comparisons with the CTBT Late Event Bulletin (LEB). Relocation is performed using ray-tracing methods to estimate celerity and back-azimuth corrections for source location based on meteorological wind and temperature values for each event derived from European Centre for Medium-range Weather Forecast (ECMWF) data. This study focuses on the analysis of repeating, man-made infrasound events (e.g., mining blasts and supersonic flights) and on the seasonal, weekly and diurnal variation of the infrasonic activity of sources in Europe. Drawing comparisons to previous studies shows that improvements in terms of detection, association and location are made within this study due to increasing the station density and thus the number of events and determined source regions. This improves the capability of the infrasound station network in Europe to more comprehensively estimate the activity of anthropogenic infrasound sources in Europe.
Radon backgrounds in the DEAP-1 liquid-argon-based Dark Matter detector
NASA Astrophysics Data System (ADS)
Amaudruz, P.-A.; Batygov, M.; Beltran, B.; Boudjemline, K.; Boulay, M. G.; Cai, B.; Caldwell, T.; Chen, M.; Chouinard, R.; Cleveland, B. T.; Contreras, D.; Dering, K.; Duncan, F.; Ford, R.; Gagnon, R.; Giuliani, F.; Gold, M.; Golovko, V. V.; Gorel, P.; Graham, K.; Grant, D. R.; Hakobyan, R.; Hallin, A. L.; Harvey, P.; Hearns, C.; Jillings, C. J.; Kuźniak, M.; Lawson, I.; Li, O.; Lidgard, J.; Liimatainen, P.; Lippincott, W. H.; Mathew, R.; McDonald, A. B.; McElroy, T.; McFarlane, K.; McKinsey, D.; Muir, A.; Nantais, C.; Nicolics, K.; Nikkel, J.; Noble, T.; O'Dwyer, E.; Olsen, K. S.; Ouellet, C.; Pasuthip, P.; Pollmann, T.; Rau, W.; Retiere, F.; Ronquest, M.; Skensved, P.; Sonley, T.; Tang, J.; Vázquez-Jáuregui, E.; Veloce, L.; Ward, M.
2015-03-01
The DEAP-1 7 kg single phase liquid argon scintillation detector was operated underground at SNOLAB in order to test the techniques and measure the backgrounds inherent to single phase detection, in support of the DEAP-3600 Dark Matter detector. Backgrounds in DEAP are controlled through material selection, construction techniques, pulse shape discrimination, and event reconstruction. This report details the analysis of background events observed in three iterations of the DEAP-1 detector, and the measures taken to reduce them. The 222 Rn decay rate in the liquid argon was measured to be between 16 and 26 μBq kg-1. We found that the background spectrum near the region of interest for Dark Matter detection in the DEAP-1 detector can be described considering events from three sources: radon daughters decaying on the surface of the active volume, the expected rate of electromagnetic events misidentified as nuclear recoils due to inefficiencies in the pulse shape discrimination, and leakage of events from outside the fiducial volume due to imperfect position reconstruction. These backgrounds statistically account for all observed events, and they will be strongly reduced in the DEAP-3600 detector due to its higher light yield and simpler geometry.
DEAP-3600 Data Acquisition System
NASA Astrophysics Data System (ADS)
Lindner, Thomas
2015-12-01
DEAP-3600 is a dark matter experiment using liquid argon to detect Weakly Interacting Massive Particles (WIMPs). The DEAP-3600 Data Acquisition (DAQ) has been built using a combination of commercial and custom electronics, organized using the MIDAS framework. The DAQ system needs to suppress a high rate of background events from 39Ar beta decays. This suppression is implemented using a combination of online firmware and software-based event filtering. We will report on progress commissioning the DAQ system, as well as the development of the web-based user interface.
Results from the MACHO Galactic Pixel Lensing Search
NASA Astrophysics Data System (ADS)
Drake, Andrew J.; Minniti, Dante; Alcock, Charles; Allsman, Robyn A.; Alves, David; Axelrod, Tim S.; Becker, Andrew C.; Bennett, David; Cook, Kem H.; Freeman, Ken C.; Griest, Kim; Lehner, Matt; Marshall, Stuart; Peterson, Bruce; Pratt, Mark; Quinn, Peter; Rodgers, Alex; Stubbs, Chris; Sutherland, Will; Tomaney, Austin; Vandehei, Thor; Welch, Doug L.
The MACHO, EROS, OGLE and AGAPE collaborations have been studying nature of the galactic halo for a number of years using microlensing events. The MACHO group undertakes observations of the LMC, SMC and Galactic Bulge monitoring the light curves of millions of stars to detect microlensing. Most of these fields are crowded to the extent that all the monitored stars are blended. Such crowding makes the performance of accurate photometry difficult. We apply the new technique of Difference Image Analysis (DIA) on archival data to improve the photometry and increase both the detection sensitivity and effective search area. The application of this technique also allows us to detect so called `pixel lensing' events. These are microlensing events where the source star is only detectable during lensing. The detection of these events will allow us to make a large increase in the number of detected microlensing events. We present a light curve demonstrating the detection of a pixel lensing event with this technique.
Kamphuis, C; Frank, E; Burke, J K; Verkerk, G A; Jago, J G
2013-01-01
The hypothesis was that sensors currently available on farm that monitor behavioral and physiological characteristics have potential for the detection of lameness in dairy cows. This was tested by applying additive logistic regression to variables derived from sensor data. Data were collected between November 2010 and June 2012 on 5 commercial pasture-based dairy farms. Sensor data from weigh scales (liveweight), pedometers (activity), and milk meters (milking order, unadjusted and adjusted milk yield in the first 2 min of milking, total milk yield, and milking duration) were collected at every milking from 4,904 cows. Lameness events were recorded by farmers who were trained in detecting lameness before the study commenced. A total of 318 lameness events affecting 292 cows were available for statistical analyses. For each lameness event, the lame cow's sensor data for a time period of 14 d before observation date were randomly matched by farm and date to 10 healthy cows (i.e., cows that were not lame and had no other health event recorded for the matched time period). Sensor data relating to the 14-d time periods were used for developing univariable (using one source of sensor data) and multivariable (using multiple sources of sensor data) models. Model development involved the use of additive logistic regression by applying the LogitBoost algorithm with a regression tree as base learner. The model's output was a probability estimate for lameness, given the sensor data collected during the 14-d time period. Models were validated using leave-one-farm-out cross-validation and, as a result of this validation, each cow in the data set (318 lame and 3,180 nonlame cows) received a probability estimate for lameness. Based on the area under the curve (AUC), results indicated that univariable models had low predictive potential, with the highest AUC values found for liveweight (AUC=0.66), activity (AUC=0.60), and milking order (AUC=0.65). Combining these 3 sensors improved AUC to 0.74. Detection performance of this combined model varied between farms but it consistently and significantly outperformed univariable models across farms at a fixed specificity of 80%. Still, detection performance was not high enough to be implemented in practice on large, pasture-based dairy farms. Future research may improve performance by developing variables based on sensor data of liveweight, activity, and milking order, but that better describe changes in sensor data patterns when cows go lame. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Kostal, Vratislav; Arriaga, Edgar A.
2011-01-01
Interactions between the cytoskeleton and mitochondria are essential for normal cellular function. An assessment of such interactions is commonly based on bulk analysis of mitochondrial and cytoskeletal markers present in a given sample, which assumes complete binding between these two organelle types. Such measurements are biased because they rarely account for non-bound ‘free’ subcellular species. Here we report on the use of capillary electrophoresis with dual laser induced fluorescence detection (CE-LIF) to identify, classify, count and quantify properties of individual binding events of mitochondria and cytoskeleton. Mitochondria were fluorescently labeled with DsRed2 while F-actin, a major cytoskeletal component, was fluorescently labeled with Alexa488-phalloidin. In a typical subcellular fraction of L6 myoblasts, 79% of mitochondrial events did not have detectable levels of F-actin, while the rest had on average ~2 zeptomole F-actin, which theoretically represents a ~ 2.5-μm long network of actin filaments per event. Trypsin treatment of L6 subcellular fractions prior to analysis decreased the fraction of mitochondrial events with detectable levels of F-actin, which is expected from digestion of cytoskeletal proteins on the surface of mitochondria. The electrophoretic mobility distributions of the individual events were also used to further distinguish between cytoskeleton-bound from cytoskeleton-free mitochondrial events. The CE-LIF approach described here could be further developed to explore cytoskeleton interactions with other subcellular structures, the effects of cytoskeleton destabilizing drugs, and the progression of viral infections. PMID:21309532
NASA Technical Reports Server (NTRS)
Totman, Peter D. (Inventor); Everton, Randy L. (Inventor); Egget, Mark R. (Inventor); Macon, David J. (Inventor)
2007-01-01
A method and apparatus for detecting and determining event characteristics such as, for example, the material failure of a component, in a manner which significantly reduces the amount of data collected. A sensor array, including a plurality of individual sensor elements, is coupled to a programmable logic device (PLD) configured to operate in a passive state and an active state. A triggering event is established such that the PLD records information only upon detection of the occurrence of the triggering event which causes a change in state within one or more of the plurality of sensor elements. Upon the occurrence of the triggering event, the change in state of the one or more sensor elements causes the PLD to record in memory which sensor element detected the event and at what time the event was detected. The PLD may be coupled with a computer for subsequent downloading and analysis of the acquired data.
Novak, Avrey; Nyflot, Matthew J; Ermoian, Ralph P; Jordan, Loucille E; Sponseller, Patricia A; Kane, Gabrielle M; Ford, Eric C; Zeng, Jing
2016-05-01
Radiation treatment planning involves a complex workflow that has multiple potential points of vulnerability. This study utilizes an incident reporting system to identify the origination and detection points of near-miss errors, in order to guide their departmental safety improvement efforts. Previous studies have examined where errors arise, but not where they are detected or applied a near-miss risk index (NMRI) to gauge severity. From 3/2012 to 3/2014, 1897 incidents were analyzed from a departmental incident learning system. All incidents were prospectively reviewed weekly by a multidisciplinary team and assigned a NMRI score ranging from 0 to 4 reflecting potential harm to the patient (no potential harm to potential critical harm). Incidents were classified by point of incident origination and detection based on a 103-step workflow. The individual steps were divided among nine broad workflow categories (patient assessment, imaging for radiation therapy (RT) planning, treatment planning, pretreatment plan review, treatment delivery, on-treatment quality management, post-treatment completion, equipment/software quality management, and other). The average NMRI scores of incidents originating or detected within each broad workflow area were calculated. Additionally, out of 103 individual process steps, 35 were classified as safety barriers, the process steps whose primary function is to catch errors. The safety barriers which most frequently detected incidents were identified and analyzed. Finally, the distance between event origination and detection was explored by grouping events by the number of broad workflow area events passed through before detection, and average NMRI scores were compared. Near-miss incidents most commonly originated within treatment planning (33%). However, the incidents with the highest average NMRI scores originated during imaging for RT planning (NMRI = 2.0, average NMRI of all events = 1.5), specifically during the documentation of patient positioning and localization of the patient. Incidents were most frequently detected during treatment delivery (30%), and incidents identified at this point also had higher severity scores than other workflow areas (NMRI = 1.6). Incidents identified during on-treatment quality management were also more severe (NMRI = 1.7), and the specific process steps of reviewing portal and CBCT images tended to catch highest-severity incidents. On average, safety barriers caught 46% of all incidents, most frequently at physics chart review, therapist's chart check, and the review of portal images; however, most of the incidents that pass through a particular safety barrier are not designed to be capable of being captured at that barrier. Incident learning systems can be used to assess the most common points of error origination and detection in radiation oncology. This can help tailor safety improvement efforts and target the highest impact portions of the workflow. The most severe near-miss events tend to originate during simulation, with the most severe near-miss events detected at the time of patient treatment. Safety barriers can be improved to allow earlier detection of near-miss events.
Detection and quantification system for monitoring instruments
Dzenitis, John M [Danville, CA; Hertzog, Claudia K [Houston, TX; Makarewicz, Anthony J [Livermore, CA; Henderer, Bruce D [Livermore, CA; Riot, Vincent J [Oakland, CA
2008-08-12
A method of detecting real events by obtaining a set of recent signal results, calculating measures of the noise or variation based on the set of recent signal results, calculating an expected baseline value based on the set of recent signal results, determining sample deviation, calculating an allowable deviation by multiplying the sample deviation by a threshold factor, setting an alarm threshold from the baseline value plus or minus the allowable deviation, and determining whether the signal results exceed the alarm threshold.
RoboTAP: Target priorities for robotic microlensing observations
NASA Astrophysics Data System (ADS)
Hundertmark, M.; Street, R. A.; Tsapras, Y.; Bachelet, E.; Dominik, M.; Horne, K.; Bozza, V.; Bramich, D. M.; Cassan, A.; D'Ago, G.; Figuera Jaimes, R.; Kains, N.; Ranc, C.; Schmidt, R. W.; Snodgrass, C.; Wambsganss, J.; Steele, I. A.; Mao, S.; Ment, K.; Menzies, J.; Li, Z.; Cross, S.; Maoz, D.; Shvartzvald, Y.
2018-01-01
Context. The ability to automatically select scientifically-important transient events from an alert stream of many such events, and to conduct follow-up observations in response, will become increasingly important in astronomy. With wide-angle time domain surveys pushing to fainter limiting magnitudes, the capability to follow-up on transient alerts far exceeds our follow-up telescope resources, and effective target prioritization becomes essential. The RoboNet-II microlensing program is a pathfinder project, which has developed an automated target selection process (RoboTAP) for gravitational microlensing events, which are observed in real time using the Las Cumbres Observatory telescope network. Aims: Follow-up telescopes typically have a much smaller field of view compared to surveys, therefore the most promising microlensing events must be automatically selected at any given time from an annual sample exceeding 2000 events. The main challenge is to select between events with a high planet detection sensitivity, with the aim of detecting many planets and characterizing planetary anomalies. Methods: Our target selection algorithm is a hybrid system based on estimates of the planet detection zones around a microlens. It follows automatic anomaly alerts and respects the expected survey coverage of specific events. Results: We introduce the RoboTAP algorithm, whose purpose is to select and prioritize microlensing events with high sensitivity to planetary companions. In this work, we determine the planet sensitivity of the RoboNet follow-up program and provide a working example of how a broker can be designed for a real-life transient science program conducting follow-up observations in response to alerts; we explore the issues that will confront similar programs being developed for the Large Synoptic Survey Telescope (LSST) and other time domain surveys.
NASA Astrophysics Data System (ADS)
Reynen, Andrew; Audet, Pascal
2017-09-01
A new method using a machine learning technique is applied to event classification and detection at seismic networks. This method is applicable to a variety of network sizes and settings. The algorithm makes use of a small catalogue of known observations across the entire network. Two attributes, the polarization and frequency content, are used as input to regression. These attributes are extracted at predicted arrival times for P and S waves using only an approximate velocity model, as attributes are calculated over large time spans. This method of waveform characterization is shown to be able to distinguish between blasts and earthquakes with 99 per cent accuracy using a network of 13 stations located in Southern California. The combination of machine learning with generalized waveform features is further applied to event detection in Oklahoma, United States. The event detection algorithm makes use of a pair of unique seismic phases to locate events, with a precision directly related to the sampling rate of the generalized waveform features. Over a week of data from 30 stations in Oklahoma, United States are used to automatically detect 25 times more events than the catalogue of the local geological survey, with a false detection rate of less than 2 per cent. This method provides a highly confident way of detecting and locating events. Furthermore, a large number of seismic events can be automatically detected with low false alarm, allowing for a larger automatic event catalogue with a high degree of trust.
Full On-Device Stay Points Detection in Smartphones for Location-Based Mobile Applications.
Pérez-Torres, Rafael; Torres-Huitzil, César; Galeana-Zapién, Hiram
2016-10-13
The tracking of frequently visited places, also known as stay points, is a critical feature in location-aware mobile applications as a way to adapt the information and services provided to smartphones users according to their moving patterns. Location based applications usually employ the GPS receiver along with Wi-Fi hot-spots and cellular cell tower mechanisms for estimating user location. Typically, fine-grained GPS location data are collected by the smartphone and transferred to dedicated servers for trajectory analysis and stay points detection. Such Mobile Cloud Computing approach has been successfully employed for extending smartphone's battery lifetime by exchanging computation costs, assuming that on-device stay points detection is prohibitive. In this article, we propose and validate the feasibility of having an alternative event-driven mechanism for stay points detection that is executed fully on-device, and that provides higher energy savings by avoiding communication costs. Our solution is encapsulated in a sensing middleware for Android smartphones, where a stream of GPS location updates is collected in the background, supporting duty cycling schemes, and incrementally analyzed following an event-driven paradigm for stay points detection. To evaluate the performance of the proposed middleware, real world experiments were conducted under different stress levels, validating its power efficiency when compared against a Mobile Cloud Computing oriented solution.
Video Traffic Analysis for Abnormal Event Detection
DOT National Transportation Integrated Search
2010-01-01
We propose the use of video imaging sensors for the detection and classification of abnormal events to be used primarily for mitigation of traffic congestion. Successful detection of such events will allow for new road guidelines; for rapid deploymen...
Video traffic analysis for abnormal event detection.
DOT National Transportation Integrated Search
2010-01-01
We propose the use of video imaging sensors for the detection and classification of abnormal events to : be used primarily for mitigation of traffic congestion. Successful detection of such events will allow for : new road guidelines; for rapid deplo...
NASA Astrophysics Data System (ADS)
Royer, J.-Y.; Chateau, R.; Dziak, R. P.; Bohnenstiehl, D. R.
2015-08-01
This paper presents the results from the Deflo-hydroacoustic experiment in the Southern Indian Ocean using three autonomous underwater hydrophones, complemented by two permanent hydroacoustic stations. The array monitored for 14 months, from November 2006 to December 2007, a 3000 × 3000 km wide area, encompassing large segments of the three Indian spreading ridges that meet at the Indian Triple Junction. A catalogue of 11 105 acoustic events is derived from the recorded data, of which 55 per cent are located from three hydrophones, 38 per cent from 4, 6 per cent from five and less than 1 per cent by six hydrophones. From a comparison with land-based seismic catalogues, the smallest detected earthquakes are mb 2.6 in size, the range of recorded magnitudes is about twice that of land-based networks and the number of detected events is 5-16 times larger. Seismicity patterns vary between the three spreading ridges, with activity mainly focused on transform faults along the fast spreading Southeast Indian Ridge and more evenly distributed along spreading segments and transforms on the slow spreading Central and ultra-slow spreading Southwest Indian ridges; the Central Indian Ridge is the most active of the three with an average of 1.9 events/100 km/month. Along the Sunda Trench, acoustic events mostly radiate from the inner wall of the trench and show a 200-km-long seismic gap between 2 °S and the Equator. The array also detected more than 3600 cryogenic events, with different seasonal trends observed for events from the Antarctic margin, compared to those from drifting icebergs at lower (up to 50°S) latitudes. Vocalizations of five species and subspecies of large baleen whales were also observed and exhibit clear seasonal variability. On the three autonomous hydrophones, whale vocalizations dominate sound levels in the 20-30 and 100 Hz frequency bands, whereas earthquakes and ice tremor are a dominant source of ambient sound at frequencies <20 Hz.
On comprehensive recovery of an aftershock sequence with cross correlation
NASA Astrophysics Data System (ADS)
Kitov, I.; Bobrov, D.; Coyne, J.; Turyomurugyendo, G.
2012-04-01
We have introduced cross correlation between seismic waveforms as a technique for signal detection and automatic event building at the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization. The intuition behind signal detection is simple - small and mid-sized seismic events close in space should produce similar signals at the same seismic stations. Equivalently, these signals have to be characterized by a high cross correlation coefficient. For array stations with many individual sensors distributed over a large area, signals from events at distances beyond, say, 50 km, are subject to destructive interference when cross correlated due to changing time delays between various channels. Thus, any cross correlation coefficient above some predefined threshold can be considered as a signature of a valid signal. With a dense grid of master events (spacing between adjacent masters between 20 km and 50 km corresponds to the statistically estimated correlation distance) with high quality (signal-to-noise ratio above 10) template waveforms at primary array stations of the International Monitoring System one can detect signals from and then build natural and manmade seismic events close to the master ones. The use of cross correlation allows detecting smaller signals (sometimes below noise level) than provided by the current IDC detecting techniques. As a result it is possible to automatically build from 50% to 100% more valid seismic events than included in the Reviewed Event Bulletin (REB). We have developed a tentative pipeline for automatic processing at the IDC. It includes three major stages. Firstly, we calculate cross correlation coefficient for a given master and continuous waveforms at the same stations and carry out signal detection as based on the statistical behavior of signal-to-noise ratio of the cross correlation coefficient. Secondly, a thorough screening is performed for all obtained signals using f-k analysis and F-statistics as applied to the cross-correlation traces at individual channels of all included array stations. Thirdly, local (i.e. confined to the correlation distance around the master event) association of origin times of all qualified signals is fulfilled. These origin times are calculated from the arrival times of these signals, which are reduced to the origin times by the travel times from the master event. An aftershock sequence of a mid-size earthquake is an ideal case to test cross correlation techniques for autiomatic event building. All events should be close to the mainshock and occur within several days. Here we analyse the aftershock sequence of an earthquake in the North Atlantic Ocean with mb(IDC)=4.79. The REB includes 38 events at distances less than 150 km from the mainshock. Our ultimate goal is to excersice the complete iterative procedure to find all possible aftershocks. We start with the mainshock and recover ten aftershocks with the largest number of stations to produce an initial set of master events with the highest quality templates. Then we find all aftershocks in the REB and many additional events, which were not originally found by the IDC. Using all events found after the first iteration as master events we find new events, which are also used in the next iteration. The iterative process stops when no new events can be found. In that sense the final set of aftershocks obtained with cross correlation is a comprehensive one.
Low-Latency Line Tracking Using Event-Based Dynamic Vision Sensors
Everding, Lukas; Conradt, Jörg
2018-01-01
In order to safely navigate and orient in their local surroundings autonomous systems need to rapidly extract and persistently track visual features from the environment. While there are many algorithms tackling those tasks for traditional frame-based cameras, these have to deal with the fact that conventional cameras sample their environment with a fixed frequency. Most prominently, the same features have to be found in consecutive frames and corresponding features then need to be matched using elaborate techniques as any information between the two frames is lost. We introduce a novel method to detect and track line structures in data streams of event-based silicon retinae [also known as dynamic vision sensors (DVS)]. In contrast to conventional cameras, these biologically inspired sensors generate a quasicontinuous stream of vision information analogous to the information stream created by the ganglion cells in mammal retinae. All pixels of DVS operate asynchronously without a periodic sampling rate and emit a so-called DVS address event as soon as they perceive a luminance change exceeding an adjustable threshold. We use the high temporal resolution achieved by the DVS to track features continuously through time instead of only at fixed points in time. The focus of this work lies on tracking lines in a mostly static environment which is observed by a moving camera, a typical setting in mobile robotics. Since DVS events are mostly generated at object boundaries and edges which in man-made environments often form lines they were chosen as feature to track. Our method is based on detecting planes of DVS address events in x-y-t-space and tracing these planes through time. It is robust against noise and runs in real time on a standard computer, hence it is suitable for low latency robotics. The efficacy and performance are evaluated on real-world data sets which show artificial structures in an office-building using event data for tracking and frame data for ground-truth estimation from a DAVIS240C sensor. PMID:29515386
NASA Technical Reports Server (NTRS)
Trejo, Leonard J.; Shensa, Mark J.; Remington, Roger W. (Technical Monitor)
1998-01-01
This report describes the development and evaluation of mathematical models for predicting human performance from discrete wavelet transforms (DWT) of event-related potentials (ERP) elicited by task-relevant stimuli. The DWT was compared to principal components analysis (PCA) for representation of ERPs in linear regression and neural network models developed to predict a composite measure of human signal detection performance. Linear regression models based on coefficients of the decimated DWT predicted signal detection performance with half as many f ree parameters as comparable models based on PCA scores. In addition, the DWT-based models were more resistant to model degradation due to over-fitting than PCA-based models. Feed-forward neural networks were trained using the backpropagation,-, algorithm to predict signal detection performance based on raw ERPs, PCA scores, or high-power coefficients of the DWT. Neural networks based on high-power DWT coefficients trained with fewer iterations, generalized to new data better, and were more resistant to overfitting than networks based on raw ERPs. Networks based on PCA scores did not generalize to new data as well as either the DWT network or the raw ERP network. The results show that wavelet expansions represent the ERP efficiently and extract behaviorally important features for use in linear regression or neural network models of human performance. The efficiency of the DWT is discussed in terms of its decorrelation and energy compaction properties. In addition, the DWT models provided evidence that a pattern of low-frequency activity (1 to 3.5 Hz) occurring at specific times and scalp locations is a reliable correlate of human signal detection performance.
NASA Technical Reports Server (NTRS)
Trejo, L. J.; Shensa, M. J.
1999-01-01
This report describes the development and evaluation of mathematical models for predicting human performance from discrete wavelet transforms (DWT) of event-related potentials (ERP) elicited by task-relevant stimuli. The DWT was compared to principal components analysis (PCA) for representation of ERPs in linear regression and neural network models developed to predict a composite measure of human signal detection performance. Linear regression models based on coefficients of the decimated DWT predicted signal detection performance with half as many free parameters as comparable models based on PCA scores. In addition, the DWT-based models were more resistant to model degradation due to over-fitting than PCA-based models. Feed-forward neural networks were trained using the backpropagation algorithm to predict signal detection performance based on raw ERPs, PCA scores, or high-power coefficients of the DWT. Neural networks based on high-power DWT coefficients trained with fewer iterations, generalized to new data better, and were more resistant to overfitting than networks based on raw ERPs. Networks based on PCA scores did not generalize to new data as well as either the DWT network or the raw ERP network. The results show that wavelet expansions represent the ERP efficiently and extract behaviorally important features for use in linear regression or neural network models of human performance. The efficiency of the DWT is discussed in terms of its decorrelation and energy compaction properties. In addition, the DWT models provided evidence that a pattern of low-frequency activity (1 to 3.5 Hz) occurring at specific times and scalp locations is a reliable correlate of human signal detection performance. Copyright 1999 Academic Press.
Modeling Concept Dependencies for Event Detection
2014-04-04
Gaussian Mixture Model (GMM). Jiang et al . [8] provide a summary of experiments for TRECVID MED 2010 . They employ low-level features such as SIFT and...event detection literature. Ballan et al . [2] present a method to introduce temporal information for video event detection with a BoW (bag-of-words...approach. Zhou et al . [24] study video event detection by encoding a video with a set of bag of SIFT feature vectors and describe the distribution with a
NASA Astrophysics Data System (ADS)
Jin, Dayong; Piper, James A.; Leif, Robert C.; Yang, Sean; Ferrari, Belinda C.; Yuan, Jingli; Wang, Guilan; Vallarino, Lidia M.; Williams, John W.
2009-03-01
A fundamental problem for rare-event cell analysis is auto-fluorescence from nontarget particles and cells. Time-gated flow cytometry is based on the temporal-domain discrimination of long-lifetime (>1 μs) luminescence-stained cells and can render invisible all nontarget cell and particles. We aim to further evaluate the technique, focusing on detection of ultra-rare-event 5-μm calibration beads in environmental water dirt samples. Europium-labeled 5-μm calibration beads with improved luminescence homogeneity and reduced aggregation were evaluated using the prototype UV LED excited time-gated luminescence (TGL) flow cytometer (FCM). A BD FACSAria flow cytometer was used to sort accurately a very low number of beads (<100 events), which were then spiked into concentrated samples of environmental water. The use of europium-labeled beads permitted the demonstration of specific detection rates of 100%+/-30% and 91%+/-3% with 10 and 100 target beads, respectively, that were mixed with over one million nontarget autofluorescent background particles. Under the same conditions, a conventional FCM was unable to recover rare-event fluorescein isothiocyanate (FITC) calibration beads. Preliminary results on Giardia detection are also reported. We have demonstrated the scientific value of lanthanide-complex biolabels in flow cytometry. This approach may augment the current method that uses multifluorescence-channel flow cytometry gating.
The Crystal Zero Degree Detector at BESIII
NASA Astrophysics Data System (ADS)
Koch, L.; Denig, A.; Drexler, P.; Garillon, B.; Johansson, T.; Kühn, W.; Lange, S.; Lauth, W.; Liang, Y.; Marciniewski, P.; Rathmann, T.; Redmer, C.
2017-07-01
The BESIII experiment at the BEPCII electron positron collider at IHEP (Beijing) is collecting data in the charm-τ mass region. Electron positron collisions are a very well suited environment for the study of initial state radiation (ISR). However, the photons from ISR are strongly peaked towards small polar angles and are currently detected with limited efficiency. In order to increase the detection efficiency of ISR photons, we are developing small-size calorimeters to be placed in the very forward and backward regions. Each detector will consist of two 4×3 arrays of 1×1×14 cm3 LYSO crystals. A 1 cm gap separating each of the two arrays will reduce the contamination from background at very low angles. The scintillation light will be collected by silicon photomultipliers (SiPMs). The expected event rate in the MHz range requires flash ADCs recording the preamplified SiPM outputs.The digitized waveforms will be analyzed in realtime yielding data reduction and pile-up detection. This high bandwidth data stream will be transmitted via optical fibers to FPGA-based hardware performing sub-event building, buffering, and event correlation with the BESIII trigger. The sub-events with a corresponding trigger will be sent to the BESIII event builder via TCP/IP. A single crystal equipped with a SiPM was instrumented as a prototype detector. Tests with radioactive sources were performed successfully.
Identification of P/S-wave successions for application in microseismicity
NASA Astrophysics Data System (ADS)
Deflandre, J.-P.; Dubesset, M.
1992-09-01
Interpretation of P/S-wave successions is used in induced or passive microseismicity. It makes the location of microseismic events possible when the triangulation technique cannot be used. To improve the reliability of the method, we propose a technique that identifies the P/S-wave successions among recorded wave successions. A polarization software is used to verify the orthogonality between the P and S polarization axes. The polarization parameters are computed all along the 3-component acoustic signal. Then the algorithm detects time windows within which the signal polarization axis is perpendicular to the polarization axis of the wave in the reference time window (representative of the P wave). The technique is demonstrated for a synthetic event, and three application cases are presented. The first one corresponds to a calibration shot within which the arrivals of perpendicularly polarized waves are correctly detected in spite of their moderate amplitude. The second example presents a microseismic event recorded during gas withdrawal from an underground gas storage reservoir. The last example is chosen as a counter-example, concerning a microseismic event recorded during a hydraulic fracturing job. The detection algorithm reveals that, in this case, the wave succession does not correspond to a P/S one. This implies that such an event must not be located by the method based on the interpretation of a P/S-wave succession as no such a succession is confirmed.
Measuring adverse events in helicopter emergency medical services: establishing content validity.
Patterson, P Daniel; Lave, Judith R; Martin-Gill, Christian; Weaver, Matthew D; Wadas, Richard J; Arnold, Robert M; Roth, Ronald N; Mosesso, Vincent N; Guyette, Francis X; Rittenberger, Jon C; Yealy, Donald M
2014-01-01
We sought to create a valid framework for detecting adverse events (AEs) in the high-risk setting of helicopter emergency medical services (HEMS). We assembled a panel of 10 expert clinicians (n = 6 emergency medicine physicians and n = 4 prehospital nurses and flight paramedics) affiliated with a large multistate HEMS organization in the Northeast US. We used a modified Delphi technique to develop a framework for detecting AEs associated with the treatment of critically ill or injured patients. We used a widely applied measure, the content validity index (CVI), to quantify the validity of the framework's content. The expert panel of 10 clinicians reached consensus on a common AE definition and four-step protocol/process for AE detection in HEMS. The consensus-based framework is composed of three main components: (1) a trigger tool, (2) a method for rating proximal cause, and (3) a method for rating AE severity. The CVI findings isolate components of the framework considered content valid. We demonstrate a standardized process for the development of a content-valid framework for AE detection. The framework is a model for the development of a method for AE identification in other settings, including ground-based EMS.
Falls event detection using triaxial accelerometry and barometric pressure measurement.
Bianchi, Federico; Redmond, Stephen J; Narayanan, Michael R; Cerutti, Sergio; Celler, Branko G; Lovell, Nigel H
2009-01-01
A falls detection system, employing a Bluetooth-based wearable device, containing a triaxial accelerometer and a barometric pressure sensor, is described. The aim of this study is to evaluate the use of barometric pressure measurement, as a surrogate measure of altitude, to augment previously reported accelerometry-based falls detection algorithms. The accelerometry and barometric pressure signals obtained from the waist-mounted device are analyzed by a signal processing and classification algorithm to discriminate falls from activities of daily living. This falls detection algorithm has been compared to two existing algorithms which utilize accelerometry signals alone. A set of laboratory-based simulated falls, along with other tasks associated with activities of daily living (16 tests) were performed by 15 healthy volunteers (9 male and 6 female; age: 23.7 +/- 2.9 years; height: 1.74 +/- 0.11 m). The algorithm incorporating pressure information detected falls with the highest sensitivity (97.8%) and the highest specificity (96.7%).
Development of 19F-NMR chemical shift detection of DNA B-Z equilibrium using 19F-NMR.
Nakamura, S; Yang, H; Hirata, C; Kersaudy, F; Fujimoto, K
2017-06-28
Various DNA conformational changes are in correlation with biological events. In particular, DNA B-Z equilibrium showed a high correlation with translation and transcription. In this study, we developed a DNA probe containing 5-trifluoromethylcytidine or 5-trifluoromethylthymidine to detect DNA B-Z equilibrium using 19 F-NMR. Its probe enabled the quantitative detection of B-, Z-, and ss-DNA based on 19 F-NMR chemical shift change.
NASA Technical Reports Server (NTRS)
Guerreiro, Nelson M.; Butler, Ricky W.; Hagen, George E.; Maddalon, Jeffrey M.; Lewis, Timothy A.
2016-01-01
A loss-of-separation (LOS) is said to occur when two aircraft are spatially too close to one another. A LOS is the fundamental unsafe event to be avoided in air traffic management and conflict detection (CD) is the function that attempts to predict these LOS events. In general, the effectiveness of conflict detection relates to the overall safety and performance of an air traffic management concept. An abstract, parametric analysis was conducted to investigate the impact of surveillance quality, level of intent information, and quality of intent information on conflict detection performance. The data collected in this analysis can be used to estimate the conflict detection performance under alternative future scenarios or alternative allocations of the conflict detection function, based on the quality of the surveillance and intent information under those conditions.Alternatively, this data could also be used to estimate the surveillance and intent information quality required to achieve some desired CD performance as part of the design of a new separation assurance system.
Potas, Jason Robert; de Castro, Newton Gonçalves; Maddess, Ted; de Souza, Marcio Nogueira
2015-01-01
Experimental electrophysiological assessment of evoked responses from regenerating nerves is challenging due to the typical complex response of events dispersed over various latencies and poor signal-to-noise ratio. Our objective was to automate the detection of compound action potential events and derive their latencies and magnitudes using a simple cross-correlation template comparison approach. For this, we developed an algorithm called Waveform Similarity Analysis. To test the algorithm, challenging signals were generated in vivo by stimulating sural and sciatic nerves, whilst recording evoked potentials at the sciatic nerve and tibialis anterior muscle, respectively, in animals recovering from sciatic nerve transection. Our template for the algorithm was generated based on responses evoked from the intact side. We also simulated noisy signals and examined the output of the Waveform Similarity Analysis algorithm with imperfect templates. Signals were detected and quantified using Waveform Similarity Analysis, which was compared to event detection, latency and magnitude measurements of the same signals performed by a trained observer, a process we called Trained Eye Analysis. The Waveform Similarity Analysis algorithm could successfully detect and quantify simple or complex responses from nerve and muscle compound action potentials of intact or regenerated nerves. Incorrectly specifying the template outperformed Trained Eye Analysis for predicting signal amplitude, but produced consistent latency errors for the simulated signals examined. Compared to the trained eye, Waveform Similarity Analysis is automatic, objective, does not rely on the observer to identify and/or measure peaks, and can detect small clustered events even when signal-to-noise ratio is poor. Waveform Similarity Analysis provides a simple, reliable and convenient approach to quantify latencies and magnitudes of complex waveforms and therefore serves as a useful tool for studying evoked compound action potentials in neural regeneration studies.
Potas, Jason Robert; de Castro, Newton Gonçalves; Maddess, Ted; de Souza, Marcio Nogueira
2015-01-01
Experimental electrophysiological assessment of evoked responses from regenerating nerves is challenging due to the typical complex response of events dispersed over various latencies and poor signal-to-noise ratio. Our objective was to automate the detection of compound action potential events and derive their latencies and magnitudes using a simple cross-correlation template comparison approach. For this, we developed an algorithm called Waveform Similarity Analysis. To test the algorithm, challenging signals were generated in vivo by stimulating sural and sciatic nerves, whilst recording evoked potentials at the sciatic nerve and tibialis anterior muscle, respectively, in animals recovering from sciatic nerve transection. Our template for the algorithm was generated based on responses evoked from the intact side. We also simulated noisy signals and examined the output of the Waveform Similarity Analysis algorithm with imperfect templates. Signals were detected and quantified using Waveform Similarity Analysis, which was compared to event detection, latency and magnitude measurements of the same signals performed by a trained observer, a process we called Trained Eye Analysis. The Waveform Similarity Analysis algorithm could successfully detect and quantify simple or complex responses from nerve and muscle compound action potentials of intact or regenerated nerves. Incorrectly specifying the template outperformed Trained Eye Analysis for predicting signal amplitude, but produced consistent latency errors for the simulated signals examined. Compared to the trained eye, Waveform Similarity Analysis is automatic, objective, does not rely on the observer to identify and/or measure peaks, and can detect small clustered events even when signal-to-noise ratio is poor. Waveform Similarity Analysis provides a simple, reliable and convenient approach to quantify latencies and magnitudes of complex waveforms and therefore serves as a useful tool for studying evoked compound action potentials in neural regeneration studies. PMID:26325291
LINEBACKER: LINE-speed Bio-inspired Analysis and Characterization for Event Recognition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oehmen, Christopher S.; Bruillard, Paul J.; Matzke, Brett D.
2016-08-04
The cyber world is a complex domain, with digital systems mediating a wide spectrum of human and machine behaviors. While this is enabling a revolution in the way humans interact with each other and data, it also is exposing previously unreachable infrastructure to a worldwide set of actors. Existing solutions for intrusion detection and prevention that are signature-focused typically seek to detect anomalous and/or malicious activity for the sake of preventing or mitigating negative impacts. But a growing interest in behavior-based detection is driving new forms of analysis that move the emphasis from static indicators (e.g. rule-based alarms or tripwires)more » to behavioral indicators that accommodate a wider contextual perspective. Similar to cyber systems, biosystems have always existed in resource-constrained hostile environments where behaviors are tuned by context. So we look to biosystems as an inspiration for addressing behavior-based cyber challenges. In this paper, we introduce LINEBACKER, a behavior-model based approach to recognizing anomalous events in network traffic and present the design of this approach of bio-inspired and statistical models working in tandem to produce individualized alerting for a collection of systems. Preliminary results of these models operating on historic data are presented along with a plugin to support real-world cyber operations.« less
LSTM-CRF | Informatics Technology for Cancer Research (ITCR)
LSTM-CRF uses Natural Language Processing methods for detecting Adverse Drug Events, Drugname, Indication and other medically relevant information from Electronic Health Records. It implements Recurrent Neural Networks using several CRF based inference methods.
TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach
Elgendi, Mohamed
2016-01-01
Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages (“TERMA”) involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages (W1 and W2) have to follow the inequality (8×W1)≥W2≥(2×W1). Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions. PMID:27827852
Scenario design and basic analysis of the National Data Centre Preparedness Exercise 2013
NASA Astrophysics Data System (ADS)
Ross, Ole; Ceranna, Lars; Hartmann, Gernot; Gestermann, Nicolai; Bönneman, Christian
2014-05-01
The Comprehensive Nuclear-Test-Ban Treaty (CTBT) prohibits all kinds of nuclear explosions. For the detection of treaty violations the International Monitoring System (IMS) operates stations observing seismic, hydroacoustic, and infrasound signals as well as radioisotopes in the atmosphere. While the IMS data is collected, processed and technically analyzed in the International Data Center (IDC) of the CTBT-Organization, National Data Centers (NDC) provide interpretation and advice to their government concerning suspicious detections occurring in IMS data. NDC Preparedness Exercises (NPE) are regularly performed dealing with fictitious treaty violations to practice the combined analysis of CTBT verification technologies and for the mutual exchange of information between NDC and also with the IDC. The NPE2010 and NPE2012 trigger scenarios were based on selected seismic events from the Reviewed Event Bulletin (REB) serving as starting point for fictitious Radionuclide dispersion. The main task was the identification of the original REB event and the discrimination between earthquakes and explosions as source. The scenario design of NPE2013 differs from those of previous NPEs. The waveform event selection is not constrained to events in the REB. The exercise trigger is a combination of a tempo-spatial indication pointing to a certain waveform event and simulated radionuclide concentrations generated by forward Atmospheric Transport Modelling based on a fictitious release. For the waveform event the date (4 Sept. 2013) is given and the region is communicated in a map showing the fictitious state of "Frisia" at the Coast of the North Sea in Central Europe. The synthetic radionuclide detections start in Vienna (8 Sept, I-131) and Schauinsland (11 Sept, Xe-133) with rather low activity concentrations and are most prominent in Stockholm and Spitsbergen mid of September 2013. Smaller concentrations in Asia follow later on. The potential connection between the waveform and radionuclide evidence remains unclear. The verification task is to identify the waveform event and to investigate potential sources of the radionuclide findings. Finally the potential conjunction between the sources and the CTBT-relevance of the whole picture has to be evaluated. The overall question is whether requesting an On-Site-Inspection in "Frisia" would be justified. The poster presents the NPE2013 scenario and gives a basic analysis of the initial situation concerning both waveform detections and atmospheric dispersion conditions in Central Europe in early September 2013. The full NPE2013 scenario will be presented at the NDC Workshop mid of May 2014.
Semi-Supervised Novelty Detection with Adaptive Eigenbases, and Application to Radio Transients
NASA Technical Reports Server (NTRS)
Thompson, David R.; Majid, Walid A.; Reed, Colorado J.; Wagstaff, Kiri L.
2011-01-01
We present a semi-supervised online method for novelty detection and evaluate its performance for radio astronomy time series data. Our approach uses adaptive eigenbases to combine 1) prior knowledge about uninteresting signals with 2) online estimation of the current data properties to enable highly sensitive and precise detection of novel signals. We apply the method to the problem of detecting fast transient radio anomalies and compare it to current alternative algorithms. Tests based on observations from the Parkes Multibeam Survey show both effective detection of interesting rare events and robustness to known false alarm anomalies.
Systems and methods of detecting force and stress using tetrapod nanocrystal
Choi, Charina L.; Koski, Kristie J.; Sivasankar, Sanjeevi; Alivisatos, A. Paul
2013-08-20
Systems and methods of detecting force on the nanoscale including methods for detecting force using a tetrapod nanocrystal by exposing the tetrapod nanocrystal to light, which produces a luminescent response by the tetrapod nanocrystal. The method continues with detecting a difference in the luminescent response by the tetrapod nanocrystal relative to a base luminescent response that indicates a force between a first and second medium or stresses or strains experienced within a material. Such systems and methods find use with biological systems to measure forces in biological events or interactions.
Video content analysis of surgical procedures.
Loukas, Constantinos
2018-02-01
In addition to its therapeutic benefits, minimally invasive surgery offers the potential for video recording of the operation. The videos may be archived and used later for reasons such as cognitive training, skills assessment, and workflow analysis. Methods from the major field of video content analysis and representation are increasingly applied in the surgical domain. In this paper, we review recent developments and analyze future directions in the field of content-based video analysis of surgical operations. The review was obtained from PubMed and Google Scholar search on combinations of the following keywords: 'surgery', 'video', 'phase', 'task', 'skills', 'event', 'shot', 'analysis', 'retrieval', 'detection', 'classification', and 'recognition'. The collected articles were categorized and reviewed based on the technical goal sought, type of surgery performed, and structure of the operation. A total of 81 articles were included. The publication activity is constantly increasing; more than 50% of these articles were published in the last 3 years. Significant research has been performed for video task detection and retrieval in eye surgery. In endoscopic surgery, the research activity is more diverse: gesture/task classification, skills assessment, tool type recognition, shot/event detection and retrieval. Recent works employ deep neural networks for phase and tool recognition as well as shot detection. Content-based video analysis of surgical operations is a rapidly expanding field. Several future prospects for research exist including, inter alia, shot boundary detection, keyframe extraction, video summarization, pattern discovery, and video annotation. The development of publicly available benchmark datasets to evaluate and compare task-specific algorithms is essential.
Increasing the Operational Value of Event Messages
NASA Technical Reports Server (NTRS)
Li, Zhenping; Savkli, Cetin; Smith, Dan
2003-01-01
Assessing the health of a space mission has traditionally been performed using telemetry analysis tools. Parameter values are compared to known operational limits and are plotted over various time periods. This presentation begins with the notion that there is an incredible amount of untapped information contained within the mission s event message logs. Through creative advancements in message handling tools, the event message logs can be used to better assess spacecraft and ground system status and to highlight and report on conditions not readily apparent when messages are evaluated one-at-a-time during a real-time pass. Work in this area is being funded as part of a larger NASA effort at the Goddard Space Flight Center to create component-based, middleware-based, standards-based general purpose ground system architecture referred to as GMSEC - the GSFC Mission Services Evolution Center. The new capabilities and operational concepts for event display, event data analyses and data mining are being developed by Lockheed Martin and the new subsystem has been named GREAT - the GMSEC Reusable Event Analysis Toolkit. Planned for use on existing and future missions, GREAT has the potential to increase operational efficiency in areas of problem detection and analysis, general status reporting, and real-time situational awareness.
Detecting and Locating Seismic Events Without Phase Picks or Velocity Models
NASA Astrophysics Data System (ADS)
Arrowsmith, S.; Young, C. J.; Ballard, S.; Slinkard, M.
2015-12-01
The standard paradigm for seismic event monitoring is to scan waveforms from a network of stations and identify the arrival time of various seismic phases. A signal association algorithm then groups the picks to form events, which are subsequently located by minimizing residuals between measured travel times and travel times predicted by an Earth model. Many of these steps are prone to significant errors which can lead to erroneous arrival associations and event locations. Here, we revisit a concept for event detection that does not require phase picks or travel time curves and fuses detection, association and location into a single algorithm. Our pickless event detector exploits existing catalog and waveform data to build an empirical stack of the full regional seismic wavefield, which is subsequently used to detect and locate events at a network level using correlation techniques. Because the technique uses more of the information content of the original waveforms, the concept is particularly powerful for detecting weak events that would be missed by conventional methods. We apply our detector to seismic data from the University of Utah Seismograph Stations network and compare our results with the earthquake catalog published by the University of Utah. We demonstrate that the pickless detector can detect and locate significant numbers of events previously missed by standard data processing techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Andrew F.; Cinquini, Luca; Khudikyan, Shakeh E.
2015-01-01
“Fast radio transients” are defined here as bright millisecond pulses of radio-frequency energy. These short-duration pulses can be produced by known objects such as pulsars or potentially by more exotic objects such as evaporating black holes. The identification and verification of such an event would be of great scientific value. This is one major goal of the Very Long Baseline Array (VLBA) Fast Transient Experiment (V-FASTR), a software-based detection system installed at the VLBA. V-FASTR uses a “commensal” (piggy-back) approach, analyzing all array data continually during routine VLBA observations and identifying candidate fast transient events. Raw data can be storedmore » from a buffer memory, which enables a comprehensive off-line analysis. This is invaluable for validating the astrophysical origin of any detection. Candidates discovered by the automatic system must be reviewed each day by analysts to identify any promising signals that warrant a more in-depth investigation. To support the timely analysis of fast transient detection candidates by V-FASTR scientists, we have developed a metadata-driven, collaborative candidate review framework. The framework consists of a software pipeline for metadata processing composed of both open source software components and project-specific code written expressly to extract and catalog metadata from the incoming V-FASTR data products, and a web-based data portal that facilitates browsing and inspection of the available metadata for candidate events extracted from the VLBA radio data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Novak, Avrey; Nyflot, Matthew J.; Ermoian, Ralph P.
Purpose: Radiation treatment planning involves a complex workflow that has multiple potential points of vulnerability. This study utilizes an incident reporting system to identify the origination and detection points of near-miss errors, in order to guide their departmental safety improvement efforts. Previous studies have examined where errors arise, but not where they are detected or applied a near-miss risk index (NMRI) to gauge severity. Methods: From 3/2012 to 3/2014, 1897 incidents were analyzed from a departmental incident learning system. All incidents were prospectively reviewed weekly by a multidisciplinary team and assigned a NMRI score ranging from 0 to 4 reflectingmore » potential harm to the patient (no potential harm to potential critical harm). Incidents were classified by point of incident origination and detection based on a 103-step workflow. The individual steps were divided among nine broad workflow categories (patient assessment, imaging for radiation therapy (RT) planning, treatment planning, pretreatment plan review, treatment delivery, on-treatment quality management, post-treatment completion, equipment/software quality management, and other). The average NMRI scores of incidents originating or detected within each broad workflow area were calculated. Additionally, out of 103 individual process steps, 35 were classified as safety barriers, the process steps whose primary function is to catch errors. The safety barriers which most frequently detected incidents were identified and analyzed. Finally, the distance between event origination and detection was explored by grouping events by the number of broad workflow area events passed through before detection, and average NMRI scores were compared. Results: Near-miss incidents most commonly originated within treatment planning (33%). However, the incidents with the highest average NMRI scores originated during imaging for RT planning (NMRI = 2.0, average NMRI of all events = 1.5), specifically during the documentation of patient positioning and localization of the patient. Incidents were most frequently detected during treatment delivery (30%), and incidents identified at this point also had higher severity scores than other workflow areas (NMRI = 1.6). Incidents identified during on-treatment quality management were also more severe (NMRI = 1.7), and the specific process steps of reviewing portal and CBCT images tended to catch highest-severity incidents. On average, safety barriers caught 46% of all incidents, most frequently at physics chart review, therapist’s chart check, and the review of portal images; however, most of the incidents that pass through a particular safety barrier are not designed to be capable of being captured at that barrier. Conclusions: Incident learning systems can be used to assess the most common points of error origination and detection in radiation oncology. This can help tailor safety improvement efforts and target the highest impact portions of the workflow. The most severe near-miss events tend to originate during simulation, with the most severe near-miss events detected at the time of patient treatment. Safety barriers can be improved to allow earlier detection of near-miss events.« less
Robust failure detection filters. M.S. Thesis
NASA Technical Reports Server (NTRS)
Sanmartin, A. M.
1985-01-01
The robustness of detection filters applied to the detection of actuator failures on a free-free beam is analyzed. This analysis is based on computer simulation tests of the detection filters in the presence of different types of model mismatch, and on frequency response functions of the transfers corresponding to the model mismatch. The robustness of detection filters based on a model of the beam containing a large number of structural modes varied dramatically with the placement of some of the filter poles. The dynamics of these filters were very hard to analyze. The design of detection filters with a number of modes equal to the number of sensors was trivial. They can be configured to detect any number of actuator failure events. The dynamics of these filters were very easy to analyze and their robustness properties were much improved. A change of the output transformation allowed the filter to perform satisfactorily with realistic levels of model mismatch.
Coronal Fine Structure in Dynamic Events Observed by Hi-C
NASA Technical Reports Server (NTRS)
Winebarger, Amy; Schuler, Timothy
2013-01-01
The High-Resolution Coronal Imager (Hi-C) flew aboard a NASA sounding rocket on 2012 July 11 and captured roughly 345 s of high spatial and temporal resolution images of the solar corona in a narrowband 193 Angstrom channel. We have analyzed the fluctuations in intensity of Active Region 11520. We selected events based on a lifetime greater than 11 s (two Hi-C frames) and intensities greater than a threshold determined from the photon and readout noise. We compare the Hi-C events with those determined from AIA. We find that HI-C detects shorter and smaller events than AIA. We also find that the intensity increase in the Hi-C events is approx. 3 times greater than the intensity increase in the AIA events we conclude the events are related to linear sub-structure that is unresolved by AIA
NASA Astrophysics Data System (ADS)
Vasterling, Margarete; Wegler, Ulrich; Bruestle, Andrea; Becker, Jan
2016-04-01
Real time information on the locations and magnitudes of induced earthquakes is essential for response plans based on the magnitude frequency distribution. We developed and tested a real time cross-correlation detector focusing on induced microseismicity in deep geothermal reservoirs. The incoming seismological data are cross-correlated in real time with a set of known master events. We use the envelopes of the seismograms rather than the seismograms themselves to account for small changes in the source locations or in the focal mechanisms. Two different detection conditions are implemented: After first passing a single trace correlation condition, secondly a network correlation is calculated taking the amplitude information of the seismic network into account. The magnitude is estimated by using the respective ratio of the maximum amplitudes of the master event and the detected event. The detector is implemented as a real time tool and put into practice as a SeisComp3 module, an established open source software for seismological real time data handling and analysis. We validated the reliability and robustness of the detector by an offline playback test using four month of data from monitoring the power plant in Insheim (Upper Rhine Graben, SW Germany). Subsequently, in October 2013 the detector was installed as real time monitoring system within the project "MAGS2 - Microseismic Activity of Geothermal Systems". Master events from the two neighboring geothermal power plants in Insheim and Landau and two nearby quarries are defined. After detection, manual phase determination and event location are performed at the local seismological survey of the Geological Survey and Mining Authority of Rhineland-Palatinate. Until November 2015 the detector identified 454 events out of which 95% were assigned correctly to the respective source. 5% were misdetections caused by local tectonic events. To evaluate the completeness of the automatically obtained catalogue, it is compared to the event catalogue of the Seismological Service of Southwestern Germany and to the events reported by the company tasked with seismic monitoring of the Insheim power plant. Events missed by the cross-correlation detector are generally very small. They are registered at too few stations to meet the detection criteria. Most of these small events were not locatable. The automatic catalogue has a magnitude of completeness around 0.0 and is significantly more detailed than the catalogue from standard processing of the Seismological Service of Southwestern Germany for this region. For events in the magnitude range of the master event the magnitude estimated from the amplitude ratio reproduces the local magnitude well. For weaker events there tends to be a small offset. Altogether, the developed real time cross correlation detector provides robust detections with reliable association of the events to the respective sources and valid magnitude estimates. Thus, it provides input parameters for the mitigation of seismic hazard by using response plans in real time.
Online Least Squares One-Class Support Vector Machines-Based Abnormal Visual Event Detection
Wang, Tian; Chen, Jie; Zhou, Yi; Snoussi, Hichem
2013-01-01
The abnormal event detection problem is an important subject in real-time video surveillance. In this paper, we propose a novel online one-class classification algorithm, online least squares one-class support vector machine (online LS-OC-SVM), combined with its sparsified version (sparse online LS-OC-SVM). LS-OC-SVM extracts a hyperplane as an optimal description of training objects in a regularized least squares sense. The online LS-OC-SVM learns a training set with a limited number of samples to provide a basic normal model, then updates the model through remaining data. In the sparse online scheme, the model complexity is controlled by the coherence criterion. The online LS-OC-SVM is adopted to handle the abnormal event detection problem. Each frame of the video is characterized by the covariance matrix descriptor encoding the moving information, then is classified into a normal or an abnormal frame. Experiments are conducted, on a two-dimensional synthetic distribution dataset and a benchmark video surveillance dataset, to demonstrate the promising results of the proposed online LS-OC-SVM method. PMID:24351629
Online least squares one-class support vector machines-based abnormal visual event detection.
Wang, Tian; Chen, Jie; Zhou, Yi; Snoussi, Hichem
2013-12-12
The abnormal event detection problem is an important subject in real-time video surveillance. In this paper, we propose a novel online one-class classification algorithm, online least squares one-class support vector machine (online LS-OC-SVM), combined with its sparsified version (sparse online LS-OC-SVM). LS-OC-SVM extracts a hyperplane as an optimal description of training objects in a regularized least squares sense. The online LS-OC-SVM learns a training set with a limited number of samples to provide a basic normal model, then updates the model through remaining data. In the sparse online scheme, the model complexity is controlled by the coherence criterion. The online LS-OC-SVM is adopted to handle the abnormal event detection problem. Each frame of the video is characterized by the covariance matrix descriptor encoding the moving information, then is classified into a normal or an abnormal frame. Experiments are conducted, on a two-dimensional synthetic distribution dataset and a benchmark video surveillance dataset, to demonstrate the promising results of the proposed online LS-OC-SVM method.
NASA Astrophysics Data System (ADS)
Fluixá-Sanmartín, Javier; Pan, Deng; Fischer, Luzia; Orlowsky, Boris; García-Hernández, Javier; Jordan, Frédéric; Haemmig, Christoph; Zhang, Fangwei; Xu, Jijun
2018-02-01
Drought indices based on precipitation are commonly used to identify and characterize droughts. Due to the general complexity of droughts, the comparison of index-identified events with droughts at different levels of the complete system, including soil humidity or river discharges, relies typically on model simulations of the latter, entailing potentially significant uncertainties. The present study explores the potential of using precipitation-based indices to reproduce observed droughts in the lower part of the Jinsha River basin (JRB), proposing an innovative approach for a catchment-wide drought detection and characterization. Two indicators, namely the Overall Drought Extension (ODE) and the Overall Drought Indicator (ODI), have been defined. These indicators aim at identifying and characterizing drought events on the basin scale, using results from four meteorological drought indices (standardized precipitation index, SPI; rainfall anomaly index, RAI; percent of normal precipitation, PN; deciles, DEC) calculated at different locations of the basin and for different timescales. Collected historical information on drought events is used to contrast results obtained with the indicators. This method has been successfully applied to the lower Jinsha River basin in China, a region prone to frequent and severe droughts. Historical drought events that occurred from 1960 to 2014 have been compiled and cataloged from different sources, in a challenging process. The analysis of the indicators shows a good agreement with the recorded historical drought events on the basin scale. It has been found that the timescale that best reproduces observed events across all the indices is the 6-month timescale.