Surface Management System Departure Event Data Analysis
NASA Technical Reports Server (NTRS)
Monroe, Gilena A.
2010-01-01
This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.
A model of human event detection in multiple process monitoring situations
NASA Technical Reports Server (NTRS)
Greenstein, J. S.; Rouse, W. B.
1978-01-01
It is proposed that human decision making in many multi-task situations might be modeled in terms of the manner in which the human detects events related to his tasks and the manner in which he allocates his attention among his tasks once he feels events have occurred. A model of human event detection performance in such a situation is presented. An assumption of the model is that, in attempting to detect events, the human generates the probability that events have occurred. Discriminant analysis is used to model the human's generation of these probabilities. An experimental study of human event detection performance in a multiple process monitoring situation is described and the application of the event detection model to this situation is addressed. The experimental study employed a situation in which subjects simulataneously monitored several dynamic processes for the occurrence of events and made yes/no decisions on the presence of events in each process. Input to the event detection model of the information displayed to the experimental subjects allows comparison of the model's performance with the performance of the subjects.
Distributed Events in Sentinel: Design and Implementation of a Global Event Detector
1999-01-01
local event detector and a global event detector to detect events. Global event detector in this case plays the role of a message sending/receiving than...significant in this case . The system performance will decrease with increase in the number of applications involved in global event detection. Yet from a...Figure 8: A Global event tree (2) 1. Global composite event is detected at the GED In this case , the whole global composite event tree is sent to the
NASA Astrophysics Data System (ADS)
Yuki, Akiyama; Satoshi, Ueyama; Ryosuke, Shibasaki; Adachi, Ryuichiro
2016-06-01
In this study, we developed a method to detect sudden population concentration on a certain day and area, that is, an "Event," all over Japan in 2012 using mass GPS data provided from mobile phone users. First, stay locations of all phone users were detected using existing methods. Second, areas and days where Events occurred were detected by aggregation of mass stay locations into 1-km-square grid polygons. Finally, the proposed method could detect Events with an especially large number of visitors in the year by removing the influences of Events that occurred continuously throughout the year. In addition, we demonstrated reasonable reliability of the proposed Event detection method by comparing the results of Event detection with light intensities obtained from the night light images from the DMSP/OLS night light images. Our method can detect not only positive events such as festivals but also negative events such as natural disasters and road accidents. These results are expected to support policy development of urban planning, disaster prevention, and transportation management.
Multi-Station Broad Regional Event Detection Using Waveform Correlation
NASA Astrophysics Data System (ADS)
Slinkard, M.; Stephen, H.; Young, C. J.; Eckert, R.; Schaff, D. P.; Richards, P. G.
2013-12-01
Previous waveform correlation studies have established the occurrence of repeating seismic events in various regions, and the utility of waveform-correlation event-detection on broad regional or even global scales to find events currently not included in traditionally-prepared bulletins. The computational burden, however, is high, limiting previous experiments to relatively modest template libraries and/or processing time periods. We have developed a distributed computing waveform correlation event detection utility that allows us to process years of continuous waveform data with template libraries numbering in the thousands. We have used this system to process several years of waveform data from IRIS stations in East Asia, using libraries of template events taken from global and regional bulletins. Detections at a given station are confirmed by 1) comparison with independent bulletins of seismicity, and 2) consistent detections at other stations. We find that many of the detected events are not in traditional catalogs, hence the multi-station comparison is essential. In addition to detecting the similar events, we also estimate magnitudes very precisely based on comparison with the template events (when magnitudes are available). We have investigated magnitude variation within detected families of similar events, false alarm rates, and the temporal and spatial reach of templates.
Network hydraulics inclusion in water quality event detection using multiple sensor stations data.
Oliker, Nurit; Ostfeld, Avi
2015-09-01
Event detection is one of the current most challenging topics in water distribution systems analysis: how regular on-line hydraulic (e.g., pressure, flow) and water quality (e.g., pH, residual chlorine, turbidity) measurements at different network locations can be efficiently utilized to detect water quality contamination events. This study describes an integrated event detection model which combines multiple sensor stations data with network hydraulics. To date event detection modelling is likely limited to single sensor station location and dataset. Single sensor station models are detached from network hydraulics insights and as a result might be significantly exposed to false positive alarms. This work is aimed at decreasing this limitation through integrating local and spatial hydraulic data understanding into an event detection model. The spatial analysis complements the local event detection effort through discovering events with lower signatures by exploring the sensors mutual hydraulic influences. The unique contribution of this study is in incorporating hydraulic simulation information into the overall event detection process of spatially distributed sensors. The methodology is demonstrated on two example applications using base runs and sensitivity analyses. Results show a clear advantage of the suggested model over single-sensor event detection schemes. Copyright © 2015 Elsevier Ltd. All rights reserved.
Subsurface event detection and classification using Wireless Signal Networks.
Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T
2012-11-05
Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events.
Subsurface Event Detection and Classification Using Wireless Signal Networks
Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T.
2012-01-01
Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events. PMID:23202191
[Study on the timeliness of detection and reporting on public health emergency events in China].
Li, Ke-Li; Feng, Zi-Jian; Ni, Da-Xin
2009-03-01
To analyze the timeliness of detection and reporting on public health emergency events, and to explore the effective strategies for improving the relative capacity on those issues. We conducted a retrospective survey on 3275 emergency events reported through Public Health Emergency Events Surveillance System from 2005 to the first half of 2006. Developed by county Centers for Disease Control and Prevention, a uniformed self-administrated questionnaire was used to collect data, which would include information on the detection, reporting of the events. For communicable diseases events, the median of time interval between the occurrence of first case and the detection of event was 6 days (P25 = 2, P75 = 13). For food poisoning events and clusters of disease with unknown origin, the medians were 3 hours (P25, P75 = 16) and 1 days (P25 = 0, P75 = 5). 71.54% of the events were reported by the discoverers within 2 hours after the detection. In general, the ranges of time intervals between the occurrence, detection or reporting of the events were different, according to the categories of events. The timeliness of detection and reporting of events could have been improved dramatically if the definition of events, according to their characteristics, had been more reasonable and accessible, as well as the improvement of training program for healthcare staff and teachers.
Automatic event recognition and anomaly detection with attribute grammar by learning scene semantics
NASA Astrophysics Data System (ADS)
Qi, Lin; Yao, Zhenyu; Li, Li; Dong, Junyu
2007-11-01
In this paper we present a novel framework for automatic event recognition and abnormal behavior detection with attribute grammar by learning scene semantics. This framework combines learning scene semantics by trajectory analysis and constructing attribute grammar-based event representation. The scene and event information is learned automatically. Abnormal behaviors that disobey scene semantics or event grammars rules are detected. By this method, an approach to understanding video scenes is achieved. Further more, with this prior knowledge, the accuracy of abnormal event detection is increased.
Full-waveform detection of non-impulsive seismic events based on time-reversal methods
NASA Astrophysics Data System (ADS)
Solano, Ericka Alinne; Hjörleifsdóttir, Vala; Liu, Qinya
2017-12-01
We present a full-waveform detection method for non-impulsive seismic events, based on time-reversal principles. We use the strain Green's tensor as a matched filter, correlating it with continuous observed seismograms, to detect non-impulsive seismic events. We show that this is mathematically equivalent to an adjoint method for detecting earthquakes. We define the detection function, a scalar valued function, which depends on the stacked correlations for a group of stations. Event detections are given by the times at which the amplitude of the detection function exceeds a given value relative to the noise level. The method can make use of the whole seismic waveform or any combination of time-windows with different filters. It is expected to have an advantage compared to traditional detection methods for events that do not produce energetic and impulsive P waves, for example glacial events, landslides, volcanic events and transform-fault earthquakes for events which velocity structure along the path is relatively well known. Furthermore, the method has advantages over empirical Greens functions template matching methods, as it does not depend on records from previously detected events, and therefore is not limited to events occurring in similar regions and with similar focal mechanisms as these events. The method is not specific to any particular way of calculating the synthetic seismograms, and therefore complicated structural models can be used. This is particularly beneficial for intermediate size events that are registered on regional networks, for which the effect of lateral structure on the waveforms can be significant. To demonstrate the feasibility of the method, we apply it to two different areas located along the mid-oceanic ridge system west of Mexico where non-impulsive events have been reported. The first study area is between Clipperton and Siqueiros transform faults (9°N), during the time of two earthquake swarms, occurring in March 2012 and May 2016. The second area of interest is the Gulf of California where two swarms took place during July and September of 2015. We show that we are able to detect previously non-reported, non-impulsive events and recommend that this method be used together with more traditional template matching methods to maximize the number of detected events.
Embedded security system for multi-modal surveillance in a railway carriage
NASA Astrophysics Data System (ADS)
Zouaoui, Rhalem; Audigier, Romaric; Ambellouis, Sébastien; Capman, François; Benhadda, Hamid; Joudrier, Stéphanie; Sodoyer, David; Lamarque, Thierry
2015-10-01
Public transport security is one of the main priorities of the public authorities when fighting against crime and terrorism. In this context, there is a great demand for autonomous systems able to detect abnormal events such as violent acts aboard passenger cars and intrusions when the train is parked at the depot. To this end, we present an innovative approach which aims at providing efficient automatic event detection by fusing video and audio analytics and reducing the false alarm rate compared to classical stand-alone video detection. The multi-modal system is composed of two microphones and one camera and integrates onboard video and audio analytics and fusion capabilities. On the one hand, for detecting intrusion, the system relies on the fusion of "unusual" audio events detection with intrusion detections from video processing. The audio analysis consists in modeling the normal ambience and detecting deviation from the trained models during testing. This unsupervised approach is based on clustering of automatically extracted segments of acoustic features and statistical Gaussian Mixture Model (GMM) modeling of each cluster. The intrusion detection is based on the three-dimensional (3D) detection and tracking of individuals in the videos. On the other hand, for violent events detection, the system fuses unsupervised and supervised audio algorithms with video event detection. The supervised audio technique detects specific events such as shouts. A GMM is used to catch the formant structure of a shout signal. Video analytics use an original approach for detecting aggressive motion by focusing on erratic motion patterns specific to violent events. As data with violent events is not easily available, a normality model with structured motions from non-violent videos is learned for one-class classification. A fusion algorithm based on Dempster-Shafer's theory analyses the asynchronous detection outputs and computes the degree of belief of each probable event.
NASA Technical Reports Server (NTRS)
Totman, Peter D. (Inventor); Everton, Randy L. (Inventor); Egget, Mark R. (Inventor); Macon, David J. (Inventor)
2007-01-01
A method and apparatus for detecting and determining event characteristics such as, for example, the material failure of a component, in a manner which significantly reduces the amount of data collected. A sensor array, including a plurality of individual sensor elements, is coupled to a programmable logic device (PLD) configured to operate in a passive state and an active state. A triggering event is established such that the PLD records information only upon detection of the occurrence of the triggering event which causes a change in state within one or more of the plurality of sensor elements. Upon the occurrence of the triggering event, the change in state of the one or more sensor elements causes the PLD to record in memory which sensor element detected the event and at what time the event was detected. The PLD may be coupled with a computer for subsequent downloading and analysis of the acquired data.
2013-04-24
DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals Vernon...datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal . We have developed...As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and
Contribution of Infrasound to IDC Reviewed Event Bulletin
NASA Astrophysics Data System (ADS)
Bittner, Paulina; Polich, Paul; Gore, Jane; Ali, Sherif Mohamed; Medinskaya, Tatiana; Mialle, Pierrick
2016-04-01
Until 2003 two waveform technologies, i.e. seismic and hydroacoustic were used to detect and locate events included in the International Data Centre (IDC) Reviewed Event Bulletin (REB). The first atmospheric event was published in the REB in 2003 but infrasound detections could not be used by the Global Association (GA) Software due to the unmanageable high number of spurious associations. Offline improvements of the automatic processing took place to reduce the number of false detections to a reasonable level. In February 2010 the infrasound technology was reintroduced to the IDC operations and has contributed to both automatic and reviewed IDC bulletins. The primary contribution of infrasound technology is to detect atmospheric events. These events may also be observed at seismic stations, which will significantly improve event location. Examples of REB events, which were detected by the International Monitoring System (IMS) infrasound network were fireballs (e.g. Bangkok fireball, 2015), volcanic eruptions (e.g. Calbuco, Chile 2015) and large surface explosions (e.g. Tjanjin, China 2015). Query blasts and large earthquakes belong to events primarily recorded at seismic stations of the IMS network but often detected at the infrasound stations. Presence of infrasound detection associated to an event from a mining area indicates a surface explosion. Satellite imaging and a database of active mines can be used to confirm the origin of such events. This presentation will summarize the contribution of 6 years of infrasound data to IDC bulletins and provide examples of events recorded at the IMS infrasound network. Results of this study may help to improve location of small events with observations on infrasound stations.
A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data
NASA Astrophysics Data System (ADS)
Kohl, B. C.; Given, J.
2017-12-01
The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in particular the spectral characteristics of events of interest, Is entirely model-based, i.e. does not rely on a priori's - particularly important for nuclear monitoring, Does not rely on individualized signal detection thresholds - it's the network solution that matters.
The effectiveness of pretreatment physics plan review for detecting errors in radiation therapy.
Gopan, Olga; Zeng, Jing; Novak, Avrey; Nyflot, Matthew; Ford, Eric
2016-09-01
The pretreatment physics plan review is a standard tool for ensuring treatment quality. Studies have shown that the majority of errors in radiation oncology originate in treatment planning, which underscores the importance of the pretreatment physics plan review. This quality assurance measure is fundamentally important and central to the safety of patients and the quality of care that they receive. However, little is known about its effectiveness. The purpose of this study was to analyze reported incidents to quantify the effectiveness of the pretreatment physics plan review with the goal of improving it. This study analyzed 522 potentially severe or critical near-miss events within an institutional incident learning system collected over a three-year period. Of these 522 events, 356 originated at a workflow point that was prior to the pretreatment physics plan review. The remaining 166 events originated after the pretreatment physics plan review and were not considered in the study. The applicable 356 events were classified into one of the three categories: (1) events detected by the pretreatment physics plan review, (2) events not detected but "potentially detectable" by the physics review, and (3) events "not detectable" by the physics review. Potentially detectable events were further classified by which specific checks performed during the pretreatment physics plan review detected or could have detected the event. For these events, the associated specific check was also evaluated as to the possibility of automating that check given current data structures. For comparison, a similar analysis was carried out on 81 events from the international SAFRON radiation oncology incident learning system. Of the 356 applicable events from the institutional database, 180/356 (51%) were detected or could have been detected by the pretreatment physics plan review. Of these events, 125 actually passed through the physics review; however, only 38% (47/125) were actually detected at the review. Of the 81 events from the SAFRON database, 66/81 (81%) were potentially detectable by the pretreatment physics plan review. From the institutional database, three specific physics checks were particularly effective at detecting events (combined effectiveness of 38%): verifying the isocenter (39/180), verifying DRRs (17/180), and verifying that the plan matched the prescription (12/180). The most effective checks from the SAFRON database were verifying that the plan matched the prescription (13/66) and verifying the field parameters in the record and verify system against those in the plan (23/66). Software-based plan checking systems, if available, would have potential effectiveness of 29% and 64% at detecting events from the institutional and SAFRON databases, respectively. Pretreatment physics plan review is a key safety measure and can detect a high percentage of errors. However, the majority of errors that potentially could have been detected were not detected in this study, indicating the need to improve the pretreatment physics plan review performance. Suggestions for improvement include the automation of specific physics checks performed during the pretreatment physics plan review and the standardization of the review process.
NASA Astrophysics Data System (ADS)
Reynen, Andrew; Audet, Pascal
2017-09-01
A new method using a machine learning technique is applied to event classification and detection at seismic networks. This method is applicable to a variety of network sizes and settings. The algorithm makes use of a small catalogue of known observations across the entire network. Two attributes, the polarization and frequency content, are used as input to regression. These attributes are extracted at predicted arrival times for P and S waves using only an approximate velocity model, as attributes are calculated over large time spans. This method of waveform characterization is shown to be able to distinguish between blasts and earthquakes with 99 per cent accuracy using a network of 13 stations located in Southern California. The combination of machine learning with generalized waveform features is further applied to event detection in Oklahoma, United States. The event detection algorithm makes use of a pair of unique seismic phases to locate events, with a precision directly related to the sampling rate of the generalized waveform features. Over a week of data from 30 stations in Oklahoma, United States are used to automatically detect 25 times more events than the catalogue of the local geological survey, with a false detection rate of less than 2 per cent. This method provides a highly confident way of detecting and locating events. Furthermore, a large number of seismic events can be automatically detected with low false alarm, allowing for a larger automatic event catalogue with a high degree of trust.
TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach.
Elgendi, Mohamed
2016-11-02
Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages ("TERMA") involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages ( W 1 and W 2 ) have to follow the inequality ( 8 × W 1 ) ≥ W 2 ≥ ( 2 × W 1 ) . Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions.
Modeling Concept Dependencies for Event Detection
2014-04-04
Gaussian Mixture Model (GMM). Jiang et al . [8] provide a summary of experiments for TRECVID MED 2010 . They employ low-level features such as SIFT and...event detection literature. Ballan et al . [2] present a method to introduce temporal information for video event detection with a BoW (bag-of-words...approach. Zhou et al . [24] study video event detection by encoding a video with a set of bag of SIFT feature vectors and describe the distribution with a
Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan
2017-01-01
In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of our proposed adaptive crawler and our stream division/recombination technique provides significant gains in event recall (44.44%) and event precision (9.57%). The addition of these sub-events or pieces, allows us to get closer to solving the event puzzle. PMID:29107976
Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan
2017-01-01
In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of our proposed adaptive crawler and our stream division/recombination technique provides significant gains in event recall (44.44%) and event precision (9.57%). The addition of these sub-events or pieces, allows us to get closer to solving the event puzzle.
Results from the MACHO Galactic Pixel Lensing Search
NASA Astrophysics Data System (ADS)
Drake, Andrew J.; Minniti, Dante; Alcock, Charles; Allsman, Robyn A.; Alves, David; Axelrod, Tim S.; Becker, Andrew C.; Bennett, David; Cook, Kem H.; Freeman, Ken C.; Griest, Kim; Lehner, Matt; Marshall, Stuart; Peterson, Bruce; Pratt, Mark; Quinn, Peter; Rodgers, Alex; Stubbs, Chris; Sutherland, Will; Tomaney, Austin; Vandehei, Thor; Welch, Doug L.
The MACHO, EROS, OGLE and AGAPE collaborations have been studying nature of the galactic halo for a number of years using microlensing events. The MACHO group undertakes observations of the LMC, SMC and Galactic Bulge monitoring the light curves of millions of stars to detect microlensing. Most of these fields are crowded to the extent that all the monitored stars are blended. Such crowding makes the performance of accurate photometry difficult. We apply the new technique of Difference Image Analysis (DIA) on archival data to improve the photometry and increase both the detection sensitivity and effective search area. The application of this technique also allows us to detect so called `pixel lensing' events. These are microlensing events where the source star is only detectable during lensing. The detection of these events will allow us to make a large increase in the number of detected microlensing events. We present a light curve demonstrating the detection of a pixel lensing event with this technique.
Detecting Earthquakes over a Seismic Network using Single-Station Similarity Measures
NASA Astrophysics Data System (ADS)
Bergen, Karianne J.; Beroza, Gregory C.
2018-03-01
New blind waveform-similarity-based detection methods, such as Fingerprint and Similarity Thresholding (FAST), have shown promise for detecting weak signals in long-duration, continuous waveform data. While blind detectors are capable of identifying similar or repeating waveforms without templates, they can also be susceptible to false detections due to local correlated noise. In this work, we present a set of three new methods that allow us to extend single-station similarity-based detection over a seismic network; event-pair extraction, pairwise pseudo-association, and event resolution complete a post-processing pipeline that combines single-station similarity measures (e.g. FAST sparse similarity matrix) from each station in a network into a list of candidate events. The core technique, pairwise pseudo-association, leverages the pairwise structure of event detections in its network detection model, which allows it to identify events observed at multiple stations in the network without modeling the expected move-out. Though our approach is general, we apply it to extend FAST over a sparse seismic network. We demonstrate that our network-based extension of FAST is both sensitive and maintains a low false detection rate. As a test case, we apply our approach to two weeks of continuous waveform data from five stations during the foreshock sequence prior to the 2014 Mw 8.2 Iquique earthquake. Our method identifies nearly five times as many events as the local seismicity catalog (including 95% of the catalog events), and less than 1% of these candidate events are false detections.
Real-Time Event Detection for Monitoring Natural and Source ...
The use of event detection systems in finished drinking water systems is increasing in order to monitor water quality in both operational and security contexts. Recent incidents involving harmful algal blooms and chemical spills into watersheds have increased interest in monitoring source water quality prior to treatment. This work highlights the use of the CANARY event detection software in detecting suspected illicit events in an actively monitored watershed in South Carolina. CANARY is an open source event detection software that was developed by USEPA and Sandia National Laboratories. The software works with any type of sensor, utilizes multiple detection algorithms and approaches, and can incorporate operational information as needed. Monitoring has been underway for several years to detect events related to intentional or unintentional dumping of materials into the monitored watershed. This work evaluates the feasibility of using CANARY to enhance the detection of events in this watershed. This presentation will describe the real-time monitoring approach used in this watershed, the selection of CANARY configuration parameters that optimize detection for this watershed and monitoring application, and the performance of CANARY during the time frame analyzed. Further, this work will highlight how rainfall events impacted analysis, and the innovative application of CANARY taken in order to effectively detect the suspected illicit events. This presentation d
The effectiveness of pretreatment physics plan review for detecting errors in radiation therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopan, Olga; Zeng, Jing; Novak, Avrey
Purpose: The pretreatment physics plan review is a standard tool for ensuring treatment quality. Studies have shown that the majority of errors in radiation oncology originate in treatment planning, which underscores the importance of the pretreatment physics plan review. This quality assurance measure is fundamentally important and central to the safety of patients and the quality of care that they receive. However, little is known about its effectiveness. The purpose of this study was to analyze reported incidents to quantify the effectiveness of the pretreatment physics plan review with the goal of improving it. Methods: This study analyzed 522 potentiallymore » severe or critical near-miss events within an institutional incident learning system collected over a three-year period. Of these 522 events, 356 originated at a workflow point that was prior to the pretreatment physics plan review. The remaining 166 events originated after the pretreatment physics plan review and were not considered in the study. The applicable 356 events were classified into one of the three categories: (1) events detected by the pretreatment physics plan review, (2) events not detected but “potentially detectable” by the physics review, and (3) events “not detectable” by the physics review. Potentially detectable events were further classified by which specific checks performed during the pretreatment physics plan review detected or could have detected the event. For these events, the associated specific check was also evaluated as to the possibility of automating that check given current data structures. For comparison, a similar analysis was carried out on 81 events from the international SAFRON radiation oncology incident learning system. Results: Of the 356 applicable events from the institutional database, 180/356 (51%) were detected or could have been detected by the pretreatment physics plan review. Of these events, 125 actually passed through the physics review; however, only 38% (47/125) were actually detected at the review. Of the 81 events from the SAFRON database, 66/81 (81%) were potentially detectable by the pretreatment physics plan review. From the institutional database, three specific physics checks were particularly effective at detecting events (combined effectiveness of 38%): verifying the isocenter (39/180), verifying DRRs (17/180), and verifying that the plan matched the prescription (12/180). The most effective checks from the SAFRON database were verifying that the plan matched the prescription (13/66) and verifying the field parameters in the record and verify system against those in the plan (23/66). Software-based plan checking systems, if available, would have potential effectiveness of 29% and 64% at detecting events from the institutional and SAFRON databases, respectively. Conclusions: Pretreatment physics plan review is a key safety measure and can detect a high percentage of errors. However, the majority of errors that potentially could have been detected were not detected in this study, indicating the need to improve the pretreatment physics plan review performance. Suggestions for improvement include the automation of specific physics checks performed during the pretreatment physics plan review and the standardization of the review process.« less
Generalized Detectability for Discrete Event Systems
Shu, Shaolong; Lin, Feng
2011-01-01
In our previous work, we investigated detectability of discrete event systems, which is defined as the ability to determine the current and subsequent states of a system based on observation. For different applications, we defined four types of detectabilities: (weak) detectability, strong detectability, (weak) periodic detectability, and strong periodic detectability. In this paper, we extend our results in three aspects. (1) We extend detectability from deterministic systems to nondeterministic systems. Such a generalization is necessary because there are many systems that need to be modeled as nondeterministic discrete event systems. (2) We develop polynomial algorithms to check strong detectability. The previous algorithms are based on observer whose construction is of exponential complexity, while the new algorithms are based on a new automaton called detector. (3) We extend detectability to D-detectability. While detectability requires determining the exact state of a system, D-detectability relaxes this requirement by asking only to distinguish certain pairs of states. With these extensions, the theory on detectability of discrete event systems becomes more applicable in solving many practical problems. PMID:21691432
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chung, Sun-Ju; Lee, Chung-Uk; Koo, Jae-Rim, E-mail: sjchung@kasi.re.kr, E-mail: leecu@kasi.re.kr, E-mail: koojr@kasi.re.kr
2014-04-20
Even though the recently discovered high-magnification event MOA-2010-BLG-311 had complete coverage over its peak, confident planet detection did not happen due to extremely weak central perturbations (EWCPs, fractional deviations of ≲ 2%). For confident detection of planets in EWCP events, it is necessary to have both high cadence monitoring and high photometric accuracy better than those of current follow-up observation systems. The next-generation ground-based observation project, Korea Microlensing Telescope Network (KMTNet), satisfies these conditions. We estimate the probability of occurrence of EWCP events with fractional deviations of ≤2% in high-magnification events and the efficiency of detecting planets in the EWCPmore » events using the KMTNet. From this study, we find that the EWCP events occur with a frequency of >50% in the case of ≲ 100 M {sub E} planets with separations of 0.2 AU ≲ d ≲ 20 AU. We find that for main-sequence and sub-giant source stars, ≳ 1 M {sub E} planets in EWCP events with deviations ≤2% can be detected with frequency >50% in a certain range that changes with the planet mass. However, it is difficult to detect planets in EWCP events of bright stars like giant stars because it is easy for KMTNet to be saturated around the peak of the events because of its constant exposure time. EWCP events are caused by close, intermediate, and wide planetary systems with low-mass planets and close and wide planetary systems with massive planets. Therefore, we expect that a much greater variety of planetary systems than those already detected, which are mostly intermediate planetary systems, regardless of the planet mass, will be significantly detected in the near future.« less
Detecting and Locating Seismic Events Without Phase Picks or Velocity Models
NASA Astrophysics Data System (ADS)
Arrowsmith, S.; Young, C. J.; Ballard, S.; Slinkard, M.
2015-12-01
The standard paradigm for seismic event monitoring is to scan waveforms from a network of stations and identify the arrival time of various seismic phases. A signal association algorithm then groups the picks to form events, which are subsequently located by minimizing residuals between measured travel times and travel times predicted by an Earth model. Many of these steps are prone to significant errors which can lead to erroneous arrival associations and event locations. Here, we revisit a concept for event detection that does not require phase picks or travel time curves and fuses detection, association and location into a single algorithm. Our pickless event detector exploits existing catalog and waveform data to build an empirical stack of the full regional seismic wavefield, which is subsequently used to detect and locate events at a network level using correlation techniques. Because the technique uses more of the information content of the original waveforms, the concept is particularly powerful for detecting weak events that would be missed by conventional methods. We apply our detector to seismic data from the University of Utah Seismograph Stations network and compare our results with the earthquake catalog published by the University of Utah. We demonstrate that the pickless detector can detect and locate significant numbers of events previously missed by standard data processing techniques.
TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach
Elgendi, Mohamed
2016-01-01
Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages (“TERMA”) involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages (W1 and W2) have to follow the inequality (8×W1)≥W2≥(2×W1). Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions. PMID:27827852
Ohyama, Junji; Watanabe, Katsumi
2016-01-01
We examined how the temporal and spatial predictability of a task-irrelevant visual event affects the detection and memory of a visual item embedded in a continuously changing sequence. Participants observed 11 sequentially presented letters, during which a task-irrelevant visual event was either present or absent. Predictabilities of spatial location and temporal position of the event were controlled in 2 × 2 conditions. In the spatially predictable conditions, the event occurred at the same location within the stimulus sequence or at another location, while, in the spatially unpredictable conditions, it occurred at random locations. In the temporally predictable conditions, the event timing was fixed relative to the order of the letters, while in the temporally unpredictable condition; it could not be predicted from the letter order. Participants performed a working memory task and a target detection reaction time (RT) task. Memory accuracy was higher for a letter simultaneously presented at the same location as the event in the temporally unpredictable conditions, irrespective of the spatial predictability of the event. On the other hand, the detection RTs were only faster for a letter simultaneously presented at the same location as the event when the event was both temporally and spatially predictable. Thus, to facilitate ongoing detection processes, an event must be predictable both in space and time, while memory processes are enhanced by temporally unpredictable (i.e., surprising) events. Evidently, temporal predictability has differential effects on detection and memory of a visual item embedded in a sequence of images. PMID:26869966
Ohyama, Junji; Watanabe, Katsumi
2016-01-01
We examined how the temporal and spatial predictability of a task-irrelevant visual event affects the detection and memory of a visual item embedded in a continuously changing sequence. Participants observed 11 sequentially presented letters, during which a task-irrelevant visual event was either present or absent. Predictabilities of spatial location and temporal position of the event were controlled in 2 × 2 conditions. In the spatially predictable conditions, the event occurred at the same location within the stimulus sequence or at another location, while, in the spatially unpredictable conditions, it occurred at random locations. In the temporally predictable conditions, the event timing was fixed relative to the order of the letters, while in the temporally unpredictable condition; it could not be predicted from the letter order. Participants performed a working memory task and a target detection reaction time (RT) task. Memory accuracy was higher for a letter simultaneously presented at the same location as the event in the temporally unpredictable conditions, irrespective of the spatial predictability of the event. On the other hand, the detection RTs were only faster for a letter simultaneously presented at the same location as the event when the event was both temporally and spatially predictable. Thus, to facilitate ongoing detection processes, an event must be predictable both in space and time, while memory processes are enhanced by temporally unpredictable (i.e., surprising) events. Evidently, temporal predictability has differential effects on detection and memory of a visual item embedded in a sequence of images.
Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin
2016-10-01
Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.
Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin
2016-01-01
Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses. PMID:27706086
Detecting earthquakes over a seismic network using single-station similarity measures
NASA Astrophysics Data System (ADS)
Bergen, Karianne J.; Beroza, Gregory C.
2018-06-01
New blind waveform-similarity-based detection methods, such as Fingerprint and Similarity Thresholding (FAST), have shown promise for detecting weak signals in long-duration, continuous waveform data. While blind detectors are capable of identifying similar or repeating waveforms without templates, they can also be susceptible to false detections due to local correlated noise. In this work, we present a set of three new methods that allow us to extend single-station similarity-based detection over a seismic network; event-pair extraction, pairwise pseudo-association, and event resolution complete a post-processing pipeline that combines single-station similarity measures (e.g. FAST sparse similarity matrix) from each station in a network into a list of candidate events. The core technique, pairwise pseudo-association, leverages the pairwise structure of event detections in its network detection model, which allows it to identify events observed at multiple stations in the network without modeling the expected moveout. Though our approach is general, we apply it to extend FAST over a sparse seismic network. We demonstrate that our network-based extension of FAST is both sensitive and maintains a low false detection rate. As a test case, we apply our approach to 2 weeks of continuous waveform data from five stations during the foreshock sequence prior to the 2014 Mw 8.2 Iquique earthquake. Our method identifies nearly five times as many events as the local seismicity catalogue (including 95 per cent of the catalogue events), and less than 1 per cent of these candidate events are false detections.
Self-similarity Clustering Event Detection Based on Triggers Guidance
NASA Astrophysics Data System (ADS)
Zhang, Xianfei; Li, Bicheng; Tian, Yuxuan
Traditional method of Event Detection and Characterization (EDC) regards event detection task as classification problem. It makes words as samples to train classifier, which can lead to positive and negative samples of classifier imbalance. Meanwhile, there is data sparseness problem of this method when the corpus is small. This paper doesn't classify event using word as samples, but cluster event in judging event types. It adopts self-similarity to convergence the value of K in K-means algorithm by the guidance of event triggers, and optimizes clustering algorithm. Then, combining with named entity and its comparative position information, the new method further make sure the pinpoint type of event. The new method avoids depending on template of event in tradition methods, and its result of event detection can well be used in automatic text summarization, text retrieval, and topic detection and tracking.
Systematic detection of seismic events at Mount St. Helens with an ultra-dense array
NASA Astrophysics Data System (ADS)
Meng, X.; Hartog, J. R.; Schmandt, B.; Hotovec-Ellis, A. J.; Hansen, S. M.; Vidale, J. E.; Vanderplas, J.
2016-12-01
During the summer of 2014, an ultra-dense array of 900 geophones was deployed around the crater of Mount St. Helens and continuously operated for 15 days. This dataset provides us an unprecedented opportunity to systematically detect seismic events around an active volcano and study their underlying mechanisms. We use a waveform-based matched filter technique to detect seismic events from this dataset. Due to the large volume of continuous data ( 1 TB), we performed the detection on the GPU cluster Stampede (https://www.tacc.utexas.edu/systems/stampede). We build a suite of template events from three catalogs: 1) the standard Pacific Northwest Seismic Network (PNSN) catalog (45 events); 2) the catalog from Hansen&Schmandt (2015) obtained with a reverse-time imaging method (212 events); and 3) the catalog identified with a matched filter technique using the PNSN permanent stations (190 events). By searching for template matches in the ultra-dense array, we find 2237 events. We then calibrate precise relative magnitudes for template and detected events, using a principal component fit to measure waveform amplitude ratios. The magnitude of completeness and b-value of the detected catalog is -0.5 and 1.1, respectively. Our detected catalog shows several intensive swarms, which are likely driven by fluid pressure transients in conduits or slip transients on faults underneath the volcano. We are currently relocating the detected catalog with HypoDD and measuring the seismic velocity changes at Mount St. Helens using the coda wave interferometry of detected repeating earthquakes. The accurate temporal-spatial migration pattern of seismicity and seismic property changes should shed light on the physical processes beneath Mount St. Helens.
Radiation detector device for rejecting and excluding incomplete charge collection events
Bolotnikov, Aleksey E.; De Geronimo, Gianluigi; Vernon, Emerson; Yang, Ge; Camarda, Giuseppe; Cui, Yonggang; Hossain, Anwar; Kim, Ki Hyun; James, Ralph B.
2016-05-10
A radiation detector device is provided that is capable of distinguishing between full charge collection (FCC) events and incomplete charge collection (ICC) events based upon a correlation value comparison algorithm that compares correlation values calculated for individually sensed radiation detection events with a calibrated FCC event correlation function. The calibrated FCC event correlation function serves as a reference curve utilized by a correlation value comparison algorithm to determine whether a sensed radiation detection event fits the profile of the FCC event correlation function within the noise tolerances of the radiation detector device. If the radiation detection event is determined to be an ICC event, then the spectrum for the ICC event is rejected and excluded from inclusion in the radiation detector device spectral analyses. The radiation detector device also can calculate a performance factor to determine the efficacy of distinguishing between FCC and ICC events.
Zhu, Pengyu; Fu, Wei; Wang, Chenguang; Du, Zhixin; Huang, Kunlun; Zhu, Shuifang; Xu, Wentao
2016-04-15
The possibility of the absolute quantitation of GMO events by digital PCR was recently reported. However, most absolute quantitation methods based on the digital PCR required pretreatment steps. Meanwhile, singleplex detection could not meet the demand of the absolute quantitation of GMO events that is based on the ratio of foreign fragments and reference genes. Thus, to promote the absolute quantitative detection of different GMO events by digital PCR, we developed a quantitative detection method based on duplex digital PCR without pretreatment. Moreover, we tested 7 GMO events in our study to evaluate the fitness of our method. The optimized combination of foreign and reference primers, limit of quantitation (LOQ), limit of detection (LOD) and specificity were validated. The results showed that the LOQ of our method for different GMO events was 0.5%, while the LOD is 0.1%. Additionally, we found that duplex digital PCR could achieve the detection results with lower RSD compared with singleplex digital PCR. In summary, the duplex digital PCR detection system is a simple and stable way to achieve the absolute quantitation of different GMO events. Moreover, the LOQ and LOD indicated that this method is suitable for the daily detection and quantitation of GMO events. Copyright © 2016 Elsevier B.V. All rights reserved.
Station Set Residual: Event Classification Using Historical Distribution of Observing Stations
NASA Astrophysics Data System (ADS)
Procopio, Mike; Lewis, Jennifer; Young, Chris
2010-05-01
Analysts working at the International Data Centre in support of treaty monitoring through the Comprehensive Nuclear-Test-Ban Treaty Organization spend a significant amount of time reviewing hypothesized seismic events produced by an automatic processing system. When reviewing these events to determine their legitimacy, analysts take a variety of approaches that rely heavily on training and past experience. One method used by analysts to gauge the validity of an event involves examining the set of stations involved in the detection of an event. In particular, leveraging past experience, an analyst can say that an event located in a certain part of the world is expected to be detected by Stations A, B, and C. Implicit in this statement is that such an event would usually not be detected by Stations X, Y, or Z. For some well understood parts of the world, the absence of one or more "expected" stations—or the presence of one or more "unexpected" stations—is correlated with a hypothesized event's legitimacy and to its survival to the event bulletin. The primary objective of this research is to formalize and quantify the difference between the observed set of stations detecting some hypothesized event, versus the expected set of stations historically associated with detecting similar nearby events close in magnitude. This Station Set Residual can be quantified in many ways, some of which are correlated with the analysts' determination of whether or not the event is valid. We propose that this Station Set Residual score can be used to screen out certain classes of "false" events produced by automatic processing with a high degree of confidence, reducing the analyst burden. Moreover, we propose that the visualization of the historically expected distribution of detecting stations can be immediately useful as an analyst aid during their review process.
NASA Astrophysics Data System (ADS)
Hopp, C. J.; Savage, M. K.; Townend, J.; Sherburn, S.
2016-12-01
Monitoring patterns in local microseismicity gives clues to the existence and location of subsurface structures. In the context of a geothermal reservoir, subsurface structures often indicate areas of high permeability and are vitally important in understanding fluid flow within the geothermal resource. Detecting and locating microseismic events within an area of power generation, however, is often challenging due to high levels of noise associated with nearby power plant infrastructure. In this situation, matched filter detection improves drastically upon standard earthquake detection techniques, specifically when events are likely induced by fluid injection and are therefore near-repeating. Using an earthquake catalog of 637 events which occurred between 1 January and 18 November 2015 as our initial dataset, we implemented a matched filtering routine for the Mighty River Power (MRP) geothermal fields at Rotokawa and Ngatamariki, central North Island, New Zealand. We detected nearly 21,000 additional events across both geothermal fields, a roughly 30-fold increase from the original catalog. On average, each of the 637 template events detected 45 additional events throughout the study period, with a maximum number of additional detections for a single template of 359. Cumulative detection rates for all template events, in general, do not mimic large scale changes in injection rates within the fields, however we do see indications of an increase in detection rate associated with power plant shutdown at Ngatamariki. Locations of detected events follow established patterns of historic seismicity at both Ngatamariki and Rotokawa. One large cluster of events persists in the southeastern portion of Rotokawa and is likely bounded to the northwest by a known fault dividing the injection and production sections of the field. Two distinct clusters of microseismicity occur in the North and South of Ngatamariki, the latter appearing to coincide with a structure dividing the production zone and the southern injection zone.
Exploiting semantics for sensor re-calibration in event detection systems
NASA Astrophysics Data System (ADS)
Vaisenberg, Ronen; Ji, Shengyue; Hore, Bijit; Mehrotra, Sharad; Venkatasubramanian, Nalini
2008-01-01
Event detection from a video stream is becoming an important and challenging task in surveillance and sentient systems. While computer vision has been extensively studied to solve different kinds of detection problems over time, it is still a hard problem and even in a controlled environment only simple events can be detected with a high degree of accuracy. Instead of struggling to improve event detection using image processing only, we bring in semantics to direct traditional image processing. Semantics are the underlying facts that hide beneath video frames, which can not be "seen" directly by image processing. In this work we demonstrate that time sequence semantics can be exploited to guide unsupervised re-calibration of the event detection system. We present an instantiation of our ideas by using an appliance as an example--Coffee Pot level detection based on video data--to show that semantics can guide the re-calibration of the detection model. This work exploits time sequence semantics to detect when re-calibration is required to automatically relearn a new detection model for the newly evolved system state and to resume monitoring with a higher rate of accuracy.
Adaptively Adjusted Event-Triggering Mechanism on Fault Detection for Networked Control Systems.
Wang, Yu-Long; Lim, Cheng-Chew; Shi, Peng
2016-12-08
This paper studies the problem of adaptively adjusted event-triggering mechanism-based fault detection for a class of discrete-time networked control system (NCS) with applications to aircraft dynamics. By taking into account the fault occurrence detection progress and the fault occurrence probability, and introducing an adaptively adjusted event-triggering parameter, a novel event-triggering mechanism is proposed to achieve the efficient utilization of the communication network bandwidth. Both the sensor-to-control station and the control station-to-actuator network-induced delays are taken into account. The event-triggered sensor and the event-triggered control station are utilized simultaneously to establish new network-based closed-loop models for the NCS subject to faults. Based on the established models, the event-triggered simultaneous design of fault detection filter (FDF) and controller is presented. A new algorithm for handling the adaptively adjusted event-triggering parameter is proposed. Performance analysis verifies the effectiveness of the adaptively adjusted event-triggering mechanism, and the simultaneous design of FDF and controller.
Event detection in an assisted living environment.
Stroiescu, Florin; Daly, Kieran; Kuris, Benjamin
2011-01-01
This paper presents the design of a wireless event detection and in building location awareness system. The systems architecture is based on using a body worn sensor to detect events such as falls where they occur in an assisted living environment. This process involves developing event detection algorithms and transmitting such events wirelessly to an in house network based on the 802.15.4 protocol. The network would then generate alerts both in the assisted living facility and remotely to an offsite monitoring facility. The focus of this paper is on the design of the system architecture and the compliance challenges in applying this technology.
Abnormal global and local event detection in compressive sensing domain
NASA Astrophysics Data System (ADS)
Wang, Tian; Qiao, Meina; Chen, Jie; Wang, Chuanyun; Zhang, Wenjia; Snoussi, Hichem
2018-05-01
Abnormal event detection, also known as anomaly detection, is one challenging task in security video surveillance. It is important to develop effective and robust movement representation models for global and local abnormal event detection to fight against factors such as occlusion and illumination change. In this paper, a new algorithm is proposed. It can locate the abnormal events on one frame, and detect the global abnormal frame. The proposed algorithm employs a sparse measurement matrix designed to represent the movement feature based on optical flow efficiently. Then, the abnormal detection mission is constructed as a one-class classification task via merely learning from the training normal samples. Experiments demonstrate that our algorithm performs well on the benchmark abnormal detection datasets against state-of-the-art methods.
NASA Astrophysics Data System (ADS)
Jia, Rui-Sheng; Sun, Hong-Mei; Peng, Yan-Jun; Liang, Yong-Quan; Lu, Xin-Ming
2017-07-01
Microseismic monitoring is an effective means for providing early warning of rock or coal dynamical disasters, and its first step is microseismic event detection, although low SNR microseismic signals often cannot effectively be detected by routine methods. To solve this problem, this paper presents permutation entropy and a support vector machine to detect low SNR microseismic events. First, an extraction method of signal features based on multi-scale permutation entropy is proposed by studying the influence of the scale factor on the signal permutation entropy. Second, the detection model of low SNR microseismic events based on the least squares support vector machine is built by performing a multi-scale permutation entropy calculation for the collected vibration signals, constructing a feature vector set of signals. Finally, a comparative analysis of the microseismic events and noise signals in the experiment proves that the different characteristics of the two can be fully expressed by using multi-scale permutation entropy. The detection model of microseismic events combined with the support vector machine, which has the features of high classification accuracy and fast real-time algorithms, can meet the requirements of online, real-time extractions of microseismic events.
Nishii, Nobuhiro; Miyoshi, Akihito; Kubo, Motoki; Miyamoto, Masakazu; Morimoto, Yoshimasa; Kawada, Satoshi; Nakagawa, Koji; Watanabe, Atsuyuki; Nakamura, Kazufumi; Morita, Hiroshi; Ito, Hiroshi
2018-03-01
Remote monitoring (RM) has been advocated as the new standard of care for patients with cardiovascular implantable electronic devices (CIEDs). RM has allowed the early detection of adverse clinical events, such as arrhythmia, lead failure, and battery depletion. However, lead failure was often identified only by arrhythmic events, but not impedance abnormalities. To compare the usefulness of arrhythmic events with conventional impedance abnormalities for identifying lead failure in CIED patients followed by RM. CIED patients in 12 hospitals have been followed by the RM center in Okayama University Hospital. All transmitted data have been analyzed and summarized. From April 2009 to March 2016, 1,873 patients have been followed by the RM center. During the mean follow-up period of 775 days, 42 lead failure events (atrial lead 22, right ventricular pacemaker lead 5, implantable cardioverter defibrillator [ICD] lead 15) were detected. The proportion of lead failures detected only by arrhythmic events, which were not detected by conventional impedance abnormalities, was significantly higher than that detected by impedance abnormalities (arrhythmic event 76.2%, 95% CI: 60.5-87.9%; impedance abnormalities 23.8%, 95% CI: 12.1-39.5%). Twenty-seven events (64.7%) were detected without any alert. Of 15 patients with ICD lead failure, none has experienced inappropriate therapy. RM can detect lead failure earlier, before clinical adverse events. However, CIEDs often diagnose lead failure as just arrhythmic events without any warning. Thus, to detect lead failure earlier, careful human analysis of arrhythmic events is useful. © 2017 Wiley Periodicals, Inc.
Automatic Detection and Classification of Audio Events for Road Surveillance Applications.
Almaadeed, Noor; Asim, Muhammad; Al-Maadeed, Somaya; Bouridane, Ahmed; Beghdadi, Azeddine
2018-06-06
This work investigates the problem of detecting hazardous events on roads by designing an audio surveillance system that automatically detects perilous situations such as car crashes and tire skidding. In recent years, research has shown several visual surveillance systems that have been proposed for road monitoring to detect accidents with an aim to improve safety procedures in emergency cases. However, the visual information alone cannot detect certain events such as car crashes and tire skidding, especially under adverse and visually cluttered weather conditions such as snowfall, rain, and fog. Consequently, the incorporation of microphones and audio event detectors based on audio processing can significantly enhance the detection accuracy of such surveillance systems. This paper proposes to combine time-domain, frequency-domain, and joint time-frequency features extracted from a class of quadratic time-frequency distributions (QTFDs) to detect events on roads through audio analysis and processing. Experiments were carried out using a publicly available dataset. The experimental results conform the effectiveness of the proposed approach for detecting hazardous events on roads as demonstrated by 7% improvement of accuracy rate when compared against methods that use individual temporal and spectral features.
Lawhern, Vernon; Hairston, W David; Robbins, Kay
2013-01-01
Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration.
Lawhern, Vernon; Hairston, W. David; Robbins, Kay
2013-01-01
Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration. PMID:23638169
Oliker, Nurit; Ostfeld, Avi
2014-03-15
This study describes a decision support system, alerts for contamination events in water distribution systems. The developed model comprises a weighted support vector machine (SVM) for the detection of outliers, and a following sequence analysis for the classification of contamination events. The contribution of this study is an improvement of contamination events detection ability and a multi-dimensional analysis of the data, differing from the parallel one-dimensional analysis conducted so far. The multivariate analysis examines the relationships between water quality parameters and detects changes in their mutual patterns. The weights of the SVM model accomplish two goals: blurring the difference between sizes of the two classes' data sets (as there are much more normal/regular than event time measurements), and adhering the time factor attribute by a time decay coefficient, ascribing higher importance to recent observations when classifying a time step measurement. All model parameters were determined by data driven optimization so the calibration of the model was completely autonomic. The model was trained and tested on a real water distribution system (WDS) data set with randomly simulated events superimposed on the original measurements. The model is prominent in its ability to detect events that were only partly expressed in the data (i.e., affecting only some of the measured parameters). The model showed high accuracy and better detection ability as compared to previous modeling attempts of contamination event detection. Copyright © 2013 Elsevier Ltd. All rights reserved.
Efficient method for events detection in phonocardiographic signals
NASA Astrophysics Data System (ADS)
Martinez-Alajarin, Juan; Ruiz-Merino, Ramon
2005-06-01
The auscultation of the heart is still the first basic analysis tool used to evaluate the functional state of the heart, as well as the first indicator used to submit the patient to a cardiologist. In order to improve the diagnosis capabilities of auscultation, signal processing algorithms are currently being developed to assist the physician at primary care centers for adult and pediatric population. A basic task for the diagnosis from the phonocardiogram is to detect the events (main and additional sounds, murmurs and clicks) present in the cardiac cycle. This is usually made by applying a threshold and detecting the events that are bigger than the threshold. However, this method usually does not allow the detection of the main sounds when additional sounds and murmurs exist, or it may join several events into a unique one. In this paper we present a reliable method to detect the events present in the phonocardiogram, even in the presence of heart murmurs or additional sounds. The method detects relative maxima peaks in the amplitude envelope of the phonocardiogram, and computes a set of parameters associated with each event. Finally, a set of characteristics is extracted from each event to aid in the identification of the events. Besides, the morphology of the murmurs is also detected, which aids in the differentiation of different diseases that can occur in the same temporal localization. The algorithms have been applied to real normal heart sounds and murmurs, achieving satisfactory results.
Perceiving goals and actions in individuals with autism spectrum disorders.
Zalla, Tiziana; Labruyère, Nelly; Georgieff, Nicolas
2013-10-01
In the present study, we investigated the ability to parse familiar sequences of action into meaningful events in young individuals with autism spectrum disorders (ASDs), as compared to young individuals with typical development (TD) and young individuals with moderate mental retardation or learning disabilities (MLDs). While viewing two videotaped movies, participants were requested to detect the boundary transitions between component events at both fine and coarse levels of the action hierarchical structure. Overall, reduced accuracy for event detection was found in participants with ASDs, relative to participants with TD, at both levels of action segmentation. The performance was, however, equally diminished in participants with ASDs and MLDs under the course-grained segmentation suggesting that difficulties to detect fine-grained events in ASDs cannot be explained by a general intellectual dysfunction. Reduced accuracy for event detection was related to diminished event recall, memory for event sequence and Theory of Mind abilities. We hypothesized that difficulties with event detection result from a deficit disrupting the on-line processing of kinematic features and physical changes of dynamic human actions. An impairment at the earlier stages of the event encoding process might contribute to deficits in episodic memory and social functioning in individuals with ASDs.
NASA Astrophysics Data System (ADS)
Baziw, Erick; Verbeek, Gerald
2012-12-01
Among engineers there is considerable interest in the real-time identification of "events" within time series data with a low signal to noise ratio. This is especially true for acoustic emission analysis, which is utilized to assess the integrity and safety of many structures and is also applied in the field of passive seismic monitoring (PSM). Here an array of seismic receivers are used to acquire acoustic signals to monitor locations where seismic activity is expected: underground excavations, deep open pits and quarries, reservoirs into which fluids are injected or from which fluids are produced, permeable subsurface formations, or sites of large underground explosions. The most important element of PSM is event detection: the monitoring of seismic acoustic emissions is a continuous, real-time process which typically runs 24 h a day, 7 days a week, and therefore a PSM system with poor event detection can easily acquire terabytes of useless data as it does not identify crucial acoustic events. This paper outlines a new algorithm developed for this application, the so-called SEED™ (Signal Enhancement and Event Detection) algorithm. The SEED™ algorithm uses real-time Bayesian recursive estimation digital filtering techniques for PSM signal enhancement and event detection.
NASA Astrophysics Data System (ADS)
Mahmoud, Seedahmed S.; Visagathilagar, Yuvaraja; Katsifolis, Jim
2012-09-01
The success of any perimeter intrusion detection system depends on three important performance parameters: the probability of detection (POD), the nuisance alarm rate (NAR), and the false alarm rate (FAR). The most fundamental parameter, POD, is normally related to a number of factors such as the event of interest, the sensitivity of the sensor, the installation quality of the system, and the reliability of the sensing equipment. The suppression of nuisance alarms without degrading sensitivity in fiber optic intrusion detection systems is key to maintaining acceptable performance. Signal processing algorithms that maintain the POD and eliminate nuisance alarms are crucial for achieving this. In this paper, a robust event classification system using supervised neural networks together with a level crossings (LCs) based feature extraction algorithm is presented for the detection and recognition of intrusion and non-intrusion events in a fence-based fiber-optic intrusion detection system. A level crossings algorithm is also used with a dynamic threshold to suppress torrential rain-induced nuisance alarms in a fence system. Results show that rain-induced nuisance alarms can be suppressed for rainfall rates in excess of 100 mm/hr with the simultaneous detection of intrusion events. The use of a level crossing based detection and novel classification algorithm is also presented for a buried pipeline fiber optic intrusion detection system for the suppression of nuisance events and discrimination of intrusion events. The sensor employed for both types of systems is a distributed bidirectional fiber-optic Mach-Zehnder (MZ) interferometer.
NASA Astrophysics Data System (ADS)
Patton, J.; Yeck, W.; Benz, H.
2017-12-01
The U.S. Geological Survey National Earthquake Information Center (USGS NEIC) is implementing and integrating new signal detection methods such as subspace correlation, continuous beamforming, multi-band picking and automatic phase identification into near-real-time monitoring operations. Leveraging the additional information from these techniques help the NEIC utilize a large and varied network on local to global scales. The NEIC is developing an ordered, rapid, robust, and decentralized framework for distributing seismic detection data as well as a set of formalized formatting standards. These frameworks and standards enable the NEIC to implement a seismic event detection framework that supports basic tasks, including automatic arrival time picking, social media based event detections, and automatic association of different seismic detection data into seismic earthquake events. In addition, this framework enables retrospective detection processing such as automated S-wave arrival time picking given a detected event, discrimination and classification of detected events by type, back-azimuth and slowness calculations, and ensuring aftershock and induced sequence detection completeness. These processes and infrastructure improve the NEIC's capabilities, accuracy, and speed of response. In addition, this same infrastructure provides an improved and convenient structure to support access to automatic detection data for both research and algorithmic development.
Development of a database and processing method for detecting hematotoxicity adverse drug events.
Shimai, Yoshie; Takeda, Toshihiro; Manabe, Shirou; Teramoto, Kei; Mihara, Naoki; Matsumura, Yasushi
2015-01-01
Adverse events are detected by monitoring the patient's status, including blood test results. However, it is difficult to identify all adverse events based on recognition by individual doctors. We developed a system that can be used to detect hematotoxicity adverse events according to blood test results recorded in an electronic medical record system. The blood test results were graded based on Common Terminology Criteria for Adverse Events (CTCAE) and changes in the blood test results (Up, Down, Flat) were assessed according to the variation in the grade. The changes in the blood test and injection data were stored in a database. By comparing the date of injection and start and end dates of the change in the blood test results, adverse events related to a designated drug were detected. Using this method, we searched for the occurrence of serious adverse events (CTCAE Grades 3 or 4) concerning WBC, ALT and creatinine related to paclitaxel at Osaka University Hospital. The rate of occurrence of a decreased WBC count, increased ALT level and increased creatinine level was 36.0%, 0.6% and 0.4%, respectively. This method is useful for detecting and estimating the rate of occurrence of hematotoxicity adverse drug events.
Global Seismic Event Detection Using Surface Waves: 15 Possible Antarctic Glacial Sliding Events
NASA Astrophysics Data System (ADS)
Chen, X.; Shearer, P. M.; Walker, K. T.; Fricker, H. A.
2008-12-01
To identify overlooked or anomalous seismic events not listed in standard catalogs, we have developed an algorithm to detect and locate global seismic events using intermediate-period (35-70s) surface waves. We apply our method to continuous vertical-component seismograms from the global seismic networks as archived in the IRIS UV FARM database from 1997 to 2007. We first bandpass filter the seismograms, apply automatic gain control, and compute envelope functions. We then examine 1654 target event locations defined at 5 degree intervals and stack the seismogram envelopes along the predicted Rayleigh-wave travel times. The resulting function has spatial and temporal peaks that indicate possible seismic events. We visually check these peaks using a graphical user interface to eliminate artifacts and assign an overall reliability grade (A, B or C) to the new events. We detect 78% of events in the Global Centroid Moment Tensor (CMT) catalog. However, we also find 840 new events not listed in the PDE, ISC and REB catalogs. Many of these new events were previously identified by Ekstrom (2006) using a different Rayleigh-wave detection scheme. Most of these new events are located along oceanic ridges and transform faults. Some new events can be associated with volcanic eruptions such as the 2000 Miyakejima sequence near Japan and others with apparent glacial sliding events in Greenland (Ekstrom et al., 2003). We focus our attention on 15 events detected from near the Antarctic coastline and relocate them using a cross-correlation approach. The events occur in 3 groups which are well-separated from areas of cataloged earthquake activity. We speculate that these are iceberg calving and/or glacial sliding events, and hope to test this by inverting for their source mechanisms and examining remote sensing data from their source regions.
Multi-Detection Events, Probability Density Functions, and Reduced Location Area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eslinger, Paul W.; Schrom, Brian T.
2016-03-01
Abstract Several efforts have been made in the Comprehensive Nuclear-Test-Ban Treaty (CTBT) community to assess the benefits of combining detections of radionuclides to improve the location estimates available from atmospheric transport modeling (ATM) backtrack calculations. We present a Bayesian estimation approach rather than a simple dilution field of regard approach to allow xenon detections and non-detections to be combined mathematically. This system represents one possible probabilistic approach to radionuclide event formation. Application of this method to a recent interesting radionuclide event shows a substantial reduction in the location uncertainty of that event.
Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; ...
2016-01-01
The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events thatmore » generated the phase picks.« less
An integrated logit model for contamination event detection in water distribution systems.
Housh, Mashor; Ostfeld, Avi
2015-05-15
The problem of contamination event detection in water distribution systems has become one of the most challenging research topics in water distribution systems analysis. Current attempts for event detection utilize a variety of approaches including statistical, heuristics, machine learning, and optimization methods. Several existing event detection systems share a common feature in which alarms are obtained separately for each of the water quality indicators. Unifying those single alarms from different indicators is usually performed by means of simple heuristics. A salient feature of the current developed approach is using a statistically oriented model for discrete choice prediction which is estimated using the maximum likelihood method for integrating the single alarms. The discrete choice model is jointly calibrated with other components of the event detection system framework in a training data set using genetic algorithms. The fusing process of each indicator probabilities, which is left out of focus in many existing event detection system models, is confirmed to be a crucial part of the system which could be modelled by exploiting a discrete choice model for improving its performance. The developed methodology is tested on real water quality data, showing improved performances in decreasing the number of false positive alarms and in its ability to detect events with higher probabilities, compared to previous studies. Copyright © 2015 Elsevier Ltd. All rights reserved.
Detecting event-related changes in organizational networks using optimized neural network models.
Li, Ze; Sun, Duoyong; Zhu, Renqi; Lin, Zihan
2017-01-01
Organizational external behavior changes are caused by the internal structure and interactions. External behaviors are also known as the behavioral events of an organization. Detecting event-related changes in organizational networks could efficiently be used to monitor the dynamics of organizational behaviors. Although many different methods have been used to detect changes in organizational networks, these methods usually ignore the correlation between the internal structure and external events. Event-related change detection considers the correlation and could be used for event recognition based on social network modeling and supervised classification. Detecting event-related changes could be effectively useful in providing early warnings and faster responses to both positive and negative organizational activities. In this study, event-related change in an organizational network was defined, and artificial neural network models were used to quantitatively determine whether and when a change occurred. To achieve a higher accuracy, Back Propagation Neural Networks (BPNNs) were optimized using Genetic Algorithms (GAs) and Particle Swarm Optimization (PSO). We showed the feasibility of the proposed method by comparing its performance with that of other methods using two cases. The results suggested that the proposed method could identify organizational events based on a correlation between the organizational networks and events. The results also suggested that the proposed method not only has a higher precision but also has a better robustness than the previously used techniques.
Detecting event-related changes in organizational networks using optimized neural network models
Sun, Duoyong; Zhu, Renqi; Lin, Zihan
2017-01-01
Organizational external behavior changes are caused by the internal structure and interactions. External behaviors are also known as the behavioral events of an organization. Detecting event-related changes in organizational networks could efficiently be used to monitor the dynamics of organizational behaviors. Although many different methods have been used to detect changes in organizational networks, these methods usually ignore the correlation between the internal structure and external events. Event-related change detection considers the correlation and could be used for event recognition based on social network modeling and supervised classification. Detecting event-related changes could be effectively useful in providing early warnings and faster responses to both positive and negative organizational activities. In this study, event-related change in an organizational network was defined, and artificial neural network models were used to quantitatively determine whether and when a change occurred. To achieve a higher accuracy, Back Propagation Neural Networks (BPNNs) were optimized using Genetic Algorithms (GAs) and Particle Swarm Optimization (PSO). We showed the feasibility of the proposed method by comparing its performance with that of other methods using two cases. The results suggested that the proposed method could identify organizational events based on a correlation between the organizational networks and events. The results also suggested that the proposed method not only has a higher precision but also has a better robustness than the previously used techniques. PMID:29190799
Heist, E Kevin; Herre, John M; Binkley, Philip F; Van Bakel, Adrian B; Porterfield, James G; Porterfield, Linda M; Qu, Fujian; Turkel, Melanie; Pavri, Behzad B
2014-10-15
Detect Fluid Early from Intrathoracic Impedance Monitoring (DEFEAT-PE) is a prospective, multicenter study of multiple intrathoracic impedance vectors to detect pulmonary congestion (PC) events. Changes in intrathoracic impedance between the right ventricular (RV) coil and device can (RVcoil→Can) of implantable cardioverter-defibrillators (ICDs) and cardiac resynchronization therapy ICDs (CRT-Ds) are used clinically for the detection of PC events, but other impedance vectors and algorithms have not been studied prospectively. An initial 75-patient study was used to derive optimal impedance vectors to detect PC events, with 2 vector combinations selected for prospective analysis in DEFEAT-PE (ICD vectors: RVring→Can + RVcoil→Can, detection threshold 13 days; CRT-D vectors: left ventricular ring→Can + RVcoil→Can, detection threshold 14 days). Impedance changes were considered true positive if detected <30 days before an adjudicated PC event. One hundred sixty-two patients were enrolled (80 with ICDs and 82 with CRT-Ds), all with ≥1 previous PC event. One hundred forty-four patients provided study data, with 214 patient-years of follow-up and 139 PC events. Sensitivity for PC events of the prespecified algorithms was as follows: ICD: sensitivity 32.3%, false-positive rate 1.28 per patient-year; CRT-D: sensitivity 32.4%, false-positive rate 1.66 per patient-year. An alternative algorithm, ultimately approved by the US Food and Drug Administration (RVring→Can + RVcoil→Can, detection threshold 14 days), resulted in (for all patients) sensitivity of 21.6% and a false-positive rate of 0.9 per patient-year. The CRT-D thoracic impedance vector algorithm selected in the derivation study was not superior to the ICD algorithm RVring→Can + RVcoil→Can when studied prospectively. In conclusion, to achieve an acceptably low false-positive rate, the intrathoracic impedance algorithms studied in DEFEAT-PE resulted in low sensitivity for the prediction of heart failure events. Copyright © 2014 Elsevier Inc. All rights reserved.
Unsupervised Spatial Event Detection in Targeted Domains with Applications to Civil Unrest Modeling
Zhao, Liang; Chen, Feng; Dai, Jing; Hua, Ting; Lu, Chang-Tien; Ramakrishnan, Naren
2014-01-01
Twitter has become a popular data source as a surrogate for monitoring and detecting events. Targeted domains such as crime, election, and social unrest require the creation of algorithms capable of detecting events pertinent to these domains. Due to the unstructured language, short-length messages, dynamics, and heterogeneity typical of Twitter data streams, it is technically difficult and labor-intensive to develop and maintain supervised learning systems. We present a novel unsupervised approach for detecting spatial events in targeted domains and illustrate this approach using one specific domain, viz. civil unrest modeling. Given a targeted domain, we propose a dynamic query expansion algorithm to iteratively expand domain-related terms, and generate a tweet homogeneous graph. An anomaly identification method is utilized to detect spatial events over this graph by jointly maximizing local modularity and spatial scan statistics. Extensive experiments conducted in 10 Latin American countries demonstrate the effectiveness of the proposed approach. PMID:25350136
Adaptive Sensor Tuning for Seismic Event Detection in Environment with Electromagnetic Noise
NASA Astrophysics Data System (ADS)
Ziegler, Abra E.
The goal of this research is to detect possible microseismic events at a carbon sequestration site. Data recorded on a continuous downhole microseismic array in the Farnsworth Field, an oil field in Northern Texas that hosts an ongoing carbon capture, utilization, and storage project, were evaluated using machine learning and reinforcement learning techniques to determine their effectiveness at seismic event detection on a dataset with electromagnetic noise. The data were recorded from a passive vertical monitoring array consisting of 16 levels of 3-component 15 Hz geophones installed in the field and continuously recording since January 2014. Electromagnetic and other noise recorded on the array has significantly impacted the utility of the data and it was necessary to characterize and filter the noise in order to attempt event detection. Traditional detection methods using short-term average/long-term average (STA/LTA) algorithms were evaluated and determined to be ineffective because of changing noise levels. To improve the performance of event detection and automatically and dynamically detect seismic events using effective data processing parameters, an adaptive sensor tuning (AST) algorithm developed by Sandia National Laboratories was utilized. AST exploits neuro-dynamic programming (reinforcement learning) trained with historic event data to automatically self-tune and determine optimal detection parameter settings. The key metric that guides the AST algorithm is consistency of each sensor with its nearest neighbors: parameters are automatically adjusted on a per station basis to be more or less sensitive to produce consistent agreement of detections in its neighborhood. The effects that changes in neighborhood configuration have on signal detection were explored, as it was determined that neighborhood-based detections significantly reduce the number of both missed and false detections in ground-truthed data. The performance of the AST algorithm was quantitatively evaluated during a variety of noise conditions and seismic detections were identified using AST and compared to ancillary injection data. During a period of CO2 injection in a nearby well to the monitoring array, 82% of seismic events were accurately detected, 13% of events were missed, and 5% of detections were determined to be false. Additionally, seismic risk was evaluated from the stress field and faulting regime at FWU to determine the likelihood of pressure perturbations to trigger slip on previously mapped faults. Faults oriented NW-SE were identified as requiring the smallest pore pressure changes to trigger slip and faults oriented N-S will also potentially be reactivated although this is less likely.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, Fei; Jiang, Huaiguang; Tan, Jin
This paper proposes an event-driven approach for reconfiguring distribution systems automatically. Specifically, an optimal synchrophasor sensor placement (OSSP) is used to reduce the number of synchrophasor sensors while keeping the whole system observable. Then, a wavelet-based event detection and location approach is used to detect and locate the event, which performs as a trigger for network reconfiguration. With the detected information, the system is then reconfigured using the hierarchical decentralized approach to seek for the new optimal topology. In this manner, whenever an event happens the distribution network can be reconfigured automatically based on the real-time information that is observablemore » and detectable.« less
Initial Evaluation of Signal-Based Bayesian Monitoring
NASA Astrophysics Data System (ADS)
Moore, D.; Russell, S.
2016-12-01
We present SIGVISA (Signal-based Vertically Integrated Seismic Analysis), a next-generation system for global seismic monitoring through Bayesian inference on seismic signals. Traditional seismic monitoring systems rely on discrete detections produced by station processing software, discarding significant information present in the original recorded signal. By modeling signals directly, our forward model is able to incorporate a rich representation of the physics underlying the signal generation process, including source mechanisms, wave propagation, and station response. This allows inference in the model to recover the qualitative behavior of geophysical methods including waveform matching and double-differencing, all as part of a unified Bayesian monitoring system that simultaneously detects and locates events from a network of stations. We report results from an evaluation of SIGVISA monitoring the western United States for a two-week period following the magnitude 6.0 event in Wells, NV in February 2008. During this period, SIGVISA detects more than twice as many events as NETVISA, and three times as many as SEL3, while operating at the same precision; at lower precisions it detects up to five times as many events as SEL3. At the same time, signal-based monitoring reduces mean location errors by a factor of four relative to detection-based systems. We provide evidence that, given only IMS data, SIGVISA detects events that are missed by regional monitoring networks, indicating that our evaluations may even underestimate its performance. Finally, SIGVISA matches or exceeds the detection rates of existing systems for de novo events - events with no nearby historical seismicity - and detects through automated processing a number of such events missed even by the human analysts generating the LEB.
Artificial Neural Network applied to lightning flashes
NASA Astrophysics Data System (ADS)
Gin, R. B.; Guedes, D.; Bianchi, R.
2013-05-01
The development of video cameras enabled cientists to study lightning discharges comportment with more precision. The main goal of this project is to create a system able to detect images of lightning discharges stored in videos and classify them using an Artificial Neural Network (ANN)using C Language and OpenCV libraries. The developed system, can be split in two different modules: detection module and classification module. The detection module uses OpenCV`s computer vision libraries and image processing techniques to detect if there are significant differences between frames in a sequence, indicating that something, still not classified, occurred. Whenever there is a significant difference between two consecutive frames, two main algorithms are used to analyze the frame image: brightness and shape algorithms. These algorithms detect both shape and brightness of the event, removing irrelevant events like birds, as well as detecting the relevant events exact position, allowing the system to track it over time. The classification module uses a neural network to classify the relevant events as horizontal or vertical lightning, save the event`s images and calculates his number of discharges. The Neural Network was implemented using the backpropagation algorithm, and was trained with 42 training images , containing 57 lightning events (one image can have more than one lightning). TheANN was tested with one to five hidden layers, with up to 50 neurons each. The best configuration achieved a success rate of 95%, with one layer containing 20 neurons (33 test images with 42 events were used in this phase). This configuration was implemented in the developed system to analyze 20 video files, containing 63 lightning discharges previously manually detected. Results showed that all the lightning discharges were detected, many irrelevant events were unconsidered, and the event's number of discharges was correctly computed. The neural network used in this project achieved a success rate of 90%. The videos used in this experiment were acquired by seven video cameras installed in São Bernardo do Campo, Brazil, that continuously recorded lightning events during the summer. The cameras were disposed in a 360 loop, recording all data at a time resolution of 33ms. During this period, several convective storms were recorded.
Detection of cough signals in continuous audio recordings using hidden Markov models.
Matos, Sergio; Birring, Surinder S; Pavord, Ian D; Evans, David H
2006-06-01
Cough is a common symptom of many respiratory diseases. The evaluation of its intensity and frequency of occurrence could provide valuable clinical information in the assessment of patients with chronic cough. In this paper we propose the use of hidden Markov models (HMMs) to automatically detect cough sounds from continuous ambulatory recordings. The recording system consists of a digital sound recorder and a microphone attached to the patient's chest. The recognition algorithm follows a keyword-spotting approach, with cough sounds representing the keywords. It was trained on 821 min selected from 10 ambulatory recordings, including 2473 manually labeled cough events, and tested on a database of nine recordings from separate patients with a total recording time of 3060 min and comprising 2155 cough events. The average detection rate was 82% at a false alarm rate of seven events/h, when considering only events above an energy threshold relative to each recording's average energy. These results suggest that HMMs can be applied to the detection of cough sounds from ambulatory patients. A postprocessing stage to perform a more detailed analysis on the detected events is under development, and could allow the rejection of some of the incorrectly detected events.
A Fuzzy-Decision Based Approach for Composite Event Detection in Wireless Sensor Networks
Zhang, Shukui; Chen, Hao; Zhu, Qiaoming
2014-01-01
The event detection is one of the fundamental researches in wireless sensor networks (WSNs). Due to the consideration of various properties that reflect events status, the Composite event is more consistent with the objective world. Thus, the research of the Composite event becomes more realistic. In this paper, we analyze the characteristics of the Composite event; then we propose a criterion to determine the area of the Composite event and put forward a dominating set based network topology construction algorithm under random deployment. For the unreliability of partial data in detection process and fuzziness of the event definitions in nature, we propose a cluster-based two-dimensional τ-GAS algorithm and fuzzy-decision based composite event decision mechanism. In the case that the sensory data of most nodes are normal, the two-dimensional τ-GAS algorithm can filter the fault node data effectively and reduce the influence of erroneous data on the event determination. The Composite event judgment mechanism which is based on fuzzy-decision holds the superiority of the fuzzy-logic based algorithm; moreover, it does not need the support of a huge rule base and its computational complexity is small. Compared to CollECT algorithm and CDS algorithm, this algorithm improves the detection accuracy and reduces the traffic. PMID:25136690
Kreilinger, Alex; Hiebel, Hannah; Müller-Putz, Gernot R
2016-03-01
This work aimed to find and evaluate a new method for detecting errors in continuous brain-computer interface (BCI) applications. Instead of classifying errors on a single-trial basis, the new method was based on multiple events (MEs) analysis to increase the accuracy of error detection. In a BCI-driven car game, based on motor imagery (MI), discrete events were triggered whenever subjects collided with coins and/or barriers. Coins counted as correct events, whereas barriers were errors. This new method, termed ME method, combined and averaged the classification results of single events (SEs) and determined the correctness of MI trials, which consisted of event sequences instead of SEs. The benefit of this method was evaluated in an offline simulation. In an online experiment, the new method was used to detect erroneous MI trials. Such MI trials were discarded and could be repeated by the users. We found that, even with low SE error potential (ErrP) detection rates, feasible accuracies can be achieved when combining MEs to distinguish erroneous from correct MI trials. Online, all subjects reached higher scores with error detection than without, at the cost of longer times needed for completing the game. Findings suggest that ErrP detection may become a reliable tool for monitoring continuous states in BCI applications when combining MEs. This paper demonstrates a novel technique for detecting errors in online continuous BCI applications, which yields promising results even with low single-trial detection rates.
NASA Astrophysics Data System (ADS)
Pattisahusiwa, Asis; Houw Liong, The; Purqon, Acep
2016-08-01
In this study, we compare two learning mechanisms: outliers and novelty detection in order to detect ionospheric TEC disturbance by November 2004 geomagnetic storm and January 2005 substorm. The mechanisms are applied by using v-SVR learning algorithm which is a regression version of SVM. Our results show that both mechanisms are quiet accurate in learning TEC data. However, novelty detection is more accurate than outliers detection in extracting anomalies related to geomagnetic events. The detected anomalies by outliers detection are mostly related to trend of data, while novelty detection are associated to geomagnetic events. Novelty detection also shows evidence of LSTID during geomagnetic events.
Detecting and characterizing coal mine related seismicity in the Western U.S. using subspace methods
NASA Astrophysics Data System (ADS)
Chambers, Derrick J. A.; Koper, Keith D.; Pankow, Kristine L.; McCarter, Michael K.
2015-11-01
We present an approach for subspace detection of small seismic events that includes methods for estimating magnitudes and associating detections from multiple stations into unique events. The process is used to identify mining related seismicity from a surface coal mine and an underground coal mining district, both located in the Western U.S. Using a blasting log and a locally derived seismic catalogue as ground truth, we assess detector performance in terms of verified detections, false positives and failed detections. We are able to correctly identify over 95 per cent of the surface coal mine blasts and about 33 per cent of the events from the underground mining district, while keeping the number of potential false positives relatively low by requiring all detections to occur on two stations. We find that most of the potential false detections for the underground coal district are genuine events missed by the local seismic network, demonstrating the usefulness of regional subspace detectors in augmenting local catalogues. We note a trade-off in detection performance between stations at smaller source-receiver distances, which have increased signal-to-noise ratio, and stations at larger distances, which have greater waveform similarity. We also explore the increased detection capabilities of a single higher dimension subspace detector, compared to multiple lower dimension detectors, in identifying events that can be described as linear combinations of training events. We find, in our data set, that such an advantage can be significant, justifying the use of a subspace detection scheme over conventional correlation methods.
A new moonquake catalog from Apollo 17 geophone data
NASA Astrophysics Data System (ADS)
Dimech, Jesse-Lee; Knapmeyer-Endrun, Brigitte; Weber, Renee
2017-04-01
New lunar seismic events have been detected on geophone data from the Apollo 17 Lunar Seismic Profile Experiment (LSPE). This dataset is already known to contain an abundance of thermal seismic events, and potentially some meteorite impacts, but prior to this study only 26 days of LSPE "listening mode" data has been analysed. In this new analysis, additional listening mode data collected between August 1976 and April 1977 is incorporated. To the authors knowledge these 8-months of data have not yet been used to detect seismic moonquake events. The geophones in question are situated adjacent to the Apollo 17 site in the Taurus-Littrow valley, about 5.5 km east of Lee-Lincoln scarp, and between the North and South Massifs. Any of these features are potential seismic sources. We have used an event-detection and classification technique based on 'Hidden Markov Models' to automatically detect and categorize seismic signals, in order to objectively generate a seismic event catalog. Currently, 2.5 months of the 8-month listening mode dataset has been processed, totaling 14,338 detections. Of these, 672 detections (classification "n1") have a sharp onset with a steep risetime suggesting they occur close to the recording geophone. These events almost all occur in association with lunar sunrise over the span of 1-2 days. One possibility is that these events originate from the nearby Apollo 17 lunar lander due to rapid heating at sunrise. A further 10,004 detections (classification "d1") show strong diurnal periodicity, with detections increasing during the lunar day and reaching a peak at sunset, and therefore probably represent thermal events from the lunar regolith immediately surrounding the Apollo 17 landing site. The final 3662 detections (classification "d2") have emergent onsets and relatively long durations. These detections have peaks associated with lunar sunrise and sunset, but also sometimes have peaks at seemingly random times. Their source mechanism has not yet been investigated. It's possible that many of these are misclassified d1/n1 events, and further QC work needs to be undertaken. But it is also possible that many of these represent more distant thermal moonquakes e.g. from the North and South massif, or even the ridge adjacent to the Lee-Lincoln scarp. The unknown event spikes will be the subject of closer inspection once the HMM technique has been refined.
Video Traffic Analysis for Abnormal Event Detection
DOT National Transportation Integrated Search
2010-01-01
We propose the use of video imaging sensors for the detection and classification of abnormal events to be used primarily for mitigation of traffic congestion. Successful detection of such events will allow for new road guidelines; for rapid deploymen...
Video traffic analysis for abnormal event detection.
DOT National Transportation Integrated Search
2010-01-01
We propose the use of video imaging sensors for the detection and classification of abnormal events to : be used primarily for mitigation of traffic congestion. Successful detection of such events will allow for : new road guidelines; for rapid deplo...
Contamination Event Detection with Multivariate Time-Series Data in Agricultural Water Monitoring †
Mao, Yingchi; Qi, Hai; Ping, Ping; Li, Xiaofang
2017-01-01
Time series data of multiple water quality parameters are obtained from the water sensor networks deployed in the agricultural water supply network. The accurate and efficient detection and warning of contamination events to prevent pollution from spreading is one of the most important issues when pollution occurs. In order to comprehensively reduce the event detection deviation, a spatial–temporal-based event detection approach with multivariate time-series data for water quality monitoring (M-STED) was proposed. The M-STED approach includes three parts. The first part is that M-STED adopts a Rule K algorithm to select backbone nodes as the nodes in the CDS, and forward the sensed data of multiple water parameters. The second part is to determine the state of each backbone node with back propagation neural network models and the sequential Bayesian analysis in the current timestamp. The third part is to establish a spatial model with Bayesian networks to estimate the state of the backbones in the next timestamp and trace the “outlier” node to its neighborhoods to detect a contamination event. The experimental results indicate that the average detection rate is more than 80% with M-STED and the false detection rate is lower than 9%, respectively. The M-STED approach can improve the rate of detection by about 40% and reduce the false alarm rate by about 45%, compared with the event detection with a single water parameter algorithm, S-STED. Moreover, the proposed M-STED can exhibit better performance in terms of detection delay and scalability. PMID:29207535
NASA Astrophysics Data System (ADS)
Solano, ErickaAlinne; Hjorleifsdottir, Vala; Perez-Campos, Xyoli
2015-04-01
A large subset of seismic events do not have impulsive arrivals, such as low frequency events in volcanoes, earthquakes in the shallow part of the subduction interface and further down dip from the traditional seismogenic part, glacial events, volcanic and non-volcanic tremors and landslides. A suite of methods can be used to detect these non-impulsive events. One of this methods is the full-waveform detection based on time reversal methods (Solano, et al , submitted to GJI). The method uses continuous observed seismograms, together with Greens functions and moment tensor responses calculated for an arbitrary 3D structure. This method was applied to the 2012 Ometepec-Pinotepa Nacional earthquake sequence in Guerrero, Mexico. During the span time of the study, we encountered three previously unknown events. One of this events was an impulsive earthquake in the Ometepec area, that only has clear arrivals on three stations and was therefore not located and reported by the SSN. The other two events are previously undetected events, very depleted in high frequencies, that occurred far outside the search area. A very rough estimate gives the location of this two events in the portion of the East Pacific Rise around 9 N. These two events are detected despite their distance from the search area, due to favorable move-out on the array of the Mexican National Seismological Service network (SSN). We are expanding the study area to the EPR and to a larger period of time, with the objective of finding more events in that region. We will present an analysis of the newly detected events, as well as any further findings at the meeting.
Automated Detection of Events of Scientific Interest
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
A report presents a slightly different perspective of the subject matter of Fusing Symbolic and Numerical Diagnostic Computations (NPO-42512), which appears elsewhere in this issue of NASA Tech Briefs. Briefly, the subject matter is the X-2000 Anomaly Detection Language, which is a developmental computing language for fusing two diagnostic computer programs one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for real-time detection of events. In the case of the cited companion NASA Tech Briefs article, the contemplated events that one seeks to detect would be primarily failures or other changes that could adversely affect the safety or success of a spacecraft mission. In the case of the instant report, the events to be detected could also include natural phenomena that could be of scientific interest. Hence, the use of X- 2000 Anomaly Detection Language could contribute to a capability for automated, coordinated use of multiple sensors and sensor-output-data-processing hardware and software to effect opportunistic collection and analysis of scientific data.
SNIa detection in the SNLS photometric analysis using Morphological Component Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Möller, A.; Ruhlmann-Kleider, V.; Neveu, J.
2015-04-01
Detection of supernovae (SNe) and, more generally, of transient events in large surveys can provide numerous false detections. In the case of a deferred processing of survey images, this implies reconstructing complete light curves for all detections, requiring sizable processing time and resources. Optimizing the detection of transient events is thus an important issue for both present and future surveys. We present here the optimization done in the SuperNova Legacy Survey (SNLS) for the 5-year data deferred photometric analysis. In this analysis, detections are derived from stacks of subtracted images with one stack per lunation. The 3-year analysis provided 300,000more » detections dominated by signals of bright objects that were not perfectly subtracted. Allowing these artifacts to be detected leads not only to a waste of resources but also to possible signal coordinate contamination. We developed a subtracted image stack treatment to reduce the number of non SN-like events using morphological component analysis. This technique exploits the morphological diversity of objects to be detected to extract the signal of interest. At the level of our subtraction stacks, SN-like events are rather circular objects while most spurious detections exhibit different shapes. A two-step procedure was necessary to have a proper evaluation of the noise in the subtracted image stacks and thus a reliable signal extraction. We also set up a new detection strategy to obtain coordinates with good resolution for the extracted signal. SNIa Monte-Carlo (MC) generated images were used to study detection efficiency and coordinate resolution. When tested on SNLS 3-year data this procedure decreases the number of detections by a factor of two, while losing only 10% of SN-like events, almost all faint ones. MC results show that SNIa detection efficiency is equivalent to that of the original method for bright events, while the coordinate resolution is improved.« less
Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks.
Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming
2015-12-15
With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady.
Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks
Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming
2015-01-01
With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady. PMID:26694394
Molecular toolbox for the identification of unknown genetically modified organisms.
Ruttink, Tom; Demeyer, Rolinde; Van Gulck, Elke; Van Droogenbroeck, Bart; Querci, Maddalena; Taverniers, Isabel; De Loose, Marc
2010-03-01
Competent laboratories monitor genetically modified organisms (GMOs) and products derived thereof in the food and feed chain in the framework of labeling and traceability legislation. In addition, screening is performed to detect the unauthorized presence of GMOs including asynchronously authorized GMOs or GMOs that are not officially registered for commercialization (unknown GMOs). Currently, unauthorized or unknown events are detected by screening blind samples for commonly used transgenic elements, such as p35S or t-nos. If (1) positive detection of such screening elements shows the presence of transgenic material and (2) all known GMOs are tested by event-specific methods but are not detected, then the presence of an unknown GMO is inferred. However, such evidence is indirect because it is based on negative observations and inconclusive because the procedure does not identify the causative event per se. In addition, detection of unknown events is hampered in products that also contain known authorized events. Here, we outline alternative approaches for analytical detection and GMO identification and develop new methods to complement the existing routine screening procedure. We developed a fluorescent anchor-polymerase chain reaction (PCR) method for the identification of the sequences flanking the p35S and t-nos screening elements. Thus, anchor-PCR fingerprinting allows the detection of unique discriminative signals per event. In addition, we established a collection of in silico calculated fingerprints of known events to support interpretation of experimentally generated anchor-PCR GM fingerprints of blind samples. Here, we first describe the molecular characterization of a novel GMO, which expresses recombinant human intrinsic factor in Arabidopsis thaliana. Next, we purposefully treated the novel GMO as a blind sample to simulate how the new methods lead to the molecular identification of a novel unknown event without prior knowledge of its transgene sequence. The results demonstrate that the new methods complement routine screening procedures by providing direct conclusive evidence and may also be useful to resolve masking of unknown events by known events.
Selected control events and reporting odds ratio in signal detection methodology.
Ooba, Nobuhiro; Kubota, Kiyoshi
2010-11-01
To know whether the reporting odds ratio (ROR) using "control events" can detect signals hidden behind striking reports on one or more particular events. We used data of 956 drug use investigations (DUIs) conducted between 1970 and 1998 in Japan and domestic spontaneous reports (SRs) between 1998 and 2008. The event terms in DUIs were converted to the preferred terms in Medical Dictionary for Regulatory Activities (MedDRA). We calculated the incidence proportion for various events and selected 20 "control events" with a relatively constant incidence proportion across DUIs and also reported regularly to the spontaneous reporting system. A "signal" was generated for the drug-event combination when the lower limit of 95% confidence interval of the ROR exceeded 1. We also compared the ROR in SRs with the RR in DUIs. The "control events" accounted for 18.2% of all reports. The ROR using "control events" may detect some hidden signals for a drug with the proportion of "control events" lower than the average. The median of the ratios of the ROR using "control events" to RR was around the unity indicating that "control events" roughly represented the exposure distribution though the range of the ratios was so diverse that the individual ROR might not be regarded as the estimate of RR. The use of the ROR with "control events" may give an adjunctive to the traditional signal detection methods to find a signal hidden behind some major events. Copyright © 2010 John Wiley & Sons, Ltd.
Fine-Scale Event Location and Error Analysis in NET-VISA
NASA Astrophysics Data System (ADS)
Arora, N. S.; Russell, S.
2016-12-01
NET-VISA is a generative probabilistic model for the occurrence of seismic, hydro, and atmospheric events, and the propagation of energy from these events through various mediums and phases before being detected, or misdetected, by IMS stations. It is built on top of the basic station, and arrival detection processing at the IDC, and is currently being tested in the IDC network processing pipelines. A key distinguishing feature of NET-VISA is that it is easy to incorporate prior scientific knowledge and historical data into the probabilistic model. The model accounts for both detections and mis-detections when forming events, and this allows it to make more accurate event hypothesis. It has been continuously evaluated since 2012, and in each year it makes a roughly 60% reduction in the number of missed events without increasing the false event rate as compared to the existing GA algorithm. More importantly the model finds large numbers of events that have been confirmed by regional seismic bulletins but missed by the IDC analysts using the same data. In this work we focus on enhancements to the model to improve the location accuracy, and error ellipses. We will present a new version of the model that focuses on the fine scale around the event location, and present error ellipses and analysis of recent important events.
Radionuclide data analysis in connection of DPRK event in May 2009
NASA Astrophysics Data System (ADS)
Nikkinen, Mika; Becker, Andreas; Zähringer, Matthias; Polphong, Pornsri; Pires, Carla; Assef, Thierry; Han, Dongmei
2010-05-01
The seismic event detected in DPRK on 25.5.2009 was triggering a series of actions within CTBTO/PTS to ensure its preparedness to detect any radionuclide emissions possibly linked with the event. Despite meticulous work to detect and verify, traces linked to the DPRK event were not found. After three weeks of high alert the PTS resumed back to normal operational routine. This case illuminates the importance of objectivity and procedural approach in the data evaluation. All the data coming from particulate and noble gas stations were evaluated daily, some of the samples even outside of office hours and during the weekends. Standard procedures were used to determine the network detection thresholds of the key (CTBT relevant) radionuclides achieved across the DPRK event area and for the assessment of radionuclides typically occurring at IMS stations (background history). Noble gas system has sometimes detections that are typical for the sites due to legitimate non-nuclear test related activities. Therefore, set of hypothesis were used to see if the detection is consistent with event time and location through atmospheric transport modelling. Also the consistency of event timing and isotopic ratios was used in the evaluation work. As a result it was concluded that if even 1/1000 of noble gasses from a nuclear detonation would had leaked, the IMS system would not had problems to detect it. This case also showed the importance of on-site inspections to verify the nuclear traces of possible tests.
NASA Astrophysics Data System (ADS)
Miley, H.; Forrester, J. B.; Greenwood, L. R.; Keillor, M. E.; Eslinger, P. W.; Regmi, R.; Biegalski, S.; Erikson, L. E.
2013-12-01
The aerosol samples taken from the CTBT International Monitoring Systems stations are measured in the field with a minimum detectable concentration (MDC) of ~30 microBq/m3 of Ba-140. This is sufficient to detect far less than 1 kt of aerosol fission products in the atmosphere when the station is in the plume from such an event. Recent thinking about minimizing the potential source region (PSR) from a detection has led to a desire for a multi-station or multi-time period detection. These would be connected through the concept of ';event formation', analogous to event formation in seismic event study. However, to form such events, samples from the nearest neighbors of the detection would require re-analysis with a more sensitive laboratory to gain a substantially lower MDC, and potentially find radionuclide concentrations undetected by the station. The authors will present recent laboratory work with air filters showing various cost effective means for enhancing laboratory sensitivity.
An Efficient Pattern Mining Approach for Event Detection in Multivariate Temporal Data
Batal, Iyad; Cooper, Gregory; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos
2015-01-01
This work proposes a pattern mining approach to learn event detection models from complex multivariate temporal data, such as electronic health records. We present Recent Temporal Pattern mining, a novel approach for efficiently finding predictive patterns for event detection problems. This approach first converts the time series data into time-interval sequences of temporal abstractions. It then constructs more complex time-interval patterns backward in time using temporal operators. We also present the Minimal Predictive Recent Temporal Patterns framework for selecting a small set of predictive and non-spurious patterns. We apply our methods for predicting adverse medical events in real-world clinical data. The results demonstrate the benefits of our methods in learning accurate event detection models, which is a key step for developing intelligent patient monitoring and decision support systems. PMID:26752800
Fighting detection using interaction energy force
NASA Astrophysics Data System (ADS)
Wateosot, Chonthisa; Suvonvorn, Nikom
2017-02-01
Fighting detection is an important issue in security aimed to prevent criminal or undesirable events in public places. Many researches on computer vision techniques have studied to detect the specific event in crowded scenes. In this paper we focus on fighting detection using social-based Interaction Energy Force (IEF). The method uses low level features without object extraction and tracking. The interaction force is modeled using the magnitude and direction of optical flows. A fighting factor is developed under this model to detect fighting events using thresholding method. An energy map of interaction force is also presented to identify the corresponding events. The evaluation is performed using NUSHGA and BEHAVE datasets. The results show the efficiency with high accuracy regardless of various conditions.
Sun, Siao; Barraud, Sylvie; Castebrunet, Hélène; Aubin, Jean-Baptiste; Marmonier, Pierre
2015-11-15
The assessment of urban stormwater quantity and quality is important for evaluating and controlling the impact of the stormwater to natural water and environment. This study mainly addresses long-term evolution of stormwater quantity and quality in a French urban catchment using continuous measured data from 2004 to 2011. Storm event-based data series are obtained (716 rainfall events and 521 runoff events are available) from measured continuous time series. The Mann-Kendall test is applied to these event-based data series for trend detection. A lack of trend is found in rainfall and an increasing trend in runoff is detected. As a result, an increasing trend is present in the runoff coefficient, likely due to growing imperviousness of the catchment caused by urbanization. The event mean concentration of the total suspended solid (TSS) in stormwater does not present a trend, whereas the event load of TSS has an increasing tendency, which is attributed to the increasing event runoff volume. Uncertainty analysis suggests that the major uncertainty in trend detection results lies in uncertainty due to available data. A lack of events due to missing data leads to dramatically increased uncertainty in trend detection results. In contrast, measurement uncertainty in time series data plays a trivial role. The intra-event distribution of TSS is studied based on both M(V) curves and pollutant concentrations of absolute runoff volumes. The trend detection test reveals no significant change in intra-event distributions of TSS in the studied catchment. Copyright © 2015 Elsevier Ltd. All rights reserved.
Adaptive Waveform Correlation Detectors for Arrays: Algorithms for Autonomous Calibration
2007-09-01
March 17, 2005. The seismic signals from both master and detected events are followed by infrasound arrivals. Note the long duration of the...correlation coefficient traces with a significant array -gain. A detected event that is co-located with the master event will record the same time-difference...estimating the detection threshold reduction for a range of highly repeating seismic sources using arrays of different configurations and at different
Method of controlling cyclic variation in engine combustion
Davis, L.I. Jr.; Daw, C.S.; Feldkamp, L.A.; Hoard, J.W.; Yuan, F.; Connolly, F.T.
1999-07-13
Cyclic variation in combustion of a lean burning engine is reduced by detecting an engine combustion event output such as torsional acceleration in a cylinder (i) at a combustion event (k), using the detected acceleration to predict a target acceleration for the cylinder at the next combustion event (k+1), modifying the target output by a correction term that is inversely proportional to the average phase of the combustion event output of cylinder (i) and calculating a control output such as fuel pulse width or spark timing necessary to achieve the target acceleration for cylinder (i) at combustion event (k+1) based on anti-correlation with the detected acceleration and spill-over effects from fueling. 27 figs.
Method of controlling cyclic variation in engine combustion
Davis, Jr., Leighton Ira; Daw, Charles Stuart; Feldkamp, Lee Albert; Hoard, John William; Yuan, Fumin; Connolly, Francis Thomas
1999-01-01
Cyclic variation in combustion of a lean burning engine is reduced by detecting an engine combustion event output such as torsional acceleration in a cylinder (i) at a combustion event (k), using the detected acceleration to predict a target acceleration for the cylinder at the next combustion event (k+1), modifying the target output by a correction term that is inversely proportional to the average phase of the combustion event output of cylinder (i) and calculating a control output such as fuel pulse width or spark timing necessary to achieve the target acceleration for cylinder (i) at combustion event (k+1) based on anti-correlation with the detected acceleration and spill-over effects from fueling.
Signaling communication events in a computer network
Bender, Carl A.; DiNicola, Paul D.; Gildea, Kevin J.; Govindaraju, Rama K.; Kim, Chulho; Mirza, Jamshed H.; Shah, Gautam H.; Nieplocha, Jaroslaw
2000-01-01
A method, apparatus and program product for detecting a communication event in a distributed parallel data processing system in which a message is sent from an origin to a target. A low-level application programming interface (LAPI) is provided which has an operation for associating a counter with a communication event to be detected. The LAPI increments the counter upon the occurrence of the communication event. The number in the counter is monitored, and when the number increases, the event is detected. A completion counter in the origin is associated with the completion of a message being sent from the origin to the target. When the message is completed, LAPI increments the completion counter such that monitoring the completion counter detects the completion of the message. The completion counter may be used to insure that a first message has been sent from the origin to the target and completed before a second message is sent.
NASA Technical Reports Server (NTRS)
Mehr, Ali Farhang; Sauvageon, Julien; Agogino, Alice M.; Tumer, Irem Y.
2006-01-01
Recent advances in micro electromechanical systems technology, digital electronics, and wireless communications have enabled development of low-cost, low-power, multifunctional miniature smart sensors. These sensors can be deployed throughout a region in an aerospace vehicle to build a network for measurement, detection and surveillance applications. Event detection using such centralized sensor networks is often regarded as one of the most promising health management technologies in aerospace applications where timely detection of local anomalies has a great impact on the safety of the mission. In this paper, we propose to conduct a qualitative comparison of several local event detection algorithms for centralized redundant sensor networks. The algorithms are compared with respect to their ability to locate and evaluate an event in the presence of noise and sensor failures for various node geometries and densities.
A General Event Location Algorithm with Applications to Eclipse and Station Line-of-Sight
NASA Technical Reports Server (NTRS)
Parker, Joel J. K.; Hughes, Steven P.
2011-01-01
A general-purpose algorithm for the detection and location of orbital events is developed. The proposed algorithm reduces the problem to a global root-finding problem by mapping events of interest (such as eclipses, station access events, etc.) to continuous, differentiable event functions. A stepping algorithm and a bracketing algorithm are used to detect and locate the roots. Examples of event functions and the stepping/bracketing algorithms are discussed, along with results indicating performance and accuracy in comparison to commercial tools across a variety of trajectories.
A General Event Location Algorithm with Applications to Eclispe and Station Line-of-Sight
NASA Technical Reports Server (NTRS)
Parker, Joel J. K.; Hughes, Steven P.
2011-01-01
A general-purpose algorithm for the detection and location of orbital events is developed. The proposed algorithm reduces the problem to a global root-finding problem by mapping events of interest (such as eclipses, station access events, etc.) to continuous, differentiable event functions. A stepping algorithm and a bracketing algorithm are used to detect and locate the roots. Examples of event functions and the stepping/bracketing algorithms are discussed, along with results indicating performance and accuracy in comparison to commercial tools across a variety of trajectories.
Detecting NEO Impacts using the International Monitoring System
NASA Astrophysics Data System (ADS)
Brown, Peter G.; Dube, Kimberlee; Silber, Elizabeth
2014-11-01
As part of the verification regime for the Comprehensive Nuclear Test Ban Treaty an International Monitoring System (IMS) consisting of seismic, hydroacoustic, infrasound and radionuclide technologies has been globally deployed beginning in the late 1990s. The infrasound network sub-component of the IMS consists of 47 active stations as of mid-2014. These microbarograph arrays detect coherent infrasonic signals from a range of sources including volcanoes, man-made explosions and bolides. Bolide detections from IMS stations have been reported since ~2000, but with the maturation of the network over the last several years the rate of detections has increased substantially. Presently the IMS performs semi-automated near real-time global event identification on timescales of 6-12 hours as well as analyst verified event identification having time lags of several weeks. Here we report on infrasound events identified by the IMS between 2010-2014 which are likely bolide impacts. Identification in this context refers to an event being included in one of the event bulletins issued by the IMS. In this untargeted study we find that the IMS globally identifies approximately 16 events per year which are likely bolide impacts. Using data released since the beginning of 2014 of US Government sensor detections (as given at http://neo.jpl.nasa.gov/fireballs/ ) of fireballs we find in a complementary targeted survey that the current IMS system is able to identify ~25% of fireballs with E > 0.1 kT energy. Using all 16 US Government sensor fireballs listed as of July 31, 2014 we are able to detect infrasound from 75% of these events on at least one IMS station. The high ratio of detection/identification is a product of the stricter criteria adopted by the IMS for inclusion in an event bulletin as compared to simple station detection.We discuss energy comparisons between infrasound-estimated energies based on amplitudes and periods and estimates provided by US Government sensors. Specific impact events of interest will be discussed as well as the utility of the global IMS infrasound system for location and timing of future NEAs detected prior to impact.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopan, O; Novak, A; Zeng, J
Purpose: Physics pre-treatment plan review is crucial to safe radiation oncology treatments. Studies show that most errors originate in treatment planning, which underscores the importance of physics plan review. As a QA measure the physics review is of fundamental importance and is central to the profession of medical physics. However, little is known about its effectiveness. More hard data are needed. The purpose of this study was to quantify the effectiveness of physics review with the goal of improving it. Methods: This study analyzed 315 “potentially serious” near-miss incidents within an institutional incident learning system collected over a two-year period.more » 139 of these originated prior to physics review and were found at the review or after. Incidents were classified as events that: 1)were detected by physics review, 2)could have been detected (but were not), and 3)could not have been detected. Category 1 and 2 events were classified by which specific check (within physics review) detected or could have detected the event. Results: Of the 139 analyzed events, 73/139 (53%) were detected or could have been detected by the physics review; although, 42/73 (58%) were not actually detected. 45/73 (62%) errors originated in treatment planning, making physics review the first step in the workflow that could detect the error. Two specific physics checks were particularly effective (combined effectiveness of >20%): verifying DRRs (8/73) and verifying isocenter (7/73). Software-based plan checking systems were evaluated and found to have potential effectiveness of 40%. Given current data structures, software implementations of some tests such as isocenter verification check would be challenging. Conclusion: Physics plan review is a key safety measure and can detect majority of reported events. However, a majority of events that potentially could have been detected were NOT detected in this study, indicating the need to improve the performance of physics review.« less
Object-Oriented Query Language For Events Detection From Images Sequences
NASA Astrophysics Data System (ADS)
Ganea, Ion Eugen
2015-09-01
In this paper is presented a method to represent the events extracted from images sequences and the query language used for events detection. Using an object oriented model the spatial and temporal relationships between salient objects and also between events are stored and queried. This works aims to unify the storing and querying phases for video events processing. The object oriented language syntax used for events processing allow the instantiation of the indexes classes in order to improve the accuracy of the query results. The experiments were performed on images sequences provided from sport domain and it shows the reliability and the robustness of the proposed language. To extend the language will be added a specific syntax for constructing the templates for abnormal events and for detection of the incidents as the final goal of the research.
Detecting a Non-Gaussian Stochastic Background of Gravitational Radiation
NASA Astrophysics Data System (ADS)
Drasco, Steve; Flanagan, Éanna É.
2002-12-01
We derive a detection method for a stochastic background of gravitational waves produced by events where the ratio of the average time between events to the average duration of an event is large. Such a signal would sound something like popcorn popping. Our derivation is based on the somewhat unrealistic assumption that the duration of an event is smaller than the detector time resolution.
NASA Astrophysics Data System (ADS)
Knox, H. A.; Draelos, T.; Young, C. J.; Lawry, B.; Chael, E. P.; Faust, A.; Peterson, M. G.
2015-12-01
The quality of automatic detections from seismic sensor networks depends on a large number of data processing parameters that interact in complex ways. The largely manual process of identifying effective parameters is painstaking and does not guarantee that the resulting controls are the optimal configuration settings. Yet, achieving superior automatic detection of seismic events is closely related to these parameters. We present an automated sensor tuning (AST) system that learns near-optimal parameter settings for each event type using neuro-dynamic programming (reinforcement learning) trained with historic data. AST learns to test the raw signal against all event-settings and automatically self-tunes to an emerging event in real-time. The overall goal is to reduce the number of missed legitimate event detections and the number of false event detections. Reducing false alarms early in the seismic pipeline processing will have a significant impact on this goal. Applicable both for existing sensor performance boosting and new sensor deployment, this system provides an important new method to automatically tune complex remote sensing systems. Systems tuned in this way will achieve better performance than is currently possible by manual tuning, and with much less time and effort devoted to the tuning process. With ground truth on detections in seismic waveforms from a network of stations, we show that AST increases the probability of detection while decreasing false alarms.
Towards a global flood detection system using social media
NASA Astrophysics Data System (ADS)
de Bruijn, Jens; de Moel, Hans; Jongman, Brenden; Aerts, Jeroen
2017-04-01
It is widely recognized that an early warning is critical in improving international disaster response. Analysis of social media in real-time can provide valuable information about an event or help to detect unexpected events. For successful and reliable detection systems that work globally, it is important that sufficient data is available and that the algorithm works both in data-rich and data-poor environments. In this study, both a new geotagging system and multi-level event detection system for flood hazards was developed using Twitter data. Geotagging algorithms that regard one tweet as a single document are well-studied. However, no algorithms exist that combine several sequential tweets mentioning keywords regarding a specific event type. Within the time frame of an event, multiple users use event related keywords that refer to the same place name. This notion allows us to treat several sequential tweets posted in the last 24 hours as one document. For all these tweets, we collect a series of spatial indicators given in the tweet metadata and extract additional topological indicators from the text. Using these indicators, we can reduce ambiguity and thus better estimate what locations are tweeted about. Using these localized tweets, Bayesian change-point analysis is used to find significant increases of tweets mentioning countries, provinces or towns. In data-poor environments detection of events on a country level is possible, while in other, data-rich, environments detection on a city level is achieved. Additionally, on a city-level we analyse the spatial dependence of mentioned places. If multiple places within a limited spatial extent are mentioned, detection confidence increases. We run the algorithm using 2 years of Twitter data with flood related keywords in 13 major languages and validate against a flood event database. We find that the geotagging algorithm yields significantly more data than previously developed algorithms and successfully deals with ambiguous place names. In addition, we show that our detection system can both quickly and reliably detect floods, even in countries where data is scarce, while achieving high detail in countries where more data is available.
Sources of Infrasound events listed in IDC Reviewed Event Bulletin
NASA Astrophysics Data System (ADS)
Bittner, Paulina; Polich, Paul; Gore, Jane; Ali, Sherif; Medinskaya, Tatiana; Mialle, Pierrick
2017-04-01
Until 2003 two waveform technologies, i.e. seismic and hydroacoustic were used to detect and locate events included in the International Data Centre (IDC) Reviewed Event Bulletin (REB). The first atmospheric event was published in the REB in 2003, however automatic processing required significant improvements to reduce the number of false events. In the beginning of 2010 the infrasound technology was reintroduced to the IDC operations and has contributed to both automatic and reviewed IDC bulletins. The primary contribution of infrasound technology is to detect atmospheric events. These events may also be observed at seismic stations, which will significantly improve event location. Examples sources of REB events, which were detected by the International Monitoring System (IMS) infrasound network were fireballs (e.g. Bangkok fireball, 2015), volcanic eruptions (e.g. Calbuco, Chile 2015) and large surface explosions (e.g. Tjanjin, China 2015). Query blasts (e.g. Zheleznogorsk) and large earthquakes (e.g. Italy 2016) belong to events primarily recorded at seismic stations of the IMS network but often detected at the infrasound stations. In case of earthquakes analysis of infrasound signals may help to estimate the area affected by ground vibration. Infrasound associations to query blast events may help to obtain better source location. The role of IDC analysts is to verify and improve location of events detected by the automatic system and to add events which were missed in the automatic process. Open source materials may help to identify nature of some events. Well recorded examples may be added to the Reference Infrasound Event Database to help in analysis process. This presentation will provide examples of events generated by different sources which were included in the IDC bulletins.
NASA Astrophysics Data System (ADS)
Liu, S. B.; Bouchard, B.; Bowden, D. C.; Guy, M.; Earle, P.
2012-12-01
The U.S. Geological Survey (USGS) is investigating how online social networking services like Twitter—a microblogging service for sending and reading public text-based messages of up to 140 characters—can augment USGS earthquake response products and the delivery of hazard information. The USGS Tweet Earthquake Dispatch (TED) system is using Twitter not only to broadcast seismically-verified earthquake alerts via the @USGSted and @USGSbigquakes Twitter accounts, but also to rapidly detect widely felt seismic events through a real-time detection system. The detector algorithm scans for significant increases in tweets containing the word "earthquake" or its equivalent in other languages and sends internal alerts with the detection time, tweet text, and the location of the city where most of the tweets originated. It has been running in real-time for 7 months and finds, on average, two or three felt events per day with a false detection rate of less than 10%. The detections have reasonable coverage of populated areas globally. The number of detections is small compared to the number of earthquakes detected seismically, and only a rough location and qualitative assessment of shaking can be determined based on Tweet data alone. However, the Twitter detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The main benefit of the tweet-based detections is speed, with most detections occurring between 19 seconds and 2 minutes from the origin time. This is considerably faster than seismic detections in poorly instrumented regions of the world. Going beyond the initial detection, the USGS is developing data mining techniques to continuously archive and analyze relevant tweets for additional details about the detected events. The information generated about an event is displayed on a web-based map designed using HTML5 for the mobile environment, which can be valuable when the user is not able to access a desktop computer at the time of the detections. The continuously updating map displays geolocated tweets arriving after the detection and plots epicenters of recent earthquakes. When available, seismograms from nearby stations are displayed as an additional form of verification. A time series of tweets-per-minute is also shown to illustrate the volume of tweets being generated for the detected event. Future additions are being investigated to provide a more in-depth characterization of the seismic events based on an analysis of tweet text and content from other social media sources.
Event Detection Challenges, Methods, and Applications in Natural and Artificial Systems
2009-03-01
using the composite event detection method [Kerman, Jiang, Blumberg , and Buttrey, 2009]. Although the techniques and utility of the...aforementioned method have been clearly demonstrated, there is still much work and research to be conducted within the realm of event detection. This...detection methods . The paragraphs that follow summarize the discoveries of and lessons learned by multiple researchers and authors over many
Automatic processing of induced events in the geothermal reservoirs Landau and Insheim, Germany
NASA Astrophysics Data System (ADS)
Olbert, Kai; Küperkoch, Ludger; Meier, Thomas
2016-04-01
Induced events can be a risk to local infrastructure that need to be understood and evaluated. They represent also a chance to learn more about the reservoir behavior and characteristics. Prior to the analysis, the waveform data must be processed consistently and accurately to avoid erroneous interpretations. In the framework of the MAGS2 project an automatic off-line event detection and a phase onset time determination algorithm are applied to induced seismic events in geothermal systems in Landau and Insheim, Germany. The off-line detection algorithm works based on a cross-correlation of continuous data taken from the local seismic network with master events. It distinguishes events between different reservoirs and within the individual reservoirs. Furthermore, it provides a location and magnitude estimation. Data from 2007 to 2014 are processed and compared with other detections using the SeisComp3 cross correlation detector and a STA/LTA detector. The detected events are analyzed concerning spatial or temporal clustering. Furthermore the number of events are compared to the existing detection lists. The automatic phase picking algorithm combines an AR-AIC approach with a cost function to find precise P1- and S1-phase onset times which can be used for localization and tomography studies. 800 induced events are processed, determining 5000 P1- and 6000 S1-picks. The phase onset times show a high precision with mean residuals to manual phase picks of 0s (P1) to 0.04s (S1) and standard deviations below ±0.05s. The received automatic picks are applied to relocate a selected number of events to evaluate influences on the location precision.
On-Die Sensors for Transient Events
NASA Astrophysics Data System (ADS)
Suchak, Mihir Vimal
Failures caused by transient electromagnetic events like Electrostatic Discharge (ESD) are a major concern for embedded systems. The component often failing is an integrated circuit (IC). Determining which IC is affected in a multi-device system is a challenging task. Debugging errors often requires sophisticated lab setups which require intentionally disturbing and probing various parts of the system which might not be easily accessible. Opening the system and adding probes may change its response to the transient event, which further compounds the problem. On-die transient event sensors were developed that require relatively little area on die, making them inexpensive, they consume negligible static current, and do not interfere with normal operation of the IC. These circuits can be used to determine the pin involved and the level of the event in the event of a transient event affecting the IC, thus allowing the user to debug system-level transient events without modifying the system. The circuit and detection scheme design has been completed and verified in simulations with Cadence Virtuoso environment. Simulations accounted for the impact of the ESD protection circuits, parasitics from the I/O pin, package and I/O ring, and included a model of an ESD gun to test the circuit's response to an ESD pulse as specified in IEC 61000-4-2. Multiple detection schemes are proposed. The final detection scheme consists of an event detector and a level sensor. The event detector latches on the presence of an event at a pad, to determine on which pin an event occurred. The level sensor generates current proportional to the level of the event. This current is converted to a voltage and digitized at the A/D converter to be read by the microprocessor. Detection scheme shows good performance in simulations when checked against process variations and different kind of events.
Trend Detection and Bivariate Frequency Analysis for Nonstrationary Rainfall Data
NASA Astrophysics Data System (ADS)
Joo, K.; Kim, H.; Shin, J. Y.; Heo, J. H.
2017-12-01
Multivariate frequency analysis has been developing for hydro-meteorological data such as rainfall, flood, and drought. Particularly, the copula has been used as a useful tool for multivariate probability model which has no limitation on deciding marginal distributions. The time-series rainfall data can be characterized to rainfall event by inter-event time definition (IETD) and each rainfall event has a rainfall depth and rainfall duration. In addition, nonstationarity in rainfall event has been studied recently due to climate change and trend detection of rainfall event is important to determine the data has nonstationarity or not. With the rainfall depth and duration of a rainfall event, trend detection and nonstationary bivariate frequency analysis has performed in this study. 62 stations from Korea Meteorological Association (KMA) over 30 years of hourly recorded data used in this study and the suitability of nonstationary copula for rainfall event has examined by the goodness-of-fit test.
NASA Astrophysics Data System (ADS)
Ziegler, A.; Balch, R. S.; Knox, H. A.; Van Wijk, J. W.; Draelos, T.; Peterson, M. G.
2016-12-01
We present results (e.g. seismic detections and STA/LTA detection parameters) from a continuous downhole seismic array in the Farnsworth Field, an oil field in Northern Texas that hosts an ongoing carbon capture, utilization, and storage project. Specifically, we evaluate data from a passive vertical monitoring array consisting of 16 levels of 3-component 15Hz geophones installed in the field and continuously recording since January 2014. This detection database is directly compared to ancillary data (i.e. wellbore pressure) to determine if there is any relationship between seismic observables and CO2 injection and pressure maintenance in the field. Of particular interest is detection of relatively low-amplitude signals constituting long-period long-duration (LPLD) events that may be associated with slow shear-slip analogous to low frequency tectonic tremor. While this category of seismic event provides great insight into dynamic behavior of the pressurized subsurface, it is inherently difficult to detect. To automatically detect seismic events using effective data processing parameters, an automated sensor tuning (AST) algorithm developed by Sandia National Laboratories is being utilized. AST exploits ideas from neuro-dynamic programming (reinforcement learning) to automatically self-tune and determine optimal detection parameter settings. AST adapts in near real-time to changing conditions and automatically self-tune a signal detector to identify (detect) only signals from events of interest, leading to a reduction in the number of missed legitimate event detections and the number of false event detections. Funding for this project is provided by the U.S. Department of Energy's (DOE) National Energy Technology Laboratory (NETL) through the Southwest Regional Partnership on Carbon Sequestration (SWP) under Award No. DE-FC26-05NT42591. Additional support has been provided by site operator Chaparral Energy, L.L.C. and Schlumberger Carbon Services. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Liu, David; Jenkins, Simon A; Sanderson, Penelope M; Watson, Marcus O; Leane, Terrence; Kruys, Amanda; Russell, W John
2009-10-01
Head-mounted displays (HMDs) can help anesthesiologists with intraoperative monitoring by keeping patients' vital signs within view at all times, even while the anesthesiologist is busy performing procedures or unable to see the monitor. The anesthesia literature suggests that there are advantages of HMD use, but research into head-up displays in the cockpit suggests that HMDs may exacerbate inattentional blindness (a tendency for users to miss unexpected but salient events in the field of view) and may introduce perceptual issues relating to focal depth. We investigated these issues in two simulator-based experiments. Experiment 1 investigated whether wearing a HMD would affect how quickly anesthesiologists detect events, and whether the focus setting of the HMD (near or far) makes any difference. Twelve anesthesiologists provided anesthesia in three naturalistic scenarios within a simulated operating theater environment. There were 24 different events that occurred either on the patient monitor or in the operating room. Experiment 2 investigated whether anesthesiologists physically constrained by performing a procedure would detect patient-related events faster with a HMD than without. Twelve anesthesiologists performed a complex simulated clinical task on a part-task endoscopic dexterity trainer while monitoring the simulated patient's vital signs. All participants experienced four different events within each of two scenarios. Experiment 1 showed that neither wearing the HMD nor adjusting the focus setting reduced participants' ability to detect events (the number of events detected and time to detect events). In general, participants spent more time looking toward the patient and less time toward the anesthesia machine when they wore the HMD than when they used standard monitoring alone. Participants reported that they preferred the near focus setting. Experiment 2 showed that participants detected two of four events faster with the HMD, but one event more slowly with the HMD. Participants turned to look toward the anesthesia machine significantly less often when using the HMD. When using the HMD, participants reported that they were less busy, monitoring was easier, and they believed they were faster at detecting abnormal changes. The HMD helped anesthesiologists detect events when physically constrained, but not when physically unconstrained. Although there was no conclusive evidence of worsened inattentional blindness, found in aviation, the perceptual properties of the HMD display appear to influence whether events are detected. Anesthesiologists wearing HMDs should self-adjust the focus to minimize eyestrain and should be aware that some changes may not attract their attention. Future areas of research include developing principles for the design of HMDs, evaluating other types of HMDs, and evaluating the HMD in clinical contexts.
Etgen, Thorleif; Hochreiter, Manfred; Mundel, Markus; Freudenberger, Thomas
2013-07-01
Atrial fibrillation (AF) is the most frequent risk factor in ischemic stroke but often remains undetected. We analyzed the value of insertable cardiac event recorder in detection of AF in a 1-year cohort of patients with cryptogenic ischemic stroke. All patients with cryptogenic stroke and eligibility for oral anticoagulation were offered the insertion of a cardiac event recorder. Regular follow-up for 1 year recorded the incidence of AF. Of the 393 patients with ischemic stroke, 65 (16.5%) had a cryptogenic stroke, and in 22 eligible patients, an event recorder was inserted. After 1 year, in 6 of 22 patients (27.3%), AF was detected. These preliminary data show that insertion of cardiac event recorder was eligible in approximately one third of patients with cryptogenic stroke and detected in approximately one quarter of these patients new AF.
NASA Astrophysics Data System (ADS)
Hutchison, A. A.; Ghosh, A.
2016-12-01
Very low frequency earthquakes (VLFEs) occur in transitional zones of faults, releasing seismic energy in the 0.02-0.05 Hz frequency band over a 90 s duration and typically have magntitudes within the range of Mw 3.0-4.0. VLFEs can occur down-dip of the seismogenic zone, where they can transfer stress up-dip potentially bringing the locked zone closer to a critical failure stress. VLFEs also occur up-dip of the seismogenic zone in a region along the plate interface that can rupture coseismically during large megathrust events, such as the 2011 Tohoku-Oki earthquake [Ide et al., 2011]. VLFEs were first detected in Cascadia during the 2011 episodic tremor and slip (ETS) event, occurring coincidentally with tremor [Ghosh et al., 2015]. However, during the 2014 ETS event, VLFEs were spatially and temporally asynchronous with tremor activity [Hutchison and Ghosh, 2016]. Such contrasting behaviors remind us that the mechanics behind such events remain elusive, yet they are responsible for the largest portion of the moment release during an ETS event. Here, we apply a match filter method using known VLFEs as template events to detect additional VLFEs. Using a grid-search centroid moment tensor inversion method, we invert stacks of the resulting match filter detections to ensure moment tensor solutions are similar to that of the respective template events. Our ability to successfully employ a match filter method to VLFE detection in Cascadia intrinsically indicates that these events can be repeating, implying that the same asperities are likely responsible for generating multiple VLFEs.
A substitution method to improve completeness of events documentation in anesthesia records.
Lamer, Antoine; De Jonckheere, Julien; Marcilly, Romaric; Tavernier, Benoît; Vallet, Benoît; Jeanne, Mathieu; Logier, Régis
2015-12-01
AIMS are optimized to find and display data and curves about one specific intervention but is not retrospective analysis on a huge volume of interventions. Such a system present two main limitation; (1) the transactional database architecture, (2) the completeness of documentation. In order to solve the architectural problem, data warehouses were developed to propose architecture suitable for analysis. However, completeness of documentation stays unsolved. In this paper, we describe a method which allows determining of substitution rules in order to detect missing anesthesia events in an anesthesia record. Our method is based on the principle that missing event could be detected using a substitution one defined as the nearest documented event. As an example, we focused on the automatic detection of the start and the end of anesthesia procedure when these events were not documented by the clinicians. We applied our method on a set of records in order to evaluate; (1) the event detection accuracy, (2) the improvement of valid records. For the year 2010-2012, we obtained event detection with a precision of 0.00 (-2.22; 2.00) min for the start of anesthesia and 0.10 (0.00; 0.35) min for the end of anesthesia. On the other hand, we increased by 21.1% the data completeness (from 80.3 to 97.2% of the total database) for the start and the end of anesthesia events. This method seems to be efficient to replace missing "start and end of anesthesia" events. This method could also be used to replace other missing time events in this particular data warehouse as well as in other kind of data warehouses.
Spatiotemporal Detection of Unusual Human Population Behavior Using Mobile Phone Data
Dobra, Adrian; Williams, Nathalie E.; Eagle, Nathan
2015-01-01
With the aim to contribute to humanitarian response to disasters and violent events, scientists have proposed the development of analytical tools that could identify emergency events in real-time, using mobile phone data. The assumption is that dramatic and discrete changes in behavior, measured with mobile phone data, will indicate extreme events. In this study, we propose an efficient system for spatiotemporal detection of behavioral anomalies from mobile phone data and compare sites with behavioral anomalies to an extensive database of emergency and non-emergency events in Rwanda. Our methodology successfully captures anomalous behavioral patterns associated with a broad range of events, from religious and official holidays to earthquakes, floods, violence against civilians and protests. Our results suggest that human behavioral responses to extreme events are complex and multi-dimensional, including extreme increases and decreases in both calling and movement behaviors. We also find significant temporal and spatial variance in responses to extreme events. Our behavioral anomaly detection system and extensive discussion of results are a significant contribution to the long-term project of creating an effective real-time event detection system with mobile phone data and we discuss the implications of our findings for future research to this end. PMID:25806954
Autonomous Detection of Eruptions, Plumes, and Other Transient Events in the Outer Solar System
NASA Astrophysics Data System (ADS)
Bunte, M. K.; Lin, Y.; Saripalli, S.; Bell, J. F.
2012-12-01
The outer solar system abounds with visually stunning examples of dynamic processes such as eruptive events that jettison materials from satellites and small bodies into space. The most notable examples of such events are the prominent volcanic plumes of Io, the wispy water jets of Enceladus, and the outgassing of comet nuclei. We are investigating techniques that will allow a spacecraft to autonomously detect those events in visible images. This technique will allow future outer planet missions to conduct sustained event monitoring and automate prioritization of data for downlink. Our technique detects plumes by searching for concentrations of large local gradients in images. Applying a Scale Invariant Feature Transform (SIFT) to either raw or calibrated images identifies interest points for further investigation based on the magnitude and orientation of local gradients in pixel values. The interest points are classified as possible transient geophysical events when they share characteristics with similar features in user-classified images. A nearest neighbor classification scheme assesses the similarity of all interest points within a threshold Euclidean distance and classifies each according to the majority classification of other interest points. Thus, features marked by multiple interest points are more likely to be classified positively as events; isolated large plumes or multiple small jets are easily distinguished from a textured background surface due to the higher magnitude gradient of the plume or jet when compared with the small, randomly oriented gradients of the textured surface. We have applied this method to images of Io, Enceladus, and comet Hartley 2 from the Voyager, Galileo, New Horizons, Cassini, and Deep Impact EPOXI missions, where appropriate, and have successfully detected up to 95% of manually identifiable events that our method was able to distinguish from the background surface and surface features of a body. Dozens of distinct features are identifiable under a variety of viewing conditions and hundreds of detections are made in each of the aforementioned datasets. In this presentation, we explore the controlling factors in detecting transient events and discuss causes of success or failure due to distinct data characteristics. These include the level of calibration of images, the ability to differentiate an event from artifacts, and the variety of event appearances in user-classified images. Other important factors include the physical characteristics of the events themselves: albedo, size as a function of image resolution, and proximity to other events (as in the case of multiple small jets which feed into the overall plume at the south pole of Enceladus). A notable strength of this method is the ability to detect events that do not extend beyond the limb of a planetary body or are adjacent to the terminator or other strong edges in the image. The former scenario strongly influences the success rate of detecting eruptive events in nadir views.
Cai, Yi; Du, Jingcheng; Huang, Jing; Ellenberg, Susan S; Hennessy, Sean; Tao, Cui; Chen, Yong
2017-07-05
To identify safety signals by manual review of individual report in large surveillance databases is time consuming; such an approach is very unlikely to reveal complex relationships between medications and adverse events. Since the late 1990s, efforts have been made to develop data mining tools to systematically and automatically search for safety signals in surveillance databases. Influenza vaccines present special challenges to safety surveillance because the vaccine changes every year in response to the influenza strains predicted to be prevalent that year. Therefore, it may be expected that reporting rates of adverse events following flu vaccines (number of reports for a specific vaccine-event combination/number of reports for all vaccine-event combinations) may vary substantially across reporting years. Current surveillance methods seldom consider these variations in signal detection, and reports from different years are typically collapsed together to conduct safety analyses. However, merging reports from different years ignores the potential heterogeneity of reporting rates across years and may miss important safety signals. Reports of adverse events between years 1990 to 2013 were extracted from the Vaccine Adverse Event Reporting System (VAERS) database and formatted into a three-dimensional data array with types of vaccine, groups of adverse events and reporting time as the three dimensions. We propose a random effects model to test the heterogeneity of reporting rates for a given vaccine-event combination across reporting years. The proposed method provides a rigorous statistical procedure to detect differences of reporting rates among years. We also introduce a new visualization tool to summarize the result of the proposed method when applied to multiple vaccine-adverse event combinations. We applied the proposed method to detect safety signals of FLU3, an influenza vaccine containing three flu strains, in the VAERS database. We showed that it had high statistical power to detect the variation in reporting rates across years. The identified vaccine-event combinations with significant different reporting rates over years suggested potential safety issues due to changes in vaccines which require further investigation. We developed a statistical model to detect safety signals arising from heterogeneity of reporting rates of a given vaccine-event combinations across reporting years. This method detects variation in reporting rates over years with high power. The temporal trend of reporting rate across years may reveal the impact of vaccine update on occurrence of adverse events and provide evidence for further investigations.
NASA Astrophysics Data System (ADS)
Kwon, Yong-Seok; Naeem, Khurram; Jeon, Min Yong; Kwon, Il-bum
2017-04-01
We analyze the relations of parameters in moving average method to enhance the event detectability of phase sensitive optical time domain reflectometer (OTDR). If the external events have unique frequency of vibration, then the control parameters of moving average method should be optimized in order to detect these events efficiently. A phase sensitive OTDR was implemented by a pulsed light source, which is composed of a laser diode, a semiconductor optical amplifier, an erbium-doped fiber amplifier, a fiber Bragg grating filter, and a light receiving part, which has a photo-detector and high speed data acquisition system. The moving average method is operated with the control parameters: total number of raw traces, M, number of averaged traces, N, and step size of moving, n. The raw traces are obtained by the phase sensitive OTDR with sound signals generated by a speaker. Using these trace data, the relation of the control parameters is analyzed. In the result, if the event signal has one frequency, then the optimal values of N, n are existed to detect the event efficiently.
2018-01-01
Modern cities are subject to periodic or unexpected critical events, which may bring economic losses or even put people in danger. When some monitoring systems based on wireless sensor networks are deployed, sensing and transmission configurations of sensor nodes may be adjusted exploiting the relevance of the considered events, but efficient detection and classification of events of interest may be hard to achieve. In Smart City environments, several people spontaneously post information in social media about some event that is being observed and such information may be mined and processed for detection and classification of critical events. This article proposes an integrated approach to detect and classify events of interest posted in social media, notably in Twitter, and the assignment of sensing priorities to source nodes. By doing so, wireless sensor networks deployed in Smart City scenarios can be optimized for higher efficiency when monitoring areas under the influence of the detected events. PMID:29614060
Costa, Daniel G; Duran-Faundez, Cristian; Andrade, Daniel C; Rocha-Junior, João B; Peixoto, João Paulo Just
2018-04-03
Modern cities are subject to periodic or unexpected critical events, which may bring economic losses or even put people in danger. When some monitoring systems based on wireless sensor networks are deployed, sensing and transmission configurations of sensor nodes may be adjusted exploiting the relevance of the considered events, but efficient detection and classification of events of interest may be hard to achieve. In Smart City environments, several people spontaneously post information in social media about some event that is being observed and such information may be mined and processed for detection and classification of critical events. This article proposes an integrated approach to detect and classify events of interest posted in social media, notably in Twitter , and the assignment of sensing priorities to source nodes. By doing so, wireless sensor networks deployed in Smart City scenarios can be optimized for higher efficiency when monitoring areas under the influence of the detected events.
Three-axis asymmetric radiation detector system
Martini, Mario Pierangelo; Gedcke, Dale A.; Raudorf, Thomas W.; Sangsingkeow, Pat
2000-01-01
A three-axis radiation detection system whose inner and outer electrodes are shaped and positioned so that the shortest path between any point on the inner electrode and the outer electrode is a different length whereby the rise time of a pulse derived from a detected radiation event can uniquely define the azimuthal and radial position of that event, and the outer electrode is divided into a plurality of segments in the longitudinal axial direction for locating the axial location of a radiation detection event occurring in the diode.
Method for detecting binding events using micro-X-ray fluorescence spectrometry
Warner, Benjamin P.; Havrilla, George J.; Mann, Grace
2010-12-28
Method for detecting binding events using micro-X-ray fluorescence spectrometry. Receptors are exposed to at least one potential binder and arrayed on a substrate support. Each member of the array is exposed to X-ray radiation. The magnitude of a detectable X-ray fluorescence signal for at least one element can be used to determine whether a binding event between a binder and a receptor has occurred, and can provide information related to the extent of binding between the binder and receptor.
Solar Demon: near real-time solar eruptive event detection on SDO/AIA images
NASA Astrophysics Data System (ADS)
Kraaikamp, Emil; Verbeeck, Cis
Solar flares, dimmings and EUV waves have been observed routinely in extreme ultra-violet (EUV) images of the Sun since 1996. These events are closely associated with coronal mass ejections (CMEs), and therefore provide useful information for early space weather alerts. The Solar Dynamics Observatory/Atmospheric Imaging Assembly (SDO/AIA) generates such a massive dataset that it becomes impossible to find most of these eruptive events manually. Solar Demon is a set of automatic detection algorithms that attempts to solve this problem by providing both near real-time warnings of eruptive events and a catalog of characterized events. Solar Demon has been designed to detect and characterize dimmings, EUV waves, as well as solar flares in near real-time on SDO/AIA data. The detection modules are running continuously at the Royal Observatory of Belgium on both quick-look data and synoptic science data. The output of Solar Demon can be accessed in near real-time on the Solar Demon website, and includes images, movies, light curves, and the numerical evolution of several parameters. Solar Demon is the result of collaboration between the FP7 projects AFFECTS and COMESEP. Flare detections of Solar Demon are integrated into the COMESEP alert system. Here we present the Solar Demon detection algorithms and their output. We will focus on the algorithm and its operational implementation. Examples of interesting flare, dimming and EUV wave events, and general statistics of the detections made so far during solar cycle 24 will be presented as well.
Screening DNA chip and event-specific multiplex PCR detection methods for biotech crops.
Lee, Seong-Hun
2014-11-01
There are about 80 biotech crop events that have been approved by safety assessment in Korea. They have been controlled by genetically modified organism (GMO) and living modified organism (LMO) labeling systems. The DNA-based detection method has been used as an efficient scientific management tool. Recently, the multiplex polymerase chain reaction (PCR) and DNA chip have been developed as simultaneous detection methods for several biotech crops' events. The event-specific multiplex PCR method was developed to detect five biotech maize events: MIR604, Event 3272, LY 038, MON 88017 and DAS-59122-7. The specificity was confirmed and the sensitivity was 0.5%. The screening DNA chip was developed from four endogenous genes of soybean, maize, cotton and canola respectively along with two regulatory elements and seven genes: P35S, tNOS, pat, bar, epsps1, epsps2, pmi, cry1Ac and cry3B. The specificity was confirmed and the sensitivity was 0.5% for four crops' 12 events: one soybean, six maize, three cotton and two canola events. The multiplex PCR and DNA chip can be available for screening, gene-specific and event-specific analysis of biotech crops as efficient detection methods by saving on workload and time. © 2014 Society of Chemical Industry. © 2014 Society of Chemical Industry.
Supervised Time Series Event Detector for Building Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-04-13
A machine learning based approach is developed to detect events that have rarely been seen in the historical data. The data can include building energy consumption, sensor data, environmental data and any data that may affect the building's energy consumption. The algorithm is a modified nonlinear Bayesian support vector machine, which examines daily energy consumption profile, detect the days with abnormal events, and diagnose the cause of the events.
NASA Astrophysics Data System (ADS)
Meng, Xiaobo; Chen, Haichao; Niu, Fenglin; Tang, Youcai; Yin, Chen; Wu, Furong
2018-02-01
We introduce an improved matching and locating technique to detect and locate microseismic events (-4 < ML < 0) associated with hydraulic fracturing treatment. We employ a set of representative master events to act as template waveforms and detect slave events that strongly resemble master events through stacking cross correlograms of both P and S waves between the template waveforms and the continuous records of the monitoring array. Moreover, the residual moveout in the cross correlograms across the array is used to locate slave events relative to the corresponding master event. In addition, P wave polarization constraint is applied to resolve the lateral extent of slave events in the case of unfavorable array configuration. We first demonstrate the detectability and location accuracy of the proposed approach with a pseudo-synthetic data set. Compared to the matched filter analysis, the proposed approach can significantly enhance detectability at low false alarm rate and yield robust location estimates of very low SNR events, particularly along the vertical direction. Then, we apply the method to a real microseismic data set acquired in the Weiyuan shale reservoir of China in November of 2014. The expanded microseismic catalog provides more easily interpretable spatiotemporal evolution of microseismicity, which is investigated in detail in a companion paper.
On-line Machine Learning and Event Detection in Petascale Data Streams
NASA Astrophysics Data System (ADS)
Thompson, David R.; Wagstaff, K. L.
2012-01-01
Traditional statistical data mining involves off-line analysis in which all data are available and equally accessible. However, petascale datasets have challenged this premise since it is often impossible to store, let alone analyze, the relevant observations. This has led the machine learning community to investigate adaptive processing chains where data mining is a continuous process. Here pattern recognition permits triage and followup decisions at multiple stages of a processing pipeline. Such techniques can also benefit new astronomical instruments such as the Large Synoptic Survey Telescope (LSST) and Square Kilometre Array (SKA) that will generate petascale data volumes. We summarize some machine learning perspectives on real time data mining, with representative cases of astronomical applications and event detection in high volume datastreams. The first is a "supervised classification" approach currently used for transient event detection at the Very Long Baseline Array (VLBA). It injects known signals of interest - faint single-pulse anomalies - and tunes system parameters to recover these events. This permits meaningful event detection for diverse instrument configurations and observing conditions whose noise cannot be well-characterized in advance. Second, "semi-supervised novelty detection" finds novel events based on statistical deviations from previous patterns. It detects outlier signals of interest while considering known examples of false alarm interference. Applied to data from the Parkes pulsar survey, the approach identifies anomalous "peryton" phenomena that do not match previous event models. Finally, we consider online light curve classification that can trigger adaptive followup measurements of candidate events. Classifier performance analyses suggest optimal survey strategies, and permit principled followup decisions from incomplete data. These examples trace a broad range of algorithm possibilities available for online astronomical data mining. This talk describes research performed at the Jet Propulsion Laboratory, California Institute of Technology. Copyright 2012, All Rights Reserved. U.S. Government support acknowledged.
Discrepancy detection in the retrieval-enhanced suggestibility paradigm.
Butler, Brendon Jerome; Loftus, Elizabeth F
2018-04-01
Retrieval-enhanced suggestibility (RES) refers to the finding that immediately recalling the details of a witnessed event can increase susceptibility to later misinformation. In three experiments, we sought to gain a deeper understanding of the role that retrieval plays in the RES paradigm. Consistent with past research, initial testing did increase susceptibility to misinformation - but only for those who failed to detect discrepancies between the original event and the post-event misinformation. In all three experiments, subjects who retrospectively detected discrepancies in the post-event narratives were more resistant to misinformation than those who did not. In Experiments 2 and 3, having subjects concurrently assess the consistency of the misinformation narratives negated the RES effect. Interestingly, in Experiments 2 and 3, subjects who had retrieval practice and detected discrepancies were more likely to endorse misinformation than control subjects who detected discrepancies. These results call attention to limiting conditions of the RES effect and highlight the complex relationship between retrieval practice, discrepancy detection, and misinformation endorsement.
Automatic detection of snow avalanches in continuous seismic data using hidden Markov models
NASA Astrophysics Data System (ADS)
Heck, Matthias; Hammer, Conny; van Herwijnen, Alec; Schweizer, Jürg; Fäh, Donat
2018-01-01
Snow avalanches generate seismic signals as many other mass movements. Detection of avalanches by seismic monitoring is highly relevant to assess avalanche danger. In contrast to other seismic events, signals generated by avalanches do not have a characteristic first arrival nor is it possible to detect different wave phases. In addition, the moving source character of avalanches increases the intricacy of the signals. Although it is possible to visually detect seismic signals produced by avalanches, reliable automatic detection methods for all types of avalanches do not exist yet. We therefore evaluate whether hidden Markov models (HMMs) are suitable for the automatic detection of avalanches in continuous seismic data. We analyzed data recorded during the winter season 2010 by a seismic array deployed in an avalanche starting zone above Davos, Switzerland. We re-evaluated a reference catalogue containing 385 events by grouping the events in seven probability classes. Since most of the data consist of noise, we first applied a simple amplitude threshold to reduce the amount of data. As first classification results were unsatisfying, we analyzed the temporal behavior of the seismic signals for the whole data set and found that there is a high variability in the seismic signals. We therefore applied further post-processing steps to reduce the number of false alarms by defining a minimal duration for the detected event, implementing a voting-based approach and analyzing the coherence of the detected events. We obtained the best classification results for events detected by at least five sensors and with a minimal duration of 12 s. These processing steps allowed identifying two periods of high avalanche activity, suggesting that HMMs are suitable for the automatic detection of avalanches in seismic data. However, our results also showed that more sensitive sensors and more appropriate sensor locations are needed to improve the signal-to-noise ratio of the signals and therefore the classification.
Azim, Riyasat; Li, Fangxing; Xue, Yaosuo; ...
2017-07-14
Distributed generations (DGs) for grid-connected applications require an accurate and reliable islanding detection methodology (IDM) for secure system operation. This paper presents an IDM for grid-connected inverter-based DGs. The proposed method is a combination of passive and active islanding detection techniques for aggregation of their advantages and elimination/minimisation of the drawbacks. In the proposed IDM, the passive method utilises critical system attributes extracted from local voltage measurements at target DG locations as well as employs decision tree-based classifiers for characterisation and detection of islanding events. The active method is based on Sandia frequency shift technique and is initiated only whenmore » the passive method is unable to differentiate islanding events from other system events. Thus, the power quality degradation introduced into the system by active islanding detection techniques can be minimised. Furthermore, a combination of active and passive techniques allows detection of islanding events under low power mismatch scenarios eliminating the disadvantage associated with the use of passive techniques alone. Finally, detailed case study results demonstrate the effectiveness of the proposed method in detection of islanding events under various power mismatch scenarios, load quality factors and in the presence of single or multiple grid-connected inverter-based DG units.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azim, Riyasat; Li, Fangxing; Xue, Yaosuo
Distributed generations (DGs) for grid-connected applications require an accurate and reliable islanding detection methodology (IDM) for secure system operation. This paper presents an IDM for grid-connected inverter-based DGs. The proposed method is a combination of passive and active islanding detection techniques for aggregation of their advantages and elimination/minimisation of the drawbacks. In the proposed IDM, the passive method utilises critical system attributes extracted from local voltage measurements at target DG locations as well as employs decision tree-based classifiers for characterisation and detection of islanding events. The active method is based on Sandia frequency shift technique and is initiated only whenmore » the passive method is unable to differentiate islanding events from other system events. Thus, the power quality degradation introduced into the system by active islanding detection techniques can be minimised. Furthermore, a combination of active and passive techniques allows detection of islanding events under low power mismatch scenarios eliminating the disadvantage associated with the use of passive techniques alone. Finally, detailed case study results demonstrate the effectiveness of the proposed method in detection of islanding events under various power mismatch scenarios, load quality factors and in the presence of single or multiple grid-connected inverter-based DG units.« less
Lee, Young-Sook; Chung, Wan-Young
2012-01-01
Vision-based abnormal event detection for home healthcare systems can be greatly improved using visual sensor-based techniques able to detect, track and recognize objects in the scene. However, in moving object detection and tracking processes, moving cast shadows can be misclassified as part of objects or moving objects. Shadow removal is an essential step for developing video surveillance systems. The goal of the primary is to design novel computer vision techniques that can extract objects more accurately and discriminate between abnormal and normal activities. To improve the accuracy of object detection and tracking, our proposed shadow removal algorithm is employed. Abnormal event detection based on visual sensor by using shape features variation and 3-D trajectory is presented to overcome the low fall detection rate. The experimental results showed that the success rate of detecting abnormal events was 97% with a false positive rate of 2%. Our proposed algorithm can allow distinguishing diverse fall activities such as forward falls, backward falls, and falling asides from normal activities. PMID:22368486
Forghani-Arani, Farnoush; Behura, Jyoti; Haines, Seth S.; Batzle, Mike
2013-01-01
In studies on heavy oil, shale reservoirs, tight gas and enhanced geothermal systems, the use of surface passive seismic data to monitor induced microseismicity due to the fluid flow in the subsurface is becoming more common. However, in most studies passive seismic records contain days and months of data and manually analysing the data can be expensive and inaccurate. Moreover, in the presence of noise, detecting the arrival of weak microseismic events becomes challenging. Hence, the use of an automated, accurate and computationally fast technique for event detection in passive seismic data is essential. The conventional automatic event identification algorithm computes a running-window energy ratio of the short-term average to the long-term average of the passive seismic data for each trace. We show that for the common case of a low signal-to-noise ratio in surface passive records, the conventional method is not sufficiently effective at event identification. Here, we extend the conventional algorithm by introducing a technique that is based on the cross-correlation of the energy ratios computed by the conventional method. With our technique we can measure the similarities amongst the computed energy ratios at different traces. Our approach is successful at improving the detectability of events with a low signal-to-noise ratio that are not detectable with the conventional algorithm. Also, our algorithm has the advantage to identify if an event is common to all stations (a regional event) or to a limited number of stations (a local event). We provide examples of applying our technique to synthetic data and a field surface passive data set recorded at a geothermal site.
Cartan invariants and event horizon detection
NASA Astrophysics Data System (ADS)
Brooks, D.; Chavy-Waddy, P. C.; Coley, A. A.; Forget, A.; Gregoris, D.; MacCallum, M. A. H.; McNutt, D. D.
2018-04-01
We show that it is possible to locate the event horizon of a black hole (in arbitrary dimensions) by the zeros of certain Cartan invariants. This approach accounts for the recent results on the detection of stationary horizons using scalar polynomial curvature invariants, and improves upon them since the proposed method is computationally less expensive. As an application, we produce Cartan invariants that locate the event horizons for various exact four-dimensional and five-dimensional stationary, asymptotically flat (or (anti) de Sitter), black hole solutions and compare the Cartan invariants with the corresponding scalar curvature invariants that detect the event horizon.
Detection of Abnormal Events via Optical Flow Feature Analysis
Wang, Tian; Snoussi, Hichem
2015-01-01
In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227
Improving the Detectability of the Catalan Seismic Network for Local Seismic Activity Monitoring
NASA Astrophysics Data System (ADS)
Jara, Jose Antonio; Frontera, Tànit; Batlló, Josep; Goula, Xavier
2016-04-01
The seismic survey of the territory of Catalonia is mainly performed by the regional seismic network operated by the Cartographic and Geologic Institute of Catalonia (ICGC). After successive deployments and upgrades, the current network consists of 16 permanent stations equipped with 3 component broadband seismometers (STS2, STS2.5, CMG3ESP and CMG3T), 24 bits digitizers (Nanometrics Trident) and VSAT telemetry. Data are continuously sent in real-time via Hispasat 1D satellite to the ICGC datacenter in Barcelona. Additionally, data from other 10 stations of neighboring areas (Spain, France and Andorra) are continuously received since 2011 via Internet or VSAT, contributing both to detect and to locate events affecting the region. More than 300 local events with Ml ≥ 0.7 have been yearly detected and located in the region. Nevertheless, small magnitude earthquakes, especially those located in the south and south-west of Catalonia may still go undetected by the automatic detection system (DAS), based on Earthworm (USGS). Thus, in order to improve the detection and characterization of these missed events, one or two new stations should be installed. Before making the decision about where to install these new stations, the performance of each existing station is evaluated taking into account the fraction of detected events using the station records, compared to the total number of events in the catalogue, occurred during the station operation time from January 1, 2011 to December 31, 2014. These evaluations allow us to build an Event Detection Probability Map (EDPM), a required tool to simulate EDPMs resulting from different network topology scenarios depending on where these new stations are sited, and becoming essential for the decision-making process to increase and optimize the event detection probability of the seismic network.
Tackle and impact detection in elite Australian football using wearable microsensor technology.
Gastin, Paul B; McLean, Owen C; Breed, Ray V P; Spittle, Michael
2014-01-01
The effectiveness of a wearable microsensor device (MinimaxX(TM) S4, Catapult Innovations, Melbourne, VIC, Australia) to automatically detect tackles and impact events in elite Australian football (AF) was assessed during four matches. Video observation was used as the criterion measure. A total of 352 tackles were observed, with 78% correctly detected as tackles by the manufacturer's software. Tackles against (i.e. tackled by an opponent) were more accurately detected than tackles made (90% v 66%). Of the 77 tackles that were not detected at all, the majority (74%) were categorised as low-intensity. In contrast, a total of 1510 "tackle" events were detected, with only 18% of these verified as tackles. A further 57% were from contested ball situations involving player contact. The remaining 25% were in general play where no contact was evident; these were significantly lower in peak Player Load™ than those involving player contact (P < 0.01). The tackle detection algorithm, developed primarily for rugby, was not suitable for tackle detection in AF. The underlying sensor data may have the potential to detect a range of events within contact sports such as AF, yet to do so is a complex task and requires sophisticated sport and event-specific algorithms.
Pies, Ross E.
2016-03-29
A method and device for the detection of impact events on a security barrier. A hollow rebar is farmed within a security barrier, whereby the hollow rebar is completely surrounded by the security barrier. An optical fiber passes through the interior of the hollow rebar. An optical transmitter and an optical receiver are both optically connected to the optical fiber and connected to optical electronics. The optical electronics are configured to provide notification upon the detection of an impact event at the security barrier based on the detection of disturbances within the optical fiber.
NASA Astrophysics Data System (ADS)
Shrestha, Sumeet; Kamehama, Hiroki; Kawahito, Shoji; Yasutomi, Keita; Kagawa, Keiichiro; Takeda, Ayaki; Tsuru, Takeshi Go; Arai, Yasuo
2015-08-01
This paper presents a low-noise wide-dynamic-range pixel design for a high-energy particle detector in astronomical applications. A silicon on insulator (SOI) based detector is used for the detection of wide energy range of high energy particles (mainly for X-ray). The sensor has a thin layer of SOI CMOS readout circuitry and a thick layer of high-resistivity detector vertically stacked in a single chip. Pixel circuits are divided into two parts; signal sensing circuit and event detection circuit. The event detection circuit consisting of a comparator and logic circuits which detect the incidence of high energy particle categorizes the incident photon it into two energy groups using an appropriate energy threshold and generate a two-bit code for an event and energy level. The code for energy level is then used for selection of the gain of the in-pixel amplifier for the detected signal, providing a function of high-dynamic-range signal measurement. The two-bit code for the event and energy level is scanned in the event scanning block and the signals from the hit pixels only are read out. The variable-gain in-pixel amplifier uses a continuous integrator and integration-time control for the variable gain. The proposed design allows the small signal detection and wide dynamic range due to the adaptive gain technique and capability of correlated double sampling (CDS) technique of kTC noise canceling of the charge detector.
Domain Anomaly Detection in Machine Perception: A System Architecture and Taxonomy.
Kittler, Josef; Christmas, William; de Campos, Teófilo; Windridge, David; Yan, Fei; Illingworth, John; Osman, Magda
2014-05-01
We address the problem of anomaly detection in machine perception. The concept of domain anomaly is introduced as distinct from the conventional notion of anomaly used in the literature. We propose a unified framework for anomaly detection which exposes the multifaceted nature of anomalies and suggest effective mechanisms for identifying and distinguishing each facet as instruments for domain anomaly detection. The framework draws on the Bayesian probabilistic reasoning apparatus which clearly defines concepts such as outlier, noise, distribution drift, novelty detection (object, object primitive), rare events, and unexpected events. Based on these concepts we provide a taxonomy of domain anomaly events. One of the mechanisms helping to pinpoint the nature of anomaly is based on detecting incongruence between contextual and noncontextual sensor(y) data interpretation. The proposed methodology has wide applicability. It underpins in a unified way the anomaly detection applications found in the literature. To illustrate some of its distinguishing features, in here the domain anomaly detection methodology is applied to the problem of anomaly detection for a video annotation system.
Event-Triggered Fault Detection of Nonlinear Networked Systems.
Li, Hongyi; Chen, Ziran; Wu, Ligang; Lam, Hak-Keung; Du, Haiping
2017-04-01
This paper investigates the problem of fault detection for nonlinear discrete-time networked systems under an event-triggered scheme. A polynomial fuzzy fault detection filter is designed to generate a residual signal and detect faults in the system. A novel polynomial event-triggered scheme is proposed to determine the transmission of the signal. A fault detection filter is designed to guarantee that the residual system is asymptotically stable and satisfies the desired performance. Polynomial approximated membership functions obtained by Taylor series are employed for filtering analysis. Furthermore, sufficient conditions are represented in terms of sum of squares (SOSs) and can be solved by SOS tools in MATLAB environment. A numerical example is provided to demonstrate the effectiveness of the proposed results.
Detection and attribution of climate extremes in the observed record
Easterling, David R.; Kunkel, Kenneth E.; Wehner, Michael F.; ...
2016-01-18
We present an overview of practices and challenges related to the detection and attribution of observed changes in climate extremes. Detection is the identification of a statistically significant change in the extreme values of a climate variable over some period of time. Issues in detection discussed include data quality, coverage, and completeness. Attribution takes that detection of a change and uses climate model simulations to evaluate whether a cause can be assigned to that change. Additionally, we discuss a newer field of attribution, event attribution, where individual extreme events are analyzed for the express purpose of assigning some measure ofmore » whether that event was directly influenced by anthropogenic forcing of the climate system.« less
Sjulson, Lucas; Miesenböck, Gero
2007-02-01
Optical imaging of physiological events in real time can yield insights into biological function that would be difficult to obtain by other experimental means. However, the detection of all-or-none events, such as action potentials or vesicle fusion events, in noisy single-trial data often requires a careful balance of tradeoffs. The analysis of such experiments, as well as the design of optical reporters and instrumentation for them, is aided by an understanding of the principles of signal detection. This review illustrates these principles, using as an example action potential recording with optical voltage reporters.
A spatial scan statistic for compound Poisson data.
Rosychuk, Rhonda J; Chang, Hsing-Ming
2013-12-20
The topic of spatial cluster detection gained attention in statistics during the late 1980s and early 1990s. Effort has been devoted to the development of methods for detecting spatial clustering of cases and events in the biological sciences, astronomy and epidemiology. More recently, research has examined detecting clusters of correlated count data associated with health conditions of individuals. Such a method allows researchers to examine spatial relationships of disease-related events rather than just incident or prevalent cases. We introduce a spatial scan test that identifies clusters of events in a study region. Because an individual case may have multiple (repeated) events, we base the test on a compound Poisson model. We illustrate our method for cluster detection on emergency department visits, where individuals may make multiple disease-related visits. Copyright © 2013 John Wiley & Sons, Ltd.
NPE 2010 results - Independent performance assessment by simulated CTBT violation scenarios
NASA Astrophysics Data System (ADS)
Ross, O.; Bönnemann, C.; Ceranna, L.; Gestermann, N.; Hartmann, G.; Plenefisch, T.
2012-04-01
For verification of compliance to the Comprehensive Nuclear-Test-Ban Treaty (CTBT) the global International Monitoring System (IMS) is currently being built up. The IMS is designed to detect nuclear explosions through their seismic, hydroacoustic, infrasound, and radionuclide signature. The IMS data are collected, processed to analysis products, and distributed to the state signatories by the International Data Centre (IDC) in Vienna. The state signatories themselves may operate National Data Centers (NDC) giving technical advice concerning CTBT verification to the government. NDC Preparedness Exercises (NPE) are regularly performed to practice the verification procedures for the detection of nuclear explosions in the framework of CTBT monitoring. The initial focus of the NPE 2010 was on the component of radionuclide detections and the application of Atmospheric Transport Modeling (ATM) for defining the source region of a radionuclide event. The exercise was triggered by fictitious radioactive noble gas detections which were calculated beforehand secretly by forward ATM for a hypothetical xenon release scenario starting at location and time of a real seismic event. The task for the exercise participants was to find potential source events by atmospheric backtracking and to analyze in the following promising candidate events concerning their waveform signals. The study shows one possible way of solution for NPE 2010 as it was performed at German NDC by a team without precedent knowledge of the selected event and release scenario. The ATM Source Receptor Sensitivity (SRS) fields as provided by the IDC were evaluated in a logical approach in order to define probable source regions for several days before the first reported fictitious radioactive xenon finding. Additional information on likely event times was derived from xenon isotopic ratios where applicable. Of the considered seismic events in the potential source region all except one could be identified as earthquakes by seismological analysis. The remaining event at Black Thunder Mine, Wyoming, on 23 Oct at 21:15 UTC showed clear explosion characteristics. It caused also Infrasound detections at one station in Canada. An infrasonic one station localization algorithm led to event localization results comparable in precision to the teleseismic localization. However, the analysis of regional seismological stations gave the most accurate result giving an error ellipse of about 60 square kilometer. Finally a forward ATM simulation was performed with the candidate event as source in order to reproduce the original detection scenario. The ATM results showed a simulated station fingerprint in the IMS very similar to the fictitious detections given in the NPE 2010 scenario which is an additional confirmation that the event was correctly identified. The shown event analysis of the NPE 2010 serves as successful example for Data Fusion between the technology of radionuclide detection supported by ATM and seismological methodology as well as infrasound signal processing.
Automated Detection of Surgical Adverse Events from Retrospective Clinical Data
ERIC Educational Resources Information Center
Hu, Zhen
2017-01-01
The Detection of surgical adverse events has become increasingly important with the growing demand for quality improvement and public health surveillance with surgery. Event reporting is one of the key steps in determining the impact of postoperative complications from a variety of perspectives and is an integral component of improving…
Human visual system-based smoking event detection
NASA Astrophysics Data System (ADS)
Odetallah, Amjad D.; Agaian, Sos S.
2012-06-01
Human action (e.g. smoking, eating, and phoning) analysis is an important task in various application domains like video surveillance, video retrieval, human-computer interaction systems, and so on. Smoke detection is a crucial task in many video surveillance applications and could have a great impact to raise the level of safety of urban areas, public parks, airplanes, hospitals, schools and others. The detection task is challenging since there is no prior knowledge about the object's shape, texture and color. In addition, its visual features will change under different lighting and weather conditions. This paper presents a new scheme of a system for detecting human smoking events, or small smoke, in a sequence of images. In developed system, motion detection and background subtraction are combined with motion-region-saving, skin-based image segmentation, and smoke-based image segmentation to capture potential smoke regions which are further analyzed to decide on the occurrence of smoking events. Experimental results show the effectiveness of the proposed approach. As well, the developed method is capable of detecting the small smoking events of uncertain actions with various cigarette sizes, colors, and shapes.
Method and apparatus for data sampling
Odell, Daniel M. C.
1994-01-01
A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium.
Wireless battery management control and monitoring system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zumstein, James M.; Chang, John T.; Farmer, Joseph C.
A battery management system using a sensor inside of the battery that sensor enables monitoring and detection of various events in the battery and transmission of a signal from the sensor through the battery casing to a control and data acquisition module by wireless transmission. The detection of threshold events in the battery enables remedial action to be taken to avoid catastrophic events.
Bates, Jonathan; Parzynski, Craig S; Dhruva, Sanket S; Coppi, Andreas; Kuntz, Richard; Li, Shu-Xia; Marinac-Dabic, Danica; Masoudi, Frederick A; Shaw, Richard E; Warner, Frederick; Krumholz, Harlan M; Ross, Joseph S
2018-06-12
To estimate medical device utilization needed to detect safety differences among implantable cardioverter defibrillators (ICDs) generator models and compare these estimates to utilization in practice. We conducted repeated sample size estimates to calculate the medical device utilization needed, systematically varying device-specific safety event rate ratios and significance levels while maintaining 80% power, testing 3 average adverse event rates (3.9, 6.1, and 12.6 events per 100 person-years) estimated from the American College of Cardiology's 2006 to 2010 National Cardiovascular Data Registry of ICDs. We then compared with actual medical device utilization. At significance level 0.05 and 80% power, 34% or fewer ICD models accrued sufficient utilization in practice to detect safety differences for rate ratios <1.15 and an average event rate of 12.6 events per 100 person-years. For average event rates of 3.9 and 12.6 events per 100 person-years, 30% and 50% of ICD models, respectively, accrued sufficient utilization for a rate ratio of 1.25, whereas 52% and 67% for a rate ratio of 1.50. Because actual ICD utilization was not uniformly distributed across ICD models, the proportion of individuals receiving any ICD that accrued sufficient utilization in practice was 0% to 21%, 32% to 70%, and 67% to 84% for rate ratios of 1.05, 1.15, and 1.25, respectively, for the range of 3 average adverse event rates. Small safety differences among ICD generator models are unlikely to be detected through routine surveillance given current ICD utilization in practice, but large safety differences can be detected for most patients at anticipated average adverse event rates. Copyright © 2018 John Wiley & Sons, Ltd.
Design and Deployment of a Pediatric Cardiac Arrest Surveillance System
Newton, Heather Marie; McNamara, Leann; Engorn, Branden Michael; Jones, Kareen; Bernier, Meghan; Dodge, Pamela; Salamone, Cheryl; Bhalala, Utpal; Jeffers, Justin M.; Engineer, Lilly; Diener-West, Marie; Hunt, Elizabeth Anne
2018-01-01
Objective We aimed to increase detection of pediatric cardiopulmonary resuscitation (CPR) events and collection of physiologic and performance data for use in quality improvement (QI) efforts. Materials and Methods We developed a workflow-driven surveillance system that leveraged organizational information technology systems to trigger CPR detection and analysis processes. We characterized detection by notification source, type, location, and year, and compared it to previous methods of detection. Results From 1/1/2013 through 12/31/2015, there were 2,986 unique notifications associated with 2,145 events, 317 requiring CPR. PICU and PEDS-ED accounted for 65% of CPR events, whereas floor care areas were responsible for only 3% of events. 100% of PEDS-OR and >70% of PICU CPR events would not have been included in QI efforts. Performance data from both defibrillator and bedside monitor increased annually. (2013: 1%; 2014: 18%; 2015: 27%). Discussion After deployment of this system, detection has increased ∼9-fold and performance data collection increased annually. Had the system not been deployed, 100% of PEDS-OR and 50–70% of PICU, NICU, and PEDS-ED events would have been missed. Conclusion By leveraging hospital information technology and medical device data, identification of pediatric cardiac arrest with an associated increased capture in the proportion of objective performance data is possible. PMID:29854451
Design and Deployment of a Pediatric Cardiac Arrest Surveillance System.
Duval-Arnould, Jordan Michel; Newton, Heather Marie; McNamara, Leann; Engorn, Branden Michael; Jones, Kareen; Bernier, Meghan; Dodge, Pamela; Salamone, Cheryl; Bhalala, Utpal; Jeffers, Justin M; Engineer, Lilly; Diener-West, Marie; Hunt, Elizabeth Anne
2018-01-01
We aimed to increase detection of pediatric cardiopulmonary resuscitation (CPR) events and collection of physiologic and performance data for use in quality improvement (QI) efforts. We developed a workflow-driven surveillance system that leveraged organizational information technology systems to trigger CPR detection and analysis processes. We characterized detection by notification source, type, location, and year, and compared it to previous methods of detection. From 1/1/2013 through 12/31/2015, there were 2,986 unique notifications associated with 2,145 events, 317 requiring CPR. PICU and PEDS-ED accounted for 65% of CPR events, whereas floor care areas were responsible for only 3% of events. 100% of PEDS-OR and >70% of PICU CPR events would not have been included in QI efforts. Performance data from both defibrillator and bedside monitor increased annually. (2013: 1%; 2014: 18%; 2015: 27%). After deployment of this system, detection has increased ∼9-fold and performance data collection increased annually. Had the system not been deployed, 100% of PEDS-OR and 50-70% of PICU, NICU, and PEDS-ED events would have been missed. By leveraging hospital information technology and medical device data, identification of pediatric cardiac arrest with an associated increased capture in the proportion of objective performance data is possible.
ON THE FERMI -GBM EVENT 0.4 s AFTER GW150914
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greiner, J.; Yu, H.-F.; Burgess, J. M.
In view of the recent report by Connaughton et al., we analyze continuous time-tagged event (TTE) data of Fermi -gamma-ray burst monitor (GBM) around the time of the gravitational-wave event GW 150914. We find that after proper accounting for low-count statistics, the GBM transient event at 0.4 s after GW 150914 is likely not due to an astrophysical source, but consistent with a background fluctuation, removing the tension between the INTEGRAL /ACS non-detection and GBM. Additionally, reanalysis of other short GRBs shows that without proper statistical modeling the fluence of faint events is over-predicted, as verified for some joint GBM–ACSmore » detections of short GRBs. We detail the statistical procedure to correct these biases. As a result, faint short GRBs, verified by ACS detections, with significances in the broadband light curve even smaller than that of the GBM–GW150914 event are recovered as proper non-zero source, while the GBM–GW150914 event is consistent with zero fluence.« less
Analyzing time-ordered event data with missed observations.
Dokter, Adriaan M; van Loon, E Emiel; Fokkema, Wimke; Lameris, Thomas K; Nolet, Bart A; van der Jeugd, Henk P
2017-09-01
A common problem with observational datasets is that not all events of interest may be detected. For example, observing animals in the wild can difficult when animals move, hide, or cannot be closely approached. We consider time series of events recorded in conditions where events are occasionally missed by observers or observational devices. These time series are not restricted to behavioral protocols, but can be any cyclic or recurring process where discrete outcomes are observed. Undetected events cause biased inferences on the process of interest, and statistical analyses are needed that can identify and correct the compromised detection processes. Missed observations in time series lead to observed time intervals between events at multiples of the true inter-event time, which conveys information on their detection probability. We derive the theoretical probability density function for observed intervals between events that includes a probability of missed detection. Methodology and software tools are provided for analysis of event data with potential observation bias and its removal. The methodology was applied to simulation data and a case study of defecation rate estimation in geese, which is commonly used to estimate their digestive throughput and energetic uptake, or to calculate goose usage of a feeding site from dropping density. Simulations indicate that at a moderate chance to miss arrival events ( p = 0.3), uncorrected arrival intervals were biased upward by up to a factor 3, while parameter values corrected for missed observations were within 1% of their true simulated value. A field case study shows that not accounting for missed observations leads to substantial underestimates of the true defecation rate in geese, and spurious rate differences between sites, which are introduced by differences in observational conditions. These results show that the derived methodology can be used to effectively remove observational biases in time-ordered event data.
Rendering visual events as sounds: Spatial attention capture by auditory augmented reality.
Stone, Scott A; Tata, Matthew S
2017-01-01
Many salient visual events tend to coincide with auditory events, such as seeing and hearing a car pass by. Information from the visual and auditory senses can be used to create a stable percept of the stimulus. Having access to related coincident visual and auditory information can help for spatial tasks such as localization. However not all visual information has analogous auditory percepts, such as viewing a computer monitor. Here, we describe a system capable of detecting and augmenting visual salient events into localizable auditory events. The system uses a neuromorphic camera (DAVIS 240B) to detect logarithmic changes of brightness intensity in the scene, which can be interpreted as salient visual events. Participants were blindfolded and asked to use the device to detect new objects in the scene, as well as determine direction of motion for a moving visual object. Results suggest the system is robust enough to allow for the simple detection of new salient stimuli, as well accurately encoding direction of visual motion. Future successes are probable as neuromorphic devices are likely to become faster and smaller in the future, making this system much more feasible.
Rendering visual events as sounds: Spatial attention capture by auditory augmented reality
Tata, Matthew S.
2017-01-01
Many salient visual events tend to coincide with auditory events, such as seeing and hearing a car pass by. Information from the visual and auditory senses can be used to create a stable percept of the stimulus. Having access to related coincident visual and auditory information can help for spatial tasks such as localization. However not all visual information has analogous auditory percepts, such as viewing a computer monitor. Here, we describe a system capable of detecting and augmenting visual salient events into localizable auditory events. The system uses a neuromorphic camera (DAVIS 240B) to detect logarithmic changes of brightness intensity in the scene, which can be interpreted as salient visual events. Participants were blindfolded and asked to use the device to detect new objects in the scene, as well as determine direction of motion for a moving visual object. Results suggest the system is robust enough to allow for the simple detection of new salient stimuli, as well accurately encoding direction of visual motion. Future successes are probable as neuromorphic devices are likely to become faster and smaller in the future, making this system much more feasible. PMID:28792518
Effects of shifts in the rate of repetitive stimulation on sustained attention
NASA Technical Reports Server (NTRS)
Krulewitz, J. E.; Warm, J. S.; Wohl, T. H.
1975-01-01
The effects of shifts in the rate of presentation of repetitive neutral events (background event rate) were studied in a visual vigilance task. Four groups of subjects experienced either a high (21 events/min) or a low (6 events/min) event rate for 20 min and then experienced either the same or the alternate event rate for an additional 40 min. The temporal occurrence of critical target signals was identical for all groups, irrespective of event rate. The density of critical signals was 12 signals/20 min. By the end of the session, shifts in event rate were associated with changes in performance which resembled contrast effects found in other experimental situations in which shift paradigms were used. Relative to constant event rate control conditions, a shift from a low to a high event rate depressed the probability of signal detections, while a shift in the opposite direction enhanced the probability of signal detections.
Could a multi-PeV neutrino event have as origin the internal shocks inside the GRB progenitor star?
NASA Astrophysics Data System (ADS)
Fraija, N.
2016-03-01
The IceCube Collaboration initially reported the detection of 37 extraterrestrial neutrinos in the TeV-PeV energy range. The reconstructed neutrino events were obtained during three consecutive years of data taking, from 2010 to 2013. Although these events have been discussed to have an extragalactic origin, they have not been correlated to any known source. Recently, the IceCube Collaboration reported a neutrino-induced muon event with energy of 2.6 ± 0.3 PeV which corresponds to the highest event ever detected. Neither the reconstructed direction of this event (J2000.0), detected on June 11 2014 at R.A. = 110 ° . 34, Dec. = 11 ° . 48 matches with any familiar source. Long gamma-ray bursts (lGRBs) are usually associated with the core collapse of massive stars leading relativistic-collimated jets inside stars with high-energy neutrino production. These neutrinos have been linked to the 37 events previously detected by IceCube experiment. In this work, we explore the conditions and values of parameters so that the highest neutrino recently detected could be generated by proton-photon and proton-hadron interactions at internal shocks inside lGRB progenitor star and then detected in IceCube experiment. Considering that internal shocks take place in a relativistic collimated jet, whose (half) opening angle is θ0 ∼ 0.1, we found that lGRBs with total luminosity L ≲1048 erg/s and internal shocks on the surface of progenitors such as Wolf-Rayet (WR) and blue super giant (BSG) stars favor this multi-PeV neutrino production, although this neutrino could be associated with L ∼1050.5 (∼1050) erg/s provided that the internal shocks occur at ∼109 (∼1010.2) cm for a WR (BSG).
Real-time detection and classification of anomalous events in streaming data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferragut, Erik M.; Goodall, John R.; Iannacone, Michael D.
2016-04-19
A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The events can be displayed to a user in user-defined groupings in an animated fashion. The system can include a plurality of anomaly detectors that together implement an algorithm to identify low probability events and detect atypical traffic patterns. The atypical traffic patterns can then be classified as being of interest or not. In one particular example, in a network environment, the classification can be whether the network traffic is malicious or not.
Semantic Context Detection Using Audio Event Fusion
NASA Astrophysics Data System (ADS)
Chu, Wei-Ta; Cheng, Wen-Huang; Wu, Ja-Ling
2006-12-01
Semantic-level content analysis is a crucial issue in achieving efficient content retrieval and management. We propose a hierarchical approach that models audio events over a time series in order to accomplish semantic context detection. Two levels of modeling, audio event and semantic context modeling, are devised to bridge the gap between physical audio features and semantic concepts. In this work, hidden Markov models (HMMs) are used to model four representative audio events, that is, gunshot, explosion, engine, and car braking, in action movies. At the semantic context level, generative (ergodic hidden Markov model) and discriminative (support vector machine (SVM)) approaches are investigated to fuse the characteristics and correlations among audio events, which provide cues for detecting gunplay and car-chasing scenes. The experimental results demonstrate the effectiveness of the proposed approaches and provide a preliminary framework for information mining by using audio characteristics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dodge, D. A.; Harris, D. B.
Correlation detectors are of considerable interest to the seismic monitoring communities because they offer reduced detection thresholds and combine detection, location and identification functions into a single operation. They appear to be ideal for applications requiring screening of frequent repeating events. However, questions remain about how broadly empirical correlation methods are applicable. We describe the effectiveness of banks of correlation detectors in a system that combines traditional power detectors with correlation detectors in terms of efficiency, which we define to be the fraction of events detected by the correlators. This paper elaborates and extends the concept of a dynamic correlationmore » detection framework – a system which autonomously creates correlation detectors from event waveforms detected by power detectors; and reports observed performance on a network of arrays in terms of efficiency. We performed a large scale test of dynamic correlation processors on an 11 terabyte global dataset using 25 arrays in the single frequency band 1-3 Hz. The system found over 3.2 million unique signals and produced 459,747 screened detections. A very satisfying result is that, on average, efficiency grows with time and, after nearly 16 years of operation, exceeds 47% for events observed over all distance ranges and approaches 70% for near regional and 90% for local events. This observation suggests that future pipeline architectures should make extensive use of correlation detectors, principally for decluttering observations of local and near-regional events. Our results also suggest that future operations based on correlation detection will require commodity large-scale computing infrastructure, since the numbers of correlators in an autonomous system can grow into the hundreds of thousands.« less
Dodge, D. A.; Harris, D. B.
2016-03-15
Correlation detectors are of considerable interest to the seismic monitoring communities because they offer reduced detection thresholds and combine detection, location and identification functions into a single operation. They appear to be ideal for applications requiring screening of frequent repeating events. However, questions remain about how broadly empirical correlation methods are applicable. We describe the effectiveness of banks of correlation detectors in a system that combines traditional power detectors with correlation detectors in terms of efficiency, which we define to be the fraction of events detected by the correlators. This paper elaborates and extends the concept of a dynamic correlationmore » detection framework – a system which autonomously creates correlation detectors from event waveforms detected by power detectors; and reports observed performance on a network of arrays in terms of efficiency. We performed a large scale test of dynamic correlation processors on an 11 terabyte global dataset using 25 arrays in the single frequency band 1-3 Hz. The system found over 3.2 million unique signals and produced 459,747 screened detections. A very satisfying result is that, on average, efficiency grows with time and, after nearly 16 years of operation, exceeds 47% for events observed over all distance ranges and approaches 70% for near regional and 90% for local events. This observation suggests that future pipeline architectures should make extensive use of correlation detectors, principally for decluttering observations of local and near-regional events. Our results also suggest that future operations based on correlation detection will require commodity large-scale computing infrastructure, since the numbers of correlators in an autonomous system can grow into the hundreds of thousands.« less
Wenchuan Event Detection And Localization Using Waveform Correlation Coupled With Double Difference
NASA Astrophysics Data System (ADS)
Slinkard, M.; Heck, S.; Schaff, D. P.; Young, C. J.; Richards, P. G.
2014-12-01
The well-studied Wenchuan aftershock sequence triggered by the May 12, 2008, Ms 8.0, mainshock offers an ideal test case for evaluating the effectiveness of using waveform correlation coupled with double difference relocation to detect and locate events in a large aftershock sequence. We use Sandia's SeisCorr detector to process 3 months of data recorded by permanent IRIS and temporary ASCENT stations using templates from events listed in a global catalog to find similar events in the raw data stream. Then we take the detections and relocate them using the double difference method. We explore both the performance that can be expected with using just a small number of stations, and, the benefits of reprocessing a well-studied sequence such as this one using waveform correlation to find even more events. We benchmark our results against previously published results describing relocations of regional catalog data. Before starting this project, we had examples where with just a few stations at far-regional distances, waveform correlation combined with double difference did and impressive job of detection and location events with precision at the few hundred and even tens of meters level.
NASA Astrophysics Data System (ADS)
Kornhuber, K.; Petoukhov, V.; Petri, S.; Rahmstorf, S.; Coumou, D.
2017-09-01
Several recent northern hemisphere summer extremes have been linked to persistent high-amplitude wave patterns (e.g. heat waves in Europe 2003, Russia 2010 and in the US 2011, Floods in Pakistan 2010 and Europe 2013). Recently quasi-resonant amplification (QRA) was proposed as a mechanism that, when certain dynamical conditions are fulfilled, can lead to such high-amplitude wave events. Based on these resonance conditions a detection scheme to scan reanalysis data for QRA events in boreal summer months was implemented. With this objective detection scheme we analyzed the occurrence and duration of QRA events and the associated atmospheric flow patterns in 1979-2015 reanalysis data. We detect a total number of 178 events for wave 6, 7 and 8 and find that during roughly one-third of all high amplitude events QRA conditions were met for respective waves. Our analysis reveals a significant shift for quasi-stationary waves 6 and 7 towards high amplitudes during QRA events, lagging first QRA-detection by typically one week. The results provide further evidence for the validity of the QRA hypothesis and its important role in generating high amplitude waves in boreal summer.
Improved Detection of Local Earthquakes in the Vienna Basin (Austria), using Subspace Detectors
NASA Astrophysics Data System (ADS)
Apoloner, Maria-Theresia; Caffagni, Enrico; Bokelmann, Götz
2016-04-01
The Vienna Basin in Eastern Austria is densely populated and highly-developed; it is also a region of low to moderate seismicity, yet the seismological network coverage is relatively sparse. This demands improving our capability of earthquake detection by testing new methods, enlarging the existing local earthquake catalogue. This contributes to imaging tectonic fault zones for better understanding seismic hazard, also through improved earthquake statistics (b-value, magnitude of completeness). Detection of low-magnitude earthquakes or events for which the highest amplitudes slightly exceed the signal-to-noise-ratio (SNR), may be possible by using standard methods like the short-term over long-term average (STA/LTA). However, due to sparse network coverage and high background noise, such a technique may not detect all potentially recoverable events. Yet, earthquakes originating from the same source region and relatively close to each other, should be characterized by similarity in seismic waveforms, at a given station. Therefore, waveform similarity can be exploited by using specific techniques such as correlation-template based (also known as matched filtering) or subspace detection methods (based on the subspace theory). Matching techniques basically require a reference or template event, usually characterized by high waveform coherence in the array receivers, and high SNR, which is cross-correlated with the continuous data. Instead, subspace detection methods overcome in principle the necessity of defining template events as single events, but use a subspace extracted from multiple events. This approach theoretically should be more robust in detecting signals that exhibit a strong variability (e.g. because of source or magnitude). In this study we scan the continuous data recorded in the Vienna Basin with a subspace detector to identify additional events. This will allow us to estimate the increase of the seismicity rate in the local earthquake catalogue, therefore providing an evaluation of network performance and efficiency of the method.
Development of an algorithm for automatic detection and rating of squeak and rattle events
NASA Astrophysics Data System (ADS)
Chandrika, Unnikrishnan Kuttan; Kim, Jay H.
2010-10-01
A new algorithm for automatic detection and rating of squeak and rattle (S&R) events was developed. The algorithm utilizes the perceived transient loudness (PTL) that approximates the human perception of a transient noise. At first, instantaneous specific loudness time histories are calculated over 1-24 bark range by applying the analytic wavelet transform and Zwicker loudness transform to the recorded noise. Transient specific loudness time histories are then obtained by removing estimated contributions of the background noise from instantaneous specific loudness time histories. These transient specific loudness time histories are summed to obtain the transient loudness time history. Finally, the PTL time history is obtained by applying Glasberg and Moore temporal integration to the transient loudness time history. Detection of S&R events utilizes the PTL time history obtained by summing only 18-24 barks components to take advantage of high signal-to-noise ratio in the high frequency range. A S&R event is identified when the value of the PTL time history exceeds the detection threshold pre-determined by a jury test. The maximum value of the PTL time history is used for rating of S&R events. Another jury test showed that the method performs much better if the PTL time history obtained by summing all frequency components is used. Therefore, r ating of S&R events utilizes this modified PTL time history. Two additional jury tests were conducted to validate the developed detection and rating methods. The algorithm developed in this work will enable automatic detection and rating of S&R events with good accuracy and minimum possibility of false alarm.
Vertically Integrated Seismological Analysis II : Inference
NASA Astrophysics Data System (ADS)
Arora, N. S.; Russell, S.; Sudderth, E.
2009-12-01
Methods for automatically associating detected waveform features with hypothesized seismic events, and localizing those events, are a critical component of efforts to verify the Comprehensive Test Ban Treaty (CTBT). As outlined in our companion abstract, we have developed a hierarchical model which views detection, association, and localization as an integrated probabilistic inference problem. In this abstract, we provide more details on the Markov chain Monte Carlo (MCMC) methods used to solve this inference task. MCMC generates samples from a posterior distribution π(x) over possible worlds x by defining a Markov chain whose states are the worlds x, and whose stationary distribution is π(x). In the Metropolis-Hastings (M-H) method, transitions in the Markov chain are constructed in two steps. First, given the current state x, a candidate next state x‧ is generated from a proposal distribution q(x‧ | x), which may be (more or less) arbitrary. Second, the transition to x‧ is not automatic, but occurs with an acceptance probability—α(x‧ | x) = min(1, π(x‧)q(x | x‧)/π(x)q(x‧ | x)). The seismic event model outlined in our companion abstract is quite similar to those used in multitarget tracking, for which MCMC has proved very effective. In this model, each world x is defined by a collection of events, a list of properties characterizing those events (times, locations, magnitudes, and types), and the association of each event to a set of observed detections. The target distribution π(x) = P(x | y), the posterior distribution over worlds x given the observed waveform data y at all stations. Proposal distributions then implement several types of moves between worlds. For example, birth moves create new events; death moves delete existing events; split moves partition the detections for an event into two new events; merge moves combine event pairs; swap moves modify the properties and assocations for pairs of events. Importantly, the rules for accepting such complex moves need not be hand-designed. Instead, they are automatically determined by the underlying probabilistic model, which is in turn calibrated via historical data and scientific knowledge. Consider a small seismic event which generates weak signals at several different stations, which might independently be mistaken for noise. A birth move may nevertheless hypothesize an event jointly explaining these detections. If the corresponding waveform data then aligns with the seismological knowledge encoded in the probabilistic model, the event may be detected even though no single station observes it unambiguously. Alternatively, if a large outlier reading is produced at a single station, moves which instantiate a corresponding (false) event would be rejected because of the absence of plausible detections at other sensors. More broadly, one of the main advantages of our MCMC approach is its consistent handling of the relative uncertainties in different information sources. By avoiding low-level thresholds, we expect to improve accuracy and robustness. At the conference, we will present results quantitatively validating our approach, using ground-truth associations and locations provided either by simulation or human analysts.
A model of human decision making in multiple process monitoring situations
NASA Technical Reports Server (NTRS)
Greenstein, J. S.; Rouse, W. B.
1982-01-01
Human decision making in multiple process monitoring situations is considered. It is proposed that human decision making in many multiple process monitoring situations can be modeled in terms of the human's detection of process related events and his allocation of attention among processes once he feels event have occurred. A mathematical model of human event detection and attention allocation performance in multiple process monitoring situations is developed. An assumption made in developing the model is that, in attempting to detect events, the human generates estimates of the probabilities that events have occurred. An elementary pattern recognition technique, discriminant analysis, is used to model the human's generation of these probability estimates. The performance of the model is compared to that of four subjects in a multiple process monitoring situation requiring allocation of attention among processes.
Pacurariu, Alexandra C; Straus, Sabine M; Trifirò, Gianluca; Schuemie, Martijn J; Gini, Rosa; Herings, Ron; Mazzaglia, Giampiero; Picelli, Gino; Scotti, Lorenza; Pedersen, Lars; Arlett, Peter; van der Lei, Johan; Sturkenboom, Miriam C; Coloma, Preciosa M
2015-12-01
Spontaneous reporting systems (SRSs) remain the cornerstone of post-marketing drug safety surveillance despite their well-known limitations. Judicious use of other available data sources is essential to enable better detection, strengthening and validation of signals. In this study, we investigated the potential of electronic healthcare records (EHRs) to be used alongside an SRS as an independent system, with the aim of improving signal detection. A signal detection strategy, focused on a limited set of adverse events deemed important in pharmacovigilance, was performed retrospectively in two data sources-(1) the Exploring and Understanding Adverse Drug Reactions (EU-ADR) database network and (2) the EudraVigilance database-using data between 2000 and 2010. Five events were considered for analysis: (1) acute myocardial infarction (AMI); (2) bullous eruption; (3) hip fracture; (4) acute pancreatitis; and (5) upper gastrointestinal bleeding (UGIB). Potential signals identified in each system were verified using the current published literature. The complementarity of the two systems to detect signals was expressed as the percentage of the unilaterally identified signals out of the total number of confirmed signals. As a proxy for the associated costs, the number of signals that needed to be reviewed to detect one true signal (number needed to detect [NND]) was calculated. The relationship between the background frequency of the events and the capability of each system to detect signals was also investigated. The contribution of each system to signal detection appeared to be correlated with the background incidence of the events, being directly proportional to the incidence in EU-ADR and inversely proportional in EudraVigilance. EudraVigilance was particularly valuable in identifying bullous eruption and acute pancreatitis (71 and 42 % of signals were correctly identified from the total pool of known associations, respectively), while EU-ADR was most useful in identifying hip fractures (60 %). Both systems contributed reasonably well to identification of signals related to UGIB (45 % in EudraVigilance, 40 % in EU-ADR) but only fairly for signals related to AMI (25 % in EU-ADR, 20 % in EudraVigilance). The costs associated with detection of signals were variable across events; however, it was often more costly to detect safety signals in EU-ADR than in EudraVigilance (median NNDs: 7 versus 5). An EHR-based system may have additional value for signal detection, alongside already established systems, especially in the presence of adverse events with a high background incidence. While the SRS appeared to be more cost effective overall, for some events the costs associated with signal detection in the EHR might be justifiable.
NASA Astrophysics Data System (ADS)
Wyer, P.; Zurek, B.
2017-12-01
Extensive additions to the Royal Dutch Meteorological Institute (KNMI) seismic monitoring network over recent years have yielded corresponding gains in detection of low magnitude seismicity induced by production of the Groningen gas field. A review of the weakest events in the seismic catalog demonstrates that waveforms from individual stations in the 30 x 35 km network area overlap sufficiently for normalized analytic envelopes to be constructively stacked without compensation for moveout, detection of individual station triggers or the need for more advanced approaches such as template matching. This observation opens the possibility of updating the historical catalog to current detection levels without having to implement more computationally expensive steps when reprocessing the legacy continuous data. A more consistent long term catalog would better constrain the frequency-size distribution (Gutenberg-Richter relationship) and provide a richer dataset for calibration of geomechanical and seismological models. To test the viability of a direct stacking approach, normalized waveform envelopes are partitioned by station into two discrete RMS stacks. Candidate seismic events are then identified as simultaneous STA/LTA triggers on both stacks. This partitioning has a minor impact on signal, but avoids the majority of false detections otherwise obtained on a single stack. Undesired detection of anthropogenic sources and earthquakes occurring outside the field can be further minimized by tuning the waveform frequency filters and trigger configuration. After minimal optimization, data from as few as 14 legacy stations are sufficient for robust automatic detection of known events approaching ML0 from the recent catalog. Ongoing work will determine residual false detection rates and whether previously unknown past events can be detected with sensitivities comparable to the modern KNMI catalog.
FORTE Compact Intra-cloud Discharge Detection parameterized by Peak Current
NASA Astrophysics Data System (ADS)
Heavner, M. J.; Suszcynsky, D. M.; Jacobson, A. R.; Heavner, B. D.; Smith, D. A.
2002-12-01
The Los Alamos Sferic Array (EDOT) has recorded over 3.7 million lightning-related fast electric field change data records during April 1 - August 31, 2001 and 2002. The events were detected by three or more stations, allowing for differential-time-of-arrival location determination. The waveforms are characterized with estimated peak currents as well as by event type. Narrow Bipolar Events (NBEs), the VLF/LF signature of Compact Intra-cloud Discharges (CIDs), are generally isolated pulses with identifiable ionospheric reflections, permitting determination of event source altitudes. We briefly review the EDOT characterization of events. The FORTE satellite observes Trans-Ionospheric Pulse Pairs (TIPPs, the VHF satellite signature of CIDs). The subset of coincident EDOT and FORTE CID observations are compared with the total EDOT CID database to characterize the VHF detection efficiency of CIDs. The NBE polarity and altitude are also examined in the context of FORTE TIPP detection. The parameter-dependent detection efficiencies are extrapolated from FORTE orbit to GPS orbit in support of the V-GLASS effort (GPS based global detection of lightning).
Bounds on the minimum number of recombination events in a sample history.
Myers, Simon R; Griffiths, Robert C
2003-01-01
Recombination is an important evolutionary factor in many organisms, including humans, and understanding its effects is an important task facing geneticists. Detecting past recombination events is thus important; this article introduces statistics that give a lower bound on the number of recombination events in the history of a sample, on the basis of the patterns of variation in the sample DNA. Such lower bounds are appropriate, since many recombination events in the history are typically undetectable, so the true number of historical recombinations is unobtainable. The statistics can be calculated quickly by computer and improve upon the earlier bound of Hudson and Kaplan 1985. A method is developed to combine bounds on local regions in the data to produce more powerful improved bounds. The method is flexible to different models of recombination occurrence. The approach gives recombination event bounds between all pairs of sites, to help identify regions with more detectable recombinations, and these bounds can be viewed graphically. Under coalescent simulations, there is a substantial improvement over the earlier method (of up to a factor of 2) in the expected number of recombination events detected by one of the new minima, across a wide range of parameter values. The method is applied to data from a region within the lipoprotein lipase gene and the amount of detected recombination is substantially increased. Further, there is strong clustering of detected recombination events in an area near the center of the region. A program implementing these statistics, which was used for this article, is available from http://www.stats.ox.ac.uk/mathgen/programs.html. PMID:12586723
OGLE-2017-BLG-1130: The First Binary Gravitational Microlens Detected from Spitzer Only
NASA Astrophysics Data System (ADS)
Wang, Tianshu; Calchi Novati, S.; Udalski, A.; Gould, A.; Mao, Shude; Zang, W.; Beichman, C.; Bryden, G.; Carey, S.; Gaudi, B. S.; Henderson, C. B.; Shvartzvald, Y.; Yee, J. C.; Spitzer Team; Mróz, P.; Poleski, R.; Skowron, J.; Szymański, M. K.; Soszyński, I.; Kozłowski, S.; Pietrukowicz, P.; Ulaczyk, K.; Pawlak, M.; OGLE Collaboration; Albrow, M. D.; Chung, S.-J.; Han, C.; Hwang, K.-H.; Jung, Y. K.; Ryu, Y.-H.; Shin, I.-G.; Zhu, W.; Cha, S.-M.; Kim, D.-J.; Kim, H.-W.; Kim, S.-L.; Lee, C.-U.; Lee, D.-J.; Lee, Y.; Park, B.-G.; Pogge, R. W.; KMTNet Collaboration
2018-06-01
We analyze the binary gravitational microlensing event OGLE-2017-BLG-1130 (mass ratio q ∼ 0.45), the first published case in which the binary anomaly was detected only by the Spitzer Space Telescope. This event provides strong evidence that some binary signals can be missed by observations from the ground alone but detected by Spitzer. We therefore invert the normal procedure, first finding the lens parameters by fitting the space-based data and then measuring the microlensing parallax using ground-based observations. We also show that the normal four-fold space-based degeneracy in the single-lens case can become a weak eight-fold degeneracy in binary-lens events. Although this degeneracy is resolved in event OGLE-2017-BLG-1130, it might persist in other events.
Observations of a hydrofracture induced earthquake sequence in Harrison County Ohio in 2014
NASA Astrophysics Data System (ADS)
Friberg, P. A.; Brudzinski, M. R.; Currie, B. S.; Skoumal, R.
2015-12-01
On October 7, 2014, a Mw 1.9 earthquake was detected and located using the IRIS Earthscope Transportable Array stations in Ohio. The earthquake was located at a depth of ~3 km near the interface of the Paleozoic sedimentary rocks with the crystalline Precambrian basement. The location is within a few kilometers laterally of a 2013 earthquake sequence that was linked to hydraulic fracturing (HF) operations on three wells in Harrison county (Friberg et al, 2014). Using the Mw 1.9 event as a template in a multi-component cross correlation detector on station O53A, over 1000 matching detections were revealed between September 26 - October 17, 2014. These detections were all coincident in time with HF operations on 3 nearby (< 1km away) horizontally drilled wells (Tarbert 1H, 3H, and 5H) in the Utica formation (~2.4 km depth). The HF operations at two of the wells (1H and 5H) were coincident with the majority of the detected events. The final well (3H) stimulated in the series, produced only about 20 identified events. In addition to the coincident timing with nearby HF operations, the time clustered nature of the detections were similar to the 2013 sequence and two other Ohio HF induced sequences in 2014 (Skoumal et al, 2015). All of the other HF induced earthquake sequences in Ohio were related to operations in the Utica formation. Interestingly, this sequence of earthquakes did not follow a simple Gutenberg-Richter magnitude frequency relationship and was deficient in positive magnitude events; the magnitude 1.9 was preceded by a magnitude 1.7, and only a ½ dozen events slightly above magnitude 0.0. The majority of the events detected were below magnitude 0.0, with some as low as magnitude -2.0. While the majority of detections are too small to locate, high similarity in waveform character indicate they are spatially near to the magnitude 1.9 event. Furthermore, gradual shifts in P phase arrival relative to S phases indicate events are moving away from the station with progressive HF stages. Given the orientation of the wells relative to the station, the migration of events away from the station with progressive stages is also supportive of this being an induced sequence.
NASA Astrophysics Data System (ADS)
Bartlow, N. M.
2017-12-01
Slow Earthquake Hunters is a new citizen science project to detect, catalog, and monitor slow slip events. Slow slip events, also called "slow earthquakes", occur when faults slip too slowly to generate significant seismic radiation. They typically take between a few days and over a year to occur, and are most often found on subduction zone plate interfaces. While not dangerous in and of themselves, recent evidence suggests that monitoring slow slip events is important for earthquake hazards, as slow slip events have been known to trigger damaging "regular" earthquakes. Slow slip events, because they do not radiate seismically, are detected with a variety of methods, most commonly continuous geodetic Global Positioning System (GPS) stations. There is now a wealth of GPS data in some regions that experience slow slip events, but a reliable automated method to detect them in GPS data remains elusive. This project aims to recruit human users to view GPS time series data, with some post-processing to highlight slow slip signals, and flag slow slip events for further analysis by the scientific team. Slow Earthquake Hunters will begin with data from the Cascadia subduction zone, where geodetically detectable slow slip events with a duration of at least a few days recur at regular intervals. The project will then expand to other areas with slow slip events or other transient geodetic signals, including other subduction zones, and areas with strike-slip faults. This project has not yet rolled out to the public, and is in a beta testing phase. This presentation will show results from an initial pilot group of student participants at the University of Missouri, and solicit feedback for the future of Slow Earthquake Hunters.
The hard X-ray burst spectrometer event listing 1980-1987
NASA Technical Reports Server (NTRS)
Dennis, B. R.; Orwig, L. E.; Kiplinger, A. L.; Schwartz, R. A.; Gibson, B. R.; Kennard, G. S.; Tolbert, A. K.; Biesecker, D. A.; Labow, G. J.; Shaver, A.
1988-01-01
This event listing is a comprehensive reference for the Hard X-ray bursts detected with the Hard X-ray Burst Spectrometer on the Solar Maximum Mission from the time of launch 14 February 1980 to December 1987. Over 8600 X-ray events were detected in the energy range from 30 to approx. 600 keV with the vast majority being solar flares. The listing includes the start time, peak time, duration and peak rate of each event.
NASA Astrophysics Data System (ADS)
Kim, S. K.; Lee, J.; Zhang, C.; Ames, S.; Williams, D. N.
2017-12-01
Deep learning techniques have been successfully applied to solve many problems in climate and geoscience using massive-scaled observed and modeled data. For extreme climate event detections, several models based on deep neural networks have been recently proposed and attend superior performance that overshadows all previous handcrafted expert based method. The issue arising, though, is that accurate localization of events requires high quality of climate data. In this work, we propose framework capable of detecting and localizing extreme climate events in very coarse climate data. Our framework is based on two models using deep neural networks, (1) Convolutional Neural Networks (CNNs) to detect and localize extreme climate events, and (2) Pixel recursive recursive super resolution model to reconstruct high resolution climate data from low resolution climate data. Based on our preliminary work, we have presented two CNNs in our framework for different purposes, detection and localization. Our results using CNNs for extreme climate events detection shows that simple neural nets can capture the pattern of extreme climate events with high accuracy from very coarse reanalysis data. However, localization accuracy is relatively low due to the coarse resolution. To resolve this issue, the pixel recursive super resolution model reconstructs the resolution of input of localization CNNs. We present a best networks using pixel recursive super resolution model that synthesizes details of tropical cyclone in ground truth data while enhancing their resolution. Therefore, this approach not only dramat- ically reduces the human effort, but also suggests possibility to reduce computing cost required for downscaling process to increase resolution of data.
Gravitational wave signature of a mini creation event (MCE)
NASA Astrophysics Data System (ADS)
Dhurandhar, S. V.; Narlikar, J. V.
2018-07-01
In light of the recent discoveries of binary black hole events and one neutron star event by the advanced LIGO (aLIGO) and advanced Virgo (aVirgo) detectors, we propose a new astrophysical source, namely, the mini creation event (MCE) as a possible source of gravitational waves (GW) to be detected by advanced detectors. The MCE is at the heart of the quasi steady state cosmology (QSSC) and is not expected to occur in standard cosmology. Generically, the MCE is anisotropic and we assume a Bianchi Tpye I model for its description. We compute its signature waveform and assume masses, distances analogous to the events detected. The striking feature of the waveform associated with this model of the MCE is that it depends only on one amplitude parameter and thus allows for simpler data analysis. By matched filtering the signal we find that, for a broad range of model parameters, the signal to noise ratio of the randomly oriented MCE is sufficiently high for a confident detection by aLIGO and aVirgo. We therefore propose the MCE as a viable astrophysical source of GW. The detection or non-detection of such a source also hold implications for QSSC, namely, whether it is a viable cosmology or not.
Method and apparatus for data sampling
Odell, D.M.C.
1994-04-19
A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples is described. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium. 6 figures.
Nadal, Anna; Esteve, Teresa; Pla, Maria
2009-01-01
A multiplex polymerase chain reaction assay coupled to capillary gel electrophoresis for amplicon identification by size and color (multiplex PCR-CGE-SC) was developed for simultaneous detection of cotton species and 5 events of genetically modified (GM) cotton. Validated real-time-PCR reactions targeting Bollgard, Bollgard II, Roundup Ready, 3006-210-23, and 281-24-236 junction sequences, and the cotton reference gene acp1 were adapted to detect more than half of the European Union-approved individual or stacked GM cotton events in one reaction. The assay was fully specific (<1.7% of false classification rate), with limit of detection values of 0.1% for each event, which were also achieved with simulated mixtures at different relative percentages of targets. The assay was further combined with a second multiplex PCR-CGE-SC assay to allow simultaneous detection of 6 cotton and 5 maize targets (two endogenous genes and 9 GM events) in two multiplex PCRs and a single CGE, making the approach more economic. Besides allowing simultaneous detection of many targets with adequate specificity and sensitivity, the multiplex PCR-CGE-SC approach has high throughput and automation capabilities, while keeping a very simple protocol, e.g., amplification and labeling in one step. Thus, it is an easy and inexpensive tool for initial screening, to be complemented with quantitative assays if necessary.
CTBT infrasound network performance to detect the 2013 Russian fireball event
Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; ...
2015-03-18
The explosive fragmentation of the 2013 Chelyabinsk meteorite generated a large airburst with an equivalent yield of 500 kT TNT. It is the most energetic event recorded by the infrasound component of the Comprehensive Nuclear-Test-Ban Treaty-International Monitoring System (CTBT-IMS), globally detected by 20 out of 42 operational stations. This study performs a station-by-station estimation of the IMS detection capability to explain infrasound detections and nondetections from short to long distances, using the Chelyabinsk meteorite as global reference event. Investigated parameters influencing the detection capability are the directivity of the line source signal, the ducting of acoustic energy, and the individualmore » noise conditions at each station. Findings include a clear detection preference for stations perpendicular to the meteorite trajectory, even over large distances. Only a weak influence of stratospheric ducting is observed for this low-frequency case. As a result, a strong dependence on the diurnal variability of background noise levels at each station is observed, favoring nocturnal detections.« less
Brauchli Pernus, Yolanda; Nan, Cassandra; Verstraeten, Thomas; Pedenko, Mariia; Osokogu, Osemeke U; Weibel, Daniel; Sturkenboom, Miriam; Bonhoeffer, Jan
2016-12-12
Safety signal detection in spontaneous reporting system databases and electronic healthcare records is key to detection of previously unknown adverse events following immunization. Various statistical methods for signal detection in these different datasources have been developed, however none are geared to the pediatric population and none specifically to vaccines. A reference set comprising pediatric vaccine-adverse event pairs is required for reliable performance testing of statistical methods within and across data sources. The study was conducted within the context of the Global Research in Paediatrics (GRiP) project, as part of the seventh framework programme (FP7) of the European Commission. Criteria for the selection of vaccines considered in the reference set were routine and global use in the pediatric population. Adverse events were primarily selected based on importance. Outcome based systematic literature searches were performed for all identified vaccine-adverse event pairs and complemented by expert committee reports, evidence based decision support systems (e.g. Micromedex), and summaries of product characteristics. Classification into positive (PC) and negative control (NC) pairs was performed by two independent reviewers according to a pre-defined algorithm and discussed for consensus in case of disagreement. We selected 13 vaccines and 14 adverse events to be included in the reference set. From a total of 182 vaccine-adverse event pairs, we classified 18 as PC, 113 as NC and 51 as unclassifiable. Most classifications (91) were based on literature review, 45 were based on expert committee reports, and for 46 vaccine-adverse event pairs, an underlying pathomechanism was not plausible classifying the association as NC. A reference set of vaccine-adverse event pairs was developed. We propose its use for comparing signal detection methods and systems in the pediatric population. Published by Elsevier Ltd.
Jensen, Morten Hasselstrøm; Christensen, Toke Folke; Tarnow, Lise; Seto, Edmund; Dencker Johansen, Mette; Hejlesen, Ole Kristian
2013-07-01
Hypoglycemia is a potentially fatal condition. Continuous glucose monitoring (CGM) has the potential to detect hypoglycemia in real time and thereby reduce time in hypoglycemia and avoid any further decline in blood glucose level. However, CGM is inaccurate and shows a substantial number of cases in which the hypoglycemic event is not detected by the CGM. The aim of this study was to develop a pattern classification model to optimize real-time hypoglycemia detection. Features such as time since last insulin injection and linear regression, kurtosis, and skewness of the CGM signal in different time intervals were extracted from data of 10 male subjects experiencing 17 insulin-induced hypoglycemic events in an experimental setting. Nondiscriminative features were eliminated with SEPCOR and forward selection. The feature combinations were used in a Support Vector Machine model and the performance assessed by sample-based sensitivity and specificity and event-based sensitivity and number of false-positives. The best model was composed by using seven features and was able to detect 17 of 17 hypoglycemic events with one false-positive compared with 12 of 17 hypoglycemic events with zero false-positives for the CGM alone. Lead-time was 14 min and 0 min for the model and the CGM alone, respectively. This optimized real-time hypoglycemia detection provides a unique approach for the diabetes patient to reduce time in hypoglycemia and learn about patterns in glucose excursions. Although these results are promising, the model needs to be validated on CGM data from patients with spontaneous hypoglycemic events.
Spawning behaviour of Allis shad Alosa alosa: new insights based on imaging sonar data.
Langkau, M C; Clavé, D; Schmidt, M B; Borcherding, J
2016-06-01
Spawning behaviour of Alosa alosa was observed by high resolution imaging sonar. Detected clouds of sexual products and micro bubbles served as a potential indicator of spawning activity. Peak spawning time was between 0130 and 0200 hours at night. Increasing detections over three consecutive nights were consistent with sounds of mating events (bulls) assessed in hearing surveys in parallel to the hydro acoustic detection. In 70% of the analysed mating events there were no additional A. alosa joining the event whilst 70% of the mating events showed one or two A. alosa leaving the cloud. In 31% of the analysed mating events, however, three or more A. alosa were leaving the clouds, indicating that matings are not restricted to a pair. Imaging sonar is suitable for monitoring spawning activity and behaviour of anadromous clupeids in their spawning habitats. © 2016 The Fisheries Society of the British Isles.
Numerical study on the sequential Bayesian approach for radioactive materials detection
NASA Astrophysics Data System (ADS)
Qingpei, Xiang; Dongfeng, Tian; Jianyu, Zhu; Fanhua, Hao; Ge, Ding; Jun, Zeng
2013-01-01
A new detection method, based on the sequential Bayesian approach proposed by Candy et al., offers new horizons for the research of radioactive detection. Compared with the commonly adopted detection methods incorporated with statistical theory, the sequential Bayesian approach offers the advantages of shorter verification time during the analysis of spectra that contain low total counts, especially in complex radionuclide components. In this paper, a simulation experiment platform implanted with the methodology of sequential Bayesian approach was developed. Events sequences of γ-rays associating with the true parameters of a LaBr3(Ce) detector were obtained based on an events sequence generator using Monte Carlo sampling theory to study the performance of the sequential Bayesian approach. The numerical experimental results are in accordance with those of Candy. Moreover, the relationship between the detection model and the event generator, respectively represented by the expected detection rate (Am) and the tested detection rate (Gm) parameters, is investigated. To achieve an optimal performance for this processor, the interval of the tested detection rate as a function of the expected detection rate is also presented.
Ben-Yoav, Hadar; Dykstra, Peter H; Bentley, William E; Ghodssi, Reza
2017-01-01
A microfluidic electrochemical lab-on-a-chip (LOC) device for DNA hybridization detection has been developed. The device comprises a 3 × 3 array of microelectrodes integrated with a dual layer microfluidic valved manipulation system that provides controlled and automated capabilities for high throughput analysis of microliter volume samples. The surface of the microelectrodes is functionalized with single-stranded DNA (ssDNA) probes which enable specific detection of complementary ssDNA targets. These targets are detected by a capacitive technique which measures dielectric variation at the microelectrode-electrolyte interface due to DNA hybridization events. A quantitative analysis of the hybridization events is carried out based on a sensing modeling that includes detailed analysis of energy storage and dissipation components. By calculating these components during hybridization events the device is able to demonstrate specific and dose response sensing characteristics. The developed microfluidic LOC for DNA hybridization detection offers a technology for real-time and label-free assessment of genetic markers outside of laboratory settings, such as at the point-of-care or in-field environmental monitoring.
Cheng, Nan; Shang, Ying; Xu, Yuancong; Zhang, Li; Luo, Yunbo; Huang, Kunlun; Xu, Wentao
2017-05-15
Stacked genetically modified organisms (GMO) are becoming popular for their enhanced production efficiency and improved functional properties, and on-site detection of stacked GMO is an urgent challenge to be solved. In this study, we developed a cascade system combining event-specific tag-labeled multiplex LAMP with a DNAzyme-lateral flow biosensor for reliable detection of stacked events (DP305423× GTS 40-3-2). Three primer sets, both event-specific and soybean species-specific, were newly designed for the tag-labeled multiplex LAMP system. A trident-like lateral flow biosensor displayed amplified products simultaneously without cross contamination, and DNAzyme enhancement improved the sensitivity effectively. After optimization, the limit of detection was approximately 0.1% (w/w) for stacked GM soybean, which is sensitive enough to detect genetically modified content up to a threshold value established by several countries for regulatory compliance. The entire detection process could be shortened to 120min without any large-scale instrumentation. This method may be useful for the in-field detection of DP305423× GTS 40-3-2 soybean on a single kernel basis and on-site screening tests of stacked GM soybean lines and individual parent GM soybean lines in highly processed foods. Copyright © 2017 Elsevier B.V. All rights reserved.
Prins, Theo W; Scholtens, Ingrid M J; Bak, Arno W; van Dijk, Jeroen P; Voorhuijzen, Marleen M; Laurensse, Emile J; Kok, Esther J
2016-12-15
During routine monitoring for GMOs in food in the Netherlands, papaya-containing food supplements were found positive for the genetically modified (GM) elements P-35S and T-nos. The goal of this study was to identify the unknown and EU unauthorised GM papaya event(s). A screening strategy was applied using additional GM screening elements including a newly developed PRSV coat protein PCR. The detected PRSV coat protein PCR product was sequenced and the nucleotide sequence showed identity to PRSV YK strains indigenous to China and Taiwan. The GM events 16-0-1 and 18-2-4 could be identified by amplifying and sequencing events-specific sequences. Further analyses showed that both papaya event 16-0-1 and event 18-2-4 were transformed with the same construct. For use in routine analysis, derived TaqMan qPCR methods for events 16-0-1 and 18-2-4 were developed. Event 16-0-1 was detected in all samples tested whereas event 18-2-4 was detected in one sample. This study presents a strategy for combining information from different sources (literature, patent databases) and novel sequence data to identify unknown GM papaya events. Copyright © 2016 Elsevier Ltd. All rights reserved.
A Framework of Simple Event Detection in Surveillance Video
NASA Astrophysics Data System (ADS)
Xu, Weiguang; Zhang, Yafei; Lu, Jianjiang; Tian, Yulong; Wang, Jiabao
Video surveillance is playing more and more important role in people's social life. Real-time alerting of threaten events and searching interesting content in stored large scale video footage needs human operator to pay full attention on monitor for long time. The labor intensive mode has limit the effectiveness and efficiency of the system. A framework of simple event detection is presented advance the automation of video surveillance. An improved inner key point matching approach is used to compensate motion of background in real-time; frame difference are used to detect foreground; HOG based classifiers are used to classify foreground object into people and car; mean-shift is used to tracking the recognized objects. Events are detected based on predefined rules. The maturity of the algorithms guarantee the robustness of the framework, and the improved approach and the easily checked rules enable the framework to work in real-time. Future works to be done are also discussed.
Semiautomated tremor detection using a combined cross-correlation and neural network approach
Horstmann, Tobias; Harrington, Rebecca M.; Cochran, Elizabeth S.
2013-01-01
Despite observations of tectonic tremor in many locations around the globe, the emergent phase arrivals, low‒amplitude waveforms, and variable event durations make automatic detection a nontrivial task. In this study, we employ a new method to identify tremor in large data sets using a semiautomated technique. The method first reduces the data volume with an envelope cross‒correlation technique, followed by a Self‒Organizing Map (SOM) algorithm to identify and classify event types. The method detects tremor in an automated fashion after calibrating for a specific data set, hence we refer to it as being “semiautomated”. We apply the semiautomated detection algorithm to a newly acquired data set of waveforms from a temporary deployment of 13 seismometers near Cholame, California, from May 2010 to July 2011. We manually identify tremor events in a 3 week long test data set and compare to the SOM output and find a detection accuracy of 79.5%. Detection accuracy improves with increasing signal‒to‒noise ratios and number of available stations. We find detection completeness of 96% for tremor events with signal‒to‒noise ratios above 3 and optimal results when data from at least 10 stations are available. We compare the SOM algorithm to the envelope correlation method of Wech and Creager and find the SOM performs significantly better, at least for the data set examined here. Using the SOM algorithm, we detect 2606 tremor events with a cumulative signal duration of nearly 55 h during the 13 month deployment. Overall, the SOM algorithm is shown to be a flexible new method that utilizes characteristics of the waveforms to identify tremor from noise or other seismic signals.
Semiautomated tremor detection using a combined cross-correlation and neural network approach
NASA Astrophysics Data System (ADS)
Horstmann, T.; Harrington, R. M.; Cochran, E. S.
2013-09-01
Despite observations of tectonic tremor in many locations around the globe, the emergent phase arrivals, low-amplitude waveforms, and variable event durations make automatic detection a nontrivial task. In this study, we employ a new method to identify tremor in large data sets using a semiautomated technique. The method first reduces the data volume with an envelope cross-correlation technique, followed by a Self-Organizing Map (SOM) algorithm to identify and classify event types. The method detects tremor in an automated fashion after calibrating for a specific data set, hence we refer to it as being "semiautomated". We apply the semiautomated detection algorithm to a newly acquired data set of waveforms from a temporary deployment of 13 seismometers near Cholame, California, from May 2010 to July 2011. We manually identify tremor events in a 3 week long test data set and compare to the SOM output and find a detection accuracy of 79.5%. Detection accuracy improves with increasing signal-to-noise ratios and number of available stations. We find detection completeness of 96% for tremor events with signal-to-noise ratios above 3 and optimal results when data from at least 10 stations are available. We compare the SOM algorithm to the envelope correlation method of Wech and Creager and find the SOM performs significantly better, at least for the data set examined here. Using the SOM algorithm, we detect 2606 tremor events with a cumulative signal duration of nearly 55 h during the 13 month deployment. Overall, the SOM algorithm is shown to be a flexible new method that utilizes characteristics of the waveforms to identify tremor from noise or other seismic signals.
On event-based optical flow detection
Brosch, Tobias; Tschechne, Stephan; Neumann, Heiko
2015-01-01
Event-based sensing, i.e., the asynchronous detection of luminance changes, promises low-energy, high dynamic range, and sparse sensing. This stands in contrast to whole image frame-wise acquisition by standard cameras. Here, we systematically investigate the implications of event-based sensing in the context of visual motion, or flow, estimation. Starting from a common theoretical foundation, we discuss different principal approaches for optical flow detection ranging from gradient-based methods over plane-fitting to filter based methods and identify strengths and weaknesses of each class. Gradient-based methods for local motion integration are shown to suffer from the sparse encoding in address-event representations (AER). Approaches exploiting the local plane like structure of the event cloud, on the other hand, are shown to be well suited. Within this class, filter based approaches are shown to define a proper detection scheme which can also deal with the problem of representing multiple motions at a single location (motion transparency). A novel biologically inspired efficient motion detector is proposed, analyzed and experimentally validated. Furthermore, a stage of surround normalization is incorporated. Together with the filtering this defines a canonical circuit for motion feature detection. The theoretical analysis shows that such an integrated circuit reduces motion ambiguity in addition to decorrelating the representation of motion related activations. PMID:25941470
Sakaeda, Toshiyuki; Kadoyama, Kaori; Okuno, Yasushi
2011-01-01
Adverse event reports (AERs) submitted to the US Food and Drug Administration (FDA) were reviewed to assess the muscular and renal adverse events induced by the administration of 3-hydroxy-3-methylglutaryl coenzyme A (HMG-CoA) reductase inhibitors (statins) and to attempt to determine the rank-order of the association. After a revision of arbitrary drug names and the deletion of duplicated submissions, AERs involving pravastatin, simvastatin, atorvastatin, or rosuvastatin were analyzed. Authorized pharmacovigilance tools were used for quantitative detection of signals, i.e., drug-associated adverse events, including the proportional reporting ratio, the reporting odds ratio, the information component given by a Bayesian confidence propagation neural network, and the empirical Bayes geometric mean. Myalgia, rhabdomyolysis and an increase in creatine phosphokinase level were focused on as the muscular adverse events, and acute renal failure, non-acute renal failure, and an increase in blood creatinine level as the renal adverse events. Based on 1,644,220 AERs from 2004 to 2009, signals were detected for 4 statins with respect to myalgia, rhabdomyolysis, and an increase in creatine phosphokinase level, but these signals were stronger for rosuvastatin than pravastatin and atorvastatin. Signals were also detected for acute renal failure, though in the case of atorvastatin, the association was marginal, and furthermore, a signal was not detected for non-acute renal failure or for an increase in blood creatinine level. Data mining of the FDA's adverse event reporting system, AERS, is useful for examining statin-associated muscular and renal adverse events. The data strongly suggest the necessity of well-organized clinical studies with respect to statin-associated adverse events.
A novel seizure detection algorithm informed by hidden Markov model event states
NASA Astrophysics Data System (ADS)
Baldassano, Steven; Wulsin, Drausin; Ung, Hoameng; Blevins, Tyler; Brown, Mesha-Gay; Fox, Emily; Litt, Brian
2016-06-01
Objective. Recently the FDA approved the first responsive, closed-loop intracranial device to treat epilepsy. Because these devices must respond within seconds of seizure onset and not miss events, they are tuned to have high sensitivity, leading to frequent false positive stimulations and decreased battery life. In this work, we propose a more robust seizure detection model. Approach. We use a Bayesian nonparametric Markov switching process to parse intracranial EEG (iEEG) data into distinct dynamic event states. Each event state is then modeled as a multidimensional Gaussian distribution to allow for predictive state assignment. By detecting event states highly specific for seizure onset zones, the method can identify precise regions of iEEG data associated with the transition to seizure activity, reducing false positive detections associated with interictal bursts. The seizure detection algorithm was translated to a real-time application and validated in a small pilot study using 391 days of continuous iEEG data from two dogs with naturally occurring, multifocal epilepsy. A feature-based seizure detector modeled after the NeuroPace RNS System was developed as a control. Main results. Our novel seizure detection method demonstrated an improvement in false negative rate (0/55 seizures missed versus 2/55 seizures missed) as well as a significantly reduced false positive rate (0.0012 h versus 0.058 h-1). All seizures were detected an average of 12.1 ± 6.9 s before the onset of unequivocal epileptic activity (unequivocal epileptic onset (UEO)). Significance. This algorithm represents a computationally inexpensive, individualized, real-time detection method suitable for implantable antiepileptic devices that may considerably reduce false positive rate relative to current industry standards.
NASA Astrophysics Data System (ADS)
Helmboldt, J.; Park, J.; von Frese, R. R. B.; Grejner-Brzezinska, D. A.
2016-12-01
Traveling ionospheric disturbance (TID) is generated by various sources and detectable by observing the spatial and temporal change of electron contents in the ionosphere. This study focused on detecting and analyzing TIDs generated by acoustic-gravity waves from man-made events including underground nuclear explosions (UNEs), mine collapses, mine blasts, and large chemical explosions (LCEs) using Global Navigation Satellite System (GNSS). In this study we selected different types of events for case study which covers two US and three North Korean UNEs, two large US mine collapses, three large US mine blasts, and a LCE in northern China and a second LCE at the Nevada Test Site. In most cases, we successfully detected the TIDs as array signatures from the multiple nearby GNSS stations. The array-based TID signatures from these studies were found to yield event-appropriate TID propagation speeds ranging from about a few hundred m/s to roughly a km/s. In addition, the event TID waveforms, and propagation angles and directions were established. The TID waveforms and the maximum angle between each event and the IPP of its TID with the longest travel distance from the source may help differentiate UNEs and LCEs, but the uneven distributions of the observing GNSS stations complicates these results. Thus, further analysis is required of the utility of the apertures of event signatures in the ionosphere for discriminating these events. In general, the results of this study show the potential utility of GNSS observations for detecting and mapping the ionospheric signatures of large-energy anthropological explosions and subsurface collapses.
Induced Seismicity in Greeley, CO: The Effects of Pore Pressure on Seismic Wave Character
NASA Astrophysics Data System (ADS)
Bogolub, K. R.; Holmes, R.; Sheehan, A. F.; Brown, M. R. M.
2017-12-01
Since 2013, a series of injection-induced earthquakes has occurred near Greeley, Colorado including a Mw 3.2 event in June 2014. With induced seismicity on the rise, it is important to understand injection-induced earthquakes to improve mitigation efforts. In this research, we analyzed seismograms from a local seismic network to see if there are any notable differences in seismic waveform as a result of changes in pore pressure from wastewater injection. Catalogued earthquake events from January-June 2017 that were clearly visible on 4 or more stations in the network were used as template events in a subspace detector. Since the template events were constructed using seismograms from a single event, the subspace detector operated similarly to a matched filter and detections had very similar waveforms to the template event. Having these detections ultimately helped us identify similar earthquakes, which gave us better located events for comparison. These detections were then examined and located using a 1D local velocity model. While many of these detections were already catalogued events, we also identified >20 new events by using this detector. Any two events that were matched by the detector, collocated within the error ellipses of both events and at least a month apart temporally were classified as "event pairs". One challenge of this method is that most of the collocated earthquakes occurred in a very narrow time window, which indicates that the events have a tendency to cluster both spatially and temporally. However, we were able to examine an event pair that fit our spatial proximity criteria, and were several months apart (March 3, 2017 and May 8, 2017). We present an examination of propagation velocity and frequency content for these two events specifically to assess if transient changes in pore pressure had any observable influence on these characteristics. Our preliminary results indicate a slight difference in lag time between P wave and S wave arrivals (slightly greater in lag time for March event) and frequency content (slightly higher dominant frequencies for March event). However, more work needs to be done to refine our earthquake locations so we can determine if these observations are caused by a transient change in velocity structure, a difference in location of the two events, or some other mechanism.
Sommermeyer, Dirk; Zou, Ding; Grote, Ludger; Hedner, Jan
2012-01-01
Study Objective: To assess the accuracy of novel algorithms using an oximeter-based finger plethysmographic signal in combination with a nasal cannula for the detection and differentiation of central and obstructive apneas. The validity of single pulse oximetry to detect respiratory disturbance events was also studied. Methods: Patients recruited from four sleep laboratories underwent an ambulatory overnight cardiorespiratory polygraphy recording. The nasal flow and photoplethysmographic signals of the recording were analyzed by automated algorithms. The apnea hypopnea index (AHIauto) was calculated using both signals, and a respiratory disturbance index (RDIauto) was calculated from photoplethysmography alone. Apnea events were classified into obstructive and central types using the oximeter derived pulse wave signal and compared with manual scoring. Results: Sixty-six subjects (42 males, age 54 ± 14 yrs, body mass index 28.5 ± 5.9 kg/m2) were included in the analysis. AHImanual (19.4 ± 18.5 events/h) correlated highly significantly with AHIauto (19.9 ± 16.5 events/h) and RDIauto (20.4 ± 17.2 events/h); the correlation coefficients were r = 0.94 and 0.95, respectively (p < 0.001) with a mean difference of −0.5 ± 6.6 and −1.0 ± 6.1 events/h. The automatic analysis of AHIauto and RDIauto detected sleep apnea (cutoff AHImanual ≥ 15 events/h) with a sensitivity/specificity of 0.90/0.97 and 0.86/0.94, respectively. The automated obstructive/central apnea indices correlated closely with manually scoring (r = 0.87 and 0.95, p < 0.001) with mean difference of −4.3 ± 7.9 and 0.3 ± 1.5 events/h, respectively. Conclusions: Automatic analysis based on routine pulse oximetry alone may be used to detect sleep disordered breathing with accuracy. In addition, the combination of photoplethysmographic signals with a nasal flow signal provides an accurate distinction between obstructive and central apneic events during sleep. Citation: Sommermeyer D; Zou D; Grote L; Hedner J. Detection of sleep disordered breathing and its central/obstructive character using nasal cannula and finger pulse oximeter. J Clin Sleep Med 2012;8(5):527-533. PMID:23066364
Event identification for KM3NeT/ARCA
NASA Astrophysics Data System (ADS)
Heid, Thomas; KM3NeT Collaboration
2017-09-01
KM3NeT is a large research infrastructure consisting of a network of deep-sea neutrino telescopes. KM3NeT/ARCA will be the instrument detecting high-energy neutrinos with energies above 100 TeV. This instrument gives a new opportunity to observe the neutrino sky with very high angular resolution to be able to detect neutrino point sources. Furthermore it will be possible to probe the flavour composition of neutrino fluxes, and hence production mechanisms, with so-far unreached precision. Neutrinos produce different event topologies in the detector according to their flavour, interaction channel and deposited energy. Machine-learning algorithms are able to learn features of topologies to discriminate them. In previous analyses only two event types were regarded, namely the shower and track topology. With good timing resolution and precise reconstruction algorithms it is possible to separate into more event types, for example the double bang topology produced by tau neutrinos. The final goal is to distinguish all three neutrino flavors as much as possible. To resolve this issue the KM3NeT collaboration uses deep neural networks trained with Monte Carlo events of all neutrino types. This contribution shows the ability of KM3NeT/ARCA to classify events in more than two neutrino event topologies. Furthermore, the borders between detectable classes are shown, such as the minimum distance the tau has to travel before decaying into a tau neutrino to be detected as double bang event.
NASA Astrophysics Data System (ADS)
Diffenbaugh, N. S.; Horton, D. E.; Singh, D.; Swain, D. L.; Touma, D. E.; Mankin, J. S.
2015-12-01
Because of the high cost of extreme events and the growing evidence that global warming is likely to alter the statistical distribution of climate variables, detection and attribution of changes in the probability of extreme climate events has become a pressing topic for the scientific community, elected officials, and the public. While most of the emphasis has thus far focused on analyzing the climate variable of interest (most often temperature or precipitation, but also flooding and drought), there is an emerging emphasis on applying detection and attribution analysis techniques to the underlying physical causes of individual extreme events. This approach is promising in part because the underlying physical causes (such as atmospheric circulation patterns) can in some cases be more accurately represented in climate models than the more proximal climate variable (such as precipitation). In addition, and more scientifically critical, is the fact that the most extreme events result from a rare combination of interacting causes, often referred to as "ingredients". Rare events will therefore always have a strong influence of "natural" variability. Analyzing the underlying physical mechanisms can therefore help to test whether there have been changes in the probability of the constituent conditions of an individual event, or whether the co-occurrence of causal conditions cannot be distinguished from random chance. This presentation will review approaches to applying detection/attribution analysis to the underlying physical causes of extreme events (including both "thermodynamic" and "dynamic" causes), and provide a number of case studies, including the role of frequency of atmospheric circulation patterns in the probability of hot, cold, wet and dry events.
Andersson, Richard; Larsson, Linnea; Holmqvist, Kenneth; Stridh, Martin; Nyström, Marcus
2017-04-01
Almost all eye-movement researchers use algorithms to parse raw data and detect distinct types of eye movement events, such as fixations, saccades, and pursuit, and then base their results on these. Surprisingly, these algorithms are rarely evaluated. We evaluated the classifications of ten eye-movement event detection algorithms, on data from an SMI HiSpeed 1250 system, and compared them to manual ratings of two human experts. The evaluation focused on fixations, saccades, and post-saccadic oscillations. The evaluation used both event duration parameters, and sample-by-sample comparisons to rank the algorithms. The resulting event durations varied substantially as a function of what algorithm was used. This evaluation differed from previous evaluations by considering a relatively large set of algorithms, multiple events, and data from both static and dynamic stimuli. The main conclusion is that current detectors of only fixations and saccades work reasonably well for static stimuli, but barely better than chance for dynamic stimuli. Differing results across evaluation methods make it difficult to select one winner for fixation detection. For saccade detection, however, the algorithm by Larsson, Nyström and Stridh (IEEE Transaction on Biomedical Engineering, 60(9):2484-2493,2013) outperforms all algorithms in data from both static and dynamic stimuli. The data also show how improperly selected algorithms applied to dynamic data misestimate fixation and saccade properties.
NASA Astrophysics Data System (ADS)
Peidou, Athina C.; Fotopoulos, Georgia; Pagiatakis, Spiros
2017-10-01
The main focus of this paper is to assess the feasibility of utilizing dedicated satellite gravity missions in order to detect large-scale solid mass transfer events (e.g. landslides). Specifically, a sensitivity analysis of Gravity Recovery and Climate Experiment (GRACE) gravity field solutions in conjunction with simulated case studies is employed to predict gravity changes due to past subaerial and submarine mass transfer events, namely the Agulhas slump in southeastern Africa and the Heart Mountain Landslide in northwestern Wyoming. The detectability of these events is evaluated by taking into account the expected noise level in the GRACE gravity field solutions and simulating their impact on the gravity field through forward modelling of the mass transfer. The spectral content of the estimated gravity changes induced by a simulated large-scale landslide event is estimated for the known spatial resolution of the GRACE observations using wavelet multiresolution analysis. The results indicate that both the Agulhas slump and the Heart Mountain Landslide could have been detected by GRACE, resulting in {\\vert }0.4{\\vert } and {\\vert }0.18{\\vert } mGal change on GRACE solutions, respectively. The suggested methodology is further extended to the case studies of the submarine landslide in Tohoku, Japan, and the Grand Banks landslide in Newfoundland, Canada. The detectability of these events using GRACE solutions is assessed through their impact on the gravity field.
Fehre, Karsten; Plössnig, Manuela; Schuler, Jochen; Hofer-Dückelmann, Christina; Rappelsberger, Andrea; Adlassnig, Klaus-Peter
2015-01-01
The detection of adverse drug events (ADEs) is an important aspect of improving patient safety. The iMedication system employs predefined triggers associated with significant events in a patient's clinical data to automatically detect possible ADEs. We defined four clinically relevant conditions: hyperkalemia, hyponatremia, renal failure, and over-anticoagulation. These are some of the most relevant ADEs in internal medical and geriatric wards. For each patient, ADE risk scores for all four situations are calculated, compared against a threshold, and judged to be monitored, or reported. A ward-based cockpit view summarizes the results.
The hard X-ray burst spectrometer event listing, 1980 - 1985
NASA Technical Reports Server (NTRS)
Dennis, B. R.; Orwig, L. E.; Kiplinger, A. L.; Gibson, B. R.; Kennard, G. S.; Tolbert, A. K.
1985-01-01
This event listing is a comprehensive reference for the hard X-ray bursts detected with the Hard X-Ray Burst Spectrometer on the Solar Maximum Mission from the time of launch on February 14, 1980 to September 1985. Over 8000 X-ray events were detected in the energy range from 30 to approx. 500 keV with the vast majority being solar flares. The listing includes the start time, peak time, duration and peak rate of each event.
The complete Hard X Ray Burst Spectrometer event list, 1980-1989
NASA Technical Reports Server (NTRS)
Dennis, B. R.; Orwig, L. E.; Kennard, G. S.; Labow, G. J.; Schwartz, R. A.; Shaver, A. R.; Tolbert, A. K.
1991-01-01
This event list is a comprehensive reference for all Hard X ray bursts detected with the Hard X Ray Burst Spectrometer on the Solar Maximum Mission from the time of launch on Feb. 14, 1980 to the end of the mission in Dec. 1989. Some 12,776 events were detected in the energy range 30 to 600 keV with the vast majority being solar flares. This list includes the start time, peak time, duration, and peak rate of each event.
LLNL Location and Detection Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Myers, S C; Harris, D B; Anderson, M L
2003-07-16
We present two LLNL research projects in the topical areas of location and detection. The first project assesses epicenter accuracy using a multiple-event location algorithm, and the second project employs waveform subspace Correlation to detect and identify events at Fennoscandian mines. Accurately located seismic events are the bases of location calibration. A well-characterized set of calibration events enables new Earth model development, empirical calibration, and validation of models. In a recent study, Bondar et al. (2003) develop network coverage criteria for assessing the accuracy of event locations that are determined using single-event, linearized inversion methods. These criteria are conservative andmore » are meant for application to large bulletins where emphasis is on catalog completeness and any given event location may be improved through detailed analysis or application of advanced algorithms. Relative event location techniques are touted as advancements that may improve absolute location accuracy by (1) ensuring an internally consistent dataset, (2) constraining a subset of events to known locations, and (3) taking advantage of station and event correlation structure. Here we present the preliminary phase of this work in which we use Nevada Test Site (NTS) nuclear explosions, with known locations, to test the effect of travel-time model accuracy on relative location accuracy. Like previous studies, we find that the reference velocity-model and relative-location accuracy are highly correlated. We also find that metrics based on travel-time residual of relocated events are not a reliable for assessing either velocity-model or relative-location accuracy. In the topical area of detection, we develop specialized correlation (subspace) detectors for the principal mines surrounding the ARCES station located in the European Arctic. Our objective is to provide efficient screens for explosions occurring in the mines of the Kola Peninsula (Kovdor, Zapolyarny, Olenogorsk, Khibiny) and the major iron mines of northern Sweden (Malmberget, Kiruna). In excess of 90% of the events detected by the ARCES station are mining explosions, and a significant fraction are from these northern mining groups. The primary challenge in developing waveform correlation detectors is the degree of variation in the source time histories of the shots, which can result in poor correlation among events even in close proximity. Our approach to solving this problem is to use lagged subspace correlation detectors, which offer some prospect of compensating for variation and uncertainty in source time functions.« less
NASA Astrophysics Data System (ADS)
Meng, X.; Daniels, C.; Smith, E.; Peng, Z.; Chen, X.; Wagner, L. S.; Fischer, K. M.; Hawman, R. B.
2015-12-01
Since 2001, the number of M>3 earthquakes increased significantly in Central and Eastern United States (CEUS), likely due to waste-water injection, also known as "induced earthquakes" [Ellsworth, 2013]. Because induced earthquakes are driven by short-term external forcing and hence may behave like earthquake swarms, which are not well characterized by branching point-process models, such as the Epidemic Type Aftershock Sequence (ETAS) model [Ogata, 1988]. In this study we focus on the 02/15/2014 M4.1 South Carolina and the 06/16/2014 M4.3 Oklahoma earthquakes, which likely represent intraplate tectonic and induced events, respectively. For the South Carolina event, only one M3.0 aftershock is identified by the ANSS catalog, which may be caused by a lack of low-magnitude events in this catalog. We apply a recently developed matched filter technique to detect earthquakes from 02/08/2014 to 02/22/2014 around the epicentral region. 15 seismic stations (both permanent and temporary USArray networks) within 100 km of the mainshock are used for detection. The mainshock and aftershock are used as templates for the initial detection. Newly detected events are employed as new templates, and the same detection procedure repeats until no new event can be added. Overall we have identified more than 10 events, including one foreshock occurred ~11 min before the M4.1 mainshock. However, the numbers of aftershocks are still much less than predicted with the modified Bath's law. For the Oklahoma event, we use 1270 events from the ANSS catalog and 182 events from a relocated catalog as templates to scan through continuous recordings 3 days before to 7 days after the mainshock. 12 seismic stations within the vicinity of the mainshock are included in the study. After obtaining more complete catalogs for both sequences, we plan to compare the statistical parameters (e.g., b, a, K, and p values) between the two sequences, as well as their spatial-temporal migration pattern, which may shed light on the underlying physics of tectonic and induced earthquakes.
Real-Time Gait Event Detection Based on Kinematic Data Coupled to a Biomechanical Model.
Lambrecht, Stefan; Harutyunyan, Anna; Tanghe, Kevin; Afschrift, Maarten; De Schutter, Joris; Jonkers, Ilse
2017-03-24
Real-time detection of multiple stance events, more specifically initial contact (IC), foot flat (FF), heel off (HO), and toe off (TO), could greatly benefit neurorobotic (NR) and neuroprosthetic (NP) control. Three real-time threshold-based algorithms have been developed, detecting the aforementioned events based on kinematic data in combination with a biomechanical model. Data from seven subjects walking at three speeds on an instrumented treadmill were used to validate the presented algorithms, accumulating to a total of 558 steps. The reference for the gait events was obtained using marker and force plate data. All algorithms had excellent precision and no false positives were observed. Timing delays of the presented algorithms were similar to current state-of-the-art algorithms for the detection of IC and TO, whereas smaller delays were achieved for the detection of FF. Our results indicate that, based on their high precision and low delays, these algorithms can be used for the control of an NR/NP, with the exception of the HO event. Kinematic data is used in most NR/NP control schemes and is thus available at no additional cost, resulting in a minimal computational burden. The presented methods can also be applied for screening pathological gait or gait analysis in general in/outside of the laboratory.
Episodic inflation events at Akutan Volcano, Alaska, during 2005-2017
NASA Astrophysics Data System (ADS)
Ji, Kang Hyeun; Yun, Sang-Ho; Rim, Hyoungrea
2017-08-01
Detection of weak volcano deformation helps constrain characteristics of eruption cycles. We have developed a signal detection technique, called the Targeted Projection Operator (TPO), to monitor surface deformation with Global Positioning System (GPS) data. We have applied the TPO to GPS data collected at Akutan Volcano from June 2005 to March 2017 and detected four inflation events that occurred in 2008, 2011, 2014, and 2016 with inflation rates of about 8-22 mm/yr above the background trend at a near-source site AV13. Numerical modeling suggests that the events should be driven by closely located sources or a single source in a shallow magma chamber at a depth of about 4 km. The inflation events suggest that magma has episodically accumulated in a shallow magma chamber.
Sommermeyer, Dirk; Zou, Ding; Grote, Ludger; Hedner, Jan
2012-10-15
To assess the accuracy of novel algorithms using an oximeter-based finger plethysmographic signal in combination with a nasal cannula for the detection and differentiation of central and obstructive apneas. The validity of single pulse oximetry to detect respiratory disturbance events was also studied. Patients recruited from four sleep laboratories underwent an ambulatory overnight cardiorespiratory polygraphy recording. The nasal flow and photoplethysmographic signals of the recording were analyzed by automated algorithms. The apnea hypopnea index (AHI(auto)) was calculated using both signals, and a respiratory disturbance index (RDI(auto)) was calculated from photoplethysmography alone. Apnea events were classified into obstructive and central types using the oximeter derived pulse wave signal and compared with manual scoring. Sixty-six subjects (42 males, age 54 ± 14 yrs, body mass index 28.5 ± 5.9 kg/m(2)) were included in the analysis. AHI(manual) (19.4 ± 18.5 events/h) correlated highly significantly with AHI(auto) (19.9 ± 16.5 events/h) and RDI(auto) (20.4 ± 17.2 events/h); the correlation coefficients were r = 0.94 and 0.95, respectively (p < 0.001) with a mean difference of -0.5 ± 6.6 and -1.0 ± 6.1 events/h. The automatic analysis of AHI(auto) and RDI(auto) detected sleep apnea (cutoff AHI(manual) ≥ 15 events/h) with a sensitivity/specificity of 0.90/0.97 and 0.86/0.94, respectively. The automated obstructive/central apnea indices correlated closely with manually scoring (r = 0.87 and 0.95, p < 0.001) with mean difference of -4.3 ± 7.9 and 0.3 ± 1.5 events/h, respectively. Automatic analysis based on routine pulse oximetry alone may be used to detect sleep disordered breathing with accuracy. In addition, the combination of photoplethysmographic signals with a nasal flow signal provides an accurate distinction between obstructive and central apneic events during sleep.
Detection of nitrification events in chloraminated drinking water distribution systems remains an ongoing challenge for many drinking water utilities, including Dallas Water Utilities (DWU) and the City of Houston (CoH). Each year, these utilities experience nitrification events ...
NASA Astrophysics Data System (ADS)
LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.
2016-12-01
Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.
Chu, Catherine J; Chan, Arthur; Song, Dan; Staley, Kevin J; Stufflebeam, Steven M; Kramer, Mark A
2017-02-01
High frequency oscillations are emerging as a clinically important indicator of epileptic networks. However, manual detection of these high frequency oscillations is difficult, time consuming, and subjective, especially in the scalp EEG, thus hindering further clinical exploration and application. Semi-automated detection methods augment manual detection by reducing inspection to a subset of time intervals. We propose a new method to detect high frequency oscillations that co-occur with interictal epileptiform discharges. The new method proceeds in two steps. The first step identifies candidate time intervals during which high frequency activity is increased. The second step computes a set of seven features for each candidate interval. These features require that the candidate event contain a high frequency oscillation approximately sinusoidal in shape, with at least three cycles, that co-occurs with a large amplitude discharge. Candidate events that satisfy these features are stored for validation through visual analysis. We evaluate the detector performance in simulation and on ten examples of scalp EEG data, and show that the proposed method successfully detects spike-ripple events, with high positive predictive value, low false positive rate, and high intra-rater reliability. The proposed method is less sensitive than the existing method of visual inspection, but much faster and much more reliable. Accurate and rapid detection of high frequency activity increases the clinical viability of this rhythmic biomarker of epilepsy. The proposed spike-ripple detector rapidly identifies candidate spike-ripple events, thus making clinical analysis of prolonged, multielectrode scalp EEG recordings tractable. Copyright © 2016 Elsevier B.V. All rights reserved.
Chu, Catherine. J.; Chan, Arthur; Song, Dan; Staley, Kevin J.; Stufflebeam, Steven M.; Kramer, Mark A.
2017-01-01
Summary Background High frequency oscillations are emerging as a clinically important indicator of epileptic networks. However, manual detection of these high frequency oscillations is difficult, time consuming, and subjective, especially in the scalp EEG, thus hindering further clinical exploration and application. Semi-automated detection methods augment manual detection by reducing inspection to a subset of time intervals. We propose a new method to detect high frequency oscillations that co-occur with interictal epileptiform discharges. New Method The new method proceeds in two steps. The first step identifies candidate time intervals during which high frequency activity is increased. The second step computes a set of seven features for each candidate interval. These features require that the candidate event contain a high frequency oscillation approximately sinusoidal in shape, with at least three cycles, that co-occurs with a large amplitude discharge. Candidate events that satisfy these features are stored for validation through visual analysis. Results We evaluate the detector performance in simulation and on ten examples of scalp EEG data, and show that the proposed method successfully detects spike-ripple events, with high positive predictive value, low false positive rate, and high intra-rater reliability. Comparison with Existing Method The proposed method is less sensitive than the existing method of visual inspection, but much faster and much more reliable. Conclusions Accurate and rapid detection of high frequency activity increases the clinical viability of this rhythmic biomarker of epilepsy. The proposed spike-ripple detector rapidly identifies candidate spike-ripple events, thus making clinical analysis of prolonged, multielectrode scalp EEG recordings tractable. PMID:27988323
Gaertner, James P; Garres, Tiffany; Becker, Jesse C; Jimenez, Maria L; Forstner, Michael R J; Hahn, Dittmar
2009-03-01
Sediments and water from the spring and slough arm of Spring Lake, the pristine headwaters of the San Marcos River, Texas, were analyzed for Salmonellae by culture and molecular techniques before and after three major precipitation events, each with intermediate dry periods. Polymerase chain reaction (PCR)-assisted analyses of enrichment cultures detected Salmonellae in samples after all three precipitation events, but failed to detect them immediately prior to the rainfall events. Detection among individual locations differed with respect to the precipitation event analyzed, and strains isolated were highly variable with respect to serovars. These results demonstrate that rainwater associated effects, most likely surface runoff, provide an avenue for short-term pollution of aquatic systems with Salmonellae that do not, however, appear to establish for the long-term in water nor sediments.
Human Rights Event Detection from Heterogeneous Social Media Graphs.
Chen, Feng; Neill, Daniel B
2015-03-01
Human rights organizations are increasingly monitoring social media for identification, verification, and documentation of human rights violations. Since manual extraction of events from the massive amount of online social network data is difficult and time-consuming, we propose an approach for automated, large-scale discovery and analysis of human rights-related events. We apply our recently developed Non-Parametric Heterogeneous Graph Scan (NPHGS), which models social media data such as Twitter as a heterogeneous network (with multiple different node types, features, and relationships) and detects emerging patterns in the network, to identify and characterize human rights events. NPHGS efficiently maximizes a nonparametric scan statistic (an aggregate measure of anomalousness) over connected subgraphs of the heterogeneous network to identify the most anomalous network clusters. It summarizes each event with information such as type of event, geographical locations, time, and participants, and provides documentation such as links to videos and news reports. Building on our previous work that demonstrates the utility of NPHGS for civil unrest prediction and rare disease outbreak detection, we present an analysis of human rights events detected by NPHGS using two years of Twitter data from Mexico. NPHGS was able to accurately detect relevant clusters of human rights-related tweets prior to international news sources, and in some cases, prior to local news reports. Analysis of social media using NPHGS could enhance the information-gathering missions of human rights organizations by pinpointing specific abuses, revealing events and details that may be blocked from traditional media sources, and providing evidence of emerging patterns of human rights violations. This could lead to more timely, targeted, and effective advocacy, as well as other potential interventions.
Detection of goal events in soccer videos
NASA Astrophysics Data System (ADS)
Kim, Hyoung-Gook; Roeber, Steffen; Samour, Amjad; Sikora, Thomas
2005-01-01
In this paper, we present an automatic extraction of goal events in soccer videos by using audio track features alone without relying on expensive-to-compute video track features. The extracted goal events can be used for high-level indexing and selective browsing of soccer videos. The detection of soccer video highlights using audio contents comprises three steps: 1) extraction of audio features from a video sequence, 2) event candidate detection of highlight events based on the information provided by the feature extraction Methods and the Hidden Markov Model (HMM), 3) goal event selection to finally determine the video intervals to be included in the summary. For this purpose we compared the performance of the well known Mel-scale Frequency Cepstral Coefficients (MFCC) feature extraction method vs. MPEG-7 Audio Spectrum Projection feature (ASP) extraction method based on three different decomposition methods namely Principal Component Analysis( PCA), Independent Component Analysis (ICA) and Non-Negative Matrix Factorization (NMF). To evaluate our system we collected five soccer game videos from various sources. In total we have seven hours of soccer games consisting of eight gigabytes of data. One of five soccer games is used as the training data (e.g., announcers' excited speech, audience ambient speech noise, audience clapping, environmental sounds). Our goal event detection results are encouraging.
Relationship between ultrasonically detected phasic antral contractions and antral pressure.
Hveem, K; Sun, W M; Hebbard, G; Horowitz, M; Doran, S; Dent, J
2001-07-01
The relationships between gastric wall motion and intraluminal pressure are believed to be major determinants of flows within and from the stomach. Gastric antral wall motion and intraluminal pressures were monitored in five healthy subjects by concurrent antropyloroduodenal manometry and transabdominal ultrasound for 60 min after subjects drank 500 ml of clear soup. We found that 99% of antral contractions detected by ultrasound were propagated aborally, and 68% of contractions became lumen occlusive at the site of the ultrasound marker. Of the 203 contractions detected by ultrasound, 53% were associated with pressure events in the manometric reference channel; 86% of contractions had corresponding pressure events detectable somewhere in the antrum. Contractions that occluded the lumen were more likely to be associated with a pressure event in the manometric reference channel (P < 0.01) and to be of greater amplitude (P < 0.01) than non-lumen-occlusive contractions. We conclude that heterogeneous pressure event patterns in the antrum occur despite a stereotyped pattern of contraction propagation seen on ultrasound. Lumen occlusion is more likely to be associated with higher peak antral pressure events.
An Alternative Explanation for "Step-Like" Early VLF Event
NASA Astrophysics Data System (ADS)
Moore, R. C.
2016-12-01
A newly-deployed array of VLF receivers along the East Coast of the United States is ideally suited for detecting VLF scattering from lightning-induced disturbances to the lower ionosphere. The array was deployed in May 2016, and one VLF receiver was deployed only 20 km from the NAA transmitter (24.0 kHz) in Cutler, Maine. The phase of the NAA signal at this closest site varies significantly with time, due simply to the impedance match of the transmitter varying with time. Additionally, both the amplitude and phase exhibit periods of rapid shifts that could possibly explain at least some "step-like" VLF scattering events. Here, we distinguish between "step-like" VLF scattering events and other events in that "step-like" events are typically not closely associated with a detected causative lightning flash and also tend to exhibit little or no recovery to ambient conditions after the event onset. We present an analysis of VLF observations from the East Coast array that demonstrates interesting examples of step-like VLF events far from the transmitter that are associated with step-like events very close to the transmitter. We conclude that step-like VLF events should be treated with caution, unless definitively associated with a causative lightning flash and/or detected using observations of multiple transmitter signals.
Joint Attributes and Event Analysis for Multimedia Event Detection.
Ma, Zhigang; Chang, Xiaojun; Xu, Zhongwen; Sebe, Nicu; Hauptmann, Alexander G
2017-06-15
Semantic attributes have been increasingly used the past few years for multimedia event detection (MED) with promising results. The motivation is that multimedia events generally consist of lower level components such as objects, scenes, and actions. By characterizing multimedia event videos with semantic attributes, one could exploit more informative cues for improved detection results. Much existing work obtains semantic attributes from images, which may be suboptimal for video analysis since these image-inferred attributes do not carry dynamic information that is essential for videos. To address this issue, we propose to learn semantic attributes from external videos using their semantic labels. We name them video attributes in this paper. In contrast with multimedia event videos, these external videos depict lower level contents such as objects, scenes, and actions. To harness video attributes, we propose an algorithm established on a correlation vector that correlates them to a target event. Consequently, we could incorporate video attributes latently as extra information into the event detector learnt from multimedia event videos in a joint framework. To validate our method, we perform experiments on the real-world large-scale TRECVID MED 2013 and 2014 data sets and compare our method with several state-of-the-art algorithms. The experiments show that our method is advantageous for MED.
Binary Microlensing Events from the MACHO Project
NASA Astrophysics Data System (ADS)
Alcock, C.; Allsman, R. A.; Alves, D.; Axelrod, T. S.; Baines, D.; Becker, A. C.; Bennett, D. P.; Bourke, A.; Brakel, A.; Cook, K. H.; Crook, B.; Crouch, A.; Dan, J.; Drake, A. J.; Fragile, P. C.; Freeman, K. C.; Gal-Yam, A.; Geha, M.; Gray, J.; Griest, K.; Gurtierrez, A.; Heller, A.; Howard, J.; Johnson, B. R.; Kaspi, S.; Keane, M.; Kovo, O.; Leach, C.; Leach, T.; Leibowitz, E. M.; Lehner, M. J.; Lipkin, Y.; Maoz, D.; Marshall, S. L.; McDowell, D.; McKeown, S.; Mendelson, H.; Messenger, B.; Minniti, D.; Nelson, C.; Peterson, B. A.; Popowski, P.; Pozza, E.; Purcell, P.; Pratt, M. R.; Quinn, J.; Quinn, P. J.; Rhie, S. H.; Rodgers, A. W.; Salmon, A.; Shemmer, O.; Stetson, P.; Stubbs, C. W.; Sutherland, W.; Thomson, S.; Tomaney, A.; Vandehei, T.; Walker, A.; Ward, K.; Wyper, G.
2000-09-01
We present the light curves of 21 gravitational microlensing events from the first six years of the MACHO Project gravitational microlensing survey that are likely examples of lensing by binary systems. These events were manually selected from a total sample of ~350 candidate microlensing events that were either detected by the MACHO Alert System or discovered through retrospective analyses of the MACHO database. At least 14 of these 21 events exhibit strong (caustic) features, and four of the events are well fit with lensing by large mass ratio (brown dwarf or planetary) systems, although these fits are not necessarily unique. The total binary event rate is roughly consistent with predictions based upon our knowledge of the properties of binary stars, but a precise comparison cannot be made without a determination of our binary lens event detection efficiency. Toward the Galactic bulge, we find a ratio of caustic crossing to noncaustic crossing binary lensing events of 12:4, excluding one event for which we present two fits. This suggests significant incompleteness in our ability to detect and characterize noncaustic crossing binary lensing. The distribution of mass ratios, N(q), for these binary lenses appears relatively flat. We are also able to reliably measure source-face crossing times in four of the bulge caustic crossing events, and recover from them a distribution of lens proper motions, masses, and distances consistent with a population of Galactic bulge lenses at a distance of 7+/-1 kpc. This analysis yields two systems with companions of ~0.05 Msolar.
Treml, Diana; Venturelli, Gustavo L; Brod, Fábio C A; Faria, Josias C; Arisi, Ana C M
2014-12-10
A genetically modified (GM) common bean event, namely Embrapa 5.1, resistant to the bean golden mosaic virus (BGMV), was approved for commercialization in Brazil. Brazilian regulation for genetically modified organism (GMO) labeling requires that any food containing more than 1% GMO be labeled. The event-specific polymerase chain reaction (PCR) method has been the primary trend for GMO identification and quantitation because of its high specificity based on the flanking sequence. This work reports the development of an event-specific assay, named FGM, for Embrapa 5.1 detection and quantitation by use of SYBR Green or hydrolysis probe. The FGM assay specificity was tested for Embrapa 2.3 event (a noncommercial GM common bean also resistant to BGMV), 46 non-GM common bean varieties, and other crop species including maize, GM maize, soybean, and GM soybean. The FGM assay showed high specificity to detect the Embrapa 5.1 event. Standard curves for the FGM assay presented a mean efficiency of 95% and a limit of detection (LOD) of 100 genome copies in the presence of background DNA. The primers and probe developed are suitable for the detection and quantitation of Embrapa 5.1.
NASA Astrophysics Data System (ADS)
Knapmeyer-Endrun, B.; Hammer, C.
2014-12-01
The seismometers that the Apollo astronauts deployed on the Moon provide the only recordings of seismic events from any extra-terrestrial body so far. These lunar events are significantly different from ones recorded on Earth, in terms of both signal shape and source processes. Thus they are a valuable test case for any experiment in planetary seismology. In this study, we analyze Apollo 16 data with a single-station event detection and classification algorithm in view of NASA's upcoming InSight mission to Mars. InSight, scheduled for launch in early 2016, has the goal to investigate Mars' internal structure by deploying a seismometer on its surface. As the mission does not feature any orbiter, continuous data will be relayed to Earth at a reduced rate. Full range data will only be available by requesting specific time-windows within a few days after the receipt of the original transmission. We apply a recently introduced algorithm based on hidden Markov models that requires only a single example waveform of each event class for training appropriate models. After constructing the prototypes we detect and classify impacts and deep and shallow moonquakes. Initial results for 1972 (year of station installation with 8 months of data) indicate a high detection rate of over 95% for impacts, of which more than 80% are classified correctly. Deep moonquakes, which occur in large amounts, but often show only very weak signals, are detected with less certainty (~70%). As there is only one weak shallow moonquake covered, results for this event class are not statistically significant. Daily adjustments of the background noise model help to reduce false alarms, which are mainly erroneous deep moonquake detections, by about 25%. The algorithm enables us to classify events that were previously listed in the catalog without classification, and, through the combined use of long period and short period data, identify some unlisted local impacts as well as at least two yet unreported deep moonquakes.
22nd Annual Logistics Conference and Exhibition
2006-04-20
Prognostics & Health Management at GE Dr. Piero P.Bonissone Industrial AI Lab GE Global Research NCD Select detection model Anomaly detection results...Mode 213 x Failure mode histogram 2130014 Anomaly detection from event-log data Anomaly detection from event-log data Diagnostics/ Prognostics Using...Failure Monitoring & AssessmentTactical C4ISR Sense Respond 7 •Diagnostics, Prognostics and health management
NASA Astrophysics Data System (ADS)
Dreger, D. S.; Ford, S. R.; Nayak, A.
2015-12-01
The formation of a large sinkhole at the Napoleonville salt dome, Assumption Parish, Louisiana, in August 2012 was accompanied by a rich sequence of complex seismic events, including long-period (LP) events that were recorded 11 km away at Transportable Array station 544A in White Castle, Louisiana. The LP events have relatively little energy at short periods, which make them difficult to detect using standard high-frequency power detectors, and the majority of energy that reaches the station is peaked near 0.4 Hz. The analysis of the local records reveals that the onset of the 0.4 Hz signals coincides with the S-wave arrival, and therefore it may be a shaking induced resonance in a fluid filled cavern. We created a low-frequency (0.1-0.6 Hz) power detector (short-term average / long-term average) that operated on all three components of the broadband instrument, since considerable energy was detected on the horizontal components. The detections from the power detector were then used as templates in three-channel correlation detectors thereby increasing the number of detections by a little more than a factor of two to nearly 3000. The rate of LP events is approximately one event every other day at the beginning of recording in March 2011. Around 2 May 2012 the rate changes to approximately 7 events per day and then increases to 25 events per day at the beginning of July 2012. Finally, in the days leading up to the sinkhole formation there are approximately 200 LP events per day. The analysis of these events could aid in the development of local seismic monitoring methods for underground industrial storage caverns. Prepared by LLNL under Contract DE-AC52-07NA27344.
Why conventional detection methods fail in identifying the existence of contamination events.
Liu, Shuming; Li, Ruonan; Smith, Kate; Che, Han
2016-04-15
Early warning systems are widely used to safeguard water security, but their effectiveness has raised many questions. To understand why conventional detection methods fail to identify contamination events, this study evaluates the performance of three contamination detection methods using data from a real contamination accident and two artificial datasets constructed using a widely applied contamination data construction approach. Results show that the Pearson correlation Euclidean distance (PE) based detection method performs better for real contamination incidents, while the Euclidean distance method (MED) and linear prediction filter (LPF) method are more suitable for detecting sudden spike-like variation. This analysis revealed why the conventional MED and LPF methods failed to identify existence of contamination events. The analysis also revealed that the widely used contamination data construction approach is misleading. Copyright © 2016 Elsevier Ltd. All rights reserved.
Monitoring Chewing and Eating in Free-Living Using Smart Eyeglasses.
Zhang, Rui; Amft, Oliver
2018-01-01
We propose to 3-D-print personal fitted regular-look smart eyeglasses frames equipped with bilateral electromyography recording to monitor temporalis muscles' activity for automatic dietary monitoring. Personal fitting supported electrode-skin contacts are at temple ear bend and temple end positions. We evaluated the smart monitoring eyeglasses during in-lab and free-living studies of food chewing and eating event detection with ten participants. The in-lab study was designed to explore three natural food hardness levels and determine parameters of an energy-based chewing cycle detection. Our free-living study investigated whether chewing monitoring and eating event detection using smart eyeglasses is feasible in free-living. An eating event detection algorithm was developed to determine intake activities based on the estimated chewing rate. Results showed an average food hardness classification accuracy of 94% and chewing cycle detection precision and recall above 90% for the in-lab study and above 77% for the free-living study covering 122 hours of recordings. Eating detection revealed the 44 eating events with an average accuracy above 95%. We conclude that smart eyeglasses are suitable for monitoring chewing and eating events in free-living and even could provide further insights into the wearer's natural chewing patterns.
Detecting Seismic Events Using a Supervised Hidden Markov Model
NASA Astrophysics Data System (ADS)
Burks, L.; Forrest, R.; Ray, J.; Young, C.
2017-12-01
We explore the use of supervised hidden Markov models (HMMs) to detect seismic events in streaming seismogram data. Current methods for seismic event detection include simple triggering algorithms, such as STA/LTA and the Z-statistic, which can lead to large numbers of false positives that must be investigated by an analyst. The hypothesis of this study is that more advanced detection methods, such as HMMs, may decreases false positives while maintaining accuracy similar to current methods. We train a binary HMM classifier using 2 weeks of 3-component waveform data from the International Monitoring System (IMS) that was carefully reviewed by an expert analyst to pick all seismic events. Using an ensemble of simple and discrete features, such as the triggering of STA/LTA, the HMM predicts the time at which transition occurs from noise to signal. Compared to the STA/LTA detection algorithm, the HMM detects more true events, but the false positive rate remains unacceptably high. Future work to potentially decrease the false positive rate may include using continuous features, a Gaussian HMM, and multi-class HMMs to distinguish between types of seismic waves (e.g., P-waves and S-waves). Acknowledgement: Sandia National Laboratories is a multi-mission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525.SAND No: SAND2017-8154 A
Integrating Remote and Social Sensing Data for a Scenario on Secure Societies in Big Data Platform
NASA Astrophysics Data System (ADS)
Albani, Sergio; Lazzarini, Michele; Koubarakis, Manolis; Taniskidou, Efi Karra; Papadakis, George; Karkaletsis, Vangelis; Giannakopoulos, George
2016-08-01
In the framework of the Horizon 2020 project BigDataEurope (Integrating Big Data, Software & Communities for Addressing Europe's Societal Challenges), a pilot for the Secure Societies Societal Challenge was designed considering the requirements coming from relevant stakeholders. The pilot is focusing on the integration in a Big Data platform of data coming from remote and social sensing.The information on land changes coming from the Copernicus Sentinel 1A sensor (Change Detection workflow) is integrated with information coming from selected Twitter and news agencies accounts (Event Detection workflow) in order to provide the user with multiple sources of information.The Change Detection workflow implements a processing chain in a distributed parallel manner, exploiting the Big Data capabilities in place; the Event Detection workflow implements parallel and distributed social media and news agencies monitoring as well as suitable mechanisms to detect and geo-annotate the related events.
Label-free DNA biosensor based on resistance change of platinum nanoparticles assemblies.
Skotadis, Evangelos; Voutyras, Konstantinos; Chatzipetrou, Marianneza; Tsekenis, Georgios; Patsiouras, Lampros; Madianos, Leonidas; Chatzandroulis, Stavros; Zergioti, Ioanna; Tsoukalas, Dimitris
2016-07-15
A novel nanoparticle based biosensor for the fast and simple detection of DNA hybridization events is presented. The sensor utilizes hybridized DNA's charge transport properties, combining them with metallic nanoparticle networks that act as nano-gapped electrodes. The DNA hybridization events can be detected by a significant reduction in the sensor's resistance due to the conductive bridging offered by hybridized DNA. By modifying the nanoparticle surface coverage, which can be controlled experimentally being a function of deposition time, and the structural properties of the electrodes, an optimized biosensor for the in situ detection of DNA hybridization events is ultimately fabricated. The fabricated biosensor exhibits a wide response range, covering four orders of magnitude, a limit of detection of 1nM and can detect a single base pair mismatch between probe and complementary DNA. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Abe, K.; Bronner, C.; Pronost, G.; Hayato, Y.; Ikeda, M.; Iyogi, K.; Kameda, J.; Kato, Y.; Kishimoto, Y.; Marti, Ll.; Miura, M.; Moriyama, S.; Nakahata, M.; Nakano, Y.; Nakayama, S.; Okajima, Y.; Orii, A.; Sekiya, H.; Shiozawa, M.; Sonoda, Y.; Takeda, A.; Takenaka, A.; Tanaka, H.; Tasaka, S.; Tomura, T.; Akutsu, R.; Kajita, T.; Kaneyuki, K.; Nishimura, Y.; Okumura, K.; Tsui, K. M.; Labarga, L.; Fernandez, P.; Blaszczyk, F. d. M.; Gustafson, J.; Kachulis, C.; Kearns, E.; Raaf, J. L.; Stone, J. L.; Sulak, L. R.; Berkman, S.; Tobayama, S.; Goldhaber, M.; Elnimr, M.; Kropp, W. R.; Mine, S.; Locke, S.; Weatherly, P.; Smy, M. B.; Sobel, H. W.; Takhistov, V.; Ganezer, K. S.; Hill, J.; Kim, J. Y.; Lim, I. T.; Park, R. G.; Himmel, A.; Li, Z.; O'Sullivan, E.; Scholberg, K.; Walter, C. W.; Ishizuka, T.; Nakamura, T.; Jang, J. S.; Choi, K.; Learned, J. G.; Matsuno, S.; Smith, S. N.; Amey, J.; Litchfield, R. P.; Ma, W. Y.; Uchida, Y.; Wascko, M. O.; Cao, S.; Friend, M.; Hasegawa, T.; Ishida, T.; Ishii, T.; Kobayashi, T.; Nakadaira, T.; Nakamura, K.; Oyama, Y.; Sakashita, K.; Sekiguchi, T.; Tsukamoto, T.; Abe, KE.; Hasegawa, M.; Suzuki, A. T.; Takeuchi, Y.; Yano, T.; Cao, S. V.; Hayashino, T.; Hiraki, T.; Hirota, S.; Huang, K.; Jiang, M.; Minamino, A.; Nakamura, KE.; Nakaya, T.; Quilain, B.; Patel, N. D.; Wendell, R. A.; Anthony, L. H. V.; McCauley, N.; Pritchard, A.; Fukuda, Y.; Itow, Y.; Murase, M.; Muto, F.; Mijakowski, P.; Frankiewicz, K.; Jung, C. K.; Li, X.; Palomino, J. L.; Santucci, G.; Vilela, C.; Wilking, M. J.; Yanagisawa, C.; Ito, S.; Fukuda, D.; Ishino, H.; Kibayashi, A.; Koshio, Y.; Nagata, H.; Sakuda, M.; Xu, C.; Kuno, Y.; Wark, D.; Di Lodovico, F.; Richards, B.; Tacik, R.; Kim, S. B.; Cole, A.; Thompson, L.; Okazawa, H.; Choi, Y.; Ito, K.; Nishijima, K.; Koshiba, M.; Totsuka, Y.; Suda, Y.; Yokoyama, M.; Calland, R. G.; Hartz, M.; Martens, K.; Simpson, C.; Suzuki, Y.; Vagins, M. R.; Hamabe, D.; Kuze, M.; Yoshida, T.; Ishitsuka, M.; Martin, J. F.; Nantais, C. M.; Tanaka, H. A.; Konaka, A.; Chen, S.; Wan, L.; Zhang, Y.; Minamino, A.; Wilkes, R. J.; Super-Kamiokande Collaboration
2017-12-01
We present the results of a search in the Super-Kamiokande (SK) detector for excesses of neutrinos with energies above a few GeV that are in the direction of the track events reported in IceCube. Data from all SK phases (SK-I through SK-IV) were used, spanning a period from 1996 April to 2016 April and corresponding to an exposure of 225 kiloton-years. We considered the 14 IceCube track events from a data set with 1347 livetime days taken from 2010 to 2014. We use Poisson counting to determine if there is an excess of neutrinos detected in SK in a 10° search cone (5° for the highest energy data set) around the reconstructed direction of the IceCube event. No significant excess was found in any of the search directions we examined. We also looked for coincidences with a recently reported IceCube multiplet event. No events were detected within a ±500 s time window around the first detected event, and no significant excess was seen from that direction over the lifetime of SK.
Multi-Sensor Data Fusion Project
2000-02-28
seismic network by detecting T phases generated by underground events ( generally earthquakes ) and associating these phases to seismic events. The...between underwater explosions (H), underground sources, mostly earthquake - generated (7), and noise detections (N). The phases classified as H are the only...processing for infrasound sensors is most similar to seismic array processing with the exception that the detections are based on a more sophisticated
Kambhampati, Satya Samyukta; Singh, Vishal; Manikandan, M Sabarimalai; Ramkumar, Barathram
2015-08-01
In this Letter, the authors present a unified framework for fall event detection and classification using the cumulants extracted from the acceleration (ACC) signals acquired using a single waist-mounted triaxial accelerometer. The main objective of this Letter is to find suitable representative cumulants and classifiers in effectively detecting and classifying different types of fall and non-fall events. It was discovered that the first level of the proposed hierarchical decision tree algorithm implements fall detection using fifth-order cumulants and support vector machine (SVM) classifier. In the second level, the fall event classification algorithm uses the fifth-order cumulants and SVM. Finally, human activity classification is performed using the second-order cumulants and SVM. The detection and classification results are compared with those of the decision tree, naive Bayes, multilayer perceptron and SVM classifiers with different types of time-domain features including the second-, third-, fourth- and fifth-order cumulants and the signal magnitude vector and signal magnitude area. The experimental results demonstrate that the second- and fifth-order cumulant features and SVM classifier can achieve optimal detection and classification rates of above 95%, as well as the lowest false alarm rate of 1.03%.
Investigating Montara platform oil spill accident by implementing RST-OIL approach.
NASA Astrophysics Data System (ADS)
Satriano, Valeria; Ciancia, Emanuele; Coviello, Irina; Di Polito, Carmine; Lacava, Teodosio; Pergola, Nicola; Tramutoli, Valerio
2016-04-01
Oil Spills represent one of the most harmful events to marine ecosystems and their timely detection is crucial for their mitigation and management. The potential of satellite data for their detection and monitoring has been largely investigated. Traditional satellite techniques usually identify oil spill presence applying a fixed threshold scheme only after the occurrence of an event, which make them not well suited for their prompt identification. The Robust Satellite Technique (RST) approach, in its oil spill detection version (RST-OIL), being based on the comparison of the latest satellite acquisition with its historical value, previously identified, allows the automatic and near real-time detection of events. Such a technique has been already successfully applied on data from different sources (AVHRR-Advanced Very High Resolution Radiometer and MODIS-Moderate Resolution Imaging Spectroradiometer) showing excellent performance in detecting oil spills both during day- and night-time conditions, with an high level of sensitivity (detection also of low intensity events) and reliability (no false alarm on scene). In this paper, RST-OIL has been implemented on MODIS thermal infrared data for the analysis of the Montara Platform (Timor Sea - Australia) oil spill disaster occurred in August 2009. Preliminary achievements are presented and discussed in this paper.
Olsen, Sisse; Neale, Graham; Schwab, Kat; Psaila, Beth; Patel, Tejal; Chapman, E Jane; Vincent, Charles
2007-01-01
Background Over the past five years, in most hospitals in England and Wales, incident reporting has become well established but it remains unclear how well reports match clinical adverse events. International epidemiological studies of adverse events are based on retrospective, multi‐hospital case record review. In this paper the authors describe the use of incident reporting, pharmacist surveillance and local real‐time record review for the recognition of clinical risks associated with hospital inpatient care. Methodology Data on adverse events were collected prospectively on 288 patients discharged from adult acute medical and surgical units in an NHS district general hospital using incident reports, active surveillance of prescription charts by pharmacists and record review at time of discharge. Results Record review detected 26 adverse events (AEs) and 40 potential adverse events (PAEs) occurring during the index admission. In contrast, in the same patient group, incident reporting detected 11 PAEs and no AEs. Pharmacy surveillance found 10 medication errors all of which were PAEs. There was little overlap in the nature of events detected by the three methods. Conclusion The findings suggest that incident reporting does not provide an adequate assessment of clinical adverse events and that this method needs to be supplemented with other more systematic forms of data collection. Structured record review, carried out by clinicians, provides an important component of an integrated approach to identifying risk in the context of developing a safety and quality improvement programme. PMID:17301203
NASA Astrophysics Data System (ADS)
Pötzi, W.; Veronig, A. M.; Temmer, M.
2018-06-01
In the framework of the Space Situational Awareness program of the European Space Agency (ESA/SSA), an automatic flare detection system was developed at Kanzelhöhe Observatory (KSO). The system has been in operation since mid-2013. The event detection algorithm was upgraded in September 2017. All data back to 2014 was reprocessed using the new algorithm. In order to evaluate both algorithms, we apply verification measures that are commonly used for forecast validation. In order to overcome the problem of rare events, which biases the verification measures, we introduce a new event-based method. We divide the timeline of the Hα observations into positive events (flaring period) and negative events (quiet period), independent of the length of each event. In total, 329 positive and negative events were detected between 2014 and 2016. The hit rate for the new algorithm reached 96% (just five events were missed) and a false-alarm ratio of 17%. This is a significant improvement of the algorithm, as the original system had a hit rate of 85% and a false-alarm ratio of 33%. The true skill score and the Heidke skill score both reach values of 0.8 for the new algorithm; originally, they were at 0.5. The mean flare positions are accurate within {±} 1 heliographic degree for both algorithms, and the peak times improve from a mean difference of 1.7± 2.9 minutes to 1.3± 2.3 minutes. The flare start times that had been systematically late by about 3 minutes as determined by the original algorithm, now match the visual inspection within -0.47± 4.10 minutes.
Support Vector Machine Model for Automatic Detection and Classification of Seismic Events
NASA Astrophysics Data System (ADS)
Barros, Vesna; Barros, Lucas
2016-04-01
The automated processing of multiple seismic signals to detect, localize and classify seismic events is a central tool in both natural hazards monitoring and nuclear treaty verification. However, false detections and missed detections caused by station noise and incorrect classification of arrivals are still an issue and the events are often unclassified or poorly classified. Thus, machine learning techniques can be used in automatic processing for classifying the huge database of seismic recordings and provide more confidence in the final output. Applied in the context of the International Monitoring System (IMS) - a global sensor network developed for the Comprehensive Nuclear-Test-Ban Treaty (CTBT) - we propose a fully automatic method for seismic event detection and classification based on a supervised pattern recognition technique called the Support Vector Machine (SVM). According to Kortström et al., 2015, the advantages of using SVM are handleability of large number of features and effectiveness in high dimensional spaces. Our objective is to detect seismic events from one IMS seismic station located in an area of high seismicity and mining activity and classify them as earthquakes or quarry blasts. It is expected to create a flexible and easily adjustable SVM method that can be applied in different regions and datasets. Taken a step further, accurate results for seismic stations could lead to a modification of the model and its parameters to make it applicable to other waveform technologies used to monitor nuclear explosions such as infrasound and hydroacoustic waveforms. As an authorized user, we have direct access to all IMS data and bulletins through a secure signatory account. A set of significant seismic waveforms containing different types of events (e.g. earthquake, quarry blasts) and noise is being analysed to train the model and learn the typical pattern of the signal from these events. Moreover, comparing the performance of the support-vector network to various classical learning algorithms used before in seismic detection and classification is an essential final step to analyze the advantages and disadvantages of the model.
Search for long distance correlations between extensive air showers detected by the EEE network
NASA Astrophysics Data System (ADS)
Abbrescia, M.; Baldini, L.; Baldini Ferroli, R.; Batignani, G.; Battaglieri, M.; Boi, S.; Bossini, E.; Carnesecchi, F.; Chiavassa, A.; Cicalo, C.; Cifarelli, L.; Coccetti, F.; Coccia, E.; De Gruttola, D.; De Pasquale, S.; Fabbri, F. L.; Frolov, V.; Galeotti, P.; Garbini, M.; Gemme, G.; Gnesi, I.; Grazzi, S.; Gustavino, C.; Hatzifotiadou, D.; La Rocca, P.; Mandaglio, G.; Maragoto Rodriguez, O.; Maron, G.; Mazziotta, M. N.; Miozzi, S.; Nania, R.; Noferini, F.; Nozzoli, F.; Palmonari, F.; Panareo, M.; Panetta, M. P.; Paoletti, R.; Park, W.; Perasso, L.; Pilo, F.; Piragino, G.; Pisano, S.; Riggi, F.; Righini, G. C.; Ripoli, C.; Sartorelli, G.; Scapparone, E.; Schioppa, M.; Scribano, A.; Selvi, M.; Serci, S.; Squarcia, S.; Taiuti, M.; Terreni, G.; Trifirò, A.; Trimarchi, M.; Vistoli, M. C.; Votano, L.; Williams, M. C. S.; Zheng, L.; Zichichi, A.; Zuyeuski, R.
2018-02-01
A search for long distance correlations between individual Extensive Air Showers (EAS) detected by pairs of MRPC telescopes of the Extreme Energy Events (EEE) network was carried out. The search for an anomaly in these events is the purpose of our work. A dataset obtained by all the possible 45 pairs between 10 EEE cluster sites (hosting at least two telescopes), located at relative distances between 86 and 1200km, was analyzed, corresponding to an overall period of 3968 days time exposure. To estimate the possible event excess with respect to the spurious rate, the number of coincidence events was extracted as a function of the time difference between the arrival of the showers in the two sites, from ± 10 s to the smallest time interval where events are still observed. The analysis was done taking into account both the time and orientation correlation between the showers detected by the telescope pairs. A few candidate events with unusually small time difference and angular distance were observed, with a p-value sensibly smaller than a confidence level of 0.05.
The analysis of a complex fire event using multispaceborne observations
NASA Astrophysics Data System (ADS)
Andrei, Simona; Carstea, Emil; Marmureanu, Luminita; Ene, Dragos; Binietoglou, Ioannis; Nicolae, Doina; Konsta, Dimitra; Amiridis, Vassilis; Proestakis, Emmanouil
2018-04-01
This study documents a complex fire event that occurred on October 2016, in Middle East belligerent area. Two fire outbreaks were detected by different spacecraft monitoring instruments on board of TERRA, CALIPSO and AURA Earth Observation missions. Link with local weather conditions was examined using ERA Interim Reanalysis and CAMS datasets. The detection of the event by multiple sensors enabled a detailed characterization of fires and the comparison with different observational data.
Secure access control and large scale robust representation for online multimedia event detection.
Liu, Changyu; Lu, Bin; Li, Huiling
2014-01-01
We developed an online multimedia event detection (MED) system. However, there are a secure access control issue and a large scale robust representation issue when we want to integrate traditional event detection algorithms into the online environment. For the first issue, we proposed a tree proxy-based and service-oriented access control (TPSAC) model based on the traditional role based access control model. Verification experiments were conducted on the CloudSim simulation platform, and the results showed that the TPSAC model is suitable for the access control of dynamic online environments. For the second issue, inspired by the object-bank scene descriptor, we proposed a 1000-object-bank (1000OBK) event descriptor. Feature vectors of the 1000OBK were extracted from response pyramids of 1000 generic object detectors which were trained on standard annotated image datasets, such as the ImageNet dataset. A spatial bag of words tiling approach was then adopted to encode these feature vectors for bridging the gap between the objects and events. Furthermore, we performed experiments in the context of event classification on the challenging TRECVID MED 2012 dataset, and the results showed that the robust 1000OBK event descriptor outperforms the state-of-the-art approaches.
NASA Astrophysics Data System (ADS)
Khosla, Deepak; Huber, David J.; Martin, Kevin
2017-05-01
This paper† describes a technique in which we improve upon the prior performance of the Rapid Serial Visual Presentation (RSVP) EEG paradigm for image classification though the insertion of visual attention distracters and overall sequence reordering based upon the expected ratio of rare to common "events" in the environment and operational context. Inserting distracter images maintains the ratio of common events to rare events at an ideal level, maximizing the rare event detection via P300 EEG response to the RSVP stimuli. The method has two steps: first, we compute the optimal number of distracters needed for an RSVP stimuli based on the desired sequence length and expected number of targets and insert the distracters into the RSVP sequence, and then we reorder the RSVP sequence to maximize P300 detection. We show that by reducing the ratio of target events to nontarget events using this method, we can allow RSVP sequences with more targets without sacrificing area under the ROC curve (azimuth).
Method for early detection of cooling-loss events
Bermudez, Sergio A.; Hamann, Hendrik; Marianno, Fernando J.
2015-06-30
A method of detecting cooling-loss event early is provided. The method includes defining a relative humidity limit and change threshold for a given space, measuring relative humidity in the given space, determining, with a processing unit, whether the measured relative humidity is within the defined relative humidity limit, generating a warning in an event the measured relative humidity is outside the defined relative humidity limit and determining whether a change in the measured relative humidity is less than the defined change threshold for the given space and generating an alarm in an event the change is greater than the defined change threshold.
Method for early detection of cooling-loss events
Bermudez, Sergio A.; Hamann, Hendrik F.; Marianno, Fernando J.
2015-12-22
A method of detecting cooling-loss event early is provided. The method includes defining a relative humidity limit and change threshold for a given space, measuring relative humidity in the given space, determining, with a processing unit, whether the measured relative humidity is within the defined relative humidity limit, generating a warning in an event the measured relative humidity is outside the defined relative humidity limit and determining whether a change in the measured relative humidity is less than the defined change threshold for the given space and generating an alarm in an event the change is greater than the defined change threshold.
Dimension-based attention in visual short-term memory.
Pilling, Michael; Barrett, Doug J K
2016-07-01
We investigated how dimension-based attention influences visual short-term memory (VSTM). This was done through examining the effects of cueing a feature dimension in two perceptual comparison tasks (change detection and sameness detection). In both tasks, a memory array and a test array consisting of a number of colored shapes were presented successively, interleaved by a blank interstimulus interval (ISI). In Experiment 1 (change detection), the critical event was a feature change in one item across the memory and test arrays. In Experiment 2 (sameness detection), the critical event was the absence of a feature change in one item across the two arrays. Auditory cues indicated the feature dimension (color or shape) of the critical event with 80 % validity; the cues were presented either prior to the memory array, during the ISI, or simultaneously with the test array. In Experiment 1, the cue validity influenced sensitivity only when the cue was given at the earliest position; in Experiment 2, the cue validity influenced sensitivity at all three cue positions. We attributed the greater effectiveness of top-down guidance by cues in the sameness detection task to the more active nature of the comparison process required to detect sameness events (Hyun, Woodman, Vogel, Hollingworth, & Luck, Journal of Experimental Psychology: Human Perception and Performance, 35; 1140-1160, 2009).
Automated infrasound signal detection algorithms implemented in MatSeis - Infra Tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Darren
2004-07-01
MatSeis's infrasound analysis tool, Infra Tool, uses frequency slowness processing to deconstruct the array data into three outputs per processing step: correlation, azimuth and slowness. Until now, an experienced analyst trained to recognize a pattern observed in outputs from signal processing manually accomplished infrasound signal detection. Our goal was to automate the process of infrasound signal detection. The critical aspect of infrasound signal detection is to identify consecutive processing steps where the azimuth is constant (flat) while the time-lag correlation of the windowed waveform is above background value. These two statements describe the arrival of a correlated set of wavefrontsmore » at an array. The Hough Transform and Inverse Slope methods are used to determine the representative slope for a specified number of azimuth data points. The representative slope is then used in conjunction with associated correlation value and azimuth data variance to determine if and when an infrasound signal was detected. A format for an infrasound signal detection output file is also proposed. The detection output file will list the processed array element names, followed by detection characteristics for each method. Each detection is supplied with a listing of frequency slowness processing characteristics: human time (YYYY/MM/DD HH:MM:SS.SSS), epochal time, correlation, fstat, azimuth (deg) and trace velocity (km/s). As an example, a ground truth event was processed using the four-element DLIAR infrasound array located in New Mexico. The event is known as the Watusi chemical explosion, which occurred on 2002/09/28 at 21:25:17 with an explosive yield of 38,000 lb TNT equivalent. Knowing the source and array location, the array-to-event distance was computed to be approximately 890 km. This test determined the station-to-event azimuth (281.8 and 282.1 degrees) to within 1.6 and 1.4 degrees for the Inverse Slope and Hough Transform detection algorithms, respectively, and the detection window closely correlated to the theoretical stratospheric arrival time. Further testing will be required for tuning of detection threshold parameters for different types of infrasound events.« less
Twitter Seismology: Earthquake Monitoring and Response in a Social World
NASA Astrophysics Data System (ADS)
Bowden, D. C.; Earle, P. S.; Guy, M.; Smoczyk, G.
2011-12-01
The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment USGS earthquake response products and the delivery of hazard information. The potential uses of Twitter for earthquake response include broadcasting earthquake alerts, rapidly detecting widely felt events, qualitatively assessing earthquake damage effects, communicating with the public, and participating in post-event collaboration. Several seismic networks and agencies are currently distributing Twitter earthquake alerts including the European-Mediterranean Seismological Centre (@LastQuake), Natural Resources Canada (@CANADAquakes), and the Indonesian meteorological agency (@infogempabmg); the USGS will soon distribute alerts via the @USGSted and @USGSbigquakes Twitter accounts. Beyond broadcasting alerts, the USGS is investigating how to use tweets that originate near the epicenter to detect and characterize shaking events. This is possible because people begin tweeting immediately after feeling an earthquake, and their short narratives and exclamations are available for analysis within 10's of seconds of the origin time. Using five months of tweets that contain the word "earthquake" and its equivalent in other languages, we generate a tweet-frequency time series. The time series clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a simple Short-Term-Average / Long-Term-Average algorithm similar to that commonly used to detect seismic phases. As with most auto-detection algorithms, the parameters can be tuned to catch more or less events at the cost of more or less false triggers. When tuned to a moderate sensitivity, the detector found 48 globally-distributed, confirmed seismic events with only 2 false triggers. A space-shuttle landing and "The Great California ShakeOut" caused the false triggers. This number of detections is very small compared to the 5,175 earthquakes in the USGS PDE global earthquake catalog for the same five month time period, and no accurate location or magnitude can be assigned based on Tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 80% occurred within 2 minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided (very) short first-impression narratives from people who experienced the shaking. The USGS will continue investigating how to use Twitter and other forms of social media to augment is current suite of seismographically derived products.
Determining dark matter properties with a XENONnT/LZ signal and LHC Run 3 monojet searches
NASA Astrophysics Data System (ADS)
Baum, Sebastian; Catena, Riccardo; Conrad, Jan; Freese, Katherine; Krauss, Martin B.
2018-04-01
We develop a method to forecast the outcome of the LHC Run 3 based on the hypothetical detection of O (100 ) signal events at XENONnT. Our method relies on a systematic classification of renormalizable single-mediator models for dark matter-quark interactions and is valid for dark matter candidates of spin less than or equal to one. Applying our method to simulated data, we find that at the end of the LHC Run 3 only two mutually exclusive scenarios would be compatible with the detection of O (100 ) signal events at XENONnT. In the first scenario, the energy distribution of the signal events is featureless, as for canonical spin-independent interactions. In this case, if a monojet signal is detected at the LHC, dark matter must have spin 1 /2 and interact with nucleons through a unique velocity-dependent operator. If a monojet signal is not detected, dark matter interacts with nucleons through canonical spin-independent interactions. In a second scenario, the spectral distribution of the signal events exhibits a bump at nonzero recoil energies. In this second case, a monojet signal can be detected at the LHC Run 3; dark matter must have spin 1 /2 and interact with nucleons through a unique momentum-dependent operator. We therefore conclude that the observation of O (100 ) signal events at XENONnT combined with the detection, or the lack of detection, of a monojet signal at the LHC Run 3 would significantly narrow the range of possible dark matter-nucleon interactions. As we argued above, it can also provide key information on the dark matter particle spin.
NASA Technical Reports Server (NTRS)
Thompson, D. J.; Bertsch, D. L.; ONeal, R. H., Jr.
2005-01-01
During its nine-year lifetime, the Energetic Gamma Ray Experiment Telescope (EGBET) on the Compton Gamma Ray Observatory (CGRO) detected 1506 cosmic photons with measured energy E>10 GeV. Of this number, 187 are found within a 1 deg of sources that are listed in the Third EGRET Catalog and were included in determining the detection likelihood, flux, and spectra of those sources. In particular, five detected EGRET pulsars are found to have events above 10 GeV, and together they account for 37 events. A pulsar not included in the Third EGRET Catalog has 2 events, both with the same phase and in one peak of the lower-energy gamma-ray light-curve. Most of the remaining 1319 events appear to be diffuse Galactic and extragalactic radiation based on the similarity of the their spatial and energy distributions with the diffuse model and in the E>100, MeV emission. No significant time clustering which would suggest a burst was detected.
Statistical Model Applied to NetFlow for Network Intrusion Detection
NASA Astrophysics Data System (ADS)
Proto, André; Alexandre, Leandro A.; Batista, Maira L.; Oliveira, Isabela L.; Cansian, Adriano M.
The computers and network services became presence guaranteed in several places. These characteristics resulted in the growth of illicit events and therefore the computers and networks security has become an essential point in any computing environment. Many methodologies were created to identify these events; however, with increasing of users and services on the Internet, many difficulties are found in trying to monitor a large network environment. This paper proposes a methodology for events detection in large-scale networks. The proposal approaches the anomaly detection using the NetFlow protocol, statistical methods and monitoring the environment in a best time for the application.
NASA Astrophysics Data System (ADS)
Wang, L.; Toshioka, T.; Nakajima, T.; Narita, A.; Xue, Z.
2017-12-01
In recent years, more and more Carbon Capture and Storage (CCS) studies focus on seismicity monitoring. For the safety management of geological CO2 storage at Tomakomai, Hokkaido, Japan, an Advanced Traffic Light System (ATLS) combined different seismic messages (magnitudes, phases, distributions et al.) is proposed for injection controlling. The primary task for ATLS is the seismic events detection in a long-term sustained time series record. Considering the time-varying characteristics of Signal to Noise Ratio (SNR) of a long-term record and the uneven energy distributions of seismic event waveforms will increase the difficulty in automatic seismic detecting, in this work, an improved probability autoregressive (AR) method for automatic seismic event detecting is applied. This algorithm, called sequentially discounting AR learning (SDAR), can identify the effective seismic event in the time series through the Change Point detection (CPD) of the seismic record. In this method, an anomaly signal (seismic event) can be designed as a change point on the time series (seismic record). The statistical model of the signal in the neighborhood of event point will change, because of the seismic event occurrence. This means the SDAR aims to find the statistical irregularities of the record thought CPD. There are 3 advantages of SDAR. 1. Anti-noise ability. The SDAR does not use waveform messages (such as amplitude, energy, polarization) for signal detecting. Therefore, it is an appropriate technique for low SNR data. 2. Real-time estimation. When new data appears in the record, the probability distribution models can be automatic updated by SDAR for on-line processing. 3. Discounting property. the SDAR introduces a discounting parameter to decrease the influence of present statistic value on future data. It makes SDAR as a robust algorithm for non-stationary signal processing. Within these 3 advantages, the SDAR method can handle the non-stationary time-varying long-term series and achieve real-time monitoring. Finally, we employ the SDAR on a synthetic model and Tomakomai Ocean Bottom Cable (OBC) baseline data to prove the feasibility and advantage of our method.
Prevalence and test characteristics of national health safety network ventilator-associated events.
Lilly, Craig M; Landry, Karen E; Sood, Rahul N; Dunnington, Cheryl H; Ellison, Richard T; Bagley, Peter H; Baker, Stephen P; Cody, Shawn; Irwin, Richard S
2014-09-01
The primary aim of the study was to measure the test characteristics of the National Health Safety Network ventilator-associated event/ventilator-associated condition constructs for detecting ventilator-associated pneumonia. Its secondary aims were to report the clinical features of patients with National Health Safety Network ventilator-associated event/ventilator-associated condition, measure costs of surveillance, and its susceptibility to manipulation. Prospective cohort study. Two inpatient campuses of an academic medical center. Eight thousand four hundred eight mechanically ventilated adults discharged from an ICU. None. The National Health Safety Network ventilator-associated event/ventilator-associated condition constructs detected less than a third of ventilator-associated pneumonia cases with a sensitivity of 0.325 and a positive predictive value of 0.07. Most National Health Safety Network ventilator-associated event/ventilator-associated condition cases (93%) did not have ventilator-associated pneumonia or other hospital-acquired complications; 71% met the definition for acute respiratory distress syndrome. Similarly, most patients with National Health Safety Network probable ventilator-associated pneumonia did not have ventilator-associated pneumonia because radiographic criteria were not met. National Health Safety Network ventilator-associated event/ventilator-associated condition rates were reduced 93% by an unsophisticated manipulation of ventilator management protocols. The National Health Safety Network ventilator-associated event/ventilator-associated condition constructs failed to detect many patients who had ventilator-associated pneumonia, detected many cases that did not have a hospital complication, and were susceptible to manipulation. National Health Safety Network ventilator-associated event/ventilator-associated condition surveillance did not perform as well as ventilator-associated pneumonia surveillance and had several undesirable characteristics.
Perspectives of Cross-Correlation in Seismic Monitoring at the International Data Centre
NASA Astrophysics Data System (ADS)
Bobrov, Dmitry; Kitov, Ivan; Zerbo, Lassina
2014-03-01
We demonstrate that several techniques based on waveform cross-correlation are able to significantly reduce the detection threshold of seismic sources worldwide and to improve the reliability of arrivals by a more accurate estimation of their defining parameters. A master event and the events it can find using waveform cross-correlation at array stations of the International Monitoring System (IMS) have to be close. For the purposes of the International Data Centre (IDC), one can use the spatial closeness of the master and slave events in order to construct a new automatic processing pipeline: all qualified arrivals detected using cross-correlation are associated with events matching the current IDC event definition criteria (EDC) in a local association procedure. Considering the repeating character of global seismicity, more than 90 % of events in the reviewed event bulletin (REB) can be built in this automatic processing. Due to the reduced detection threshold, waveform cross-correlation may increase the number of valid REB events by a factor of 1.5-2.0. Therefore, the new pipeline may produce a more comprehensive bulletin than the current pipeline—the goal of seismic monitoring. The analysts' experience with the cross correlation event list (XSEL) shows that the workload of interactive processing might be reduced by a factor of two or even more. Since cross-correlation produces a comprehensive list of detections for a given master event, no additional arrivals from primary stations are expected to be associated with the XSEL events. The number of false alarms, relative to the number of events rejected from the standard event list 3 (SEL3) in the current interactive processing—can also be reduced by the use of several powerful filters. The principal filter is the difference between the arrival times of the master and newly built events at three or more primary stations, which should lie in a narrow range of a few seconds. In this study, one event at a distance of about 2,000 km from the main shock was formed by three stations, with the stations and both events on the same great circle. Such spurious events are rejected by checking consistency between detections at stations at different back azimuths from the source region. Two additional effective pre-filters are f-k analysis and F prob based on correlation traces instead of original waveforms. Overall, waveform cross-correlation is able to improve the REB completeness, to reduce the workload related to IDC interactive analysis, and to provide a precise tool for quality check for both arrivals and events. Some major improvements in automatic and interactive processing achieved by cross-correlation are illustrated using an aftershock sequence from a large continental earthquake. Exploring this sequence, we describe schematically the next steps for the development of a processing pipeline parallel to the existing IDC one in order to improve the quality of the REB together with the reduction of the magnitude threshold.
A Method for Automated Detection of Usability Problems from Client User Interface Events
Saadawi, Gilan M.; Legowski, Elizabeth; Medvedeva, Olga; Chavan, Girish; Crowley, Rebecca S.
2005-01-01
Think-aloud usability analysis provides extremely useful data but is very time-consuming and expensive to perform because of the extensive manual video analysis that is required. We describe a simple method for automated detection of usability problems from client user interface events for a developing medical intelligent tutoring system. The method incorporates (1) an agent-based method for communication that funnels all interface events and system responses to a centralized database, (2) a simple schema for representing interface events and higher order subgoals, and (3) an algorithm that reproduces the criteria used for manual coding of usability problems. A correction factor was empirically determining to account for the slower task performance of users when thinking aloud. We tested the validity of the method by simultaneously identifying usability problems using TAU and manually computing them from stored interface event data using the proposed algorithm. All usability problems that did not rely on verbal utterances were detectable with the proposed method. PMID:16779121
Passive acoustic monitoring to detect spawning in large-bodied catostomids
Straight, Carrie A.; Freeman, Byron J.; Freeman, Mary C.
2014-01-01
Documenting timing, locations, and intensity of spawning can provide valuable information for conservation and management of imperiled fishes. However, deep, turbid or turbulent water, or occurrence of spawning at night, can severely limit direct observations. We have developed and tested the use of passive acoustics to detect distinctive acoustic signatures associated with spawning events of two large-bodied catostomid species (River Redhorse Moxostoma carinatum and Robust Redhorse Moxostoma robustum) in river systems in north Georgia. We deployed a hydrophone with a recording unit at four different locations on four different dates when we could both record and observe spawning activity. Recordings captured 494 spawning events that we acoustically characterized using dominant frequency, 95% frequency, relative power, and duration. We similarly characterized 46 randomly selected ambient river noises. Dominant frequency did not differ between redhorse species and ranged from 172.3 to 14,987.1 Hz. Duration of spawning events ranged from 0.65 to 11.07 s, River Redhorse having longer durations than Robust Redhorse. Observed spawning events had significantly higher dominant and 95% frequencies than ambient river noises. We additionally tested software designed to automate acoustic detection. The automated detection configurations correctly identified 80–82% of known spawning events, and falsely indentified spawns 6–7% of the time when none occurred. These rates were combined over all recordings; rates were more variable among individual recordings. Longer spawning events were more likely to be detected. Combined with sufficient visual observations to ascertain species identities and to estimate detection error rates, passive acoustic recording provides a useful tool to study spawning frequency of large-bodied fishes that displace gravel during egg deposition, including several species of imperiled catostomids.
Blowing snow detection from ground-based ceilometers: application to East Antarctica
NASA Astrophysics Data System (ADS)
Gossart, Alexandra; Souverijns, Niels; Gorodetskaya, Irina V.; Lhermitte, Stef; Lenaerts, Jan T. M.; Schween, Jan H.; Mangold, Alexander; Laffineur, Quentin; van Lipzig, Nicole P. M.
2017-12-01
Blowing snow impacts Antarctic ice sheet surface mass balance by snow redistribution and sublimation. However, numerical models poorly represent blowing snow processes, while direct observations are limited in space and time. Satellite retrieval of blowing snow is hindered by clouds and only the strongest events are considered. Here, we develop a blowing snow detection (BSD) algorithm for ground-based remote-sensing ceilometers in polar regions and apply it to ceilometers at Neumayer III and Princess Elisabeth (PE) stations, East Antarctica. The algorithm is able to detect (heavy) blowing snow layers reaching 30 m height. Results show that 78 % of the detected events are in agreement with visual observations at Neumayer III station. The BSD algorithm detects heavy blowing snow 36 % of the time at Neumayer (2011-2015) and 13 % at PE station (2010-2016). Blowing snow occurrence peaks during the austral winter and shows around 5 % interannual variability. The BSD algorithm is capable of detecting blowing snow both lifted from the ground and occurring during precipitation, which is an added value since results indicate that 92 % of the blowing snow is during synoptic events, often combined with precipitation. Analysis of atmospheric meteorological variables shows that blowing snow occurrence strongly depends on fresh snow availability in addition to wind speed. This finding challenges the commonly used parametrizations, where the threshold for snow particles to be lifted is a function of wind speed only. Blowing snow occurs predominantly during storms and overcast conditions, shortly after precipitation events, and can reach up to 1300 m a. g. l. in the case of heavy mixed events (precipitation and blowing snow together). These results suggest that synoptic conditions play an important role in generating blowing snow events and that fresh snow availability should be considered in determining the blowing snow onset.
Brewer, Gene A; Knight, Justin B; Marsh, Richard L; Unsworth, Nash
2010-04-01
The multiprocess view proposes that different processes can be used to detect event-based prospective memory cues, depending in part on the specificity of the cue. According to this theory, attentional processes are not necessary to detect focal cues, whereas detection of nonfocal cues requires some form of controlled attention. This notion was tested using a design in which we compared performance on a focal and on a nonfocal prospective memory task by participants with high or low working memory capacity. An interaction was found, such that participants with high and low working memory performed equally well on the focal task, whereas the participants with high working memory performed significantly better on the nonfocal task than did their counterparts with low working memory. Thus, controlled attention was only necessary for detecting event-based prospective memory cues in the nonfocal task. These results have implications for theories of prospective memory, the processes necessary for cue detection, and the successful fulfillment of intentions.
Unbeck, Maria; Schildmeijer, Kristina; Henriksson, Peter; Jürgensen, Urban; Muren, Olav; Nilsson, Lena; Pukk Härenstam, Karin
2013-04-15
There has been a theoretical debate as to which retrospective record review method is the most valid, reliable, cost efficient and feasible for detecting adverse events. The aim of the present study was to evaluate the feasibility and capability of two common retrospective record review methods, the "Harvard Medical Practice Study" method and the "Global Trigger Tool" in detecting adverse events in adult orthopaedic inpatients. We performed a three-stage structured retrospective record review process in a random sample of 350 orthopaedic admissions during 2009 at a Swedish university hospital. Two teams comprised each of a registered nurse and two physicians were assigned, one to each method. All records were primarily reviewed by registered nurses. Records containing a potential adverse event were forwarded to physicians for review in stage 2. Physicians made an independent review regarding, for example, healthcare causation, preventability and severity. In the third review stage all adverse events that were found with the two methods together were compared and all discrepancies after review stage 2 were analysed. Events that had not been identified by one of the methods in the first two review stages were reviewed by the respective physicians. Altogether, 160 different adverse events were identified in 105 (30.0%) of the 350 records with both methods combined. The "Harvard Medical Practice Study" method identified 155 of the 160 (96.9%, 95% CI: 92.9-99.0) adverse events in 104 (29.7%) records compared with 137 (85.6%, 95% CI: 79.2-90.7) adverse events in 98 (28.0%) records using the "Global Trigger Tool". Adverse events "causing harm without permanent disability" accounted for most of the observed difference. The overall positive predictive value for criteria and triggers using the "Harvard Medical Practice Study" method and the "Global Trigger Tool" was 40.3% and 30.4%, respectively. More adverse events were identified using the "Harvard Medical Practice Study" method than using the "Global Trigger Tool". Differences in review methodology, perception of less severe adverse events and context knowledge may explain the observed difference between two expert review teams in the detection of adverse events.
Smalling, Kelly L.; Orlando, James L.
2011-01-01
Water and sediment (bed and suspended) were collected from January 2008 through October 2009 from 12 sites in 3 of the largest watersheds along California's Central Coast (Pajaro, Salinas, and Santa Maria Rivers) and analyzed for a suite of pesticides by the U.S. Geological Survey. Water samples were collected in each watershed from the estuaries and major tributaries during 4 storm events and 11 dry season sampling events in 2008 and 2009. Bed sediments were collected from depositional zones at the tributary sampling sites three times over the course of the study. Suspended sediment samples were collected from the major tributaries during the four storm events and in the tributaries and estuaries during three dry season sampling events in 2009. Water samples were analyzed for 68 pesticides using gas chromatography/mass spectrometry. A total of 38 pesticides were detected in 144 water samples, and 13 pesticides were detected in more than half the samples collected over the course of the study. Dissolved pesticide concentrations ranged from below their method detection limits to 36,000 nanograms per liter (boscalid). The most frequently detected pesticides in water from all the watersheds were azoxystrobin, boscalid, chlorpyrifos, DCPA, diazinon, oxyfluorfen, prometryn, and propyzamide, which were found in more than 80 percent of the samples. On average, detection frequencies and concentrations were higher in samples collected during winter storm events compared to the summer dry season. With the exception of the fungicide, myclobutanil, the Santa Maria estuary watershed exhibited higher pesticide detection frequencies than the Pajaro and Salinas watersheds. Bed and suspended sediment samples were analyzed for 55 pesticides using accelerated solvent extraction, gel permeation chromatography for sulfur removal, and carbon/alumina stacked solid-phase extraction cartridges to remove interfering sediment matrices. In bed sediment samples, 17 pesticides were detected including pyrethroid and organophosphate (OP) insecticides, p,p'-DDT and its degradates, as well as several herbicides. The only pesticides detected more than half the time were p,p'-DDD, p,p'-DDE, and p,p'-DDT. Maximum pesticide concentrations ranged from less than their respective method detection limits to 234 micrograms per kilogram (p,p'-DDE). Four pyrethroids (bifenthrin, 955;-cyhalothrin, permethrin, and 964;-fluvalinate) were detected in bed sediment samples, though concentrations were relatively low (less than 10 microgram per kilogram). The greatest number of pesticides were detected in samples collected from Lower Orcutt Creek, the major tributary to the Santa Maria estuary. In suspended sediment samples, 19 pesticides were detected, and maximum concentrations ranged from less than the method detection limits to 549 micrograms per kilogram (chlorpyrifos). The most frequently detected pesticides were p,p'-DDE (49 percent), p,p'-DDT (38 percent), and chlorpyrifos (32 percent). During storm events, 19 pesticides were detected in suspended sediment samples compared to 10 detected during the dry season. Pesticide concentrations commonly were higher in suspended sediments during storm events than during the dry season, as well.
On comprehensive recovery of an aftershock sequence with cross correlation
NASA Astrophysics Data System (ADS)
Kitov, I.; Bobrov, D.; Coyne, J.; Turyomurugyendo, G.
2012-04-01
We have introduced cross correlation between seismic waveforms as a technique for signal detection and automatic event building at the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization. The intuition behind signal detection is simple - small and mid-sized seismic events close in space should produce similar signals at the same seismic stations. Equivalently, these signals have to be characterized by a high cross correlation coefficient. For array stations with many individual sensors distributed over a large area, signals from events at distances beyond, say, 50 km, are subject to destructive interference when cross correlated due to changing time delays between various channels. Thus, any cross correlation coefficient above some predefined threshold can be considered as a signature of a valid signal. With a dense grid of master events (spacing between adjacent masters between 20 km and 50 km corresponds to the statistically estimated correlation distance) with high quality (signal-to-noise ratio above 10) template waveforms at primary array stations of the International Monitoring System one can detect signals from and then build natural and manmade seismic events close to the master ones. The use of cross correlation allows detecting smaller signals (sometimes below noise level) than provided by the current IDC detecting techniques. As a result it is possible to automatically build from 50% to 100% more valid seismic events than included in the Reviewed Event Bulletin (REB). We have developed a tentative pipeline for automatic processing at the IDC. It includes three major stages. Firstly, we calculate cross correlation coefficient for a given master and continuous waveforms at the same stations and carry out signal detection as based on the statistical behavior of signal-to-noise ratio of the cross correlation coefficient. Secondly, a thorough screening is performed for all obtained signals using f-k analysis and F-statistics as applied to the cross-correlation traces at individual channels of all included array stations. Thirdly, local (i.e. confined to the correlation distance around the master event) association of origin times of all qualified signals is fulfilled. These origin times are calculated from the arrival times of these signals, which are reduced to the origin times by the travel times from the master event. An aftershock sequence of a mid-size earthquake is an ideal case to test cross correlation techniques for autiomatic event building. All events should be close to the mainshock and occur within several days. Here we analyse the aftershock sequence of an earthquake in the North Atlantic Ocean with mb(IDC)=4.79. The REB includes 38 events at distances less than 150 km from the mainshock. Our ultimate goal is to excersice the complete iterative procedure to find all possible aftershocks. We start with the mainshock and recover ten aftershocks with the largest number of stations to produce an initial set of master events with the highest quality templates. Then we find all aftershocks in the REB and many additional events, which were not originally found by the IDC. Using all events found after the first iteration as master events we find new events, which are also used in the next iteration. The iterative process stops when no new events can be found. In that sense the final set of aftershocks obtained with cross correlation is a comprehensive one.
Event detection for car park entries by video-surveillance
NASA Astrophysics Data System (ADS)
Coquin, Didier; Tailland, Johan; Cintract, Michel
2007-10-01
Intelligent surveillance has become an important research issue due to the high cost and low efficiency of human supervisors, and machine intelligence is required to provide a solution for automated event detection. In this paper we describe a real-time system that has been used for detecting car park entries, using an adaptive background learning algorithm and two indicators representing activity and identity to overcome the difficulty of tracking objects.
Sudden Event Recognition: A Survey
Suriani, Nor Surayahani; Hussain, Aini; Zulkifley, Mohd Asyraf
2013-01-01
Event recognition is one of the most active research areas in video surveillance fields. Advancement in event recognition systems mainly aims to provide convenience, safety and an efficient lifestyle for humanity. A precise, accurate and robust approach is necessary to enable event recognition systems to respond to sudden changes in various uncontrolled environments, such as the case of an emergency, physical threat and a fire or bomb alert. The performance of sudden event recognition systems depends heavily on the accuracy of low level processing, like detection, recognition, tracking and machine learning algorithms. This survey aims to detect and characterize a sudden event, which is a subset of an abnormal event in several video surveillance applications. This paper discusses the following in detail: (1) the importance of a sudden event over a general anomalous event; (2) frameworks used in sudden event recognition; (3) the requirements and comparative studies of a sudden event recognition system and (4) various decision-making approaches for sudden event recognition. The advantages and drawbacks of using 3D images from multiple cameras for real-time application are also discussed. The paper concludes with suggestions for future research directions in sudden event recognition. PMID:23921828
Automatic detection of lift-off and touch-down of a pick-up walker using 3D kinematics.
Grootveld, L; Thies, S B; Ogden, D; Howard, D; Kenney, L P J
2014-02-01
Walking aids have been associated with falls and it is believed that incorrect use limits their usefulness. Measures are therefore needed that characterize their stable use and the classification of key events in walking aid movement is the first step in their development. This study presents an automated algorithm for detection of lift-off (LO) and touch-down (TD) events of a pick-up walker. For algorithm design and initial testing, a single user performed trials for which the four individual walker feet lifted off the ground and touched down again in various sequences, and for different amounts of frame loading (Dataset_1). For further validation, ten healthy young subjects walked with the pick-up walker on flat ground (Dataset_2a) and on a narrow beam (Dataset_2b), to challenge balance. One 88-year-old walking frame user was also assessed. Kinematic data were collected with a 3D optoelectronic camera system. The algorithm detected over 93% of events (Dataset_1), and 95% and 92% in Dataset_2a and b, respectively. Of the various LO/TD sequences, those associated with natural progression resulted in up to 100% correctly identified events. For the 88-year-old walking frame user, 96% of LO events and 93% of TD events were detected, demonstrating the potential of the approach. Copyright © 2013 IPEM. Published by Elsevier Ltd. All rights reserved.
Revisiting the 1988 Pluto Occultation
NASA Astrophysics Data System (ADS)
Bosh, Amanda S.; Dunham, Edward W.; Young, Leslie A.; Slivan, Steve; Barba née Cordella, Linda L.; Millis, Robert L.; Wasserman, Lawrence H.; Nye, Ralph
2015-11-01
In 1988, Pluto's atmosphere was surmised to exist because of the surface ices that had been detected through spectroscopy, but it had not yet been directly detected in a definitive manner. The key to making such a detection was the stellar occultation method, used so successfully for the discovery of the Uranian rings in 1977 (Elliot et al. 1989; Millis et al. 1993) and before that for studies of the atmospheres of other planets.On 9 June 1988, Pluto occulted a star, with its shadow falling over the South Pacific Ocean region. One team of observers recorded this event from the Kuiper Airborne Observatory, while other teams captured the event from various locations in Australia and New Zealand. Preceding this event, extensive astrometric observations of Pluto and the star were collected in order to refine the prediction.We will recount the investigations that led up to this important Pluto occultation, discuss the unexpected atmospheric results, and compare the 1988 event to the recent 2015 event whose shadow followed a similar track through New Zealand and Australia.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bame, D.
To determine if seismic signals at frequencies up to 50 Hz are useful for detecting events and discriminating between earthquakes and explosions, approximately 180 events from the three-component high-frequency seismic element (HFSE) installed at the center of the Norwegian Regional Seismic Array (NRSA) have been analyzed. The attenuation of high-frequency signals in Scandinavia varies with distance, azimuth, magnitude, and source effects. Most of the events were detected with HFSE, although detections were better on the NRSA where signal processing techniques were used. Based on a preliminary analysis, high-frequency data do not appear to be a useful discriminant in Scandinavia. 21more » refs., 29 figs., 3 tabs.« less
Association rule mining in the US Vaccine Adverse Event Reporting System (VAERS).
Wei, Lai; Scott, John
2015-09-01
Spontaneous adverse event reporting systems are critical tools for monitoring the safety of licensed medical products. Commonly used signal detection algorithms identify disproportionate product-adverse event pairs and may not be sensitive to more complex potential signals. We sought to develop a computationally tractable multivariate data-mining approach to identify product-multiple adverse event associations. We describe an application of stepwise association rule mining (Step-ARM) to detect potential vaccine-symptom group associations in the US Vaccine Adverse Event Reporting System. Step-ARM identifies strong associations between one vaccine and one or more adverse events. To reduce the number of redundant association rules found by Step-ARM, we also propose a clustering method for the post-processing of association rules. In sample applications to a trivalent intradermal inactivated influenza virus vaccine and to measles, mumps, rubella, and varicella (MMRV) vaccine and in simulation studies, we find that Step-ARM can detect a variety of medically coherent potential vaccine-symptom group signals efficiently. In the MMRV example, Step-ARM appears to outperform univariate methods in detecting a known safety signal. Our approach is sensitive to potentially complex signals, which may be particularly important when monitoring novel medical countermeasure products such as pandemic influenza vaccines. The post-processing clustering algorithm improves the applicability of the approach as a screening method to identify patterns that may merit further investigation. Copyright © 2015 John Wiley & Sons, Ltd.
Solar Neutrino flare detection in Hyperkamiokande and SK
NASA Astrophysics Data System (ADS)
Fargion, Daniele
2016-07-01
The possible buid and near activity of a Megaton neutrino detection in HyperKamiokande and the older SK implementation by Gadolinium liqid might open to future detection of largest solar flare (pion trace at tens MeV) electron neutrino and antineutrino. The multiwave detection of X-gamma and neutrino event might offer a deep view of such solar acelleration and of neutrino flavor mix along its flight. The possoble near future discover of such events will open a third neutrino astronomy windows after rarest SN 1987A and persistent Solar nuclear signals.
NASA Astrophysics Data System (ADS)
Li, C.; Li, Z.; Peng, Z.; Zhang, C.; Nakata, N.
2017-12-01
Oklahoma has experienced abrupt increase of induced seismicity in the last decade. An important way to fully understand seismic activities in Oklahoma is to obtain more complete earthquake catalogs and detect different types of seismic events. The IRIS Community Wavefield Demonstration Experiment was deployed near Enid, Oklahoma in Summer of 2016. The dataset from this ultra-dense array provides an excellent opportunity for detecting microseismicity in that region with wavefield approaches. Here we examine continuous waveforms recorded by 3 seismic lines using local coherence for ultra-dense arrays (Li et al., 2017), which is a measure of cross-correlation of waveform at each station with its nearby stations. So far we have detected more than 5,000 events from 06/22/2016 to 07/20/2016, and majority of them are not listed on the regional catalog of Oklahoma or global catalogs, indicating that they are local events. We also identify 15-20 long-period long-duration events, some of them lasting for more than 500 s. Such events have been found at major plate-boundary faults (also known as deep tectonic tremor), as well as during hydraulic fracturing, slow-moving landslides and glaciers. Our next step is to locate these possible tremor-like events with their relative arrival times across the array and compare their occurrence times with solid-earth tides and injection histories to better understand their driving mechanisms.
Analyzing the Possibility of Dynamic Earthquake Triggering in Socorro, New Mexico
NASA Astrophysics Data System (ADS)
Morton, E.; Bilek, S. L.
2011-12-01
The release of energy during an earthquake changes the stress state and seismicity both locally and remotely. Far-field stress changes can lead to triggered earthquakes coinciding with the arrival of the surface waves. This dynamic triggering is found to occur in a variety of tectonic settings, but in particular magmatic regions. Here we test whether the Socorro Magma Body region in central New Mexico hosts triggered seismicity. Preliminary inspection of continuous network data in central New Mexico suggested a local triggered event with the passage of surface waves from an MW 6.9 event in 2009. For a more comprehensive view, we examine data from 379 earthquakes MW ≥ 6.0 between January 15, 2008 to March 13, 2010 recorded on the EarthScope USArray Transportable Network stations located within New Mexico and providing more dense coverage for better detectability. Waveforms from twenty EarthScope stations were windowed around the time of the large event, high-pass filtered at 5 Hz to remove low frequency signals and analyzed to detect high frequency triggered local earthquakes. For each possible trigger detected, waveforms from nine short-period stations in the Socorro Seismic Network were added to aid in locating the events. In the time period analyzed, twelve triggered events were detected. Only one of these events, on August 30, 2009, corresponded to the arrival of surface waves, occurring about a minute after their arrival. The majority of the triggered events occur well after the arrival of the surface waves, indicating that they are either independent of the main shock or the result of delayed dynamic triggering. Delayed dynamic triggering can occur hours or days after the passage of surface waves, and are marked by an increase in seismicity relative to background. Only one of the events, on September 18, 2009, occurred within the Socorro Magma Body area. The rest of these events occur spread throughout New Mexico. The widely spread distribution of possibly triggered events and the low ratio of triggers to main shocks indicates that the rifted magmatic region above the Socorro Magma Body is not particularly susceptible to dynamic triggering from remote main shocks. The lack of direct correspondence to a seismic phase can mean that the detected events may be independent (not triggered events), or the result of delayed dynamic triggering. A comparison to randomly chosen waveforms within the time period as background will reveal if the possible events are a result of delayed dynamic triggering or part of the background.
Desjardin, Marie; Roman, Sabine; des Varannes, Stanislas Bruley; Gourcerol, Guillaume; Coffin, Benoit; Ropert, Alain; Mion, François
2013-01-01
Background Pharyngeal pH probes and pH-impedance catheters have been developed for the diagnosis of laryngo-pharyngeal reflux. Objective To determine the reliability of pharyngeal pH alone for the detection of pharyngeal reflux events. Methods 24-h pH-impedance recordings performed in 45 healthy subjects with a bifurcated probe for detection of pharyngeal and oesophageal reflux events were reviewed. Pharyngeal pH drops to below 4 and 5 were analysed for the simultaneous occurrence of pharyngeal reflux, gastro-oesophageal reflux, and swallows, according to impedance patterns. Results Only 7.0% of pharyngeal pH drops to below 5 identified with impedance corresponded to pharyngeal reflux, while 92.6% were related to swallows and 10.2 and 13.3% were associated with proximal and distal gastro-oesophageal reflux events, respectively. Of pharyngeal pH drops to below 4, 13.2% were related to pharyngeal reflux, 87.5% were related to swallows, and 18.1 and 21.5% were associated with proximal and distal gastro-oesophageal reflux events, respectively. Conclusions This study demonstrates that pharyngeal pH alone is not reliable for the detection of pharyngeal reflux and that adding distal oesophageal pH analysis is not helpful. The only reliable analysis should take into account impedance patterns demonstrating the presence of pharyngeal reflux event preceded by a distal and proximal reflux event within the oesophagus. PMID:24917995
Swallow, Khena M; Jiang, Yuhong V
2010-04-01
Recent work on event perception suggests that perceptual processing increases when events change. An important question is how such changes influence the way other information is processed, particularly during dual-task performance. In this study, participants monitored a long series of distractor items for an occasional target as they simultaneously encoded unrelated background scenes. The appearance of an occasional target could have two opposite effects on the secondary task: It could draw attention away from the second task, or, as a change in the ongoing event, it could improve secondary task performance. Results were consistent with the second possibility. Memory for scenes presented simultaneously with the targets was better than memory for scenes that preceded or followed the targets. This effect was observed when the primary detection task involved visual feature oddball detection, auditory oddball detection, and visual color-shape conjunction detection. It was eliminated when the detection task was omitted, and when it required an arbitrary response mapping. The appearance of occasional, task-relevant events appears to trigger a temporal orienting response that facilitates processing of concurrently attended information (Attentional Boost Effect). Copyright 2009 Elsevier B.V. All rights reserved.
Swallow, Khena M.; Jiang, Yuhong V.
2009-01-01
Recent work on event perception suggests that perceptual processing increases when events change. An important question is how such changes influence the way other information is processed, particularly during dual-task performance. In this study, participants monitored a long series of distractor items for an occasional target as they simultaneously encoded unrelated background scenes. The appearance of an occasional target could have two opposite effects on the secondary task: It could draw attention away from the second task, or, as a change in the ongoing event, it could improve secondary task performance. Results were consistent with the second possibility. Memory for scenes presented simultaneously with the targets was better than memory for scenes that preceded or followed the targets. This effect was observed when the primary detection task involved visual feature oddball detection, auditory oddball detection, and visual color-shape conjunction detection. It was eliminated when the detection task was omitted, and when it required an arbitrary response mapping. The appearance of occasional, task-relevant events appears to trigger a temporal orienting response that facilitates processing of concurrently attended information (Attentional Boost Effect). PMID:20080232
Silas, Reshma; Tibballs, James
2010-12-01
Little is known of the incidence of adverse events in the paediatric intensive care unit (PICU). Perceived incidence may be dependent on data-collection methods. To determine the incidence of adverse events by voluntary reporting and systematic enquiry. Adverse events in PICU were recorded contemporaneously by systematic enquiry with bedside nurses and attending doctors, and compared with data submitted voluntarily to the hospital's quality and safety unit. Events were classified as insignificant, minor, moderate, major and catastrophic or lethal, and assigned origins as medical/surgical diagnosis or management, medical/surgical procedures, medication or miscellaneous. Among 740 patients, 524 adverse events (mean 0.71 per patient) occurred in 193 patients (26.1%). Systematic enquiry detected 405 (80%) among 165 patients and were classified by one investigator as insignificant 30 (7%); minor 100 (25%); moderate 160 (37%); major 103(25%) and catastrophic 12 (3%). The coefficient of agreement (kappa) of severity between the two investigators was 0.82 (95% CI 0.78-0.87). Voluntary reporting detected 166 (32%) adverse events among 100 patients, of which 119 were undetected by systematic reporting. Forty-nine events (9%) were detected by both methods. The number and severity of events reported by the two methods were significantly different (p<0.0001). Voluntary reporting, mainly by nurses, did not capture major, severe or catastrophic events related to medical/surgical diagnosis or management. Neither voluntary reporting nor systematic enquiry captures all adverse events. While the two methods both capture some events, systematic reporting captures serious events, while voluntary reporting captures mainly insignificant and minor events.
Accelerometer and Camera-Based Strategy for Improved Human Fall Detection.
Zerrouki, Nabil; Harrou, Fouzi; Sun, Ying; Houacine, Amrane
2016-12-01
In this paper, we address the problem of detecting human falls using anomaly detection. Detection and classification of falls are based on accelerometric data and variations in human silhouette shape. First, we use the exponentially weighted moving average (EWMA) monitoring scheme to detect a potential fall in the accelerometric data. We used an EWMA to identify features that correspond with a particular type of fall allowing us to classify falls. Only features corresponding with detected falls were used in the classification phase. A benefit of using a subset of the original data to design classification models minimizes training time and simplifies models. Based on features corresponding to detected falls, we used the support vector machine (SVM) algorithm to distinguish between true falls and fall-like events. We apply this strategy to the publicly available fall detection databases from the university of Rzeszow's. Results indicated that our strategy accurately detected and classified fall events, suggesting its potential application to early alert mechanisms in the event of fall situations and its capability for classification of detected falls. Comparison of the classification results using the EWMA-based SVM classifier method with those achieved using three commonly used machine learning classifiers, neural network, K-nearest neighbor and naïve Bayes, proved our model superior.
NASA Astrophysics Data System (ADS)
Bergen, K.; Yoon, C. E.; OReilly, O. J.; Beroza, G. C.
2015-12-01
Recent improvements in computational efficiency for waveform correlation-based detections achieved by new methods such as Fingerprint and Similarity Thresholding (FAST) promise to allow large-scale blind search for similar waveforms in long-duration continuous seismic data. Waveform similarity search applied to datasets of months to years of continuous seismic data will identify significantly more events than traditional detection methods. With the anticipated increase in number of detections and associated increase in false positives, manual inspection of the detection results will become infeasible. This motivates the need for new approaches to process the output of similarity-based detection. We explore data mining techniques for improved detection post-processing. We approach this by considering similarity-detector output as a sparse similarity graph with candidate events as vertices and similarities as weighted edges. Image processing techniques are leveraged to define candidate events and combine results individually processed at multiple stations. Clustering and graph analysis methods are used to identify groups of similar waveforms and assign a confidence score to candidate detections. Anomaly detection and classification are applied to waveform data for additional false detection removal. A comparison of methods will be presented and their performance will be demonstrated on a suspected induced and non-induced earthquake sequence.
Space-time clusters for early detection of grizzly bear predation.
Kermish-Wells, Joseph; Massolo, Alessandro; Stenhouse, Gordon B; Larsen, Terrence A; Musiani, Marco
2018-01-01
Accurate detection and classification of predation events is important to determine predation and consumption rates by predators. However, obtaining this information for large predators is constrained by the speed at which carcasses disappear and the cost of field data collection. To accurately detect predation events, researchers have used GPS collar technology combined with targeted site visits. However, kill sites are often investigated well after the predation event due to limited data retrieval options on GPS collars (VHF or UHF downloading) and to ensure crew safety when working with large predators. This can lead to missing information from small-prey (including young ungulates) kill sites due to scavenging and general site deterioration (e.g., vegetation growth). We used a space-time permutation scan statistic (STPSS) clustering method (SaTScan) to detect predation events of grizzly bears ( Ursus arctos ) fitted with satellite transmitting GPS collars. We used generalized linear mixed models to verify predation events and the size of carcasses using spatiotemporal characteristics as predictors. STPSS uses a probability model to compare expected cluster size (space and time) with the observed size. We applied this method retrospectively to data from 2006 to 2007 to compare our method to random GPS site selection. In 2013-2014, we applied our detection method to visit sites one week after their occupation. Both datasets were collected in the same study area. Our approach detected 23 of 27 predation sites verified by visiting 464 random grizzly bear locations in 2006-2007, 187 of which were within space-time clusters and 277 outside. Predation site detection increased by 2.75 times (54 predation events of 335 visited clusters) using 2013-2014 data. Our GLMMs showed that cluster size and duration predicted predation events and carcass size with high sensitivity (0.72 and 0.94, respectively). Coupling GPS satellite technology with clusters using a program based on space-time probability models allows for prompt visits to predation sites. This enables accurate identification of the carcass size and increases fieldwork efficiency in predation studies.
Automatic fall detection using wearable biomedical signal measurement terminal.
Nguyen, Thuy-Trang; Cho, Myeong-Chan; Lee, Tae-Soo
2009-01-01
In our study, we developed a mobile waist-mounted device which can monitor the subject's acceleration signal and detect the fall events in real-time with high accuracy and automatically send an emergency message to a remote server via CDMA module. When fall event happens, the system also generates an alarm sound at 50Hz to alarm other people until a subject can sit up or stand up. A Kionix KXM52-1050 tri-axial accelerometer and a Bellwave BSM856 CDMA standalone modem were used to detect and manage fall events. We used not only a simple threshold algorithm but also some supporting methods to increase an accuracy of our system (nearly 100% in laboratory environment). Timely fall detection can prevent regrettable death due to long-lie effect; therefore increase the independence of elderly people in an unsupervised living environment.
Shamur, Eyal; Zilka, Miri; Hassner, Tal; China, Victor; Liberzon, Alex; Holzman, Roi
2016-06-01
Using videography to extract quantitative data on animal movement and kinematics constitutes a major tool in biomechanics and behavioral ecology. Advanced recording technologies now enable acquisition of long video sequences encompassing sparse and unpredictable events. Although such events may be ecologically important, analysis of sparse data can be extremely time-consuming and potentially biased; data quality is often strongly dependent on the training level of the observer and subject to contamination by observer-dependent biases. These constraints often limit our ability to study animal performance and fitness. Using long videos of foraging fish larvae, we provide a framework for the automated detection of prey acquisition strikes, a behavior that is infrequent yet critical for larval survival. We compared the performance of four video descriptors and their combinations against manually identified feeding events. For our data, the best single descriptor provided a classification accuracy of 77-95% and detection accuracy of 88-98%, depending on fish species and size. Using a combination of descriptors improved the accuracy of classification by ∼2%, but did not improve detection accuracy. Our results indicate that the effort required by an expert to manually label videos can be greatly reduced to examining only the potential feeding detections in order to filter false detections. Thus, using automated descriptors reduces the amount of manual work needed to identify events of interest from weeks to hours, enabling the assembly of an unbiased large dataset of ecologically relevant behaviors. © 2016. Published by The Company of Biologists Ltd.
Testing the Rapid Detection Capabilities of the Quake-Catcher Network
NASA Astrophysics Data System (ADS)
Chung, A. I.; Cochran, E.; Yildirim, B.; Christensen, C. M.; Kaiser, A. E.; Lawrence, J. F.
2013-12-01
The Quake-Catcher Network (QCN) is a versatile network of MEMS accelerometers that are used in combination with distributed volunteer computing to detect earthquakes around the world. Using a dense network of QCN stations installed in Christchurch, New Zealand after the 2010 M7.1 Darfield earthquake, hundreds of events in the Christchurch area were detected and rapidly characterized. When the M6.3 Christchurch event occurred on 21 February 2011, QCN sensors recorded the event and calculated its magnitude, location, and created a map of estimated shaking intensity within 7 seconds of the earthquake origin time. Successive iterations improved the calculations and, within 24 seconds of the earthquake, magnitude and location values were calculated that were comparable to those provided by GeoNet. We have rigorously tested numerous methods to create a working magnitude scaling relationship. In this presentation, we show a drastic improvement in the magnitude estimates using the maximum acceleration at the time of the first trigger and updated ground accelerations from one to three seconds after the initial trigger. 75% of the events rapidly detected and characterized by QCN are within 0.5 magnitude units of the official GeoNet reported magnitude values, with 95% of the events within 1 magnitude unit. We also test the QCN detection algorithms using higher quality data from the SCSN network in Southern California. We examine a dataset of M5 and larger earthquakes that occurred since 1995. We present the performance of the QCN algorithms for this dataset, including time to detection as well as location and magnitude accuracy.
Khandelwal, Siddhartha; Wickstrom, Nicholas
2016-12-01
Detecting gait events is the key to many gait analysis applications that would benefit from continuous monitoring or long-term analysis. Most gait event detection algorithms using wearable sensors that offer a potential for use in daily living have been developed from data collected in controlled indoor experiments. However, for real-word applications, it is essential that the analysis is carried out in humans' natural environment; that involves different gait speeds, changing walking terrains, varying surface inclinations and regular turns among other factors. Existing domain knowledge in the form of principles or underlying fundamental gait relationships can be utilized to drive and support the data analysis in order to develop robust algorithms that can tackle real-world challenges in gait analysis. This paper presents a novel approach that exhibits how domain knowledge about human gait can be incorporated into time-frequency analysis to detect gait events from long-term accelerometer signals. The accuracy and robustness of the proposed algorithm are validated by experiments done in indoor and outdoor environments with approximately 93 600 gait events in total. The proposed algorithm exhibits consistently high performance scores across all datasets in both, indoor and outdoor environments.
Detection and prevention of medication misadventures in general practice.
Tam, Ka Wae Tammy; Kwok, Kon Hung; Fan, Yuen Man Cecilia; Tsui, Kwok Biu; Ng, Kwok Keung; Ho, King Yip Anthony; Lau, Kam Tong; Chan, Yuk Chun; Tse, Ching Wan Charmaine; Lau, Cheuk Man
2008-06-01
Adverse drug events are leading categories of iatrogenic patient injury. Development of preventive strategies for general practice setting depends on effective detection of events. The aim of the study is to compare the strengths and weaknesses of voluntary reporting, chart review and patient survey in measuring medication misadventures in general practice and to analyze the events by severity and preventability, drug groups and patients' and doctors' characteristics, for the formulation of preventive strategies. In the 2-month study period, we applied voluntary report, chart review and patient survey to collect data related to medication misadventures and compared their detection rate. The chart review demonstrated the highest yield for detecting overall medication misadventures (2.03% medication orders), followed by patient survey (1.46% medication orders) and voluntary reporting (0.52% medication orders). Chart review and patient survey were better than voluntary reporting in uncovering preventable adverse drug events. However, voluntary reporting was pivotal in capturing sentinel events. Beta-blocker, diuretic, angiotensin-converting enzyme inhibitor, aspirin and non-steroidal anti-inflammatory drugs had caused 82.0% of all adverse drug events. These events were more common with advanced age of patients, greater number of consultation problems and prescribed drug items. Additional resources implicated were minimal. We suggested a complementary approach using chart review and voluntary reporting in measuring and monitoring medication misadventures in general practice. Close monitoring of the events was necessary for older patients, multiple medical problems and poly-pharmacy and for patients using beta-blocker, diuretic, angiotensin-converting enzyme inhibitor, aspirin or non-steroidal anti-inflammatory drugs on a long-term basis.
Smith, Brian T; Coiro, Daniel J; Finson, Richard; Betz, Randal R; McCarthy, James
2002-03-01
Force-sensing resistors (FSRs) were used to detect the transitions between five main phases of gait for the control of electrical stimulation (ES) while walking with seven children with spastic diplegia, cerebral palsy. The FSR positions within each child's insoles were customized based on plantar pressure profiles determined using a pressure-sensitive membrane array (Tekscan Inc., Boston, MA). The FSRs were placed in the insoles so that pressure transitions coincided with an ipsilateral or contralateral gait event. The transitions between the following gait phases were determined: loading response, mid- and terminal stance, and pre- and initial swing. Following several months of walking on a regular basis with FSR-triggered intramuscular ES to the hip and knee extensors, hip abductors, and ankle dorsi and plantar flexors, the accuracy and reliability of the FSRs to detect gait phase transitions were evaluated. Accuracy was evaluated with four of the subjects by synchronizing the output of the FSR detection scheme with a VICON (Oxford Metrics, U.K.) motion analysis system, which was used as the gait event reference. While mean differences between each FSR-detected gait event and that of the standard (VICON) ranged from +35 ms (indicating that the FSR detection scheme recognized the event before it actually happened) to -55 ms (indicating that the FSR scheme recognized the event after it occurred), the difference data was widely distributed, which appeared to be due in part to both intrasubject (step-to-step) and intersubject variability. Terminal stance exhibited the largest mean difference and standard deviation, while initial swing exhibited the smallest deviation and preswing the smallest mean difference. To determine step-to-step reliability, all seven children walked on a level walkway for at least 50 steps. Of 642 steps, there were no detection errors in 94.5% of the steps. Of the steps that contained a detection error, 80% were due to the failure of the FSR signal to reach the programmed threshold level during the transition to loading response. Recovery from an error always occurred one to three steps later.
Slow earthquakes in microseism frequency band (0.1-2 Hz) off the Kii peninsula
NASA Astrophysics Data System (ADS)
Kaneko, L.; Ide, S.; Nakano, M.
2017-12-01
Slow earthquakes are divided into deep tectonic tremors, very low frequency (VLF) events, and slow slip events (SSE), each of which is observed in a different frequency band. Tremors are observed above 2 Hz, and VLF signals are visible mainly in 0.01-0.05 Hz. It was generally very difficult to find signals of slow underground deformation at frequencies between them, i.e., 0.1-2Hz, where microseism noise is dominant. However, after a Mw 5.9 plate boundary earthquake off the Kii peninsula on April 1st, 2016, sufficiently large signals have been observed in the microseism band, accompanied with signals from active tremors, VLFEs, and SSEs by the ocean bottom seismometer network DONET maintained by JAMSTEC and NIED. This is the first observation of slow earthquakes in the microseism frequency band. Here we report the detection and location of events in this band, and compare them with the spatial and temporal distributions of ordinary tectonic tremors above 2 Hz and VLF events. We used continuous records of 20 broadband seismometers of DONET from April 1st to 12th. We detected events by calculating arrival time differences between stations using an envelope correlation method of Ide (2010). Unlike ordinary applications, we repeated analyses for seismograms bandpass-filtered in four separated frequency bands, 0.1-1, 1-2, 2-4, and 4-8 Hz. For each band, we successfully detected events and determined their hypocenter locations. Many VLF events have also been detected in this region in the frequency band of 0.03-0.05 Hz, with location and focal mechanism using a method of Nakano et al. (2008). In the 0.1-1 Hz microseism band, hypocenters were determined mainly on April 10th, when microseism noises are small and signal amplitudes are quite large. In several time windows, events were detected in all four bands, and located within the 2-sigma error ellipses, with similar source time functions. Sometimes, events were detected in two or three bands, suggesting wide variations of in wave radiation at different frequencies. Although the location errors are not always small enough to confirm the collocation of sources, due to uncertainty in structure, we can confirm seismic wave are radiated in the microseism band from slow earthquake, which is considered as a continuous, broadband, and complicated phenomenon.
Pérez Zapata, A I; Gutiérrez Samaniego, M; Rodríguez Cuéllar, E; Gómez de la Cámara, A; Ruiz López, P
Surgery is a high risk for the occurrence of adverse events (AE). The main objective of this study is to compare the effectiveness of the Trigger tool with the Hospital National Health System registration of Discharges, the minimum basic data set (MBDS), in detecting adverse events in patients admitted to General Surgery and undergoing surgery. Observational and descriptive retrospective study of patients admitted to general surgery of a tertiary hospital, and undergoing surgery in 2012. The identification of adverse events was made by reviewing the medical records, using an adaptation of "Global Trigger Tool" methodology, as well as the (MBDS) registered on the same patients. Once the AE were identified, they were classified according to damage and to the extent to which these could have been avoided. The area under the curve (ROC) were used to determine the discriminatory power of the tools. The Hanley and Mcneil test was used to compare both tools. AE prevalence was 36.8%. The TT detected 89.9% of all AE, while the MBDS detected 28.48%. The TT provides more information on the nature and characteristics of the AE. The area under the curve was 0.89 for the TT and 0.66 for the MBDS. These differences were statistically significant (P<.001). The Trigger tool detects three times more adverse events than the MBDS registry. The prevalence of adverse events in General Surgery is higher than that estimated in other studies. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.
Seismic Characterization of EGS Reservoirs
NASA Astrophysics Data System (ADS)
Templeton, D. C.; Pyle, M. L.; Matzel, E.; Myers, S.; Johannesson, G.
2014-12-01
To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance the traditional microearthquake detection and location methodologies at two EGS systems. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP are typically smaller magnitude events or events that occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event seismic location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation could be real or simply within the anticipated error range. We apply this methodology to the Basel EGS data set and compare it to another EGS dataset. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Confidential Clinician-reported Surveillance of Adverse Events Among Medical Inpatients
Weingart, Saul N; Ship, Amy N; Aronson, Mark D
2000-01-01
BACKGROUND Although iatrogenic injury poses a significant risk to hospitalized patients, detection of adverse events (AEs) is costly and difficult. METHODS The authors developed a confidential reporting method for detecting AEs on a medicine unit of a teaching hospital. Adverse events were defined as patient injuries. Potential adverse events (PAEs) represented errors that could have, but did not result in harm. Investigators interviewed house officers during morning rounds and by e-mail, asking them to identify obstacles to high quality care and iatrogenic injuries. They compared house officer reports with hospital incident reports and patients' medical records. A multivariate regression model identified correlates of reporting. RESULTS One hundred ten events occurred, affecting 84 patients. Queries by e-mail (incidence rate ratio [IRR ]=0.16; 95% confidence interval [95% CI], 0.05 to 0.49) and on days when house officers rotated to a new service (IRR =0.12; 95% CI, 0.02 to 0.91) resulted in fewer reports. The most commonly reported process of care problems were inadequate evaluation of the patient (16.4%), failure to monitor or follow up (12.7%), and failure of the laboratory to perform a test (12.7%). Respondents identified 29 (26.4%) AEs, 52 (47.3%) PAEs, and 29 (26.4%) other house officer-identified quality problems. An AE occurred in 2.6% of admissions. The hospital incident reporting system detected only one house officer-reported event. Chart review corroborated 72.9% of events. CONCLUSIONS House officers detect many AEs among inpatients. Confidential peer interviews of front-line providers is a promising method for identifying medical errors and substandard quality. PMID:10940133
Li, Xiang; Wang, Xiuxiu; Yang, Jielin; Liu, Yueming; He, Yuping; Pan, Liangwen
2014-05-16
To date, over 150 genetically modified (GM) crops are widely cultivated. To comply with regulations developed for genetically modified organisms (GMOs), including labeling policies, many detection methods for GMO identification and quantification have been developed. To detect the entrance and exit of unauthorized GM crop events in China, we developed a novel quadruplex real-time PCR method for simultaneous detection and quantification of GM cotton events GHB119 and T304-40 in cotton-derived products (based on the 5'-flanking sequence) and the insect-resistance gene Cry2Ae. The limit of detection was 10 copies for GHB119 and Cry2Ae and 25 copies for T304-40. The limit of quantification was 25 copies for GHB119 and Cry2Ae and 50 copies for T304-40. Moreover, low bias and acceptable standard deviation and relative standard deviation values were obtained in quantification analysis of six blind samples containing different GHB119 and T304-40 ingredients. The developed quadruplex quantitative method could be used for quantitative detection of two GM cotton events (GHB119 and T304-40) and Cry2Ae gene ingredient in cotton derived products.
2014-01-01
Background To date, over 150 genetically modified (GM) crops are widely cultivated. To comply with regulations developed for genetically modified organisms (GMOs), including labeling policies, many detection methods for GMO identification and quantification have been developed. Results To detect the entrance and exit of unauthorized GM crop events in China, we developed a novel quadruplex real-time PCR method for simultaneous detection and quantification of GM cotton events GHB119 and T304-40 in cotton-derived products (based on the 5′-flanking sequence) and the insect-resistance gene Cry2Ae. The limit of detection was 10 copies for GHB119 and Cry2Ae and 25 copies for T304-40. The limit of quantification was 25 copies for GHB119 and Cry2Ae and 50 copies for T304-40. Moreover, low bias and acceptable standard deviation and relative standard deviation values were obtained in quantification analysis of six blind samples containing different GHB119 and T304-40 ingredients. Conclusions The developed quadruplex quantitative method could be used for quantitative detection of two GM cotton events (GHB119 and T304-40) and Cry2Ae gene ingredient in cotton derived products. PMID:24884946
An iterative matching and locating technique for borehole microseismic monitoring
NASA Astrophysics Data System (ADS)
Chen, H.; Meng, X.; Niu, F.; Tang, Y.
2016-12-01
Microseismic monitoring has been proven to be an effective and valuable technology to image hydraulic fracture geometry. The success of hydraulic fracturing monitoring relies on the detection and characterization (i.e., location and focal mechanism estimation) of a maximum number of induced microseismic events. All the events are important to quantify the stimulated reservior volume (SRV) and characterize the newly created fracture network. Detecting and locating low magnitude events, however, are notoriously difficult, particularly at a high noisy production environment. Here we propose an iterative matching and locating technique (iMLT) to obtain a maximum detection of small events and the best determination of their locations from continuous data recorded by a single azimuth downhole geophone array. As the downhole array is located in one azimuth, the regular M&L using the P-wave cross-correlation only is not able to resolve the location of a matched event relative to the template event. We thus introduce the polarization direction in the matching, which significantly improve the lateral resolution of the M&L method based on numerical simulations with synthetic data. Our synthetic tests further indicate that the inclusion of S-wave cross-correlation data can help better constrain the focal depth of the matched events. We apply this method to a dataset recorded during hydraulic fracturing treatment of a pilot horizontal well within the shale play in southwest China. Our approach yields a more than fourfold increase in the number of located events, compared with the original event catalog from traditional downhole processing.
Chen, Yen-Lin; Liang, Wen-Yew; Chiang, Chuan-Yen; Hsieh, Tung-Ju; Lee, Da-Cheng; Yuan, Shyan-Ming; Chang, Yang-Lang
2011-01-01
This study presents efficient vision-based finger detection, tracking, and event identification techniques and a low-cost hardware framework for multi-touch sensing and display applications. The proposed approach uses a fast bright-blob segmentation process based on automatic multilevel histogram thresholding to extract the pixels of touch blobs obtained from scattered infrared lights captured by a video camera. The advantage of this automatic multilevel thresholding approach is its robustness and adaptability when dealing with various ambient lighting conditions and spurious infrared noises. To extract the connected components of these touch blobs, a connected-component analysis procedure is applied to the bright pixels acquired by the previous stage. After extracting the touch blobs from each of the captured image frames, a blob tracking and event recognition process analyzes the spatial and temporal information of these touch blobs from consecutive frames to determine the possible touch events and actions performed by users. This process also refines the detection results and corrects for errors and occlusions caused by noise and errors during the blob extraction process. The proposed blob tracking and touch event recognition process includes two phases. First, the phase of blob tracking associates the motion correspondence of blobs in succeeding frames by analyzing their spatial and temporal features. The touch event recognition process can identify meaningful touch events based on the motion information of touch blobs, such as finger moving, rotating, pressing, hovering, and clicking actions. Experimental results demonstrate that the proposed vision-based finger detection, tracking, and event identification system is feasible and effective for multi-touch sensing applications in various operational environments and conditions. PMID:22163990
Sampled-data consensus in switching networks of integrators based on edge events
NASA Astrophysics Data System (ADS)
Xiao, Feng; Meng, Xiangyu; Chen, Tongwen
2015-02-01
This paper investigates the event-driven sampled-data consensus in switching networks of multiple integrators and studies both the bidirectional interaction and leader-following passive reaction topologies in a unified framework. In these topologies, each information link is modelled by an edge of the information graph and assigned a sequence of edge events, which activate the mutual data sampling and controller updates of the two linked agents. Two kinds of edge-event-detecting rules are proposed for the general asynchronous data-sampling case and the synchronous periodic event-detecting case. They are implemented in a distributed fashion, and their effectiveness in reducing communication costs and solving consensus problems under a jointly connected topology condition is shown by both theoretical analysis and simulation examples.
NASA Astrophysics Data System (ADS)
Aster, R. C.; McMahon, N. D.; Myers, E. K.; Lough, A. C.
2015-12-01
Lough et al. (2014) first detected deep sub-icecap magmatic events beneath the Executive Committee Range volcanoes of Marie Byrd Land. Here, we extend the identification and analysis of these events in space and time utilizing subspace detection. Subspace detectors provide a highly effective methodology for studying events within seismic swarms that have similar moment tensor and Green's function characteristics and are particularly effective for identifying low signal-to-noise events. Marie Byrd Land (MBL) is an extremely remote continental region that is nearly completely covered by the West Antarctic Ice Sheet (WAIS). The southern extent of Marie Byrd Land lies within the West Antarctic Rift System (WARS), which includes the volcanic Executive Committee Range (ECR). The ECR shows north-to-south progression of volcanism across the WARS during the Holocene. In 2013, the POLENET/ANET seismic data identified two swarms of seismic activity in 2010 and 2011. These events have been interpreted as deep, long-period (DLP) earthquakes based on depth (25-40 km) and low frequency content. The DLP events in MBL lie beneath an inferred sub-WAIS volcanic edifice imaged with ice penetrating radar and have been interpreted as a present location of magmatic intrusion. The magmatic swarm activity in MBL provides a promising target for advanced subspace detection and temporal, spatial, and event size analysis of an extensive deep long period earthquake swarm using a remote seismographic network. We utilized a catalog of 1,370 traditionally identified DLP events to construct subspace detectors for the six nearest stations and analyzed two years of data spanning 2010-2011. Association of these detections into events resulted in an approximate ten-fold increase in number of locatable earthquakes. In addition to the two previously identified swarms during early 2010 and early 2011, we find sustained activity throughout the two years of study that includes several previously unidentified periods of heightened activity. Correlation with large global earthquakes suggests that the DLP activity is not sensitive to remote teleseismic triggering.
Photon Counting Imaging with an Electron-Bombarded Pixel Image Sensor
Hirvonen, Liisa M.; Suhling, Klaus
2016-01-01
Electron-bombarded pixel image sensors, where a single photoelectron is accelerated directly into a CCD or CMOS sensor, allow wide-field imaging at extremely low light levels as they are sensitive enough to detect single photons. This technology allows the detection of up to hundreds or thousands of photon events per frame, depending on the sensor size, and photon event centroiding can be employed to recover resolution lost in the detection process. Unlike photon events from electron-multiplying sensors, the photon events from electron-bombarded sensors have a narrow, acceleration-voltage-dependent pulse height distribution. Thus a gain voltage sweep during exposure in an electron-bombarded sensor could allow photon arrival time determination from the pulse height with sub-frame exposure time resolution. We give a brief overview of our work with electron-bombarded pixel image sensor technology and recent developments in this field for single photon counting imaging, and examples of some applications. PMID:27136556
Simultaneous beta and gamma spectroscopy
Farsoni, Abdollah T.; Hamby, David M.
2010-03-23
A phoswich radiation detector for simultaneous spectroscopy of beta rays and gamma rays includes three scintillators with different decay time characteristics. Two of the three scintillators are used for beta detection and the third scintillator is used for gamma detection. A pulse induced by an interaction of radiation with the detector is digitally analyzed to classify the type of event as beta, gamma, or unknown. A pulse is classified as a beta event if the pulse originated from just the first scintillator alone or from just the first and the second scintillator. A pulse from just the third scintillator is recorded as gamma event. Other pulses are rejected as unknown events.
Semantic Concept Discovery for Large Scale Zero Shot Event Detection
2015-07-25
sources and can be shared among many different events, including unseen ones. Based on this idea, events can be detected by inspect- ing the individual...2013]. Partial success along this vein has also been achieved in the zero-shot setting, e.g. [Habibian et al., 2014; Wu et al., 2014], but the...candle”, “birthday cake” and “applaud- ing”. Since concepts are shared among many different classes (events) and each concept classifier can be trained
Novak, Avrey; Nyflot, Matthew J; Ermoian, Ralph P; Jordan, Loucille E; Sponseller, Patricia A; Kane, Gabrielle M; Ford, Eric C; Zeng, Jing
2016-05-01
Radiation treatment planning involves a complex workflow that has multiple potential points of vulnerability. This study utilizes an incident reporting system to identify the origination and detection points of near-miss errors, in order to guide their departmental safety improvement efforts. Previous studies have examined where errors arise, but not where they are detected or applied a near-miss risk index (NMRI) to gauge severity. From 3/2012 to 3/2014, 1897 incidents were analyzed from a departmental incident learning system. All incidents were prospectively reviewed weekly by a multidisciplinary team and assigned a NMRI score ranging from 0 to 4 reflecting potential harm to the patient (no potential harm to potential critical harm). Incidents were classified by point of incident origination and detection based on a 103-step workflow. The individual steps were divided among nine broad workflow categories (patient assessment, imaging for radiation therapy (RT) planning, treatment planning, pretreatment plan review, treatment delivery, on-treatment quality management, post-treatment completion, equipment/software quality management, and other). The average NMRI scores of incidents originating or detected within each broad workflow area were calculated. Additionally, out of 103 individual process steps, 35 were classified as safety barriers, the process steps whose primary function is to catch errors. The safety barriers which most frequently detected incidents were identified and analyzed. Finally, the distance between event origination and detection was explored by grouping events by the number of broad workflow area events passed through before detection, and average NMRI scores were compared. Near-miss incidents most commonly originated within treatment planning (33%). However, the incidents with the highest average NMRI scores originated during imaging for RT planning (NMRI = 2.0, average NMRI of all events = 1.5), specifically during the documentation of patient positioning and localization of the patient. Incidents were most frequently detected during treatment delivery (30%), and incidents identified at this point also had higher severity scores than other workflow areas (NMRI = 1.6). Incidents identified during on-treatment quality management were also more severe (NMRI = 1.7), and the specific process steps of reviewing portal and CBCT images tended to catch highest-severity incidents. On average, safety barriers caught 46% of all incidents, most frequently at physics chart review, therapist's chart check, and the review of portal images; however, most of the incidents that pass through a particular safety barrier are not designed to be capable of being captured at that barrier. Incident learning systems can be used to assess the most common points of error origination and detection in radiation oncology. This can help tailor safety improvement efforts and target the highest impact portions of the workflow. The most severe near-miss events tend to originate during simulation, with the most severe near-miss events detected at the time of patient treatment. Safety barriers can be improved to allow earlier detection of near-miss events.
Secure Access Control and Large Scale Robust Representation for Online Multimedia Event Detection
Liu, Changyu; Li, Huiling
2014-01-01
We developed an online multimedia event detection (MED) system. However, there are a secure access control issue and a large scale robust representation issue when we want to integrate traditional event detection algorithms into the online environment. For the first issue, we proposed a tree proxy-based and service-oriented access control (TPSAC) model based on the traditional role based access control model. Verification experiments were conducted on the CloudSim simulation platform, and the results showed that the TPSAC model is suitable for the access control of dynamic online environments. For the second issue, inspired by the object-bank scene descriptor, we proposed a 1000-object-bank (1000OBK) event descriptor. Feature vectors of the 1000OBK were extracted from response pyramids of 1000 generic object detectors which were trained on standard annotated image datasets, such as the ImageNet dataset. A spatial bag of words tiling approach was then adopted to encode these feature vectors for bridging the gap between the objects and events. Furthermore, we performed experiments in the context of event classification on the challenging TRECVID MED 2012 dataset, and the results showed that the robust 1000OBK event descriptor outperforms the state-of-the-art approaches. PMID:25147840
Chiles, M.M.; Mihalczo, J.T.; Blakeman, E.D.
1987-02-27
A scintillation based radiation detector for the combined detection of thermal neutrons, high-energy neutrons and gamma rays in a single detecting unit. The detector consists of a pair of scintillators sandwiched together and optically coupled to the light sensitive face of a photomultiplier tube. A light tight radiation pervious housing is disposed about the scintillators and a portion of the photomultiplier tube to hold the arrangement in assembly and provides a radiation window adjacent the outer scintillator through which the radiation to be detected enters the detector. The outer scintillator is formed of a material in which scintillations are produced by thermal-neutrons and the inner scintillator is formed of a material in which scintillations are produced by high-energy neutrons and gamma rays. The light pulses produced by events detected in both scintillators are coupled to the photomultiplier tube which produces a current pulse in response to each detected event. These current pulses may be processed in a conventional manner to produce a count rate output indicative of the total detected radiation event count rate. Pulse discrimination techniques may be used to distinguish the different radiations and their energy distribution.
Online track detection in triggerless mode for INO
NASA Astrophysics Data System (ADS)
Jain, A.; Padmini, S.; Joseph, A. N.; Mahesh, P.; Preetha, N.; Behere, A.; Sikder, S. S.; Majumder, G.; Behera, S. P.
2018-03-01
The India based Neutrino Observatory (INO) is a proposed particle physics research project to study the atmospheric neutrinos. INO-Iron Calorimeter (ICAL) will consist of 28,800 detectors having 3.6 million electronic channels expected to activate with 100 Hz single rate, producing data at a rate of 3 GBps. Data collected contains a few real hits generated by muon tracks and the remaining noise-induced spurious hits. Estimated reduction factor after filtering out data of interest from generated data is of the order of 103. This makes trigger generation critical for efficient data collection and storage. Trigger is generated by detecting coincidence across multiple channels satisfying trigger criteria, within a small window of 200 ns in the trigger region. As the probability of neutrino interaction is very low, track detection algorithm has to be efficient and fast enough to process 5 × 106 events-candidates/s without introducing significant dead time, so that not even a single neutrino event is missed out. A hardware based trigger system is presently proposed for on-line track detection considering stringent timing requirements. Though the trigger system can be designed with scalability, a lot of hardware devices and interconnections make it a complex and expensive solution with limited flexibility. A software based track detection approach working on the hit information offers an elegant solution with possibility of varying trigger criteria for selecting various potentially interesting physics events. An event selection approach for an alternative triggerless readout scheme has been developed. The algorithm is mathematically simple, robust and parallelizable. It has been validated by detecting simulated muon events for energies of the range of 1 GeV-10 GeV with 100% efficiency at a processing rate of 60 μs/event on a 16 core machine. The algorithm and result of a proof-of-concept for its faster implementation over multiple cores is presented. The paper also discusses about harnessing the computing capabilities of multi-core computing farm, thereby optimizing number of nodes required for the proposed system.
Söth-Hansen, Malene; Witt, Christoffer Tobias; Rasmussen, Mathis; Kristensen, Jens; Gerdes, Christian; Nielsen, Jens Cosedis
2018-05-24
Remote monitoring (RM) is an established technology integrated into routine follow-up of patients with implantable cardioverter-defibrillator (ICD). Current RM systems differ according to transmission frequency and alert definition. We aimed to compare time difference between detection and acknowledgement of clinically relevant events between four RM systems. We analyzed time delay between detection of ventricular arrhythmic and technical events by the ICD and acknowledgement by hospital staff in 1.802 consecutive patients followed with RM during September 2014 - August 2016. Devices from Biotronik (BIO, n=374), Boston Scientific (BSC, n=196), Medtronic (MDT, n=468) and St Jude Medical (SJM, n=764) were included. We identified all events from RM webpages and their acknowledgement with RM or at in-clinic follow-up. Events occurring during weekends were excluded. We included 3.472 events. Proportion of events acknowledged within 24 hours was 72%, 23%, 18% and 65% with BIO, BSC, MDT and SJM, respectively, with median times of 13, 222, 163 and 18 hours from detection to acknowledgement (p<0.001 for both comparisons between manufacturers). Including only events transmitted as alerts by RM, 72%, 68%, 61% and 65% for BIO, BSC, MDT and SJM, respectively were acknowledged within 24 hours. Variation in time to acknowledgement of ventricular tachyarrhythmia episodes not treated with shock therapy was the primary cause for the difference between manufacturers. Significant and clinically relevant differences in time delay from event detection to acknowledgement exist between RM systems. Varying definitions of which events RM transmits as alerts are important for the differences observed. Copyright © 2018. Published by Elsevier Inc.
Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang
2015-01-05
The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection.
Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang
2015-01-01
The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection. PMID:25556930
Detection of visual events along the apparent motion trace in patients with paranoid schizophrenia.
Sanders, Lia Lira Olivier; Muckli, Lars; de Millas, Walter; Lautenschlager, Marion; Heinz, Andreas; Kathmann, Norbert; Sterzer, Philipp
2012-07-30
Dysfunctional prediction in sensory processing has been suggested as a possible causal mechanism in the development of delusions in patients with schizophrenia. Previous studies in healthy subjects have shown that while the perception of apparent motion can mask visual events along the illusory motion trace, such motion masking is reduced when events are spatio-temporally compatible with the illusion, and, therefore, predictable. Here we tested the hypothesis that this specific detection advantage for predictable target stimuli on the apparent motion trace is reduced in patients with paranoid schizophrenia. Our data show that, although target detection along the illusory motion trace is generally impaired, both patients and healthy control participants detect predictable targets more often than unpredictable targets. Patients had a stronger motion masking effect when compared to controls. However, patients showed the same advantage in the detection of predictable targets as healthy control subjects. Our findings reveal stronger motion masking but intact prediction of visual events along the apparent motion trace in patients with paranoid schizophrenia and suggest that the sensory prediction mechanism underlying apparent motion is not impaired in paranoid schizophrenia. Copyright © 2012. Published by Elsevier Ireland Ltd.
Event-specific real-time detection and quantification of genetically modified Roundup Ready soybean.
Huang, Chia-Chia; Pan, Tzu-Ming
2005-05-18
The event-specific real-time detection and quantification of Roundup Ready soybean (RRS) using an ABI PRISM 7700 sequence detection system with light upon extension (LUX) primer was developed in this study. The event-specific primers were designed, targeting the junction of the RRS 5' integration site and the endogenous gene lectin1. Then, a standard reference plasmid was constructed that carried both of the targeted sequences for quantitative analysis. The detection limit of the LUX real-time PCR system was 0.05 ng of 100% RRS genomic DNA, which was equal to 20.5 copies. The range of quantification was from 0.1 to 100%. The sensitivity and range of quantification successfully met the requirement of the labeling rules in the European Union and Taiwan.
Event Recognition for Contactless Activity Monitoring Using Phase-Modulated Continuous Wave Radar.
Forouzanfar, Mohamad; Mabrouk, Mohamed; Rajan, Sreeraman; Bolic, Miodrag; Dajani, Hilmi R; Groza, Voicu Z
2017-02-01
The use of remote sensing technologies such as radar is gaining popularity as a technique for contactless detection of physiological signals and analysis of human motion. This paper presents a methodology for classifying different events in a collection of phase modulated continuous wave radar returns. The primary application of interest is to monitor inmates where the presence of human vital signs amidst different, interferences needs to be identified. A comprehensive set of features is derived through time and frequency domain analyses of the radar returns. The Bhattacharyya distance is used to preselect the features with highest class separability as the possible candidate features for use in the classification process. The uncorrelated linear discriminant analysis is performed to decorrelate, denoise, and reduce the dimension of the candidate feature set. Linear and quadratic Bayesian classifiers are designed to distinguish breathing, different human motions, and nonhuman motions. The performance of these classifiers is evaluated on a pilot dataset of radar returns that contained different events including breathing, stopped breathing, simple human motions, and movement of fan and water. Our proposed pattern classification system achieved accuracies of up to 93% in stationary subject detection, 90% in stop-breathing detection, and 86% in interference detection. Our proposed radar pattern recognition system was able to accurately distinguish the predefined events amidst interferences. Besides inmate monitoring and suicide attempt detection, this paper can be extended to other radar applications such as home-based monitoring of elderly people, apnea detection, and home occupancy detection.
Barboza, Philippe; Vaillant, Laetitia; Le Strat, Yann; Hartley, David M.; Nelson, Noele P.; Mawudeku, Abla; Madoff, Lawrence C.; Linge, Jens P.; Collier, Nigel; Brownstein, John S.; Astagneau, Pascal
2014-01-01
Background Internet-based biosurveillance systems have been developed to detect health threats using information available on the Internet, but system performance has not been assessed relative to end-user needs and perspectives. Method and Findings Infectious disease events from the French Institute for Public Health Surveillance (InVS) weekly international epidemiological bulletin published in 2010 were used to construct the gold-standard official dataset. Data from six biosurveillance systems were used to detect raw signals (infectious disease events from informal Internet sources): Argus, BioCaster, GPHIN, HealthMap, MedISys and ProMED-mail. Crude detection rates (C-DR), crude sensitivity rates (C-Se) and intrinsic sensitivity rates (I-Se) were calculated from multivariable regressions to evaluate the systems’ performance (events detected compared to the gold-standard) 472 raw signals (Internet disease reports) related to the 86 events included in the gold-standard data set were retrieved from the six systems. 84 events were detected before their publication in the gold-standard. The type of sources utilised by the systems varied significantly (p<0001). I-Se varied significantly from 43% to 71% (p = 0001) whereas other indicators were similar (C-DR: p = 020; C-Se, p = 013). I-Se was significantly associated with individual systems, types of system, languages, regions of occurrence, and types of infectious disease. Conversely, no statistical difference of C-DR was observed after adjustment for other variables. Conclusion Although differences could result from a biosurveillance system's conceptual design, findings suggest that the combined expertise amongst systems enhances early detection performance for detection of infectious diseases. While all systems showed similar early detection performance, systems including human moderation were found to have a 53% higher I-Se (p = 00001) after adjustment for other variables. Overall, the use of moderation, sources, languages, regions of occurrence, and types of cases were found to influence system performance. PMID:24599062
Barboza, Philippe; Vaillant, Laetitia; Le Strat, Yann; Hartley, David M; Nelson, Noele P; Mawudeku, Abla; Madoff, Lawrence C; Linge, Jens P; Collier, Nigel; Brownstein, John S; Astagneau, Pascal
2014-01-01
Internet-based biosurveillance systems have been developed to detect health threats using information available on the Internet, but system performance has not been assessed relative to end-user needs and perspectives. Infectious disease events from the French Institute for Public Health Surveillance (InVS) weekly international epidemiological bulletin published in 2010 were used to construct the gold-standard official dataset. Data from six biosurveillance systems were used to detect raw signals (infectious disease events from informal Internet sources): Argus, BioCaster, GPHIN, HealthMap, MedISys and ProMED-mail. Crude detection rates (C-DR), crude sensitivity rates (C-Se) and intrinsic sensitivity rates (I-Se) were calculated from multivariable regressions to evaluate the systems' performance (events detected compared to the gold-standard) 472 raw signals (Internet disease reports) related to the 86 events included in the gold-standard data set were retrieved from the six systems. 84 events were detected before their publication in the gold-standard. The type of sources utilised by the systems varied significantly (p<0001). I-Se varied significantly from 43% to 71% (p=0001) whereas other indicators were similar (C-DR: p=020; C-Se, p=013). I-Se was significantly associated with individual systems, types of system, languages, regions of occurrence, and types of infectious disease. Conversely, no statistical difference of C-DR was observed after adjustment for other variables. Although differences could result from a biosurveillance system's conceptual design, findings suggest that the combined expertise amongst systems enhances early detection performance for detection of infectious diseases. While all systems showed similar early detection performance, systems including human moderation were found to have a 53% higher I-Se (p=00001) after adjustment for other variables. Overall, the use of moderation, sources, languages, regions of occurrence, and types of cases were found to influence system performance.
Shadow Detection Based on Regions of Light Sources for Object Extraction in Nighttime Video
Lee, Gil-beom; Lee, Myeong-jin; Lee, Woo-Kyung; Park, Joo-heon; Kim, Tae-Hwan
2017-01-01
Intelligent video surveillance systems detect pre-configured surveillance events through background modeling, foreground and object extraction, object tracking, and event detection. Shadow regions inside video frames sometimes appear as foreground objects, interfere with ensuing processes, and finally degrade the event detection performance of the systems. Conventional studies have mostly used intensity, color, texture, and geometric information to perform shadow detection in daytime video, but these methods lack the capability of removing shadows in nighttime video. In this paper, a novel shadow detection algorithm for nighttime video is proposed; this algorithm partitions each foreground object based on the object’s vertical histogram and screens out shadow objects by validating their orientations heading toward regions of light sources. From the experimental results, it can be seen that the proposed algorithm shows more than 93.8% shadow removal and 89.9% object extraction rates for nighttime video sequences, and the algorithm outperforms conventional shadow removal algorithms designed for daytime videos. PMID:28327515
Continuous robust sound event classification using time-frequency features and deep learning
Song, Yan; Xiao, Wei; Phan, Huy
2017-01-01
The automatic detection and recognition of sound events by computers is a requirement for a number of emerging sensing and human computer interaction technologies. Recent advances in this field have been achieved by machine learning classifiers working in conjunction with time-frequency feature representations. This combination has achieved excellent accuracy for classification of discrete sounds. The ability to recognise sounds under real-world noisy conditions, called robust sound event classification, is an especially challenging task that has attracted recent research attention. Another aspect of real-word conditions is the classification of continuous, occluded or overlapping sounds, rather than classification of short isolated sound recordings. This paper addresses the classification of noise-corrupted, occluded, overlapped, continuous sound recordings. It first proposes a standard evaluation task for such sounds based upon a common existing method for evaluating isolated sound classification. It then benchmarks several high performing isolated sound classifiers to operate with continuous sound data by incorporating an energy-based event detection front end. Results are reported for each tested system using the new task, to provide the first analysis of their performance for continuous sound event detection. In addition it proposes and evaluates a novel Bayesian-inspired front end for the segmentation and detection of continuous sound recordings prior to classification. PMID:28892478
Continuous robust sound event classification using time-frequency features and deep learning.
McLoughlin, Ian; Zhang, Haomin; Xie, Zhipeng; Song, Yan; Xiao, Wei; Phan, Huy
2017-01-01
The automatic detection and recognition of sound events by computers is a requirement for a number of emerging sensing and human computer interaction technologies. Recent advances in this field have been achieved by machine learning classifiers working in conjunction with time-frequency feature representations. This combination has achieved excellent accuracy for classification of discrete sounds. The ability to recognise sounds under real-world noisy conditions, called robust sound event classification, is an especially challenging task that has attracted recent research attention. Another aspect of real-word conditions is the classification of continuous, occluded or overlapping sounds, rather than classification of short isolated sound recordings. This paper addresses the classification of noise-corrupted, occluded, overlapped, continuous sound recordings. It first proposes a standard evaluation task for such sounds based upon a common existing method for evaluating isolated sound classification. It then benchmarks several high performing isolated sound classifiers to operate with continuous sound data by incorporating an energy-based event detection front end. Results are reported for each tested system using the new task, to provide the first analysis of their performance for continuous sound event detection. In addition it proposes and evaluates a novel Bayesian-inspired front end for the segmentation and detection of continuous sound recordings prior to classification.
NASA Astrophysics Data System (ADS)
Diniakos, R. S.; Bilek, S. L.; Rowe, C. A.; Draganov, D.
2015-12-01
The subduction of the Nazca Plate beneath the South American Plate along Chile has led to some of the largest earthquakes recorded on modern seismic instrumentation. These include the 1960 M 9.5 Valdivia, 2010 M 8.8 Maule, and 2014 M 8.1 Iquique earthquakes. Slip heterogeneity for both the 2010 and 2014 earthquakes has been noted in various studies. In order to explore both spatial variations in the continued aftershocks of the 2010 event, and also seismicity to the north along Iquique prior to the 2014 earthquake relative to the high slip regions, we are expanding the catalog of small earthquakes using template matching algorithms to find other small earthquakes in the region. We start with an earthquake catalog developed from regional and local array data; these events provide the templates used to search through waveform data from a temporary seismic array in Malargue, Argentina, located ~300 km west of the Maule region, which operated in 2012. Our template events are first identified on the array stations, and we use a 10-s window around the P-wave arrival as the template. We then use a waveform cross-correlation algorithm to compare the template with day-long seismograms from Malargue stations. The newly detected events are then located using the HYPOINVERSE2000 program. Initial results for 103 templates on 19 of the array stations show that we find 275 new events ,with an average of three new events for each template correlated. For these preliminary results, events from the Maule region appear to provide the most new detections, with an average of ten new events. We will present our locations for the detected events and we will compare them to patterns of high slip along the 2010 rupture zone of the M 8.8 Maule earthquake and the 2014 M 8.1 Iquique event.
NASA Astrophysics Data System (ADS)
Yun, Jinsik; Ha, Dong Sam; Inman, Daniel J.; Owen, Robert B.
2011-03-01
Structural damage for spacecraft is mainly due to impacts such as collision of meteorites or space debris. We present a structural health monitoring (SHM) system for space applications, named Adverse Event Detection (AED), which integrates an acoustic sensor, an impedance-based SHM system, and a Lamb wave SHM system. With these three health-monitoring methods in place, we can determine the presence, location, and severity of damage. An acoustic sensor continuously monitors acoustic events, while the impedance-based and Lamb wave SHM systems are in sleep mode. If an acoustic sensor detects an impact, it activates the impedance-based SHM. The impedance-based system determines if the impact incurred damage. When damage is detected, it activates the Lamb wave SHM system to determine the severity and location of the damage. Further, since an acoustic sensor dissipates much less power than the two SHM systems and the two systems are activated only when there is an acoustic event, our system reduces overall power dissipation significantly. Our prototype system demonstrates the feasibility of the proposed concept.
Application of artificial neural network to fMRI regression analysis.
Misaki, Masaya; Miyauchi, Satoru
2006-01-15
We used an artificial neural network (ANN) to detect correlations between event sequences and fMRI (functional magnetic resonance imaging) signals. The layered feed-forward neural network, given a series of events as inputs and the fMRI signal as a supervised signal, performed a non-linear regression analysis. This type of ANN is capable of approximating any continuous function, and thus this analysis method can detect any fMRI signals that correlated with corresponding events. Because of the flexible nature of ANNs, fitting to autocorrelation noise is a problem in fMRI analyses. We avoided this problem by using cross-validation and an early stopping procedure. The results showed that the ANN could detect various responses with different time courses. The simulation analysis also indicated an additional advantage of ANN over non-parametric methods in detecting parametrically modulated responses, i.e., it can detect various types of parametric modulations without a priori assumptions. The ANN regression analysis is therefore beneficial for exploratory fMRI analyses in detecting continuous changes in responses modulated by changes in input values.
Detecting Noisy Events Using Waveform Cross-Correlation at Superarrays of Seismic Stations
NASA Astrophysics Data System (ADS)
von Seggern, D. H.; Tibuleac, I. M.
2007-12-01
Cross-correlation using master events, followed by stacking of the correlation series, has been shown to dramatically improve detection thresholds of small-to-medium seismic arrays. With the goal of lowering the detection threshold, determining relative magnitudes or moments, and characterizing sources by empirical Green's functions, we extend the cross-correlation methodology to include "superarrays" of seismic stations. The superarray concept naturally brings further benefits over conventional arrays and single-stations due to the fact that many distances and azimuths can be sampled. This extension is straightforward given the ease with which regional or global data from various stations or arrays can be currently accessed and combined into a single database. We demonstrate the capability of superarrays to detect and analyze events which lie below the detection threshold. This is aided by applying an F-statistic detector to the superarray cross-correlation stack and its components. Our first example illustrates the use of a superarray consisting of the Southern Great Basin Digital Seismic Network, a small-aperture array (NVAR) in Mina, Nevada and the Earthscope Transportable Array to detect events in California-Nevada areas. In our second example, we use a combination of small-to-medium arrays and single stations to study the rupture of the great Sumatra earthquake of 26 December 2004 and to detect its early aftershocks. The location and times of "detected" events are confirmed using a frequency- wavenumber method at the small-to-medium arrays. We propose that ad hoc superarrays can be used in many studies where conventional approaches previously used only single arrays or groups of single stations. The availability of near-real-time data from many networks and of archived data from, for instance, IRIS makes possible the easy assembly of superarrays. Furthermore, the continued improvement of seismic data availability and the continued growth in the number of world-wide seismic sensors will increasingly make superarrays an attractive choice for many studies.
Autonomous Science on the EO-1 Mission
NASA Technical Reports Server (NTRS)
Chien, S.; Sherwood, R.; Tran, D.; Castano, R.; Cichy, B.; Davies, A.; Rabideau, G.; Tang, N.; Burl, M.; Mandl, D.;
2003-01-01
In mid-2003, we will fly software to detect science events that will drive autonomous scene selectionon board the New Millennium Earth Observing 1 (EO-1) spacecraft. This software will demonstrate the potential for future space missions to use onboard decision-making to detect science events and respond autonomously to capture short-lived science events and to downlink only the highest value science data.
Fischbach, Ephraim; Jenkins, Jere
2013-08-27
A flux detection apparatus can include a radioactive sample having a decay rate capable of changing in response to interaction with a first particle or a field, and a detector associated with the radioactive sample. The detector is responsive to a second particle or radiation formed by decay of the radioactive sample. The rate of decay of the radioactive sample can be correlated to flux of the first particle or the field. Detection of the first particle or the field can provide an early warning for an impending solar event.
NASA Astrophysics Data System (ADS)
Sato, Mitsuteru; Mihara, Masahiro; Ushio, Tomoo; Morimoto, Takeshi; Kikuchi, Hiroshi; Adachi, Toru; Suzuki, Makoto; Yamazaki, Atsushi; Takahashi, Yukihiro
2015-04-01
JEM-GLIMS is continuing the comprehensive nadir observations of lightning and TLEs using optical instruments and electromagnetic wave receivers since November 2012. For the period between November 20, 2012 and November 30, 2014, JEM-GLIMS succeeded in detecting 5,048 lightning events. A total of 567 events in 5,048 lightning events were TLEs, which were mostly elves events. To identify the sprite occurrences from the transient optical flash data, it is necessary to perform the following data analysis: (1) a subtraction of the appropriately scaled wideband camera data from the narrowband camera data; (2) a calculation of intensity ratio between different spectrophotometer channels; and (3) an estimation of the polarization and CMC for the parent CG discharges using ground-based ELF measurement data. From a synthetic comparison of these results, it is confirmed that JEM-GLISM succeeded in detecting sprite events. The VHF receiver (VITF) onboard JEM-GLIMS uses two patch-type antennas separated by a 1.6-m interval and can detect VHF pulses emitted by lightning discharges in the 70-100 MHz frequency range. Using both an interferometric technique and a group delay technique, we can estimate the source locations of VHF pulses excited by lightning discharges. In the event detected at 06:41:15.68565 UT on June 12, 2014 over central North America, sprite was distributed with a horizontal displacement of 20 km from the peak location of the parent lightning emission. In this event, a total of 180 VHF pulses were simultaneously detected by VITF. From the detailed data analysis of these VHF pulse data, it is found that the majority of the source locations were placed near the area of the dim lightning emission, which may imply that the VHF pulses were associated with the in-cloud lightning current. At the presentation, we will show detailed comparison between the spatiotemporal characteristics of sprite emission and source locations of VHF pulses excited by the parent lightning discharges of sprites.
NASA Astrophysics Data System (ADS)
Hotokezaka, K.; Nissanke, S.; Hallinan, G.; Lazio, T. J. W.; Nakar, E.; Piran, T.
2016-11-01
Mergers of binary neutron stars and black hole-neutron star binaries produce gravitational-wave (GW) emission and outflows with significant kinetic energies. These outflows result in radio emissions through synchrotron radiation. We explore the detectability of these synchrotron-generated radio signals by follow-up observations of GW merger events lacking a detection of electromagnetic counterparts in other wavelengths. We model radio light curves arising from (I) sub-relativistic merger ejecta and (II) ultra-relativistic jets. The former produce radio remnants on timescales of a few years and the latter produce γ-ray bursts in the direction of the jet and orphan-radio afterglows extending over wider angles on timescales of weeks. Based on the derived light curves, we suggest an optimized survey at 1.4 GHz with five epochs separated by a logarithmic time interval. We estimate the detectability of the radio counterparts of simulated GW-merger events to be detected by advanced LIGO and Virgo by current and future radio facilities. The detectable distances for these GW merger events could be as high as 1 Gpc. Around 20%-60% of the long-lasting radio remnants will be detectable in the case of the moderate kinetic energy of 3\\cdot {10}50 erg and a circum-merger density of 0.1 {{cm}}-3 or larger, while 5%-20% of the orphan-radio afterglows with kinetic energy of 1048 erg will be detectable. The detection likelihood increases if one focuses on the well-localizable GW events. We discuss the background noise due to radio fluxes of host galaxies and false positives arising from extragalactic radio transients and variable active galactic nuclei, and we show that the quiet radio transient sky is of great advantage when searching for the radio counterparts.
Surprise-Induced Blindness: A Stimulus-Driven Attentional Limit to Conscious Perception
ERIC Educational Resources Information Center
Asplund, Christopher L.; Todd, J. Jay; Snyder, A. P.; Gilbert, Christopher M.; Marois, Rene
2010-01-01
The cost of attending to a visual event can be the failure to consciously detect other events. This processing limitation is well illustrated by the attentional blink paradigm, in which searching for and attending to a target presented in a rapid serial visual presentation stream of distractors can impair one's ability to detect a second target…
A computer aided treatment event recognition system in radiation therapy.
Xia, Junyi; Mart, Christopher; Bayouth, John
2014-01-01
To develop an automated system to safeguard radiation therapy treatments by analyzing electronic treatment records and reporting treatment events. CATERS (Computer Aided Treatment Event Recognition System) was developed to detect treatment events by retrieving and analyzing electronic treatment records. CATERS is designed to make the treatment monitoring process more efficient by automating the search of the electronic record for possible deviations from physician's intention, such as logical inconsistencies as well as aberrant treatment parameters (e.g., beam energy, dose, table position, prescription change, treatment overrides, etc). Over a 5 month period (July 2012-November 2012), physicists were assisted by the CATERS software in conducting normal weekly chart checks with the aims of (a) determining the relative frequency of particular events in the authors' clinic and (b) incorporating these checks into the CATERS. During this study period, 491 patients were treated at the University of Iowa Hospitals and Clinics for a total of 7692 fractions. All treatment records from the 5 month analysis period were evaluated using all the checks incorporated into CATERS after the training period. About 553 events were detected as being exceptions, although none of them had significant dosimetric impact on patient treatments. These events included every known event type that was discovered during the trial period. A frequency analysis of the events showed that the top three types of detected events were couch position override (3.2%), extra cone beam imaging (1.85%), and significant couch position deviation (1.31%). The significant couch deviation is defined as the number of treatments where couch vertical exceeded two times standard deviation of all couch verticals, or couch lateral/longitudinal exceeded three times standard deviation of all couch laterals and longitudinals. On average, the application takes about 1 s per patient when executed on either a desktop computer or a mobile device. CATERS offers an effective tool to detect and report treatment events. Automation and rapid processing enables electronic record interrogation daily, alerting the medical physicist of deviations potentially days prior to performing weekly check. The output of CATERS could also be utilized as an important input to failure mode and effects analysis.
Wu, Yuhua; Wang, Yulei; Li, Jun; Li, Wei; Zhang, Li; Li, Yunjing; Li, Xiaofei; Li, Jun; Zhu, Li; Wu, Gang
2014-01-01
The Cauliflower mosaic virus (CaMV) 35S promoter (P35S) is a commonly used target for detection of genetically modified organisms (GMOs). There are currently 24 reported detection methods, targeting different regions of the P35S promoter. Initial assessment revealed that due to the absence of primer binding sites in the P35S sequence, 19 of the 24 reported methods failed to detect P35S in MON88913 cotton, and the other two methods could only be applied to certain GMOs. The rest three reported methods were not suitable for measurement of P35S in some testing events, because SNPs in binding sites of the primer/probe would result in abnormal amplification plots and poor linear regression parameters. In this study, we discovered a conserved region in the P35S sequence through sequencing of P35S promoters from multiple transgenic events, and developed new qualitative and quantitative detection systems targeting this conserved region. The qualitative PCR could detect the P35S promoter in 23 unique GMO events with high specificity and sensitivity. The quantitative method was suitable for measurement of P35S promoter, exhibiting good agreement between the amount of template and Ct values for each testing event. This study provides a general P35S screening method, with greater coverage than existing methods. PMID:25483893
Vilar, Santiago; Harpaz, Rave; Chase, Herbert S; Costanzi, Stefano; Rabadan, Raul
2011-01-01
Background Adverse drug events (ADE) cause considerable harm to patients, and consequently their detection is critical for patient safety. The US Food and Drug Administration maintains an adverse event reporting system (AERS) to facilitate the detection of ADE in drugs. Various data mining approaches have been developed that use AERS to detect signals identifying associations between drugs and ADE. The signals must then be monitored further by domain experts, which is a time-consuming task. Objective To develop a new methodology that combines existing data mining algorithms with chemical information by analysis of molecular fingerprints to enhance initial ADE signals generated from AERS, and to provide a decision support mechanism to facilitate the identification of novel adverse events. Results The method achieved a significant improvement in precision in identifying known ADE, and a more than twofold signal enhancement when applied to the ADE rhabdomyolysis. The simplicity of the method assists in highlighting the etiology of the ADE by identifying structurally similar drugs. A set of drugs with strong evidence from both AERS and molecular fingerprint-based modeling is constructed for further analysis. Conclusion The results demonstrate that the proposed methodology could be used as a pharmacovigilance decision support tool to facilitate ADE detection. PMID:21946238
ERIC Educational Resources Information Center
Taft, Laritza M.
2010-01-01
In its report "To Err is Human", The Institute of Medicine recommended the implementation of internal and external voluntary and mandatory automatic reporting systems to increase detection of adverse events. Knowledge Discovery in Databases (KDD) allows the detection of patterns and trends that would be hidden or less detectable if analyzed by…
Effects of Audio-Visual Integration on the Detection of Masked Speech and Non-Speech Sounds
ERIC Educational Resources Information Center
Eramudugolla, Ranmalee; Henderson, Rachel; Mattingley, Jason B.
2011-01-01
Integration of simultaneous auditory and visual information about an event can enhance our ability to detect that event. This is particularly evident in the perception of speech, where the articulatory gestures of the speaker's lips and face can significantly improve the listener's detection and identification of the message, especially when that…
Covert Network Analysis for Key Player Detection and Event Prediction Using a Hybrid Classifier
Akram, M. Usman; Khan, Shoab A.; Javed, Muhammad Younus
2014-01-01
National security has gained vital importance due to increasing number of suspicious and terrorist events across the globe. Use of different subfields of information technology has also gained much attraction of researchers and practitioners to design systems which can detect main members which are actually responsible for such kind of events. In this paper, we present a novel method to predict key players from a covert network by applying a hybrid framework. The proposed system calculates certain centrality measures for each node in the network and then applies novel hybrid classifier for detection of key players. Our system also applies anomaly detection to predict any terrorist activity in order to help law enforcement agencies to destabilize the involved network. As a proof of concept, the proposed framework has been implemented and tested using different case studies including two publicly available datasets and one local network. PMID:25136674
The Third Swift Burst Alert Telescope Gamma-Ray Burst Catalog
NASA Astrophysics Data System (ADS)
Lien, Amy; Sakamoto, Takanori; Barthelmy, Scott D.; Baumgartner, Wayne H.; Cannizzo, John K.; Chen, Kevin; Collins, Nicholas R.; Cummings, Jay R.; Gehrels, Neil; Krimm, Hans A.; Markwardt, Craig. B.; Palmer, David M.; Stamatikos, Michael; Troja, Eleonora; Ukwatta, T. N.
2016-09-01
To date, the Burst Alert Telescope (BAT) onboard Swift has detected ˜1000 gamma-ray bursts (GRBs), of which ˜360 GRBs have redshift measurements, ranging from z = 0.03 to z = 9.38. We present the analyses of the BAT-detected GRBs for the past ˜11 years up through GRB 151027B. We report summaries of both the temporal and spectral analyses of the GRB characteristics using event data (I.e., data for each photon within approximately 250 s before and 950 s after the BAT trigger time), and discuss the instrumental sensitivity and selection effects of GRB detections. We also explore the GRB properties with redshift when possible. The result summaries and data products are available at http://swift.gsfc.nasa.gov/results/batgrbcat/index.html. In addition, we perform searches for GRB emissions before or after the event data using the BAT survey data. We estimate the false detection rate to be only one false detection in this sample. There are 15 ultra-long GRBs (˜2% of the BAT GRBs) in this search with confirmed emission beyond ˜1000 s of event data, and only two GRBs (GRB 100316D and GRB 101024A) with detections in the survey data prior to the starting of event data.
THE THIRD SWIFT BURST ALERT TELESCOPE GAMMA-RAY BURST CATALOG
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lien, Amy; Baumgartner, Wayne H.; Cannizzo, John K.
2016-09-20
To date, the Burst Alert Telescope (BAT) onboard Swift has detected ∼1000 gamma-ray bursts (GRBs), of which ∼360 GRBs have redshift measurements, ranging from z = 0.03 to z = 9.38. We present the analyses of the BAT-detected GRBs for the past ∼11 years up through GRB 151027B. We report summaries of both the temporal and spectral analyses of the GRB characteristics using event data (i.e., data for each photon within approximately 250 s before and 950 s after the BAT trigger time), and discuss the instrumental sensitivity and selection effects of GRB detections. We also explore the GRB propertiesmore » with redshift when possible. The result summaries and data products are available at http://swift.gsfc.nasa.gov/results/batgrbcat/index.html. In addition, we perform searches for GRB emissions before or after the event data using the BAT survey data. We estimate the false detection rate to be only one false detection in this sample. There are 15 ultra-long GRBs (∼2% of the BAT GRBs) in this search with confirmed emission beyond ∼1000 s of event data, and only two GRBs (GRB 100316D and GRB 101024A) with detections in the survey data prior to the starting of event data.« less
Closing the Loop in ICU Decision Support: Physiologic Event Detection, Alerts, and Documentation
Norris, Patrick R.; Dawant, Benoit M.
2002-01-01
Automated physiologic event detection and alerting is a challenging task in the ICU. Ideally care providers should be alerted only when events are clinically significant and there is opportunity for corrective action. However, the concepts of clinical significance and opportunity are difficult to define in automated systems, and effectiveness of alerting algorithms is difficult to measure. This paper describes recent efforts on the Simon project to capture information from ICU care providers about patient state and therapy in response to alerts, in order to assess the value of event definitions and progressively refine alerting algorithms. Event definitions for intracranial pressure and cerebral perfusion pressure were studied by implementing a reliable system to automatically deliver alerts to clinical users’ alphanumeric pagers, and to capture associated documentation about patient state and therapy when the alerts occurred. During a 6-month test period in the trauma ICU at Vanderbilt University Medical Center, 530 alerts were detected in 2280 hours of data spanning 14 patients. Clinical users electronically documented 81% of these alerts as they occurred. Retrospectively classifying documentation based on therapeutic actions taken, or reasons why actions were not taken, provided useful information about ways to potentially improve event definitions and enhance system utility.
Slow Earthquakes in the Microseism Frequency Band (0.1-1.0 Hz) off Kii Peninsula, Japan
NASA Astrophysics Data System (ADS)
Kaneko, Lisa; Ide, Satoshi; Nakano, Masaru
2018-03-01
It is difficult to detect the signal of slow deformation in the 0.1-1.0 Hz frequency band between tectonic tremors and very low frequency events, where microseism noise is dominant. Here we provide the first evidence of slow earthquakes in this microseism band, observed by the DONET1 ocean bottom seismometer network, after an Mw 5.8 earthquake off Kii Peninsula, Japan, on 1 April 2016. The signals in the microseism band were accompanied by signals from active tremors, very low frequency events, and slow slip events that radiated from the shallow plate interface. We report the detection and locations of events across five frequency bands, including the microseism band. The locations and timing of the events estimated in the different frequency bands are similar, suggesting that these signals radiated from a common source. The observed variations in detectability for each band highlight the complexity of the slow earthquake process.
Wang, Su-hua; Baillargeon, Renée
2009-01-01
As they observe or produce events, infants identify variables that help them predict outcomes in each category of events. How do infants identify a new variable? An explanation-based learning (EBL) account suggests three essential steps: (1) observing contrastive outcomes relevant to the variable; (2) discovering the conditions associated with these outcomes; and (3) generating an explanation for the condition-outcome regularity discovered. In Experiments 1–3, 9-month-old infants watched events designed to “teach” them the variable height in covering events. After watching these events, designed in accord with the EBL account, the infants detected a height violation in a covering event, three months earlier than they ordinarily would have. In Experiments 4–6, the “teaching” events were modified to remove one of the EBL steps, and the infants no longer detected the height violation. The present findings thus support the EBL account and help specify the processes by which infants acquire their physical knowledge. PMID:18177635
Risk factors for hazardous events in olfactory-impaired patients.
Pence, Taylor S; Reiter, Evan R; DiNardo, Laurence J; Costanzo, Richard M
2014-10-01
Normal olfaction provides essential cues to allow early detection and avoidance of potentially hazardous situations. Thus, patients with impaired olfaction may be at increased risk of experiencing certain hazardous events such as cooking or house fires, delayed detection of gas leaks, and exposure to or ingestion of toxic substances. To identify risk factors and potential trends over time in olfactory-related hazardous events in patients with impaired olfactory function. Retrospective cohort study of 1047 patients presenting to a university smell and taste clinic between 1983 and 2013. A total of 704 patients had both clinical olfactory testing and a hazard interview and were studied. On the basis of olfactory function testing results, patients were categorized as normosmic (n = 161), mildly hyposmic (n = 99), moderately hyposmic (n = 93), severely hyposmic (n = 142), and anosmic (n = 209). Patient evaluation including interview, examination, and olfactory testing. Incidence of specific olfaction-related hazardous events (ie, burning pots and/or pans, starting a fire while cooking, inability to detect gas leaks, inability to detect smoke, and ingestion of toxic substances or spoiled foods) by degree of olfactory impairment. The incidence of having experienced any hazardous event progressively increased with degree of impairment: normosmic (18.0%), mildly hyposmic (22.2%), moderately hyposmic (31.2%), severely hyposmic (32.4%), and anosmic (39.2%). Over 3 decades there was no significant change in the overall incidence of hazardous events. Analysis of demographic data (age, sex, race, smoking status, and etiology) revealed significant differences in the incidence of hazardous events based on age (among 397 patients <65 years, 148 [37.3%] with hazardous event, vs 31 of 146 patients ≥65 years [21.3%]; P < .001), sex (among 278 women, 106 [38.1%] with hazardous event, vs 73 of 265 men [27.6%]; P = .009), and race (among 98 African Americans, 41 [41.8%] with hazardous event, vs 134 of 434 whites [30.9%]; P = .04). Increased level of olfactory impairment portends an increased risk of experiencing a hazardous event. Risk is further impacted by individuals' age, sex, and race. These results may assist health care practitioners in counseling patients on the risks associated with olfactory impairment.
Detection of dominant flow and abnormal events in surveillance video
NASA Astrophysics Data System (ADS)
Kwak, Sooyeong; Byun, Hyeran
2011-02-01
We propose an algorithm for abnormal event detection in surveillance video. The proposed algorithm is based on a semi-unsupervised learning method, a kind of feature-based approach so that it does not detect the moving object individually. The proposed algorithm identifies dominant flow without individual object tracking using a latent Dirichlet allocation model in crowded environments. It can also automatically detect and localize an abnormally moving object in real-life video. The performance tests are taken with several real-life databases, and their results show that the proposed algorithm can efficiently detect abnormally moving objects in real time. The proposed algorithm can be applied to any situation in which abnormal directions or abnormal speeds are detected regardless of direction.
NASA Astrophysics Data System (ADS)
Morton, E.; Bilek, S. L.; Rowe, C. A.
2016-12-01
Unlike other subduction zones, the Cascadia subduction zone (CSZ) is notable for the absence of detected and located small and moderate magnitude interplate earthquakes, despite the presence of recurring episodic tremor and slip (ETS) downdip and evidence of pre-historic great earthquakes. Thermal and geodetic models indicate that the seismogenic zone exists primarily, if not entirely, offshore; therefore the perceived unusual seismic quiescence may be a consequence of seismic source location in relation to land based seismometers. The Cascadia Initiative (CI) amphibious community seismic experiment includes ocean bottom seismometers (OBS) deployed directly above the presumed locked seismogenic zone. We use the CI dataset to search for small magnitude interplate earthquakes previously undetected using the on-land sensors alone. We implement subspace detection to search for small earthquakes. We build our subspace with template events from existing earthquake catalogs that appear to have occurred on the plate interface, windowing waveforms on CI OBS and land seismometers. Although our efforts will target the entire CSZ margin and full 4-year CI deployment, here we focus on a previously identified cluster off the coast of Oregon, related to a subducting seamount. During the first year of CI deployment, this target area yields 293 unique detections with 86 well-located events. Thirty-two of these events occurred within the seamount cluster, and 13 events were located in another cluster to the northwest of the seamount. Events within the seamount cluster are separated into those whose depths place them on the plate interface, and a shallower set ( 5 km depth). These separate event groups track together temporally, and seem to agree with a model of seamount subduction that creates extensive fracturing around the seamount, rather than stress concentrated at the seamount-plate boundary. During CI year 2, this target area yields >1000 additional event detections.
Kim, Jae-Hwan; Park, Saet-Byul; Roh, Hyo-Jeong; Shin, Min-Ki; Moon, Gui-Im; Hong, Jin-Hwan; Kim, Hae-Yeong
2017-07-01
One novel standard reference plasmid, namely pUC-RICE5, was constructed as a positive control and calibrator for event-specific qualitative and quantitative detection of genetically modified (GM) rice (Bt63, Kemingdao1, Kefeng6, Kefeng8, and LLRice62). pUC-RICE5 contained fragments of a rice-specific endogenous reference gene (sucrose phosphate synthase) as well as the five GM rice events. An existing qualitative PCR assay approach was modified using pUC-RICE5 to create a quantitative method with limits of detection correlating to approximately 1-10 copies of rice haploid genomes. In this quantitative PCR assay, the square regression coefficients ranged from 0.993 to 1.000. The standard deviation and relative standard deviation values for repeatability ranged from 0.02 to 0.22 and 0.10% to 0.67%, respectively. The Ministry of Food and Drug Safety (Korea) validated the method and the results suggest it could be used routinely to identify five GM rice events. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Novak, Avrey; Nyflot, Matthew J.; Ermoian, Ralph P.
Purpose: Radiation treatment planning involves a complex workflow that has multiple potential points of vulnerability. This study utilizes an incident reporting system to identify the origination and detection points of near-miss errors, in order to guide their departmental safety improvement efforts. Previous studies have examined where errors arise, but not where they are detected or applied a near-miss risk index (NMRI) to gauge severity. Methods: From 3/2012 to 3/2014, 1897 incidents were analyzed from a departmental incident learning system. All incidents were prospectively reviewed weekly by a multidisciplinary team and assigned a NMRI score ranging from 0 to 4 reflectingmore » potential harm to the patient (no potential harm to potential critical harm). Incidents were classified by point of incident origination and detection based on a 103-step workflow. The individual steps were divided among nine broad workflow categories (patient assessment, imaging for radiation therapy (RT) planning, treatment planning, pretreatment plan review, treatment delivery, on-treatment quality management, post-treatment completion, equipment/software quality management, and other). The average NMRI scores of incidents originating or detected within each broad workflow area were calculated. Additionally, out of 103 individual process steps, 35 were classified as safety barriers, the process steps whose primary function is to catch errors. The safety barriers which most frequently detected incidents were identified and analyzed. Finally, the distance between event origination and detection was explored by grouping events by the number of broad workflow area events passed through before detection, and average NMRI scores were compared. Results: Near-miss incidents most commonly originated within treatment planning (33%). However, the incidents with the highest average NMRI scores originated during imaging for RT planning (NMRI = 2.0, average NMRI of all events = 1.5), specifically during the documentation of patient positioning and localization of the patient. Incidents were most frequently detected during treatment delivery (30%), and incidents identified at this point also had higher severity scores than other workflow areas (NMRI = 1.6). Incidents identified during on-treatment quality management were also more severe (NMRI = 1.7), and the specific process steps of reviewing portal and CBCT images tended to catch highest-severity incidents. On average, safety barriers caught 46% of all incidents, most frequently at physics chart review, therapist’s chart check, and the review of portal images; however, most of the incidents that pass through a particular safety barrier are not designed to be capable of being captured at that barrier. Conclusions: Incident learning systems can be used to assess the most common points of error origination and detection in radiation oncology. This can help tailor safety improvement efforts and target the highest impact portions of the workflow. The most severe near-miss events tend to originate during simulation, with the most severe near-miss events detected at the time of patient treatment. Safety barriers can be improved to allow earlier detection of near-miss events.« less
The hard X-ray burst spectrometer event listing 1980, 1981 and 1982
NASA Technical Reports Server (NTRS)
Dennis, B. R.; Frost, K. J.; Orwig, L. E.; Kiplinger, A.; Dennis, H. E.; Gibson, B. R.; Kennard, G. S.; Tolbert, A. K.
1983-01-01
A comprehensive reference for the hard X-ray bursts detected with the Hard X-Ray Burst Spectrometer on the Solar Maximum Mission for the time of launch on February 14, 1980 to March 1983 is provided. Over 6300 X-ray events were detected in the energy range from 30 to approx 500 keV with the vast majority being solar flares. The listing includes the start time, peak time, duration and peak rate of each event.
NASA Astrophysics Data System (ADS)
Vasterling, Margarete; Wegler, Ulrich; Bruestle, Andrea; Becker, Jan
2016-04-01
Real time information on the locations and magnitudes of induced earthquakes is essential for response plans based on the magnitude frequency distribution. We developed and tested a real time cross-correlation detector focusing on induced microseismicity in deep geothermal reservoirs. The incoming seismological data are cross-correlated in real time with a set of known master events. We use the envelopes of the seismograms rather than the seismograms themselves to account for small changes in the source locations or in the focal mechanisms. Two different detection conditions are implemented: After first passing a single trace correlation condition, secondly a network correlation is calculated taking the amplitude information of the seismic network into account. The magnitude is estimated by using the respective ratio of the maximum amplitudes of the master event and the detected event. The detector is implemented as a real time tool and put into practice as a SeisComp3 module, an established open source software for seismological real time data handling and analysis. We validated the reliability and robustness of the detector by an offline playback test using four month of data from monitoring the power plant in Insheim (Upper Rhine Graben, SW Germany). Subsequently, in October 2013 the detector was installed as real time monitoring system within the project "MAGS2 - Microseismic Activity of Geothermal Systems". Master events from the two neighboring geothermal power plants in Insheim and Landau and two nearby quarries are defined. After detection, manual phase determination and event location are performed at the local seismological survey of the Geological Survey and Mining Authority of Rhineland-Palatinate. Until November 2015 the detector identified 454 events out of which 95% were assigned correctly to the respective source. 5% were misdetections caused by local tectonic events. To evaluate the completeness of the automatically obtained catalogue, it is compared to the event catalogue of the Seismological Service of Southwestern Germany and to the events reported by the company tasked with seismic monitoring of the Insheim power plant. Events missed by the cross-correlation detector are generally very small. They are registered at too few stations to meet the detection criteria. Most of these small events were not locatable. The automatic catalogue has a magnitude of completeness around 0.0 and is significantly more detailed than the catalogue from standard processing of the Seismological Service of Southwestern Germany for this region. For events in the magnitude range of the master event the magnitude estimated from the amplitude ratio reproduces the local magnitude well. For weaker events there tends to be a small offset. Altogether, the developed real time cross correlation detector provides robust detections with reliable association of the events to the respective sources and valid magnitude estimates. Thus, it provides input parameters for the mitigation of seismic hazard by using response plans in real time.
Automatic near-real-time detection of CMEs in Mauna Loa K-Cor coronagraph images
NASA Astrophysics Data System (ADS)
Thompson, W. T.; St Cyr, O. C.; Burkepile, J.; Posner, A.
2017-12-01
A simple algorithm has been developed to detect the onset of coronal massejections (CMEs), together with an estimate of their speed, in near-real-timeusing images of the linearly polarized white-light solar corona taken by theK-Cor telescope at the Mauna Loa Solar Observatory (MLSO). The algorithm usedis a variation on the Solar Eruptive Event Detection System (SEEDS) developedat George Mason University. The algorithm was tested against K-Cor data takenbetween 29 April 2014 and 20 February 2017, on days which the MLSO websitemarked as containing CMEs. This resulted in testing of 139 days worth of datacontaining 171 CMEs. The detection rate varied from close to 80% in 2014-2015when solar activity was high, down to as low as 20-30% in 2017 when activitywas low. The difference in effectiveness with solar cycle is attributed to thedifference in relative prevalance of strong CMEs between active and quietperiods. There were also twelve false detections during this time period,leading to an average false detection rate of 8.6% on any given day. However,half of the false detections were clustered into two short periods of a fewdays each when special conditions prevailed to increase the false detectionrate. The K-Cor data were also compared with major Solar Energetic Particle(SEP) storms during this time period. There were three SEP events detectedeither at Earth or at one of the two STEREO spacecraft where K-Cor wasobserving during the relevant time period. The K-Cor CME detection algorithmsuccessfully generated alerts for two of these events, with lead times of 1-3hours before the SEP onset at 1 AU. The third event was not detected by theautomatic algorithm because of the unusually broad width of the CME in positionangle.
Yang, Litao; Xu, Songci; Pan, Aihu; Yin, Changsong; Zhang, Kewei; Wang, Zhenying; Zhou, Zhigang; Zhang, Dabing
2005-11-30
Because of the genetically modified organisms (GMOs) labeling policies issued in many countries and areas, polymerase chain reaction (PCR) methods were developed for the execution of GMO labeling policies, such as screening, gene specific, construct specific, and event specific PCR detection methods, which have become a mainstay of GMOs detection. The event specific PCR detection method is the primary trend in GMOs detection because of its high specificity based on the flanking sequence of the exogenous integrant. This genetically modified maize, MON863, contains a Cry3Bb1 coding sequence that produces a protein with enhanced insecticidal activity against the coleopteran pest, corn rootworm. In this study, the 5'-integration junction sequence between the host plant DNA and the integrated gene construct of the genetically modified maize MON863 was revealed by means of thermal asymmetric interlaced-PCR, and the specific PCR primers and TaqMan probe were designed based upon the revealed 5'-integration junction sequence; the conventional qualitative PCR and quantitative TaqMan real-time PCR detection methods employing these primers and probes were successfully developed. In conventional qualitative PCR assay, the limit of detection (LOD) was 0.1% for MON863 in 100 ng of maize genomic DNA for one reaction. In the quantitative TaqMan real-time PCR assay, the LOD and the limit of quantification were eight and 80 haploid genome copies, respectively. In addition, three mixed maize samples with known MON863 contents were detected using the established real-time PCR systems, and the ideal results indicated that the established event specific real-time PCR detection systems were reliable, sensitive, and accurate.
MOLECULAR DIAGNOSTICS - ANOTHER PIECE IN THE ENVIRONMENTAL PUZZLE
Molecular biology offers sensitive and expedient tools for the detection of exposure to environmental stressors. Molecular approaches provide the means for detection of the "first cellular event(s)" in response to environmental changes-specifically, immediate changes in gene expr...
Drivers of Emerging Infectious Disease Events as a Framework for Digital Detection.
Olson, Sarah H; Benedum, Corey M; Mekaru, Sumiko R; Preston, Nicholas D; Mazet, Jonna A K; Joly, Damien O; Brownstein, John S
2015-08-01
The growing field of digital disease detection, or epidemic intelligence, attempts to improve timely detection and awareness of infectious disease (ID) events. Early detection remains an important priority; thus, the next frontier for ID surveillance is to improve the recognition and monitoring of drivers (antecedent conditions) of ID emergence for signals that precede disease events. These data could help alert public health officials to indicators of elevated ID risk, thereby triggering targeted active surveillance and interventions. We believe that ID emergence risks can be anticipated through surveillance of their drivers, just as successful warning systems of climate-based, meteorologically sensitive diseases are supported by improved temperature and precipitation data. We present approaches to driver surveillance, gaps in the current literature, and a scientific framework for the creation of a digital warning system. Fulfilling the promise of driver surveillance will require concerted action to expand the collection of appropriate digital driver data.
NASA Astrophysics Data System (ADS)
Vasterling, Margarete; Wegler, Ulrich; Becker, Jan; Brüstle, Andrea; Bischoff, Monika
2017-01-01
We develop and test a real-time envelope cross-correlation detector for use in seismic response plans to mitigate hazard of induced seismicity. The incoming seismological data are cross-correlated in real-time with a set of previously recorded master events. For robustness against small changes in the earthquake source locations or in the focal mechanisms we cross-correlate the envelopes of the seismograms rather than the seismograms themselves. Two sequenced detection conditions are implemented: After passing a single trace cross-correlation condition, a network cross-correlation is calculated taking amplitude ratios between stations into account. Besides detecting the earthquake and assigning it to the respective reservoir, real-time magnitudes are important for seismic response plans. We estimate the magnitudes of induced microseismicity using the relative amplitudes between master event and detected event. The real-time detector is implemented as a SeisComP3 module. We carry out offline and online performance tests using seismic monitoring data of the Insheim and Landau geothermal power plants (Upper Rhine Graben, Germany), also including blasts from a nearby quarry. The comparison of the automatic real-time catalogue with a manually processed catalogue shows, that with the implemented parameters events are always correctly assigned to the respective reservoir (4 km distance between reservoirs) or the quarry (8 km and 10 km distance, respectively, from the reservoirs). The real-time catalogue achieves a magnitude of completeness around 0.0. Four per cent of the events assigned to the Insheim reservoir and zero per cent of the Landau events are misdetections. All wrong detections are local tectonic events, whereas none are caused by seismic noise.
Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)
NASA Astrophysics Data System (ADS)
Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko
2016-07-01
A new Matched Filtering Algorithm (MFA) is proposed for detecting and analysing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multichannel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multicomponent waveforms into the ray-centred co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, that is, microseismic events for which only one of the S- or P-wave arrivals is evident due to unfavourable S/N conditions. A real-data example using microseismic monitoring data from four stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than fourfold increase) in the number of located events compared with the original catalogue. Moreover, analysis of the new MFA catalogue suggests that this approach leads to more robust interpretation of the induced microseismicity and novel insights into dynamic rupture processes based on the average temporal (foreshock-aftershock) relationship of child events to parents.
Expanding the 2011 Prague, OK Event Catalog: Detections, Relocations, and Stress Drop Estimates
NASA Astrophysics Data System (ADS)
Clerc, F.; Cochran, E. S.; Dougherty, S. L.; Keranen, K. M.; Harrington, R. M.
2016-12-01
The Mw 5.6 earthquake occurring on 6 Nov. 2011, near Prague, OK, is thought to have been triggered by a Mw 4.8 foreshock, which was likely induced by fluid injection into local wastewater disposal wells [Keranen et al., 2013; Sumy et al., 2014]. Previous stress drop estimates for the sequence have suggested values lower than those for most Central and Eastern U.S. tectonic events of similar magnitudes [Hough, 2014; Sun & Hartzell, 2014; Sumy & Neighbors et al., 2016]. Better stress drop estimates allow more realistic assessment of seismic hazard and more effective regulation of wastewater injection. More reliable estimates of source properties may help to differentiate induced events from natural ones. Using data from local and regional networks, we perform event detections, relocations, and stress drop calculations of the Prague aftershock sequence. We use the Match & Locate method, a variation on the matched-filter method which detects events of lower magnitudes by stacking cross-correlograms from different stations [Zhang & Wen, 2013; 2015], in order to create a more complete catalog from 6 Nov to 31 Dec 2011. We then relocate the detected events using the HypoDD double-difference algorithm. Using our enhanced catalog and relocations, we examine the seismicity distribution for evidence of migration and investigate implications for triggering mechanisms. To account for path and site effects, we calculate stress drops using the Empirical Green's Function (EGF) spectral ratio method, beginning with 2730 previously relocated events. We determine whether there is a correlation between the stress drop magnitudes and the spatial and temporal distribution of events, including depth, position relative to existing faults, and proximity to injection wells. Finally, we consider the range of stress drop values and scaling with respect to event magnitudes within the context of previously published work for the Prague sequence as well as other induced and natural sequences.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warburton, William K.; Hennig, Wolfgang G.
A method and apparatus for measuring the concentrations of radioxenon isotopes in a gaseous sample wherein the sample cell is surrounded by N sub-detectors that are sensitive to both electrons and to photons from radioxenon decays. Signal processing electronics are provided that can detect events within the sub-detectors, measure their energies, determine whether they arise from electrons or photons, and detect coincidences between events within the same or different sub-detectors. The energies of detected two or three event coincidences are recorded as points in associated two or three-dimensional histograms. Counts within regions of interest in the histograms are then usedmore » to compute estimates of the radioxenon isotope concentrations. The method achieves lower backgrounds and lower minimum detectable concentrations by using smaller detector crystals, eliminating interference between double and triple coincidence decay branches, and segregating double coincidences within the same sub-detector from those occurring between different sub-detectors.« less
Ferragut, Erik M.; Laska, Jason A.; Bridges, Robert A.
2016-06-07
A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The system can include a plurality of anomaly detectors that together implement an algorithm to identify low-probability events and detect atypical traffic patterns. The anomaly detector provides for comparability of disparate sources of data (e.g., network flow data and firewall logs.) Additionally, the anomaly detector allows for regulatability, meaning that the algorithm can be user configurable to adjust a number of false alerts. The anomaly detector can be used for a variety of probability density functions, including normal Gaussian distributions, irregular distributions, as well as functions associated with continuous or discrete variables.
Time difference of arrival to blast localization of potential chemical/biological event on the move
NASA Astrophysics Data System (ADS)
Morcos, Amir; Desai, Sachi; Peltzer, Brian; Hohil, Myron E.
2007-10-01
Integrating a sensor suite with ability to discriminate potential Chemical/Biological (CB) events from high-explosive (HE) events employing a standalone acoustic sensor with a Time Difference of Arrival (TDOA) algorithm we developed a cueing mechanism for more power intensive and range limited sensing techniques. Enabling the event detection algorithm to locate to a blast event using TDOA we then provide further information of the event as either Launch/Impact and if CB/HE. The added information is provided to a range limited chemical sensing system that exploits spectroscopy to determine the contents of the chemical event. The main innovation within this sensor suite is the system will provide this information on the move while the chemical sensor will have adequate time to determine the contents of the event from a safe stand-off distance. The CB/HE discrimination algorithm exploits acoustic sensors to provide early detection and identification of CB attacks. Distinct characteristics arise within the different airburst signatures because HE warheads emphasize concussive and shrapnel effects, while CB warheads are designed to disperse their contents over large areas, therefore employing a slower burning, less intense explosive to mix and spread their contents. Differences characterized by variations in the corresponding peak pressure and rise time of the blast, differences in the ratio of positive pressure amplitude to the negative amplitude, and variations in the overall duration of the resulting waveform. The discrete wavelet transform (DWT) is used to extract the predominant components of these characteristics from air burst signatures at ranges exceeding 3km. Highly reliable discrimination is achieved with a feed-forward neural network classifier trained on a feature space derived from the distribution of wavelet coefficients and higher frequency details found within different levels of the multiresolution decomposition. The development of an adaptive noise floor to provide early event detection assists in minimizing the false alarm rate and increasing the confidence whether the event is blast event or back ground noise. The integration of these algorithms with the TDOA algorithm provides a complex suite of algorithms that can give early warning detection and highly reliable look direction from a great stand-off distance for a moving vehicle to determine if a candidate blast event is CB and if CB what is the composition of the resulting cloud.
Pacheco Coello, Ricardo; Pestana Justo, Jorge; Factos Mendoza, Andrés; Santos Ordoñez, Efrén
2017-12-20
In Ecuador, food products need to be labeled if exceeded 0.9% of transgenic content in whole products. For the detection of genetically modified organisms (GMOs), three DNA extraction methods were tested in 35 food products commercialized in Ecuador. Samples with positive amplification of endogenous genes were screened for the presence of the Cauliflower mosaic virus 35S-promoter (P35S) and the nopaline synthase-terminator (Tnos). TaqMan™ probes were used for determination of transgenic content of the GTS 40-3-2 and MON810 events through quantitative PCR (qPCR). Twenty-six processed food samples were positive for the P35S alone and eight samples for the Tnos and P35S. Absolute qPCR results indicated that eleven samples were positive for GTS 40-3-2 specific event and two for MON810 specific event. A total of nine samples for events GTS 40-3-2 and MON810 exceeded the umbral allowed of transgenic content in the whole food product with the specific events. Different food products may require different DNA extraction protocols for GMO detection through PCR. Among the three methods tested, the DNeasy mericon food kit DNA extraction method obtained higher proportion of amplified endogenous genes through PCR. Finally, event-specific GMOs were detected in food products in Ecuador.
Alsep data processing: How we processed Apollo Lunar Seismic Data
NASA Technical Reports Server (NTRS)
Latham, G. V.; Nakamura, Y.; Dorman, H. J.
1979-01-01
The Apollo lunar seismic station network gathered data continuously at a rate of 3 x 10 to the 8th power bits per day for nearly eight years until the termination in September, 1977. The data were processed and analyzed using a PDP-15 minicomputer. On the average, 1500 long-period seismic events were detected yearly. Automatic event detection and identification schemes proved unsuccessful because of occasional high noise levels and, above all, the risk of overlooking unusual natural events. The processing procedures finally settled on consist of first plotting all the data on a compressed time scale, visually picking events from the plots, transferring event data to separate sets of tapes and performing detailed analyses using the latter. Many problems remain especially for automatically processing extraterrestrial seismic signals.
NASA Astrophysics Data System (ADS)
Tang, Xiaojing
Fast and accurate monitoring of tropical forest disturbance is essential for understanding current patterns of deforestation as well as helping eliminate illegal logging. This dissertation explores the use of data from different satellites for near real-time monitoring of forest disturbance in tropical forests, including: development of new monitoring methods; development of new assessment methods; and assessment of the performance and operational readiness of existing methods. Current methods for accuracy assessment of remote sensing products do not address the priority of near real-time monitoring of detecting disturbance events as early as possible. I introduce a new assessment framework for near real-time products that focuses on the timing and the minimum detectable size of disturbance events. The new framework reveals the relationship between change detection accuracy and the time needed to identify events. In regions that are frequently cloudy, near real-time monitoring using data from a single sensor is difficult. This study extends the work by Xin et al. (2013) and develops a new time series method (Fusion2) based on fusion of Landsat and MODIS (Moderate Resolution Imaging Spectroradiometer) data. Results of three test sites in the Amazon Basin show that Fusion2 can detect 44.4% of the forest disturbance within 13 clear observations (82 days) after the initial disturbance. The smallest event detected by Fusion2 is 6.5 ha. Also, Fusion2 detects disturbance faster and has less commission error than more conventional methods. In a comparison of coarse resolution sensors, MODIS Terra and Aqua combined provides faster and more accurate detection of disturbance events than VIIRS (Visible Infrared Imaging Radiometer Suite) and MODIS single sensor data. The performance of near real-time monitoring using VIIRS is slightly worse than MODIS Terra but significantly better than MODIS Aqua. New monitoring methods developed in this dissertation provide forest protection organizations the capacity to monitor illegal logging events promptly. In the future, combining two Landsat and two Sentinel-2 satellites will provide global coverage at 30 m resolution every 4 days, and routine monitoring may be possible at high resolution. The methods and assessment framework developed in this dissertation are adaptable to newly available datasets.
NASA Astrophysics Data System (ADS)
Karrenbach, M. H.; Cole, S.; Williams, J. J.; Biondi, B. C.; McMurtry, T.; Martin, E. R.; Yuan, S.
2017-12-01
Fiber-optic distributed acoustic sensing (DAS) uses conventional telecom fibers for a wide variety of monitoring purposes. Fiber-optic arrays can be located along pipelines for leak detection; along borders and perimeters to detect and locate intruders, or along railways and roadways to monitor traffic and identify and manage incidents. DAS can also be used to monitor oil and gas reservoirs and to detect earthquakes. Because thousands of such arrays are deployed worldwide and acquiring data continuously, they can be a valuable source of data for earthquake detection and location, and could potentially provide important information to earthquake early-warning systems. In this presentation, we show that DAS arrays in Mexico and the United States detected the M8.1 and M7.2 Mexico earthquakes in September 2017. At Stanford University, we have deployed a 2.4 km fiber-optic DAS array in a figure-eight pattern, with 600 channels spaced 4 meters apart. Data have been recorded continuously since September 2016. Over 800 earthquakes from across California have been detected and catalogued. Distant teleseismic events have also been recorded, including the two Mexican earthquakes. In Mexico, fiber-optic arrays attached to pipelines also detected these two events. Because of the length of these arrays and their proximity to the event locations, we can not only detect the earthquakes but also make location estimates, potentially in near real time. In this presentation, we review the data recorded for these two events recorded at Stanford and in Mexico. We compare the waveforms recorded by the DAS arrays to those recorded by traditional earthquake sensor networks. Using the wide coverage provided by the pipeline arrays, we estimate the event locations. Such fiber-optic DAS networks can potentially play a role in earthquake early-warning systems, allowing actions to be taken to minimize the impact of an earthquake on critical infrastructure components. While many such fiber-optic networks are already in place, new arrays can be created on demand, using existing fiber-optic telecom cables, for specific monitoring situations such as recording aftershocks of a large earthquake or monitoring induced seismicity.
Hilsden, Robert J; Dube, Catherine; Heitman, Steven J; Bridges, Ronald; McGregor, S Elizabeth; Rostom, Alaa
2015-11-01
Although several quality indicators of colonoscopy have been defined, quality assurance activities should be directed at the measurement of quality indicators that are predictive of key screening colonoscopy outcomes. The goal of this study was to examine the association among established quality indicators and the detection of screen-relevant lesions (SRLs), adverse events, and postcolonoscopy cancers. Historical cohort study. Canadian colorectal cancer screening center. A total of 18,456 asymptomatic men and women ages 40 to 74, at either average risk or increased risk for colorectal cancer because of a family history, who underwent a screening colonoscopy from 2008 to 2010. Using univariate and multivariate analyses, we explored the association among procedural quality indicators and 3 colonoscopy outcomes: detection of SRLs, adverse events, and postcolonoscopy cancers. The crude rates of SRLs, adverse events, and postcolonoscopy cancers were 240, 6.44, and .54 per 1000 colonoscopies, respectively. Several indicators, including endoscopist withdrawal time (OR, 1.3; 95% CI, 1.2-1.4) and cecal intubation rate (OR, 13.9; 95% CI, 1.9-96.9), were associated with the detection of SRLs. No quality indicator was associated with the risk of adverse events. Endoscopist average withdrawal time over 6 minutes (OR, .12; 95% CI, .002-.85) and SRL detection rate over 20% (OR, .17; 95% CI, .03-.74) were associated with a reduced risk of postcolonoscopy cancers. Single-center study. Quality assurance programs should prioritize the measurement of endoscopist average withdrawal time and adenoma (SRL) detection rate. Copyright © 2015 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.
Commonality of drug-associated adverse events detected by 4 commonly used data mining algorithms.
Sakaeda, Toshiyuki; Kadoyama, Kaori; Minami, Keiko; Okuno, Yasushi
2014-01-01
Data mining algorithms have been developed for the quantitative detection of drug-associated adverse events (signals) from a large database on spontaneously reported adverse events. In the present study, the commonality of signals detected by 4 commonly used data mining algorithms was examined. A total of 2,231,029 reports were retrieved from the public release of the US Food and Drug Administration Adverse Event Reporting System database between 2004 and 2009. The deletion of duplicated submissions and revision of arbitrary drug names resulted in a reduction in the number of reports to 1,644,220. Associations with adverse events were analyzed for 16 unrelated drugs, using the proportional reporting ratio (PRR), reporting odds ratio (ROR), information component (IC), and empirical Bayes geometric mean (EBGM). All EBGM-based signals were included in the PRR-based signals as well as IC- or ROR-based ones, and PRR- and IC-based signals were included in ROR-based ones. The PRR scores of PRR-based signals were significantly larger for 15 of 16 drugs when adverse events were also detected as signals by the EBGM method, as were the IC scores of IC-based signals for all drugs; however, no such effect was observed in the ROR scores of ROR-based signals. The EBGM method was the most conservative among the 4 methods examined, which suggested its better suitability for pharmacoepidemiological studies. Further examinations should be performed on the reproducibility of clinical observations, especially for EBGM-based signals.
Learning rational temporal eye movement strategies.
Hoppe, David; Rothkopf, Constantin A
2016-07-19
During active behavior humans redirect their gaze several times every second within the visual environment. Where we look within static images is highly efficient, as quantified by computational models of human gaze shifts in visual search and face recognition tasks. However, when we shift gaze is mostly unknown despite its fundamental importance for survival in a dynamic world. It has been suggested that during naturalistic visuomotor behavior gaze deployment is coordinated with task-relevant events, often predictive of future events, and studies in sportsmen suggest that timing of eye movements is learned. Here we establish that humans efficiently learn to adjust the timing of eye movements in response to environmental regularities when monitoring locations in the visual scene to detect probabilistically occurring events. To detect the events humans adopt strategies that can be understood through a computational model that includes perceptual and acting uncertainties, a minimal processing time, and, crucially, the intrinsic costs of gaze behavior. Thus, subjects traded off event detection rate with behavioral costs of carrying out eye movements. Remarkably, based on this rational bounded actor model the time course of learning the gaze strategies is fully explained by an optimal Bayesian learner with humans' characteristic uncertainty in time estimation, the well-known scalar law of biological timing. Taken together, these findings establish that the human visual system is highly efficient in learning temporal regularities in the environment and that it can use these regularities to control the timing of eye movements to detect behaviorally relevant events.
Ontology-based knowledge management for personalized adverse drug events detection.
Cao, Feng; Sun, Xingzhi; Wang, Xiaoyuan; Li, Bo; Li, Jing; Pan, Yue
2011-01-01
Since Adverse Drug Event (ADE) has become a leading cause of death around the world, there arises high demand for helping clinicians or patients to identify possible hazards from drug effects. Motivated by this, we present a personalized ADE detection system, with the focus on applying ontology-based knowledge management techniques to enhance ADE detection services. The development of electronic health records makes it possible to automate the personalized ADE detection, i.e., to take patient clinical conditions into account during ADE detection. Specifically, we define the ADE ontology to uniformly manage the ADE knowledge from multiple sources. We take advantage of the rich semantics from the terminology SNOMED-CT and apply it to ADE detection via the semantic query and reasoning.
Laboratory-Based Prospective Surveillance for Community Outbreaks of Shigella spp. in Argentina
Viñas, María R.; Tuduri, Ezequiel; Galar, Alicia; Yih, Katherine; Pichel, Mariana; Stelling, John; Brengi, Silvina P.; Della Gaspera, Anabella; van der Ploeg, Claudia; Bruno, Susana; Rogé, Ariel; Caffer, María I.; Kulldorff, Martin; Galas, Marcelo
2013-01-01
Background To implement effective control measures, timely outbreak detection is essential. Shigella is the most common cause of bacterial diarrhea in Argentina. Highly resistant clones of Shigella have emerged, and outbreaks have been recognized in closed settings and in whole communities. We hereby report our experience with an evolving, integrated, laboratory-based, near real-time surveillance system operating in six contiguous provinces of Argentina during April 2009 to March 2012. Methodology To detect localized shigellosis outbreaks timely, we used the prospective space-time permutation scan statistic algorithm of SaTScan, embedded in WHONET software. Twenty three laboratories sent updated Shigella data on a weekly basis to the National Reference Laboratory. Cluster detection analysis was performed at several taxonomic levels: for all Shigella spp., for serotypes within species and for antimicrobial resistance phenotypes within species. Shigella isolates associated with statistically significant signals (clusters in time/space with recurrence interval ≥365 days) were subtyped by pulsed field gel electrophoresis (PFGE) using PulseNet protocols. Principal Findings In three years of active surveillance, our system detected 32 statistically significant events, 26 of them identified before hospital staff was aware of any unexpected increase in the number of Shigella isolates. Twenty-six signals were investigated by PFGE, which confirmed a close relationship among the isolates for 22 events (84.6%). Seven events were investigated epidemiologically, which revealed links among the patients. Seventeen events were found at the resistance profile level. The system detected events of public health importance: infrequent resistance profiles, long-lasting and/or re-emergent clusters and events important for their duration or size, which were reported to local public health authorities. Conclusions/Significance The WHONET-SaTScan system may serve as a model for surveillance and can be applied to other pathogens, implemented by other networks, and scaled up to national and international levels for early detection and control of outbreaks. PMID:24349586
Laboratory-based prospective surveillance for community outbreaks of Shigella spp. in Argentina.
Viñas, María R; Tuduri, Ezequiel; Galar, Alicia; Yih, Katherine; Pichel, Mariana; Stelling, John; Brengi, Silvina P; Della Gaspera, Anabella; van der Ploeg, Claudia; Bruno, Susana; Rogé, Ariel; Caffer, María I; Kulldorff, Martin; Galas, Marcelo
2013-01-01
To implement effective control measures, timely outbreak detection is essential. Shigella is the most common cause of bacterial diarrhea in Argentina. Highly resistant clones of Shigella have emerged, and outbreaks have been recognized in closed settings and in whole communities. We hereby report our experience with an evolving, integrated, laboratory-based, near real-time surveillance system operating in six contiguous provinces of Argentina during April 2009 to March 2012. To detect localized shigellosis outbreaks timely, we used the prospective space-time permutation scan statistic algorithm of SaTScan, embedded in WHONET software. Twenty three laboratories sent updated Shigella data on a weekly basis to the National Reference Laboratory. Cluster detection analysis was performed at several taxonomic levels: for all Shigella spp., for serotypes within species and for antimicrobial resistance phenotypes within species. Shigella isolates associated with statistically significant signals (clusters in time/space with recurrence interval ≥365 days) were subtyped by pulsed field gel electrophoresis (PFGE) using PulseNet protocols. In three years of active surveillance, our system detected 32 statistically significant events, 26 of them identified before hospital staff was aware of any unexpected increase in the number of Shigella isolates. Twenty-six signals were investigated by PFGE, which confirmed a close relationship among the isolates for 22 events (84.6%). Seven events were investigated epidemiologically, which revealed links among the patients. Seventeen events were found at the resistance profile level. The system detected events of public health importance: infrequent resistance profiles, long-lasting and/or re-emergent clusters and events important for their duration or size, which were reported to local public health authorities. The WHONET-SaTScan system may serve as a model for surveillance and can be applied to other pathogens, implemented by other networks, and scaled up to national and international levels for early detection and control of outbreaks.
Negated bio-events: analysis and identification
2013-01-01
Background Negation occurs frequently in scientific literature, especially in biomedical literature. It has previously been reported that around 13% of sentences found in biomedical research articles contain negation. Historically, the main motivation for identifying negated events has been to ensure their exclusion from lists of extracted interactions. However, recently, there has been a growing interest in negative results, which has resulted in negation detection being identified as a key challenge in biomedical relation extraction. In this article, we focus on the problem of identifying negated bio-events, given gold standard event annotations. Results We have conducted a detailed analysis of three open access bio-event corpora containing negation information (i.e., GENIA Event, BioInfer and BioNLP’09 ST), and have identified the main types of negated bio-events. We have analysed the key aspects of a machine learning solution to the problem of detecting negated events, including selection of negation cues, feature engineering and the choice of learning algorithm. Combining the best solutions for each aspect of the problem, we propose a novel framework for the identification of negated bio-events. We have evaluated our system on each of the three open access corpora mentioned above. The performance of the system significantly surpasses the best results previously reported on the BioNLP’09 ST corpus, and achieves even better results on the GENIA Event and BioInfer corpora, both of which contain more varied and complex events. Conclusions Recently, in the field of biomedical text mining, the development and enhancement of event-based systems has received significant interest. The ability to identify negated events is a key performance element for these systems. We have conducted the first detailed study on the analysis and identification of negated bio-events. Our proposed framework can be integrated with state-of-the-art event extraction systems. The resulting systems will be able to extract bio-events with attached polarities from textual documents, which can serve as the foundation for more elaborate systems that are able to detect mutually contradicting bio-events. PMID:23323936
Automatic Near-Real-Time Detection of CMEs in Mauna Loa K-Cor Coronagraph Images
NASA Astrophysics Data System (ADS)
Thompson, W. T.; St. Cyr, O. C.; Burkepile, J. T.; Posner, A.
2017-10-01
A simple algorithm has been developed to detect the onset of coronal mass ejections (CMEs), together with speed estimates, in near-real time using linearly polarized white-light solar coronal images from the Mauna Loa Solar Observatory K-Cor telescope. Ground observations in the low corona can warn of CMEs well before they appear in space coronagraphs. The algorithm used is a variation on the Solar Eruptive Event Detection System developed at George Mason University. It was tested against K-Cor data taken between 29 April 2014 and 20 February 2017, on days identified as containing CMEs. This resulted in testing of 139 days' worth of data containing 171 CMEs. The detection rate varied from close to 80% when solar activity was high down to as low as 20-30% when activity was low. The difference in effectiveness with solar cycle is attributed to the relative prevalence of strong CMEs between active and quiet periods. There were also 12 false detections, leading to an average false detection rate of 8.6%. The K-Cor data were also compared with major solar energetic particle (SEP) storms during this time period. There were three SEP events detected either at Earth or at one of the two STEREO spacecraft when K-Cor was observing during the relevant time period. The algorithm successfully generated alerts for two of these events, with lead times of 1-3 h before the SEP onset at 1 AU. The third event was not detected by the automatic algorithm because of the unusually broad width in position angle.
High-speed event detector for embedded nanopore bio-systems.
Huang, Yiyun; Magierowski, Sebastian; Ghafar-Zadeh, Ebrahim; Wang, Chengjie
2015-08-01
Biological measurements of microscopic phenomena often deal with discrete-event signals. The ability to automatically carry out such measurements at high-speed in a miniature embedded system is desirable but compromised by high-frequency noise along with practical constraints on filter quality and sampler resolution. This paper presents a real-time event-detection method in the context of nanopore sensing that helps to mitigate these drawbacks and allows accurate signal processing in an embedded system. Simulations show at least a 10× improvement over existing on-line detection methods.
Bowden, Vanessa K; Loft, Shayne
2016-06-01
In 2 experiments we examined the impact of memory for prior events on conflict detection in simulated air traffic control under conditions where individuals proactively controlled aircraft and completed concurrent tasks. Individuals were faster to detect conflicts that had repeatedly been presented during training (positive transfer). Bayesian statistics indicated strong evidence for the null hypothesis that conflict detection was not impaired for events that resembled an aircraft pair that had repeatedly come close to conflicting during training. This is likely because aircraft altitude (the feature manipulated between training and test) was attended to by participants when proactively controlling aircraft. In contrast, a minor change to the relative position of a repeated nonconflicting aircraft pair moderately impaired conflict detection (negative transfer). There was strong evidence for the null hypothesis that positive transfer was not impacted by dividing participant attention, which suggests that part of the information retrieved regarding prior aircraft events was perceptual (the new aircraft pair "looked" like a conflict based on familiarity). These findings extend the effects previously reported by Loft, Humphreys, and Neal (2004), answering the recent strong and unanimous calls across the psychological science discipline to formally establish the robustness and generality of previously published effects. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Rapid Landslide Mapping by Means of Post-Event Polarimetric SAR Imagery
NASA Astrophysics Data System (ADS)
Plank, Simon; Martinis, Sandro; Twele, Andre
2016-08-01
Rapid mapping of landslides, quickly providing information about the extent of the affected area and type and grade of damage, is crucial to enable fast crisis response. Reviewing the literature shows that most synthetic aperture radar (SAR) data-based landslide mapping procedures use change detection techniques. However, the required very high resolution (VHR) pre-event SAR imagery, acquired shortly before the landslide event, is commonly not available. Due to limitations in onboard disk space and downlink transmission rates modern VHR SAR missions do not systematically cover the entire world. We present a fast and robust procedure for mapping of landslides, based on change detection between freely available and systematically acquired pre-event optical and post-event polarimetric SAR data.
Day-time identification of summer hailstorm cells from MSG data
NASA Astrophysics Data System (ADS)
Merino, A.; López, L.; Sánchez, J. L.; García-Ortega, E.; Cattani, E.; Levizzani, V.
2013-10-01
Identifying deep convection is of paramount importance, as it may be associated with extreme weather that has significant impact on the environment, property and the population. A new method, the Hail Detection Tool (HDT), is described for identifying hail-bearing storms using multi-spectral Meteosat Second Generation (MSG) data. HDT was conceived as a two-phase method, in which the first step is the Convective Mask (CM) algorithm devised for detection of deep convection, and the second a Hail Detection algorithm (HD) for the identification of hail-bearing clouds among cumulonimbus systems detected by CM. Both CM and HD are based on logistic regression models trained with multi-spectral MSG data-sets comprised of summer convective events in the middle Ebro Valley between 2006-2010, and detected by the RGB visualization technique (CM) or C-band weather radar system of the University of León. By means of the logistic regression approach, the probability of identifying a cumulonimbus event with CM or a hail event with HD are computed by exploiting a proper selection of MSG wavelengths or their combination. A number of cloud physical properties (liquid water path, optical thickness and effective cloud drop radius) were used to physically interpret results of statistical models from a meteorological perspective, using a method based on these "ingredients." Finally, the HDT was applied to a new validation sample consisting of events during summer 2011. The overall Probability of Detection (POD) was 76.9% and False Alarm Ratio 16.7%.
An evaluation of computer-aided disproportionality analysis for post-marketing signal detection.
Lehman, H P; Chen, J; Gould, A L; Kassekert, R; Beninger, P R; Carney, R; Goldberg, M; Goss, M A; Kidos, K; Sharrar, R G; Shields, K; Sweet, A; Wiholm, B E; Honig, P K
2007-08-01
To understand the value of computer-aided disproportionality analysis (DA) in relation to current pharmacovigilance signal detection methods, four products were retrospectively evaluated by applying an empirical Bayes method to Merck's post-marketing safety database. Findings were compared with the prior detection of labeled post-marketing adverse events. Disproportionality ratios (empirical Bayes geometric mean lower 95% bounds for the posterior distribution (EBGM05)) were generated for product-event pairs. Overall (1993-2004 data, EBGM05> or =2, individual terms) results of signal detection using DA compared to standard methods were sensitivity, 31.1%; specificity, 95.3%; and positive predictive value, 19.9%. Using groupings of synonymous labeled terms, sensitivity improved (40.9%). More of the adverse events detected by both methods were detected earlier using DA and grouped (versus individual) terms. With 1939-2004 data, diagnostic properties were similar to those from 1993 to 2004. DA methods using Merck's safety database demonstrate sufficient sensitivity and specificity to be considered for use as an adjunct to conventional signal detection methods.
Context-aware event detection smartphone application for first responders
NASA Astrophysics Data System (ADS)
Boddhu, Sanjay K.; Dave, Rakesh P.; McCartney, Matt; West, James A.; Williams, Robert L.
2013-05-01
The rise of social networking platforms like Twitter, Facebook, etc…, have provided seamless sharing of information (as chat, video and other media) among its user community on a global scale. Further, the proliferation of the smartphones and their connectivity networks has powered the ordinary individuals to share and acquire information regarding the events happening in his/her immediate vicinity in a real-time fashion. This human-centric sensed data being generated in "human-as-sensor" approach is tremendously valuable as it delivered mostly with apt annotations and ground truth that would be missing in traditional machine-centric sensors, besides high redundancy factor (same data thru multiple users). Further, when appropriately employed this real-time data can support in detecting localized events like fire, accidents, shooting, etc…, as they unfold and pin-point individuals being affected by those events. This spatiotemporal information, when made available for first responders in the event vicinity (or approaching it) can greatly assist them to make effective decisions to protect property and life in a timely fashion. In this vein, under SATE and YATE programs, the research team at AFRL Tec^Edge Discovery labs had demonstrated the feasibility of developing Smartphone applications, that can provide a augmented reality view of the appropriate detected events in a given geographical location (localized) and also provide an event search capability over a large geographic extent. In its current state, the application thru its backend connectivity utilizes a data (Text & Image) processing framework, which deals with data challenges like; identifying and aggregating important events, analyzing and correlating the events temporally and spatially and building a search enabled event database. Further, the smartphone application with its backend data processing workflow has been successfully field tested with live user generated feeds.
TE-Tracker: systematic identification of transposition events through whole-genome resequencing.
Gilly, Arthur; Etcheverry, Mathilde; Madoui, Mohammed-Amin; Guy, Julie; Quadrana, Leandro; Alberti, Adriana; Martin, Antoine; Heitkam, Tony; Engelen, Stefan; Labadie, Karine; Le Pen, Jeremie; Wincker, Patrick; Colot, Vincent; Aury, Jean-Marc
2014-11-19
Transposable elements (TEs) are DNA sequences that are able to move from their location in the genome by cutting or copying themselves to another locus. As such, they are increasingly recognized as impacting all aspects of genome function. With the dramatic reduction in cost of DNA sequencing, it is now possible to resequence whole genomes in order to systematically characterize novel TE mobilization in a particular individual. However, this task is made difficult by the inherently repetitive nature of TE sequences, which in some eukaryotes compose over half of the genome sequence. Currently, only a few software tools dedicated to the detection of TE mobilization using next-generation-sequencing are described in the literature. They often target specific TEs for which annotation is available, and are only able to identify families of closely related TEs, rather than individual elements. We present TE-Tracker, a general and accurate computational method for the de-novo detection of germ line TE mobilization from re-sequenced genomes, as well as the identification of both their source and destination sequences. We compare our method with the two classes of existing software: specialized TE-detection tools and generic structural variant (SV) detection tools. We show that TE-Tracker, while working independently of any prior annotation, bridges the gap between these two approaches in terms of detection power. Indeed, its positive predictive value (PPV) is comparable to that of dedicated TE software while its sensitivity is typical of a generic SV detection tool. TE-Tracker demonstrates the benefit of adopting an annotation-independent, de novo approach for the detection of TE mobilization events. We use TE-Tracker to provide a comprehensive view of transposition events induced by loss of DNA methylation in Arabidopsis. TE-Tracker is freely available at http://www.genoscope.cns.fr/TE-Tracker . We show that TE-Tracker accurately detects both the source and destination of novel transposition events in re-sequenced genomes. Moreover, TE-Tracker is able to detect all potential donor sequences for a given insertion, and can identify the correct one among them. Furthermore, TE-Tracker produces significantly fewer false positives than common SV detection programs, thus greatly facilitating the detection and analysis of TE mobilization events.
Thermal wake/vessel detection technique
Roskovensky, John K [Albuquerque, NM; Nandy, Prabal [Albuquerque, NM; Post, Brian N [Albuquerque, NM
2012-01-10
A computer-automated method for detecting a vessel in water based on an image of a portion of Earth includes generating a thermal anomaly mask. The thermal anomaly mask flags each pixel of the image initially deemed to be a wake pixel based on a comparison of a thermal value of each pixel against other thermal values of other pixels localized about each pixel. Contiguous pixels flagged by the thermal anomaly mask are grouped into pixel clusters. A shape of each of the pixel clusters is analyzed to determine whether each of the pixel clusters represents a possible vessel detection event. The possible vessel detection events are represented visually within the image.
Impact Detection for Characterization of Complex Multiphase Flows
NASA Astrophysics Data System (ADS)
Chan, Wai Hong Ronald; Urzay, Javier; Mani, Ali; Moin, Parviz
2016-11-01
Multiphase flows often involve a wide range of impact events, such as liquid droplets impinging on a liquid pool or gas bubbles coalescing in a liquid medium. These events contribute to a myriad of large-scale phenomena, including breaking waves on ocean surfaces. As impacts between surfaces necessarily occur at isolated points, numerical simulations of impact events will require the resolution of molecular scales near the impact points for accurate modeling. This can be prohibitively expensive unless subgrid impact and breakup models are formulated to capture the effects of the interactions. The first step in a large-eddy simulation (LES) based computational methodology for complex multiphase flows like air-sea interactions requires effective detection of these impact events. The starting point of this work is a collision detection algorithm for structured grids on a coupled level set / volume of fluid (CLSVOF) solver adapted from an earlier algorithm for cloth animations that triangulates the interface with the marching cubes method. We explore the extension of collision detection to a geometric VOF solver and to unstructured grids. Supported by ONR/A*STAR. Agency of Science, Technology and Research, Singapore; Office of Naval Research, USA.
Homaeinezhad, M R; Erfanianmoshiri-Nejad, M; Naseri, H
2014-01-01
The goal of this study is to introduce a simple, standard and safe procedure to detect and to delineate P and T waves of the electrocardiogram (ECG) signal in real conditions. The proposed method consists of four major steps: (1) a secure QRS detection and delineation algorithm, (2) a pattern recognition algorithm designed for distinguishing various ECG clusters which take place between consecutive R-waves, (3) extracting template of the dominant events of each cluster waveform and (4) application of the correlation analysis in order to delineate automatically the P- and T-waves in noisy conditions. The performance characteristics of the proposed P and T detection-delineation algorithm are evaluated versus various ECG signals whose qualities are altered from the best to the worst cases based on the random-walk noise theory. Also, the method is applied to the MIT-BIH Arrhythmia and the QT databases for comparing some parts of its performance characteristics with a number of P and T detection-delineation algorithms. The conducted evaluations indicate that in a signal with low quality value of about 0.6, the proposed method detects the P and T events with sensitivity Se=85% and positive predictive value of P+=89%, respectively. In addition, at the same quality, the average delineation errors associated with those ECG events are 45 and 63ms, respectively. Stable delineation error, high detection accuracy and high noise tolerance were the most important aspects considered during development of the proposed method. © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Smith, T.; Arce, A. C.; Ji, C.
2016-12-01
Waveform cross-correlation technique is widely used to improve the detection of small magnitude events induced by hydraulic fracturing. However, when events are detected, assigning a reliable magnitude is a challenging task, especially considering their small signal amplitude and high background noise during injections. In this study, we adopt the Match & Locate algorithm (M&L, Zhang and Wen, 2015) to analyze seven hours of continuous seismic observations from a hydraulic fracturing experiment in Central California. The site of the stimulated region is only 300-400m away from a 16-receiver vertical-borehole array which spans 230 m. The sampling rate is 4000 Hz. Both the injection sites and borehole array are more than 1.7 km below the surface. This dataset has previously been studied by an industry group, producing a catalog of 1134 events with moment magnitudes (Mw) ranging from -3.1 to -0.9. In this study, we select 202 events from this catalog with high signal to noise ratios to use as templates. Our M&L analysis produces a new catalog that contains 2119 events, which is 10 times more detections than the number of templates and about two times the original catalog. Using these two catalogs, we investigate the relationship of moment magnitude difference (ΔMW) and local magnitude difference (ΔML) between the detected event and corresponding template event. ΔML is computed using the peak amplitude ratio between the detected and template event for each channel. Our analysis yields an empirical relationship of ΔMW=0.64-0.65ΔML with an R2 of 0.99. The coefficient of 2/3 suggests that the information of the event's corner frequency is entirely lost (Hanks and Boore, 1984). The cause might not be unique, which implies that Earth's attenuation at this depth range (>1.7 km) is significant; or the 4000 Hz sampling rate is not sufficient. This relationship is crucial to estimate the b-value of the microseismicity induced by hydraulic fracture experiments. The analysis using the M&L catalog with the relation ΔMW = 2/3*ΔML , results in a normal b-value of 1.1, whereas a smaller b value of 0.88 is be obtained if ΔMW = ΔML.
National Earthquake Information Center Seismic Event Detections on Multiple Scales
NASA Astrophysics Data System (ADS)
Patton, J.; Yeck, W. L.; Benz, H.; Earle, P. S.; Soto-Cordero, L.; Johnson, C. E.
2017-12-01
The U.S. Geological Survey National Earthquake Information Center (NEIC) monitors seismicity on local, regional, and global scales using automatic picks from more than 2,000 near-real time seismic stations. This presents unique challenges in automated event detection due to the high variability in data quality, network geometries and density, and distance-dependent variability in observed seismic signals. To lower the overall detection threshold while minimizing false detection rates, NEIC has begun to test the incorporation of new detection and picking algorithms, including multiband (Lomax et al., 2012) and kurtosis (Baillard et al., 2014) pickers, and a new bayesian associator (Glass 3.0). The Glass 3.0 associator allows for simultaneous processing of variably scaled detection grids, each with a unique set of nucleation criteria (e.g., nucleation threshold, minimum associated picks, nucleation phases) to meet specific monitoring goals. We test the efficacy of these new tools on event detection in networks of various scales and geometries, compare our results with previous catalogs, and discuss lessons learned. For example, we find that on local and regional scales, rapid nucleation of small events may require event nucleation with both P and higher-amplitude secondary phases (e.g., S or Lg). We provide examples of the implementation of a scale-independent associator for an induced seismicity sequence (local-scale), a large aftershock sequence (regional-scale), and for monitoring global seismicity. Baillard, C., Crawford, W. C., Ballu, V., Hibert, C., & Mangeney, A. (2014). An automatic kurtosis-based P-and S-phase picker designed for local seismic networks. Bulletin of the Seismological Society of America, 104(1), 394-409. Lomax, A., Satriano, C., & Vassallo, M. (2012). Automatic picker developments and optimization: FilterPicker - a robust, broadband picker for real-time seismic monitoring and earthquake early-warning, Seism. Res. Lett. , 83, 531-540, doi: 10.1785/gssrl.83.3.531.
Observation of Long Ionospheric Recoveries from Lightning-induced Electron Precipitation Events
NASA Astrophysics Data System (ADS)
Mohammadpour Salut, M.; Cohen, M.
2015-12-01
Lightning strokes induces lower ionospheric nighttime disturbances which can be detected through Very Low Frequency (VLF) remote sensing via at least two means: (1) direct heating and ionization, known as an Early event, and (2) triggered precipitation of energetic electrons from the radiation belts, known as Lightning-induced Electron Precipitation (LEP). For each, the ionospheric recover time is typically a few minutes or less. A small class of Early events have been identified as having unusually long ionospheric recoveries (10s of minutes), with the underlying mechanism still in question. Our study shows for the first time that some LEP events also demonstrate unusually long recovery. The VLF events were detected by visual inspection of the recorded data in both the North-South and East-West magnetic fields. Data from the National Lightning Detection Network (NLDN) are used to determine the location and peak current of the lightning responsible for each lightning-associated VLF perturbation. LEP or Early VLF events are determined by measuring the time delay between the causative lightning discharges and the onset of all lightning-associated perturbations. LEP events typically possess an onset delay greater than ~ 200 msec following the causative lightning discharges, while the onset of Early VLF events is time-aligned (<20 msec) with the lightning return stroke. Nonducted LEP events are distinguished from ducted events based on the location of the causative lightning relative to the precipitation region. From 15 March to 20 April and 15 October to 15 November 2011, a total of 385 LEP events observed at Indiana, Montana, Colorado and Oklahoma VLF sites, on the NAA, NLK and NML transmitter signals. 46 of these events exhibited a long recovery. It has been found that the occurrence rate of ducted long recovery LEP events is higher than nonducted. Of the 46 long recovery LEP events, 33 events were induced by ducted whistlers, and 13 events were associated with nonducted obliquely propagating whistler waves. The occurrence of high peak current lightning strokes is a prerequisite for long recovery LEP events.
A novel adaptive, real-time algorithm to detect gait events from wearable sensors.
Chia Bejarano, Noelia; Ambrosini, Emilia; Pedrocchi, Alessandra; Ferrigno, Giancarlo; Monticone, Marco; Ferrante, Simona
2015-05-01
A real-time, adaptive algorithm based on two inertial and magnetic sensors placed on the shanks was developed for gait-event detection. For each leg, the algorithm detected the Initial Contact (IC), as the minimum of the flexion/extension angle, and the End Contact (EC) and the Mid-Swing (MS), as minimum and maximum of the angular velocity, respectively. The algorithm consisted of calibration, real-time detection, and step-by-step update. Data collected from 22 healthy subjects (21 to 85 years) walking at three self-selected speeds were used to validate the algorithm against the GaitRite system. Comparable levels of accuracy and significantly lower detection delays were achieved with respect to other published methods. The algorithm robustness was tested on ten healthy subjects performing sudden speed changes and on ten stroke subjects (43 to 89 years). For healthy subjects, F1-scores of 1 and mean detection delays lower than 14 ms were obtained. For stroke subjects, F1-scores of 0.998 and 0.944 were obtained for IC and EC, respectively, with mean detection delays always below 31 ms. The algorithm accurately detected gait events in real time from a heterogeneous dataset of gait patterns and paves the way for the design of closed-loop controllers for customized gait trainings and/or assistive devices.
GMDD: a database of GMO detection methods.
Dong, Wei; Yang, Litao; Shen, Kailin; Kim, Banghyun; Kleter, Gijs A; Marvin, Hans J P; Guo, Rong; Liang, Wanqi; Zhang, Dabing
2008-06-04
Since more than one hundred events of genetically modified organisms (GMOs) have been developed and approved for commercialization in global area, the GMO analysis methods are essential for the enforcement of GMO labelling regulations. Protein and nucleic acid-based detection techniques have been developed and utilized for GMOs identification and quantification. However, the information for harmonization and standardization of GMO analysis methods at global level is needed. GMO Detection method Database (GMDD) has collected almost all the previous developed and reported GMOs detection methods, which have been grouped by different strategies (screen-, gene-, construct-, and event-specific), and also provide a user-friendly search service of the detection methods by GMO event name, exogenous gene, or protein information, etc. In this database, users can obtain the sequences of exogenous integration, which will facilitate PCR primers and probes design. Also the information on endogenous genes, certified reference materials, reference molecules, and the validation status of developed methods is included in this database. Furthermore, registered users can also submit new detection methods and sequences to this database, and the newly submitted information will be released soon after being checked. GMDD contains comprehensive information of GMO detection methods. The database will make the GMOs analysis much easier.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford
The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events thatmore » generated the phase picks.« less
Detection Thresholds of Falling Snow From Satellite-Borne Active and Passive Sensors
NASA Technical Reports Server (NTRS)
Skofronick-Jackson, Gail M.; Johnson, Benjamin T.; Munchak, S. Joseph
2013-01-01
There is an increased interest in detecting and estimating the amount of falling snow reaching the Earths surface in order to fully capture the global atmospheric water cycle. An initial step toward global spaceborne falling snow algorithms for current and future missions includes determining the thresholds of detection for various active and passive sensor channel configurations and falling snow events over land surfaces and lakes. In this paper, cloud resolving model simulations of lake effect and synoptic snow events were used to determine the minimum amount of snow (threshold) that could be detected by the following instruments: the W-band radar of CloudSat, Global Precipitation Measurement (GPM) Dual-Frequency Precipitation Radar (DPR)Ku- and Ka-bands, and the GPM Microwave Imager. Eleven different nonspherical snowflake shapes were used in the analysis. Notable results include the following: 1) The W-band radar has detection thresholds more than an order of magnitude lower than the future GPM radars; 2) the cloud structure macrophysics influences the thresholds of detection for passive channels (e.g., snow events with larger ice water paths and thicker clouds are easier to detect); 3) the snowflake microphysics (mainly shape and density)plays a large role in the detection threshold for active and passive instruments; 4) with reasonable assumptions, the passive 166-GHz channel has detection threshold values comparable to those of the GPM DPR Ku- and Ka-band radars with approximately 0.05 g *m(exp -3) detected at the surface, or an approximately 0.5-1.0-mm * h(exp -1) melted snow rate. This paper provides information on the light snowfall events missed by the sensors and not captured in global estimates.
NASA Astrophysics Data System (ADS)
Ryan, E. M.; Brucker, L.; Forman, B. A.
2015-12-01
During the winter months, the occurrence of rain-on-snow (ROS) events can impact snow stratigraphy via generation of large scale ice crusts, e.g., on or within the snowpack. The formation of such layers significantly alters the electromagnetic response of the snowpack, which can be witnessed using space-based microwave radiometers. In addition, ROS layers can hinder the ability of wildlife to burrow in the snow for vegetation, which limits their foraging capability. A prime example occurred on 23 October 2003 in Banks Island, Canada, where an ROS event is believed to have caused the deaths of over 20,000 musk oxen. Through the use of passive microwave remote sensing, ROS events can be detected by utilizing observed brightness temperatures (Tb) from AMSR-E. Tb observed at different microwave frequencies and polarizations depends on snow properties. A wet snowpack formed from an ROS event yields a larger Tb than a typical dry snowpack would. This phenomenon makes observed Tb useful when detecting ROS events. With the use of data retrieved from AMSR-E, in conjunction with observations from ground-based weather station networks, a database of estimated ROS events over the past twelve years was generated. Using this database, changes in measured Tb following the ROS events was also observed. This study adds to the growing knowledge of ROS events and has the potential to help inform passive microwave snow water equivalent (SWE) retrievals or snow cover properties in polar regions.
ATLAS EventIndex general dataflow and monitoring infrastructure
NASA Astrophysics Data System (ADS)
Fernández Casaní, Á.; Barberis, D.; Favareto, A.; García Montoro, C.; González de la Hoz, S.; Hřivnáč, J.; Prokoshin, F.; Salt, J.; Sánchez, J.; Többicke, R.; Yuan, R.; ATLAS Collaboration
2017-10-01
The ATLAS EventIndex has been running in production since mid-2015, reliably collecting information worldwide about all produced events and storing them in a central Hadoop infrastructure at CERN. A subset of this information is copied to an Oracle relational database for fast dataset discovery, event-picking, crosschecks with other ATLAS systems and checks for event duplication. The system design and its optimization is serving event picking from requests of a few events up to scales of tens of thousand of events, and in addition, data consistency checks are performed for large production campaigns. Detecting duplicate events with a scope of physics collections has recently arisen as an important use case. This paper describes the general architecture of the project and the data flow and operation issues, which are addressed by recent developments to improve the throughput of the overall system. In this direction, the data collection system is reducing the usage of the messaging infrastructure to overcome the performance shortcomings detected during production peaks; an object storage approach is instead used to convey the event index information, and messages to signal their location and status. Recent changes in the Producer/Consumer architecture are also presented in detail, as well as the monitoring infrastructure.
High speed point derivative microseismic detector
Uhl, J.E.; Warpinski, N.R.; Whetten, E.B.
1998-06-30
A high speed microseismic event detector constructed in accordance with the present invention uses a point derivative comb to quickly and accurately detect microseismic events. Compressional and shear waves impinging upon microseismic receiver stations disposed to collect waves are converted into digital data and analyzed using a point derivative comb including assurance of quiet periods prior to declaration of microseismic events. If a sufficient number of quiet periods have passed, the square of a two point derivative of the incoming digital signal is compared to a trip level threshold exceeding the determined noise level to declare a valid trial event. The squaring of the derivative emphasizes the differences between noise and signal, and the valid event is preferably declared when the trip threshold has been exceeded over a temporal comb width to realize a comb over a given time period. Once a trial event has been declared, the event is verified through a spatial comb, which applies the temporal event comb to additional stations. The detector according to the present invention quickly and accurately detects initial compressional waves indicative of a microseismic event which typically exceed the ambient cultural noise level by a small amount, and distinguishes the waves from subsequent larger amplitude shear waves. 9 figs.
High speed point derivative microseismic detector
Uhl, James Eugene; Warpinski, Norman Raymond; Whetten, Ernest Blayne
1998-01-01
A high speed microseismic event detector constructed in accordance with the present invention uses a point derivative comb to quickly and accurately detect microseismic events. Compressional and shear waves impinging upon microseismic receiver stations disposed to collect waves are converted into digital data and analyzed using a point derivative comb including assurance of quiet periods prior to declaration of microseismic events. If a sufficient number of quiet periods have passed, the square of a two point derivative of the incoming digital signal is compared to a trip level threshold exceeding the determined noise level to declare a valid trial event. The squaring of the derivative emphasizes the differences between noise and signal, and the valid event is preferably declared when the trip threshold has been exceeded over a temporal comb width to realize a comb over a given time period. Once a trial event has been declared, the event is verified through a spatial comb, which applies the temporal event comb to additional stations. The detector according to the present invention quickly and accurately detects initial compressional waves indicative of a microseismic event which typically exceed the ambient cultural noise level by a small amount, and distinguishes the waves from subsequent larger amplitude shear waves.
Detecting modification of biomedical events using a deep parsing approach.
Mackinlay, Andrew; Martinez, David; Baldwin, Timothy
2012-04-30
This work describes a system for identifying event mentions in bio-molecular research abstracts that are either speculative (e.g. analysis of IkappaBalpha phosphorylation, where it is not specified whether phosphorylation did or did not occur) or negated (e.g. inhibition of IkappaBalpha phosphorylation, where phosphorylation did not occur). The data comes from a standard dataset created for the BioNLP 2009 Shared Task. The system uses a machine-learning approach, where the features used for classification are a combination of shallow features derived from the words of the sentences and more complex features based on the semantic outputs produced by a deep parser. To detect event modification, we use a Maximum Entropy learner with features extracted from the data relative to the trigger words of the events. The shallow features are bag-of-words features based on a small sliding context window of 3-4 tokens on either side of the trigger word. The deep parser features are derived from parses produced by the English Resource Grammar and the RASP parser. The outputs of these parsers are converted into the Minimal Recursion Semantics formalism, and from this, we extract features motivated by linguistics and the data itself. All of these features are combined to create training or test data for the machine learning algorithm. Over the test data, our methods produce approximately a 4% absolute increase in F-score for detection of event modification compared to a baseline based only on the shallow bag-of-words features. Our results indicate that grammar-based techniques can enhance the accuracy of methods for detecting event modification.
Integration of launch/impact discrimination algorithm with the UTAMS platform
NASA Astrophysics Data System (ADS)
Desai, Sachi; Morcos, Amir; Tenney, Stephen; Mays, Brian
2008-04-01
An acoustic array, integrated with an algorithm to discriminate potential Launch (LA) or Impact (IM) events, was augmented by employing the Launch Impact Discrimination (LID) algorithm for mortar events. We develop an added situational awareness capability to determine whether the localized event is a mortar launch or mortar impact at safe standoff distances. The algorithm utilizes a discrete wavelet transform to exploit higher harmonic components of various sub bands of the acoustic signature. Additional features are extracted via the frequency domain exploiting harmonic components generated by the nature of event, i.e. supersonic shrapnel components at impact. The further extrapolations of these features are employed with a neural network to provide a high level of confidence for discrimination and classification. The ability to discriminate between these events is of great interest on the battlefield. Providing more information and developing a common picture of situational awareness. Algorithms exploit the acoustic sensor array to provide detection and identification of IM/LA events at extended ranges. The integration of this algorithm with the acoustic sensor array for mortar detection provides an early warning detection system giving greater battlefield information for field commanders. This paper will describe the integration of the algorithm with a candidate sensor and resulting field tests.
Possible Evidence for an Event Horizon in Cyg XR-1
NASA Technical Reports Server (NTRS)
Dolan, Joseph F.; Fisher, Richard R. (Technical Monitor)
2001-01-01
The X-ray emitting component in the Cyg XR-1/HDE226868 system is a leading candidate for identification as a stellar-mass sized black hole. The positive identification of a black hole as predicted by general relativity requires the detection of an event horizon surrounding the point singularity. One signature of such an event horizon would be the existence of dying pulse trains emitted by material spiraling into the event horizon from the last stable orbit around the black hole. We observed the Cyg XR-1 system at three different epochs in a 1400 - 3000 A bandpass with 0.1 ms time resolution using the Hubble Space Telescope's High Speed Photometer. Repeated excursions of the detected flux by more than three standard deviations above the mean are present in the UV flux with FWHM 1 - 10 ms. If any of these excursions are pulses of radiation produced in the system (and not just stochastic variability associated with the Poisson distribution of detected photon arrival times), then this short a timescale requires that the pulses originate in the accretion disk around Cyg XR-1. Two series of pulses with characteristics similar to those expected from dying pulse trains were detected in three hours of observation.
Antón, Alfonso; Pazos, Marta; Martín, Belén; Navero, José Manuel; Ayala, Miriam Eleonora; Castany, Marta; Martínez, Patricia; Bardavío, Javier
2013-01-01
To assess sensitivity, specificity, and agreement among automated event analysis, automated trend analysis, and expert evaluation to detect glaucoma progression. This was a prospective study that included 37 eyes with a follow-up of 36 months. All had glaucomatous disks and fields and performed reliable visual fields every 6 months. Each series of fields was assessed with 3 different methods: subjective assessment by 2 independent teams of glaucoma experts, glaucoma/guided progression analysis (GPA) event analysis, and GPA (visual field index-based) trend analysis. Kappa agreement coefficient between methods and sensitivity and specificity for each method using expert opinion as gold standard were calculated. The incidence of glaucoma progression was 16% to 18% in 3 years but only 3 cases showed progression with all 3 methods. Kappa agreement coefficient was high (k=0.82) between subjective expert assessment and GPA event analysis, and only moderate between these two and GPA trend analysis (k=0.57). Sensitivity and specificity for GPA event and GPA trend analysis were 71% and 96%, and 57% and 93%, respectively. The 3 methods detected similar numbers of progressing cases. The GPA event analysis and expert subjective assessment showed high agreement between them and moderate agreement with GPA trend analysis. In a period of 3 years, both methods of GPA analysis offered high specificity, event analysis showed 83% sensitivity, and trend analysis had a 66% sensitivity.
Video-tracker trajectory analysis: who meets whom, when and where
NASA Astrophysics Data System (ADS)
Jäger, U.; Willersinn, D.
2010-04-01
Unveiling unusual or hostile events by observing manifold moving persons in a crowd is a challenging task for human operators, especially when sitting in front of monitor walls for hours. Typically, hostile events are rare. Thus, due to tiredness and negligence the operator may miss important events. In such situations, an automatic alarming system is able to support the human operator. The system incorporates a processing chain consisting of (1) people tracking, (2) event detection, (3) data retrieval, and (4) display of relevant video sequence overlaid by highlighted regions of interest. In this paper we focus on the event detection stage of the processing chain mentioned above. In our case, the selected event of interest is the encounter of people. Although being based on a rather simple trajectory analysis, this kind of event embodies great practical importance because it paves the way to answer the question "who meets whom, when and where". This, in turn, forms the basis to detect potential situations where e.g. money, weapons, drugs etc. are handed over from one person to another in crowded environments like railway stations, airports or busy streets and places etc.. The input to the trajectory analysis comes from a multi-object video-based tracking system developed at IOSB which is able to track multiple individuals within a crowd in real-time [1]. From this we calculate the inter-distances between all persons on a frame-to-frame basis. We use a sequence of simple rules based on the individuals' kinematics to detect the event mentioned above to output the frame number, the persons' IDs from the tracker and the pixel coordinates of the meeting position. Using this information, a data retrieval system may extract the corresponding part of the recorded video image sequence and finally allows for replaying the selected video clip with a highlighted region of interest to attract the operator's attention for further visual inspection.
Automatic near-real-time detection of CMEs in Mauna Loa K-Cor coronagraph images
NASA Astrophysics Data System (ADS)
Thompson, William T.; St. Cyr, Orville Chris; Burkepile, Joan; Posner, Arik
2017-08-01
A simple algorithm has been developed to detect the onset of coronal mass ejections (CMEs), together with an estimate of their speed, in near-real-time using images of the linearly polarized white-light solar corona taken by the K-Cor telescope at the Mauna Loa Solar Observatory (MLSO). The algorithm used is a variation on the Solar Eruptive Event Detection System (SEEDS) developed at George Mason University. The algorithm was tested against K-Cor data taken between 29 April 2014 and 20 February 2017, on days which the MLSO website marked as containing CMEs. This resulted in testing of 139 days worth of data containing 171 CMEs. The detection rate varied from close to 80% in 2014-2015 when solar activity was high, down to as low as 20-30% in 2017 when activity was low. The difference in effectiveness with solar cycle is attributed to the difference in relative prevalance of strong CMEs between active and quiet periods. There were also twelve false detections during this time period, leading to an average false detection rate of 8.6% on any given day. However, half of the false detections were clustered into two short periods of a few days each when special conditions prevailed to increase the false detection rate. The K-Cor data were also compared with major Solar Energetic Particle (SEP) storms during this time period. There were three SEP events detected either at Earth or at one of the two STEREO spacecraft where K-Cor was observing during the relevant time period. The K-Cor CME detection algorithm successfully generated alerts for two of these events, with lead times of 1-3 hours before the SEP onset at 1 AU. The third event was not detected by the automatic algorithm because of the unusually broad width of the CME in position angle.
NASA Astrophysics Data System (ADS)
Kenefic, L.; Morton, E.; Bilek, S.
2017-12-01
It is well known that subduction zones create the largest earthquakes in the world, like the magnitude 9.5 Chile earthquake in 1960, or the more recent 9.1 magnitude Japan earthquake in 2011, both of which are in the top five largest earthquakes ever recorded. However, off the coast of the Pacific Northwest region of the U.S., the Cascadia subduction zone (CSZ) remains relatively quiet and modern seismic instruments have not recorded earthquakes of this size in the CSZ. The last great earthquake, a magnitude 8.7-9.2, occurred in 1700 and is constrained by written reports of the resultant tsunami in Japan and dating a drowned forest in the U.S. Previous studies have suggested the margin is most likely segmented along-strike. However, variations in frictional conditions in the CSZ fault zone are not well known. Geodetic modeling indicates that the locked seismogenic zone is likely completely offshore, which may be too far from land seismometers to adequately detect related seismicity. Ocean bottom seismometers, as part of the Cascadia Initiative Amphibious Network, were installed directly above the inferred seismogenic zone, which we use to better detect small interplate seismicity. Using the subspace detection method, this study looks to find new seismogenic zone earthquakes. This subspace detection method uses multiple previously known event templates concurrently to scan through continuous seismic data. Template events that make up the subspace are chosen from events in existing catalogs that likely occurred along the plate interface. Corresponding waveforms are windowed on the nearby Cascadia Initiative ocean bottom seismometers and coastal land seismometers for scanning. Detections that are found by the scan are similar to the template waveforms based upon a predefined threshold. Detections are then visually examined to determine if an event is present. The presence of repeating event clusters can indicate persistent seismic patches, likely corresponding to areas of stronger coupling. This work will ultimately improve the understanding of CSZ fault zone heterogeneity. Preliminary results gathered indicate 96 possible new events between August 2, 2013 and July 1, 2014 for four target clusters off the coast of northern Oregon.
Automatic detection of freezing of gait events in patients with Parkinson's disease.
Tripoliti, Evanthia E; Tzallas, Alexandros T; Tsipouras, Markos G; Rigas, George; Bougia, Panagiota; Leontiou, Michael; Konitsiotis, Spiros; Chondrogiorgi, Maria; Tsouli, Sofia; Fotiadis, Dimitrios I
2013-04-01
The aim of this study is to detect freezing of gait (FoG) events in patients suffering from Parkinson's disease (PD) using signals received from wearable sensors (six accelerometers and two gyroscopes) placed on the patients' body. For this purpose, an automated methodology has been developed which consists of four stages. In the first stage, missing values due to signal loss or degradation are replaced and then (second stage) low frequency components of the raw signal are removed. In the third stage, the entropy of the raw signal is calculated. Finally (fourth stage), four classification algorithms have been tested (Naïve Bayes, Random Forests, Decision Trees and Random Tree) in order to detect the FoG events. The methodology has been evaluated using several different configurations of sensors in order to conclude to the set of sensors which can produce optimal FoG episode detection. Signals recorded from five healthy subjects, five patients with PD who presented the symptom of FoG and six patients who suffered from PD but they do not present FoG events. The signals included 93 FoG events with 405.6s total duration. The results indicate that the proposed methodology is able to detect FoG events with 81.94% sensitivity, 98.74% specificity, 96.11% accuracy and 98.6% area under curve (AUC) using the signals from all sensors and the Random Forests classification algorithm. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
A Novel Event-Based Incipient Slip Detection Using Dynamic Active-Pixel Vision Sensor (DAVIS)
Rigi, Amin
2018-01-01
In this paper, a novel approach to detect incipient slip based on the contact area between a transparent silicone medium and different objects using a neuromorphic event-based vision sensor (DAVIS) is proposed. Event-based algorithms are developed to detect incipient slip, slip, stress distribution and object vibration. Thirty-seven experiments were performed on five objects with different sizes, shapes, materials and weights to compare precision and response time of the proposed approach. The proposed approach is validated by using a high speed constitutional camera (1000 FPS). The results indicate that the sensor can detect incipient slippage with an average of 44.1 ms latency in unstructured environment for various objects. It is worth mentioning that the experiments were conducted in an uncontrolled experimental environment, therefore adding high noise levels that affected results significantly. However, eleven of the experiments had a detection latency below 10 ms which shows the capability of this method. The results are very promising and show a high potential of the sensor being used for manipulation applications especially in dynamic environments. PMID:29364190
Christian, Kira A; Iuliano, A Danielle; Uyeki, Timothy M; Mintz, Eric D; Nichol, Stuart T; Rollin, Pierre; Staples, J Erin; Arthur, Ray R
To better track public health events in areas where the public health system is unable or unwilling to report the event to appropriate public health authorities, agencies can conduct event-based surveillance, which is defined as the organized collection, monitoring, assessment, and interpretation of unstructured information regarding public health events that may represent an acute risk to public health. The US Centers for Disease Control and Prevention's (CDC's) Global Disease Detection Operations Center (GDDOC) was created in 2007 to serve as CDC's platform dedicated to conducting worldwide event-based surveillance, which is now highlighted as part of the "detect" element of the Global Health Security Agenda (GHSA). The GHSA works toward making the world more safe and secure from disease threats through building capacity to better "Prevent, Detect, and Respond" to those threats. The GDDOC monitors approximately 30 to 40 public health events each day. In this article, we describe the top threats to public health monitored during 2012 to 2016: avian influenza, cholera, Ebola virus disease, and the vector-borne diseases yellow fever, chikungunya virus, and Zika virus, with updates to the previously described threats from Middle East respiratory syndrome-coronavirus (MERS-CoV) and poliomyelitis.
Schadt, Eric E.; Banerjee, Onureena; Fang, Gang; Feng, Zhixing; Wong, Wing H.; Zhang, Xuegong; Kislyuk, Andrey; Clark, Tyson A.; Luong, Khai; Keren-Paz, Alona; Chess, Andrew; Kumar, Vipin; Chen-Plotkin, Alice; Sondheimer, Neal; Korlach, Jonas; Kasarskis, Andrew
2013-01-01
Current generation DNA sequencing instruments are moving closer to seamlessly sequencing genomes of entire populations as a routine part of scientific investigation. However, while significant inroads have been made identifying small nucleotide variation and structural variations in DNA that impact phenotypes of interest, progress has not been as dramatic regarding epigenetic changes and base-level damage to DNA, largely due to technological limitations in assaying all known and unknown types of modifications at genome scale. Recently, single-molecule real time (SMRT) sequencing has been reported to identify kinetic variation (KV) events that have been demonstrated to reflect epigenetic changes of every known type, providing a path forward for detecting base modifications as a routine part of sequencing. However, to date no statistical framework has been proposed to enhance the power to detect these events while also controlling for false-positive events. By modeling enzyme kinetics in the neighborhood of an arbitrary location in a genomic region of interest as a conditional random field, we provide a statistical framework for incorporating kinetic information at a test position of interest as well as at neighboring sites that help enhance the power to detect KV events. The performance of this and related models is explored, with the best-performing model applied to plasmid DNA isolated from Escherichia coli and mitochondrial DNA isolated from human brain tissue. We highlight widespread kinetic variation events, some of which strongly associate with known modification events, while others represent putative chemically modified sites of unknown types. PMID:23093720
Schadt, Eric E; Banerjee, Onureena; Fang, Gang; Feng, Zhixing; Wong, Wing H; Zhang, Xuegong; Kislyuk, Andrey; Clark, Tyson A; Luong, Khai; Keren-Paz, Alona; Chess, Andrew; Kumar, Vipin; Chen-Plotkin, Alice; Sondheimer, Neal; Korlach, Jonas; Kasarskis, Andrew
2013-01-01
Current generation DNA sequencing instruments are moving closer to seamlessly sequencing genomes of entire populations as a routine part of scientific investigation. However, while significant inroads have been made identifying small nucleotide variation and structural variations in DNA that impact phenotypes of interest, progress has not been as dramatic regarding epigenetic changes and base-level damage to DNA, largely due to technological limitations in assaying all known and unknown types of modifications at genome scale. Recently, single-molecule real time (SMRT) sequencing has been reported to identify kinetic variation (KV) events that have been demonstrated to reflect epigenetic changes of every known type, providing a path forward for detecting base modifications as a routine part of sequencing. However, to date no statistical framework has been proposed to enhance the power to detect these events while also controlling for false-positive events. By modeling enzyme kinetics in the neighborhood of an arbitrary location in a genomic region of interest as a conditional random field, we provide a statistical framework for incorporating kinetic information at a test position of interest as well as at neighboring sites that help enhance the power to detect KV events. The performance of this and related models is explored, with the best-performing model applied to plasmid DNA isolated from Escherichia coli and mitochondrial DNA isolated from human brain tissue. We highlight widespread kinetic variation events, some of which strongly associate with known modification events, while others represent putative chemically modified sites of unknown types.
Multimodal Event Detection in Twitter Hashtag Networks
Yilmaz, Yasin; Hero, Alfred O.
2016-07-01
In this study, event detection in a multimodal Twitter dataset is considered. We treat the hashtags in the dataset as instances with two modes: text and geolocation features. The text feature consists of a bag-of-words representation. The geolocation feature consists of geotags (i.e., geographical coordinates) of the tweets. Fusing the multimodal data we aim to detect, in terms of topic and geolocation, the interesting events and the associated hashtags. To this end, a generative latent variable model is assumed, and a generalized expectation-maximization (EM) algorithm is derived to learn the model parameters. The proposed method is computationally efficient, and lendsmore » itself to big datasets. Lastly, experimental results on a Twitter dataset from August 2014 show the efficacy of the proposed method.« less
Ben Mansour, Khaireddine; Rezzoug, Nasser; Gorce, Philippe
2015-10-01
The purpose of this paper was to determine which types of inertial sensors and which advocated locations should be used for reliable and accurate gait event detection and temporal parameter assessment in normal adults. In addition, we aimed to remove the ambiguity found in the literature of the definition of the initial contact (IC) from the lumbar accelerometer. Acceleration and angular velocity data was gathered from the lumbar region and the distal edge of each shank. This data was evaluated in comparison to an instrumented treadmill and an optoelectronic system during five treadmill speed sessions. The lumbar accelerometer showed that the peak of the anteroposterior component was the most accurate for IC detection. Similarly, the valley that followed the peak of the vertical component was the most precise for terminal contact (TC) detection. Results based on ANOVA and Tukey tests showed that the set of inertial methods was suitable for temporal gait assessment and gait event detection in able-bodied subjects. For gait event detection, an exception was found with the shank accelerometer. The tool was suitable for temporal parameters assessment, despite the high root mean square error on the detection of IC (RMSEIC) and TC (RMSETC). The shank gyroscope was found to be as accurate as the kinematic method since the statistical tests revealed no significant difference between the two techniques for the RMSE off all gait events and temporal parameters. The lumbar and shank accelerometers were the most accurate alternative to the shank gyroscope for gait event detection and temporal parameters assessment, respectively. Copyright © 2015. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Le Bras, R.; Rozhkov, M.; Bobrov, D.; Kitov, I. O.; Sanina, I.
2017-12-01
Association of weak seismic signals generated by low-magnitude aftershocks of the DPRK underground tests into event hypotheses represent a challenge for routine automatic and interactive processing at the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization, due to the relatively low station density of the International Monitoring System (IMS) seismic network. Since 2011, as an alternative, the IDC has been testing various prototype techniques of signal detection and event creation based on waveform cross correlation. Using signals measured by seismic stations of the IMS from DPRK explosions as waveform templates, the IDC detected several small (estimated mb between 2.2 and 3.6) seismic events after two DPRK tests conducted on September 9, 2016 and September 3, 2017. The obtained detections were associated with reliable event hypothesis and then used to locate these events relative to the epicenters of the DPRK explosions. We observe high similarity of the detected signals with the corresponding waveform templates. The newly found signals also correlate well between themselves. In addition, the values of the signal-to-noise ratios (SNR) estimated using the traces of cross correlation coefficients, increase with template length (from 5 s to 150 s), providing strong evidence in favour of their spatial closeness, which allows interpreting them as explosion aftershocks. We estimated the relative magnitudes of all aftershocks using the ratio of RMS amplitudes of the master and slave signal in the cross correlation windows characterized by the highest SNR. Additional waveform data from regional non-IMS stations MDJ and SEHB provide independent validation of these aftershock hypotheses. Since waveform templates from any single master event may be sub-efficient at some stations, we have also developed a method of joint usage of the DPRK and the biggest aftershocks templates to build more robust event hypotheses.
Clayton, Hilary M.
2015-01-01
The study of animal movement commonly requires the segmentation of continuous data streams into individual strides. The use of forceplates and foot-mounted accelerometers readily allows the detection of the foot-on and foot-off events that define a stride. However, when relying on optical methods such as motion capture, there is lack of validated robust, universally applicable stride event detection methods. To date, no method has been validated for movement on a circle, while algorithms are commonly specific to front/hind limbs or gait. In this study, we aimed to develop and validate kinematic stride segmentation methods applicable to movement on straight line and circle at walk and trot, which exclusively rely on a single, dorsal hoof marker. The advantage of such marker placement is the robustness to marker loss and occlusion. Eight horses walked and trotted on a straight line and in a circle over an array of multiple forceplates. Kinetic events were detected based on the vertical force profile and used as the reference values. Kinematic events were detected based on displacement, velocity or acceleration signals of the dorsal hoof marker depending on the algorithm using (i) defined thresholds associated with derived movement signals and (ii) specific events in the derived movement signals. Method comparison was performed by calculating limits of agreement, accuracy, between-horse precision and within-horse precision based on differences between kinetic and kinematic event. In addition, we examined the effect of force thresholds ranging from 50 to 150 N on the timings of kinetic events. The two approaches resulted in very good and comparable performance: of the 3,074 processed footfall events, 95% of individual foot on and foot off events differed by no more than 26 ms from the kinetic event, with average accuracy between −11 and 10 ms and average within- and between horse precision ≤8 ms. While the event-based method may be less likely to suffer from scaling effects, on soft ground the threshold-based method may prove more valuable. While we found that use of velocity thresholds for foot on detection results in biased event estimates for the foot on the inside of the circle at trot, adjusting thresholds for this condition negated the effect. For the final four algorithms, we found no noteworthy bias between conditions or between front- and hind-foot timings. Different force thresholds in the range of 50 to 150 N had the greatest systematic effect on foot-off estimates in the hind limbs (up to on average 16 ms per condition), being greater than the effect on foot-on estimates or foot-off estimates in the forelimbs (up to on average ±7 ms per condition). PMID:26157641
ERIC Educational Resources Information Center
Sanocki, Thomas; Sulman, Noah
2013-01-01
Three experiments measured the efficiency of monitoring complex scenes composed of changing objects, or events. All events lasted about 4 s, but in a given block of trials, could be of a single type (single task) or of multiple types (multitask, with a total of four event types). Overall accuracy of detecting target events amid distractors was…
A research using hybrid RBF/Elman neural networks for intrusion detection system secure model
NASA Astrophysics Data System (ADS)
Tong, Xiaojun; Wang, Zhu; Yu, Haining
2009-10-01
A hybrid RBF/Elman neural network model that can be employed for both anomaly detection and misuse detection is presented in this paper. The IDSs using the hybrid neural network can detect temporally dispersed and collaborative attacks effectively because of its memory of past events. The RBF network is employed as a real-time pattern classification and the Elman network is employed to restore the memory of past events. The IDSs using the hybrid neural network are evaluated against the intrusion detection evaluation data sponsored by U.S. Defense Advanced Research Projects Agency (DARPA). Experimental results are presented in ROC curves. Experiments show that the IDSs using this hybrid neural network improve the detection rate and decrease the false positive rate effectively.
ERIC Educational Resources Information Center
Wang, S.h.; Baillargeon, R.; Paterson, S.
2005-01-01
Recent research on infants' responses to occlusion and containment events indicates that, although some violations of the continuity principle are detected at an early age e.g. Aguiar, A., & Baillargeon, R. (1999). 2.5-month-old infants' reasoning about when objects should and should not be occluded. Cognitive Psychology 39, 116-157; Hespos, S.…
Probing the DPRK nuclear test-site to low magnitude using seismic pattern detectors
NASA Astrophysics Data System (ADS)
Kvaerna, T.; Gibbons, S. J.; Mykkeltveit, S.
2017-12-01
Six declared nuclear explosions at North Korea's Punggye-ri test-site between October 2006 and September 2017 were detected seismically both at regional and teleseismic distances. The similarity of body-wave signals from explosion to explosion allows us to locate these events relative to each other with high accuracy. Greater uncertainty in the relative time measurements for the most recent test on 3 September 2017 results in a greater uncertainty in the relative location estimate for this event, although it appears to have taken place below optimal overburden close to the peak of Mount Mantap. A number of smaller events, detected mainly at regional distances, have been identified as being at, or very close to, the test site. Due to waveform differences and available station coverage, a simple double-difference relative location is often not possible. In addition to the apparent collapse event some 8 minutes after the declared nuclear test, small seismic events have been detected on 25 May 2014, 11 September 2016, 23 September 2017, and 12 October 2017. The signals from these events differ significantly from those from the declared nuclear tests with far weaker Pn and far stronger Lg phases. Multi-channel correlation analysis and empirical matched field processing allow us to categorize these weaker seismic events with far greater confidence than classical waveform analysis allows.
Concept and Analysis of a Satellite for Space-Based Radio Detection of Ultra-High Energy Cosmic Rays
NASA Astrophysics Data System (ADS)
Romero-Wolf, Andrew; Gorham, P.; Booth, J.; Chen, P.; Duren, R. M.; Liewer, K.; Nam, J.; Saltzberg, D.; Schoorlemmer, H.; Wissel, S.; Zairfian, P.
2014-01-01
We present a concept for on-orbit radio detection of ultra-high energy cosmic rays (UHECRs) that has the potential to provide collection rates of ~100 events per year for energies above 10^20 eV. The synoptic wideband orbiting radio detector (SWORD) mission's high event statistics at these energies combined with the pointing capabilities of a space-borne antenna array could enable charged particle astronomy. The detector concept is based on ANITA's successful detection UHECRs where the geosynchrotron radio signal produced by the extended air shower is reflected off the Earth's surface and detected in flight.
NASA Technical Reports Server (NTRS)
Silber, E. A.; Brown, P. G.; Le Pinchon, A.
2011-01-01
In the morning hours of October 8, 2009, a bright object entered Earth's atmosphere over South Sulawesi, Indonesia. This bolide disintegrated above the ground, generating stratospheric infrasound returns that were detected by infrasonic stations of the global International Monitoring System (IMS) Network of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) at distances up to 17 500 km. Here we present instrumental recordings and preliminary results of this extraordinary event. Using the infrasonic period-yield relations, originally derived for atmospheric nuclear detonations, we find the most probable source energy for this bolide to be 70+/-20 kt TNT equivalent explosive yield. A unique aspect of this event is the fact that it was apparently detected by infrasound only. Global events of such magnitude are expected only once per decade and can be utilized to calibrate infrasonic location and propagation tools on a global scale, and to evaluate energy yield formula, and event timing.
The rate of transient beta frequency events predicts behavior across tasks and species
Law, Robert; Tsutsui, Shawn; Moore, Christopher I; Jones, Stephanie R
2017-01-01
Beta oscillations (15-29Hz) are among the most prominent signatures of brain activity. Beta power is predictive of healthy and abnormal behaviors, including perception, attention and motor action. In non-averaged signals, beta can emerge as transient high-power 'events'. As such, functionally relevant differences in averaged power across time and trials can reflect changes in event number, power, duration, and/or frequency span. We show that functionally relevant differences in averaged beta power in primary somatosensory neocortex reflect a difference in the number of high-power beta events per trial, i.e. event rate. Further, beta events occurring close to the stimulus were more likely to impair perception. These results are consistent across detection and attention tasks in human magnetoencephalography, and in local field potentials from mice performing a detection task. These results imply that an increased propensity of beta events predicts the failure to effectively transmit information through specific neocortical representations. PMID:29106374
Kaplan, H S
2005-11-01
Safety and reliability in blood transfusion are not static, but are dynamic non-events. Since performance deviations continually occur in complex systems, their detection and correction must be accomplished over and over again. Non-conformance must be detected early enough to allow for recovery or mitigation. Near-miss events afford early detection of possible system weaknesses and provide an early chance at correction. National event reporting systems, both voluntary and involuntary, have begun to include near-miss reporting in their classification schemes, raising awareness for their detection. MERS-TM is a voluntary safety reporting initiative in transfusion. Currently 22 hospitals submit reports anonymously to a central database which supports analysis of a hospital's own data and that of an aggregate database. The system encourages reporting of near-miss events, where the patient is protected from receiving an unsuitable or incorrect blood component due to a planned or unplanned recovery step. MERS-TM data suggest approximately 90% of events are near-misses, with 10% caught after issue but before transfusion. Near-miss reporting may increase total reports ten-fold. The ratio of near-misses to events with harm is 339:1, consistent with other industries' ratio of 300:1, which has been proposed as a measure of reporting in event reporting systems. Use of a risk matrix and an event's relation to protective barriers allow prioritization of these events. Near-misses recovered by planned barriers occur ten times more frequently then unplanned recoveries. A bedside check of the patient's identity with that on the blood component is an essential, final barrier. How the typical two person check is performed, is critical. Even properly done, this check is ineffective against sampling and testing errors. Blood testing at bedside just prior to transfusion minimizes the risk of such upstream events. However, even with simple and well designed devices, training may be a critical issue. Sample errors account for more than half of reported events. The most dangerous miscollection is a blood sample passing acceptance with no previous patient results for comparison. Bar code labels or collection of a second sample may counter this upstream vulnerability. Further upstream barriers have been proposed to counter the precariousness of urgent blood sample collection in a changing unstable situation. One, a linking device, allows safer labeling of tubes away from the bedside, the second, a forcing function, prevents omission of critical patient identification steps. Errors in the blood bank itself account for 15% of errors with a high potential severity. In one such event, a component incorrectly issued, but safely detected prior to transfusion, focused attention on multitasking's contribution to laboratory error. In sum, use of near-miss information, by enhancing barriers supporting error prevention and mitigation, increases our capacity to get the right blood to the right patient.
NASA Astrophysics Data System (ADS)
Touati, Sarah; Naylor, Mark; Main, Ian
2016-02-01
The recent spate of mega-earthquakes since 2004 has led to speculation of an underlying change in the global `background' rate of large events. At a regional scale, detecting changes in background rate is also an important practical problem for operational forecasting and risk calculation, for example due to volcanic processes, seismicity induced by fluid injection or withdrawal, or due to redistribution of Coulomb stress after natural large events. Here we examine the general problem of detecting changes in background rate in earthquake catalogues with and without correlated events, for the first time using the Bayes factor as a discriminant for models of varying complexity. First we use synthetic Poisson (purely random) and Epidemic-Type Aftershock Sequence (ETAS) models (which also allow for earthquake triggering) to test the effectiveness of many standard methods of addressing this question. These fall into two classes: those that evaluate the relative likelihood of different models, for example using Information Criteria or the Bayes Factor; and those that evaluate the probability of the observations (including extreme events or clusters of events) under a single null hypothesis, for example by applying the Kolmogorov-Smirnov and `runs' tests, and a variety of Z-score tests. The results demonstrate that the effectiveness among these tests varies widely. Information Criteria worked at least as well as the more computationally expensive Bayes factor method, and the Kolmogorov-Smirnov and runs tests proved to be the relatively ineffective in reliably detecting a change point. We then apply the methods tested to events at different thresholds above magnitude M ≥ 7 in the global earthquake catalogue since 1918, after first declustering the catalogue. This is most effectively done by removing likely correlated events using a much lower magnitude threshold (M ≥ 5), where triggering is much more obvious. We find no strong evidence that the background rate of large events worldwide has increased in recent years.
Detection and localization capability of an urban seismic sinkhole monitoring network
NASA Astrophysics Data System (ADS)
Becker, Dirk; Dahm, Torsten; Schneider, Fabian
2017-04-01
Microseismic events linked to underground processes in sinkhole areas might serve as precursors to larger mass dislocation or rupture events which can cause felt ground shaking or even structural damage. To identify these weak and shallow events, a sensitive local seismic monitoring network is needed. In case of an urban environment the performance of local monitoring networks is severely compromised by the high anthropogenic noise level. We study the detection and localization capability of such a network, which is already partly installed in the urban area of the city of Hamburg, Germany, within the joint project SIMULTAN (http://www.gfz-potsdam.de/en/section/near-surface-geophysics/projects/simultan/). SIMULTAN aims to monitor a known sinkhole structure and gain a better understanding of the underlying processes. The current network consists of six surface stations installed in the basement of private houses and underground structures of a research facility (DESY - Deutsches Elektronen Synchrotron). During the started monitoring campaign since 2015, no microseismic events could be unambiguously attributed to the sinkholes. To estimate the detection and location capability of the network, we calculate synthetic waveforms based on the location and mechanism of former events in the area. These waveforms are combined with the recorded urban seismic noise at the station sites. As detection algorithms a simple STA/LTA trigger and a more sophisticated phase detector are used. While the STA/LTA detector delivers stable results and is able to detect events with a moment magnitude as low as 0.35 at a distance of 1.3km from the source even under the present high noise conditions the phase detector is more sensitive but also less stable. It should be stressed that due to the local near surface conditions of the wave propagation the detections are generally performed on S- or surface waves and not on P-waves, which have a significantly lower amplitude. Due to the often emergent onsets of the seismic phases of sinkhole events and the high noise conditions the localization capability of the network is assessed by a stacking approach of characteristic waveforms (STA/LTA traces) in addition to traditional estimates based on travel time uncertainties and network geometry. Also the effect of a vertical array of borehole sensors as well as a small scale surface array on the location accuracy is investigated. Due to the expected, rather low frequency character of the seismic signals arrays with a small aperture due to the required close proximity to the source exhibit considerable uncertainty in the determination of the azimuth of the incoming wavefront, but can contribute to better constrain the event location. Future borehole stations, apart from significantly reducing the detection threshold, would also significantly reduce the location uncertainty. In addition, the synthetic data sets created for this study can also be used to better constrain the magnitudes of the microseismic events by deriving attenuation relations for the surface waves of shallow events encountered in the sinkhole environment. This work has been funded by the German 'Geotechnologien' project SIMULTAN (BMBF03G0737A).
Li, Yunji; Wu, QingE; Peng, Li
2018-01-23
In this paper, a synthesized design of fault-detection filter and fault estimator is considered for a class of discrete-time stochastic systems in the framework of event-triggered transmission scheme subject to unknown disturbances and deception attacks. A random variable obeying the Bernoulli distribution is employed to characterize the phenomena of the randomly occurring deception attacks. To achieve a fault-detection residual is only sensitive to faults while robust to disturbances, a coordinate transformation approach is exploited. This approach can transform the considered system into two subsystems and the unknown disturbances are removed from one of the subsystems. The gain of fault-detection filter is derived by minimizing an upper bound of filter error covariance. Meanwhile, system faults can be reconstructed by the remote fault estimator. An recursive approach is developed to obtain fault estimator gains as well as guarantee the fault estimator performance. Furthermore, the corresponding event-triggered sensor data transmission scheme is also presented for improving working-life of the wireless sensor node when measurement information are aperiodically transmitted. Finally, a scaled version of an industrial system consisting of local PC, remote estimator and wireless sensor node is used to experimentally evaluate the proposed theoretical results. In particular, a novel fault-alarming strategy is proposed so that the real-time capacity of fault-detection is guaranteed when the event condition is triggered.
System level latchup mitigation for single event and transient radiation effects on electronics
Kimbrough, J.R.; Colella, N.J.
1997-09-30
A ``blink`` technique, analogous to a person blinking at a flash of bright light, is provided for mitigating the effects of single event current latchup and prompt pulse destructive radiation on a micro-electronic circuit. The system includes event detection circuitry, power dump logic circuitry, and energy limiting measures with autonomous recovery. The event detection circuitry includes ionizing radiation pulse detection means for detecting a pulse of ionizing radiation and for providing at an output terminal thereof a detection signal indicative of the detection of a pulse of ionizing radiation. The current sensing circuitry is coupled to the power bus for determining an occurrence of excess current through the power bus caused by ionizing radiation or by ion-induced destructive latchup of a semiconductor device. The power dump circuitry includes power dump logic circuitry having a first input terminal connected to the output terminal of the ionizing radiation pulse detection circuitry and having a second input terminal connected to the output terminal of the current sensing circuitry. The power dump logic circuitry provides an output signal to the input terminal of the circuitry for opening the power bus and the circuitry for shorting the power bus to a ground potential to remove power from the power bus. The energy limiting circuitry with autonomous recovery includes circuitry for opening the power bus and circuitry for shorting the power bus to a ground potential. The circuitry for opening the power bus and circuitry for shorting the power bus to a ground potential includes a series FET and a shunt FET. The invention provides for self-contained sensing for latchup, first removal of power to protect latched components, and autonomous recovery to enable transparent operation of other system elements. 18 figs.
System level latchup mitigation for single event and transient radiation effects on electronics
Kimbrough, Joseph Robert; Colella, Nicholas John
1997-01-01
A "blink" technique, analogous to a person blinking at a flash of bright light, is provided for mitigating the effects of single event current latchup and prompt pulse destructive radiation on a micro-electronic circuit. The system includes event detection circuitry, power dump logic circuitry, and energy limiting measures with autonomous recovery. The event detection circuitry includes ionizing radiation pulse detection means for detecting a pulse of ionizing radiation and for providing at an output terminal thereof a detection signal indicative of the detection of a pulse of ionizing radiation. The current sensing circuitry is coupled to the power bus for determining an occurrence of excess current through the power bus caused by ionizing radiation or by ion-induced destructive latchup of a semiconductor device. The power dump circuitry includes power dump logic circuitry having a first input terminal connected to the output terminal of the ionizing radiation pulse detection circuitry and having a second input terminal connected to the output terminal of the current sensing circuitry. The power dump logic circuitry provides an output signal to the input terminal of the circuitry for opening the power bus and the circuitry for shorting the power bus to a ground potential to remove power from the power bus. The energy limiting circuitry with autonomous recovery includes circuitry for opening the power bus and circuitry for shorting the power bus to a ground potential. The circuitry for opening the power bus and circuitry for shorting the power bus to a ground potential includes a series FET and a shunt FET. The invention provides for self-contained sensing for latchup, first removal of power to protect latched components, and autonomous recovery to enable transparent operation of other system elements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohrer, Brandon Robinson
2011-09-01
Events of interest to data analysts are sometimes difficult to characterize in detail. Rather, they consist of anomalies, events that are unpredicted, unusual, or otherwise incongruent. The purpose of this LDRD was to test the hypothesis that a biologically-inspired anomaly detection algorithm could be used to detect contextual, multi-modal anomalies. There currently is no other solution to this problem, but the existence of a solution would have a great national security impact. The technical focus of this research was the application of a brain-emulating cognition and control architecture (BECCA) to the problem of anomaly detection. One aspect of BECCA inmore » particular was discovered to be critical to improved anomaly detection capabilities: it's feature creator. During the course of this project the feature creator was developed and tested against multiple data types. Development direction was drawn from psychological and neurophysiological measurements. Major technical achievements include the creation of hierarchical feature sets created from both audio and imagery data.« less
Detection and Classification of Motor Vehicle Noise in a Forested Landscape
NASA Astrophysics Data System (ADS)
Brown, Casey L.; Reed, Sarah E.; Dietz, Matthew S.; Fristrup, Kurt M.
2013-11-01
Noise emanating from human activity has become a common addition to natural soundscapes and has the potential to harm wildlife and erode human enjoyment of nature. In particular, motor vehicles traveling along roads and trails produce high levels of both chronic and intermittent noise, eliciting varied responses from a wide range of animal species. Anthropogenic noise is especially conspicuous in natural areas where ambient background sound levels are low. In this article, we present an acoustic method to detect and analyze motor vehicle noise. Our approach uses inexpensive consumer products to record sound, sound analysis software to automatically detect sound events within continuous recordings and measure their acoustic properties, and statistical classification methods to categorize sound events. We describe an application of this approach to detect motor vehicle noise on paved, gravel, and natural-surface roads, and off-road vehicle trails in 36 sites distributed throughout a national forest in the Sierra Nevada, CA, USA. These low-cost, unobtrusive methods can be used by scientists and managers to detect anthropogenic noise events for many potential applications, including ecological research, transportation and recreation planning, and natural resource management.
Stewart, C M; Newlands, S D; Perachio, A A
2004-12-01
Rapid and accurate discrimination of single units from extracellular recordings is a fundamental process for the analysis and interpretation of electrophysiological recordings. We present an algorithm that performs detection, characterization, discrimination, and analysis of action potentials from extracellular recording sessions. The program was entirely written in LabVIEW (National Instruments), and requires no external hardware devices or a priori information about action potential shapes. Waveform events are detected by scanning the digital record for voltages that exceed a user-adjustable trigger. Detected events are characterized to determine nine different time and voltage levels for each event. Various algebraic combinations of these waveform features are used as axis choices for 2-D Cartesian plots of events. The user selects axis choices that generate distinct clusters. Multiple clusters may be defined as action potentials by manually generating boundaries of arbitrary shape. Events defined as action potentials are validated by visual inspection of overlain waveforms. Stimulus-response relationships may be identified by selecting any recorded channel for comparison to continuous and average cycle histograms of binned unit data. The algorithm includes novel aspects of feature analysis and acquisition, including higher acquisition rates for electrophysiological data compared to other channels. The program confirms that electrophysiological data may be discriminated with high-speed and efficiency using algebraic combinations of waveform features derived from high-speed digital records.
Long-term changes of the glacial seismicity: case study from Spitsbergen
NASA Astrophysics Data System (ADS)
Gajek, Wojciech; Trojanowski, Jacek; Malinowski, Michał
2016-04-01
Changes in global temperature balance have proved to have a major impact on the cryosphere, and therefore withdrawing glaciers are the symbol of the warming climate. Our study focuses on year-to-year changes in glacier-generated seismicity. We have processed 7-year long continuous seismological data recorded by the HSP broadband station located in the proximity of Hansbreen glacier (Hornsund, southern Spitsbergen), obtaining seismic activity distribution between 2008 and 2014. We developed a new fuzzy logic algorithm to distinguish between glacier- and non-glacier-origin events. The algorithm takes into account the frequency of seismic signal and the energy flow in certain time interval. Our research has revealed that the number of detected glacier-origin events over last two years has doubled. Annual events distribution correlates well with temperature and precipitation curves, illustrating characteristic yearlong behaviour of glacier seismic activity. To further support our observations, we have analysed 5-year long distribution of glacier-origin tremors detected in the vicinity of the Kronebreen glacier using KBS broadband station located in Ny-Ålesund (western Spitsbergen). We observe a steady increase in the number of detected events. detected each year, however not as significant as for Hornsund dataset.
Closing the loop in ICU decision support: physiologic event detection, alerts, and documentation.
Norris, P. R.; Dawant, B. M.
2001-01-01
Automated physiologic event detection and alerting is a challenging task in the ICU. Ideally care providers should be alerted only when events are clinically significant and there is opportunity for corrective action. However, the concepts of clinical significance and opportunity are difficult to define in automated systems, and effectiveness of alerting algorithms is difficult to measure. This paper describes recent efforts on the Simon project to capture information from ICU care providers about patient state and therapy in response to alerts, in order to assess the value of event definitions and progressively refine alerting algorithms. Event definitions for intracranial pressure and cerebral perfusion pressure were studied by implementing a reliable system to automatically deliver alerts to clinical users alphanumeric pagers, and to capture associated documentation about patient state and therapy when the alerts occurred. During a 6-month test period in the trauma ICU at Vanderbilt University Medical Center, 530 alerts were detected in 2280 hours of data spanning 14 patients. Clinical users electronically documented 81% of these alerts as they occurred. Retrospectively classifying documentation based on therapeutic actions taken, or reasons why actions were not taken, provided useful information about ways to potentially improve event definitions and enhance system utility. PMID:11825238
The Swiss-Army-Knife Approach to the Nearly Automatic Analysis for Microearthquake Sequences.
NASA Astrophysics Data System (ADS)
Kraft, T.; Simon, V.; Tormann, T.; Diehl, T.; Herrmann, M.
2017-12-01
Many Swiss earthquake sequence have been studied using relative location techniques, which often allowed to constrain the active fault planes and shed light on the tectonic processes that drove the seismicity. Yet, in the majority of cases the number of located earthquakes was too small to infer the details of the space-time evolution of the sequences, or their statistical properties. Therefore, it has mostly been impossible to resolve clear patterns in the seismicity of individual sequences, which are needed to improve our understanding of the mechanisms behind them. Here we present a nearly automatic workflow that combines well-established seismological analysis techniques and allows to significantly improve the completeness of detected and located earthquakes of a sequence. We start from the manually timed routine catalog of the Swiss Seismological Service (SED), which contains the larger events of a sequence. From these well-analyzed earthquakes we dynamically assemble a template set and perform a matched filter analysis on the station with: the best SNR for the sequence; and a recording history of at least 10-15 years, our typical analysis period. This usually allows us to detect events several orders of magnitude below the SED catalog detection threshold. The waveform similarity of the events is then further exploited to derive accurate and consistent magnitudes. The enhanced catalog is then analyzed statistically to derive high-resolution time-lines of the a- and b-value and consequently the occurrence probability of larger events. Many of the detected events are strong enough to be located using double-differences. No further manual interaction is needed; we simply time-shift the arrival-time pattern of the detecting template to the associated detection. Waveform similarity assures a good approximation of the expected arrival-times, which we use to calculate event-pair arrival-time differences by cross correlation. After a SNR and cycle-skipping quality check these are directly fed into hypoDD. Using this procedure we usually improve the number of well-relocated events by a factor 2-5. We demonstrate the successful application of the workflow at the example of natural sequences in Switzerland and present first results of the advanced analysis the was possible with the enhanced catalogs.
ElarmS Earthquake Early Warning System 2016 Performance and New Research
NASA Astrophysics Data System (ADS)
Chung, A. I.; Allen, R. M.; Hellweg, M.; Henson, I. H.; Neuhauser, D. S.
2016-12-01
The ElarmS earthquake early warning system has been detecting earthquakes throughout California since 2007. It is one of the algorithms that contributes to the West Coast ShakeAlert, a prototype earthquake early warning system being developed for the US West Coast. ElarmS is also running in the Pacific Northwest, and in Israel, Chile, Turkey, and Peru in test mode. We summarize the performance of the ElarmS system over the past year and review some of the more problematic events that the system has encountered. During the first half of 2016 (2016-01-01 through 2016-07-21), ElarmS successfully alerted on all events with ANSS catalog magnitudes M>3 in the Los Angeles area. The mean alert time for these 9 events was just 4.84 seconds. In the San Francisco Bay Area, ElarmS detected 26 events with ANSS catalog magnitudes M>3. The alert times for these events is 9.12 seconds. The alert times are longer in the Bay Area than in the Los Angeles area due to the sparser network of stations in the Bay Area. 7 Bay Area events were not detected by ElarmS. These events occurred in areas where there is less dense station coverage. In addition, ElarmS sent alerts for 13 of the 16 moderately-sized (ANSS catalog magnitudes M>4) events that occurred throughout the state of California. One of those missed events was a M4.5 that occurred far offshore in the northernmost part of the state. The other two missed events occurred inland in regions with sparse station coverage. Over the past year, we have worked towards the implementation of a new filterbank teleseismic filter algorithm, which we will discuss. Other than teleseismic events, a significant cause of false alerts and severely mislocated events is spurious triggers being associated with triggers from a real earthquake. Here, we address new approaches to filtering out problematic triggers.
Prospects for the Detection of Fast Radio Bursts with the Murchison Widefield Array
NASA Astrophysics Data System (ADS)
Trott, Cathryn M.; Tingay, Steven J.; Wayth, Randall B.
2013-10-01
Fast radio bursts (FRBs) are short timescale (Lt1 s) astrophysical radio signals, presumed to be a signature of cataclysmic events of extragalactic origin. The discovery of six high-redshift events at ~1400 MHz from the Parkes radio telescope suggests that FRBs may occur at a high rate across the sky. The Murchison Widefield Array (MWA) operates at low radio frequencies (80-300 MHz) and is expected to detect FRBs due to its large collecting area (~2500 m2) and wide field-of-view (FOV, ~ 1000 deg2 at ν = 200 MHz). We compute the expected number of FRB detections for the MWA assuming a source population consistent with the reported detections. Our formalism properly accounts for the frequency-dependence of the antenna primary beam, the MWA system temperature, and unknown spectral index of the source population, for three modes of FRB detection: coherent; incoherent; and fast imaging. We find that the MWA's sensitivity and large FOV combine to provide the expectation of multiple detectable events per week in all modes, potentially making it an excellent high time resolution science instrument. Deviations of the expected number of detections from actual results will provide a strong constraint on the assumptions made for the underlying source population and intervening plasma distribution.
Searching for tidal disruption events at an unexplored wavelength
NASA Astrophysics Data System (ADS)
Soler, S.; Webb, N.; Saxton, R.
2017-10-01
When a star approaches too close to a black hole, the star can be torn apart by the gravitational forces and approximately half the matter falls towards the black hole, causing the luminosity to increase by several orders of magnitude. Such an event is known as a tidal disruption event (TDE). These events can help us locate black holes which would be otherwise too faint to be detected and help us understand the mass function of these objects. To date only a small sample of candidate TDEs have been detected (˜65), either in the optical or in soft X-rays. However, four TDEs have been observed with hard X-ray spectra. In order to determine if these hard TDEs are the result of a different mechanism to those detected at lower energy, we search for similar events in the 3XMM catalogue. Using spectral and timing characteristics determined from the hard TDEs and cross-correlating 3XMM with other catalogues, we have developed a methodology with which to identify new hard TDEs. In this poster we describe the characteristics used to search for previously undiscovered hard TDEs and present the results of this search and the resulting constraints on the central mechanism in TDEs.
Reaction times to weak test lights. [psychophysics biological model
NASA Technical Reports Server (NTRS)
Wandell, B. A.; Ahumada, P.; Welsh, D.
1984-01-01
Maloney and Wandell (1984) describe a model of the response of a single visual channel to weak test lights. The initial channel response is a linearly filtered version of the stimulus. The filter output is randomly sampled over time. Each time a sample occurs there is some probability increasing with the magnitude of the sampled response - that a discrete detection event is generated. Maloney and Wandell derive the statistics of the detection events. In this paper a test is conducted of the hypothesis that the reaction time responses to the presence of a weak test light are initiated at the first detection event. This makes it possible to extend the application of the model to lights that are slightly above threshold, but still within the linear operating range of the visual system. A parameter-free prediction of the model proposed by Maloney and Wandell for lights detected by this statistic is tested. The data are in agreement with the prediction.
TED: a novel man portable infrared detection and situation awareness system
NASA Astrophysics Data System (ADS)
Tidhar, Gil; Manor, Ran
2007-04-01
Infrared Search and Track (IRST) and threat warning systems are used in vehicle mounted or in fixed land positions. Migration of this technology to the man portable applications proves to be difficult due to the tight constraints of power consumption, dimensions, weight and due to the high video rate requirements. In this report we provide design details of a novel transient event detection (TED) system, capable of detection of blasts and gun shot events in a very wide field of view, while used by an operator in motion
Track-based event recognition in a realistic crowded environment
NASA Astrophysics Data System (ADS)
van Huis, Jasper R.; Bouma, Henri; Baan, Jan; Burghouts, Gertjan J.; Eendebak, Pieter T.; den Hollander, Richard J. M.; Dijk, Judith; van Rest, Jeroen H.
2014-10-01
Automatic detection of abnormal behavior in CCTV cameras is important to improve the security in crowded environments, such as shopping malls, airports and railway stations. This behavior can be characterized at different time scales, e.g., by small-scale subtle and obvious actions or by large-scale walking patterns and interactions between people. For example, pickpocketing can be recognized by the actual snatch (small scale), when he follows the victim, or when he interacts with an accomplice before and after the incident (longer time scale). This paper focusses on event recognition by detecting large-scale track-based patterns. Our event recognition method consists of several steps: pedestrian detection, object tracking, track-based feature computation and rule-based event classification. In the experiment, we focused on single track actions (walk, run, loiter, stop, turn) and track interactions (pass, meet, merge, split). The experiment includes a controlled setup, where 10 actors perform these actions. The method is also applied to all tracks that are generated in a crowded shopping mall in a selected time frame. The results show that most of the actions can be detected reliably (on average 90%) at a low false positive rate (1.1%), and that the interactions obtain lower detection rates (70% at 0.3% FP). This method may become one of the components that assists operators to find threatening behavior and enrich the selection of videos that are to be observed.
Real-Time Event Detection for Monitoring Natural and Source Waterways - Sacramento, CA
The use of event detection systems in finished drinking water systems is increasing in order to monitor water quality in both operational and security contexts. Recent incidents involving harmful algal blooms and chemical spills into watersheds have increased interest in monitori...
Introduction of the ASGARD Code
NASA Technical Reports Server (NTRS)
Bethge, Christian; Winebarger, Amy; Tiwari, Sanjiv; Fayock, Brian
2017-01-01
ASGARD stands for 'Automated Selection and Grouping of events in AIA Regional Data'. The code is a refinement of the event detection method in Ugarte-Urra & Warren (2014). It is intended to automatically detect and group brightenings ('events') in the AIA EUV channels, to record event parameters, and to find related events over multiple channels. Ultimately, the goal is to automatically determine heating and cooling timescales in the corona and to significantly increase statistics in this respect. The code is written in IDL and requires the SolarSoft library. It is parallelized and can run with multiple CPUs. Input files are regions of interest (ROIs) in time series of AIA images from the JSOC cutout service (http://jsoc.stanford.edu/ajax/exportdata.html). The ROIs need to be tracked, co-registered, and limited in time (typically 12 hours).
Monitoring the Earth's Atmosphere with the Global IMS Infrasound Network
NASA Astrophysics Data System (ADS)
Brachet, Nicolas; Brown, David; Mialle, Pierrick; Le Bras, Ronan; Coyne, John; Given, Jeffrey
2010-05-01
The Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) is tasked with monitoring compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT) which bans nuclear weapon explosions underground, in the oceans, and in the atmosphere. The verification regime includes a globally distributed network of seismic, hydroacoustic, infrasound and radionuclide stations which collect and transmit data to the International Data Centre (IDC) in Vienna, Austria shortly after the data are recorded at each station. The infrasound network defined in the Protocol of the CTBT comprises 60 infrasound array stations. Each array is built according to the same technical specifications, it is typically composed of 4 to 9 sensors, with 1 to 3 km aperture geometry. At the end of 2000 only one infrasound station was transmitting data to the IDC. Since then, 41 additional stations have been installed and 70% of the infrasound network is currently certified and contributing data to the IDC. This constitutes the first global infrasound network ever built with such a large and uniform distribution of stations. Infrasound data at the IDC are processed at the station level using the Progressive Multi-Channel Correlation (PMCC) method for the detection and measurement of infrasound signals. The algorithm calculates the signal correlation between sensors at an infrasound array. If the signal is sufficiently correlated and consistent over an extended period of time and frequency range a detection is created. Groups of detections are then categorized according to their propagation and waveform features, and a phase name is assigned for infrasound, seismic or noise detections. The categorization complements the PMCC algorithm to avoid overwhelming the IDC automatic association algorithm with false alarm infrasound events. Currently, 80 to 90% of the detections are identified as noise by the system. Although the noise detections are not used to build events in the context of CTBT monitoring, they represent valuable data for other civil applications like monitoring of natural hazards (volcanic activity, storm tracking) and climate change. Non-noise detections are used in network processing at the IDC along with seismic and hydroacoustic technologies. The arrival phases detected on the three waveform technologies may be combined and used for locating events in an automatically generated bulletin of events. This automatic event bulletin is routinely reviewed by analysts during the interactive review process. However, the fusion of infrasound data with the other waveform technologies has only recently (in early 2010) become part of the IDC operational system, after a software development and testing period that began in 2004. The build-up of the IMS infrasound network, the recent developments of the IDC infrasound software, and the progress accomplished during the last decade in the domain of real-time atmospheric modelling have allowed better understanding of infrasound signals and identification of a growing data set of ground-truth sources. These infragenic sources originate from natural or man-made sources. Some of the detected signals are emitted by local or regional phenomena recorded by a single IMS infrasound station: man-made cultural activity, wind farms, aircraft, artillery exercises, ocean surf, thunderstorms, rumbling volcanoes, iceberg calving, aurora, avalanches. Other signals may be recorded by several IMS infrasound stations at larger distances: ocean swell, sonic booms, and mountain associated waves. Only a small fraction of events meet the event definition criteria considering the Treaty verification mission of the Organization. Candidate event types for the IDC Reviewed Event Bulletin include atmospheric or surface explosions, meteor explosions, rocket launches, signals from large earthquakes and explosive volcanic eruptions.
Detection and analysis of high-temperature events in the BIRD mission
NASA Astrophysics Data System (ADS)
Zhukov, Boris; Briess, Klaus; Lorenz, Eckehard; Oertel, Dieter; Skrbek, Wolfgang
2005-01-01
The primary mission objective of a new small Bi-spectral InfraRed Detection (BIRD) satellite is detection and quantitative analysis of high-temperature events like fires and volcanoes. An absence of saturation in the BIRD infrared channels makes it possible to improve false alarm rejection as well as to retrieve quantitative characteristics of hot targets, including their effective fire temperature, area and the radiative energy release. Examples are given of detection and analysis of wild and coal seam fires, of volcanic activity as well as of oil fires in Iraq. The smallest fires detected by BIRD, which were verified on ground, had an area of 12m2 at daytime and 4m2 at night.
NASA Astrophysics Data System (ADS)
Mateo, Mario
1994-01-01
Three teams of astronomers believe they have independently found evidence for dark matter in our galaxy. A brief history of the search for dark matter is presented. The use of microlensing-event observation for spotting dark matter is described. The equipment required to observe microlensing events and three groups working on dark matter detection are discussed. The three groups are the Massive Compact Halo Objects (MACHO) Project team, the Experience de Recherche d'Objets Sombres (EROS) team, and the Optical Gravitational Lensing Experiment (OGLE) team. The first apparent detections of microlensing events by the three teams are briefly reported.
Candidate Binary Microlensing Events from the MACHO Project
NASA Astrophysics Data System (ADS)
Becker, A. C.; Alcock, C.; Allsman, R. A.; Alves, D. R.; Axelrod, T. S.; Bennett, D. P.; Cook, K. H.; Drake, A. J.; Freeman, K. C.; Griest, K.; King, L. J.; Lehner, M. J.; Marshall, S. L.; Minniti, D.; Peterson, B. A.; Popowski, P.; Pratt, M. R.; Quinn, P. J.; Rodgers, A. W.; Stubbs, C. W.; Sutherland, W.; Tomaney, A.; Vandehei, T.; Welch, D. L.; Baines, D.; Brakel, A.; Crook, B.; Howard, J.; Leach, T.; McDowell, D.; McKeown, S.; Mitchell, J.; Moreland, J.; Pozza, E.; Purcell, P.; Ring, S.; Salmon, A.; Ward, K.; Wyper, G.; Heller, A.; Kaspi, S.; Kovo, O.; Maoz, D.; Retter, A.; Rhie, S. H.; Stetson, P.; Walker, A.; MACHO Collaboration
1998-12-01
We present the lightcurves of 22 gravitational microlensing events from the first six years of the MACHO Project gravitational microlensing survey which are likely examples of lensing by binary systems. These events were selected from a total sample of ~ 300 events which were either detected by the MACHO Alert System or discovered through retrospective analyses of the MACHO database. Many of these events appear to have undergone a caustic or cusp crossing, and 2 of the events are well fit with lensing by binary systems with large mass ratios, indicating secondary companions of approximately planetary mass. The event rate is roughly consistent with predictions based upon our knowledge of the properties of binary stars. The utility of binary lensing in helping to solve the Galactic dark matter problem is demonstrated with analyses of 3 binary microlensing events seen towards the Magellanic Clouds. Source star resolution during caustic crossings in 2 of these events allows us to estimate the location of the lensing systems, assuming each source is a single star and not a short period binary. * MACHO LMC-9 appears to be a binary lensing event with a caustic crossing partially resolved in 2 observations. The resulting lens proper motion appears too small for a single source and LMC disk lens. However, it is considerably less likely to be a single source star and Galactic halo lens. We estimate the a priori probability of a short period binary source with a detectable binary character to be ~ 10 %. If the source is also a binary, then we currently have no constraints on the lens location. * The most recent of these events, MACHO 98-SMC-1, was detected in real-time. Follow-up observations by the MACHO/GMAN, PLANET, MPS, EROS and OGLE microlensing collaborations lead to the robust conclusion that the lens likely resides in the SMC.
Decrease the Number of Glovebox Glove Breaches and Failures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hurtle, Jackie C.
2013-12-24
Los Alamos National Laboratory (LANL) is committed to the protection of the workers, public, and environment while performing work and uses gloveboxes as engineered controls to protect workers from exposure to hazardous materials while performing plutonium operations. Glovebox gloves are a weak link in the engineered controls and are a major cause of radiation contamination events which can result in potential worker exposure and localized contamination making operational areas off-limits and putting programmatic work on hold. Each day of lost opportunity at Technical Area (TA) 55, Plutonium Facility (PF) 4 is estimated at $1.36 million. Between July 2011 and Junemore » 2013, TA-55-PF-4 had 65 glovebox glove breaches and failures with an average of 2.7 per month. The glovebox work follows the five step safety process promoted at LANL with a decision diamond interjected for whether or not a glove breach or failure event occurred in the course of performing glovebox work. In the event that no glove breach or failure is detected, there is an additional decision for whether or not contamination is detected. In the event that contamination is detected, the possibility for a glove breach or failure event is revisited.« less
Jasiewicz, Jan M; Allum, John H J; Middleton, James W; Barriskill, Andrew; Condie, Peter; Purcell, Brendan; Li, Raymond Che Tin
2006-12-01
We report on three different methods of gait event detection (toe-off and heel strike) using miniature linear accelerometers and angular velocity transducers in comparison to using standard pressure-sensitive foot switches. Detection was performed with normal and spinal-cord injured subjects. The detection of end contact (EC), normally toe-off, and initial contact (IC) normally, heel strike was based on either foot linear accelerations or foot sagittal angular velocity or shank sagittal angular velocity. The results showed that all three methods were as accurate as foot switches in estimating times of IC and EC for normal gait patterns. In spinal-cord injured subjects, shank angular velocity was significantly less accurate (p<0.02). We conclude that detection based on foot linear accelerations or foot angular velocity can correctly identify the timing of IC and EC events in both normal and spinal-cord injured subjects.
Detection and analysis of a transient energy burst with beamforming of multiple teleseismic phases
NASA Astrophysics Data System (ADS)
Retailleau, Lise; Landès, Matthieu; Gualtieri, Lucia; Shapiro, Nikolai M.; Campillo, Michel; Roux, Philippe; Guilbert, Jocelyn
2018-01-01
Seismological detection methods are traditionally based on picking techniques. These methods cannot be used to analyse emergent signals where the arrivals cannot be picked. Here, we detect and locate seismic events by applying a beamforming method that combines multiple body-wave phases to USArray data. This method explores the consistency and characteristic behaviour of teleseismic body waves that are recorded by a large-scale, still dense, seismic network. We perform time-slowness analysis of the signals and correlate this with the time-slowness equivalent of the different body-wave phases predicted by a global traveltime calculator, to determine the occurrence of an event with no a priori information about it. We apply this method continuously to one year of data to analyse the different events that generate signals reaching the USArray network. In particular, we analyse in detail a low-frequency secondary microseismic event that occurred on 2010 February 1. This event, that lasted 1 d, has a narrow frequency band around 0.1 Hz, and it occurred at a distance of 150° to the USArray network, South of Australia. We show that the most energetic phase observed is the PKPab phase. Direct amplitude analysis of regional seismograms confirms the occurrence of this event. We compare the seismic observations with models of the spectral density of the pressure field generated by the interferences between oceanic waves. We attribute the observed signals to a storm-generated microseismic event that occurred along the South East Indian Ridge.
Tan, Francisca M; Caballero-Gaudes, César; Mullinger, Karen J; Cho, Siu-Yeung; Zhang, Yaping; Dryden, Ian L; Francis, Susan T; Gowland, Penny A
2017-11-01
Most functional MRI (fMRI) studies map task-driven brain activity using a block or event-related paradigm. Sparse paradigm free mapping (SPFM) can detect the onset and spatial distribution of BOLD events in the brain without prior timing information, but relating the detected events to brain function remains a challenge. In this study, we developed a decoding method for SPFM using a coordinate-based meta-analysis method of activation likelihood estimation (ALE). We defined meta-maps of statistically significant ALE values that correspond to types of events and calculated a summation overlap between the normalized meta-maps and SPFM maps. As a proof of concept, this framework was applied to relate SPFM-detected events in the sensorimotor network (SMN) to six motor functions (left/right fingers, left/right toes, swallowing, and eye blinks). We validated the framework using simultaneous electromyography (EMG)-fMRI experiments and motor tasks with short and long duration, and random interstimulus interval. The decoding scores were considerably lower for eye movements relative to other movement types tested. The average successful rate for short and long motor events were 77 ± 13% and 74 ± 16%, respectively, excluding eye movements. We found good agreement between the decoding results and EMG for most events and subjects, with a range in sensitivity between 55% and 100%, excluding eye movements. The proposed method was then used to classify the movement types of spontaneous single-trial events in the SMN during resting state, which produced an average successful rate of 22 ± 12%. Finally, this article discusses methodological implications and improvements to increase the decoding performance. Hum Brain Mapp 38:5778-5794, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Tan, Francisca M.; Caballero-Gaudes, César; Mullinger, Karen J.; Cho, Siu-Yeung; Zhang, Yaping; Dryden, Ian L.; Francis, Susan T.; Gowland, Penny A.
2017-01-01
Most fMRI studies map task-driven brain activity using a block or event-related paradigm. Sparse Paradigm Free Mapping (SPFM) can detect the onset and spatial distribution of BOLD events in the brain without prior timing information; but relating the detected events to brain function remains a challenge. In this study, we developed a decoding method for SPFM using a coordinate-based meta-analysis method of Activation Likelihood Estimation (ALE). We defined meta-maps of statistically significant ALE values that correspond to types of events and calculated a summation overlap between the normalized meta-maps and SPFM maps. As a proof of concept, this framework was applied to relate SPFM-detected events in the Sensorimotor Network (SMN) to six motor function (left/right fingers, left/right toes, swallowing and eye blinks). We validated the framework using simultaneous Electromyography-fMRI experiments and motor tasks with short and long duration, and random inter-stimulus interval. The decoding scores were considerably lower for eye movements relative to other movement types tested. The average successful rate for short and long motor events was 77 ± 13% and 74 ± 16% respectively, excluding eye movements. We found good agreement between the decoding results and EMG for most events and subjects, with a range in sensitivity between 55 and 100%, excluding eye movements. The proposed method was then used to classify the movement types of spontaneous single-trial events in the SMN during resting state, which produced an average successful rate of 22 ± 12%. Finally, this paper discusses methodological implications and improvements to increase the decoding performance. PMID:28815863
Detection of Epileptic Seizure Event and Onset Using EEG
Ahammad, Nabeel; Fathima, Thasneem; Joseph, Paul
2014-01-01
This study proposes a method of automatic detection of epileptic seizure event and onset using wavelet based features and certain statistical features without wavelet decomposition. Normal and epileptic EEG signals were classified using linear classifier. For seizure event detection, Bonn University EEG database has been used. Three types of EEG signals (EEG signal recorded from healthy volunteer with eye open, epilepsy patients in the epileptogenic zone during a seizure-free interval, and epilepsy patients during epileptic seizures) were classified. Important features such as energy, entropy, standard deviation, maximum, minimum, and mean at different subbands were computed and classification was done using linear classifier. The performance of classifier was determined in terms of specificity, sensitivity, and accuracy. The overall accuracy was 84.2%. In the case of seizure onset detection, the database used is CHB-MIT scalp EEG database. Along with wavelet based features, interquartile range (IQR) and mean absolute deviation (MAD) without wavelet decomposition were extracted. Latency was used to study the performance of seizure onset detection. Classifier gave a sensitivity of 98.5% with an average latency of 1.76 seconds. PMID:24616892
Zhang, Jian; Hou, Dibo; Wang, Ke; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo
2017-05-01
The detection of organic contaminants in water distribution systems is essential to protect public health from potential harmful compounds resulting from accidental spills or intentional releases. Existing methods for detecting organic contaminants are based on quantitative analyses such as chemical testing and gas/liquid chromatography, which are time- and reagent-consuming and involve costly maintenance. This study proposes a novel procedure based on discrete wavelet transform and principal component analysis for detecting organic contamination events from ultraviolet spectral data. Firstly, the spectrum of each observation is transformed using discrete wavelet with a coiflet mother wavelet to capture the abrupt change along the wavelength. Principal component analysis is then employed to approximate the spectra based on capture and fusion features. The significant value of Hotelling's T 2 statistics is calculated and used to detect outliers. An alarm of contamination event is triggered by sequential Bayesian analysis when the outliers appear continuously in several observations. The effectiveness of the proposed procedure is tested on-line using a pilot-scale setup and experimental data.
Monma, Kimio; Araki, Rie; Sagi, Naoki; Satoh, Masaki; Ichikawa, Hisatsugu; Satoh, Kazue; Tobe, Takashi; Kamata, Kunihiro; Hino, Akihiro; Saito, Kazuo
2005-06-01
Investigations of the validity of labeling regarding genetically modified (GM) products were conducted using polymerase chain reaction (PCR) methods for foreign-made processed foods made from corn and potato purchased in the Tokyo area and in the USA. Several kinds of GM crops were detected in 12 of 32 samples of processed corn samples. More than two GM events for which safety reviews have been completed in Japan were simultaneously detected in 10 samples. GM events MON810 and Bt11 were most frequently detected in the samples by qualitative PCR methods. MON810 was detected in 11 of the 12 samples, and Bt11 was detected in 6 of the 12 samples. In addition, Roundup Ready soy was detected in one of the 12 samples. On the other hand, CBH351, for which the safety assessment was withdrawn in Japan, was not detected in any of the 12 samples. A trial quantitative analysis was performed on six of the GM maize qualitatively positive samples. The estimated amounts of GM maize in these samples ranged from 0.2 to 2.8%, except for one sample, which contained 24.1%. For this sample, the total amount found by event-specific quantitative analysis was 23.8%. Additionally, Roundup Ready soy was detected in one sample of 21 potato-processed foods, although GM potatoes were not detected in any sample.
Conjugate LEP Events at Palmer Station, Antarctica: Hemisphere-Dependent Timing
NASA Astrophysics Data System (ADS)
Kim, D.; Moore, R. C.
2016-12-01
During March 2015, a large number of lightning-induced electron precipitation (LEP) events were simultaneously observed using very low frequency receivers in both the northern and southern hemispheres. After removing overlapping events and unclear (or not well-defined) events, 22 conjugate LEP events remain and are used to statistically analyze the hemispheric dependence of LEP onset time. LEP events were detected in the northern hemisphere using the VLF remote sensing method by tracking the NAA transmitter signal (24.0 kHz, Cutler, Maine) at Tuscaloosa, Alabama. In the southern hemisphere, the NPM transmitter signal (21.4 kHz, Laulaulei, Hawii) is tracked at Palmer station, Antarctica. In each case, the GLD360 dataset from Vaisala is used to determine the hemisphere of the causative lightning flash, and this is compared with the hemisphere in which the LEP event is detected first. The onset times and onset durations can be calculated using a number of different methods, however. In this paper, we compare and contrast the onset times and durations calculated using multiple different methods, with each method applied to the same 22 conjugate LEP events.
High infrasonic goniometry applied to the detection of a helicopter in a high activity environment
NASA Astrophysics Data System (ADS)
Chritin, Vincent; Van Lancker, Eric; Wellig, Peter; Ott, Beat
2016-10-01
A current concern of armasuisse is the feasibility of a fixed or mobile acoustic surveillance and recognition network of sensors allowing to permanently monitor the noise immissions of a wide range of aerial activities such as civil or military aviation, and other possible acoustic events such as transient events, subsonic or sonic booms or other. This objective requires an ability to detect, localize and recognize a wide range of potential acoustic events of interest, among others possibly parasitic acoustic events (natural and industrial events on the ground for example), and possibly high background noise (for example close to urban or high activity areas). This article presents a general discussion and conclusion about this problem, based on 20 years of experience totalizing a dozen of research programs or internal researches by IAV, with an illustration through one central specific experimental case-study carried out within the framework of an armasuisse research program.
Li, Jie; Li, Rui; You, Leiming; Xu, Anlong; Fu, Yonggui; Huang, Shengfeng
2015-01-01
Switching between different alternative polyadenylation (APA) sites plays an important role in the fine tuning of gene expression. New technologies for the execution of 3’-end enriched RNA-seq allow genome-wide detection of the genes that exhibit significant APA site switching between different samples. Here, we show that the independence test gives better results than the linear trend test in detecting APA site-switching events. Further examination suggests that the discrepancy between these two statistical methods arises from complex APA site-switching events that cannot be represented by a simple change of average 3’-UTR length. In theory, the linear trend test is only effective in detecting these simple changes. We classify the switching events into four switching patterns: two simple patterns (3’-UTR shortening and lengthening) and two complex patterns. By comparing the results of the two statistical methods, we show that complex patterns account for 1/4 of all observed switching events that happen between normal and cancerous human breast cell lines. Because simple and complex switching patterns may convey different biological meanings, they merit separate study. We therefore propose to combine both the independence test and the linear trend test in practice. First, the independence test should be used to detect APA site switching; second, the linear trend test should be invoked to identify simple switching events; and third, those complex switching events that pass independence testing but fail linear trend testing can be identified. PMID:25875641
Hata, Akihiko; Katayama, Hiroyuki; Kojima, Keisuke; Sano, Shoichi; Kasuga, Ikuro; Kitajima, Masaaki; Furumai, Hiroaki
2014-01-15
Rainfall events can introduce large amount of microbial contaminants including human enteric viruses into surface water by intermittent discharges from combined sewer overflows (CSOs). The present study aimed to investigate the effect of rainfall events on viral loads in surface waters impacted by CSO and the reliability of molecular methods for detection of enteric viruses. The reliability of virus detection in the samples was assessed by using process controls for virus concentration, nucleic acid extraction and reverse transcription (RT)-quantitative PCR (qPCR) steps, which allowed accurate estimation of virus detection efficiencies. Recovery efficiencies of poliovirus in river water samples collected during rainfall events (<10%) were lower than those during dry weather conditions (>10%). The log10-transformed virus concentration efficiency was negatively correlated with suspended solid concentration (r(2)=0.86) that increased significantly during rainfall events. Efficiencies of DNA extraction and qPCR steps determined with adenovirus type 5 and a primer sharing control, respectively, were lower in dry weather. However, no clear relationship was observed between organic water quality parameters and efficiencies of these two steps. Observed concentrations of indigenous enteric adenoviruses, GII-noroviruses, enteroviruses, and Aichi viruses increased during rainfall events even though the virus concentration efficiency was presumed to be lower than in dry weather. The present study highlights the importance of using appropriate process controls to evaluate accurately the concentration of water borne enteric viruses in natural waters impacted by wastewater discharge, stormwater, and CSOs. © 2013.
Detection and interpretation of seismoacoustic events at German infrasound stations
NASA Astrophysics Data System (ADS)
Pilger, Christoph; Koch, Karl; Ceranna, Lars
2016-04-01
Three infrasound arrays with collocated or nearby installed seismometers are operated by the Federal Institute for Geosciences and Natural Resources (BGR) as the German National Data Center (NDC) for the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Infrasound generated by seismoacoustic events is routinely detected at these infrasound arrays, but air-to-ground coupled acoustic waves occasionally show up in seismometer recordings as well. Different natural and artificial sources like meteoroids as well as industrial and mining activity generate infrasonic signatures that are simultaneously detected at microbarometers and seismometers. Furthermore, many near-surface sources like earthquakes and explosions generate both seismic and infrasonic waves that can be detected successively with both technologies. The combined interpretation of seismic and acoustic signatures provides additional information about the origin time and location of remote infrasound events or about the characterization of seismic events distinguishing man-made and natural origins. Furthermore, seismoacoustic studies help to improve the modelling of infrasound propagation and ducting in the atmosphere and allow quantifying the portion of energy coupled into ground and into air by seismoacoustic sources. An overview of different seismoacoustic sources and their detection by German infrasound stations as well as some conclusions on the benefit of a combined seismoacoustic analysis are presented within this study.
Automatic Detection and Vulnerability Analysis of Areas Endangered by Heavy Rain
NASA Astrophysics Data System (ADS)
Krauß, Thomas; Fischer, Peter
2016-08-01
In this paper we present a new method for fully automatic detection and derivation of areas endangered by heavy rainfall based only on digital elevation models. Tracking news show that the majority of occuring natural hazards are flood events. So already many flood prediction systems were developed. But most of these existing systems for deriving areas endangered by flooding events are based only on horizontal and vertical distances to existing rivers and lakes. Typically such systems take not into account dangers arising directly from heavy rain events. In a study conducted by us together with a german insurance company a new approach for detection of areas endangered by heavy rain was proven to give a high correlation of the derived endangered areas and the losses claimed at the insurance company. Here we describe three methods for classification of digital terrain models and analyze their usability for automatic detection and vulnerability analysis for areas endangered by heavy rainfall and analyze the results using the available insurance data.
Comparison of Event Detection Methods for Centralized Sensor Networks
NASA Technical Reports Server (NTRS)
Sauvageon, Julien; Agogiono, Alice M.; Farhang, Ali; Tumer, Irem Y.
2006-01-01
The development of an Integrated Vehicle Health Management (IVHM) for space vehicles has become a great concern. Smart Sensor Networks is one of the promising technologies that are catching a lot of attention. In this paper, we propose to a qualitative comparison of several local event (hot spot) detection algorithms in centralized redundant sensor networks. The algorithms are compared regarding their ability to locate and evaluate the event under noise and sensor failures. The purpose of this study is to check if the ratio performance/computational power of the Mote Fuzzy Validation and Fusion algorithm is relevant compare to simpler methods.
Seismic Characterization of the Newberry and Cooper Basin EGS Sites
NASA Astrophysics Data System (ADS)
Templeton, D. C.; Wang, J.; Goebel, M.; Johannesson, G.; Myers, S. C.; Harris, D.; Cladouhos, T. T.
2015-12-01
To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance traditional microearthquake detection and location methodologies at two EGS systems: the Newberry EGS site and the Habanero EGS site in the Cooper Basin of South Australia. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP typically have smaller magnitudes or occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation is real, or simply within the anticipated error range. At the Newberry EGS site, 235 events were reported in the original catalog. MFP identified 164 additional events (an increase of over 70% more events). For the relocated events in the Newberry catalog, we can distinguish two distinct seismic swarms that fall outside of one another's 95% probability error ellipsoids.This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
NASA Astrophysics Data System (ADS)
Abe, K.; Haga, K.; Hayato, Y.; Ikeda, M.; Iyogi, K.; Kameda, J.; Kishimoto, Y.; Miura, M.; Moriyama, S.; Nakahata, M.; Nakajima, T.; Nakano, Y.; Nakayama, S.; Orii, A.; Sekiya, H.; Shiozawa, M.; Takeda, A.; Tanaka, H.; Tasaka, S.; Tomura, T.; Akutsu, R.; Kajita, T.; Kaneyuki, K.; Nishimura, Y.; Richard, E.; Okumura, K.; Labarga, L.; Fernandez, P.; Blaszczyk, F. d. M.; Gustafson, J.; Kachulis, C.; Kearns, E.; Raaf, J. L.; Stone, J. L.; Sulak, L. R.; Berkman, S.; Nantais, C. M.; Tobayama, S.; Goldhaber, M.; Kropp, W. R.; Mine, S.; Weatherly, P.; Smy, M. B.; Sobel, H. W.; Takhistov, V.; Ganezer, K. S.; Hartfiel, B. L.; Hill, J.; Hong, N.; Kim, J. Y.; Lim, I. T.; Park, R. G.; Himmel, A.; Li, Z.; O’Sullivan, E.; Scholberg, K.; Walter, C. W.; Ishizuka, T.; Nakamura, T.; Jang, J. S.; Choi, K.; Learned, J. G.; Matsuno, S.; Smith, S. N.; Friend, M.; Hasegawa, T.; Ishida, T.; Ishii, T.; Kobayashi, T.; Nakadaira, T.; Nakamura, K.; Oyama, Y.; Sakashita, K.; Sekiguchi, T.; Tsukamoto, T.; Suzuki, A. T.; Takeuchi, Y.; Yano, T.; Cao, S. V.; Hiraki, T.; Hirota, S.; Huang, K.; Jiang, M.; Minamino, A.; Nakaya, T.; Patel, N. D.; Wendell, R. A.; Suzuki, K.; Fukuda, Y.; Itow, Y.; Suzuki, T.; Mijakowski, P.; Frankiewicz, K.; Hignight, J.; Imber, J.; Jung, C. K.; Li, X.; Palomino, J. L.; Santucci, G.; Wilking, M. J.; Yanagisawa, C.; Fukuda, D.; Ishino, H.; Kayano, T.; Kibayashi, A.; Koshio, Y.; Mori, T.; Sakuda, M.; Xu, C.; Kuno, Y.; Tacik, R.; Kim, S. B.; Okazawa, H.; Choi, Y.; Nishijima, K.; Koshiba, M.; Totsuka, Y.; Suda, Y.; Yokoyama, M.; Bronner, C.; Calland, R. G.; Hartz, M.; Martens, K.; Marti, Ll.; Suzuki, Y.; Vagins, M. R.; Martin, J. F.; Tanaka, H. A.; Konaka, A.; Chen, S.; Wan, L.; Zhang, Y.; Wilkes, R. J.; The Super-Kamiokande Collaboration
2016-10-01
We report the results from a search in Super-Kamiokande for neutrino signals coincident with the first detected gravitational-wave events, GW150914 and GW151226, as well as LVT151012, using a neutrino energy range from 3.5 MeV to 100 PeV. We searched for coincident neutrino events within a time window of ±500 s around the gravitational-wave detection time. Four neutrino candidates are found for GW150914, and no candidates are found for GW151226. The remaining neutrino candidates are consistent with the expected background events. We calculated the 90% confidence level upper limits on the combined neutrino fluence for both gravitational-wave events, which depends on event energy and topologies. Considering the upward-going muon data set (1.6 GeV–100 PeV), the neutrino fluence limit for each gravitational-wave event is 14–37 (19–50) cm‑2 for muon neutrinos (muon antineutrinos), depending on the zenith angle of the event. In the other data sets, the combined fluence limits for both gravitational-wave events range from 2.4 × 104 to 7.0 × 109 cm‑2.
The detection and analysis of point processes in biological signals
NASA Technical Reports Server (NTRS)
Anderson, D. J.; Correia, M. J.
1977-01-01
A pragmatic approach to the detection and analysis of discrete events in biomedical signals is taken. Examples from both clinical and basic research are provided. Introductory sections discuss not only discrete events which are easily extracted from recordings by conventional threshold detectors but also events embedded in other information carrying signals. The primary considerations are factors governing event-time resolution and the effects limits to this resolution have on the subsequent analysis of the underlying process. The analysis portion describes tests for qualifying the records as stationary point processes and procedures for providing meaningful information about the biological signals under investigation. All of these procedures are designed to be implemented on laboratory computers of modest computational capacity.
Wang, Jingbo; Templeton, Dennise C.; Harris, David B.
2015-07-30
Using empirical matched field processing (MFP), we compare 4 yr of continuous seismic data to a set of 195 master templates from within an active geothermal field and identify over 140 per cent more events than were identified using traditional detection and location techniques alone. In managed underground reservoirs, a substantial fraction of seismic events can be excluded from the official catalogue due to an inability to clearly identify seismic-phase onsets. Empirical MFP can improve the effectiveness of current seismic detection and location methodologies by using conventionally located events with higher signal-to-noise ratios as master events to define wavefield templatesmore » that could then be used to map normally discarded indistinct seismicity. Since MFP does not require picking, it can be carried out automatically and rapidly once suitable templates are defined. In this application, we extend MFP by constructing local-distance empirical master templates using Southern California Earthquake Data Center archived waveform data of events originating within the Salton Sea Geothermal Field. We compare the empirical templates to continuous seismic data collected between 1 January 2008 and 31 December 2011. The empirical MFP method successfully identifies 6249 additional events, while the original catalogue reported 4352 events. The majority of these new events are lower-magnitude events with magnitudes between M0.2–M0.8. Here, the increased spatial-temporal resolution of the microseismicity map within the geothermal field illustrates how empirical MFP, when combined with conventional methods, can significantly improve seismic network detection capabilities, which can aid in long-term sustainability and monitoring of managed underground reservoirs.« less
Anomaly Detection Based on Local Nearest Neighbor Distance Descriptor in Crowded Scenes
Hu, Shiqiang; Zhang, Huanlong; Luo, Lingkun
2014-01-01
We propose a novel local nearest neighbor distance (LNND) descriptor for anomaly detection in crowded scenes. Comparing with the commonly used low-level feature descriptors in previous works, LNND descriptor has two major advantages. First, LNND descriptor efficiently incorporates spatial and temporal contextual information around the video event that is important for detecting anomalous interaction among multiple events, while most existing feature descriptors only contain the information of single event. Second, LNND descriptor is a compact representation and its dimensionality is typically much lower than the low-level feature descriptor. Therefore, not only the computation time and storage requirement can be accordingly saved by using LNND descriptor for the anomaly detection method with offline training fashion, but also the negative aspects caused by using high-dimensional feature descriptor can be avoided. We validate the effectiveness of LNND descriptor by conducting extensive experiments on different benchmark datasets. Experimental results show the promising performance of LNND-based method against the state-of-the-art methods. It is worthwhile to notice that the LNND-based approach requires less intermediate processing steps without any subsequent processing such as smoothing but achieves comparable event better performance. PMID:25105164
NASA Astrophysics Data System (ADS)
Sugioka, H.; Suyehiro, K.; Shinohara, M.
2009-12-01
The hydroacoustic monitoring by the International Monitoring System (IMS) for Comprehensive Nuclear-Test-Treaty (CTBT) verification system utilize hydrophone stations and seismic stations called T-phase stations for worldwide detection. Some signals of natural origin include those from earthquakes, submarine volcanic eruptions, or whale calls. Among artificial sources there are non-nuclear explosions and air-gun shots. It is important for IMS system to detect and locate hydroacoustic events with sufficient accuracy and correctly characterize the signals and identify the source. As there are a number of seafloor cable networks operated offshore Japanese islands basically facing the Pacific Ocean for monitoring regional seismicity, the data from these stations (pressures, hydrophones and seismic sensors) may be utilized to verify and increase the capability of the IMS. We use these data to compare some selected event parameters with those by Pacific in the time period of 2004-present. These anomalous examples and also dynamite shots used for seismic crustal structure studies and other natural sources will be presented in order to help improve the IMS verification capabilities for detection, location and characterization of anomalous signals. The seafloor cable networks composed of three hydrophones and six seismometers and a temporal dense seismic array detected and located hydroacoustic events offshore Japanese island on 12th of March in 2008, which had been reported by the IMS. We detected not only the reverberated hydroacoustic waves between the sea surface and the sea bottom but also the seismic waves going through the crust associated with the events. The determined source of the seismic waves is almost coincident with the one of hydroacoustic waves, suggesting that the seismic waves are converted very close to the origin of the hydroacoustic source. We also detected very similar signals on 16th of March in 2009 to the ones associated with the event of 12th of March in 2008.
2010-09-01
MULTIPLE-ARRAY DETECTION, ASSOCIATION AND LOCATION OF INFRASOUND AND SEISMO-ACOUSTIC EVENTS – UTILIZATION OF GROUND TRUTH INFORMATION Stephen J...and infrasound data from seismo-acoustic arrays and apply the methodology to regional networks for validation with ground truth information. In the...initial year of the project automated techniques for detecting, associating and locating infrasound signals were developed. Recently, the location
NASA Technical Reports Server (NTRS)
Deutschman, W. A. (Principal Investigator)
1973-01-01
The author has identified the following significant results. Detection of short-lived events has continued. Forest fires, oil spills, vegetation damage, volcanoes, storm ridges, earthquakes, and floods have been detected and analyzed.
Detections of Planets in Binaries Through the Channel of Chang–Refsdal Gravitational Lensing Events
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, Cheongho; Shin, In-Gu; Jung, Youn Kil
Chang–Refsdal (C–R) lensing, which refers to the gravitational lensing of a point mass perturbed by a constant external shear, provides a good approximation in describing lensing behaviors of either a very wide or a very close binary lens. C–R lensing events, which are identified by short-term anomalies near the peak of high-magnification lensing light curves, are routinely detected from lensing surveys, but not much attention is paid to them. In this paper, we point out that C–R lensing events provide an important channel to detect planets in binaries, both in close and wide binary systems. Detecting planets through the C–Rmore » lensing event channel is possible because the planet-induced perturbation occurs in the same region of the C–R lensing-induced anomaly and thus the existence of the planet can be identified by the additional deviation in the central perturbation. By presenting the analysis of the actually observed C–R lensing event OGLE-2015-BLG-1319, we demonstrate that dense and high-precision coverage of a C–R lensing-induced perturbation can provide a strong constraint on the existence of a planet in a wide range of planet parameters. The sample of an increased number of microlensing planets in binary systems will provide important observational constraints in giving shape to the details of planet formation, which have been restricted to the case of single stars to date.« less
Meteor studies in the framework of the JEM-EUSO program
NASA Astrophysics Data System (ADS)
Abdellaoui, G.; Abe, S.; Acheli, A.; Adams, J. H.; Ahmad, S.; Ahriche, A.; Albert, J.-N.; Allard, D.; Alonso, G.; Anchordoqui, L.; Andreev, V.; Anzalone, A.; Aouimeur, W.; Arai, Y.; Arsene, N.; Asano, K.; Attallah, R.; Attoui, H.; Ave Pernas, M.; Bacholle, S.; Bakiri, M.; Baragatti, P.; Barrillon, P.; Bartocci, S.; Batsch, T.; Bayer, J.; Bechini, R.; Belenguer, T.; Bellotti, R.; Belov, A.; Belov, K.; Benadda, B.; Benmessai, K.; Berlind, A. A.; Bertaina, M.; Biermann, P. L.; Biktemerova, S.; Bisconti, F.; Blanc, N.; Błȩcki, J.; Blin-Bondil, S.; Bobik, P.; Bogomilov, M.; Bonamente, M.; Boudaoud, R.; Bozzo, E.; Briggs, M. S.; Bruno, A.; Caballero, K. S.; Cafagna, F.; Campana, D.; Capdevielle, J.-N.; Capel, F.; Caramete, A.; Caramete, L.; Carlson, P.; Caruso, R.; Casolino, M.; Cassardo, C.; Castellina, A.; Castellini, G.; Catalano, C.; Catalano, O.; Cellino, A.; Chikawa, M.; Chiritoi, G.; Christl, M. J.; Connaughton, V.; Conti, L.; Cordero, G.; Crawford, H. J.; Cremonini, R.; Csorna, S.; Dagoret-Campagne, S.; De Donato, C.; de la Taille, C.; De Santis, C.; del Peral, L.; Di Martino, M.; Djemil, T.; Djenas, S. A.; Dulucq, F.; Dupieux, M.; Dutan, I.; Ebersoldt, A.; Ebisuzaki, T.; Engel, R.; Eser, J.; Fang, K.; Fenu, F.; Fernández-González, S.; Fernández-Soriano, J.; Ferrarese, S.; Finco, D.; Flamini, M.; Fornaro, C.; Fouka, M.; Franceschi, A.; Franchini, S.; Fuglesang, C.; Fujimoto, J.; Fukushima, M.; Galeotti, P.; García-Ortega, E.; Garipov, G.; Gascón, E.; Geary, J.; Gelmini, G.; Genci, J.; Giraudo, G.; Gonchar, M.; González Alvarado, C.; Gorodetzky, P.; Guarino, F.; Guehaz, R.; Guzmán, A.; Hachisu, Y.; Haiduc, M.; Harlov, B.; Haungs, A.; Hernández Carretero, J.; Hidber, W.; Higashide, K.; Ikeda, D.; Ikeda, H.; Inoue, N.; Inoue, S.; Isgrò, F.; Itow, Y.; Jammer, T.; Joven, E.; Judd, E. G.; Jung, A.; Jochum, J.; Kajino, F.; Kajino, T.; Kalli, S.; Kaneko, I.; Kang, D.; Kanouni, F.; Karadzhov, Y.; Karczmarczyk, J.; Karus, M.; Katahira, K.; Kawai, K.; Kawasaki, Y.; Kedadra, A.; Khales, H.; Khrenov, B. A.; Kim, Jeong-Sook; Kim, Soon-Wook; Kim, Sug-Whan; Kleifges, M.; Klimov, P. A.; Kolev, D.; Kreykenbohm, I.; Kudela, K.; Kurihara, Y.; Kusenko, A.; Kuznetsov, E.; Lacombe, M.; Lachaud, C.; Lahmar, H.; Lakhdari, F.; Larsson, O.; Lee, J.; Licandro, J.; Lim, H.; López Campano, L.; Maccarone, M. C.; Mackovjak, S.; Mahdi, M.; Maravilla, D.; Marcelli, L.; Marcos, J. L.; Marini, A.; Martens, K.; Martín, Y.; Martinez, O.; Masciantonio, G.; Mase, K.; Matev, R.; Matthews, J. N.; Mebarki, N.; Medina-Tanco, G.; Mehrad, L.; Mendoza, M. A.; Merino, A.; Mernik, T.; Meseguer, J.; Messaoud, S.; Micu, O.; Mimouni, J.; Miyamoto, H.; Miyazaki, Y.; Mizumoto, Y.; Modestino, G.; Monaco, A.; Monnier-Ragaigne, D.; Morales de los Ríos, J. A.; Moretto, C.; Morozenko, V. S.; Mot, B.; Murakami, T.; Nadji, B.; Nagano, M.; Nagata, M.; Nagataki, S.; Nakamura, T.; Napolitano, T.; Nardelli, A.; Naumov, D.; Nava, R.; Neronov, A.; Nomoto, K.; Nonaka, T.; Ogawa, T.; Ogio, S.; Ohmori, H.; Olinto, A. V.; Orleański, P.; Osteria, G.; Painter, W.; Panasyuk, M. I.; Panico, B.; Parizot, E.; Park, I. H.; Park, H. W.; Pastircak, B.; Patzak, T.; Paul, T.; Pennypacker, C.; Perdichizzi, M.; Pérez-Grande, I.; Perfetto, F.; Peter, T.; Picozza, P.; Pierog, T.; Pindado, S.; Piotrowski, L. W.; Piraino, S.; Placidi, L.; Plebaniak, Z.; Pliego, S.; Pollini, A.; Popescu, E. M.; Prat, P.; Prévôt, G.; Prieto, H.; Putis, M.; Rabanal, J.; Radu, A. A.; Rahmani, M.; Reardon, P.; Reyes, M.; Rezazadeh, M.; Ricci, M.; Rodríguez Frías, M. D.; Ronga, F.; Roth, M.; Rothkaehl, H.; Roudil, G.; Rusinov, I.; Rybczyński, M.; Sabau, M. D.; Sáez Cano, G.; Sagawa, H.; Sahnoune, Z.; Saito, A.; Sakaki, N.; Sakata, M.; Salazar, H.; Sanchez, J. C.; Sánchez, J. L.; Santangelo, A.; Santiago Crúz, L.; Sanz-Andrés, A.; Sanz Palomino, M.; Saprykin, O.; Sarazin, F.; Sato, H.; Sato, M.; Schanz, T.; Schieler, H.; Scotti, V.; Segreto, A.; Selmane, S.; Semikoz, D.; Serra, M.; Sharakin, S.; Shibata, T.; Shimizu, H. M.; Shinozaki, K.; Shirahama, T.; Siemieniec-Oziȩbło, G.; Sledd, J.; Słomińska, K.; Sobey, A.; Stan, I.; Sugiyama, T.; Supanitsky, D.; Suzuki, M.; Szabelska, B.; Szabelski, J.; Tahi, H.; Tajima, F.; Tajima, N.; Tajima, T.; Takahashi, Y.; Takami, H.; Takeda, M.; Takizawa, Y.; Talai, M. C.; Tenzer, C.; Tibolla, O.; Tkachev, L.; Tokuno, H.; Tomida, T.; Tone, N.; Toscano, S.; Traïche, M.; Tsenov, R.; Tsunesada, Y.; Tsuno, K.; Tymieniecka, T.; Uchihori, Y.; Unger, M.; Vaduvescu, O.; Valdés-Galicia, J. F.; Vallania, P.; Vankova, G.; Vigorito, C.; Villaseñor, L.; Vlcek, B.; von Ballmoos, P.; Vrabel, M.; Wada, S.; Watanabe, J.; Watanabe, S.; Watts, J., Jr.; Weber, M.; Weigand Muñoz, R.; Weindl, A.; Weiler, T. J.; Wibig, T.; Wiencke, L.; Wille, M.; Wilms, J.; Włodarczyk, Z.; Yamamoto, T.; Yamamoto, Y.; Yang, J.; Yano, H.; Yashin, I. V.; Yonetoku, D.; Yoshida, S.; Young, R.; Zgura, I. S.; Zotov, M. Yu.; Zuccaro Marchi, A.
2017-09-01
We summarize the state of the art of a program of UV observations from space of meteor phenomena, a secondary objective of the JEM-EUSO international collaboration. Our preliminary analysis indicates that JEM-EUSO, taking advantage of its large FOV and good sensitivity, should be able to detect meteors down to absolute magnitude close to 7. This means that JEM-EUSO should be able to record a statistically significant flux of meteors, including both sporadic ones, and events produced by different meteor streams. Being unaffected by adverse weather conditions, JEM-EUSO can also be a very important facility for the detection of bright meteors and fireballs, as these events can be detected even in conditions of very high sky background. In the case of bright events, moreover, exhibiting some persistence of the meteor train, preliminary simulations show that it should be possible to exploit the motion of the ISS itself and derive at least a rough 3D reconstruction of the meteor trajectory. Moreover, the observing strategy developed to detect meteors may also be applied to the detection of nuclearites, exotic particles whose existence has been suggested by some theoretical investigations. Nuclearites are expected to move at higher velocities than meteoroids, and to exhibit a wider range of possible trajectories, including particles moving upward after crossing the Earth. Some pilot studies, including the approved Mini-EUSO mission, a precursor of JEM-EUSO, are currently operational or in preparation. We are doing simulations to assess the performance of Mini-EUSO for meteor studies, while a few meteor events have been already detected using the ground-based facility EUSO-TA.
Using waveform cross correlation for automatic recovery of aftershock sequences
NASA Astrophysics Data System (ADS)
Bobrov, Dmitry; Kitov, Ivan; Rozhkov, Mikhail
2017-04-01
Aftershock sequences of the largest earthquakes are difficult to recover. There can be several hundred mid-sized aftershocks per hour within a few hundred km from each other recorded by the same stations. Moreover, these events generate thousands of reflected/refracted phases having azimuth and slowness close to those from the P-waves. Therefore, aftershock sequences with thousands of events represent a major challenge for automatic and interactive processing at the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Organization (CTBTO). Standard methods of detection and phase association do not use all information contained in signals. As a result, wrong association of the first and later phases, both regular and site specific, produces enormous number of wrong event hypotheses and destroys valid event hypotheses in automatic IDC processing. In turn, the IDC analysts have to reject false and recreate valid hypotheses wasting precious human resources. At the current level of the IDC catalogue completeness, the method of waveform cross correlation (WCC) can resolve most of detection and association problems fully utilizing the similarity of waveforms generated by aftershocks. Array seismic stations of the International monitoring system (IMS) can enhance the performance of the WCC method: reduce station-specific detection thresholds, allow accurate estimate of signal attributes, including relative magnitude, and effectively suppress irrelevant arrivals. We have developed and tested a prototype of an aftershock tool matching all IDC processing requirements and merged it with the current IDC pipeline. This tool includes creation of master events consisting of real or synthetic waveform templates at ten and more IMS stations; cross correlation (CC) of real-time waveforms with these templates, association of arrivals detected at CC-traces in event hypotheses; building events matching the IDC quality criteria; and resolution of conflicts between events hypotheses created by neighboring master-events. The final cross correlation standard event lists (XSEL) is a start point for interactive analysis with standard tools. We present select results for the biggest earthquakes, like Sumatra 2004 and Tohoku 2011, as well as for several smaller events with hundreds of aftershocks. The sensitivity and resolution of the aftershock tool is demonstrated on the example of mb=2.2 aftershock found after the September 9, 2016 DPRK test.
Characterization of GM events by insert knowledge adapted re-sequencing approaches
Yang, Litao; Wang, Congmao; Holst-Jensen, Arne; Morisset, Dany; Lin, Yongjun; Zhang, Dabing
2013-01-01
Detection methods and data from molecular characterization of genetically modified (GM) events are needed by stakeholders of public risk assessors and regulators. Generally, the molecular characteristics of GM events are incomprehensively revealed by current approaches and biased towards detecting transformation vector derived sequences. GM events are classified based on available knowledge of the sequences of vectors and inserts (insert knowledge). Herein we present three insert knowledge-adapted approaches for characterization GM events (TT51-1 and T1c-19 rice as examples) based on paired-end re-sequencing with the advantages of comprehensiveness, accuracy, and automation. The comprehensive molecular characteristics of two rice events were revealed with additional unintended insertions comparing with the results from PCR and Southern blotting. Comprehensive transgene characterization of TT51-1 and T1c-19 is shown to be independent of a priori knowledge of the insert and vector sequences employing the developed approaches. This provides an opportunity to identify and characterize also unknown GM events. PMID:24088728
Characterization of GM events by insert knowledge adapted re-sequencing approaches.
Yang, Litao; Wang, Congmao; Holst-Jensen, Arne; Morisset, Dany; Lin, Yongjun; Zhang, Dabing
2013-10-03
Detection methods and data from molecular characterization of genetically modified (GM) events are needed by stakeholders of public risk assessors and regulators. Generally, the molecular characteristics of GM events are incomprehensively revealed by current approaches and biased towards detecting transformation vector derived sequences. GM events are classified based on available knowledge of the sequences of vectors and inserts (insert knowledge). Herein we present three insert knowledge-adapted approaches for characterization GM events (TT51-1 and T1c-19 rice as examples) based on paired-end re-sequencing with the advantages of comprehensiveness, accuracy, and automation. The comprehensive molecular characteristics of two rice events were revealed with additional unintended insertions comparing with the results from PCR and Southern blotting. Comprehensive transgene characterization of TT51-1 and T1c-19 is shown to be independent of a priori knowledge of the insert and vector sequences employing the developed approaches. This provides an opportunity to identify and characterize also unknown GM events.
Transient Events in Archival Very Large Array Observations of the Galactic Center
NASA Astrophysics Data System (ADS)
Chiti, Anirudh; Chatterjee, Shami; Wharton, Robert; Cordes, James; Lazio, T. Joseph W.; Kaplan, David L.; Bower, Geoffrey C.; Croft, Steve
2016-12-01
The Galactic center has some of the highest stellar densities in the Galaxy and a range of interstellar scattering properties, which may aid in the detection of new radio-selected transient events. Here, we describe a search for radio transients in the Galactic center, using over 200 hr of archival data from the Very Large Array at 5 and 8.4 GHz. Every observation of Sgr A* from 1985 to 2005 has been searched using an automated processing and detection pipeline sensitive to transients with timescales between 30 s and 5 minutes with a typical detection threshold of ˜100 mJy. Eight possible candidates pass tests to filter false-positives from radio-frequency interference, calibration errors, and imaging artifacts. Two events are identified as promising candidates based on the smoothness of their light curves. Despite the high quality of their light curves, these detections remain suspect due to evidence of incomplete subtraction of the complex structure in the Galactic center, and apparent contingency of one detection on reduction routines. Events of this intensity (˜100 mJy) and duration (˜100 s) are not obviously associated with known astrophysical sources, and no counterparts are found in data at other wavelengths. We consider potential sources, including Galactic center pulsars, dwarf stars, sources like GCRT J1745-3009, and bursts from X-ray binaries. None can fully explain the observed transients, suggesting either a new astrophysical source or a subtle imaging artifact. More sensitive multiwavelength studies are necessary to characterize these events, which, if real, occur with a rate of {14}-12+32 {{hr}}-1 {\\deg }-2 in the Galactic center.
Singh, Monika; Bhoge, Rajesh K; Randhawa, Gurinderjit
2018-04-20
Background : Confirming the integrity of seed samples in powdered form is important priorto conducting a genetically modified organism (GMO) test. Rapid onsite methods may provide a technological solution to check for genetically modified (GM) events at ports of entry. In India, Bt cotton is the commercialized GM crop with four approved GM events; however, 59 GM events have been approved globally. GMO screening is required to test for authorized GM events. The identity and amplifiability of test samples could be ensured first by employing endogenous genes as an internal control. Objective : A rapid onsite detection method was developed for an endogenous reference gene, stearoyl acyl carrier protein desaturase ( Sad1 ) of cotton, employing visual and real-time loop-mediated isothermal amplification (LAMP). Methods : The assays were performed at a constant temperature of 63°C for 30 min for visual LAMP and 62ºC for 40 min for real-time LAMP. Positive amplification was visualized as a change in color from orange to green on addition of SYBR ® Green or detected as real-time amplification curves. Results : Specificity of LAMP assays was confirmed using a set of 10 samples. LOD for visual LAMP was up to 0.1%, detecting 40 target copies, and for real-time LAMP up to 0.05%, detecting 20 target copies. Conclusions : The developed methods could be utilized to confirm the integrity of seed powder prior to conducting a GMO test for specific GM events of cotton. Highlights : LAMP assays for the endogenous Sad1 gene of cotton have been developed to be used as an internal control for onsite GMO testing in cotton.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dudleson, B.; Arnold, M.; McCann, D.
Rapid detection of unexpected drilling events requires continuous monitoring of drilling parameters. A major R and D program by a drilling contractor has led to the introduction of a computerized monitoring system on its offshore rigs. System includes advanced color graphics displays and new smart alarms to help both contractor and operator personnel detect and observe drilling events before they would normally be apparent with conventional rig instrumentation. This article describes a module of this monitoring system, which uses expert system technology to detect the earliest stages of drillstring washouts. Field results demonstrate the effectiveness of the smart alarm incorporatedmore » in the system. Early detection allows the driller to react before a twist-off results in expensive fishing operations.« less
Gravitational Microlensing Events as a Target for the SETI project
NASA Astrophysics Data System (ADS)
Rahvar, Sohrab
2016-09-01
The detection of signals from a possible extrasolar technological civilization is one of the most challenging efforts of science. In this work, we propose using natural telescopes made of single or binary gravitational lensing systems to magnify leakage of electromagnetic signals from a remote planet that harbors Extraterrestrial Intelligent (ETI) technology. Currently, gravitational microlensing surveys are monitoring a large area of the Galactic bulge to search for microlensing events, finding more than 2000 events per year. These lenses are capable of playing the role of natural telescopes, and, in some instances, they can magnify radio band signals from planets orbiting around the source stars in gravitational microlensing systems. Assuming that the frequency of electromagnetic waves used for telecommunication in ETIs is similar to ours, we propose follow-up observation of microlensing events with radio telescopes such as the Square Kilometre Array (SKA), the Low Frequency Demonstrators, and the Mileura Wide-Field Array. Amplifying signals from the leakage of broadcasting by an Earth-like civilization will allow us to detect them as far as the center of the Milky Way galaxy. Our analysis shows that in binary microlensing systems, the probability of amplification of signals from ETIs is more than that in single microlensing events. Finally, we propose the use of the target of opportunity mode for follow-up observations of binary microlensing events with SKA as a new observational program for searching ETIs. Using optimistic values for the factors of the Drake equation provides detection of about one event per year.
Spitzer Observes Neutron Star Collision
2017-10-16
NASA's Spitzer Space Telescope has provisionally detected the faint afterglow of the explosive merger of two neutron stars in the galaxy NGC 4993. The event, labeled GW170817, was initially detected in gravitational waves and gamma rays. Subsequent observations by dozens of telescopes have monitored its afterglow across the entire spectrum of light. The event is located about 130 million light-years from Earth. Spitzer's observation on September 29, 2017, came late in the game, just over 6 weeks after the event was first seen. But if this weak detection is verified, it will play an important role in helping astronomers understand how many of the heaviest elements in the periodic table are created in explosive neutron star mergers. The left panel is a color composite of the 3.6 and 4.5 micron channels of the Spitzer IRAC instrument, rendered in cyan and red. The center panel is a median-filtered color composite showing a faint red dot at the known location of the event. The right panel shows the residual 4.5 micron data after subtracting out the light of the galaxy using an archival image that predates the event. An annotated version is at https://photojournal.jpl.nasa.gov/catalog/PIA21910
Strategies for automatic processing of large aftershock sequences
NASA Astrophysics Data System (ADS)
Kvaerna, T.; Gibbons, S. J.
2017-12-01
Aftershock sequences following major earthquakes present great challenges to seismic bulletin generation. The analyst resources needed to locate events increase with increased event numbers as the quality of underlying, fully automatic, event lists deteriorates. While current pipelines, designed a generation ago, are usually limited to single passes over the raw data, modern systems also allow multiple passes. Processing the raw data from each station currently generates parametric data streams that are later subject to phase-association algorithms which form event hypotheses. We consider a major earthquake scenario and propose to define a region of likely aftershock activity in which we will detect and accurately locate events using a separate, specially targeted, semi-automatic process. This effort may use either pattern detectors or more general algorithms that cover wider source regions without requiring waveform similarity. An iterative procedure to generate automatic bulletins would incorporate all the aftershock event hypotheses generated by the auxiliary process, and filter all phases from these events from the original detection lists prior to a new iteration of the global phase-association algorithm.
Detecting modification of biomedical events using a deep parsing approach
2012-01-01
Background This work describes a system for identifying event mentions in bio-molecular research abstracts that are either speculative (e.g. analysis of IkappaBalpha phosphorylation, where it is not specified whether phosphorylation did or did not occur) or negated (e.g. inhibition of IkappaBalpha phosphorylation, where phosphorylation did not occur). The data comes from a standard dataset created for the BioNLP 2009 Shared Task. The system uses a machine-learning approach, where the features used for classification are a combination of shallow features derived from the words of the sentences and more complex features based on the semantic outputs produced by a deep parser. Method To detect event modification, we use a Maximum Entropy learner with features extracted from the data relative to the trigger words of the events. The shallow features are bag-of-words features based on a small sliding context window of 3-4 tokens on either side of the trigger word. The deep parser features are derived from parses produced by the English Resource Grammar and the RASP parser. The outputs of these parsers are converted into the Minimal Recursion Semantics formalism, and from this, we extract features motivated by linguistics and the data itself. All of these features are combined to create training or test data for the machine learning algorithm. Results Over the test data, our methods produce approximately a 4% absolute increase in F-score for detection of event modification compared to a baseline based only on the shallow bag-of-words features. Conclusions Our results indicate that grammar-based techniques can enhance the accuracy of methods for detecting event modification. PMID:22595089
Improved microseismic event locations through large-N arrays and wave-equation imaging and inversion
NASA Astrophysics Data System (ADS)
Witten, B.; Shragge, J. C.
2016-12-01
The recent increased focus on small-scale seismicity, Mw < 4 has come about primarily for two reasons. First, there is an increase in induced seismicity related to injection operations primarily for wastewater disposal and hydraulic fracturing for oil and gas recovery and for geothermal energy production. While the seismicity associated with injection is sometimes felt, it is more often weak. Some weak events are detected on current sparse arrays; however, accurate location of the events often requires a larger number of (multi-component) sensors. This leads to the second reason for an increased focus on small magnitude seismicity: a greater number of seismometers are being deployed in large N-arrays. The greater number of sensors decreases the detection threshold and therefore significantly increases the number of weak events found. Overall, these two factors bring new challenges and opportunities. Many standard seismological location and inversion techniques are geared toward large, easily identifiable events recorded on a sparse number of stations. However, with large-N arrays we can detect small events by utilizing multi-trace processing techniques, and increased processing power equips us with tools that employ more complete physics for simultaneously locating events and inverting for P- and S-wave velocity structure. We present a method that uses large-N arrays and wave-equation-based imaging and inversion to jointly locate earthquakes and estimate the elastic velocities of the earth. The technique requires no picking and is thus suitable for weak events. We validate the methodology through synthetic and field data examples.
Pre-trained D-CNN models for detecting complex events in unconstrained videos
NASA Astrophysics Data System (ADS)
Robinson, Joseph P.; Fu, Yun
2016-05-01
Rapid event detection faces an emergent need to process large videos collections; whether surveillance videos or unconstrained web videos, the ability to automatically recognize high-level, complex events is a challenging task. Motivated by pre-existing methods being complex, computationally demanding, and often non-replicable, we designed a simple system that is quick, effective and carries minimal overhead in terms of memory and storage. Our system is clearly described, modular in nature, replicable on any Desktop, and demonstrated with extensive experiments, backed by insightful analysis on different Convolutional Neural Networks (CNNs), as stand-alone and fused with others. With a large corpus of unconstrained, real-world video data, we examine the usefulness of different CNN models as features extractors for modeling high-level events, i.e., pre-trained CNNs that differ in architectures, training data, and number of outputs. For each CNN, we use 1-fps from all training exemplar to train one-vs-rest SVMs for each event. To represent videos, frame-level features were fused using a variety of techniques. The best being to max-pool between predetermined shot boundaries, then average-pool to form the final video-level descriptor. Through extensive analysis, several insights were found on using pre-trained CNNs as off-the-shelf feature extractors for the task of event detection. Fusing SVMs of different CNNs revealed some interesting facts, finding some combinations to be complimentary. It was concluded that no single CNN works best for all events, as some events are more object-driven while others are more scene-based. Our top performance resulted from learning event-dependent weights for different CNNs.
Efficient, Decentralized Detection of Qualitative Spatial Events in a Dynamic Scalar Field
Jeong, Myeong-Hun; Duckham, Matt
2015-01-01
This paper describes an efficient, decentralized algorithm to monitor qualitative spatial events in a dynamic scalar field. The events of interest involve changes to the critical points (i.e., peak, pits and passes) and edges of the surface network derived from the field. Four fundamental types of event (appearance, disappearance, movement and switch) are defined. Our algorithm is designed to rely purely on qualitative information about the neighborhoods of nodes in the sensor network and does not require information about nodes’ coordinate positions. Experimental investigations confirm that our algorithm is efficient, with O(n) overall communication complexity (where n is the number of nodes in the sensor network), an even load balance and low operational latency. The accuracy of event detection is comparable to established centralized algorithms for the identification of critical points of a surface network. Our algorithm is relevant to a broad range of environmental monitoring applications of sensor networks. PMID:26343672
Efficient, Decentralized Detection of Qualitative Spatial Events in a Dynamic Scalar Field.
Jeong, Myeong-Hun; Duckham, Matt
2015-08-28
This paper describes an efficient, decentralized algorithm to monitor qualitative spatial events in a dynamic scalar field. The events of interest involve changes to the critical points (i.e., peak, pits and passes) and edges of the surface network derived from the field. Four fundamental types of event (appearance, disappearance, movement and switch) are defined. Our algorithm is designed to rely purely on qualitative information about the neighborhoods of nodes in the sensor network and does not require information about nodes' coordinate positions. Experimental investigations confirm that our algorithm is efficient, with O(n) overall communication complexity (where n is the number of nodes in the sensor network), an even load balance and low operational latency. The accuracy of event detection is comparable to established centralized algorithms for the identification of critical points of a surface network. Our algorithm is relevant to a broad range of environmental monitoring applications of sensor networks.
Following the geomagnetic activity: events on September and October (1999)
NASA Astrophysics Data System (ADS)
Blanco, J. J.; Hidalgo, M. A.; Rodríguez-Pacheco, J.; Medina, J.; Sequeiros, J.; Nieves-Chinchilla, T.
2006-12-01
On 21-22 October 1999 a very intense geomagnetic storm (DST index: -237 nT) was detected. This event was associated with a High Speed Stream (HSS) and an interplanetary coronal mass ejection. Before and after this event, the interplanetary magnetic field showed an inversion probably associated with Heliospheric Current Sheet (HCS) crossings. One month before (21-22 September) a strong geomagnetic storm (DST index: -164 nT) was detected and the solar wind conditions were similar to those observed in October, i. e. magnetic cloud, HSS and HCS crossings. Nevertheless, the October event was stronger than the September one. We have compared both events trying to clarify what caused the difference between them. This work has been supported by the Spanish Comisión Internacional de Ciencia y Tecnología (CICYT), grant ESP2005-07290-C02-01 and ESP2006-08459 and Madrid Autonomous Community / University of Alcala grant CAM-UAH 2005/007.
Iterative Strategies for Aftershock Classification in Automatic Seismic Processing Pipelines
NASA Astrophysics Data System (ADS)
Gibbons, Steven J.; Kværna, Tormod; Harris, David B.; Dodge, Douglas A.
2016-04-01
Aftershock sequences following very large earthquakes present enormous challenges to near-realtime generation of seismic bulletins. The increase in analyst resources needed to relocate an inflated number of events is compounded by failures of phase association algorithms and a significant deterioration in the quality of underlying fully automatic event bulletins. Current processing pipelines were designed a generation ago and, due to computational limitations of the time, are usually limited to single passes over the raw data. With current processing capability, multiple passes over the data are feasible. Processing the raw data at each station currently generates parametric data streams which are then scanned by a phase association algorithm to form event hypotheses. We consider the scenario where a large earthquake has occurred and propose to define a region of likely aftershock activity in which events are detected and accurately located using a separate specially targeted semi-automatic process. This effort may focus on so-called pattern detectors, but here we demonstrate a more general grid search algorithm which may cover wider source regions without requiring waveform similarity. Given many well-located aftershocks within our source region, we may remove all associated phases from the original detection lists prior to a new iteration of the phase association algorithm. We provide a proof-of-concept example for the 2015 Gorkha sequence, Nepal, recorded on seismic arrays of the International Monitoring System. Even with very conservative conditions for defining event hypotheses within the aftershock source region, we can automatically remove over half of the original detections which could have been generated by Nepal earthquakes and reduce the likelihood of false associations and spurious event hypotheses. Further reductions in the number of detections in the parametric data streams are likely using correlation and subspace detectors and/or empirical matched field processing.
Picking vs Waveform based detection and location methods for induced seismicity monitoring
NASA Astrophysics Data System (ADS)
Grigoli, Francesco; Boese, Maren; Scarabello, Luca; Diehl, Tobias; Weber, Bernd; Wiemer, Stefan; Clinton, John F.
2017-04-01
Microseismic monitoring is a common operation in various industrial activities related to geo-resouces, such as oil and gas and mining operations or geothermal energy exploitation. In microseismic monitoring we generally deal with large datasets from dense monitoring networks that require robust automated analysis procedures. The seismic sequences being monitored are often characterized by very many events with short inter-event times that can even provide overlapped seismic signatures. In these situations, traditional approaches that identify seismic events using dense seismic networks based on detections, phase identification and event association can fail, leading to missed detections and/or reduced location resolution. In recent years, to improve the quality of automated catalogues, various waveform-based methods for the detection and location of microseismicity have been proposed. These methods exploit the coherence of the waveforms recorded at different stations and do not require any automated picking procedure. Although this family of methods have been applied to different induced seismicity datasets, an extensive comparison with sophisticated pick-based detection and location methods is still lacking. We aim here to perform a systematic comparison in term of performance using the waveform-based method LOKI and the pick-based detection and location methods (SCAUTOLOC and SCANLOC) implemented within the SeisComP3 software package. SCANLOC is a new detection and location method specifically designed for seismic monitoring at local scale. Although recent applications have proved an extensive test with induced seismicity datasets have been not yet performed. This method is based on a cluster search algorithm to associate detections to one or many potential earthquake sources. On the other hand, SCAUTOLOC is more a "conventional" method and is the basic tool for seismic event detection and location in SeisComp3. This approach was specifically designed for regional and teleseismic applications, thus its performance with microseismic data might be limited. We analyze the performance of the three methodologies for a synthetic dataset with realistic noise conditions as well as for the first hour of continuous waveform data, including the Ml 3.5 St. Gallen earthquake, recorded by a microseismic network deployed in the area. We finally compare the results obtained all these three methods with a manually revised catalogue.
Directional Antineutrino Detection
NASA Astrophysics Data System (ADS)
Safdi, Benjamin R.; Suerfu, Burkhant
2015-02-01
We propose the first event-by-event directional antineutrino detector using inverse beta decay (IBD) interactions on hydrogen, with potential applications including monitoring for nuclear nonproliferation, spatially mapping geoneutrinos, characterizing the diffuse supernova neutrino background and searching for new physics in the neutrino sector. The detector consists of adjacent and separated target and capture scintillator planes. IBD events take place in the target layers, which are thin enough to allow the neutrons to escape without scattering elastically. The neutrons are detected in the thicker boron-loaded capture layers. The location of the IBD event and the momentum of the positron are determined by tracking the positron's trajectory through the detector. Our design is a straightforward modification of existing antineutrino detectors; a prototype could be built with existing technology.
Automated sleep scoring and sleep apnea detection in children
NASA Astrophysics Data System (ADS)
Baraglia, David P.; Berryman, Matthew J.; Coussens, Scott W.; Pamula, Yvonne; Kennedy, Declan; Martin, A. James; Abbott, Derek
2005-12-01
This paper investigates the automated detection of a patient's breathing rate and heart rate from their skin conductivity as well as sleep stage scoring and breathing event detection from their EEG. The software developed for these tasks is tested on data sets obtained from the sleep disorders unit at the Adelaide Women's and Children's Hospital. The sleep scoring and breathing event detection tasks used neural networks to achieve signal classification. The Fourier transform and the Higuchi fractal dimension were used to extract features for input to the neural network. The filtered skin conductivity appeared visually to bear a similarity to the breathing and heart rate signal, but a more detailed evaluation showed the relation was not consistent. Sleep stage classification was achieved with and accuracy of around 65% with some stages being accurately scored and others poorly scored. The two breathing events hypopnea and apnea were scored with varying degrees of accuracy with the highest scores being around 75% and 30%.
GMDD: a database of GMO detection methods
Dong, Wei; Yang, Litao; Shen, Kailin; Kim, Banghyun; Kleter, Gijs A; Marvin, Hans JP; Guo, Rong; Liang, Wanqi; Zhang, Dabing
2008-01-01
Background Since more than one hundred events of genetically modified organisms (GMOs) have been developed and approved for commercialization in global area, the GMO analysis methods are essential for the enforcement of GMO labelling regulations. Protein and nucleic acid-based detection techniques have been developed and utilized for GMOs identification and quantification. However, the information for harmonization and standardization of GMO analysis methods at global level is needed. Results GMO Detection method Database (GMDD) has collected almost all the previous developed and reported GMOs detection methods, which have been grouped by different strategies (screen-, gene-, construct-, and event-specific), and also provide a user-friendly search service of the detection methods by GMO event name, exogenous gene, or protein information, etc. In this database, users can obtain the sequences of exogenous integration, which will facilitate PCR primers and probes design. Also the information on endogenous genes, certified reference materials, reference molecules, and the validation status of developed methods is included in this database. Furthermore, registered users can also submit new detection methods and sequences to this database, and the newly submitted information will be released soon after being checked. Conclusion GMDD contains comprehensive information of GMO detection methods. The database will make the GMOs analysis much easier. PMID:18522755
Piezoelectric-based self-powered electronic adjustable impulse switches
NASA Astrophysics Data System (ADS)
Rastegar, Jahangir; Kwok, Philip
2018-03-01
Novel piezoelectric-based self-powered impulse detecting switches are presented. The switches are designed to detect shock loading events resulting in acceleration or deceleration above prescribed levels and durations. The prescribed acceleration level and duration thresholds are adjustable. They are provided with false trigger protection logic. The impulse switches are provided with electronic and logic circuitry to detect prescribed impulse events and reject events such as high amplitude but short duration shocks, and transportation vibration and similar low amplitude and relatively long duration events. They can be mounted directly onto electronics circuit boards, thereby significantly simplifying the electrical and electronic circuitry, simplifying the assembly process and total cost, significantly reducing the occupied volume, and in some applications eliminating the need for physical wiring to and from the impulse switches. The design of prototypes and testing under realistic conditions are presented.
Sprites and Early ionospheric VLF perturbations
NASA Astrophysics Data System (ADS)
Haldoupis, Christos; Amvrosiadi, Nino; Cotts, Ben; van der Velde, Oscar; Chanrion, Olivier; Neubert, Torsten
2010-05-01
Past studies have shown a correlation between sprites and early VLF perturbations, but the reported correlation varies widely from ~ 50% to 100%. The present study resolves these large discrepancies by analyzing several case studies of sprite and narrowband VLF observations, in which multiple transmitter-receiver VLF links with great circle paths (GCPs) passing near a sprite-producing thunderstorm were available. In this setup, the multiple links act in a complementary way that makes the detection of early VLF perturbations much more probable compared to a single VLF link that can miss several of them, a fact that was overlooked in past studies. The evidence shows that sprites are accompanied by early VLF perturbations in a one-to-one correspondence. This implies that the sprite generation mechanism may cause also sub-ionospheric conductivity disturbances that produce early VLF events. However, the one-to-one "sprite to early" event relationship, if viewed conversely as "early to sprite", appears not to be always reciprocal. This is because the number of early events detected in some cases was considerably larger than the number of sprites. Since the great majority of the early events not accompanied by sprites was caused by positive cloud to ground (+CG) lightning discharges, it is possible that sprites or sprite halos were concurrently present in these events as well but were missed by the sprite-watch detection system. In order for this option to be resolved we need more studies using highly sensitive optical systems capable of detecting weaker sprites, sprite halos and elves.
Characterization of Large Structural Genetic Mosaicism in Human Autosomes
Machiela, Mitchell J.; Zhou, Weiyin; Sampson, Joshua N.; Dean, Michael C.; Jacobs, Kevin B.; Black, Amanda; Brinton, Louise A.; Chang, I-Shou; Chen, Chu; Chen, Constance; Chen, Kexin; Cook, Linda S.; Crous Bou, Marta; De Vivo, Immaculata; Doherty, Jennifer; Friedenreich, Christine M.; Gaudet, Mia M.; Haiman, Christopher A.; Hankinson, Susan E.; Hartge, Patricia; Henderson, Brian E.; Hong, Yun-Chul; Hosgood, H. Dean; Hsiung, Chao A.; Hu, Wei; Hunter, David J.; Jessop, Lea; Kim, Hee Nam; Kim, Yeul Hong; Kim, Young Tae; Klein, Robert; Kraft, Peter; Lan, Qing; Lin, Dongxin; Liu, Jianjun; Le Marchand, Loic; Liang, Xiaolin; Lissowska, Jolanta; Lu, Lingeng; Magliocco, Anthony M.; Matsuo, Keitaro; Olson, Sara H.; Orlow, Irene; Park, Jae Yong; Pooler, Loreall; Prescott, Jennifer; Rastogi, Radhai; Risch, Harvey A.; Schumacher, Fredrick; Seow, Adeline; Setiawan, Veronica Wendy; Shen, Hongbing; Sheng, Xin; Shin, Min-Ho; Shu, Xiao-Ou; VanDen Berg, David; Wang, Jiu-Cun; Wentzensen, Nicolas; Wong, Maria Pik; Wu, Chen; Wu, Tangchun; Wu, Yi-Long; Xia, Lucy; Yang, Hannah P.; Yang, Pan-Chyr; Zheng, Wei; Zhou, Baosen; Abnet, Christian C.; Albanes, Demetrius; Aldrich, Melinda C.; Amos, Christopher; Amundadottir, Laufey T.; Berndt, Sonja I.; Blot, William J.; Bock, Cathryn H.; Bracci, Paige M.; Burdett, Laurie; Buring, Julie E.; Butler, Mary A.; Carreón, Tania; Chatterjee, Nilanjan; Chung, Charles C.; Cook, Michael B.; Cullen, Michael; Davis, Faith G.; Ding, Ti; Duell, Eric J.; Epstein, Caroline G.; Fan, Jin-Hu; Figueroa, Jonine D.; Fraumeni, Joseph F.; Freedman, Neal D.; Fuchs, Charles S.; Gao, Yu-Tang; Gapstur, Susan M.; Patiño-Garcia, Ana; Garcia-Closas, Montserrat; Gaziano, J. Michael; Giles, Graham G.; Gillanders, Elizabeth M.; Giovannucci, Edward L.; Goldin, Lynn; Goldstein, Alisa M.; Greene, Mark H.; Hallmans, Goran; Harris, Curtis C.; Henriksson, Roger; Holly, Elizabeth A.; Hoover, Robert N.; Hu, Nan; Hutchinson, Amy; Jenab, Mazda; Johansen, Christoffer; Khaw, Kay-Tee; Koh, Woon-Puay; Kolonel, Laurence N.; Kooperberg, Charles; Krogh, Vittorio; Kurtz, Robert C.; LaCroix, Andrea; Landgren, Annelie; Landi, Maria Teresa; Li, Donghui; Liao, Linda M.; Malats, Nuria; McGlynn, Katherine A.; McNeill, Lorna H.; McWilliams, Robert R.; Melin, Beatrice S.; Mirabello, Lisa; Peplonska, Beata; Peters, Ulrike; Petersen, Gloria M.; Prokunina-Olsson, Ludmila; Purdue, Mark; Qiao, You-Lin; Rabe, Kari G.; Rajaraman, Preetha; Real, Francisco X.; Riboli, Elio; Rodríguez-Santiago, Benjamín; Rothman, Nathaniel; Ruder, Avima M.; Savage, Sharon A.; Schwartz, Ann G.; Schwartz, Kendra L.; Sesso, Howard D.; Severi, Gianluca; Silverman, Debra T.; Spitz, Margaret R.; Stevens, Victoria L.; Stolzenberg-Solomon, Rachael; Stram, Daniel; Tang, Ze-Zhong; Taylor, Philip R.; Teras, Lauren R.; Tobias, Geoffrey S.; Viswanathan, Kala; Wacholder, Sholom; Wang, Zhaoming; Weinstein, Stephanie J.; Wheeler, William; White, Emily; Wiencke, John K.; Wolpin, Brian M.; Wu, Xifeng; Wunder, Jay S.; Yu, Kai; Zanetti, Krista A.; Zeleniuch-Jacquotte, Anne; Ziegler, Regina G.; de Andrade, Mariza; Barnes, Kathleen C.; Beaty, Terri H.; Bierut, Laura J.; Desch, Karl C.; Doheny, Kimberly F.; Feenstra, Bjarke; Ginsburg, David; Heit, John A.; Kang, Jae H.; Laurie, Cecilia A.; Li, Jun Z.; Lowe, William L.; Marazita, Mary L.; Melbye, Mads; Mirel, Daniel B.; Murray, Jeffrey C.; Nelson, Sarah C.; Pasquale, Louis R.; Rice, Kenneth; Wiggs, Janey L.; Wise, Anastasia; Tucker, Margaret; Pérez-Jurado, Luis A.; Laurie, Cathy C.; Caporaso, Neil E.; Yeager, Meredith; Chanock, Stephen J.
2015-01-01
Analyses of genome-wide association study (GWAS) data have revealed that detectable genetic mosaicism involving large (>2 Mb) structural autosomal alterations occurs in a fraction of individuals. We present results for a set of 24,849 genotyped individuals (total GWAS set II [TGSII]) in whom 341 large autosomal abnormalities were observed in 168 (0.68%) individuals. Merging data from the new TGSII set with data from two prior reports (the Gene-Environment Association Studies and the total GWAS set I) generated a large dataset of 127,179 individuals; we then conducted a meta-analysis to investigate the patterns of detectable autosomal mosaicism (n = 1,315 events in 925 [0.73%] individuals). Restricting to events >2 Mb in size, we observed an increase in event frequency as event size decreased. The combined results underscore that the rate of detectable mosaicism increases with age (p value = 5.5 × 10−31) and is higher in men (p value = 0.002) but lower in participants of African ancestry (p value = 0.003). In a subset of 47 individuals from whom serial samples were collected up to 6 years apart, complex changes were noted over time and showed an overall increase in the proportion of mosaic cells as age increased. Our large combined sample allowed for a unique ability to characterize detectable genetic mosaicism involving large structural events and strengthens the emerging evidence of non-random erosion of the genome in the aging population. PMID:25748358
Popok, David W; West, Christopher R; Hubli, Michele; Currie, Katharine D; Krassioukov, Andrei V
2017-02-01
Cardiovascular disease is one of the leading causes of morbidity and mortality in the spinal cord injury (SCI) population. SCI may disrupt autonomic cardiovascular homeostasis, which can lead to persistent hypotension, irregular diurnal rhythmicity, and the development of autonomic dysreflexia (AD). There is currently no software available to perform automated detection and evaluation of cardiovascular autonomic dysfunction(s) such as those generated from 24 h ambulatory blood pressure monitoring (ABPM) recordings in the clinical setting. The objective of this study is to compare the efficacy of a novel 24 h ABPM Autonomic Dysfunction Detection Software against manual detection and to use the software to demonstrate the relationships between level of injury and the degree of autonomic cardiovascular impairment in a large cohort of individuals with SCI. A total of 46 individuals with cervical (group 1, n = 37) or high thoracic (group 2, n = 9) SCI participated in the study. Outcome measures included the frequency and severity of AD, frequency of hypotensive events, and diurnal variations in blood pressure and heart rate. There was good agreement between the software and manual detection of AD events (Bland-Altman limits of agreement = ±1.458 events). Cervical SCI presented with more frequent (p = 0.0043) and severe AD (p = 0.0343) than did high thoracic SCI. Cervical SCI exhibited higher systolic and diastolic blood pressure during the night and lower heart rate during the day than high thoracic SCI. In conclusion, our ABPM AD Detection Software was equally as effective in detecting the frequency and severity of AD and hypotensive events as manual detection, suggesting that this software can be used in the clinical setting to expedite ABPM analyses.
High speed point derivative microseismic detector
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uhl, J.E.; Warpinski, N.R.; Whetten, E.B.
A high speed microseismic event detector constructed in accordance with the present invention uses a point derivative comb to quickly and accurately detect microseismic events. Compressional and shear waves impinging upon microseismic receiver stations disposed to collect waves are converted into digital data and analyzed using a point derivative comb including assurance of quiet periods prior to declaration of microseismic events. If a sufficient number of quiet periods have passed, the square of a two point derivative of the incoming digital signal is compared to a trip level threshold exceeding the determined noise level to declare a valid trial event.more » The squaring of the derivative emphasizes the differences between noise and signal, and the valid event is preferably declared when the trip threshold has been exceeded over a temporal comb width to realize a comb over a given time period. Once a trial event has been declared, the event is verified through a spatial comb, which applies the temporal event comb to additional stations. The detector according to the present invention quickly and accurately detects initial compressional waves indicative of a microseismic event which typically exceed the ambient cultural noise level by a small amount, and distinguishes the waves from subsequent larger amplitude shear waves. 9 figs.« less
Pulse Detecting Genetic Circuit – A New Design Approach
Inniss, Mara; Iba, Hitoshi; Way, Jeffrey C.
2016-01-01
A robust cellular counter could enable synthetic biologists to design complex circuits with diverse behaviors. The existing synthetic-biological counters, responsive to the beginning of the pulse, are sensitive to the pulse duration. Here we present a pulse detecting circuit that responds only at the falling edge of a pulse–analogous to negative edge triggered electric circuits. As biological events do not follow precise timing, use of such a pulse detector would enable the design of robust asynchronous counters which can count the completion of events. This transcription-based pulse detecting circuit depends on the interaction of two co-expressed lambdoid phage-derived proteins: the first is unstable and inhibits the regulatory activity of the second, stable protein. At the end of the pulse the unstable inhibitor protein disappears from the cell and the second protein triggers the recording of the event completion. Using stochastic simulation we showed that the proposed design can detect the completion of the pulse irrespective to the pulse duration. In our simulation we also showed that fusing the pulse detector with a phage lambda memory element we can construct a counter which can be extended to count larger numbers. The proposed design principle is a new control mechanism for synthetic biology which can be integrated in different circuits for identifying the completion of an event. PMID:27907045
Pulse Detecting Genetic Circuit - A New Design Approach.
Noman, Nasimul; Inniss, Mara; Iba, Hitoshi; Way, Jeffrey C
2016-01-01
A robust cellular counter could enable synthetic biologists to design complex circuits with diverse behaviors. The existing synthetic-biological counters, responsive to the beginning of the pulse, are sensitive to the pulse duration. Here we present a pulse detecting circuit that responds only at the falling edge of a pulse-analogous to negative edge triggered electric circuits. As biological events do not follow precise timing, use of such a pulse detector would enable the design of robust asynchronous counters which can count the completion of events. This transcription-based pulse detecting circuit depends on the interaction of two co-expressed lambdoid phage-derived proteins: the first is unstable and inhibits the regulatory activity of the second, stable protein. At the end of the pulse the unstable inhibitor protein disappears from the cell and the second protein triggers the recording of the event completion. Using stochastic simulation we showed that the proposed design can detect the completion of the pulse irrespective to the pulse duration. In our simulation we also showed that fusing the pulse detector with a phage lambda memory element we can construct a counter which can be extended to count larger numbers. The proposed design principle is a new control mechanism for synthetic biology which can be integrated in different circuits for identifying the completion of an event.
A novel CUSUM-based approach for event detection in smart metering
NASA Astrophysics Data System (ADS)
Zhu, Zhicheng; Zhang, Shuai; Wei, Zhiqiang; Yin, Bo; Huang, Xianqing
2018-03-01
Non-intrusive load monitoring (NILM) plays such a significant role in raising consumer awareness on household electricity use to reduce overall energy consumption in the society. With regard to monitoring low power load, many researchers have introduced CUSUM into the NILM system, since the traditional event detection method is not as effective as expected. Due to the fact that the original CUSUM faces limitations given the small shift is below threshold, we therefore improve the test statistic which allows permissible deviation to gradually rise as the data size increases. This paper proposes a novel event detection and corresponding criterion that could be used in NILM systems to recognize transient states and to help the labelling task. Its performance has been tested in a real scenario where eight different appliances are connected to main line of electric power.
NASA Technical Reports Server (NTRS)
Turso, James; Lawrence, Charles; Litt, Jonathan
2004-01-01
The development of a wavelet-based feature extraction technique specifically targeting FOD-event induced vibration signal changes in gas turbine engines is described. The technique performs wavelet analysis of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/health parameter estimation for FOD-event detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-event test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite element model modal analysis to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-event detection. In the presence of significant noise, precise location of the FOD event in time was obtained using the developed wavelet-based feature.
NASA Technical Reports Server (NTRS)
Turso, James A.; Lawrence, Charles; Litt, Jonathan S.
2007-01-01
The development of a wavelet-based feature extraction technique specifically targeting FOD-event induced vibration signal changes in gas turbine engines is described. The technique performs wavelet analysis of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/ health parameter estimation for FOD-event detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-event test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite-element model modal analysis to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-event detection. In the presence of significant noise, precise location of the FOD event in time was obtained using the developed wavelet-based feature.
Local Explosion Monitoring using Rg
NASA Astrophysics Data System (ADS)
O'Rourke, C. T.; Baker, G. E.
2016-12-01
Rg is the high-frequency fundamental-mode Rayleigh wave, which is only excited by near-surface events. As such, an Rg detection indicates that a seismic source is shallow, generally less than a few km depending on the velocity structure, and so likely man-made. Conversely, the absence of Rg can indicate that the source is deeper and so likely naturally occurring. We have developed a new automated method of detecting Rg arrivals from various explosion sources at local distances, and a process for estimating the likelihood that a source is not shallow when no Rg is detected. Our Rg detection method scans the spectrogram of a seismic signal for a characteristic frequency peak. We test this on the Bighorn Arch Seismic Experiment data, which includes earthquakes, active source explosions in boreholes, and mining explosions recorded on a dense network that spans the Bighorn Mountains and Powder River Basin. The Rg passbands used were 0.4-0.8 Hz for mining blasts and 0.8-1.2 Hz for borehole shots. We successfully detect Rg across the full network for most mining blasts. The lower-yield shots are detectable out to 50 km. We achieve <1% false-positive rate for the small-magnitude earthquakes in the region. Rg detections on known non-shallow earthquake seismograms indicates they are largely due to windowing leakage at very close distances or occasionally to cultural noise. We compare our results to existing methods that use cross-correlation to detect retrograde motion of the surface waves. Our method shows more complete detection across the network, especially in the Powder River Basin where Rg exhibits prograde motion that does not trigger the existing detector. We also estimate the likelihood that Rg would have been detected from a surface source, based on the measured P amplitude. For example, an event with a large P wave and no detectable Rg would have a high probability of being a deeper event, whereas we cannot confidently determine whether an event with a small P wave and no Rg detection is shallow or not. These results allow us to detect Rg arrivals, which indicate a shallow source, and to use the absence of Rg to estimate the likelihood that a source in a calibrated region is not shallow enough to be man-made.
Schmidt; Fiorentino; Arkin; Laude
2000-08-01
A method for direct and continuous detection of ion motion during different perturbation events of the fourier transform ion cyclotron resonance (FTICR) experiment is demonstrated. The modifications necessary to convert an ordinary FTICR cell into one capable of performing simultaneous excitation/detection (SED) using a capacitive network are outlined. With these modifications, a 200-fold reduction in the detection of the coupled excitation signal is achieved. This allows the unique ability not only to observe the response to the perturbation but to observe the perturbation event itself. SED is used successfully to monitor the ion cyclotron transient during single-frequency excitation, remeasurement and exciter-excite experiments.
NASA Astrophysics Data System (ADS)
Tremsin, A. S.; Vallerga, J. V.; McPhate, J. B.; Siegmund, O. H. W.
2015-07-01
Many high resolution event counting devices process one event at a time and cannot register simultaneous events. In this article a frame-based readout event counting detector consisting of a pair of Microchannel Plates and a quad Timepix readout is described. More than 104 simultaneous events can be detected with a spatial resolution of 55 μm, while >103 simultaneous events can be detected with <10 μm spatial resolution when event centroiding is implemented. The fast readout electronics is capable of processing >1200 frames/sec, while the global count rate of the detector can exceed 5×108 particles/s when no timing information on every particle is required. For the first generation Timepix readout, the timing resolution is limited by the Timepix clock to 10-20 ns. Optimization of the MCP gain, rear field voltage and Timepix threshold levels are crucial for the device performance and that is the main subject of this article. These devices can be very attractive for applications where the photon/electron/ion/neutron counting with high spatial and temporal resolution is required, such as energy resolved neutron imaging, Time of Flight experiments in lidar applications, experiments on photoelectron spectroscopy and many others.
Determination of a Limited Scope Network's Lightning Detection Efficiency
NASA Technical Reports Server (NTRS)
Rompala, John T.; Blakeslee, R.
2008-01-01
This paper outlines a modeling technique to map lightning detection efficiency variations over a region surveyed by a sparse array of ground based detectors. A reliable flash peak current distribution (PCD) for the region serves as the technique's base. This distribution is recast as an event probability distribution function. The technique then uses the PCD together with information regarding: site signal detection thresholds, type of solution algorithm used, and range attenuation; to formulate the probability that a flash at a specified location will yield a solution. Applying this technique to the full region produces detection efficiency contour maps specific to the parameters employed. These contours facilitate a comparative analysis of each parameter's effect on the network's detection efficiency. In an alternate application, this modeling technique gives an estimate of the number, strength, and distribution of events going undetected. This approach leads to a variety of event density contour maps. This application is also illustrated. The technique's base PCD can be empirical or analytical. A process for formulating an empirical PCD specific to the region and network being studied is presented. A new method for producing an analytical representation of the empirical PCD is also introduced.
Detection and attribution of extreme weather disasters
NASA Astrophysics Data System (ADS)
Huggel, Christian; Stone, Dáithí; Hansen, Gerrit
2014-05-01
Single disasters related to extreme weather events have caused loss and damage on the order of up to tens of billions US dollars over the past years. Recent disasters fueled the debate about whether and to what extent these events are related to climate change. In international climate negotiations disaster loss and damage is now high on the agenda, and related policy mechanisms have been discussed or are being implemented. In view of funding allocation and effective risk reduction strategies detection and attribution to climate change of extreme weather events and disasters is a key issue. Different avenues have so far been taken to address detection and attribution in this context. Physical climate sciences have developed approaches, among others, where variables that are reasonably sampled over climatically relevant time periods and related to the meteorological characteristics of the extreme event are examined. Trends in these variables (e.g. air or sea surface temperatures) are compared between observations and climate simulations with and without anthropogenic forcing. Generally, progress has been made in recent years in attribution of changes in the chance of some single extreme weather events to anthropogenic climate change but there remain important challenges. A different line of research is primarily concerned with losses related to the extreme weather events over time, using disaster databases. A growing consensus is that the increase in asset values and in exposure are main drivers of the strong increase of economic losses over the past several decades, and only a limited number of studies have found trends consistent with expectations from climate change. Here we propose a better integration of existing lines of research in detection and attribution of extreme weather events and disasters by applying a risk framework. Risk is thereby defined as a function of the probability of occurrence of an extreme weather event, and the associated consequences, with consequences being a function of the intensity of the physical weather event, the exposure and value of assets, and vulnerabilities. We have examined selected major extreme events and disasters, including superstorm Sandy in 2012, the Pakistan floods and the heat wave in Russia in 2010, the 2010 floods in Colombia and the 2011 floods in Australia. We systematically analyzed to what extent (anthropogenic) climate change may have contributed to intensity and frequency of the event, along with changes in the other risk variables, to eventually reach a more comprehensive understanding of the relative role of climate change in recent loss and damage of extreme weather events.
Participation of the NDC Austria at the NDC Preparedness Exercise 2012
NASA Astrophysics Data System (ADS)
Mitterbauer, Ulrike; Wotawa, Gerhard; Schraick, Irene
2013-04-01
NDC Preparedness Exercises (NPEs) are conducted annually by the National Data Centers (NDCs) of CTBT States Signatories to train the detection of a (hypothetical) nuclear test. During the NDC Preparedness Exercise 2012, a fictitious radionuclide scenario originating from a real seismic event (mining explosion) was calculated by the German NDC and distributed among all NDCs. For the scenario computation, it was assumed that the selected seismic event was the epicentre of an underground nuclear fission explosion. The scenario included detections of the Iodine isotopes I-131 and I-133 (both particulates), and the Radioxenon Isotopes Xe-133, Xe-133M, Xe-131M and Xe-135 (noble gas). By means of atmospheric transport modelling (ATM), concentrations of all these six isotopes which would result from the hypothetical explosion were calculated and interpolated to the IMS station locations. The participating NDCs received information about the concentration of the isotopes at the station locations without knowing the underlying seismic event. The aim of the exercise was to identify this event based on the detection scenario. The Austrian NDC performed the following analyses: • Atmospheric backtracking and data fusion to identify seismic candidate events, • Seismic analysis of candidate events within the possible source region, • Atmospheric transport modelling (forward mode) from identified candidate events, comparison between "measured" and simulated concentrations based on certain release assumptions. The main goal of the analysis was to identify the event selected by NDC Germany to calculate the radionuclide scenario, and to exclude other events. In the presentation, the analysis methodology as well as the final results and conclusions will be shown and discussed in detail.
Daytime identification of summer hailstorm cells from MSG data
NASA Astrophysics Data System (ADS)
Merino, A.; López, L.; Sánchez, J. L.; García-Ortega, E.; Cattani, E.; Levizzani, V.
2014-04-01
Identifying deep convection is of paramount importance, as it may be associated with extreme weather phenomena that have significant impact on the environment, property and populations. A new method, the hail detection tool (HDT), is described for identifying hail-bearing storms using multispectral Meteosat Second Generation (MSG) data. HDT was conceived as a two-phase method, in which the first step is the convective mask (CM) algorithm devised for detection of deep convection, and the second a hail mask algorithm (HM) for the identification of hail-bearing clouds among cumulonimbus systems detected by CM. Both CM and HM are based on logistic regression models trained with multispectral MSG data sets comprised of summer convective events in the middle Ebro Valley (Spain) between 2006 and 2010, and detected by the RGB (red-green-blue) visualization technique (CM) or C-band weather radar system of the University of León. By means of the logistic regression approach, the probability of identifying a cumulonimbus event with CM or a hail event with HM are computed by exploiting a proper selection of MSG wavelengths or their combination. A number of cloud physical properties (liquid water path, optical thickness and effective cloud drop radius) were used to physically interpret results of statistical models from a meteorological perspective, using a method based on these "ingredients". Finally, the HDT was applied to a new validation sample consisting of events during summer 2011. The overall probability of detection was 76.9 % and the false alarm ratio 16.7 %.
A novel real-time health monitoring system for unmanned vehicles
NASA Astrophysics Data System (ADS)
Zhang, David C.; Ouyang, Lien; Qing, Peter; Li, Irene
2008-04-01
Real-time monitoring the status of in-service structures such as unmanned vehicles can provide invaluable information to detect the damages to the structures on time. The unmanned vehicles can be maintained and repaired in time if such damages are found. One typical cause of damages of unmanned vehicles is from impacts caused by bumping into some obstacles or being hit by some objects such as hostile fire. This paper introduces a novel impact event sensing system that can detect the location of the impact events and the force-time history of the impact events. The system consists of the Piezo-electric sensor network, the hardware platform and the analysis software. The new customized battery-powered impact event sensing system supports up to 64-channel parallel data acquisition. It features an innovative low-power hardware trigger circuit that monitors 64 channels simultaneously. The system is in the sleep mode most of the time. When an impact event happens, the system will wake up in micro-seconds and detect the impact location and corresponding force-time history. The system can be combined with the SMART sensing system to further evaluate the impact damage severity.
Big Data solution for CTBT monitoring: CEA-IDC joint global cross correlation project
NASA Astrophysics Data System (ADS)
Bobrov, Dmitry; Bell, Randy; Brachet, Nicolas; Gaillard, Pierre; Kitov, Ivan; Rozhkov, Mikhail
2014-05-01
Waveform cross-correlation when applied to historical datasets of seismic records provides dramatic improvements in detection, location, and magnitude estimation of natural and manmade seismic events. With correlation techniques, the amplitude threshold of signal detection can be reduced globally by a factor of 2 to 3 relative to currently standard beamforming and STA/LTA detector. The gain in sensitivity corresponds to a body wave magnitude reduction by 0.3 to 0.4 units and doubles the number of events meeting high quality requirements (e.g. detected by three and more seismic stations of the International Monitoring System (IMS). This gain is crucial for seismic monitoring under the Comprehensive Nuclear-Test-Ban Treaty. The International Data Centre (IDC) dataset includes more than 450,000 seismic events, tens of millions of raw detections and continuous seismic data from the primary IMS stations since 2000. This high-quality dataset is a natural candidate for an extensive cross correlation study and the basis of further enhancements in monitoring capabilities. Without this historical dataset recorded by the permanent IMS Seismic Network any improvements would not be feasible. However, due to the mismatch between the volume of data and the performance of the standard Information Technology infrastructure, it becomes impossible to process all the data within tolerable elapsed time. To tackle this problem known as "BigData", the CEA/DASE is part of the French project "DataScale". One objective is to reanalyze 10 years of waveform data from the IMS network with the cross-correlation technique thanks to a dedicated High Performance Computer (HPC) infrastructure operated by the Centre de Calcul Recherche et Technologie (CCRT) at the CEA of Bruyères-le-Châtel. Within 2 years we are planning to enhance detection and phase association algorithms (also using machine learning and automatic classification) and process about 30 terabytes of data provided by the IDC to update the world seismicity map. From the new events and those in the IDC Reviewed Event Bulletin, we will automatically create various sets of master event templates that will be used for the event location globally by the CTBTO and CEA.
Call, Rosemary J.; Burlison, Jonathan D.; Robertson, Jennifer J.; Scott, Jeffrey R.; Baker, Donald K.; Rossi, Michael G.; Howard, Scott C.; Hoffman, James M.
2014-01-01
Objective To investigate the use of a trigger tool for adverse drug event (ADE) detection in a pediatric hospital specializing in oncology, hematology, and other catastrophic diseases. Study design A medication-based trigger tool package analyzed electronic health records from February 2009 to February 2013. Chart review determined whether an ADE precipitated the trigger. Severity was assigned to ADEs, and preventability was assessed. Preventable ADEs were compared with the hospital’s electronic voluntary event reporting system to identify whether these ADEs had been previously identified. The positive predictive values (PPVs) of the entire trigger tool and individual triggers were calculated to assess their accuracy to detect ADEs. Results Trigger occurrences (n=706) were detected in 390 patients from six medication triggers, 33 of which were ADEs (overall PPV = 16%). Hyaluronidase had the highest PPV (60%). Most ADEs were category E harm (temporary harm) per the National Coordinating Council for Medication Error Reporting and Prevention (NCC MERP) index. One event was category H harm (intervention to sustain life). Naloxone was associated with the most grade 4 ADEs per the Common Terminology Criteria for Adverse Events (CTCAE) v4.03. Twenty-one (64%) ADEs were preventable; 3 of which were submitted via the voluntary reporting system. Conclusion Most of the medication-based triggers yielded low PPVs. Refining the triggers based on patients’ characteristics and medication usage patterns could increase the PPVs and make them more useful for quality improvement. To efficiently detect ADEs, triggers must be revised to reflect specialized pediatric patient populations such as hematology and oncology patients. PMID:24768254
Call, Rosemary J; Burlison, Jonathan D; Robertson, Jennifer J; Scott, Jeffrey R; Baker, Donald K; Rossi, Michael G; Howard, Scott C; Hoffman, James M
2014-09-01
To investigate the use of a trigger tool for the detection of adverse drug events (ADE) in a pediatric hospital specializing in oncology, hematology, and other catastrophic diseases. A medication-based trigger tool package analyzed electronic health records from February 2009 to February 2013. Chart review determined whether an ADE precipitated the trigger. Severity was assigned to ADEs, and preventability was assessed. Preventable ADEs were compared with the hospital's electronic voluntary event reporting system to identify whether these ADEs had been previously identified. The positive predictive values (PPVs) of the entire trigger tool and individual triggers were calculated to assess their accuracy to detect ADEs. Trigger occurrences (n = 706) were detected in 390 patients from 6 medication triggers, 33 of which were ADEs (overall PPV = 16%). Hyaluronidase had the greatest PPV (60%). Most ADEs were category E harm (temporary harm) per the National Coordinating Council for Medication Error Reporting and Prevention index. One event was category H harm (intervention to sustain life). Naloxone was associated with the most grade 4 ADEs per the Common Terminology Criteria for Adverse Events v4.03. Twenty-one (64%) ADEs were preventable, 3 of which were submitted via the voluntary reporting system. Most of the medication-based triggers yielded low PPVs. Refining the triggers based on patients' characteristics and medication usage patterns could increase the PPVs and make them more useful for quality improvement. To efficiently detect ADEs, triggers must be revised to reflect specialized pediatric patient populations such as hematology and oncology patients. Copyright © 2014 Elsevier Inc. All rights reserved.
Bayesian Monitoring Systems for the CTBT: Historical Development and New Results
NASA Astrophysics Data System (ADS)
Russell, S.; Arora, N. S.; Moore, D.
2016-12-01
A project at Berkeley, begun in 2009 in collaboration with CTBTO andmore recently with LLNL, has reformulated the global seismicmonitoring problem in a Bayesian framework. A first-generation system,NETVISA, has been built comprising a spatial event prior andgenerative models of event transmission and detection, as well as aMonte Carlo inference algorithm. The probabilistic model allows forseamless integration of various disparate sources of information,including negative information (the absence of detections). Workingfrom arrivals extracted by traditional station processing fromInternational Monitoring System (IMS) data, NETVISA achieves areduction of around 60% in the number of missed events compared withthe currently deployed network processing system. It also finds manyevents that are missed by the human analysts who postprocess the IMSoutput. Recent improvements include the integration of models forinfrasound and hydroacoustic detections and a global depth model fornatural seismicity trained from ISC data. NETVISA is now fullycompatible with the CTBTO operating environment. A second-generation model called SIGVISA extends NETVISA's generativemodel all the way from events to raw signal data, avoiding theerror-prone bottom-up detection phase of station processing. SIGVISA'smodel automatically captures the phenomena underlying existingdetection and location techniques such as multilateration, waveformcorrelation matching, and double-differencing, and integrates theminto a global inference process that also (like NETVISA) handles denovo events. Initial results for the Western US in early 2008 (whenthe transportable US Array was operating) shows that SIGVISA finds,from IMS data only, more than twice the number of events recorded inthe CTBTO Late Event Bulletin (LEB). For mb 1.0-2.5, the ratio is more than10; put another way, for this data set, SIGVISA lowers the detectionthreshold by roughly one magnitude compared to LEB. The broader message of this work is that probabilistic inference basedon a vertically integrated generative model that directly expressesgeophysical knowledge can be a much more effective approach forinterpreting scientific data than the traditional bottom-up processingpipeline.
Schwartz, Frank L; Vernier, Stanley J; Shubrook, Jay H; Marling, Cynthia R
2010-11-01
We have developed a prototypical case-based reasoning system to enhance management of patients with type 1 diabetes mellitus (T1DM). The system is capable of automatically analyzing large volumes of life events, self-monitoring of blood glucose readings, continuous glucose monitoring system results, and insulin pump data to detect clinical problems. In a preliminary study, manual entry of large volumes of life-event and other data was too burdensome for patients. In this study, life-event and pump data collection were automated, and then the system was reevaluated. Twenty-three adult T1DM patients on insulin pumps completed the five-week study. A usual daily schedule was entered into the database, and patients were only required to upload their insulin pump data to Medtronic's CareLink® Web site weekly. Situation assessment routines were run weekly for each participant to detect possible problems, and once the trial was completed, the case-retrieval module was tested. Using the situation assessment routines previously developed, the system found 295 possible problems. The enhanced system detected only 2.6 problems per patient per week compared to 4.9 problems per patient per week in the preliminary study (p=.017). Problems detected by the system were correctly identified in 97.9% of the cases, and 96.1% of these were clinically useful. With less life-event data, the system is unable to detect certain clinical problems and detects fewer problems overall. Additional work is needed to provide device/software interfaces that allow patients to provide this data quickly and conveniently. © 2010 Diabetes Technology Society.
Bolide Airbursts as a Seismic Source for the 2018 Mars InSight Mission
NASA Astrophysics Data System (ADS)
Stevanović, J.; Teanby, N. A.; Wookey, J.; Selby, N.; Daubar, I. J.; Vaubaillon, J.; Garcia, R.
2017-10-01
In 2018, NASA will launch InSight, a single-station suite of geophysical instruments, designed to characterise the martian interior. We investigate the seismo-acoustic signal generated by a bolide entering the martian atmosphere and exploding in a terminal airburst, and assess this phenomenon as a potential observable for the SEIS seismic payload. Terrestrial analogue data from four recent events are used to identify diagnostic airburst characteristics in both the time and frequency domain. In order to estimate a potential number of detectable events for InSight, we first model the impactor source population from observations made on the Earth, scaled for planetary radius, entry velocity and source density. We go on to calculate a range of potential airbursts from the larger incident impactor population. We estimate there to be {˜} 1000 events of this nature per year on Mars. To then derive a detectable number of airbursts for InSight, we scale this number according to atmospheric attenuation, air-to-ground coupling inefficiencies and by instrument capability for SEIS. We predict between 10-200 detectable events per year for InSight.
Gravitational Wave Detection of Compact Binaries Through Multivariate Analysis
NASA Astrophysics Data System (ADS)
Atallah, Dany Victor; Dorrington, Iain; Sutton, Patrick
2017-01-01
The first detection of gravitational waves (GW), GW150914, as produced by a binary black hole merger, has ushered in the era of GW astronomy. The detection technique used to find GW150914 considered only a fraction of the information available describing the candidate event: mainly the detector signal to noise ratios and chi-squared values. In hopes of greatly increasing detection rates, we want to take advantage of all the information available about candidate events. We employ a technique called Multivariate Analysis (MVA) to improve LIGO sensitivity to GW signals. MVA techniques are efficient ways to scan high dimensional data spaces for signal/noise classification. Our goal is to use MVA to classify compact-object binary coalescence (CBC) events composed of any combination of black holes and neutron stars. CBC waveforms are modeled through numerical relativity. Templates of the modeled waveforms are used to search for CBCs and quantify candidate events. Different MVA pipelines are under investigation to look for CBC signals and un-modelled signals, with promising results. One such MVA pipeline used for the un-modelled search can theoretically analyze far more data than the MVA pipelines currently explored for CBCs, potentially making a more powerful classifier. In principle, this extra information could improve the sensitivity to GW signals. We will present the results from our efforts to adapt an MVA pipeline used in the un-modelled search to classify candidate events from the CBC search.
Detecting Single-Nucleotide Substitutions Induced by Genome Editing.
Miyaoka, Yuichiro; Chan, Amanda H; Conklin, Bruce R
2016-08-01
The detection of genome editing is critical in evaluating genome-editing tools or conditions, but it is not an easy task to detect genome-editing events-especially single-nucleotide substitutions-without a surrogate marker. Here we introduce a procedure that significantly contributes to the advancement of genome-editing technologies. It uses droplet digital polymerase chain reaction (ddPCR) and allele-specific hydrolysis probes to detect single-nucleotide substitutions generated by genome editing (via homology-directed repair, or HDR). HDR events that introduce substitutions using donor DNA are generally infrequent, even with genome-editing tools, and the outcome is only one base pair difference in 3 billion base pairs of the human genome. This task is particularly difficult in induced pluripotent stem (iPS) cells, in which editing events can be very rare. Therefore, the technological advances described here have implications for therapeutic genome editing and experimental approaches to disease modeling with iPS cells. © 2016 Cold Spring Harbor Laboratory Press.
Real-time Automatic Detectors of P and S Waves Using Singular Values Decomposition
NASA Astrophysics Data System (ADS)
Kurzon, I.; Vernon, F.; Rosenberger, A.; Ben-Zion, Y.
2013-12-01
We implement a new method for the automatic detection of the primary P and S phases using Singular Value Decomposition (SVD) analysis. The method is based on a real-time iteration algorithm of Rosenberger (2010) for the SVD of three component seismograms. Rosenberger's algorithm identifies the incidence angle by applying SVD and separates the waveforms into their P and S components. We have been using the same algorithm with the modification that we filter the waveforms prior to the SVD, and then apply SNR (Signal-to-Noise Ratio) detectors for picking the P and S arrivals, on the new filtered+SVD-separated channels. A recent deployment in San Jacinto Fault Zone area provides a very dense seismic network that allows us to test the detection algorithm in diverse setting, such as: events with different source mechanisms, stations with different site characteristics, and ray paths that diverge from the SVD approximation used in the algorithm, (e.g., rays propagating within the fault and recorded on linear arrays, crossing the fault). We have found that a Butterworth band-pass filter of 2-30Hz, with four poles at each of the corner frequencies, shows the best performance in a large variety of events and stations within the SJFZ. Using the SVD detectors we obtain a similar number of P and S picks, which is a rare thing to see in ordinary SNR detectors. Also for the actual real-time operation of the ANZA and SJFZ real-time seismic networks, the above filter (2-30Hz) shows a very impressive performance, tested on many events and several aftershock sequences in the region from the MW 5.2 of June 2005, through the MW 5.4 of July 2010, to MW 4.7 of March 2013. Here we show the results of testing the detectors on the most complex and intense aftershock sequence, the MW 5.2 of June 2005, in which in the very first hour there were ~4 events a minute. This aftershock sequence was thoroughly reviewed by several analysts, identifying 294 events in the first hour, located in a condensed cluster around the main shock. We used this hour of events to fine-tune the automatic SVD detection, association and location of the real-time system, reaching a 37% automatic identification and location of events, with a minimum of 10 stations per event, all events fall within the same condensed cluster and there are no false events or large offsets of their locations. An ordinary SNR detector did not exceed the 11% success with a minimum of 8 stations per event, 2 false events and a wider spread of events (not within the reviewed cluster). One of the main advantages of the SVD detectors for real-time operations is the actual separation between the P and S components, by that significantly reducing the noise of picks detected by ordinary SNR detectors. The new method has been applied for a significant amount of events within the SJFZ in the past 8 years, and is now in the final stage of real-time implementation in UCSD for the ANZA and SJFZ networks, tuned for automatic detection and location of local events.
Analyzing and Identifying Teens' Stressful Periods and Stressor Events From a Microblog.
Li, Qi; Xue, Yuanyuan; Zhao, Liang; Jia, Jia; Feng, Ling
2017-09-01
Increased health problems among adolescents caused by psychological stress have aroused worldwide attention. Long-standing stress without targeted assistance and guidance negatively impacts the healthy growth of adolescents, threatening the future development of our society. So far, research focused on detecting adolescent psychological stress revealed from each individual post on microblogs. However, beyond stressful moments, identifying teens' stressful periods and stressor events that trigger each stressful period is more desirable to understand the stress from appearance to essence. In this paper, we define the problem of identifying teens' stressful periods and stressor events from the open social media microblog. Starting from a case study of adolescents' posting behaviors during stressful school events, we build a Poisson-based probability model for the correlation between stressor events and stressful posting behaviors through a series of posts on Tencent Weibo (referred to as the microblog throughout the paper). With the model, we discover teens' maximal stressful periods and further extract details of possible stressor events that cause the stressful periods. We generalize and present the extracted stressor events in a hierarchy based on common stress dimensions and event types. Taking 122 scheduled stressful study-related events in a high school as the ground truth, we test the approach on 124 students' posts from January 1, 2012 to February 1, 2015 and obtain some promising experimental results: (stressful periods: recall 0.761, precision 0.737, and F 1 -measure 0.734) and (top-3 stressor events: recall 0.763, precision 0.756, and F 1 -measure 0.759). The most prominent stressor events extracted are in the self-cognition domain, followed by the school life domain. This conforms to the adolescent psychological investigation result that problems in school life usually accompanied with teens' inner cognition problems. Compared with the state-of-the-art top-1 personal life event detection approach, our stressor event detection method is 13.72% higher in precision, 19.18% higher in recall, and 16.50% higher in F 1 -measure, demonstrating the effectiveness of our proposed framework.
Multimodal Sparse Coding for Event Detection
2015-10-13
classification tasks based on single modality. We present multimodal sparse coding for learning feature representations shared across multiple modalities...The shared representa- tions are applied to multimedia event detection (MED) and evaluated in compar- ison to unimodal counterparts, as well as other...and video tracks from the same multimedia clip, we can force the two modalities to share a similar sparse representation whose benefit includes robust