Embedded security system for multi-modal surveillance in a railway carriage
NASA Astrophysics Data System (ADS)
Zouaoui, Rhalem; Audigier, Romaric; Ambellouis, Sébastien; Capman, François; Benhadda, Hamid; Joudrier, Stéphanie; Sodoyer, David; Lamarque, Thierry
2015-10-01
Public transport security is one of the main priorities of the public authorities when fighting against crime and terrorism. In this context, there is a great demand for autonomous systems able to detect abnormal events such as violent acts aboard passenger cars and intrusions when the train is parked at the depot. To this end, we present an innovative approach which aims at providing efficient automatic event detection by fusing video and audio analytics and reducing the false alarm rate compared to classical stand-alone video detection. The multi-modal system is composed of two microphones and one camera and integrates onboard video and audio analytics and fusion capabilities. On the one hand, for detecting intrusion, the system relies on the fusion of "unusual" audio events detection with intrusion detections from video processing. The audio analysis consists in modeling the normal ambience and detecting deviation from the trained models during testing. This unsupervised approach is based on clustering of automatically extracted segments of acoustic features and statistical Gaussian Mixture Model (GMM) modeling of each cluster. The intrusion detection is based on the three-dimensional (3D) detection and tracking of individuals in the videos. On the other hand, for violent events detection, the system fuses unsupervised and supervised audio algorithms with video event detection. The supervised audio technique detects specific events such as shouts. A GMM is used to catch the formant structure of a shout signal. Video analytics use an original approach for detecting aggressive motion by focusing on erratic motion patterns specific to violent events. As data with violent events is not easily available, a normality model with structured motions from non-violent videos is learned for one-class classification. A fusion algorithm based on Dempster-Shafer's theory analyses the asynchronous detection outputs and computes the degree of belief of each probable event.
NASA Astrophysics Data System (ADS)
Kim, S. K.; Lee, J.; Zhang, C.; Ames, S.; Williams, D. N.
2017-12-01
Deep learning techniques have been successfully applied to solve many problems in climate and geoscience using massive-scaled observed and modeled data. For extreme climate event detections, several models based on deep neural networks have been recently proposed and attend superior performance that overshadows all previous handcrafted expert based method. The issue arising, though, is that accurate localization of events requires high quality of climate data. In this work, we propose framework capable of detecting and localizing extreme climate events in very coarse climate data. Our framework is based on two models using deep neural networks, (1) Convolutional Neural Networks (CNNs) to detect and localize extreme climate events, and (2) Pixel recursive recursive super resolution model to reconstruct high resolution climate data from low resolution climate data. Based on our preliminary work, we have presented two CNNs in our framework for different purposes, detection and localization. Our results using CNNs for extreme climate events detection shows that simple neural nets can capture the pattern of extreme climate events with high accuracy from very coarse reanalysis data. However, localization accuracy is relatively low due to the coarse resolution. To resolve this issue, the pixel recursive super resolution model reconstructs the resolution of input of localization CNNs. We present a best networks using pixel recursive super resolution model that synthesizes details of tropical cyclone in ground truth data while enhancing their resolution. Therefore, this approach not only dramat- ically reduces the human effort, but also suggests possibility to reduce computing cost required for downscaling process to increase resolution of data.
Adaptively Adjusted Event-Triggering Mechanism on Fault Detection for Networked Control Systems.
Wang, Yu-Long; Lim, Cheng-Chew; Shi, Peng
2016-12-08
This paper studies the problem of adaptively adjusted event-triggering mechanism-based fault detection for a class of discrete-time networked control system (NCS) with applications to aircraft dynamics. By taking into account the fault occurrence detection progress and the fault occurrence probability, and introducing an adaptively adjusted event-triggering parameter, a novel event-triggering mechanism is proposed to achieve the efficient utilization of the communication network bandwidth. Both the sensor-to-control station and the control station-to-actuator network-induced delays are taken into account. The event-triggered sensor and the event-triggered control station are utilized simultaneously to establish new network-based closed-loop models for the NCS subject to faults. Based on the established models, the event-triggered simultaneous design of fault detection filter (FDF) and controller is presented. A new algorithm for handling the adaptively adjusted event-triggering parameter is proposed. Performance analysis verifies the effectiveness of the adaptively adjusted event-triggering mechanism, and the simultaneous design of FDF and controller.
Network hydraulics inclusion in water quality event detection using multiple sensor stations data.
Oliker, Nurit; Ostfeld, Avi
2015-09-01
Event detection is one of the current most challenging topics in water distribution systems analysis: how regular on-line hydraulic (e.g., pressure, flow) and water quality (e.g., pH, residual chlorine, turbidity) measurements at different network locations can be efficiently utilized to detect water quality contamination events. This study describes an integrated event detection model which combines multiple sensor stations data with network hydraulics. To date event detection modelling is likely limited to single sensor station location and dataset. Single sensor station models are detached from network hydraulics insights and as a result might be significantly exposed to false positive alarms. This work is aimed at decreasing this limitation through integrating local and spatial hydraulic data understanding into an event detection model. The spatial analysis complements the local event detection effort through discovering events with lower signatures by exploring the sensors mutual hydraulic influences. The unique contribution of this study is in incorporating hydraulic simulation information into the overall event detection process of spatially distributed sensors. The methodology is demonstrated on two example applications using base runs and sensitivity analyses. Results show a clear advantage of the suggested model over single-sensor event detection schemes. Copyright © 2015 Elsevier Ltd. All rights reserved.
Model Based Verification of Cyber Range Event Environments
2015-12-10
Model Based Verification of Cyber Range Event Environments Suresh K. Damodaran MIT Lincoln Laboratory 244 Wood St., Lexington, MA, USA...apply model based verification to cyber range event environment configurations, allowing for the early detection of errors in event environment...Environment Representation (CCER) ontology. We also provide an overview of a methodology to specify verification rules and the corresponding error
Initial Evaluation of Signal-Based Bayesian Monitoring
NASA Astrophysics Data System (ADS)
Moore, D.; Russell, S.
2016-12-01
We present SIGVISA (Signal-based Vertically Integrated Seismic Analysis), a next-generation system for global seismic monitoring through Bayesian inference on seismic signals. Traditional seismic monitoring systems rely on discrete detections produced by station processing software, discarding significant information present in the original recorded signal. By modeling signals directly, our forward model is able to incorporate a rich representation of the physics underlying the signal generation process, including source mechanisms, wave propagation, and station response. This allows inference in the model to recover the qualitative behavior of geophysical methods including waveform matching and double-differencing, all as part of a unified Bayesian monitoring system that simultaneously detects and locates events from a network of stations. We report results from an evaluation of SIGVISA monitoring the western United States for a two-week period following the magnitude 6.0 event in Wells, NV in February 2008. During this period, SIGVISA detects more than twice as many events as NETVISA, and three times as many as SEL3, while operating at the same precision; at lower precisions it detects up to five times as many events as SEL3. At the same time, signal-based monitoring reduces mean location errors by a factor of four relative to detection-based systems. We provide evidence that, given only IMS data, SIGVISA detects events that are missed by regional monitoring networks, indicating that our evaluations may even underestimate its performance. Finally, SIGVISA matches or exceeds the detection rates of existing systems for de novo events - events with no nearby historical seismicity - and detects through automated processing a number of such events missed even by the human analysts generating the LEB.
NASA Astrophysics Data System (ADS)
Jia, Rui-Sheng; Sun, Hong-Mei; Peng, Yan-Jun; Liang, Yong-Quan; Lu, Xin-Ming
2017-07-01
Microseismic monitoring is an effective means for providing early warning of rock or coal dynamical disasters, and its first step is microseismic event detection, although low SNR microseismic signals often cannot effectively be detected by routine methods. To solve this problem, this paper presents permutation entropy and a support vector machine to detect low SNR microseismic events. First, an extraction method of signal features based on multi-scale permutation entropy is proposed by studying the influence of the scale factor on the signal permutation entropy. Second, the detection model of low SNR microseismic events based on the least squares support vector machine is built by performing a multi-scale permutation entropy calculation for the collected vibration signals, constructing a feature vector set of signals. Finally, a comparative analysis of the microseismic events and noise signals in the experiment proves that the different characteristics of the two can be fully expressed by using multi-scale permutation entropy. The detection model of microseismic events combined with the support vector machine, which has the features of high classification accuracy and fast real-time algorithms, can meet the requirements of online, real-time extractions of microseismic events.
Detecting Earthquakes over a Seismic Network using Single-Station Similarity Measures
NASA Astrophysics Data System (ADS)
Bergen, Karianne J.; Beroza, Gregory C.
2018-03-01
New blind waveform-similarity-based detection methods, such as Fingerprint and Similarity Thresholding (FAST), have shown promise for detecting weak signals in long-duration, continuous waveform data. While blind detectors are capable of identifying similar or repeating waveforms without templates, they can also be susceptible to false detections due to local correlated noise. In this work, we present a set of three new methods that allow us to extend single-station similarity-based detection over a seismic network; event-pair extraction, pairwise pseudo-association, and event resolution complete a post-processing pipeline that combines single-station similarity measures (e.g. FAST sparse similarity matrix) from each station in a network into a list of candidate events. The core technique, pairwise pseudo-association, leverages the pairwise structure of event detections in its network detection model, which allows it to identify events observed at multiple stations in the network without modeling the expected move-out. Though our approach is general, we apply it to extend FAST over a sparse seismic network. We demonstrate that our network-based extension of FAST is both sensitive and maintains a low false detection rate. As a test case, we apply our approach to two weeks of continuous waveform data from five stations during the foreshock sequence prior to the 2014 Mw 8.2 Iquique earthquake. Our method identifies nearly five times as many events as the local seismicity catalog (including 95% of the catalog events), and less than 1% of these candidate events are false detections.
A Cyber-Attack Detection Model Based on Multivariate Analyses
NASA Astrophysics Data System (ADS)
Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi
In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.
Detecting event-related changes in organizational networks using optimized neural network models.
Li, Ze; Sun, Duoyong; Zhu, Renqi; Lin, Zihan
2017-01-01
Organizational external behavior changes are caused by the internal structure and interactions. External behaviors are also known as the behavioral events of an organization. Detecting event-related changes in organizational networks could efficiently be used to monitor the dynamics of organizational behaviors. Although many different methods have been used to detect changes in organizational networks, these methods usually ignore the correlation between the internal structure and external events. Event-related change detection considers the correlation and could be used for event recognition based on social network modeling and supervised classification. Detecting event-related changes could be effectively useful in providing early warnings and faster responses to both positive and negative organizational activities. In this study, event-related change in an organizational network was defined, and artificial neural network models were used to quantitatively determine whether and when a change occurred. To achieve a higher accuracy, Back Propagation Neural Networks (BPNNs) were optimized using Genetic Algorithms (GAs) and Particle Swarm Optimization (PSO). We showed the feasibility of the proposed method by comparing its performance with that of other methods using two cases. The results suggested that the proposed method could identify organizational events based on a correlation between the organizational networks and events. The results also suggested that the proposed method not only has a higher precision but also has a better robustness than the previously used techniques.
Detecting event-related changes in organizational networks using optimized neural network models
Sun, Duoyong; Zhu, Renqi; Lin, Zihan
2017-01-01
Organizational external behavior changes are caused by the internal structure and interactions. External behaviors are also known as the behavioral events of an organization. Detecting event-related changes in organizational networks could efficiently be used to monitor the dynamics of organizational behaviors. Although many different methods have been used to detect changes in organizational networks, these methods usually ignore the correlation between the internal structure and external events. Event-related change detection considers the correlation and could be used for event recognition based on social network modeling and supervised classification. Detecting event-related changes could be effectively useful in providing early warnings and faster responses to both positive and negative organizational activities. In this study, event-related change in an organizational network was defined, and artificial neural network models were used to quantitatively determine whether and when a change occurred. To achieve a higher accuracy, Back Propagation Neural Networks (BPNNs) were optimized using Genetic Algorithms (GAs) and Particle Swarm Optimization (PSO). We showed the feasibility of the proposed method by comparing its performance with that of other methods using two cases. The results suggested that the proposed method could identify organizational events based on a correlation between the organizational networks and events. The results also suggested that the proposed method not only has a higher precision but also has a better robustness than the previously used techniques. PMID:29190799
Detecting earthquakes over a seismic network using single-station similarity measures
NASA Astrophysics Data System (ADS)
Bergen, Karianne J.; Beroza, Gregory C.
2018-06-01
New blind waveform-similarity-based detection methods, such as Fingerprint and Similarity Thresholding (FAST), have shown promise for detecting weak signals in long-duration, continuous waveform data. While blind detectors are capable of identifying similar or repeating waveforms without templates, they can also be susceptible to false detections due to local correlated noise. In this work, we present a set of three new methods that allow us to extend single-station similarity-based detection over a seismic network; event-pair extraction, pairwise pseudo-association, and event resolution complete a post-processing pipeline that combines single-station similarity measures (e.g. FAST sparse similarity matrix) from each station in a network into a list of candidate events. The core technique, pairwise pseudo-association, leverages the pairwise structure of event detections in its network detection model, which allows it to identify events observed at multiple stations in the network without modeling the expected moveout. Though our approach is general, we apply it to extend FAST over a sparse seismic network. We demonstrate that our network-based extension of FAST is both sensitive and maintains a low false detection rate. As a test case, we apply our approach to 2 weeks of continuous waveform data from five stations during the foreshock sequence prior to the 2014 Mw 8.2 Iquique earthquake. Our method identifies nearly five times as many events as the local seismicity catalogue (including 95 per cent of the catalogue events), and less than 1 per cent of these candidate events are false detections.
Accelerometer and Camera-Based Strategy for Improved Human Fall Detection.
Zerrouki, Nabil; Harrou, Fouzi; Sun, Ying; Houacine, Amrane
2016-12-01
In this paper, we address the problem of detecting human falls using anomaly detection. Detection and classification of falls are based on accelerometric data and variations in human silhouette shape. First, we use the exponentially weighted moving average (EWMA) monitoring scheme to detect a potential fall in the accelerometric data. We used an EWMA to identify features that correspond with a particular type of fall allowing us to classify falls. Only features corresponding with detected falls were used in the classification phase. A benefit of using a subset of the original data to design classification models minimizes training time and simplifies models. Based on features corresponding to detected falls, we used the support vector machine (SVM) algorithm to distinguish between true falls and fall-like events. We apply this strategy to the publicly available fall detection databases from the university of Rzeszow's. Results indicated that our strategy accurately detected and classified fall events, suggesting its potential application to early alert mechanisms in the event of fall situations and its capability for classification of detected falls. Comparison of the classification results using the EWMA-based SVM classifier method with those achieved using three commonly used machine learning classifiers, neural network, K-nearest neighbor and naïve Bayes, proved our model superior.
Generalized Detectability for Discrete Event Systems
Shu, Shaolong; Lin, Feng
2011-01-01
In our previous work, we investigated detectability of discrete event systems, which is defined as the ability to determine the current and subsequent states of a system based on observation. For different applications, we defined four types of detectabilities: (weak) detectability, strong detectability, (weak) periodic detectability, and strong periodic detectability. In this paper, we extend our results in three aspects. (1) We extend detectability from deterministic systems to nondeterministic systems. Such a generalization is necessary because there are many systems that need to be modeled as nondeterministic discrete event systems. (2) We develop polynomial algorithms to check strong detectability. The previous algorithms are based on observer whose construction is of exponential complexity, while the new algorithms are based on a new automaton called detector. (3) We extend detectability to D-detectability. While detectability requires determining the exact state of a system, D-detectability relaxes this requirement by asking only to distinguish certain pairs of states. With these extensions, the theory on detectability of discrete event systems becomes more applicable in solving many practical problems. PMID:21691432
Event-driven simulation in SELMON: An overview of EDSE
NASA Technical Reports Server (NTRS)
Rouquette, Nicolas F.; Chien, Steve A.; Charest, Leonard, Jr.
1992-01-01
EDSE (event-driven simulation engine), a model-based event-driven simulator implemented for SELMON, a tool for sensor selection and anomaly detection in real-time monitoring is described. The simulator is used in conjunction with a causal model to predict future behavior of the model from observed data. The behavior of the causal model is interpreted as equivalent to the behavior of the physical system being modeled. An overview of the functionality of the simulator and the model-based event-driven simulation paradigm on which it is based is provided. Included are high-level descriptions of the following key properties: event consumption and event creation, iterative simulation, synchronization and filtering of monitoring data from the physical system. Finally, how EDSE stands with respect to the relevant open issues of discrete-event and model-based simulation is discussed.
Secure access control and large scale robust representation for online multimedia event detection.
Liu, Changyu; Lu, Bin; Li, Huiling
2014-01-01
We developed an online multimedia event detection (MED) system. However, there are a secure access control issue and a large scale robust representation issue when we want to integrate traditional event detection algorithms into the online environment. For the first issue, we proposed a tree proxy-based and service-oriented access control (TPSAC) model based on the traditional role based access control model. Verification experiments were conducted on the CloudSim simulation platform, and the results showed that the TPSAC model is suitable for the access control of dynamic online environments. For the second issue, inspired by the object-bank scene descriptor, we proposed a 1000-object-bank (1000OBK) event descriptor. Feature vectors of the 1000OBK were extracted from response pyramids of 1000 generic object detectors which were trained on standard annotated image datasets, such as the ImageNet dataset. A spatial bag of words tiling approach was then adopted to encode these feature vectors for bridging the gap between the objects and events. Furthermore, we performed experiments in the context of event classification on the challenging TRECVID MED 2012 dataset, and the results showed that the robust 1000OBK event descriptor outperforms the state-of-the-art approaches.
Integrating physically based simulators with Event Detection Systems: Multi-site detection approach.
Housh, Mashor; Ohar, Ziv
2017-03-01
The Fault Detection (FD) Problem in control theory concerns of monitoring a system to identify when a fault has occurred. Two approaches can be distinguished for the FD: Signal processing based FD and Model-based FD. The former concerns of developing algorithms to directly infer faults from sensors' readings, while the latter uses a simulation model of the real-system to analyze the discrepancy between sensors' readings and expected values from the simulation model. Most contamination Event Detection Systems (EDSs) for water distribution systems have followed the signal processing based FD, which relies on analyzing the signals from monitoring stations independently of each other, rather than evaluating all stations simultaneously within an integrated network. In this study, we show that a model-based EDS which utilizes a physically based water quality and hydraulics simulation models, can outperform the signal processing based EDS. We also show that the model-based EDS can facilitate the development of a Multi-Site EDS (MSEDS), which analyzes the data from all the monitoring stations simultaneously within an integrated network. The advantage of the joint analysis in the MSEDS is expressed by increased detection accuracy (higher true positive alarms and fewer false alarms) and shorter detection time. Copyright © 2016 Elsevier Ltd. All rights reserved.
Fighting detection using interaction energy force
NASA Astrophysics Data System (ADS)
Wateosot, Chonthisa; Suvonvorn, Nikom
2017-02-01
Fighting detection is an important issue in security aimed to prevent criminal or undesirable events in public places. Many researches on computer vision techniques have studied to detect the specific event in crowded scenes. In this paper we focus on fighting detection using social-based Interaction Energy Force (IEF). The method uses low level features without object extraction and tracking. The interaction force is modeled using the magnitude and direction of optical flows. A fighting factor is developed under this model to detect fighting events using thresholding method. An energy map of interaction force is also presented to identify the corresponding events. The evaluation is performed using NUSHGA and BEHAVE datasets. The results show the efficiency with high accuracy regardless of various conditions.
Jensen, Morten Hasselstrøm; Christensen, Toke Folke; Tarnow, Lise; Seto, Edmund; Dencker Johansen, Mette; Hejlesen, Ole Kristian
2013-07-01
Hypoglycemia is a potentially fatal condition. Continuous glucose monitoring (CGM) has the potential to detect hypoglycemia in real time and thereby reduce time in hypoglycemia and avoid any further decline in blood glucose level. However, CGM is inaccurate and shows a substantial number of cases in which the hypoglycemic event is not detected by the CGM. The aim of this study was to develop a pattern classification model to optimize real-time hypoglycemia detection. Features such as time since last insulin injection and linear regression, kurtosis, and skewness of the CGM signal in different time intervals were extracted from data of 10 male subjects experiencing 17 insulin-induced hypoglycemic events in an experimental setting. Nondiscriminative features were eliminated with SEPCOR and forward selection. The feature combinations were used in a Support Vector Machine model and the performance assessed by sample-based sensitivity and specificity and event-based sensitivity and number of false-positives. The best model was composed by using seven features and was able to detect 17 of 17 hypoglycemic events with one false-positive compared with 12 of 17 hypoglycemic events with zero false-positives for the CGM alone. Lead-time was 14 min and 0 min for the model and the CGM alone, respectively. This optimized real-time hypoglycemia detection provides a unique approach for the diabetes patient to reduce time in hypoglycemia and learn about patterns in glucose excursions. Although these results are promising, the model needs to be validated on CGM data from patients with spontaneous hypoglycemic events.
Real-Time Gait Event Detection Based on Kinematic Data Coupled to a Biomechanical Model.
Lambrecht, Stefan; Harutyunyan, Anna; Tanghe, Kevin; Afschrift, Maarten; De Schutter, Joris; Jonkers, Ilse
2017-03-24
Real-time detection of multiple stance events, more specifically initial contact (IC), foot flat (FF), heel off (HO), and toe off (TO), could greatly benefit neurorobotic (NR) and neuroprosthetic (NP) control. Three real-time threshold-based algorithms have been developed, detecting the aforementioned events based on kinematic data in combination with a biomechanical model. Data from seven subjects walking at three speeds on an instrumented treadmill were used to validate the presented algorithms, accumulating to a total of 558 steps. The reference for the gait events was obtained using marker and force plate data. All algorithms had excellent precision and no false positives were observed. Timing delays of the presented algorithms were similar to current state-of-the-art algorithms for the detection of IC and TO, whereas smaller delays were achieved for the detection of FF. Our results indicate that, based on their high precision and low delays, these algorithms can be used for the control of an NR/NP, with the exception of the HO event. Kinematic data is used in most NR/NP control schemes and is thus available at no additional cost, resulting in a minimal computational burden. The presented methods can also be applied for screening pathological gait or gait analysis in general in/outside of the laboratory.
Secure Access Control and Large Scale Robust Representation for Online Multimedia Event Detection
Liu, Changyu; Li, Huiling
2014-01-01
We developed an online multimedia event detection (MED) system. However, there are a secure access control issue and a large scale robust representation issue when we want to integrate traditional event detection algorithms into the online environment. For the first issue, we proposed a tree proxy-based and service-oriented access control (TPSAC) model based on the traditional role based access control model. Verification experiments were conducted on the CloudSim simulation platform, and the results showed that the TPSAC model is suitable for the access control of dynamic online environments. For the second issue, inspired by the object-bank scene descriptor, we proposed a 1000-object-bank (1000OBK) event descriptor. Feature vectors of the 1000OBK were extracted from response pyramids of 1000 generic object detectors which were trained on standard annotated image datasets, such as the ImageNet dataset. A spatial bag of words tiling approach was then adopted to encode these feature vectors for bridging the gap between the objects and events. Furthermore, we performed experiments in the context of event classification on the challenging TRECVID MED 2012 dataset, and the results showed that the robust 1000OBK event descriptor outperforms the state-of-the-art approaches. PMID:25147840
Assessing the continuum of event-based biosurveillance through an operational lens.
Corley, Courtney D; Lancaster, Mary J; Brigantic, Robert T; Chung, James S; Walters, Ronald A; Arthur, Ray R; Bruckner-Lea, Cynthia J; Calapristi, Augustin; Dowling, Glenn; Hartley, David M; Kennedy, Shaun; Kircher, Amy; Klucking, Sara; Lee, Eva K; McKenzie, Taylor; Nelson, Noele P; Olsen, Jennifer; Pancerella, Carmen; Quitugua, Teresa N; Reed, Jeremy Todd; Thomas, Carla S
2012-03-01
This research follows the Updated Guidelines for Evaluating Public Health Surveillance Systems, Recommendations from the Guidelines Working Group, published by the Centers for Disease Control and Prevention nearly a decade ago. Since then, models have been developed and complex systems have evolved with a breadth of disparate data to detect or forecast chemical, biological, and radiological events that have a significant impact on the One Health landscape. How the attributes identified in 2001 relate to the new range of event-based biosurveillance technologies is unclear. This article frames the continuum of event-based biosurveillance systems (that fuse media reports from the internet), models (ie, computational that forecast disease occurrence), and constructs (ie, descriptive analytical reports) through an operational lens (ie, aspects and attributes associated with operational considerations in the development, testing, and validation of the event-based biosurveillance methods and models and their use in an operational environment). A workshop was held in 2010 to scientifically identify, develop, and vet a set of attributes for event-based biosurveillance. Subject matter experts were invited from 7 federal government agencies and 6 different academic institutions pursuing research in biosurveillance event detection. We describe 8 attribute families for the characterization of event-based biosurveillance: event, readiness, operational aspects, geographic coverage, population coverage, input data, output, and cost. Ultimately, the analyses provide a framework from which the broad scope, complexity, and relevant issues germane to event-based biosurveillance useful in an operational environment can be characterized.
Warrick, P A; Precup, D; Hamilton, E F; Kearney, R E
2007-01-01
To develop a singular-spectrum analysis (SSA) based change-point detection algorithm applicable to fetal heart rate (FHR) monitoring to improve the detection of deceleration events. We present a method for decomposing a signal into near-orthogonal components via the discrete cosine transform (DCT) and apply this in a novel online manner to change-point detection based on SSA. The SSA technique forms models of the underlying signal that can be compared over time; models that are sufficiently different indicate signal change points. To adapt the algorithm to deceleration detection where many successive similar change events can occur, we modify the standard SSA algorithm to hold the reference model constant under such conditions, an approach that we term "base-hold SSA". The algorithm is applied to a database of 15 FHR tracings that have been preprocessed to locate candidate decelerations and is compared to the markings of an expert obstetrician. Of the 528 true and 1285 false decelerations presented to the algorithm, the base-hold approach improved on standard SSA, reducing the number of missed decelerations from 64 to 49 (21.9%) while maintaining the same reduction in false-positives (278). The standard SSA assumption that changes are infrequent does not apply to FHR analysis where decelerations can occur successively and in close proximity; our base-hold SSA modification improves detection of these types of event series.
Schadt, Eric E.; Banerjee, Onureena; Fang, Gang; Feng, Zhixing; Wong, Wing H.; Zhang, Xuegong; Kislyuk, Andrey; Clark, Tyson A.; Luong, Khai; Keren-Paz, Alona; Chess, Andrew; Kumar, Vipin; Chen-Plotkin, Alice; Sondheimer, Neal; Korlach, Jonas; Kasarskis, Andrew
2013-01-01
Current generation DNA sequencing instruments are moving closer to seamlessly sequencing genomes of entire populations as a routine part of scientific investigation. However, while significant inroads have been made identifying small nucleotide variation and structural variations in DNA that impact phenotypes of interest, progress has not been as dramatic regarding epigenetic changes and base-level damage to DNA, largely due to technological limitations in assaying all known and unknown types of modifications at genome scale. Recently, single-molecule real time (SMRT) sequencing has been reported to identify kinetic variation (KV) events that have been demonstrated to reflect epigenetic changes of every known type, providing a path forward for detecting base modifications as a routine part of sequencing. However, to date no statistical framework has been proposed to enhance the power to detect these events while also controlling for false-positive events. By modeling enzyme kinetics in the neighborhood of an arbitrary location in a genomic region of interest as a conditional random field, we provide a statistical framework for incorporating kinetic information at a test position of interest as well as at neighboring sites that help enhance the power to detect KV events. The performance of this and related models is explored, with the best-performing model applied to plasmid DNA isolated from Escherichia coli and mitochondrial DNA isolated from human brain tissue. We highlight widespread kinetic variation events, some of which strongly associate with known modification events, while others represent putative chemically modified sites of unknown types. PMID:23093720
Schadt, Eric E; Banerjee, Onureena; Fang, Gang; Feng, Zhixing; Wong, Wing H; Zhang, Xuegong; Kislyuk, Andrey; Clark, Tyson A; Luong, Khai; Keren-Paz, Alona; Chess, Andrew; Kumar, Vipin; Chen-Plotkin, Alice; Sondheimer, Neal; Korlach, Jonas; Kasarskis, Andrew
2013-01-01
Current generation DNA sequencing instruments are moving closer to seamlessly sequencing genomes of entire populations as a routine part of scientific investigation. However, while significant inroads have been made identifying small nucleotide variation and structural variations in DNA that impact phenotypes of interest, progress has not been as dramatic regarding epigenetic changes and base-level damage to DNA, largely due to technological limitations in assaying all known and unknown types of modifications at genome scale. Recently, single-molecule real time (SMRT) sequencing has been reported to identify kinetic variation (KV) events that have been demonstrated to reflect epigenetic changes of every known type, providing a path forward for detecting base modifications as a routine part of sequencing. However, to date no statistical framework has been proposed to enhance the power to detect these events while also controlling for false-positive events. By modeling enzyme kinetics in the neighborhood of an arbitrary location in a genomic region of interest as a conditional random field, we provide a statistical framework for incorporating kinetic information at a test position of interest as well as at neighboring sites that help enhance the power to detect KV events. The performance of this and related models is explored, with the best-performing model applied to plasmid DNA isolated from Escherichia coli and mitochondrial DNA isolated from human brain tissue. We highlight widespread kinetic variation events, some of which strongly associate with known modification events, while others represent putative chemically modified sites of unknown types.
NASA Astrophysics Data System (ADS)
Missif, Lial Raja; Kadhum, Mohammad M.
2017-09-01
Wireless Sensor Network (WSN) has been widely used for monitoring where sensors are deployed to operate independently to sense abnormal phenomena. Most of the proposed environmental monitoring systems are designed based on a predetermined sensing range which does not reflect the sensor reliability, event characteristics, and the environment conditions. Measuring of the capability of a sensor node to accurately detect an event within a sensing field is of great important for monitoring applications. This paper presents an efficient mechanism for even detection based on probabilistic sensing model. Different models have been presented theoretically in this paper to examine their adaptability and applicability to the real environment applications. The numerical results of the experimental evaluation have showed that the probabilistic sensing model provides accurate observation and delectability of an event, and it can be utilized for different environment scenarios.
Masquerade Detection Using a Taxonomy-Based Multinomial Modeling Approach in UNIX Systems
2008-08-25
primarily the modeling of statistical features , such as the frequency of events, the duration of events, the co- occurrence of multiple events...are identified, we can extract features representing such behavior while auditing the user’s behavior. Figure1: Taxonomy of Linux and Unix...achieved when the features are extracted just from simple commands. Method Hit Rate False Positive Rate ocSVM using simple cmds (freq.-based
NASA Technical Reports Server (NTRS)
Turso, James; Lawrence, Charles; Litt, Jonathan
2004-01-01
The development of a wavelet-based feature extraction technique specifically targeting FOD-event induced vibration signal changes in gas turbine engines is described. The technique performs wavelet analysis of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/health parameter estimation for FOD-event detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-event test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite element model modal analysis to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-event detection. In the presence of significant noise, precise location of the FOD event in time was obtained using the developed wavelet-based feature.
NASA Technical Reports Server (NTRS)
Turso, James A.; Lawrence, Charles; Litt, Jonathan S.
2007-01-01
The development of a wavelet-based feature extraction technique specifically targeting FOD-event induced vibration signal changes in gas turbine engines is described. The technique performs wavelet analysis of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/ health parameter estimation for FOD-event detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-event test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite-element model modal analysis to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-event detection. In the presence of significant noise, precise location of the FOD event in time was obtained using the developed wavelet-based feature.
Exploiting semantics for sensor re-calibration in event detection systems
NASA Astrophysics Data System (ADS)
Vaisenberg, Ronen; Ji, Shengyue; Hore, Bijit; Mehrotra, Sharad; Venkatasubramanian, Nalini
2008-01-01
Event detection from a video stream is becoming an important and challenging task in surveillance and sentient systems. While computer vision has been extensively studied to solve different kinds of detection problems over time, it is still a hard problem and even in a controlled environment only simple events can be detected with a high degree of accuracy. Instead of struggling to improve event detection using image processing only, we bring in semantics to direct traditional image processing. Semantics are the underlying facts that hide beneath video frames, which can not be "seen" directly by image processing. In this work we demonstrate that time sequence semantics can be exploited to guide unsupervised re-calibration of the event detection system. We present an instantiation of our ideas by using an appliance as an example--Coffee Pot level detection based on video data--to show that semantics can guide the re-calibration of the detection model. This work exploits time sequence semantics to detect when re-calibration is required to automatically relearn a new detection model for the newly evolved system state and to resume monitoring with a higher rate of accuracy.
SIG-VISA: Signal-based Vertically Integrated Seismic Monitoring
NASA Astrophysics Data System (ADS)
Moore, D.; Mayeda, K. M.; Myers, S. C.; Russell, S.
2013-12-01
Traditional seismic monitoring systems rely on discrete detections produced by station processing software; however, while such detections may constitute a useful summary of station activity, they discard large amounts of information present in the original recorded signal. We present SIG-VISA (Signal-based Vertically Integrated Seismic Analysis), a system for seismic monitoring through Bayesian inference on seismic signals. By directly modeling the recorded signal, our approach incorporates additional information unavailable to detection-based methods, enabling higher sensitivity and more accurate localization using techniques such as waveform matching. SIG-VISA's Bayesian forward model of seismic signal envelopes includes physically-derived models of travel times and source characteristics as well as Gaussian process (kriging) statistical models of signal properties that combine interpolation of historical data with extrapolation of learned physical trends. Applying Bayesian inference, we evaluate the model on earthquakes as well as the 2009 DPRK test event, demonstrating a waveform matching effect as part of the probabilistic inference, along with results on event localization and sensitivity. In particular, we demonstrate increased sensitivity from signal-based modeling, in which the SIGVISA signal model finds statistical evidence for arrivals even at stations for which the IMS station processing failed to register any detection.
Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; ...
2016-01-01
The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events thatmore » generated the phase picks.« less
A model of human event detection in multiple process monitoring situations
NASA Technical Reports Server (NTRS)
Greenstein, J. S.; Rouse, W. B.
1978-01-01
It is proposed that human decision making in many multi-task situations might be modeled in terms of the manner in which the human detects events related to his tasks and the manner in which he allocates his attention among his tasks once he feels events have occurred. A model of human event detection performance in such a situation is presented. An assumption of the model is that, in attempting to detect events, the human generates the probability that events have occurred. Discriminant analysis is used to model the human's generation of these probabilities. An experimental study of human event detection performance in a multiple process monitoring situation is described and the application of the event detection model to this situation is addressed. The experimental study employed a situation in which subjects simulataneously monitored several dynamic processes for the occurrence of events and made yes/no decisions on the presence of events in each process. Input to the event detection model of the information displayed to the experimental subjects allows comparison of the model's performance with the performance of the subjects.
A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data
NASA Astrophysics Data System (ADS)
Kohl, B. C.; Given, J.
2017-12-01
The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in particular the spectral characteristics of events of interest, Is entirely model-based, i.e. does not rely on a priori's - particularly important for nuclear monitoring, Does not rely on individualized signal detection thresholds - it's the network solution that matters.
Zero-inflated Poisson model based likelihood ratio test for drug safety signal detection.
Huang, Lan; Zheng, Dan; Zalkikar, Jyoti; Tiwari, Ram
2017-02-01
In recent decades, numerous methods have been developed for data mining of large drug safety databases, such as Food and Drug Administration's (FDA's) Adverse Event Reporting System, where data matrices are formed by drugs such as columns and adverse events as rows. Often, a large number of cells in these data matrices have zero cell counts and some of them are "true zeros" indicating that the drug-adverse event pairs cannot occur, and these zero counts are distinguished from the other zero counts that are modeled zero counts and simply indicate that the drug-adverse event pairs have not occurred yet or have not been reported yet. In this paper, a zero-inflated Poisson model based likelihood ratio test method is proposed to identify drug-adverse event pairs that have disproportionately high reporting rates, which are also called signals. The maximum likelihood estimates of the model parameters of zero-inflated Poisson model based likelihood ratio test are obtained using the expectation and maximization algorithm. The zero-inflated Poisson model based likelihood ratio test is also modified to handle the stratified analyses for binary and categorical covariates (e.g. gender and age) in the data. The proposed zero-inflated Poisson model based likelihood ratio test method is shown to asymptotically control the type I error and false discovery rate, and its finite sample performance for signal detection is evaluated through a simulation study. The simulation results show that the zero-inflated Poisson model based likelihood ratio test method performs similar to Poisson model based likelihood ratio test method when the estimated percentage of true zeros in the database is small. Both the zero-inflated Poisson model based likelihood ratio test and likelihood ratio test methods are applied to six selected drugs, from the 2006 to 2011 Adverse Event Reporting System database, with varying percentages of observed zero-count cells.
NASA Astrophysics Data System (ADS)
Knapmeyer-Endrun, B.; Hammer, C.
2014-12-01
The seismometers that the Apollo astronauts deployed on the Moon provide the only recordings of seismic events from any extra-terrestrial body so far. These lunar events are significantly different from ones recorded on Earth, in terms of both signal shape and source processes. Thus they are a valuable test case for any experiment in planetary seismology. In this study, we analyze Apollo 16 data with a single-station event detection and classification algorithm in view of NASA's upcoming InSight mission to Mars. InSight, scheduled for launch in early 2016, has the goal to investigate Mars' internal structure by deploying a seismometer on its surface. As the mission does not feature any orbiter, continuous data will be relayed to Earth at a reduced rate. Full range data will only be available by requesting specific time-windows within a few days after the receipt of the original transmission. We apply a recently introduced algorithm based on hidden Markov models that requires only a single example waveform of each event class for training appropriate models. After constructing the prototypes we detect and classify impacts and deep and shallow moonquakes. Initial results for 1972 (year of station installation with 8 months of data) indicate a high detection rate of over 95% for impacts, of which more than 80% are classified correctly. Deep moonquakes, which occur in large amounts, but often show only very weak signals, are detected with less certainty (~70%). As there is only one weak shallow moonquake covered, results for this event class are not statistically significant. Daily adjustments of the background noise model help to reduce false alarms, which are mainly erroneous deep moonquake detections, by about 25%. The algorithm enables us to classify events that were previously listed in the catalog without classification, and, through the combined use of long period and short period data, identify some unlisted local impacts as well as at least two yet unreported deep moonquakes.
LLNL Location and Detection Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Myers, S C; Harris, D B; Anderson, M L
2003-07-16
We present two LLNL research projects in the topical areas of location and detection. The first project assesses epicenter accuracy using a multiple-event location algorithm, and the second project employs waveform subspace Correlation to detect and identify events at Fennoscandian mines. Accurately located seismic events are the bases of location calibration. A well-characterized set of calibration events enables new Earth model development, empirical calibration, and validation of models. In a recent study, Bondar et al. (2003) develop network coverage criteria for assessing the accuracy of event locations that are determined using single-event, linearized inversion methods. These criteria are conservative andmore » are meant for application to large bulletins where emphasis is on catalog completeness and any given event location may be improved through detailed analysis or application of advanced algorithms. Relative event location techniques are touted as advancements that may improve absolute location accuracy by (1) ensuring an internally consistent dataset, (2) constraining a subset of events to known locations, and (3) taking advantage of station and event correlation structure. Here we present the preliminary phase of this work in which we use Nevada Test Site (NTS) nuclear explosions, with known locations, to test the effect of travel-time model accuracy on relative location accuracy. Like previous studies, we find that the reference velocity-model and relative-location accuracy are highly correlated. We also find that metrics based on travel-time residual of relocated events are not a reliable for assessing either velocity-model or relative-location accuracy. In the topical area of detection, we develop specialized correlation (subspace) detectors for the principal mines surrounding the ARCES station located in the European Arctic. Our objective is to provide efficient screens for explosions occurring in the mines of the Kola Peninsula (Kovdor, Zapolyarny, Olenogorsk, Khibiny) and the major iron mines of northern Sweden (Malmberget, Kiruna). In excess of 90% of the events detected by the ARCES station are mining explosions, and a significant fraction are from these northern mining groups. The primary challenge in developing waveform correlation detectors is the degree of variation in the source time histories of the shots, which can result in poor correlation among events even in close proximity. Our approach to solving this problem is to use lagged subspace correlation detectors, which offer some prospect of compensating for variation and uncertainty in source time functions.« less
Abnormal global and local event detection in compressive sensing domain
NASA Astrophysics Data System (ADS)
Wang, Tian; Qiao, Meina; Chen, Jie; Wang, Chuanyun; Zhang, Wenjia; Snoussi, Hichem
2018-05-01
Abnormal event detection, also known as anomaly detection, is one challenging task in security video surveillance. It is important to develop effective and robust movement representation models for global and local abnormal event detection to fight against factors such as occlusion and illumination change. In this paper, a new algorithm is proposed. It can locate the abnormal events on one frame, and detect the global abnormal frame. The proposed algorithm employs a sparse measurement matrix designed to represent the movement feature based on optical flow efficiently. Then, the abnormal detection mission is constructed as a one-class classification task via merely learning from the training normal samples. Experiments demonstrate that our algorithm performs well on the benchmark abnormal detection datasets against state-of-the-art methods.
Determination of a Limited Scope Network's Lightning Detection Efficiency
NASA Technical Reports Server (NTRS)
Rompala, John T.; Blakeslee, R.
2008-01-01
This paper outlines a modeling technique to map lightning detection efficiency variations over a region surveyed by a sparse array of ground based detectors. A reliable flash peak current distribution (PCD) for the region serves as the technique's base. This distribution is recast as an event probability distribution function. The technique then uses the PCD together with information regarding: site signal detection thresholds, type of solution algorithm used, and range attenuation; to formulate the probability that a flash at a specified location will yield a solution. Applying this technique to the full region produces detection efficiency contour maps specific to the parameters employed. These contours facilitate a comparative analysis of each parameter's effect on the network's detection efficiency. In an alternate application, this modeling technique gives an estimate of the number, strength, and distribution of events going undetected. This approach leads to a variety of event density contour maps. This application is also illustrated. The technique's base PCD can be empirical or analytical. A process for formulating an empirical PCD specific to the region and network being studied is presented. A new method for producing an analytical representation of the empirical PCD is also introduced.
Automatic Detection and Vulnerability Analysis of Areas Endangered by Heavy Rain
NASA Astrophysics Data System (ADS)
Krauß, Thomas; Fischer, Peter
2016-08-01
In this paper we present a new method for fully automatic detection and derivation of areas endangered by heavy rainfall based only on digital elevation models. Tracking news show that the majority of occuring natural hazards are flood events. So already many flood prediction systems were developed. But most of these existing systems for deriving areas endangered by flooding events are based only on horizontal and vertical distances to existing rivers and lakes. Typically such systems take not into account dangers arising directly from heavy rain events. In a study conducted by us together with a german insurance company a new approach for detection of areas endangered by heavy rain was proven to give a high correlation of the derived endangered areas and the losses claimed at the insurance company. Here we describe three methods for classification of digital terrain models and analyze their usability for automatic detection and vulnerability analysis for areas endangered by heavy rainfall and analyze the results using the available insurance data.
Austin, Peter C
2018-01-01
The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest.
Austin, Peter C.
2017-01-01
The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest. PMID:29321694
Novel Hierarchical Fall Detection Algorithm Using a Multiphase Fall Model.
Hsieh, Chia-Yeh; Liu, Kai-Chun; Huang, Chih-Ning; Chu, Woei-Chyn; Chan, Chia-Tai
2017-02-08
Falls are the primary cause of accidents for the elderly in the living environment. Reducing hazards in the living environment and performing exercises for training balance and muscles are the common strategies for fall prevention. However, falls cannot be avoided completely; fall detection provides an alarm that can decrease injuries or death caused by the lack of rescue. The automatic fall detection system has opportunities to provide real-time emergency alarms for improving the safety and quality of home healthcare services. Two common technical challenges are also tackled in order to provide a reliable fall detection algorithm, including variability and ambiguity. We propose a novel hierarchical fall detection algorithm involving threshold-based and knowledge-based approaches to detect a fall event. The threshold-based approach efficiently supports the detection and identification of fall events from continuous sensor data. A multiphase fall model is utilized, including free fall, impact, and rest phases for the knowledge-based approach, which identifies fall events and has the potential to deal with the aforementioned technical challenges of a fall detection system. Seven kinds of falls and seven types of daily activities arranged in an experiment are used to explore the performance of the proposed fall detection algorithm. The overall performances of the sensitivity, specificity, precision, and accuracy using a knowledge-based algorithm are 99.79%, 98.74%, 99.05% and 99.33%, respectively. The results show that the proposed novel hierarchical fall detection algorithm can cope with the variability and ambiguity of the technical challenges and fulfill the reliability, adaptability, and flexibility requirements of an automatic fall detection system with respect to the individual differences.
Novel Hierarchical Fall Detection Algorithm Using a Multiphase Fall Model
Hsieh, Chia-Yeh; Liu, Kai-Chun; Huang, Chih-Ning; Chu, Woei-Chyn; Chan, Chia-Tai
2017-01-01
Falls are the primary cause of accidents for the elderly in the living environment. Reducing hazards in the living environment and performing exercises for training balance and muscles are the common strategies for fall prevention. However, falls cannot be avoided completely; fall detection provides an alarm that can decrease injuries or death caused by the lack of rescue. The automatic fall detection system has opportunities to provide real-time emergency alarms for improving the safety and quality of home healthcare services. Two common technical challenges are also tackled in order to provide a reliable fall detection algorithm, including variability and ambiguity. We propose a novel hierarchical fall detection algorithm involving threshold-based and knowledge-based approaches to detect a fall event. The threshold-based approach efficiently supports the detection and identification of fall events from continuous sensor data. A multiphase fall model is utilized, including free fall, impact, and rest phases for the knowledge-based approach, which identifies fall events and has the potential to deal with the aforementioned technical challenges of a fall detection system. Seven kinds of falls and seven types of daily activities arranged in an experiment are used to explore the performance of the proposed fall detection algorithm. The overall performances of the sensitivity, specificity, precision, and accuracy using a knowledge-based algorithm are 99.79%, 98.74%, 99.05% and 99.33%, respectively. The results show that the proposed novel hierarchical fall detection algorithm can cope with the variability and ambiguity of the technical challenges and fulfill the reliability, adaptability, and flexibility requirements of an automatic fall detection system with respect to the individual differences. PMID:28208694
A Patch-Based Method for Repetitive and Transient Event Detection in Fluorescence Imaging
Boulanger, Jérôme; Gidon, Alexandre; Kervran, Charles; Salamero, Jean
2010-01-01
Automatic detection and characterization of molecular behavior in large data sets obtained by fast imaging in advanced light microscopy become key issues to decipher the dynamic architectures and their coordination in the living cell. Automatic quantification of the number of sudden and transient events observed in fluorescence microscopy is discussed in this paper. We propose a calibrated method based on the comparison of image patches expected to distinguish sudden appearing/vanishing fluorescent spots from other motion behaviors such as lateral movements. We analyze the performances of two statistical control procedures and compare the proposed approach to a frame difference approach using the same controls on a benchmark of synthetic image sequences. We have then selected a molecular model related to membrane trafficking and considered real image sequences obtained in cells stably expressing an endocytic-recycling trans-membrane protein, the Langerin-YFP, for validation. With this model, we targeted the efficient detection of fast and transient local fluorescence concentration arising in image sequences from a data base provided by two different microscopy modalities, wide field (WF) video microscopy using maximum intensity projection along the axial direction and total internal reflection fluorescence microscopy. Finally, the proposed detection method is briefly used to statistically explore the effect of several perturbations on the rate of transient events detected on the pilot biological model. PMID:20976222
Dazard, Jean-Eudes; Ishwaran, Hemant; Mehlotra, Rajeev; Weinberg, Aaron; Zimmerman, Peter
2018-01-01
Unraveling interactions among variables such as genetic, clinical, demographic and environmental factors is essential to understand the development of common and complex diseases. To increase the power to detect such variables interactions associated with clinical time-to-events outcomes, we borrowed established concepts from random survival forest (RSF) models. We introduce a novel RSF-based pairwise interaction estimator and derive a randomization method with bootstrap confidence intervals for inferring interaction significance. Using various linear and nonlinear time-to-events survival models in simulation studies, we first show the efficiency of our approach: true pairwise interaction-effects between variables are uncovered, while they may not be accompanied with their corresponding main-effects, and may not be detected by standard semi-parametric regression modeling and test statistics used in survival analysis. Moreover, using a RSF-based cross-validation scheme for generating prediction estimators, we show that informative predictors may be inferred. We applied our approach to an HIV cohort study recording key host gene polymorphisms and their association with HIV change of tropism or AIDS progression. Altogether, this shows how linear or nonlinear pairwise statistical interactions of variables may be efficiently detected with a predictive value in observational studies with time-to-event outcomes. PMID:29453930
Dazard, Jean-Eudes; Ishwaran, Hemant; Mehlotra, Rajeev; Weinberg, Aaron; Zimmerman, Peter
2018-02-17
Unraveling interactions among variables such as genetic, clinical, demographic and environmental factors is essential to understand the development of common and complex diseases. To increase the power to detect such variables interactions associated with clinical time-to-events outcomes, we borrowed established concepts from random survival forest (RSF) models. We introduce a novel RSF-based pairwise interaction estimator and derive a randomization method with bootstrap confidence intervals for inferring interaction significance. Using various linear and nonlinear time-to-events survival models in simulation studies, we first show the efficiency of our approach: true pairwise interaction-effects between variables are uncovered, while they may not be accompanied with their corresponding main-effects, and may not be detected by standard semi-parametric regression modeling and test statistics used in survival analysis. Moreover, using a RSF-based cross-validation scheme for generating prediction estimators, we show that informative predictors may be inferred. We applied our approach to an HIV cohort study recording key host gene polymorphisms and their association with HIV change of tropism or AIDS progression. Altogether, this shows how linear or nonlinear pairwise statistical interactions of variables may be efficiently detected with a predictive value in observational studies with time-to-event outcomes.
Detection of dominant flow and abnormal events in surveillance video
NASA Astrophysics Data System (ADS)
Kwak, Sooyeong; Byun, Hyeran
2011-02-01
We propose an algorithm for abnormal event detection in surveillance video. The proposed algorithm is based on a semi-unsupervised learning method, a kind of feature-based approach so that it does not detect the moving object individually. The proposed algorithm identifies dominant flow without individual object tracking using a latent Dirichlet allocation model in crowded environments. It can also automatically detect and localize an abnormally moving object in real-life video. The performance tests are taken with several real-life databases, and their results show that the proposed algorithm can efficiently detect abnormally moving objects in real time. The proposed algorithm can be applied to any situation in which abnormal directions or abnormal speeds are detected regardless of direction.
Bayesian Inference for Signal-Based Seismic Monitoring
NASA Astrophysics Data System (ADS)
Moore, D.
2015-12-01
Traditional seismic monitoring systems rely on discrete detections produced by station processing software, discarding significant information present in the original recorded signal. SIG-VISA (Signal-based Vertically Integrated Seismic Analysis) is a system for global seismic monitoring through Bayesian inference on seismic signals. By modeling signals directly, our forward model is able to incorporate a rich representation of the physics underlying the signal generation process, including source mechanisms, wave propagation, and station response. This allows inference in the model to recover the qualitative behavior of recent geophysical methods including waveform matching and double-differencing, all as part of a unified Bayesian monitoring system that simultaneously detects and locates events from a global network of stations. We demonstrate recent progress in scaling up SIG-VISA to efficiently process the data stream of global signals recorded by the International Monitoring System (IMS), including comparisons against existing processing methods that show increased sensitivity from our signal-based model and in particular the ability to locate events (including aftershock sequences that can tax analyst processing) precisely from waveform correlation effects. We also provide a Bayesian analysis of an alleged low-magnitude event near the DPRK test site in May 2010 [1] [2], investigating whether such an event could plausibly be detected through automated processing in a signal-based monitoring system. [1] Zhang, Miao and Wen, Lianxing. "Seismological Evidence for a Low-Yield Nuclear Test on 12 May 2010 in North Korea". Seismological Research Letters, January/February 2015. [2] Richards, Paul. "A Seismic Event in North Korea on 12 May 2010". CTBTO SnT 2015 oral presentation, video at https://video-archive.ctbto.org/index.php/kmc/preview/partner_id/103/uiconf_id/4421629/entry_id/0_ymmtpps0/delivery/http
ERIC Educational Resources Information Center
Ball, B. Hunter; Brewer, Gene A.
2018-01-01
The present study implemented an individual differences approach in conjunction with response time (RT) variability and distribution modeling techniques to better characterize the cognitive control dynamics underlying ongoing task cost (i.e., slowing) and cue detection in event-based prospective memory (PM). Three experiments assessed the relation…
Bayesian analysis of caustic-crossing microlensing events
NASA Astrophysics Data System (ADS)
Cassan, A.; Horne, K.; Kains, N.; Tsapras, Y.; Browne, P.
2010-06-01
Aims: Caustic-crossing binary-lens microlensing events are important anomalous events because they are capable of detecting an extrasolar planet companion orbiting the lens star. Fast and robust modelling methods are thus of prime interest in helping to decide whether a planet is detected by an event. Cassan introduced a new set of parameters to model binary-lens events, which are closely related to properties of the light curve. In this work, we explain how Bayesian priors can be added to this framework, and investigate on interesting options. Methods: We develop a mathematical formulation that allows us to compute analytically the priors on the new parameters, given some previous knowledge about other physical quantities. We explicitly compute the priors for a number of interesting cases, and show how this can be implemented in a fully Bayesian, Markov chain Monte Carlo algorithm. Results: Using Bayesian priors can accelerate microlens fitting codes by reducing the time spent considering physically implausible models, and helps us to discriminate between alternative models based on the physical plausibility of their parameters.
A novel seizure detection algorithm informed by hidden Markov model event states
NASA Astrophysics Data System (ADS)
Baldassano, Steven; Wulsin, Drausin; Ung, Hoameng; Blevins, Tyler; Brown, Mesha-Gay; Fox, Emily; Litt, Brian
2016-06-01
Objective. Recently the FDA approved the first responsive, closed-loop intracranial device to treat epilepsy. Because these devices must respond within seconds of seizure onset and not miss events, they are tuned to have high sensitivity, leading to frequent false positive stimulations and decreased battery life. In this work, we propose a more robust seizure detection model. Approach. We use a Bayesian nonparametric Markov switching process to parse intracranial EEG (iEEG) data into distinct dynamic event states. Each event state is then modeled as a multidimensional Gaussian distribution to allow for predictive state assignment. By detecting event states highly specific for seizure onset zones, the method can identify precise regions of iEEG data associated with the transition to seizure activity, reducing false positive detections associated with interictal bursts. The seizure detection algorithm was translated to a real-time application and validated in a small pilot study using 391 days of continuous iEEG data from two dogs with naturally occurring, multifocal epilepsy. A feature-based seizure detector modeled after the NeuroPace RNS System was developed as a control. Main results. Our novel seizure detection method demonstrated an improvement in false negative rate (0/55 seizures missed versus 2/55 seizures missed) as well as a significantly reduced false positive rate (0.0012 h versus 0.058 h-1). All seizures were detected an average of 12.1 ± 6.9 s before the onset of unequivocal epileptic activity (unequivocal epileptic onset (UEO)). Significance. This algorithm represents a computationally inexpensive, individualized, real-time detection method suitable for implantable antiepileptic devices that may considerably reduce false positive rate relative to current industry standards.
NASA Technical Reports Server (NTRS)
Trejo, Leonard J.; Shensa, Mark J.; Remington, Roger W. (Technical Monitor)
1998-01-01
This report describes the development and evaluation of mathematical models for predicting human performance from discrete wavelet transforms (DWT) of event-related potentials (ERP) elicited by task-relevant stimuli. The DWT was compared to principal components analysis (PCA) for representation of ERPs in linear regression and neural network models developed to predict a composite measure of human signal detection performance. Linear regression models based on coefficients of the decimated DWT predicted signal detection performance with half as many f ree parameters as comparable models based on PCA scores. In addition, the DWT-based models were more resistant to model degradation due to over-fitting than PCA-based models. Feed-forward neural networks were trained using the backpropagation,-, algorithm to predict signal detection performance based on raw ERPs, PCA scores, or high-power coefficients of the DWT. Neural networks based on high-power DWT coefficients trained with fewer iterations, generalized to new data better, and were more resistant to overfitting than networks based on raw ERPs. Networks based on PCA scores did not generalize to new data as well as either the DWT network or the raw ERP network. The results show that wavelet expansions represent the ERP efficiently and extract behaviorally important features for use in linear regression or neural network models of human performance. The efficiency of the DWT is discussed in terms of its decorrelation and energy compaction properties. In addition, the DWT models provided evidence that a pattern of low-frequency activity (1 to 3.5 Hz) occurring at specific times and scalp locations is a reliable correlate of human signal detection performance.
NASA Technical Reports Server (NTRS)
Trejo, L. J.; Shensa, M. J.
1999-01-01
This report describes the development and evaluation of mathematical models for predicting human performance from discrete wavelet transforms (DWT) of event-related potentials (ERP) elicited by task-relevant stimuli. The DWT was compared to principal components analysis (PCA) for representation of ERPs in linear regression and neural network models developed to predict a composite measure of human signal detection performance. Linear regression models based on coefficients of the decimated DWT predicted signal detection performance with half as many free parameters as comparable models based on PCA scores. In addition, the DWT-based models were more resistant to model degradation due to over-fitting than PCA-based models. Feed-forward neural networks were trained using the backpropagation algorithm to predict signal detection performance based on raw ERPs, PCA scores, or high-power coefficients of the DWT. Neural networks based on high-power DWT coefficients trained with fewer iterations, generalized to new data better, and were more resistant to overfitting than networks based on raw ERPs. Networks based on PCA scores did not generalize to new data as well as either the DWT network or the raw ERP network. The results show that wavelet expansions represent the ERP efficiently and extract behaviorally important features for use in linear regression or neural network models of human performance. The efficiency of the DWT is discussed in terms of its decorrelation and energy compaction properties. In addition, the DWT models provided evidence that a pattern of low-frequency activity (1 to 3.5 Hz) occurring at specific times and scalp locations is a reliable correlate of human signal detection performance. Copyright 1999 Academic Press.
Numerical study on the sequential Bayesian approach for radioactive materials detection
NASA Astrophysics Data System (ADS)
Qingpei, Xiang; Dongfeng, Tian; Jianyu, Zhu; Fanhua, Hao; Ge, Ding; Jun, Zeng
2013-01-01
A new detection method, based on the sequential Bayesian approach proposed by Candy et al., offers new horizons for the research of radioactive detection. Compared with the commonly adopted detection methods incorporated with statistical theory, the sequential Bayesian approach offers the advantages of shorter verification time during the analysis of spectra that contain low total counts, especially in complex radionuclide components. In this paper, a simulation experiment platform implanted with the methodology of sequential Bayesian approach was developed. Events sequences of γ-rays associating with the true parameters of a LaBr3(Ce) detector were obtained based on an events sequence generator using Monte Carlo sampling theory to study the performance of the sequential Bayesian approach. The numerical experimental results are in accordance with those of Candy. Moreover, the relationship between the detection model and the event generator, respectively represented by the expected detection rate (Am) and the tested detection rate (Gm) parameters, is investigated. To achieve an optimal performance for this processor, the interval of the tested detection rate as a function of the expected detection rate is also presented.
Shadow Detection Based on Regions of Light Sources for Object Extraction in Nighttime Video
Lee, Gil-beom; Lee, Myeong-jin; Lee, Woo-Kyung; Park, Joo-heon; Kim, Tae-Hwan
2017-01-01
Intelligent video surveillance systems detect pre-configured surveillance events through background modeling, foreground and object extraction, object tracking, and event detection. Shadow regions inside video frames sometimes appear as foreground objects, interfere with ensuing processes, and finally degrade the event detection performance of the systems. Conventional studies have mostly used intensity, color, texture, and geometric information to perform shadow detection in daytime video, but these methods lack the capability of removing shadows in nighttime video. In this paper, a novel shadow detection algorithm for nighttime video is proposed; this algorithm partitions each foreground object based on the object’s vertical histogram and screens out shadow objects by validating their orientations heading toward regions of light sources. From the experimental results, it can be seen that the proposed algorithm shows more than 93.8% shadow removal and 89.9% object extraction rates for nighttime video sequences, and the algorithm outperforms conventional shadow removal algorithms designed for daytime videos. PMID:28327515
Contamination Event Detection with Multivariate Time-Series Data in Agricultural Water Monitoring †
Mao, Yingchi; Qi, Hai; Ping, Ping; Li, Xiaofang
2017-01-01
Time series data of multiple water quality parameters are obtained from the water sensor networks deployed in the agricultural water supply network. The accurate and efficient detection and warning of contamination events to prevent pollution from spreading is one of the most important issues when pollution occurs. In order to comprehensively reduce the event detection deviation, a spatial–temporal-based event detection approach with multivariate time-series data for water quality monitoring (M-STED) was proposed. The M-STED approach includes three parts. The first part is that M-STED adopts a Rule K algorithm to select backbone nodes as the nodes in the CDS, and forward the sensed data of multiple water parameters. The second part is to determine the state of each backbone node with back propagation neural network models and the sequential Bayesian analysis in the current timestamp. The third part is to establish a spatial model with Bayesian networks to estimate the state of the backbones in the next timestamp and trace the “outlier” node to its neighborhoods to detect a contamination event. The experimental results indicate that the average detection rate is more than 80% with M-STED and the false detection rate is lower than 9%, respectively. The M-STED approach can improve the rate of detection by about 40% and reduce the false alarm rate by about 45%, compared with the event detection with a single water parameter algorithm, S-STED. Moreover, the proposed M-STED can exhibit better performance in terms of detection delay and scalability. PMID:29207535
Detection of goal events in soccer videos
NASA Astrophysics Data System (ADS)
Kim, Hyoung-Gook; Roeber, Steffen; Samour, Amjad; Sikora, Thomas
2005-01-01
In this paper, we present an automatic extraction of goal events in soccer videos by using audio track features alone without relying on expensive-to-compute video track features. The extracted goal events can be used for high-level indexing and selective browsing of soccer videos. The detection of soccer video highlights using audio contents comprises three steps: 1) extraction of audio features from a video sequence, 2) event candidate detection of highlight events based on the information provided by the feature extraction Methods and the Hidden Markov Model (HMM), 3) goal event selection to finally determine the video intervals to be included in the summary. For this purpose we compared the performance of the well known Mel-scale Frequency Cepstral Coefficients (MFCC) feature extraction method vs. MPEG-7 Audio Spectrum Projection feature (ASP) extraction method based on three different decomposition methods namely Principal Component Analysis( PCA), Independent Component Analysis (ICA) and Non-Negative Matrix Factorization (NMF). To evaluate our system we collected five soccer game videos from various sources. In total we have seven hours of soccer games consisting of eight gigabytes of data. One of five soccer games is used as the training data (e.g., announcers' excited speech, audience ambient speech noise, audience clapping, environmental sounds). Our goal event detection results are encouraging.
Oliker, Nurit; Ostfeld, Avi
2014-03-15
This study describes a decision support system, alerts for contamination events in water distribution systems. The developed model comprises a weighted support vector machine (SVM) for the detection of outliers, and a following sequence analysis for the classification of contamination events. The contribution of this study is an improvement of contamination events detection ability and a multi-dimensional analysis of the data, differing from the parallel one-dimensional analysis conducted so far. The multivariate analysis examines the relationships between water quality parameters and detects changes in their mutual patterns. The weights of the SVM model accomplish two goals: blurring the difference between sizes of the two classes' data sets (as there are much more normal/regular than event time measurements), and adhering the time factor attribute by a time decay coefficient, ascribing higher importance to recent observations when classifying a time step measurement. All model parameters were determined by data driven optimization so the calibration of the model was completely autonomic. The model was trained and tested on a real water distribution system (WDS) data set with randomly simulated events superimposed on the original measurements. The model is prominent in its ability to detect events that were only partly expressed in the data (i.e., affecting only some of the measured parameters). The model showed high accuracy and better detection ability as compared to previous modeling attempts of contamination event detection. Copyright © 2013 Elsevier Ltd. All rights reserved.
Day-time identification of summer hailstorm cells from MSG data
NASA Astrophysics Data System (ADS)
Merino, A.; López, L.; Sánchez, J. L.; García-Ortega, E.; Cattani, E.; Levizzani, V.
2013-10-01
Identifying deep convection is of paramount importance, as it may be associated with extreme weather that has significant impact on the environment, property and the population. A new method, the Hail Detection Tool (HDT), is described for identifying hail-bearing storms using multi-spectral Meteosat Second Generation (MSG) data. HDT was conceived as a two-phase method, in which the first step is the Convective Mask (CM) algorithm devised for detection of deep convection, and the second a Hail Detection algorithm (HD) for the identification of hail-bearing clouds among cumulonimbus systems detected by CM. Both CM and HD are based on logistic regression models trained with multi-spectral MSG data-sets comprised of summer convective events in the middle Ebro Valley between 2006-2010, and detected by the RGB visualization technique (CM) or C-band weather radar system of the University of León. By means of the logistic regression approach, the probability of identifying a cumulonimbus event with CM or a hail event with HD are computed by exploiting a proper selection of MSG wavelengths or their combination. A number of cloud physical properties (liquid water path, optical thickness and effective cloud drop radius) were used to physically interpret results of statistical models from a meteorological perspective, using a method based on these "ingredients." Finally, the HDT was applied to a new validation sample consisting of events during summer 2011. The overall Probability of Detection (POD) was 76.9% and False Alarm Ratio 16.7%.
Full-waveform detection of non-impulsive seismic events based on time-reversal methods
NASA Astrophysics Data System (ADS)
Solano, Ericka Alinne; Hjörleifsdóttir, Vala; Liu, Qinya
2017-12-01
We present a full-waveform detection method for non-impulsive seismic events, based on time-reversal principles. We use the strain Green's tensor as a matched filter, correlating it with continuous observed seismograms, to detect non-impulsive seismic events. We show that this is mathematically equivalent to an adjoint method for detecting earthquakes. We define the detection function, a scalar valued function, which depends on the stacked correlations for a group of stations. Event detections are given by the times at which the amplitude of the detection function exceeds a given value relative to the noise level. The method can make use of the whole seismic waveform or any combination of time-windows with different filters. It is expected to have an advantage compared to traditional detection methods for events that do not produce energetic and impulsive P waves, for example glacial events, landslides, volcanic events and transform-fault earthquakes for events which velocity structure along the path is relatively well known. Furthermore, the method has advantages over empirical Greens functions template matching methods, as it does not depend on records from previously detected events, and therefore is not limited to events occurring in similar regions and with similar focal mechanisms as these events. The method is not specific to any particular way of calculating the synthetic seismograms, and therefore complicated structural models can be used. This is particularly beneficial for intermediate size events that are registered on regional networks, for which the effect of lateral structure on the waveforms can be significant. To demonstrate the feasibility of the method, we apply it to two different areas located along the mid-oceanic ridge system west of Mexico where non-impulsive events have been reported. The first study area is between Clipperton and Siqueiros transform faults (9°N), during the time of two earthquake swarms, occurring in March 2012 and May 2016. The second area of interest is the Gulf of California where two swarms took place during July and September of 2015. We show that we are able to detect previously non-reported, non-impulsive events and recommend that this method be used together with more traditional template matching methods to maximize the number of detected events.
Vertically Integrated Seismological Analysis I : Modeling
NASA Astrophysics Data System (ADS)
Russell, S.; Arora, N. S.; Jordan, M. I.; Sudderth, E.
2009-12-01
As part of its CTBT verification efforts, the International Data Centre (IDC) analyzes seismic and other signals collected from hundreds of stations around the world. Current processing at the IDC proceeds in a series of pipelined stages. From station processing to network processing, each decision is made on the basis of local information. This has the advantage of efficiency, and simplifies the structure of software implementations. However, this approach may reduce accuracy in the detection and phase classification of arrivals, association of detections to hypothesized events, and localization of small-magnitude events.In our work, we approach such detection and association problems as ones of probabilistic inference. In simple terms, let X be a random variable ranging over all possible collections of events, with each event defined by time, location, magnitude, and type (natural or man-made). Let Y range over all possible waveform signal recordings at all detection stations. Then Pθ(X) describes a parameterized generative prior over events, and P[|#30#|]φ(Y | X) describes how the signal is propagated and measured (including travel time, selective absorption and scattering, noise, artifacts, sensor bias, sensor failures, etc.). Given observed recordings Y = y, we are interested in the posterior P(X | Y = y), and perhaps in the value of X that maximizes it—i.e., the most likely explanation for all the sensor readings. As detailed below, an additional focus of our work is to robustly learn appropriate model parameters θ and φ from historical data. The primary advantage we expect is that decisions about arrivals, phase classifications, and associations are made with the benefit of all available evidence, not just the local signal or predefined recipes. Important phenomena—such as the successful detection of sub-threshold signals, correction of phase classifications using arrival information at other stations, and removal of false events based on the absence of signals—should all fall out of our probabilistic framework without the need for special processing rules. In our baseline model, natural events occur according to a spatially inhomogeneous Poisson process. Complex events (swarms and aftershocks) may then be captured via temporally inhomogeneous extensions. Man-made events have a uniform probability of occurring anywhere on the earth, with a tendency to occur closer to the surface. Phases are modelled via their amplitude, frequency distribution, and origin. In the simplest case, transmission times are characterized via the one-dimensional IASPEI-91 model, accounting for model errors with Gaussian uncertainty. Such homogeneous, approximate physical models can be further refined via historical data and previously developed corrections. Signal measurements are captured by station-specific models, based on sensor types and geometries, local frequency absorption characteristics, and time-varying noise models. At the conference, we expect to be able to quantitatively demonstrate the advantages of our approach, at least for simulated data. When reporting their findings, such systems can easily flag low-confidence events, unexplained arrivals, and ambiguous classifications to focus the efforts of expert analysts.
Bridging the semantic gap in sports
NASA Astrophysics Data System (ADS)
Li, Baoxin; Errico, James; Pan, Hao; Sezan, M. Ibrahim
2003-01-01
One of the major challenges facing current media management systems and the related applications is the so-called "semantic gap" between the rich meaning that a user desires and the shallowness of the content descriptions that are automatically extracted from the media. In this paper, we address the problem of bridging this gap in the sports domain. We propose a general framework for indexing and summarizing sports broadcast programs. The framework is based on a high-level model of sports broadcast video using the concept of an event, defined according to domain-specific knowledge for different types of sports. Within this general framework, we develop automatic event detection algorithms that are based on automatic analysis of the visual and aural signals in the media. We have successfully applied the event detection algorithms to different types of sports including American football, baseball, Japanese sumo wrestling, and soccer. Event modeling and detection contribute to the reduction of the semantic gap by providing rudimentary semantic information obtained through media analysis. We further propose a novel approach, which makes use of independently generated rich textual metadata, to fill the gap completely through synchronization of the information-laden textual data with the basic event segments. An MPEG-7 compliant prototype browsing system has been implemented to demonstrate semantic retrieval and summarization of sports video.
A spatial scan statistic for compound Poisson data.
Rosychuk, Rhonda J; Chang, Hsing-Ming
2013-12-20
The topic of spatial cluster detection gained attention in statistics during the late 1980s and early 1990s. Effort has been devoted to the development of methods for detecting spatial clustering of cases and events in the biological sciences, astronomy and epidemiology. More recently, research has examined detecting clusters of correlated count data associated with health conditions of individuals. Such a method allows researchers to examine spatial relationships of disease-related events rather than just incident or prevalent cases. We introduce a spatial scan test that identifies clusters of events in a study region. Because an individual case may have multiple (repeated) events, we base the test on a compound Poisson model. We illustrate our method for cluster detection on emergency department visits, where individuals may make multiple disease-related visits. Copyright © 2013 John Wiley & Sons, Ltd.
Deep Recurrent Neural Network-Based Autoencoders for Acoustic Novelty Detection
Vesperini, Fabio; Schuller, Björn
2017-01-01
In the emerging field of acoustic novelty detection, most research efforts are devoted to probabilistic approaches such as mixture models or state-space models. Only recent studies introduced (pseudo-)generative models for acoustic novelty detection with recurrent neural networks in the form of an autoencoder. In these approaches, auditory spectral features of the next short term frame are predicted from the previous frames by means of Long-Short Term Memory recurrent denoising autoencoders. The reconstruction error between the input and the output of the autoencoder is used as activation signal to detect novel events. There is no evidence of studies focused on comparing previous efforts to automatically recognize novel events from audio signals and giving a broad and in depth evaluation of recurrent neural network-based autoencoders. The present contribution aims to consistently evaluate our recent novel approaches to fill this white spot in the literature and provide insight by extensive evaluations carried out on three databases: A3Novelty, PASCAL CHiME, and PROMETHEUS. Besides providing an extensive analysis of novel and state-of-the-art methods, the article shows how RNN-based autoencoders outperform statistical approaches up to an absolute improvement of 16.4% average F-measure over the three databases. PMID:28182121
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford
The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events thatmore » generated the phase picks.« less
Robust failure detection filters. M.S. Thesis
NASA Technical Reports Server (NTRS)
Sanmartin, A. M.
1985-01-01
The robustness of detection filters applied to the detection of actuator failures on a free-free beam is analyzed. This analysis is based on computer simulation tests of the detection filters in the presence of different types of model mismatch, and on frequency response functions of the transfers corresponding to the model mismatch. The robustness of detection filters based on a model of the beam containing a large number of structural modes varied dramatically with the placement of some of the filter poles. The dynamics of these filters were very hard to analyze. The design of detection filters with a number of modes equal to the number of sensors was trivial. They can be configured to detect any number of actuator failure events. The dynamics of these filters were very easy to analyze and their robustness properties were much improved. A change of the output transformation allowed the filter to perform satisfactorily with realistic levels of model mismatch.
Physics-based, Bayesian sequential detection method and system for radioactive contraband
Candy, James V; Axelrod, Michael C; Breitfeller, Eric F; Chambers, David H; Guidry, Brian L; Manatt, Douglas R; Meyer, Alan W; Sale, Kenneth E
2014-03-18
A distributed sequential method and system for detecting and identifying radioactive contraband from highly uncertain (noisy) low-count, radionuclide measurements, i.e. an event mode sequence (EMS), using a statistical approach based on Bayesian inference and physics-model-based signal processing based on the representation of a radionuclide as a monoenergetic decomposition of monoenergetic sources. For a given photon event of the EMS, the appropriate monoenergy processing channel is determined using a confidence interval condition-based discriminator for the energy amplitude and interarrival time and parameter estimates are used to update a measured probability density function estimate for a target radionuclide. A sequential likelihood ratio test is then used to determine one of two threshold conditions signifying that the EMS is either identified as the target radionuclide or not, and if not, then repeating the process for the next sequential photon event of the EMS until one of the two threshold conditions is satisfied.
NASA Technical Reports Server (NTRS)
Pulkkinen, A.; Rastaetter, L.; Kuznetsova, M.; Singer, H.; Balch, C.; Weimer, D.; Toth, G.; Ridley, A.; Gombosi, T.; Wiltberger, M.;
2013-01-01
In this paper we continue the community-wide rigorous modern space weather model validation efforts carried out within GEM, CEDAR and SHINE programs. In this particular effort, in coordination among the Community Coordinated Modeling Center (CCMC), NOAA Space Weather Prediction Center (SWPC), modelers, and science community, we focus on studying the models' capability to reproduce observed ground magnetic field fluctuations, which are closely related to geomagnetically induced current phenomenon. One of the primary motivations of the work is to support NOAA SWPC in their selection of the next numerical model that will be transitioned into operations. Six geomagnetic events and 12 geomagnetic observatories were selected for validation.While modeled and observed magnetic field time series are available for all 12 stations, the primary metrics analysis is based on six stations that were selected to represent the high-latitude and mid-latitude locations. Events-based analysis and the corresponding contingency tables were built for each event and each station. The elements in the contingency table were then used to calculate Probability of Detection (POD), Probability of False Detection (POFD) and Heidke Skill Score (HSS) for rigorous quantification of the models' performance. In this paper the summary results of the metrics analyses are reported in terms of POD, POFD and HSS. More detailed analyses can be carried out using the event by event contingency tables provided as an online appendix. An online interface built at CCMC and described in the supporting information is also available for more detailed time series analyses.
Learning rational temporal eye movement strategies.
Hoppe, David; Rothkopf, Constantin A
2016-07-19
During active behavior humans redirect their gaze several times every second within the visual environment. Where we look within static images is highly efficient, as quantified by computational models of human gaze shifts in visual search and face recognition tasks. However, when we shift gaze is mostly unknown despite its fundamental importance for survival in a dynamic world. It has been suggested that during naturalistic visuomotor behavior gaze deployment is coordinated with task-relevant events, often predictive of future events, and studies in sportsmen suggest that timing of eye movements is learned. Here we establish that humans efficiently learn to adjust the timing of eye movements in response to environmental regularities when monitoring locations in the visual scene to detect probabilistically occurring events. To detect the events humans adopt strategies that can be understood through a computational model that includes perceptual and acting uncertainties, a minimal processing time, and, crucially, the intrinsic costs of gaze behavior. Thus, subjects traded off event detection rate with behavioral costs of carrying out eye movements. Remarkably, based on this rational bounded actor model the time course of learning the gaze strategies is fully explained by an optimal Bayesian learner with humans' characteristic uncertainty in time estimation, the well-known scalar law of biological timing. Taken together, these findings establish that the human visual system is highly efficient in learning temporal regularities in the environment and that it can use these regularities to control the timing of eye movements to detect behaviorally relevant events.
Subsurface event detection and classification using Wireless Signal Networks.
Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T
2012-11-05
Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events.
Subsurface Event Detection and Classification Using Wireless Signal Networks
Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T.
2012-01-01
Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events. PMID:23202191
Kamphuis, C; Frank, E; Burke, J K; Verkerk, G A; Jago, J G
2013-01-01
The hypothesis was that sensors currently available on farm that monitor behavioral and physiological characteristics have potential for the detection of lameness in dairy cows. This was tested by applying additive logistic regression to variables derived from sensor data. Data were collected between November 2010 and June 2012 on 5 commercial pasture-based dairy farms. Sensor data from weigh scales (liveweight), pedometers (activity), and milk meters (milking order, unadjusted and adjusted milk yield in the first 2 min of milking, total milk yield, and milking duration) were collected at every milking from 4,904 cows. Lameness events were recorded by farmers who were trained in detecting lameness before the study commenced. A total of 318 lameness events affecting 292 cows were available for statistical analyses. For each lameness event, the lame cow's sensor data for a time period of 14 d before observation date were randomly matched by farm and date to 10 healthy cows (i.e., cows that were not lame and had no other health event recorded for the matched time period). Sensor data relating to the 14-d time periods were used for developing univariable (using one source of sensor data) and multivariable (using multiple sources of sensor data) models. Model development involved the use of additive logistic regression by applying the LogitBoost algorithm with a regression tree as base learner. The model's output was a probability estimate for lameness, given the sensor data collected during the 14-d time period. Models were validated using leave-one-farm-out cross-validation and, as a result of this validation, each cow in the data set (318 lame and 3,180 nonlame cows) received a probability estimate for lameness. Based on the area under the curve (AUC), results indicated that univariable models had low predictive potential, with the highest AUC values found for liveweight (AUC=0.66), activity (AUC=0.60), and milking order (AUC=0.65). Combining these 3 sensors improved AUC to 0.74. Detection performance of this combined model varied between farms but it consistently and significantly outperformed univariable models across farms at a fixed specificity of 80%. Still, detection performance was not high enough to be implemented in practice on large, pasture-based dairy farms. Future research may improve performance by developing variables based on sensor data of liveweight, activity, and milking order, but that better describe changes in sensor data patterns when cows go lame. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Support Vector Machine Model for Automatic Detection and Classification of Seismic Events
NASA Astrophysics Data System (ADS)
Barros, Vesna; Barros, Lucas
2016-04-01
The automated processing of multiple seismic signals to detect, localize and classify seismic events is a central tool in both natural hazards monitoring and nuclear treaty verification. However, false detections and missed detections caused by station noise and incorrect classification of arrivals are still an issue and the events are often unclassified or poorly classified. Thus, machine learning techniques can be used in automatic processing for classifying the huge database of seismic recordings and provide more confidence in the final output. Applied in the context of the International Monitoring System (IMS) - a global sensor network developed for the Comprehensive Nuclear-Test-Ban Treaty (CTBT) - we propose a fully automatic method for seismic event detection and classification based on a supervised pattern recognition technique called the Support Vector Machine (SVM). According to Kortström et al., 2015, the advantages of using SVM are handleability of large number of features and effectiveness in high dimensional spaces. Our objective is to detect seismic events from one IMS seismic station located in an area of high seismicity and mining activity and classify them as earthquakes or quarry blasts. It is expected to create a flexible and easily adjustable SVM method that can be applied in different regions and datasets. Taken a step further, accurate results for seismic stations could lead to a modification of the model and its parameters to make it applicable to other waveform technologies used to monitor nuclear explosions such as infrasound and hydroacoustic waveforms. As an authorized user, we have direct access to all IMS data and bulletins through a secure signatory account. A set of significant seismic waveforms containing different types of events (e.g. earthquake, quarry blasts) and noise is being analysed to train the model and learn the typical pattern of the signal from these events. Moreover, comparing the performance of the support-vector network to various classical learning algorithms used before in seismic detection and classification is an essential final step to analyze the advantages and disadvantages of the model.
Daytime identification of summer hailstorm cells from MSG data
NASA Astrophysics Data System (ADS)
Merino, A.; López, L.; Sánchez, J. L.; García-Ortega, E.; Cattani, E.; Levizzani, V.
2014-04-01
Identifying deep convection is of paramount importance, as it may be associated with extreme weather phenomena that have significant impact on the environment, property and populations. A new method, the hail detection tool (HDT), is described for identifying hail-bearing storms using multispectral Meteosat Second Generation (MSG) data. HDT was conceived as a two-phase method, in which the first step is the convective mask (CM) algorithm devised for detection of deep convection, and the second a hail mask algorithm (HM) for the identification of hail-bearing clouds among cumulonimbus systems detected by CM. Both CM and HM are based on logistic regression models trained with multispectral MSG data sets comprised of summer convective events in the middle Ebro Valley (Spain) between 2006 and 2010, and detected by the RGB (red-green-blue) visualization technique (CM) or C-band weather radar system of the University of León. By means of the logistic regression approach, the probability of identifying a cumulonimbus event with CM or a hail event with HM are computed by exploiting a proper selection of MSG wavelengths or their combination. A number of cloud physical properties (liquid water path, optical thickness and effective cloud drop radius) were used to physically interpret results of statistical models from a meteorological perspective, using a method based on these "ingredients". Finally, the HDT was applied to a new validation sample consisting of events during summer 2011. The overall probability of detection was 76.9 % and the false alarm ratio 16.7 %.
Model-Based Fault Tolerant Control
NASA Technical Reports Server (NTRS)
Kumar, Aditya; Viassolo, Daniel
2008-01-01
The Model Based Fault Tolerant Control (MBFTC) task was conducted under the NASA Aviation Safety and Security Program. The goal of MBFTC is to develop and demonstrate real-time strategies to diagnose and accommodate anomalous aircraft engine events such as sensor faults, actuator faults, or turbine gas-path component damage that can lead to in-flight shutdowns, aborted take offs, asymmetric thrust/loss of thrust control, or engine surge/stall events. A suite of model-based fault detection algorithms were developed and evaluated. Based on the performance and maturity of the developed algorithms two approaches were selected for further analysis: (i) multiple-hypothesis testing, and (ii) neural networks; both used residuals from an Extended Kalman Filter to detect the occurrence of the selected faults. A simple fusion algorithm was implemented to combine the results from each algorithm to obtain an overall estimate of the identified fault type and magnitude. The identification of the fault type and magnitude enabled the use of an online fault accommodation strategy to correct for the adverse impact of these faults on engine operability thereby enabling continued engine operation in the presence of these faults. The performance of the fault detection and accommodation algorithm was extensively tested in a simulation environment.
An integrated logit model for contamination event detection in water distribution systems.
Housh, Mashor; Ostfeld, Avi
2015-05-15
The problem of contamination event detection in water distribution systems has become one of the most challenging research topics in water distribution systems analysis. Current attempts for event detection utilize a variety of approaches including statistical, heuristics, machine learning, and optimization methods. Several existing event detection systems share a common feature in which alarms are obtained separately for each of the water quality indicators. Unifying those single alarms from different indicators is usually performed by means of simple heuristics. A salient feature of the current developed approach is using a statistically oriented model for discrete choice prediction which is estimated using the maximum likelihood method for integrating the single alarms. The discrete choice model is jointly calibrated with other components of the event detection system framework in a training data set using genetic algorithms. The fusing process of each indicator probabilities, which is left out of focus in many existing event detection system models, is confirmed to be a crucial part of the system which could be modelled by exploiting a discrete choice model for improving its performance. The developed methodology is tested on real water quality data, showing improved performances in decreasing the number of false positive alarms and in its ability to detect events with higher probabilities, compared to previous studies. Copyright © 2015 Elsevier Ltd. All rights reserved.
Detecting Single-Nucleotide Substitutions Induced by Genome Editing.
Miyaoka, Yuichiro; Chan, Amanda H; Conklin, Bruce R
2016-08-01
The detection of genome editing is critical in evaluating genome-editing tools or conditions, but it is not an easy task to detect genome-editing events-especially single-nucleotide substitutions-without a surrogate marker. Here we introduce a procedure that significantly contributes to the advancement of genome-editing technologies. It uses droplet digital polymerase chain reaction (ddPCR) and allele-specific hydrolysis probes to detect single-nucleotide substitutions generated by genome editing (via homology-directed repair, or HDR). HDR events that introduce substitutions using donor DNA are generally infrequent, even with genome-editing tools, and the outcome is only one base pair difference in 3 billion base pairs of the human genome. This task is particularly difficult in induced pluripotent stem (iPS) cells, in which editing events can be very rare. Therefore, the technological advances described here have implications for therapeutic genome editing and experimental approaches to disease modeling with iPS cells. © 2016 Cold Spring Harbor Laboratory Press.
Sampled-data consensus in switching networks of integrators based on edge events
NASA Astrophysics Data System (ADS)
Xiao, Feng; Meng, Xiangyu; Chen, Tongwen
2015-02-01
This paper investigates the event-driven sampled-data consensus in switching networks of multiple integrators and studies both the bidirectional interaction and leader-following passive reaction topologies in a unified framework. In these topologies, each information link is modelled by an edge of the information graph and assigned a sequence of edge events, which activate the mutual data sampling and controller updates of the two linked agents. Two kinds of edge-event-detecting rules are proposed for the general asynchronous data-sampling case and the synchronous periodic event-detecting case. They are implemented in a distributed fashion, and their effectiveness in reducing communication costs and solving consensus problems under a jointly connected topology condition is shown by both theoretical analysis and simulation examples.
Participation of the NDC Austria at the NDC Preparedness Exercise 2012
NASA Astrophysics Data System (ADS)
Mitterbauer, Ulrike; Wotawa, Gerhard; Schraick, Irene
2013-04-01
NDC Preparedness Exercises (NPEs) are conducted annually by the National Data Centers (NDCs) of CTBT States Signatories to train the detection of a (hypothetical) nuclear test. During the NDC Preparedness Exercise 2012, a fictitious radionuclide scenario originating from a real seismic event (mining explosion) was calculated by the German NDC and distributed among all NDCs. For the scenario computation, it was assumed that the selected seismic event was the epicentre of an underground nuclear fission explosion. The scenario included detections of the Iodine isotopes I-131 and I-133 (both particulates), and the Radioxenon Isotopes Xe-133, Xe-133M, Xe-131M and Xe-135 (noble gas). By means of atmospheric transport modelling (ATM), concentrations of all these six isotopes which would result from the hypothetical explosion were calculated and interpolated to the IMS station locations. The participating NDCs received information about the concentration of the isotopes at the station locations without knowing the underlying seismic event. The aim of the exercise was to identify this event based on the detection scenario. The Austrian NDC performed the following analyses: • Atmospheric backtracking and data fusion to identify seismic candidate events, • Seismic analysis of candidate events within the possible source region, • Atmospheric transport modelling (forward mode) from identified candidate events, comparison between "measured" and simulated concentrations based on certain release assumptions. The main goal of the analysis was to identify the event selected by NDC Germany to calculate the radionuclide scenario, and to exclude other events. In the presentation, the analysis methodology as well as the final results and conclusions will be shown and discussed in detail.
Modeling Concept Dependencies for Event Detection
2014-04-04
Gaussian Mixture Model (GMM). Jiang et al . [8] provide a summary of experiments for TRECVID MED 2010 . They employ low-level features such as SIFT and...event detection literature. Ballan et al . [2] present a method to introduce temporal information for video event detection with a BoW (bag-of-words...approach. Zhou et al . [24] study video event detection by encoding a video with a set of bag of SIFT feature vectors and describe the distribution with a
Pipeline Processing with an Iterative, Context-Based Detection Model
2016-01-22
25: Teleseismic paths from earthquakes in Myanmar to three North American arrays. The path length to ILAR (the nearest array) is about 8950...kilometers. ................................. 57 Figure 26: Waveforms of Myanmar calibration event (left) and target event (right), recorded at ILAR...one Myanmar event (2007 5/16 8:56:16.0, Mw 6.3; 20.47°N 100.69°E) as a calibration for a second event occurring nearly 4 years later (2011 3/24 13:55
Romero, Peggy; Miller, Ted; Garakani, Arman
2009-12-01
Current methods to assess neurodegradation in dorsal root ganglion cultures as a model for neurodegenerative diseases are imprecise and time-consuming. Here we describe two new methods to quantify neuroprotection in these cultures. The neurite quality index (NQI) builds upon earlier manual methods, incorporating additional morphological events to increase detection sensitivity for the detection of early degeneration events. Neurosight is a machine vision-based method that recapitulates many of the strengths of NQI while enabling high-throughput screening applications with decreased costs.
Fine-Scale Event Location and Error Analysis in NET-VISA
NASA Astrophysics Data System (ADS)
Arora, N. S.; Russell, S.
2016-12-01
NET-VISA is a generative probabilistic model for the occurrence of seismic, hydro, and atmospheric events, and the propagation of energy from these events through various mediums and phases before being detected, or misdetected, by IMS stations. It is built on top of the basic station, and arrival detection processing at the IDC, and is currently being tested in the IDC network processing pipelines. A key distinguishing feature of NET-VISA is that it is easy to incorporate prior scientific knowledge and historical data into the probabilistic model. The model accounts for both detections and mis-detections when forming events, and this allows it to make more accurate event hypothesis. It has been continuously evaluated since 2012, and in each year it makes a roughly 60% reduction in the number of missed events without increasing the false event rate as compared to the existing GA algorithm. More importantly the model finds large numbers of events that have been confirmed by regional seismic bulletins but missed by the IDC analysts using the same data. In this work we focus on enhancements to the model to improve the location accuracy, and error ellipses. We will present a new version of the model that focuses on the fine scale around the event location, and present error ellipses and analysis of recent important events.
Impact Detection for Characterization of Complex Multiphase Flows
NASA Astrophysics Data System (ADS)
Chan, Wai Hong Ronald; Urzay, Javier; Mani, Ali; Moin, Parviz
2016-11-01
Multiphase flows often involve a wide range of impact events, such as liquid droplets impinging on a liquid pool or gas bubbles coalescing in a liquid medium. These events contribute to a myriad of large-scale phenomena, including breaking waves on ocean surfaces. As impacts between surfaces necessarily occur at isolated points, numerical simulations of impact events will require the resolution of molecular scales near the impact points for accurate modeling. This can be prohibitively expensive unless subgrid impact and breakup models are formulated to capture the effects of the interactions. The first step in a large-eddy simulation (LES) based computational methodology for complex multiphase flows like air-sea interactions requires effective detection of these impact events. The starting point of this work is a collision detection algorithm for structured grids on a coupled level set / volume of fluid (CLSVOF) solver adapted from an earlier algorithm for cloth animations that triangulates the interface with the marching cubes method. We explore the extension of collision detection to a geometric VOF solver and to unstructured grids. Supported by ONR/A*STAR. Agency of Science, Technology and Research, Singapore; Office of Naval Research, USA.
Deformed shell model study of event rates for WIMP-73Ge scattering
NASA Astrophysics Data System (ADS)
Sahu, R.; Kota, V. K. B.
2017-12-01
The event detection rates for the Weakly Interacting Massive Particles (WIMP) (a dark matter candidate) are calculated with 73Ge as the detector. The calculations are performed within the deformed shell model (DSM) based on Hartree-Fock states. First, the energy levels and magnetic moment for the ground state and two low-lying positive parity states for this nucleus are calculated and compared with experiment. The agreement is quite satisfactory. Then the nuclear wave functions are used to investigate the elastic and inelastic scattering of WIMP from 73Ge; inelastic scattering, especially for the 9/2+ → 5/2+ transition, is studied for the first time. The nuclear structure factors which are independent of supersymmetric model are also calculated as a function of WIMP mass. The event rates are calculated for a given set of nucleonic current parameters. The calculation shows that 73Ge is a good detector for detecting dark matter.
A Survey of Insider Attack Detection Research
2008-08-25
modeling of statistical features , such as the frequency of events, the duration of events, the co-occurrence of multiple events combined through...forms of attack that have been reported [Error! Reference source not found.]. For example: • Unauthorized extraction , duplication, or exfiltration...network level. Schultz pointed out that not one approach will work but solutions need to be based on multiple sensors to be able to find any combination
Automatic detection of snow avalanches in continuous seismic data using hidden Markov models
NASA Astrophysics Data System (ADS)
Heck, Matthias; Hammer, Conny; van Herwijnen, Alec; Schweizer, Jürg; Fäh, Donat
2018-01-01
Snow avalanches generate seismic signals as many other mass movements. Detection of avalanches by seismic monitoring is highly relevant to assess avalanche danger. In contrast to other seismic events, signals generated by avalanches do not have a characteristic first arrival nor is it possible to detect different wave phases. In addition, the moving source character of avalanches increases the intricacy of the signals. Although it is possible to visually detect seismic signals produced by avalanches, reliable automatic detection methods for all types of avalanches do not exist yet. We therefore evaluate whether hidden Markov models (HMMs) are suitable for the automatic detection of avalanches in continuous seismic data. We analyzed data recorded during the winter season 2010 by a seismic array deployed in an avalanche starting zone above Davos, Switzerland. We re-evaluated a reference catalogue containing 385 events by grouping the events in seven probability classes. Since most of the data consist of noise, we first applied a simple amplitude threshold to reduce the amount of data. As first classification results were unsatisfying, we analyzed the temporal behavior of the seismic signals for the whole data set and found that there is a high variability in the seismic signals. We therefore applied further post-processing steps to reduce the number of false alarms by defining a minimal duration for the detected event, implementing a voting-based approach and analyzing the coherence of the detected events. We obtained the best classification results for events detected by at least five sensors and with a minimal duration of 12 s. These processing steps allowed identifying two periods of high avalanche activity, suggesting that HMMs are suitable for the automatic detection of avalanches in seismic data. However, our results also showed that more sensitive sensors and more appropriate sensor locations are needed to improve the signal-to-noise ratio of the signals and therefore the classification.
Online Least Squares One-Class Support Vector Machines-Based Abnormal Visual Event Detection
Wang, Tian; Chen, Jie; Zhou, Yi; Snoussi, Hichem
2013-01-01
The abnormal event detection problem is an important subject in real-time video surveillance. In this paper, we propose a novel online one-class classification algorithm, online least squares one-class support vector machine (online LS-OC-SVM), combined with its sparsified version (sparse online LS-OC-SVM). LS-OC-SVM extracts a hyperplane as an optimal description of training objects in a regularized least squares sense. The online LS-OC-SVM learns a training set with a limited number of samples to provide a basic normal model, then updates the model through remaining data. In the sparse online scheme, the model complexity is controlled by the coherence criterion. The online LS-OC-SVM is adopted to handle the abnormal event detection problem. Each frame of the video is characterized by the covariance matrix descriptor encoding the moving information, then is classified into a normal or an abnormal frame. Experiments are conducted, on a two-dimensional synthetic distribution dataset and a benchmark video surveillance dataset, to demonstrate the promising results of the proposed online LS-OC-SVM method. PMID:24351629
Online least squares one-class support vector machines-based abnormal visual event detection.
Wang, Tian; Chen, Jie; Zhou, Yi; Snoussi, Hichem
2013-12-12
The abnormal event detection problem is an important subject in real-time video surveillance. In this paper, we propose a novel online one-class classification algorithm, online least squares one-class support vector machine (online LS-OC-SVM), combined with its sparsified version (sparse online LS-OC-SVM). LS-OC-SVM extracts a hyperplane as an optimal description of training objects in a regularized least squares sense. The online LS-OC-SVM learns a training set with a limited number of samples to provide a basic normal model, then updates the model through remaining data. In the sparse online scheme, the model complexity is controlled by the coherence criterion. The online LS-OC-SVM is adopted to handle the abnormal event detection problem. Each frame of the video is characterized by the covariance matrix descriptor encoding the moving information, then is classified into a normal or an abnormal frame. Experiments are conducted, on a two-dimensional synthetic distribution dataset and a benchmark video surveillance dataset, to demonstrate the promising results of the proposed online LS-OC-SVM method.
Space-time clusters for early detection of grizzly bear predation.
Kermish-Wells, Joseph; Massolo, Alessandro; Stenhouse, Gordon B; Larsen, Terrence A; Musiani, Marco
2018-01-01
Accurate detection and classification of predation events is important to determine predation and consumption rates by predators. However, obtaining this information for large predators is constrained by the speed at which carcasses disappear and the cost of field data collection. To accurately detect predation events, researchers have used GPS collar technology combined with targeted site visits. However, kill sites are often investigated well after the predation event due to limited data retrieval options on GPS collars (VHF or UHF downloading) and to ensure crew safety when working with large predators. This can lead to missing information from small-prey (including young ungulates) kill sites due to scavenging and general site deterioration (e.g., vegetation growth). We used a space-time permutation scan statistic (STPSS) clustering method (SaTScan) to detect predation events of grizzly bears ( Ursus arctos ) fitted with satellite transmitting GPS collars. We used generalized linear mixed models to verify predation events and the size of carcasses using spatiotemporal characteristics as predictors. STPSS uses a probability model to compare expected cluster size (space and time) with the observed size. We applied this method retrospectively to data from 2006 to 2007 to compare our method to random GPS site selection. In 2013-2014, we applied our detection method to visit sites one week after their occupation. Both datasets were collected in the same study area. Our approach detected 23 of 27 predation sites verified by visiting 464 random grizzly bear locations in 2006-2007, 187 of which were within space-time clusters and 277 outside. Predation site detection increased by 2.75 times (54 predation events of 335 visited clusters) using 2013-2014 data. Our GLMMs showed that cluster size and duration predicted predation events and carcass size with high sensitivity (0.72 and 0.94, respectively). Coupling GPS satellite technology with clusters using a program based on space-time probability models allows for prompt visits to predation sites. This enables accurate identification of the carcass size and increases fieldwork efficiency in predation studies.
Wang, Anran; Wang, Jian; Lin, Hongfei; Zhang, Jianhai; Yang, Zhihao; Xu, Kan
2017-12-20
Biomedical event extraction is one of the most frontier domains in biomedical research. The two main subtasks of biomedical event extraction are trigger identification and arguments detection which can both be considered as classification problems. However, traditional state-of-the-art methods are based on support vector machine (SVM) with massive manually designed one-hot represented features, which require enormous work but lack semantic relation among words. In this paper, we propose a multiple distributed representation method for biomedical event extraction. The method combines context consisting of dependency-based word embedding, and task-based features represented in a distributed way as the input of deep learning models to train deep learning models. Finally, we used softmax classifier to label the example candidates. The experimental results on Multi-Level Event Extraction (MLEE) corpus show higher F-scores of 77.97% in trigger identification and 58.31% in overall compared to the state-of-the-art SVM method. Our distributed representation method for biomedical event extraction avoids the problems of semantic gap and dimension disaster from traditional one-hot representation methods. The promising results demonstrate that our proposed method is effective for biomedical event extraction.
Zhang, Xiaopu; Lin, Jun; Chen, Zubin; Sun, Feng; Zhu, Xi; Fang, Gengfa
2018-06-05
Microseismic monitoring is one of the most critical technologies for hydraulic fracturing in oil and gas production. To detect events in an accurate and efficient way, there are two major challenges. One challenge is how to achieve high accuracy due to a poor signal-to-noise ratio (SNR). The other one is concerned with real-time data transmission. Taking these challenges into consideration, an edge-computing-based platform, namely Edge-to-Center LearnReduce, is presented in this work. The platform consists of a data center with many edge components. At the data center, a neural network model combined with convolutional neural network (CNN) and long short-term memory (LSTM) is designed and this model is trained by using previously obtained data. Once the model is fully trained, it is sent to edge components for events detection and data reduction. At each edge component, a probabilistic inference is added to the neural network model to improve its accuracy. Finally, the reduced data is delivered to the data center. Based on experiment results, a high detection accuracy (over 96%) with less transmitted data (about 90%) was achieved by using the proposed approach on a microseismic monitoring system. These results show that the platform can simultaneously improve the accuracy and efficiency of microseismic monitoring.
NASA Technical Reports Server (NTRS)
Thompson, D. J.; Bertsch, D. L.; ONeal, R. H., Jr.
2005-01-01
During its nine-year lifetime, the Energetic Gamma Ray Experiment Telescope (EGBET) on the Compton Gamma Ray Observatory (CGRO) detected 1506 cosmic photons with measured energy E>10 GeV. Of this number, 187 are found within a 1 deg of sources that are listed in the Third EGRET Catalog and were included in determining the detection likelihood, flux, and spectra of those sources. In particular, five detected EGRET pulsars are found to have events above 10 GeV, and together they account for 37 events. A pulsar not included in the Third EGRET Catalog has 2 events, both with the same phase and in one peak of the lower-energy gamma-ray light-curve. Most of the remaining 1319 events appear to be diffuse Galactic and extragalactic radiation based on the similarity of the their spatial and energy distributions with the diffuse model and in the E>100, MeV emission. No significant time clustering which would suggest a burst was detected.
LINEBACKER: LINE-speed Bio-inspired Analysis and Characterization for Event Recognition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oehmen, Christopher S.; Bruillard, Paul J.; Matzke, Brett D.
2016-08-04
The cyber world is a complex domain, with digital systems mediating a wide spectrum of human and machine behaviors. While this is enabling a revolution in the way humans interact with each other and data, it also is exposing previously unreachable infrastructure to a worldwide set of actors. Existing solutions for intrusion detection and prevention that are signature-focused typically seek to detect anomalous and/or malicious activity for the sake of preventing or mitigating negative impacts. But a growing interest in behavior-based detection is driving new forms of analysis that move the emphasis from static indicators (e.g. rule-based alarms or tripwires)more » to behavioral indicators that accommodate a wider contextual perspective. Similar to cyber systems, biosystems have always existed in resource-constrained hostile environments where behaviors are tuned by context. So we look to biosystems as an inspiration for addressing behavior-based cyber challenges. In this paper, we introduce LINEBACKER, a behavior-model based approach to recognizing anomalous events in network traffic and present the design of this approach of bio-inspired and statistical models working in tandem to produce individualized alerting for a collection of systems. Preliminary results of these models operating on historic data are presented along with a plugin to support real-world cyber operations.« less
A Fuzzy-Decision Based Approach for Composite Event Detection in Wireless Sensor Networks
Zhang, Shukui; Chen, Hao; Zhu, Qiaoming
2014-01-01
The event detection is one of the fundamental researches in wireless sensor networks (WSNs). Due to the consideration of various properties that reflect events status, the Composite event is more consistent with the objective world. Thus, the research of the Composite event becomes more realistic. In this paper, we analyze the characteristics of the Composite event; then we propose a criterion to determine the area of the Composite event and put forward a dominating set based network topology construction algorithm under random deployment. For the unreliability of partial data in detection process and fuzziness of the event definitions in nature, we propose a cluster-based two-dimensional τ-GAS algorithm and fuzzy-decision based composite event decision mechanism. In the case that the sensory data of most nodes are normal, the two-dimensional τ-GAS algorithm can filter the fault node data effectively and reduce the influence of erroneous data on the event determination. The Composite event judgment mechanism which is based on fuzzy-decision holds the superiority of the fuzzy-logic based algorithm; moreover, it does not need the support of a huge rule base and its computational complexity is small. Compared to CollECT algorithm and CDS algorithm, this algorithm improves the detection accuracy and reduces the traffic. PMID:25136690
Pre-trained D-CNN models for detecting complex events in unconstrained videos
NASA Astrophysics Data System (ADS)
Robinson, Joseph P.; Fu, Yun
2016-05-01
Rapid event detection faces an emergent need to process large videos collections; whether surveillance videos or unconstrained web videos, the ability to automatically recognize high-level, complex events is a challenging task. Motivated by pre-existing methods being complex, computationally demanding, and often non-replicable, we designed a simple system that is quick, effective and carries minimal overhead in terms of memory and storage. Our system is clearly described, modular in nature, replicable on any Desktop, and demonstrated with extensive experiments, backed by insightful analysis on different Convolutional Neural Networks (CNNs), as stand-alone and fused with others. With a large corpus of unconstrained, real-world video data, we examine the usefulness of different CNN models as features extractors for modeling high-level events, i.e., pre-trained CNNs that differ in architectures, training data, and number of outputs. For each CNN, we use 1-fps from all training exemplar to train one-vs-rest SVMs for each event. To represent videos, frame-level features were fused using a variety of techniques. The best being to max-pool between predetermined shot boundaries, then average-pool to form the final video-level descriptor. Through extensive analysis, several insights were found on using pre-trained CNNs as off-the-shelf feature extractors for the task of event detection. Fusing SVMs of different CNNs revealed some interesting facts, finding some combinations to be complimentary. It was concluded that no single CNN works best for all events, as some events are more object-driven while others are more scene-based. Our top performance resulted from learning event-dependent weights for different CNNs.
Semantic Context Detection Using Audio Event Fusion
NASA Astrophysics Data System (ADS)
Chu, Wei-Ta; Cheng, Wen-Huang; Wu, Ja-Ling
2006-12-01
Semantic-level content analysis is a crucial issue in achieving efficient content retrieval and management. We propose a hierarchical approach that models audio events over a time series in order to accomplish semantic context detection. Two levels of modeling, audio event and semantic context modeling, are devised to bridge the gap between physical audio features and semantic concepts. In this work, hidden Markov models (HMMs) are used to model four representative audio events, that is, gunshot, explosion, engine, and car braking, in action movies. At the semantic context level, generative (ergodic hidden Markov model) and discriminative (support vector machine (SVM)) approaches are investigated to fuse the characteristics and correlations among audio events, which provide cues for detecting gunplay and car-chasing scenes. The experimental results demonstrate the effectiveness of the proposed approaches and provide a preliminary framework for information mining by using audio characteristics.
A model of human decision making in multiple process monitoring situations
NASA Technical Reports Server (NTRS)
Greenstein, J. S.; Rouse, W. B.
1982-01-01
Human decision making in multiple process monitoring situations is considered. It is proposed that human decision making in many multiple process monitoring situations can be modeled in terms of the human's detection of process related events and his allocation of attention among processes once he feels event have occurred. A mathematical model of human event detection and attention allocation performance in multiple process monitoring situations is developed. An assumption made in developing the model is that, in attempting to detect events, the human generates estimates of the probabilities that events have occurred. An elementary pattern recognition technique, discriminant analysis, is used to model the human's generation of these probability estimates. The performance of the model is compared to that of four subjects in a multiple process monitoring situation requiring allocation of attention among processes.
Qualitative Event-Based Diagnosis: Case Study on the Second International Diagnostic Competition
NASA Technical Reports Server (NTRS)
Daigle, Matthew; Roychoudhury, Indranil
2010-01-01
We describe a diagnosis algorithm entered into the Second International Diagnostic Competition. We focus on the first diagnostic problem of the industrial track of the competition in which a diagnosis algorithm must detect, isolate, and identify faults in an electrical power distribution testbed and provide corresponding recovery recommendations. The diagnosis algorithm embodies a model-based approach, centered around qualitative event-based fault isolation. Faults produce deviations in measured values from model-predicted values. The sequence of these deviations is matched to those predicted by the model in order to isolate faults. We augment this approach with model-based fault identification, which determines fault parameters and helps to further isolate faults. We describe the diagnosis approach, provide diagnosis results from running the algorithm on provided example scenarios, and discuss the issues faced, and lessons learned, from implementing the approach
Taking the CCDs to the ultimate performance for low threshold experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haro, Miguel; Moroni, Guillermo; Tiffenberg, Javier
2016-11-14
Scientific grade CCDs show atractive capabilities for the detection of particles with small energy deposition in matter. Their very low threshold of approximately 40 eV and their good spatial reconstruction of the event are key properties for currently running experiments: CONNIE and DAMIC. Both experiments can benefit from any increase of the detection efficiency of nuclear recoils at low energy. In this work we present two different approaches to increase this efficiency by increasing the SNR of events. The first one is based on the reduction of the readout noise of the device, which is the main contribution of uncertaintymore » to the signal measurement. New studies on the electronic noise from the integrated output amplifier and the readout electronics will be presented together with result of a new configuration showing a lower limit on the readout noise which can be implemented on the current setup of the CCD based experiments. A second approach to increase the SNR of events at low energy that will be presented is the studies of the spatial conformation of nuclear recoil events at different depth in the active volume by studies of new effects that differ from expected models based on not interacting diffusion model of electrons in the semiconductor.« less
A Neutral Network based Early Eathquake Warning model in California region
NASA Astrophysics Data System (ADS)
Xiao, H.; MacAyeal, D. R.
2016-12-01
Early Earthquake Warning systems could reduce loss of lives and other economic impact resulted from natural disaster or man-made calamity. Current systems could be further enhanced by neutral network method. A 3 layer neural network model combined with onsite method was deployed in this paper to improve the recognition time and detection time for large scale earthquakes.The 3 layer neutral network early earthquake warning model adopted the vector feature design for sample events happened within 150 km radius of the epicenters. Dataset used in this paper contained both destructive events and small scale events. All the data was extracted from IRIS database to properly train the model. In the training process, backpropagation algorithm was used to adjust the weight matrices and bias matrices during each iteration. The information in all three channels of the seismometers served as the source in this model. Through designed tests, it was indicated that this model could identify approximately 90 percent of the events' scale correctly. And the early detection could provide informative evidence for public authorities to make further decisions. This indicated that neutral network model could have the potential to strengthen current early warning system, since the onsite method may greatly reduce the responding time and save more lives in such disasters.
NASA Astrophysics Data System (ADS)
Ragettli, S.; Zhou, J.; Wang, H.; Liu, C.
2017-12-01
Flash floods in small mountain catchments are one of the most frequent causes of loss of life and property from natural hazards in China. Hydrological models can be a useful tool for the anticipation of these events and the issuing of timely warnings. Since sub-daily streamflow information is unavailable for most small basins in China, one of the main challenges is finding appropriate parameter values for simulating flash floods in ungauged catchments. In this study, we use decision tree learning to explore parameter set transferability between different catchments. For this purpose, the physically-based, semi-distributed rainfall-runoff model PRMS-OMS is set up for 35 catchments in ten Chinese provinces. Hourly data from more than 800 storm runoff events are used to calibrate the model and evaluate the performance of parameter set transfers between catchments. For each catchment, 58 catchment attributes are extracted from several data sets available for whole China. We then use a data mining technique (decision tree learning) to identify catchment similarities that can be related to good transfer performance. Finally, we use the splitting rules of decision trees for finding suitable donor catchments for ungauged target catchments. We show that decision tree learning allows to optimally utilize the information content of available catchment descriptors and outperforms regionalization based on a conventional measure of physiographic-climatic similarity by 15%-20%. Similar performance can be achieved with a regionalization method based on spatial proximity, but decision trees offer flexible rules for selecting suitable donor catchments, not relying on the vicinity of gauged catchments. This flexibility makes the method particularly suitable for implementation in sparsely gauged environments. We evaluate the probability to detect flood events exceeding a given return period, considering measured discharge and PRMS-OMS simulated flows with regionalized parameters. Overall, the probability of detection of an event with a return period of 10 years is 62%. 44% of all 10-year flood peaks can be detected with a timing error of 2 hours or less. These results indicate that the modeling system can provide useful information about the timing and magnitude of flood events at ungauged sites.
Ben-Yoav, Hadar; Dykstra, Peter H; Bentley, William E; Ghodssi, Reza
2017-01-01
A microfluidic electrochemical lab-on-a-chip (LOC) device for DNA hybridization detection has been developed. The device comprises a 3 × 3 array of microelectrodes integrated with a dual layer microfluidic valved manipulation system that provides controlled and automated capabilities for high throughput analysis of microliter volume samples. The surface of the microelectrodes is functionalized with single-stranded DNA (ssDNA) probes which enable specific detection of complementary ssDNA targets. These targets are detected by a capacitive technique which measures dielectric variation at the microelectrode-electrolyte interface due to DNA hybridization events. A quantitative analysis of the hybridization events is carried out based on a sensing modeling that includes detailed analysis of energy storage and dissipation components. By calculating these components during hybridization events the device is able to demonstrate specific and dose response sensing characteristics. The developed microfluidic LOC for DNA hybridization detection offers a technology for real-time and label-free assessment of genetic markers outside of laboratory settings, such as at the point-of-care or in-field environmental monitoring.
NASA Astrophysics Data System (ADS)
Nissen, Katrin; Ulbrich, Uwe
2016-04-01
An event based detection algorithm for extreme precipitation is applied to a multi-model ensemble of regional climate model simulations. The algorithm determines extent, location, duration and severity of extreme precipitation events. We assume that precipitation in excess of the local present-day 10-year return value will potentially exceed the capacity of the drainage systems that protect critical infrastructure elements. This assumption is based on legislation for the design of drainage systems which is in place in many European countries. Thus, events exceeding the local 10-year return value are detected. In this study we distinguish between sub-daily events (3 hourly) with high precipitation intensities and long-duration events (1-3 days) with high precipitation amounts. The climate change simulations investigated here were conducted within the EURO-CORDEX framework and exhibit a horizontal resolution of approximately 12.5 km. The period between 1971-2100 forced with observed and scenario (RCP 8.5 and RCP 4.5) greenhouse gas concentrations was analysed. Examined are changes in event frequency, event duration and size. The simulations show an increase in the number of extreme precipitation events for the future climate period over most of the area, which is strongest in Northern Europe. Strength and statistical significance of the signal increase with increasing greenhouse gas concentrations. This work has been conducted within the EU project RAIN (Risk Analysis of Infrastructure Networks in response to extreme weather).
Zhu, Pengyu; Fu, Wei; Wang, Chenguang; Du, Zhixin; Huang, Kunlun; Zhu, Shuifang; Xu, Wentao
2016-04-15
The possibility of the absolute quantitation of GMO events by digital PCR was recently reported. However, most absolute quantitation methods based on the digital PCR required pretreatment steps. Meanwhile, singleplex detection could not meet the demand of the absolute quantitation of GMO events that is based on the ratio of foreign fragments and reference genes. Thus, to promote the absolute quantitative detection of different GMO events by digital PCR, we developed a quantitative detection method based on duplex digital PCR without pretreatment. Moreover, we tested 7 GMO events in our study to evaluate the fitness of our method. The optimized combination of foreign and reference primers, limit of quantitation (LOQ), limit of detection (LOD) and specificity were validated. The results showed that the LOQ of our method for different GMO events was 0.5%, while the LOD is 0.1%. Additionally, we found that duplex digital PCR could achieve the detection results with lower RSD compared with singleplex digital PCR. In summary, the duplex digital PCR detection system is a simple and stable way to achieve the absolute quantitation of different GMO events. Moreover, the LOQ and LOD indicated that this method is suitable for the daily detection and quantitation of GMO events. Copyright © 2016 Elsevier B.V. All rights reserved.
Blowing snow detection from ground-based ceilometers: application to East Antarctica
NASA Astrophysics Data System (ADS)
Gossart, Alexandra; Souverijns, Niels; Gorodetskaya, Irina V.; Lhermitte, Stef; Lenaerts, Jan T. M.; Schween, Jan H.; Mangold, Alexander; Laffineur, Quentin; van Lipzig, Nicole P. M.
2017-12-01
Blowing snow impacts Antarctic ice sheet surface mass balance by snow redistribution and sublimation. However, numerical models poorly represent blowing snow processes, while direct observations are limited in space and time. Satellite retrieval of blowing snow is hindered by clouds and only the strongest events are considered. Here, we develop a blowing snow detection (BSD) algorithm for ground-based remote-sensing ceilometers in polar regions and apply it to ceilometers at Neumayer III and Princess Elisabeth (PE) stations, East Antarctica. The algorithm is able to detect (heavy) blowing snow layers reaching 30 m height. Results show that 78 % of the detected events are in agreement with visual observations at Neumayer III station. The BSD algorithm detects heavy blowing snow 36 % of the time at Neumayer (2011-2015) and 13 % at PE station (2010-2016). Blowing snow occurrence peaks during the austral winter and shows around 5 % interannual variability. The BSD algorithm is capable of detecting blowing snow both lifted from the ground and occurring during precipitation, which is an added value since results indicate that 92 % of the blowing snow is during synoptic events, often combined with precipitation. Analysis of atmospheric meteorological variables shows that blowing snow occurrence strongly depends on fresh snow availability in addition to wind speed. This finding challenges the commonly used parametrizations, where the threshold for snow particles to be lifted is a function of wind speed only. Blowing snow occurs predominantly during storms and overcast conditions, shortly after precipitation events, and can reach up to 1300 m a. g. l. in the case of heavy mixed events (precipitation and blowing snow together). These results suggest that synoptic conditions play an important role in generating blowing snow events and that fresh snow availability should be considered in determining the blowing snow onset.
Multilevel analysis of sports video sequences
NASA Astrophysics Data System (ADS)
Han, Jungong; Farin, Dirk; de With, Peter H. N.
2006-01-01
We propose a fully automatic and flexible framework for analysis and summarization of tennis broadcast video sequences, using visual features and specific game-context knowledge. Our framework can analyze a tennis video sequence at three levels, which provides a broad range of different analysis results. The proposed framework includes novel pixel-level and object-level tennis video processing algorithms, such as a moving-player detection taking both the color and the court (playing-field) information into account, and a player-position tracking algorithm based on a 3-D camera model. Additionally, we employ scene-level models for detecting events, like service, base-line rally and net-approach, based on a number real-world visual features. The system can summarize three forms of information: (1) all court-view playing frames in a game, (2) the moving trajectory and real-speed of each player, as well as relative position between the player and the court, (3) the semantic event segments in a game. The proposed framework is flexible in choosing the level of analysis that is desired. It is effective because the framework makes use of several visual cues obtained from the real-world domain to model important events like service, thereby increasing the accuracy of the scene-level analysis. The paper presents attractive experimental results highlighting the system efficiency and analysis capabilities.
Towards cross-lingual alerting for bursty epidemic events.
Collier, Nigel
2011-10-06
Online news reports are increasingly becoming a source for event-based early warning systems that detect natural disasters. Harnessing the massive volume of information available from multilingual newswire presents as many challanges as opportunities due to the patterns of reporting complex spatio-temporal events. In this article we study the problem of utilising correlated event reports across languages. We track the evolution of 16 disease outbreaks using 5 temporal aberration detection algorithms on text-mined events classified according to disease and outbreak country. Using ProMED reports as a silver standard, comparative analysis of news data for 13 languages over a 129 day trial period showed improved sensitivity, F1 and timeliness across most models using cross-lingual events. We report a detailed case study analysis for Cholera in Angola 2010 which highlights the challenges faced in correlating news events with the silver standard. The results show that automated health surveillance using multilingual text mining has the potential to turn low value news into high value alerts if informed choices are used to govern the selection of models and data sources. An implementation of the C2 alerting algorithm using multilingual news is available at the BioCaster portal http://born.nii.ac.jp/?page=globalroundup.
Symbolic Processing Combined with Model-Based Reasoning
NASA Technical Reports Server (NTRS)
James, Mark
2009-01-01
A computer program for the detection of present and prediction of future discrete states of a complex, real-time engineering system utilizes a combination of symbolic processing and numerical model-based reasoning. One of the biggest weaknesses of a purely symbolic approach is that it enables prediction of only future discrete states while missing all unmodeled states or leading to incorrect identification of an unmodeled state as a modeled one. A purely numerical approach is based on a combination of statistical methods and mathematical models of the applicable physics and necessitates development of a complete model to the level of fidelity required for prediction. In addition, a purely numerical approach does not afford the ability to qualify its results without some form of symbolic processing. The present software implements numerical algorithms to detect unmodeled events and symbolic algorithms to predict expected behavior, correlate the expected behavior with the unmodeled events, and interpret the results in order to predict future discrete states. The approach embodied in this software differs from that of the BEAM methodology (aspects of which have been discussed in several prior NASA Tech Briefs articles), which provides for prediction of future measurements in the continuous-data domain.
A neural network model of causative actions.
Lee-Hand, Jeremy; Knott, Alistair
2015-01-01
A common idea in models of action representation is that actions are represented in terms of their perceptual effects (see e.g., Prinz, 1997; Hommel et al., 2001; Sahin et al., 2007; Umiltà et al., 2008; Hommel, 2013). In this paper we extend existing models of effect-based action representations to account for a novel distinction. Some actions bring about effects that are independent events in their own right: for instance, if John smashes a cup, he brings about the event of the cup smashing. Other actions do not bring about such effects. For instance, if John grabs a cup, this action does not cause the cup to "do" anything: a grab action has well-defined perceptual effects, but these are not registered by the perceptual system that detects independent events involving external objects in the world. In our model, effect-based actions are implemented in several distinct neural circuits, which are organized into a hierarchy based on the complexity of their associated perceptual effects. The circuit at the top of this hierarchy is responsible for actions that bring about independently perceivable events. This circuit receives input from the perceptual module that recognizes arbitrary events taking place in the world, and learns movements that reliably cause such events. We assess our model against existing experimental observations about effect-based motor representations, and make some novel experimental predictions. We also consider the possibility that the "causative actions" circuit in our model can be identified with a motor pathway reported in other work, specializing in "functional" actions on manipulable tools (Bub et al., 2008; Binkofski and Buxbaum, 2013).
Modeling and Detection of Ice Particle Accretion in Aircraft Engine Compression Systems
NASA Technical Reports Server (NTRS)
May, Ryan D.; Simon, Donald L.; Guo, Ten-Huei
2012-01-01
The accretion of ice particles in the core of commercial aircraft engines has been an ongoing aviation safety challenge. While no accidents have resulted from this phenomenon to date, numerous engine power loss events ranging from uneventful recoveries to forced landings have been recorded. As a first step to enabling mitigation strategies during ice accretion, a detection scheme must be developed that is capable of being implemented on board modern engines. In this paper, a simple detection scheme is developed and tested using a realistic engine simulation with approximate ice accretion models based on data from a compressor design tool. These accretion models are implemented as modified Low Pressure Compressor maps and have the capability to shift engine performance based on a specified level of ice blockage. Based on results from this model, it is possible to detect the accretion of ice in the engine core by observing shifts in the typical sensed engine outputs. Results are presented in which, for a 0.1 percent false positive rate, a true positive detection rate of 98 percent is achieved.
Bidirectional RNN for Medical Event Detection in Electronic Health Records.
Jagannatha, Abhyuday N; Yu, Hong
2016-06-01
Sequence labeling for extraction of medical events and their attributes from unstructured text in Electronic Health Record (EHR) notes is a key step towards semantic understanding of EHRs. It has important applications in health informatics including pharmacovigilance and drug surveillance. The state of the art supervised machine learning models in this domain are based on Conditional Random Fields (CRFs) with features calculated from fixed context windows. In this application, we explored recurrent neural network frameworks and show that they significantly out-performed the CRF models.
Mining patterns in persistent surveillance systems with smart query and visual analytics
NASA Astrophysics Data System (ADS)
Habibi, Mohammad S.; Shirkhodaie, Amir
2013-05-01
In Persistent Surveillance Systems (PSS) the ability to detect and characterize events geospatially help take pre-emptive steps to counter adversary's actions. Interactive Visual Analytic (VA) model offers this platform for pattern investigation and reasoning to comprehend and/or predict such occurrences. The need for identifying and offsetting these threats requires collecting information from diverse sources, which brings with it increasingly abstract data. These abstract semantic data have a degree of inherent uncertainty and imprecision, and require a method for their filtration before being processed further. In this paper, we have introduced an approach based on Vector Space Modeling (VSM) technique for classification of spatiotemporal sequential patterns of group activities. The feature vectors consist of an array of attributes extracted from generated sensors semantic annotated messages. To facilitate proper similarity matching and detection of time-varying spatiotemporal patterns, a Temporal-Dynamic Time Warping (DTW) method with Gaussian Mixture Model (GMM) for Expectation Maximization (EM) is introduced. DTW is intended for detection of event patterns from neighborhood-proximity semantic frames derived from established ontology. GMM with EM, on the other hand, is employed as a Bayesian probabilistic model to estimated probability of events associated with a detected spatiotemporal pattern. In this paper, we present a new visual analytic tool for testing and evaluation group activities detected under this control scheme. Experimental results demonstrate the effectiveness of proposed approach for discovery and matching of subsequences within sequentially generated patterns space of our experiments.
Prediction of Intensity Change Subsequent to Concentric Eyewall Events
NASA Astrophysics Data System (ADS)
Mauk, Rachel Grant
Concentric eyewall events have been documented numerous times in intense tropical cyclones over the last two decades. During a concentric eyewall event, an outer (secondary) eyewall forms around the inner (primary) eyewall. Improved instrumentation on aircraft and satellites greatly increases the likelihood of detecting an event. Despite the increased ability to detect such events, forecasts of intensity changes during and after these events remain poor. When concentric eyewall events occur near land, accurate intensity change predictions are especially critical to ensure proper emergency preparations and staging of recovery assets. A nineteen-year (1997-2015) database of concentric eyewall events is developed by analyzing microwave satellite imagery, aircraft- and land-based radar, and other published documents. Events are identified in both the North Atlantic and eastern North Pacific basins. TCs are categorized as single (1 event), serial (>= 2 events) and super-serial (>= 3 events). Key findings here include distinct spatial patterns for single and serial Atlantic TCs, a broad seasonal distribution for eastern North Pacific TCs, and apparent ENSO-related variability in both basins. The intensity change subsequent to the concentric eyewall event is calculated from the HURDAT2 database at time points relative to the start and to the end of the event. Intensity change is then categorized as Weaken (≤ -10 kt), Maintain (+/- 5 kt), and Strengthen (≥ 10 kt). Environmental conditions in which each event occurred are analyzed based on the SHIPS diagnostic files. Oceanic, dynamic, thermodynamic, and TC status predictors are selected for testing in a multiple discriminant analysis procedure to determine which variables successfully discriminate the intensity change category and the occurrence of additional concentric eyewall events. Intensity models are created for 12 h, 24 h, 36 h, and 48 h after the concentric eyewall events end. Leave-one-out cross validation is performed on each set of discriminators to generate classifications, which are then compared to observations. For each model, the top combinations achieve 80-95% overall accuracy in classifying TCs based on the environmental characteristics, although Maintain systems are frequently misclassified. The third part of this dissertation employs the Weather Research and Forecasting model to further investigate concentric eyewall events. Two serial Atlantic concentric eyewall cases (Katrina 2005 and Wilma 2005) are selected from the original study set, and WRF simulations performed using several model designs. Despite strong evidence from multiple sources that serial concentric eyewalls formed in both hurricanes, the WRF simulations did not produce identifiable concentric eyewall structures for Katrina, and only transient structures for Wilma. Possible reasons for the lack of concentric eyewall formation are discussed, including model resolution, microphysics, and data sources.
NASA Astrophysics Data System (ADS)
Sippel, S.; Otto, F. E. L.; Forkel, M.; Allen, M. R.; Guillod, B. P.; Heimann, M.; Reichstein, M.; Seneviratne, S. I.; Kirsten, T.; Mahecha, M. D.
2015-12-01
Understanding, quantifying and attributing the impacts of climatic extreme events and variability is crucial for societal adaptation in a changing climate. However, climate model simulations generated for this purpose typically exhibit pronounced biases in their output that hinders any straightforward assessment of impacts. To overcome this issue, various bias correction strategies are routinely used to alleviate climate model deficiencies most of which have been criticized for physical inconsistency and the non-preservation of the multivariate correlation structure. We assess how biases and their correction affect the quantification and attribution of simulated extremes and variability in i) climatological variables and ii) impacts on ecosystem functioning as simulated by a terrestrial biosphere model. Our study demonstrates that assessments of simulated climatic extreme events and impacts in the terrestrial biosphere are highly sensitive to bias correction schemes with major implications for the detection and attribution of these events. We introduce a novel ensemble-based resampling scheme based on a large regional climate model ensemble generated by the distributed weather@home setup[1], which fully preserves the physical consistency and multivariate correlation structure of the model output. We use extreme value statistics to show that this procedure considerably improves the representation of climatic extremes and variability. Subsequently, biosphere-atmosphere carbon fluxes are simulated using a terrestrial ecosystem model (LPJ-GSI) to further demonstrate the sensitivity of ecosystem impacts to the methodology of bias correcting climate model output. We find that uncertainties arising from bias correction schemes are comparable in magnitude to model structural and parameter uncertainties. The present study consists of a first attempt to alleviate climate model biases in a physically consistent way and demonstrates that this yields improved simulations of climate extremes and associated impacts. [1] http://www.climateprediction.net/weatherathome/
Analyzing and Identifying Teens' Stressful Periods and Stressor Events From a Microblog.
Li, Qi; Xue, Yuanyuan; Zhao, Liang; Jia, Jia; Feng, Ling
2017-09-01
Increased health problems among adolescents caused by psychological stress have aroused worldwide attention. Long-standing stress without targeted assistance and guidance negatively impacts the healthy growth of adolescents, threatening the future development of our society. So far, research focused on detecting adolescent psychological stress revealed from each individual post on microblogs. However, beyond stressful moments, identifying teens' stressful periods and stressor events that trigger each stressful period is more desirable to understand the stress from appearance to essence. In this paper, we define the problem of identifying teens' stressful periods and stressor events from the open social media microblog. Starting from a case study of adolescents' posting behaviors during stressful school events, we build a Poisson-based probability model for the correlation between stressor events and stressful posting behaviors through a series of posts on Tencent Weibo (referred to as the microblog throughout the paper). With the model, we discover teens' maximal stressful periods and further extract details of possible stressor events that cause the stressful periods. We generalize and present the extracted stressor events in a hierarchy based on common stress dimensions and event types. Taking 122 scheduled stressful study-related events in a high school as the ground truth, we test the approach on 124 students' posts from January 1, 2012 to February 1, 2015 and obtain some promising experimental results: (stressful periods: recall 0.761, precision 0.737, and F 1 -measure 0.734) and (top-3 stressor events: recall 0.763, precision 0.756, and F 1 -measure 0.759). The most prominent stressor events extracted are in the self-cognition domain, followed by the school life domain. This conforms to the adolescent psychological investigation result that problems in school life usually accompanied with teens' inner cognition problems. Compared with the state-of-the-art top-1 personal life event detection approach, our stressor event detection method is 13.72% higher in precision, 19.18% higher in recall, and 16.50% higher in F 1 -measure, demonstrating the effectiveness of our proposed framework.
Event Detection for Hydrothermal Plumes: A case study at Grotto Vent
NASA Astrophysics Data System (ADS)
Bemis, K. G.; Ozer, S.; Xu, G.; Rona, P. A.; Silver, D.
2012-12-01
Evidence is mounting that geologic events such as volcanic eruptions (and intrusions) and earthquakes (near and far) influence the flow rates and temperatures of hydrothermal systems. Connecting such suppositions to observations of hydrothermal output is challenging, but new ongoing time series have the potential to capture such events. This study explores using activity detection, a technique modified from computer vision, to identify pre-defined events within an extended time series recorded by COVIS (Cabled Observatory Vent Imaging Sonar) and applies it to a time series, with gaps, from Sept 2010 to the present; available measurements include plume orientation, plume rise rate, and diffuse flow area at the NEPTUNE Canada Observatory at Grotto Vent, Main Endeavour Field, Juan de Fuca Ridge. Activity detection is the process of finding a pattern (activity) in a data set containing many different types of patterns. Among many approaches proposed to model and detect activities, we have chosen a graph-based technique, Petri Nets, as they do not require training data to model the activity. They use the domain expert's knowledge to build the activity as a combination of feature states and their transitions (actions). Starting from a conceptual model of how hydrothermal plumes respond to daily tides, we have developed a Petri Net based detection algorithm that identifies deviations from the specified response. Initially we assumed that the orientation of the plume would change smoothly and symmetrically in a consistent daily pattern. However, results indicate that the rate of directional changes varies. The present Petri Net detects unusually large and rapid changes in direction or amount of bending; however inspection of Figure 1 suggests that many of the events detected may be artifacts resulting from gaps in the data or from the large temporal spacing. Still, considerable complexity overlies the "normal" tidal response pattern (the data has a dominant frequency of ~12.9 hours). We are in the process of defining several events of particular scientific interest: 1) transient behavioral changes associated with atmospheric storms, earthquakes or volcanic intrusions or eruptions, 2) mutual interaction of neighboring plumes on each other's behavior, and 3) rapid shifts in plume direction that indicate the presence of unusual currents or changes in currents. We will query the existing data to see if these relationships are ever observed as well as testing our understanding of the "normal" pattern of response to tidal currents.Figure 1. Arrows indicate plume orientation at a given time (time axis in days after 9/29/10) and stars indicate times when orientation changes rapidly.
Scalable Joint Models for Reliable Uncertainty-Aware Event Prediction.
Soleimani, Hossein; Hensman, James; Saria, Suchi
2017-08-21
Missing data and noisy observations pose significant challenges for reliably predicting events from irregularly sampled multivariate time series (longitudinal) data. Imputation methods, which are typically used for completing the data prior to event prediction, lack a principled mechanism to account for the uncertainty due to missingness. Alternatively, state-of-the-art joint modeling techniques can be used for jointly modeling the longitudinal and event data and compute event probabilities conditioned on the longitudinal observations. These approaches, however, make strong parametric assumptions and do not easily scale to multivariate signals with many observations. Our proposed approach consists of several key innovations. First, we develop a flexible and scalable joint model based upon sparse multiple-output Gaussian processes. Unlike state-of-the-art joint models, the proposed model can explain highly challenging structure including non-Gaussian noise while scaling to large data. Second, we derive an optimal policy for predicting events using the distribution of the event occurrence estimated by the joint model. The derived policy trades-off the cost of a delayed detection versus incorrect assessments and abstains from making decisions when the estimated event probability does not satisfy the derived confidence criteria. Experiments on a large dataset show that the proposed framework significantly outperforms state-of-the-art techniques in event prediction.
Comprehensive Assessment of Models and Events based on Library tools (CAMEL)
NASA Astrophysics Data System (ADS)
Rastaetter, L.; Boblitt, J. M.; DeZeeuw, D.; Mays, M. L.; Kuznetsova, M. M.; Wiegand, C.
2017-12-01
At the Community Coordinated Modeling Center (CCMC), the assessment of modeling skill using a library of model-data comparison metrics is taken to the next level by fully integrating the ability to request a series of runs with the same model parameters for a list of events. The CAMEL framework initiates and runs a series of selected, pre-defined simulation settings for participating models (e.g., WSA-ENLIL, SWMF-SC+IH for the heliosphere, SWMF-GM, OpenGGCM, LFM, GUMICS for the magnetosphere) and performs post-processing using existing tools for a host of different output parameters. The framework compares the resulting time series data with respective observational data and computes a suite of metrics such as Prediction Efficiency, Root Mean Square Error, Probability of Detection, Probability of False Detection, Heidke Skill Score for each model-data pair. The system then plots scores by event and aggregated over all events for all participating models and run settings. We are building on past experiences with model-data comparisons of magnetosphere and ionosphere model outputs in GEM2008, GEM-CEDAR CETI2010 and Operational Space Weather Model challenges (2010-2013). We can apply the framework also to solar-heliosphere as well as radiation belt models. The CAMEL framework takes advantage of model simulations described with Space Physics Archive Search and Extract (SPASE) metadata and a database backend design developed for a next-generation Run-on-Request system at the CCMC.
Lin, Yin-Yan; Wu, Hau-Tieng; Hsu, Chi-An; Huang, Po-Chiun; Huang, Yuan-Hao; Lo, Yu-Lun
2016-12-07
Physiologically, the thoracic (THO) and abdominal (ABD) movement signals, captured using wearable piezo-electric bands, provide information about various types of apnea, including central sleep apnea (CSA) and obstructive sleep apnea (OSA). However, the use of piezo-electric wearables in detecting sleep apnea events has been seldom explored in the literature. This study explored the possibility of identifying sleep apnea events, including OSA and CSA, by solely analyzing one or both the THO and ABD signals. An adaptive non-harmonic model was introduced to model the THO and ABD signals, which allows us to design features for sleep apnea events. To confirm the suitability of the extracted features, a support vector machine was applied to classify three categories - normal and hypopnea, OSA, and CSA. According to a database of 34 subjects, the overall classification accuracies were on average 75.9%±11.7% and 73.8%±4.4%, respectively, based on the cross validation. When the features determined from the THO and ABD signals were combined, the overall classification accuracy became 81.8%±9.4%. These features were applied for designing a state machine for online apnea event detection. Two event-byevent accuracy indices, S and I, were proposed for evaluating the performance of the state machine. For the same database, the S index was 84.01%±9.06%, and the I index was 77.21%±19.01%. The results indicate the considerable potential of applying the proposed algorithm to clinical examinations for both screening and homecare purposes.
NASA Astrophysics Data System (ADS)
Meng, X.; Daniels, C.; Smith, E.; Peng, Z.; Chen, X.; Wagner, L. S.; Fischer, K. M.; Hawman, R. B.
2015-12-01
Since 2001, the number of M>3 earthquakes increased significantly in Central and Eastern United States (CEUS), likely due to waste-water injection, also known as "induced earthquakes" [Ellsworth, 2013]. Because induced earthquakes are driven by short-term external forcing and hence may behave like earthquake swarms, which are not well characterized by branching point-process models, such as the Epidemic Type Aftershock Sequence (ETAS) model [Ogata, 1988]. In this study we focus on the 02/15/2014 M4.1 South Carolina and the 06/16/2014 M4.3 Oklahoma earthquakes, which likely represent intraplate tectonic and induced events, respectively. For the South Carolina event, only one M3.0 aftershock is identified by the ANSS catalog, which may be caused by a lack of low-magnitude events in this catalog. We apply a recently developed matched filter technique to detect earthquakes from 02/08/2014 to 02/22/2014 around the epicentral region. 15 seismic stations (both permanent and temporary USArray networks) within 100 km of the mainshock are used for detection. The mainshock and aftershock are used as templates for the initial detection. Newly detected events are employed as new templates, and the same detection procedure repeats until no new event can be added. Overall we have identified more than 10 events, including one foreshock occurred ~11 min before the M4.1 mainshock. However, the numbers of aftershocks are still much less than predicted with the modified Bath's law. For the Oklahoma event, we use 1270 events from the ANSS catalog and 182 events from a relocated catalog as templates to scan through continuous recordings 3 days before to 7 days after the mainshock. 12 seismic stations within the vicinity of the mainshock are included in the study. After obtaining more complete catalogs for both sequences, we plan to compare the statistical parameters (e.g., b, a, K, and p values) between the two sequences, as well as their spatial-temporal migration pattern, which may shed light on the underlying physics of tectonic and induced earthquakes.
Sandberg, Warren S; Häkkinen, Matti; Egan, Marie; Curran, Paige K; Fairbrother, Pamela; Choquette, Ken; Daily, Bethany; Sarkka, Jukka-Pekka; Rattner, David
2005-09-01
When procedures and processes to assure patient location based on human performance do not work as expected, patients are brought incrementally closer to a possible "wrong patient-wrong procedure'' error. We developed a system for automated patient location monitoring and management. Real-time data from an active infrared/radio frequency identification tracking system provides patient location data that are robust and can be compared with an "expected process'' model to automatically flag wrong-location events as soon as they occur. The system also generates messages that are automatically sent to process managers via the hospital paging system, thus creating an active alerting function to annunciate errors. We deployed the system to detect and annunciate "patient-in-wrong-OR'' events. The system detected all "wrong-operating room (OR)'' events, and all "wrong-OR'' locations were correctly assigned within 0.50+/-0.28 minutes (mean+/-SD). This corresponded to the measured latency of the tracking system. All wrong-OR events were correctly annunciated via the paging function. This experiment demonstrates that current technology can automatically collect sufficient data to remotely monitor patient flow through a hospital, provide decision support based on predefined rules, and automatically notify stakeholders of errors.
Observational evidence of predawn plasma bubble and its irregularity scales in Southeast Asia
NASA Astrophysics Data System (ADS)
Watthanasangmechai, K.; Tsunoda, R. T.; Yokoyama, T.; Ishii, M.; Tsugawa, T.
2016-12-01
This paper describes an event of deep plasma depletion simultaneously detected with GPS, GNU Radio Beacon Receiver (GRBR) and in situ satellite measurement from DMFPF15. The event is on March 7, 2012 at 4:30 LT with geomagnetic quiet condition. Such a sharp depletion at plasma bubble wall detected at predawn is interesting but apparently rare event. Only one event is found from all dataset in March 2012. The inside structure of the predawn plasma bubble was clearly captured by DMSPF15 and the ground-based GRBR. The envelop structure seen from the precessed GPS-TEC appeares as a cluster. The observed cluster is concluded as the structure at the westwall of an upwelling of the large-scale wave structure, that accompanies the fifty- and thousand-km scales. This event is consistent with the plasma bubble structure simulated from the high-resolution bubble (HIRB) model.
Event-Ready Bell Test Using Entangled Atoms Simultaneously Closing Detection and Locality Loopholes
NASA Astrophysics Data System (ADS)
Rosenfeld, Wenjamin; Burchardt, Daniel; Garthoff, Robert; Redeker, Kai; Ortegel, Norbert; Rau, Markus; Weinfurter, Harald
2017-07-01
An experimental test of Bell's inequality allows ruling out any local-realistic description of nature by measuring correlations between distant systems. While such tests are conceptually simple, there are strict requirements concerning the detection efficiency of the involved measurements, as well as the enforcement of spacelike separation between the measurement events. Only very recently could both loopholes be closed simultaneously. Here we present a statistically significant, event-ready Bell test based on combining heralded entanglement of atoms separated by 398 m with fast and efficient measurements of the atomic spin states closing essential loopholes. We obtain a violation with S =2.221 ±0.033 (compared to the maximal value of 2 achievable with models based on local hidden variables) which allows us to refute the hypothesis of local realism with a significance level P <2.57 ×10-9.
Hierarchical structure for audio-video based semantic classification of sports video sequences
NASA Astrophysics Data System (ADS)
Kolekar, M. H.; Sengupta, S.
2005-07-01
A hierarchical structure for sports event classification based on audio and video content analysis is proposed in this paper. Compared to the event classifications in other games, those of cricket are very challenging and yet unexplored. We have successfully solved cricket video classification problem using a six level hierarchical structure. The first level performs event detection based on audio energy and Zero Crossing Rate (ZCR) of short-time audio signal. In the subsequent levels, we classify the events based on video features using a Hidden Markov Model implemented through Dynamic Programming (HMM-DP) using color or motion as a likelihood function. For some of the game-specific decisions, a rule-based classification is also performed. Our proposed hierarchical structure can easily be applied to any other sports. Our results are very promising and we have moved a step forward towards addressing semantic classification problems in general.
Event detection in an assisted living environment.
Stroiescu, Florin; Daly, Kieran; Kuris, Benjamin
2011-01-01
This paper presents the design of a wireless event detection and in building location awareness system. The systems architecture is based on using a body worn sensor to detect events such as falls where they occur in an assisted living environment. This process involves developing event detection algorithms and transmitting such events wirelessly to an in house network based on the 802.15.4 protocol. The network would then generate alerts both in the assisted living facility and remotely to an offsite monitoring facility. The focus of this paper is on the design of the system architecture and the compliance challenges in applying this technology.
Detecting Social Desirability Bias Using Factor Mixture Models
ERIC Educational Resources Information Center
Leite, Walter L.; Cooper, Lou Ann
2010-01-01
Based on the conceptualization that social desirable bias (SDB) is a discrete event resulting from an interaction between a scale's items, the testing situation, and the respondent's latent trait on a social desirability factor, we present a method that makes use of factor mixture models to identify which examinees are most likely to provide…
NASA Astrophysics Data System (ADS)
Gavrishchaka, Valeriy V.; Kovbasinskaya, Maria; Monina, Maria
2008-11-01
Novelty detection is a very desirable additional feature of any practical classification or forecasting system. Novelty and rare patterns detection is the main objective in such applications as fault/abnormality discovery in complex technical and biological systems, fraud detection and risk management in financial and insurance industry. Although many interdisciplinary approaches for rare event modeling and novelty detection have been proposed, significant data incompleteness due to the nature of the problem makes it difficult to find a universal solution. Even more challenging and much less formalized problem is novelty detection in complex strategies and models where practical performance criteria are usually multi-objective and the best state-of-the-art solution is often not known due to the complexity of the task and/or proprietary nature of the application area. For example, it is much more difficult to detect a series of small insider trading or other illegal transactions mixed with valid operations and distributed over long time period according to a well-designed strategy than a single, large fraudulent transaction. Recently proposed boosting-based optimization was shown to be an effective generic tool for the discovery of stable multi-component strategies/models from the existing parsimonious base strategies/models in financial and other applications. Here we outline how the same framework can be used for novelty and fraud detection in complex strategies and models.
Discovering Event Structure in Continuous Narrative Perception and Memory.
Baldassano, Christopher; Chen, Janice; Zadbood, Asieh; Pillow, Jonathan W; Hasson, Uri; Norman, Kenneth A
2017-08-02
During realistic, continuous perception, humans automatically segment experiences into discrete events. Using a novel model of cortical event dynamics, we investigate how cortical structures generate event representations during narrative perception and how these events are stored to and retrieved from memory. Our data-driven approach allows us to detect event boundaries as shifts between stable patterns of brain activity without relying on stimulus annotations and reveals a nested hierarchy from short events in sensory regions to long events in high-order areas (including angular gyrus and posterior medial cortex), which represent abstract, multimodal situation models. High-order event boundaries are coupled to increases in hippocampal activity, which predict pattern reinstatement during later free recall. These areas also show evidence of anticipatory reinstatement as subjects listen to a familiar narrative. Based on these results, we propose that brain activity is naturally structured into nested events, which form the basis of long-term memory representations. Copyright © 2017 Elsevier Inc. All rights reserved.
High-Performance Signal Detection for Adverse Drug Events using MapReduce Paradigm.
Fan, Kai; Sun, Xingzhi; Tao, Ying; Xu, Linhao; Wang, Chen; Mao, Xianling; Peng, Bo; Pan, Yue
2010-11-13
Post-marketing pharmacovigilance is important for public health, as many Adverse Drug Events (ADEs) are unknown when those drugs were approved for marketing. However, due to the large number of reported drugs and drug combinations, detecting ADE signals by mining these reports is becoming a challenging task in terms of computational complexity. Recently, a parallel programming model, MapReduce has been introduced by Google to support large-scale data intensive applications. In this study, we proposed a MapReduce-based algorithm, for common ADE detection approach, Proportional Reporting Ratio (PRR), and tested it in mining spontaneous ADE reports from FDA. The purpose is to investigate the possibility of using MapReduce principle to speed up biomedical data mining tasks using this pharmacovigilance case as one specific example. The results demonstrated that MapReduce programming model could improve the performance of common signal detection algorithm for pharmacovigilance in a distributed computation environment at approximately liner speedup rates.
NASA Astrophysics Data System (ADS)
Ginsberg, Mark D.; Smith, Eddy D.; VanBlaricum, Vicki; Hock, Vincent F.; Kroll, Dan; Russell, Kevin J.
2010-04-01
Both real events and models have proven that drinking water systems are vulnerable to deliberate and/or accidental contamination. Additionally, homeland security initiatives and modeling efforts have determined that it is relatively easy to orchestrate the contamination of potable water supplies. Such contamination can be accomplished with classic and non-traditional chemical agents, toxic industrial chemicals (TICs), and/or toxic industrial materials (TIMs). Subsequent research and testing has developed a proven network for detection and response to these threats. The method uses offthe- shelf, broad-spectrum analytical instruments coupled with advanced interpretive algorithms. The system detects and characterizes any backflow events involving toxic contaminants by employing unique chemical signature (fingerprint) response data. This instrumentation has been certified by the Office of Homeland Security for detecting deliberate and/or accidental contamination of critical water infrastructure. The system involves integration of several mature technologies (sensors, SCADA, dynamic models, and the HACH HST Guardian Blue instrumentation) into a complete, real-time, management system that also can be used to address other water distribution concerns, such as corrosion. This paper summarizes the reasons and results for installing such a distribution-based detection and protection system.
Bates, Jonathan; Parzynski, Craig S; Dhruva, Sanket S; Coppi, Andreas; Kuntz, Richard; Li, Shu-Xia; Marinac-Dabic, Danica; Masoudi, Frederick A; Shaw, Richard E; Warner, Frederick; Krumholz, Harlan M; Ross, Joseph S
2018-06-12
To estimate medical device utilization needed to detect safety differences among implantable cardioverter defibrillators (ICDs) generator models and compare these estimates to utilization in practice. We conducted repeated sample size estimates to calculate the medical device utilization needed, systematically varying device-specific safety event rate ratios and significance levels while maintaining 80% power, testing 3 average adverse event rates (3.9, 6.1, and 12.6 events per 100 person-years) estimated from the American College of Cardiology's 2006 to 2010 National Cardiovascular Data Registry of ICDs. We then compared with actual medical device utilization. At significance level 0.05 and 80% power, 34% or fewer ICD models accrued sufficient utilization in practice to detect safety differences for rate ratios <1.15 and an average event rate of 12.6 events per 100 person-years. For average event rates of 3.9 and 12.6 events per 100 person-years, 30% and 50% of ICD models, respectively, accrued sufficient utilization for a rate ratio of 1.25, whereas 52% and 67% for a rate ratio of 1.50. Because actual ICD utilization was not uniformly distributed across ICD models, the proportion of individuals receiving any ICD that accrued sufficient utilization in practice was 0% to 21%, 32% to 70%, and 67% to 84% for rate ratios of 1.05, 1.15, and 1.25, respectively, for the range of 3 average adverse event rates. Small safety differences among ICD generator models are unlikely to be detected through routine surveillance given current ICD utilization in practice, but large safety differences can be detected for most patients at anticipated average adverse event rates. Copyright © 2018 John Wiley & Sons, Ltd.
Event attribution using data assimilation in an intermediate complexity atmospheric model
NASA Astrophysics Data System (ADS)
Metref, Sammy; Hannart, Alexis; Ruiz, Juan; Carrassi, Alberto; Bocquet, Marc; Ghil, Michael
2016-04-01
A new approach, coined DADA (Data Assimilation for Detection and Attribution) has been recently introduced by Hannart et al. 2015, and is potentially useful for near real time, systematic causal attribution of weather and climate-related events The method is purposely designed to allow its operability at meteorological centers by synergizing causal attribution with Data Assimilation (DA) methods usually designed to deal with large nonlinear models. In Hannart et al. 2015, the DADA proposal is illustrated in the context of a low-order nonlinear model (forced three-variable Lorenz model) that is of course not realistic to represent the events considered. As a continuation of this stream of work, we therefore propose an implementation of the DADA approach in a realistic intermediate complexity atmospheric model (ICTP AGCM, nicknamed SPEEDY). The SPEEDY model is based on a spectral dynamical core developed at the Geophysical Fluid Dynamics Laboratory (see Held and Suarez 1994). It is a hydrostatic, r-coordinate, spectral-transform model in the vorticity-divergence form described by Bourke (1974). A synthetic dataset of observations of an extreme precipitation event over Southeastern South America is extracted from a long SPEEDY simulation under present climatic conditions (i.e. factual conditions). Then, following the DADA approach, observations of this event are assimilated twice in the SPEEDY model: first in the factual configuration of the model and second under its counterfactual, pre-industrial configuration. We show that attribution can be performed based on the likelihood ratio as in Hannart et al. 2015, but we further extend this result by showing that the likelihood can be split in space, time and variables in order to help identify the specific physical features of the event that bear the causal signature. References: Hannart A., A. Carrassi, M. Bocquet, M. Ghil, P. Naveau, M. Pulido, J. Ruiz, P. Tandeo (2015) DADA: Data assimilation for the detection and attribution of weather and climate-related events, Climatic Change, (in press). Held I. M. and M. J. Suarez, (1994): A Proposal for the Intercomparison of the Dynamical Cores of Atmospheric General Circulation Models. Bull. Amer. Meteor. Soc., 75, 1825-1830. Bourke W. (1972): A multi-level spectral model. I. Formulation and hemispheric integrations. Mon. Wea. Rev., 102, 687-701.
Zhao, Xing; Zhou, Xiao-Hua; Feng, Zijian; Guo, Pengfei; He, Hongyan; Zhang, Tao; Duan, Lei; Li, Xiaosong
2013-01-01
As a useful tool for geographical cluster detection of events, the spatial scan statistic is widely applied in many fields and plays an increasingly important role. The classic version of the spatial scan statistic for the binary outcome is developed by Kulldorff, based on the Bernoulli or the Poisson probability model. In this paper, we apply the Hypergeometric probability model to construct the likelihood function under the null hypothesis. Compared with existing methods, the likelihood function under the null hypothesis is an alternative and indirect method to identify the potential cluster, and the test statistic is the extreme value of the likelihood function. Similar with Kulldorff's methods, we adopt Monte Carlo test for the test of significance. Both methods are applied for detecting spatial clusters of Japanese encephalitis in Sichuan province, China, in 2009, and the detected clusters are identical. Through a simulation to independent benchmark data, it is indicated that the test statistic based on the Hypergeometric model outweighs Kulldorff's statistics for clusters of high population density or large size; otherwise Kulldorff's statistics are superior.
NASA Astrophysics Data System (ADS)
Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao
2006-12-01
We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.
On-line Machine Learning and Event Detection in Petascale Data Streams
NASA Astrophysics Data System (ADS)
Thompson, David R.; Wagstaff, K. L.
2012-01-01
Traditional statistical data mining involves off-line analysis in which all data are available and equally accessible. However, petascale datasets have challenged this premise since it is often impossible to store, let alone analyze, the relevant observations. This has led the machine learning community to investigate adaptive processing chains where data mining is a continuous process. Here pattern recognition permits triage and followup decisions at multiple stages of a processing pipeline. Such techniques can also benefit new astronomical instruments such as the Large Synoptic Survey Telescope (LSST) and Square Kilometre Array (SKA) that will generate petascale data volumes. We summarize some machine learning perspectives on real time data mining, with representative cases of astronomical applications and event detection in high volume datastreams. The first is a "supervised classification" approach currently used for transient event detection at the Very Long Baseline Array (VLBA). It injects known signals of interest - faint single-pulse anomalies - and tunes system parameters to recover these events. This permits meaningful event detection for diverse instrument configurations and observing conditions whose noise cannot be well-characterized in advance. Second, "semi-supervised novelty detection" finds novel events based on statistical deviations from previous patterns. It detects outlier signals of interest while considering known examples of false alarm interference. Applied to data from the Parkes pulsar survey, the approach identifies anomalous "peryton" phenomena that do not match previous event models. Finally, we consider online light curve classification that can trigger adaptive followup measurements of candidate events. Classifier performance analyses suggest optimal survey strategies, and permit principled followup decisions from incomplete data. These examples trace a broad range of algorithm possibilities available for online astronomical data mining. This talk describes research performed at the Jet Propulsion Laboratory, California Institute of Technology. Copyright 2012, All Rights Reserved. U.S. Government support acknowledged.
Syndromic surveillance system based on near real-time cattle mortality monitoring.
Torres, G; Ciaravino, V; Ascaso, S; Flores, V; Romero, L; Simón, F
2015-05-01
Early detection of an infectious disease incursion will minimize the impact of outbreaks in livestock. Syndromic surveillance based on the analysis of readily available data can enhance traditional surveillance systems and allow veterinary authorities to react in a timely manner. This study was based on monitoring the number of cattle carcasses sent for rendering in the veterinary unit of Talavera de la Reina (Spain). The aim was to develop a system to detect deviations from expected values which would signal unexpected health events. Historical weekly collected dead cattle (WCDC) time series stabilized by the Box-Cox transformation and adjusted by the minimum least squares method were used to build the univariate cycling regression model based on a Fourier transformation. Three different models, according to type of production system, were built to estimate the baseline expected number of WCDC. Two types of risk signals were generated: point risk signals when the observed value was greater than the upper 95% confidence interval of the expected baseline, and cumulative risk signals, generated by a modified cumulative sum algorithm, when the cumulative sums of reported deaths were above the cumulative sum of expected deaths. Data from 2011 were used to prospectively validate the model generating seven risk signals. None of them were correlated to infectious disease events but some coincided, in time, with very high climatic temperatures recorded in the region. The harvest effect was also observed during the first week of the study year. Establishing appropriate risk signal thresholds is a limiting factor of predictive models; it needs to be adjusted based on experience gained during the use of the models. To increase the sensitivity and specificity of the predictions epidemiological interpretation of non-specific risk signals should be complemented by other sources of information. The methodology developed in this study can enhance other existing early detection surveillance systems. Syndromic surveillance based on mortality monitoring can reduce the detection time for certain disease outbreaks associated with mild mortality only detected at regional level. The methodology can be adapted to monitor other parameters routinely collected at farm level which can be influenced by communicable diseases. Copyright © 2015 Elsevier B.V. All rights reserved.
Unsupervised Spatial Event Detection in Targeted Domains with Applications to Civil Unrest Modeling
Zhao, Liang; Chen, Feng; Dai, Jing; Hua, Ting; Lu, Chang-Tien; Ramakrishnan, Naren
2014-01-01
Twitter has become a popular data source as a surrogate for monitoring and detecting events. Targeted domains such as crime, election, and social unrest require the creation of algorithms capable of detecting events pertinent to these domains. Due to the unstructured language, short-length messages, dynamics, and heterogeneity typical of Twitter data streams, it is technically difficult and labor-intensive to develop and maintain supervised learning systems. We present a novel unsupervised approach for detecting spatial events in targeted domains and illustrate this approach using one specific domain, viz. civil unrest modeling. Given a targeted domain, we propose a dynamic query expansion algorithm to iteratively expand domain-related terms, and generate a tweet homogeneous graph. An anomaly identification method is utilized to detect spatial events over this graph by jointly maximizing local modularity and spatial scan statistics. Extensive experiments conducted in 10 Latin American countries demonstrate the effectiveness of the proposed approach. PMID:25350136
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, Fei; Jiang, Huaiguang; Tan, Jin
This paper proposes an event-driven approach for reconfiguring distribution systems automatically. Specifically, an optimal synchrophasor sensor placement (OSSP) is used to reduce the number of synchrophasor sensors while keeping the whole system observable. Then, a wavelet-based event detection and location approach is used to detect and locate the event, which performs as a trigger for network reconfiguration. With the detected information, the system is then reconfigured using the hierarchical decentralized approach to seek for the new optimal topology. In this manner, whenever an event happens the distribution network can be reconfigured automatically based on the real-time information that is observablemore » and detectable.« less
Hierarchical Context Modeling for Video Event Recognition.
Wang, Xiaoyang; Ji, Qiang
2016-10-11
Current video event recognition research remains largely target-centered. For real-world surveillance videos, targetcentered event recognition faces great challenges due to large intra-class target variation, limited image resolution, and poor detection and tracking results. To mitigate these challenges, we introduced a context-augmented video event recognition approach. Specifically, we explicitly capture different types of contexts from three levels including image level, semantic level, and prior level. At the image level, we introduce two types of contextual features including the appearance context features and interaction context features to capture the appearance of context objects and their interactions with the target objects. At the semantic level, we propose a deep model based on deep Boltzmann machine to learn event object representations and their interactions. At the prior level, we utilize two types of prior-level contexts including scene priming and dynamic cueing. Finally, we introduce a hierarchical context model that systematically integrates the contextual information at different levels. Through the hierarchical context model, contexts at different levels jointly contribute to the event recognition. We evaluate the hierarchical context model for event recognition on benchmark surveillance video datasets. Results show that incorporating contexts in each level can improve event recognition performance, and jointly integrating three levels of contexts through our hierarchical model achieves the best performance.
Detecting and Locating Seismic Events Without Phase Picks or Velocity Models
NASA Astrophysics Data System (ADS)
Arrowsmith, S.; Young, C. J.; Ballard, S.; Slinkard, M.
2015-12-01
The standard paradigm for seismic event monitoring is to scan waveforms from a network of stations and identify the arrival time of various seismic phases. A signal association algorithm then groups the picks to form events, which are subsequently located by minimizing residuals between measured travel times and travel times predicted by an Earth model. Many of these steps are prone to significant errors which can lead to erroneous arrival associations and event locations. Here, we revisit a concept for event detection that does not require phase picks or travel time curves and fuses detection, association and location into a single algorithm. Our pickless event detector exploits existing catalog and waveform data to build an empirical stack of the full regional seismic wavefield, which is subsequently used to detect and locate events at a network level using correlation techniques. Because the technique uses more of the information content of the original waveforms, the concept is particularly powerful for detecting weak events that would be missed by conventional methods. We apply our detector to seismic data from the University of Utah Seismograph Stations network and compare our results with the earthquake catalog published by the University of Utah. We demonstrate that the pickless detector can detect and locate significant numbers of events previously missed by standard data processing techniques.
Azim, Riyasat; Li, Fangxing; Xue, Yaosuo; ...
2017-07-14
Distributed generations (DGs) for grid-connected applications require an accurate and reliable islanding detection methodology (IDM) for secure system operation. This paper presents an IDM for grid-connected inverter-based DGs. The proposed method is a combination of passive and active islanding detection techniques for aggregation of their advantages and elimination/minimisation of the drawbacks. In the proposed IDM, the passive method utilises critical system attributes extracted from local voltage measurements at target DG locations as well as employs decision tree-based classifiers for characterisation and detection of islanding events. The active method is based on Sandia frequency shift technique and is initiated only whenmore » the passive method is unable to differentiate islanding events from other system events. Thus, the power quality degradation introduced into the system by active islanding detection techniques can be minimised. Furthermore, a combination of active and passive techniques allows detection of islanding events under low power mismatch scenarios eliminating the disadvantage associated with the use of passive techniques alone. Finally, detailed case study results demonstrate the effectiveness of the proposed method in detection of islanding events under various power mismatch scenarios, load quality factors and in the presence of single or multiple grid-connected inverter-based DG units.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azim, Riyasat; Li, Fangxing; Xue, Yaosuo
Distributed generations (DGs) for grid-connected applications require an accurate and reliable islanding detection methodology (IDM) for secure system operation. This paper presents an IDM for grid-connected inverter-based DGs. The proposed method is a combination of passive and active islanding detection techniques for aggregation of their advantages and elimination/minimisation of the drawbacks. In the proposed IDM, the passive method utilises critical system attributes extracted from local voltage measurements at target DG locations as well as employs decision tree-based classifiers for characterisation and detection of islanding events. The active method is based on Sandia frequency shift technique and is initiated only whenmore » the passive method is unable to differentiate islanding events from other system events. Thus, the power quality degradation introduced into the system by active islanding detection techniques can be minimised. Furthermore, a combination of active and passive techniques allows detection of islanding events under low power mismatch scenarios eliminating the disadvantage associated with the use of passive techniques alone. Finally, detailed case study results demonstrate the effectiveness of the proposed method in detection of islanding events under various power mismatch scenarios, load quality factors and in the presence of single or multiple grid-connected inverter-based DG units.« less
Lee, Young-Sook; Chung, Wan-Young
2012-01-01
Vision-based abnormal event detection for home healthcare systems can be greatly improved using visual sensor-based techniques able to detect, track and recognize objects in the scene. However, in moving object detection and tracking processes, moving cast shadows can be misclassified as part of objects or moving objects. Shadow removal is an essential step for developing video surveillance systems. The goal of the primary is to design novel computer vision techniques that can extract objects more accurately and discriminate between abnormal and normal activities. To improve the accuracy of object detection and tracking, our proposed shadow removal algorithm is employed. Abnormal event detection based on visual sensor by using shape features variation and 3-D trajectory is presented to overcome the low fall detection rate. The experimental results showed that the success rate of detecting abnormal events was 97% with a false positive rate of 2%. Our proposed algorithm can allow distinguishing diverse fall activities such as forward falls, backward falls, and falling asides from normal activities. PMID:22368486
NASA Astrophysics Data System (ADS)
Morton, E.; Bilek, S. L.; Rowe, C. A.
2016-12-01
Unlike other subduction zones, the Cascadia subduction zone (CSZ) is notable for the absence of detected and located small and moderate magnitude interplate earthquakes, despite the presence of recurring episodic tremor and slip (ETS) downdip and evidence of pre-historic great earthquakes. Thermal and geodetic models indicate that the seismogenic zone exists primarily, if not entirely, offshore; therefore the perceived unusual seismic quiescence may be a consequence of seismic source location in relation to land based seismometers. The Cascadia Initiative (CI) amphibious community seismic experiment includes ocean bottom seismometers (OBS) deployed directly above the presumed locked seismogenic zone. We use the CI dataset to search for small magnitude interplate earthquakes previously undetected using the on-land sensors alone. We implement subspace detection to search for small earthquakes. We build our subspace with template events from existing earthquake catalogs that appear to have occurred on the plate interface, windowing waveforms on CI OBS and land seismometers. Although our efforts will target the entire CSZ margin and full 4-year CI deployment, here we focus on a previously identified cluster off the coast of Oregon, related to a subducting seamount. During the first year of CI deployment, this target area yields 293 unique detections with 86 well-located events. Thirty-two of these events occurred within the seamount cluster, and 13 events were located in another cluster to the northwest of the seamount. Events within the seamount cluster are separated into those whose depths place them on the plate interface, and a shallower set ( 5 km depth). These separate event groups track together temporally, and seem to agree with a model of seamount subduction that creates extensive fracturing around the seamount, rather than stress concentrated at the seamount-plate boundary. During CI year 2, this target area yields >1000 additional event detections.
Lawhern, Vernon; Hairston, W David; Robbins, Kay
2013-01-01
Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration.
Lawhern, Vernon; Hairston, W. David; Robbins, Kay
2013-01-01
Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration. PMID:23638169
NASA Astrophysics Data System (ADS)
Bainbridge, S.
2012-04-01
The advent of new observing systems, such as sensor networks, have dramatically increased our ability to collect marine data; the issue now is not data drought but data deluge. The challenge now is to extract data representing events of interest from the background data, that is how to deliver information and potentially knowledge from an increasing large store of base observations. Given that each potential user will have differing definitions of 'interesting' and that this is often defined by other events and data, systems need to deliver information or knowledge in a form and context defined by the user. This paper reports on a series of coral reef sensor networks set up under the Coral Reef Environmental Observation Network (CREON). CREON is a community of interest group deploying coral reef sensor networks with the goal of increasing capacity in coral reef observation, especially into developing areas. Issues such as coral bleaching, terrestrial runoff, human impacts and climate change are impacting reefs with one assessment indicating a quarter of the worlds reefs being severely degraded with another quarter under immediate threat. Increasing our ability to collect scientifically valid observations is fundamental to understanding these systems and ultimately in preserving and sustaining them. A cloud based data management system was used to store the base sensor data from each agency involved using service based agents to push the data from individual field sensors to the cloud. The system supports a range of service based outputs such as on-line graphs, a smart-phone application and simple event detection. A more complex event detection system was written that takes input from the cloud services and outputs natural language 'tweets' to Twitter as events occur. It therefore becomes possible to distil the entire data set down to a series of Twitter entries that interested parties can subscribe to. The next step is to allow users to define their own events and to deliver results, in context, to their preferred medium. The paper contrasts what has been achieved within a small community with well defined issues with what it would take to build equivalent systems to hold a wide range of cross community observational data addressing a wider range of potential issues. The role of discoverability, quality control, uncertainly, conformity and metadata are investigated along with a brief discussion of existing and emerging standards in this area. The elements of such as system are described along with the role of modelling and scenario tools in delivering a higher level of outputs linking what may have already occurred (event detection) with what may potentially occur (scenarios). The development of service based cloud computing open data systems coupled with complex event detection systems delivering through social media and other channels linked into model and scenario systems represents one vision for delivering value from the increasing store of ocean observations, most of which lie unknown, unused and unloved.
NASA Astrophysics Data System (ADS)
Dannemann, F. K.; Park, J.; Marcillo, O. E.; Blom, P. S.; Stump, B. W.; Hayward, C.
2016-12-01
Data from five infrasound arrays in the western US jointly operated by University of Utah Seismograph Station and Southern Methodist University are used to test a database-centric processing pipeline, InfraPy, for automated event detection, association and location. Infrasonic array data from a one-year time period (January 1 2012 to December 31 2012) are used. This study focuses on the identification and location of 53 ground-truth verified events produced from near surface military explosions at the Utah Test and Training Range (UTTR). Signals are detected using an adaptive F-detector, which accounts for correlated and uncorrelated time-varying noise in order to reduce false detections due to the presence of coherent noise. Variations in detection azimuth and correlation are found to be consistent with seasonal changes in atmospheric winds. The Bayesian infrasonic source location (BISL) method is used to produce source location and time credibility contours based on posterior probability density functions. Updates to the previous BISL methodology include the application of celerity range and azimuth deviation distributions in order to accurately account for the spatial and temporal variability of infrasound propagation through the atmosphere. These priors are estimated by ray tracing through Ground-to-Space (G2S) atmospheric models as a function of season and time of day using historic atmospheric characterizations from 2007 to 2013. Out of the 53 events, 31 are successfully located using the InfraPy pipeline. Confidence contour areas for maximum a posteriori event locations produce error estimates which are reduced a maximum of 98% and an average of 25% from location estimates utilizing a simple time independent uniform atmosphere. We compare real-time ray tracing results with the statistical atmospheric priors used in this study to examine large time differences between known origin times and estimated origin times that might be due to the misidentification of infrasonic phases. This work provides an opportunity to improve atmospheric model predictions by understanding atmospheric variability at a station-level.
NASA Astrophysics Data System (ADS)
Schweier, C.; Markus, M.; Steinle, E.
2004-04-01
Catastrophic events like strong earthquakes can cause big losses in life and economic values. An increase in the efficiency of reconnaissance techniques could help to reduce the losses in life as many victims die after and not during the event. A basic prerequisite to improve the rescue teams' work is an improved planning of the measures. This can only be done on the basis of reliable and detailed information about the actual situation in the affected regions. Therefore, a bundle of projects at Karlsruhe university aim at the development of a tool for fast information retrieval after strong earthquakes. The focus is on urban areas as the most losses occur there. In this paper the approach for a damage analysis of buildings will be presented. It consists of an automatic methodology to model buildings in three dimensions, a comparison of pre- and post-event models to detect changes and a subsequent classification of the changes into damage types. The process is based on information extraction from airborne laserscanning data, i.e. digital surface models (DSM) acquired through scanning of an area with pulsed laser light. To date, there are no laserscanning derived DSMs available to the authors that were taken of areas that suffered damages from earthquakes. Therefore, it was necessary to simulate such data for the development of the damage detection methodology. In this paper two different methodologies used for simulating the data will be presented. The first method is to create CAD models of undamaged buildings based on their construction plans and alter them artificially in such a way as if they had suffered serious damage. Then, a laserscanning data set is simulated based on these models which can be compared with real laserscanning data acquired of the buildings (in intact state). The other approach is to use measurements of actual damaged buildings and simulate their intact state. It is possible to model the geometrical structure of these damaged buildings based on digital photography taken after the event by evaluating the images with photogrammetrical methods. The intact state of the buildings is simulated based on on-site investigations, and finally laserscanning data are simulated for both states.
Laboratory-Based Prospective Surveillance for Community Outbreaks of Shigella spp. in Argentina
Viñas, María R.; Tuduri, Ezequiel; Galar, Alicia; Yih, Katherine; Pichel, Mariana; Stelling, John; Brengi, Silvina P.; Della Gaspera, Anabella; van der Ploeg, Claudia; Bruno, Susana; Rogé, Ariel; Caffer, María I.; Kulldorff, Martin; Galas, Marcelo
2013-01-01
Background To implement effective control measures, timely outbreak detection is essential. Shigella is the most common cause of bacterial diarrhea in Argentina. Highly resistant clones of Shigella have emerged, and outbreaks have been recognized in closed settings and in whole communities. We hereby report our experience with an evolving, integrated, laboratory-based, near real-time surveillance system operating in six contiguous provinces of Argentina during April 2009 to March 2012. Methodology To detect localized shigellosis outbreaks timely, we used the prospective space-time permutation scan statistic algorithm of SaTScan, embedded in WHONET software. Twenty three laboratories sent updated Shigella data on a weekly basis to the National Reference Laboratory. Cluster detection analysis was performed at several taxonomic levels: for all Shigella spp., for serotypes within species and for antimicrobial resistance phenotypes within species. Shigella isolates associated with statistically significant signals (clusters in time/space with recurrence interval ≥365 days) were subtyped by pulsed field gel electrophoresis (PFGE) using PulseNet protocols. Principal Findings In three years of active surveillance, our system detected 32 statistically significant events, 26 of them identified before hospital staff was aware of any unexpected increase in the number of Shigella isolates. Twenty-six signals were investigated by PFGE, which confirmed a close relationship among the isolates for 22 events (84.6%). Seven events were investigated epidemiologically, which revealed links among the patients. Seventeen events were found at the resistance profile level. The system detected events of public health importance: infrequent resistance profiles, long-lasting and/or re-emergent clusters and events important for their duration or size, which were reported to local public health authorities. Conclusions/Significance The WHONET-SaTScan system may serve as a model for surveillance and can be applied to other pathogens, implemented by other networks, and scaled up to national and international levels for early detection and control of outbreaks. PMID:24349586
Laboratory-based prospective surveillance for community outbreaks of Shigella spp. in Argentina.
Viñas, María R; Tuduri, Ezequiel; Galar, Alicia; Yih, Katherine; Pichel, Mariana; Stelling, John; Brengi, Silvina P; Della Gaspera, Anabella; van der Ploeg, Claudia; Bruno, Susana; Rogé, Ariel; Caffer, María I; Kulldorff, Martin; Galas, Marcelo
2013-01-01
To implement effective control measures, timely outbreak detection is essential. Shigella is the most common cause of bacterial diarrhea in Argentina. Highly resistant clones of Shigella have emerged, and outbreaks have been recognized in closed settings and in whole communities. We hereby report our experience with an evolving, integrated, laboratory-based, near real-time surveillance system operating in six contiguous provinces of Argentina during April 2009 to March 2012. To detect localized shigellosis outbreaks timely, we used the prospective space-time permutation scan statistic algorithm of SaTScan, embedded in WHONET software. Twenty three laboratories sent updated Shigella data on a weekly basis to the National Reference Laboratory. Cluster detection analysis was performed at several taxonomic levels: for all Shigella spp., for serotypes within species and for antimicrobial resistance phenotypes within species. Shigella isolates associated with statistically significant signals (clusters in time/space with recurrence interval ≥365 days) were subtyped by pulsed field gel electrophoresis (PFGE) using PulseNet protocols. In three years of active surveillance, our system detected 32 statistically significant events, 26 of them identified before hospital staff was aware of any unexpected increase in the number of Shigella isolates. Twenty-six signals were investigated by PFGE, which confirmed a close relationship among the isolates for 22 events (84.6%). Seven events were investigated epidemiologically, which revealed links among the patients. Seventeen events were found at the resistance profile level. The system detected events of public health importance: infrequent resistance profiles, long-lasting and/or re-emergent clusters and events important for their duration or size, which were reported to local public health authorities. The WHONET-SaTScan system may serve as a model for surveillance and can be applied to other pathogens, implemented by other networks, and scaled up to national and international levels for early detection and control of outbreaks.
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Rinehart, Aidan W.; Jones, Scott M.
2017-01-01
Aircraft flying in regions of high ice crystal concentrations are susceptible to the buildup of ice within the compression system of their gas turbine engines. This ice buildup can restrict engine airflow and cause an uncommanded loss of thrust, also known as engine rollback, which poses a potential safety hazard. The aviation community is conducting research to understand this phenomena, and to identify avoidance and mitigation strategies to address the concern. To support this research, a dynamic turbofan engine model has been created to enable the development and evaluation of engine icing detection and control-based mitigation strategies. This model captures the dynamic engine response due to high ice water ingestion and the buildup of ice blockage in the engines low pressure compressor. It includes a fuel control system allowing engine closed-loop control effects during engine icing events to be emulated. The model also includes bleed air valve and horsepower extraction actuators that, when modulated, change overall engine operating performance. This system-level model has been developed and compared against test data acquired from an aircraft turbofan engine undergoing engine icing studies in an altitude test facility and also against outputs from the manufacturers customer deck. This paper will describe the model and show results of its dynamic response under open-loop and closed-loop control operating scenarios in the presence of ice blockage buildup compared against engine test cell data. Planned follow-on use of the model for the development and evaluation of icing detection and control-based mitigation strategies will also be discussed. The intent is to combine the model and control mitigation logic with an engine icing risk calculation tool capable of predicting the risk of engine icing based on current operating conditions. Upon detection of an operating region of risk for engine icing events, the control mitigation logic will seek to change the engines operating point to a region of lower risk through the modulation of available control actuators while maintaining the desired engine thrust output. Follow-on work will assess the feasibility and effectiveness of such control-based mitigation strategies.
Ross, Joseph S; Bates, Jonathan; Parzynski, Craig S; Akar, Joseph G; Curtis, Jeptha P; Desai, Nihar R; Freeman, James V; Gamble, Ginger M; Kuntz, Richard; Li, Shu-Xia; Marinac-Dabic, Danica; Masoudi, Frederick A; Normand, Sharon-Lise T; Ranasinghe, Isuru; Shaw, Richard E; Krumholz, Harlan M
2017-01-01
Machine learning methods may complement traditional analytic methods for medical device surveillance. Using data from the National Cardiovascular Data Registry for implantable cardioverter-defibrillators (ICDs) linked to Medicare administrative claims for longitudinal follow-up, we applied three statistical approaches to safety-signal detection for commonly used dual-chamber ICDs that used two propensity score (PS) models: one specified by subject-matter experts (PS-SME), and the other one by machine learning-based selection (PS-ML). The first approach used PS-SME and cumulative incidence (time-to-event), the second approach used PS-SME and cumulative risk (Data Extraction and Longitudinal Trend Analysis [DELTA]), and the third approach used PS-ML and cumulative risk (embedded feature selection). Safety-signal surveillance was conducted for eleven dual-chamber ICD models implanted at least 2,000 times over 3 years. Between 2006 and 2010, there were 71,948 Medicare fee-for-service beneficiaries who received dual-chamber ICDs. Cumulative device-specific unadjusted 3-year event rates varied for three surveyed safety signals: death from any cause, 12.8%-20.9%; nonfatal ICD-related adverse events, 19.3%-26.3%; and death from any cause or nonfatal ICD-related adverse event, 27.1%-37.6%. Agreement among safety signals detected/not detected between the time-to-event and DELTA approaches was 90.9% (360 of 396, k =0.068), between the time-to-event and embedded feature-selection approaches was 91.7% (363 of 396, k =-0.028), and between the DELTA and embedded feature selection approaches was 88.1% (349 of 396, k =-0.042). Three statistical approaches, including one machine learning method, identified important safety signals, but without exact agreement. Ensemble methods may be needed to detect all safety signals for further evaluation during medical device surveillance.
Analyzing depression tendency of web posts using an event-driven depression tendency warning model.
Tung, Chiaming; Lu, Wenhsiang
2016-01-01
The Internet has become a platform to express individual moods/feelings of daily life, where authors share their thoughts in web blogs, micro-blogs, forums, bulletin board systems or other media. In this work, we investigate text-mining technology to analyze and predict the depression tendency of web posts. In this paper, we defined depression factors, which include negative events, negative emotions, symptoms, and negative thoughts from web posts. We proposed an enhanced event extraction (E3) method to automatically extract negative event terms. In addition, we also proposed an event-driven depression tendency warning (EDDTW) model to predict the depression tendency of web bloggers or post authors by analyzing their posted articles. We compare the performance among the proposed EDDTW model, negative emotion evaluation (NEE) model, and the diagnostic and statistical manual of mental disorders-based depression tendency evaluation method. The EDDTW model obtains the best recall rate and F-measure at 0.668 and 0.624, respectively, while the diagnostic and statistical manual of mental disorders-based method achieves the best precision rate of 0.666. The main reason is that our enhanced event extraction method can increase recall rate by enlarging the negative event lexicon at the expense of precision. Our EDDTW model can also be used to track the change or trend of depression tendency for each post author. The depression tendency trend can help doctors to diagnose and even track depression of web post authors more efficiently. This paper presents an E3 method to automatically extract negative event terms in web posts. We also proposed a new EDDTW model to predict the depression tendency of web posts and possibly help bloggers or post authors to early detect major depressive disorder. Copyright © 2015 Elsevier B.V. All rights reserved.
An Efficient Pattern Mining Approach for Event Detection in Multivariate Temporal Data
Batal, Iyad; Cooper, Gregory; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos
2015-01-01
This work proposes a pattern mining approach to learn event detection models from complex multivariate temporal data, such as electronic health records. We present Recent Temporal Pattern mining, a novel approach for efficiently finding predictive patterns for event detection problems. This approach first converts the time series data into time-interval sequences of temporal abstractions. It then constructs more complex time-interval patterns backward in time using temporal operators. We also present the Minimal Predictive Recent Temporal Patterns framework for selecting a small set of predictive and non-spurious patterns. We apply our methods for predicting adverse medical events in real-world clinical data. The results demonstrate the benefits of our methods in learning accurate event detection models, which is a key step for developing intelligent patient monitoring and decision support systems. PMID:26752800
Automatic event recognition and anomaly detection with attribute grammar by learning scene semantics
NASA Astrophysics Data System (ADS)
Qi, Lin; Yao, Zhenyu; Li, Li; Dong, Junyu
2007-11-01
In this paper we present a novel framework for automatic event recognition and abnormal behavior detection with attribute grammar by learning scene semantics. This framework combines learning scene semantics by trajectory analysis and constructing attribute grammar-based event representation. The scene and event information is learned automatically. Abnormal behaviors that disobey scene semantics or event grammars rules are detected. By this method, an approach to understanding video scenes is achieved. Further more, with this prior knowledge, the accuracy of abnormal event detection is increased.
On event-based optical flow detection
Brosch, Tobias; Tschechne, Stephan; Neumann, Heiko
2015-01-01
Event-based sensing, i.e., the asynchronous detection of luminance changes, promises low-energy, high dynamic range, and sparse sensing. This stands in contrast to whole image frame-wise acquisition by standard cameras. Here, we systematically investigate the implications of event-based sensing in the context of visual motion, or flow, estimation. Starting from a common theoretical foundation, we discuss different principal approaches for optical flow detection ranging from gradient-based methods over plane-fitting to filter based methods and identify strengths and weaknesses of each class. Gradient-based methods for local motion integration are shown to suffer from the sparse encoding in address-event representations (AER). Approaches exploiting the local plane like structure of the event cloud, on the other hand, are shown to be well suited. Within this class, filter based approaches are shown to define a proper detection scheme which can also deal with the problem of representing multiple motions at a single location (motion transparency). A novel biologically inspired efficient motion detector is proposed, analyzed and experimentally validated. Furthermore, a stage of surround normalization is incorporated. Together with the filtering this defines a canonical circuit for motion feature detection. The theoretical analysis shows that such an integrated circuit reduces motion ambiguity in addition to decorrelating the representation of motion related activations. PMID:25941470
A model for anomaly classification in intrusion detection systems
NASA Astrophysics Data System (ADS)
Ferreira, V. O.; Galhardi, V. V.; Gonçalves, L. B. L.; Silva, R. C.; Cansian, A. M.
2015-09-01
Intrusion Detection Systems (IDS) are traditionally divided into two types according to the detection methods they employ, namely (i) misuse detection and (ii) anomaly detection. Anomaly detection has been widely used and its main advantage is the ability to detect new attacks. However, the analysis of anomalies generated can become expensive, since they often have no clear information about the malicious events they represent. In this context, this paper presents a model for automated classification of alerts generated by an anomaly based IDS. The main goal is either the classification of the detected anomalies in well-defined taxonomies of attacks or to identify whether it is a false positive misclassified by the IDS. Some common attacks to computer networks were considered and we achieved important results that can equip security analysts with best resources for their analyses.
Diagnosis of delay-deadline failures in real time discrete event models.
Biswas, Santosh; Sarkar, Dipankar; Bhowal, Prodip; Mukhopadhyay, Siddhartha
2007-10-01
In this paper a method for fault detection and diagnosis (FDD) of real time systems has been developed. A modeling framework termed as real time discrete event system (RTDES) model is presented and a mechanism for FDD of the same has been developed. The use of RTDES framework for FDD is an extension of the works reported in the discrete event system (DES) literature, which are based on finite state machines (FSM). FDD of RTDES models are suited for real time systems because of their capability of representing timing faults leading to failures in terms of erroneous delays and deadlines, which FSM-based ones cannot address. The concept of measurement restriction of variables is introduced for RTDES and the consequent equivalence of states and indistinguishability of transitions have been characterized. Faults are modeled in terms of an unmeasurable condition variable in the state map. Diagnosability is defined and the procedure of constructing a diagnoser is provided. A checkable property of the diagnoser is shown to be a necessary and sufficient condition for diagnosability. The methodology is illustrated with an example of a hydraulic cylinder.
Compressor stability management
NASA Astrophysics Data System (ADS)
Dhingra, Manuj
Dynamic compressors are susceptible to aerodynamic instabilities while operating at low mass flow rates. These instabilities, rotating stall and surge, are detrimental to engine life and operational safety, and are thus undesirable. In order to prevent stability problems, a passive technique, involving fuel flow scheduling, is currently employed on gas turbines. The passive nature of this technique necessitates conservative stability margins, compromising performance and/or efficiency. In the past, model based active control has been proposed to enable reduction of margin requirements. However, available compressor stability models do not predict the different stall inception patterns, making model based control techniques practically infeasible. This research presents active stability management as a viable alternative. In particular, a limit detection and avoidance approach has been used to maintain the system free of instabilities. Simulations show significant improvements in the dynamic response of a gas turbine engine with this approach. A novel technique has been developed to enable real-time detection of stability limits in axial compressors. It employs a correlation measure to quantify the chaos in the rotor tip region. Analysis of data from four axial compressors shows that the value of the correlation measure decreases as compressor loading is increased. Moreover, sharp drops in this measure have been found to be relevant for stability limit detection. The significance of these drops can be captured by tracking events generated by the downward crossing of a selected threshold level. It has been observed that the average number of events increases as the stability limit is approached in all the compressors studied. These events appear to be randomly distributed in time. A stochastic model for the time between consecutive events has been developed and incorporated in an engine simulation. The simulation has been used to highlight the importance of the threshold level to successful stability management. The compressor stability management concepts have also been experimentally demonstrated on a laboratory axial compressor rig. The fundamental nature of correlation measure has opened avenues for its application besides limit detection. The applications presented include stage load matching in a multi-stage compressor and monitoring the aerodynamic health of rotor blades.
NASA Astrophysics Data System (ADS)
Mahmoud, Seedahmed S.; Visagathilagar, Yuvaraja; Katsifolis, Jim
2012-09-01
The success of any perimeter intrusion detection system depends on three important performance parameters: the probability of detection (POD), the nuisance alarm rate (NAR), and the false alarm rate (FAR). The most fundamental parameter, POD, is normally related to a number of factors such as the event of interest, the sensitivity of the sensor, the installation quality of the system, and the reliability of the sensing equipment. The suppression of nuisance alarms without degrading sensitivity in fiber optic intrusion detection systems is key to maintaining acceptable performance. Signal processing algorithms that maintain the POD and eliminate nuisance alarms are crucial for achieving this. In this paper, a robust event classification system using supervised neural networks together with a level crossings (LCs) based feature extraction algorithm is presented for the detection and recognition of intrusion and non-intrusion events in a fence-based fiber-optic intrusion detection system. A level crossings algorithm is also used with a dynamic threshold to suppress torrential rain-induced nuisance alarms in a fence system. Results show that rain-induced nuisance alarms can be suppressed for rainfall rates in excess of 100 mm/hr with the simultaneous detection of intrusion events. The use of a level crossing based detection and novel classification algorithm is also presented for a buried pipeline fiber optic intrusion detection system for the suppression of nuisance events and discrimination of intrusion events. The sensor employed for both types of systems is a distributed bidirectional fiber-optic Mach-Zehnder (MZ) interferometer.
Vilar, Santiago; Harpaz, Rave; Chase, Herbert S; Costanzi, Stefano; Rabadan, Raul
2011-01-01
Background Adverse drug events (ADE) cause considerable harm to patients, and consequently their detection is critical for patient safety. The US Food and Drug Administration maintains an adverse event reporting system (AERS) to facilitate the detection of ADE in drugs. Various data mining approaches have been developed that use AERS to detect signals identifying associations between drugs and ADE. The signals must then be monitored further by domain experts, which is a time-consuming task. Objective To develop a new methodology that combines existing data mining algorithms with chemical information by analysis of molecular fingerprints to enhance initial ADE signals generated from AERS, and to provide a decision support mechanism to facilitate the identification of novel adverse events. Results The method achieved a significant improvement in precision in identifying known ADE, and a more than twofold signal enhancement when applied to the ADE rhabdomyolysis. The simplicity of the method assists in highlighting the etiology of the ADE by identifying structurally similar drugs. A set of drugs with strong evidence from both AERS and molecular fingerprint-based modeling is constructed for further analysis. Conclusion The results demonstrate that the proposed methodology could be used as a pharmacovigilance decision support tool to facilitate ADE detection. PMID:21946238
Detection of epileptic seizure in EEG signals using linear least squares preprocessing.
Roshan Zamir, Z
2016-09-01
An epileptic seizure is a transient event of abnormal excessive neuronal discharge in the brain. This unwanted event can be obstructed by detection of electrical changes in the brain that happen before the seizure takes place. The automatic detection of seizures is necessary since the visual screening of EEG recordings is a time consuming task and requires experts to improve the diagnosis. Much of the prior research in detection of seizures has been developed based on artificial neural network, genetic programming, and wavelet transforms. Although the highest achieved accuracy for classification is 100%, there are drawbacks, such as the existence of unbalanced datasets and the lack of investigations in performances consistency. To address these, four linear least squares-based preprocessing models are proposed to extract key features of an EEG signal in order to detect seizures. The first two models are newly developed. The original signal (EEG) is approximated by a sinusoidal curve. Its amplitude is formed by a polynomial function and compared with the predeveloped spline function. Different statistical measures, namely classification accuracy, true positive and negative rates, false positive and negative rates and precision, are utilised to assess the performance of the proposed models. These metrics are derived from confusion matrices obtained from classifiers. Different classifiers are used over the original dataset and the set of extracted features. The proposed models significantly reduce the dimension of the classification problem and the computational time while the classification accuracy is improved in most cases. The first and third models are promising feature extraction methods with the classification accuracy of 100%. Logistic, LazyIB1, LazyIB5, and J48 are the best classifiers. Their true positive and negative rates are 1 while false positive and negative rates are 0 and the corresponding precision values are 1. Numerical results suggest that these models are robust and efficient for detecting epileptic seizure. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Song, X X; Zhao, Q; Tao, T; Zhou, C M; Diwan, V K; Xu, B
2018-05-30
Records of absenteeism from primary schools are valuable data for infectious diseases surveillance. However, the analysis of the absenteeism is complicated by the data features of clustering at zero, non-independence and overdispersion. This study aimed to generate an appropriate model to handle the absenteeism data collected in a European Commission granted project for infectious disease surveillance in rural China and to evaluate the validity and timeliness of the resulting model for early warnings of infectious disease outbreak. Four steps were taken: (1) building a 'well-fitting' model by the zero-inflated Poisson model with random effects (ZIP-RE) using the absenteeism data from the first implementation year; (2) applying the resulting model to predict the 'expected' number of absenteeism events in the second implementation year; (3) computing the differences between the observations and the expected values (O-E values) to generate an alternative series of data; (4) evaluating the early warning validity and timeliness of the observational data and model-based O-E values via the EARS-3C algorithms with regard to the detection of real cluster events. The results indicate that ZIP-RE and its corresponding O-E values could improve the detection of aberrations, reduce the false-positive signals and are applicable to the zero-inflated data.
A new moonquake catalog from Apollo 17 geophone data
NASA Astrophysics Data System (ADS)
Dimech, Jesse-Lee; Knapmeyer-Endrun, Brigitte; Weber, Renee
2017-04-01
New lunar seismic events have been detected on geophone data from the Apollo 17 Lunar Seismic Profile Experiment (LSPE). This dataset is already known to contain an abundance of thermal seismic events, and potentially some meteorite impacts, but prior to this study only 26 days of LSPE "listening mode" data has been analysed. In this new analysis, additional listening mode data collected between August 1976 and April 1977 is incorporated. To the authors knowledge these 8-months of data have not yet been used to detect seismic moonquake events. The geophones in question are situated adjacent to the Apollo 17 site in the Taurus-Littrow valley, about 5.5 km east of Lee-Lincoln scarp, and between the North and South Massifs. Any of these features are potential seismic sources. We have used an event-detection and classification technique based on 'Hidden Markov Models' to automatically detect and categorize seismic signals, in order to objectively generate a seismic event catalog. Currently, 2.5 months of the 8-month listening mode dataset has been processed, totaling 14,338 detections. Of these, 672 detections (classification "n1") have a sharp onset with a steep risetime suggesting they occur close to the recording geophone. These events almost all occur in association with lunar sunrise over the span of 1-2 days. One possibility is that these events originate from the nearby Apollo 17 lunar lander due to rapid heating at sunrise. A further 10,004 detections (classification "d1") show strong diurnal periodicity, with detections increasing during the lunar day and reaching a peak at sunset, and therefore probably represent thermal events from the lunar regolith immediately surrounding the Apollo 17 landing site. The final 3662 detections (classification "d2") have emergent onsets and relatively long durations. These detections have peaks associated with lunar sunrise and sunset, but also sometimes have peaks at seemingly random times. Their source mechanism has not yet been investigated. It's possible that many of these are misclassified d1/n1 events, and further QC work needs to be undertaken. But it is also possible that many of these represent more distant thermal moonquakes e.g. from the North and South massif, or even the ridge adjacent to the Lee-Lincoln scarp. The unknown event spikes will be the subject of closer inspection once the HMM technique has been refined.
Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO)
Yan, Lixin; Zhang, Yishi; He, Yi; Gao, Song; Zhu, Dunyao; Ran, Bin; Wu, Qing
2016-01-01
The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1) the Markov blanket (MB) algorithm is employed to extract the main factors associated with hazardous traffic events; (2) a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle’s speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G) have significant influences on hazardous traffic events. The sequential minimal optimization (SMO) algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles. PMID:27420073
Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO).
Yan, Lixin; Zhang, Yishi; He, Yi; Gao, Song; Zhu, Dunyao; Ran, Bin; Wu, Qing
2016-07-13
The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1) the Markov blanket (MB) algorithm is employed to extract the main factors associated with hazardous traffic events; (2) a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle's speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G) have significant influences on hazardous traffic events. The sequential minimal optimization (SMO) algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles.
Capability of detecting ultraviolet counterparts of gravitational waves with GLUV
NASA Astrophysics Data System (ADS)
Ridden-Harper, Ryan; Tucker, B. E.; Sharp, R.; Gilbert, J.; Petkovic, M.
2017-12-01
With the discovery of gravitational waves (GWs), attention has turned towards detecting counterparts to these sources. In discussions on counterpart signatures and multimessenger follow-up strategies to the GW detections, ultraviolet (UV) signatures have largely been neglected, due to UV facilities being limited to SWIFT, which lacks high-cadence UV survey capabilities. In this paper, we examine the UV signatures from merger models for the major GW sources, highlighting the need for further modelling, while presenting requirements and a design for an effective UV survey telescope. Using the u΄-band models as an analogue, we find that a UV survey telescope requires a limiting magnitude of m_{u^' }}(AB)≈ 24 to fully complement the aLIGO range and sky localization. We show that a network of small, balloon-based UV telescopes with a primary mirror diameter of 30 cm could be capable of covering the aLIGO detection distance from ∼60 to 100 per cent for BNS events and ∼40 per cent for the black hole and a neutron star events. The sensitivity of UV emission to initial conditions suggests that a UV survey telescope would provide a unique data set, which can act as an effective diagnostic to discriminate between models.
Xu, Stanley; Newcomer, Sophia; Nelson, Jennifer; Qian, Lei; McClure, David; Pan, Yi; Zeng, Chan; Glanz, Jason
2014-05-01
The Vaccine Safety Datalink project captures electronic health record data including vaccinations and medically attended adverse events on 8.8 million enrollees annually from participating managed care organizations in the United States. While the automated vaccination data are generally of high quality, a presumptive adverse event based on diagnosis codes in automated health care data may not be true (misclassification). Consequently, analyses using automated health care data can generate false positive results, where an association between the vaccine and outcome is incorrectly identified, as well as false negative findings, where a true association or signal is missed. We developed novel conditional Poisson regression models and fixed effects models that accommodate misclassification of adverse event outcome for self-controlled case series design. We conducted simulation studies to evaluate their performance in signal detection in vaccine safety hypotheses generating (screening) studies. We also reanalyzed four previously identified signals in a recent vaccine safety study using the newly proposed models. Our simulation studies demonstrated that (i) outcome misclassification resulted in both false positive and false negative signals in screening studies; (ii) the newly proposed models reduced both the rates of false positive and false negative signals. In reanalyses of four previously identified signals using the novel statistical models, the incidence rate ratio estimates and statistical significances were similar to those using conventional models and including only medical record review confirmed cases. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Detection of cough signals in continuous audio recordings using hidden Markov models.
Matos, Sergio; Birring, Surinder S; Pavord, Ian D; Evans, David H
2006-06-01
Cough is a common symptom of many respiratory diseases. The evaluation of its intensity and frequency of occurrence could provide valuable clinical information in the assessment of patients with chronic cough. In this paper we propose the use of hidden Markov models (HMMs) to automatically detect cough sounds from continuous ambulatory recordings. The recording system consists of a digital sound recorder and a microphone attached to the patient's chest. The recognition algorithm follows a keyword-spotting approach, with cough sounds representing the keywords. It was trained on 821 min selected from 10 ambulatory recordings, including 2473 manually labeled cough events, and tested on a database of nine recordings from separate patients with a total recording time of 3060 min and comprising 2155 cough events. The average detection rate was 82% at a false alarm rate of seven events/h, when considering only events above an energy threshold relative to each recording's average energy. These results suggest that HMMs can be applied to the detection of cough sounds from ambulatory patients. A postprocessing stage to perform a more detailed analysis on the detected events is under development, and could allow the rejection of some of the incorrectly detected events.
2018-01-01
Modern cities are subject to periodic or unexpected critical events, which may bring economic losses or even put people in danger. When some monitoring systems based on wireless sensor networks are deployed, sensing and transmission configurations of sensor nodes may be adjusted exploiting the relevance of the considered events, but efficient detection and classification of events of interest may be hard to achieve. In Smart City environments, several people spontaneously post information in social media about some event that is being observed and such information may be mined and processed for detection and classification of critical events. This article proposes an integrated approach to detect and classify events of interest posted in social media, notably in Twitter, and the assignment of sensing priorities to source nodes. By doing so, wireless sensor networks deployed in Smart City scenarios can be optimized for higher efficiency when monitoring areas under the influence of the detected events. PMID:29614060
Costa, Daniel G; Duran-Faundez, Cristian; Andrade, Daniel C; Rocha-Junior, João B; Peixoto, João Paulo Just
2018-04-03
Modern cities are subject to periodic or unexpected critical events, which may bring economic losses or even put people in danger. When some monitoring systems based on wireless sensor networks are deployed, sensing and transmission configurations of sensor nodes may be adjusted exploiting the relevance of the considered events, but efficient detection and classification of events of interest may be hard to achieve. In Smart City environments, several people spontaneously post information in social media about some event that is being observed and such information may be mined and processed for detection and classification of critical events. This article proposes an integrated approach to detect and classify events of interest posted in social media, notably in Twitter , and the assignment of sensing priorities to source nodes. By doing so, wireless sensor networks deployed in Smart City scenarios can be optimized for higher efficiency when monitoring areas under the influence of the detected events.
OGLE-2017-BLG-1130: The First Binary Gravitational Microlens Detected from Spitzer Only
NASA Astrophysics Data System (ADS)
Wang, Tianshu; Calchi Novati, S.; Udalski, A.; Gould, A.; Mao, Shude; Zang, W.; Beichman, C.; Bryden, G.; Carey, S.; Gaudi, B. S.; Henderson, C. B.; Shvartzvald, Y.; Yee, J. C.; Spitzer Team; Mróz, P.; Poleski, R.; Skowron, J.; Szymański, M. K.; Soszyński, I.; Kozłowski, S.; Pietrukowicz, P.; Ulaczyk, K.; Pawlak, M.; OGLE Collaboration; Albrow, M. D.; Chung, S.-J.; Han, C.; Hwang, K.-H.; Jung, Y. K.; Ryu, Y.-H.; Shin, I.-G.; Zhu, W.; Cha, S.-M.; Kim, D.-J.; Kim, H.-W.; Kim, S.-L.; Lee, C.-U.; Lee, D.-J.; Lee, Y.; Park, B.-G.; Pogge, R. W.; KMTNet Collaboration
2018-06-01
We analyze the binary gravitational microlensing event OGLE-2017-BLG-1130 (mass ratio q ∼ 0.45), the first published case in which the binary anomaly was detected only by the Spitzer Space Telescope. This event provides strong evidence that some binary signals can be missed by observations from the ground alone but detected by Spitzer. We therefore invert the normal procedure, first finding the lens parameters by fitting the space-based data and then measuring the microlensing parallax using ground-based observations. We also show that the normal four-fold space-based degeneracy in the single-lens case can become a weak eight-fold degeneracy in binary-lens events. Although this degeneracy is resolved in event OGLE-2017-BLG-1130, it might persist in other events.
Phylogenetic framework for coevolutionary studies: a compass for exploring jungles of tangled trees.
Martínez-Aquino, Andrés
2016-08-01
Phylogenetics is used to detect past evolutionary events, from how species originated to how their ecological interactions with other species arose, which can mirror cophylogenetic patterns. Cophylogenetic reconstructions uncover past ecological relationships between taxa through inferred coevolutionary events on trees, for example, codivergence, duplication, host-switching, and loss. These events can be detected by cophylogenetic analyses based on nodes and the length and branching pattern of the phylogenetic trees of symbiotic associations, for example, host-parasite. In the past 2 decades, algorithms have been developed for cophylogetenic analyses and implemented in different software, for example, statistical congruence index and event-based methods. Based on the combination of these approaches, it is possible to integrate temporal information into cophylogenetical inference, such as estimates of lineage divergence times between 2 taxa, for example, hosts and parasites. Additionally, the advances in phylogenetic biogeography applying methods based on parametric process models and combined Bayesian approaches, can be useful for interpreting coevolutionary histories in a scenario of biogeographical area connectivity through time. This article briefly reviews the basics of parasitology and provides an overview of software packages in cophylogenetic methods. Thus, the objective here is to present a phylogenetic framework for coevolutionary studies, with special emphasis on groups of parasitic organisms. Researchers wishing to undertake phylogeny-based coevolutionary studies can use this review as a "compass" when "walking" through jungles of tangled phylogenetic trees.
Phylogenetic framework for coevolutionary studies: a compass for exploring jungles of tangled trees
2016-01-01
Abstract Phylogenetics is used to detect past evolutionary events, from how species originated to how their ecological interactions with other species arose, which can mirror cophylogenetic patterns. Cophylogenetic reconstructions uncover past ecological relationships between taxa through inferred coevolutionary events on trees, for example, codivergence, duplication, host-switching, and loss. These events can be detected by cophylogenetic analyses based on nodes and the length and branching pattern of the phylogenetic trees of symbiotic associations, for example, host–parasite. In the past 2 decades, algorithms have been developed for cophylogetenic analyses and implemented in different software, for example, statistical congruence index and event-based methods. Based on the combination of these approaches, it is possible to integrate temporal information into cophylogenetical inference, such as estimates of lineage divergence times between 2 taxa, for example, hosts and parasites. Additionally, the advances in phylogenetic biogeography applying methods based on parametric process models and combined Bayesian approaches, can be useful for interpreting coevolutionary histories in a scenario of biogeographical area connectivity through time. This article briefly reviews the basics of parasitology and provides an overview of software packages in cophylogenetic methods. Thus, the objective here is to present a phylogenetic framework for coevolutionary studies, with special emphasis on groups of parasitic organisms. Researchers wishing to undertake phylogeny-based coevolutionary studies can use this review as a “compass” when “walking” through jungles of tangled phylogenetic trees. PMID:29491928
Calibration methodology for proportional counters applied to yield measurements of a neutron burst.
Tarifeño-Saldivia, Ariel; Mayer, Roberto E; Pavez, Cristian; Soto, Leopoldo
2014-01-01
This paper introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. This methodology is to be applied when single neutron events cannot be resolved in time by nuclear standard electronics, or when a continuous current cannot be measured at the output of the counter. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from the detection of the burst of neutrons. The model is developed and presented in full detail. For the measurement of fast neutron yields generated from plasma focus experiments using a moderated proportional counter, the implementation of the methodology is herein discussed. An experimental verification of the accuracy of the methodology is presented. An improvement of more than one order of magnitude in the accuracy of the detection system is obtained by using this methodology with respect to previous calibration methods.
NASA Astrophysics Data System (ADS)
Salafia, Om Sharan; Colpi, Monica; Branchesi, Marica; Chassande-Mottin, Eric; Ghirlanda, Giancarlo; Ghisellini, Gabriele; Vergani, Susanna D.
2017-09-01
The electromagnetic (EM) follow-up of a gravitational-wave (GW) event requires scanning a wide sky region, defined by the so-called “skymap,” to detect and identify a transient counterpart. We propose a novel method that exploits the information encoded in the GW signal to construct a “detectability map,” which represents the time-dependent (“when”) probability of detecting the transient at each position of the skymap (“where”). Focusing on the case of a neutron star binary inspiral, we model the associated short gamma-ray burst afterglow and macronova emission using the probability distributions of binary parameters (sky position, distance, orbit inclination, mass ratio) extracted from the GW signal as inputs. The resulting family of possible light curves is the basis for constructing the detectability map. As a practical example, we apply the method to a simulated GW signal produced by a neutron star merger at 75 Mpc whose localization uncertainty is very large (˜1500 deg2). We construct observing strategies for optical, infrared, and radio facilities based on the detectability maps, taking VST, VISTA, and MeerKAT as prototypes. Assuming limiting fluxes of r˜ 24.5, J˜ 22.4 (AB magnitudes), and 500 μ {Jy} (1.4 {GHz}) for ˜1000 s of exposure each, the afterglow and macronova emissions are successfully detected with a minimum observing time of 7, 15, and 5 hr respectively.
Shen, Yanna; Cooper, Gregory F
2012-09-01
This paper investigates Bayesian modeling of known and unknown causes of events in the context of disease-outbreak detection. We introduce a multivariate Bayesian approach that models multiple evidential features of every person in the population. This approach models and detects (1) known diseases (e.g., influenza and anthrax) by using informative prior probabilities and (2) unknown diseases (e.g., a new, highly contagious respiratory virus that has never been seen before) by using relatively non-informative prior probabilities. We report the results of simulation experiments which support that this modeling method can improve the detection of new disease outbreaks in a population. A contribution of this paper is that it introduces a multivariate Bayesian approach for jointly modeling both known and unknown causes of events. Such modeling has general applicability in domains where the space of known causes is incomplete. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
A methodology to select a wire insulation for use in habitable spacecraft.
Paulos, T; Apostolakis, G
1998-08-01
This paper investigates electrical overheating events aboard a habitable spacecraft. The wire insulation involved in these failures plays a major role in the entire event scenario from threat development to detection and damage assessment. Ideally, if models of wire overheating events in microgravity existed, the various wire insulations under consideration could be quantitatively compared. However, these models do not exist. In this paper, a methodology is developed that can be used to select a wire insulation that is best suited for use in a habitable spacecraft. The results of this study show that, based upon the Analytic Hierarchy Process and simplifying assumptions, the criteria selected, and data used in the analysis, Tefzel is better than Teflon for use in a habitable spacecraft.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foxall, W; Vincent, P; Walter, W
1999-07-23
We have previously presented simple elastic deformation modeling results for three classes of seismic events of concern in monitoring the CTBT--underground explosions, mine collapses and earthquakes. Those results explored the theoretical detectability of each event type using synthetic aperture radar interferometry (InSAR) based on commercially available satellite data. In those studies we identified and compared the characteristics of synthetic interferograms that distinguish each event type, as well the ability of the interferograms to constrain source parameters. These idealized modeling results, together with preliminary analysis of InSAR data for the 1995 mb 5.2 Solvay mine collapse in southwestern Wyoming, suggested thatmore » InSAR data used in conjunction with regional seismic monitoring holds great potential for CTBT discrimination and seismic source analysis, as well as providing accurate ground truth parameters for regional calibration events. In this paper we further examine the detectability and ''discriminating'' power of InSAR by presenting results from InSAR data processing, analysis and modeling of the surface deformation signals associated with underground explosions. Specifically, we present results of a detailed study of coseismic and postseismic surface deformation signals associated with underground nuclear and chemical explosion tests at the Nevada Test Site (NTS). Several interferograms were formed from raw ERS-1/2 radar data covering different time spans and epochs beginning just prior to the last U.S. nuclear tests in 1992 and ending in 1996. These interferograms have yielded information about the nature and duration of the source processes that produced the surface deformations associated with these events. A critical result of this study is that significant post-event surface deformation associated with underground nuclear explosions detonated at depths in excess of 600 meters can be detected using differential radar interferometry. An immediate implication of this finding is that underground nuclear explosions may not need to be captured coseismically by radar images acquired before and after an event in order to be detectable. This has obvious advantages in CTBT monitoring since suspect seismic events--which usually can be located within a 100 km by 100 km area of an ERS-1/2 satellite frame by established seismic methods-can be imaged after the event has been identified and located by existing regional seismic networks. Key Words: InSAR, SLC images, interferogram, synthetic interferogram, ERS-1/2 frame, phase unwrapping, DEM, coseismic, postseismic, source parameters.« less
NASA Technical Reports Server (NTRS)
Wu, Huan; Adler, Robert F.; Tian, Yudong; Huffman, George J.; Li, Hongyi; Wang, JianJian
2014-01-01
A widely used land surface model, the Variable Infiltration Capacity (VIC) model, is coupled with a newly developed hierarchical dominant river tracing-based runoff-routing model to form the Dominant river tracing-Routing Integrated with VIC Environment (DRIVE) model, which serves as the new core of the real-time Global Flood Monitoring System (GFMS). The GFMS uses real-time satellite-based precipitation to derive flood monitoring parameters for the latitude band 50 deg. N - 50 deg. S at relatively high spatial (approximately 12 km) and temporal (3 hourly) resolution. Examples of model results for recent flood events are computed using the real-time GFMS (http://flood.umd.edu). To evaluate the accuracy of the new GFMS, the DRIVE model is run retrospectively for 15 years using both research-quality and real-time satellite precipitation products. Evaluation results are slightly better for the research-quality input and significantly better for longer duration events (3 day events versus 1 day events). Basins with fewer dams tend to provide lower false alarm ratios. For events longer than three days in areas with few dams, the probability of detection is approximately 0.9 and the false alarm ratio is approximately 0.6. In general, these statistical results are better than those of the previous system. Streamflow was evaluated at 1121 river gauges across the quasi-global domain. Validation using real-time precipitation across the tropics (30 deg. S - 30 deg. N) gives positive daily Nash-Sutcliffe Coefficients for 107 out of 375 (28%) stations with a mean of 0.19 and 51% of the same gauges at monthly scale with a mean of 0.33. There were poorer results in higher latitudes, probably due to larger errors in the satellite precipitation input.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Huan; Adler, Robert F.; Tian, Yudong
2014-03-01
A widely used land surface model, the Variable Infiltration Capacity (VIC) model, is coupled with a newly developed hierarchical dominant river tracing-based runoff-routing model to form the Dominant river tracing-Routing Integrated with VIC Environment (DRIVE) model, which serves as the new core of the real-time Global Flood Monitoring System (GFMS). The GFMS uses real-time satellite-based precipitation to derive flood monitoring parameters for the latitude band 50°N–50°S at relatively high spatial (~12 km) and temporal (3 hourly) resolution. Examples of model results for recent flood events are computed using the real-time GFMS (http://flood.umd.edu). To evaluate the accuracy of the new GFMS,more » the DRIVE model is run retrospectively for 15 years using both research-quality and real-time satellite precipitation products. Evaluation results are slightly better for the research-quality input and significantly better for longer duration events (3 day events versus 1 day events). Basins with fewer dams tend to provide lower false alarm ratios. For events longer than three days in areas with few dams, the probability of detection is ~0.9 and the false alarm ratio is ~0.6. In general, these statistical results are better than those of the previous system. Streamflow was evaluated at 1121 river gauges across the quasi-global domain. Validation using real-time precipitation across the tropics (30°S–30°N) gives positive daily Nash-Sutcliffe Coefficients for 107 out of 375 (28%) stations with a mean of 0.19 and 51% of the same gauges at monthly scale with a mean of 0.33. Finally, there were poorer results in higher latitudes, probably due to larger errors in the satellite precipitation input.« less
Infrasound ray tracing models for real events
NASA Astrophysics Data System (ADS)
Averbuch, Gil; Applbaum, David; Price, Colin; Ben Horin, Yochai
2015-04-01
Infrasound ray tracing models for real events C. Price1, G. Averbuch1, D. Applbaum1, Y. Ben Horin2 (1) Department of Geosciences, Tel Aviv University, Israel (2) Soreq Nuclear Research Center, Yavne, Israel Ray tracing models for infrasound propagation require two atmospheric parameters: the speed of sound profile and the wind profile. The usage of global atmospheric models for the speed of sound and wind profiles raises a fundamental question: can these models provide accurate results for modeling real events that have been detected by the infrasound arrays? Moreover, can these models provide accurate results for events that occurred during extreme weather conditions? We use 2D and 3D ray tracing models based on a modified Hamiltonian for a moving medium. Radiosonde measurements enable us to update the first 20 km of both speed of sound and wind profiles. The 2009 and 2011 Sayarim calibration experiments in Israel served us as a test for the models. In order to answer the question regarding the accuracy of the model during extreme weather conditions, we simulate infrasound sprite signals that were detected by the infrasound array in Mt. Meron, Israel. The results from modeling the Sayarim experiment provided us sufficient insight to conclude that ray tracing modeling can provide accurate results for real events that occurred during fair weather conditions. We conclude that the time delay in the model of the 2009 experiment is due to lack of accuracy in the wind and speed of sound profiles. Perturbed profiles provide accurate results. Earlier arrivals in 2011 are a result of the assumption that the earth is flat (no topography) and the use of local radiosonde measurements for the entire model. Using local radiosonde measurements only for part of the model and neglecting them on other parts prevents the early arrivals. We were able to determine which sprite is the one that got detected in the infrasound array as well as providing a height range for the sprite's height or the sprite's most energetic part. Even though atmospheric wind has a strong influence on infrasound wave propagation, our estimation is that for high altitude sources, extreme weather in the troposphere below has low impact on the trajectories of the waves.
Self-similarity Clustering Event Detection Based on Triggers Guidance
NASA Astrophysics Data System (ADS)
Zhang, Xianfei; Li, Bicheng; Tian, Yuxuan
Traditional method of Event Detection and Characterization (EDC) regards event detection task as classification problem. It makes words as samples to train classifier, which can lead to positive and negative samples of classifier imbalance. Meanwhile, there is data sparseness problem of this method when the corpus is small. This paper doesn't classify event using word as samples, but cluster event in judging event types. It adopts self-similarity to convergence the value of K in K-means algorithm by the guidance of event triggers, and optimizes clustering algorithm. Then, combining with named entity and its comparative position information, the new method further make sure the pinpoint type of event. The new method avoids depending on template of event in tradition methods, and its result of event detection can well be used in automatic text summarization, text retrieval, and topic detection and tracking.
Description and detection of burst events in turbulent flows
NASA Astrophysics Data System (ADS)
Schmid, P. J.; García-Gutierrez, A.; Jiménez, J.
2018-04-01
A mathematical and computational framework is developed for the detection and identification of coherent structures in turbulent wall-bounded shear flows. In a first step, this data-based technique will use an embedding methodology to formulate the fluid motion as a phase-space trajectory, from which state-transition probabilities can be computed. Within this formalism, a second step then applies repeated clustering and graph-community techniques to determine a hierarchy of coherent structures ranked by their persistencies. This latter information will be used to detect highly transitory states that act as precursors to violent and intermittent events in turbulent fluid motion (e.g., bursts). Used as an analysis tool, this technique allows the objective identification of intermittent (but important) events in turbulent fluid motion; however, it also lays the foundation for advanced control strategies for their manipulation. The techniques are applied to low-dimensional model equations for turbulent transport, such as the self-sustaining process (SSP), for varying levels of complexity.
NASA Astrophysics Data System (ADS)
Howell, E. J.; Chan, M. L.; Chu, Q.; Jones, D. H.; Heng, I. S.; Lee, H.-M.; Blair, D.; Degallaix, J.; Regimbau, T.; Miao, H.; Zhao, C.; Hendry, M.; Coward, D.; Messenger, C.; Ju, L.; Zhu, Z.-H.
2018-03-01
The detection of black hole binary coalescence events by Advanced LIGO allows the science benefits of future detectors to be evaluated. In this paper, we report the science benefits of one or two 8 km arm length detectors based on the doubling of key parameters in an Advanced LIGO-type detector, combined with realizable enhancements. It is shown that the total detection rate for sources similar to those already detected would increase to ˜ 103-105 per year. Within 0.4 Gpc, we find that around 10 of these events would be localizable to within ˜10-1 deg2. This is sufficient to make unique associations or to rule out a direct association with the brightest galaxies in optical surveys (at r-band magnitudes of 17 or above) or for deeper limits (down to r-band magnitudes of 20) yield statistically significant associations. The combination of angular resolution and event rate would benefit precision testing of formation models, cosmic evolution, and cosmological studies.
The magnetic sense and its use in long-distance navigation by animals.
Walker, Michael M; Dennis, Todd E; Kirschvink, Joseph L
2002-12-01
True navigation by animals is likely to depend on events occurring in the individual cells that detect magnetic fields. Minimum thresholds of detection, perception and 'interpretation' of magnetic field stimuli must be met if animals are to use a magnetic sense to navigate. Recent technological advances in animal tracking devices now make it possible to test predictions from models of navigation based on the use of variations in magnetic intensity.
From TRMM to GPM: How well can heavy rainfall be detected from space?
NASA Astrophysics Data System (ADS)
Prakash, Satya; Mitra, Ashis K.; Pai, D. S.; AghaKouchak, Amir
2016-02-01
In this study, we investigate the capabilities of the Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) and the recently released Integrated Multi-satellitE Retrievals for GPM (IMERG) in detecting and estimating heavy rainfall across India. First, the study analyzes TMPA data products over a 17-year period (1998-2014). While TMPA and reference gauge-based observations show similar mean monthly variations of conditional heavy rainfall events, the multi-satellite product systematically overestimates its inter-annual variations. Categorical as well as volumetric skill scores reveal that TMPA over-detects heavy rainfall events (above 75th percentile of reference data), but it shows reasonable performance in capturing the volume of heavy rain across the country. An initial assessment of the GPM-based multi-satellite IMERG precipitation estimates for the southwest monsoon season shows notable improvements over TMPA in capturing heavy rainfall over India. The recently released IMERG shows promising results to help improve modeling of hydrological extremes (e.g., floods and landslides) using satellite observations.
Foreign Object Damage Identification in Turbine Engines
NASA Technical Reports Server (NTRS)
Strack, William; Zhang, Desheng; Turso, James; Pavlik, William; Lopez, Isaac
2005-01-01
This report summarizes the collective work of a five-person team from different organizations examining the problem of detecting foreign object damage (FOD) events in turbofan engines from gas path thermodynamic and bearing accelerometer sensors, and determining the severity of damage to each component (diagnosis). Several detection and diagnostic approaches were investigated and a software tool (FODID) was developed to assist researchers detect/diagnose FOD events. These approaches include (1) fan efficiency deviation computed from upstream and downstream temperature/ pressure measurements, (2) gas path weighted least squares estimation of component health parameter deficiencies, (3) Kalman filter estimation of component health parameters, and (4) use of structural vibration signal processing to detect both large and small FOD events. The last three of these approaches require a significant amount of computation in conjunction with a physics-based analytic model of the underlying phenomenon the NPSS thermodynamic cycle code for approaches 1 to 3 and the DyRoBeS reduced-order rotor dynamics code for approach 4. A potential application of the FODID software tool, in addition to its detection/diagnosis role, is using its sensitivity results to help identify the best types of sensors and their optimum locations within the gas path, and similarly for bearing accelerometers.
Commonality of drug-associated adverse events detected by 4 commonly used data mining algorithms.
Sakaeda, Toshiyuki; Kadoyama, Kaori; Minami, Keiko; Okuno, Yasushi
2014-01-01
Data mining algorithms have been developed for the quantitative detection of drug-associated adverse events (signals) from a large database on spontaneously reported adverse events. In the present study, the commonality of signals detected by 4 commonly used data mining algorithms was examined. A total of 2,231,029 reports were retrieved from the public release of the US Food and Drug Administration Adverse Event Reporting System database between 2004 and 2009. The deletion of duplicated submissions and revision of arbitrary drug names resulted in a reduction in the number of reports to 1,644,220. Associations with adverse events were analyzed for 16 unrelated drugs, using the proportional reporting ratio (PRR), reporting odds ratio (ROR), information component (IC), and empirical Bayes geometric mean (EBGM). All EBGM-based signals were included in the PRR-based signals as well as IC- or ROR-based ones, and PRR- and IC-based signals were included in ROR-based ones. The PRR scores of PRR-based signals were significantly larger for 15 of 16 drugs when adverse events were also detected as signals by the EBGM method, as were the IC scores of IC-based signals for all drugs; however, no such effect was observed in the ROR scores of ROR-based signals. The EBGM method was the most conservative among the 4 methods examined, which suggested its better suitability for pharmacoepidemiological studies. Further examinations should be performed on the reproducibility of clinical observations, especially for EBGM-based signals.
Near Real-Time Event Detection & Prediction Using Intelligent Software Agents
2006-03-01
value was 0.06743. Multiple autoregressive integrated moving average ( ARIMA ) models were then build to see if the raw data, differenced data, or...slight improvement. The best adjusted r^2 value was found to be 0.1814. Successful results were not expected from linear or ARIMA -based modelling ...appear, 2005. [63] Mora-Lopez, L., Mora, J., Morales-Bueno, R., et al. Modelling time series of climatic parameters with probabilistic finite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakhleh, Luay
I proposed to develop computationally efficient tools for accurate detection and reconstruction of microbes' complex evolutionary mechanisms, thus enabling rapid and accurate annotation, analysis and understanding of their genomes. To achieve this goal, I proposed to address three aspects. (1) Mathematical modeling. A major challenge facing the accurate detection of HGT is that of distinguishing between these two events on the one hand and other events that have similar "effects." I proposed to develop a novel mathematical approach for distinguishing among these events. Further, I proposed to develop a set of novel optimization criteria for the evolutionary analysis of microbialmore » genomes in the presence of these complex evolutionary events. (2) Algorithm design. In this aspect of the project, I proposed to develop an array of e cient and accurate algorithms for analyzing microbial genomes based on the formulated optimization criteria. Further, I proposed to test the viability of the criteria and the accuracy of the algorithms in an experimental setting using both synthetic as well as biological data. (3) Software development. I proposed the nal outcome to be a suite of software tools which implements the mathematical models as well as the algorithms developed.« less
Development of a database and processing method for detecting hematotoxicity adverse drug events.
Shimai, Yoshie; Takeda, Toshihiro; Manabe, Shirou; Teramoto, Kei; Mihara, Naoki; Matsumura, Yasushi
2015-01-01
Adverse events are detected by monitoring the patient's status, including blood test results. However, it is difficult to identify all adverse events based on recognition by individual doctors. We developed a system that can be used to detect hematotoxicity adverse events according to blood test results recorded in an electronic medical record system. The blood test results were graded based on Common Terminology Criteria for Adverse Events (CTCAE) and changes in the blood test results (Up, Down, Flat) were assessed according to the variation in the grade. The changes in the blood test and injection data were stored in a database. By comparing the date of injection and start and end dates of the change in the blood test results, adverse events related to a designated drug were detected. Using this method, we searched for the occurrence of serious adverse events (CTCAE Grades 3 or 4) concerning WBC, ALT and creatinine related to paclitaxel at Osaka University Hospital. The rate of occurrence of a decreased WBC count, increased ALT level and increased creatinine level was 36.0%, 0.6% and 0.4%, respectively. This method is useful for detecting and estimating the rate of occurrence of hematotoxicity adverse drug events.
NASA Astrophysics Data System (ADS)
Liu, S. B.; Bouchard, B.; Bowden, D. C.; Guy, M.; Earle, P.
2012-12-01
The U.S. Geological Survey (USGS) is investigating how online social networking services like Twitter—a microblogging service for sending and reading public text-based messages of up to 140 characters—can augment USGS earthquake response products and the delivery of hazard information. The USGS Tweet Earthquake Dispatch (TED) system is using Twitter not only to broadcast seismically-verified earthquake alerts via the @USGSted and @USGSbigquakes Twitter accounts, but also to rapidly detect widely felt seismic events through a real-time detection system. The detector algorithm scans for significant increases in tweets containing the word "earthquake" or its equivalent in other languages and sends internal alerts with the detection time, tweet text, and the location of the city where most of the tweets originated. It has been running in real-time for 7 months and finds, on average, two or three felt events per day with a false detection rate of less than 10%. The detections have reasonable coverage of populated areas globally. The number of detections is small compared to the number of earthquakes detected seismically, and only a rough location and qualitative assessment of shaking can be determined based on Tweet data alone. However, the Twitter detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The main benefit of the tweet-based detections is speed, with most detections occurring between 19 seconds and 2 minutes from the origin time. This is considerably faster than seismic detections in poorly instrumented regions of the world. Going beyond the initial detection, the USGS is developing data mining techniques to continuously archive and analyze relevant tweets for additional details about the detected events. The information generated about an event is displayed on a web-based map designed using HTML5 for the mobile environment, which can be valuable when the user is not able to access a desktop computer at the time of the detections. The continuously updating map displays geolocated tweets arriving after the detection and plots epicenters of recent earthquakes. When available, seismograms from nearby stations are displayed as an additional form of verification. A time series of tweets-per-minute is also shown to illustrate the volume of tweets being generated for the detected event. Future additions are being investigated to provide a more in-depth characterization of the seismic events based on an analysis of tweet text and content from other social media sources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chung, Sun-Ju; Lee, Chung-Uk; Koo, Jae-Rim, E-mail: sjchung@kasi.re.kr, E-mail: leecu@kasi.re.kr, E-mail: koojr@kasi.re.kr
2014-04-20
Even though the recently discovered high-magnification event MOA-2010-BLG-311 had complete coverage over its peak, confident planet detection did not happen due to extremely weak central perturbations (EWCPs, fractional deviations of ≲ 2%). For confident detection of planets in EWCP events, it is necessary to have both high cadence monitoring and high photometric accuracy better than those of current follow-up observation systems. The next-generation ground-based observation project, Korea Microlensing Telescope Network (KMTNet), satisfies these conditions. We estimate the probability of occurrence of EWCP events with fractional deviations of ≤2% in high-magnification events and the efficiency of detecting planets in the EWCPmore » events using the KMTNet. From this study, we find that the EWCP events occur with a frequency of >50% in the case of ≲ 100 M {sub E} planets with separations of 0.2 AU ≲ d ≲ 20 AU. We find that for main-sequence and sub-giant source stars, ≳ 1 M {sub E} planets in EWCP events with deviations ≤2% can be detected with frequency >50% in a certain range that changes with the planet mass. However, it is difficult to detect planets in EWCP events of bright stars like giant stars because it is easy for KMTNet to be saturated around the peak of the events because of its constant exposure time. EWCP events are caused by close, intermediate, and wide planetary systems with low-mass planets and close and wide planetary systems with massive planets. Therefore, we expect that a much greater variety of planetary systems than those already detected, which are mostly intermediate planetary systems, regardless of the planet mass, will be significantly detected in the near future.« less
The Bayesian Approach to Association
NASA Astrophysics Data System (ADS)
Arora, N. S.
2017-12-01
The Bayesian approach to Association focuses mainly on quantifying the physics of the domain. In the case of seismic association for instance let X be the set of all significant events (above some threshold) and their attributes, such as location, time, and magnitude, Y1 be the set of detections that are caused by significant events and their attributes such as seismic phase, arrival time, amplitude etc., Y2 be the set of detections that are not caused by significant events, and finally Y be the set of observed detections We would now define the joint distribution P(X, Y1, Y2, Y) = P(X) P(Y1 | X) P(Y2) I(Y = Y1 + Y2) ; where the last term simply states that Y1 and Y2 are a partitioning of Y. Given the above joint distribution the inference problem is simply to find the X, Y1, and Y2 that maximizes posterior probability P(X, Y1, Y2| Y) which reduces to maximizing P(X) P(Y1 | X) P(Y2) I(Y = Y1 + Y2). In this expression P(X) captures our prior belief about event locations. P(Y1 | X) captures notions of travel time, residual error distributions as well as detection and mis-detection probabilities. While P(Y2) captures the false detection rate of our seismic network. The elegance of this approach is that all of the assumptions are stated clearly in the model for P(X), P(Y1|X) and P(Y2). The implementation of the inference is merely a by-product of this model. In contrast some of the other methods such as GA hide a number of assumptions in the implementation details of the inference - such as the so called "driver cells." The other important aspect of this approach is that all seismic knowledge including knowledge from other domains such as infrasound and hydroacoustic can be included in the same model. So, we don't need to separately account for misdetections or merge seismic and infrasound events as a separate step. Finally, it should be noted that the objective of automatic association is to simplify the job of humans who are publishing seismic bulletins based on this output. The error metric for association should accordingly count errors such as missed events much higher than spurious events because the former require more work from humans. Furthermore, the error rate needs to be weighted higher during periods of high seismicity such as an aftershock sequence when the human effort tends to increase.
NASA Astrophysics Data System (ADS)
Taghavi, F.; Owlad, E.; Ackerman, S. A.
2017-03-01
South-west Asia including the Middle East is one of the most prone regions to dust storm events. In recent years, there was an increase in the occurrence of these environmental and meteorological phenomena. Remote sensing could serve as an applicable method to detect and also characterise these events. In this study, two dust enhancement algorithms were used to investigate the behaviour of dust events using satellite data, compare with numerical model output and other satellite products and finally validate with in-situ measurements. The results show that the use of thermal infrared algorithm enhances dust more accurately. The aerosol optical depth from MODIS and output of a Dust Regional Atmospheric Model (DREAM8b) are applied for comparing the results. Ground-based observations of synoptic stations and sun photometers are used for validating the satellite products. To find the transport direction and the locations of the dust sources and the synoptic situations during these events, model outputs (HYSPLIT and NCEP/NCAR) are presented. Comparing the results with synoptic maps and the model outputs showed that using enhancement algorithms is a more reliable way than any other MODIS products or model outputs to enhance the dust.
Brauchli Pernus, Yolanda; Nan, Cassandra; Verstraeten, Thomas; Pedenko, Mariia; Osokogu, Osemeke U; Weibel, Daniel; Sturkenboom, Miriam; Bonhoeffer, Jan
2016-12-12
Safety signal detection in spontaneous reporting system databases and electronic healthcare records is key to detection of previously unknown adverse events following immunization. Various statistical methods for signal detection in these different datasources have been developed, however none are geared to the pediatric population and none specifically to vaccines. A reference set comprising pediatric vaccine-adverse event pairs is required for reliable performance testing of statistical methods within and across data sources. The study was conducted within the context of the Global Research in Paediatrics (GRiP) project, as part of the seventh framework programme (FP7) of the European Commission. Criteria for the selection of vaccines considered in the reference set were routine and global use in the pediatric population. Adverse events were primarily selected based on importance. Outcome based systematic literature searches were performed for all identified vaccine-adverse event pairs and complemented by expert committee reports, evidence based decision support systems (e.g. Micromedex), and summaries of product characteristics. Classification into positive (PC) and negative control (NC) pairs was performed by two independent reviewers according to a pre-defined algorithm and discussed for consensus in case of disagreement. We selected 13 vaccines and 14 adverse events to be included in the reference set. From a total of 182 vaccine-adverse event pairs, we classified 18 as PC, 113 as NC and 51 as unclassifiable. Most classifications (91) were based on literature review, 45 were based on expert committee reports, and for 46 vaccine-adverse event pairs, an underlying pathomechanism was not plausible classifying the association as NC. A reference set of vaccine-adverse event pairs was developed. We propose its use for comparing signal detection methods and systems in the pediatric population. Published by Elsevier Ltd.
Multi-Detection Events, Probability Density Functions, and Reduced Location Area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eslinger, Paul W.; Schrom, Brian T.
2016-03-01
Abstract Several efforts have been made in the Comprehensive Nuclear-Test-Ban Treaty (CTBT) community to assess the benefits of combining detections of radionuclides to improve the location estimates available from atmospheric transport modeling (ATM) backtrack calculations. We present a Bayesian estimation approach rather than a simple dilution field of regard approach to allow xenon detections and non-detections to be combined mathematically. This system represents one possible probabilistic approach to radionuclide event formation. Application of this method to a recent interesting radionuclide event shows a substantial reduction in the location uncertainty of that event.
Global Infrasound Association Based on Probabilistic Clutter Categorization
NASA Astrophysics Data System (ADS)
Arora, Nimar; Mialle, Pierrick
2016-04-01
The IDC advances its methods and continuously improves its automatic system for the infrasound technology. The IDC focuses on enhancing the automatic system for the identification of valid signals and the optimization of the network detection threshold by identifying ways to refine signal characterization methodology and association criteria. An objective of this study is to reduce the number of associated infrasound arrivals that are rejected from the automatic bulletins when generating the reviewed event bulletins. Indeed, a considerable number of signal detections are due to local clutter sources such as microbaroms, waterfalls, dams, gas flares, surf (ocean breaking waves) etc. These sources are either too diffuse or too local to form events. Worse still, the repetitive nature of this clutter leads to a large number of false event hypotheses due to the random matching of clutter at multiple stations. Previous studies, for example [1], have worked on categorization of clutter using long term trends on detection azimuth, frequency, and amplitude at each station. In this work we continue the same line of reasoning to build a probabilistic model of clutter that is used as part of NETVISA [2], a Bayesian approach to network processing. The resulting model is a fusion of seismic, hydroacoustic and infrasound processing built on a unified probabilistic framework. References: [1] Infrasound categorization Towards a statistics based approach. J. Vergoz, P. Gaillard, A. Le Pichon, N. Brachet, and L. Ceranna. ITW 2011 [2] NETVISA: Network Processing Vertically Integrated Seismic Analysis. N. S. Arora, S. Russell, and E. Sudderth. BSSA 2013
22nd Annual Logistics Conference and Exhibition
2006-04-20
Prognostics & Health Management at GE Dr. Piero P.Bonissone Industrial AI Lab GE Global Research NCD Select detection model Anomaly detection results...Mode 213 x Failure mode histogram 2130014 Anomaly detection from event-log data Anomaly detection from event-log data Diagnostics/ Prognostics Using...Failure Monitoring & AssessmentTactical C4ISR Sense Respond 7 •Diagnostics, Prognostics and health management
NASA Astrophysics Data System (ADS)
Schindewolf, Marcus; Kaiser, Andreas; Buchholtz, Arno; Schmidt, Jürgen
2017-04-01
Extreme rainfall events and resulting flash floods led to massive devastations in Germany during spring 2016. The study presented aims on the development of a early warning system, which allows the simulation and assessment of negative effects on infrastructure by radar-based heavy rainfall predictions, serving as input data for the process-based soil loss and deposition model EROSION 3D. Our approach enables a detailed identification of runoff and sediment fluxes in agricultural used landscapes. In a first step, documented historical events were analyzed concerning the accordance of measured radar rainfall and large scale erosion risk maps. A second step focused on a small scale erosion monitoring via UAV of source areas of heavy flooding events and a model reconstruction of the processes involved. In all examples damages were caused to local infrastructure. Both analyses are promising in order to detect runoff and sediment delivering areas even in a high temporal and spatial resolution. Results prove the important role of late-covering crops such as maize, sugar beet or potatoes in runoff generation. While e.g. winter wheat positively affects extensive runoff generation on undulating landscapes, massive soil loss and thus muddy flows are observed and depicted in model results. Future research aims on large scale model parameterization and application in real time, uncertainty estimation of precipitation forecast and interface developments.
NASA Astrophysics Data System (ADS)
Himemoto, Yoshiaki; Taruya, Atsushi
2017-07-01
After the first direct detection of gravitational waves (GW), detection of the stochastic background of GWs is an important next step, and the first GW event suggests that it is within the reach of the second-generation ground-based GW detectors. Such a GW signal is typically tiny and can be detected by cross-correlating the data from two spatially separated detectors if the detector noise is uncorrelated. It has been advocated, however, that the global magnetic fields in the Earth-ionosphere cavity produce the environmental disturbances at low-frequency bands, known as Schumann resonances, which potentially couple with GW detectors. In this paper, we present a simple analytical model to estimate its impact on the detection of stochastic GWs. The model crucially depends on the geometry of the detector pair through the directional coupling, and we investigate the basic properties of the correlated magnetic noise based on the analytic expressions. The model reproduces the major trend of the recently measured global correlation between the GW detectors via magnetometer. The estimated values of the impact of correlated noise also match those obtained from the measurement. Finally, we give an implication to the detection of stochastic GWs including upcoming detectors, KAGRA and LIGO India. The model suggests that LIGO Hanford-Virgo and Virgo-KAGRA pairs are possibly less sensitive to the correlated noise and can achieve a better sensitivity to the stochastic GW signal in the most pessimistic case.
Global Infrasound Association Based on Probabilistic Clutter Categorization
NASA Astrophysics Data System (ADS)
Arora, N. S.; Mialle, P.
2015-12-01
The IDC collects waveforms from a global network of infrasound sensors maintained by the IMS, and automatically detects signal onsets and associates them to form event hypotheses. However, a large number of signal onsets are due to local clutter sources such as microbaroms (from standing waves in the oceans), waterfalls, dams, gas flares, surf (ocean breaking waves) etc. These sources are either too diffuse or too local to form events. Worse still, the repetitive nature of this clutter leads to a large number of false event hypotheses due to the random matching of clutter at multiple stations. Previous studies, for example [1], have worked on categorization of clutter using long term trends on detection azimuth, frequency, and amplitude at each station. In this work we continue the same line of reasoning to build a probabilistic model of clutter that is used as part of NET-VISA [2], a Bayesian approach to network processing. The resulting model is a fusion of seismic, hydro-acoustic and infrasound processing built on a unified probabilistic framework. Notes: The attached figure shows all the unassociated arrivals detected at IMS station I09BR for 2012 distributed by azimuth and center frequency. (The title displays the bandwidth of the kernel density estimate along the azimuth and frequency dimensions).This plot shows multiple micro-barom sources as well as other sources of infrasound clutter. A diverse clutter-field such as this one is quite common for most IMS infrasound stations, and it highlights the dangers of forming events without due consideration of this source of noise. References: [1] Infrasound categorization Towards a statistics-based approach. J. Vergoz, P. Gaillard, A. Le Pichon, N. Brachet, and L. Ceranna. ITW 2011 [2] NET-VISA: Network Processing Vertically Integrated Seismic Analysis. N. S. Arora, S. Russell, and E. Sudderth. BSSA 2013.
NASA Astrophysics Data System (ADS)
Zhou, Cong; Chase, J. Geoffrey; Rodgers, Geoffrey W.; Xu, Chao
2017-02-01
The model-free hysteresis loop analysis (HLA) method for structural health monitoring (SHM) has significant advantages over the traditional model-based SHM methods that require a suitable baseline model to represent the actual system response. This paper provides a unique validation against both an experimental reinforced concrete (RC) building and a calibrated numerical model to delineate the capability of the model-free HLA method and the adaptive least mean squares (LMS) model-based method in detecting, localizing and quantifying damage that may not be visible, observable in overall structural response. Results clearly show the model-free HLA method is capable of adapting to changes in how structures transfer load or demand across structural elements over time and multiple events of different size. However, the adaptive LMS model-based method presented an image of greater spread of lesser damage over time and story when the baseline model is not well defined. Finally, the two algorithms are tested over a simpler hysteretic behaviour typical steel structure to quantify the impact of model mismatch between the baseline model used for identification and the actual response. The overall results highlight the need for model-based methods to have an appropriate model that can capture the observed response, in order to yield accurate results, even in small events where the structure remains linear.
Gravitational wave signature of a mini creation event (MCE)
NASA Astrophysics Data System (ADS)
Dhurandhar, S. V.; Narlikar, J. V.
2018-07-01
In light of the recent discoveries of binary black hole events and one neutron star event by the advanced LIGO (aLIGO) and advanced Virgo (aVirgo) detectors, we propose a new astrophysical source, namely, the mini creation event (MCE) as a possible source of gravitational waves (GW) to be detected by advanced detectors. The MCE is at the heart of the quasi steady state cosmology (QSSC) and is not expected to occur in standard cosmology. Generically, the MCE is anisotropic and we assume a Bianchi Tpye I model for its description. We compute its signature waveform and assume masses, distances analogous to the events detected. The striking feature of the waveform associated with this model of the MCE is that it depends only on one amplitude parameter and thus allows for simpler data analysis. By matched filtering the signal we find that, for a broad range of model parameters, the signal to noise ratio of the randomly oriented MCE is sufficiently high for a confident detection by aLIGO and aVirgo. We therefore propose the MCE as a viable astrophysical source of GW. The detection or non-detection of such a source also hold implications for QSSC, namely, whether it is a viable cosmology or not.
Tracking Vessels to Illegal Pollutant Discharges Using Multisource Vessel Information
NASA Astrophysics Data System (ADS)
Busler, J.; Wehn, H.; Woodhouse, L.
2015-04-01
Illegal discharge of bilge waters is a significant source of oil and other environmental pollutants in Canadian and international waters. Imaging satellites are commonly used to monitor large areas to detect oily discharges from vessels, off-shore platforms and other sources. While remotely sensed imagery provides a snap-shot picture useful for detecting a spill or the presence of vessels in the vicinity, it is difficult to directly associate a vessel to an observed spill unless the vessel is observed while the discharge is occurring. The situation then becomes more challenging with increased vessel traffic as multiple vessels may be associated with a spill event. By combining multiple sources of vessel location data, such as Automated Information Systems (AIS), Long Range Identification and Tracking (LRIT) and SAR-based ship detection, with spill detections and drift models we have created a system that associates detected spill events with vessels in the area using a probabilistic model that intersects vessel tracks and spill drift trajectories in both time and space. Working with the Canadian Space Agency and the Canadian Ice Service's Integrated Satellite Tracking of Pollution (ISTOP) program, we use spills observed in Canadian waters to demonstrate the investigative value of augmenting spill detections with temporally sequenced vessel and spill tracking information.
Determining dark matter properties with a XENONnT/LZ signal and LHC Run 3 monojet searches
NASA Astrophysics Data System (ADS)
Baum, Sebastian; Catena, Riccardo; Conrad, Jan; Freese, Katherine; Krauss, Martin B.
2018-04-01
We develop a method to forecast the outcome of the LHC Run 3 based on the hypothetical detection of O (100 ) signal events at XENONnT. Our method relies on a systematic classification of renormalizable single-mediator models for dark matter-quark interactions and is valid for dark matter candidates of spin less than or equal to one. Applying our method to simulated data, we find that at the end of the LHC Run 3 only two mutually exclusive scenarios would be compatible with the detection of O (100 ) signal events at XENONnT. In the first scenario, the energy distribution of the signal events is featureless, as for canonical spin-independent interactions. In this case, if a monojet signal is detected at the LHC, dark matter must have spin 1 /2 and interact with nucleons through a unique velocity-dependent operator. If a monojet signal is not detected, dark matter interacts with nucleons through canonical spin-independent interactions. In a second scenario, the spectral distribution of the signal events exhibits a bump at nonzero recoil energies. In this second case, a monojet signal can be detected at the LHC Run 3; dark matter must have spin 1 /2 and interact with nucleons through a unique momentum-dependent operator. We therefore conclude that the observation of O (100 ) signal events at XENONnT combined with the detection, or the lack of detection, of a monojet signal at the LHC Run 3 would significantly narrow the range of possible dark matter-nucleon interactions. As we argued above, it can also provide key information on the dark matter particle spin.
Multi-Station Broad Regional Event Detection Using Waveform Correlation
NASA Astrophysics Data System (ADS)
Slinkard, M.; Stephen, H.; Young, C. J.; Eckert, R.; Schaff, D. P.; Richards, P. G.
2013-12-01
Previous waveform correlation studies have established the occurrence of repeating seismic events in various regions, and the utility of waveform-correlation event-detection on broad regional or even global scales to find events currently not included in traditionally-prepared bulletins. The computational burden, however, is high, limiting previous experiments to relatively modest template libraries and/or processing time periods. We have developed a distributed computing waveform correlation event detection utility that allows us to process years of continuous waveform data with template libraries numbering in the thousands. We have used this system to process several years of waveform data from IRIS stations in East Asia, using libraries of template events taken from global and regional bulletins. Detections at a given station are confirmed by 1) comparison with independent bulletins of seismicity, and 2) consistent detections at other stations. We find that many of the detected events are not in traditional catalogs, hence the multi-station comparison is essential. In addition to detecting the similar events, we also estimate magnitudes very precisely based on comparison with the template events (when magnitudes are available). We have investigated magnitude variation within detected families of similar events, false alarm rates, and the temporal and spatial reach of templates.
Sun, Siao; Barraud, Sylvie; Castebrunet, Hélène; Aubin, Jean-Baptiste; Marmonier, Pierre
2015-11-15
The assessment of urban stormwater quantity and quality is important for evaluating and controlling the impact of the stormwater to natural water and environment. This study mainly addresses long-term evolution of stormwater quantity and quality in a French urban catchment using continuous measured data from 2004 to 2011. Storm event-based data series are obtained (716 rainfall events and 521 runoff events are available) from measured continuous time series. The Mann-Kendall test is applied to these event-based data series for trend detection. A lack of trend is found in rainfall and an increasing trend in runoff is detected. As a result, an increasing trend is present in the runoff coefficient, likely due to growing imperviousness of the catchment caused by urbanization. The event mean concentration of the total suspended solid (TSS) in stormwater does not present a trend, whereas the event load of TSS has an increasing tendency, which is attributed to the increasing event runoff volume. Uncertainty analysis suggests that the major uncertainty in trend detection results lies in uncertainty due to available data. A lack of events due to missing data leads to dramatically increased uncertainty in trend detection results. In contrast, measurement uncertainty in time series data plays a trivial role. The intra-event distribution of TSS is studied based on both M(V) curves and pollutant concentrations of absolute runoff volumes. The trend detection test reveals no significant change in intra-event distributions of TSS in the studied catchment. Copyright © 2015 Elsevier Ltd. All rights reserved.
Machine intelligence-based decision-making (MIND) for automatic anomaly detection
NASA Astrophysics Data System (ADS)
Prasad, Nadipuram R.; King, Jason C.; Lu, Thomas
2007-04-01
Any event deemed as being out-of-the-ordinary may be called an anomaly. Anomalies by virtue of their definition are events that occur spontaneously with no prior indication of their existence or appearance. Effects of anomalies are typically unknown until they actually occur, and their effects aggregate in time to show noticeable change from the original behavior. An evolved behavior would in general be very difficult to correct unless the anomalous event that caused such behavior can be detected early, and any consequence attributed to the specific anomaly. Substantial time and effort is required to back-track the cause for abnormal behavior and to recreate the event sequence leading to abnormal behavior. There is a critical need therefore to automatically detect anomalous behavior as and when they may occur, and to do so with the operator in the loop. Human-machine interaction results in better machine learning and a better decision-support mechanism. This is the fundamental concept of intelligent control where machine learning is enhanced by interaction with human operators, and vice versa. The paper discusses a revolutionary framework for the characterization, detection, identification, learning, and modeling of anomalous behavior in observed phenomena arising from a large class of unknown and uncertain dynamical systems.
An active monitoring method for flood events
NASA Astrophysics Data System (ADS)
Chen, Zeqiang; Chen, Nengcheng; Du, Wenying; Gong, Jianya
2018-07-01
Timely and active detecting and monitoring of a flood event are critical for a quick response, effective decision-making and disaster reduction. To achieve the purpose, this paper proposes an active service framework for flood monitoring based on Sensor Web services and an active model for the concrete implementation of the active service framework. The framework consists of two core components-active warning and active planning. The active warning component is based on a publish-subscribe mechanism implemented by the Sensor Event Service. The active planning component employs the Sensor Planning Service to control the execution of the schemes and models and plans the model input data. The active model, called SMDSA, defines the quantitative calculation method for five elements, scheme, model, data, sensor, and auxiliary information, as well as their associations. Experimental monitoring of the Liangzi Lake flood in the summer of 2010 is conducted to test the proposed framework and model. The results show that 1) the proposed active service framework is efficient for timely and automated flood monitoring. 2) The active model, SMDSA, is a quantitative calculation method used to monitor floods from manual intervention to automatic computation. 3) As much preliminary work as possible should be done to take full advantage of the active service framework and the active model.
Human visual system-based smoking event detection
NASA Astrophysics Data System (ADS)
Odetallah, Amjad D.; Agaian, Sos S.
2012-06-01
Human action (e.g. smoking, eating, and phoning) analysis is an important task in various application domains like video surveillance, video retrieval, human-computer interaction systems, and so on. Smoke detection is a crucial task in many video surveillance applications and could have a great impact to raise the level of safety of urban areas, public parks, airplanes, hospitals, schools and others. The detection task is challenging since there is no prior knowledge about the object's shape, texture and color. In addition, its visual features will change under different lighting and weather conditions. This paper presents a new scheme of a system for detecting human smoking events, or small smoke, in a sequence of images. In developed system, motion detection and background subtraction are combined with motion-region-saving, skin-based image segmentation, and smoke-based image segmentation to capture potential smoke regions which are further analyzed to decide on the occurrence of smoking events. Experimental results show the effectiveness of the proposed approach. As well, the developed method is capable of detecting the small smoking events of uncertain actions with various cigarette sizes, colors, and shapes.
New early warning system for gravity-driven ruptures based on codetection of acoustic signal
NASA Astrophysics Data System (ADS)
Faillettaz, J.
2016-12-01
Gravity-driven rupture phenomena in natural media - e.g. landslide, rockfalls, snow or ice avalanches - represent an important class of natural hazards in mountainous regions. To protect the population against such events, a timely evacuation often constitutes the only effective way to secure the potentially endangered area. However, reliable prediction of imminence of such failure events remains challenging due to the nonlinear and complex nature of geological material failure hampered by inherent heterogeneity, unknown initial mechanical state, and complex load application (rainfall, temperature, etc.). Here, a simple method for real-time early warning that considers both the heterogeneity of natural media and characteristics of acoustic emissions attenuation is proposed. This new method capitalizes on codetection of elastic waves emanating from microcracks by multiple and spatially separated sensors. Event-codetection is considered as surrogate for large event size with more frequent codetected events (i.e., detected concurrently on more than one sensor) marking imminence of catastrophic failure. Simple numerical model based on a Fiber Bundle Model considering signal attenuation and hypothetical arrays of sensors confirms the early warning potential of codetection principles. Results suggest that although statistical properties of attenuated signal amplitude could lead to misleading results, monitoring the emergence of large events announcing impeding failure is possible even with attenuated signals depending on sensor network geometry and detection threshold. Preliminary application of the proposed method to acoustic emissions during failure of snow samples has confirmed the potential use of codetection as indicator for imminent failure at lab scale. The applicability of such simple and cheap early warning system is now investigated at a larger scale (hillslope). First results of such a pilot field experiment are presented and analysed.
NASA Astrophysics Data System (ADS)
Gilmanshin, I. R.; Kirpichnikov, A. P.
2017-09-01
In the result of study of the algorithm of the functioning of the early detection module of excessive losses, it is proven the ability to model it by using absorbing Markov chains. The particular interest is in the study of probability characteristics of early detection module functioning algorithm of losses in order to identify the relationship of indicators of reliability of individual elements, or the probability of occurrence of certain events and the likelihood of transmission of reliable information. The identified relations during the analysis allow to set thresholds reliability characteristics of the system components.
Efficient hemodynamic event detection utilizing relational databases and wavelet analysis
NASA Technical Reports Server (NTRS)
Saeed, M.; Mark, R. G.
2001-01-01
Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.
Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin
2016-10-01
Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.
Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin
2016-01-01
Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses. PMID:27706086
Crespo, Andrea; Álvarez, Daniel; Kheirandish-Gozal, Leila; Gutiérrez-Tobal, Gonzalo C; Cerezo-Hernández, Ana; Gozal, David; Hornero, Roberto; Del Campo, Félix
2018-02-16
A variety of statistical models based on overnight oximetry has been proposed to simplify the detection of children with suspected obstructive sleep apnea syndrome (OSAS). Despite the usefulness reported, additional thorough comparative analyses are required. This study was aimed at assessing common binary classification models from oximetry for the detection of childhood OSAS. Overnight oximetry recordings from 176 children referred for clinical suspicion of OSAS were acquired during in-lab polysomnography. Several training and test datasets were randomly composed by means of bootstrapping for model optimization and independent validation. For every child, blood oxygen saturation (SpO 2 ) was parameterized by means of 17 features. Fast correlation-based filter (FCBF) was applied to search for the optimum features. The discriminatory power of three statistical pattern recognition algorithms was assessed: linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), and logistic regression (LR). The performance of each automated model was evaluated for the three common diagnostic polysomnographic cutoffs in pediatric OSAS: 1, 3, and 5 events/h. Best screening performances emerged using the 1 event/h cutoff for mild-to-severe childhood OSAS. LR achieved 84.3% accuracy (95% CI 76.8-91.5%) and 0.89 AUC (95% CI 0.83-0.94), while QDA reached 96.5% PPV (95% CI 90.3-100%) and 0.91 AUC (95% CI 0.85-0.96%). Moreover, LR and QDA reached diagnostic accuracies of 82.7% (95% CI 75.0-89.6%) and 82.1% (95% CI 73.8-89.5%) for a cutoff of 5 events/h, respectively. Automated analysis of overnight oximetry may be used to develop reliable as well as accurate screening tools for childhood OSAS.
NASA Astrophysics Data System (ADS)
Aartsen, M. G.; Abraham, K.; Ackermann, M.; Adams, J.; Aguilar, J. A.; Ahlers, M.; Ahrens, M.; Altmann, D.; Andeen, K.; Anderson, T.; Ansseau, I.; Anton, G.; Archinger, M.; Argüelles, C.; Auffenberg, J.; Axani, S.; Bai, X.; Barwick, S. W.; Baum, V.; Bay, R.; Beatty, J. J.; Becker Tjus, J.; Becker, K.-H.; BenZvi, S.; Berghaus, P.; Berley, D.; Bernardini, E.; Bernhard, A.; Besson, D. Z.; Binder, G.; Bindig, D.; Bissok, M.; Blaufuss, E.; Blot, S.; Bohm, C.; Börner, M.; Bos, F.; Bose, D.; Böser, S.; Botner, O.; Braun, J.; Brayeur, L.; Bretz, H.-P.; Burgman, A.; Carver, T.; Casier, M.; Cheung, E.; Chirkin, D.; Christov, A.; Clark, K.; Classen, L.; Coenders, S.; Collin, G. H.; Conrad, J. M.; Cowen, D. F.; Cross, R.; Day, M.; de André, J. P. A. M.; De Clercq, C.; del Pino Rosendo, E.; Dembinski, H.; De Ridder, S.; Desiati, P.; de Vries, K. D.; de Wasseige, G.; de With, M.; DeYoung, T.; Díaz-Vélez, J. C.; di Lorenzo, V.; Dujmovic, H.; Dumm, J. P.; Dunkman, M.; Eberhardt, B.; Ehrhardt, T.; Eichmann, B.; Eller, P.; Euler, S.; Evenson, P. A.; Fahey, S.; Fazely, A. R.; Feintzeig, J.; Felde, J.; Filimonov, K.; Finley, C.; Flis, S.; Fösig, C.-C.; Franckowiak, A.; Friedman, E.; Fuchs, T.; Gaisser, T. K.; Gallagher, J.; Gerhardt, L.; Ghorbani, K.; Giang, W.; Gladstone, L.; Glagla, M.; Glüsenkamp, T.; Goldschmidt, A.; Golup, G.; Gonzalez, J. G.; Grant, D.; Griffith, Z.; Haack, C.; Haj Ismail, A.; Hallgren, A.; Halzen, F.; Hansen, E.; Hansmann, B.; Hansmann, T.; Hanson, K.; Hebecker, D.; Heereman, D.; Helbing, K.; Hellauer, R.; Hickford, S.; Hignight, J.; Hill, G. C.; Hoffman, K. D.; Hoffmann, R.; Holzapfel, K.; Hoshina, K.; Huang, F.; Huber, M.; Hultqvist, K.; In, S.; Ishihara, A.; Jacobi, E.; Japaridze, G. S.; Jeong, M.; Jero, K.; Jones, B. J. P.; Jurkovic, M.; Kappes, A.; Karg, T.; Karle, A.; Katz, U.; Kauer, M.; Keivani, A.; Kelley, J. L.; Kemp, J.; Kheirandish, A.; Kim, M.; Kintscher, T.; Kiryluk, J.; Kittler, T.; Klein, S. R.; Kohnen, G.; Koirala, R.; Kolanoski, H.; Konietz, R.; Köpke, L.; Kopper, C.; Kopper, S.; Koskinen, D. J.; Kowalski, M.; Krings, K.; Kroll, M.; Krückl, G.; Krüger, C.; Kunnen, J.; Kunwar, S.; Kurahashi, N.; Kuwabara, T.; Labare, M.; Lanfranchi, J. L.; Larson, M. J.; Lauber, F.; Lennarz, D.; Lesiak-Bzdak, M.; Leuermann, M.; Leuner, J.; Lu, L.; Lünemann, J.; Madsen, J.; Maggi, G.; Mahn, K. B. M.; Mancina, S.; Mandelartz, M.; Maruyama, R.; Mase, K.; Maunu, R.; McNally, F.; Meagher, K.; Medici, M.; Meier, M.; Meli, A.; Menne, T.; Merino, G.; Meures, T.; Miarecki, S.; Mohrmann, L.; Montaruli, T.; Moulai, M.; Nahnhauer, R.; Naumann, U.; Neer, G.; Niederhausen, H.; Nowicki, S. C.; Nygren, D. R.; Obertacke Pollmann, A.; Olivas, A.; O'Murchadha, A.; Palczewski, T.; Pandya, H.; Pankova, D. V.; Penek, Ö.; Pepper, J. A.; Pérez de los Heros, C.; Pieloth, D.; Pinat, E.; Price, P. B.; Przybylski, G. T.; Quinnan, M.; Raab, C.; Rädel, L.; Rameez, M.; Rawlins, K.; Reimann, R.; Relethford, B.; Relich, M.; Resconi, E.; Rhode, W.; Richman, M.; Riedel, B.; Robertson, S.; Rongen, M.; Rott, C.; Ruhe, T.; Ryckbosch, D.; Rysewyk, D.; Sabbatini, L.; Sanchez Herrera, S. E.; Sandrock, A.; Sandroos, J.; Sarkar, S.; Satalecka, K.; Schimp, M.; Schlunder, P.; Schmidt, T.; Schoenen, S.; Schöneberg, S.; Schumacher, L.; Seckel, D.; Seunarine, S.; Soldin, D.; Song, M.; Spiczak, G. M.; Spiering, C.; Stahlberg, M.; Stanev, T.; Stasik, A.; Steuer, A.; Stezelberger, T.; Stokstad, R. G.; Stößl, A.; Ström, R.; Strotjohann, N. L.; Sullivan, G. W.; Sutherland, M.; Taavola, H.; Taboada, I.; Tatar, J.; Tenholt, F.; Ter-Antonyan, S.; Terliuk, A.; Tešić, G.; Tilav, S.; Toale, P. A.; Tobin, M. N.; Toscano, S.; Tosi, D.; Tselengidou, M.; Turcati, A.; Unger, E.; Usner, M.; Vandenbroucke, J.; van Eijndhoven, N.; Vanheule, S.; van Rossem, M.; van Santen, J.; Veenkamp, J.; Vehring, M.; Voge, M.; Vraeghe, M.; Walck, C.; Wallace, A.; Wallraff, M.; Wandkowsky, N.; Weaver, Ch.; Weiss, M. J.; Wendt, C.; Westerhoff, S.; Whelan, B. J.; Wickmann, S.; Wiebe, K.; Wiebusch, C. H.; Wille, L.; Williams, D. R.; Wills, L.; Wolf, M.; Wood, T. R.; Woolsey, E.; Woschnagg, K.; Xu, D. L.; Xu, X. W.; Xu, Y.; Yanez, J. P.; Yodh, G.; Yoshida, S.; Zoll, M.; IceCube Collaboration
2016-12-01
We report constraints on the sources of ultrahigh-energy cosmic rays (UHECRs) above 1 09 GeV , based on an analysis of seven years of IceCube data. This analysis efficiently selects very high- energy neutrino-induced events which have deposited energies from 5 ×1 05 GeV to above 1 011 GeV . Two neutrino-induced events with an estimated deposited energy of (2.6 ±0.3 )×1 06 GeV , the highest neutrino energy observed so far, and (7.7 ±2.0 )×1 05 GeV were detected. The atmospheric background-only hypothesis of detecting these events is rejected at 3.6 σ . The hypothesis that the observed events are of cosmogenic origin is also rejected at >99 % CL because of the limited deposited energy and the nonobservation of events at higher energy, while their observation is consistent with an astrophysical origin. Our limits on cosmogenic neutrino fluxes disfavor the UHECR sources having a cosmological evolution stronger than the star formation rate, e.g., active galactic nuclei and γ -ray bursts, assuming proton-dominated UHECRs. Constraints on UHECR sources including mixed and heavy UHECR compositions are obtained for models of neutrino production within UHECR sources. Our limit disfavors a significant part of parameter space for active galactic nuclei and new-born pulsar models. These limits on the ultrahigh-energy neutrino flux models are the most stringent to date.
A risk-based multi-objective model for optimal placement of sensors in water distribution system
NASA Astrophysics Data System (ADS)
Naserizade, Sareh S.; Nikoo, Mohammad Reza; Montaseri, Hossein
2018-02-01
In this study, a new stochastic model based on Conditional Value at Risk (CVaR) and multi-objective optimization methods is developed for optimal placement of sensors in water distribution system (WDS). This model determines minimization of risk which is caused by simultaneous multi-point contamination injection in WDS using CVaR approach. The CVaR considers uncertainties of contamination injection in the form of probability distribution function and calculates low-probability extreme events. In this approach, extreme losses occur at tail of the losses distribution function. Four-objective optimization model based on NSGA-II algorithm is developed to minimize losses of contamination injection (through CVaR of affected population and detection time) and also minimize the two other main criteria of optimal placement of sensors including probability of undetected events and cost. Finally, to determine the best solution, Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), as a subgroup of Multi Criteria Decision Making (MCDM) approach, is utilized to rank the alternatives on the trade-off curve among objective functions. Also, sensitivity analysis is done to investigate the importance of each criterion on PROMETHEE results considering three relative weighting scenarios. The effectiveness of the proposed methodology is examined through applying it to Lamerd WDS in the southwestern part of Iran. The PROMETHEE suggests 6 sensors with suitable distribution that approximately cover all regions of WDS. Optimal values related to CVaR of affected population and detection time as well as probability of undetected events for the best optimal solution are equal to 17,055 persons, 31 mins and 0.045%, respectively. The obtained results of the proposed methodology in Lamerd WDS show applicability of CVaR-based multi-objective simulation-optimization model for incorporating the main uncertainties of contamination injection in order to evaluate extreme value of losses in WDS.
Vision-based Detection of Acoustic Timed Events: a Case Study on Clarinet Note Onsets
NASA Astrophysics Data System (ADS)
Bazzica, A.; van Gemert, J. C.; Liem, C. C. S.; Hanjalic, A.
2017-05-01
Acoustic events often have a visual counterpart. Knowledge of visual information can aid the understanding of complex auditory scenes, even when only a stereo mixdown is available in the audio domain, \\eg identifying which musicians are playing in large musical ensembles. In this paper, we consider a vision-based approach to note onset detection. As a case study we focus on challenging, real-world clarinetist videos and carry out preliminary experiments on a 3D convolutional neural network based on multiple streams and purposely avoiding temporal pooling. We release an audiovisual dataset with 4.5 hours of clarinetist videos together with cleaned annotations which include about 36,000 onsets and the coordinates for a number of salient points and regions of interest. By performing several training trials on our dataset, we learned that the problem is challenging. We found that the CNN model is highly sensitive to the optimization algorithm and hyper-parameters, and that treating the problem as binary classification may prevent the joint optimization of precision and recall. To encourage further research, we publicly share our dataset, annotations and all models and detail which issues we came across during our preliminary experiments.
NASA Astrophysics Data System (ADS)
Talukder, A.; Panangadan, A. V.; Blumberg, A. F.; Herrington, T.; Georgas, N.
2008-12-01
The New York Harbor Observation and Prediction System (NYHOPS) is a real-time, estuarine and coastal ocean observing and modeling system for the New York Harbor and surrounding waters. Real-time measurements from in-situ mobile and stationary sensors in the NYHOPS networks are assimilated into marine forecasts in order to reduce the discrepancy with ground truth. The forecasts are obtained from the ECOMSED hydrodynamic model, a shallow water derivative of the Princeton Ocean Model. Currently, all sensors in the NYHOPS system are operated in a fixed mode with uniform sampling rates. This technology infusion effort demonstrates the use of Model Predictive Control (MPC) to autonomously adapt the operation of both mobile and stationary sensors in response to changing events that are -automatically detected from the ECOMSED forecasts. The controller focuses sensing resources on those regions that are expected to be impacted by the detected events. The MPC approach involves formulating the problem of calculating the optimal sensor parameters as a constrained multi-objective optimization problem. We have developed an objective function that takes into account the spatiotemporal relationship of the in-situ sensor locations and the locations of events detected by the model. Experiments in simulation were carried out using data collected during a freshwater flooding event. The location of the resulting freshwater plume was calculated from the corresponding model forecasts and was used by the MPC controller to derive control parameters for the sensing assets. The operational parameters that are controlled include the sampling rates of stationary sensors, paths of unmanned underwater vehicles (UUVs), and data transfer routes between sensors and the central modeling computer. The simulation experiments show that MPC-based sensor control reduces the RMS error in the forecast by a factor of 380% as compared to uniform sampling. The paths of multiple UUVs were simultaneously calculated such that measurements from on-board sensors would lead to maximal reduction in the forecast error after data assimilation. The MPC controller also reduces the consumption of system resources such as energy expended in sampling and wireless communication. The MPC-based control approach can be generalized to accept data from remote sensing satellites. This will enable in-situ sensors to be regulated using forecasts generated by assimilating local high resolution in-situ measurements with wide-area observations from remote sensing satellites.
Ross, Joseph S; Bates, Jonathan; Parzynski, Craig S; Akar, Joseph G; Curtis, Jeptha P; Desai, Nihar R; Freeman, James V; Gamble, Ginger M; Kuntz, Richard; Li, Shu-Xia; Marinac-Dabic, Danica; Masoudi, Frederick A; Normand, Sharon-Lise T; Ranasinghe, Isuru; Shaw, Richard E; Krumholz, Harlan M
2017-01-01
Background Machine learning methods may complement traditional analytic methods for medical device surveillance. Methods and results Using data from the National Cardiovascular Data Registry for implantable cardioverter–defibrillators (ICDs) linked to Medicare administrative claims for longitudinal follow-up, we applied three statistical approaches to safety-signal detection for commonly used dual-chamber ICDs that used two propensity score (PS) models: one specified by subject-matter experts (PS-SME), and the other one by machine learning-based selection (PS-ML). The first approach used PS-SME and cumulative incidence (time-to-event), the second approach used PS-SME and cumulative risk (Data Extraction and Longitudinal Trend Analysis [DELTA]), and the third approach used PS-ML and cumulative risk (embedded feature selection). Safety-signal surveillance was conducted for eleven dual-chamber ICD models implanted at least 2,000 times over 3 years. Between 2006 and 2010, there were 71,948 Medicare fee-for-service beneficiaries who received dual-chamber ICDs. Cumulative device-specific unadjusted 3-year event rates varied for three surveyed safety signals: death from any cause, 12.8%–20.9%; nonfatal ICD-related adverse events, 19.3%–26.3%; and death from any cause or nonfatal ICD-related adverse event, 27.1%–37.6%. Agreement among safety signals detected/not detected between the time-to-event and DELTA approaches was 90.9% (360 of 396, k=0.068), between the time-to-event and embedded feature-selection approaches was 91.7% (363 of 396, k=−0.028), and between the DELTA and embedded feature selection approaches was 88.1% (349 of 396, k=−0.042). Conclusion Three statistical approaches, including one machine learning method, identified important safety signals, but without exact agreement. Ensemble methods may be needed to detect all safety signals for further evaluation during medical device surveillance. PMID:28860874
Information-Theoretic Properties of Auditory Sequences Dynamically Influence Expectation and Memory
ERIC Educational Resources Information Center
Agres, Kat; Abdallah, Samer; Pearce, Marcus
2018-01-01
A basic function of cognition is to detect regularities in sensory input to facilitate the prediction and recognition of future events. It has been proposed that these implicit expectations arise from an internal predictive coding model, based on knowledge acquired through processes such as statistical learning, but it is unclear how different…
21st Century Changes in Precipitation Extremes Based on Resolved Atmospheric Patterns
NASA Astrophysics Data System (ADS)
Gao, X.; Schlosser, C. A.; O'Gorman, P. A.; Monier, E.
2014-12-01
Global warming is expected to alter the frequency and/or magnitude of extreme precipitation events. Such changes could have substantial ecological, economic, and sociological consequences. However, climate models in general do not correctly reproduce the frequency distribution of precipitation, especially at the regional scale. In this study, a validated analogue method is employed to diagnose the potential future shifts in the probability of extreme precipitation over the United States under global warming. The method is based on the use of the resolved large-scale meteorological conditions (i.e. flow features, moisture supply) to detect the occurrence of extreme precipitation. The CMIP5 multi-model projections have been compiled for two radiative forcing scenarios (Representative Concentration Pathways 4.5 and 8.5). We further analyze the accompanying circulation features and their changes that may be responsible for shifts in extreme precipitation in response to changed climate. The application of such analogue method to detect other types of hazard events, i.e. landslides is also explored. The results from this study may guide hazardous weather watches and help society develop adaptive strategies for preventing catastrophic losses.
Comparative study of predicted and experimentally detected interplanetary shocks
NASA Astrophysics Data System (ADS)
Kartalev, M. D.; Grigorov, K. G.; Smith, Z.; Dryer, M.; Fry, C. D.; Sun, Wei; Deehr, C. S.
2002-03-01
We compare the real time space weather prediction shock arrival times at 1 AU made by the USAF/NOAA Shock Time of Arrival (STOA) and Interplanetary Shock Propagation Model (ISPM) models, and the Exploration Physics International/University of Alaska Hakamada-Akasofu-Fry Solar Wind Model (HAF-v2) to a real time analysis analysis of plasma and field ACE data. The comparison is made using an algorithm that was developed on the basis of wavelet data analysis and MHD identification procedure. The shock parameters are estimated for selected "candidate events". An appropriate automatically performing Web-based interface periodically utilizes solar wind observations made by the ACE at L1. Near real time results as well an archive of the registered interesting events are available on a specially developed web site. A number of events are considered. These studies are essential for the validation of real time space weather forecasts made from solar data.
Modeling Tool for Decision Support during Early Days of an Anthrax Event.
Rainisch, Gabriel; Meltzer, Martin I; Shadomy, Sean; Bower, William A; Hupert, Nathaniel
2017-01-01
Health officials lack field-implementable tools for forecasting the effects that a large-scale release of Bacillus anthracis spores would have on public health and hospitals. We created a modeling tool (combining inhalational anthrax caseload projections based on initial case reports, effects of variable postexposure prophylaxis campaigns, and healthcare facility surge capacity requirements) to project hospitalizations and casualties from a newly detected inhalation anthrax event, and we examined the consequences of intervention choices. With only 3 days of case counts, the model can predict final attack sizes for simulated Sverdlovsk-like events (1979 USSR) with sufficient accuracy for decision making and confirms the value of early postexposure prophylaxis initiation. According to a baseline scenario, hospital treatment volume peaks 15 days after exposure, deaths peak earlier (day 5), and recovery peaks later (day 23). This tool gives public health, hospital, and emergency planners scenario-specific information for developing quantitative response plans for this threat.
Parkison, Steven A.; Carlson, Jay D.; Chaudoin, Tammy R.; Hoke, Traci A.; Schenk, A. Katrin; Goulding, Evan H.; Pérez, Lance C.; Bonasera, Stephen J.
2016-01-01
Inexpensive, high-throughput, low maintenance systems for precise temporal and spatial measurement of mouse home cage behavior (including movement, feeding, and drinking) are required to evaluate products from large scale pharmaceutical design and genetic lesion programs. These measurements are also required to interpret results from more focused behavioral assays. We describe the design and validation of a highly-scalable, reliable mouse home cage behavioral monitoring system modeled on a previously described, one-of-a-kind system [1]. Mouse position was determined by solving static equilibrium equations describing the force and torques acting on the system strain gauges; feeding events were detected by a photobeam across the food hopper, and drinking events were detected by a capacitive lick sensor. Validation studies show excellent agreement between mouse position and drinking events measured by the system compared with video-based observation – a gold standard in neuroscience. PMID:23366406
NASA Astrophysics Data System (ADS)
Heleno, S.; Matias, M.; Pina, P.; Sousa, A. J.
2015-09-01
A method for semi-automatic landslide detection, with the ability to separate source and run-out areas, is presented in this paper. It combines object-based image analysis and a Support Vector Machine classifier on a GeoEye-1 multispectral image, sensed 3 days after the major damaging landslide event that occurred in Madeira island (20 February 2010), with a pre-event LIDAR Digital Elevation Model. The testing is developed in a 15 km2-wide study area, where 95 % of the landslides scars are detected by this supervised approach. The classifier presents a good performance in the delineation of the overall landslide area. In addition, fair results are achieved in the separation of the source from the run-out landslide areas, although in less illuminated slopes this discrimination is less effective than in sunnier east facing-slopes.
Progression of regional grey matter atrophy in multiple sclerosis
Marinescu, Razvan V; Young, Alexandra L; Firth, Nicholas C; Jorge Cardoso, M; Tur, Carmen; De Angelis, Floriana; Cawley, Niamh; Brownlee, Wallace J; De Stefano, Nicola; Laura Stromillo, M; Battaglini, Marco; Ruggieri, Serena; Gasperini, Claudio; Filippi, Massimo; Rocca, Maria A; Rovira, Alex; Sastre-Garriga, Jaume; Geurts, Jeroen J G; Vrenken, Hugo; Wottschel, Viktor; Leurs, Cyra E; Uitdehaag, Bernard; Pirpamer, Lukas; Enzinger, Christian; Ourselin, Sebastien; Gandini Wheeler-Kingshott, Claudia A; Chard, Declan; Thompson, Alan J; Barkhof, Frederik; Alexander, Daniel C; Ciccarelli, Olga
2018-01-01
Abstract See Stankoff and Louapre (doi:10.1093/brain/awy114) for a scientific commentary on this article. Grey matter atrophy is present from the earliest stages of multiple sclerosis, but its temporal ordering is poorly understood. We aimed to determine the sequence in which grey matter regions become atrophic in multiple sclerosis and its association with disability accumulation. In this longitudinal study, we included 1417 subjects: 253 with clinically isolated syndrome, 708 with relapsing-remitting multiple sclerosis, 128 with secondary-progressive multiple sclerosis, 125 with primary-progressive multiple sclerosis, and 203 healthy control subjects from seven European centres. Subjects underwent repeated MRI (total number of scans 3604); the mean follow-up for patients was 2.41 years (standard deviation = 1.97). Disability was scored using the Expanded Disability Status Scale. We calculated the volume of brain grey matter regions and brainstem using an unbiased within-subject template and used an established data-driven event-based model to determine the sequence of occurrence of atrophy and its uncertainty. We assigned each subject to a specific event-based model stage, based on the number of their atrophic regions. Linear mixed-effects models were used to explore associations between the rate of increase in event-based model stages, and T2 lesion load, disease-modifying treatments, comorbidity, disease duration and disability accumulation. The first regions to become atrophic in patients with clinically isolated syndrome and relapse-onset multiple sclerosis were the posterior cingulate cortex and precuneus, followed by the middle cingulate cortex, brainstem and thalamus. A similar sequence of atrophy was detected in primary-progressive multiple sclerosis with the involvement of the thalamus, cuneus, precuneus, and pallidum, followed by the brainstem and posterior cingulate cortex. The cerebellum, caudate and putamen showed early atrophy in relapse-onset multiple sclerosis and late atrophy in primary-progressive multiple sclerosis. Patients with secondary-progressive multiple sclerosis showed the highest event-based model stage (the highest number of atrophic regions, P < 0.001) at the study entry. All multiple sclerosis phenotypes, but clinically isolated syndrome, showed a faster rate of increase in the event-based model stage than healthy controls. T2 lesion load and disease duration in all patients were associated with increased event-based model stage, but no effects of disease-modifying treatments and comorbidity on event-based model stage were observed. The annualized rate of event-based model stage was associated with the disability accumulation in relapsing-remitting multiple sclerosis, independent of disease duration (P < 0.0001). The data-driven staging of atrophy progression in a large multiple sclerosis sample demonstrates that grey matter atrophy spreads to involve more regions over time. The sequence in which regions become atrophic is reasonably consistent across multiple sclerosis phenotypes. The spread of atrophy was associated with disease duration and with disability accumulation over time in relapsing-remitting multiple sclerosis. PMID:29741648
Progression of regional grey matter atrophy in multiple sclerosis.
Eshaghi, Arman; Marinescu, Razvan V; Young, Alexandra L; Firth, Nicholas C; Prados, Ferran; Jorge Cardoso, M; Tur, Carmen; De Angelis, Floriana; Cawley, Niamh; Brownlee, Wallace J; De Stefano, Nicola; Laura Stromillo, M; Battaglini, Marco; Ruggieri, Serena; Gasperini, Claudio; Filippi, Massimo; Rocca, Maria A; Rovira, Alex; Sastre-Garriga, Jaume; Geurts, Jeroen J G; Vrenken, Hugo; Wottschel, Viktor; Leurs, Cyra E; Uitdehaag, Bernard; Pirpamer, Lukas; Enzinger, Christian; Ourselin, Sebastien; Gandini Wheeler-Kingshott, Claudia A; Chard, Declan; Thompson, Alan J; Barkhof, Frederik; Alexander, Daniel C; Ciccarelli, Olga
2018-06-01
See Stankoff and Louapre (doi:10.1093/brain/awy114) for a scientific commentary on this article.Grey matter atrophy is present from the earliest stages of multiple sclerosis, but its temporal ordering is poorly understood. We aimed to determine the sequence in which grey matter regions become atrophic in multiple sclerosis and its association with disability accumulation. In this longitudinal study, we included 1417 subjects: 253 with clinically isolated syndrome, 708 with relapsing-remitting multiple sclerosis, 128 with secondary-progressive multiple sclerosis, 125 with primary-progressive multiple sclerosis, and 203 healthy control subjects from seven European centres. Subjects underwent repeated MRI (total number of scans 3604); the mean follow-up for patients was 2.41 years (standard deviation = 1.97). Disability was scored using the Expanded Disability Status Scale. We calculated the volume of brain grey matter regions and brainstem using an unbiased within-subject template and used an established data-driven event-based model to determine the sequence of occurrence of atrophy and its uncertainty. We assigned each subject to a specific event-based model stage, based on the number of their atrophic regions. Linear mixed-effects models were used to explore associations between the rate of increase in event-based model stages, and T2 lesion load, disease-modifying treatments, comorbidity, disease duration and disability accumulation. The first regions to become atrophic in patients with clinically isolated syndrome and relapse-onset multiple sclerosis were the posterior cingulate cortex and precuneus, followed by the middle cingulate cortex, brainstem and thalamus. A similar sequence of atrophy was detected in primary-progressive multiple sclerosis with the involvement of the thalamus, cuneus, precuneus, and pallidum, followed by the brainstem and posterior cingulate cortex. The cerebellum, caudate and putamen showed early atrophy in relapse-onset multiple sclerosis and late atrophy in primary-progressive multiple sclerosis. Patients with secondary-progressive multiple sclerosis showed the highest event-based model stage (the highest number of atrophic regions, P < 0.001) at the study entry. All multiple sclerosis phenotypes, but clinically isolated syndrome, showed a faster rate of increase in the event-based model stage than healthy controls. T2 lesion load and disease duration in all patients were associated with increased event-based model stage, but no effects of disease-modifying treatments and comorbidity on event-based model stage were observed. The annualized rate of event-based model stage was associated with the disability accumulation in relapsing-remitting multiple sclerosis, independent of disease duration (P < 0.0001). The data-driven staging of atrophy progression in a large multiple sclerosis sample demonstrates that grey matter atrophy spreads to involve more regions over time. The sequence in which regions become atrophic is reasonably consistent across multiple sclerosis phenotypes. The spread of atrophy was associated with disease duration and with disability accumulation over time in relapsing-remitting multiple sclerosis.
Reaction times to weak test lights. [psychophysics biological model
NASA Technical Reports Server (NTRS)
Wandell, B. A.; Ahumada, P.; Welsh, D.
1984-01-01
Maloney and Wandell (1984) describe a model of the response of a single visual channel to weak test lights. The initial channel response is a linearly filtered version of the stimulus. The filter output is randomly sampled over time. Each time a sample occurs there is some probability increasing with the magnitude of the sampled response - that a discrete detection event is generated. Maloney and Wandell derive the statistics of the detection events. In this paper a test is conducted of the hypothesis that the reaction time responses to the presence of a weak test light are initiated at the first detection event. This makes it possible to extend the application of the model to lights that are slightly above threshold, but still within the linear operating range of the visual system. A parameter-free prediction of the model proposed by Maloney and Wandell for lights detected by this statistic is tested. The data are in agreement with the prediction.
Vertically Integrated Seismological Analysis II : Inference
NASA Astrophysics Data System (ADS)
Arora, N. S.; Russell, S.; Sudderth, E.
2009-12-01
Methods for automatically associating detected waveform features with hypothesized seismic events, and localizing those events, are a critical component of efforts to verify the Comprehensive Test Ban Treaty (CTBT). As outlined in our companion abstract, we have developed a hierarchical model which views detection, association, and localization as an integrated probabilistic inference problem. In this abstract, we provide more details on the Markov chain Monte Carlo (MCMC) methods used to solve this inference task. MCMC generates samples from a posterior distribution π(x) over possible worlds x by defining a Markov chain whose states are the worlds x, and whose stationary distribution is π(x). In the Metropolis-Hastings (M-H) method, transitions in the Markov chain are constructed in two steps. First, given the current state x, a candidate next state x‧ is generated from a proposal distribution q(x‧ | x), which may be (more or less) arbitrary. Second, the transition to x‧ is not automatic, but occurs with an acceptance probability—α(x‧ | x) = min(1, π(x‧)q(x | x‧)/π(x)q(x‧ | x)). The seismic event model outlined in our companion abstract is quite similar to those used in multitarget tracking, for which MCMC has proved very effective. In this model, each world x is defined by a collection of events, a list of properties characterizing those events (times, locations, magnitudes, and types), and the association of each event to a set of observed detections. The target distribution π(x) = P(x | y), the posterior distribution over worlds x given the observed waveform data y at all stations. Proposal distributions then implement several types of moves between worlds. For example, birth moves create new events; death moves delete existing events; split moves partition the detections for an event into two new events; merge moves combine event pairs; swap moves modify the properties and assocations for pairs of events. Importantly, the rules for accepting such complex moves need not be hand-designed. Instead, they are automatically determined by the underlying probabilistic model, which is in turn calibrated via historical data and scientific knowledge. Consider a small seismic event which generates weak signals at several different stations, which might independently be mistaken for noise. A birth move may nevertheless hypothesize an event jointly explaining these detections. If the corresponding waveform data then aligns with the seismological knowledge encoded in the probabilistic model, the event may be detected even though no single station observes it unambiguously. Alternatively, if a large outlier reading is produced at a single station, moves which instantiate a corresponding (false) event would be rejected because of the absence of plausible detections at other sensors. More broadly, one of the main advantages of our MCMC approach is its consistent handling of the relative uncertainties in different information sources. By avoiding low-level thresholds, we expect to improve accuracy and robustness. At the conference, we will present results quantitatively validating our approach, using ground-truth associations and locations provided either by simulation or human analysts.
Liu, An-An; Li, Kang; Kanade, Takeo
2012-02-01
We propose a semi-Markov model trained in a max-margin learning framework for mitosis event segmentation in large-scale time-lapse phase contrast microscopy image sequences of stem cell populations. Our method consists of three steps. First, we apply a constrained optimization based microscopy image segmentation method that exploits phase contrast optics to extract candidate subsequences in the input image sequence that contains mitosis events. Then, we apply a max-margin hidden conditional random field (MM-HCRF) classifier learned from human-annotated mitotic and nonmitotic sequences to classify each candidate subsequence as a mitosis or not. Finally, a max-margin semi-Markov model (MM-SMM) trained on manually-segmented mitotic sequences is utilized to reinforce the mitosis classification results, and to further segment each mitosis into four predefined temporal stages. The proposed method outperforms the event-detection CRF model recently reported by Huh as well as several other competing methods in very challenging image sequences of multipolar-shaped C3H10T1/2 mesenchymal stem cells. For mitosis detection, an overall precision of 95.8% and a recall of 88.1% were achieved. For mitosis segmentation, the mean and standard deviation for the localization errors of the start and end points of all mitosis stages were well below 1 and 2 frames, respectively. In particular, an overall temporal location error of 0.73 ± 1.29 frames was achieved for locating daughter cell birth events.
NASA Astrophysics Data System (ADS)
Hotokezaka, K.; Nissanke, S.; Hallinan, G.; Lazio, T. J. W.; Nakar, E.; Piran, T.
2016-11-01
Mergers of binary neutron stars and black hole-neutron star binaries produce gravitational-wave (GW) emission and outflows with significant kinetic energies. These outflows result in radio emissions through synchrotron radiation. We explore the detectability of these synchrotron-generated radio signals by follow-up observations of GW merger events lacking a detection of electromagnetic counterparts in other wavelengths. We model radio light curves arising from (I) sub-relativistic merger ejecta and (II) ultra-relativistic jets. The former produce radio remnants on timescales of a few years and the latter produce γ-ray bursts in the direction of the jet and orphan-radio afterglows extending over wider angles on timescales of weeks. Based on the derived light curves, we suggest an optimized survey at 1.4 GHz with five epochs separated by a logarithmic time interval. We estimate the detectability of the radio counterparts of simulated GW-merger events to be detected by advanced LIGO and Virgo by current and future radio facilities. The detectable distances for these GW merger events could be as high as 1 Gpc. Around 20%-60% of the long-lasting radio remnants will be detectable in the case of the moderate kinetic energy of 3\\cdot {10}50 erg and a circum-merger density of 0.1 {{cm}}-3 or larger, while 5%-20% of the orphan-radio afterglows with kinetic energy of 1048 erg will be detectable. The detection likelihood increases if one focuses on the well-localizable GW events. We discuss the background noise due to radio fluxes of host galaxies and false positives arising from extragalactic radio transients and variable active galactic nuclei, and we show that the quiet radio transient sky is of great advantage when searching for the radio counterparts.
Multi-Level Anomaly Detection on Time-Varying Graph Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bridges, Robert A; Collins, John P; Ferragut, Erik M
This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating probabilities at finer levels, and these closely related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, thismore » multi-scale analysis facilitates intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. To illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less
OGLE-III Microlensing Events and the Structure of the Galactic Bulge
NASA Astrophysics Data System (ADS)
Wyrzykowski, Łukasz; Rynkiewicz, Alicja E.; Skowron, Jan; Kozłowski, Szymon; Udalski, Andrzej; Szymański, Michał K.; Kubiak, Marcin; Soszyński, Igor; Pietrzyński, Grzegorz; Poleski, Radosław; Pietrukowicz, Paweł; Pawlak, Michał
2015-01-01
We present and study the largest and most comprehensive catalog of microlensing events ever constructed. The sample of standard microlensing events comprises 3718 unique events from 2001-2009 with 1409 events that had not been detected before in real-time by the Early Warning System of the Optical Gravitational Lensing Experiment. The search pipeline uses machine learning algorithms to help find rare phenomena among 150 million objects and to derive the detection efficiency. Applications of the catalog can be numerous, from analyzing individual events to large statistical studies of the Galactic mass, kinematics distributions, and planetary abundances. We derive maps of the mean Einstein ring crossing time of events spanning 31 deg2 toward the Galactic center and compare the observed distributions with the most recent models. We find good agreement within the observed region and we see the signature of the tilt of the bar in the microlensing data. However, the asymmetry of the mean timescales seems to rise more steeply than predicted, indicating either a somewhat different orientation of the bar or a larger bar width. The map of events with sources in the Galactic bulge shows a dependence of the mean timescale on the Galactic latitude, signaling an increasing contribution from disk lenses closer to the plane relative to the height of the disk. Our data present a perfect set for comparing and enhancing new models of the central parts of the Milky Way and creating a three-dimensional picture of the Galaxy. Based on observations obtained with the 1.3 m Warsaw telescope at the Las Campanas Observatory of the Carnegie Institution for Science.
Detecting aseismic strain transients from seismicity data
Llenos, A.L.; McGuire, J.J.
2011-01-01
Aseismic deformation transients such as fluid flow, magma migration, and slow slip can trigger changes in seismicity rate. We present a method that can detect these seismicity rate variations and utilize these anomalies to constrain the underlying variations in stressing rate. Because ordinary aftershock sequences often obscure changes in the background seismicity caused by aseismic processes, we combine the stochastic Epidemic Type Aftershock Sequence model that describes aftershock sequences well and the physically based rate- and state-dependent friction seismicity model into a single seismicity rate model that models both aftershock activity and changes in background seismicity rate. We implement this model into a data assimilation algorithm that inverts seismicity catalogs to estimate space-time variations in stressing rate. We evaluate the method using a synthetic catalog, and then apply it to a catalog of M???1.5 events that occurred in the Salton Trough from 1990 to 2009. We validate our stressing rate estimates by comparing them to estimates from a geodetically derived slip model for a large creep event on the Obsidian Buttes fault. The results demonstrate that our approach can identify large aseismic deformation transients in a multidecade long earthquake catalog and roughly constrain the absolute magnitude of the stressing rate transients. Our method can therefore provide a way to detect aseismic transients in regions where geodetic resolution in space or time is poor. Copyright 2011 by the American Geophysical Union.
NASA Astrophysics Data System (ADS)
Brown, P. G.; Edwards, W. N.; Revelle, D. O.; Spurny, P.
2007-04-01
Four very high-velocity and high-altitude meteors (a Leonid, two Perseids and a high-speed sporadic fireball) have been unambiguously detected at the ground both optically using precision all-sky cameras and acoustically via infrasound and seismic signals. Infrasound arriving from altitudes of over 100 km is not very common, but has been previously observed for re-entering spacecraft. This, however, is the first reported detection of such high-altitude infrasound unambiguously from meteors to our knowledge. These fragile meteoroids were found to generate acoustic waves at source heights ranging from 80 to 110 km, with most acoustic energy being generated near the lowest heights. Time residuals between observed acoustic onset and model predictions based on ray-tracing points along the photographically determined trajectories indicate that the upper winds given by the UK meteorological office (UKMO) model systematically produce lower residuals for first arrivals than those from the Naval Research Laboratory Horizontal Wind Model (HWM). Average source energies for three of the four events from acoustic data alone are found to be in the range of 2×108-9 J. One event, EN010803, had unusually favorable geometry for acoustic detection at the ground and therefore has the smallest photometric source energy (10-5 kt; 6×107 J) of any meteor detected infrasonically. When compared to the total optical radiation recorded by film, the results for the three events produce equivalent integral panchromatic luminous efficiencies of 3 7%, within a factor of two of the values proposed by Ceplecha and McCrosky [1976. Fireball end heights—a diagnostic for the structure of meteoric material. Journal of Geophysical Research 81, 6257 6275] for the velocity range (55 70 km s-1) appropriate to our events. Application of these findings to meteor showers in general suggest that the Geminid shower should be the most prolific producer of infrasound detectable meteors at the ground of all the major showers, with one Geminid fireball producing detectable infrasound from a given location every ˜400 h of observation.
NASA Astrophysics Data System (ADS)
LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.
2016-12-01
Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.
Detection and analysis of a transient energy burst with beamforming of multiple teleseismic phases
NASA Astrophysics Data System (ADS)
Retailleau, Lise; Landès, Matthieu; Gualtieri, Lucia; Shapiro, Nikolai M.; Campillo, Michel; Roux, Philippe; Guilbert, Jocelyn
2018-01-01
Seismological detection methods are traditionally based on picking techniques. These methods cannot be used to analyse emergent signals where the arrivals cannot be picked. Here, we detect and locate seismic events by applying a beamforming method that combines multiple body-wave phases to USArray data. This method explores the consistency and characteristic behaviour of teleseismic body waves that are recorded by a large-scale, still dense, seismic network. We perform time-slowness analysis of the signals and correlate this with the time-slowness equivalent of the different body-wave phases predicted by a global traveltime calculator, to determine the occurrence of an event with no a priori information about it. We apply this method continuously to one year of data to analyse the different events that generate signals reaching the USArray network. In particular, we analyse in detail a low-frequency secondary microseismic event that occurred on 2010 February 1. This event, that lasted 1 d, has a narrow frequency band around 0.1 Hz, and it occurred at a distance of 150° to the USArray network, South of Australia. We show that the most energetic phase observed is the PKPab phase. Direct amplitude analysis of regional seismograms confirms the occurrence of this event. We compare the seismic observations with models of the spectral density of the pressure field generated by the interferences between oceanic waves. We attribute the observed signals to a storm-generated microseismic event that occurred along the South East Indian Ridge.
A model of seismic coda arrivals to suppress spurious events.
NASA Astrophysics Data System (ADS)
Arora, N.; Russell, S.
2012-04-01
We describe a model of coda arrivals which has been added to NET-VISA (Network processing Vertically Integrated Seismic Analysis) our probabilistic generative model of seismic events, their transmission, and detection on a global seismic network. The scattered energy that follows a seismic phase arrival tends to deceive typical STA/LTA based arrival picking software into believing that a real seismic phase has been detected. These coda arrivals which tend to follow all seismic phases cause most network processing software including NET-VISA to believe that multiple events have taken place. It is not a simple matter of ignoring closely spaced arrivals since arrivals from multiple events can indeed overlap. The current practice in NET-VISA of pruning events within a small space-time neighborhood of a larger event works reasonably well, but it may mask real events produced in an after-shock sequence. Our new model allows any seismic arrival, even coda arrivals, to trigger a subsequent coda arrival. The probability of such a triggered arrival depends on the amplitude of the triggering arrival. Although real seismic phases are more likely to generate such coda arrivals. Real seismic phases also tend to generate coda arrivals with more strongly correlated parameters, for example azimuth and slowness. However, the SNR (Signal to Noise Ratio) of a coda arrival immediately following a phase arrival tends to be lower because of the nature of the SNR calculation. We have calibrated our model on historical statistics of such triggered arrivals and our inference accounts for them while searching for the best explanation of seismic events their association to the arrivals and the coda arrivals. We have tested our new model on one week of global seismic data spanning March 22, 2009 to March 29, 2009. Our model was trained on two and half months of data from April 5, 2009 to June 20, 2009. We use the LEB bulletin produced by the IDC (International Data Center) as the ground truth and computed the precision (percentage of reported events which are true) and recall (percentage of true events which are reported). The existing model has a precision of 32.2 and recall of 88.6 which changes to a precision of 50.7 and recall of 88.5 after pruning. The new model has a precision of 56.8 and recall of 86.9 without any pruning and the corresponding precision recall curve is dramatically improved. In contrast, the performance of the current automated bulletin at the IDC, SEL3, has a precision of 46.2 and recall of 69.7.
Label-free DNA biosensor based on resistance change of platinum nanoparticles assemblies.
Skotadis, Evangelos; Voutyras, Konstantinos; Chatzipetrou, Marianneza; Tsekenis, Georgios; Patsiouras, Lampros; Madianos, Leonidas; Chatzandroulis, Stavros; Zergioti, Ioanna; Tsoukalas, Dimitris
2016-07-15
A novel nanoparticle based biosensor for the fast and simple detection of DNA hybridization events is presented. The sensor utilizes hybridized DNA's charge transport properties, combining them with metallic nanoparticle networks that act as nano-gapped electrodes. The DNA hybridization events can be detected by a significant reduction in the sensor's resistance due to the conductive bridging offered by hybridized DNA. By modifying the nanoparticle surface coverage, which can be controlled experimentally being a function of deposition time, and the structural properties of the electrodes, an optimized biosensor for the in situ detection of DNA hybridization events is ultimately fabricated. The fabricated biosensor exhibits a wide response range, covering four orders of magnitude, a limit of detection of 1nM and can detect a single base pair mismatch between probe and complementary DNA. Copyright © 2016 Elsevier B.V. All rights reserved.
Extending TOPS: Ontology-driven Anomaly Detection and Analysis System
NASA Astrophysics Data System (ADS)
Votava, P.; Nemani, R. R.; Michaelis, A.
2010-12-01
Terrestrial Observation and Prediction System (TOPS) is a flexible modeling software system that integrates ecosystem models with frequent satellite and surface weather observations to produce ecosystem nowcasts (assessments of current conditions) and forecasts useful in natural resources management, public health and disaster management. We have been extending the Terrestrial Observation and Prediction System (TOPS) to include a capability for automated anomaly detection and analysis of both on-line (streaming) and off-line data. In order to best capture the knowledge about data hierarchies, Earth science models and implied dependencies between anomalies and occurrences of observable events such as urbanization, deforestation, or fires, we have developed an ontology to serve as a knowledge base. We can query the knowledge base and answer questions about dataset compatibilities, similarities and dependencies so that we can, for example, automatically analyze similar datasets in order to verify a given anomaly occurrence in multiple data sources. We are further extending the system to go beyond anomaly detection towards reasoning about possible causes of anomalies that are also encoded in the knowledge base as either learned or implied knowledge. This enables us to scale up the analysis by eliminating a large number of anomalies early on during the processing by either failure to verify them from other sources, or matching them directly with other observable events without having to perform an extensive and time-consuming exploration and analysis. The knowledge is captured using OWL ontology language, where connections are defined in a schema that is later extended by including specific instances of datasets and models. The information is stored using Sesame server and is accessible through both Java API and web services using SeRQL and SPARQL query languages. Inference is provided using OWLIM component integrated with Sesame.
A hidden Markov model for decoding and the analysis of replay in spike trains.
Box, Marc; Jones, Matt W; Whiteley, Nick
2016-12-01
We present a hidden Markov model that describes variation in an animal's position associated with varying levels of activity in action potential spike trains of individual place cell neurons. The model incorporates a coarse-graining of position, which we find to be a more parsimonious description of the system than other models. We use a sequential Monte Carlo algorithm for Bayesian inference of model parameters, including the state space dimension, and we explain how to estimate position from spike train observations (decoding). We obtain greater accuracy over other methods in the conditions of high temporal resolution and small neuronal sample size. We also present a novel, model-based approach to the study of replay: the expression of spike train activity related to behaviour during times of motionlessness or sleep, thought to be integral to the consolidation of long-term memories. We demonstrate how we can detect the time, information content and compression rate of replay events in simulated and real hippocampal data recorded from rats in two different environments, and verify the correlation between the times of detected replay events and of sharp wave/ripples in the local field potential.
Threshold Monitoring Maps for Under-Water Explosions
NASA Astrophysics Data System (ADS)
Arora, N. S.
2014-12-01
Hydro-acoustic energy in the 1-100 Hz range from under-water explosions can easily spread for thousands of miles due to the unique properties of the deep sound channel. This channel, aka SOFAR channel, exists almost everywhere in the earth's oceans where the water has at least 1500m depth. Once the energy is trapped in this channel it spreads out cylindrically, and hence experiences very little loss, as long as there is an unblocked path from source to receiver. Other losses such as absorption due to chemicals in the ocean (mainly boric acid and magnesium sulphate) are also quite minimal at these low frequencies. It is not surprising then that the International Monitoring System (IMS) maintains a global network of hydrophone stations listening on this particular frequency range. The overall objective of our work is to build a probabilistic model to detect and locate under-water explosions using the IMS network. A number of critical pieces for this model, such as travel time predictions, are already well known. We are extending the existing knowledge-base by building the remaining pieces, most crucially the models for transmission losses and detection probabilities. With a complete model for detecting under-water explosions we are able to combine it with our existing model for seismic events, NET-VISA. In the conference we will present threshold monitoring maps for explosions in the earth's oceans. Our premise is that explosive sources release an unknown fraction of their total energy into the SOFAR channel, and this trapped energy determines their detection probability at each of the IMS hydrophone stations. Our threshold monitoring maps compute the minimum amount of energy at each location that must be released into the deep sound channel such that there is a ninety percent probability that at least two of the IMS stations detect the event. We will also present results of our effort to detect and locate hydro-acoustic events. In particular, we will show results from a recent under-water volcanic eruption at the Ahyl Seamount (April-May 2014), and compare our work with the current processing, both automated and human, at the IDC.
Fehre, Karsten; Plössnig, Manuela; Schuler, Jochen; Hofer-Dückelmann, Christina; Rappelsberger, Andrea; Adlassnig, Klaus-Peter
2015-01-01
The detection of adverse drug events (ADEs) is an important aspect of improving patient safety. The iMedication system employs predefined triggers associated with significant events in a patient's clinical data to automatically detect possible ADEs. We defined four clinically relevant conditions: hyperkalemia, hyponatremia, renal failure, and over-anticoagulation. These are some of the most relevant ADEs in internal medical and geriatric wards. For each patient, ADE risk scores for all four situations are calculated, compared against a threshold, and judged to be monitored, or reported. A ward-based cockpit view summarizes the results.
Gravitational Wave Detection of Compact Binaries Through Multivariate Analysis
NASA Astrophysics Data System (ADS)
Atallah, Dany Victor; Dorrington, Iain; Sutton, Patrick
2017-01-01
The first detection of gravitational waves (GW), GW150914, as produced by a binary black hole merger, has ushered in the era of GW astronomy. The detection technique used to find GW150914 considered only a fraction of the information available describing the candidate event: mainly the detector signal to noise ratios and chi-squared values. In hopes of greatly increasing detection rates, we want to take advantage of all the information available about candidate events. We employ a technique called Multivariate Analysis (MVA) to improve LIGO sensitivity to GW signals. MVA techniques are efficient ways to scan high dimensional data spaces for signal/noise classification. Our goal is to use MVA to classify compact-object binary coalescence (CBC) events composed of any combination of black holes and neutron stars. CBC waveforms are modeled through numerical relativity. Templates of the modeled waveforms are used to search for CBCs and quantify candidate events. Different MVA pipelines are under investigation to look for CBC signals and un-modelled signals, with promising results. One such MVA pipeline used for the un-modelled search can theoretically analyze far more data than the MVA pipelines currently explored for CBCs, potentially making a more powerful classifier. In principle, this extra information could improve the sensitivity to GW signals. We will present the results from our efforts to adapt an MVA pipeline used in the un-modelled search to classify candidate events from the CBC search.
Kreilinger, Alex; Hiebel, Hannah; Müller-Putz, Gernot R
2016-03-01
This work aimed to find and evaluate a new method for detecting errors in continuous brain-computer interface (BCI) applications. Instead of classifying errors on a single-trial basis, the new method was based on multiple events (MEs) analysis to increase the accuracy of error detection. In a BCI-driven car game, based on motor imagery (MI), discrete events were triggered whenever subjects collided with coins and/or barriers. Coins counted as correct events, whereas barriers were errors. This new method, termed ME method, combined and averaged the classification results of single events (SEs) and determined the correctness of MI trials, which consisted of event sequences instead of SEs. The benefit of this method was evaluated in an offline simulation. In an online experiment, the new method was used to detect erroneous MI trials. Such MI trials were discarded and could be repeated by the users. We found that, even with low SE error potential (ErrP) detection rates, feasible accuracies can be achieved when combining MEs to distinguish erroneous from correct MI trials. Online, all subjects reached higher scores with error detection than without, at the cost of longer times needed for completing the game. Findings suggest that ErrP detection may become a reliable tool for monitoring continuous states in BCI applications when combining MEs. This paper demonstrates a novel technique for detecting errors in online continuous BCI applications, which yields promising results even with low single-trial detection rates.
Measuring adverse events in helicopter emergency medical services: establishing content validity.
Patterson, P Daniel; Lave, Judith R; Martin-Gill, Christian; Weaver, Matthew D; Wadas, Richard J; Arnold, Robert M; Roth, Ronald N; Mosesso, Vincent N; Guyette, Francis X; Rittenberger, Jon C; Yealy, Donald M
2014-01-01
We sought to create a valid framework for detecting adverse events (AEs) in the high-risk setting of helicopter emergency medical services (HEMS). We assembled a panel of 10 expert clinicians (n = 6 emergency medicine physicians and n = 4 prehospital nurses and flight paramedics) affiliated with a large multistate HEMS organization in the Northeast US. We used a modified Delphi technique to develop a framework for detecting AEs associated with the treatment of critically ill or injured patients. We used a widely applied measure, the content validity index (CVI), to quantify the validity of the framework's content. The expert panel of 10 clinicians reached consensus on a common AE definition and four-step protocol/process for AE detection in HEMS. The consensus-based framework is composed of three main components: (1) a trigger tool, (2) a method for rating proximal cause, and (3) a method for rating AE severity. The CVI findings isolate components of the framework considered content valid. We demonstrate a standardized process for the development of a content-valid framework for AE detection. The framework is a model for the development of a method for AE identification in other settings, including ground-based EMS.
ERIC Educational Resources Information Center
Taft, Laritza M.
2010-01-01
In its report "To Err is Human", The Institute of Medicine recommended the implementation of internal and external voluntary and mandatory automatic reporting systems to increase detection of adverse events. Knowledge Discovery in Databases (KDD) allows the detection of patterns and trends that would be hidden or less detectable if analyzed by…
NASA Astrophysics Data System (ADS)
Yun, Jinsik; Ha, Dong Sam; Inman, Daniel J.; Owen, Robert B.
2011-03-01
Structural damage for spacecraft is mainly due to impacts such as collision of meteorites or space debris. We present a structural health monitoring (SHM) system for space applications, named Adverse Event Detection (AED), which integrates an acoustic sensor, an impedance-based SHM system, and a Lamb wave SHM system. With these three health-monitoring methods in place, we can determine the presence, location, and severity of damage. An acoustic sensor continuously monitors acoustic events, while the impedance-based and Lamb wave SHM systems are in sleep mode. If an acoustic sensor detects an impact, it activates the impedance-based SHM. The impedance-based system determines if the impact incurred damage. When damage is detected, it activates the Lamb wave SHM system to determine the severity and location of the damage. Further, since an acoustic sensor dissipates much less power than the two SHM systems and the two systems are activated only when there is an acoustic event, our system reduces overall power dissipation significantly. Our prototype system demonstrates the feasibility of the proposed concept.
A Probabilistic Model of Global-Scale Seismology with Veith-Clawson Amplitude Corrections
NASA Astrophysics Data System (ADS)
Arora, N. S.; Russell, S.
2013-12-01
We present a probabilistic generative model of global-scale seismology, NET-VISA, that is designed to address the event detection and location problem of seismic monitoring. The model is based on a standard Bayesian framework with prior probabilities for event generation and propagation as well as likelihoods of detection and arrival (or onset) parameters. The model is supplemented with a greedy search algorithm that iteratively improves the predicted bulletin with respect to the posterior probability. Our prior model incorporates both seismic theory and empirical observations as appropriate. For instance, we use empirical observations for the expected rates of earthquake at each point on the earth, while we use the Gutenberg-Richter law for the the expected magnitude distribution of these earthquakes. In this work, we describe an extension of our model where we include the Veith-Clawson (1972) amplitude decline curves in our empirically calibrated arrival amplitude model. While this change doesn't alter the overall event-detection results, we have chosen to keep the Veith-Clawson curves since they are more seismically accurate. We also describe a recent change to our search algorithm, whereby we now consider multiple hypotheses when we encounter a series of closely spaced arrivals which could be explained by either a single event or multiple co-located events. This change has led to a sharp improvement in our results on large after-shock sequences. We use the analyst-curated LEB bulletin or the REB bulletin, which is the published product of the IDC, as a reference and measure the overlap (percentage of reference events that are matched) and inconsistency (percentage of test bulletin events that don't match anything in the reference) of a one-to-one matching between the test and the reference bulletins. In the table below we show results for NET-VISA and SEL3, which is produced by the existing GA software, for the whole of 2009. These results show that NET-VISA, which is restricted to use arrivals with a 6 hour lag (in order to be comparable to SEL3), reduces the number of missed events by a factor of 2.5 while simultaneously reducing the rate of spurious events. Further, these "spurious" NET-VISA events, in fact, include many real events which are missed by the human analysts. When we compare the NET-VISA events, with arrivals from at least 3 stations (to be comparable to LEB), with NEIC events (in the ISC catalog) over the continental United States, as well as NNC events over Central Asia, we find that NET-VISA identifies 1.5 to 2 times the number of events that the IDC analysts find. Most of these additional events are in the 2--4 mb or ML range. Our experiments also confirm that NET-VISA accurately located each of the recent nuclear explosions to within 5 km of the LEB location. For large after-shock sequences, NET-VISA has been shown to be very efficient as well as accurate. For example on the Tohoku sequence (March 10 -- 14, 2011), NET-VISA (running time 2.57 days) had an overlap of 82.7 % with LEB and inconsistency of 26.8 % versus SEL3's overlap of 71.9 % and inconsistency of 40 %.
Homaeinezhad, M R; Sabetian, P; Feizollahi, A; Ghaffari, A; Rahmani, R
2012-02-01
The major focus of this study is to present a performance accuracy assessment framework based on mathematical modelling of cardiac system multiple measurement signals. Three mathematical algebraic subroutines with simple structural functions for synthetic generation of the synchronously triggered electrocardiogram (ECG), phonocardiogram (PCG) and arterial blood pressure (ABP) signals are described. In the case of ECG signals, normal and abnormal PQRST cycles in complicated conditions such as fascicular ventricular tachycardia, rate dependent conduction block and acute Q-wave infarctions of inferior and anterolateral walls can be simulated. Also, continuous ABP waveform with corresponding individual events such as systolic, diastolic and dicrotic pressures with normal or abnormal morphologies can be generated by another part of the model. In addition, the mathematical synthetic PCG framework is able to generate the S4-S1-S2-S3 cycles in normal and in cardiac disorder conditions such as stenosis, insufficiency, regurgitation and gallop. In the PCG model, the amplitude and frequency content (5-700 Hz) of each sound and variation patterns can be specified. The three proposed models were implemented to generate artificial signals with varies abnormality types and signal-to-noise ratios (SNR), for quantitative detection-delineation performance assessment of several ECG, PCG and ABP individual event detectors designed based on the Hilbert transform, discrete wavelet transform, geometric features such as area curve length (ACLM), the multiple higher order moments (MHOM) metric, and the principal components analysed geometric index (PCAGI). For each method the detection-delineation operating characteristics were obtained automatically in terms of sensitivity, positive predictivity and delineation (segmentation) error rms and checked by the cardiologist. The Matlab m-file script of the synthetic ECG, ABP and PCG signal generators are available in the Appendix.
Brewer, Gene A; Knight, Justin B; Marsh, Richard L; Unsworth, Nash
2010-04-01
The multiprocess view proposes that different processes can be used to detect event-based prospective memory cues, depending in part on the specificity of the cue. According to this theory, attentional processes are not necessary to detect focal cues, whereas detection of nonfocal cues requires some form of controlled attention. This notion was tested using a design in which we compared performance on a focal and on a nonfocal prospective memory task by participants with high or low working memory capacity. An interaction was found, such that participants with high and low working memory performed equally well on the focal task, whereas the participants with high working memory performed significantly better on the nonfocal task than did their counterparts with low working memory. Thus, controlled attention was only necessary for detecting event-based prospective memory cues in the nonfocal task. These results have implications for theories of prospective memory, the processes necessary for cue detection, and the successful fulfillment of intentions.
Modeling urban flood risk territories for Riga city
NASA Astrophysics Data System (ADS)
Piliksere, A.; Sennikovs, J.; Virbulis, J.; Bethers, U.; Bethers, P.; Valainis, A.
2012-04-01
Riga, the capital of Latvia, is located on River Daugava at the Gulf of Riga. The main flooding risks of Riga city are: (1) storm caused water setup in South part of Gulf of Riga (storm event), (2) water level increase caused by Daugava River discharge maximums (spring snow melting event) and (3) strong rainfall or rapid snow melting in densely populated urban areas. The first two flooding factors were discussed previously (Piliksere et al, 2011). The aims of the study were (1) the identification of the flood risk situations in densely populated areas, (2) the quantification of the flooding scenarios caused by rain and snow melting events of different return periods nowadays, in the near future (2021-2050), far future (2071-2100) taking into account the projections of climate change, (3) estimation of groundwater level for Riga city, (4) the building and calibration of the hydrological mathematical model based on SWMM (EPA, 2004) for the domain potentially vulnerable for rain and snow melt flooding events, (5) the calculation of rain and snow melting flood events with different return periods, (6) mapping the potentially flooded areas on a fine grid. The time series of short term precipitation events during warm time period of year (id est. rain events) were analyzed for 35 year long time period. Annual maxima of precipitation intensity for events with different duration (5 min; 15 min; 1h; 3h; 6h; 12h; 1 day; 2 days; 4 days; 10 days) were calculated. The time series of long term simultaneous precipitation data and observations of the reduction of thickness of snow cover were analyzed for 27 year long time period. Snow thawing periods were detected and maximum of snow melting intensity for events with different intensity (1day; 2 days; 4 days; 7 days; 10 days) were calculated. According to the occurrence probability six scenarios for each event for nowadays, near and far future with return period once in 5, 10, 20, 50, 100 and 200 years were constructed based on the Gumbell extreme value analysis. The hydrological modelling driven by the temperature and precipitation data series from regional climate models were used for evaluation of rain event maximums in the future periods. The usage of the climate model data in hydrological models causes systematic errors; therefore the bias correction method (Sennikovs, Bethers, 2009) was applied for determination of the future rainfall intensities. SWMM model was built for the urban area. Objects of hydraulic importance (manifold, penstock, ditch, pumping station, weir, well, catchment sub-basin etc.) were included in the model. There exist pure rain sewage system and mixed rain-water/household sewage system in Riga. Sewage system with wastewater load proportional to population density was taken account and calibrated. Model system was calibrated for a real rain event against the water flux time series into sewage treatment plant of Riga. High resolution (~1.5 points per square meter) digital terrain map was used as the base for finite element mesh for the geospatial mapping of results of hydraulic calculations. Main results of study are (1) detection of the hot spots of densely populated urban areas; (2) identification of the weak chains of the melioration and sewage systems; (3) mapping the elevation of ground water mainly caused by snow melting. A.Piliksere, A.Valainis, J.Seņņikovs, (2011), A flood risk assessment for Riga city taking account climate changes, EGU, Vienna, Austria. EPA, (2004), Storm water management model. User's manual version 5.0. US Environmental Protection Agency J.Sennikovs, U.Bethers, (2009), Statistical downscaling method of regional climate model results for hydrological modelling. 18th World IMACS/MODSIM Congress, Cairns, Australia.
NASA Astrophysics Data System (ADS)
Hetényi, G.; Diehl, T.; Singer, J.; Kissling, E. H.; Clinton, J. F.; Wiemer, S.
2015-12-01
The Eastern Himalayas are home to a seemingly complex seismo-tectonic evolution. The rate of instrumental seismicity is lower than the average along the orogen, there is no record of large historical events, but both paleoseismology and GPS studies point to potentially large (M>8) earthquakes. Due to the lack of a permanent seismic monitoring system in the area, our current level of understanding is inappropriate to create a reliable quantitative seismic hazard model for the region. Existing maps are based on questionable hypotheses and show major inconsistencies when compared to each other. Here we present results on national and regional scales from a 38-station broadband seismological network we operated for almost 2 years in the Kingdom of Bhutan. A thorough, state-of-the-art analysis of local and regional earthquakes builds a comprehensive catalogue that reveals significantly (2-to-3 orders of magnitude) more events than detected from global networks. The seismotectonic analysis reveals new patterns of seismic activity as well as striking differences over relatively short distances within the Himalayas, only partly explained by surface observations such as geology. We compare a priori and a posteriori (BMC) magnitude of completeness maps and show that our network was able to detect all felt events during its operation. Some of these events could be felt at surprisingly large distances. Based on our experiment and experience, we draft the pillars on which a permanent seismological observatory for Bhutan could be constructed. Such a continuous monitoring system of seismic activity could then lead to a reliable quantitative seismic hazard model for Bhutan and surrounding regions, and serve as a base to improve building codes and general preparedness.
Social Media as Seismic Networks for the Earthquake Damage Assessment
NASA Astrophysics Data System (ADS)
Meletti, C.; Cresci, S.; La Polla, M. N.; Marchetti, A.; Tesconi, M.
2014-12-01
The growing popularity of online platforms, based on user-generated content, is gradually creating a digital world that mirrors the physical world. In the paradigm of crowdsensing, the crowd becomes a distributed network of sensors that allows us to understand real life events at a quasi-real-time rate. The SoS-Social Sensing project [http://socialsensing.it/] exploits the opportunistic crowdsensing, involving users in the sensing process in a minimal way, for social media emergency management purposes in order to obtain a very fast, but still reliable, detection of emergency dimension to face. First of all we designed and implemented a decision support system for the detection and the damage assessment of earthquakes. Our system exploits the messages shared in real-time on Twitter. In the detection phase, data mining and natural language processing techniques are firstly adopted to select meaningful and comprehensive sets of tweets. Then we applied a burst detection algorithm in order to promptly identify outbreaking seismic events. Using georeferenced tweets and reported locality names, a rough epicentral determination is also possible. The results, compared to Italian INGV official reports, show that the system is able to detect, within seconds, events of a magnitude in the region of 3.5 with a precision of 75% and a recall of 81,82%. We then focused our attention on damage assessment phase. We investigated the possibility to exploit social media data to estimate earthquake intensity. We designed a set of predictive linear models and evaluated their ability to map the intensity of worldwide earthquakes. The models build on a dataset of almost 5 million tweets exploited to compute our earthquake features, and more than 7,000 globally distributed earthquakes data, acquired in a semi-automatic way from USGS, serving as ground truth. We extracted 45 distinct features falling into four categories: profile, tweet, time and linguistic. We run diagnostic tests and simulations on generated models to assess their significance and avoid overfitting. Overall results show a correlation between the messages shared in social media and intensity estimations based on online survey data (CDI).
NASA Astrophysics Data System (ADS)
Wyard, Coraline; Fettweis, Xavier
2016-04-01
As a consequence of climate change, several studies concluded that winter flood occurrence could increase in the future in many rivers of northern and western Europe in response to an increase in extreme precipitation events. This study aims to determine if trends in extreme hydroclimatic events generating floods can already be detected over the last century. In particular, we focus on the Ourthe River (southeast of Belgium) which is one of the main tributaries of the Meuse River with a catchment area of 3500 km². In this river, most of the floods occur during winter and about 50% of them are due to rainfall events associated with the melting of the snow which covers the Ardennes during winter. In this study, hydroclimatic conditions favorable to flooding were reconstructed over the 20th century using the regional climate model MAR ("Modèle Atmosphérique Régional") forced by the following reanalyses: the ERA-20C, the ERA-Interim and the NCEP/NCAR-v1. The use of the MAR model allows to compute precipitation, snow depth and run-off resulting from precipitation events and snow melting in any part of the Ourthe river catchment area. Therefore, extreme hydroclimatic events, namely extreme run-off events, which could potentially generate floods, can be reconstructed using the MAR model. As validation, the MAR results were compared to weather station-based data. A trend analysis was then performed in order to study the evolution of conditions favorable to flooding in the Ourthe River catchment. The results show that the MAR model allows the detection of more than 95% of the hydroclimatic conditions which effectively generated observed floods in the Ourthe River over the 1974-2014 period. Conditions favorable to flooding present a negative trend over the last 50 years as a result of a decrease in snow accumulation and in extreme precipitation events. However, significance of these trends depends on the reanalysis used to force the regional climate model as well as the length of the time series.
Chen, Yen-Lin; Liang, Wen-Yew; Chiang, Chuan-Yen; Hsieh, Tung-Ju; Lee, Da-Cheng; Yuan, Shyan-Ming; Chang, Yang-Lang
2011-01-01
This study presents efficient vision-based finger detection, tracking, and event identification techniques and a low-cost hardware framework for multi-touch sensing and display applications. The proposed approach uses a fast bright-blob segmentation process based on automatic multilevel histogram thresholding to extract the pixels of touch blobs obtained from scattered infrared lights captured by a video camera. The advantage of this automatic multilevel thresholding approach is its robustness and adaptability when dealing with various ambient lighting conditions and spurious infrared noises. To extract the connected components of these touch blobs, a connected-component analysis procedure is applied to the bright pixels acquired by the previous stage. After extracting the touch blobs from each of the captured image frames, a blob tracking and event recognition process analyzes the spatial and temporal information of these touch blobs from consecutive frames to determine the possible touch events and actions performed by users. This process also refines the detection results and corrects for errors and occlusions caused by noise and errors during the blob extraction process. The proposed blob tracking and touch event recognition process includes two phases. First, the phase of blob tracking associates the motion correspondence of blobs in succeeding frames by analyzing their spatial and temporal features. The touch event recognition process can identify meaningful touch events based on the motion information of touch blobs, such as finger moving, rotating, pressing, hovering, and clicking actions. Experimental results demonstrate that the proposed vision-based finger detection, tracking, and event identification system is feasible and effective for multi-touch sensing applications in various operational environments and conditions. PMID:22163990
Bayesian Monitoring Systems for the CTBT: Historical Development and New Results
NASA Astrophysics Data System (ADS)
Russell, S.; Arora, N. S.; Moore, D.
2016-12-01
A project at Berkeley, begun in 2009 in collaboration with CTBTO andmore recently with LLNL, has reformulated the global seismicmonitoring problem in a Bayesian framework. A first-generation system,NETVISA, has been built comprising a spatial event prior andgenerative models of event transmission and detection, as well as aMonte Carlo inference algorithm. The probabilistic model allows forseamless integration of various disparate sources of information,including negative information (the absence of detections). Workingfrom arrivals extracted by traditional station processing fromInternational Monitoring System (IMS) data, NETVISA achieves areduction of around 60% in the number of missed events compared withthe currently deployed network processing system. It also finds manyevents that are missed by the human analysts who postprocess the IMSoutput. Recent improvements include the integration of models forinfrasound and hydroacoustic detections and a global depth model fornatural seismicity trained from ISC data. NETVISA is now fullycompatible with the CTBTO operating environment. A second-generation model called SIGVISA extends NETVISA's generativemodel all the way from events to raw signal data, avoiding theerror-prone bottom-up detection phase of station processing. SIGVISA'smodel automatically captures the phenomena underlying existingdetection and location techniques such as multilateration, waveformcorrelation matching, and double-differencing, and integrates theminto a global inference process that also (like NETVISA) handles denovo events. Initial results for the Western US in early 2008 (whenthe transportable US Array was operating) shows that SIGVISA finds,from IMS data only, more than twice the number of events recorded inthe CTBTO Late Event Bulletin (LEB). For mb 1.0-2.5, the ratio is more than10; put another way, for this data set, SIGVISA lowers the detectionthreshold by roughly one magnitude compared to LEB. The broader message of this work is that probabilistic inference basedon a vertically integrated generative model that directly expressesgeophysical knowledge can be a much more effective approach forinterpreting scientific data than the traditional bottom-up processingpipeline.
Quantifying (dis)agreement between direct detection experiments in a halo-independent way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feldstein, Brian; Kahlhoefer, Felix, E-mail: brian.feldstein@physics.ox.ac.uk, E-mail: felix.kahlhoefer@physics.ox.ac.uk
We propose an improved method to study recent and near-future dark matter direct detection experiments with small numbers of observed events. Our method determines in a quantitative and halo-independent way whether the experiments point towards a consistent dark matter signal and identifies the best-fit dark matter parameters. To achieve true halo independence, we apply a recently developed method based on finding the velocity distribution that best describes a given set of data. For a quantitative global analysis we construct a likelihood function suitable for small numbers of events, which allows us to determine the best-fit particle physics properties of darkmore » matter considering all experiments simultaneously. Based on this likelihood function we propose a new test statistic that quantifies how well the proposed model fits the data and how large the tension between different direct detection experiments is. We perform Monte Carlo simulations in order to determine the probability distribution function of this test statistic and to calculate the p-value for both the dark matter hypothesis and the background-only hypothesis.« less
Modeling surface backgrounds from radon progeny plate-out
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perumpilly, G.; Guiseppe, V. E.; Snyder, N.
2013-08-08
The next generation low-background detectors operating deep underground aim for unprecedented low levels of radioactive backgrounds. The surface deposition and subsequent implantation of radon progeny in detector materials will be a source of energetic background events. We investigate Monte Carlo and model-based simulations to understand the surface implantation profile of radon progeny. Depending on the material and region of interest of a rare event search, these partial energy depositions can be problematic. Motivated by the use of Ge crystals for the detection of neutrinoless double-beta decay, we wish to understand the detector response of surface backgrounds from radon progeny. Wemore » look at the simulation of surface decays using a validated implantation distribution based on nuclear recoils and a realistic surface texture. Results of the simulations and measured α spectra are presented.« less
Cole, Casey A; Anshari, Dien; Lambert, Victoria; Thrasher, James F
2017-01-01
Background Smoking is the leading cause of preventable death in the world today. Ecological research on smoking in context currently relies on self-reported smoking behavior. Emerging smartwatch technology may more objectively measure smoking behavior by automatically detecting smoking sessions using robust machine learning models. Objective This study aimed to examine the feasibility of detecting smoking behavior using smartwatches. The second aim of this study was to compare the success of observing smoking behavior with smartwatches to that of conventional self-reporting. Methods A convenience sample of smokers was recruited for this study. Participants (N=10) recorded 12 hours of accelerometer data using a mobile phone and smartwatch. During these 12 hours, they engaged in various daily activities, including smoking, for which they logged the beginning and end of each smoking session. Raw data were classified as either smoking or nonsmoking using a machine learning model for pattern recognition. The accuracy of the model was evaluated by comparing the output with a detailed description of a modeled smoking session. Results In total, 120 hours of data were collected from participants and analyzed. The accuracy of self-reported smoking was approximately 78% (96/123). Our model was successful in detecting 100 of 123 (81%) smoking sessions recorded by participants. After eliminating sessions from the participants that did not adhere to study protocols, the true positive detection rate of the smartwatch based-detection increased to more than 90%. During the 120 hours of combined observation time, only 22 false positive smoking sessions were detected resulting in a 2.8% false positive rate. Conclusions Smartwatch technology can provide an accurate, nonintrusive means of monitoring smoking behavior in natural contexts. The use of machine learning algorithms for passively detecting smoking sessions may enrich ecological momentary assessment protocols and cessation intervention studies that often rely on self-reported behaviors and may not allow for targeted data collection and communications around smoking events. PMID:29237580
Monitoring Chewing and Eating in Free-Living Using Smart Eyeglasses.
Zhang, Rui; Amft, Oliver
2018-01-01
We propose to 3-D-print personal fitted regular-look smart eyeglasses frames equipped with bilateral electromyography recording to monitor temporalis muscles' activity for automatic dietary monitoring. Personal fitting supported electrode-skin contacts are at temple ear bend and temple end positions. We evaluated the smart monitoring eyeglasses during in-lab and free-living studies of food chewing and eating event detection with ten participants. The in-lab study was designed to explore three natural food hardness levels and determine parameters of an energy-based chewing cycle detection. Our free-living study investigated whether chewing monitoring and eating event detection using smart eyeglasses is feasible in free-living. An eating event detection algorithm was developed to determine intake activities based on the estimated chewing rate. Results showed an average food hardness classification accuracy of 94% and chewing cycle detection precision and recall above 90% for the in-lab study and above 77% for the free-living study covering 122 hours of recordings. Eating detection revealed the 44 eating events with an average accuracy above 95%. We conclude that smart eyeglasses are suitable for monitoring chewing and eating events in free-living and even could provide further insights into the wearer's natural chewing patterns.
Domain Anomaly Detection in Machine Perception: A System Architecture and Taxonomy.
Kittler, Josef; Christmas, William; de Campos, Teófilo; Windridge, David; Yan, Fei; Illingworth, John; Osman, Magda
2014-05-01
We address the problem of anomaly detection in machine perception. The concept of domain anomaly is introduced as distinct from the conventional notion of anomaly used in the literature. We propose a unified framework for anomaly detection which exposes the multifaceted nature of anomalies and suggest effective mechanisms for identifying and distinguishing each facet as instruments for domain anomaly detection. The framework draws on the Bayesian probabilistic reasoning apparatus which clearly defines concepts such as outlier, noise, distribution drift, novelty detection (object, object primitive), rare events, and unexpected events. Based on these concepts we provide a taxonomy of domain anomaly events. One of the mechanisms helping to pinpoint the nature of anomaly is based on detecting incongruence between contextual and noncontextual sensor(y) data interpretation. The proposed methodology has wide applicability. It underpins in a unified way the anomaly detection applications found in the literature. To illustrate some of its distinguishing features, in here the domain anomaly detection methodology is applied to the problem of anomaly detection for a video annotation system.
Detecting a Non-Gaussian Stochastic Background of Gravitational Radiation
NASA Astrophysics Data System (ADS)
Drasco, Steve; Flanagan, Éanna É.
2002-12-01
We derive a detection method for a stochastic background of gravitational waves produced by events where the ratio of the average time between events to the average duration of an event is large. Such a signal would sound something like popcorn popping. Our derivation is based on the somewhat unrealistic assumption that the duration of an event is smaller than the detector time resolution.
Assessing the Continuum of Event-Based Biosurveillance Through an Operational Lens
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corley, Courtney D.; Lancaster, Mary J.; Brigantic, Robert T.
2012-03-28
This research follows the Updated Guidelines for Evaluating Public Health Surveillance Systems, Recommendations from the Guidelines Working Group, published by the Centers for Disease Control and Prevention nearly a decade ago. Since then, models have been developed and complex systems have evolved with a breadth of disparate data to detect or forecast chemical, biological, and radiological events that have significant impact in the One Health landscape. How the attributes identified in 2001 relate to the new range of event-based biosurveillance (EBB) technologies is unclear. This manuscript frames the continuum of EBB methods, models, and constructs through an operational lens (i.e.,more » aspects and attributes associated with operational considerations in the development, testing, and validation of the EBB methods and models and their use in an operational environment). A 2-day subject matter expert workshop was held to scientifically identify, develop, and vet a set of attributes for the broad range of such operational considerations. Workshop participants identified and described comprehensive attributes for the characterization of EBB. The identified attributes are: (1) event, (2) readiness, (3) operational aspects, (4) geographic coverage, (5) population coverage, (6) input data, (7) output, and (8) cost. Ultimately, the analyses herein discuss the broad scope, complexity, and relevant issues germane to EBB useful in an operational environment.« less
Updated Model of the Solar Energetic Proton Environment in Space
NASA Astrophysics Data System (ADS)
Jiggens, Piers; Heynderickx, Daniel; Sandberg, Ingmar; Truscott, Pete; Raukunen, Osku; Vainio, Rami
2018-05-01
The Solar Accumulated and Peak Proton and Heavy Ion Radiation Environment (SAPPHIRE) model provides environment specification outputs for all aspects of the Solar Energetic Particle (SEP) environment. The model is based upon a thoroughly cleaned and carefully processed data set. Herein the evolution of the solar proton model is discussed with comparisons to other models and data. This paper discusses the construction of the underlying data set, the modelling methodology, optimisation of fitted flux distributions and extrapolation of model outputs to cover a range of proton energies from 0.1 MeV to 1 GeV. The model provides outputs in terms of mission cumulative fluence, maximum event fluence and peak flux for both solar maximum and solar minimum periods. A new method for describing maximum event fluence and peak flux outputs in terms of 1-in-x-year SPEs is also described. SAPPHIRE proton model outputs are compared with previous models including CREME96, ESP-PSYCHIC and the JPL model. Low energy outputs are compared to SEP data from ACE/EPAM whilst high energy outputs are compared to a new model based on GLEs detected by Neutron Monitors (NMs).
Microseismic Velocity Imaging of the Fracturing Zone
NASA Astrophysics Data System (ADS)
Zhang, H.; Chen, Y.
2015-12-01
Hydraulic fracturing of low permeability reservoirs can induce microseismic events during fracture development. For this reason, microseismic monitoring using sensors on surface or in borehole have been widely used to delineate fracture spatial distribution and to understand fracturing mechanisms. It is often the case that the stimulated reservoir volume (SRV) is determined solely based on microseismic locations. However, it is known that for some fracture development stage, long period long duration events, instead of microseismic events may be associated. In addition, because microseismic events are essentially weak and there exist different sources of noise during monitoring, some microseismic events could not be detected and thus located. Therefore the estimation of the SRV is biased if it is solely determined by microseismic locations. With the existence of fluids and fractures, the seismic velocity of reservoir layers will be decreased. Based on this fact, we have developed a near real time seismic velocity tomography method to characterize velocity changes associated with fracturing process. The method is based on double-difference seismic tomography algorithm to image the fracturing zone where microseismic events occur by using differential arrival times from microseismic event pairs. To take into account varying data distribution for different fracking stages, the method solves the velocity model in the wavelet domain so that different scales of model features can be obtained according to different data distribution. We have applied this real time tomography method to both acoustic emission data from lab experiment and microseismic data from a downhole microseismic monitoring project for shale gas hydraulic fracturing treatment. The tomography results from lab data clearly show the velocity changes associated with different rock fracturing stages. For the field data application, it shows that microseismic events are located in low velocity anomalies. By combining low velocity anomalies with microseismic events, we should better estimate the SRV.
Short template switch events explain mutation clusters in the human genome.
Löytynoja, Ari; Goldman, Nick
2017-06-01
Resequencing efforts are uncovering the extent of genetic variation in humans and provide data to study the evolutionary processes shaping our genome. One recurring puzzle in both intra- and inter-species studies is the high frequency of complex mutations comprising multiple nearby base substitutions or insertion-deletions. We devised a generalized mutation model of template switching during replication that extends existing models of genome rearrangement and used this to study the role of template switch events in the origin of short mutation clusters. Applied to the human genome, our model detects thousands of template switch events during the evolution of human and chimp from their common ancestor and hundreds of events between two independently sequenced human genomes. Although many of these are consistent with a template switch mechanism previously proposed for bacteria, our model also identifies new types of mutations that create short inversions, some flanked by paired inverted repeats. The local template switch process can create numerous complex mutation patterns, including hairpin loop structures, and explains multinucleotide mutations and compensatory substitutions without invoking positive selection, speculative mechanisms, or implausible coincidence. Clustered sequence differences are challenging for current mapping and variant calling methods, and we show that many erroneous variant annotations exist in human reference data. Local template switch events may have been neglected as an explanation for complex mutations because of biases in commonly used analyses. Incorporation of our model into reference-based analysis pipelines and comparisons of de novo assembled genomes will lead to improved understanding of genome variation and evolution. © 2017 Löytynoja and Goldman; Published by Cold Spring Harbor Laboratory Press.
Aartsen, M G; Abraham, K; Ackermann, M; Adams, J; Aguilar, J A; Ahlers, M; Ahrens, M; Altmann, D; Andeen, K; Anderson, T; Ansseau, I; Anton, G; Archinger, M; Argüelles, C; Auffenberg, J; Axani, S; Bai, X; Barwick, S W; Baum, V; Bay, R; Beatty, J J; Becker Tjus, J; Becker, K-H; BenZvi, S; Berghaus, P; Berley, D; Bernardini, E; Bernhard, A; Besson, D Z; Binder, G; Bindig, D; Bissok, M; Blaufuss, E; Blot, S; Bohm, C; Börner, M; Bos, F; Bose, D; Böser, S; Botner, O; Braun, J; Brayeur, L; Bretz, H-P; Burgman, A; Carver, T; Casier, M; Cheung, E; Chirkin, D; Christov, A; Clark, K; Classen, L; Coenders, S; Collin, G H; Conrad, J M; Cowen, D F; Cross, R; Day, M; de André, J P A M; De Clercq, C; Del Pino Rosendo, E; Dembinski, H; De Ridder, S; Desiati, P; de Vries, K D; de Wasseige, G; de With, M; DeYoung, T; Díaz-Vélez, J C; di Lorenzo, V; Dujmovic, H; Dumm, J P; Dunkman, M; Eberhardt, B; Ehrhardt, T; Eichmann, B; Eller, P; Euler, S; Evenson, P A; Fahey, S; Fazely, A R; Feintzeig, J; Felde, J; Filimonov, K; Finley, C; Flis, S; Fösig, C-C; Franckowiak, A; Friedman, E; Fuchs, T; Gaisser, T K; Gallagher, J; Gerhardt, L; Ghorbani, K; Giang, W; Gladstone, L; Glagla, M; Glüsenkamp, T; Goldschmidt, A; Golup, G; Gonzalez, J G; Grant, D; Griffith, Z; Haack, C; Haj Ismail, A; Hallgren, A; Halzen, F; Hansen, E; Hansmann, B; Hansmann, T; Hanson, K; Hebecker, D; Heereman, D; Helbing, K; Hellauer, R; Hickford, S; Hignight, J; Hill, G C; Hoffman, K D; Hoffmann, R; Holzapfel, K; Hoshina, K; Huang, F; Huber, M; Hultqvist, K; In, S; Ishihara, A; Jacobi, E; Japaridze, G S; Jeong, M; Jero, K; Jones, B J P; Jurkovic, M; Kappes, A; Karg, T; Karle, A; Katz, U; Kauer, M; Keivani, A; Kelley, J L; Kemp, J; Kheirandish, A; Kim, M; Kintscher, T; Kiryluk, J; Kittler, T; Klein, S R; Kohnen, G; Koirala, R; Kolanoski, H; Konietz, R; Köpke, L; Kopper, C; Kopper, S; Koskinen, D J; Kowalski, M; Krings, K; Kroll, M; Krückl, G; Krüger, C; Kunnen, J; Kunwar, S; Kurahashi, N; Kuwabara, T; Labare, M; Lanfranchi, J L; Larson, M J; Lauber, F; Lennarz, D; Lesiak-Bzdak, M; Leuermann, M; Leuner, J; Lu, L; Lünemann, J; Madsen, J; Maggi, G; Mahn, K B M; Mancina, S; Mandelartz, M; Maruyama, R; Mase, K; Maunu, R; McNally, F; Meagher, K; Medici, M; Meier, M; Meli, A; Menne, T; Merino, G; Meures, T; Miarecki, S; Mohrmann, L; Montaruli, T; Moulai, M; Nahnhauer, R; Naumann, U; Neer, G; Niederhausen, H; Nowicki, S C; Nygren, D R; Obertacke Pollmann, A; Olivas, A; O'Murchadha, A; Palczewski, T; Pandya, H; Pankova, D V; Penek, Ö; Pepper, J A; Pérez de Los Heros, C; Pieloth, D; Pinat, E; Price, P B; Przybylski, G T; Quinnan, M; Raab, C; Rädel, L; Rameez, M; Rawlins, K; Reimann, R; Relethford, B; Relich, M; Resconi, E; Rhode, W; Richman, M; Riedel, B; Robertson, S; Rongen, M; Rott, C; Ruhe, T; Ryckbosch, D; Rysewyk, D; Sabbatini, L; Sanchez Herrera, S E; Sandrock, A; Sandroos, J; Sarkar, S; Satalecka, K; Schimp, M; Schlunder, P; Schmidt, T; Schoenen, S; Schöneberg, S; Schumacher, L; Seckel, D; Seunarine, S; Soldin, D; Song, M; Spiczak, G M; Spiering, C; Stahlberg, M; Stanev, T; Stasik, A; Steuer, A; Stezelberger, T; Stokstad, R G; Stößl, A; Ström, R; Strotjohann, N L; Sullivan, G W; Sutherland, M; Taavola, H; Taboada, I; Tatar, J; Tenholt, F; Ter-Antonyan, S; Terliuk, A; Tešić, G; Tilav, S; Toale, P A; Tobin, M N; Toscano, S; Tosi, D; Tselengidou, M; Turcati, A; Unger, E; Usner, M; Vandenbroucke, J; van Eijndhoven, N; Vanheule, S; van Rossem, M; van Santen, J; Veenkamp, J; Vehring, M; Voge, M; Vraeghe, M; Walck, C; Wallace, A; Wallraff, M; Wandkowsky, N; Weaver, Ch; Weiss, M J; Wendt, C; Westerhoff, S; Whelan, B J; Wickmann, S; Wiebe, K; Wiebusch, C H; Wille, L; Williams, D R; Wills, L; Wolf, M; Wood, T R; Woolsey, E; Woschnagg, K; Xu, D L; Xu, X W; Xu, Y; Yanez, J P; Yodh, G; Yoshida, S; Zoll, M
2016-12-09
We report constraints on the sources of ultrahigh-energy cosmic rays (UHECRs) above 10^{9} GeV, based on an analysis of seven years of IceCube data. This analysis efficiently selects very high- energy neutrino-induced events which have deposited energies from 5×10^{5} GeV to above 10^{11} GeV. Two neutrino-induced events with an estimated deposited energy of (2.6±0.3)×10^{6} GeV, the highest neutrino energy observed so far, and (7.7±2.0)×10^{5} GeV were detected. The atmospheric background-only hypothesis of detecting these events is rejected at 3.6σ. The hypothesis that the observed events are of cosmogenic origin is also rejected at >99% CL because of the limited deposited energy and the nonobservation of events at higher energy, while their observation is consistent with an astrophysical origin. Our limits on cosmogenic neutrino fluxes disfavor the UHECR sources having a cosmological evolution stronger than the star formation rate, e.g., active galactic nuclei and γ-ray bursts, assuming proton-dominated UHECRs. Constraints on UHECR sources including mixed and heavy UHECR compositions are obtained for models of neutrino production within UHECR sources. Our limit disfavors a significant part of parameter space for active galactic nuclei and new-born pulsar models. These limits on the ultrahigh-energy neutrino flux models are the most stringent to date.
Coronary artery calcium distributions in older persons in the AGES-Reykjavik study
Gudmundsson, Elias Freyr; Gudnason, Vilmundur; Sigurdsson, Sigurdur; Launer, Lenore J.; Harris, Tamara B.; Aspelund, Thor
2013-01-01
Coronary Artery Calcium (CAC) is a sign of advanced atherosclerosis and an independent risk factor for cardiac events. Here, we describe CAC-distributions in an unselected aged population and compare modelling methods to characterize CAC-distribution. CAC is difficult to model because it has a skewed and zero inflated distribution with over-dispersion. Data are from the AGES-Reykjavik sample, a large population based study [2002-2006] in Iceland of 5,764 persons aged 66-96 years. Linear regressions using logarithmic- and Box-Cox transformations on CAC+1, quantile regression and a Zero-Inflated Negative Binomial model (ZINB) were applied. Methods were compared visually and with the PRESS-statistic, R2 and number of detected associations with concurrently measured variables. There were pronounced differences in CAC according to sex, age, history of coronary events and presence of plaque in the carotid artery. Associations with conventional coronary artery disease (CAD) risk factors varied between the sexes. The ZINB model provided the best results with respect to the PRESS-statistic, R2, and predicted proportion of zero scores. The ZINB model detected similar numbers of associations as the linear regression on ln(CAC+1) and usually with the same risk factors. PMID:22990371
A Review of Recent Advances in Research on Extreme Heat Events
NASA Technical Reports Server (NTRS)
Horton, Radley M.; Mankin, Justin S.; Lesk, Corey; Coffel, Ethan; Raymond, Colin
2016-01-01
Reviewing recent literature, we report that changes in extreme heat event characteristics such as magnitude, frequency, and duration are highly sensitive to changes in mean global-scale warming. Numerous studies have detected significant changes in the observed occurrence of extreme heat events, irrespective of how such events are defined. Further, a number of these studies have attributed present-day changes in the risk of individual heat events and the documented global-scale increase in such events to anthropogenic-driven warming. Advances in process-based studies of heat events have focused on the proximate land-atmosphere interactions through soil moisture anomalies, and changes in occurrence of the underlying atmospheric circulation associated with heat events in the mid-latitudes. While evidence for a number of hypotheses remains limited, climate change nevertheless points to tail risks of possible changes in heat extremes that could exceed estimates generated from model outputs of mean temperature. We also explore risks associated with compound extreme events and nonlinear impacts associated with extreme heat.
A Framework of Simple Event Detection in Surveillance Video
NASA Astrophysics Data System (ADS)
Xu, Weiguang; Zhang, Yafei; Lu, Jianjiang; Tian, Yulong; Wang, Jiabao
Video surveillance is playing more and more important role in people's social life. Real-time alerting of threaten events and searching interesting content in stored large scale video footage needs human operator to pay full attention on monitor for long time. The labor intensive mode has limit the effectiveness and efficiency of the system. A framework of simple event detection is presented advance the automation of video surveillance. An improved inner key point matching approach is used to compensate motion of background in real-time; frame difference are used to detect foreground; HOG based classifiers are used to classify foreground object into people and car; mean-shift is used to tracking the recognized objects. Events are detected based on predefined rules. The maturity of the algorithms guarantee the robustness of the framework, and the improved approach and the easily checked rules enable the framework to work in real-time. Future works to be done are also discussed.
Radiation detector device for rejecting and excluding incomplete charge collection events
Bolotnikov, Aleksey E.; De Geronimo, Gianluigi; Vernon, Emerson; Yang, Ge; Camarda, Giuseppe; Cui, Yonggang; Hossain, Anwar; Kim, Ki Hyun; James, Ralph B.
2016-05-10
A radiation detector device is provided that is capable of distinguishing between full charge collection (FCC) events and incomplete charge collection (ICC) events based upon a correlation value comparison algorithm that compares correlation values calculated for individually sensed radiation detection events with a calibrated FCC event correlation function. The calibrated FCC event correlation function serves as a reference curve utilized by a correlation value comparison algorithm to determine whether a sensed radiation detection event fits the profile of the FCC event correlation function within the noise tolerances of the radiation detector device. If the radiation detection event is determined to be an ICC event, then the spectrum for the ICC event is rejected and excluded from inclusion in the radiation detector device spectral analyses. The radiation detector device also can calculate a performance factor to determine the efficacy of distinguishing between FCC and ICC events.
A physics investigation of deadtime losses in neutron counting at low rates with Cf252
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Louise G; Croft, Stephen
2009-01-01
{sup 252}Cf spontaneous fission sources are used for the characterization of neutron counters and the determination of calibration parameters; including both neutron coincidence counting (NCC) and neutron multiplicity deadtime (DT) parameters. Even at low event rates, temporally-correlated neutron counting using {sup 252}Cf suffers a deadtime effect. Meaning that in contrast to counting a random neutron source (e.g. AmLi to a close approximation), DT losses do not vanish in the low rate limit. This is because neutrons are emitted from spontaneous fission events in time-correlated 'bursts', and are detected over a short period commensurate with their lifetime in the detector (characterizedmore » by the system die-away time, {tau}). Thus, even when detected neutron events from different spontaneous fissions are unlikely to overlap in time, neutron events within the detected 'burst' are subject to intrinsic DT losses. Intrinsic DT losses for dilute Pu will be lower since the multiplicity distribution is softer, but real items also experience self-multiplication which can increase the 'size' of the bursts. Traditional NCC DT correction methods do not include the intrinsic (within burst) losses. We have proposed new forms of the traditional NCC Singles and Doubles DT correction factors. In this work, we apply Monte Carlo neutron pulse train analysis to investigate the functional form of the deadtime correction factors for an updating deadtime. Modeling is based on a high efficiency {sup 3}He neutron counter with short die-away time, representing an ideal {sup 3}He based detection system. The physics of dead time losses at low rates is explored and presented. It is observed that new forms are applicable and offer more accurate correction than the traditional forms.« less
NASA Astrophysics Data System (ADS)
Managave, S. R.; Jani, R. A.; Narayana Rao, T.; Sunilkumar, K.; Satheeshkumar, S.; Ramesh, R.
2016-08-01
Evaporation of rain is known to contribute water vapor, a potent greenhouse gas, to the atmosphere. Stable oxygen and hydrogen isotopic compositions (δ18O and, δD, respectively) of precipitation, usually measured/presented as values integrated over rain events or monthly mean values, are important tools for detecting evaporation effects. The slope ~8 of the linear relationship between such time-averaged values of δD and δ18O (called the meteoric water line) is widely accepted as a proof of condensation under isotopic equilibrium and absence of evaporation of rain during atmospheric fall. Here, through a simultaneous investigation of the isotopic and drop size distributions of seventeen rain events sampled on an intra-event scale at Gadanki (13.5°N, 79.2°E), southern India, we demonstrate that the evaporation effects, not evident in the time-averaged data, are significantly manifested in the sub-samples of individual rain events. We detect this through (1) slopes significantly less than 8 for the δD-δ18O relation on intra-event scale and (2) significant positive correlations between deuterium excess ( d-excess = δD - 8*δ18O; lower values in rain indicate evaporation) and the mass-weighted mean diameter of the raindrops ( D m ). An estimated ~44 % of rain is influenced by evaporation. This study also reveals a signature of isotopic equilibration of rain with the cloud base vapor, the processes important for modeling isotopic composition of precipitation. d-excess values of rain are modified by the post-condensation processes and the present approach offers a way to identify the d-excess values least affected by such processes. Isotope-enabled global circulation models could be improved by incorporating intra-event isotopic data and raindrop size dependent isotopic effects.
Multi-model data fusion to improve an early warning system for hypo-/hyperglycemic events.
Botwey, Ransford Henry; Daskalaki, Elena; Diem, Peter; Mougiakakou, Stavroula G
2014-01-01
Correct predictions of future blood glucose levels in individuals with Type 1 Diabetes (T1D) can be used to provide early warning of upcoming hypo-/hyperglycemic events and thus to improve the patient's safety. To increase prediction accuracy and efficiency, various approaches have been proposed which combine multiple predictors to produce superior results compared to single predictors. Three methods for model fusion are presented and comparatively assessed. Data from 23 T1D subjects under sensor-augmented pump (SAP) therapy were used in two adaptive data-driven models (an autoregressive model with output correction - cARX, and a recurrent neural network - RNN). Data fusion techniques based on i) Dempster-Shafer Evidential Theory (DST), ii) Genetic Algorithms (GA), and iii) Genetic Programming (GP) were used to merge the complimentary performances of the prediction models. The fused output is used in a warning algorithm to issue alarms of upcoming hypo-/hyperglycemic events. The fusion schemes showed improved performance with lower root mean square errors, lower time lags, and higher correlation. In the warning algorithm, median daily false alarms (DFA) of 0.25%, and 100% correct alarms (CA) were obtained for both event types. The detection times (DT) before occurrence of events were 13.0 and 12.1 min respectively for hypo-/hyperglycemic events. Compared to the cARX and RNN models, and a linear fusion of the two, the proposed fusion schemes represents a significant improvement.
A multi-level anomaly detection algorithm for time-varying graph data with interactive visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bridges, Robert A.; Collins, John P.; Ferragut, Erik M.
This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating node probabilities, and these related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitatesmore » intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. Furthermore, to illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less
A multi-level anomaly detection algorithm for time-varying graph data with interactive visualization
Bridges, Robert A.; Collins, John P.; Ferragut, Erik M.; ...
2016-01-01
This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating node probabilities, and these related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitatesmore » intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. Furthermore, to illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less
Supervised Time Series Event Detector for Building Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-04-13
A machine learning based approach is developed to detect events that have rarely been seen in the historical data. The data can include building energy consumption, sensor data, environmental data and any data that may affect the building's energy consumption. The algorithm is a modified nonlinear Bayesian support vector machine, which examines daily energy consumption profile, detect the days with abnormal events, and diagnose the cause of the events.
NASA Astrophysics Data System (ADS)
Reynen, Andrew; Audet, Pascal
2017-09-01
A new method using a machine learning technique is applied to event classification and detection at seismic networks. This method is applicable to a variety of network sizes and settings. The algorithm makes use of a small catalogue of known observations across the entire network. Two attributes, the polarization and frequency content, are used as input to regression. These attributes are extracted at predicted arrival times for P and S waves using only an approximate velocity model, as attributes are calculated over large time spans. This method of waveform characterization is shown to be able to distinguish between blasts and earthquakes with 99 per cent accuracy using a network of 13 stations located in Southern California. The combination of machine learning with generalized waveform features is further applied to event detection in Oklahoma, United States. The event detection algorithm makes use of a pair of unique seismic phases to locate events, with a precision directly related to the sampling rate of the generalized waveform features. Over a week of data from 30 stations in Oklahoma, United States are used to automatically detect 25 times more events than the catalogue of the local geological survey, with a false detection rate of less than 2 per cent. This method provides a highly confident way of detecting and locating events. Furthermore, a large number of seismic events can be automatically detected with low false alarm, allowing for a larger automatic event catalogue with a high degree of trust.
Experiments on Adaptive Techniques for Host-Based Intrusion Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
DRAELOS, TIMOTHY J.; COLLINS, MICHAEL J.; DUGGAN, DAVID P.
2001-09-01
This research explores four experiments of adaptive host-based intrusion detection (ID) techniques in an attempt to develop systems that can detect novel exploits. The technique considered to have the most potential is adaptive critic designs (ACDs) because of their utilization of reinforcement learning, which allows learning exploits that are difficult to pinpoint in sensor data. Preliminary results of ID using an ACD, an Elman recurrent neural network, and a statistical anomaly detection technique demonstrate an ability to learn to distinguish between clean and exploit data. We used the Solaris Basic Security Module (BSM) as a data source and performed considerablemore » preprocessing on the raw data. A detection approach called generalized signature-based ID is recommended as a middle ground between signature-based ID, which has an inability to detect novel exploits, and anomaly detection, which detects too many events including events that are not exploits. The primary results of the ID experiments demonstrate the use of custom data for generalized signature-based intrusion detection and the ability of neural network-based systems to learn in this application environment.« less
Real-time prediction of the occurrence of GLE events
NASA Astrophysics Data System (ADS)
Núñez, Marlon; Reyes-Santiago, Pedro J.; Malandraki, Olga E.
2017-07-01
A tool for predicting the occurrence of Ground Level Enhancement (GLE) events using the UMASEP scheme is presented. This real-time tool, called HESPERIA UMASEP-500, is based on the detection of the magnetic connection, along which protons arrive in the near-Earth environment, by estimating the lag correlation between the time derivatives of 1 min soft X-ray flux (SXR) and 1 min near-Earth proton fluxes observed by the GOES satellites. Unlike current GLE warning systems, this tool can predict GLE events before the detection by any neutron monitor (NM) station. The prediction performance measured for the period from 1986 to 2016 is presented for two consecutive periods, because of their notable difference in performance. For the 2000-2016 period, this prediction tool obtained a probability of detection (POD) of 53.8% (7 of 13 GLE events), a false alarm ratio (FAR) of 30.0%, and average warning times (AWT) of 8 min with respect to the first NM station's alert and 15 min to the GLE Alert Plus's warning. We have tested the model by replacing the GOES proton data with SOHO/EPHIN proton data, and the results are similar in terms of POD, FAR, and AWT for the same period. The paper also presents a comparison with a GLE warning system.
NASA Astrophysics Data System (ADS)
Tellman, B.; Sullivan, J.; Kettner, A.; Brakenridge, G. R.; Slayback, D. A.; Kuhn, C.; Doyle, C.
2016-12-01
There is an increasing need to understand flood vulnerability as the societal and economic effects of flooding increases. Risk models from insurance companies and flood models from hydrologists must be calibrated based on flood observations in order to make future predictions that can improve planning and help societies reduce future disasters. Specifically, to improve these models both traditional methods of flood prediction from physically based models as well as data-driven techniques, such as machine learning, require spatial flood observation to validate model outputs and quantify uncertainty. A key dataset that is missing for flood model validation is a global historical geo-database of flood event extents. Currently, the most advanced database of historical flood extent is hosted and maintained at the Dartmouth Flood Observatory (DFO) that has catalogued 4320 floods (1985-2015) but has only mapped 5% of these floods. We are addressing this data gap by mapping the inventory of floods in the DFO database to create a first-of- its-kind, comprehensive, global and historical geospatial database of flood events. To do so, we combine water detection algorithms on MODIS and Landsat 5,7 and 8 imagery in Google Earth Engine to map discrete flood events. The created database will be available in the Earth Engine Catalogue for download by country, region, or time period. This dataset can be leveraged for new data-driven hydrologic modeling using machine learning algorithms in Earth Engine's highly parallelized computing environment, and we will show examples for New York and Senegal.
Detecting misinformation and knowledge conflicts in relational data
NASA Astrophysics Data System (ADS)
Levchuk, Georgiy; Jackobsen, Matthew; Riordan, Brian
2014-06-01
Information fusion is required for many mission-critical intelligence analysis tasks. Using knowledge extracted from various sources, including entities, relations, and events, intelligence analysts respond to commander's information requests, integrate facts into summaries about current situations, augment existing knowledge with inferred information, make predictions about the future, and develop action plans. However, information fusion solutions often fail because of conflicting and redundant knowledge contained in multiple sources. Most knowledge conflicts in the past were due to translation errors and reporter bias, and thus could be managed. Current and future intelligence analysis, especially in denied areas, must deal with open source data processing, where there is much greater presence of intentional misinformation. In this paper, we describe a model for detecting conflicts in multi-source textual knowledge. Our model is based on constructing semantic graphs representing patterns of multi-source knowledge conflicts and anomalies, and detecting these conflicts by matching pattern graphs against the data graph constructed using soft co-reference between entities and events in multiple sources. The conflict detection process maintains the uncertainty throughout all phases, providing full traceability and enabling incremental updates of the detection results as new knowledge or modification to previously analyzed information are obtained. Detected conflicts are presented to analysts for further investigation. In the experimental study with SYNCOIN dataset, our algorithms achieved perfect conflict detection in ideal situation (no missing data) while producing 82% recall and 90% precision in realistic noise situation (15% of missing attributes).
Ionospheric "Volcanology": Ionospheric Detection of Volcano Eruptions
NASA Astrophysics Data System (ADS)
Astafyeva, E.; Shults, K.; Lognonne, P. H.; Rakoto, V.
2016-12-01
It is known that volcano eruptions and explosions can generate acoustic and gravity waves. These neutral waves further propagate into the atmosphere and ionosphere, where they are detectable by atmospheric and ionospheric sounding tools. So far, the features of co-volcanic ionospheric perturbations are not well understood yet. The development of the global and regional networks of ground-based GPS/GNSS receivers has opened a new era in the ionospheric detection of natural hazard events, including volcano eruptions. It is now known that eruptions with the volcanic explosivity index (VEI) of more than 2 can be detected in the ionosphere, especially in regions with dense GPS/GNSS-receiver coverage. The co-volcanic ionospheric disturbances are usually characterized as quasi-periodic oscillations. The Calbuco volcano, located in southern Chile, awoke in April 2015 after 43 years of inactivity. The first eruption began at 21:04UT on 22 April 2015, preceded by only an hour-long period of volcano-tectonic activity. This first eruption lasted 90 minutes and generated a sub-Plinian (i.e. medium to large explosive event), gray ash plume that rose 15 km above the main crater. A larger second event on 23 April began at 04:00UT (01:00LT), it lasted six hours, and also generated a sub-Plinian ash plume that rose higher than 15 km. The VEI was estimated to be 4 to 5 for these two events. In this work, we first study ionospheric TEC response to the Calbuco volcano eruptions of April 2015 by using ground-based GNSS-receivers located around the volcano. We analyze the spectral characteristics of the observed TEC variations and we estimate the propagation speed of the co-volcanic ionospheric perturbations. We further proceed with the normal mode summation technique based modeling of the ionospheric TEC variations due to the Calbuco volcano eruptions. Finally, we attempt to localize the position of the volcano from the ionospheric measurements, and we also estimate the time of the beginning of the eruption.
Object-Oriented Query Language For Events Detection From Images Sequences
NASA Astrophysics Data System (ADS)
Ganea, Ion Eugen
2015-09-01
In this paper is presented a method to represent the events extracted from images sequences and the query language used for events detection. Using an object oriented model the spatial and temporal relationships between salient objects and also between events are stored and queried. This works aims to unify the storing and querying phases for video events processing. The object oriented language syntax used for events processing allow the instantiation of the indexes classes in order to improve the accuracy of the query results. The experiments were performed on images sequences provided from sport domain and it shows the reliability and the robustness of the proposed language. To extend the language will be added a specific syntax for constructing the templates for abnormal events and for detection of the incidents as the final goal of the research.
An image-based model of brain volume biomarker changes in Huntington's disease.
Wijeratne, Peter A; Young, Alexandra L; Oxtoby, Neil P; Marinescu, Razvan V; Firth, Nicholas C; Johnson, Eileanoir B; Mohan, Amrita; Sampaio, Cristina; Scahill, Rachael I; Tabrizi, Sarah J; Alexander, Daniel C
2018-05-01
Determining the sequence in which Huntington's disease biomarkers become abnormal can provide important insights into the disease progression and a quantitative tool for patient stratification. Here, we construct and present a uniquely fine-grained model of temporal progression of Huntington's disease from premanifest through to manifest stages. We employ a probabilistic event-based model to determine the sequence of appearance of atrophy in brain volumes, learned from structural MRI in the Track-HD study, as well as to estimate the uncertainty in the ordering. We use longitudinal and phenotypic data to demonstrate the utility of the patient staging system that the resulting model provides. The model recovers the following order of detectable changes in brain region volumes: putamen, caudate, pallidum, insula white matter, nonventricular cerebrospinal fluid, amygdala, optic chiasm, third ventricle, posterior insula, and basal forebrain. This ordering is mostly preserved even under cross-validation of the uncertainty in the event sequence. Longitudinal analysis performed using 6 years of follow-up data from baseline confirms efficacy of the model, as subjects consistently move to later stages with time, and significant correlations are observed between the estimated stages and nonimaging phenotypic markers. We used a data-driven method to provide new insight into Huntington's disease progression as well as new power to stage and predict conversion. Our results highlight the potential of disease progression models, such as the event-based model, to provide new insight into Huntington's disease progression and to support fine-grained patient stratification for future precision medicine in Huntington's disease.
A Novel Event-Based Incipient Slip Detection Using Dynamic Active-Pixel Vision Sensor (DAVIS)
Rigi, Amin
2018-01-01
In this paper, a novel approach to detect incipient slip based on the contact area between a transparent silicone medium and different objects using a neuromorphic event-based vision sensor (DAVIS) is proposed. Event-based algorithms are developed to detect incipient slip, slip, stress distribution and object vibration. Thirty-seven experiments were performed on five objects with different sizes, shapes, materials and weights to compare precision and response time of the proposed approach. The proposed approach is validated by using a high speed constitutional camera (1000 FPS). The results indicate that the sensor can detect incipient slippage with an average of 44.1 ms latency in unstructured environment for various objects. It is worth mentioning that the experiments were conducted in an uncontrolled experimental environment, therefore adding high noise levels that affected results significantly. However, eleven of the experiments had a detection latency below 10 ms which shows the capability of this method. The results are very promising and show a high potential of the sensor being used for manipulation applications especially in dynamic environments. PMID:29364190
Non Conventional Seismic Events Along the Himalayan Arc Detected in the Hi-Climb Dataset
NASA Astrophysics Data System (ADS)
Vergne, J.; Nàbĕlek, J. L.; Rivera, L.; Bollinger, L.; Burtin, A.
2008-12-01
From September 2002 to August 2005, more than 200 broadband seismic stations were operated across the Himalayan arc and the southern Tibetan plateau in the framework of the Hi-Climb project. Here, we take advantage of the high density of stations along the main profile to look for coherent seismic wave arrivals that can not be attributed to ordinary tectonic events. An automatic detection algorithm is applied to the continuous data streams filtered between 1 and 10 Hz, followed by a visual inspection of all detections. We discovered about one hundred coherent signals that cannot be attributed to local, regional or teleseismic earthquakes and which are characterized by emergent arrivals and long durations ranging from one minute to several hours. Most of these non conventional seismic events have a low signal to noise ratio and are thus only observed above 1 Hz in the frequency band where the seismic noise is the lowest. However, a small subset of them are strong enough to be observed in a larger frequency band and show an enhancement of long periods compared to standard earthquakes. Based on the analysis of the relative amplitude measured at each station or, when possible, on the correlation of the low frequency part of the signals, most of these events appear to be located along the High Himalayan range. But, because of their emergent character and the main orientation of the seismic profile, their longitude and depth remain poorly constrained. The origin of these non conventional seismic events is still unsealed but their seismic signature shares several characteristics with non volcanic tremors, glacial earthquakes and/or debris avalanches. All these phenomena may occur along the Himalayan range but were not seismically detected before. Here we discuss the pros and cons for each of these postulated candidates based on the analysis of the recorded waveforms and slip models.
NASA Astrophysics Data System (ADS)
Khosla, Deepak; Huber, David J.; Martin, Kevin
2017-05-01
This paper† describes a technique in which we improve upon the prior performance of the Rapid Serial Visual Presentation (RSVP) EEG paradigm for image classification though the insertion of visual attention distracters and overall sequence reordering based upon the expected ratio of rare to common "events" in the environment and operational context. Inserting distracter images maintains the ratio of common events to rare events at an ideal level, maximizing the rare event detection via P300 EEG response to the RSVP stimuli. The method has two steps: first, we compute the optimal number of distracters needed for an RSVP stimuli based on the desired sequence length and expected number of targets and insert the distracters into the RSVP sequence, and then we reorder the RSVP sequence to maximize P300 detection. We show that by reducing the ratio of target events to nontarget events using this method, we can allow RSVP sequences with more targets without sacrificing area under the ROC curve (azimuth).
He, Jian; Bai, Shuang; Wang, Xiaoyi
2017-06-16
Falls are one of the main health risks among the elderly. A fall detection system based on inertial sensors can automatically detect fall event and alert a caregiver for immediate assistance, so as to reduce injuries causing by falls. Nevertheless, most inertial sensor-based fall detection technologies have focused on the accuracy of detection while neglecting quantization noise caused by inertial sensor. In this paper, an activity model based on tri-axial acceleration and gyroscope is proposed, and the difference between activities of daily living (ADLs) and falls is analyzed. Meanwhile, a Kalman filter is proposed to preprocess the raw data so as to reduce noise. A sliding window and Bayes network classifier are introduced to develop a wearable fall detection system, which is composed of a wearable motion sensor and a smart phone. The experiment shows that the proposed system distinguishes simulated falls from ADLs with a high accuracy of 95.67%, while sensitivity and specificity are 99.0% and 95.0%, respectively. Furthermore, the smart phone can issue an alarm to caregivers so as to provide timely and accurate help for the elderly, as soon as the system detects a fall.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Bruce T.
2015-12-11
Problem: The overall goal of this proposal is to detect observed seasonal-mean precipitation variations and extreme event occurrences over the United States. Detection, e.g. the process of demonstrating that an observed change in climate is unusual, first requires some means of estimating the range of internal variability absent any external drivers. Ideally, the internal variability would be derived from the observations themselves, however generally the observed variability is a confluence of both internal variability and variability in response to external drivers. Further, numerical climate models—the standard tool for detection studies—have their own estimates of intrinsic variability, which may differ substantiallymore » from that found in the observed system as well as other model systems. These problems are further compounded for weather and climate extremes, which as singular events are particularly ill-suited for detection studies because of their infrequent occurrence, limited spatial range, and underestimation within global and even regional numerical models. Rationale: As a basis for this research we will show how stochastic daily-precipitation models—models in which the simulated interannual-to-multidecadal precipitation variance is purely the result of the random evolution of daily precipitation events within a given time period—can be used to address many of these issues simultaneously. Through the novel application of these well-established models, we can first estimate the changes/trends in various means and extremes that can occur even with fixed daily-precipitation characteristics, e.g. that can occur simply as a result of the stochastic evolution of daily weather events within a given climate. Detection of a change in the observed climate—either naturally or anthropogenically forced—can then be defined as any change relative to this stochastic variability, e.g. as changes/trends in the means and extremes that could only have occurred through a change in the underlying climate. As such, this method is capable of detecting “hot spot” regions—as well as “flare ups” within the hot spot regions—that have experienced interannual to multi-decadal scale variations and trends in seasonal-mean precipitation and extreme events. Further by applying the same methods to numerical climate models we can discern the fidelity of the current-generation climate models in representing detectability within the observed climate system. In this way, we can objectively determine the utility of these model systems for performing detection studies of historical and future climate change.« less
Lightning Detection Efficiency Analysis Process: Modeling Based on Empirical Data
NASA Technical Reports Server (NTRS)
Rompala, John T.
2005-01-01
A ground based lightning detection system employs a grid of sensors, which record and evaluate the electromagnetic signal produced by a lightning strike. Several detectors gather information on that signal s strength, time of arrival, and behavior over time. By coordinating the information from several detectors, an event solution can be generated. That solution includes the signal s point of origin, strength and polarity. Determination of the location of the lightning strike uses algorithms based on long used techniques of triangulation. Determination of the event s original signal strength relies on the behavior of the generated magnetic field over distance and time. In general the signal from the event undergoes geometric dispersion and environmental attenuation as it progresses. Our knowledge of that radial behavior together with the strength of the signal received by detecting sites permits an extrapolation and evaluation of the original strength of the lightning strike. It also limits the detection efficiency (DE) of the network. For expansive grids and with a sparse density of detectors, the DE varies widely over the area served. This limits the utility of the network in gathering information on regional lightning strike density and applying it to meteorological studies. A network of this type is a grid of four detectors in the Rondonian region of Brazil. The service area extends over a million square kilometers. Much of that area is covered by rain forests. Thus knowledge of lightning strike characteristics over the expanse is of particular value. I have been developing a process that determines the DE over the region [3]. In turn, this provides a way to produce lightning strike density maps, corrected for DE, over the entire region of interest. This report offers a survey of that development to date and a record of present activity.
NASA Astrophysics Data System (ADS)
Bergen, K.; Yoon, C. E.; OReilly, O. J.; Beroza, G. C.
2015-12-01
Recent improvements in computational efficiency for waveform correlation-based detections achieved by new methods such as Fingerprint and Similarity Thresholding (FAST) promise to allow large-scale blind search for similar waveforms in long-duration continuous seismic data. Waveform similarity search applied to datasets of months to years of continuous seismic data will identify significantly more events than traditional detection methods. With the anticipated increase in number of detections and associated increase in false positives, manual inspection of the detection results will become infeasible. This motivates the need for new approaches to process the output of similarity-based detection. We explore data mining techniques for improved detection post-processing. We approach this by considering similarity-detector output as a sparse similarity graph with candidate events as vertices and similarities as weighted edges. Image processing techniques are leveraged to define candidate events and combine results individually processed at multiple stations. Clustering and graph analysis methods are used to identify groups of similar waveforms and assign a confidence score to candidate detections. Anomaly detection and classification are applied to waveform data for additional false detection removal. A comparison of methods will be presented and their performance will be demonstrated on a suspected induced and non-induced earthquake sequence.
Ku, Hao-Hsiang
2015-01-01
Nowadays, people can easily use a smartphone to get wanted information and requested services. Hence, this study designs and proposes a Golf Swing Injury Detection and Evaluation open service platform with Ontology-oritened clustering case-based reasoning mechanism, which is called GoSIDE, based on Arduino and Open Service Gateway initative (OSGi). GoSIDE is a three-tier architecture, which is composed of Mobile Users, Application Servers and a Cloud-based Digital Convergence Server. A mobile user is with a smartphone and Kinect sensors to detect the user's Golf swing actions and to interact with iDTV. An application server is with Intelligent Golf Swing Posture Analysis Model (iGoSPAM) to check a user's Golf swing actions and to alter this user when he is with error actions. Cloud-based Digital Convergence Server is with Ontology-oriented Clustering Case-based Reasoning (CBR) for Quality of Experiences (OCC4QoE), which is designed to provide QoE services by QoE-based Ontology strategies, rules and events for this user. Furthermore, GoSIDE will automatically trigger OCC4QoE and deliver popular rules for a new user. Experiment results illustrate that GoSIDE can provide appropriate detections for Golfers. Finally, GoSIDE can be a reference model for researchers and engineers.
NASA Astrophysics Data System (ADS)
Krysta, Monika; Kushida, Noriyuki; Kotselko, Yuriy; Carter, Jerry
2016-04-01
Possibilities of associating information from four pillars constituting CTBT monitoring and verification regime, namely seismic, infrasound, hydracoustic and radionuclide networks, have been explored by the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) for a long time. Based on a concept of overlying waveform events with the geographical regions constituting possible sources of the detected radionuclides, interactive and non-interactive tools were built in the past. Based on the same concept, a design of a prototype of a Fused Event Bulletin was proposed recently. One of the key design elements of the proposed approach is the ability to access fusion results from either the radionuclide or from the waveform technologies products, which are available on different time scales and through various different automatic and interactive products. To accommodate various time scales a dynamic product evolving while the results of the different technologies are being processed and compiled is envisioned. The product would be available through the Secure Web Portal (SWP). In this presentation we describe implementation of the data fusion functionality in the test framework of the SWP. In addition, we address possible refinements to the already implemented concepts.
NASA Astrophysics Data System (ADS)
Occhipinti, G.; Bablet, A.; Makela, J. J.
2015-12-01
The detection of the tsunami related internal gravity waves (IGWtsuna) by airglow camera has been recently validated by observation (Makela et al., 2011) and modeling (Occhipinti et al., 2011) in the case of the Tohoku event (11 March 2011, Mw 9.0). The airglow is measuring the photon emission at 630 nm, indirectly linked to the plasma density of O2+ (Link & Cogger, 1988) and it is commonly used to detect transient event in the ionosphere (Kelley et al., 2002, Makela et al., 2009, Miller et al., 2009). The modeling of the IGWtsuna clearly reproduced the pattern of the airglow measurement observed over Hawaii and the comparison between the observation and the modeling allows to recognize the wave form and allow to explain the IGWtsuna arriving before the tsunami wavefront at the sea level (Occhipinti et al., 2011). Approaching the Hawaiian archipelagos the tsunami propagation is slowed down (reduction of the sea depth), instead, the IGWtsuna, propagating in the atmosphere/ionosphere, conserves its speed. In this work, we present the modeling of the new airglow observation following the Queen Charlotte event (27 October 2012, Mw 7.8) that has been recently detected, proving that the technique can be generalized for smaller events. Additionally, the effect of the wind on the IGWtsuna, already evocated in the past, is included in the modeling to better reproduce the airglow observations. All ref. here @ www.ipgp.fr/~ninto
Spectral characteristics of VLF sferics associated with TGFs
NASA Astrophysics Data System (ADS)
Mezentsev, Andrew; Lehtinen, Nikolai; Ostgaard, Nikolai; Perez-Invernon, Javier; Cummer, Steven
2017-04-01
A detailed analysis of RHESSI TGFs is performed in association with WWLLN sources and VLF sferics recorded at Duke University. The analysis of the TGF-WWLLN matches allowed to evaluate RHESSI clock systematic offsets [1], which allows to perform a more precise timing analysis involving TGF data comparisons with the VLF sferics recorded at Duke University. In this work we analyzed the energy spectra of 35 VLF sferics, which were identified as candidates to be emitted by the TGF source, based on the simultaneity and location coincidence between the TGF and radio sources. 20 events have WWLLN detections, which provides a reliable source location of the event. For the other 15 events several selection criteria were used: source location should be consistent with the simultaneity of the TGF and VLF sferic within ±200 μs uncertainty; source location should lay within the azimuthal ±4° cone defined by the ratio of the radial and azimuthal magnetic field components of the VLF sferic; source location should lay within 800 km circle around the RHESSI foot-point; source location should lay within a cluster of a current lightning activity validated by WWLLN (or any other lightning detection network). The energy spectra of 35 VLF sferics related to TGFs were analyzed in the context of the TGF radio emission model developed in [2]. Proposed model represents a TGF at source as a sequence of Np seeding pulses of energetic particles which develop into runaway avalanches in the strong ambient field. These relativistic electrons ionize air along their propagation path which results in secondary currents of low energy electrons and light ions in the ambient electric field. These secondary currents produce radio emissions that can be detected by the ground based sensors. Proposed model allows to express the TGF source current moment energy spectrum using the T50 TGF duration measured by RHESSI. This gives the opportunity to establish and validate empirically the functional link between the satellite measurements and radio recordings of TGFs. Distances from the analyzed TGF sources to the Duke VLF receiver range from 2000 to 4000 km. This involves the consideration of the propagation effects in the Earth-ionosphere wave guide (EIWG). The EIWG transfer function was calculated for each event using the full wave propagation method. Thus, the modeled energy spectrum of the TGF source current moment can be transformed into how it would look like for the Duke VLF receiver. Comparative analysis of the energy spectra of modeled TGF radio emission and associated VLF sferics for 20 events with WWLLN confirmed location and 15 events without WWLLN detection shows that 31 of these 35 events exhibit a good fit between the modeled and observed spectra, with only 4 exceptions, that look inconsistent with the proposed model. The second cutoff frequency fB with the number of avalanches Np define the shape of the observed energy spectrum of the sferic emitted by a TGF. Multiplicity of the TGF serves as another important discriminative factor that shows the consistency between the modeled and observed spectra. The results show that the number of avalanches Np should be relatively small, of the order of 30-300, to make the modeled TGF radio emission consistent with the observed VLF sferics. These small values of Np give an argument in favor of the leader model of the TGF production, and also might refer to streamers in the streamer zone of the leader tip, as candidates, producing initial seeding pulses that develop into RREAs, generating a TGF. [1]. Mezentsev, A., Østgaard, N., Gjesteland, T., Albrechtsen, K., Lehtinen, N., Marisaldi, M., Smith, D., and Cummer, S. (2016), Radio emissions from double RHESSI TGFs, J. Geophys. Res., 121, doi:10.1002/2016JD025111 [2]. Dwyer, J. R., and S. A. Cummer (2013), Radio emissions from terrestrial gamma ray flashes, J. Geophys. Res., 118, doi:10.1002/jgra.50188.
A research using hybrid RBF/Elman neural networks for intrusion detection system secure model
NASA Astrophysics Data System (ADS)
Tong, Xiaojun; Wang, Zhu; Yu, Haining
2009-10-01
A hybrid RBF/Elman neural network model that can be employed for both anomaly detection and misuse detection is presented in this paper. The IDSs using the hybrid neural network can detect temporally dispersed and collaborative attacks effectively because of its memory of past events. The RBF network is employed as a real-time pattern classification and the Elman network is employed to restore the memory of past events. The IDSs using the hybrid neural network are evaluated against the intrusion detection evaluation data sponsored by U.S. Defense Advanced Research Projects Agency (DARPA). Experimental results are presented in ROC curves. Experiments show that the IDSs using this hybrid neural network improve the detection rate and decrease the false positive rate effectively.
NASA Astrophysics Data System (ADS)
Wang, L.; Toshioka, T.; Nakajima, T.; Narita, A.; Xue, Z.
2017-12-01
In recent years, more and more Carbon Capture and Storage (CCS) studies focus on seismicity monitoring. For the safety management of geological CO2 storage at Tomakomai, Hokkaido, Japan, an Advanced Traffic Light System (ATLS) combined different seismic messages (magnitudes, phases, distributions et al.) is proposed for injection controlling. The primary task for ATLS is the seismic events detection in a long-term sustained time series record. Considering the time-varying characteristics of Signal to Noise Ratio (SNR) of a long-term record and the uneven energy distributions of seismic event waveforms will increase the difficulty in automatic seismic detecting, in this work, an improved probability autoregressive (AR) method for automatic seismic event detecting is applied. This algorithm, called sequentially discounting AR learning (SDAR), can identify the effective seismic event in the time series through the Change Point detection (CPD) of the seismic record. In this method, an anomaly signal (seismic event) can be designed as a change point on the time series (seismic record). The statistical model of the signal in the neighborhood of event point will change, because of the seismic event occurrence. This means the SDAR aims to find the statistical irregularities of the record thought CPD. There are 3 advantages of SDAR. 1. Anti-noise ability. The SDAR does not use waveform messages (such as amplitude, energy, polarization) for signal detecting. Therefore, it is an appropriate technique for low SNR data. 2. Real-time estimation. When new data appears in the record, the probability distribution models can be automatic updated by SDAR for on-line processing. 3. Discounting property. the SDAR introduces a discounting parameter to decrease the influence of present statistic value on future data. It makes SDAR as a robust algorithm for non-stationary signal processing. Within these 3 advantages, the SDAR method can handle the non-stationary time-varying long-term series and achieve real-time monitoring. Finally, we employ the SDAR on a synthetic model and Tomakomai Ocean Bottom Cable (OBC) baseline data to prove the feasibility and advantage of our method.
NASA Astrophysics Data System (ADS)
Heleno, Sandra; Matias, Magda; Pina, Pedro; Sousa, António Jorge
2016-04-01
A method for semiautomated landslide detection and mapping, with the ability to separate source and run-out areas, is presented in this paper. It combines object-based image analysis and a support vector machine classifier and is tested using a GeoEye-1 multispectral image, sensed 3 days after a major damaging landslide event that occurred on Madeira Island (20 February 2010), and a pre-event lidar digital terrain model. The testing is developed in a 15 km2 wide study area, where 95 % of the number of landslides scars are detected by this supervised approach. The classifier presents a good performance in the delineation of the overall landslide area, with commission errors below 26 % and omission errors below 24 %. In addition, fair results are achieved in the separation of the source from the run-out landslide areas, although in less illuminated slopes this discrimination is less effective than in sunnier, east-facing slopes.
Wu, Jian-Xing; Huang, Ping-Tzan; Li, Chien-Ming
2018-01-01
Blood leakage and blood loss are serious life-threatening complications occurring during dialysis therapy. These events have been of concerns to both healthcare givers and patients. More than 40% of adult blood volume can be lost in just a few minutes, resulting in morbidities and mortality. The authors intend to propose the design of a warning tool for the detection of blood leakage/blood loss during dialysis therapy based on fog computing with an array of photocell sensors and heteroassociative memory (HAM) model. Photocell sensors are arranged in an array on a flexible substrate to detect blood leakage via the resistance changes with illumination in the visible spectrum of 500–700 nm. The HAM model is implemented to design a virtual alarm unit using electricity changes in an embedded system. The proposed warning tool can indicate the risk level in both end-sensing units and remote monitor devices via a wireless network and fog/cloud computing. The animal experimental results (pig blood) will demonstrate the feasibility. PMID:29515815
Wu, Jian-Xing; Huang, Ping-Tzan; Lin, Chia-Hung; Li, Chien-Ming
2018-02-01
Blood leakage and blood loss are serious life-threatening complications occurring during dialysis therapy. These events have been of concerns to both healthcare givers and patients. More than 40% of adult blood volume can be lost in just a few minutes, resulting in morbidities and mortality. The authors intend to propose the design of a warning tool for the detection of blood leakage/blood loss during dialysis therapy based on fog computing with an array of photocell sensors and heteroassociative memory (HAM) model. Photocell sensors are arranged in an array on a flexible substrate to detect blood leakage via the resistance changes with illumination in the visible spectrum of 500-700 nm. The HAM model is implemented to design a virtual alarm unit using electricity changes in an embedded system. The proposed warning tool can indicate the risk level in both end-sensing units and remote monitor devices via a wireless network and fog/cloud computing. The animal experimental results (pig blood) will demonstrate the feasibility.
NASA Astrophysics Data System (ADS)
Záhlava, J.; Němec, F.; Santolík, O.; Kolmašová, I.; Parrot, M.; Rodger, C. J.
2015-11-01
We present results of a systematic study of unusual very low frequency (VLF) radio events with a reduced intensity observed in the frequency-time spectrograms measured by the low-orbiting Detection of Electro-Magnetic Emissions Transmitted from Earthquake Regions (DEMETER) spacecraft. They occur exclusively on the nightside. During these events, the intensity of fractional hop whistlers at specific frequencies is significantly reduced. These frequencies are usually above about 3.4 kHz (second Earth-ionosphere waveguide cutoff frequency), but about 20% of events extend down to about 1.7 kHz (first Earth-ionosphere waveguide cutoff frequency). The frequencies of a reduced intensity vary smoothly with time. We have inspected 6.5 years of DEMETER data, and we identified in total 1601 such events. We present a simple model of the event formation based on the wave propagation in the Earth-ionosphere waveguide. We apply the model to two selected events, and we demonstrate that the model is able to reproduce both the minimum frequencies of the events and their approximate frequency-time shapes. The overall geographic distribution of the events is shifted by about 3000 km westward and slightly southward with respect to the areas with high long-term average lightning activity. We demonstrate that this shift is related to the specific DEMETER orbit, and we suggest its qualitative explanation by the east-west asymmetry of the wave propagation in the Earth-ionosphere waveguide.
Automatic Detection and Classification of Audio Events for Road Surveillance Applications.
Almaadeed, Noor; Asim, Muhammad; Al-Maadeed, Somaya; Bouridane, Ahmed; Beghdadi, Azeddine
2018-06-06
This work investigates the problem of detecting hazardous events on roads by designing an audio surveillance system that automatically detects perilous situations such as car crashes and tire skidding. In recent years, research has shown several visual surveillance systems that have been proposed for road monitoring to detect accidents with an aim to improve safety procedures in emergency cases. However, the visual information alone cannot detect certain events such as car crashes and tire skidding, especially under adverse and visually cluttered weather conditions such as snowfall, rain, and fog. Consequently, the incorporation of microphones and audio event detectors based on audio processing can significantly enhance the detection accuracy of such surveillance systems. This paper proposes to combine time-domain, frequency-domain, and joint time-frequency features extracted from a class of quadratic time-frequency distributions (QTFDs) to detect events on roads through audio analysis and processing. Experiments were carried out using a publicly available dataset. The experimental results conform the effectiveness of the proposed approach for detecting hazardous events on roads as demonstrated by 7% improvement of accuracy rate when compared against methods that use individual temporal and spectral features.
Lyapunov-Based Sensor Failure Detection And Recovery For The Reverse Water Gas Shift Process
NASA Technical Reports Server (NTRS)
Haralambous, Michael G.
2001-01-01
Livingstone, a model-based AI software system, is planned for use in the autonomous fault diagnosis, reconfiguration, and control of the oxygen-producing reverse water gas shift (RWGS) process test-bed located in the Applied Chemistry Laboratory at KSC. In this report the RWGS process is first briefly described and an overview of Livingstone is given. Next, a Lyapunov-based approach for detecting and recovering from sensor failures, differing significantly from that used by Livingstone, is presented. In this new method, models used are in terms of the defining differential equations of system components, thus differing from the qualitative, static models used by Livingstone. An easily computed scalar inequality constraint, expressed in terms of sensed system variables, is used to determine the existence of sensor failures. In the event of sensor failure, an observer/estimator is used for determining which sensors have failed. The theory underlying the new approach is developed. Finally, a recommendation is made to use the Lyapunov-based approach to complement the capability of Livingstone and to use this combination in the RWGS process.
LYAPUNOV-Based Sensor Failure Detection and Recovery for the Reverse Water Gas Shift Process
NASA Technical Reports Server (NTRS)
Haralambous, Michael G.
2002-01-01
Livingstone, a model-based AI software system, is planned for use in the autonomous fault diagnosis, reconfiguration, and control of the oxygen-producing reverse water gas shift (RWGS) process test-bed located in the Applied Chemistry Laboratory at KSC. In this report the RWGS process is first briefly described and an overview of Livingstone is given. Next, a Lyapunov-based approach for detecting and recovering from sensor failures, differing significantly from that used by Livingstone, is presented. In this new method, models used are in t e m of the defining differential equations of system components, thus differing from the qualitative, static models used by Livingstone. An easily computed scalar inequality constraint, expressed in terms of sensed system variables, is used to determine the existence of sensor failures. In the event of sensor failure, an observer/estimator is used for determining which sensors have failed. The theory underlying the new approach is developed. Finally, a recommendation is made to use the Lyapunov-based approach to complement the capability of Livingstone and to use this combination in the RWGS process.
NASA Astrophysics Data System (ADS)
Kenefic, L.; Morton, E.; Bilek, S.
2017-12-01
It is well known that subduction zones create the largest earthquakes in the world, like the magnitude 9.5 Chile earthquake in 1960, or the more recent 9.1 magnitude Japan earthquake in 2011, both of which are in the top five largest earthquakes ever recorded. However, off the coast of the Pacific Northwest region of the U.S., the Cascadia subduction zone (CSZ) remains relatively quiet and modern seismic instruments have not recorded earthquakes of this size in the CSZ. The last great earthquake, a magnitude 8.7-9.2, occurred in 1700 and is constrained by written reports of the resultant tsunami in Japan and dating a drowned forest in the U.S. Previous studies have suggested the margin is most likely segmented along-strike. However, variations in frictional conditions in the CSZ fault zone are not well known. Geodetic modeling indicates that the locked seismogenic zone is likely completely offshore, which may be too far from land seismometers to adequately detect related seismicity. Ocean bottom seismometers, as part of the Cascadia Initiative Amphibious Network, were installed directly above the inferred seismogenic zone, which we use to better detect small interplate seismicity. Using the subspace detection method, this study looks to find new seismogenic zone earthquakes. This subspace detection method uses multiple previously known event templates concurrently to scan through continuous seismic data. Template events that make up the subspace are chosen from events in existing catalogs that likely occurred along the plate interface. Corresponding waveforms are windowed on the nearby Cascadia Initiative ocean bottom seismometers and coastal land seismometers for scanning. Detections that are found by the scan are similar to the template waveforms based upon a predefined threshold. Detections are then visually examined to determine if an event is present. The presence of repeating event clusters can indicate persistent seismic patches, likely corresponding to areas of stronger coupling. This work will ultimately improve the understanding of CSZ fault zone heterogeneity. Preliminary results gathered indicate 96 possible new events between August 2, 2013 and July 1, 2014 for four target clusters off the coast of northern Oregon.
Multimodal Event Detection in Twitter Hashtag Networks
Yilmaz, Yasin; Hero, Alfred O.
2016-07-01
In this study, event detection in a multimodal Twitter dataset is considered. We treat the hashtags in the dataset as instances with two modes: text and geolocation features. The text feature consists of a bag-of-words representation. The geolocation feature consists of geotags (i.e., geographical coordinates) of the tweets. Fusing the multimodal data we aim to detect, in terms of topic and geolocation, the interesting events and the associated hashtags. To this end, a generative latent variable model is assumed, and a generalized expectation-maximization (EM) algorithm is derived to learn the model parameters. The proposed method is computationally efficient, and lendsmore » itself to big datasets. Lastly, experimental results on a Twitter dataset from August 2014 show the efficacy of the proposed method.« less
Near Real-Time Optimal Prediction of Adverse Events in Aviation Data
NASA Technical Reports Server (NTRS)
Martin, Rodney Alexander; Das, Santanu
2010-01-01
The prediction of anomalies or adverse events is a challenging task, and there are a variety of methods which can be used to address the problem. In this paper, we demonstrate how to recast the anomaly prediction problem into a form whose solution is accessible as a level-crossing prediction problem. The level-crossing prediction problem has an elegant, optimal, yet untested solution under certain technical constraints, and only when the appropriate modeling assumptions are made. As such, we will thoroughly investigate the resilience of these modeling assumptions, and show how they affect final performance. Finally, the predictive capability of this method will be assessed by quantitative means, using both validation and test data containing anomalies or adverse events from real aviation data sets that have previously been identified as operationally significant by domain experts. It will be shown that the formulation proposed yields a lower false alarm rate on average than competing methods based on similarly advanced concepts, and a higher correct detection rate than a standard method based upon exceedances that is commonly used for prediction.
Jasiewicz, Jan M; Allum, John H J; Middleton, James W; Barriskill, Andrew; Condie, Peter; Purcell, Brendan; Li, Raymond Che Tin
2006-12-01
We report on three different methods of gait event detection (toe-off and heel strike) using miniature linear accelerometers and angular velocity transducers in comparison to using standard pressure-sensitive foot switches. Detection was performed with normal and spinal-cord injured subjects. The detection of end contact (EC), normally toe-off, and initial contact (IC) normally, heel strike was based on either foot linear accelerations or foot sagittal angular velocity or shank sagittal angular velocity. The results showed that all three methods were as accurate as foot switches in estimating times of IC and EC for normal gait patterns. In spinal-cord injured subjects, shank angular velocity was significantly less accurate (p<0.02). We conclude that detection based on foot linear accelerations or foot angular velocity can correctly identify the timing of IC and EC events in both normal and spinal-cord injured subjects.
Picking vs Waveform based detection and location methods for induced seismicity monitoring
NASA Astrophysics Data System (ADS)
Grigoli, Francesco; Boese, Maren; Scarabello, Luca; Diehl, Tobias; Weber, Bernd; Wiemer, Stefan; Clinton, John F.
2017-04-01
Microseismic monitoring is a common operation in various industrial activities related to geo-resouces, such as oil and gas and mining operations or geothermal energy exploitation. In microseismic monitoring we generally deal with large datasets from dense monitoring networks that require robust automated analysis procedures. The seismic sequences being monitored are often characterized by very many events with short inter-event times that can even provide overlapped seismic signatures. In these situations, traditional approaches that identify seismic events using dense seismic networks based on detections, phase identification and event association can fail, leading to missed detections and/or reduced location resolution. In recent years, to improve the quality of automated catalogues, various waveform-based methods for the detection and location of microseismicity have been proposed. These methods exploit the coherence of the waveforms recorded at different stations and do not require any automated picking procedure. Although this family of methods have been applied to different induced seismicity datasets, an extensive comparison with sophisticated pick-based detection and location methods is still lacking. We aim here to perform a systematic comparison in term of performance using the waveform-based method LOKI and the pick-based detection and location methods (SCAUTOLOC and SCANLOC) implemented within the SeisComP3 software package. SCANLOC is a new detection and location method specifically designed for seismic monitoring at local scale. Although recent applications have proved an extensive test with induced seismicity datasets have been not yet performed. This method is based on a cluster search algorithm to associate detections to one or many potential earthquake sources. On the other hand, SCAUTOLOC is more a "conventional" method and is the basic tool for seismic event detection and location in SeisComp3. This approach was specifically designed for regional and teleseismic applications, thus its performance with microseismic data might be limited. We analyze the performance of the three methodologies for a synthetic dataset with realistic noise conditions as well as for the first hour of continuous waveform data, including the Ml 3.5 St. Gallen earthquake, recorded by a microseismic network deployed in the area. We finally compare the results obtained all these three methods with a manually revised catalogue.
Trend Detection and Bivariate Frequency Analysis for Nonstrationary Rainfall Data
NASA Astrophysics Data System (ADS)
Joo, K.; Kim, H.; Shin, J. Y.; Heo, J. H.
2017-12-01
Multivariate frequency analysis has been developing for hydro-meteorological data such as rainfall, flood, and drought. Particularly, the copula has been used as a useful tool for multivariate probability model which has no limitation on deciding marginal distributions. The time-series rainfall data can be characterized to rainfall event by inter-event time definition (IETD) and each rainfall event has a rainfall depth and rainfall duration. In addition, nonstationarity in rainfall event has been studied recently due to climate change and trend detection of rainfall event is important to determine the data has nonstationarity or not. With the rainfall depth and duration of a rainfall event, trend detection and nonstationary bivariate frequency analysis has performed in this study. 62 stations from Korea Meteorological Association (KMA) over 30 years of hourly recorded data used in this study and the suitability of nonstationary copula for rainfall event has examined by the goodness-of-fit test.
Bioluminescence-based system for rapid detection of natural transformation.
Santala, Ville; Karp, Matti; Santala, Suvi
2016-07-01
Horizontal gene transfer plays a significant role in bacterial evolution and has major clinical importance. Thus, it is vital to understand the mechanisms and kinetics of genetic transformations. Natural transformation is the driving mechanism for horizontal gene transfer in diverse genera of bacteria. Our study introduces a simple and rapid method for the investigation of natural transformation. This highly sensitive system allows the detection of a transformation event directly from a bacterial population without any separation step or selection of cells. The system is based on the bacterial luciferase operon from Photorhabdus luminescens The studied molecular tools consist of the functional modules luxCDE and luxAB, which involve a replicative plasmid and an integrative gene cassette. A well-established host for bacterial genetic investigations, Acinetobacter baylyi ADP1, is used as the model bacterium. We show that natural transformation followed by homologous recombination or plasmid recircularization can be readily detected in both actively growing and static biofilm-like cultures, including very rare transformation events. The system allows the detection of natural transformation within 1 h of introducing sample DNA into the culture. The introduced method provides a convenient means to study the kinetics of natural transformation under variable conditions and perturbations. © FEMS 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Measuring Adverse Events in Helicopter Emergency Medical Services: Establishing Content Validity
Patterson, P. Daniel; Lave, Judith R.; Martin-Gill, Christian; Weaver, Matthew D.; Wadas, Richard J.; Arnold, Robert M.; Roth, Ronald N.; Mosesso, Vincent N.; Guyette, Francis X.; Rittenberger, Jon C.; Yealy, Donald M.
2015-01-01
Introduction We sought to create a valid framework for detecting Adverse Events (AEs) in the high-risk setting of Helicopter Emergency Medical Services (HEMS). Methods We assembled a panel of 10 expert clinicians (n=6 emergency medicine physicians and n=4 prehospital nurses and flight paramedics) affiliated with a large multi-state HEMS organization in the Northeast U.S. We used a modified Delphi technique to develop a framework for detecting AEs associated with the treatment of critically ill or injured patients. We used a widely applied measure, the Content Validity Index (CVI), to quantify the validity of the framework’s content. Results The expert panel of 10 clinicians reached consensus on a common AE definition and four-step protocol/process for AE detection in HEMS. The consensus-based framework is composed of three main components: 1) a trigger tool, 2) a method for rating proximal cause, and 3) a method for rating AE severity. The CVI findings isolate components of the framework considered content valid. Conclusions We demonstrate a standardized process for the development of a content valid framework for AE detection. The framework is a model for the development of a method for AE identification in other settings, including ground-based EMS. PMID:24003951
Propulsion Health Monitoring for Enhanced Safety
NASA Technical Reports Server (NTRS)
Butz, Mark G.; Rodriguez, Hector M.
2003-01-01
This report presents the results of the NASA contract Propulsion System Health Management for Enhanced Safety performed by General Electric Aircraft Engines (GE AE), General Electric Global Research (GE GR), and Pennsylvania State University Applied Research Laboratory (PSU ARL) under the NASA Aviation Safety Program. This activity supports the overall goal of enhanced civil aviation safety through a reduction in the occurrence of safety-significant propulsion system malfunctions. Specific objectives are to develop and demonstrate vibration diagnostics techniques for the on-line detection of turbine rotor disk cracks, and model-based fault tolerant control techniques for the prevention and mitigation of in-flight engine shutdown, surge/stall, and flameout events. The disk crack detection work was performed by GE GR which focused on a radial-mode vibration monitoring technique, and PSU ARL which focused on a torsional-mode vibration monitoring technique. GE AE performed the Model-Based Fault Tolerant Control work which focused on the development of analytical techniques for detecting, isolating, and accommodating gas-path faults.
Polepalli Ramesh, Balaji; Belknap, Steven M; Li, Zuofeng; Frid, Nadya; West, Dennis P
2014-01-01
Background The Food and Drug Administration’s (FDA) Adverse Event Reporting System (FAERS) is a repository of spontaneously-reported adverse drug events (ADEs) for FDA-approved prescription drugs. FAERS reports include both structured reports and unstructured narratives. The narratives often include essential information for evaluation of the severity, causality, and description of ADEs that are not present in the structured data. The timely identification of unknown toxicities of prescription drugs is an important, unsolved problem. Objective The objective of this study was to develop an annotated corpus of FAERS narratives and biomedical named entity tagger to automatically identify ADE related information in the FAERS narratives. Methods We developed an annotation guideline and annotate medication information and adverse event related entities on 122 FAERS narratives comprising approximately 23,000 word tokens. A named entity tagger using supervised machine learning approaches was built for detecting medication information and adverse event entities using various categories of features. Results The annotated corpus had an agreement of over .9 Cohen’s kappa for medication and adverse event entities. The best performing tagger achieves an overall performance of 0.73 F1 score for detection of medication, adverse event and other named entities. Conclusions In this study, we developed an annotated corpus of FAERS narratives and machine learning based models for automatically extracting medication and adverse event information from the FAERS narratives. Our study is an important step towards enriching the FAERS data for postmarketing pharmacovigilance. PMID:25600332
Pick- and waveform-based techniques for real-time detection of induced seismicity
NASA Astrophysics Data System (ADS)
Grigoli, Francesco; Scarabello, Luca; Böse, Maren; Weber, Bernd; Wiemer, Stefan; Clinton, John F.
2018-05-01
The monitoring of induced seismicity is a common operation in many industrial activities, such as conventional and non-conventional hydrocarbon production or mining and geothermal energy exploitation, to cite a few. During such operations, we generally collect very large and strongly noise-contaminated data sets that require robust and automated analysis procedures. Induced seismicity data sets are often characterized by sequences of multiple events with short interevent times or overlapping events; in these cases, pick-based location methods may struggle to correctly assign picks to phases and events, and errors can lead to missed detections and/or reduced location resolution and incorrect magnitudes, which can have significant consequences if real-time seismicity information are used for risk assessment frameworks. To overcome these issues, different waveform-based methods for the detection and location of microseismicity have been proposed. The main advantages of waveform-based methods is that they appear to perform better and can simultaneously detect and locate seismic events providing high-quality locations in a single step, while the main disadvantage is that they are computationally expensive. Although these methods have been applied to different induced seismicity data sets, an extensive comparison with sophisticated pick-based detection methods is still missing. In this work, we introduce our improved waveform-based detector and we compare its performance with two pick-based detectors implemented within the SeiscomP3 software suite. We test the performance of these three approaches with both synthetic and real data sets related to the induced seismicity sequence at the deep geothermal project in the vicinity of the city of St. Gallen, Switzerland.
Piezoelectric-based self-powered electronic adjustable impulse switches
NASA Astrophysics Data System (ADS)
Rastegar, Jahangir; Kwok, Philip
2018-03-01
Novel piezoelectric-based self-powered impulse detecting switches are presented. The switches are designed to detect shock loading events resulting in acceleration or deceleration above prescribed levels and durations. The prescribed acceleration level and duration thresholds are adjustable. They are provided with false trigger protection logic. The impulse switches are provided with electronic and logic circuitry to detect prescribed impulse events and reject events such as high amplitude but short duration shocks, and transportation vibration and similar low amplitude and relatively long duration events. They can be mounted directly onto electronics circuit boards, thereby significantly simplifying the electrical and electronic circuitry, simplifying the assembly process and total cost, significantly reducing the occupied volume, and in some applications eliminating the need for physical wiring to and from the impulse switches. The design of prototypes and testing under realistic conditions are presented.
Method of controlling cyclic variation in engine combustion
Davis, L.I. Jr.; Daw, C.S.; Feldkamp, L.A.; Hoard, J.W.; Yuan, F.; Connolly, F.T.
1999-07-13
Cyclic variation in combustion of a lean burning engine is reduced by detecting an engine combustion event output such as torsional acceleration in a cylinder (i) at a combustion event (k), using the detected acceleration to predict a target acceleration for the cylinder at the next combustion event (k+1), modifying the target output by a correction term that is inversely proportional to the average phase of the combustion event output of cylinder (i) and calculating a control output such as fuel pulse width or spark timing necessary to achieve the target acceleration for cylinder (i) at combustion event (k+1) based on anti-correlation with the detected acceleration and spill-over effects from fueling. 27 figs.
Method of controlling cyclic variation in engine combustion
Davis, Jr., Leighton Ira; Daw, Charles Stuart; Feldkamp, Lee Albert; Hoard, John William; Yuan, Fumin; Connolly, Francis Thomas
1999-01-01
Cyclic variation in combustion of a lean burning engine is reduced by detecting an engine combustion event output such as torsional acceleration in a cylinder (i) at a combustion event (k), using the detected acceleration to predict a target acceleration for the cylinder at the next combustion event (k+1), modifying the target output by a correction term that is inversely proportional to the average phase of the combustion event output of cylinder (i) and calculating a control output such as fuel pulse width or spark timing necessary to achieve the target acceleration for cylinder (i) at combustion event (k+1) based on anti-correlation with the detected acceleration and spill-over effects from fueling.
A multidisciplinary approach to trace Asian dust storms from source to sink
NASA Astrophysics Data System (ADS)
Yan, Yan; Sun, Youbin; Ma, Long; Long, Xin
2015-03-01
Tracing the source of dust storm (DS) in mega-cities of northern China currently suffers ambiguities from different approaches including source-sink proxy comparison, air mass back trajectory modeling, and satellite image monitoring. By integrating advantages of all three methods, we present a multidisciplinary approach to trace the provenance of dust fall in Xi'an during the spring season (March to May) of 2012. We collected daily dust fall to calculate dust flux variation, and detected eight DS events with remarkable high flux values based on meteorological comparison and extreme detection algorithm. By combining MODIS images and accompanying real-time air mass back trajectories, we attribute four of them as natural DS events and the other four as anthropogenic DS events, suggesting the importance of natural and anthropogenic processes in supplying long-range transported dust. The primary sources of these DS events were constrained to three possible areas, including the northern Chinese deserts, Taklimakan desert, and Gurbantunggut desert. Proxy comparisons based upon the quartz crystallinity index and oxygen isotope further confirmed the source-to-sink linkage between the natural DS events in Xi'an and the dust emissions from the northern Chinese deserts. The integration of geochemical and meteorological tracing approaches favors the dominant contribution of short-distance transportation of modern dust fall on the Chinese Loess Plateau. Our study shows that the multidisciplinary approach could permit a better source identification of modern dust and should be applied properly for tracing the provenance fluctuations of geological dust deposits.
NASA Astrophysics Data System (ADS)
Fluixá-Sanmartín, Javier; Pan, Deng; Fischer, Luzia; Orlowsky, Boris; García-Hernández, Javier; Jordan, Frédéric; Haemmig, Christoph; Zhang, Fangwei; Xu, Jijun
2018-02-01
Drought indices based on precipitation are commonly used to identify and characterize droughts. Due to the general complexity of droughts, the comparison of index-identified events with droughts at different levels of the complete system, including soil humidity or river discharges, relies typically on model simulations of the latter, entailing potentially significant uncertainties. The present study explores the potential of using precipitation-based indices to reproduce observed droughts in the lower part of the Jinsha River basin (JRB), proposing an innovative approach for a catchment-wide drought detection and characterization. Two indicators, namely the Overall Drought Extension (ODE) and the Overall Drought Indicator (ODI), have been defined. These indicators aim at identifying and characterizing drought events on the basin scale, using results from four meteorological drought indices (standardized precipitation index, SPI; rainfall anomaly index, RAI; percent of normal precipitation, PN; deciles, DEC) calculated at different locations of the basin and for different timescales. Collected historical information on drought events is used to contrast results obtained with the indicators. This method has been successfully applied to the lower Jinsha River basin in China, a region prone to frequent and severe droughts. Historical drought events that occurred from 1960 to 2014 have been compiled and cataloged from different sources, in a challenging process. The analysis of the indicators shows a good agreement with the recorded historical drought events on the basin scale. It has been found that the timescale that best reproduces observed events across all the indices is the 6-month timescale.
Rapid Landslide Mapping by Means of Post-Event Polarimetric SAR Imagery
NASA Astrophysics Data System (ADS)
Plank, Simon; Martinis, Sandro; Twele, Andre
2016-08-01
Rapid mapping of landslides, quickly providing information about the extent of the affected area and type and grade of damage, is crucial to enable fast crisis response. Reviewing the literature shows that most synthetic aperture radar (SAR) data-based landslide mapping procedures use change detection techniques. However, the required very high resolution (VHR) pre-event SAR imagery, acquired shortly before the landslide event, is commonly not available. Due to limitations in onboard disk space and downlink transmission rates modern VHR SAR missions do not systematically cover the entire world. We present a fast and robust procedure for mapping of landslides, based on change detection between freely available and systematically acquired pre-event optical and post-event polarimetric SAR data.
Rassam, Murad A.; Zainal, Anazida; Maarof, Mohd Aizaini
2013-01-01
Wireless Sensor Networks (WSNs) are important and necessary platforms for the future as the concept “Internet of Things” has emerged lately. They are used for monitoring, tracking, or controlling of many applications in industry, health care, habitat, and military. However, the quality of data collected by sensor nodes is affected by anomalies that occur due to various reasons, such as node failures, reading errors, unusual events, and malicious attacks. Therefore, anomaly detection is a necessary process to ensure the quality of sensor data before it is utilized for making decisions. In this review, we present the challenges of anomaly detection in WSNs and state the requirements to design efficient and effective anomaly detection models. We then review the latest advancements of data anomaly detection research in WSNs and classify current detection approaches in five main classes based on the detection methods used to design these approaches. Varieties of the state-of-the-art models for each class are covered and their limitations are highlighted to provide ideas for potential future works. Furthermore, the reviewed approaches are compared and evaluated based on how well they meet the stated requirements. Finally, the general limitations of current approaches are mentioned and further research opportunities are suggested and discussed. PMID:23966182
Semantic Concept Discovery for Large Scale Zero Shot Event Detection
2015-07-25
sources and can be shared among many different events, including unseen ones. Based on this idea, events can be detected by inspect- ing the individual...2013]. Partial success along this vein has also been achieved in the zero-shot setting, e.g. [Habibian et al., 2014; Wu et al., 2014], but the...candle”, “birthday cake” and “applaud- ing”. Since concepts are shared among many different classes (events) and each concept classifier can be trained
Stability of model-based event-triggered control systems: a separation property
NASA Astrophysics Data System (ADS)
Hao, Fei; Yu, Hao
2017-04-01
To save resource of communication, this paper investigates the model-based event-triggered control systems. Two main problems are considered in this paper. One is, for given plant and model, to design event conditions to guarantee the stability of the systems. The other is to consider the effect of the model matrices on the stability. The results show that the closed-loop systems can be asymptotically stabilised with any model matrices in compact sets if the parameters in the event conditions are within the designed ranges. Then, a separation property of model-based event-triggered control is proposed. Namely, the design of the controller gain and the event condition can be separated from the selection of the model matrices. Based on this property, an adaption mechanism is introduced to the model-based event-triggered control systems, which can further improve the sampling performance. Finally, a numerical example is given to show the efficiency and feasibility of the developed results.
Track-based event recognition in a realistic crowded environment
NASA Astrophysics Data System (ADS)
van Huis, Jasper R.; Bouma, Henri; Baan, Jan; Burghouts, Gertjan J.; Eendebak, Pieter T.; den Hollander, Richard J. M.; Dijk, Judith; van Rest, Jeroen H.
2014-10-01
Automatic detection of abnormal behavior in CCTV cameras is important to improve the security in crowded environments, such as shopping malls, airports and railway stations. This behavior can be characterized at different time scales, e.g., by small-scale subtle and obvious actions or by large-scale walking patterns and interactions between people. For example, pickpocketing can be recognized by the actual snatch (small scale), when he follows the victim, or when he interacts with an accomplice before and after the incident (longer time scale). This paper focusses on event recognition by detecting large-scale track-based patterns. Our event recognition method consists of several steps: pedestrian detection, object tracking, track-based feature computation and rule-based event classification. In the experiment, we focused on single track actions (walk, run, loiter, stop, turn) and track interactions (pass, meet, merge, split). The experiment includes a controlled setup, where 10 actors perform these actions. The method is also applied to all tracks that are generated in a crowded shopping mall in a selected time frame. The results show that most of the actions can be detected reliably (on average 90%) at a low false positive rate (1.1%), and that the interactions obtain lower detection rates (70% at 0.3% FP). This method may become one of the components that assists operators to find threatening behavior and enrich the selection of videos that are to be observed.
NASA Astrophysics Data System (ADS)
Butkowski, Łukasz; Vogel, Vladimir; Schlarb, Holger; Szabatin, Jerzy
2017-06-01
The driving engine of the superconducting accelerator of the European X-ray free electron laser (XFEL) is a set of 27 radio frequency (RF) stations. Each of the underground RF stations consists of a multibeam horizontal klystron that can provide up to 10 MW of power at 1.3 GHz. Klystrons are sensitive devices with a limited lifetime and a high mean time between failures. In real operation, the lifetime of the tube can be significantly reduced because of failures. The special fast protection klystron lifetime management (KLM) system has been developed to minimize the influence of service conditions on the lifetime of klystrons. The main task of this system is to detect all events which can destroy the tube as quickly as possible, and switch off the driving RF signal or the high voltage. Detection of events is based on a comparison of the value of the real signal obtained at the system output with the value estimated on the basis of a high-power RF amplifier model and input signals. The KLM system has been realized in field-programmable gate array (FPGA) and implemented in XFEL. Implementation is based on the standard low-level RF micro telecommunications computing architecture (MTCA.4 or xTCA). The main part of the paper focuses on an estimation of the klystron model and the implementation of KLM in FPGA. The results of the performance of the KLM system will also be presented.
Detection and attribution of climate extremes in the observed record
Easterling, David R.; Kunkel, Kenneth E.; Wehner, Michael F.; ...
2016-01-18
We present an overview of practices and challenges related to the detection and attribution of observed changes in climate extremes. Detection is the identification of a statistically significant change in the extreme values of a climate variable over some period of time. Issues in detection discussed include data quality, coverage, and completeness. Attribution takes that detection of a change and uses climate model simulations to evaluate whether a cause can be assigned to that change. Additionally, we discuss a newer field of attribution, event attribution, where individual extreme events are analyzed for the express purpose of assigning some measure ofmore » whether that event was directly influenced by anthropogenic forcing of the climate system.« less
NASA Astrophysics Data System (ADS)
Tsai, F.; Lai, J. S.; Chiang, S. H.
2015-12-01
Landslides are frequently triggered by typhoons and earthquakes in Taiwan, causing serious economic losses and human casualties. Remotely sensed images and geo-spatial data consisting of land-cover and environmental information have been widely used for producing landslide inventories and causative factors for slope stability analysis. Landslide susceptibility, on the other hand, can represent the spatial likelihood of landslide occurrence and is an important basis for landslide risk assessment. As multi-temporal satellite images become popular and affordable, they are commonly used to generate landslide inventories for subsequent analysis. However, it is usually difficult to distinguish different landslide sub-regions (scarp, debris flow, deposition etc.) directly from remote sensing imagery. Consequently, the extracted landslide extents using image-based visual interpretation and automatic detections may contain many depositions that may reduce the fidelity of the landslide susceptibility model. This study developed an empirical thresholding scheme based on terrain characteristics for eliminating depositions from detected landslide areas to improve landslide susceptibility modeling. In this study, Bayesian network classifier is utilized to build a landslide susceptibility model and to predict sequent rainfall-induced shallow landslides in the Shimen reservoir watershed located in northern Taiwan. Eleven causative factors are considered, including terrain slope, aspect, curvature, elevation, geology, land-use, NDVI, soil, distance to fault, river and road. Landslide areas detected using satellite images acquired before and after eight typhoons between 2004 to 2008 are collected as the main inventory for training and verification. In the analysis, previous landslide events are used as training data to predict the samples of the next event. The results are then compared with recorded landslide areas in the inventory to evaluate the accuracy. Experimental results demonstrate that the accuracies of landslide susceptibility analysis in all sequential predictions have been improved significantly after eliminating landslide depositions.
A Short-term ESPERTA-based Forecast Tool for Moderate-to-extreme Solar Proton Events
NASA Astrophysics Data System (ADS)
Laurenza, M.; Alberti, T.; Cliver, E. W.
2018-04-01
The ESPERTA (Empirical model for Solar Proton Event Real Time Alert) forecast tool has a Probability of Detection (POD) of 63% for all >10 MeV events with proton peak intensity ≥10 pfu (i.e., ≥S1 events, S1 referring to minor storms on the NOAA Solar Radiation Storms scale), from 1995 to 2014 with a false alarm rate (FAR) of 38% and a median (minimum) warning time (WT) of ∼4.8 (0.4) hr. The NOAA space weather scale includes four additional categories: moderate (S2), strong (S3), severe (S4), and extreme (S5). As S1 events have only minor impacts on HF radio propagation in the polar regions, the effective threshold for significant space radiation effects appears to be the S2 level (100 pfu), above which both biological and space operation impacts are observed along with increased effects on HF propagation in the polar regions. We modified the ESPERTA model to predict ≥S2 events and obtained a POD of 75% (41/55) and an FAR of 24% (13/54) for the 1995–2014 interval with a median (minimum) WT of ∼1.7 (0.2) hr based on predictions made at the time of the S1 threshold crossing. The improved performance of ESPERTA for ≥S2 events is a reflection of the big flare syndrome, which postulates that the measures of the various manifestations of eruptive solar flares increase as one considers increasingly larger events.
Multi-Sensor Data Fusion Project
2000-02-28
seismic network by detecting T phases generated by underground events ( generally earthquakes ) and associating these phases to seismic events. The...between underwater explosions (H), underground sources, mostly earthquake - generated (7), and noise detections (N). The phases classified as H are the only...processing for infrasound sensors is most similar to seismic array processing with the exception that the detections are based on a more sophisticated
Hanuschkin, Alexander; Kunkel, Susanne; Helias, Moritz; Morrison, Abigail; Diesmann, Markus
2010-01-01
Traditionally, event-driven simulations have been limited to the very restricted class of neuronal models for which the timing of future spikes can be expressed in closed form. Recently, the class of models that is amenable to event-driven simulation has been extended by the development of techniques to accurately calculate firing times for some integrate-and-fire neuron models that do not enable the prediction of future spikes in closed form. The motivation of this development is the general perception that time-driven simulations are imprecise. Here, we demonstrate that a globally time-driven scheme can calculate firing times that cannot be discriminated from those calculated by an event-driven implementation of the same model; moreover, the time-driven scheme incurs lower computational costs. The key insight is that time-driven methods are based on identifying a threshold crossing in the recent past, which can be implemented by a much simpler algorithm than the techniques for predicting future threshold crossings that are necessary for event-driven approaches. As run time is dominated by the cost of the operations performed at each incoming spike, which includes spike prediction in the case of event-driven simulation and retrospective detection in the case of time-driven simulation, the simple time-driven algorithm outperforms the event-driven approaches. Additionally, our method is generally applicable to all commonly used integrate-and-fire neuronal models; we show that a non-linear model employing a standard adaptive solver can reproduce a reference spike train with a high degree of precision. PMID:21031031
Detecting Seismic Events Using a Supervised Hidden Markov Model
NASA Astrophysics Data System (ADS)
Burks, L.; Forrest, R.; Ray, J.; Young, C.
2017-12-01
We explore the use of supervised hidden Markov models (HMMs) to detect seismic events in streaming seismogram data. Current methods for seismic event detection include simple triggering algorithms, such as STA/LTA and the Z-statistic, which can lead to large numbers of false positives that must be investigated by an analyst. The hypothesis of this study is that more advanced detection methods, such as HMMs, may decreases false positives while maintaining accuracy similar to current methods. We train a binary HMM classifier using 2 weeks of 3-component waveform data from the International Monitoring System (IMS) that was carefully reviewed by an expert analyst to pick all seismic events. Using an ensemble of simple and discrete features, such as the triggering of STA/LTA, the HMM predicts the time at which transition occurs from noise to signal. Compared to the STA/LTA detection algorithm, the HMM detects more true events, but the false positive rate remains unacceptably high. Future work to potentially decrease the false positive rate may include using continuous features, a Gaussian HMM, and multi-class HMMs to distinguish between types of seismic waves (e.g., P-waves and S-waves). Acknowledgement: Sandia National Laboratories is a multi-mission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525.SAND No: SAND2017-8154 A
Simulation of Rate-Related (Dead-Time) Losses In Passive Neutron Multiplicity Counting Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, L.G.; Norman, P.I.; Leadbeater, T.W.
Passive Neutron Multiplicity Counting (PNMC) based on Multiplicity Shift Register (MSR) electronics (a form of time correlation analysis) is a widely used non-destructive assay technique for quantifying spontaneously fissile materials such as Pu. At high event rates, dead-time losses perturb the count rates with the Singles, Doubles and Triples being increasingly affected. Without correction these perturbations are a major source of inaccuracy in the measured count rates and assay values derived from them. This paper presents the simulation of dead-time losses and investigates the effect of applying different dead-time models on the observed MSR data. Monte Carlo methods have beenmore » used to simulate neutron pulse trains for a variety of source intensities and with ideal detection geometry, providing an event by event record of the time distribution of neutron captures within the detection system. The action of the MSR electronics was modelled in software to analyse these pulse trains. Stored pulse trains were perturbed in software to apply the effects of dead-time according to the chosen physical process; for example, the ideal paralysable (extending) and non-paralysable models with an arbitrary dead-time parameter. Results of the simulations demonstrate the change in the observed MSR data when the system dead-time parameter is varied. In addition, the paralysable and non-paralysable models of deadtime are compared. These results form part of a larger study to evaluate existing dead-time corrections and to extend their application to correlated sources. (authors)« less
Li, P; Jia, J W; Jiang, L X; Zhu, H; Bai, L; Wang, J B; Tang, X M; Pan, A H
2012-04-27
To ensure the implementation of genetically modified organism (GMO)-labeling regulations, an event-specific detection method was developed based on the junction sequence of an exogenous integrant in the transgenic carnation variety Moonlite. The 5'-transgene integration sequence was isolated by thermal asymmetric interlaced PCR. Based upon the 5'-transgene integration sequence, the event-specific primers and TaqMan probe were designed to amplify the fragments, which spanned the exogenous DNA and carnation genomic DNA. Qualitative and quantitative PCR assays were developed employing the designed primers and probe. The detection limit of the qualitative PCR assay was 0.05% for Moonlite in 100 ng total carnation genomic DNA, corresponding to about 79 copies of the carnation haploid genome; the limit of detection and quantification of the quantitative PCR assay were estimated to be 38 and 190 copies of haploid carnation genomic DNA, respectively. Carnation samples with different contents of genetically modified components were quantified and the bias between the observed and true values of three samples were lower than the acceptance criterion (<25%) of the GMO detection method. These results indicated that these event-specific methods would be useful for the identification and quantification of the GMO carnation Moonlite.
Single- and Dual-Process Models of Biased Contingency Detection.
Vadillo, Miguel A; Blanco, Fernando; Yarritu, Ion; Matute, Helena
2016-01-01
Decades of research in causal and contingency learning show that people's estimations of the degree of contingency between two events are easily biased by the relative probabilities of those two events. If two events co-occur frequently, then people tend to overestimate the strength of the contingency between them. Traditionally, these biases have been explained in terms of relatively simple single-process models of learning and reasoning. However, more recently some authors have found that these biases do not appear in all dependent variables and have proposed dual-process models to explain these dissociations between variables. In the present paper we review the evidence for dissociations supporting dual-process models and we point out important shortcomings of this literature. Some dissociations seem to be difficult to replicate or poorly generalizable and others can be attributed to methodological artifacts. Overall, we conclude that support for dual-process models of biased contingency detection is scarce and inconclusive.
NASA Astrophysics Data System (ADS)
Versini, P.-A.; Gaume, E.; Andrieu, H.
2010-04-01
This paper presents an initial prototype of a distributed hydrological model used to map possible road inundations in a region frequently exposed to severe flash floods: the Gard region (South of France). The prototype has been tested in a pseudo real-time mode on five recent flash flood events for which actual road inundations have been inventoried. The results are promising: close to 100% probability of detection of actual inundations, inundations detected before they were reported by the road management field teams with a false alarm ratios not exceeding 30%. This specific case study differs from the standard applications of rainfall-runoff models to produce flood forecasts, focussed on a single or a limited number of gauged river cross sections. It illustrates that, despite their lack of accuracy, hydro-meteorological forecasts based on rainfall-runoff models, especially distributed models, contain valuable information for flood event management. The possible consequences of landslides, debris flows and local erosion processes, sometimes associated with flash floods, were not considered at this stage of development of the prototype. They are limited in the Gard region but should be taken into account in future developments of the approach to implement it efficiently in other areas more exposed to these phenomena such as the Alpine area.
Detecting modification of biomedical events using a deep parsing approach.
Mackinlay, Andrew; Martinez, David; Baldwin, Timothy
2012-04-30
This work describes a system for identifying event mentions in bio-molecular research abstracts that are either speculative (e.g. analysis of IkappaBalpha phosphorylation, where it is not specified whether phosphorylation did or did not occur) or negated (e.g. inhibition of IkappaBalpha phosphorylation, where phosphorylation did not occur). The data comes from a standard dataset created for the BioNLP 2009 Shared Task. The system uses a machine-learning approach, where the features used for classification are a combination of shallow features derived from the words of the sentences and more complex features based on the semantic outputs produced by a deep parser. To detect event modification, we use a Maximum Entropy learner with features extracted from the data relative to the trigger words of the events. The shallow features are bag-of-words features based on a small sliding context window of 3-4 tokens on either side of the trigger word. The deep parser features are derived from parses produced by the English Resource Grammar and the RASP parser. The outputs of these parsers are converted into the Minimal Recursion Semantics formalism, and from this, we extract features motivated by linguistics and the data itself. All of these features are combined to create training or test data for the machine learning algorithm. Over the test data, our methods produce approximately a 4% absolute increase in F-score for detection of event modification compared to a baseline based only on the shallow bag-of-words features. Our results indicate that grammar-based techniques can enhance the accuracy of methods for detecting event modification.
NASA Astrophysics Data System (ADS)
CHU, Q.; Xu, Z.; Zhuo, L.; Han, D.
2016-12-01
Increased requirements for interactions between different disciplines and readily access to the numerical weather forecasting system featured with portability and extensibility have made useful contribution to the increases of downstream model users in WRF over recent years. For these users, a knowledge base classified by the representative events would be much helpful. This is because the determination of model settings is regarded as the most important steps in WRF. However, such a process is generally time-consuming, even if with a high computational platform. As such, we propose a sharable proper lookup table on WRF domain settings and corresponding procedures based on a representative torrential rainfall event in Beijing, China. It has been found that WRF's simulations' drift away from the input lateral boundary conditions can be significantly reduced with the adjustment of the domain settings. Among all the impact factors, the placement of nested domain can not only affect the moving speed and angle of the storm-center, but also the location and amount of heavy-rain-belt which can only be detected with adjusted spatial resolutions. Spin-up time is also considered in the model settings, which is demonstrated to have the most obvious influence on the accuracy of the simulations. This conclusion is made based on the large diversity of spatial distributions of precipitation, in terms of the amount of heavy rain varied from -30% to 58% among each experiment. After following all the procedures, the variations of domain settings have minimal effect on the modeling and show the best correlation (larger than 0.65) with fusion observations. So the model settings, including domain size covering the greater Beijing area, 1:5:5 downscaling ratio, 57 vertical levels with top of 50hpa and 60h spin-up time, are found suitable for predicting the similar convective torrential rainfall event in Beijing area. We hope that the procedure for building the community WRF knowledge base in this paper would be helpful to peer-researchers and operational communities by saving them from repeating each other's work. More importantly, the results by studying different events and locations could enrich this community knowledge base to benefit WRF users around the world in the future.
Chen, Gong; Qi, Peng; Guo, Zhao; Yu, Haoyong
2017-06-01
In the field of gait rehabilitation robotics, achieving human-robot synchronization is very important. In this paper, a novel human-robot synchronization method using gait event information is proposed. This method includes two steps. First, seven gait events in one gait cycle are detected in real time with a hidden Markov model; second, an adaptive oscillator is utilized to estimate the stride percentage of human gait using any one of the gait events. Synchronous reference trajectories for the robot are then generated with the estimated stride percentage. This method is based on a bioinspired adaptive oscillator, which is a mathematical tool, first proposed to explain the phenomenon of synchronous flashing among fireflies. The proposed synchronization method is implemented in a portable knee-ankle-foot robot and tested in 15 healthy subjects. This method has the advantages of simple structure, flexible selection of gait events, and fast adaptation. Gait event is the only information needed, and hence the performance of synchronization holds when an abnormal gait pattern is involved. The results of the experiments reveal that our approach is efficient in achieving human-robot synchronization and feasible for rehabilitation robotics application.
Bongiorno, Christian; Miccichè, Salvatore; Mantegna, Rosario N
2017-01-01
We present an agent based model of the Air Traffic Management socio-technical complex system aiming at modeling the interactions between aircraft and air traffic controllers at a tactical level. The core of the model is given by the conflict detection and resolution module and by the directs module. Directs are flight shortcuts that are given by air controllers to speed up the passage of an aircraft within a certain airspace and therefore to facilitate airline operations. Conflicts between flight trajectories can occur for two main reasons: either the planning of the flight trajectory was not sufficiently detailed to rule out all potential conflicts or unforeseen events during the flight require modifications of the flight plan that can conflict with other flight trajectories. Our model performs a local conflict detection and resolution procedure. Once a flight trajectory has been made conflict-free, the model searches for possible improvements of the system efficiency by issuing directs. We give an example of model calibration based on real data. We then provide an illustration of the capability of our model in generating scenario simulations able to give insights about the air traffic management system. We show that the calibrated model is able to reproduce the existence of a geographical localization of air traffic controllers' operations. Finally, we use the model to investigate the relationship between directs and conflict resolutions (i) in the presence of perfect forecast ability of controllers, and (ii) in the presence of some degree of uncertainty in flight trajectory forecast.
Bongiorno, Christian; Mantegna, Rosario N.
2017-01-01
We present an agent based model of the Air Traffic Management socio-technical complex system aiming at modeling the interactions between aircraft and air traffic controllers at a tactical level. The core of the model is given by the conflict detection and resolution module and by the directs module. Directs are flight shortcuts that are given by air controllers to speed up the passage of an aircraft within a certain airspace and therefore to facilitate airline operations. Conflicts between flight trajectories can occur for two main reasons: either the planning of the flight trajectory was not sufficiently detailed to rule out all potential conflicts or unforeseen events during the flight require modifications of the flight plan that can conflict with other flight trajectories. Our model performs a local conflict detection and resolution procedure. Once a flight trajectory has been made conflict-free, the model searches for possible improvements of the system efficiency by issuing directs. We give an example of model calibration based on real data. We then provide an illustration of the capability of our model in generating scenario simulations able to give insights about the air traffic management system. We show that the calibrated model is able to reproduce the existence of a geographical localization of air traffic controllers’ operations. Finally, we use the model to investigate the relationship between directs and conflict resolutions (i) in the presence of perfect forecast ability of controllers, and (ii) in the presence of some degree of uncertainty in flight trajectory forecast. PMID:28419160
Ontology-based knowledge management for personalized adverse drug events detection.
Cao, Feng; Sun, Xingzhi; Wang, Xiaoyuan; Li, Bo; Li, Jing; Pan, Yue
2011-01-01
Since Adverse Drug Event (ADE) has become a leading cause of death around the world, there arises high demand for helping clinicians or patients to identify possible hazards from drug effects. Motivated by this, we present a personalized ADE detection system, with the focus on applying ontology-based knowledge management techniques to enhance ADE detection services. The development of electronic health records makes it possible to automate the personalized ADE detection, i.e., to take patient clinical conditions into account during ADE detection. Specifically, we define the ADE ontology to uniformly manage the ADE knowledge from multiple sources. We take advantage of the rich semantics from the terminology SNOMED-CT and apply it to ADE detection via the semantic query and reasoning.
NASA Astrophysics Data System (ADS)
Morton, E.; Bilek, S. L.; Rowe, C. A.
2017-12-01
Understanding the spatial extent and behavior of the interplate contact in the Cascadia Subduction Zone (CSZ) may prove pivotal to preparation for future great earthquakes, such as the M9 event of 1700. Current and historic seismic catalogs are limited in their integrity by their short duration, given the recurrence rate of great earthquakes, and by their rather high magnitude of completeness for the interplate seismic zone, due to its offshore distance from these land-based networks. This issue is addressed via the 2011-2015 Cascadia Initiative (CI) amphibious seismic array deployment, which combined coastal land seismometers with more than 60 ocean-bottom seismometers (OBS) situated directly above the presumed plate interface. We search the CI dataset for small, previously undetected interplate earthquakes to identify seismic patches on the megathrust. Using the automated subspace detection method, we search for previously undetected events. Our subspace comprises eigenvectors derived from CI OBS and on-land waveforms extracted for existing catalog events that appear to have occurred on the plate interface. Previous work focused on analysis of two repeating event clusters off the coast of Oregon spanning all 4 years of deployment. Here we expand earlier results to include detection and location analysis to the entire CSZ margin during the first year of CI deployment, with more than 200 new events detected for the central portion of the margin. Template events used for subspace scanning primarily occurred beneath the land surface along the coast, at the downdip edge of modeled high slip patches for the 1700 event, with most concentrated at the northwestern edge of the Olympic Peninsula.
Boxwala, Aziz A; Kim, Jihoon; Grillo, Janice M; Ohno-Machado, Lucila
2011-01-01
To determine whether statistical and machine-learning methods, when applied to electronic health record (EHR) access data, could help identify suspicious (ie, potentially inappropriate) access to EHRs. From EHR access logs and other organizational data collected over a 2-month period, the authors extracted 26 features likely to be useful in detecting suspicious accesses. Selected events were marked as either suspicious or appropriate by privacy officers, and served as the gold standard set for model evaluation. The authors trained logistic regression (LR) and support vector machine (SVM) models on 10-fold cross-validation sets of 1291 labeled events. The authors evaluated the sensitivity of final models on an external set of 58 events that were identified as truly inappropriate and investigated independently from this study using standard operating procedures. The area under the receiver operating characteristic curve of the models on the whole data set of 1291 events was 0.91 for LR, and 0.95 for SVM. The sensitivity of the baseline model on this set was 0.8. When the final models were evaluated on the set of 58 investigated events, all of which were determined as truly inappropriate, the sensitivity was 0 for the baseline method, 0.76 for LR, and 0.79 for SVM. The LR and SVM models may not generalize because of interinstitutional differences in organizational structures, applications, and workflows. Nevertheless, our approach for constructing the models using statistical and machine-learning techniques can be generalized. An important limitation is the relatively small sample used for the training set due to the effort required for its construction. The results suggest that statistical and machine-learning methods can play an important role in helping privacy officers detect suspicious accesses to EHRs.
Kim, Jihoon; Grillo, Janice M; Ohno-Machado, Lucila
2011-01-01
Objective To determine whether statistical and machine-learning methods, when applied to electronic health record (EHR) access data, could help identify suspicious (ie, potentially inappropriate) access to EHRs. Methods From EHR access logs and other organizational data collected over a 2-month period, the authors extracted 26 features likely to be useful in detecting suspicious accesses. Selected events were marked as either suspicious or appropriate by privacy officers, and served as the gold standard set for model evaluation. The authors trained logistic regression (LR) and support vector machine (SVM) models on 10-fold cross-validation sets of 1291 labeled events. The authors evaluated the sensitivity of final models on an external set of 58 events that were identified as truly inappropriate and investigated independently from this study using standard operating procedures. Results The area under the receiver operating characteristic curve of the models on the whole data set of 1291 events was 0.91 for LR, and 0.95 for SVM. The sensitivity of the baseline model on this set was 0.8. When the final models were evaluated on the set of 58 investigated events, all of which were determined as truly inappropriate, the sensitivity was 0 for the baseline method, 0.76 for LR, and 0.79 for SVM. Limitations The LR and SVM models may not generalize because of interinstitutional differences in organizational structures, applications, and workflows. Nevertheless, our approach for constructing the models using statistical and machine-learning techniques can be generalized. An important limitation is the relatively small sample used for the training set due to the effort required for its construction. Conclusion The results suggest that statistical and machine-learning methods can play an important role in helping privacy officers detect suspicious accesses to EHRs. PMID:21672912
Zhang, Jian; Hou, Dibo; Wang, Ke; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo
2017-05-01
The detection of organic contaminants in water distribution systems is essential to protect public health from potential harmful compounds resulting from accidental spills or intentional releases. Existing methods for detecting organic contaminants are based on quantitative analyses such as chemical testing and gas/liquid chromatography, which are time- and reagent-consuming and involve costly maintenance. This study proposes a novel procedure based on discrete wavelet transform and principal component analysis for detecting organic contamination events from ultraviolet spectral data. Firstly, the spectrum of each observation is transformed using discrete wavelet with a coiflet mother wavelet to capture the abrupt change along the wavelength. Principal component analysis is then employed to approximate the spectra based on capture and fusion features. The significant value of Hotelling's T 2 statistics is calculated and used to detect outliers. An alarm of contamination event is triggered by sequential Bayesian analysis when the outliers appear continuously in several observations. The effectiveness of the proposed procedure is tested on-line using a pilot-scale setup and experimental data.
Serial Spike Time Correlations Affect Probability Distribution of Joint Spike Events.
Shahi, Mina; van Vreeswijk, Carl; Pipa, Gordon
2016-01-01
Detecting the existence of temporally coordinated spiking activity, and its role in information processing in the cortex, has remained a major challenge for neuroscience research. Different methods and approaches have been suggested to test whether the observed synchronized events are significantly different from those expected by chance. To analyze the simultaneous spike trains for precise spike correlation, these methods typically model the spike trains as a Poisson process implying that the generation of each spike is independent of all the other spikes. However, studies have shown that neural spike trains exhibit dependence among spike sequences, such as the absolute and relative refractory periods which govern the spike probability of the oncoming action potential based on the time of the last spike, or the bursting behavior, which is characterized by short epochs of rapid action potentials, followed by longer episodes of silence. Here we investigate non-renewal processes with the inter-spike interval distribution model that incorporates spike-history dependence of individual neurons. For that, we use the Monte Carlo method to estimate the full shape of the coincidence count distribution and to generate false positives for coincidence detection. The results show that compared to the distributions based on homogeneous Poisson processes, and also non-Poisson processes, the width of the distribution of joint spike events changes. Non-renewal processes can lead to both heavy tailed or narrow coincidence distribution. We conclude that small differences in the exact autostructure of the point process can cause large differences in the width of a coincidence distribution. Therefore, manipulations of the autostructure for the estimation of significance of joint spike events seem to be inadequate.
Serial Spike Time Correlations Affect Probability Distribution of Joint Spike Events
Shahi, Mina; van Vreeswijk, Carl; Pipa, Gordon
2016-01-01
Detecting the existence of temporally coordinated spiking activity, and its role in information processing in the cortex, has remained a major challenge for neuroscience research. Different methods and approaches have been suggested to test whether the observed synchronized events are significantly different from those expected by chance. To analyze the simultaneous spike trains for precise spike correlation, these methods typically model the spike trains as a Poisson process implying that the generation of each spike is independent of all the other spikes. However, studies have shown that neural spike trains exhibit dependence among spike sequences, such as the absolute and relative refractory periods which govern the spike probability of the oncoming action potential based on the time of the last spike, or the bursting behavior, which is characterized by short epochs of rapid action potentials, followed by longer episodes of silence. Here we investigate non-renewal processes with the inter-spike interval distribution model that incorporates spike-history dependence of individual neurons. For that, we use the Monte Carlo method to estimate the full shape of the coincidence count distribution and to generate false positives for coincidence detection. The results show that compared to the distributions based on homogeneous Poisson processes, and also non-Poisson processes, the width of the distribution of joint spike events changes. Non-renewal processes can lead to both heavy tailed or narrow coincidence distribution. We conclude that small differences in the exact autostructure of the point process can cause large differences in the width of a coincidence distribution. Therefore, manipulations of the autostructure for the estimation of significance of joint spike events seem to be inadequate. PMID:28066225
Feasibility of blocking detection in observations from radio occultation
NASA Astrophysics Data System (ADS)
Brunner, Lukas; Steiner, Andrea Karin; Scherllin-Pirscher, Barbara; Jury, Martin
2015-04-01
Blocking describes an atmospheric situation in which the climatological westerly flow at mid latitudes is weakened or reversed. This is caused by a persistent high pressure system which can be stationary for several days to weeks. In the Northern Hemisphere blocking preferably occurs over the Atlantic/European and the Pacific regions. In recent years blocking has been under close scientific investigation due to its effect on weather extremes, triggering heat waves in summer and cold spells in winter. So far, scientific literature mainly focused on the investigation of blocking in reanalysis and global climate model data sets. However, blocking is underestimated in most climate models due to small-scale processes involved in its evolution. For a detection of blocking, most commonly applied methods are based on the computation of meridional geopotential height gradients at the 500 hPa level. Therefore measurements with adequate vertical, horizontal, and temporal resolution and coverage are required. We use an observational data set based on Global Positioning System (GPS) Radio Occultation (RO) measurements fulfilling these requirements. RO is a relatively new, satellite based remote sensing technique, delivering profiles of atmospheric parameters such as geopotential height, pressure, and temperature. It is characterized by favorable properties like long-term stability, global coverage, and high vertical resolution. Our data set is based on the most recent WEGC RO retrieval. Here we report on a feasibility study for blocking detection and analysis in RO data for two exemplary blocking events: the blocking over Russia in summer 2010 and the blocking over Greenland in late winter 2013. For these two events about 700 RO measurements per day are available in the Northern Hemisphere. We will show that the measurement density and quality of RO observations are favorable for blocking analysis and can therefore contribute to blocking research.
Detection of Abnormal Events via Optical Flow Feature Analysis
Wang, Tian; Snoussi, Hichem
2015-01-01
In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227
Forghani-Arani, Farnoush; Behura, Jyoti; Haines, Seth S.; Batzle, Mike
2013-01-01
In studies on heavy oil, shale reservoirs, tight gas and enhanced geothermal systems, the use of surface passive seismic data to monitor induced microseismicity due to the fluid flow in the subsurface is becoming more common. However, in most studies passive seismic records contain days and months of data and manually analysing the data can be expensive and inaccurate. Moreover, in the presence of noise, detecting the arrival of weak microseismic events becomes challenging. Hence, the use of an automated, accurate and computationally fast technique for event detection in passive seismic data is essential. The conventional automatic event identification algorithm computes a running-window energy ratio of the short-term average to the long-term average of the passive seismic data for each trace. We show that for the common case of a low signal-to-noise ratio in surface passive records, the conventional method is not sufficiently effective at event identification. Here, we extend the conventional algorithm by introducing a technique that is based on the cross-correlation of the energy ratios computed by the conventional method. With our technique we can measure the similarities amongst the computed energy ratios at different traces. Our approach is successful at improving the detectability of events with a low signal-to-noise ratio that are not detectable with the conventional algorithm. Also, our algorithm has the advantage to identify if an event is common to all stations (a regional event) or to a limited number of stations (a local event). We provide examples of applying our technique to synthetic data and a field surface passive data set recorded at a geothermal site.
Anomaly Detection Based on Local Nearest Neighbor Distance Descriptor in Crowded Scenes
Hu, Shiqiang; Zhang, Huanlong; Luo, Lingkun
2014-01-01
We propose a novel local nearest neighbor distance (LNND) descriptor for anomaly detection in crowded scenes. Comparing with the commonly used low-level feature descriptors in previous works, LNND descriptor has two major advantages. First, LNND descriptor efficiently incorporates spatial and temporal contextual information around the video event that is important for detecting anomalous interaction among multiple events, while most existing feature descriptors only contain the information of single event. Second, LNND descriptor is a compact representation and its dimensionality is typically much lower than the low-level feature descriptor. Therefore, not only the computation time and storage requirement can be accordingly saved by using LNND descriptor for the anomaly detection method with offline training fashion, but also the negative aspects caused by using high-dimensional feature descriptor can be avoided. We validate the effectiveness of LNND descriptor by conducting extensive experiments on different benchmark datasets. Experimental results show the promising performance of LNND-based method against the state-of-the-art methods. It is worthwhile to notice that the LNND-based approach requires less intermediate processing steps without any subsequent processing such as smoothing but achieves comparable event better performance. PMID:25105164
Schechter, Clyde B; Near, Aimee M; Jayasekera, Jinani; Chandler, Young; Mandelblatt, Jeanne S
2018-04-01
The Georgetown University-Albert Einstein College of Medicine breast cancer simulation model (Model GE) has evolved over time in structure and function to reflect advances in knowledge about breast cancer, improvements in early detection and treatment technology, and progress in computing resources. This article describes the model and provides examples of model applications. The model is a discrete events microsimulation of single-life histories of women from multiple birth cohorts. Events are simulated in the absence of screening and treatment, and interventions are then applied to assess their impact on population breast cancer trends. The model accommodates differences in natural history associated with estrogen receptor (ER) and human epidermal growth factor receptor 2 (HER2) biomarkers, as well as conventional breast cancer risk factors. The approach for simulating breast cancer natural history is phenomenological, relying on dates, stage, and age of clinical and screen detection for a tumor molecular subtype without explicitly modeling tumor growth. The inputs to the model are regularly updated to reflect current practice. Numerous technical modifications, including the use of object-oriented programming (C++), and more efficient algorithms, along with hardware advances, have increased program efficiency permitting simulations of large samples. The model results consistently match key temporal trends in US breast cancer incidence and mortality. The model has been used in collaboration with other CISNET models to assess cancer control policies and will be applied to evaluate clinical trial design, recurrence risk, and polygenic risk-based screening.
Cole, Casey A; Anshari, Dien; Lambert, Victoria; Thrasher, James F; Valafar, Homayoun
2017-12-13
Smoking is the leading cause of preventable death in the world today. Ecological research on smoking in context currently relies on self-reported smoking behavior. Emerging smartwatch technology may more objectively measure smoking behavior by automatically detecting smoking sessions using robust machine learning models. This study aimed to examine the feasibility of detecting smoking behavior using smartwatches. The second aim of this study was to compare the success of observing smoking behavior with smartwatches to that of conventional self-reporting. A convenience sample of smokers was recruited for this study. Participants (N=10) recorded 12 hours of accelerometer data using a mobile phone and smartwatch. During these 12 hours, they engaged in various daily activities, including smoking, for which they logged the beginning and end of each smoking session. Raw data were classified as either smoking or nonsmoking using a machine learning model for pattern recognition. The accuracy of the model was evaluated by comparing the output with a detailed description of a modeled smoking session. In total, 120 hours of data were collected from participants and analyzed. The accuracy of self-reported smoking was approximately 78% (96/123). Our model was successful in detecting 100 of 123 (81%) smoking sessions recorded by participants. After eliminating sessions from the participants that did not adhere to study protocols, the true positive detection rate of the smartwatch based-detection increased to more than 90%. During the 120 hours of combined observation time, only 22 false positive smoking sessions were detected resulting in a 2.8% false positive rate. Smartwatch technology can provide an accurate, nonintrusive means of monitoring smoking behavior in natural contexts. The use of machine learning algorithms for passively detecting smoking sessions may enrich ecological momentary assessment protocols and cessation intervention studies that often rely on self-reported behaviors and may not allow for targeted data collection and communications around smoking events. ©Casey A Cole, Dien Anshari, Victoria Lambert, James F Thrasher, Homayoun Valafar. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 13.12.2017.
Thermal wake/vessel detection technique
Roskovensky, John K [Albuquerque, NM; Nandy, Prabal [Albuquerque, NM; Post, Brian N [Albuquerque, NM
2012-01-10
A computer-automated method for detecting a vessel in water based on an image of a portion of Earth includes generating a thermal anomaly mask. The thermal anomaly mask flags each pixel of the image initially deemed to be a wake pixel based on a comparison of a thermal value of each pixel against other thermal values of other pixels localized about each pixel. Contiguous pixels flagged by the thermal anomaly mask are grouped into pixel clusters. A shape of each of the pixel clusters is analyzed to determine whether each of the pixel clusters represents a possible vessel detection event. The possible vessel detection events are represented visually within the image.
Concept and Analysis of a Satellite for Space-Based Radio Detection of Ultra-High Energy Cosmic Rays
NASA Astrophysics Data System (ADS)
Romero-Wolf, Andrew; Gorham, P.; Booth, J.; Chen, P.; Duren, R. M.; Liewer, K.; Nam, J.; Saltzberg, D.; Schoorlemmer, H.; Wissel, S.; Zairfian, P.
2014-01-01
We present a concept for on-orbit radio detection of ultra-high energy cosmic rays (UHECRs) that has the potential to provide collection rates of ~100 events per year for energies above 10^20 eV. The synoptic wideband orbiting radio detector (SWORD) mission's high event statistics at these energies combined with the pointing capabilities of a space-borne antenna array could enable charged particle astronomy. The detector concept is based on ANITA's successful detection UHECRs where the geosynchrotron radio signal produced by the extended air shower is reflected off the Earth's surface and detected in flight.
NASA Astrophysics Data System (ADS)
Patton, J.; Yeck, W.; Benz, H.
2017-12-01
The U.S. Geological Survey National Earthquake Information Center (USGS NEIC) is implementing and integrating new signal detection methods such as subspace correlation, continuous beamforming, multi-band picking and automatic phase identification into near-real-time monitoring operations. Leveraging the additional information from these techniques help the NEIC utilize a large and varied network on local to global scales. The NEIC is developing an ordered, rapid, robust, and decentralized framework for distributing seismic detection data as well as a set of formalized formatting standards. These frameworks and standards enable the NEIC to implement a seismic event detection framework that supports basic tasks, including automatic arrival time picking, social media based event detections, and automatic association of different seismic detection data into seismic earthquake events. In addition, this framework enables retrospective detection processing such as automated S-wave arrival time picking given a detected event, discrimination and classification of detected events by type, back-azimuth and slowness calculations, and ensuring aftershock and induced sequence detection completeness. These processes and infrastructure improve the NEIC's capabilities, accuracy, and speed of response. In addition, this same infrastructure provides an improved and convenient structure to support access to automatic detection data for both research and algorithmic development.
Measuring pesticides in surface waters - continuous versus event-based sampling design
NASA Astrophysics Data System (ADS)
Eyring, J.; Bach, M.; Frede, H.-G.
2009-04-01
Monitoring pesticides in surface waters is still a work- and cost-intensive procedure. Therefore, studies are normally carried out with a low monitoring frequency or with only a small selection of substances to be analyzed. In this case, it is not possible to picture the high temporal variability of pesticide concentrations, depending on application dates, weather conditions, cropping seasons and other factors. In 2007 the Institute of Landscape Ecology and Resource Management at Giessen University implemented a monitoring program during two pesticide application periods aiming to produce a detailed dataset of pesticide concentration for a wide range of substances, and which would also be suitable for the evaluation of catchment-scale pesticide exposure models. The Weida catchment in Thuringia (Eastern Germany) was selected as study area due to the availability of detailed pesticide application data for this region. The samples were taken from the river Weida at the gauge Zeulenroda, where it flows into a drinking water reservoir. The catchment area is 102 km². 67% of the area are in agricultural use, the main crops being winter wheat, maize, winter barley and winter rape. Dominant soil texture classes are loamy sand and loamy silt. About one third of the agricultural area is drained. The sampling was carried out in cooperation with the water supply agency of Thuringia (Fernwasserversorgung Thueringen). The sample analysis was done by the Institute of Environmental Research at Dortmund University. Two sampling schemes were carried out using two automatic samplers: continuous sampling with composite samples bottled two times per week and event-based sampling triggered by a discharge threshold. 53 samples from continuous sampling were collected. 19 discharge events were sampled with 45 individual samples (one to six per event). 34 pesticides and two metabolites were analyzed. 21 compounds were detected, nine of which having concentrations above the drinking water limit (0.1 µg/l). Pesticide loads were calculated separately from continuous and event-based samples. Only three pesticides dominated the total load. These were the herbicides metazachlor, terbuthylazine and quinmerac amounting to 75 % of the total load. This result seems to be plausible considering the fact that these three substances are the pesticides with the highest applied amounts in the Weida catchment. The highest pesticide loads of single pesticides were observed during or shortly after their application period, mostly accompanied by larger discharge events. They can be explained as surface runoff and drainage inputs from treated fields, since spray-drift inputs would be detected during the application periods without dependency on discharge events, and inputs from point-sources are usually independent of discharge as well. Annual loads calculated from continuous samples were mainly higher than those of event-based samples due to the fact that they represent a much longer time period. On the other hand, the highest concentrations were found in the event-based samples; in many cases they double the maximum concentrations of continuous samples. The monitoring study presented shows that different sampling strategies lead to different results and can answer different questions. If the intention is to detect maximum concentrations caused by surface runoff or drainage inputs, e.g. to assess the resulting risk to the aquatic community, the event based sampling method can be recommended. If one is rather interested in calculating annual pesticide loads and assessing which fractions of applied amounts finally enter the surface water network, continuous sampling is advisable. The dataset of continuous and event-based pesticide concentrations offers the possibility to evaluate and improve pesticide exposure models at the catchment scale. Further work is scheduled on this issue.
Emergence of self and other in perception and action: an event-control approach.
Jordan, J Scott
2003-12-01
The present paper analyzes the regularities referred to via the concept 'self.' This is important, for cognitive science traditionally models the self as a cognitive mediator between perceptual inputs and behavioral outputs. This leads to the assertion that the self causes action. Recent findings in social psychology indicate this is not the case and, as a consequence, certain cognitive scientists model the self as being epiphenomenal. In contrast, the present paper proposes an alternative approach (i.e., the event-control approach) that is based on recently discovered regularities between perception and action. Specifically, these regularities indicate that perception and action planning utilize common neural resources. This leads to a coupling of perception, planning, and action in which the first two constitute aspects of a single system (i.e., the distal-event system) that is able to pre-specify and detect distal events. This distal-event system is then coupled with action (i.e., effector-control systems) in a constraining, as opposed to 'causal' manner. This model has implications for how we conceptualize the manner in which one infers the intentions of another, anticipates the intentions of another, and possibly even experiences another. In conclusion, it is argued that it may be possible to map the concept 'self' onto the regularities referred to in the event-control model, not in order to reify 'the self' as a causal mechanism, but to demonstrate its status as a useful concept that refers to regularities that are part of the natural order.
Christian, Kira A; Iuliano, A Danielle; Uyeki, Timothy M; Mintz, Eric D; Nichol, Stuart T; Rollin, Pierre; Staples, J Erin; Arthur, Ray R
To better track public health events in areas where the public health system is unable or unwilling to report the event to appropriate public health authorities, agencies can conduct event-based surveillance, which is defined as the organized collection, monitoring, assessment, and interpretation of unstructured information regarding public health events that may represent an acute risk to public health. The US Centers for Disease Control and Prevention's (CDC's) Global Disease Detection Operations Center (GDDOC) was created in 2007 to serve as CDC's platform dedicated to conducting worldwide event-based surveillance, which is now highlighted as part of the "detect" element of the Global Health Security Agenda (GHSA). The GHSA works toward making the world more safe and secure from disease threats through building capacity to better "Prevent, Detect, and Respond" to those threats. The GDDOC monitors approximately 30 to 40 public health events each day. In this article, we describe the top threats to public health monitored during 2012 to 2016: avian influenza, cholera, Ebola virus disease, and the vector-borne diseases yellow fever, chikungunya virus, and Zika virus, with updates to the previously described threats from Middle East respiratory syndrome-coronavirus (MERS-CoV) and poliomyelitis.
Hydrological Retrospective of floods and droughts: Case study in the Amazon
NASA Astrophysics Data System (ADS)
Wongchuig Correa, Sly; Cauduro Dias de Paiva, Rodrigo; Carlo Espinoza Villar, Jhan; Collischonn, Walter
2017-04-01
Recent studies have reported an increase in intensity and frequency of hydrological extreme events in many regions of the Amazon basin over last decades, these events such as seasonal floods and droughts have originated a significant impact in human and natural systems. Recently, methodologies such as climatic reanalysis are being developed in order to create a coherent register of climatic systems, thus taking this notion, this research efforts to produce a methodology called Hydrological Retrospective (HR), that essentially simulate large rainfall datasets over hydrological models in order to develop a record over past hydrology, enabling the analysis of past floods and droughts. We developed our methodology on the Amazon basin, thus we used eight large precipitation datasets (more than 30 years) through a large scale hydrological and hydrodynamic model (MGB-IPH), after that HR products were validated against several in situ discharge gauges dispersed throughout Amazon basin, given focus in maximum and minimum events. For better HR results according performance metrics, we performed a forecast skill of HR to detect floods and droughts considering in-situ observations. Furthermore, statistical temporal series trend was performed for intensity of seasonal floods and drought in the whole Amazon basin. Results indicate that better HR represented well most past extreme events registered by in-situ observed data and also showed coherent with many events cited by literature, thus we consider viable to use some large precipitation datasets as climatic reanalysis mainly based on land surface component and datasets based in merged products for represent past regional hydrology and seasonal hydrological extreme events. On the other hand, an increase trend of intensity was realized for maximum annual discharges (related to floods) in north-western regions and for minimum annual discharges (related to drought) in central-south regions of the Amazon basin, these features were previously detected by other researches. In the whole basin, we estimated an upward trend of maximum annual discharges at Amazon River. In order to estimate better future hydrological behavior and their impacts on the society, HR could be used as a methodology to understand past extreme events occurrence in many places considering the global coverage of rainfall datasets.
Cai, Yi; Du, Jingcheng; Huang, Jing; Ellenberg, Susan S; Hennessy, Sean; Tao, Cui; Chen, Yong
2017-07-05
To identify safety signals by manual review of individual report in large surveillance databases is time consuming; such an approach is very unlikely to reveal complex relationships between medications and adverse events. Since the late 1990s, efforts have been made to develop data mining tools to systematically and automatically search for safety signals in surveillance databases. Influenza vaccines present special challenges to safety surveillance because the vaccine changes every year in response to the influenza strains predicted to be prevalent that year. Therefore, it may be expected that reporting rates of adverse events following flu vaccines (number of reports for a specific vaccine-event combination/number of reports for all vaccine-event combinations) may vary substantially across reporting years. Current surveillance methods seldom consider these variations in signal detection, and reports from different years are typically collapsed together to conduct safety analyses. However, merging reports from different years ignores the potential heterogeneity of reporting rates across years and may miss important safety signals. Reports of adverse events between years 1990 to 2013 were extracted from the Vaccine Adverse Event Reporting System (VAERS) database and formatted into a three-dimensional data array with types of vaccine, groups of adverse events and reporting time as the three dimensions. We propose a random effects model to test the heterogeneity of reporting rates for a given vaccine-event combination across reporting years. The proposed method provides a rigorous statistical procedure to detect differences of reporting rates among years. We also introduce a new visualization tool to summarize the result of the proposed method when applied to multiple vaccine-adverse event combinations. We applied the proposed method to detect safety signals of FLU3, an influenza vaccine containing three flu strains, in the VAERS database. We showed that it had high statistical power to detect the variation in reporting rates across years. The identified vaccine-event combinations with significant different reporting rates over years suggested potential safety issues due to changes in vaccines which require further investigation. We developed a statistical model to detect safety signals arising from heterogeneity of reporting rates of a given vaccine-event combinations across reporting years. This method detects variation in reporting rates over years with high power. The temporal trend of reporting rate across years may reveal the impact of vaccine update on occurrence of adverse events and provide evidence for further investigations.
Shang, Ying; Xu, Wentao; Wang, Yong; Xu, Yuancong; Huang, Kunlun
2017-12-15
This study described a novel multiplex qualitative detection method using pyrosequencing. Based on the principle of the universal primer-multiplex-PCR, only one sequencing primer was employed to realize the detection of the multiple targets. Samples containing three genetically modified (GM) crops in different proportions were used to validate the method. The dNTP dispensing order was designed based on the product sequences. Only 12 rounds (ATCTGATCGACT) of dNTPs addition and, often, as few as three rounds (CAT) under ideal conditions, were required to detect the GM events qualitatively, and sensitivity was as low as 1% of a mixture. However, when considering a mixture, calculating signal values allowed the proportion of each GM to be estimated. Based on these results, we concluded that our novel method not only realized detection but also allowed semi-quantitative detection of individual events. Copyright © 2017. Published by Elsevier Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murphy, Chantell Lynne-Marie
Traditional nuclear materials accounting does not work well for safeguards when applied to pyroprocessing. Alternate methods such as Signature Based Safeguards (SBS) are being investigated. The goal of SBS is real-time/near-real-time detection of anomalous events in the pyroprocessing facility as they could indicate loss of special nuclear material. In high-throughput reprocessing facilities, metric tons of separated material are processed that must be accounted for. Even with very low uncertainties of accountancy measurements (<0.1%) the uncertainty of the material balances is still greater than the desired level. Novel contributions of this work are as follows: (1) significant enhancement of SBS developmentmore » for the salt cleanup process by creating a new gas sparging process model, selecting sensors to monitor normal operation, identifying safeguards-significant off-normal scenarios, and simulating those off-normal events and generating sensor output; (2) further enhancement of SBS development for the electrorefiner by simulating off-normal events caused by changes in salt concentration and identifying which conditions lead to Pu and Cm not tracking throughout the rest of the system; and (3) new contribution in applying statistical techniques to analyze the signatures gained from these two models to help draw real-time conclusions on anomalous events.« less
Screening DNA chip and event-specific multiplex PCR detection methods for biotech crops.
Lee, Seong-Hun
2014-11-01
There are about 80 biotech crop events that have been approved by safety assessment in Korea. They have been controlled by genetically modified organism (GMO) and living modified organism (LMO) labeling systems. The DNA-based detection method has been used as an efficient scientific management tool. Recently, the multiplex polymerase chain reaction (PCR) and DNA chip have been developed as simultaneous detection methods for several biotech crops' events. The event-specific multiplex PCR method was developed to detect five biotech maize events: MIR604, Event 3272, LY 038, MON 88017 and DAS-59122-7. The specificity was confirmed and the sensitivity was 0.5%. The screening DNA chip was developed from four endogenous genes of soybean, maize, cotton and canola respectively along with two regulatory elements and seven genes: P35S, tNOS, pat, bar, epsps1, epsps2, pmi, cry1Ac and cry3B. The specificity was confirmed and the sensitivity was 0.5% for four crops' 12 events: one soybean, six maize, three cotton and two canola events. The multiplex PCR and DNA chip can be available for screening, gene-specific and event-specific analysis of biotech crops as efficient detection methods by saving on workload and time. © 2014 Society of Chemical Industry. © 2014 Society of Chemical Industry.
Pies, Ross E.
2016-03-29
A method and device for the detection of impact events on a security barrier. A hollow rebar is farmed within a security barrier, whereby the hollow rebar is completely surrounded by the security barrier. An optical fiber passes through the interior of the hollow rebar. An optical transmitter and an optical receiver are both optically connected to the optical fiber and connected to optical electronics. The optical electronics are configured to provide notification upon the detection of an impact event at the security barrier based on the detection of disturbances within the optical fiber.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petiteau, Antoine; Babak, Stanislav; Sesana, Alberto
Gravitational wave (GW) signals from coalescing massive black hole (MBH) binaries could be used as standard sirens to measure cosmological parameters. The future space-based GW observatory Laser Interferometer Space Antenna (LISA) will detect up to a hundred of those events, providing very accurate measurements of their luminosity distances. To constrain the cosmological parameters, we also need to measure the redshift of the galaxy (or cluster of galaxies) hosting the merger. This requires the identification of a distinctive electromagnetic event associated with the binary coalescence. However, putative electromagnetic signatures may be too weak to be observed. Instead, we study here themore » possibility of constraining the cosmological parameters by enforcing statistical consistency between all the possible hosts detected within the measurement error box of a few dozen of low-redshift (z < 3) events. We construct MBH populations using merger tree realizations of the dark matter hierarchy in a {Lambda}CDM universe, and we use data from the Millennium simulation to model the galaxy distribution in the LISA error box. We show that, assuming that all the other cosmological parameters are known, the parameter w describing the dark energy equation of state can be constrained to a 4%-8% level (2{sigma} error), competitive with current uncertainties obtained by type Ia supernovae measurements, providing an independent test of our cosmological model.« less
A Method of Synchrophasor Technology for Detecting and Analyzing Cyber-Attacks
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCann, Roy; Al-Sarray, Muthanna
Studying cybersecurity events and analyzing their impacts encourage planners and operators to develop innovative approaches for preventing attacks in order to avoid outages and other disruptions. This work considers two parts in security studies; detecting an integrity attack and examining its effects on power system generators. The detection was conducted through employing synchrophasor technology to provide authentication of ACG commands based on observed system operating characteristics. The examination of an attack is completed via a detailed simulation of a modified IEEE 68-bus benchmark model to show the associated power system dynamic response. The results of the simulation are discussed formore » assessing the impacts of cyber threats.« less
Failure detection and correction for turbofan engines
NASA Technical Reports Server (NTRS)
Corley, R. C.; Spang, H. A., III
1977-01-01
In this paper, a failure detection and correction strategy for turbofan engines is discussed. This strategy allows continuing control of the engines in the event of a sensor failure. An extended Kalman filter is used to provide the best estimate of the state of the engine based on currently available sensor outputs. Should a sensor failure occur the control is based on the best estimate rather than the sensor output. The extended Kalman filter consists of essentially two parts, a nonlinear model of the engine and up-date logic which causes the model to track the actual engine. Details on the model and up-date logic are presented. To allow implementation, approximations are made to the feedback gain matrix which result in a single feedback matrix which is suitable for use over the entire flight envelope. The effect of these approximations on stability and response is discussed. Results from a detailed nonlinear simulation indicate that good control can be maintained even under multiple failures.
Real-time surveillance for abnormal events: the case of influenza outbreaks.
Rao, Yao; McCabe, Brendan
2016-06-15
This paper introduces a method of surveillance using deviations from probabilistic forecasts. Realised observations are compared with probabilistic forecasts, and the "deviation" metric is based on low probability events. If an alert is declared, the algorithm continues to monitor until an all-clear is announced. Specifically, this article addresses the problem of syndromic surveillance for influenza (flu) with the intention of detecting outbreaks, due to new strains of viruses, over and above the normal seasonal pattern. The syndrome is hospital admissions for flu-like illness, and hence, the data are low counts. In accordance with the count properties of the observations, an integer-valued autoregressive process is used to model flu occurrences. Monte Carlo evidence suggests the method works well in stylised but somewhat realistic situations. An application to real flu data indicates that the ideas may have promise. The model estimated on a short run of training data did not declare false alarms when used with new observations deemed in control, ex post. The model easily detected the 2009 H1N1 outbreak. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Whited, John D; Datta, Santanu K; Aiello, Lloyd M; Aiello, Lloyd P; Cavallerano, Jerry D; Conlin, Paul R; Horton, Mark B; Vigersky, Robert A; Poropatich, Ronald K; Challa, Pratap; Darkins, Adam W; Bursell, Sven-Erik
2005-12-01
The objective of this study was to compare, using a 12-month time frame, the cost-effectiveness of a non-mydriatic digital tele-ophthalmology system (Joslin Vision Network) versus traditional clinic-based ophthalmoscopy examinations with pupil dilation to detect proliferative diabetic retinopathy and its consequences. Decision analysis techniques, including Monte Carlo simulation, were used to model the use of the Joslin Vision Network versus conventional clinic-based ophthalmoscopy among the entire diabetic populations served by the Indian Health Service, the Department of Veterans Affairs, and the active duty Department of Defense. The economic perspective analyzed was that of each federal agency. Data sources for costs and outcomes included the published literature, epidemiologic data, administrative data, market prices, and expert opinion. Outcome measures included the number of true positive cases of proliferative diabetic retinopathy detected, the number of patients treated with panretinal laser photocoagulation, and the number of cases of severe vision loss averted. In the base-case analyses, the Joslin Vision Network was the dominant strategy in all but two of the nine modeled scenarios, meaning that it was both less costly and more effective. In the active duty Department of Defense population, the Joslin Vision Network would be more effective but cost an extra 1,618 dollars per additional patient treated with panretinal laser photo-coagulation and an additional 13,748 dollars per severe vision loss event averted. Based on our economic model, the Joslin Vision Network has the potential to be more effective than clinic-based ophthalmoscopy for detecting proliferative diabetic retinopathy and averting cases of severe vision loss, and may do so at lower cost.
Systematic detection of seismic events at Mount St. Helens with an ultra-dense array
NASA Astrophysics Data System (ADS)
Meng, X.; Hartog, J. R.; Schmandt, B.; Hotovec-Ellis, A. J.; Hansen, S. M.; Vidale, J. E.; Vanderplas, J.
2016-12-01
During the summer of 2014, an ultra-dense array of 900 geophones was deployed around the crater of Mount St. Helens and continuously operated for 15 days. This dataset provides us an unprecedented opportunity to systematically detect seismic events around an active volcano and study their underlying mechanisms. We use a waveform-based matched filter technique to detect seismic events from this dataset. Due to the large volume of continuous data ( 1 TB), we performed the detection on the GPU cluster Stampede (https://www.tacc.utexas.edu/systems/stampede). We build a suite of template events from three catalogs: 1) the standard Pacific Northwest Seismic Network (PNSN) catalog (45 events); 2) the catalog from Hansen&Schmandt (2015) obtained with a reverse-time imaging method (212 events); and 3) the catalog identified with a matched filter technique using the PNSN permanent stations (190 events). By searching for template matches in the ultra-dense array, we find 2237 events. We then calibrate precise relative magnitudes for template and detected events, using a principal component fit to measure waveform amplitude ratios. The magnitude of completeness and b-value of the detected catalog is -0.5 and 1.1, respectively. Our detected catalog shows several intensive swarms, which are likely driven by fluid pressure transients in conduits or slip transients on faults underneath the volcano. We are currently relocating the detected catalog with HypoDD and measuring the seismic velocity changes at Mount St. Helens using the coda wave interferometry of detected repeating earthquakes. The accurate temporal-spatial migration pattern of seismicity and seismic property changes should shed light on the physical processes beneath Mount St. Helens.
NASA Astrophysics Data System (ADS)
Touati, Sarah; Naylor, Mark; Main, Ian
2016-02-01
The recent spate of mega-earthquakes since 2004 has led to speculation of an underlying change in the global `background' rate of large events. At a regional scale, detecting changes in background rate is also an important practical problem for operational forecasting and risk calculation, for example due to volcanic processes, seismicity induced by fluid injection or withdrawal, or due to redistribution of Coulomb stress after natural large events. Here we examine the general problem of detecting changes in background rate in earthquake catalogues with and without correlated events, for the first time using the Bayes factor as a discriminant for models of varying complexity. First we use synthetic Poisson (purely random) and Epidemic-Type Aftershock Sequence (ETAS) models (which also allow for earthquake triggering) to test the effectiveness of many standard methods of addressing this question. These fall into two classes: those that evaluate the relative likelihood of different models, for example using Information Criteria or the Bayes Factor; and those that evaluate the probability of the observations (including extreme events or clusters of events) under a single null hypothesis, for example by applying the Kolmogorov-Smirnov and `runs' tests, and a variety of Z-score tests. The results demonstrate that the effectiveness among these tests varies widely. Information Criteria worked at least as well as the more computationally expensive Bayes factor method, and the Kolmogorov-Smirnov and runs tests proved to be the relatively ineffective in reliably detecting a change point. We then apply the methods tested to events at different thresholds above magnitude M ≥ 7 in the global earthquake catalogue since 1918, after first declustering the catalogue. This is most effectively done by removing likely correlated events using a much lower magnitude threshold (M ≥ 5), where triggering is much more obvious. We find no strong evidence that the background rate of large events worldwide has increased in recent years.
NASA Technical Reports Server (NTRS)
Bedka, Kristopher M.; Dworak, Richard; Brunner, Jason; Feltz, Wayne
2012-01-01
Two satellite infrared-based overshooting convective cloud-top (OT) detection methods have recently been described in the literature: 1) the 11-mm infrared window channel texture (IRW texture) method, which uses IRW channel brightness temperature (BT) spatial gradients and thresholds, and 2) the water vapor minus IRW BT difference (WV-IRW BTD). While both methods show good performance in published case study examples, it is important to quantitatively validate these methods relative to overshooting top events across the globe. Unfortunately, no overshooting top database currently exists that could be used in such study. This study examines National Aeronautics and Space Administration CloudSat Cloud Profiling Radar data to develop an OT detection validation database that is used to evaluate the IRW-texture and WV-IRW BTD OT detection methods. CloudSat data were manually examined over a 1.5-yr period to identify cases in which the cloud top penetrates above the tropopause height defined by a numerical weather prediction model and the surrounding cirrus anvil cloud top, producing 111 confirmed overshooting top events. When applied to Moderate Resolution Imaging Spectroradiometer (MODIS)-based Geostationary Operational Environmental Satellite-R Series (GOES-R) Advanced Baseline Imager proxy data, the IRW-texture (WV-IRW BTD) method offered a 76% (96%) probability of OT detection (POD) and 16% (81%) false-alarm ratio. Case study examples show that WV-IRW BTD.0 K identifies much of the deep convective cloud top, while the IRW-texture method focuses only on regions with a spatial scale near that of commonly observed OTs. The POD decreases by 20% when IRW-texture is applied to current geostationary imager data, highlighting the importance of imager spatial resolution for observing and detecting OT regions.
Information-based self-organization of sensor nodes of a sensor network
Ko, Teresa H [Castro Valley, CA; Berry, Nina M [Tracy, CA
2011-09-20
A sensor node detects a plurality of information-based events. The sensor node determines whether at least one other sensor node is an information neighbor of the sensor node based on at least a portion of the plurality of information-based events. The information neighbor has an overlapping field of view with the sensor node. The sensor node sends at least one communication to the at least one other sensor node that is an information neighbor of the sensor node in response to at least one information-based event of the plurality of information-based events.
NASA Astrophysics Data System (ADS)
Ragettli, S.; Zhou, J.; Wang, H.; Liu, C.; Guo, L.
2017-12-01
Flash floods in small mountain catchments are one of the most frequent causes of loss of life and property from natural hazards in China. Hydrological models can be a useful tool for the anticipation of these events and the issuing of timely warnings. One of the main challenges of setting up such a system is finding appropriate model parameter values for ungauged catchments. Previous studies have shown that the transfer of parameter sets from hydrologically similar gauged catchments is one of the best performing regionalization methods. However, a remaining key issue is the identification of suitable descriptors of similarity. In this study, we use decision tree learning to explore parameter set transferability in the full space of catchment descriptors. For this purpose, a semi-distributed rainfall-runoff model is set up for 35 catchments in ten Chinese provinces. Hourly runoff data from in total 858 storm events are used to calibrate the model and to evaluate the performance of parameter set transfers between catchments. We then present a novel technique that uses the splitting rules of classification and regression trees (CART) for finding suitable donor catchments for ungauged target catchments. The ability of the model to detect flood events in assumed ungauged catchments is evaluated in series of leave-one-out tests. We show that CART analysis increases the probability of detection of 10-year flood events in comparison to a conventional measure of physiographic-climatic similarity by up to 20%. Decision tree learning can outperform other regionalization approaches because it generates rules that optimally consider spatial proximity and physical similarity. Spatial proximity can be used as a selection criteria but is skipped in the case where no similar gauged catchments are in the vicinity. We conclude that the CART regionalization concept is particularly suitable for implementation in sparsely gauged and topographically complex environments where a proximity-based regionalization concept is not applicable.
Automatic violence detection in digital movies
NASA Astrophysics Data System (ADS)
Fischer, Stephan
1996-11-01
Research on computer-based recognition of violence is scant. We are working on the automatic recognition of violence in digital movies, a first step towards the goal of a computer- assisted system capable of protecting children against TV programs containing a great deal of violence. In the video domain a collision detection and a model-mapping to locate human figures are run, while the creation and comparison of fingerprints to find certain events are run int he audio domain. This article centers on the recognition of fist- fights in the video domain and on the recognition of shots, explosions and cries in the audio domain.
Detecting event-based prospective memory cues occurring within and outside the focus of attention.
Hicks, Jason L; Cook, Gabriel I; Marsh, Richard L
2005-01-01
Event-based prospective memory cues are environmental stimuli that are associated with a previously established intention to perform an activity. Such cues traditionally have been placed in materials that receive focal attention during an ongoing activity. This article reports a direct comparison of event-based cues that occurred either within the focus of attention or at the periphery of such attention. When the cue occurred outside focal attention, manipulating that cue changed event-based prospective memory. The identical manipulation had no effect on event-based responding if the cue occurred within focal attention. These results suggest that cue characteristics can compensate for attention being directed away from an aspect of an ongoing task that contains event-based prospective memory.
Characterizing super-spreading in microblog: An epidemic-based information propagation model
NASA Astrophysics Data System (ADS)
Liu, Yu; Wang, Bai; Wu, Bin; Shang, Suiming; Zhang, Yunlei; Shi, Chuan
2016-12-01
As the microblogging services are becoming more prosperous in everyday life for users on Online Social Networks (OSNs), it is more favorable for hot topics and breaking news to gain more attraction very soon than ever before, which are so-called "super-spreading events". In the information diffusion process of these super-spreading events, messages are passed on from one user to another and numerous individuals are influenced by a relatively small portion of users, a.k.a. super-spreaders. Acquiring an awareness of super-spreading phenomena and an understanding of patterns of wide-ranged information propagations benefits several social media data mining tasks, such as hot topic detection, predictions of information propagation, harmful information monitoring and intervention. Taking into account that super-spreading in both information diffusion and spread of a contagious disease are analogous, in this study, we build a parameterized model, the SAIR model, based on well-known epidemic models to characterize super-spreading phenomenon in tweet information propagation accompanied with super-spreaders. For the purpose of modeling information diffusion, empirical observations on a real-world Weibo dataset are statistically carried out. Both the steady-state analysis on the equilibrium and the validation on real-world Weibo dataset of the proposed model are conducted. The case study that validates the proposed model shows that the SAIR model is much more promising than the conventional SIR model in characterizing a super-spreading event of information propagation. In addition, numerical simulations are carried out and discussed to discover how sensitively the parameters affect the information propagation process.
Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan
2017-01-01
In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of our proposed adaptive crawler and our stream division/recombination technique provides significant gains in event recall (44.44%) and event precision (9.57%). The addition of these sub-events or pieces, allows us to get closer to solving the event puzzle. PMID:29107976
Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan
2017-01-01
In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of our proposed adaptive crawler and our stream division/recombination technique provides significant gains in event recall (44.44%) and event precision (9.57%). The addition of these sub-events or pieces, allows us to get closer to solving the event puzzle.
A hypothetical universal model of cerebellar function: reconsideration of the current dogma.
Magal, Ari
2013-10-01
The cerebellum is commonly studied in the context of the classical eyeblink conditioning model, which attributes an adaptive motor function to cerebellar learning processes. This model of cerebellar function has quite a few shortcomings and may in fact be somewhat deficient in explaining the myriad functions attributed to the cerebellum, functions ranging from motor sequencing to emotion and cognition. The involvement of the cerebellum in these motor and non-motor functions has been demonstrated in both animals and humans in electrophysiological, behavioral, tracing, functional neuroimaging, and PET studies, as well as in clinical human case studies. A closer look at the cerebellum's evolutionary origin provides a clue to its underlying purpose as a tool which evolved to aid predation rather than as a tool for protection. Based upon this evidence, an alternative model of cerebellar function is proposed, one which might more comprehensively account both for the cerebellum's involvement in a myriad of motor, affective, and cognitive functions and for the relative simplicity and ubiquitous repetitiveness of its circuitry. This alternative model suggests that the cerebellum has the ability to detect coincidences of events, be they sensory, motor, affective, or cognitive in nature, and, after having learned to associate these, it can then trigger (or "mirror") these events after having temporally adjusted their onset based on positive/negative reinforcement. The model also provides for the cerebellum's direction of the proper and uninterrupted sequence of events resulting from this learning through the inhibition of efferent structures (as demonstrated in our lab).
Simulation Of A Photofission-Based Cargo Interrogation System
DOE Office of Scientific and Technical Information (OSTI.GOV)
King, Michael; Gozani, Tsahi; Stevenson, John
A comprehensive model has been developed to characterize and optimize the detection of Bremsstrahlung x-ray induced fission signatures from nuclear materials hidden in cargo containers. An effective active interrogation system should not only induce a large number of fission events but also efficiently detect their signatures. The proposed scanning system utilizes a 9-MV commercially available linear accelerator and the detection of strong fission signals i.e. delayed gamma rays and prompt neutrons. Because the scanning system is complex and the cargo containers are large and often highly attenuating, the simulation method segments the model into several physical steps, representing each changemore » of radiation particle. Each approximation is carried-out separately, resulting in a major reduction in computational time and a significant improvement in tally statistics. The model investigates the effect on the fission rate and detection rate by various cargo types, densities and distributions. Hydrogenous and metallic cargos, homogeneous and heterogeneous, as well as various locations of the nuclear material inside the cargo container were studied. We will show that for the photofission-based interrogation system simulation, the final results are not only in good agreement with a full, single-step simulation but also with experimental results, further validating the full-system simulation.« less
Yang, Litao; Xu, Songci; Pan, Aihu; Yin, Changsong; Zhang, Kewei; Wang, Zhenying; Zhou, Zhigang; Zhang, Dabing
2005-11-30
Because of the genetically modified organisms (GMOs) labeling policies issued in many countries and areas, polymerase chain reaction (PCR) methods were developed for the execution of GMO labeling policies, such as screening, gene specific, construct specific, and event specific PCR detection methods, which have become a mainstay of GMOs detection. The event specific PCR detection method is the primary trend in GMOs detection because of its high specificity based on the flanking sequence of the exogenous integrant. This genetically modified maize, MON863, contains a Cry3Bb1 coding sequence that produces a protein with enhanced insecticidal activity against the coleopteran pest, corn rootworm. In this study, the 5'-integration junction sequence between the host plant DNA and the integrated gene construct of the genetically modified maize MON863 was revealed by means of thermal asymmetric interlaced-PCR, and the specific PCR primers and TaqMan probe were designed based upon the revealed 5'-integration junction sequence; the conventional qualitative PCR and quantitative TaqMan real-time PCR detection methods employing these primers and probes were successfully developed. In conventional qualitative PCR assay, the limit of detection (LOD) was 0.1% for MON863 in 100 ng of maize genomic DNA for one reaction. In the quantitative TaqMan real-time PCR assay, the LOD and the limit of quantification were eight and 80 haploid genome copies, respectively. In addition, three mixed maize samples with known MON863 contents were detected using the established real-time PCR systems, and the ideal results indicated that the established event specific real-time PCR detection systems were reliable, sensitive, and accurate.
NASA Technical Reports Server (NTRS)
Zeigler, Bernard P.
1989-01-01
It is shown how systems can be advantageously represented as discrete-event models by using DEVS (discrete-event system specification), a set-theoretic formalism. Such DEVS models provide a basis for the design of event-based logic control. In this control paradigm, the controller expects to receive confirming sensor responses to its control commands within definite time windows determined by its DEVS model of the system under control. The event-based contral paradigm is applied in advanced robotic and intelligent automation, showing how classical process control can be readily interfaced with rule-based symbolic reasoning systems.
Absolute auditory threshold: testing the absolute.
Heil, Peter; Matysiak, Artur
2017-11-02
The mechanisms underlying the detection of sounds in quiet, one of the simplest tasks for auditory systems, are debated. Several models proposed to explain the threshold for sounds in quiet and its dependence on sound parameters include a minimum sound intensity ('hard threshold'), below which sound has no effect on the ear. Also, many models are based on the assumption that threshold is mediated by integration of a neural response proportional to sound intensity. Here, we test these ideas. Using an adaptive forced choice procedure, we obtained thresholds of 95 normal-hearing human ears for 18 tones (3.125 kHz carrier) in quiet, each with a different temporal amplitude envelope. Grand-mean thresholds and standard deviations were well described by a probabilistic model according to which sensory events are generated by a Poisson point process with a low rate in the absence, and higher, time-varying rates in the presence, of stimulation. The subject actively evaluates the process and bases the decision on the number of events observed. The sound-driven rate of events is proportional to the temporal amplitude envelope of the bandpass-filtered sound raised to an exponent. We find no evidence for a hard threshold: When the model is extended to include such a threshold, the fit does not improve. Furthermore, we find an exponent of 3, consistent with our previous studies and further challenging models that are based on the assumption of the integration of a neural response that, at threshold sound levels, is directly proportional to sound amplitude or intensity. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
Automated aortic calcification detection in low-dose chest CT images
NASA Astrophysics Data System (ADS)
Xie, Yiting; Htwe, Yu Maw; Padgett, Jennifer; Henschke, Claudia; Yankelevitz, David; Reeves, Anthony P.
2014-03-01
The extent of aortic calcification has been shown to be a risk indicator for vascular events including cardiac events. We have developed a fully automated computer algorithm to segment and measure aortic calcification in low-dose noncontrast, non-ECG gated, chest CT scans. The algorithm first segments the aorta using a pre-computed Anatomy Label Map (ALM). Then based on the segmented aorta, aortic calcification is detected and measured in terms of the Agatston score, mass score, and volume score. The automated scores are compared with reference scores obtained from manual markings. For aorta segmentation, the aorta is modeled as a series of discrete overlapping cylinders and the aortic centerline is determined using a cylinder-tracking algorithm. Then the aortic surface location is detected using the centerline and a triangular mesh model. The segmented aorta is used as a mask for the detection of aortic calcification. For calcification detection, the image is first filtered, then an elevated threshold of 160 Hounsfield units (HU) is used within the aorta mask region to reduce the effect of noise in low-dose scans, and finally non-aortic calcification voxels (bony structures, calcification in other organs) are eliminated. The remaining candidates are considered as true aortic calcification. The computer algorithm was evaluated on 45 low-dose non-contrast CT scans. Using linear regression, the automated Agatston score is 98.42% correlated with the reference Agatston score. The automated mass and volume score is respectively 98.46% and 98.28% correlated with the reference mass and volume score.
Processing LiDAR Data to Predict Natural Hazards
NASA Technical Reports Server (NTRS)
Fairweather, Ian; Crabtree, Robert; Hager, Stacey
2008-01-01
ELF-Base and ELF-Hazards (wherein 'ELF' signifies 'Extract LiDAR Features' and 'LiDAR' signifies 'light detection and ranging') are developmental software modules for processing remote-sensing LiDAR data to identify past natural hazards (principally, landslides) and predict future ones. ELF-Base processes raw LiDAR data, including LiDAR intensity data that are often ignored in other software, to create digital terrain models (DTMs) and digital feature models (DFMs) with sub-meter accuracy. ELF-Hazards fuses raw LiDAR data, data from multispectral and hyperspectral optical images, and DTMs and DFMs generated by ELF-Base to generate hazard risk maps. Advanced algorithms in these software modules include line-enhancement and edge-detection algorithms, surface-characterization algorithms, and algorithms that implement innovative data-fusion techniques. The line-extraction and edge-detection algorithms enable users to locate such features as faults and landslide headwall scarps. Also implemented in this software are improved methodologies for identification and mapping of past landslide events by use of (1) accurate, ELF-derived surface characterizations and (2) three LiDAR/optical-data-fusion techniques: post-classification data fusion, maximum-likelihood estimation modeling, and hierarchical within-class discrimination. This software is expected to enable faster, more accurate forecasting of natural hazards than has previously been possible.
A Detailed Picture of the (93) Minerva Triple System
NASA Astrophysics Data System (ADS)
Marchis, F.; Descamps, P.; Dalba, P.; Enriquez, J. E.; Durech, J.; Emery, J. P.; Berthier, J.; Vachier, F.; Merlbourne, J.; Stockton, A. N.; Fassnacht, C. D.; Dupuy, T. J.
2011-10-01
We developed an orbital model of the satellites of (93) Minerva based on Keck II AO observations recorded in 2009 and a mutual event between one moon and the primary detected in March 2010. Using new lightcurves we found an approximated ellipsoid shape model for the primary. With a reanalysis of the IRAS data, we derived a preliminary bulk density of 1.5±0.2 g/cc. We will present a detailed analysis of the system, including a 3D shape model of the 93 Minerva primary derived by combining our AO observations, lightcurve, and stellar occultations.
Real-time detection and classification of anomalous events in streaming data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferragut, Erik M.; Goodall, John R.; Iannacone, Michael D.
2016-04-19
A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The events can be displayed to a user in user-defined groupings in an animated fashion. The system can include a plurality of anomaly detectors that together implement an algorithm to identify low probability events and detect atypical traffic patterns. The atypical traffic patterns can then be classified as being of interest or not. In one particular example, in a network environment, the classification can be whether the network traffic is malicious or not.
NASA Astrophysics Data System (ADS)
Kim, W. Y.; Richards, P. G.
2017-12-01
At least four small seismic events were detected around the North Korean nuclear test site following the 3 September 2017 underground nuclear test. The magnitude of these shocks range from 2.6 to 3.5. Based on their proximity to the September 3 UNT, these shocks may be considered as aftershocks of the UNT. We assess the best method to classify these small events based on spectral amplitude ratios of regional P and S wave from the shocks. None of these shocks are classified as explosion-like based on P/S spectral amplitude ratios. We examine additional possible small seismic events around the North Korean test site by using seismic data from stations in southern Korea and northeastern China including IMS seismic arrays, GSN stations, and regional network stations in the region.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dodge, D. A.; Harris, D. B.
Correlation detectors are of considerable interest to the seismic monitoring communities because they offer reduced detection thresholds and combine detection, location and identification functions into a single operation. They appear to be ideal for applications requiring screening of frequent repeating events. However, questions remain about how broadly empirical correlation methods are applicable. We describe the effectiveness of banks of correlation detectors in a system that combines traditional power detectors with correlation detectors in terms of efficiency, which we define to be the fraction of events detected by the correlators. This paper elaborates and extends the concept of a dynamic correlationmore » detection framework – a system which autonomously creates correlation detectors from event waveforms detected by power detectors; and reports observed performance on a network of arrays in terms of efficiency. We performed a large scale test of dynamic correlation processors on an 11 terabyte global dataset using 25 arrays in the single frequency band 1-3 Hz. The system found over 3.2 million unique signals and produced 459,747 screened detections. A very satisfying result is that, on average, efficiency grows with time and, after nearly 16 years of operation, exceeds 47% for events observed over all distance ranges and approaches 70% for near regional and 90% for local events. This observation suggests that future pipeline architectures should make extensive use of correlation detectors, principally for decluttering observations of local and near-regional events. Our results also suggest that future operations based on correlation detection will require commodity large-scale computing infrastructure, since the numbers of correlators in an autonomous system can grow into the hundreds of thousands.« less
Dodge, D. A.; Harris, D. B.
2016-03-15
Correlation detectors are of considerable interest to the seismic monitoring communities because they offer reduced detection thresholds and combine detection, location and identification functions into a single operation. They appear to be ideal for applications requiring screening of frequent repeating events. However, questions remain about how broadly empirical correlation methods are applicable. We describe the effectiveness of banks of correlation detectors in a system that combines traditional power detectors with correlation detectors in terms of efficiency, which we define to be the fraction of events detected by the correlators. This paper elaborates and extends the concept of a dynamic correlationmore » detection framework – a system which autonomously creates correlation detectors from event waveforms detected by power detectors; and reports observed performance on a network of arrays in terms of efficiency. We performed a large scale test of dynamic correlation processors on an 11 terabyte global dataset using 25 arrays in the single frequency band 1-3 Hz. The system found over 3.2 million unique signals and produced 459,747 screened detections. A very satisfying result is that, on average, efficiency grows with time and, after nearly 16 years of operation, exceeds 47% for events observed over all distance ranges and approaches 70% for near regional and 90% for local events. This observation suggests that future pipeline architectures should make extensive use of correlation detectors, principally for decluttering observations of local and near-regional events. Our results also suggest that future operations based on correlation detection will require commodity large-scale computing infrastructure, since the numbers of correlators in an autonomous system can grow into the hundreds of thousands.« less
NASA Astrophysics Data System (ADS)
de Vries, A. J.; Ouwersloot, H. G.; Feldstein, S. B.; Riemer, M.; El Kenawy, A. M.; McCabe, M. F.; Lelieveld, J.
2018-01-01
Extreme precipitation events in the otherwise arid Middle East can cause flooding with dramatic socioeconomic impacts. Most of these events are associated with tropical-extratropical interactions, whereby a stratospheric potential vorticity (PV) intrusion reaches deep into the subtropics and forces an incursion of high poleward vertically integrated water vapor transport (IVT) into the Middle East. This study presents an object-based identification method for extreme precipitation events based on the combination of these two larger-scale meteorological features. The general motivation for this approach is that precipitation is often poorly simulated in relatively coarse weather and climate models, whereas the synoptic-scale circulation is much better represented. The algorithm is applied to ERA-Interim reanalysis data (1979-2015) and detects 90% (83%) of the 99th (97.5th) percentile of extreme precipitation days in the region of interest. Our results show that stratospheric PV intrusions and IVT structures are intimately connected to extreme precipitation intensity and seasonality. The farther south a stratospheric PV intrusion reaches, the larger the IVT magnitude, and the longer the duration of their combined occurrence, the more extreme the precipitation. Our algorithm detects a large fraction of the climatological rainfall amounts (40-70%), heavy precipitation days (50-80%), and the top 10 extreme precipitation days (60-90%) at many sites in southern Israel and the northern and western parts of Saudi Arabia. This identification method provides a new tool for future work to disentangle teleconnections, assess medium-range predictability, and improve understanding of climatic changes of extreme precipitation in the Middle East and elsewhere.
ON THE FERMI -GBM EVENT 0.4 s AFTER GW150914
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greiner, J.; Yu, H.-F.; Burgess, J. M.
In view of the recent report by Connaughton et al., we analyze continuous time-tagged event (TTE) data of Fermi -gamma-ray burst monitor (GBM) around the time of the gravitational-wave event GW 150914. We find that after proper accounting for low-count statistics, the GBM transient event at 0.4 s after GW 150914 is likely not due to an astrophysical source, but consistent with a background fluctuation, removing the tension between the INTEGRAL /ACS non-detection and GBM. Additionally, reanalysis of other short GRBs shows that without proper statistical modeling the fluence of faint events is over-predicted, as verified for some joint GBM–ACSmore » detections of short GRBs. We detail the statistical procedure to correct these biases. As a result, faint short GRBs, verified by ACS detections, with significances in the broadband light curve even smaller than that of the GBM–GW150914 event are recovered as proper non-zero source, while the GBM–GW150914 event is consistent with zero fluence.« less
Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang
2015-01-05
The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection.
Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang
2015-01-01
The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection. PMID:25556930
Pombo, Nuno; Garcia, Nuno; Bousson, Kouamana
2017-03-01
Sleep apnea syndrome (SAS), which can significantly decrease the quality of life is associated with a major risk factor of health implications such as increased cardiovascular disease, sudden death, depression, irritability, hypertension, and learning difficulties. Thus, it is relevant and timely to present a systematic review describing significant applications in the framework of computational intelligence-based SAS, including its performance, beneficial and challenging effects, and modeling for the decision-making on multiple scenarios. This study aims to systematically review the literature on systems for the detection and/or prediction of apnea events using a classification model. Forty-five included studies revealed a combination of classification techniques for the diagnosis of apnea, such as threshold-based (14.75%) and machine learning (ML) models (85.25%). In addition, the ML models, were clustered in a mind map, include neural networks (44.26%), regression (4.91%), instance-based (11.47%), Bayesian algorithms (1.63%), reinforcement learning (4.91%), dimensionality reduction (8.19%), ensemble learning (6.55%), and decision trees (3.27%). A classification model should provide an auto-adaptive and no external-human action dependency. In addition, the accuracy of the classification models is related with the effective features selection. New high-quality studies based on randomized controlled trials and validation of models using a large and multiple sample of data are recommended. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
Clayton, Hilary M.
2015-01-01
The study of animal movement commonly requires the segmentation of continuous data streams into individual strides. The use of forceplates and foot-mounted accelerometers readily allows the detection of the foot-on and foot-off events that define a stride. However, when relying on optical methods such as motion capture, there is lack of validated robust, universally applicable stride event detection methods. To date, no method has been validated for movement on a circle, while algorithms are commonly specific to front/hind limbs or gait. In this study, we aimed to develop and validate kinematic stride segmentation methods applicable to movement on straight line and circle at walk and trot, which exclusively rely on a single, dorsal hoof marker. The advantage of such marker placement is the robustness to marker loss and occlusion. Eight horses walked and trotted on a straight line and in a circle over an array of multiple forceplates. Kinetic events were detected based on the vertical force profile and used as the reference values. Kinematic events were detected based on displacement, velocity or acceleration signals of the dorsal hoof marker depending on the algorithm using (i) defined thresholds associated with derived movement signals and (ii) specific events in the derived movement signals. Method comparison was performed by calculating limits of agreement, accuracy, between-horse precision and within-horse precision based on differences between kinetic and kinematic event. In addition, we examined the effect of force thresholds ranging from 50 to 150 N on the timings of kinetic events. The two approaches resulted in very good and comparable performance: of the 3,074 processed footfall events, 95% of individual foot on and foot off events differed by no more than 26 ms from the kinetic event, with average accuracy between −11 and 10 ms and average within- and between horse precision ≤8 ms. While the event-based method may be less likely to suffer from scaling effects, on soft ground the threshold-based method may prove more valuable. While we found that use of velocity thresholds for foot on detection results in biased event estimates for the foot on the inside of the circle at trot, adjusting thresholds for this condition negated the effect. For the final four algorithms, we found no noteworthy bias between conditions or between front- and hind-foot timings. Different force thresholds in the range of 50 to 150 N had the greatest systematic effect on foot-off estimates in the hind limbs (up to on average 16 ms per condition), being greater than the effect on foot-on estimates or foot-off estimates in the forelimbs (up to on average ±7 ms per condition). PMID:26157641
Automated Detection of Events of Scientific Interest
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
A report presents a slightly different perspective of the subject matter of Fusing Symbolic and Numerical Diagnostic Computations (NPO-42512), which appears elsewhere in this issue of NASA Tech Briefs. Briefly, the subject matter is the X-2000 Anomaly Detection Language, which is a developmental computing language for fusing two diagnostic computer programs one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for real-time detection of events. In the case of the cited companion NASA Tech Briefs article, the contemplated events that one seeks to detect would be primarily failures or other changes that could adversely affect the safety or success of a spacecraft mission. In the case of the instant report, the events to be detected could also include natural phenomena that could be of scientific interest. Hence, the use of X- 2000 Anomaly Detection Language could contribute to a capability for automated, coordinated use of multiple sensors and sensor-output-data-processing hardware and software to effect opportunistic collection and analysis of scientific data.
FORTE Compact Intra-cloud Discharge Detection parameterized by Peak Current
NASA Astrophysics Data System (ADS)
Heavner, M. J.; Suszcynsky, D. M.; Jacobson, A. R.; Heavner, B. D.; Smith, D. A.
2002-12-01
The Los Alamos Sferic Array (EDOT) has recorded over 3.7 million lightning-related fast electric field change data records during April 1 - August 31, 2001 and 2002. The events were detected by three or more stations, allowing for differential-time-of-arrival location determination. The waveforms are characterized with estimated peak currents as well as by event type. Narrow Bipolar Events (NBEs), the VLF/LF signature of Compact Intra-cloud Discharges (CIDs), are generally isolated pulses with identifiable ionospheric reflections, permitting determination of event source altitudes. We briefly review the EDOT characterization of events. The FORTE satellite observes Trans-Ionospheric Pulse Pairs (TIPPs, the VHF satellite signature of CIDs). The subset of coincident EDOT and FORTE CID observations are compared with the total EDOT CID database to characterize the VHF detection efficiency of CIDs. The NBE polarity and altitude are also examined in the context of FORTE TIPP detection. The parameter-dependent detection efficiencies are extrapolated from FORTE orbit to GPS orbit in support of the V-GLASS effort (GPS based global detection of lightning).
Radionuclide data analysis in connection of DPRK event in May 2009
NASA Astrophysics Data System (ADS)
Nikkinen, Mika; Becker, Andreas; Zähringer, Matthias; Polphong, Pornsri; Pires, Carla; Assef, Thierry; Han, Dongmei
2010-05-01
The seismic event detected in DPRK on 25.5.2009 was triggering a series of actions within CTBTO/PTS to ensure its preparedness to detect any radionuclide emissions possibly linked with the event. Despite meticulous work to detect and verify, traces linked to the DPRK event were not found. After three weeks of high alert the PTS resumed back to normal operational routine. This case illuminates the importance of objectivity and procedural approach in the data evaluation. All the data coming from particulate and noble gas stations were evaluated daily, some of the samples even outside of office hours and during the weekends. Standard procedures were used to determine the network detection thresholds of the key (CTBT relevant) radionuclides achieved across the DPRK event area and for the assessment of radionuclides typically occurring at IMS stations (background history). Noble gas system has sometimes detections that are typical for the sites due to legitimate non-nuclear test related activities. Therefore, set of hypothesis were used to see if the detection is consistent with event time and location through atmospheric transport modelling. Also the consistency of event timing and isotopic ratios was used in the evaluation work. As a result it was concluded that if even 1/1000 of noble gasses from a nuclear detonation would had leaked, the IMS system would not had problems to detect it. This case also showed the importance of on-site inspections to verify the nuclear traces of possible tests.
NASA Technical Reports Server (NTRS)
Culpepper, William X.; ONeill, Pat; Nicholson, Leonard L.
2000-01-01
An internuclear cascade and evaporation model has been adapted to estimate the LET spectrum generated during testing with 200 MeV protons. The model-generated heavy ion LET spectrum is compared to the heavy ion LET spectrum seen on orbit. This comparison is the basis for predicting single event failure rates from heavy ions using results from a single proton test. Of equal importance, this spectra comparison also establishes an estimate of the risk of encountering a failure mode on orbit that was not detected during proton testing. Verification of the general results of the model is presented based on experiments, individual part test results, and flight data. Acceptance of this model and its estimate of remaining risk opens the hardware verification philosophy to the consideration of radiation testing with high energy protons at the board and box level instead of the more standard method of individual part testing with low energy heavy ions.
Statistical Model Applied to NetFlow for Network Intrusion Detection
NASA Astrophysics Data System (ADS)
Proto, André; Alexandre, Leandro A.; Batista, Maira L.; Oliveira, Isabela L.; Cansian, Adriano M.
The computers and network services became presence guaranteed in several places. These characteristics resulted in the growth of illicit events and therefore the computers and networks security has become an essential point in any computing environment. Many methodologies were created to identify these events; however, with increasing of users and services on the Internet, many difficulties are found in trying to monitor a large network environment. This paper proposes a methodology for events detection in large-scale networks. The proposal approaches the anomaly detection using the NetFlow protocol, statistical methods and monitoring the environment in a best time for the application.
Urtnasan, Erdenebayar; Park, Jong-Uk; Joo, Eun-Yeon; Lee, Kyoung-Joung
2018-04-23
In this study, we propose a method for the automated detection of obstructive sleep apnea (OSA) from a single-lead electrocardiogram (ECG) using a convolutional neural network (CNN). A CNN model was designed with six optimized convolution layers including activation, pooling, and dropout layers. One-dimensional (1D) convolution, rectified linear units (ReLU), and max pooling were applied to the convolution, activation, and pooling layers, respectively. For training and evaluation of the CNN model, a single-lead ECG dataset was collected from 82 subjects with OSA and was divided into training (including data from 63 patients with 34,281 events) and testing (including data from 19 patients with 8571 events) datasets. Using this CNN model, a precision of 0.99%, a recall of 0.99%, and an F 1 -score of 0.99% were attained with the training dataset; these values were all 0.96% when the CNN was applied to the testing dataset. These results show that the proposed CNN model can be used to detect OSA accurately on the basis of a single-lead ECG. Ultimately, this CNN model may be used as a screening tool for those suspected to suffer from OSA.
NASA Technical Reports Server (NTRS)
Blackburn, L.; Briggs, M. S.; Camp, J.; Christensen, N.; Connaughton, V.; Jenke, P.; Remillard, R. A.; Veitch, J.
2015-01-01
We present two different search methods for electromagnetic counterparts to gravitational-wave (GW) events from ground-based detectors using archival NASA high-energy data from the Fermi Gamma-ray Burst Monitor (GBM) and RXTE All-sky Monitor (ASM) instruments. To demonstrate the methods, we use a limited number of representative GW background noise events produced by a search for binary neutron star coalescence over the last two months of the LIGO-Virgo S6/VSR3 joint science run. Time and sky location provided by the GW data trigger a targeted search in the high-energy photon data. We use two custom pipelines: one to search for prompt gamma-ray counterparts in GBM, and the other to search for a variety of X-ray afterglow model signals in ASM. We measure the efficiency of the joint pipelines to weak gamma-ray burst counterparts, and a family of model X-ray afterglows. By requiring a detectable signal in either electromagnetic instrument coincident with a GW event, we are able to reject a large majority of GW candidates. This reduces the signal-to-noise ratio of the loudest surviving GW background event by around 15-20 percent.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blackburn, L.; Camp, J.; Briggs, M. S.
2015-03-15
We present two different search methods for electromagnetic counterparts to gravitational-wave (GW) events from ground-based detectors using archival NASA high-energy data from the Fermi Gamma-ray Burst Monitor (GBM) and RXTE All-sky Monitor (ASM) instruments. To demonstrate the methods, we use a limited number of representative GW background noise events produced by a search for binary neutron star coalescence over the last two months of the LIGO-Virgo S6/VSR3 joint science run. Time and sky location provided by the GW data trigger a targeted search in the high-energy photon data. We use two custom pipelines: one to search for prompt gamma-ray counterpartsmore » in GBM, and the other to search for a variety of X-ray afterglow model signals in ASM. We measure the efficiency of the joint pipelines to weak gamma-ray burst counterparts, and a family of model X-ray afterglows. By requiring a detectable signal in either electromagnetic instrument coincident with a GW event, we are able to reject a large majority of GW candidates. This reduces the signal-to-noise ratio of the loudest surviving GW background event by around 15–20%.« less
Explosion Source Location Study Using Collocated Acoustic and Seismic Networks in Israel
NASA Astrophysics Data System (ADS)
Pinsky, V.; Gitterman, Y.; Arrowsmith, S.; Ben-Horin, Y.
2013-12-01
We explore a joined analysis of seismic and infrasonic signals for improvement in automatic monitoring of small local/regional events, such as construction and quarry blasts, military chemical explosions, sonic booms, etc. using collocated seismic and infrasonic networks recently build in Israel (ISIN) in the frame of the project sponsored by the Bi-national USA-Israel Science Foundation (BSF). The general target is to create an automatic system, which will provide detection, location and identification of explosions in real-time or close-to-real time manner. At the moment the network comprises 15 stations hosting a microphone and seismometer (or accelerometer), operated by the Geophysical Institute of Israel (GII), plus two infrasonic arrays, operated by the National Data Center, Soreq: IOB in the South (Negev desert) and IMA in the North of Israel (Upper Galilee),collocated with the IMS seismic array MMAI. The study utilizes a ground-truth data-base of numerous Rotem phosphate quarry blasts, a number of controlled explosions for demolition of outdated ammunitions and experimental surface explosions for a structure protection research, at the Sayarim Military Range. A special event, comprising four military explosions in a neighboring country, that provided both strong seismic (up to 400 km) and infrasound waves (up to 300 km), is also analyzed. For all of these events the ground-truth coordinates and/or the results of seismic location by the Israel Seismic Network (ISN) have been provided. For automatic event detection and phase picking we tested the new recursive picker, based on Statistically optimal detector. The results were compared to the manual picks. Several location techniques have been tested using the ground-truth event recordings and the preliminary results obtained have been compared to the ground-truth locations: 1) a number of events have been located as intersection of azimuths estimated using the wide-band F-K analysis technique applied to the infrasonic phases of the two distant arrays; 2) a standard robust grid-search location procedure based on phase picks and a constant celerity for a phase (tropospheric or stratospheric) was applied; 3) a joint coordinate grid-search procedure using array waveforms and phase picks was tested, 4) the Bayesian Infrasonic Source Localization (BISL) method, incorporating semi-empirical model-based prior information, was modified for array+network configuration and applied to the ground-truth events. For this purpose we accumulated data of the former observations of the air-to-ground infrasonic phases to compute station specific ground-truth Celerity-Range Histograms (ssgtCRH) and/or model-based CRH (mbCRH), which allow to essentially improve the location results. For building the mbCRH the local meteo-data and the ray-tracing modeling in 3 available azimuth ranges, accounting seasonal variations of winds directivity (quadrants North:315-45, South: 135-225, East 45-135) have been used.
A New Network Modeling Tool for the Ground-based Nuclear Explosion Monitoring Community
NASA Astrophysics Data System (ADS)
Merchant, B. J.; Chael, E. P.; Young, C. J.
2013-12-01
Network simulations have long been used to assess the performance of monitoring networks to detect events for such purposes as planning station deployments and network resilience to outages. The standard tool has been the SAIC-developed NetSim package. With correct parameters, NetSim can produce useful simulations; however, the package has several shortcomings: an older language (FORTRAN), an emphasis on seismic monitoring with limited support for other technologies, limited documentation, and a limited parameter set. Thus, we are developing NetMOD (Network Monitoring for Optimal Detection), a Java-based tool designed to assess the performance of ground-based networks. NetMOD's advantages include: coded in a modern language that is multi-platform, utilizes modern computing performance (e.g. multi-core processors), incorporates monitoring technologies other than seismic, and includes a well-validated default parameter set for the IMS stations. NetMOD is designed to be extendable through a plugin infrastructure, so new phenomenological models can be added. Development of the Seismic Detection Plugin is being pursued first. Seismic location and infrasound and hydroacoustic detection plugins will follow. By making NetMOD an open-release package, it can hopefully provide a common tool that the monitoring community can use to produce assessments of monitoring networks and to verify assessments made by others.
Temporal monitoring of vessels activity using day/night band in Suomi NPP on South China Sea
NASA Astrophysics Data System (ADS)
Yamaguchi, Takashi; Asanuma, Ichio; Park, Jong Geol; Mackin, Kenneth J.; Mittleman, John
2017-05-01
In this research, we focus on vessel detection using the satellite imagery of day/night band (DNB) on Suomi NPP in order to monitor the change of vessel activity on the region of South China Sea. In this paper, we consider the relation between the temporal change of vessel activities and the events on maritime environment based on the vessel traffic density estimation using DNB. DNB is a moderate resolution (350-700m) satellite imagery but can detect the fishing light of fishery boats in night time for every day. The advantage of DNB is the continuous monitoring on wide area compared to another vessel detection and locating system. However, DNB gave strong influence of cloud and lunar refection. Therefore, we additionally used Brightness Temperature at 3.7μm(BT3.7) for cloud information. In our previous research, we construct an empirical vessel detection model that based on the DNB contrast and the estimation of cloud condition using BT3.7. Moreover, we proposed a vessel traffic density estimation method based on empirical model. In this paper, we construct the time temporal density estimation map on South China Sea and East China Sea in order to extract the knowledge from vessel activities change.
Sequence of pathogenic events in cynomolgus macaques infected with aerosolized monkeypox virus.
Tree, J A; Hall, G; Pearson, G; Rayner, E; Graham, V A; Steeds, K; Bewley, K R; Hatch, G J; Dennis, M; Taylor, I; Roberts, A D; Funnell, S G P; Vipond, J
2015-04-01
To evaluate new vaccines when human efficacy studies are not possible, the FDA's "Animal Rule" requires well-characterized models of infection. Thus, in the present study, the early pathogenic events of monkeypox infection in nonhuman primates, a surrogate for variola virus infection, were characterized. Cynomolgus macaques were exposed to aerosolized monkeypox virus (10(5) PFU). Clinical observations, viral loads, immune responses, and pathological changes were examined on days 2, 4, 6, 8, 10, and 12 postchallenge. Viral DNA (vDNA) was detected in the lungs on day 2 postchallenge, and viral antigen was detected, by immunostaining, in the epithelium of bronchi, bronchioles, and alveolar walls. Lesions comprised rare foci of dysplastic and sloughed cells in respiratory bronchioles. By day 4, vDNA was detected in the throat, tonsil, and spleen, and monkeypox antigen was detected in the lung, hilar and submandibular lymph nodes, spleen, and colon. Lung lesions comprised focal epithelial necrosis and inflammation. Body temperature peaked on day 6, pox lesions appeared on the skin, and lesions, with positive immunostaining, were present in the lung, tonsil, spleen, lymph nodes, and colon. By day 8, vDNA was present in 9/13 tissues. Blood concentrations of interleukin 1ra (IL-1ra), IL-6, and gamma interferon (IFN-γ) increased markedly. By day 10, circulating IgG antibody concentrations increased, and on day 12, animals showed early signs of recovery. These results define early events occurring in an inhalational macaque monkeypox infection model, supporting its use as a surrogate model for human smallpox. Bioterrorism poses a major threat to public health, as the deliberate release of infectious agents, such smallpox or a related virus, monkeypox, would have catastrophic consequences. The development and testing of new medical countermeasures, e.g., vaccines, are thus priorities; however, tests for efficacy in humans cannot be performed because it would be unethical and field trials are not feasible. To overcome this, the FDA may grant marketing approval of a new product based upon the "Animal Rule," in which interventions are tested for efficacy in well-characterized animal models. Monkeypox virus infection of nonhuman primates (NHPs) presents a potential surrogate disease model for smallpox. Previously, the later stages of monkeypox infection were defined, but the early course of infection remains unstudied. Here, the early pathogenic events of inhalational monkeypox infection in NHPs were characterized, and the results support the use of this surrogate model for testing human smallpox interventions. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
Sequence of Pathogenic Events in Cynomolgus Macaques Infected with Aerosolized Monkeypox Virus
Hall, G.; Pearson, G.; Rayner, E.; Graham, V. A.; Steeds, K.; Bewley, K. R.; Hatch, G. J.; Dennis, M.; Taylor, I.; Roberts, A. D.; Funnell, S. G. P.; Vipond, J.
2015-01-01
ABSTRACT To evaluate new vaccines when human efficacy studies are not possible, the FDA's “Animal Rule” requires well-characterized models of infection. Thus, in the present study, the early pathogenic events of monkeypox infection in nonhuman primates, a surrogate for variola virus infection, were characterized. Cynomolgus macaques were exposed to aerosolized monkeypox virus (105 PFU). Clinical observations, viral loads, immune responses, and pathological changes were examined on days 2, 4, 6, 8, 10, and 12 postchallenge. Viral DNA (vDNA) was detected in the lungs on day 2 postchallenge, and viral antigen was detected, by immunostaining, in the epithelium of bronchi, bronchioles, and alveolar walls. Lesions comprised rare foci of dysplastic and sloughed cells in respiratory bronchioles. By day 4, vDNA was detected in the throat, tonsil, and spleen, and monkeypox antigen was detected in the lung, hilar and submandibular lymph nodes, spleen, and colon. Lung lesions comprised focal epithelial necrosis and inflammation. Body temperature peaked on day 6, pox lesions appeared on the skin, and lesions, with positive immunostaining, were present in the lung, tonsil, spleen, lymph nodes, and colon. By day 8, vDNA was present in 9/13 tissues. Blood concentrations of interleukin 1ra (IL-1ra), IL-6, and gamma interferon (IFN-γ) increased markedly. By day 10, circulating IgG antibody concentrations increased, and on day 12, animals showed early signs of recovery. These results define early events occurring in an inhalational macaque monkeypox infection model, supporting its use as a surrogate model for human smallpox. IMPORTANCE Bioterrorism poses a major threat to public health, as the deliberate release of infectious agents, such smallpox or a related virus, monkeypox, would have catastrophic consequences. The development and testing of new medical countermeasures, e.g., vaccines, are thus priorities; however, tests for efficacy in humans cannot be performed because it would be unethical and field trials are not feasible. To overcome this, the FDA may grant marketing approval of a new product based upon the “Animal Rule,” in which interventions are tested for efficacy in well-characterized animal models. Monkeypox virus infection of nonhuman primates (NHPs) presents a potential surrogate disease model for smallpox. Previously, the later stages of monkeypox infection were defined, but the early course of infection remains unstudied. Here, the early pathogenic events of inhalational monkeypox infection in NHPs were characterized, and the results support the use of this surrogate model for testing human smallpox interventions. PMID:25653439
Detecting modification of biomedical events using a deep parsing approach
2012-01-01
Background This work describes a system for identifying event mentions in bio-molecular research abstracts that are either speculative (e.g. analysis of IkappaBalpha phosphorylation, where it is not specified whether phosphorylation did or did not occur) or negated (e.g. inhibition of IkappaBalpha phosphorylation, where phosphorylation did not occur). The data comes from a standard dataset created for the BioNLP 2009 Shared Task. The system uses a machine-learning approach, where the features used for classification are a combination of shallow features derived from the words of the sentences and more complex features based on the semantic outputs produced by a deep parser. Method To detect event modification, we use a Maximum Entropy learner with features extracted from the data relative to the trigger words of the events. The shallow features are bag-of-words features based on a small sliding context window of 3-4 tokens on either side of the trigger word. The deep parser features are derived from parses produced by the English Resource Grammar and the RASP parser. The outputs of these parsers are converted into the Minimal Recursion Semantics formalism, and from this, we extract features motivated by linguistics and the data itself. All of these features are combined to create training or test data for the machine learning algorithm. Results Over the test data, our methods produce approximately a 4% absolute increase in F-score for detection of event modification compared to a baseline based only on the shallow bag-of-words features. Conclusions Our results indicate that grammar-based techniques can enhance the accuracy of methods for detecting event modification. PMID:22595089
LAN attack detection using Discrete Event Systems.
Hubballi, Neminath; Biswas, Santosh; Roopa, S; Ratti, Ritesh; Nandi, Sukumar
2011-01-01
Address Resolution Protocol (ARP) is used for determining the link layer or Medium Access Control (MAC) address of a network host, given its Internet Layer (IP) or Network Layer address. ARP is a stateless protocol and any IP-MAC pairing sent by a host is accepted without verification. This weakness in the ARP may be exploited by malicious hosts in a Local Area Network (LAN) by spoofing IP-MAC pairs. Several schemes have been proposed in the literature to circumvent these attacks; however, these techniques either make IP-MAC pairing static, modify the existing ARP, patch operating systems of all the hosts etc. In this paper we propose a Discrete Event System (DES) approach for Intrusion Detection System (IDS) for LAN specific attacks which do not require any extra constraint like static IP-MAC, changing the ARP etc. A DES model is built for the LAN under both a normal and compromised (i.e., spoofed request/response) situation based on the sequences of ARP related packets. Sequences of ARP events in normal and spoofed scenarios are similar thereby rendering the same DES models for both the cases. To create different ARP events under normal and spoofed conditions the proposed technique uses active ARP probing. However, this probing adds extra ARP traffic in the LAN. Following that a DES detector is built to determine from observed ARP related events, whether the LAN is operating under a normal or compromised situation. The scheme also minimizes extra ARP traffic by probing the source IP-MAC pair of only those ARP packets which are yet to be determined as genuine/spoofed by the detector. Also, spoofed IP-MAC pairs determined by the detector are stored in tables to detect other LAN attacks triggered by spoofing namely, man-in-the-middle (MiTM), denial of service etc. The scheme is successfully validated in a test bed. Copyright © 2010 ISA. Published by Elsevier Ltd. All rights reserved.
Dimension-based attention in visual short-term memory.
Pilling, Michael; Barrett, Doug J K
2016-07-01
We investigated how dimension-based attention influences visual short-term memory (VSTM). This was done through examining the effects of cueing a feature dimension in two perceptual comparison tasks (change detection and sameness detection). In both tasks, a memory array and a test array consisting of a number of colored shapes were presented successively, interleaved by a blank interstimulus interval (ISI). In Experiment 1 (change detection), the critical event was a feature change in one item across the memory and test arrays. In Experiment 2 (sameness detection), the critical event was the absence of a feature change in one item across the two arrays. Auditory cues indicated the feature dimension (color or shape) of the critical event with 80 % validity; the cues were presented either prior to the memory array, during the ISI, or simultaneously with the test array. In Experiment 1, the cue validity influenced sensitivity only when the cue was given at the earliest position; in Experiment 2, the cue validity influenced sensitivity at all three cue positions. We attributed the greater effectiveness of top-down guidance by cues in the sameness detection task to the more active nature of the comparison process required to detect sameness events (Hyun, Woodman, Vogel, Hollingworth, & Luck, Journal of Experimental Psychology: Human Perception and Performance, 35; 1140-1160, 2009).
The Effect of Right Colon Retroflexion on Adenoma Detection: A Systematic Review and Meta-analysis.
Cohen, Jonah; Grunwald, Douglas; Grossberg, Laurie B; Sawhney, Mandeep S
2017-10-01
Although colonoscopy with polypectomy can prevent up to 80% of colorectal cancers, a significant adenoma miss rate still exists, particularly in the right colon. Previous studies addressing right colon retroflexion have revealed discordant evidence regarding the benefit of this maneuver on adenoma detection with concomitant concerns about safety and rates of maneuver success. In this meta-analysis, we sought to determine the effect of right colon retroflexion on improving adenoma detection compared with conventional colonoscopy without retroflexion, as well as determine the rates of retroflexion maneuver success and adverse events. Multiple databases including MEDLINE, Embase, and Web of Science were searched for studies on right colon retroflexion and its impact on adenoma detection compared with conventional colonoscopy. Pooled analyses of adenoma detection and retroflexion success were based on mixed-effects and random-effects models with heterogeneity analyses. Eight studies met the inclusion criteria (N=3660). The primary analysis comparing colonoscopy with right-sided retroflexion versus conventional colonoscopy to determine the per-adenoma miss rate in the right colon was 16.9% (95% confidence interval, 12.5%-22.5%). The overall rate of successful retroflexion was 91.9% (95% confidence interval, 86%-95%) and rate of adverse events was 0.03%. Colonoscopy with right-sided retroflexion significantly increases the detection of adenomas in the right colon compared with conventional colonoscopy with a high rate of maneuver success and small risk of adverse events. Thus, reexamination of the right colon in retroflexed view should be strongly considered in future standard of care colonoscopy guidelines for quality improvement in colon cancer prevention.
None of the above: A Bayesian account of the detection of novel categories.
Navarro, Daniel J; Kemp, Charles
2017-10-01
Every time we encounter a new object, action, or event, there is some chance that we will need to assign it to a novel category. We describe and evaluate a class of probabilistic models that detect when an object belongs to a category that has not previously been encountered. The models incorporate a prior distribution that is influenced by the distribution of previous objects among categories, and we present 2 experiments that demonstrate that people are also sensitive to this distributional information. Two additional experiments confirm that distributional information is combined with similarity when both sources of information are available. We compare our approach to previous models of unsupervised categorization and to several heuristic-based models, and find that a hierarchical Bayesian approach provides the best account of our data. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Forward modeling transient brightenings and microflares around an active region observed with Hi-C
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kobelski, Adam R.; McKenzie, David E., E-mail: kobelski@solar.physics.montana.edu
Small-scale flare-like brightenings around active regions are among the smallest and most fundamental of energetic transient events in the corona, providing a testbed for models of heating and active region dynamics. In a previous study, we modeled a large collection of these microflares observed with Hinode/X-Ray Telescope (XRT) using EBTEL and found that they required multiple heating events, but could not distinguish between multiple heating events on a single strand, or multiple strands each experiencing a single heating event. We present here a similar study, but with extreme-ultraviolet data of Active Region 11520 from the High Resolution Coronal Imager (Hi-C)more » sounding rocket. Hi-C provides an order of magnitude improvement to the spatial resolution of XRT, and a cooler temperature sensitivity, which combine to provide significant improvements to our ability to detect and model microflare activity around active regions. We have found that at the spatial resolution of Hi-C (≈0.''3), the events occur much more frequently than expected (57 events detected, only 1 or 2 expected), and are most likely made from strands of the order of 100 km wide, each of which is impulsively heated with multiple heating events. These findings tend to support bursty reconnection as the cause of the energy release responsible for the brightenings.« less
Rule-Based Event Processing and Reaction Rules
NASA Astrophysics Data System (ADS)
Paschke, Adrian; Kozlenkov, Alexander
Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.
NASA Astrophysics Data System (ADS)
Kobayashi, Takashi; Komoda, Norihisa
The traditional business process design methods, in which the usecase is the most typical, have no useful framework to design the activity sequence with. Therefore, the design efficiency and quality vary widely according to the designer’s experience and skill. In this paper, to solve this problem, we propose the business events and their state transition model (a basic business event model) based on the language/action perspective, which is the result in the cognitive science domain. In the business process design, using this model, we decide event occurrence conditions so that every event synchronizes with each other. We also propose the design pattern to decide the event occurrence condition (a business event improvement strategy). Lastly, we apply the business process design method based on the business event model and the business event improvement strategy to the credit card issue process and estimate its effect.
Limitations imposed on fire PRA methods as the result of incomplete and uncertain fire event data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nowlen, Steven Patrick; Hyslop, J. S.
2010-04-01
Fire probabilistic risk assessment (PRA) methods utilize data and insights gained from actual fire events in a variety of ways. For example, fire occurrence frequencies, manual fire fighting effectiveness and timing, and the distribution of fire events by fire source and plant location are all based directly on the historical experience base. Other factors are either derived indirectly or supported qualitatively based on insights from the event data. These factors include the general nature and intensity of plant fires, insights into operator performance, and insights into fire growth and damage behaviors. This paper will discuss the potential methodology improvements thatmore » could be realized if more complete fire event reporting information were available. Areas that could benefit from more complete event reporting that will be discussed in the paper include fire event frequency analysis, analysis of fire detection and suppression system performance including incipient detection systems, analysis of manual fire fighting performance, treatment of fire growth from incipient stages to fully-involved fires, operator response to fire events, the impact of smoke on plant operations and equipment, and the impact of fire-induced cable failures on plant electrical circuits.« less
Model-Based Fault Diagnosis: Performing Root Cause and Impact Analyses in Real Time
NASA Technical Reports Server (NTRS)
Figueroa, Jorge F.; Walker, Mark G.; Kapadia, Ravi; Morris, Jonathan
2012-01-01
Generic, object-oriented fault models, built according to causal-directed graph theory, have been integrated into an overall software architecture dedicated to monitoring and predicting the health of mission- critical systems. Processing over the generic fault models is triggered by event detection logic that is defined according to the specific functional requirements of the system and its components. Once triggered, the fault models provide an automated way for performing both upstream root cause analysis (RCA), and for predicting downstream effects or impact analysis. The methodology has been applied to integrated system health management (ISHM) implementations at NASA SSC's Rocket Engine Test Stands (RETS).
Ricci, Renato Pietro; Morichelli, Loredana; D'Onofrio, Antonio; Calò, Leonardo; Vaccari, Diego; Zanotto, Gabriele; Curnis, Antonio; Buja, Gianfranco; Rovai, Nicola; Gargaro, Alessio
2013-01-01
Aims The HomeGuide Registry was a prospective study (NCT01459874), implementing a model for remote monitoring of cardiac implantable electronic devices (CIEDs) in daily clinical practice, to estimate effectiveness in major cardiovascular event detection and management. Methods and results The workflow for remote monitoring [Biotronik Home Monitoring (HM)] was based on primary nursing: each patient was assigned to an expert nurse for management and to a responsible physician for medical decisions. In-person visits were scheduled once a year. Seventy-five Italian sites enrolled 1650 patients [27% pacemakers, 27% single-chamber implantable cardioverter defibrillators (ICDs), 22% dual-chamber ICDs, 24% ICDs with cardiac resynchronization therapy]. Population resembled the expected characteristics of CIED patients. During a 20 ± 13 month follow-up, 2471 independently adjudicated events were collected in 838 patients (51%): 2033 (82%) were detected during HM sessions; 438 (18%) during in-person visits. Sixty were classified as false-positive, with generalized estimating equation-adjusted sensitivity and positive predictive value of 84.3% [confidence interval (CI), 82.5–86.0%] and 97.4% (CI, 96.5–98.2%), respectively. Overall, 95% of asymptomatic and 73% of actionable events were detected during HM sessions. Median reaction time was 3 days [interquartile range (IQR), 1–14 days]. Generalized estimating equation-adjusted incremental utility, calculated according to four properties of major clinical interest, was in favour of the HM sessions: +0.56 (CI, 0.53–0.58%), P < 0.0001. Resource consumption: 3364 HM sessions performed (76% by nurses), median committed monthly manpower of 55.5 (IQR, 22.0–107.0) min × health personnel/100 patients. Conclusion Home Monitoring was highly effective in detecting and managing clinical events in CIED patients in daily practice with remarkably low manpower and resource consumption. PMID:23362021
Comparison of the landslide susceptibility models in Taipei Water Source Domain, Taiwan
NASA Astrophysics Data System (ADS)
WU, C. Y.; Yeh, Y. C.; Chou, T. H.
2017-12-01
Taipei Water Source Domain, locating at the southeast of Taipei Metropolis, is the main source of water resource in this region. Recently, the downstream turbidity often soared significantly during the typhoon period because of the upstream landslides. The landslide susceptibilities should be analysed to assess the influence zones caused by different rainfall events, and to ensure the abilities of this domain to serve enough and quality water resource. Generally, the landslide susceptibility models can be established based on either a long-term landslide inventory or a specified landslide event. Sometimes, there is no long-term landslide inventory in some areas. Thus, the event-based landslide susceptibility models are established widely. However, the inventory-based and event-based landslide susceptibility models may result in dissimilar susceptibility maps in the same area. So the purposes of this study were to compare the landslide susceptibility maps derived from the inventory-based and event-based models, and to interpret how to select a representative event to be included in the susceptibility model. The landslide inventory from Typhoon Tim in July, 1994 and Typhoon Soudelor in August, 2015 was collected, and used to establish the inventory-based landslide susceptibility model. The landslides caused by Typhoon Nari and rainfall data were used to establish the event-based model. The results indicated the high susceptibility slope-units were located at middle upstream Nan-Shih Stream basin.
MODELING THE AFTERGLOW OF THE POSSIBLE FERMI -GBM EVENT ASSOCIATED WITH GW150914
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morsony, Brian J.; Workman, Jared C.; Ryan, Dominic M., E-mail: morsony@astro.umd.edu
2016-07-10
We model the possible afterglow of the Fermi Gamma-ray Burst Monitor (GBM) event associated with LIGO detection GW150914, under the assumption that the gamma-rays are produced by a short GRB-like relativistic outflow. We model GW150914-GBM as both a weak, on-axis short GRB and normal short GRB seen far off-axis. Given the large uncertainty in the position of GW150914, we determine that the best chance of finding the afterglow is with ASKAP or possibly the Murchinson Widefield Array (MWA), with the flux from an off-axis short GRB reaching 0.2–4 mJy (0.12–16 mJy) at 150 MHz (863.5 MHz) by 1–12 months aftermore » the initial event. At low frequencies, the source would evolve from a hard to soft spectrum over several months. The radio afterglow would be detectable for several months to years after it peaks, meaning the afterglow may still be detectable and increasing in brightness NOW (2016 mid-July). With a localization from the MWA or ASKAP, the afterglow would be detectable at higher radio frequencies with the ATCA and in X-rays with Chandra or XMM .« less
Predictive modeling of structured electronic health records for adverse drug event detection.
Zhao, Jing; Henriksson, Aron; Asker, Lars; Boström, Henrik
2015-01-01
The digitization of healthcare data, resulting from the increasingly widespread adoption of electronic health records, has greatly facilitated its analysis by computational methods and thereby enabled large-scale secondary use thereof. This can be exploited to support public health activities such as pharmacovigilance, wherein the safety of drugs is monitored to inform regulatory decisions about sustained use. To that end, electronic health records have emerged as a potentially valuable data source, providing access to longitudinal observations of patient treatment and drug use. A nascent line of research concerns predictive modeling of healthcare data for the automatic detection of adverse drug events, which presents its own set of challenges: it is not yet clear how to represent the heterogeneous data types in a manner conducive to learning high-performing machine learning models. Datasets from an electronic health record database are used for learning predictive models with the purpose of detecting adverse drug events. The use and representation of two data types, as well as their combination, are studied: clinical codes, describing prescribed drugs and assigned diagnoses, and measurements. Feature selection is conducted on the various types of data to reduce dimensionality and sparsity, while allowing for an in-depth feature analysis of the usefulness of each data type and representation. Within each data type, combining multiple representations yields better predictive performance compared to using any single representation. The use of clinical codes for adverse drug event detection significantly outperforms the use of measurements; however, there is no significant difference over datasets between using only clinical codes and their combination with measurements. For certain adverse drug events, the combination does, however, outperform using only clinical codes. Feature selection leads to increased predictive performance for both data types, in isolation and combined. We have demonstrated how machine learning can be applied to electronic health records for the purpose of detecting adverse drug events and proposed solutions to some of the challenges this presents, including how to represent the various data types. Overall, clinical codes are more useful than measurements and, in specific cases, it is beneficial to combine the two.
Predictive modeling of structured electronic health records for adverse drug event detection
2015-01-01
Background The digitization of healthcare data, resulting from the increasingly widespread adoption of electronic health records, has greatly facilitated its analysis by computational methods and thereby enabled large-scale secondary use thereof. This can be exploited to support public health activities such as pharmacovigilance, wherein the safety of drugs is monitored to inform regulatory decisions about sustained use. To that end, electronic health records have emerged as a potentially valuable data source, providing access to longitudinal observations of patient treatment and drug use. A nascent line of research concerns predictive modeling of healthcare data for the automatic detection of adverse drug events, which presents its own set of challenges: it is not yet clear how to represent the heterogeneous data types in a manner conducive to learning high-performing machine learning models. Methods Datasets from an electronic health record database are used for learning predictive models with the purpose of detecting adverse drug events. The use and representation of two data types, as well as their combination, are studied: clinical codes, describing prescribed drugs and assigned diagnoses, and measurements. Feature selection is conducted on the various types of data to reduce dimensionality and sparsity, while allowing for an in-depth feature analysis of the usefulness of each data type and representation. Results Within each data type, combining multiple representations yields better predictive performance compared to using any single representation. The use of clinical codes for adverse drug event detection significantly outperforms the use of measurements; however, there is no significant difference over datasets between using only clinical codes and their combination with measurements. For certain adverse drug events, the combination does, however, outperform using only clinical codes. Feature selection leads to increased predictive performance for both data types, in isolation and combined. Conclusions We have demonstrated how machine learning can be applied to electronic health records for the purpose of detecting adverse drug events and proposed solutions to some of the challenges this presents, including how to represent the various data types. Overall, clinical codes are more useful than measurements and, in specific cases, it is beneficial to combine the two. PMID:26606038
Hennenfent, Andrew; DelVento, Vito; Davies-Cole, John; Johnson-Clarke, Fern
2017-03-01
To enhance the early detection of emerging infectious diseases and bioterrorism events using companion animal-based surveillance. Washington, DC, small animal veterinary facilities (n=17) were surveyed to determine interest in conducting infectious disease surveillance. Using these results, an electronic-based online reporting system was developed and launched in August 2015 to monitor rates of canine influenza, canine leptospirosis, antibiotic resistant infections, canine parvovirus, and syndromic disease trends. Nine of the 10 facilities that responded expressed interest conducting surveillance. In September 2015, 17 canine parvovirus cases were reported. In response, a campaign encouraging regular veterinary preventative care was launched and featured on local media platforms. Additionally, during the system's first year of operation it detected 5 canine leptospirosis cases and 2 antibiotic resistant infections. No canine influenza cases were reported and syndromic surveillance compliance varied, peaking during National Special Security Events. Small animal veterinarians and the general public are interested in companion animal disease surveillance. The system described can serve as a model for establishing similar systems to monitor disease trends of public health importance in pet populations and enhance biosurveillance capabilities. Copyright © 2017 Elsevier B.V. All rights reserved.
Meta-Analysis of Rare Binary Adverse Event Data
Bhaumik, Dulal K.; Amatya, Anup; Normand, Sharon-Lise; Greenhouse, Joel; Kaizar, Eloise; Neelon, Brian; Gibbons, Robert D.
2013-01-01
We examine the use of fixed-effects and random-effects moment-based meta-analytic methods for analysis of binary adverse event data. Special attention is paid to the case of rare adverse events which are commonly encountered in routine practice. We study estimation of model parameters and between-study heterogeneity. In addition, we examine traditional approaches to hypothesis testing of the average treatment effect and detection of the heterogeneity of treatment effect across studies. We derive three new methods, simple (unweighted) average treatment effect estimator, a new heterogeneity estimator, and a parametric bootstrapping test for heterogeneity. We then study the statistical properties of both the traditional and new methods via simulation. We find that in general, moment-based estimators of combined treatment effects and heterogeneity are biased and the degree of bias is proportional to the rarity of the event under study. The new methods eliminate much, but not all of this bias. The various estimators and hypothesis testing methods are then compared and contrasted using an example dataset on treatment of stable coronary artery disease. PMID:23734068
Landslide Detection in the Carlyon Beach, WA Peninsula: Analysis Of High Resolution DEMs
NASA Astrophysics Data System (ADS)
Fayne, J.; Tran, C.; Mora, O. E.
2017-12-01
Landslides are geological events caused by slope instability and degradation, leading to the sliding of large masses of rock and soil down a mountain or hillside. These events are influenced by topography, geology, weather and human activity, and can cause extensive damage to the environment and infrastructure, such as the destruction of transportation networks, homes, and businesses. It is therefore imperative to detect early-warning signs of landslide hazards as a means of mitigation and disaster prevention. Traditional landslide surveillance consists of field mapping, but the process is expensive and time consuming. This study uses Light Detection and Ranging (LiDAR) derived Digital Elevation Models (DEMs) and k-means clustering and Gaussian Mixture Model (GMM) to analyze surface roughness and extract spatial features and patterns of landslides and landslide-prone areas. The methodology based on several feature extractors employs an unsupervised classifier on the Carlyon Beach Peninsula in the state of Washington to attempt to identify slide potential terrain. When compared with the independently compiled landslide inventory map, the proposed algorithm correctly classifies up to 87% of the terrain. These results suggest that the proposed methods and LiDAR-derived DEMs can provide important surface information and be used as efficient tools for digital terrain analysis to create accurate landslide maps.
Online track detection in triggerless mode for INO
NASA Astrophysics Data System (ADS)
Jain, A.; Padmini, S.; Joseph, A. N.; Mahesh, P.; Preetha, N.; Behere, A.; Sikder, S. S.; Majumder, G.; Behera, S. P.
2018-03-01
The India based Neutrino Observatory (INO) is a proposed particle physics research project to study the atmospheric neutrinos. INO-Iron Calorimeter (ICAL) will consist of 28,800 detectors having 3.6 million electronic channels expected to activate with 100 Hz single rate, producing data at a rate of 3 GBps. Data collected contains a few real hits generated by muon tracks and the remaining noise-induced spurious hits. Estimated reduction factor after filtering out data of interest from generated data is of the order of 103. This makes trigger generation critical for efficient data collection and storage. Trigger is generated by detecting coincidence across multiple channels satisfying trigger criteria, within a small window of 200 ns in the trigger region. As the probability of neutrino interaction is very low, track detection algorithm has to be efficient and fast enough to process 5 × 106 events-candidates/s without introducing significant dead time, so that not even a single neutrino event is missed out. A hardware based trigger system is presently proposed for on-line track detection considering stringent timing requirements. Though the trigger system can be designed with scalability, a lot of hardware devices and interconnections make it a complex and expensive solution with limited flexibility. A software based track detection approach working on the hit information offers an elegant solution with possibility of varying trigger criteria for selecting various potentially interesting physics events. An event selection approach for an alternative triggerless readout scheme has been developed. The algorithm is mathematically simple, robust and parallelizable. It has been validated by detecting simulated muon events for energies of the range of 1 GeV-10 GeV with 100% efficiency at a processing rate of 60 μs/event on a 16 core machine. The algorithm and result of a proof-of-concept for its faster implementation over multiple cores is presented. The paper also discusses about harnessing the computing capabilities of multi-core computing farm, thereby optimizing number of nodes required for the proposed system.
Observing atmospheric blocking with GPS radio occultation - one decade of measurements
NASA Astrophysics Data System (ADS)
Brunner, Lukas; Steiner, Andrea
2017-04-01
Atmospheric blocking has received a lot of attention in recent years due to its impact on mid-latitude circulation and subsequently on weather extremes such as cold and warm spells. So far blocking studies have been based mainly on re-analysis data or model output. However, it has been shown that blocking frequency exhibits considerable inter-model spread in current climate models. Here we use one decade (2006 to 2016) of satellite-based observations from GPS radio occultation (RO) to analyze blocking in RO data building on work by Brunner et al. (2016). Daily fields on a 2.5°×2.5° longitude-latitude grid are calculated by applying an adequate gridding strategy to the RO measurements. For blocking detection we use a standard blocking detection algorithm based on 500 hPa geopotential height (GPH) gradients. We investigate vertically resolved atmospheric variables such as GPH, temperature, and water vapor before, during, and after blocking events to increase process understanding. Moreover, utilizing the coverage of the RO data set, we investigate global blocking frequencies. The main blocking regions in the northern and southern hemisphere are identified and the (vertical) atmospheric structure linked to blocking events is compared. Finally, an inter-comparison of results from RO data to different re-analyses, such as ERA-Interim, MERRA 2, and JRA-55, is presented. Brunner, L., A. K. Steiner, B. Scherllin-Pirscher, and M. W. Jury (2016): Exploring atmospheric blocking with GPS radio occultation observations. Atmos. Chem. Phys., 16, 4593-4604, doi:10.5194/acp-16-4593-2016.
Video-Based Affect Detection in Noninteractive Learning Environments
ERIC Educational Resources Information Center
Chen, Yuxuan; Bosch, Nigel; D'Mello, Sidney
2015-01-01
The current paper explores possible solutions to the problem of detecting affective states from facial expressions during text/diagram comprehension, a context devoid of interactive events that can be used to infer affect. These data present an interesting challenge for face-based affect detection because likely locations of affective facial…
Helmer, Axel; Kretschmer, Friedrich; Deparade, Riana; Song, Bianying; Meis, Markus; Hein, Andreas; Marschollek, Michael; Tegtbur, Uwe
2012-01-01
Cardiopulmonary diseases affect millions of people and cause high costs in health care systems worldwide. Patients should perform regular endurance exercises to stabilize their health state and prevent further impairment. However, patients are often uncertain about the level of intensity they should exercise in their current condition. The cost of continuous monitoring for these training sessions in clinics is high and additionally requires the patient to travel to a clinic for each single session. Performing the rehabilitation training at home can raise compliance and reduce costs. To ensure safe telerehabilitation training and to enable patients to control their performance and health state, detection of abnormal events during training is a critical prerequisite. Therefore, we created a model that predicts the heart rate of cardiopulmonary patients and that can be used to detect and avoid abnormal health states. To enable external feedback and an immediate reaction in case of a critical situation, the patient should have the possibility to configure the system to communicate warnings and emergency events to clinical and non-clinical actors. To fulfill this task, we coupled a personal health record (PHR) with a new component that extends the classic home emergency systems. The PHR is also used for a training schedule definition that makes use of the predictive HR model. We used statistical methods to evaluate the prediction model and found that our prediction error of 3.2 heart beats per minute is precise enough to enable a detection of critical states. The concept for the communication of alerts was evaluated through focus group interviews with domain experts who judged that it fulfills the needs of potential users.
Trinh, T; Ishida, K; Kavvas, M L; Ercan, A; Carr, K
2017-05-15
Along with socioeconomic developments, and population increase, natural disasters around the world have recently increased the awareness of harmful impacts they cause. Among natural disasters, drought is of great interest to scientists due to the extraordinary diversity of their severity and duration. Motivated by the development of a potential approach to investigate future possible droughts in a probabilistic framework based on climate change projections, a methodology to consider thirteen future climate projections based on four emission scenarios to characterize droughts is presented. The proposed approach uses a regional climate model coupled with a physically-based hydrology model (Watershed Environmental Hydrology Hydro-Climate Model; WEHY-HCM) to generate thirteen equally likely future water supply projections. The water supply projections were compared to the current water demand for the detection of drought events and estimation of drought properties. The procedure was applied to Shasta Dam watershed to analyze drought conditions at the watershed outlet, Shasta Dam. The results suggest an increasing water scarcity at Shasta Dam with more severe and longer future drought events in some future scenarios. An important advantage of the proposed approach to the probabilistic analysis of future droughts is that it provides the drought properties of the 100-year and 200-year return periods without resorting to any extrapolation of the frequency curve. Copyright © 2017 Elsevier B.V. All rights reserved.
Characterization of GM events by insert knowledge adapted re-sequencing approaches
Yang, Litao; Wang, Congmao; Holst-Jensen, Arne; Morisset, Dany; Lin, Yongjun; Zhang, Dabing
2013-01-01
Detection methods and data from molecular characterization of genetically modified (GM) events are needed by stakeholders of public risk assessors and regulators. Generally, the molecular characteristics of GM events are incomprehensively revealed by current approaches and biased towards detecting transformation vector derived sequences. GM events are classified based on available knowledge of the sequences of vectors and inserts (insert knowledge). Herein we present three insert knowledge-adapted approaches for characterization GM events (TT51-1 and T1c-19 rice as examples) based on paired-end re-sequencing with the advantages of comprehensiveness, accuracy, and automation. The comprehensive molecular characteristics of two rice events were revealed with additional unintended insertions comparing with the results from PCR and Southern blotting. Comprehensive transgene characterization of TT51-1 and T1c-19 is shown to be independent of a priori knowledge of the insert and vector sequences employing the developed approaches. This provides an opportunity to identify and characterize also unknown GM events. PMID:24088728
Characterization of GM events by insert knowledge adapted re-sequencing approaches.
Yang, Litao; Wang, Congmao; Holst-Jensen, Arne; Morisset, Dany; Lin, Yongjun; Zhang, Dabing
2013-10-03
Detection methods and data from molecular characterization of genetically modified (GM) events are needed by stakeholders of public risk assessors and regulators. Generally, the molecular characteristics of GM events are incomprehensively revealed by current approaches and biased towards detecting transformation vector derived sequences. GM events are classified based on available knowledge of the sequences of vectors and inserts (insert knowledge). Herein we present three insert knowledge-adapted approaches for characterization GM events (TT51-1 and T1c-19 rice as examples) based on paired-end re-sequencing with the advantages of comprehensiveness, accuracy, and automation. The comprehensive molecular characteristics of two rice events were revealed with additional unintended insertions comparing with the results from PCR and Southern blotting. Comprehensive transgene characterization of TT51-1 and T1c-19 is shown to be independent of a priori knowledge of the insert and vector sequences employing the developed approaches. This provides an opportunity to identify and characterize also unknown GM events.
Improved Detection of Local Earthquakes in the Vienna Basin (Austria), using Subspace Detectors
NASA Astrophysics Data System (ADS)
Apoloner, Maria-Theresia; Caffagni, Enrico; Bokelmann, Götz
2016-04-01
The Vienna Basin in Eastern Austria is densely populated and highly-developed; it is also a region of low to moderate seismicity, yet the seismological network coverage is relatively sparse. This demands improving our capability of earthquake detection by testing new methods, enlarging the existing local earthquake catalogue. This contributes to imaging tectonic fault zones for better understanding seismic hazard, also through improved earthquake statistics (b-value, magnitude of completeness). Detection of low-magnitude earthquakes or events for which the highest amplitudes slightly exceed the signal-to-noise-ratio (SNR), may be possible by using standard methods like the short-term over long-term average (STA/LTA). However, due to sparse network coverage and high background noise, such a technique may not detect all potentially recoverable events. Yet, earthquakes originating from the same source region and relatively close to each other, should be characterized by similarity in seismic waveforms, at a given station. Therefore, waveform similarity can be exploited by using specific techniques such as correlation-template based (also known as matched filtering) or subspace detection methods (based on the subspace theory). Matching techniques basically require a reference or template event, usually characterized by high waveform coherence in the array receivers, and high SNR, which is cross-correlated with the continuous data. Instead, subspace detection methods overcome in principle the necessity of defining template events as single events, but use a subspace extracted from multiple events. This approach theoretically should be more robust in detecting signals that exhibit a strong variability (e.g. because of source or magnitude). In this study we scan the continuous data recorded in the Vienna Basin with a subspace detector to identify additional events. This will allow us to estimate the increase of the seismicity rate in the local earthquake catalogue, therefore providing an evaluation of network performance and efficiency of the method.
Episodic inflation events at Akutan Volcano, Alaska, during 2005-2017
NASA Astrophysics Data System (ADS)
Ji, Kang Hyeun; Yun, Sang-Ho; Rim, Hyoungrea
2017-08-01
Detection of weak volcano deformation helps constrain characteristics of eruption cycles. We have developed a signal detection technique, called the Targeted Projection Operator (TPO), to monitor surface deformation with Global Positioning System (GPS) data. We have applied the TPO to GPS data collected at Akutan Volcano from June 2005 to March 2017 and detected four inflation events that occurred in 2008, 2011, 2014, and 2016 with inflation rates of about 8-22 mm/yr above the background trend at a near-source site AV13. Numerical modeling suggests that the events should be driven by closely located sources or a single source in a shallow magma chamber at a depth of about 4 km. The inflation events suggest that magma has episodically accumulated in a shallow magma chamber.
A Foreign Object Damage Event Detector Data Fusion System for Turbofan Engines
NASA Technical Reports Server (NTRS)
Turso, James A.; Litt, Jonathan S.
2004-01-01
A Data Fusion System designed to provide a reliable assessment of the occurrence of Foreign Object Damage (FOD) in a turbofan engine is presented. The FOD-event feature level fusion scheme combines knowledge of shifts in engine gas path performance obtained using a Kalman filter, with bearing accelerometer signal features extracted via wavelet analysis, to positively identify a FOD event. A fuzzy inference system provides basic probability assignments (bpa) based on features extracted from the gas path analysis and bearing accelerometers to a fusion algorithm based on the Dempster-Shafer-Yager Theory of Evidence. Details are provided on the wavelet transforms used to extract the foreign object strike features from the noisy data and on the Kalman filter-based gas path analysis. The system is demonstrated using a turbofan engine combined-effects model (CEM), providing both gas path and rotor dynamic structural response, and is suitable for rapid-prototyping of control and diagnostic systems. The fusion of the disparate data can provide significantly more reliable detection of a FOD event than the use of either method alone. The use of fuzzy inference techniques combined with Dempster-Shafer-Yager Theory of Evidence provides a theoretical justification for drawing conclusions based on imprecise or incomplete data.
Scenario design and basic analysis of the National Data Centre Preparedness Exercise 2013
NASA Astrophysics Data System (ADS)
Ross, Ole; Ceranna, Lars; Hartmann, Gernot; Gestermann, Nicolai; Bönneman, Christian
2014-05-01
The Comprehensive Nuclear-Test-Ban Treaty (CTBT) prohibits all kinds of nuclear explosions. For the detection of treaty violations the International Monitoring System (IMS) operates stations observing seismic, hydroacoustic, and infrasound signals as well as radioisotopes in the atmosphere. While the IMS data is collected, processed and technically analyzed in the International Data Center (IDC) of the CTBT-Organization, National Data Centers (NDC) provide interpretation and advice to their government concerning suspicious detections occurring in IMS data. NDC Preparedness Exercises (NPE) are regularly performed dealing with fictitious treaty violations to practice the combined analysis of CTBT verification technologies and for the mutual exchange of information between NDC and also with the IDC. The NPE2010 and NPE2012 trigger scenarios were based on selected seismic events from the Reviewed Event Bulletin (REB) serving as starting point for fictitious Radionuclide dispersion. The main task was the identification of the original REB event and the discrimination between earthquakes and explosions as source. The scenario design of NPE2013 differs from those of previous NPEs. The waveform event selection is not constrained to events in the REB. The exercise trigger is a combination of a tempo-spatial indication pointing to a certain waveform event and simulated radionuclide concentrations generated by forward Atmospheric Transport Modelling based on a fictitious release. For the waveform event the date (4 Sept. 2013) is given and the region is communicated in a map showing the fictitious state of "Frisia" at the Coast of the North Sea in Central Europe. The synthetic radionuclide detections start in Vienna (8 Sept, I-131) and Schauinsland (11 Sept, Xe-133) with rather low activity concentrations and are most prominent in Stockholm and Spitsbergen mid of September 2013. Smaller concentrations in Asia follow later on. The potential connection between the waveform and radionuclide evidence remains unclear. The verification task is to identify the waveform event and to investigate potential sources of the radionuclide findings. Finally the potential conjunction between the sources and the CTBT-relevance of the whole picture has to be evaluated. The overall question is whether requesting an On-Site-Inspection in "Frisia" would be justified. The poster presents the NPE2013 scenario and gives a basic analysis of the initial situation concerning both waveform detections and atmospheric dispersion conditions in Central Europe in early September 2013. The full NPE2013 scenario will be presented at the NDC Workshop mid of May 2014.
NASA Astrophysics Data System (ADS)
Lutsch, E.; Conway, S. A.; Strong, K.; Jones, D. B. A.; Drummond, J. R.; Ortega, I.; Hannigan, J. W.; Makarova, M.; Notholt, J.; Blumenstock, T.; Sussmann, R.; Mahieu, E.; Kasai, Y.; Clerbaux, C.
2017-12-01
We present a multi-year time series of the total columns of carbon monoxide (CO), hydrogen cyanide (HCN) and ethane (C2H6) obtained by Fourier Transform Infrared (FTIR) spectrometer measurements at nine sites. Six are high-latitude sites: Eureka, Nunavut; Ny Alesund, Norway; Thule, Greenland; Kiruna, Sweden; Poker Flat, Alaska and St. Petersburg, Russia and three are mid-latitude sites; Zugspitze, Germany; Jungfraujoch, Switzerland and Toronto, Ontario. For each site, the inter-annual trends and seasonal variabilities of the CO total column time series are accounted for, allowing ambient concentrations to be determined. Enhancements above ambient levels are then used to identify possible wildfire pollution events. Since the abundance of each trace gas species emitted in a wildfire event is specific to the type of vegetation burned and the burning phase, correlations of CO to the other long-lived wildfire tracers HCN and C2H6 allow for further confirmation of the detection of wildfire pollution. Back-trajectories from HYSPLIT and FLEXPART as well as fire detections from the Moderate Resolution Spectroradiometer (MODIS) allow the source regions of the detected enhancements to be determined while satellite observations of CO from the Measurement of Pollution in the Troposphere (MOPITT) and Infrared Atmospheric Sounding Interferometer (IASI) instruments can be used to track the transport of the smoke plume. Differences in travel times between sites allows ageing of biomass burning plumes to be determined, providing a means to infer the physical and chemical processes affecting the loss of each species during transport. Comparisons of ground-based FTIR measurements to GEOS-Chem chemical transport model results are used to investigate these processes, evaluate wildfire emission inventories and infer the influence of wildfire emissions on the Arctic.
Liu, Jia; Guo, Jinchao; Zhang, Haibo; Li, Ning; Yang, Litao; Zhang, Dabing
2009-11-25
Various polymerase chain reaction (PCR) methods were developed for the execution of genetically modified organism (GMO) labeling policies, of which an event-specific PCR detection method based on the flanking sequence of exogenous integration is the primary trend in GMO detection due to its high specificity. In this study, the 5' and 3' flanking sequences of the exogenous integration of MON89788 soybean were revealed by thermal asymmetric interlaced PCR. The event-specific PCR primers and TaqMan probe were designed based upon the revealed 5' flanking sequence, and the qualitative and quantitative PCR assays were established employing these designed primers and probes. In qualitative PCR, the limit of detection (LOD) was about 0.01 ng of genomic DNA corresponding to 10 copies of haploid soybean genomic DNA. In the quantitative PCR assay, the LOD was as low as two haploid genome copies, and the limit of quantification was five haploid genome copies. Furthermore, the developed PCR methods were in-house validated by five researchers, and the validated results indicated that the developed event-specific PCR methods can be used for identification and quantification of MON89788 soybean and its derivates.
Semi-supervised anomaly detection - towards model-independent searches of new physics
NASA Astrophysics Data System (ADS)
Kuusela, Mikael; Vatanen, Tommi; Malmi, Eric; Raiko, Tapani; Aaltonen, Timo; Nagai, Yoshikazu
2012-06-01
Most classification algorithms used in high energy physics fall under the category of supervised machine learning. Such methods require a training set containing both signal and background events and are prone to classification errors should this training data be systematically inaccurate for example due to the assumed MC model. To complement such model-dependent searches, we propose an algorithm based on semi-supervised anomaly detection techniques, which does not require a MC training sample for the signal data. We first model the background using a multivariate Gaussian mixture model. We then search for deviations from this model by fitting to the observations a mixture of the background model and a number of additional Gaussians. This allows us to perform pattern recognition of any anomalous excess over the background. We show by a comparison to neural network classifiers that such an approach is a lot more robust against misspecification of the signal MC than supervised classification. In cases where there is an unexpected signal, a neural network might fail to correctly identify it, while anomaly detection does not suffer from such a limitation. On the other hand, when there are no systematic errors in the training data, both methods perform comparably.
DNS study on bursting and intermittency in late boundary layer transition
NASA Astrophysics Data System (ADS)
Wang, YiQian; Liu, ChaoQun
2017-11-01
Experimental and numerical investigations have suggested the existence of a strong correlation between the passage of coherent structures and events of bursting and intermittency. However, a detailed cause-and-effect study on the subject is rarely found in the literature due to the complexity and the nonlinear multiscale nature of turbulent flows. The primary goal of this research is to explore the motion and evolution of coherent structures during late transition, whose structure is much more ordered than that of fully developed turbulence, and their relationship with events of bursting and intermittency based on a verified high-order direct numerical simulation (DNS). The computation was carried out on a flat plate at Reynolds number 1000 (based on the inflow displacement thickness) with an inflow Mach number 0.5. It is concluded that bursting and intermittency detected by stationary sensors in a transitional boundary layer actually result from the passage and development of vortical structures, and it would be more rational to design transitional turbulence models based on modelling the moving vortical structures rather than the statistical features and experimental experiences.
Single- and Dual-Process Models of Biased Contingency Detection
2016-01-01
Abstract. Decades of research in causal and contingency learning show that people’s estimations of the degree of contingency between two events are easily biased by the relative probabilities of those two events. If two events co-occur frequently, then people tend to overestimate the strength of the contingency between them. Traditionally, these biases have been explained in terms of relatively simple single-process models of learning and reasoning. However, more recently some authors have found that these biases do not appear in all dependent variables and have proposed dual-process models to explain these dissociations between variables. In the present paper we review the evidence for dissociations supporting dual-process models and we point out important shortcomings of this literature. Some dissociations seem to be difficult to replicate or poorly generalizable and others can be attributed to methodological artifacts. Overall, we conclude that support for dual-process models of biased contingency detection is scarce and inconclusive. PMID:27025532
An integrated framework for detecting suspicious behaviors in video surveillance
NASA Astrophysics Data System (ADS)
Zin, Thi Thi; Tin, Pyke; Hama, Hiromitsu; Toriu, Takashi
2014-03-01
In this paper, we propose an integrated framework for detecting suspicious behaviors in video surveillance systems which are established in public places such as railway stations, airports, shopping malls and etc. Especially, people loitering in suspicion, unattended objects left behind and exchanging suspicious objects between persons are common security concerns in airports and other transit scenarios. These involve understanding scene/event, analyzing human movements, recognizing controllable objects, and observing the effect of the human movement on those objects. In the proposed framework, multiple background modeling technique, high level motion feature extraction method and embedded Markov chain models are integrated for detecting suspicious behaviors in real time video surveillance systems. Specifically, the proposed framework employs probability based multiple backgrounds modeling technique to detect moving objects. Then the velocity and distance measures are computed as the high level motion features of the interests. By using an integration of the computed features and the first passage time probabilities of the embedded Markov chain, the suspicious behaviors in video surveillance are analyzed for detecting loitering persons, objects left behind and human interactions such as fighting. The proposed framework has been tested by using standard public datasets and our own video surveillance scenarios.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeong, J.; Park, H.; Han, C.
2015-05-01
We reanalyze microlensing events in the published list of anomalous events that were observed from the Optical Gravitational Lensing Experiment (OGLE) lensing survey conducted during the 2004–2008 period. In order to check the existence of possible degenerate solutions and extract extra information, we conduct analyses based on combined data from other survey and follow-up observation and consider higher-order effects. Among the analyzed events, we present analyses of eight events for which either new solutions are identified or additional information is obtained. We find that the previous binary-source interpretations of five events are better interpreted by binary-lens models. These events includemore » OGLE-2006-BLG-238, OGLE-2007-BLG-159, OGLE-2007-BLG-491, OGLE-2008-BLG-143, and OGLE-2008-BLG-210. With additional data covering caustic crossings, we detect finite-source effects for six events including OGLE-2006-BLG-215, OGLE-2006-BLG-238, OGLE-2006-BLG-450, OGLE-2008-BLG-143, OGLE-2008-BLG-210, and OGLE-2008-BLG-513. Among them, we are able to measure the Einstein radii of three events for which multi-band data are available. These events are OGLE-2006-BLG-238, OGLE-2008-BLG-210, and OGLE-2008-BLG-513. For OGLE-2008-BLG-143, we detect higher-order effects induced by the changes of the observer’s position caused by the orbital motion of the Earth around the Sun. In addition, we present degenerate solutions resulting from the known close/wide or ecliptic degeneracy. Finally, we note that the masses of the binary companions of the lenses of OGLE-2006-BLG-450 and OGLE-2008-BLG-210 are in the brown-dwarf regime.« less
Guzmán Ruiz, Óscar; Pérez Lázaro, Juan José; Ruiz López, Pedro
To characterise the performance of the triggers used in the detection of adverse events (AE) of hospitalised adult patients and to define a simplified panel of triggers to facilitate the detection of AE. Cross-sectional study of charts of patients from a service of internal medicine to detect EA through systematic review of the charts and identification of triggers (clinical event often related to AE), determining if there was AE as the context in which it appeared the trigger. Once the EA was detected, we proceeded to the characterization of the triggers that detected it. Logistic regression was applied to select the triggers with greater AE detection capability. A total of 291 charts were reviewed, with a total of 562 triggers in 103 patients, of which 163 were involved in detecting an AE. The triggers that detected the most AE were "A.1. Pressure ulcer" (9.82%), "B.5. Laxative or enema" (8.59%), "A.8. Agitation" (8.59%), "A.9. Over-sedation" (7.98%), "A.7. Haemorrhage" (6.75%) and "B.4. Antipsychotic" (6.75%). A simplified model was obtained using logistic regression, and included the variable "Number of drugs" and the triggers "Over-sedation", "Urinary catheterisation", "Readmission in 30 days", "Laxative or enema" and "Abrupt medication stop". This model showed a probability of 81% to correctly classify charts with EA or without EA (p <0.001; 95% confidence interval: 0.763-0.871). A high number of triggers were associated with AE. The summary model is capable of detecting a large amount of AE, with a minimum of elements. Copyright © 2017 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.
Heist, E Kevin; Herre, John M; Binkley, Philip F; Van Bakel, Adrian B; Porterfield, James G; Porterfield, Linda M; Qu, Fujian; Turkel, Melanie; Pavri, Behzad B
2014-10-15
Detect Fluid Early from Intrathoracic Impedance Monitoring (DEFEAT-PE) is a prospective, multicenter study of multiple intrathoracic impedance vectors to detect pulmonary congestion (PC) events. Changes in intrathoracic impedance between the right ventricular (RV) coil and device can (RVcoil→Can) of implantable cardioverter-defibrillators (ICDs) and cardiac resynchronization therapy ICDs (CRT-Ds) are used clinically for the detection of PC events, but other impedance vectors and algorithms have not been studied prospectively. An initial 75-patient study was used to derive optimal impedance vectors to detect PC events, with 2 vector combinations selected for prospective analysis in DEFEAT-PE (ICD vectors: RVring→Can + RVcoil→Can, detection threshold 13 days; CRT-D vectors: left ventricular ring→Can + RVcoil→Can, detection threshold 14 days). Impedance changes were considered true positive if detected <30 days before an adjudicated PC event. One hundred sixty-two patients were enrolled (80 with ICDs and 82 with CRT-Ds), all with ≥1 previous PC event. One hundred forty-four patients provided study data, with 214 patient-years of follow-up and 139 PC events. Sensitivity for PC events of the prespecified algorithms was as follows: ICD: sensitivity 32.3%, false-positive rate 1.28 per patient-year; CRT-D: sensitivity 32.4%, false-positive rate 1.66 per patient-year. An alternative algorithm, ultimately approved by the US Food and Drug Administration (RVring→Can + RVcoil→Can, detection threshold 14 days), resulted in (for all patients) sensitivity of 21.6% and a false-positive rate of 0.9 per patient-year. The CRT-D thoracic impedance vector algorithm selected in the derivation study was not superior to the ICD algorithm RVring→Can + RVcoil→Can when studied prospectively. In conclusion, to achieve an acceptably low false-positive rate, the intrathoracic impedance algorithms studied in DEFEAT-PE resulted in low sensitivity for the prediction of heart failure events. Copyright © 2014 Elsevier Inc. All rights reserved.
Toutouzas, Konstantinos; Benetos, Georgios; Koutagiar, Iosif; Barampoutis, Nikolaos; Mitropoulou, Fotini; Davlouros, Periklis; Sfikakis, Petros P; Alexopoulos, Dimitrios; Stefanadis, Christodoulos; Siores, Elias; Tousoulis, Dimitris
2017-07-01
Limited prospective data have been reported regarding the impact of carotid inflammation on cardiovascular events in patients with coronary artery disease (CAD). Microwave radiometry (MWR) is a noninvasive, simple method that has been used for evaluation of carotid artery temperature which, when increased, predicts 'inflamed' plaques with vulnerable characteristics. We prospectively tested the hypothesis that increased carotid artery temperature predicts future cerebro- and cardiovascular events in patients with CAD. Consecutive patients from 3 centers, with documented CAD by coronary angiography, were studied. In both carotid arteries, common carotid intima-media thickness and plaque thickness were evaluated by ultrasound. Temperature difference (ΔT), measured by MWR, was considered as the maximal temperature along the carotid artery minus the minimum; ΔT ≥0.90 °C was assigned as high. Major cardiovascular events (MACE, death, stroke, myocardial infarction or revascularization) were recorded during the following year. In total, 250 patients were studied; of them 40 patients (16%) had high ΔT values in both carotid arteries. MACEs occurred in 30% of patients having bilateral high ΔT versus 3.8% in the remaining patients (p<0.001). Bilateral high ΔT was independently associated with increased one-year MACE rate (HR = 6.32, 95% CI 2.42-16.53, p<0.001, by multivariate cox regression hazard model). The addition of ΔT information on a baseline model based on cardiovascular risk factors and extent of CAD significantly increased the prognostic value of the model (c-statistic increase 0.744 to 0.845, p dif = 0.05) CONCLUSIONS: Carotid inflammation, detected by MWR, has an incremental prognostic value in patients with documented CAD. Copyright © 2017 Elsevier B.V. All rights reserved.
Seismological investigation of the National Data Centre Preparedness Exercise 2013
NASA Astrophysics Data System (ADS)
Gestermann, Nicolai; Hartmann, Gernot; Ross, J. Ole; Ceranna, Lars
2015-04-01
The Comprehensive Nuclear-Test-Ban Treaty (CTBT) prohibits all kinds of nuclear explosions conducted on Earth - underground, underwater or in the atmosphere. The verification regime of the CTBT is designed to detect any treaty violation. While the data of the International Monitoring System (IMS) is collected, processed and technically analyzed at the International Data Centre (IDC) of the CTBT-Organization, National Data Centres (NDC) of the member states provide interpretation and advice to their government concerning suspicious detections. The NDC Preparedness Exercises (NPE) are regularly performed dealing with fictitious treaty violations to practice the combined analysis of CTBT verification technologies. These exercises should help to evaluate the effectiveness of analysis procedures applied at NDCs and the quality, completeness and usefulness of IDC products for example. The exercise trigger of NPE2013 is a combination of a tempo-spatial indication pointing to a certain waveform event and simulated radionuclide concentrations generated by forward Atmospheric Transport Modelling based on a fictitious release. For the waveform event the date (4 Sept. 2013) is given and the region is communicated in a map showing the fictitious state of "Frisia" at the Coast of the North Sea in Central Europe. The potential connection between the waveform and radionuclide evidence remains unclear for exercise participants. The verification task was to identify the waveform event and to investigate potential sources of the radionuclide findings. The final question was whether the findings are CTBT relevant and justify a request for On-Site-Inspection in "Frisia". The seismic event was not included in the Reviewed Event Bulletin (REB) of the IDC. The available detections from the closest seismic IMS stations lead to a epicenter accuracy of about 24 km which is not sufficient to specify the 1000 km2 inspection area in case of an OSI. With use of data from local stations and adjusted velocity models the epicenter accuracy could be improved to less than 2 km, which demonstrates the crucial role of national technical means for verification tasks. The seismic NPE2013 event could be identified as induced from natural gas production in the source region. Similar waveforms and comparable spectral characteristic as a set of events in the same region are clear indications. The scenario of a possible treaty violation at the location of the seismic NPE2013 event could be disproved.
Final Technical Report. Project Boeing SGS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bell, Thomas E.
Boeing and its partner, PJM Interconnection, teamed to bring advanced “defense-grade” technologies for cyber security to the US regional power grid through demonstration in PJM’s energy management environment. Under this cooperative project with the Department of Energy, Boeing and PJM have developed and demonstrated a host of technologies specifically tailored to the needs of PJM and the electric sector as a whole. The team has demonstrated to the energy industry a combination of processes, techniques and technologies that have been successfully implemented in the commercial, defense, and intelligence communities to identify, mitigate and continuously monitor the cyber security of criticalmore » systems. Guided by the results of a Cyber Security Risk-Based Assessment completed in Phase I, the Boeing-PJM team has completed multiple iterations through the Phase II Development and Phase III Deployment phases. Multiple cyber security solutions have been completed across a variety of controls including: Application Security, Enhanced Malware Detection, Security Incident and Event Management (SIEM) Optimization, Continuous Vulnerability Monitoring, SCADA Monitoring/Intrusion Detection, Operational Resiliency, Cyber Range simulations and hands on cyber security personnel training. All of the developed and demonstrated solutions are suitable for replication across the electric sector and/or the energy sector as a whole. Benefits identified include; Improved malware and intrusion detection capability on critical SCADA networks including behavioral-based alerts resulting in improved zero-day threat protection; Improved Security Incident and Event Management system resulting in better threat visibility, thus increasing the likelihood of detecting a serious event; Improved malware detection and zero-day threat response capability; Improved ability to systematically evaluate and secure in house and vendor sourced software applications; Improved ability to continuously monitor and maintain secure configuration of network devices resulting in reduced vulnerabilities for potential exploitation; Improved overall cyber security situational awareness through the integration of multiple discrete security technologies into a single cyber security reporting console; Improved ability to maintain the resiliency of critical systems in the face of a targeted cyber attack of other significant event; Improved ability to model complex networks for penetration testing and advanced training of cyber security personnel« less
NASA Astrophysics Data System (ADS)
Sobolev, Yu. G.; Penionzhkevich, Yu. E.; Borcea, C.; Demekhina, N. A.; Eshanov, A. G.; Ivanov, M. P.; Kabdrakhimova, G. D.; Kabyshev, A. M.; Kugler, A.; Kuterbekov, K. A.; Lukyanov, K. V.; Maj, A.; Maslov, V. A.; Negret, A.; Skobelev, N. K.; Testov, D.; Trzaska, W. H.; Voskobojnik, E. I.; Zemlyanaya, E. V.
2015-06-01
Total reaction cross section excitation functions σR(E) were measured for 6He secondary beam particles on 181Ta, 59Co, natSi and 9Be targets in a wide energy range by direct and model-independent method. This experimental method was based on prompt n-γ 4π-technique applied in event-by event mode. A high efficiency CsI(Tl) γ-spectrometer was used for the detection of reaction products (prompt γ-quanta and neutrons) accompanying each reaction event. Using the ACCULINNA fragment-separator 6He fragments (produced by 11B primary beam with 9Be target) are separated and transported to n-γ shielded experimental cave at FLNR JINR. The measured total reaction cross section data σR(E) for the above mentioned reactions are compared with a theoretical calculation based on the optical potential with the real part having the double-folding form.
Sensor Webs: Autonomous Rapid Response to Monitor Transient Science Events
NASA Technical Reports Server (NTRS)
Mandl, Dan; Grosvenor, Sandra; Frye, Stu; Sherwood, Robert; Chien, Steve; Davies, Ashley; Cichy, Ben; Ingram, Mary Ann; Langley, John; Miranda, Felix
2005-01-01
To better understand how physical phenomena, such as volcanic eruptions, evolve over time, multiple sensor observations over the duration of the event are required. Using sensor web approaches that integrate original detections by in-situ sensors and global-coverage, lower-resolution, on-orbit assets with automated rapid response observations from high resolution sensors, more observations of significant events can be made with increased temporal, spatial, and spectral resolution. This paper describes experiments using Earth Observing 1 (EO-1) along with other space and ground assets to implement progressive mission autonomy to identify, locate and image with high resolution instruments phenomena such as wildfires, volcanoes, floods and ice breakup. The software that plans, schedules and controls the various satellite assets are used to form ad hoc constellations which enable collaborative autonomous image collections triggered by transient phenomena. This software is both flight and ground based and works in concert to run all of the required assets cohesively and includes software that is model-based, artificial intelligence software.
NASA Astrophysics Data System (ADS)
Guo, Jingnan; Zeitlin, Cary; Wimmer-Schweingruber, Robert F.; McDole, Thoren; Kühl, Patrick; Appel, Jan C.; Matthiä, Daniel; Krauss, Johannes; Köhler, Jan
2018-01-01
For future human missions to Mars, it is important to study the surface radiation environment during extreme and elevated conditions. In the long term, it is mainly galactic cosmic rays (GCRs) modulated by solar activity that contribute to the radiation on the surface of Mars, but intense solar energetic particle (SEP) events may induce acute health effects. Such events may enhance the radiation level significantly and should be detected as immediately as possible to prevent severe damage to humans and equipment. However, the energetic particle environment on the Martian surface is significantly different from that in deep space due to the influence of the Martian atmosphere. Depending on the intensity and shape of the original solar particle spectra, as well as particle types, the surface spectra may induce entirely different radiation effects. In order to give immediate and accurate alerts while avoiding unnecessary ones, it is important to model and well understand the atmospheric effect on the incoming SEPs, including both protons and helium ions. In this paper, we have developed a generalized approach to quickly model the surface response of any given incoming proton/helium ion spectra and have applied it to a set of historical large solar events, thus providing insights into the possible variety of surface radiation environments that may be induced during SEP events. Based on the statistical study of more than 30 significant solar events, we have obtained an empirical model for estimating the surface dose rate directly from the intensities of a power-law SEP spectra.
Continuous robust sound event classification using time-frequency features and deep learning
Song, Yan; Xiao, Wei; Phan, Huy
2017-01-01
The automatic detection and recognition of sound events by computers is a requirement for a number of emerging sensing and human computer interaction technologies. Recent advances in this field have been achieved by machine learning classifiers working in conjunction with time-frequency feature representations. This combination has achieved excellent accuracy for classification of discrete sounds. The ability to recognise sounds under real-world noisy conditions, called robust sound event classification, is an especially challenging task that has attracted recent research attention. Another aspect of real-word conditions is the classification of continuous, occluded or overlapping sounds, rather than classification of short isolated sound recordings. This paper addresses the classification of noise-corrupted, occluded, overlapped, continuous sound recordings. It first proposes a standard evaluation task for such sounds based upon a common existing method for evaluating isolated sound classification. It then benchmarks several high performing isolated sound classifiers to operate with continuous sound data by incorporating an energy-based event detection front end. Results are reported for each tested system using the new task, to provide the first analysis of their performance for continuous sound event detection. In addition it proposes and evaluates a novel Bayesian-inspired front end for the segmentation and detection of continuous sound recordings prior to classification. PMID:28892478
Continuous robust sound event classification using time-frequency features and deep learning.
McLoughlin, Ian; Zhang, Haomin; Xie, Zhipeng; Song, Yan; Xiao, Wei; Phan, Huy
2017-01-01
The automatic detection and recognition of sound events by computers is a requirement for a number of emerging sensing and human computer interaction technologies. Recent advances in this field have been achieved by machine learning classifiers working in conjunction with time-frequency feature representations. This combination has achieved excellent accuracy for classification of discrete sounds. The ability to recognise sounds under real-world noisy conditions, called robust sound event classification, is an especially challenging task that has attracted recent research attention. Another aspect of real-word conditions is the classification of continuous, occluded or overlapping sounds, rather than classification of short isolated sound recordings. This paper addresses the classification of noise-corrupted, occluded, overlapped, continuous sound recordings. It first proposes a standard evaluation task for such sounds based upon a common existing method for evaluating isolated sound classification. It then benchmarks several high performing isolated sound classifiers to operate with continuous sound data by incorporating an energy-based event detection front end. Results are reported for each tested system using the new task, to provide the first analysis of their performance for continuous sound event detection. In addition it proposes and evaluates a novel Bayesian-inspired front end for the segmentation and detection of continuous sound recordings prior to classification.
NASA Astrophysics Data System (ADS)
Nakamura, Takashi; Ando, Masaki; Kinugawa, Tomoya; Nakano, Hiroyuki; Eda, Kazunari; Sato, Shuichi; Musha, Mitsuru; Akutsu, Tomotada; Tanaka, Takahiro; Seto, Naoki; Kanda, Nobuyuki; Itoh, Yousuke
2016-09-01
Pre-DECIGO (DECihertz laser Interferometer Gravitational wave Observatory) consists of three spacecraft arranged in an equilateral triangle with 100 km arm lengths orbiting 2000 km above the surface of the earth. It is hoped that the launch date will be in the late 2020s. Pre-DECIGO has one clear target: binary black holes (BBHs) like GW150914 and GW151226. Pre-DECIGO can detect ~30M-30M BBH mergers like GW150914 up to redshift z~30. The cumulative event rate is ~1.8×10 events yr in the Pop III origin model of BBHs like GW150914, and it saturates at z~10, while in the primordial BBH (PBBH) model, the cumulative event rate is ~3×10 events yr at z=30 even if only 0.1% of the dark matter consists of PBHs, and it is still increasing at z=30. In the Pop I/II model of GW150914-like BBHs, the cumulative event rate is (3-10) × 10 events yr and it saturates at z~6. We present the requirements on orbit accuracy, drag-free techniques, laser power, frequency stability, and interferometer test mass. For BBHs like GW150914 at 1 Gpc (z~0.2), SNR~90 is achieved with the definition of Pre-DECIGO in the 0.01-100 Hz band. Since for z≫1 the characteristic strain amplitude h for a fixed frequency band weakly depends on z as z, ˜ 10% of BBHs near face-on have SNR > 5 (7) even at z~30 (10). Pre-DECIGO can measure the mass spectrum and the z-dependence of the merger rate to distinguish various models of BBHs like GW150914, such as Pop III BBH, Pop II BBH, and PBBH scenarios. Pre-DECIGO can also predict the direction of BBHs at z=0.1 with an accuracy of ~0.3 deg and a merging time accuracy of ~1 s at about a day before the merger so that ground-based GW detectors further developed at that time as well as electromagnetic follow-up observations can prepare for the detection of merger in advance, like a solar eclipse. For intermediate mass BBHs such as ~640M-640M at a large redshift z>10, the quasinormal mode frequency after the merger can be within the Pre-DECIGO band so that the ringing tail can also be detectable to confirm the Einstein theory of general relativity with SNR~35.
Reasoning over genetic variance information in cause-and-effect models of neurodegenerative diseases
Naz, Mufassra; Kodamullil, Alpha Tom
2016-01-01
The work we present here is based on the recent extension of the syntax of the Biological Expression Language (BEL), which now allows for the representation of genetic variation information in cause-and-effect models. In our article, we describe, how genetic variation information can be used to identify candidate disease mechanisms in diseases with complex aetiology such as Alzheimer’s disease and Parkinson’s disease. In those diseases, we have to assume that many genetic variants contribute moderately to the overall dysregulation that in the case of neurodegenerative diseases has such a long incubation time until the first clinical symptoms are detectable. Owing to the multilevel nature of dysregulation events, systems biomedicine modelling approaches need to combine mechanistic information from various levels, including gene expression, microRNA (miRNA) expression, protein–protein interaction, genetic variation and pathway. OpenBEL, the open source version of BEL, has recently been extended to match this requirement, and we demonstrate in our article, how candidate mechanisms for early dysregulation events in Alzheimer’s disease can be identified based on an integrative mining approach that identifies ‘chains of causation’ that include single nucleotide polymorphism information in BEL models. PMID:26249223
Bogani, Patrizia; Spiriti, Maria Michela; Lazzarano, Stefano; Arcangeli, Annarosa; Buiatti, Marcello; Minunni, Maria
2011-11-01
The World Anti-Doping Agency fears the use of gene doping to enhance athletic performances. Thus, a bioanalytical approach based on end point PCR for detecting markers' of transgenesis traceability was developed. A few sequences from two different vectors using an animal model were selected and traced in different tissues and at different times. In particular, enhanced green fluorescent protein gene and a construct-specific new marker were targeted in the analysis. To make the developed detection approach open to future routine doping analysis, matrices such as urine and tears as well blood were also tested. This study will have impact in evaluating the vector transgenes traceability for the detection of a gene doping event by non-invasive sampling.
Biolayer modeling and optimization for the SPARROW biosensor
NASA Astrophysics Data System (ADS)
Feng, Ke
2007-12-01
Biosensor direct detection of molecular binding events is of significant interest in applications from molecular screening for cancer drug design to bioagent detection for homeland security and defense. The Stacked Planar Affinity Regulated Resonant Optical Waveguide (SPARROW) structure based on coupled waveguides was recently developed to achieve increased sensitivity within a fieldable biosensor device configuration. Under ideal operating conditions, modification of the effective propagation constant of the structure's sensing waveguide through selective attachment of specific targets to probes on the waveguide surface results in a change in the coupling characteristics of the guide over a specifically designed interaction length with the analyte. Monitoring the relative power in each waveguide after interaction enables 'recognition' of those targets which have selectively bound to the surface. However, fabrication tolerances, waveguide interface roughness, biolayer surface roughness and biolayer partial coverage have an effect on biosensor behavior and achievable limit of detection (LOD). In addition to these influences which play a role in device optimization, the influence of the spatially random surface loading of molecular binding events has to be considered, especially for low surface coverage. In this dissertation an analytic model is established for the SPARROW biosensor which accounts for these nonidealities with which the design of the biosensor can be guided and optimized. For the idealized case of uniform waveguide transducer layers and biolayer, both theoretical simulation (analytical expression) and computer simulation (numerical calculation) are completed. For the nonideal case of an inhomogeneous transducer with nonideal waveguide and biolayer surfaces, device output power is affected by such physical influences as surface scattering, coupling length, absorption, and percent coverage of binding events. Using grating and perturbation techniques we explore the influence of imperfect surfaces and random surface loading on scattering loss and coupling length. Results provide a range of achievable limits of detection in the SPARROW device for a given target size, surface loading, and detectable optical power.
NASA Astrophysics Data System (ADS)
Pötzi, W.; Veronig, A. M.; Temmer, M.
2018-06-01
In the framework of the Space Situational Awareness program of the European Space Agency (ESA/SSA), an automatic flare detection system was developed at Kanzelhöhe Observatory (KSO). The system has been in operation since mid-2013. The event detection algorithm was upgraded in September 2017. All data back to 2014 was reprocessed using the new algorithm. In order to evaluate both algorithms, we apply verification measures that are commonly used for forecast validation. In order to overcome the problem of rare events, which biases the verification measures, we introduce a new event-based method. We divide the timeline of the Hα observations into positive events (flaring period) and negative events (quiet period), independent of the length of each event. In total, 329 positive and negative events were detected between 2014 and 2016. The hit rate for the new algorithm reached 96% (just five events were missed) and a false-alarm ratio of 17%. This is a significant improvement of the algorithm, as the original system had a hit rate of 85% and a false-alarm ratio of 33%. The true skill score and the Heidke skill score both reach values of 0.8 for the new algorithm; originally, they were at 0.5. The mean flare positions are accurate within {±} 1 heliographic degree for both algorithms, and the peak times improve from a mean difference of 1.7± 2.9 minutes to 1.3± 2.3 minutes. The flare start times that had been systematically late by about 3 minutes as determined by the original algorithm, now match the visual inspection within -0.47± 4.10 minutes.
NASA Astrophysics Data System (ADS)
Mróz, Przemek; Poleski, Radosław
2018-04-01
We use three-dimensional distributions of classical Cepheids and RR Lyrae stars in the Small Magellanic Cloud (SMC) to model the stellar density distribution of a young and old stellar population in that galaxy. We use these models to estimate the microlensing self-lensing optical depth to the SMC, which is in excellent agreement with the observations. Our models are consistent with the total stellar mass of the SMC of about 1.0× {10}9 {M}ȯ under the assumption that all microlensing events toward this galaxy are caused by self-lensing. We also calculate the expected event rates and estimate that future large-scale surveys, like the Large Synoptic Survey Telescope (LSST), will be able to detect up to a few dozen microlensing events in the SMC annually. If the planet frequency in the SMC is similar to that in the Milky Way, a few extragalactic planets can be detected over the course of the LSST survey, provided significant changes in the SMC observing strategy are devised. A relatively small investment of LSST resources can give us a unique probe of the population of extragalactic exoplanets.
Phenology satellite experiment
NASA Technical Reports Server (NTRS)
Dethier, B. E. (Principal Investigator)
1973-01-01
The author has identified the following significant results. The detection of a phenological event (the Brown Wave-vegetation sensescence) for specific forest and crop types using ERTS-1 imagery is described. Data handling techniques including computer analysis and photointerpretation procedures are explained. Computer analysis of multspectral scanner digital tapes in all bands was used to give the relative changes of spectral reflectance with time of forests and specified crops. These data were obtained for a number of the twenty-four sites located within four north-south corridors across the United States. Analysis of ground observation photography and ERTS-1 imagery for sites in the Appalachian Corridor and Mississippi Valley Corridor indicates that the recession of vegetation development can be detected very well. Tentative conclusions are that specific phenological events such as crop maturity or leaf fall can be mapped for specific sites and possible for different regions. Preliminary analysis based on a number of samples in mixed deciduous hardwood stands indicates that as senescence proceeds both the rate of change and differences in color among species can be detected. The results to data show the feasibility of the development and refinement of phenoclimatic models.
Sommermeyer, Dirk; Zou, Ding; Grote, Ludger; Hedner, Jan
2012-10-15
To assess the accuracy of novel algorithms using an oximeter-based finger plethysmographic signal in combination with a nasal cannula for the detection and differentiation of central and obstructive apneas. The validity of single pulse oximetry to detect respiratory disturbance events was also studied. Patients recruited from four sleep laboratories underwent an ambulatory overnight cardiorespiratory polygraphy recording. The nasal flow and photoplethysmographic signals of the recording were analyzed by automated algorithms. The apnea hypopnea index (AHI(auto)) was calculated using both signals, and a respiratory disturbance index (RDI(auto)) was calculated from photoplethysmography alone. Apnea events were classified into obstructive and central types using the oximeter derived pulse wave signal and compared with manual scoring. Sixty-six subjects (42 males, age 54 ± 14 yrs, body mass index 28.5 ± 5.9 kg/m(2)) were included in the analysis. AHI(manual) (19.4 ± 18.5 events/h) correlated highly significantly with AHI(auto) (19.9 ± 16.5 events/h) and RDI(auto) (20.4 ± 17.2 events/h); the correlation coefficients were r = 0.94 and 0.95, respectively (p < 0.001) with a mean difference of -0.5 ± 6.6 and -1.0 ± 6.1 events/h. The automatic analysis of AHI(auto) and RDI(auto) detected sleep apnea (cutoff AHI(manual) ≥ 15 events/h) with a sensitivity/specificity of 0.90/0.97 and 0.86/0.94, respectively. The automated obstructive/central apnea indices correlated closely with manually scoring (r = 0.87 and 0.95, p < 0.001) with mean difference of -4.3 ± 7.9 and 0.3 ± 1.5 events/h, respectively. Automatic analysis based on routine pulse oximetry alone may be used to detect sleep disordered breathing with accuracy. In addition, the combination of photoplethysmographic signals with a nasal flow signal provides an accurate distinction between obstructive and central apneic events during sleep.
New Fermi-LAT event reconstruction reveals more high-energy gamma rays from gamma-ray bursts
Atwood, W. B.; Baldini, L.; Bregeon, J.; ...
2013-08-19
Here, based on the experience gained during the four and a half years of the mission, the Fermi-LAT Collaboration has undertaken a comprehensive revision of the event-level analysis going under the name of Pass 8. Although it is not yet finalized, we can test the improvements in the new event reconstruction with the special case of the prompt phase of bright gamma-ray bursts (GRBs), where the signal-to-noise ratio is large enough that loose selection cuts are sufficient to identify gamma rays associated with the source. Using the new event reconstruction, we have re-analyzed 10 GRBs previously detected by the Largemore » Area Telescope (LAT) for which an X-ray/optical follow-up was possible and found four new gamma rays with energies greater than 10 GeV in addition to the seven previously known. Among these four is a 27.4 GeV gamma ray from GRB 080916C, which has a redshift of 4.35, thus making it the gamma ray with the highest intrinsic energy (~147 GeV) detected from a GRB. We present here the salient aspects of the new event reconstruction and discuss the scientific implications of these new high-energy gamma rays, such as constraining extragalactic background light models, Lorentz invariance violation tests, the prompt emission mechanism, and the bulk Lorentz factor of the emitting region.« less
ERIC Educational Resources Information Center
Smith, Rebekah E.; Bayen, Ute J.
2006-01-01
Event-based prospective memory involves remembering to perform an action in response to a particular future event. Normal younger and older adults performed event-based prospective memory tasks in 2 experiments. The authors applied a formal multinomial processing tree model of prospective memory (Smith & Bayen, 2004) to disentangle age differences…
Assessing the severity of sleep apnea syndrome based on ballistocardiogram
Zhou, Xingshe; Zhao, Weichao; Liu, Fan; Ni, Hongbo; Yu, Zhiwen
2017-01-01
Background Sleep Apnea Syndrome (SAS) is a common sleep-related breathing disorder, which affects about 4-7% males and 2-4% females all around the world. Different approaches have been adopted to diagnose SAS and measure its severity, including the gold standard Polysomnography (PSG) in sleep study field as well as several alternative techniques such as single-channel ECG, pulse oximeter and so on. However, many shortcomings still limit their generalization in home environment. In this study, we aim to propose an efficient approach to automatically assess the severity of sleep apnea syndrome based on the ballistocardiogram (BCG) signal, which is non-intrusive and suitable for in home environment. Methods We develop an unobtrusive sleep monitoring system to capture the BCG signals, based on which we put forward a three-stage sleep apnea syndrome severity assessment framework, i.e., data preprocessing, sleep-related breathing events (SBEs) detection, and sleep apnea syndrome severity evaluation. First, in the data preprocessing stage, to overcome the limits of BCG signals (e.g., low precision and reliability), we utilize wavelet decomposition to obtain the outline information of heartbeats, and apply a RR correction algorithm to handle missing or spurious RR intervals. Afterwards, in the event detection stage, we propose an automatic sleep-related breathing event detection algorithm named Physio_ICSS based on the iterative cumulative sums of squares (i.e., the ICSS algorithm), which is originally used to detect structural breakpoints in a time series. In particular, to efficiently detect sleep-related breathing events in the obtained time series of RR intervals, the proposed algorithm not only explores the practical factors of sleep-related breathing events (e.g., the limit of lasting duration and possible occurrence sleep stages) but also overcomes the event segmentation issue (e.g., equal-length segmentation method might divide one sleep-related breathing event into different fragments and lead to incorrect results) of existing approaches. Finally, by fusing features extracted from multiple domains, we can identify sleep-related breathing events and assess the severity level of sleep apnea syndrome effectively. Conclusions Experimental results on 136 individuals of different sleep apnea syndrome severities validate the effectiveness of the proposed framework, with the accuracy of 94.12% (128/136). PMID:28445548
The effectiveness of pretreatment physics plan review for detecting errors in radiation therapy.
Gopan, Olga; Zeng, Jing; Novak, Avrey; Nyflot, Matthew; Ford, Eric
2016-09-01
The pretreatment physics plan review is a standard tool for ensuring treatment quality. Studies have shown that the majority of errors in radiation oncology originate in treatment planning, which underscores the importance of the pretreatment physics plan review. This quality assurance measure is fundamentally important and central to the safety of patients and the quality of care that they receive. However, little is known about its effectiveness. The purpose of this study was to analyze reported incidents to quantify the effectiveness of the pretreatment physics plan review with the goal of improving it. This study analyzed 522 potentially severe or critical near-miss events within an institutional incident learning system collected over a three-year period. Of these 522 events, 356 originated at a workflow point that was prior to the pretreatment physics plan review. The remaining 166 events originated after the pretreatment physics plan review and were not considered in the study. The applicable 356 events were classified into one of the three categories: (1) events detected by the pretreatment physics plan review, (2) events not detected but "potentially detectable" by the physics review, and (3) events "not detectable" by the physics review. Potentially detectable events were further classified by which specific checks performed during the pretreatment physics plan review detected or could have detected the event. For these events, the associated specific check was also evaluated as to the possibility of automating that check given current data structures. For comparison, a similar analysis was carried out on 81 events from the international SAFRON radiation oncology incident learning system. Of the 356 applicable events from the institutional database, 180/356 (51%) were detected or could have been detected by the pretreatment physics plan review. Of these events, 125 actually passed through the physics review; however, only 38% (47/125) were actually detected at the review. Of the 81 events from the SAFRON database, 66/81 (81%) were potentially detectable by the pretreatment physics plan review. From the institutional database, three specific physics checks were particularly effective at detecting events (combined effectiveness of 38%): verifying the isocenter (39/180), verifying DRRs (17/180), and verifying that the plan matched the prescription (12/180). The most effective checks from the SAFRON database were verifying that the plan matched the prescription (13/66) and verifying the field parameters in the record and verify system against those in the plan (23/66). Software-based plan checking systems, if available, would have potential effectiveness of 29% and 64% at detecting events from the institutional and SAFRON databases, respectively. Pretreatment physics plan review is a key safety measure and can detect a high percentage of errors. However, the majority of errors that potentially could have been detected were not detected in this study, indicating the need to improve the pretreatment physics plan review performance. Suggestions for improvement include the automation of specific physics checks performed during the pretreatment physics plan review and the standardization of the review process.
Video-tracker trajectory analysis: who meets whom, when and where
NASA Astrophysics Data System (ADS)
Jäger, U.; Willersinn, D.
2010-04-01
Unveiling unusual or hostile events by observing manifold moving persons in a crowd is a challenging task for human operators, especially when sitting in front of monitor walls for hours. Typically, hostile events are rare. Thus, due to tiredness and negligence the operator may miss important events. In such situations, an automatic alarming system is able to support the human operator. The system incorporates a processing chain consisting of (1) people tracking, (2) event detection, (3) data retrieval, and (4) display of relevant video sequence overlaid by highlighted regions of interest. In this paper we focus on the event detection stage of the processing chain mentioned above. In our case, the selected event of interest is the encounter of people. Although being based on a rather simple trajectory analysis, this kind of event embodies great practical importance because it paves the way to answer the question "who meets whom, when and where". This, in turn, forms the basis to detect potential situations where e.g. money, weapons, drugs etc. are handed over from one person to another in crowded environments like railway stations, airports or busy streets and places etc.. The input to the trajectory analysis comes from a multi-object video-based tracking system developed at IOSB which is able to track multiple individuals within a crowd in real-time [1]. From this we calculate the inter-distances between all persons on a frame-to-frame basis. We use a sequence of simple rules based on the individuals' kinematics to detect the event mentioned above to output the frame number, the persons' IDs from the tracker and the pixel coordinates of the meeting position. Using this information, a data retrieval system may extract the corresponding part of the recorded video image sequence and finally allows for replaying the selected video clip with a highlighted region of interest to attract the operator's attention for further visual inspection.
Detectability of galactic supernova neutrinos coherently scattered on xenon nuclei in XMASS
NASA Astrophysics Data System (ADS)
Abe, K.; Hiraide, K.; Ichimura, K.; Kishimoto, Y.; Kobayashi, K.; Kobayashi, M.; Moriyama, S.; Nakagawa, K.; Nakahata, M.; Norita, T.; Ogawa, H.; Sekiya, H.; Takachio, O.; Takeda, A.; Yamashita, M.; Yang, B. S.; Kim, N. Y.; Kim, Y. D.; Tasaka, S.; Liu, J.; Martens, K.; Suzuki, Y.; Fujita, R.; Hosokawa, K.; Miuchi, K.; Oka, N.; Onishi, Y.; Takeuchi, Y.; Kim, Y. H.; Lee, J. S.; Lee, K. B.; Lee, M. K.; Fukuda, Y.; Itow, Y.; Kegasa, R.; Kobayashi, K.; Masuda, K.; Takiya, H.; Uchida, H.; Nishijima, K.; Fujii, K.; Murayama, I.; Nakamura, S.; Xmass Collaboration
2017-03-01
The coherent elastic neutrino-nucleus scattering (CEvNS) plays a crucial role at the final evolution of stars. The detection of it would be of importance in astroparticle physics. Among all available neutrino sources, galactic supernovae give the highest neutrino flux in the MeV range. Among all liquid xenon dark matter experiments, XMASS has the largest sensitive volume and light yield. The possibility to detect galactic supernova via the CEvNS-process on xenon nuclei in the current XMASS detector was investigated. The total number of events integrated in about 18 s after the explosion of a supernova 10 kpc away from the Earth was expected to be from 3.5 to 21.1, depending on the supernova model used to predict the neutrino flux, while the number of background events in the same time window was measured to be negligible. All lead to very high possibility to detect CEvNS experimentally for the first time utilizing the combination of galactic supernovae and the XMASS detector. In case of a supernova explosion as close as Betelgeuse, the total observable events can be more than ∼ 104, making it possible to distinguish different supernova models by examining the evolution of neutrino event rate in XMASS.
Managed traffic evacuation using distributed sensor processing
NASA Astrophysics Data System (ADS)
Ramuhalli, Pradeep; Biswas, Subir
2005-05-01
This paper presents an integrated sensor network and distributed event processing architecture for managed in-building traffic evacuation during natural and human-caused disasters, including earthquakes, fire and biological/chemical terrorist attacks. The proposed wireless sensor network protocols and distributed event processing mechanisms offer a new distributed paradigm for improving reliability in building evacuation and disaster management. The networking component of the system is constructed using distributed wireless sensors for measuring environmental parameters such as temperature, humidity, and detecting unusual events such as smoke, structural failures, vibration, biological/chemical or nuclear agents. Distributed event processing algorithms will be executed by these sensor nodes to detect the propagation pattern of the disaster and to measure the concentration and activity of human traffic in different parts of the building. Based on this information, dynamic evacuation decisions are taken for maximizing the evacuation speed and minimizing unwanted incidents such as human exposure to harmful agents and stampedes near exits. A set of audio-visual indicators and actuators are used for aiding the automated evacuation process. In this paper we develop integrated protocols, algorithms and their simulation models for the proposed sensor networking and the distributed event processing framework. Also, efficient harnessing of the individually low, but collectively massive, processing abilities of the sensor nodes is a powerful concept behind our proposed distributed event processing algorithms. Results obtained through simulation in this paper are used for a detailed characterization of the proposed evacuation management system and its associated algorithmic components.
The Geostationary Lightning Mapper: Its Performance and Calibration
NASA Astrophysics Data System (ADS)
Christian, H. J., Jr.
2015-12-01
The Geostationary Lightning Mapper (GLM) has been developed to be an operational instrument on the GOES-R series of spacecraft. The GLM is a unique instrument, unlike other meteorological instruments, both in how it operates and in the information content that it provides. Instrumentally, it is an event detector, rather than an imager. While processing almost a billion pixels per second with 14 bits of resolution, the event detection process reduces the required telemetry bandwidth by almost 105, thus keeping the telemetry requirements modest and enabling efficient ground processing that leads to rapid data distribution to operational users. The GLM was designed to detect about 90 percent of the total lightning flashes within its almost hemispherical field of view. Based on laboratory calibration, we expect the on-orbit detection efficiency to be closer to 85%, making it the highest performing, large area coverage total lightning detector. It has a number of unique design features that will enable it have near uniform special resolution over most of its field of view and to operate with minimal impact on performance during solar eclipses. The GLM has no dedicated on-orbit calibration system, thus the ground-based calibration provides the bases for the predicted radiometric performance. A number of problems were encountered during the calibration of Flight Model 1. The issues arouse from GLM design features including its wide field of view, fast lens, the narrow-band interference filters located in both object and collimated space and the fact that the GLM is inherently a event detector yet the calibration procedures required both calibration of images and events. The GLM calibration techniques were based on those developed for the Lightning Imaging Sensor calibration, but there are enough differences between the sensors that the initial GLM calibration suggested that it is significantly more sensitive than its design parameters. The calibration discrepancies have been resolved and will be discussed. Absolute calibration will be verified on-orbit using vicarious cloud reflections. In addition to details of the GLM calibration, the presentation will address the unique design of the GLM, its features, capabilities and performance.
NASA Astrophysics Data System (ADS)
Nair, U. S.; Keiser, K.; Wu, Y.; Maskey, M.; Berendes, D.; Glass, P.; Dhakal, A.; Christopher, S. A.
2012-12-01
The Alabama Forestry Commission (AFC) is responsible for wildfire control and also prescribed burn management in the state of Alabama. Visibility and air quality degradation resulting from smoke are two pieces of information that are crucial for this activity. Currently the tools available to AFC are the dispersion index available from the National Weather Service and also surface smoke concentrations. The former provides broad guidance for prescribed burning activities but does not provide specific information regarding smoke transport, areas affected and quantification of air quality and visibility degradation. While the NOAA operational air quality guidance includes surface smoke concentrations from existing fire events, it does not account for contributions from background aerosols, which are important for the southeastern region including Alabama. Also lacking is the quantification of visibility. The University of Alabama in Huntsville has developed a state-of-the-art integrated modeling system to address these concerns. This system based on the Community Air Quality Modeling System (CMAQ) that ingests satellite derived smoke emissions and also assimilates NASA MODIS derived aerosol optical thickness. In addition, this operational modeling system also simulates the impact of potential prescribed burn events based on location information derived from the AFC prescribed burn permit database. A lagrangian model is used to simulate smoke plumes for the prescribed burns requests. The combined air quality and visibility degradation resulting from these smoke plumes and background aerosols is computed and the information is made available through a web based decision support system utilizing open source GIS components. This system provides information regarding intersections between highways and other critical facilities such as old age homes, hospitals and schools. The system also includes satellite detected fire locations and other satellite derived datasets relevant for fire and smoke management.
Detection of Epileptic Seizure Event and Onset Using EEG
Ahammad, Nabeel; Fathima, Thasneem; Joseph, Paul
2014-01-01
This study proposes a method of automatic detection of epileptic seizure event and onset using wavelet based features and certain statistical features without wavelet decomposition. Normal and epileptic EEG signals were classified using linear classifier. For seizure event detection, Bonn University EEG database has been used. Three types of EEG signals (EEG signal recorded from healthy volunteer with eye open, epilepsy patients in the epileptogenic zone during a seizure-free interval, and epilepsy patients during epileptic seizures) were classified. Important features such as energy, entropy, standard deviation, maximum, minimum, and mean at different subbands were computed and classification was done using linear classifier. The performance of classifier was determined in terms of specificity, sensitivity, and accuracy. The overall accuracy was 84.2%. In the case of seizure onset detection, the database used is CHB-MIT scalp EEG database. Along with wavelet based features, interquartile range (IQR) and mean absolute deviation (MAD) without wavelet decomposition were extracted. Latency was used to study the performance of seizure onset detection. Classifier gave a sensitivity of 98.5% with an average latency of 1.76 seconds. PMID:24616892
Fusing Symbolic and Numerical Diagnostic Computations
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.
Sommermeyer, Dirk; Zou, Ding; Grote, Ludger; Hedner, Jan
2012-01-01
Study Objective: To assess the accuracy of novel algorithms using an oximeter-based finger plethysmographic signal in combination with a nasal cannula for the detection and differentiation of central and obstructive apneas. The validity of single pulse oximetry to detect respiratory disturbance events was also studied. Methods: Patients recruited from four sleep laboratories underwent an ambulatory overnight cardiorespiratory polygraphy recording. The nasal flow and photoplethysmographic signals of the recording were analyzed by automated algorithms. The apnea hypopnea index (AHIauto) was calculated using both signals, and a respiratory disturbance index (RDIauto) was calculated from photoplethysmography alone. Apnea events were classified into obstructive and central types using the oximeter derived pulse wave signal and compared with manual scoring. Results: Sixty-six subjects (42 males, age 54 ± 14 yrs, body mass index 28.5 ± 5.9 kg/m2) were included in the analysis. AHImanual (19.4 ± 18.5 events/h) correlated highly significantly with AHIauto (19.9 ± 16.5 events/h) and RDIauto (20.4 ± 17.2 events/h); the correlation coefficients were r = 0.94 and 0.95, respectively (p < 0.001) with a mean difference of −0.5 ± 6.6 and −1.0 ± 6.1 events/h. The automatic analysis of AHIauto and RDIauto detected sleep apnea (cutoff AHImanual ≥ 15 events/h) with a sensitivity/specificity of 0.90/0.97 and 0.86/0.94, respectively. The automated obstructive/central apnea indices correlated closely with manually scoring (r = 0.87 and 0.95, p < 0.001) with mean difference of −4.3 ± 7.9 and 0.3 ± 1.5 events/h, respectively. Conclusions: Automatic analysis based on routine pulse oximetry alone may be used to detect sleep disordered breathing with accuracy. In addition, the combination of photoplethysmographic signals with a nasal flow signal provides an accurate distinction between obstructive and central apneic events during sleep. Citation: Sommermeyer D; Zou D; Grote L; Hedner J. Detection of sleep disordered breathing and its central/obstructive character using nasal cannula and finger pulse oximeter. J Clin Sleep Med 2012;8(5):527-533. PMID:23066364
NASA Astrophysics Data System (ADS)
Wu, Fan; Cui, Xiaopeng; Zhang, Da-Lin
2018-06-01
Nowcasting short-duration (i.e., <6 h) rainfall (SDR) events is examined using total [i.e., cloud-to-ground (CG) and intra-cloud (IC)] lightning observations over the Beijing Metropolitan Region (BMR) during the warm seasons of 2006-2007. A total of 928 moderate and 554 intense SDR events, i.e., with the respective hourly rainfall rates (HRR) of 10-20 and ≥20 mm h-1, are utilized to estimate sharp-increasing rates in rainfall and lightning flash, termed as rainfall and lightning jumps, respectively. By optimizing the parameters in a lightning jump and a rainfall jump algorithm, their different jump intensity grades are verified for the above two categories of SDR events. Then, their corresponding graded nowcast-warning models are developed for the moderate and intense SDR events, respectively, with a low-grade warning for hitting more SDR events and a high-grade warning for reducing false alarms. Any issued warning in the nowcast-warning models is designed to last for 2 h after the occurrence of a lightning jump. It is demonstrated that the low-grade warnings can have the probability of detection (POD) of 67.8% (87.0%) and the high-grade warnings have the false alarms ratio (FAR) of 27.0% (22.2%) for the moderate (intense) SDR events, with an averaged lead time of 36.7 (52.0) min. The nowcast-warning models are further validated using three typical heavy-rain-producing storms that are independent from those used to develop the models. Results show that the nowcast-warning models can provide encouraging early warnings for the associated SDR events from the regional to meso-γ scales, indicating that they have a great potential in being applied to the other regions where high-resolution total lightning observations are available.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bame, D.
To determine if seismic signals at frequencies up to 50 Hz are useful for detecting events and discriminating between earthquakes and explosions, approximately 180 events from the three-component high-frequency seismic element (HFSE) installed at the center of the Norwegian Regional Seismic Array (NRSA) have been analyzed. The attenuation of high-frequency signals in Scandinavia varies with distance, azimuth, magnitude, and source effects. Most of the events were detected with HFSE, although detections were better on the NRSA where signal processing techniques were used. Based on a preliminary analysis, high-frequency data do not appear to be a useful discriminant in Scandinavia. 21more » refs., 29 figs., 3 tabs.« less
Parameterization of synoptic weather systems in the South Atlantic Bight for modeling applications
NASA Astrophysics Data System (ADS)
Wu, Xiaodong; Voulgaris, George; Kumar, Nirnimesh
2017-10-01
An event based, long-term, climatological analysis is presented that allows the creation of coastal ocean atmospheric forcing on the coastal ocean that preserves both frequency of occurrence and event time history. An algorithm is developed that identifies individual storm event (cold fronts, warm fronts, and tropical storms) from meteorological records. The algorithm has been applied to a location along the South Atlantic Bight, off South Carolina, an area prone to cyclogenesis occurrence and passages of atmospheric fronts. Comparison against daily weather maps confirms that the algorithm is efficient in identifying cold fronts and warm fronts, while the identification of tropical storms is less successful. The average state of the storm events and their variability are represented by the temporal evolution of atmospheric pressure, air temperature, wind velocity, and wave directional spectral energy. The use of uncorrected algorithm-detected events provides climatologies that show a little deviation from those derived using corrected events. The effectiveness of this analysis method is further verified by numerically simulating the wave conditions driven by the characteristic wind forcing and comparing the results with the wave climatology that corresponds to each storm type. A high level of consistency found in the comparison indicates that this analysis method can be used for accurately characterizing event-based oceanic processes and long-term storm-induced morphodynamic processes on wind-dominated coasts.
NASA Astrophysics Data System (ADS)
Solano, ErickaAlinne; Hjorleifsdottir, Vala; Perez-Campos, Xyoli
2015-04-01
A large subset of seismic events do not have impulsive arrivals, such as low frequency events in volcanoes, earthquakes in the shallow part of the subduction interface and further down dip from the traditional seismogenic part, glacial events, volcanic and non-volcanic tremors and landslides. A suite of methods can be used to detect these non-impulsive events. One of this methods is the full-waveform detection based on time reversal methods (Solano, et al , submitted to GJI). The method uses continuous observed seismograms, together with Greens functions and moment tensor responses calculated for an arbitrary 3D structure. This method was applied to the 2012 Ometepec-Pinotepa Nacional earthquake sequence in Guerrero, Mexico. During the span time of the study, we encountered three previously unknown events. One of this events was an impulsive earthquake in the Ometepec area, that only has clear arrivals on three stations and was therefore not located and reported by the SSN. The other two events are previously undetected events, very depleted in high frequencies, that occurred far outside the search area. A very rough estimate gives the location of this two events in the portion of the East Pacific Rise around 9 N. These two events are detected despite their distance from the search area, due to favorable move-out on the array of the Mexican National Seismological Service network (SSN). We are expanding the study area to the EPR and to a larger period of time, with the objective of finding more events in that region. We will present an analysis of the newly detected events, as well as any further findings at the meeting.
Comparative evaluation of urban storm water quality models
NASA Astrophysics Data System (ADS)
Vaze, J.; Chiew, Francis H. S.
2003-10-01
The estimation of urban storm water pollutant loads is required for the development of mitigation and management strategies to minimize impacts to receiving environments. Event pollutant loads are typically estimated using either regression equations or "process-based" water quality models. The relative merit of using regression models compared to process-based models is not clear. A modeling study is carried out here to evaluate the comparative ability of the regression equations and process-based water quality models to estimate event diffuse pollutant loads from impervious surfaces. The results indicate that, once calibrated, both the regression equations and the process-based model can estimate event pollutant loads satisfactorily. In fact, the loads estimated using the regression equation as a function of rainfall intensity and runoff rate are better than the loads estimated using the process-based model. Therefore, if only estimates of event loads are required, regression models should be used because they are simpler and require less data compared to process-based models.
Call, Rosemary J.; Burlison, Jonathan D.; Robertson, Jennifer J.; Scott, Jeffrey R.; Baker, Donald K.; Rossi, Michael G.; Howard, Scott C.; Hoffman, James M.
2014-01-01
Objective To investigate the use of a trigger tool for adverse drug event (ADE) detection in a pediatric hospital specializing in oncology, hematology, and other catastrophic diseases. Study design A medication-based trigger tool package analyzed electronic health records from February 2009 to February 2013. Chart review determined whether an ADE precipitated the trigger. Severity was assigned to ADEs, and preventability was assessed. Preventable ADEs were compared with the hospital’s electronic voluntary event reporting system to identify whether these ADEs had been previously identified. The positive predictive values (PPVs) of the entire trigger tool and individual triggers were calculated to assess their accuracy to detect ADEs. Results Trigger occurrences (n=706) were detected in 390 patients from six medication triggers, 33 of which were ADEs (overall PPV = 16%). Hyaluronidase had the highest PPV (60%). Most ADEs were category E harm (temporary harm) per the National Coordinating Council for Medication Error Reporting and Prevention (NCC MERP) index. One event was category H harm (intervention to sustain life). Naloxone was associated with the most grade 4 ADEs per the Common Terminology Criteria for Adverse Events (CTCAE) v4.03. Twenty-one (64%) ADEs were preventable; 3 of which were submitted via the voluntary reporting system. Conclusion Most of the medication-based triggers yielded low PPVs. Refining the triggers based on patients’ characteristics and medication usage patterns could increase the PPVs and make them more useful for quality improvement. To efficiently detect ADEs, triggers must be revised to reflect specialized pediatric patient populations such as hematology and oncology patients. PMID:24768254
Call, Rosemary J; Burlison, Jonathan D; Robertson, Jennifer J; Scott, Jeffrey R; Baker, Donald K; Rossi, Michael G; Howard, Scott C; Hoffman, James M
2014-09-01
To investigate the use of a trigger tool for the detection of adverse drug events (ADE) in a pediatric hospital specializing in oncology, hematology, and other catastrophic diseases. A medication-based trigger tool package analyzed electronic health records from February 2009 to February 2013. Chart review determined whether an ADE precipitated the trigger. Severity was assigned to ADEs, and preventability was assessed. Preventable ADEs were compared with the hospital's electronic voluntary event reporting system to identify whether these ADEs had been previously identified. The positive predictive values (PPVs) of the entire trigger tool and individual triggers were calculated to assess their accuracy to detect ADEs. Trigger occurrences (n = 706) were detected in 390 patients from 6 medication triggers, 33 of which were ADEs (overall PPV = 16%). Hyaluronidase had the greatest PPV (60%). Most ADEs were category E harm (temporary harm) per the National Coordinating Council for Medication Error Reporting and Prevention index. One event was category H harm (intervention to sustain life). Naloxone was associated with the most grade 4 ADEs per the Common Terminology Criteria for Adverse Events v4.03. Twenty-one (64%) ADEs were preventable, 3 of which were submitted via the voluntary reporting system. Most of the medication-based triggers yielded low PPVs. Refining the triggers based on patients' characteristics and medication usage patterns could increase the PPVs and make them more useful for quality improvement. To efficiently detect ADEs, triggers must be revised to reflect specialized pediatric patient populations such as hematology and oncology patients. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Iles, R. H. A.; Jones, J. B. L.; Taylor, G. C.; Blake, J. B.; Bentley, R. D.; Hunter, R.; Harra, L. K.; Coates, A. J.
2004-11-01
We investigate the circumstances required for aircrew and passengers to experience an increased radiation exposure rate from a solar energetic particle (SEP) event occurring during a flight. The effects of the 14 July 2000 National Oceanic and Atmospheric Administration S3 class SEP event are examined using ground-based and satellite measurements together with coincident measurements made using a tissue equivalent proportional counter (TEPC) on board a Virgin Atlantic Airways flight from London Heathrow to Hong Kong. In this paper we present the first measurements made during a SEP event using a TEPC at flight altitudes. Our results indicate that there were no increased radiation levels detected during the flight due to the SEPs, but the measurements agreed well with the CARI-6 model calculations made using a heliocentric potential value derived immediately prior to the SEP event. In addition, a prolonged increase in the >85 MeV particle flux is observed for up to 2 days after the SEP onset by the SAMPEX spacecraft at latitudes >55°.
NASA Astrophysics Data System (ADS)
Hansell, Richard Allen, Jr.
The radiative effects of dust aerosol on our climate system have yet to be fully understood and remain a topic of contemporary research. To investigate these effects, detection/retrieval methods for dust events over major dust outbreak and transport areas have been developed using satellite and ground-based approaches. To this end, both the shortwave and longwave surface radiative forcing of dust aerosol were investigated. The ground-based remote sensing approach uses the Atmospheric Emitted Radiance Interferometer brightness temperature spectra to detect mineral dust events and to retrieve their properties. Taking advantage of the high spectral resolution of the AERI instrument, absorptive differences in prescribed thermal IR window sub-band channels were exploited to differentiate dust from cirrus clouds. AERI data collected during the UAE2 at Al-Ain UAE was employed for dust retrieval. Assuming a specified dust composition model a priori and using the light scattering programs of T-matrix and the finite difference time domain methods for oblate spheroids and hexagonal plates, respectively, dust optical depths have been retrieved and compared to those inferred from a collocated and coincident AERONET sun-photometer dataset. The retrieved optical depths were then used to determine the dust longwave surface forcing during the UAE2. Likewise, dust shortwave surface forcing is investigated employing a differential technique from previous field studies. The satellite-based approach uses MODIS thermal infrared brightness temperature window data for the simultaneous detection/separation of mineral dust and cirrus clouds. Based on the spectral variability of dust emissivity at the 3.75, 8.6, 11 and 12 mum wavelengths, the D*-parameter, BTD-slope and BTD3-11 tests are combined to identify dust and cirrus. MODIS data for the three dust-laden scenes have been analyzed to demonstrate the effectiveness of this detection/separation method. Detected daytime dust and cloud coverage for the Persian Gulf case compare reasonably well to those from the "Deep Blue" algorithm developed at NASA-GSFC. The nighttime dust/cloud detection for the cases surrounding Cape Verde and Niger, West Africa has been validated by comparing to coincident and collocated ground-based micro-pulse lidar measurements.
Jiang, Ling; Qian, Jing; Yang, Xingwang; Yan, Yuting; Liu, Qian; Wang, Kan; Wang, Kun
2014-01-02
An amplified electrochemical impedimetric aptasensor for ochratoxin A (OTA) was developed with picomolar sensitivity. A facile route to fabricate gold nanoparticles covalently bound reduced graphene oxide (AuNPs-rGO) resulted in a large number of well-dispersed AuNPs on graphene sheets with tremendous binding sites for DNA, since the single rGO sheet and each AuNP can be loaded with hundreds of DNA strands. An aptasensor with sandwich model was fabricated which involved thiolated capture DNA immobilized on a gold electrode to capture the aptamer, then the sensing interface was incubated with OTA at a desired concentration, followed by AuNPs-rGO functionalized reporter DNA hybridized with the residual aptamers. By exploiting the AuNPs-rGO as an excellent signal amplified platform, a single hybridization event between aptamer and reporter DNA was translated into more than 10(7) redox events, leading to a substantial increase in charge-transfer resistance (Rct) by 7~ orders of magnitude compared with that of the free aptamer modified electrode. Such designed aptasensor showed a decreased response of Rct to the increase of OTA concentrations over a wide range of 1 pg mL(-1)-50 ng mL(-1) and could detect extremely low OTA concentration, namely, 0.3 pg mL(-1) or 0.74 pM, which was much lower than that of most other existed impedimetric aptasensors. The signal amplification platform presented here would provide a promising model for the aptamer-based detection with a direct impedimetric method. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
2013-08-01
A scientific session of the Physical Sciences Division of the Russian Academy of Sciences (RAS), titled "Near-Earth space hazards and their detection", was held on 27 March 2013 at the conference hall of the Lebedev Physical Institute, RAS. The agenda posted on the website of the Physical Sciences Division, RAS, http://www.gpad.ac.ru, included the following reports: (1) Emel'yanenko V V, Shustov B M (Institute of Astronomy, RAS, Moscow) "The Chelyabinsk event and the asteroid-comet hazard"; (2) Chugai N N (Institute of Astronomy, RAS, Moscow) "A physical model of the Chelyabinsk event"; (3) Lipunov V M (Lomonosov Moscow State University, Sternberg Astronomical Institute, Moscow) "MASTER global network of optical monitoring"; (4) Beskin G M (Special Astrophysical Observatory, RAS, Arkhyz, Karachai-Cirkassian Republic) "Wide-field optical monitoring systems with subsecond time resolution for the detection and study of cosmic threats". The expanded papers written on the base of oral reports 1 and 4 are given below. • The Chelyabinsk event and the asteroid-comet hazard, V V Emel'yanenko, B M Shustov Physics-Uspekhi, 2013, Volume 56, Number 8, Pages 833-836 • Wide-field subsecond temporal resolution optical monitoring systems for the detection and study of cosmic hazards, G M Beskin, S V Karpov, V L Plokhotnichenko, S F Bondar, A V Perkov, E A Ivanov, E V Katkova, V V Sasyuk, A Shearer Physics-Uspekhi, 2013, Volume 56, Number 8, Pages 836-842
Shaw, Tyler H; Funke, Matthew E; Dillard, Michael; Funke, Gregory J; Warm, Joel S; Parasuraman, Raja
2013-08-01
Transcranial Doppler sonography was used to measure cerebral blood flow velocity (CBFV) in the right and left cerebral hemispheres during the performance of a 50-min visual vigilance session. Observers monitored a simulated flight of unmanned aerial vehicles for cases in which one of the vehicles was flying in an inappropriate direction relative to its cohorts. Two types of vigilance tasks were employed: a traditional task in which observers made button press ("go") responses to critical signals, and a modification of the traditional task called the Sustained Attention to Response Task (SART) in which "go" responses acknowledged nonsignal events and response withholding ("no-go") signified signal detection. Signal detections and global CBFV scores declined over time. In addition, fine-grained event-related analyses revealed that the detection of signals was accompanied by an elevation of CBFV that was not present with missed signals. As was the case with the global scores, the magnitude of the transient CBFV increments associated with signal detection also declined over time, and these findings were independent of task type. The results support the view of CBFV as an index of the cognitive evaluation of stimulus significance, and a resource model of vigilance in which the need for continuous attention produces a depletion of information-processing assets that are not replenished as the task progresses. Further, temporal declines in the magnitude of event-related CBFV in response to critical signals only is evidence that the decrement function in vigilance is due to attentional processing and not specific task elements such as the required response format. Copyright © 2013. Published by Elsevier Inc.
An impact of environmental changes on flows in the reach scale under a range of climatic conditions
NASA Astrophysics Data System (ADS)
Karamuz, Emilia; Romanowicz, Renata J.
2016-04-01
The present paper combines detection and adequate identification of causes of changes in flow regime at cross-sections along the Middle River Vistula reach using different methods. Two main experimental set ups (designs) have been applied to study the changes, a moving three-year window and low- and high-flow event based approach. In the first experiment, a Stochastic Transfer Function (STF) model and a quantile-based statistical analysis of flow patterns were compared. These two methods are based on the analysis of changes of the STF model parameters and standardised differences of flow quantile values. In the second experiment, in addition to the STF-based also a 1-D distributed model, MIKE11 was applied. The first step of the procedure used in the study is to define the river reaches that have recorded information on land use and water management changes. The second task is to perform the moving window analysis of standardised differences of flow quantiles and moving window optimisation of the STF model for flow routing. The third step consists of an optimisation of the STF and MIKE11 models for high- and low-flow events. The final step is to analyse the results and relate the standardised quantile changes and model parameter changes to historical land use changes and water management practices. Results indicate that both models give consistent assessment of changes in the channel for medium and high flows. ACKNOWLEDGEMENTS This research was supported by the Institute of Geophysics Polish Academy of Sciences through the Young Scientist Grant no. 3b/IGF PAN/2015.