Science.gov

Sample records for model-based event detection

  1. Identification of new events in Apollo 16 lunar seismic data by Hidden Markov Model-based event detection and classification

    NASA Astrophysics Data System (ADS)

    Knapmeyer-Endrun, Brigitte; Hammer, Conny

    2015-10-01

    Detection and identification of interesting events in single-station seismic data with little prior knowledge and under tight time constraints is a typical scenario in planetary seismology. The Apollo lunar seismic data, with the only confirmed events recorded on any extraterrestrial body yet, provide a valuable test case. Here we present the application of a stochastic event detector and classifier to the data of station Apollo 16. Based on a single-waveform example for each event class and some hours of background noise, the system is trained to recognize deep moonquakes, impacts, and shallow moonquakes and performs reliably over 3 years of data. The algorithm's demonstrated ability to detect rare events and flag previously undefined signal classes as new event types is of particular interest in the analysis of the first seismic recordings from a completely new environment. We are able to classify more than 50% of previously unclassified lunar events, and additionally find over 200 new events not listed in the current lunar event catalog. These events include deep moonquakes as well as impacts and could be used to update studies on temporal variations in event rate or deep moonquakes stacks used in phase picking for localization. No unambiguous new shallow moonquake was detected, but application to data of the other Apollo stations has the potential for additional new discoveries 40 years after the data were recorded. Besides, the classification system could be useful for future seismometer missions to other planets, e.g., the InSight mission to Mars.

  2. a Topic Modeling Based Representation to Detect Tweet Locations. Example of the Event "je Suis Charlie"

    NASA Astrophysics Data System (ADS)

    Morchid, M.; Josselin, D.; Portilla, Y.; Dufour, R.; Altman, E.; Linarès, G.

    2015-09-01

    Social Networks became a major actor in information propagation. Using the Twitter popular platform, mobile users post or relay messages from different locations. The tweet content, meaning and location, show how an event-such as the bursty one "JeSuisCharlie", happened in France in January 2015, is comprehended in different countries. This research aims at clustering the tweets according to the co-occurrence of their terms, including the country, and forecasting the probable country of a non-located tweet, knowing its content. First, we present the process of collecting a large quantity of data from the Twitter website. We finally have a set of 2,189 located tweets about "Charlie", from the 7th to the 14th of January. We describe an original method adapted from the Author-Topic (AT) model based on the Latent Dirichlet Allocation (LDA) method. We define an homogeneous space containing both lexical content (words) and spatial information (country). During a training process on a part of the sample, we provide a set of clusters (topics) based on statistical relations between lexical and spatial terms. During a clustering task, we evaluate the method effectiveness on the rest of the sample that reaches up to 95% of good assignment. It shows that our model is pertinent to foresee tweet location after a learning process.

  3. Applying a Hidden Markov Model-Based Event Detection and Classification Algorithm to Apollo Lunar Seismic Data

    NASA Astrophysics Data System (ADS)

    Knapmeyer-Endrun, B.; Hammer, C.

    2014-12-01

    The seismometers that the Apollo astronauts deployed on the Moon provide the only recordings of seismic events from any extra-terrestrial body so far. These lunar events are significantly different from ones recorded on Earth, in terms of both signal shape and source processes. Thus they are a valuable test case for any experiment in planetary seismology. In this study, we analyze Apollo 16 data with a single-station event detection and classification algorithm in view of NASA's upcoming InSight mission to Mars. InSight, scheduled for launch in early 2016, has the goal to investigate Mars' internal structure by deploying a seismometer on its surface. As the mission does not feature any orbiter, continuous data will be relayed to Earth at a reduced rate. Full range data will only be available by requesting specific time-windows within a few days after the receipt of the original transmission. We apply a recently introduced algorithm based on hidden Markov models that requires only a single example waveform of each event class for training appropriate models. After constructing the prototypes we detect and classify impacts and deep and shallow moonquakes. Initial results for 1972 (year of station installation with 8 months of data) indicate a high detection rate of over 95% for impacts, of which more than 80% are classified correctly. Deep moonquakes, which occur in large amounts, but often show only very weak signals, are detected with less certainty (~70%). As there is only one weak shallow moonquake covered, results for this event class are not statistically significant. Daily adjustments of the background noise model help to reduce false alarms, which are mainly erroneous deep moonquake detections, by about 25%. The algorithm enables us to classify events that were previously listed in the catalog without classification, and, through the combined use of long period and short period data, identify some unlisted local impacts as well as at least two yet unreported

  4. A Cyber-Attack Detection Model Based on Multivariate Analyses

    NASA Astrophysics Data System (ADS)

    Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi

    In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.

  5. Detection of anomalous events

    DOEpatents

    Ferragut, Erik M.; Laska, Jason A.; Bridges, Robert A.

    2016-06-07

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The system can include a plurality of anomaly detectors that together implement an algorithm to identify low-probability events and detect atypical traffic patterns. The anomaly detector provides for comparability of disparate sources of data (e.g., network flow data and firewall logs.) Additionally, the anomaly detector allows for regulatability, meaning that the algorithm can be user configurable to adjust a number of false alerts. The anomaly detector can be used for a variety of probability density functions, including normal Gaussian distributions, irregular distributions, as well as functions associated with continuous or discrete variables.

  6. Detection of solar events

    SciTech Connect

    Fischbach, Ephraim; Jenkins, Jere

    2013-08-27

    A flux detection apparatus can include a radioactive sample having a decay rate capable of changing in response to interaction with a first particle or a field, and a detector associated with the radioactive sample. The detector is responsive to a second particle or radiation formed by decay of the radioactive sample. The rate of decay of the radioactive sample can be correlated to flux of the first particle or the field. Detection of the first particle or the field can provide an early warning for an impending solar event.

  7. Model-Based Signal Processing: Correlation Detection With Synthetic Seismograms

    SciTech Connect

    Rodgers, A; Harris, D; Pasyanos, M; Blair, S; Matt, R

    2006-08-30

    Recent applications of correlation methods to seismological problems illustrate the power of coherent signal processing applied to seismic waveforms. Examples of these applications include detection of low amplitude signals buried in ambient noise and cross-correlation of sets of waveforms to form event clusters and accurately measure delay times for event relocation and/or earth structure. These methods rely on the exploitation of the similarity of individual waveforms and have been successfully applied to large sets of empirical observations. However, in cases with little or no empirical event data, such as aseismic regions or exotic event types, correlation methods with observed seismograms will not be possible due to the lack of previously observed similar waveforms. This study uses model-based signals computed for three-dimensional (3D) Earth models to form the basis for correlation detection. Synthetic seismograms are computed for fully 3D models estimated from the Markov Chain Monte-Carlo (MCMC) method. MCMC uses stochastic sampling to fit multiple seismological data sets. Rather than estimate a single ''optimal'' model, MCMC results in a suite of models that sample the model space and incorporates uncertainty through variability of the models. The variability reflects our ignorance of Earth structure, due to limited resolution, data and modeling errors, and produces variability in the seismic waveform response. Model-based signals are combined using a subspace method where the synthetic signals are decomposed into an orthogonal basis by singular-value decomposition (SVD) and the observed waveforms are represented with a linear combination of a sub-set of eigenvectors (signals) associated with the most significant eigenvalues. We have demonstrated the method by modeling long-period (80-10 seconds) regional seismograms for a moderate (M{approx}5) earthquake near the China-North Korea border. Synthetic seismograms are computed with the Spectral Element Method

  8. Scintillation event energy measurement via a pulse model based iterative deconvolution method

    NASA Astrophysics Data System (ADS)

    Deng, Zhenzhou; Xie, Qingguo; Duan, Zhiwen; Xiao, Peng

    2013-11-01

    This work focuses on event energy measurement, a crucial task of scintillation detection systems. We modeled the scintillation detector as a linear system and treated the energy measurement as a deconvolution problem. We proposed a pulse model based iterative deconvolution (PMID) method, which can process pileup events without detection and is adaptive for different signal pulse shapes. The proposed method was compared with digital gated integrator (DGI) and digital delay-line clipping (DDLC) using real world experimental data. For singles data, the energy resolution (ER) produced by PMID matched that of DGI. For pileups, the PMID method outperformed both DGI and DDLC in ER and counts recovery. The encouraging results suggest that the PMID method has great potentials in applications like photon-counting systems and pulse height spectrometers, in which multiple-event pileups are common.

  9. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Sullivan, E J; Candy, J V

    2007-08-13

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  10. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Candy, J V

    2008-12-08

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  11. Naive Probability: Model-Based Estimates of Unique Events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. PMID:25363706

  12. Adaptive, Model-Based Monitoring and Threat Detection

    NASA Astrophysics Data System (ADS)

    Valdes, Alfonso; Skinner, Keith

    2002-09-01

    We explore the suitability of model-based probabilistic techniques, such as Bayes networks, to the field of intrusion detection and alert report correlation. We describe a network intrusion detection system (IDS) using Bayes inference, wherein the knowledge base is encoded not as rules but as conditional probability relations between observables and hypotheses of normal and malicious usage. The same high-performance Bayes inference library was employed in a component of the Mission-Based Correlation effort, using an initial knowledge base that adaptively learns the security administrator's preference for alert priority and rank. Another major effort demonstrated probabilistic techniques in heterogeneous sensor correlation. We provide results for simulated attack data, live traffic, and the CyberPanel Grand Challenge Problem. Our results establish that model-based probabilistic techniques are an important complementary capability to signature-based methods in detection and correlation.

  13. Fuzzy model-based observers for fault detection in CSTR.

    PubMed

    Ballesteros-Moncada, Hazael; Herrera-López, Enrique J; Anzurez-Marín, Juan

    2015-11-01

    Under the vast variety of fuzzy model-based observers reported in the literature, what would be the properone to be used for fault detection in a class of chemical reactor? In this study four fuzzy model-based observers for sensor fault detection of a Continuous Stirred Tank Reactor were designed and compared. The designs include (i) a Luenberger fuzzy observer, (ii) a Luenberger fuzzy observer with sliding modes, (iii) a Walcott-Zak fuzzy observer, and (iv) an Utkin fuzzy observer. A negative, an oscillating fault signal, and a bounded random noise signal with a maximum value of ±0.4 were used to evaluate and compare the performance of the fuzzy observers. The Utkin fuzzy observer showed the best performance under the tested conditions. PMID:26521723

  14. An improved intrusion detection model based on paraconsistent logic

    NASA Astrophysics Data System (ADS)

    Yan, Fei; Zhang, Huanguo; Wang, Lina; Yang, Min

    2005-02-01

    A major difficulty of current intrusion detection model is the attack set cannot be separated from normal set thoroughly. On the basis of paraconsistent logic, an improved intrusion detection model is proposed to solve this problem. We give a proof that the detection model is trivial and discuss the reason of false alerts. A parallel paraconsistent detection algorithm is presented to develop the detection technology based on our model. An experiment using network connection data, which is usually used to evaluate the intrusion detection methods, is given to illustrate the performance of this model. We use one-class supported vector machine (SVM) to train our profiles and use supported vector-clustering (SVC) algorithm to update our detection profiles. Results of the experiment indicate that the detection system based on our model can deal with the uncertain events and reduce the false alerts.

  15. Method for Real-Time Model Based Structural Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Smith, Timothy A. (Inventor); Urnes, James M., Sr. (Inventor); Reichenbach, Eric Y. (Inventor)

    2015-01-01

    A system and methods for real-time model based vehicle structural anomaly detection are disclosed. A real-time measurement corresponding to a location on a vehicle structure during an operation of the vehicle is received, and the real-time measurement is compared to expected operation data for the location to provide a modeling error signal. A statistical significance of the modeling error signal to provide an error significance is calculated, and a persistence of the error significance is determined. A structural anomaly is indicated, if the persistence exceeds a persistence threshold value.

  16. Model-based approach to real-time target detection

    NASA Astrophysics Data System (ADS)

    Hackett, Jay K.; Gold, Ed V.; Long, Daniel T.; Cloud, Eugene L.; Duvoisin, Herbert A.

    1992-09-01

    Land mine detection and extraction from infra-red (IR) scenes using real-time parallel processing is of significant interest to ground based infantry. The mine detection algorithms consist of several sub-processes to progress from raw input IR imagery to feature based mine nominations. Image enhancement is first applied; this consists of noise and sensor artifact removal. Edge grouping is used to determine the boundary of the objects. The generalized Hough Transform tuned to the land mine signature acts as a model based matched nomination filter. Once the object is found, the model is used to guide the labeling of each pixel as background, object, or object boundary. Using these labels to identify object regions, feature primitives are extracted in a high speed parallel processor. A feature based screener then compares each object's feature primitives to acceptable values and rejects all objects that do not resemble mines. This operation greatly reduces the number of objects that must be passed from a real-time parallel processor to the classifier. We will discuss details of this model- based approach, including results from actual IR field test imagery.

  17. Event rates for WIMP detection

    SciTech Connect

    Vergados, J. D.; Moustakidis, Ch. C.; Oikonomou, V.

    2006-11-28

    The event rates for the direct detection of dark matter for various types of WIMPs are presented. In addition to the neutralino of SUSY models, we considered other candidates (exotic scalars as well as particles in Kaluza-Klein and technicolour theories) with masses in the TeV region. Then one finds reasonable branching ratios to excited states. Thus the detection of the WIMP can be made not only by recoil measurements, by measuring the de-excitation {gamma}-rays as well.

  18. a model based on crowsourcing for detecting natural hazards

    NASA Astrophysics Data System (ADS)

    Duan, J.; Ma, C.; Zhang, J.; Liu, S.; Liu, J.

    2015-12-01

    Remote Sensing Technology provides a new method for the detecting,early warning,mitigation and relief of natural hazards. Given the suddenness and the unpredictability of the location of natural hazards as well as the actual demands for hazards work, this article proposes an evaluation model for remote sensing detecting of natural hazards based on crowdsourcing. Firstly, using crowdsourcing model and with the help of the Internet and the power of hundreds of millions of Internet users, this evaluation model provides visual interpretation of high-resolution remote sensing images of hazards area and collects massive valuable disaster data; secondly, this evaluation model adopts the strategy of dynamic voting consistency to evaluate the disaster data provided by the crowdsourcing workers; thirdly, this evaluation model pre-estimates the disaster severity with the disaster pre-evaluation model based on regional buffers; lastly, the evaluation model actuates the corresponding expert system work according to the forecast results. The idea of this model breaks the boundaries between geographic information professionals and the public, makes the public participation and the citizen science eventually be realized, and improves the accuracy and timeliness of hazards assessment results.

  19. Lightning Detection Efficiency Analysis Process: Modeling Based on Empirical Data

    NASA Technical Reports Server (NTRS)

    Rompala, John T.

    2005-01-01

    A ground based lightning detection system employs a grid of sensors, which record and evaluate the electromagnetic signal produced by a lightning strike. Several detectors gather information on that signal s strength, time of arrival, and behavior over time. By coordinating the information from several detectors, an event solution can be generated. That solution includes the signal s point of origin, strength and polarity. Determination of the location of the lightning strike uses algorithms based on long used techniques of triangulation. Determination of the event s original signal strength relies on the behavior of the generated magnetic field over distance and time. In general the signal from the event undergoes geometric dispersion and environmental attenuation as it progresses. Our knowledge of that radial behavior together with the strength of the signal received by detecting sites permits an extrapolation and evaluation of the original strength of the lightning strike. It also limits the detection efficiency (DE) of the network. For expansive grids and with a sparse density of detectors, the DE varies widely over the area served. This limits the utility of the network in gathering information on regional lightning strike density and applying it to meteorological studies. A network of this type is a grid of four detectors in the Rondonian region of Brazil. The service area extends over a million square kilometers. Much of that area is covered by rain forests. Thus knowledge of lightning strike characteristics over the expanse is of particular value. I have been developing a process that determines the DE over the region [3]. In turn, this provides a way to produce lightning strike density maps, corrected for DE, over the entire region of interest. This report offers a survey of that development to date and a record of present activity.

  20. Probabilistic model-based approach for heart beat detection.

    PubMed

    Chen, Hugh; Erol, Yusuf; Shen, Eric; Russell, Stuart

    2016-09-01

    Nowadays, hospitals are ubiquitous and integral to modern society. Patients flow in and out of a veritable whirlwind of paperwork, consultations, and potential inpatient admissions, through an abstracted system that is not without flaws. One of the biggest flaws in the medical system is perhaps an unexpected one: the patient alarm system. One longitudinal study reported an 88.8% rate of false alarms, with other studies reporting numbers of similar magnitudes. These false alarm rates lead to deleterious effects that manifest in a lower standard of care across clinics. This paper discusses a model-based probabilistic inference approach to estimate physiological variables at a detection level. We design a generative model that complies with a layman's understanding of human physiology and perform approximate Bayesian inference. One primary goal of this paper is to justify a Bayesian modeling approach to increasing robustness in a physiological domain. In order to evaluate our algorithm we look at the application of heart beat detection using four datasets provided by PhysioNet, a research resource for complex physiological signals, in the form of the PhysioNet 2014 Challenge set-p1 and set-p2, the MIT-BIH Polysomnographic Database, and the MGH/MF Waveform Database. On these data sets our algorithm performs on par with the other top six submissions to the PhysioNet 2014 challenge. The overall evaluation scores in terms of sensitivity and positive predictivity values obtained were as follows: set-p1 (99.72%), set-p2 (93.51%), MIT-BIH (99.66%), and MGH/MF (95.53%). These scores are based on the averaging of gross sensitivity, gross positive predictivity, average sensitivity, and average positive predictivity. PMID:27480267

  1. GPU Accelerated Event Detection Algorithm

    Energy Science and Technology Software Center (ESTSC)

    2011-05-25

    Smart grid external require new algorithmic approaches as well as parallel formulations. One of the critical components is the prediction of changes and detection of anomalies within the power grid. The state-of-the-art algorithms are not suited to handle the demands of streaming data analysis. (i) need for events detection algorithms that can scale with the size of data, (ii) need for algorithms that can not only handle multi dimensional nature of the data, but alsomore » model both spatial and temporal dependencies in the data, which, for the most part, are highly nonlinear, (iii) need for algorithms that can operate in an online fashion with streaming data. The GAEDA code is a new online anomaly detection techniques that take into account spatial, temporal, multi-dimensional aspects of the data set. The basic idea behind the proposed approach is to (a) to convert a multi-dimensional sequence into a univariate time series that captures the changes between successive windows extracted from the original sequence using singular value decomposition (SVD), and then (b) to apply known anomaly detection techniques for univariate time series. A key challenge for the proposed approach is to make the algorithm scalable to huge datasets by adopting techniques from perturbation theory, incremental SVD analysis. We used recent advances in tensor decomposition techniques which reduce computational complexity to monitor the change between successive windows and detect anomalies in the same manner as described above. Therefore we propose to develop the parallel solutions on many core systems such as GPUs, because these algorithms involve lot of numerical operations and are highly data-parallelizable.« less

  2. Detecting Adverse Events Using Information Technology

    PubMed Central

    Bates, David W.; Evans, R. Scott; Murff, Harvey; Stetson, Peter D.; Pizziferri, Lisa; Hripcsak, George

    2003-01-01

    Context: Although patient safety is a major problem, most health care organizations rely on spontaneous reporting, which detects only a small minority of adverse events. As a result, problems with safety have remained hidden. Chart review can detect adverse events in research settings, but it is too expensive for routine use. Information technology techniques can detect some adverse events in a timely and cost-effective way, in some cases early enough to prevent patient harm. Objective: To review methodologies of detecting adverse events using information technology, reports of studies that used these techniques to detect adverse events, and study results for specific types of adverse events. Design: Structured review. Methodology: English-language studies that reported using information technology to detect adverse events were identified using standard techniques. Only studies that contained original data were included. Main Outcome Measures: Adverse events, with specific focus on nosocomial infections, adverse drug events, and injurious falls. Results: Tools such as event monitoring and natural language processing can inexpensively detect certain types of adverse events in clinical databases. These approaches already work well for some types of adverse events, including adverse drug events and nosocomial infections, and are in routine use in a few hospitals. In addition, it appears likely that these techniques will be adaptable in ways that allow detection of a broad array of adverse events, especially as more medical information becomes computerized. Conclusion: Computerized detection of adverse events will soon be practical on a widespread basis. PMID:12595401

  3. Model-Based Detection in a Shallow Water Ocean Environment

    SciTech Connect

    Candy, J V

    2001-07-30

    A model-based detector is developed to process shallow water ocean acoustic data. The function of the detector is to adaptively monitor the environment and decide whether or not a change from normal has occurred. Here we develop a processor incorporating both a normal-mode ocean acoustic model and a vertical hydrophone array. The detector is applied to data acquired from the Hudson Canyon experiments at various ranges and its performance is evaluated.

  4. Crowd Event Detection on Optical Flow Manifolds.

    PubMed

    Rao, Aravinda S; Gubbi, Jayavardhana; Marusic, Slaven; Palaniswami, Marimuthu

    2016-07-01

    Analyzing crowd events in a video is key to understanding the behavioral characteristics of people (humans). Detecting crowd events in videos is challenging because of articulated human movements and occlusions. The aim of this paper is to detect the events in a probabilistic framework for automatically interpreting the visual crowd behavior. In this paper, crowd event detection and classification in optical flow manifolds (OFMs) are addressed. A new algorithm to detect walking and running events has been proposed, which uses optical flow vector lengths in OFMs. Furthermore, a new algorithm to detect merging and splitting events has been proposed, which uses Riemannian connections in the optical flow bundle (OFB). The longest vector from the OFB provides a key feature for distinguishing walking and running events. Using a Riemannian connection, the optical flow vectors are parallel transported to localize the crowd groups. The geodesic lengths among the groups provide a criterion for merging and splitting events. Dispersion and evacuation events are jointly modeled from the walking/running and merging/splitting events. Our results show that the proposed approach delivers a comparable model to detect crowd events. Using the performance evaluation of tracking and surveillance 2009 dataset, the proposed method is shown to produce the best results in merging, splitting, and dispersion events, and comparable results in walking, running, and evacuation events when compared with other methods. PMID:26219100

  5. Event oriented dictionary learning for complex event detection.

    PubMed

    Yan, Yan; Yang, Yi; Meng, Deyu; Liu, Gaowen; Tong, Wei; Hauptmann, Alexander G; Sebe, Nicu

    2015-06-01

    Complex event detection is a retrieval task with the goal of finding videos of a particular event in a large-scale unconstrained Internet video archive, given example videos and text descriptions. Nowadays, different multimodal fusion schemes of low-level and high-level features are extensively investigated and evaluated for the complex event detection task. However, how to effectively select the high-level semantic meaningful concepts from a large pool to assist complex event detection is rarely studied in the literature. In this paper, we propose a novel strategy to automatically select semantic meaningful concepts for the event detection task based on both the events-kit text descriptions and the concepts high-level feature descriptions. Moreover, we introduce a novel event oriented dictionary representation based on the selected semantic concepts. Toward this goal, we leverage training images (frames) of selected concepts from the semantic indexing dataset with a pool of 346 concepts, into a novel supervised multitask lp -norm dictionary learning framework. Extensive experimental results on TRECVID multimedia event detection dataset demonstrate the efficacy of our proposed method. PMID:25794390

  6. Sequential Model-Based Detection in a Shallow Ocean Acoustic Environment

    SciTech Connect

    Candy, J V

    2002-03-26

    A model-based detection scheme is developed to passively monitor an ocean acoustic environment along with its associated variations. The technique employs an embedded model-based processor and a reference model in a sequential likelihood detection scheme. The monitor is therefore called a sequential reference detector. The underlying theory for the design is developed and discussed in detail.

  7. Model-Based Design of Tree WSNs for Decentralized Detection.

    PubMed

    Tantawy, Ashraf; Koutsoukos, Xenofon; Biswas, Gautam

    2015-01-01

    The classical decentralized detection problem of finding the optimal decision rules at the sensor and fusion center, as well as variants that introduce physical channel impairments have been studied extensively in the literature. The deployment of WSNs in decentralized detection applications brings new challenges to the field. Protocols for different communication layers have to be co-designed to optimize the detection performance. In this paper, we consider the communication network design problem for a tree WSN. We pursue a system-level approach where a complete model for the system is developed that captures the interactions between different layers, as well as different sensor quality measures. For network optimization, we propose a hierarchical optimization algorithm that lends itself to the tree structure, requiring only local network information. The proposed design approach shows superior performance over several contentionless and contention-based network design approaches. PMID:26307989

  8. A novel interacting multiple model based network intrusion detection scheme

    NASA Astrophysics Data System (ADS)

    Xin, Ruichi; Venkatasubramanian, Vijay; Leung, Henry

    2006-04-01

    In today's information age, information and network security are of primary importance to any organization. Network intrusion is a serious threat to security of computers and data networks. In internet protocol (IP) based network, intrusions originate in different kinds of packets/messages contained in the open system interconnection (OSI) layer 3 or higher layers. Network intrusion detection and prevention systems observe the layer 3 packets (or layer 4 to 7 messages) to screen for intrusions and security threats. Signature based methods use a pre-existing database that document intrusion patterns as perceived in the layer 3 to 7 protocol traffics and match the incoming traffic for potential intrusion attacks. Alternately, network traffic data can be modeled and any huge anomaly from the established traffic pattern can be detected as network intrusion. The latter method, also known as anomaly based detection is gaining popularity for its versatility in learning new patterns and discovering new attacks. It is apparent that for a reliable performance, an accurate model of the network data needs to be established. In this paper, we illustrate using collected data that network traffic is seldom stationary. We propose the use of multiple models to accurately represent the traffic data. The improvement in reliability of the proposed model is verified by measuring the detection and false alarm rates on several datasets.

  9. A biological hierarchical model based underwater moving object detection.

    PubMed

    Shen, Jie; Fan, Tanghuai; Tang, Min; Zhang, Qian; Sun, Zhen; Huang, Fengchen

    2014-01-01

    Underwater moving object detection is the key for many underwater computer vision tasks, such as object recognizing, locating, and tracking. Considering the super ability in visual sensing of the underwater habitats, the visual mechanism of aquatic animals is generally regarded as the cue for establishing bionic models which are more adaptive to the underwater environments. However, the low accuracy rate and the absence of the prior knowledge learning limit their adaptation in underwater applications. Aiming to solve the problems originated from the inhomogeneous lumination and the unstable background, the mechanism of the visual information sensing and processing pattern from the eye of frogs are imitated to produce a hierarchical background model for detecting underwater objects. Firstly, the image is segmented into several subblocks. The intensity information is extracted for establishing background model which could roughly identify the object and the background regions. The texture feature of each pixel in the rough object region is further analyzed to generate the object contour precisely. Experimental results demonstrate that the proposed method gives a better performance. Compared to the traditional Gaussian background model, the completeness of the object detection is 97.92% with only 0.94% of the background region that is included in the detection results. PMID:25140194

  10. A Biological Hierarchical Model Based Underwater Moving Object Detection

    PubMed Central

    Shen, Jie; Fan, Tanghuai; Tang, Min; Zhang, Qian; Sun, Zhen; Huang, Fengchen

    2014-01-01

    Underwater moving object detection is the key for many underwater computer vision tasks, such as object recognizing, locating, and tracking. Considering the super ability in visual sensing of the underwater habitats, the visual mechanism of aquatic animals is generally regarded as the cue for establishing bionic models which are more adaptive to the underwater environments. However, the low accuracy rate and the absence of the prior knowledge learning limit their adaptation in underwater applications. Aiming to solve the problems originated from the inhomogeneous lumination and the unstable background, the mechanism of the visual information sensing and processing pattern from the eye of frogs are imitated to produce a hierarchical background model for detecting underwater objects. Firstly, the image is segmented into several subblocks. The intensity information is extracted for establishing background model which could roughly identify the object and the background regions. The texture feature of each pixel in the rough object region is further analyzed to generate the object contour precisely. Experimental results demonstrate that the proposed method gives a better performance. Compared to the traditional Gaussian background model, the completeness of the object detection is 97.92% with only 0.94% of the background region that is included in the detection results. PMID:25140194

  11. Monitoring the Ocean Acoustic Environment: A Model-Based Detection Approach

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    2000-03-13

    A model-based approach is applied in the development of a processor designed to passively monitor an ocean acoustic environment along with its associated variations. The technique employs an adaptive, model-based processor embedded in a sequential likelihood detection scheme. The trade-off between state-based and innovations-based monitor designs is discussed, conceptually. The underlying theory for the innovations-based design is briefly developed and applied to a simulated data set.

  12. Automated Detection of Events of Scientific Interest

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    A report presents a slightly different perspective of the subject matter of Fusing Symbolic and Numerical Diagnostic Computations (NPO-42512), which appears elsewhere in this issue of NASA Tech Briefs. Briefly, the subject matter is the X-2000 Anomaly Detection Language, which is a developmental computing language for fusing two diagnostic computer programs one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for real-time detection of events. In the case of the cited companion NASA Tech Briefs article, the contemplated events that one seeks to detect would be primarily failures or other changes that could adversely affect the safety or success of a spacecraft mission. In the case of the instant report, the events to be detected could also include natural phenomena that could be of scientific interest. Hence, the use of X- 2000 Anomaly Detection Language could contribute to a capability for automated, coordinated use of multiple sensors and sensor-output-data-processing hardware and software to effect opportunistic collection and analysis of scientific data.

  13. Model-based failure detection for cylindrical shells from noisy vibration measurements.

    PubMed

    Candy, J V; Fisher, K A; Guidry, B L; Chambers, D H

    2014-12-01

    Model-based processing is a theoretically sound methodology to address difficult objectives in complex physical problems involving multi-channel sensor measurement systems. It involves the incorporation of analytical models of both physical phenomenology (complex vibrating structures, noisy operating environment, etc.) and the measurement processes (sensor networks and including noise) into the processor to extract the desired information. In this paper, a model-based methodology is developed to accomplish the task of online failure monitoring of a vibrating cylindrical shell externally excited by controlled excitations. A model-based processor is formulated to monitor system performance and detect potential failure conditions. The objective of this paper is to develop a real-time, model-based monitoring scheme for online diagnostics in a representative structural vibrational system based on controlled experimental data. PMID:25480059

  14. On event-based optical flow detection

    PubMed Central

    Brosch, Tobias; Tschechne, Stephan; Neumann, Heiko

    2015-01-01

    Event-based sensing, i.e., the asynchronous detection of luminance changes, promises low-energy, high dynamic range, and sparse sensing. This stands in contrast to whole image frame-wise acquisition by standard cameras. Here, we systematically investigate the implications of event-based sensing in the context of visual motion, or flow, estimation. Starting from a common theoretical foundation, we discuss different principal approaches for optical flow detection ranging from gradient-based methods over plane-fitting to filter based methods and identify strengths and weaknesses of each class. Gradient-based methods for local motion integration are shown to suffer from the sparse encoding in address-event representations (AER). Approaches exploiting the local plane like structure of the event cloud, on the other hand, are shown to be well suited. Within this class, filter based approaches are shown to define a proper detection scheme which can also deal with the problem of representing multiple motions at a single location (motion transparency). A novel biologically inspired efficient motion detector is proposed, analyzed and experimentally validated. Furthermore, a stage of surround normalization is incorporated. Together with the filtering this defines a canonical circuit for motion feature detection. The theoretical analysis shows that such an integrated circuit reduces motion ambiguity in addition to decorrelating the representation of motion related activations. PMID:25941470

  15. Phenological Event Detection from Multitemporal Image Data

    SciTech Connect

    Vatsavai, Raju

    2009-01-01

    Monitoring biomass over large geographic regions for seasonal changes in vegetation and crop phenology is important for many applications. In this paper we a present a novel clustering based change detection method using MODIS NDVI time series data. We used well known EM technique to find GMM parameters and Bayesian Information Criteria (BIC) for determining the number of clusters. KL Divergence measure is then used to establish the cluster correspondence across two years (2001 and 2006) to determine changes between these two years. The changes identied were further analyzed for understanding phenological events. This preliminary study shows interesting relationships between key phenological events such as onset, length, end of growing seasons.

  16. Detection and recognition of indoor smoking events

    NASA Astrophysics Data System (ADS)

    Bien, Tse-Lun; Lin, Chang Hong

    2013-03-01

    Smoking in public indoor spaces has become prohibited in many countries since it not only affects the health of the people around you, but also increases the risk of fire outbreaks. This paper proposes a novel scheme to automatically detect and recognize smoking events by using exsiting surveillance cameras. The main idea of our proposed method is to detect human smoking events by recognizing their actions. In this scheme, the human pose estimation is introduced to analyze human actions from their poses. The human pose estimation method segments head and both hands from human body parts by using a skin color detection method. However, the skin color methods may fail in insufficient light conditions. Therefore, the lighting compensation is applied to help the skin color detection method become more accurate. Due to the human body parts may be covered by shadows, which may cause the human pose estimation to fail, the Kalman filter is applied to track the missed body parts. After that, we evaluate the probability features of hands approaching the head. The support vector machine (SVM) is applied to learn and recognize the smoking events by the probability features. To analysis the performance of proposed method, the datasets established in the survillance camera view under indoor enviroment are tested. The experimental results show the effectiveness of our proposed method with accuracy rate of 83.33%.

  17. Implementation of a Fractional Model-Based Fault Detection Algorithm into a PLC Controller

    NASA Astrophysics Data System (ADS)

    Kopka, Ryszard

    2014-12-01

    This paper presents results related to the implementation of model-based fault detection and diagnosis procedures into a typical PLC controller. To construct the mathematical model and to implement the PID regulator, a non-integer order differential/integral calculation was used. Such an approach allows for more exact control of the process and more precise modelling. This is very crucial in model-based diagnostic methods. The theoretical results were verified on a real object in the form of a supercapacitor connected to a PLC controller by a dedicated electronic circuit controlled directly from the PLC outputs.

  18. Radioactive Threat Detection with Scattering Physics: A Model-Based Application

    SciTech Connect

    Candy, J V; Chambers, D H; Breitfeller, E F; Guidry, B L; Verbeke, J M; Axelrod, M A; Sale, K E; Meyer, A M

    2010-01-21

    The detection of radioactive contraband is a critical problem in maintaining national security for any country. Emissions from threat materials challenge both detection and measurement technologies especially when concealed by various types of shielding complicating the transport physics significantly. The development of a model-based sequential Bayesian processor that captures both the underlying transport physics including scattering offers a physics-based approach to attack this challenging problem. It is shown that this processor can be used to develop an effective detection technique.

  19. Phase-Space Detection of Cyber Events

    SciTech Connect

    Hernandez Jimenez, Jarilyn M; Ferber, Aaron E; Prowell, Stacy J; Hively, Lee M

    2015-01-01

    Energy Delivery Systems (EDS) are a network of processes that produce, transfer and distribute energy. EDS are increasingly dependent on networked computing assets, as are many Industrial Control Systems. Consequently, cyber-attacks pose a real and pertinent threat, as evidenced by Stuxnet, Shamoon and Dragonfly. Hence, there is a critical need for novel methods to detect, prevent, and mitigate effects of such attacks. To detect cyber-attacks in EDS, we developed a framework for gathering and analyzing timing data that involves establishing a baseline execution profile and then capturing the effect of perturbations in the state from injecting various malware. The data analysis was based on nonlinear dynamics and graph theory to improve detection of anomalous events in cyber applications. The goal was the extraction of changing dynamics or anomalous activity in the underlying computer system. Takens' theorem in nonlinear dynamics allows reconstruction of topologically invariant, time-delay-embedding states from the computer data in a sufficiently high-dimensional space. The resultant dynamical states were nodes, and the state-to-state transitions were links in a mathematical graph. Alternatively, sequential tabulation of executing instructions provides the nodes with corresponding instruction-to-instruction links. Graph theorems guarantee graph-invariant measures to quantify the dynamical changes in the running applications. Results showed a successful detection of cyber events.

  20. Model-Based Detection of Radioactive Contraband for Harbor Defense Incorporating Compton Scattering Physics

    SciTech Connect

    Candy, J V; Chambers, D H; Breitfeller, E F; Guidry, B L; Verbeke, J M; Axelrod, M A; Sale, K E; Meyer, A M

    2010-03-02

    The detection of radioactive contraband is a critical problem is maintaining national security for any country. Photon emissions from threat materials challenge both detection and measurement technologies especially when concealed by various types of shielding complicating the transport physics significantly. This problem becomes especially important when ships are intercepted by U.S. Coast Guard harbor patrols searching for contraband. The development of a sequential model-based processor that captures both the underlying transport physics of gamma-ray emissions including Compton scattering and the measurement of photon energies offers a physics-based approach to attack this challenging problem. The inclusion of a basic radionuclide representation of absorbed/scattered photons at a given energy along with interarrival times is used to extract the physics information available from the noisy measurements portable radiation detection systems used to interdict contraband. It is shown that this physics representation can incorporated scattering physics leading to an 'extended' model-based structure that can be used to develop an effective sequential detection technique. The resulting model-based processor is shown to perform quite well based on data obtained from a controlled experiment.

  1. LAN attack detection using Discrete Event Systems.

    PubMed

    Hubballi, Neminath; Biswas, Santosh; Roopa, S; Ratti, Ritesh; Nandi, Sukumar

    2011-01-01

    Address Resolution Protocol (ARP) is used for determining the link layer or Medium Access Control (MAC) address of a network host, given its Internet Layer (IP) or Network Layer address. ARP is a stateless protocol and any IP-MAC pairing sent by a host is accepted without verification. This weakness in the ARP may be exploited by malicious hosts in a Local Area Network (LAN) by spoofing IP-MAC pairs. Several schemes have been proposed in the literature to circumvent these attacks; however, these techniques either make IP-MAC pairing static, modify the existing ARP, patch operating systems of all the hosts etc. In this paper we propose a Discrete Event System (DES) approach for Intrusion Detection System (IDS) for LAN specific attacks which do not require any extra constraint like static IP-MAC, changing the ARP etc. A DES model is built for the LAN under both a normal and compromised (i.e., spoofed request/response) situation based on the sequences of ARP related packets. Sequences of ARP events in normal and spoofed scenarios are similar thereby rendering the same DES models for both the cases. To create different ARP events under normal and spoofed conditions the proposed technique uses active ARP probing. However, this probing adds extra ARP traffic in the LAN. Following that a DES detector is built to determine from observed ARP related events, whether the LAN is operating under a normal or compromised situation. The scheme also minimizes extra ARP traffic by probing the source IP-MAC pair of only those ARP packets which are yet to be determined as genuine/spoofed by the detector. Also, spoofed IP-MAC pairs determined by the detector are stored in tables to detect other LAN attacks triggered by spoofing namely, man-in-the-middle (MiTM), denial of service etc. The scheme is successfully validated in a test bed. PMID:20804980

  2. WCEDS: A waveform correlation event detection system

    SciTech Connect

    Young, C.J.; Beiriger, J.I.; Trujillo, J.R.; Withers, M.M.; Aster, R.C.; Astiz, L.; Shearer, P.M.

    1995-08-01

    We have developed a working prototype of a grid-based global event detection system based on waveform correlation. The algorithm comes from a long-period detector but we have recast it in a full matrix formulation which can reduce the number of multiplications needed by better than two orders of magnitude for realistic monitoring scenarios. The reduction is made possible by eliminating redundant multiplications in the original formulation. All unique correlations for a given origin time are stored in a correlation matrix (C) which is formed by a full matrix product of a Master Image matrix (M) and a data matrix (D). The detector value at each grid point is calculated by following a different summation path through the correlation matrix. Master Images can be derived either empirically or synthetically. Our testing has used synthetic Master Images because their influence on the detector is easier to understand. We tested the system using the matrix formulation with continuous data from the IRIS (Incorporate Research Institutes for Seismology) broadband global network to monitor a 2 degree evenly spaced surface grid with a time discretization of 1 sps; we successfully detected the largest event in a two hour segment from October 1993. The output at the correct gridpoint was at least 33% larger than at adjacent grid points, and the output at the correct gridpoint at the correct origin time was more than 500% larger than the output at the same gridpoint immediately before or after. Analysis of the C matrix for the origin time of the event demonstrates that there are many significant ``false`` correlations of observed phases with incorrect predicted phases. These false correlations dull the sensitivity of the detector and so must be dealt with if our system is to attain detection thresholds consistent with a Comprehensive Test Ban Treaty (CTBT).

  3. Implementation of a model based fault detection and diagnosis technique for actuation faults of the SSME

    NASA Technical Reports Server (NTRS)

    Duyar, A.; Guo, T.-H.; Merrill, W.; Musgrave, J.

    1991-01-01

    In a previous study, Guo, Merrill and Duyar, 1990, reported a conceptual development of a fault detection and diagnosis system for actuation faults of the Space Shuttle main engine. This study, which is a continuation of the previous work, implements the developed fault detection and diagnosis scheme for the real time actuation fault diagnosis of the Space Shuttle Main Engine. The scheme will be used as an integral part of an intelligent control system demonstration experiment at NASA Lewis. The diagnosis system utilizes a model based method with real time identification and hypothesis testing for actuation, sensor, and performance degradation faults.

  4. Real-Time Model-Based Leak-Through Detection within Cryogenic Flow Systems

    NASA Technical Reports Server (NTRS)

    Walker, M.; Figueroa, F.

    2015-01-01

    The timely detection of leaks within cryogenic fuel replenishment systems is of significant importance to operators on account of the safety and economic impacts associated with material loss and operational inefficiencies. Associated loss in control of pressure also effects the stability and ability to control the phase of cryogenic fluids during replenishment operations. Current research dedicated to providing Prognostics and Health Management (PHM) coverage of such cryogenic replenishment systems has focused on the detection of leaks to atmosphere involving relatively simple model-based diagnostic approaches that, while effective, are unable to isolate the fault to specific piping system components. The authors have extended this research to focus on the detection of leaks through closed valves that are intended to isolate sections of the piping system from the flow and pressurization of cryogenic fluids. The described approach employs model-based detection of leak-through conditions based on correlations of pressure changes across isolation valves and attempts to isolate the faults to specific valves. Implementation of this capability is enabled by knowledge and information embedded in the domain model of the system. The approach has been used effectively to detect such leak-through faults during cryogenic operational testing at the Cryogenic Testbed at NASA's Kennedy Space Center.

  5. Pickless event detection and location: The waveform correlation event detection system (WCEDS) revisited

    DOE PAGESBeta

    Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; Slinkard, Megan Elizabeth

    2016-01-01

    The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events thatmore » generated the phase picks.« less

  6. Pickless event detection and location: The waveform correlation event detection system (WCEDS) revisited

    SciTech Connect

    Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; Slinkard, Megan Elizabeth

    2016-01-01

    The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events that generated the phase picks.

  7. A model-based approach for detection of objects in low resolution passive millimeter wave images

    NASA Technical Reports Server (NTRS)

    Kasturi, Rangachar; Tang, Yuan-Liang; Devadiga, Sadashiva

    1993-01-01

    A model-based vision system to assist the pilots in landing maneuvers under restricted visibility conditions is described. The system was designed to analyze image sequences obtained from a Passive Millimeter Wave (PMMW) imaging system mounted on the aircraft to delineate runways/taxiways, buildings, and other objects on or near runways. PMMW sensors have good response in a foggy atmosphere, but their spatial resolution is very low. However, additional data such as airport model and approximate position and orientation of aircraft are available. These data are exploited to guide our model-based system to locate objects in the low resolution image and generate warning signals to alert the pilots. Also analytical expressions were derived from the accuracy of the camera position estimate obtained by detecting the position of known objects in the image.

  8. Model-based approach for detection of objects in low-resolution passive-millimeter images

    NASA Astrophysics Data System (ADS)

    Tang, Yuan-Ling; Devadiga, Sadashiva; Kasturi, Rangachar; Harris, Randall L., Sr.

    1994-03-01

    We describe a model-based vision system to assist the pilots in landing maneuvers under restricted visibility conditions. The system has been designed to analyze image sequences obtained from a passive millimeter wave (PMMW) imaging system mounted on the aircraft to delineate runways/taxiways, buildings, and other objects on or near runways. PMMW sensors have good response in a foggy atmosphere, but their spatial resolution is very low. However, additional data such as airport model and approximate position and orientation of aircraft are available. We exploit these data to guide our model-based system to locate objects in the low resolution image and generate warning signals to alert the pilots. We also derive analytical expressions for the accuracy of the camera position estimate obtained by detecting the position of known objects in the image.

  9. Rare Event Detection Algorithm Of Water Quality

    NASA Astrophysics Data System (ADS)

    Ungs, M. J.

    2011-12-01

    A novel method is presented describing the development and implementation of an on-line water quality event detection algorithm. An algorithm was developed to distinguish between normal variation in water quality parameters and changes in these parameters triggered by the presence of contaminant spikes. Emphasis is placed on simultaneously limiting the number of false alarms (which are called false positives) that occur and the number of misses (called false negatives). The problem of excessive false alarms is common to existing change detection algorithms. EPA's standard measure of evaluation for event detection algorithms is to have a false alarm rate of less than 0.5 percent and a false positive rate less than 2 percent (EPA 817-R-07-002). A detailed description of the algorithm's development is presented. The algorithm is tested using historical water quality data collected by a public water supply agency at multiple locations and using spiking contaminants developed by the USEPA, Water Security Division. The water quality parameters of specific conductivity, chlorine residual, total organic carbon, pH, and oxidation reduction potential are considered. Abnormal data sets are generated by superimposing water quality changes on the historical or baseline data. Eddies-ET has defined reaction expressions which specify how the peak or spike concentration of a particular contaminant affects each water quality parameter. Nine default contaminants (Eddies-ET) were previously derived from pipe-loop tests performed at EPA's National Homeland Security Research Center (NHSRC) Test and Evaluation (T&E) Facility. A contaminant strength value of approximately 1.5 is considered to be a significant threat. The proposed algorithm has been able to achieve a combined false alarm rate of less than 0.03 percent for both false positives and for false negatives using contaminant spikes of strength 2 or more.

  10. Model-based estimation of measures of association for time-to-event outcomes

    PubMed Central

    2014-01-01

    Background Hazard ratios are ubiquitously used in time to event applications to quantify adjusted covariate effects. Although hazard ratios are invaluable for hypothesis testing, other adjusted measures of association, both relative and absolute, should be provided to fully appreciate studies results. The corrected group prognosis method is generally used to estimate the absolute risk reduction and the number needed to be treated for categorical covariates. Methods The goal of this paper is to present transformation models for time-to-event outcomes to obtain, directly from estimated coefficients, the measures of association widely used in biostatistics together with their confidence interval. Pseudo-values are used for a practical estimation of transformation models. Results Using the regression model estimated through pseudo-values with suitable link functions, relative risks, risk differences and the number needed to treat, are obtained together with their confidence intervals. One example based on literature data and one original application to the study of prognostic factors in primary retroperitoneal soft tissue sarcomas are presented. A simulation study is used to show some properties of the different estimation methods. Conclusions Clinically useful measures of treatment or exposure effect are widely available in epidemiology. When time to event outcomes are present, the analysis is performed generally resorting to predicted values from Cox regression model. It is now possible to resort to more general regression models, adopting suitable link functions and pseudo values for estimation, to obtain alternative measures of effect directly from regression coefficients together with their confidence interval. This may be especially useful when, in presence of time dependent covariate effects, it is not straightforward to specify the correct, if any, time dependent functional form. The method can easily be implemented with standard software. PMID:25106903

  11. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan Walker

    2015-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  12. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan W.

    2014-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  13. A model-based framework for the detection of spiculated masses on mammography

    SciTech Connect

    Sampat, Mehul P.; Bovik, Alan C.; Whitman, Gary J.; Markey, Mia K.

    2008-05-15

    The detection of lesions on mammography is a repetitive and fatiguing task. Thus, computer-aided detection systems have been developed to aid radiologists. The detection accuracy of current systems is much higher for clusters of microcalcifications than for spiculated masses. In this article, the authors present a new model-based framework for the detection of spiculated masses. The authors have invented a new class of linear filters, spiculated lesion filters, for the detection of converging lines or spiculations. These filters are highly specific narrowband filters, which are designed to match the expected structures of spiculated masses. As a part of this algorithm, the authors have also invented a novel technique to enhance spicules on mammograms. This entails filtering in the radon domain. They have also developed models to reduce the false positives due to normal linear structures. A key contribution of this work is that the parameters of the detection algorithm are based on measurements of physical properties of spiculated masses. The results of the detection algorithm are presented in the form of free-response receiver operating characteristic curves on images from the Mammographic Image Analysis Society and Digital Database for Screening Mammography databases.

  14. Model-based fault detection and identification with online aerodynamic model structure selection

    NASA Astrophysics Data System (ADS)

    Lombaerts, T.

    2013-12-01

    This publication describes a recursive algorithm for the approximation of time-varying nonlinear aerodynamic models by means of a joint adaptive selection of the model structure and parameter estimation. This procedure is called adaptive recursive orthogonal least squares (AROLS) and is an extension and modification of the previously developed ROLS procedure. This algorithm is particularly useful for model-based fault detection and identification (FDI) of aerospace systems. After the failure, a completely new aerodynamic model can be elaborated recursively with respect to structure as well as parameter values. The performance of the identification algorithm is demonstrated on a simulation data set.

  15. Observer and data-driven-model-based fault detection in power plant coal mills

    SciTech Connect

    Odgaard, P.F.; Lin, B.; Jorgensen, S.B.

    2008-06-15

    This paper presents and compares model-based and data-driven fault detection approaches for coal mill systems. The first approach detects faults with an optimal unknown input observer developed from a simplified energy balance model. Due to the time-consuming effort in developing a first principles model with motor power as the controlled variable, data-driven methods for fault detection are also investigated. Regression models that represent normal operating conditions (NOCs) are developed with both static and dynamic principal component analysis and partial least squares methods. The residual between process measurement and the NOC model prediction is used for fault detection. A hybrid approach, where a data-driven model is employed to derive an optimal unknown input observer, is also implemented. The three methods are evaluated with case studies on coal mill data, which includes a fault caused by a blocked inlet pipe. All three approaches detect the fault as it emerges. The optimal unknown input observer approach is most robust, in that, it has no false positives. On the other hand, the data-driven approaches are more straightforward to implement, since they just require the selection of appropriate confidence limit to avoid false detection. The proposed hybrid approach is promising for systems where a first principles model is cumbersome to obtain.

  16. Model-Based Building Detection from Low-Cost Optical Sensors Onboard Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Karantzalos, K.; Koutsourakis, P.; Kalisperakis, I.; Grammatikopoulos, L.

    2015-08-01

    The automated and cost-effective building detection in ultra high spatial resolution is of major importance for various engineering and smart city applications. To this end, in this paper, a model-based building detection technique has been developed able to extract and reconstruct buildings from UAV aerial imagery and low-cost imaging sensors. In particular, the developed approach through advanced structure from motion, bundle adjustment and dense image matching computes a DSM and a true orthomosaic from the numerous GoPro images which are characterised by important geometric distortions and fish-eye effect. An unsupervised multi-region, graphcut segmentation and a rule-based classification is responsible for delivering the initial multi-class classification map. The DTM is then calculated based on inpaininting and mathematical morphology process. A data fusion process between the detected building from the DSM/DTM and the classification map feeds a grammar-based building reconstruction and scene building are extracted and reconstructed. Preliminary experimental results appear quite promising with the quantitative evaluation indicating detection rates at object level of 88% regarding the correctness and above 75% regarding the detection completeness.

  17. Comparison of chiller models for use in model-based fault detection

    SciTech Connect

    Sreedharan, Priya; Haves, Philip

    2001-06-07

    Selecting the model is an important and essential step in model based fault detection and diagnosis (FDD). Factors that are considered in evaluating a model include accuracy, training data requirements, calibration effort, generality, and computational requirements. The objective of this study was to evaluate different modeling approaches for their applicability to model based FDD of vapor compression chillers. Three different models were studied: the Gordon and Ng Universal Chiller model (2nd generation) and a modified version of the ASHRAE Primary Toolkit model, which are both based on first principles, and the DOE-2 chiller model, as implemented in CoolTools{trademark}, which is empirical. The models were compared in terms of their ability to reproduce the observed performance of an older, centrifugal chiller operating in a commercial office building and a newer centrifugal chiller in a laboratory. All three models displayed similar levels of accuracy. Of the first principles models, the Gordon-Ng model has the advantage of being linear in the parameters, which allows more robust parameter estimation methods to be used and facilitates estimation of the uncertainty in the parameter values. The ASHRAE Toolkit Model may have advantages when refrigerant temperature measurements are also available. The DOE-2 model can be expected to have advantages when very limited data are available to calibrate the model, as long as one of the previously identified models in the CoolTools library matches the performance of the chiller in question.

  18. Detecting seismic events using Benford's Law

    NASA Astrophysics Data System (ADS)

    Diaz, Jordi; Gallart, Josep; Ruiz, Mario

    2015-04-01

    The Benford's Law (BL) states that the distribution of first significant digits is not uniform but follows a logarithmic frequency distribution. Even if a remarkable wide range of natural and socioeconomical data sets, from stock market values to quantum phase transitions, fit this peculiar law, the conformity to it has deserved few scientific applications, being used mainly as a test to pinpoint anomalous or fraudulent data. We developed a procedure to detect the arrival of seismic waves based on the degree of conformity of the amplitude values in the raw seismic trace to the BL. The signal is divided in time windows of appropriate length and the fitting of the first digits distribution to BL is checked in each time window using a conformity estimator. We document that both teleseismic and local earthquakes can be clearly identified in this procedure and we compare its performance with respect to the classical STA/LTA approach. Moreover, we show that the conformity of the seismic record to the BL does not depend on the amplitude of the incoming series, as the occurrence of events with very different amplitudes result in quite similar degree of BL fitting. On the other hand, we show that natural or man-made quasi-monochromatic seismic signals, surface wave trains or engine-generated vibrations can be identified through their very low BL estimator values, when appropriate interval lengths are used. Therefore, we conclude that the degree of conformity of a seismic signal with the BL is primarily dependent on the frequency content of that signal.

  19. AESOP: Adaptive Event detection SOftware using Programming by example

    NASA Astrophysics Data System (ADS)

    Thangali, Ashwin; Prasad, Harsha; Kethamakka, Sai; Demirdjian, David; Checka, Neal

    2015-05-01

    This paper presents AESOP, a software tool for automatic event detection in video. AESOP employs a super- vised learning approach for constructing event models, given training examples from different event classes. A trajectory-based formulation is used for modeling events with an aim towards incorporating invariance to changes in the camera location and orientation parameters. The proposed formulation is designed to accommodate events that involve interactions between two or more entities over an extended period of time. AESOP's event models are formulated as HMMs to improve the event detection algorithm's robustness to noise in input data and to achieve computationally efficient algorithms for event model training and event detection. AESOP's performance is demonstrated on a wide range of different scenarios, including stationary camera surveillance and aerial video footage captured in land and maritime environments.

  20. Model-based imputation approach for data analysis in the presence of non-detects.

    PubMed

    Krishnamoorthy, K; Mallick, Avishek; Mathew, Thomas

    2009-04-01

    A model-based multiple imputation approach for analyzing sample data with non-detects is proposed. The imputation approach involves randomly generating observations below the detection limit using the detected sample values and then analyzing the data using complete sample techniques, along with suitable adjustments to account for the imputation. The method is described for the normal case and is illustrated for making inferences for constructing prediction limits, tolerance limits, for setting an upper bound for an exceedance probability and for interval estimation of a log-normal mean. Two imputation approaches are investigated in the paper: one uses approximate maximum likelihood estimates (MLEs) of the parameters and a second approach uses simple ad hoc estimates that were developed for the specific purpose of imputations. The accuracy of the approaches is verified using Monte Carlo simulation. Simulation studies show that both approaches are very satisfactory for small to moderately large sample sizes, but only the MLE-based approach is satisfactory for large sample sizes. The MLE-based approach can be calibrated to perform very well for large samples. Applicability of the method to the log-normal distribution and the gamma distribution (via a cube root transformation) is outlined. Simulation studies also show that the imputation approach works well for constructing tolerance limits and prediction limits for a gamma distribution. The approach is illustrated using a few practical examples. PMID:19181626

  1. 3D model-based detection and tracking for space autonomous and uncooperative rendezvous

    NASA Astrophysics Data System (ADS)

    Shang, Yang; Zhang, Yueqiang; Liu, Haibo

    2015-10-01

    In order to fully navigate using a vision sensor, a 3D edge model based detection and tracking technique was developed. Firstly, we proposed a target detection strategy over a sequence of several images from the 3D model to initialize the tracking. The overall purpose of such approach is to robustly match each image with the model views of the target. Thus we designed a line segment detection and matching method based on the multi-scale space technology. Experiments on real images showed that our method is highly robust under various image changes. Secondly, we proposed a method based on 3D particle filter (PF) coupled with M-estimation to track and estimate the pose of the target efficiently. In the proposed approach, a similarity observation model was designed according to a new distance function of line segments. Then, based on the tracking results of PF, the pose was optimized using M-estimation. Experiments indicated that the proposed method can effectively track and accurately estimate the pose of freely moving target in unconstrained environment.

  2. Model-based approach to the detection and classification of mines in sidescan sonar.

    PubMed

    Reed, Scott; Petillot, Yvan; Bell, Judith

    2004-01-10

    This paper presents a model-based approach to mine detection and classification by use of sidescan sonar. Advances in autonomous underwater vehicle technology have increased the interest in automatic target recognition systems in an effort to automate a process that is currently carried out by a human operator. Current automated systems generally require training and thus produce poor results when the test data set is different from the training set. This has led to research into unsupervised systems, which are able to cope with the large variability in conditions and terrains seen in sidescan imagery. The system presented in this paper first detects possible minelike objects using a Markov random field model, which operates well on noisy images, such as sidescan, and allows a priori information to be included through the use of priors. The highlight and shadow regions of the object are then extracted with a cooperating statistical snake, which assumes these regions are statistically separate from the background. Finally, a classification decision is made using Dempster-Shafer theory, where the extracted features are compared with synthetic realizations generated with a sidescan sonar simulator model. Results for the entire process are shown on real sidescan sonar data. Similarities between the sidescan sonar and synthetic aperture radar (SAR) imaging processes ensure that the approach outlined here could be made applied to SAR image analysis. PMID:14735943

  3. Model-based approach to the detection and classification of mines in sidescan sonar

    NASA Astrophysics Data System (ADS)

    Reed, Scott; Petillot, Yvan; Bell, Judith

    2004-01-01

    This paper presents a model-based approach to mine detection and classification by use of sidescan sonar. Advances in autonomous underwater vehicle technology have increased the interest in automatic target recognition systems in an effort to automate a process that is currently carried out by a human operator. Current automated systems generally require training and thus produce poor results when the test data set is different from the training set. This has led to research into unsupervised systems, which are able to cope with the large variability in conditions and terrains seen in sidescan imagery. The system presented in this paper first detects possible minelike objects using a Markov random field model, which operates well on noisy images, such as sidescan, and allows a priori information to be included through the use of priors. The highlight and shadow regions of the object are then extracted with a cooperating statistical snake, which assumes these regions are statistically separate from the background. Finally, a classification decision is made using Dempster-Shafer theory, where the extracted features are compared with synthetic realizations generated with a sidescan sonar simulator model. Results for the entire process are shown on real sidescan sonar data. Similarities between the sidescan sonar and synthetic aperture radar (SAR) imaging processes ensure that the approach outlined here could be made applied to SAR image analysis.

  4. Articulating uncertainty as part of scientific argumentation during model-based exoplanet detection tasks

    NASA Astrophysics Data System (ADS)

    Lee, Hee-Sun; Pallant, Amy; Pryputniewicz, Sarah

    2015-08-01

    Teaching scientific argumentation has emerged as an important goal for K-12 science education. In scientific argumentation, students are actively involved in coordinating evidence with theory based on their understanding of the scientific content and thinking critically about the strengths and weaknesses of the cited evidence in the context of the investigation. We developed a one-week-long online curriculum module called "Is there life in space?" where students conduct a series of four model-based tasks to learn how scientists detect extrasolar planets through the “wobble” and transit methods. The simulation model allows students to manipulate various parameters of an imaginary star and planet system such as planet size, orbit size, planet-orbiting-plane angle, and sensitivity of telescope equipment, and to adjust the display settings for graphs illustrating the relative velocity and light intensity of the star. Students can use model-based evidence to formulate an argument on whether particular signals in the graphs guarantee the presence of a planet. Students' argumentation is facilitated by the four-part prompts consisting of multiple-choice claim, open-ended explanation, Likert-scale uncertainty rating, and open-ended uncertainty rationale. We analyzed 1,013 scientific arguments formulated by 302 high school student groups taught by 7 teachers. We coded these arguments in terms of the accuracy of their claim, the sophistication of explanation connecting evidence to the established knowledge base, the uncertainty rating, and the scientific validity of uncertainty. We found that (1) only 18% of the students' uncertainty rationale involved critical reflection on limitations inherent in data and concepts, (2) 35% of students' uncertainty rationale reflected their assessment of personal ability and knowledge, rather than scientific sources of uncertainty related to the evidence, and (3) the nature of task such as the use of noisy data or the framing of

  5. Asynchronous event-based corner detection and matching.

    PubMed

    Clady, Xavier; Ieng, Sio-Hoi; Benosman, Ryad

    2015-06-01

    This paper introduces an event-based luminance-free method to detect and match corner events from the output of asynchronous event-based neuromorphic retinas. The method relies on the use of space-time properties of moving edges. Asynchronous event-based neuromorphic retinas are composed of autonomous pixels, each of them asynchronously generating "spiking" events that encode relative changes in pixels' illumination at high temporal resolutions. Corner events are defined as the spatiotemporal locations where the aperture problem can be solved using the intersection of several geometric constraints in events' spatiotemporal spaces. A regularization process provides the required constraints, i.e. the motion attributes of the edges with respect to their spatiotemporal locations using local geometric properties of visual events. Experimental results are presented on several real scenes showing the stability and robustness of the detection and matching. PMID:25828960

  6. System for detection of hazardous events

    DOEpatents

    Kulesz, James J.; Worley, Brian A.

    2006-05-23

    A system for detecting the occurrence of anomalies, includes a plurality of spaced apart nodes, with each node having adjacent nodes, each of the nodes having one or more sensors associated with the node and capable of detecting anomalies, and each of the nodes having a controller connected to the sensors associated with the node. The system also includes communication links between adjacent nodes, whereby the nodes form a network. Each controller is programmed to query its adjacent nodes to assess the status of the adjacent nodes and the communication links.

  7. System For Detection Of Hazardous Events

    DOEpatents

    Kulesz, James J [Oak Ridge, TN; Worley, Brian A [Knoxville, TN

    2005-08-16

    A system for detecting the occurrence of anomalies, includes a plurality of spaced apart nodes, with each node having adjacent nodes, each of the nodes having one or more sensors associated with the node and capable of detecting anomalies, and each of the nodes having a controller connected to the sensors associated with the node. The system also includes communication links between adjacent nodes, whereby the nodes form a network. Each controller is programmed to query its adjacent nodes to assess the status of the adjacent nodes and the communication links.

  8. Particle Filtering for Model-Based Anomaly Detection in Sensor Networks

    NASA Technical Reports Server (NTRS)

    Solano, Wanda; Banerjee, Bikramjit; Kraemer, Landon

    2012-01-01

    A novel technique has been developed for anomaly detection of rocket engine test stand (RETS) data. The objective was to develop a system that postprocesses a csv file containing the sensor readings and activities (time-series) from a rocket engine test, and detects any anomalies that might have occurred during the test. The output consists of the names of the sensors that show anomalous behavior, and the start and end time of each anomaly. In order to reduce the involvement of domain experts significantly, several data-driven approaches have been proposed where models are automatically acquired from the data, thus bypassing the cost and effort of building system models. Many supervised learning methods can efficiently learn operational and fault models, given large amounts of both nominal and fault data. However, for domains such as RETS data, the amount of anomalous data that is actually available is relatively small, making most supervised learning methods rather ineffective, and in general met with limited success in anomaly detection. The fundamental problem with existing approaches is that they assume that the data are iid, i.e., independent and identically distributed, which is violated in typical RETS data. None of these techniques naturally exploit the temporal information inherent in time series data from the sensor networks. There are correlations among the sensor readings, not only at the same time, but also across time. However, these approaches have not explicitly identified and exploited such correlations. Given these limitations of model-free methods, there has been renewed interest in model-based methods, specifically graphical methods that explicitly reason temporally. The Gaussian Mixture Model (GMM) in a Linear Dynamic System approach assumes that the multi-dimensional test data is a mixture of multi-variate Gaussians, and fits a given number of Gaussian clusters with the help of the wellknown Expectation Maximization (EM) algorithm. The

  9. Subsurface event detection and classification using Wireless Signal Networks.

    PubMed

    Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T

    2012-01-01

    Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events. PMID:23202191

  10. Subsurface Event Detection and Classification Using Wireless Signal Networks

    PubMed Central

    Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T.

    2012-01-01

    Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events. PMID:23202191

  11. Global Seismic Event Detection Using Surface Waves: 15 Possible Antarctic Glacial Sliding Events

    NASA Astrophysics Data System (ADS)

    Chen, X.; Shearer, P. M.; Walker, K. T.; Fricker, H. A.

    2008-12-01

    To identify overlooked or anomalous seismic events not listed in standard catalogs, we have developed an algorithm to detect and locate global seismic events using intermediate-period (35-70s) surface waves. We apply our method to continuous vertical-component seismograms from the global seismic networks as archived in the IRIS UV FARM database from 1997 to 2007. We first bandpass filter the seismograms, apply automatic gain control, and compute envelope functions. We then examine 1654 target event locations defined at 5 degree intervals and stack the seismogram envelopes along the predicted Rayleigh-wave travel times. The resulting function has spatial and temporal peaks that indicate possible seismic events. We visually check these peaks using a graphical user interface to eliminate artifacts and assign an overall reliability grade (A, B or C) to the new events. We detect 78% of events in the Global Centroid Moment Tensor (CMT) catalog. However, we also find 840 new events not listed in the PDE, ISC and REB catalogs. Many of these new events were previously identified by Ekstrom (2006) using a different Rayleigh-wave detection scheme. Most of these new events are located along oceanic ridges and transform faults. Some new events can be associated with volcanic eruptions such as the 2000 Miyakejima sequence near Japan and others with apparent glacial sliding events in Greenland (Ekstrom et al., 2003). We focus our attention on 15 events detected from near the Antarctic coastline and relocate them using a cross-correlation approach. The events occur in 3 groups which are well-separated from areas of cataloged earthquake activity. We speculate that these are iceberg calving and/or glacial sliding events, and hope to test this by inverting for their source mechanisms and examining remote sensing data from their source regions.

  12. Event Detection using Twitter: A Spatio-Temporal Approach

    PubMed Central

    Cheng, Tao; Wicks, Thomas

    2014-01-01

    Background Every day, around 400 million tweets are sent worldwide, which has become a rich source for detecting, monitoring and analysing news stories and special (disaster) events. Existing research within this field follows key words attributed to an event, monitoring temporal changes in word usage. However, this method requires prior knowledge of the event in order to know which words to follow, and does not guarantee that the words chosen will be the most appropriate to monitor. Methods This paper suggests an alternative methodology for event detection using space-time scan statistics (STSS). This technique looks for clusters within the dataset across both space and time, regardless of tweet content. It is expected that clusters of tweets will emerge during spatio-temporally relevant events, as people will tweet more than expected in order to describe the event and spread information. The special event used as a case study is the 2013 London helicopter crash. Results and Conclusion A spatio-temporally significant cluster is found relating to the London helicopter crash. Although the cluster only remains significant for a relatively short time, it is rich in information, such as important key words and photographs. The method also detects other special events such as football matches, as well as train and flight delays from Twitter data. These findings demonstrate that STSS is an effective approach to analysing Twitter data for event detection. PMID:24893168

  13. Video Event Detection Framework on Large-Scale Video Data

    ERIC Educational Resources Information Center

    Park, Dong-Jun

    2011-01-01

    Detection of events and actions in video entails substantial processing of very large, even open-ended, video streams. Video data present a unique challenge for the information retrieval community because properly representing video events is challenging. We propose a novel approach to analyze temporal aspects of video data. We consider video data…

  14. Automatic detection of iceberg calving events using seismic observations

    NASA Astrophysics Data System (ADS)

    Andersen, M. L.; Larsen, T.; Hamilton, G. S.; Nettles, M.

    2014-12-01

    Iceberg calving at large, marine-terminating glaciers has been shown to be seismogenic. Seismic energy from these events is released slowly, resulting in characteristic low-frequency signals. The events therefore typically escape detection by traditional systematic methods. Here we show the results of a detection algorithm applied to data observed at two stations, both ~100 km from Helheim Glacier, South East Greenland, in 2007 and 2008 for the purpose of detecting calving-related seismic signals. The detector entails sliding a 150 s wide window over the observed vertical displacement seismograms at steps of one second. Relative power in the 1.1-3.3 s band is monitored, and the detector is activated when a pre-defined threshold is exceeded. We determine the threshold by calibrating the detector with a record of known events observed by time lapse cameras at Helheim Glacier and automatic detections of glacial earthquakes from the GSN (Global Seismic Network) stations. The resulting list of detections is then filtered for events overlapping with tectonic events, both local and global. We observe a clear periodicity in the detections, with most events occurring during the late summer and early fall, roughly coinciding with the end of the melt season. This apparent offset from peak melt intensity leads us to speculate that the pattern in calving is the result of a combination of the seasonal development of multiple physical properties of the glacier, i.e., surface crevassing, subglacial melt and crevassing, and the subglacial drainage system.

  15. An Organic Model for Detecting Cyber Events

    SciTech Connect

    Oehmen, Christopher S.; Peterson, Elena S.; Dowson, Scott T.

    2010-04-21

    Cyber entities in many ways mimic the behavior of organic systems. Individuals or groups compete for limited resources using a variety of strategies and effective strategies are re-used and refined in later ‘generations’. Traditionally this drift has made detection of malicious entities very difficult because 1) recognition systems are often built on exact matching to a pattern that can only be ‘learned’ after a malicious entity reveals itself and 2) the enormous volume and variation in benign entities is an overwhelming source of previously unseen entities that often confound detectors. To turn the tables of complexity on the would-be attackers, we have developed a method for mapping the sequence of behaviors in which cyber entities engage to strings of text and analyze these strings using modified bioinformatics algorithms. Bioinformatics algorithms optimize the alignment between text strings even in the presence of mismatches, insertions or deletions and do not require an a priori definition of the patterns one is seeking. Nor does it require any type of exact matching. This allows the data itself to suggest meaningful patterns that are conserved between cyber entities. We demonstrate this method on data generated from network traffic. The impact of this approach is that it can rapidly calculate similarity measures of previously unseen cyber entities in terms of well-characterized entities. These measures may also be used to organize large collections of data into families, making it possible to identify motifs indicative of each family.

  16. Detection of flood events in hydrological discharge time series

    NASA Astrophysics Data System (ADS)

    Seibert, S. P.; Ehret, U.

    2012-04-01

    The shortcomings of mean-squared-error (MSE) based distance metrics are well known (Beran 1999, Schaeffli & Gupta 2007) and the development of novel distance metrics (Pappenberger & Beven 2004, Ehret & Zehe 2011) and multi-criteria-approaches enjoy increasing popularity (Reusser 2009, Gupta et al. 2009). Nevertheless, the hydrological community still lacks metrics which identify and thus, allow signature based evaluations of hydrological discharge time series. Signature based information/evaluations are required wherever specific time series features, such as flood events, are of special concern. Calculation of event based runoff coefficients or precise knowledge on flood event characteristics (like onset or duration of rising limp or the volume of falling limp, etc.) are possible applications. The same applies for flood forecasting/simulation models. Directly comparing simulated and observed flood event features may reveal thorough insights into model dynamics. Compared to continuous space-and-time-aggregated distance metrics, event based evaluations may provide answers like the distributions of event characteristics or the percentage of the events which were actually reproduced by a hydrological model. It also may help to provide information on the simulation accuracy of small, medium and/or large events in terms of timing and magnitude. However, the number of approaches which expose time series features is small and their usage is limited to very specific questions (Merz & Blöschl 2009, Norbiato et al. 2009). We believe this is due to the following reasons: i) a generally accepted definition of the signature of interest is missing or difficult to obtain (in our case: what makes a flood event a flood event?) and/or ii) it is difficult to translate such a definition into a equation or (graphical) procedure which exposes the feature of interest in the discharge time series. We reviewed approaches which detect event starts and/or ends in hydrological discharge time

  17. Stable algorithm for event detection in event-driven particle dynamics: logical states

    NASA Astrophysics Data System (ADS)

    Strobl, Severin; Bannerman, Marcus N.; Pöschel, Thorsten

    2016-07-01

    Following the recent development of a stable event-detection algorithm for hard-sphere systems, the implications of more complex interaction models are examined. The relative location of particles leads to ambiguity when it is used to determine the interaction state of a particle in stepped potentials, such as the square-well model. To correctly predict the next event in these systems, the concept of an additional state that is tracked separately from the particle position is introduced and integrated into the stable algorithm for event detection.

  18. Structuring an event ontology for disease outbreak detection

    PubMed Central

    Kawazoe, Ai; Chanlekha, Hutchatai; Shigematsu, Mika; Collier, Nigel

    2008-01-01

    Background This paper describes the design of an event ontology being developed for application in the machine understanding of infectious disease-related events reported in natural language text. This event ontology is designed to support timely detection of disease outbreaks and rapid judgment of their alerting status by 1) bridging a gap between layman's language used in disease outbreak reports and public health experts' deep knowledge, and 2) making multi-lingual information available. Construction and content This event ontology integrates a model of experts' knowledge for disease surveillance, and at the same time sets of linguistic expressions which denote disease-related events, and formal definitions of events. In this ontology, rather general event classes, which are suitable for application to language-oriented tasks such as recognition of event expressions, are placed on the upper-level, and more specific events of the experts' interest are in the lower level. Each class is related to other classes which represent participants of events, and linked with multi-lingual synonym sets and axioms. Conclusions We consider that the design of the event ontology and the methodology introduced in this paper are applicable to other domains which require integration of natural language information and machine support for experts to assess them. The first version of the ontology, with about 40 concepts, will be available in March 2008. PMID:18426553

  19. Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks.

    PubMed

    Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming

    2015-01-01

    With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady. PMID:26694394

  20. Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks

    PubMed Central

    Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming

    2015-01-01

    With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady. PMID:26694394

  1. Secure Access Control and Large Scale Robust Representation for Online Multimedia Event Detection

    PubMed Central

    Liu, Changyu; Li, Huiling

    2014-01-01

    We developed an online multimedia event detection (MED) system. However, there are a secure access control issue and a large scale robust representation issue when we want to integrate traditional event detection algorithms into the online environment. For the first issue, we proposed a tree proxy-based and service-oriented access control (TPSAC) model based on the traditional role based access control model. Verification experiments were conducted on the CloudSim simulation platform, and the results showed that the TPSAC model is suitable for the access control of dynamic online environments. For the second issue, inspired by the object-bank scene descriptor, we proposed a 1000-object-bank (1000OBK) event descriptor. Feature vectors of the 1000OBK were extracted from response pyramids of 1000 generic object detectors which were trained on standard annotated image datasets, such as the ImageNet dataset. A spatial bag of words tiling approach was then adopted to encode these feature vectors for bridging the gap between the objects and events. Furthermore, we performed experiments in the context of event classification on the challenging TRECVID MED 2012 dataset, and the results showed that the robust 1000OBK event descriptor outperforms the state-of-the-art approaches. PMID:25147840

  2. Adaptive noise estimation and suppression for improving microseismic event detection

    NASA Astrophysics Data System (ADS)

    Mousavi, S. Mostafa; Langston, Charles A.

    2016-09-01

    Microseismic data recorded by surface arrays are often strongly contaminated by unwanted noise. This background noise makes the detection of small magnitude events difficult. A noise level estimation and noise reduction algorithm is presented for microseismic data analysis based upon minimally controlled recursive averaging and neighborhood shrinkage estimators. The method might not be compared with more sophisticated and computationally expensive denoising algorithm in terms of preserving detailed features of seismic signal. However, it is fast and data-driven and can be applied in real-time processing of continuous data for event detection purposes. Results from application of this algorithm to synthetic and real seismic data show that it holds a great promise for improving microseismic event detection.

  3. Data mining for signal detection of adverse event safety data.

    PubMed

    Chen, Hung-Chia; Tsong, Yi; Chen, James J

    2013-01-01

    The Adverse Event Reporting System (AERS) is the primary database designed to support the Food and Drug Administration (FDA) postmarketing safety surveillance program for all approved drugs and therapeutic biologic products. Most current disproportionality analysis focuses on the detection of potential adverse events (AE) involving a single drug and a single AE only. In this paper, we present a data mining biclustering technique based on the singular value decomposition to extract local regions of association for a safety study. The analysis consists of collection of biclusters, each representing an association between a set of drugs with the corresponding set of adverse events. Significance of each bicluster can be tested using disproportionality analysis. Individual drug-event combination can be further tested. A safety data set consisting of 193 drugs with 8453 adverse events is analyzed as an illustration. PMID:23331228

  4. On Identifiability of Bias-Type Actuator-Sensor Faults in Multiple-Model-Based Fault Detection and Identification

    NASA Technical Reports Server (NTRS)

    Joshi, Suresh M.

    2012-01-01

    This paper explores a class of multiple-model-based fault detection and identification (FDI) methods for bias-type faults in actuators and sensors. These methods employ banks of Kalman-Bucy filters to detect the faults, determine the fault pattern, and estimate the fault values, wherein each Kalman-Bucy filter is tuned to a different failure pattern. Necessary and sufficient conditions are presented for identifiability of actuator faults, sensor faults, and simultaneous actuator and sensor faults. It is shown that FDI of simultaneous actuator and sensor faults is not possible using these methods when all sensors have biases.

  5. Implementation of a model based fault detection and diagnosis for actuation faults of the Space Shuttle main engine

    NASA Technical Reports Server (NTRS)

    Duyar, A.; Guo, T.-H.; Merrill, W.; Musgrave, J.

    1992-01-01

    In a previous study, Guo, Merrill and Duyar, 1990, reported a conceptual development of a fault detection and diagnosis system for actuation faults of the space shuttle main engine. This study, which is a continuation of the previous work, implements the developed fault detection and diagnosis scheme for the real time actuation fault diagnosis of the space shuttle main engine. The scheme will be used as an integral part of an intelligent control system demonstration experiment at NASA Lewis. The diagnosis system utilizes a model based method with real time identification and hypothesis testing for actuation, sensor, and performance degradation faults.

  6. Method for early detection of cooling-loss events

    SciTech Connect

    Bermudez, Sergio A.; Hamann, Hendrik F.; Marianno, Fernando J.

    2015-12-22

    A method of detecting cooling-loss event early is provided. The method includes defining a relative humidity limit and change threshold for a given space, measuring relative humidity in the given space, determining, with a processing unit, whether the measured relative humidity is within the defined relative humidity limit, generating a warning in an event the measured relative humidity is outside the defined relative humidity limit and determining whether a change in the measured relative humidity is less than the defined change threshold for the given space and generating an alarm in an event the change is greater than the defined change threshold.

  7. Method for early detection of cooling-loss events

    SciTech Connect

    Bermudez, Sergio A.; Hamann, Hendrik; Marianno, Fernando J.

    2015-06-30

    A method of detecting cooling-loss event early is provided. The method includes defining a relative humidity limit and change threshold for a given space, measuring relative humidity in the given space, determining, with a processing unit, whether the measured relative humidity is within the defined relative humidity limit, generating a warning in an event the measured relative humidity is outside the defined relative humidity limit and determining whether a change in the measured relative humidity is less than the defined change threshold for the given space and generating an alarm in an event the change is greater than the defined change threshold.

  8. Development of the IDC Infrasound Event Detection Pipeline

    NASA Astrophysics Data System (ADS)

    Mialle, P.; Bittner, P.; Brown, D.; Given, J. W.

    2012-12-01

    The first atmospheric event built only from infrasound arrivals was reported in the Reviewed Event Bulletin (REB) of the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO) in 2003. In the subsequent decade, 45 infrasound stations from the International Monitoring System (IMS) have been installed and are transmitting continuous data to the IDC. The growing amount of infrasound data and detections produced by the automatic system challenges the station and network processing at the IDC and requires the Organization to improve the infrasound data processing. In 2010, the IDC began full-time operational automatic processing of infrasound data followed by interactive analysis. The detected and located events are systematically included in the analyst-reviewed Late Event Bulletin (LEB) and REB. Approximately 16% of SEL3 (Selected Event List 3, automatically produced 6 hours after real-time) events with associated infrasound signals pass interactive analysis and make it to the IDC bulletins. 41% of those SEL3 events rejected after review have only 2 associated infrasound phases (and possibly other seismic and hydro-acoustic detections). Therefore, the process whereby infrasound detections are associated with events needs to be investigated further. The objective of this study is to reduce the number of associated infrasound arrivals that are falsely associated during the creation of the SEL3. There are two parts to the study. First, the detection accuracy at the infrasound arrays is improved by improving the infrasound signal detector, which is based on the PMCC (Progressive Multi-Channel Correlation) algorithm. The second part focuses on improving the reliability of the association algorithm. The association algorithm is enhanced to include better characterization of the variable atmospheric phenomena, which profoundly affect the detection patterns of the infrasound signals. The algorithm is then further tuned to reduce the

  9. Detection of Abnormal Events via Optical Flow Feature Analysis

    PubMed Central

    Wang, Tian; Snoussi, Hichem

    2015-01-01

    In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227

  10. Context-aware event detection smartphone application for first responders

    NASA Astrophysics Data System (ADS)

    Boddhu, Sanjay K.; Dave, Rakesh P.; McCartney, Matt; West, James A.; Williams, Robert L.

    2013-05-01

    The rise of social networking platforms like Twitter, Facebook, etc…, have provided seamless sharing of information (as chat, video and other media) among its user community on a global scale. Further, the proliferation of the smartphones and their connectivity networks has powered the ordinary individuals to share and acquire information regarding the events happening in his/her immediate vicinity in a real-time fashion. This human-centric sensed data being generated in "human-as-sensor" approach is tremendously valuable as it delivered mostly with apt annotations and ground truth that would be missing in traditional machine-centric sensors, besides high redundancy factor (same data thru multiple users). Further, when appropriately employed this real-time data can support in detecting localized events like fire, accidents, shooting, etc…, as they unfold and pin-point individuals being affected by those events. This spatiotemporal information, when made available for first responders in the event vicinity (or approaching it) can greatly assist them to make effective decisions to protect property and life in a timely fashion. In this vein, under SATE and YATE programs, the research team at AFRL Tec^Edge Discovery labs had demonstrated the feasibility of developing Smartphone applications, that can provide a augmented reality view of the appropriate detected events in a given geographical location (localized) and also provide an event search capability over a large geographic extent. In its current state, the application thru its backend connectivity utilizes a data (Text & Image) processing framework, which deals with data challenges like; identifying and aggregating important events, analyzing and correlating the events temporally and spatially and building a search enabled event database. Further, the smartphone application with its backend data processing workflow has been successfully field tested with live user generated feeds.

  11. A model-based approach for detection of objects in low resolution passive-millimeter wave images

    NASA Technical Reports Server (NTRS)

    Tang, Yuan-Liang; Devadiga, Sadashiva; Kasturi, Rangachar; Harris, Randall L., Sr.

    1993-01-01

    We describe a model-based vision system to assist pilots in landing maneuvers under restricted visibility conditions. The system was designed to analyze image sequences obtained from a Passive Millimeter Wave (PMMW) imaging system mounted on the aircraft to delineate runways/taxiways, buildings, and other objects on or near runways. PMMW sensors have good response in a foggy atmosphere; but, their spatial resolution is very low. However, additional data such as airport model and approximate position and orientation of aircraft are available. We exploit these data to guide our model-based system to locate objects in the low resolution image and generate warning signals to alert the pilots. We also derive analytical expressions for the accuracy of the camera position estimate obtained by detecting the position of known objects in the image.

  12. Summary of gas release events detected by hydrogen monitoring

    SciTech Connect

    MCCAIN, D.J.

    1999-05-18

    This paper summarizes the results of monitoring tank headspace for flammable gas release events. In over 40 tank years of monitoring the largest detected release in a single-shell tank is 2.4 cubic meters of Hydrogen. In the double-shell tanks the largest release is 19.3 cubic meters except in SY-101 pre mixer pump installation condition.

  13. Context and quality estimation in video for enhanced event detection

    NASA Astrophysics Data System (ADS)

    Irvine, John M.; Wood, Richard J.

    2015-05-01

    Numerous practical applications for automated event recognition in video rely on analysis of the objects and their associated motion, i.e., the kinematics of the scene. The ability to recognize events in practice depends on accurate tracking objects of interest in the video data and accurate recognition of changes relative to the background. Numerous factors can degrade the performance of automated algorithms. Our object detection and tracking algorithms estimate the object position and attributes within the context of a dynamic assessment of video quality, to provide more reliable event recognition under challenging conditions. We present an approach to robustly modeling the image quality which informs tuning parameters to use for a given video stream. The video quality model rests on a suite of image metrics computed in real-time from the video. We will describe the formulation of the image quality model. Results from a recent experiment will quantify the empirical performance for recognition of events of interest.

  14. Adaptive Model-Based Mine Detection/Localization using Noisy Laser Doppler Vibration Measurements

    SciTech Connect

    Sullivan, E J; Xiang, N; Candy, J V

    2009-04-06

    The acoustic detection of buried mines is hampered by the fact that at the frequencies required for obtaining useful penetration, the energy is quickly absorbed by the ground. A recent approach which avoids this problem, is to excite the ground with a high-level low frequency sound, which excites low frequency resonances in the mine. These resonances cause a low-level vibration on the surface which can be detected by a Laser Doppler Vibrometer. This paper presents a method of quickly and efficiently detecting these vibrations by sensing a change in the statistics of the signal when the mine is present. Results based on real data are shown.

  15. Human visual system-based smoking event detection

    NASA Astrophysics Data System (ADS)

    Odetallah, Amjad D.; Agaian, Sos S.

    2012-06-01

    Human action (e.g. smoking, eating, and phoning) analysis is an important task in various application domains like video surveillance, video retrieval, human-computer interaction systems, and so on. Smoke detection is a crucial task in many video surveillance applications and could have a great impact to raise the level of safety of urban areas, public parks, airplanes, hospitals, schools and others. The detection task is challenging since there is no prior knowledge about the object's shape, texture and color. In addition, its visual features will change under different lighting and weather conditions. This paper presents a new scheme of a system for detecting human smoking events, or small smoke, in a sequence of images. In developed system, motion detection and background subtraction are combined with motion-region-saving, skin-based image segmentation, and smoke-based image segmentation to capture potential smoke regions which are further analyzed to decide on the occurrence of smoking events. Experimental results show the effectiveness of the proposed approach. As well, the developed method is capable of detecting the small smoking events of uncertain actions with various cigarette sizes, colors, and shapes.

  16. Automatic event detection based on artificial neural networks

    NASA Astrophysics Data System (ADS)

    Doubravová, Jana; Wiszniowski, Jan; Horálek, Josef

    2015-04-01

    The proposed algorithm was developed to be used for Webnet, a local seismic network in West Bohemia. The Webnet network was built to monitor West Bohemia/Vogtland swarm area. During the earthquake swarms there is a large number of events which must be evaluated automatically to get a quick estimate of the current earthquake activity. Our focus is to get good automatic results prior to precise manual processing. With automatic data processing we may also reach a lower completeness magnitude. The first step of automatic seismic data processing is the detection of events. To get a good detection performance we require low number of false detections as well as high number of correctly detected events. We used a single layer recurrent neural network (SLRNN) trained by manual detections from swarms in West Bohemia in the past years. As inputs of the SLRNN we use STA/LTA of half-octave filter bank fed by vertical and horizontal components of seismograms. All stations were trained together to obtain the same network with the same neuron weights. We tried several architectures - different number of neurons - and different starting points for training. Networks giving the best results for training set must not be the optimal ones for unknown waveforms. Therefore we test each network on test set from different swarm (but still with similar characteristics, i.e. location, focal mechanisms, magnitude range). We also apply a coincidence verification for each event. It means that we can lower the number of false detections by rejecting events on one station only and force to declare an event on all stations in the network by coincidence on two or more stations. In further work we would like to retrain the network for each station individually so each station will have its own coefficients (neural weights) set. We would also like to apply this method to data from Reykjanet network located in Reykjanes peninsula, Iceland. As soon as we have a reliable detection, we can proceed to

  17. Detection of dominant flow and abnormal events in surveillance video

    NASA Astrophysics Data System (ADS)

    Kwak, Sooyeong; Byun, Hyeran

    2011-02-01

    We propose an algorithm for abnormal event detection in surveillance video. The proposed algorithm is based on a semi-unsupervised learning method, a kind of feature-based approach so that it does not detect the moving object individually. The proposed algorithm identifies dominant flow without individual object tracking using a latent Dirichlet allocation model in crowded environments. It can also automatically detect and localize an abnormally moving object in real-life video. The performance tests are taken with several real-life databases, and their results show that the proposed algorithm can efficiently detect abnormally moving objects in real time. The proposed algorithm can be applied to any situation in which abnormal directions or abnormal speeds are detected regardless of direction.

  18. Discriminative boundary detection for model-based heart segmentation in CT images

    NASA Astrophysics Data System (ADS)

    Peters, Jochen; Ecabert, Olivier; Schramm, Hauke; Weese, Jürgen

    2007-03-01

    Segmentation of organs in medical images can be successfully performed with deformable models. Most approaches combine a boundary detection step with some smoothness or shape constraint. An objective function for the model deformation is thus established from two terms: the first one attracts the surface model to the detected boundaries while the second one keeps the surface smooth or close to expected shapes. In this work, we assign locally varying boundary detection functions to all parts of the surface model. These functions combine an edge detector with local image analysis in order to accept or reject possible edge candidates. The goal is to optimize the discrimination between the wanted and misleading boundaries. We present a method to automatically learn from a representative set of 3D training images which features are optimal at each position of the surface model. The basic idea is to simulate the boundary detection for the given 3D images and to select those features that minimize the distance between the detected position and the desired object boundary. The approach is experimentally evaluated for the complex task of full-heart segmentation in CT images. A cyclic cross-evaluation on 25 cardiac CT images shows that the optimized feature training and selection enables robust, fully automatic heart segmentation with a mean error well below 1 mm. Comparing this approach to simpler training schemes that use the same basic formalism to accept or reject edges shows the importance of the discriminative optimization.

  19. ARX model-based gearbox fault detection and localization under varying load conditions

    NASA Astrophysics Data System (ADS)

    Yang, Ming; Makis, Viliam

    2010-11-01

    The development of the fault detection schemes for gearbox systems has received considerable attention in recent years. Both time series modeling and feature extraction based on wavelet methods have been considered, mostly under constant load. Constant load assumption implies that changes in vibration data are caused only by deterioration of the gearbox. However, most real gearbox systems operate under varying load and speed which affect the vibration signature of the system and in general make it difficult to recognize the occurrence of an impending fault. This paper presents a novel approach to detect and localize the gear failure occurrence for a gearbox operating under varying load conditions. First, residual signal is calculated using an autoregressive model with exogenous variables (ARX) fitted to the time-synchronously averaged (TSA) vibration data and filtered TSA envelopes when the gearbox operated under various load conditions in the healthy state. The gear of interest is divided into several sections so that each section includes the same number of adjacent teeth. Then, the fault detection and localization indicator is calculated by applying F-test to the residual signal of the ARX model. The proposed fault detection scheme indicates not only when the gear fault occurs, but also in which section of the gear. Finally, the performance of the fault detection scheme is checked using full lifetime vibration data obtained from the gearbox operating from a new condition to a breakdown under varying load.

  20. Detecting rare gene transfer events in bacterial populations.

    PubMed

    Nielsen, Kaare M; Bøhn, Thomas; Townsend, Jeffrey P

    2014-01-01

    Horizontal gene transfer (HGT) enables bacteria to access, share, and recombine genetic variation, resulting in genetic diversity that cannot be obtained through mutational processes alone. In most cases, the observation of evolutionary successful HGT events relies on the outcome of initially rare events that lead to novel functions in the new host, and that exhibit a positive effect on host fitness. Conversely, the large majority of HGT events occurring in bacterial populations will go undetected due to lack of replication success of transformants. Moreover, other HGT events that would be highly beneficial to new hosts can fail to ensue due to lack of physical proximity to the donor organism, lack of a suitable gene transfer mechanism, genetic compatibility, and stochasticity in tempo-spatial occurrence. Experimental attempts to detect HGT events in bacterial populations have typically focused on the transformed cells or their immediate offspring. However, rare HGT events occurring in large and structured populations are unlikely to reach relative population sizes that will allow their immediate identification; the exception being the unusually strong positive selection conferred by antibiotics. Most HGT events are not expected to alter the likelihood of host survival to such an extreme extent, and will confer only minor changes in host fitness. Due to the large population sizes of bacteria and the time scales involved, the process and outcome of HGT are often not amenable to experimental investigation. Population genetic modeling of the growth dynamics of bacteria with differing HGT rates and resulting fitness changes is therefore necessary to guide sampling design and predict realistic time frames for detection of HGT, as it occurs in laboratory or natural settings. Here we review the key population genetic parameters, consider their complexity and highlight knowledge gaps for further research. PMID:24432015

  1. Detecting rare gene transfer events in bacterial populations

    PubMed Central

    Nielsen, Kaare M.; Bøhn, Thomas; Townsend, Jeffrey P.

    2014-01-01

    Horizontal gene transfer (HGT) enables bacteria to access, share, and recombine genetic variation, resulting in genetic diversity that cannot be obtained through mutational processes alone. In most cases, the observation of evolutionary successful HGT events relies on the outcome of initially rare events that lead to novel functions in the new host, and that exhibit a positive effect on host fitness. Conversely, the large majority of HGT events occurring in bacterial populations will go undetected due to lack of replication success of transformants. Moreover, other HGT events that would be highly beneficial to new hosts can fail to ensue due to lack of physical proximity to the donor organism, lack of a suitable gene transfer mechanism, genetic compatibility, and stochasticity in tempo-spatial occurrence. Experimental attempts to detect HGT events in bacterial populations have typically focused on the transformed cells or their immediate offspring. However, rare HGT events occurring in large and structured populations are unlikely to reach relative population sizes that will allow their immediate identification; the exception being the unusually strong positive selection conferred by antibiotics. Most HGT events are not expected to alter the likelihood of host survival to such an extreme extent, and will confer only minor changes in host fitness. Due to the large population sizes of bacteria and the time scales involved, the process and outcome of HGT are often not amenable to experimental investigation. Population genetic modeling of the growth dynamics of bacteria with differing HGT rates and resulting fitness changes is therefore necessary to guide sampling design and predict realistic time frames for detection of HGT, as it occurs in laboratory or natural settings. Here we review the key population genetic parameters, consider their complexity and highlight knowledge gaps for further research. PMID:24432015

  2. Model-based assessment of the role of human-induced climate change in the 2005 Caribbean coral bleaching event.

    PubMed

    Donner, Simon D; Knutson, Thomas R; Oppenheimer, Michael

    2007-03-27

    Episodes of mass coral bleaching around the world in recent decades have been attributed to periods of anomalously warm ocean temperatures. In 2005, the sea surface temperature (SST) anomaly in the tropical North Atlantic that may have contributed to the strong hurricane season caused widespread coral bleaching in the Eastern Caribbean. Here, we use two global climate models to evaluate the contribution of natural climate variability and anthropogenic forcing to the thermal stress that caused the 2005 coral bleaching event. Historical temperature data and simulations for the 1870-2000 period show that the observed warming in the region is unlikely to be due to unforced climate variability alone. Simulation of background climate variability suggests that anthropogenic warming may have increased the probability of occurrence of significant thermal stress events for corals in this region by an order of magnitude. Under scenarios of future greenhouse gas emissions, mass coral bleaching in the Eastern Caribbean may become a biannual event in 20-30 years. However, if corals and their symbionts can adapt by 1-1.5 degrees C, such mass bleaching events may not begin to recur at potentially harmful intervals until the latter half of the century. The delay could enable more time to alter the path of greenhouse gas emissions, although long-term "committed warming" even after stabilization of atmospheric CO(2) levels may still represent an additional long-term threat to corals. PMID:17360373

  3. Model-based assessment of the role of human-induced climate change in the 2005 Caribbean coral bleaching event

    SciTech Connect

    Donner, S.D.; Knutson, T.R.; Oppenheimer, M.

    2007-03-27

    Episodes of mass coral bleaching around the world in recent decades have been attributed to periods of anomalously warm ocean temperatures. In 2005, the sea surface temperature (SST) anomaly in the tropical North Atlantic that may have contributed to the strong hurricane season caused widespread coral bleaching in the Eastern Caribbean. Here, the authors use two global climate models to evaluate the contribution of natural climate variability and anthropogenic forcing to the thermal stress that caused the 2005 coral bleaching event. Historical temperature data and simulations for the 1870-2000 period show that the observed warming in the region is unlikely to be due to unforced climate variability alone. Simulation of background climate variability suggests that anthropogenic warming may have increased the probability of occurrence of significant thermal stress events for corals in this region by an order of magnitude. Under scenarios of future greenhouse gas emissions, mass coral bleaching in the Eastern Caribbean may become a biannual event in 20-30 years. However, if corals and their symbionts can adapt by 1-1.5{sup o}C, such mass bleaching events may not begin to recur at potentially harmful intervals until the latter half of the century. The delay could enable more time to alter the path of greenhouse gas emissions, although long-term 'committed warming' even after stabilization of atmospheric CO{sub 2} levels may still represent an additional long-term threat to corals.

  4. Comparison of Event Detection Methods for Centralized Sensor Networks

    NASA Technical Reports Server (NTRS)

    Sauvageon, Julien; Agogiono, Alice M.; Farhang, Ali; Tumer, Irem Y.

    2006-01-01

    The development of an Integrated Vehicle Health Management (IVHM) for space vehicles has become a great concern. Smart Sensor Networks is one of the promising technologies that are catching a lot of attention. In this paper, we propose to a qualitative comparison of several local event (hot spot) detection algorithms in centralized redundant sensor networks. The algorithms are compared regarding their ability to locate and evaluate the event under noise and sensor failures. The purpose of this study is to check if the ratio performance/computational power of the Mote Fuzzy Validation and Fusion algorithm is relevant compare to simpler methods.

  5. FraudMiner: a novel credit card fraud detection model based on frequent itemset mining.

    PubMed

    Seeja, K R; Zareapoor, Masoumeh

    2014-01-01

    This paper proposes an intelligent credit card fraud detection model for detecting fraud from highly imbalanced and anonymous credit card transaction datasets. The class imbalance problem is handled by finding legal as well as fraud transaction patterns for each customer by using frequent itemset mining. A matching algorithm is also proposed to find to which pattern (legal or fraud) the incoming transaction of a particular customer is closer and a decision is made accordingly. In order to handle the anonymous nature of the data, no preference is given to any of the attributes and each attribute is considered equally for finding the patterns. The performance evaluation of the proposed model is done on UCSD Data Mining Contest 2009 Dataset (anonymous and imbalanced) and it is found that the proposed model has very high fraud detection rate, balanced classification rate, Matthews correlation coefficient, and very less false alarm rate than other state-of-the-art classifiers. PMID:25302317

  6. Model-Based Design of Tree WSNs for Decentralized Detection

    PubMed Central

    Tantawy, Ashraf; Koutsoukos, Xenofon; Biswas, Gautam

    2015-01-01

    The classical decentralized detection problem of finding the optimal decision rules at the sensor and fusion center, as well as variants that introduce physical channel impairments have been studied extensively in the literature. The deployment of WSNs in decentralized detection applications brings new challenges to the field. Protocols for different communication layers have to be co-designed to optimize the detection performance. In this paper, we consider the communication network design problem for a tree WSN. We pursue a system-level approach where a complete model for the system is developed that captures the interactions between different layers, as well as different sensor quality measures. For network optimization, we propose a hierarchical optimization algorithm that lends itself to the tree structure, requiring only local network information. The proposed design approach shows superior performance over several contentionless and contention-based network design approaches. PMID:26307989

  7. FraudMiner: A Novel Credit Card Fraud Detection Model Based on Frequent Itemset Mining

    PubMed Central

    Seeja, K. R.; Zareapoor, Masoumeh

    2014-01-01

    This paper proposes an intelligent credit card fraud detection model for detecting fraud from highly imbalanced and anonymous credit card transaction datasets. The class imbalance problem is handled by finding legal as well as fraud transaction patterns for each customer by using frequent itemset mining. A matching algorithm is also proposed to find to which pattern (legal or fraud) the incoming transaction of a particular customer is closer and a decision is made accordingly. In order to handle the anonymous nature of the data, no preference is given to any of the attributes and each attribute is considered equally for finding the patterns. The performance evaluation of the proposed model is done on UCSD Data Mining Contest 2009 Dataset (anonymous and imbalanced) and it is found that the proposed model has very high fraud detection rate, balanced classification rate, Matthews correlation coefficient, and very less false alarm rate than other state-of-the-art classifiers. PMID:25302317

  8. Detection and interpretation of seismoacoustic events at German infrasound stations

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Koch, Karl; Ceranna, Lars

    2016-04-01

    Three infrasound arrays with collocated or nearby installed seismometers are operated by the Federal Institute for Geosciences and Natural Resources (BGR) as the German National Data Center (NDC) for the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Infrasound generated by seismoacoustic events is routinely detected at these infrasound arrays, but air-to-ground coupled acoustic waves occasionally show up in seismometer recordings as well. Different natural and artificial sources like meteoroids as well as industrial and mining activity generate infrasonic signatures that are simultaneously detected at microbarometers and seismometers. Furthermore, many near-surface sources like earthquakes and explosions generate both seismic and infrasonic waves that can be detected successively with both technologies. The combined interpretation of seismic and acoustic signatures provides additional information about the origin time and location of remote infrasound events or about the characterization of seismic events distinguishing man-made and natural origins. Furthermore, seismoacoustic studies help to improve the modelling of infrasound propagation and ducting in the atmosphere and allow quantifying the portion of energy coupled into ground and into air by seismoacoustic sources. An overview of different seismoacoustic sources and their detection by German infrasound stations as well as some conclusions on the benefit of a combined seismoacoustic analysis are presented within this study.

  9. Automatic Detection of Student Mental Models Based on Natural Language Student Input during Metacognitive Skill Training

    ERIC Educational Resources Information Center

    Lintean, Mihai; Rus, Vasile; Azevedo, Roger

    2012-01-01

    This article describes the problem of detecting the student mental models, i.e. students' knowledge states, during the self-regulatory activity of prior knowledge activation in MetaTutor, an intelligent tutoring system that teaches students self-regulation skills while learning complex science topics. The article presents several approaches to…

  10. PMU Data Event Detection: A User Guide for Power Engineers

    SciTech Connect

    Allen, A.; Singh, M.; Muljadi, E.; Santoso, S.

    2014-10-01

    This user guide is intended to accompany a software package containing a Matrix Laboratory (MATLAB) script and related functions for processing phasor measurement unit (PMU) data. This package and guide have been developed by the National Renewable Energy Laboratory and the University of Texas at Austin. The objective of this data processing exercise is to discover events in the vast quantities of data collected by PMUs. This document attempts to cover some of the theory behind processing the data to isolate events as well as the functioning of the MATLAB scripts. The report describes (1) the algorithms and mathematical background that the accompanying MATLAB codes use to detect events in PMU data and (2) the inputs required from the user and the outputs generated by the scripts.

  11. Gait Event Detection during Stair Walking Using a Rate Gyroscope

    PubMed Central

    Formento, Paola Catalfamo; Acevedo, Ruben; Ghoussayni, Salim; Ewins, David

    2014-01-01

    Gyroscopes have been proposed as sensors for ambulatory gait analysis and functional electrical stimulation systems. These applications often require detection of the initial contact (IC) of the foot with the floor and/or final contact or foot off (FO) from the floor during outdoor walking. Previous investigations have reported the use of a single gyroscope placed on the shank for detection of IC and FO on level ground and incline walking. This paper describes the evaluation of a gyroscope placed on the shank for determination of IC and FO in subjects ascending and descending a set of stairs. Performance was compared with a reference pressure measurement system. The absolute mean difference between the gyroscope and the reference was less than 45 ms for IC and better than 135 ms for FO for both activities. Detection success was over 93%. These results provide preliminary evidence supporting the use of a gyroscope for gait event detection when walking up and down stairs. PMID:24651724

  12. Model Based Determination of Detection Limits for Proton Transfer Reaction Mass Spectrometer

    NASA Astrophysics Data System (ADS)

    Amann, Anton; Schwarz, Konrad; Wimmer, Gejza; Witkovský, Viktor

    2010-01-01

    Proton Transfer Reaction Mass Spectrometry (PTR-MS) is a chemical ionization mass spectrometric technique which allows to measure trace gases as, for example, in exhaled human breath. The quantification of compounds at low concentrations is desirable for medical diagnostics. Typically, an increase of measuring accuracy can be achieved if the duration of the measuring process is extended. For real time measurements the time windows for measurement are relatively short, in order to get a good time resolution (e.g. with breath-to-breath resolution during exercise on a stationary bicycle). Determination of statistical detection limits is typically based on calibration measurements, but this approach is limited, especially for very low concentrations. To overcome this problem, a calculation of limit of quantification (LOQ) and limit of detection (LOD), respectively, based on a theoretical model of the measurement process is outlined.

  13. Integrated Seismic Event Detection and Location by Advanced Array Processing

    SciTech Connect

    Kvaerna, T; Gibbons, S J; Ringdal, F; Harris, D B

    2007-02-09

    The principal objective of this two-year study is to develop and test a new advanced, automatic approach to seismic detection/location using array processing. We address a strategy to obtain significantly improved precision in the location of low-magnitude events compared with current fully-automatic approaches, combined with a low false alarm rate. We have developed and evaluated a prototype automatic system which uses as a basis regional array processing with fixed, carefully calibrated, site-specific parameters in conjuction with improved automatic phase onset time estimation. We have in parallel developed tools for Matched Field Processing for optimized detection and source-region identification of seismic signals. This narrow-band procedure aims to mitigate some of the causes of difficulty encountered using the standard array processing system, specifically complicated source-time histories of seismic events and shortcomings in the plane-wave approximation for seismic phase arrivals at regional arrays.

  14. Real-time detection of traffic events using smart cameras

    NASA Astrophysics Data System (ADS)

    Macesic, M.; Jelaca, V.; Niño-Castaneda, J. O.; Prodanovic, N.; Panic, M.; Pizurica, A.; Crnojevic, V.; Philips, W.

    2012-01-01

    With rapid increase of number of vehicles on roads it is necessary to maintain close monitoring of traffic. For this purpose many surveillance cameras are placed along roads and on crossroads, creating a huge communication load between the cameras and the monitoring center. Therefore, the data needs to be processed on site and transferred to the monitoring centers in form of metadata or as a set of selected images. For this purpose it is necessary to detect events of interest already on the camera side, which implies using smart cameras as visual sensors. In this paper we propose a method for tracking of vehicles and analysis of vehicle trajectories to detect different traffic events. Kalman filtering is used for tracking, combining foreground and optical flow measurements. Obtained vehicle trajectories are used to detect different traffic events. Every new trajectory is compared with collection of normal routes and clustered accordingly. If the observed trajectory differs from all normal routes more than a predefined threshold, it is marked as abnormal and the alarm is raised. The system was developed and tested on Texas Instruments OMAP platform. Testing was done on four different locations, two locations in the city and two locations on the open road.

  15. Hybrid light transport model based bioluminescence tomography reconstruction for early gastric cancer detection

    NASA Astrophysics Data System (ADS)

    Chen, Xueli; Liang, Jimin; Hu, Hao; Qu, Xiaochao; Yang, Defu; Chen, Duofang; Zhu, Shouping; Tian, Jie

    2012-03-01

    Gastric cancer is the second cause of cancer-related death in the world, and it remains difficult to cure because it has been in late-stage once that is found. Early gastric cancer detection becomes an effective approach to decrease the gastric cancer mortality. Bioluminescence tomography (BLT) has been applied to detect early liver cancer and prostate cancer metastasis. However, the gastric cancer commonly originates from the gastric mucosa and grows outwards. The bioluminescent light will pass through a non-scattering region constructed by gastric pouch when it transports in tissues. Thus, the current BLT reconstruction algorithms based on the approximation model of radiative transfer equation are not optimal to handle this problem. To address the gastric cancer specific problem, this paper presents a novel reconstruction algorithm that uses a hybrid light transport model to describe the bioluminescent light propagation in tissues. The radiosity theory integrated with the diffusion equation to form the hybrid light transport model is utilized to describe light propagation in the non-scattering region. After the finite element discretization, the hybrid light transport model is converted into a minimization problem which fuses an l1 norm based regularization term to reveal the sparsity of bioluminescent source distribution. The performance of the reconstruction algorithm is first demonstrated with a digital mouse based simulation with the reconstruction error less than 1mm. An in situ gastric cancer-bearing nude mouse based experiment is then conducted. The primary result reveals the ability of the novel BLT reconstruction algorithm in early gastric cancer detection.

  16. Gaussian mixture model based approach to anomaly detection in multi/hyperspectral images

    NASA Astrophysics Data System (ADS)

    Acito, N.; Diani, M.; Corsini, G.

    2005-10-01

    Anomaly detectors reveal the presence of objects/materials in a multi/hyperspectral image simply searching for those pixels whose spectrum differs from the background one (anomalies). This procedure can be applied directly to the radiance at the sensor level and has the great advantage of avoiding the difficult step of atmospheric correction. The most popular anomaly detector is the RX algorithm derived by Yu and Reed. It is based on the assumption that the pixels, in a region around the one under test, follow a single multivariate Gaussian distribution. Unfortunately, such a hypothesis is generally not met in actual scenarios and a large number of false alarms is usually experienced when the RX algorithm is applied in practice. In this paper, a more general approach to anomaly detection is considered based on the assumption that the background contains different terrain types (clusters) each of them Gaussian distributed. In this approach the parameters of each cluster are estimated and used in the detection process. Two detectors are considered: the SEM-RX and the K-means RX. Both the algorithms follow two steps: first, 1) the parameters of the background clusters are estimated, then, 2) a detection rule based on the RX test is applied. The SEM-RX stems from the GMM and employs the SEM algorithm to estimate the clusters' parameters; instead, the K-means RX resorts to the well known K-means algorithm to obtain the background clusters. An automatic procedure is defined, for both the detectors, to select the number of clusters and a novel criterion is proposed to set the test threshold. The performances of the two detectors are also evaluated on an experimental data set and compared to the ones of the RX algorithm. The comparative analysis is carried out in terms of experimental Receiver Operating Characteristics.

  17. Statistical language analysis for automatic exfiltration event detection.

    SciTech Connect

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  18. Model-based estimation of cardiovascular repolarization features: ischaemia detection and PTCA monitoring.

    PubMed

    Laguna, P; García, J; Roncal, I; Wagner, G; Lander, P; Mark, R

    1998-01-01

    The ST-T segment of the surface ECG reflects cardiac repolarization, and is quite sensitive to a number of pathological conditions, particularly ischaemia. ST-T changes generally affect the entire waveshape, and are inadequately characterized by single features such as depression of the ST segment at one particular point. Metrics which represent overall waveshape should provide more sensitive indicators of ST-T wave abnormalities, particularly when they are subtle, intermittent or periodic. This study discusses a Karhunen-Loève transform (KLT) technique for the analysis of the ST-T waveform. The KL technique was used to analyse the ST-T complexes in the ESC ST-T database. KL coefficients were plotted as a function of time, and were effective in detection of transient ischaemic episodes. Twenty per cent of the records showed bursts of periodic ischaemia suggesting local vascular instability. A comparison between kl and ST depression series has shown the KL technique as more appropriate to the study of ST-T complex variations. Using the kl series, an ischaemia detector has been developed based on a resampled, filtered, and differentiated KL series. This technique demonstrates a sensitivity of 65% and a specificity of 54%. These low values can be due to shifts of the electrical axis which are detected as ischaemic changes, real ischaemic episodes that were not annotated with the protocol used at the European ST-T database, or erroneous detections. An increase in sensitivity can be obtained at the expense of a decrease in the positive predictive value and thus becomes a useful technique for previous scanning of the ECG record and subsequent review by the expert. The technique has also been used to monitor patients during a PTCA process, demonstrating that this technique allows us to monitor PTCA-induced ischaemia. A detailed analysis has shown that in some cases a repetitive oscillatory behaviour appears, lasting for a period of around 20 s, and highly related to the

  19. Multi-resolution model-based traffic sign detection and tracking

    NASA Astrophysics Data System (ADS)

    Marinas, Javier; Salgado, Luis; Camplani, Massimo

    2012-06-01

    In this paper we propose an innovative approach to tackle the problem of traffic sign detection using a computer vision algorithm and taking into account real-time operation constraints, trying to establish intelligent strategies to simplify as much as possible the algorithm complexity and to speed up the process. Firstly, a set of candidates is generated according to a color segmentation stage, followed by a region analysis strategy, where spatial characteristic of previously detected objects are taken into account. Finally, temporal coherence is introduced by means of a tracking scheme, performed using a Kalman filter for each potential candidate. Taking into consideration time constraints, efficiency is achieved two-fold: on the one side, a multi-resolution strategy is adopted for segmentation, where global operation will be applied only to low-resolution images, increasing the resolution to the maximum only when a potential road sign is being tracked. On the other side, we take advantage of the expected spacing between traffic signs. Namely, the tracking of objects of interest allows to generate inhibition areas, which are those ones where no new traffic signs are expected to appear due to the existence of a TS in the neighborhood. The proposed solution has been tested with real sequences in both urban areas and highways, and proved to achieve higher computational efficiency, especially as a result of the multi-resolution approach.

  20. Hidden Markov Models for Detecting Aseismic Events in Southern California

    NASA Astrophysics Data System (ADS)

    Granat, R.

    2004-12-01

    We employ a hidden Markov model (HMM) to segment surface displacement time series collection by the Southern California Integrated Geodetic Network (SCIGN). These segmented time series are then used to detect regional events by observing the number of simultaneous mode changes across the network; if a large number of stations change at the same time, that indicates an event. The hidden Markov model (HMM) approach assumes that the observed data has been generated by an unobservable dynamical statistical process. The process is of a particular form such that each observation is coincident with the system being in a particular discrete state, which is interpreted as a behavioral mode. The dynamics are the model are constructed so that the next state is directly dependent only on the current state -- it is a first order Markov process. The model is completely described by a set of parameters: the initial state probabilities, the first order Markov chain state-to-state transition probabilities, and the probability distribution of observable outputs associated with each state. The result of this approach is that our segmentation decisions are based entirely on statistical changes in the behavior of the observed daily displacements. In general, finding the optimal model parameters to fit the data is a difficult problem. We present an innovative model fitting method that is unsupervised (i.e., it requires no labeled training data) and uses a regularized version of the expectation-maximization (EM) algorithm to ensure that model solutions are both robust with respect to initial conditions and of high quality. We demonstrate the reliability of the method as compared to standard model fitting methods and show that it results in lower noise in the mode change correlation signal used to detect regional events. We compare candidate events detected by this method to the seismic record and observe that most are not correlated with a significant seismic event. Our analysis

  1. A Model-Based Probabilistic Inversion Framework for Wire Fault Detection Using TDR

    NASA Technical Reports Server (NTRS)

    Schuet, Stefan R.; Timucin, Dogan A.; Wheeler, Kevin R.

    2010-01-01

    Time-domain reflectometry (TDR) is one of the standard methods for diagnosing faults in electrical wiring and interconnect systems, with a long-standing history focused mainly on hardware development of both high-fidelity systems for laboratory use and portable hand-held devices for field deployment. While these devices can easily assess distance to hard faults such as sustained opens or shorts, their ability to assess subtle but important degradation such as chafing remains an open question. This paper presents a unified framework for TDR-based chafing fault detection in lossy coaxial cables by combining an S-parameter based forward modeling approach with a probabilistic (Bayesian) inference algorithm. Results are presented for the estimation of nominal and faulty cable parameters from laboratory data.

  2. [Establishment and Improvement of Portable X-Ray Fluorescence Spectrometer Detection Model Based on Wavelet Transform].

    PubMed

    Li, Fang; Wang, Ji-hua; Lu, An-xiang; Han, Ping

    2015-04-01

    The concentration of Cr, Cu, Zn, As and Pb in soil was tested by portable X-ray fluorescence spectrometer. Each sample was tested for 3 times, then after using wavelet threshold noise filtering method for denoising and smoothing the spectra, a standard curve for each heavy metal was established according to the standard values of heavy metals in soil and the corresponding counts which was the average of the 3 processed spectra. The signal to noise ratio (SNR), mean square error (MSE) and information entropy (H) were taken to assess the effects of denoising when using wavelet threshold noise filtering method for determining the best wavelet basis and wavelet decomposition level. Some samples with different concentrations and H3 B03 (blank) were chosen to retest this instrument to verify its stability. The results show that: the best denoising result was obtained with the coif3 wavelet basis at the decomposition level of 3 when using the wavelet transform method. The determination coefficient (R2) range of the instrument is 0.990-0.996, indicating that a high degree of linearity was found between the contents of heavy metals in soil and each X-ray fluorescence spectral characteristic peak intensity with the instrument measurement within the range (0-1,500 mg · kg(-1)). After retesting and calculating, the results indicate that all the detection limits of the instrument are below the soil standards at national level. The accuracy of the model has been effectively improved, and the instrument also shows good precision with the practical application of wavelet transform to the establishment and improvement of X-ray fluorescence spectrometer detection model. Thus the instrument can be applied in on-site rapid screening of heavy metal in contaminated soil. PMID:26197612

  3. Application of Kalman Filtering Techniques for Microseismic Event Detection

    NASA Astrophysics Data System (ADS)

    Baziw, E.; Weir-Jones, I.

    - Microseismic monitoring systems are generally installed in areas of induced seismicity caused by human activity. Induced seismicity results from changes in the state of stress which may occur as a result of excavation within the rock mass in mining (i.e., rockbursts), and changes in hydrostatic pressures and rock temperatures (e.g., during fluid injection or extraction) in oil exploitation, dam construction or fluid disposal. Microseismic monitoring systems determine event locations and important source parameters such as attenuation, seismic moment, source radius, static stress drop, peak particle velocity and seismic energy. An essential part of the operation of a microseismic monitoring system is the reliable detection of microseismic events. In the absence of reliable, automated picking techniques, operators rely upon manual picking. This is time-consuming, costly and, in the presence of background noise, very prone to error. The techniques described in this paper not only permit the reliable identification of events in cluttered signal environments they have also enabled the authors to develop reliable automated event picking procedures. This opens the way to use microseismic monitoring as a cost-effective production/operations procedure. It has been the experience of the authors that in certain noisy environments, the seismic monitoring system may trigger on and subsequently acquire substantial quantities of erroneous data, due to the high energy content of the ambient noise. Digital filtering techniques need to be applied on the microseismic data so that the ambient noise is removed and event detection simplified. The monitoring of seismic acoustic emissions is a continuous, real-time process and it is desirable to implement digital filters which can also be designed in the time domain and in real-time such as the Kalman Filter. This paper presents a real-time Kalman Filter which removes the statistically describable background noise from the recorded

  4. Swarm intelligence for detecting interesting events in crowded environments.

    PubMed

    Kaltsa, Vagia; Briassouli, Alexia; Kompatsiaris, Ioannis; Hadjileontiadis, Leontios J; Strintzis, Michael Gerasimos

    2015-07-01

    This paper focuses on detecting and localizing anomalous events in videos of crowded scenes, i.e., divergences from a dominant pattern. Both motion and appearance information are considered, so as to robustly distinguish different kinds of anomalies, for a wide range of scenarios. A newly introduced concept based on swarm theory, histograms of oriented swarms (HOS), is applied to capture the dynamics of crowded environments. HOS, together with the well-known histograms of oriented gradients, are combined to build a descriptor that effectively characterizes each scene. These appearance and motion features are only extracted within spatiotemporal volumes of moving pixels to ensure robustness to local noise, increase accuracy in the detection of local, nondominant anomalies, and achieve a lower computational cost. Experiments on benchmark data sets containing various situations with human crowds, as well as on traffic data, led to results that surpassed the current state of the art (SoA), confirming the method's efficacy and generality. Finally, the experiments show that our approach achieves significantly higher accuracy, especially for pixel-level event detection compared to SoA methods, at a low computational cost. PMID:25769154

  5. Endmember detection in marine environment with oil spill event

    NASA Astrophysics Data System (ADS)

    Andreou, Charoula; Karathanassi, Vassilia

    2011-11-01

    Oil spill events are a crucial environmental issue. Detection of oil spills is important for both oil exploration and environmental protection. In this paper, investigation of hyperspectral remote sensing is performed for the detection of oil spills and the discrimination of different oil types. Spectral signatures of different oil types are very useful, since they may serve as endmembers in unmixing and classification models. Towards this direction, an oil spectral library, resulting from spectral measurements of artificial oil spills as well as of look-alikes in marine environment was compiled. Samples of four different oil types were used; two crude oils, one marine residual fuel oil, and one light petroleum product. Lookalikes comprise sea water, river discharges, shallow water and water with algae. Spectral measurements were acquired with spectro-radiometer GER1500. Moreover, oil and look-alikes spectral signatures have been examined whether they can be served as endmembers. This was accomplished by testifying their linear independence. After that, synthetic hyperspectral images based on the relevant oil spectral library were created. Several simplex-based endmember algorithms such as sequential maximum angle convex cone (SMACC), vertex component analysis (VCA), n-finder algorithm (N-FINDR), and automatic target generation process (ATGP) were applied on the synthetic images in order to evaluate their effectiveness for detecting oil spill events occurred from different oil types. Results showed that different types of oil spills with various thicknesses can be extracted as endmembers.

  6. An Automated Visual Event Detection System for Cabled Observatory Video

    NASA Astrophysics Data System (ADS)

    Edgington, D. R.; Cline, D. E.; Mariette, J.

    2007-12-01

    The permanent presence of underwater cameras on oceanic cabled observatories, such as the Victoria Experimental Network Under the Sea (VENUS) and Eye-In-The-Sea (EITS) on Monterey Accelerated Research System (MARS), will generate valuable data that can move forward the boundaries of understanding the underwater world. However, sightings of underwater animal activities are rare, resulting in the recording of many hours of video with relatively few events of interest. The burden of video management and analysis often requires reducing the amount of video recorded and later analyzed. Sometimes enough human resources do not exist to analyze the video; the strains on human attention needed to analyze video demand an automated way to assist in video analysis. Towards this end, an Automated Visual Event Detection System (AVED) is in development at the Monterey Bay Aquarium Research Institute (MBARI) to address the problem of analyzing cabled observatory video. Here we describe the overall design of the system to process video data and enable science users to analyze the results. We present our results analyzing video from the VENUS observatory and test data from EITS deployments. This automated system for detecting visual events includes a collection of custom and open source software that can be run three ways: through a Web Service, through a Condor managed pool of AVED enabled compute servers, or locally on a single computer. The collection of software also includes a graphical user interface to preview or edit detected results and to setup processing options. To optimize the compute-intensive AVED algorithms, a parallel program has been implemented for high-data rate applications like the EITS instrument on MARS.

  7. A Pulse-type Hardware Level Difference Detection Model Based on Sound Source Localization Mechanism in Barn Owl

    NASA Astrophysics Data System (ADS)

    Sakurai, Tsubasa; Sekine, Yoshifumi

    Auditory information processing is very important in the darkness where vision information is extremely limited. Barn owls have excellent hearing information processing function. Barn owls can detect a sound source in the high accuracy of less than two degrees in both of the vertical and horizontal directions. When they perform the sound source localization, the barn owls use the interaural time difference for localization in the horizontal plane, and the interaural level difference for localization in the vertical plane. We are constructing the two-dimensional sound source localization model using pulse-type hardware neuron models based on sound source localization mechanism of barn owl for the purpose of the engineering application. In this paper, we propose a pulse-type hardware model for level difference detection based on sound source localization mechanism of barn owl. Firstly, we discuss the response characteristics of the mathematical model for level difference detection. Next we discuss the response characteristics of the hardware mode. As a result, we show clearly that this proposal model can be used as a sound source localization model of vertical direction.

  8. Use of sonification in the detection of anomalous events

    NASA Astrophysics Data System (ADS)

    Ballora, Mark; Cole, Robert J.; Kruesi, Heidi; Greene, Herbert; Monahan, Ganesh; Hall, David L.

    2012-06-01

    In this paper, we describe the construction of a soundtrack that fuses stock market data with information taken from tweets. This soundtrack, or auditory display, presents the numerical and text data in such a way that anomalous events may be readily detected, even by untrained listeners. The soundtrack generation is flexible, allowing an individual listener to create a unique audio mix from the available information sources. Properly constructed, the display exploits the auditory system's sensitivities to periodicities, to dynamic changes, and to patterns. This type of display could be valuable in environments that demand high levels of situational awareness based on multiple sources of incoming information.

  9. The waveform correlation event detection system global prototype software design

    SciTech Connect

    Beiriger, J.I.; Moore, S.G.; Trujillo, J.R.; Young, C.J.

    1997-12-01

    The WCEDS prototype software system was developed to investigate the usefulness of waveform correlation methods for CTBT monitoring. The WCEDS prototype performs global seismic event detection and has been used in numerous experiments. This report documents the software system design, presenting an overview of the system operation, describing the system functions, tracing the information flow through the system, discussing the software structures, and describing the subsystem services and interactions. The effectiveness of the software design in meeting project objectives is considered, as well as opportunities for code refuse and lessons learned from the development process. The report concludes with recommendations for modifications and additions envisioned for regional waveform-correlation-based detector.

  10. Automatic adverse drug events detection using letters to the editor.

    PubMed

    Yang, Chao; Srinivasan, Padmini; Polgreen, Philip M

    2012-01-01

    We present and test the intuition that letters to the editor in journals carry early signals of adverse drug events (ADEs). Surprisingly these letters have not yet been exploited for automatic ADE detection unlike for example, clinical records and PubMed. Part of the challenge is that it is not easy to access the full-text of letters (for the most part these do not appear in PubMed). Also letters are likely underrated in comparison with full articles. Besides demonstrating that this intuition holds we contribute techniques for post market drug surveillance. Specifically, we test an automatic approach for ADE detection from letters using off-the-shelf machine learning tools. We also involve natural language processing for feature definitions. Overall we achieve high accuracy in our experiments and our method also works well on a second new test set. Our results encourage us to further pursue this line of research. PMID:23304379

  11. Automatic Adverse Drug Events Detection Using Letters to the Editor

    PubMed Central

    Yang, Chao; Srinivasan, Padmini; Polgreen, Philip M.

    2012-01-01

    We present and test the intuition that letters to the editor in journals carry early signals of adverse drug events (ADEs). Surprisingly these letters have not yet been exploited for automatic ADE detection unlike for example, clinical records and PubMed. Part of the challenge is that it is not easy to access the full-text of letters (for the most part these do not appear in PubMed). Also letters are likely underrated in comparison with full articles. Besides demonstrating that this intuition holds we contribute techniques for post market drug surveillance. Specifically, we test an automatic approach for ADE detection from letters using off-the-shelf machine learning tools. We also involve natural language processing for feature definitions. Overall we achieve high accuracy in our experiments and our method also works well on a second new test set. Our results encourage us to further pursue this line of research. PMID:23304379

  12. Increased SERS detection efficiency for characterizing rare events in flow.

    PubMed

    Jacobs, Kevin T; Schultz, Zachary D

    2015-08-18

    Improved surface-enhanced Raman scattering (SERS) measurements of a flowing aqueous sample are accomplished by combining line focus optics with sheath-flow SERS detection. The straightforward introduction of a cylindrical lens into the optical path of the Raman excitation laser increases the efficiency of SERS detection and the reproducibility of SERS signals at low concentrations. The width of the line focus is matched to the width of the sample capillary from which the analyte elutes under hydrodynamic focusing conditions, allowing for increased collection across the SERS substrate while maintaining the power density below the damage threshold at any specific point. We show that a 4× increase in power spread across the line increases the signal-to-noise ratio by a factor of 2 for a variety of analytes, such as rhodamine 6G, amino acids, and lipid vesicles, without any detectable photodamage. COMSOL simulations and Raman maps elucidate the hydrodynamic focusing properties of the flow cell, providing a clearer picture of the confinement effects at the surface where the sample exits the capillary. The lipid vesicle results suggest that the combination of hydrodynamic focusing and increased optical collection enables the reproducible detection of rare events, in this case individual lipid vesicles. PMID:26168151

  13. Local Seismic Event Detection Using Image Processing Techniques

    NASA Astrophysics Data System (ADS)

    West, J. D.; Fouch, M. J.

    2013-12-01

    The large footprint of regularly-spaced broadband seismometers afforded by EarthScope's USArray Transportable Array (TA) [www.usarray.org] presents an unprecedented opportunity to develop novel seismic array processing methods. Here we report preliminary results from a new automated method for detecting small local seismic events within the footprint of the TA using image processing techniques. The overarching goal is to develop a new methodology for automated searches of large seismic datasets for signals that are difficult to detect by traditional means, such as STA/LTA triggering algorithms. We first process the raw broadband data for each station by bandpass filtering at 7-19 Hz and integrating the absolute value of the velocity waveform over a sequence of 5-second intervals. We further combine the integrated values of all three orthogonal channels into a single new time series with a 5-second sampling rate. This new time series is analogous to a measurement of the total seismic energy recorded at the station in each 5-second interval; we call this time series Integrated Ground Motion (IGM). Each sample is compared to a sliding longer-term average to remove diurnal and long-term noise effects. We create an image file by mapping each station location to an equivalent position in a blank image array, and use a modified Voronoi tessellation algorithm to assign each pixel in the image to the IGM value of the nearest station. We assign a value of zero if the pixel is more than a maximum distance from the nearest station. We apply 2-dimensional spatial image filtering techniques to remove large-scale features affecting much of the image, as we assume these likely result from teleseismic events. We also filter the time series to remove very small-scale features from noise spikes affecting a single seismic station. The resulting image contains only features of regional scale affecting 2 or more stations. For each of the remaining image features, we find the center

  14. A robustness study of parametric and non-parametric tests in model-based multifactor dimensionality reduction for epistasis detection

    PubMed Central

    2013-01-01

    Background Applying a statistical method implies identifying underlying (model) assumptions and checking their validity in the particular context. One of these contexts is association modeling for epistasis detection. Here, depending on the technique used, violation of model assumptions may result in increased type I error, power loss, or biased parameter estimates. Remedial measures for violated underlying conditions or assumptions include data transformation or selecting a more relaxed modeling or testing strategy. Model-Based Multifactor Dimensionality Reduction (MB-MDR) for epistasis detection relies on association testing between a trait and a factor consisting of multilocus genotype information. For quantitative traits, the framework is essentially Analysis of Variance (ANOVA) that decomposes the variability in the trait amongst the different factors. In this study, we assess through simulations, the cumulative effect of deviations from normality and homoscedasticity on the overall performance of quantitative Model-Based Multifactor Dimensionality Reduction (MB-MDR) to detect 2-locus epistasis signals in the absence of main effects. Methodology Our simulation study focuses on pure epistasis models with varying degrees of genetic influence on a quantitative trait. Conditional on a multilocus genotype, we consider quantitative trait distributions that are normal, chi-square or Student’s t with constant or non-constant phenotypic variances. All data are analyzed with MB-MDR using the built-in Student’s t-test for association, as well as a novel MB-MDR implementation based on Welch’s t-test. Traits are either left untransformed or are transformed into new traits via logarithmic, standardization or rank-based transformations, prior to MB-MDR modeling. Results Our simulation results show that MB-MDR controls type I error and false positive rates irrespective of the association test considered. Empirically-based MB-MDR power estimates for MB-MDR with Welch

  15. Efficient hemodynamic event detection utilizing relational databases and wavelet analysis

    NASA Technical Reports Server (NTRS)

    Saeed, M.; Mark, R. G.

    2001-01-01

    Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.

  16. Detecting Rare Events in the Time-Domain

    SciTech Connect

    Rest, A; Garg, A

    2008-10-31

    One of the biggest challenges in current and future time-domain surveys is to extract the objects of interest from the immense data stream. There are two aspects to achieving this goal: detecting variable sources and classifying them. Difference imaging provides an elegant technique for identifying new transients or changes in source brightness. Much progress has been made in recent years toward refining the process. We discuss a selection of pitfalls that can afflict an automated difference imagine pipeline and describe some solutions. After identifying true astrophysical variables, we are faced with the challenge of classifying them. For rare events, such as supernovae and microlensing, this challenge is magnified because we must balance having selection criteria that select for the largest number of objects of interest against a high contamination rate. We discuss considerations and techniques for developing classification schemes.

  17. Detecting and characterising ramp events in wind power time series

    NASA Astrophysics Data System (ADS)

    Gallego, Cristóbal; Cuerva, Álvaro; Costa, Alexandre

    2014-12-01

    In order to implement accurate models for wind power ramp forecasting, ramps need to be previously characterised. This issue has been typically addressed by performing binary ramp/non-ramp classifications based on ad-hoc assessed thresholds. However, recent works question this approach. This paper presents the ramp function, an innovative wavelet- based tool which detects and characterises ramp events in wind power time series. The underlying idea is to assess a continuous index related to the ramp intensity at each time step, which is obtained by considering large power output gradients evaluated under different time scales (up to typical ramp durations). The ramp function overcomes some of the drawbacks shown by the aforementioned binary classification and permits forecasters to easily reveal specific features of the ramp behaviour observed at a wind farm. As an example, the daily profile of the ramp-up and ramp-down intensities are obtained for the case of a wind farm located in Spain.

  18. Barometric pressure and triaxial accelerometry-based falls event detection.

    PubMed

    Bianchi, Federico; Redmond, Stephen J; Narayanan, Michael R; Cerutti, Sergio; Lovell, Nigel H

    2010-12-01

    Falls and fall related injuries are a significant cause of morbidity, disability, and health care utilization, particularly among the age group of 65 years and over. The ability to detect falls events in an unsupervised manner would lead to improved prognoses for falls victims. Several wearable accelerometry and gyroscope-based falls detection devices have been described in the literature; however, they all suffer from unacceptable false positive rates. This paper investigates the augmentation of such systems with a barometric pressure sensor, as a surrogate measure of altitude, to assist in discriminating real fall events from normal activities of daily living. The acceleration and air pressure data are recorded using a wearable device attached to the subject's waist and analyzed offline. The study incorporates several protocols including simulated falls onto a mattress and simulated activities of daily living, in a cohort of 20 young healthy volunteers (12 male and 8 female; age: 23.7 ±3.0 years). A heuristically trained decision tree classifier is used to label suspected falls. The proposed system demonstrated considerable improvements in comparison to an existing accelerometry-based technique; showing an accuracy, sensitivity and specificity of 96.9%, 97.5%, and 96.5%, respectively, in the indoor environment, with no false positives generated during extended testing during activities of daily living. This is compared to 85.3%, 75%, and 91.5% for the same measures, respectively, when using accelerometry alone. The increased specificity of this system may enhance the usage of falls detectors among the elderly population. PMID:20805056

  19. Event Detection and Spatial Analysis for Characterizing Extreme Precipitation

    NASA Astrophysics Data System (ADS)

    Jeon, S.; Prabhat, M.; Byna, S.; Collins, W.; Wehner, M. F.

    2013-12-01

    Atmospheric Rivers (ARs) are large spatially coherent weather systems with high concentrations of elevated water vapor that often cause severe downpours and flooding over western coastal United States. With the availability of more atmospheric moisture in the future under global warming, we expect ARs to play an important role as a potential cause of extreme precipitation. We have recently developed TECA software for automatically identifying and tracking features in climate datasets. In particular, we are able to identify ARs that make landfall on the western coast of North America. This detection tool examines integrated water vapor field above a certain threshold and performs geometric analysis. Based on the detection procedure, we investigate impacts of ARs by exploring spatial extent of AR precipitation for CMIP5 simulations, and characterize spatial pattern of dependence for future projections under climate change within the framework of extreme value theory. The results show that AR events in RCP8.5 scenario (2076-2100) tend to produce heavier rainfall with higher frequency and longer duration than the events from historical run (1981-2005). Range of spatial dependence between extreme precipitations is concentrated on smaller localized area in California under the highest emission scenario than present day. Preliminary results are illustrated in Figure 1 and 2. Fig 1: Boxplot of annual max precipitation (left two) and max AR precipitation (right two) from GFDL-ESM2M during 25-year time period by station in California, US. Fig 2: Spatial dependence of max AR precipitation calculated from Station 4 (triangle) for historical run (left) and for future projections of RCP8.5 (right) from GFDL-ESM2M. Green and orange colors represent complete dependence and independence between two stations respectively.

  20. Model-based iterative reconstruction and adaptive statistical iterative reconstruction: dose-reduced CT for detecting pancreatic calcification

    PubMed Central

    Katsura, Masaki; Akahane, Masaaki; Sato, Jiro; Matsuda, Izuru; Ohtomo, Kuni

    2016-01-01

    Background Iterative reconstruction methods have attracted attention for reducing radiation doses in computed tomography (CT). Purpose To investigate the detectability of pancreatic calcification using dose-reduced CT reconstructed with model-based iterative construction (MBIR) and adaptive statistical iterative reconstruction (ASIR). Material and Methods This prospective study approved by Institutional Review Board included 85 patients (57 men, 28 women; mean age, 69.9 years; mean body weight, 61.2 kg). Unenhanced CT was performed three times with different radiation doses (reference-dose CT [RDCT], low-dose CT [LDCT], ultralow-dose CT [ULDCT]). From RDCT, LDCT, and ULDCT, images were reconstructed with filtered-back projection (R-FBP, used for establishing reference standard), ASIR (L-ASIR), and MBIR and ASIR (UL-MBIR and UL-ASIR), respectively. A lesion (pancreatic calcification) detection test was performed by two blinded radiologists with a five-point certainty level scale. Results Dose-length products of RDCT, LDCT, and ULDCT were 410, 97, and 36 mGy-cm, respectively. Nine patients had pancreatic calcification. The sensitivity for detecting pancreatic calcification with UL-MBIR was high (0.67–0.89) compared to L-ASIR or UL-ASIR (0.11–0.44), and a significant difference was seen between UL-MBIR and UL-ASIR for one reader (P = 0.014). The area under the receiver-operating characteristic curve for UL-MBIR (0.818–0.860) was comparable to that for L-ASIR (0.696–0.844). The specificity was lower with UL-MBIR (0.79–0.92) than with L-ASIR or UL-ASIR (0.96–0.99), and a significant difference was seen for one reader (P < 0.01). Conclusion In UL-MBIR, pancreatic calcification can be detected with high sensitivity, however, we should pay attention to the slightly lower specificity. PMID:27110389

  1. Detecting Tidal Disruption Events (TDEs) with the Einstein Probe

    NASA Astrophysics Data System (ADS)

    Yuan, W.; Komossa, S.; Zhang, C.; Feng, H.; Zhang, S.; Osborne, J.; O'Brien, P.; Watson, M.; Fraser, G.

    2014-07-01

    Stars are tidally disrupted and accreted when they approach supermassive black holes (SMBHs) closely, producing a flare of electromagnetic radiation. The majority of the (approximately two dozen) tidal disruption events (TDEs) identified so far have been discovered by their luminous, transient X-ray emission. Once TDEs are detected in much larger numbers, in future dedicated transient surveys, a wealth of new applications will become possible. Including (1) TDE rate measurements in dependence of host galaxy types, (2) an assessment of the population of IMBHs, and (3) new probes of general relativity and accretion processes. Here, we present the proposed X-ray mission Einstein Probe}, which aims at detecting TDEs in large numbers. The mission consists of a wide-field micro-pore Lobster-eye imager (60deg x 60deg, or ˜1 ster), and is designed to carry out an all-sky transient survey at energies of 0.5-4 keV. It will also carry an X-ray telescope of the same micro-pore optics for follow-ups, with a smaller field-of-view. It will be capable of issuing public transient alerts rapidly.

  2. Communication of ALS Patients by Detecting Event-Related Potential

    NASA Astrophysics Data System (ADS)

    Kanou, Naoyuki; Sakuma, Kenji; Nakashima, Kenji

    Amyotrophic Lateral Sclerosis(ALS) patients are unable to successfully communicate their desires, although their mental capacity is the same as non-affected persons. Therefore, the authors put emphasis on Event-Related Potential(ERP) which elicits the highest outcome for the target visual and hearing stimuli. P300 is one component of ERP. It is positive potential that is elicited when the subject focuses attention on stimuli that appears infrequently. In this paper, the authors focused on P200 and N200 components, in addition to P300, for their great improvement in the rate of correct judgment in the target word-specific experiment. Hence the authors propose the algorithm that specifies target words by detecting these three components. Ten healthy subjects and ALS patient underwent the experiment in which a target word out of five words, was specified by this algorithm. The rates of correct judgment in nine of ten healthy subjects were more than 90.0%. The highest rate was 99.7%. The highest rate of ALS patient was 100.0%. Through these results, the authors found the possibility that ALS patients could communicate with surrounding persons by detecting ERP(P200, N200 and P300) as their desire.

  3. Apparatus and method for detecting full-capture radiation events

    DOEpatents

    Odell, Daniel M. C.

    1994-01-01

    An apparatus and method for sampling the output signal of a radiation detector and distinguishing full-capture radiation events from Compton scattering events. The output signal of a radiation detector is continuously sampled. The samples are converted to digital values and input to a discriminator where samples that are representative of events are identified. The discriminator transfers only event samples, that is, samples representing full-capture events and Compton events, to a signal processor where the samples are saved in a three-dimensional count matrix with time (from the time of onset of the pulse) on the first axis, sample pulse current amplitude on the second axis, and number of samples on the third axis. The stored data are analyzed to separate the Compton events from full-capture events, and the energy of the full-capture events is determined without having determined the energies of any of the individual radiation detector events.

  4. Detection of Local/Regional Events in Kuwait Using Next-Generation Detection Algorithms

    SciTech Connect

    Gok, M. Rengin; Al-Jerri, Farra; Dodge, Douglas; Al-Enezi, Abdullah; Hauk, Terri; Mellors, R.

    2014-12-10

    Seismic networks around the world use conventional triggering algorithms to detect seismic signals in order to locate local/regional seismic events. Kuwait National Seismological Network (KNSN) of Kuwait Institute of Scientific Research (KISR) is operating seven broad-band and short-period three-component stations in Kuwait. The network is equipped with Nanometrics digitizers and uses Antelope and Guralp acquisition software for processing and archiving the data. In this study, we selected 10 days of archived hourly-segmented continuous data of five stations (Figure 1) and 250 days of continuous recording at MIB. For the temporary deployment our selection criteria was based on KNSN catalog intensity for the period of time we test the method. An autonomous event detection and clustering framework is employed to test a more complete catalog of this short period of time. The goal is to illustrate the effectiveness of the technique and pursue the framework for longer period of time.

  5. Automatic detection of volcano-seismic events by modeling state and event duration in hidden Markov models

    NASA Astrophysics Data System (ADS)

    Bhatti, Sohail Masood; Khan, Muhammad Salman; Wuth, Jorge; Huenupan, Fernando; Curilem, Millaray; Franco, Luis; Yoma, Nestor Becerra

    2016-09-01

    In this paper we propose an automatic volcano event detection system based on Hidden Markov Model (HMM) with state and event duration models. Since different volcanic events have different durations, therefore the state and whole event durations learnt from the training data are enforced on the corresponding state and event duration models within the HMM. Seismic signals from the Llaima volcano are used to train the system. Two types of events are employed in this study, Long Period (LP) and Volcano-Tectonic (VT). Experiments show that the standard HMMs can detect the volcano events with high accuracy but generates false positives. The results presented in this paper show that the incorporation of duration modeling can lead to reductions in false positive rate in event detection as high as 31% with a true positive accuracy equal to 94%. Further evaluation of the false positives indicate that the false alarms generated by the system were mostly potential events based on the signal-to-noise ratio criteria recommended by a volcano expert.

  6. Direct phosphorescent detection of primary event of photodynamic action

    NASA Astrophysics Data System (ADS)

    Losev, Anatoly P.; Knukshto, Valentin N.; Zhuravkin, Ivan N.

    1994-07-01

    Highly phosphorescent photosensitizer Pd-tetra (o-methoxy-p-sulfo) phenyl porphyrin (Pd-MSPP) was used to follow the primary events of photodynamic action - quenching of triplet states by free oxygen in different systems: water solutions of proteins, cells and tissues in vivo and in vitro. The photosensitizer forms complexes with proteins in solutions and biosystems showing remarkable hypsochromic shifts of band and an increase of the quantum yield and lifetime of phosphorescence at the binding to proteins. In absence of oxygen the lifetime of phosphorescence is almost single exponential, and depends on the energy of lowest triplet state of the sensitizer. The photochemical quenching of the triplets by cell components is negligible. In presence of free oxygen the quenching of the sensitizer triplets takes place. The emission spectrum of singlet oxygen with maximum 1271 nm was recorded in water protein solutions and quantum yield of sensitized luminescence was measured. In the systems studied, oxygen consumption was detected and oxygen concentration was estimated in the course of photodynamics by an increase in photosensitizer phosphorescence lifetime, using laser flash photolysis technique. At least two exponential kinetics of the phosphorescence decay shows that the distribution of the free oxygen is not uniform in tissues.

  7. Visual traffic surveillance framework: classification to event detection

    NASA Astrophysics Data System (ADS)

    Ambardekar, Amol; Nicolescu, Mircea; Bebis, George; Nicolescu, Monica

    2013-10-01

    Visual traffic surveillance using computer vision techniques can be noninvasive, automated, and cost effective. Traffic surveillance systems with the ability to detect, count, and classify vehicles can be employed in gathering traffic statistics and achieving better traffic control in intelligent transportation systems. However, vehicle classification poses a difficult problem as vehicles have high intraclass variation and relatively low interclass variation. Five different object recognition techniques are investigated: principal component analysis (PCA)+difference from vehicle space, PCA+difference in vehicle space, PCA+support vector machine, linear discriminant analysis, and constellation-based modeling applied to the problem of vehicle classification. Three of the techniques that performed well were incorporated into a unified traffic surveillance system for online classification of vehicles, which uses tracking results to improve the classification accuracy. To evaluate the accuracy of the system, 31 min of traffic video containing multilane traffic intersection was processed. It was possible to achieve classification accuracy as high as 90.49% while classifying correctly tracked vehicles into four classes: cars, SUVs/vans, pickup trucks, and buses/semis. While processing a video, our system also recorded important traffic parameters such as the appearance, speed, trajectory of a vehicle, etc. This information was later used in a search assistant tool to find interesting traffic events.

  8. Large Time Projection Chambers for Rare Event Detection

    SciTech Connect

    Heffner, M

    2009-11-03

    The Time Projection Chamber (TPC) concept [add ref to TPC section] has been applied to many projects outside of particle physics and the accelerator based experiments where it was initially developed. TPCs in non-accelerator particle physics experiments are principally focused on rare event detection (e.g. neutrino and darkmater experiments) and the physics of these experiments can place dramatically different constraints on the TPC design (only extensions to the traditional TPCs are discussed here). The drift gas, or liquid, is usually the target or matter under observation and due to very low signal rates a TPC with the largest active mass is desired. The large mass complicates particle tracking of short and sometimes very low energy particles. Other special design issues include, efficient light collection, background rejection, internal triggering and optimal energy resolution. Backgrounds from gamma-rays and neutrons are significant design issues in the construction of these TPCs. They are generally placed deep underground to shield from cosmogenic particles and surrounded with shielding to reduce radiation from the local surroundings. The construction materials have to be carefully screened for radiopurity as they are in close contact with the active mass and can be a signification source of background events. The TPC excels in reducing this internal background because the mass inside the fieldcage forms one monolithic volume from which fiducial cuts can be made ex post facto to isolate quiet drift mass, and can be circulated and purified to a very high level. Self shielding in these large mass systems can be significant and the effect improves with density. The liquid phase TPC can obtain a high density at low pressure which results in very good self-shielding and compact installation with a lightweight containment. The down sides are the need for cryogenics, slower charge drift, tracks shorter than the typical electron diffusion, lower energy resolution (e

  9. The waveform correlation event detection system project: Issues in system refinement, tuning, and operation

    SciTech Connect

    Young, C.J.; Beiriger, J.I.; Harris, J.M.; Moore, S.G.; Trujillo, J.R.; Withers, M.M.; Aster, R.C.

    1996-08-01

    The goal of the Waveform Correlation Event Detection System (WCEDS) Project at Sandia Labs has been to develop a prototype of a full-waveform correlation based seismic event detection system which could be used to assess potential usefulness for CTBT monitoring. The current seismic event detection system in use at the IDC is very sophisticated and provides good results but there is still significant room for improvement, particularly in reducing the number of false events (currently being nearly equal to the number of real events). Our first prototype was developed last year and since then we have used it for extensive testing from which we have gained considerable insight. The original prototype was based on a long-period detector designed by Shearer (1994), but it has been heavily modified to address problems encountered in application to a data set from the Incorporated Research Institutes for Seismology (IRIS) broadband global network. Important modifications include capabilities for event masking and iterative event detection, continuous near-real time execution, improved Master Image creation, and individualized station pre-processing. All have been shown to improve bulletin quality. In some cases the system has detected marginal events which may not be detectable by traditional detection systems, but definitive conclusions cannot be made without direct comparisons. For this reason future work will focus on using the system to process GSETT3 data for comparison with current event detection systems at the IDC.

  10. Network hydraulics inclusion in water quality event detection using multiple sensor stations data.

    PubMed

    Oliker, Nurit; Ostfeld, Avi

    2015-09-01

    Event detection is one of the current most challenging topics in water distribution systems analysis: how regular on-line hydraulic (e.g., pressure, flow) and water quality (e.g., pH, residual chlorine, turbidity) measurements at different network locations can be efficiently utilized to detect water quality contamination events. This study describes an integrated event detection model which combines multiple sensor stations data with network hydraulics. To date event detection modelling is likely limited to single sensor station location and dataset. Single sensor station models are detached from network hydraulics insights and as a result might be significantly exposed to false positive alarms. This work is aimed at decreasing this limitation through integrating local and spatial hydraulic data understanding into an event detection model. The spatial analysis complements the local event detection effort through discovering events with lower signatures by exploring the sensors mutual hydraulic influences. The unique contribution of this study is in incorporating hydraulic simulation information into the overall event detection process of spatially distributed sensors. The methodology is demonstrated on two example applications using base runs and sensitivity analyses. Results show a clear advantage of the suggested model over single-sensor event detection schemes. PMID:25996752

  11. Method and apparatus for detecting and determining event characteristics with reduced data collection

    NASA Technical Reports Server (NTRS)

    Totman, Peter D. (Inventor); Everton, Randy L. (Inventor); Egget, Mark R. (Inventor); Macon, David J. (Inventor)

    2007-01-01

    A method and apparatus for detecting and determining event characteristics such as, for example, the material failure of a component, in a manner which significantly reduces the amount of data collected. A sensor array, including a plurality of individual sensor elements, is coupled to a programmable logic device (PLD) configured to operate in a passive state and an active state. A triggering event is established such that the PLD records information only upon detection of the occurrence of the triggering event which causes a change in state within one or more of the plurality of sensor elements. Upon the occurrence of the triggering event, the change in state of the one or more sensor elements causes the PLD to record in memory which sensor element detected the event and at what time the event was detected. The PLD may be coupled with a computer for subsequent downloading and analysis of the acquired data.

  12. Reading Times and the Detection of Event Shift Processing

    ERIC Educational Resources Information Center

    Radvansky, Gabriel A.; Copeland, David E.

    2010-01-01

    When people read narratives, they often need to update their situation models as the described events change. Previous research has shown little to no increases in reading times for spatial shifts but consistent increases for temporal shifts. On this basis, researchers have suggested that spatial updating does not regularly occur, whereas temporal…

  13. Unsupervised Event Characterization and Detection in Multichannel Signals: An EEG application.

    PubMed

    Mur, Angel; Dormido, Raquel; Vega, Jesús; Duro, Natividad; Dormido-Canto, Sebastian

    2016-01-01

    In this paper, we propose a new unsupervised method to automatically characterize and detect events in multichannel signals. This method is used to identify artifacts in electroencephalogram (EEG) recordings of brain activity. The proposed algorithm has been evaluated and compared with a supervised method. To this end an example of the performance of the algorithm to detect artifacts is shown. The results show that although both methods obtain similar classification, the proposed method allows detecting events without training data and can also be applied in signals whose events are unknown a priori. Furthermore, the proposed method provides an optimal window whereby an optimal detection and characterization of events is found. The detection of events can be applied in real-time. PMID:27120605

  14. Unsupervised Event Characterization and Detection in Multichannel Signals: An EEG application

    PubMed Central

    Mur, Angel; Dormido, Raquel; Vega, Jesús; Duro, Natividad; Dormido-Canto, Sebastian

    2016-01-01

    In this paper, we propose a new unsupervised method to automatically characterize and detect events in multichannel signals. This method is used to identify artifacts in electroencephalogram (EEG) recordings of brain activity. The proposed algorithm has been evaluated and compared with a supervised method. To this end an example of the performance of the algorithm to detect artifacts is shown. The results show that although both methods obtain similar classification, the proposed method allows detecting events without training data and can also be applied in signals whose events are unknown a priori. Furthermore, the proposed method provides an optimal window whereby an optimal detection and characterization of events is found. The detection of events can be applied in real-time. PMID:27120605

  15. Setting objective thresholds for rare event detection in flow cytometry.

    PubMed

    Richards, Adam J; Staats, Janet; Enzor, Jennifer; McKinnon, Katherine; Frelinger, Jacob; Denny, Thomas N; Weinhold, Kent J; Chan, Cliburn

    2014-07-01

    The accurate identification of rare antigen-specific cytokine positive cells from peripheral blood mononuclear cells (PBMC) after antigenic stimulation in an intracellular staining (ICS) flow cytometry assay is challenging, as cytokine positive events may be fairly diffusely distributed and lack an obvious separation from the negative population. Traditionally, the approach by flow operators has been to manually set a positivity threshold to partition events into cytokine-positive and cytokine-negative. This approach suffers from subjectivity and inconsistency across different flow operators. The use of statistical clustering methods does not remove the need to find an objective threshold between between positive and negative events since consistent identification of rare event subsets is highly challenging for automated algorithms, especially when there is distributional overlap between the positive and negative events ("smear"). We present a new approach, based on the Fβ measure, that is similar to manual thresholding in providing a hard cutoff, but has the advantage of being determined objectively. The performance of this algorithm is compared with results obtained by expert visual gating. Several ICS data sets from the External Quality Assurance Program Oversight Laboratory (EQAPOL) proficiency program were used to make the comparisons. We first show that visually determined thresholds are difficult to reproduce and pose a problem when comparing results across operators or laboratories, as well as problems that occur with the use of commonly employed clustering algorithms. In contrast, a single parameterization for the Fβ method performs consistently across different centers, samples, and instruments because it optimizes the precision/recall tradeoff by using both negative and positive controls. PMID:24727143

  16. Motor task event detection using Subthalamic Nucleus Local Field Potentials.

    PubMed

    Niketeghad, Soroush; Hebb, Adam O; Nedrud, Joshua; Hanrahan, Sara J; Mahoor, Mohammad H

    2015-08-01

    Deep Brain Stimulation (DBS) provides significant therapeutic benefit for movement disorders such as Parkinson's disease. Current DBS devices lack real-time feedback (thus are open loop) and stimulation parameters are adjusted during scheduled visits with a clinician. A closed-loop DBS system may reduce power consumption and DBS side effects. In such systems, DBS parameters are adjusted based on patient's behavior, which means that behavior detection is a major step in designing such systems. Various physiological signals can be used to recognize the behaviors. Subthalamic Nucleus (STN) Local Field Potential (LFP) is a great candidate signal for the neural feedback, because it can be recorded from the stimulation lead and does not require additional sensors. A practical behavior detection method should be able to detect behaviors asynchronously meaning that it should not use any prior knowledge of behavior onsets. In this paper, we introduce a behavior detection method that is able to asynchronously detect the finger movements of Parkinson patients. As a result of this study, we learned that there is a motor-modulated inter-hemispheric connectivity between LFP signals recorded bilaterally from STN. We used non-linear regression method to measure this connectivity and use it to detect the finger movements. Performance of this method is evaluated using Receiver Operating Characteristic (ROC). PMID:26737550

  17. The GRACE satellites detect recent extreme climate events in China

    NASA Astrophysics Data System (ADS)

    Tang, Jingshi; Liu, Lin

    2012-07-01

    As the climate changes, the extreme climates are occurring more frequenly over the globe. In China, drought or flood recently strikes almost every year and there have been several disastrous events in these years. We show that some of the disastrous events are so strong that corresponding gravity change can be observed by geodetic satellies. We use the Gravity Recovery and Climate Experiment (GRACE), which is a joint mission between NASA and DLR. One primary job of GRACE is to map Earth temporal gravity field with high resolution. Over the years the twin satellites have observed the loss of mass in Antarctic and Greenland, strong earthquakes, severe climate change in South America and so on, which provides a unique way to study the geophysical or climatological process. In this report, the Level-2 product in recent few years from Center for Space Research is used and specific areas in China are focused on. It is shown that after decorrelation, filter and other processes, the gravity anomalies observed by GRACE match the extreme climate events and the hydrological data from the Global Land Data Assimilation System (GLDAS).

  18. Nonthreshold-based event detection for 3d environment monitoring in sensor networks

    SciTech Connect

    Li, M.; Liu, Y.H.; Chen, L.

    2008-12-15

    Event detection is a crucial task for wireless sensor network applications, especially environment monitoring. Existing approaches for event detection are mainly based on some predefined threshold values and, thus, are often inaccurate and incapable of capturing complex events. For example, in coal mine monitoring scenarios, gas leakage or water osmosis can hardly be described by the overrun of specified attribute thresholds but some complex pattern in the full-scale view of the environmental data. To address this issue, we propose a nonthreshold-based approach for the real 3D sensor monitoring environment. We employ energy-efficient methods to collect a time series of data maps from the sensor network and detect complex events through matching the gathered data to spatiotemporal data patterns. Finally, we conduct trace-driven simulations to prove the efficacy and efficiency of this approach on detecting events of complex phenomena from real-life records.

  19. Minimal elastographic modeling of breast cancer for model based tumor detection in a digital image elasto tomography (DIET) system

    NASA Astrophysics Data System (ADS)

    Lotz, Thomas F.; Muller, Natalie; Hann, Christopher E.; Chase, J. Geoffrey

    2011-03-01

    Digital Image Elasto Tomography (DIET) is a non-invasive breast cancer screening technology that images the surface motion of a breast under harmonic mechanical actuation. A new approach capturing the dynamics and characteristics of tumor behavior is presented. A simple mechanical model of the breast is used to identify a transfer function relating the input harmonic actuation to the output surface displacements using imaging data of a silicone phantom. Areas of higher stiffness cause significant changes of damping and resonant frequencies as seen in the resulting Bode plots. A case study on a healthy and tumor silicone breast phantom shows the potential for this model-based method to clearly distinguish cancerous and healthy tissue as well as correctly predicting the tumor position.

  20. Neuro-evolutionary event detection technique for downhole microseismic surveys

    NASA Astrophysics Data System (ADS)

    Maity, Debotyam; Salehi, Iraj

    2016-01-01

    Recent years have seen a significant increase in borehole microseismic data acquisition programs associated with unconventional reservoir developments such as hydraulic fracturing programs for shale oil and gas. The data so acquired is used for hydraulic fracture monitoring and diagnostics and therefore, the quality of the data in terms of resolution and accuracy has a significant impact on its value to the industry. Borehole microseismic data acquired in such environments typically suffer from propagation effects due to the presence of thin interbedded shale layers as well as noise and interference effects. Moreover, acquisition geometry has significant impact on detectability across portions of the sensor array. Our work focuses on developing robust first arrival detection and pick selection workflow for both P and S waves specifically designed for such environments. We introduce a novel workflow for refinement of picks with immunity towards significant noise artifacts and applicability over data with very low signal-to-noise ratio provided some accurate picks have already been made. This workflow utilizes multi-step hybrid detection and classification routine which makes use of a neural network based autopicker for initial picking and an evolutionary algorithm for pick refinement. We highlight the results from an actual field case study including multiple examples demonstrating immunity towards noise and compare the effectiveness of the workflow with two contemporary autopicking routines without the application of the shared detection/refinement procedure. Finally, we use a windowed waveform cross-correlation based uncertainty estimation method for potential quality control purposes. While the workflow was developed to work with the neural network based autopicker, it can be used with any other traditional autopicker and provides significant improvements in pick detection across seismic gathers.

  1. Spatial-temporal event detection in climate parameter imagery.

    SciTech Connect

    McKenna, Sean Andrew; Gutierrez, Karen A.

    2011-10-01

    Previously developed techniques that comprise statistical parametric mapping, with applications focused on human brain imaging, are examined and tested here for new applications in anomaly detection within remotely-sensed imagery. Two approaches to analysis are developed: online, regression-based anomaly detection and conditional differences. These approaches are applied to two example spatial-temporal data sets: data simulated with a Gaussian field deformation approach and weekly NDVI images derived from global satellite coverage. Results indicate that anomalies can be identified in spatial temporal data with the regression-based approach. Additionally, la Nina and el Nino climatic conditions are used as different stimuli applied to the earth and this comparison shows that el Nino conditions lead to significant decreases in NDVI in both the Amazon Basin and in Southern India.

  2. Detection of intermittent events in atmospheric time series

    NASA Astrophysics Data System (ADS)

    Paradisi, P.; Cesari, R.; Palatella, L.; Contini, D.; Donateo, A.

    2009-04-01

    The modeling approach in atmospheric sciences is based on the assumption that local fluxes of mass, momentum, heat, etc... can be described as linear functions of the local gradient of some intensive property (concentration, flow strain, temperature,...). This is essentially associated with Gaussian statistics and short range (exponential) correlations. However, the atmosphere is a complex dynamical system displaying a wide range of spatial and temporal scales. A global description of the atmospheric dynamics should include a great number of degrees of freedom, strongly interacting on several temporal and spatial scales, thus generating long range (power-law) correlations and non-Gaussian distribution of fluctuations (Lévy flights, Lévy walks, Continuous Time Random Walks) [1]. This is typically associated with anomalous diffusion and scaling, non-trivial memory features and correlation decays and, especially, with the emergence of flux-gradient relationships that are non-linear and/or non-local in time and/or space. Actually, the local flux-gradient relationship is greatly preferred due to a more clear physical meaning, allowing to perform direct comparisons with experimental data, and, especially, to smaller computational costs in numerical models. In particular, the linearity of this relationship allows to define a transport coefficient (e.g., turbulent diffusivity) and the modeling effort is usually focused on this coefficient. However, the validity of the local (and linear) flux-gradient model is strongly dependent on the range of spatial and temporal scales represented by the model and, consequently, by the sub-grid processes included in the flux-gradient relationship. In this work, in order to check the validity of local and linear flux-gradient relationships, an approach based on the concept of renewal critical events [2] is introduced. In fact, in renewal theory [2], the dynamical origin of anomalous behaviour and non-local flux-gradient relation is

  3. A Fuzzy-Decision Based Approach for Composite Event Detection in Wireless Sensor Networks

    PubMed Central

    Zhang, Shukui; Chen, Hao; Zhu, Qiaoming

    2014-01-01

    The event detection is one of the fundamental researches in wireless sensor networks (WSNs). Due to the consideration of various properties that reflect events status, the Composite event is more consistent with the objective world. Thus, the research of the Composite event becomes more realistic. In this paper, we analyze the characteristics of the Composite event; then we propose a criterion to determine the area of the Composite event and put forward a dominating set based network topology construction algorithm under random deployment. For the unreliability of partial data in detection process and fuzziness of the event definitions in nature, we propose a cluster-based two-dimensional τ-GAS algorithm and fuzzy-decision based composite event decision mechanism. In the case that the sensory data of most nodes are normal, the two-dimensional τ-GAS algorithm can filter the fault node data effectively and reduce the influence of erroneous data on the event determination. The Composite event judgment mechanism which is based on fuzzy-decision holds the superiority of the fuzzy-logic based algorithm; moreover, it does not need the support of a huge rule base and its computational complexity is small. Compared to CollECT algorithm and CDS algorithm, this algorithm improves the detection accuracy and reduces the traffic. PMID:25136690

  4. Object-Oriented Query Language For Events Detection From Images Sequences

    NASA Astrophysics Data System (ADS)

    Ganea, Ion Eugen

    2015-09-01

    In this paper is presented a method to represent the events extracted from images sequences and the query language used for events detection. Using an object oriented model the spatial and temporal relationships between salient objects and also between events are stored and queried. This works aims to unify the storing and querying phases for video events processing. The object oriented language syntax used for events processing allow the instantiation of the indexes classes in order to improve the accuracy of the query results. The experiments were performed on images sequences provided from sport domain and it shows the reliability and the robustness of the proposed language. To extend the language will be added a specific syntax for constructing the templates for abnormal events and for detection of the incidents as the final goal of the research.

  5. Detection of Upper Airway Status and Respiratory Events by a Current Generation Positive Airway Pressure Device

    PubMed Central

    Li, Qing Yun; Berry, Richard B.; Goetting, Mark G.; Staley, Bethany; Soto-Calderon, Haideliza; Tsai, Sheila C.; Jasko, Jeffrey G.; Pack, Allan I.; Kuna, Samuel T.

    2015-01-01

    Study Objectives: To compare a positive airway pressure (PAP) device's detection of respiratory events and airway status during device-detected apneas with events scored on simultaneous polysomnography (PSG). Design: Prospective PSGs of patients with sleep apnea using a new-generation PAP device. Settings: Four clinical and academic sleep centers. Patients: Forty-five patients with obstructive sleep apnea (OSA) and complex sleep apnea (Comp SA) performed a PSG on PAP levels adjusted to induce respiratory events. Interventions: None. Measurements and Results: PAP device data identifying the type of respiratory event and whether the airway during a device-detected apnea was open or obstructed were compared to time-synced, manually scored respiratory events on simultaneous PSG recording. Intraclass correlation coefficients between device-detected and PSG scored events were 0.854 for apnea-hypopnea index (AHI), 0.783 for apnea index, 0.252 for hypopnea index, and 0.098 for respiratory event-related arousals index. At a device AHI (AHIFlow) of 10 events/h, area under the receiver operating characteristic curve was 0.98, with sensitivity 0.92 and specificity 0.84. AHIFlow tended to overestimate AHI on PSG at values less than 10 events/h. The device detected that the airway was obstructed in 87.4% of manually scored obstructive apneas. Of the device-detected apneas with clear airway, a minority (15.8%) were manually scored as obstructive apneas. Conclusions: A device-detected apnea-hypopnea index (AHIFlow) < 10 events/h on a positive airway pressure device is strong evidence of good treatment efficacy. Device-detected airway status agrees closely with the presumed airway status during polysomnography scored events, but should not be equated with a specific type of respiratory event. Citation: Li QY, Berry RB, Goetting MG, Staley B, Soto-Calderon H, Tsai SC, Jasko JG, Pack AI, Kuna ST. Detection of upper airway status and respiratory events by a current generation positive

  6. Unsupervised Spatial Event Detection in Targeted Domains with Applications to Civil Unrest Modeling

    PubMed Central

    Zhao, Liang; Chen, Feng; Dai, Jing; Hua, Ting; Lu, Chang-Tien; Ramakrishnan, Naren

    2014-01-01

    Twitter has become a popular data source as a surrogate for monitoring and detecting events. Targeted domains such as crime, election, and social unrest require the creation of algorithms capable of detecting events pertinent to these domains. Due to the unstructured language, short-length messages, dynamics, and heterogeneity typical of Twitter data streams, it is technically difficult and labor-intensive to develop and maintain supervised learning systems. We present a novel unsupervised approach for detecting spatial events in targeted domains and illustrate this approach using one specific domain, viz. civil unrest modeling. Given a targeted domain, we propose a dynamic query expansion algorithm to iteratively expand domain-related terms, and generate a tweet homogeneous graph. An anomaly identification method is utilized to detect spatial events over this graph by jointly maximizing local modularity and spatial scan statistics. Extensive experiments conducted in 10 Latin American countries demonstrate the effectiveness of the proposed approach. PMID:25350136

  7. Seismic network detection probability assessment using waveforms and accounting to event association logic

    NASA Astrophysics Data System (ADS)

    Pinsky, Vladimir; Shapira, Avi

    2016-05-01

    The geographical area where a seismic event of magnitude M ≥ M t is detected by a seismic station network, for a defined probability is derived from a station probability of detection estimated as a function of epicentral distance. The latter is determined from both the bulletin data and the waveforms recorded by the station during the occurrence of the event with and without band-pass filtering. For simulating the real detection process, the waveforms are processed using the conventional Carl Johnson detection and association algorithm. The attempt is presented to account for the association time criterion in addition to the conventional approach adopted by the known PMC method.

  8. Find Your Manners: How Do Infants Detect the Invariant Manner of Motion in Dynamic Events?

    ERIC Educational Resources Information Center

    Pruden, Shannon M.; Goksun, Tilbe; Roseberry, Sarah; Hirsh-Pasek, Kathy; Golinkoff, Roberta M.

    2012-01-01

    To learn motion verbs, infants must be sensitive to the specific event features lexicalized in their language. One event feature important for the acquisition of English motion verbs is the manner of motion. This article examines when and how infants detect manners of motion across variations in the figure's path. Experiment 1 shows that 13- to…

  9. MCD for detection of event-based landslides

    NASA Astrophysics Data System (ADS)

    Mondini, A. C.; Chang, K.; Guzzetti, F.

    2011-12-01

    Landslides play an important role in the landscape evolution of mountainous terrain. They also present a socioeconomic problem in terms of risk for people and properties. Landslide inventory maps are not available for many areas affected by slope instabilities, resulting in a lack of primary information for the comprehension of the phenomenon, evaluation of relative landslide statistics, and civil protection operations on large scales. Traditional methods for the preparation of landslide inventory maps are based on the geomorphological interpretation of stereoscopic aerial photography and field surveys. These methods are expensive and time consuming. The exploitation of new remote sensing data, in particular very high resolution (VHR) satellite images, and new dedicated methods present an alternative to the traditional methods and are at the forefront of modern landslide research. Recent studies have showed the possibility to produce accurate landslide maps, reducing the time and resources required for their compilation and systematic update. This paper presents the Multiple Change Detection (MCD) technique, a new method that has shown promising results in landslide mapping. Through supervised or unsupervised classifiers, MCD combines different algorithms of change detection metrics, such as change in Normalized Differential Vegetation Index, spectral angle, principal component analysis, and independent component analysis, and applies them to a multi-temporal set of VHR satellite images to distinguish new landslides from stable areas. MCD has been applied with success in different geographical areas and with different satellite images, suggesting it is a reliable and robust technique. The technique can distinguish old from new landslides and capture runout features. Results of these case studies will be presented in the conference. Also to be presented are new developments of MCD involving the introduction of a priori information on landslide susceptibility within

  10. A model-based information sharing protocol for profile Hidden Markov Models used for HIV-1 recombination detection

    PubMed Central

    2014-01-01

    Background In many applications, a family of nucleotide or protein sequences classified into several subfamilies has to be modeled. Profile Hidden Markov Models (pHMMs) are widely used for this task, modeling each subfamily separately by one pHMM. However, a major drawback of this approach is the difficulty of dealing with subfamilies composed of very few sequences. One of the most crucial bioinformatical tasks affected by the problem of small-size subfamilies is the subtyping of human immunodeficiency virus type 1 (HIV-1) sequences, i.e., HIV-1 subtypes for which only a small number of sequences is known. Results To deal with small samples for particular subfamilies of HIV-1, we introduce a novel model-based information sharing protocol. It estimates the emission probabilities of the pHMM modeling a particular subfamily not only based on the nucleotide frequencies of the respective subfamily but also incorporating the nucleotide frequencies of all available subfamilies. To this end, the underlying probabilistic model mimics the pattern of commonality and variation between the subtypes with regards to the biological characteristics of HI viruses. In order to implement the proposed protocol, we make use of an existing HMM architecture and its associated inference engine. Conclusions We apply the modified algorithm to classify HIV-1 sequence data in the form of partial HIV-1 sequences and semi-artificial recombinants. Thereby, we demonstrate that the performance of pHMMs can be significantly improved by the proposed technique. Moreover, we show that our algorithm performs significantly better than Simplot and Bootscanning. PMID:24946781

  11. An integrated logit model for contamination event detection in water distribution systems.

    PubMed

    Housh, Mashor; Ostfeld, Avi

    2015-05-15

    The problem of contamination event detection in water distribution systems has become one of the most challenging research topics in water distribution systems analysis. Current attempts for event detection utilize a variety of approaches including statistical, heuristics, machine learning, and optimization methods. Several existing event detection systems share a common feature in which alarms are obtained separately for each of the water quality indicators. Unifying those single alarms from different indicators is usually performed by means of simple heuristics. A salient feature of the current developed approach is using a statistically oriented model for discrete choice prediction which is estimated using the maximum likelihood method for integrating the single alarms. The discrete choice model is jointly calibrated with other components of the event detection system framework in a training data set using genetic algorithms. The fusing process of each indicator probabilities, which is left out of focus in many existing event detection system models, is confirmed to be a crucial part of the system which could be modelled by exploiting a discrete choice model for improving its performance. The developed methodology is tested on real water quality data, showing improved performances in decreasing the number of false positive alarms and in its ability to detect events with higher probabilities, compared to previous studies. PMID:25770443

  12. Comparison of the STA/LTA and power spectral density methods for microseismic event detection

    NASA Astrophysics Data System (ADS)

    Vaezi, Yoones; Van der Baan, Mirko

    2015-12-01

    Robust event detection and picking is a prerequisite for reliable (micro-) seismic interpretations. Detection of weak events is a common challenge among various available event detection algorithms. In this paper we compare the performance of two event detection methods, the short-term average/long-term average (STA/LTA) method, which is the most commonly used technique in industry, and a newly introduced method that is based on the power spectral density (PSD) measurements. We have applied both techniques to a 1-hr long segment of the vertical component of some raw continuous data recorded at a borehole geophone in a hydraulic fracturing experiment. The PSD technique outperforms the STA/LTA technique by detecting a higher number of weak events while keeping the number of false alarms at a reasonable level. The time-frequency representations obtained through the PSD method can also help define a more suitable bandpass filter which is usually required for the STA/LTA method. The method offers thus much promise for automated event detection in industrial, local, regional and global seismological data sets.

  13. Field testing of component-level model-based fault detection methods for mixing boxes and VAV fan systems

    SciTech Connect

    Xu, Peng; Haves, Philip

    2002-05-16

    An automated fault detection and diagnosis tool for HVAC systems is being developed, based on an integrated, life-cycle, approach to commissioning and performance monitoring. The tool uses component-level HVAC equipment models implemented in the SPARK equation-based simulation environment. The models are configured using design information and component manufacturers' data and then fine-tuned to match the actual performance of the equipment by using data measured during functional tests of the sort using in commissioning. This paper presents the results of field tests of mixing box and VAV fan system models in an experimental facility and a commercial office building. The models were found to be capable of representing the performance of correctly operating mixing box and VAV fan systems and detecting several types of incorrect operation.

  14. Probabilistic approaches to fault detection in networked discrete event systems.

    PubMed

    Athanasopoulou, Eleftheria; Hadjicostis, Christoforos N

    2005-09-01

    In this paper, we consider distributed systems that can be modeled as finite state machines with known behavior under fault-free conditions, and we study the detection of a general class of faults that manifest themselves as permanent changes in the next-state transition functionality of the system. This scenario could arise in a variety of situations encountered in communication networks, including faults occurred due to design or implementation errors during the execution of communication protocols. In our approach, fault diagnosis is performed by an external observer/diagnoser that functions as a finite state machine and which has access to the input sequence applied to the system but has only limited access to the system state or output. In particular, we assume that the observer/diagnoser is only able to obtain partial information regarding the state of the given system at intermittent time intervals that are determined by certain synchronizing conditions between the system and the observer/diagnoser. By adopting a probabilistic framework, we analyze ways to optimally choose these synchronizing conditions and develop adaptive strategies that achieve a low probability of aliasing, i.e., a low probability that the external observer/diagnoser incorrectly declares the system as fault-free. An application of these ideas in the context of protocol testing/classification is provided as an example. PMID:16252815

  15. The waveform correlation event detection system project, Phase I: Issues in prototype development and testing

    SciTech Connect

    Young, C.; Harris, M.; Beiriger, J.; Moore, S.; Trujillo, J.; Withers, M.; Aster, R.

    1996-08-01

    A study using long-period seismic data showed that seismic events can be detected and located based on correlations of processed waveform profiles with the profile expected for an event. In this technique both time and space are discretized and events are found by forming profiles and calculating correlations for all time-distance points. events are declared at points with large correlations. In the first phase of the Waveform Correlation Event Detection System (WCEDS) Project at Sandia Labs we have developed a prototype automatic event detection system based on Shearer`s work which shows promise for treaty monitoring applications. Many modifications have been made to meet the requirements of the monitoring environment. A new full matrix multiplication has been developed which can reduce the number of computations needed for the data correlation by as much as two orders of magnitude for large grids. New methodology has also been developed to deal with the problems caused by false correlations (sidelobes) generated during the correlation process. When an event has been detected, masking matrices are set up which will mask all correlation sidelobes due to the event, allowing other events with intermingled phases to be found. This process is repeated until a detection threshold is reached. The system was tested on one hour of Incorporated Research Institutions for Seismology (IRIS) broadband data and built all 4 of the events listed in the National Earthquake Information Center (NEIC) Preliminary Determination of Epicenters (PDE) which were observable by the IRIS network. A continuous execution scheme has been developed for the system but has not yet been implemented. Improvements to the efficiency of the code are in various stages of development. Many refinements would have to be made to the system before it could be used as part of an actual monitoring system, but at this stage we know of no clear barriers which would prevent an eventual implementation of the system.

  16. Qualitative and event-specific real-time PCR detection methods for Bt brinjal event EE-1.

    PubMed

    Randhawa, Gurinder Jit; Sharma, Ruchi; Singh, Monika

    2012-01-01

    Bt brinjal event EE-1 with cry1Ac gene, expressing insecticidal protein against fruit and shoot borer, is the first genetically modified food crop in the pipeline for commercialization in India. Qualitative polymerase chain reaction (PCR) along with event-specific conventional as well as real-time PCR methods to characterize the event EE-1 is reported. A multiplex (pentaplex) PCR system simultaneously amplifying cry1Ac transgene, Cauliflower Mosaic Virus (CaMV) 35S promoter, nopaline synthase (nos) terminator, aminoglycoside adenyltransferase (aadA) marker gene, and a taxon-specific beta-fructosidase gene in event EE-1 has been developed. Furthermore, construct-specific PCR, targeting the approximate 1.8 kb region of inserted gene construct comprising the region of CaMV 35S promoter and cry1Ac gene has also been developed. The LOD of developed EE-1 specific conventional PCR assay is 0.01%. The method performance of the reported real-time PCR assay was consistent with the acceptance criteria of Codex Alimentarius Commission ALINORM 10/33/23, with the LOD and LOQ values of 0.05%. The developed detection methods would not only facilitate effective regulatory compliance for identification of genetic traits, risk assessment, management, and postrelease monitoring, but also address consumer concerns and resolution of legal disputes. PMID:23451391

  17. Real-time detection and classification of anomalous events in streaming data

    DOEpatents

    Ferragut, Erik M.; Goodall, John R.; Iannacone, Michael D.; Laska, Jason A.; Harrison, Lane T.

    2016-04-19

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The events can be displayed to a user in user-defined groupings in an animated fashion. The system can include a plurality of anomaly detectors that together implement an algorithm to identify low probability events and detect atypical traffic patterns. The atypical traffic patterns can then be classified as being of interest or not. In one particular example, in a network environment, the classification can be whether the network traffic is malicious or not.

  18. Investigation of EMIC Waves During Balloon Detected Relativistic Electron Precipitation Events

    NASA Astrophysics Data System (ADS)

    Woodger, L. A.; Millan, R. M.

    2009-12-01

    Multiple relativistic electron precipitation (REP) events were detected by balloon-borne instrumentation during the MAXIS 2000 and MINIS 2005 campaigns. It has been suggested that resonance with EMIC waves caused these precipitation events (Lorentzen et al, 2000 and Millan et al, 2002) due to their location in the dusk sector. We present observations of dusk-side relativistic electron precipitation events, and use supporting satellite and theoretical data to investigate the relationship between EMIC waves and the detected REP. Satellite data can provide direct measurements of not only the waves themselves but also important resonance condition parameters. The data will be presented collectively with each event to showcase similarities and differences between events and the challenges that arise in trying to understand the relationship between dusk-side relativistic electron precipitation and EMIC waves.

  19. Detection of stick-slip events within the Whillans Ice Stream using an artificial neural network

    NASA Astrophysics Data System (ADS)

    Bernsen, S. P.

    2014-12-01

    Temporal changes in the periodic stick-slip events on the Whillans Ice Stream (WIS) help to understand the hydrosphere-cryosphere coupling in West Antarctica. Previous studies have shown that the periodic behavior has been ongoing for a number of years but the record of slip events is incomplete. Rayleigh waves from WIS grounding line events exhibit different patterns than events from the interior of the glacier. An algorithm using a backpropagation neural network is proposed to efficiently extract surface waves that are a result of stick slip events. A neural network approach has its advantages of machine learning, simplified mathematics, and eliminates the need for an analyst to correctly pick first arrivals. Training data has been assembled using 107 events occuring during the 2010 austral summer that were previously identified to correspond to stick slip events at the grounding line as well as the interior of the WIS. A 0.1 s moving window of 3 s of each of the preprocessed attributes is input into the neural network for automated surface wave detection. Following surface wave detection a much longer 30 minute sliding window is used to classify surface wave detections as grounding line, interior, or non-stick slip events. Similar to the automatic detection algorithms for body waves, preprocessing using STA/LTA ratio, degree of polarization, variance, and skewness exhibit obvious patterns during the onset of surface waves. The The automated event detection could lead to more cost effective means of data collection in future seismic experiments especially with an increase in array density in cold weather regions.

  20. Early snowmelt events: detection, distribution, and significance in a major sub-arctic watershed

    NASA Astrophysics Data System (ADS)

    Alese Semmens, Kathryn; Ramage, Joan; Bartsch, Annett; Liston, Glen E.

    2013-03-01

    High latitude drainage basins are experiencing higher average temperatures, earlier snowmelt onset in spring, and an increase in rain on snow (ROS) events in winter, trends that climate models project into the future. Snowmelt-dominated basins are most sensitive to winter temperature increases that influence the frequency of ROS events and the timing and duration of snowmelt, resulting in changes to spring runoff. Of specific interest in this study are early melt events that occur in late winter preceding melt onset in the spring. The study focuses on satellite determination and characterization of these early melt events using the Yukon River Basin (Canada/USA) as a test domain. The timing of these events was estimated using data from passive (Advanced Microwave Scanning Radiometer—EOS (AMSR-E)) and active (SeaWinds on Quick Scatterometer (QuikSCAT)) microwave remote sensors, employing detection algorithms for brightness temperature (AMSR-E) and radar backscatter (QuikSCAT). The satellite detected events were validated with ground station meteorological and hydrological data, and the spatial and temporal variability of the events across the entire river basin was characterized. Possible causative factors for the detected events, including ROS, fog, and positive air temperatures, were determined by comparing the timing of the events to parameters from SnowModel and National Centers for Environmental Prediction North American Regional Reanalysis (NARR) outputs, and weather station data. All melt events coincided with above freezing temperatures, while a limited number corresponded to ROS (determined from SnowModel and ground data) and a majority to fog occurrence (determined from NARR). The results underscore the significant influence that warm air intrusions have on melt in some areas and demonstrate the large temporal and spatial variability over years and regions. The study provides a method for melt detection and a baseline from which to assess future change.

  1. Event Detection and Visualization of Ocean Eddies based on SSH and Velocity Field

    NASA Astrophysics Data System (ADS)

    Matsuoka, Daisuke; Araki, Fumiaki; Inoue, Yumi; Sasaki, Hideharu

    2016-04-01

    Numerical studies of ocean eddies have been progressed using high-resolution ocean general circulation models. In order to understand ocean eddies from simulation results with large amount of information volume, it is necessary to visualize not only distribution of eddies of each time step, but also events or phenomena of eddies. However, previous methods cannot precisely detect eddies, especially, during the events such as eddies' amalgamation, bifurcation. In the present study, we propose a new approach of eddy's detection, tracking and event visualization based on sea surface height (SSH) and velocity field. The proposed method detects eddies region as well as streams and currents region, and classifies detected eddies into several types. By tracking the time-varying change of classified eddies, it is possible to detect not only eddies event such as amalgamation and bifurcation but also the interaction between eddy and ocean current. As a result of visualizing detected eddies and events, we succeeded in creating the movie which enables us to intuitively understand the region of interest.

  2. The Cognitive Processes Underlying Event-Based Prospective Memory in School-Age Children and Young Adults: A Formal Model-Based Study

    ERIC Educational Resources Information Center

    Smith, Rebekah E.; Bayen, Ute J.; Martin, Claudia

    2010-01-01

    Fifty children 7 years of age (29 girls, 21 boys), 53 children 10 years of age (29 girls, 24 boys), and 36 young adults (19 women, 17 men) performed a computerized event-based prospective memory task. All 3 groups differed significantly in prospective memory performance, with adults showing the best performance and with 7-year-olds showing the…

  3. Event Detection in Aerospace Systems using Centralized Sensor Networks: A Comparative Study of Several Methodologies

    NASA Technical Reports Server (NTRS)

    Mehr, Ali Farhang; Sauvageon, Julien; Agogino, Alice M.; Tumer, Irem Y.

    2006-01-01

    Recent advances in micro electromechanical systems technology, digital electronics, and wireless communications have enabled development of low-cost, low-power, multifunctional miniature smart sensors. These sensors can be deployed throughout a region in an aerospace vehicle to build a network for measurement, detection and surveillance applications. Event detection using such centralized sensor networks is often regarded as one of the most promising health management technologies in aerospace applications where timely detection of local anomalies has a great impact on the safety of the mission. In this paper, we propose to conduct a qualitative comparison of several local event detection algorithms for centralized redundant sensor networks. The algorithms are compared with respect to their ability to locate and evaluate an event in the presence of noise and sensor failures for various node geometries and densities.

  4. Detection of gait events using an F-Scan in-shoe pressure measurement system.

    PubMed

    Catalfamo, Paola; Moser, David; Ghoussayni, Salim; Ewins, David

    2008-10-01

    A portable system capable of accurate detection of initial contact (IC) and foot off (FO) without adding encumbrance to the subject would be extremely useful in many gait analysis applications. Force platforms represent the gold standard method for determining these events and other methods including foot switches and kinematic data have also been proposed. These approaches, however, present limitations in terms of the number of steps that can be analysed per trial, the portability for outdoor measurements or the information needed beforehand. The purpose of this study was to evaluate the F-Scan((R)) Mobile pressure measurement system when detecting IC and FO. Two methods were used, one was the force detection (FD) in-built algorithm used by F-Scan software and a new area detection (AD) method using the loaded area during the gait cycle. Both methods were tested in ten healthy adults and compared with the detection provided by a kinetic detection (KT) algorithm. The absolute mean differences between KT and FD were (mean+/-standard deviation) 42+/-11 ms for IC and 37+/-11 ms for FO. The absolute mean differences between KT and AD were 22+/-9 ms for IC and 10+/-4 ms for FO. The AD method remained closer to KT detection for all subjects providing sufficiently accurate detection of both events and presenting advantages in terms of portability, number of steps analysed per trial and practicality as to make it a system of choice for gait event detection. PMID:18468441

  5. Model-based analysis supports interglacial refugia over long-dispersal events in the diversification of two South American cactus species.

    PubMed

    Perez, M F; Bonatelli, I A S; Moraes, E M; Carstens, B C

    2016-06-01

    Pilosocereus machrisii and P. aurisetus are cactus species within the P. aurisetus complex, a group of eight cacti that are restricted to rocky habitats within the Neotropical savannas of eastern South America. Previous studies have suggested that diversification within this complex was driven by distributional fragmentation, isolation leading to allopatric differentiation, and secondary contact among divergent lineages. These events have been associated with Quaternary climatic cycles, leading to the hypothesis that the xerophytic vegetation patches which presently harbor these populations operate as refugia during the current interglacial. However, owing to limitations of the standard phylogeographic approaches used in these studies, this hypothesis was not explicitly tested. Here we use Approximate Bayesian Computation to refine the previous inferences and test the role of different events in the diversification of two species within P. aurisetus group. We used molecular data from chloroplast DNA and simple sequence repeats loci of P. machrisii and P. aurisetus, the two species with broadest distribution in the complex, in order to test if the diversification in each species was driven mostly by vicariance or by long-dispersal events. We found that both species were affected primarily by vicariance, with a refuge model as the most likely scenario for P. aurisetus and a soft vicariance scenario most probable for P. machrisii. These results emphasize the importance of distributional fragmentation in these species, and add support to the hypothesis of long-term isolation in interglacial refugia previously proposed for the P. aurisetus species complex diversification. PMID:27071846

  6. A canonical correlation analysis based method for contamination event detection in water sources.

    PubMed

    Li, Ruonan; Liu, Shuming; Smith, Kate; Che, Han

    2016-06-15

    In this study, a general framework integrating a data-driven estimation model is employed for contamination event detection in water sources. Sequential canonical correlation coefficients are updated in the model using multivariate water quality time series. The proposed method utilizes canonical correlation analysis for studying the interplay between two sets of water quality parameters. The model is assessed by precision, recall and F-measure. The proposed method is tested using data from a laboratory contaminant injection experiment. The proposed method could detect a contamination event 1 minute after the introduction of 1.600 mg l(-1) acrylamide solution. With optimized parameter values, the proposed method can correctly detect 97.50% of all contamination events with no false alarms. The robustness of the proposed method can be explained using the Bauer-Fike theorem. PMID:27264637

  7. On-line Machine Learning and Event Detection in Petascale Data Streams

    NASA Astrophysics Data System (ADS)

    Thompson, David R.; Wagstaff, K. L.

    2012-01-01

    Traditional statistical data mining involves off-line analysis in which all data are available and equally accessible. However, petascale datasets have challenged this premise since it is often impossible to store, let alone analyze, the relevant observations. This has led the machine learning community to investigate adaptive processing chains where data mining is a continuous process. Here pattern recognition permits triage and followup decisions at multiple stages of a processing pipeline. Such techniques can also benefit new astronomical instruments such as the Large Synoptic Survey Telescope (LSST) and Square Kilometre Array (SKA) that will generate petascale data volumes. We summarize some machine learning perspectives on real time data mining, with representative cases of astronomical applications and event detection in high volume datastreams. The first is a "supervised classification" approach currently used for transient event detection at the Very Long Baseline Array (VLBA). It injects known signals of interest - faint single-pulse anomalies - and tunes system parameters to recover these events. This permits meaningful event detection for diverse instrument configurations and observing conditions whose noise cannot be well-characterized in advance. Second, "semi-supervised novelty detection" finds novel events based on statistical deviations from previous patterns. It detects outlier signals of interest while considering known examples of false alarm interference. Applied to data from the Parkes pulsar survey, the approach identifies anomalous "peryton" phenomena that do not match previous event models. Finally, we consider online light curve classification that can trigger adaptive followup measurements of candidate events. Classifier performance analyses suggest optimal survey strategies, and permit principled followup decisions from incomplete data. These examples trace a broad range of algorithm possibilities available for online astronomical data

  8. Method for detecting binding events using micro-X-ray fluorescence spectrometry

    DOEpatents

    Warner, Benjamin P.; Havrilla, George J.; Mann, Grace

    2010-12-28

    Method for detecting binding events using micro-X-ray fluorescence spectrometry. Receptors are exposed to at least one potential binder and arrayed on a substrate support. Each member of the array is exposed to X-ray radiation. The magnitude of a detectable X-ray fluorescence signal for at least one element can be used to determine whether a binding event between a binder and a receptor has occurred, and can provide information related to the extent of binding between the binder and receptor.

  9. Detecting, Monitoring, and Reporting Possible Adverse Drug Events Using an Arden-Syntax-based Rule Engine.

    PubMed

    Fehre, Karsten; Plössnig, Manuela; Schuler, Jochen; Hofer-Dückelmann, Christina; Rappelsberger, Andrea; Adlassnig, Klaus-Peter

    2015-01-01

    The detection of adverse drug events (ADEs) is an important aspect of improving patient safety. The iMedication system employs predefined triggers associated with significant events in a patient's clinical data to automatically detect possible ADEs. We defined four clinically relevant conditions: hyperkalemia, hyponatremia, renal failure, and over-anticoagulation. These are some of the most relevant ADEs in internal medical and geriatric wards. For each patient, ADE risk scores for all four situations are calculated, compared against a threshold, and judged to be monitored, or reported. A ward-based cockpit view summarizes the results. PMID:26262252

  10. Comparison of pointwise and regional statistical approaches to detect non stationarity in extreme rainfall events. Application to the Sahelian region

    NASA Astrophysics Data System (ADS)

    Panthou, G.; Vischel, T.; Lebel, T.; Quantin, G.; Favre, A.; Blanchet, J.; Ali, A.

    2012-12-01

    Studying trends in rainfall extremes at regional scale is required to provide reference climatology to evaluate General Circulation Model global predictions as well as to help managing and designing hydraulic works. The present study compares three methods to detect trends (linear and change-point) in series of daily rainfall annual maxima: (i) The first approach is widely used and consist in applying statistical stationarity tests (linear trend and change-point) on the point-wise maxima series; (ii) The second approach compares the performances of a constant and a time dependent Generalized Extreme Value (GEV) distribution fitted to the point-wise maxima series. (iii) The last method uses an original regional statistical model based on space-time GEV distribution which is used to detect changes in rainfall extremes directly at regional scale. The three methods are applied to detect trends in extreme daily rainfall over the Sahel during the period 1950-1990 for which a network of 128 daily rain gages is available. This region has experienced an intense drought since the end of the 1960s; it is thus an interesting case-study to illustrate how a regional climate change can affect the extreme rainfall distributions. One major result is that the statistical stationarity tests rarely detect non-stationarities in the series while the two GEV-based models converge to show that the extreme rainfall series have a negative break point around 1970. The study points out the limit of the widely used classical stationarity tests to detect trends in noisy series affected by sampling errors. The use of parametric time-dependent GEV seems to reduce this effect especially when a regional approach is used. From a climatological point of view, the results show that the great Sahelian drought has been accompanied by a decrease of extreme rainfall events, both in magnitude and occurence.

  11. Model-Based Fault Tolerant Control

    NASA Technical Reports Server (NTRS)

    Kumar, Aditya; Viassolo, Daniel

    2008-01-01

    The Model Based Fault Tolerant Control (MBFTC) task was conducted under the NASA Aviation Safety and Security Program. The goal of MBFTC is to develop and demonstrate real-time strategies to diagnose and accommodate anomalous aircraft engine events such as sensor faults, actuator faults, or turbine gas-path component damage that can lead to in-flight shutdowns, aborted take offs, asymmetric thrust/loss of thrust control, or engine surge/stall events. A suite of model-based fault detection algorithms were developed and evaluated. Based on the performance and maturity of the developed algorithms two approaches were selected for further analysis: (i) multiple-hypothesis testing, and (ii) neural networks; both used residuals from an Extended Kalman Filter to detect the occurrence of the selected faults. A simple fusion algorithm was implemented to combine the results from each algorithm to obtain an overall estimate of the identified fault type and magnitude. The identification of the fault type and magnitude enabled the use of an online fault accommodation strategy to correct for the adverse impact of these faults on engine operability thereby enabling continued engine operation in the presence of these faults. The performance of the fault detection and accommodation algorithm was extensively tested in a simulation environment.

  12. Detection of invisible and crucial events: from seismic fluctuations to the war against terrorism

    NASA Astrophysics Data System (ADS)

    Allegrini, Paolo; Fronzoni, Leone; Grigolini, Paolo; Latora, Vito; Mega, Mirko S.; Palatella, Luigi; Rapisarda, Andrea; Vinciguerra, Sergio

    2004-04-01

    We prove the efficiency of a new method for the detection of crucial events that might have useful applications to the war against terrorism. This has to do with the search for rare but significant events, a theme of research that has been made of extreme importance by the tragedy of September 11. This method is applied here to defining the statistics of seismic main-shocks, as done in cond-mat/0212529. The emphasis here is on the conceptual issues behind the results obtained in cond-mat/0212529 than on geophysics. This discussion suggests that the method has a wider range of validity. We support this general discussion with a dynamic model originally proposed in cond-mat/0107597 for purposes different from geophysical applications. However, it is a case where the crucial events to detect are under our control, thereby making it possible for us to check the accuracy of the method of detection of invisible and crucial events that we propose here for a general purpose, including the war against terrorism. For this model an analytical treatment has been recently found [cond-mat/0209038], supporting the claims that we make in this paper for the accuracy of the method of detection. For the reader's convenience, the results on the seismic fluctuations are suitably reviewed, and discussed in the light of the more general perspective of this paper. We also review the model for seismic fluctuations, proposed in the earlier work of cond-mat/0212529. This model shares with the model of cond-mat/0107597 the property that the crucial events are imbedded in a sea of secondary events, but it allows us to reveal with accuracy the statistics of the crucial events for different mathematical reasons.

  13. Why conventional detection methods fail in identifying the existence of contamination events.

    PubMed

    Liu, Shuming; Li, Ruonan; Smith, Kate; Che, Han

    2016-04-15

    Early warning systems are widely used to safeguard water security, but their effectiveness has raised many questions. To understand why conventional detection methods fail to identify contamination events, this study evaluates the performance of three contamination detection methods using data from a real contamination accident and two artificial datasets constructed using a widely applied contamination data construction approach. Results show that the Pearson correlation Euclidean distance (PE) based detection method performs better for real contamination incidents, while the Euclidean distance method (MED) and linear prediction filter (LPF) method are more suitable for detecting sudden spike-like variation. This analysis revealed why the conventional MED and LPF methods failed to identify existence of contamination events. The analysis also revealed that the widely used contamination data construction approach is misleading. PMID:26905801

  14. Model-based fault detection and isolation for intermittently active faults with application to motion-based thruster fault detection and isolation for spacecraft

    NASA Technical Reports Server (NTRS)

    Wilson, Edward (Inventor)

    2008-01-01

    The present invention is a method for detecting and isolating fault modes in a system having a model describing its behavior and regularly sampled measurements. The models are used to calculate past and present deviations from measurements that would result with no faults present, as well as with one or more potential fault modes present. Algorithms that calculate and store these deviations, along with memory of when said faults, if present, would have an effect on the said actual measurements, are used to detect when a fault is present. Related algorithms are used to exonerate false fault modes and finally to isolate the true fault mode. This invention is presented with application to detection and isolation of thruster faults for a thruster-controlled spacecraft. As a supporting aspect of the invention, a novel, effective, and efficient filtering method for estimating the derivative of a noisy signal is presented.

  15. Development of an algorithm for automatic detection and rating of squeak and rattle events

    NASA Astrophysics Data System (ADS)

    Chandrika, Unnikrishnan Kuttan; Kim, Jay H.

    2010-10-01

    A new algorithm for automatic detection and rating of squeak and rattle (S&R) events was developed. The algorithm utilizes the perceived transient loudness (PTL) that approximates the human perception of a transient noise. At first, instantaneous specific loudness time histories are calculated over 1-24 bark range by applying the analytic wavelet transform and Zwicker loudness transform to the recorded noise. Transient specific loudness time histories are then obtained by removing estimated contributions of the background noise from instantaneous specific loudness time histories. These transient specific loudness time histories are summed to obtain the transient loudness time history. Finally, the PTL time history is obtained by applying Glasberg and Moore temporal integration to the transient loudness time history. Detection of S&R events utilizes the PTL time history obtained by summing only 18-24 barks components to take advantage of high signal-to-noise ratio in the high frequency range. A S&R event is identified when the value of the PTL time history exceeds the detection threshold pre-determined by a jury test. The maximum value of the PTL time history is used for rating of S&R events. Another jury test showed that the method performs much better if the PTL time history obtained by summing all frequency components is used. Therefore, r ating of S&R events utilizes this modified PTL time history. Two additional jury tests were conducted to validate the developed detection and rating methods. The algorithm developed in this work will enable automatic detection and rating of S&R events with good accuracy and minimum possibility of false alarm.

  16. Fast and robust microseismic event detection using very fast simulated annealing

    NASA Astrophysics Data System (ADS)

    Velis, Danilo R.; Sabbione, Juan I.; Sacchi, Mauricio D.

    2013-04-01

    The study of microseismic data has become an essential tool in many geoscience fields, including oil reservoir geophysics, mining and CO2 sequestration. In hydraulic fracturing, microseismicity studies permit the characterization and monitoring of the reservoir dynamics in order to optimize the production and the fluid injection process itself. As the number of events is usually large and the signal-to-noise ratio is in general very low, fast, automated, and robust detection algorithms are required for most applications. Also, real-time functionality is commonly needed to control the fluid injection in the field. Generally, events are located by means of grid search algorithms that rely on some approximate velocity model. These techniques are very effective and accurate, but computationally intensive when dealing with large three or four-dimensional grids. Here, we present a fast and robust method that allows to automatically detect and pick an event in 3C microseismic data without any input information about the velocity model. The detection is carried out by means of a very fast simulated annealing (VFSA) algorithm. To this end, we define an objective function that measures the energy of a potential microseismic event along the multichannel signal. This objective function is based on the stacked energy of the envelope of the signals calculated within a predefined narrow time window that depends on the source position, receivers geometry and velocity. Once an event has been detected, the source location can be estimated, in a second stage, by inverting the corresponding traveltimes using a standard technique, which would naturally require some knowledge of the velocity model. Since the proposed technique focuses on the detection of the microseismic events only, the velocity model is not required, leading to a fast algorithm that carries out the detection in real-time. Besides, the strategy is applicable to data with very low signal-to-noise ratios, for it relies

  17. Wenchuan Event Detection And Localization Using Waveform Correlation Coupled With Double Difference

    NASA Astrophysics Data System (ADS)

    Slinkard, M.; Heck, S.; Schaff, D. P.; Young, C. J.; Richards, P. G.

    2014-12-01

    The well-studied Wenchuan aftershock sequence triggered by the May 12, 2008, Ms 8.0, mainshock offers an ideal test case for evaluating the effectiveness of using waveform correlation coupled with double difference relocation to detect and locate events in a large aftershock sequence. We use Sandia's SeisCorr detector to process 3 months of data recorded by permanent IRIS and temporary ASCENT stations using templates from events listed in a global catalog to find similar events in the raw data stream. Then we take the detections and relocate them using the double difference method. We explore both the performance that can be expected with using just a small number of stations, and, the benefits of reprocessing a well-studied sequence such as this one using waveform correlation to find even more events. We benchmark our results against previously published results describing relocations of regional catalog data. Before starting this project, we had examples where with just a few stations at far-regional distances, waveform correlation combined with double difference did and impressive job of detection and location events with precision at the few hundred and even tens of meters level.

  18. Testing the ability of different seismic detections approaches to monitor aftershocks following a moderate magnitude event.

    NASA Astrophysics Data System (ADS)

    Romero, Paula; Díaz, Jordi; Ruiz, Mario; Cantavella, Juan Vicente; Gomez-García, Clara

    2016-04-01

    The detection and picking of seismic events is a permanent concern for seismic surveying, in particular when dealing with aftershocks of moderate magnitude events. Many efforts have been done to find the balance between computer efficiency and the robustness of the detection methods. In this work, data recorded by a high density seismic network deployed following a 5.2 magnitude event located close to Albacete, SE Spain, is used to test the ability of classical and recently proposed detection methodologies. Two days after the main shock, occurred the 23th February, a network formed by 11 stations from ICTJA-CSIC and 2 stations from IGN were deployed over the region, with inter-station distances ranging between 5 and 10 km. The network remained in operation until April 6th, 2015 and allowed to manually identify up to 552 events with magnitudes from 0.2 to 3.5 located in an area of just 25 km2 inside the network limits. The detection methods here studied applied are the classical STA/LTA, a power spectral method, a detector based in the Benford's law and a waveform similarity method. The STA/LTA method, based in the comparison of background noise and seismic signal amplitudes, is taken as a reference to evaluate the results arising from the other approaches. The power spectral density method is based in the inspection of the characteristic frequency pattern associated to seismic events. The Benford's Law detector analyses the distribution of the first-digit of displacement count in the histogram of a seismic waveform, considering that only the windows containing seismic wave arrivals will match the logarithmic law. Finally, the waveform similarity method is based in the analysis of the normalized waveform amplitude, detecting those events with waveform similar to a previously defined master event. The aim of this contribution is to inspect the ability of the different approaches to accurately detect the aftershocks events for this kind of seismic crisis and to

  19. Detecting Continuity Violations in Infancy: A New Account and New Evidence from Covering and Tube Events

    ERIC Educational Resources Information Center

    Wang, S.h.; Baillargeon, R.; Paterson, S.

    2005-01-01

    Recent research on infants' responses to occlusion and containment events indicates that, although some violations of the continuity principle are detected at an early age e.g. Aguiar, A., & Baillargeon, R. (1999). 2.5-month-old infants' reasoning about when objects should and should not be occluded. Cognitive Psychology 39, 116-157; Hespos, S.…

  20. A novel seizure detection algorithm informed by hidden Markov model event states

    NASA Astrophysics Data System (ADS)

    Baldassano, Steven; Wulsin, Drausin; Ung, Hoameng; Blevins, Tyler; Brown, Mesha-Gay; Fox, Emily; Litt, Brian

    2016-06-01

    Objective. Recently the FDA approved the first responsive, closed-loop intracranial device to treat epilepsy. Because these devices must respond within seconds of seizure onset and not miss events, they are tuned to have high sensitivity, leading to frequent false positive stimulations and decreased battery life. In this work, we propose a more robust seizure detection model. Approach. We use a Bayesian nonparametric Markov switching process to parse intracranial EEG (iEEG) data into distinct dynamic event states. Each event state is then modeled as a multidimensional Gaussian distribution to allow for predictive state assignment. By detecting event states highly specific for seizure onset zones, the method can identify precise regions of iEEG data associated with the transition to seizure activity, reducing false positive detections associated with interictal bursts. The seizure detection algorithm was translated to a real-time application and validated in a small pilot study using 391 days of continuous iEEG data from two dogs with naturally occurring, multifocal epilepsy. A feature-based seizure detector modeled after the NeuroPace RNS System was developed as a control. Main results. Our novel seizure detection method demonstrated an improvement in false negative rate (0/55 seizures missed versus 2/55 seizures missed) as well as a significantly reduced false positive rate (0.0012 h versus 0.058 h‑1). All seizures were detected an average of 12.1 ± 6.9 s before the onset of unequivocal epileptic activity (unequivocal epileptic onset (UEO)). Significance. This algorithm represents a computationally inexpensive, individualized, real-time detection method suitable for implantable antiepileptic devices that may considerably reduce false positive rate relative to current industry standards.

  1. Efficient, Decentralized Detection of Qualitative Spatial Events in a Dynamic Scalar Field.

    PubMed

    Jeong, Myeong-Hun; Duckham, Matt

    2015-01-01

    This paper describes an efficient, decentralized algorithm to monitor qualitative spatial events in a dynamic scalar field. The events of interest involve changes to the critical points (i.e., peak, pits and passes) and edges of the surface network derived from the field. Four fundamental types of event (appearance, disappearance, movement and switch) are defined. Our algorithm is designed to rely purely on qualitative information about the neighborhoods of nodes in the sensor network and does not require information about nodes' coordinate positions. Experimental investigations confirm that our algorithm is efficient, with O(n) overall communication complexity (where n is the number of nodes in the sensor network), an even load balance and low operational latency. The accuracy of event detection is comparable to established centralized algorithms for the identification of critical points of a surface network. Our algorithm is relevant to a broad range of environmental monitoring applications of sensor networks. PMID:26343672

  2. Efficient, Decentralized Detection of Qualitative Spatial Events in a Dynamic Scalar Field

    PubMed Central

    Jeong, Myeong-Hun; Duckham, Matt

    2015-01-01

    This paper describes an efficient, decentralized algorithm to monitor qualitative spatial events in a dynamic scalar field. The events of interest involve changes to the critical points (i.e., peak, pits and passes) and edges of the surface network derived from the field. Four fundamental types of event (appearance, disappearance, movement and switch) are defined. Our algorithm is designed to rely purely on qualitative information about the neighborhoods of nodes in the sensor network and does not require information about nodes’ coordinate positions. Experimental investigations confirm that our algorithm is efficient, with O(n) overall communication complexity (where n is the number of nodes in the sensor network), an even load balance and low operational latency. The accuracy of event detection is comparable to established centralized algorithms for the identification of critical points of a surface network. Our algorithm is relevant to a broad range of environmental monitoring applications of sensor networks. PMID:26343672

  3. Feature selection of seismic waveforms for long period event detection at Cotopaxi Volcano

    NASA Astrophysics Data System (ADS)

    Lara-Cueva, R. A.; Benítez, D. S.; Carrera, E. V.; Ruiz, M.; Rojo-Álvarez, J. L.

    2016-04-01

    Volcano Early Warning Systems (VEWS) have become a research topic in order to preserve human lives and material losses. In this setting, event detection criteria based on classification using machine learning techniques have proven useful, and a number of systems have been proposed in the literature. However, to the best of our knowledge, no comprehensive and principled study has been conducted to compare the influence of the many different sets of possible features that have been used as input spaces in previous works. We present an automatic recognition system of volcano seismicity, by considering feature extraction, event classification, and subsequent event detection, in order to reduce the processing time as a first step towards a high reliability automatic detection system in real-time. We compiled and extracted a comprehensive set of temporal, moving average, spectral, and scale-domain features, for separating long period seismic events from background noise. We benchmarked two usual kinds of feature selection techniques, namely, filter (mutual information and statistical dependence) and embedded (cross-validation and pruning), each of them by using suitable and appropriate classification algorithms such as k Nearest Neighbors (k-NN) and Decision Trees (DT). We applied this approach to the seismicity presented at Cotopaxi Volcano in Ecuador during 2009 and 2010. The best results were obtained by using a 15 s segmentation window, feature matrix in the frequency domain, and DT classifier, yielding 99% of detection accuracy and sensitivity. Selected features and their interpretation were consistent among different input spaces, in simple terms of amplitude and spectral content. Our study provides the framework for an event detection system with high accuracy and reduced computational requirements.

  4. Detection, tracking and event localization of interesting features in 4-D atmospheric data

    NASA Astrophysics Data System (ADS)

    Limbach, S.; Schömer, E.; Wernli, H.

    2011-11-01

    We introduce a novel algorithm for the efficient detection and tracking of interesting features in spatial-temporal atmospheric data, as well as for the precise localization of the occurring genesis, lysis, merging and splitting events. The algorithm is based on the well-known region growing segmentation method. We extended the basic idea towards the analysis of the complete 4-D dataset, identifying segments representing the spatial features and their development over time. Each segment consists of one set of distinct 3-D features per time step. The algorithm keeps track of the successors of each 3-D feature, constructing the so-called event graph of each segment. The precise localization of the splitting events is based on a search for all grid points inside the initial 3-D feature which have a similar distance to all successive 3-D features of the next time step. The merging event is localized analogously considering inverted direction of time. We tested the implementation on a four-dimensional field of wind speed data from European Centre for Medium-Range Weather Forecasts (ECMWF) analyses and computed a climatology of upper-tropospheric jet streams and their events. We compare our results with a previous climatology, investigate the statistical distribution of the merging and splitting events, and illustrate the meteorological significance of the jet splitting events with a case study. A brief outlook is given on additional potential applications of the 4-D data segmentation technique.

  5. Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)

    NASA Astrophysics Data System (ADS)

    Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko

    2016-07-01

    A new Matched Filtering Algorithm (MFA) is proposed for detecting and analysing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multichannel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multicomponent waveforms into the ray-centred co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, that is, microseismic events for which only one of the S- or P-wave arrivals is evident due to unfavourable S/N conditions. A real-data example using microseismic monitoring data from four stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than fourfold increase) in the number of located events compared with the original catalogue. Moreover, analysis of the new MFA catalogue suggests that this approach leads to more robust interpretation

  6. Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)

    NASA Astrophysics Data System (ADS)

    Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko

    2016-05-01

    A new Matched Filtering Algorithm (MFA) is proposed for detecting and analyzing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multi-channel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multi-component waveforms into the ray-centered co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, i.e. microseismic events for which only one of the S- or P-wave arrival is evident due to unfavorable S/N conditions. A real-data example using microseismic monitoring data from 4 stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than four-fold increase) in the number of located events compared with the original catalog. Moreover, analysis of the new MFA catalog suggests that this approach leads to more robust interpretation of the

  7. Covert Network Analysis for Key Player Detection and Event Prediction Using a Hybrid Classifier

    PubMed Central

    Akram, M. Usman; Khan, Shoab A.; Javed, Muhammad Younus

    2014-01-01

    National security has gained vital importance due to increasing number of suspicious and terrorist events across the globe. Use of different subfields of information technology has also gained much attraction of researchers and practitioners to design systems which can detect main members which are actually responsible for such kind of events. In this paper, we present a novel method to predict key players from a covert network by applying a hybrid framework. The proposed system calculates certain centrality measures for each node in the network and then applies novel hybrid classifier for detection of key players. Our system also applies anomaly detection to predict any terrorist activity in order to help law enforcement agencies to destabilize the involved network. As a proof of concept, the proposed framework has been implemented and tested using different case studies including two publicly available datasets and one local network. PMID:25136674

  8. Drivers of Emerging Infectious Disease Events as a Framework for Digital Detection

    PubMed Central

    Olson, Sarah H.; Benedum, Corey M.; Mekaru, Sumiko R.; Preston, Nicholas D.; Mazet, Jonna A.K.; Joly, Damien O.

    2015-01-01

    The growing field of digital disease detection, or epidemic intelligence, attempts to improve timely detection and awareness of infectious disease (ID) events. Early detection remains an important priority; thus, the next frontier for ID surveillance is to improve the recognition and monitoring of drivers (antecedent conditions) of ID emergence for signals that precede disease events. These data could help alert public health officials to indicators of elevated ID risk, thereby triggering targeted active surveillance and interventions. We believe that ID emergence risks can be anticipated through surveillance of their drivers, just as successful warning systems of climate-based, meteorologically sensitive diseases are supported by improved temperature and precipitation data. We present approaches to driver surveillance, gaps in the current literature, and a scientific framework for the creation of a digital warning system. Fulfilling the promise of driver surveillance will require concerted action to expand the collection of appropriate digital driver data. PMID:26196106

  9. Drivers of Emerging Infectious Disease Events as a Framework for Digital Detection.

    PubMed

    Olson, Sarah H; Benedum, Corey M; Mekaru, Sumiko R; Preston, Nicholas D; Mazet, Jonna A K; Joly, Damien O; Brownstein, John S

    2015-08-01

    The growing field of digital disease detection, or epidemic intelligence, attempts to improve timely detection and awareness of infectious disease (ID) events. Early detection remains an important priority; thus, the next frontier for ID surveillance is to improve the recognition and monitoring of drivers (antecedent conditions) of ID emergence for signals that precede disease events. These data could help alert public health officials to indicators of elevated ID risk, thereby triggering targeted active surveillance and interventions. We believe that ID emergence risks can be anticipated through surveillance of their drivers, just as successful warning systems of climate-based, meteorologically sensitive diseases are supported by improved temperature and precipitation data. We present approaches to driver surveillance, gaps in the current literature, and a scientific framework for the creation of a digital warning system. Fulfilling the promise of driver surveillance will require concerted action to expand the collection of appropriate digital driver data. PMID:26196106

  10. A novel adaptive, real-time algorithm to detect gait events from wearable sensors.

    PubMed

    Chia Bejarano, Noelia; Ambrosini, Emilia; Pedrocchi, Alessandra; Ferrigno, Giancarlo; Monticone, Marco; Ferrante, Simona

    2015-05-01

    A real-time, adaptive algorithm based on two inertial and magnetic sensors placed on the shanks was developed for gait-event detection. For each leg, the algorithm detected the Initial Contact (IC), as the minimum of the flexion/extension angle, and the End Contact (EC) and the Mid-Swing (MS), as minimum and maximum of the angular velocity, respectively. The algorithm consisted of calibration, real-time detection, and step-by-step update. Data collected from 22 healthy subjects (21 to 85 years) walking at three self-selected speeds were used to validate the algorithm against the GaitRite system. Comparable levels of accuracy and significantly lower detection delays were achieved with respect to other published methods. The algorithm robustness was tested on ten healthy subjects performing sudden speed changes and on ten stroke subjects (43 to 89 years). For healthy subjects, F1-scores of 1 and mean detection delays lower than 14 ms were obtained. For stroke subjects, F1-scores of 0.998 and 0.944 were obtained for IC and EC, respectively, with mean detection delays always below 31 ms. The algorithm accurately detected gait events in real time from a heterogeneous dataset of gait patterns and paves the way for the design of closed-loop controllers for customized gait trainings and/or assistive devices. PMID:25069118

  11. Sparse conditional mixture model: late fusion with missing scores for multimedia event detection

    NASA Astrophysics Data System (ADS)

    Nallapati, Ramesh; Yeh, Eric; Myers, Gregory

    2013-03-01

    The problem of event detection in multimedia clips is typically handled by modeling each of the component modalities independently, then combining their detection scores in a late fusion approach. One of the problems of a late fusion model in the multimedia setting is that the detection scores may be missing from one or more components for a given clip; e.g., when there is no speech in the clip; or when there is no overlay text. Standard fusion techniques typically address this problem by assuming a default backoff score for a component when its detection score is missing for a clip. This may potentially bias the fusion model, especially if there are many missing detections from a given component. In this work, we present the Sparse Conditional Mixture Model (SCMM) which models only the observed detection scores for each example, thereby avoiding making any assumptions about the distributions of the scores that are made by backoff models. Our experiments in multi-media event detection using the TRECVID-2011 corpus demonstrates that SCMM achieves statistically significant performance gains over standard late fusion techniques. The SCMM model is very general and is applicable to fusion problems with missing data in any domain.

  12. Event Detection and Location of Earthquakes Using the Cascadia Initiative Dataset

    NASA Astrophysics Data System (ADS)

    Morton, E.; Bilek, S. L.; Rowe, C. A.

    2015-12-01

    The Cascadia subduction zone (CSZ) produces a range of slip behavior along the plate boundary megathrust, from great earthquakes to episodic slow slip and tremor (ETS). Unlike other subduction zones that produce great earthquakes and ETS, the CSZ is notable for the lack of small and moderate magnitude earthquakes recorded. The seismogenic zone extent is currently estimated to be primarily offshore, thus the lack of observed small, interplate earthquakes may be partially due to the use of only land seismometers. The Cascadia Initiative (CI) community seismic experiment seeks to address this issue by including ocean bottom seismometers (OBS) deployed directly over the locked seismogenic zone, in addition to land seismometers. We use these seismic data to explore whether small magnitude earthquakes are occurring on the plate interface, but have gone undetected by the land-based seismic networks. We select a subset of small magnitude (M0.1-3.7) earthquakes from existing earthquake catalogs, based on land seismic data, whose preliminary hypocentral locations suggest they may have occurred on the plate interface. We window the waveforms on CI OBS and land seismometers around the phase arrival times for these earthquakes to generate templates for subspace detection, which allows for additional flexibility over traditional matched filter detection methods. Here we present event detections from the first year of CI deployment and preliminary locations for the detected events. Initial results of scanning the first year of the CI deployment using one cluster of template events, located near a previously identified subducted seamount, include 473 detections on OBS station M08A (~61.6 km offshore) and 710 detections on OBS station J25A (~44.8 km northeast of M08A). Ongoing efforts include detection using additional OBS stations along the margin, as well as determining locations of clusters detected in the first year of deployment.

  13. The development of a temporal-BRDF model-based approach to change detection, an application to the identification and delineation of fire affected areas

    NASA Astrophysics Data System (ADS)

    Rebelo, Lisa-Maria

    Although large quantities of southern Africa burn every year, minimal information is available relating to the fire regimes of this area. This study develops a new, generic approach to change detection, applicable to the identification of land cover change from high temporal and moderate spatial resolution satellite data. Traditional change detection techniques have several key limitations which are identified and addressed in this work. In particular these approaches fail to account for directional effects in the remote sensing signal introduced by variations in the solar and sensing geometry, and are sensitive to underlying phenological changes in the surface as well as noise in the data due to cloud or atmospheric contamination. This research develops a bi-directional, model-based change detection algorithm. An empirical temporal component is incorporated into a semi-empirical linear BRDF model. This may be fitted to a long time series of reflectance with less sensitivity to the presence of underlying phenological change. Outliers are identified based on an estimation of noise in the data and the calculation of uncertainty in the model parameters and are removed from the sequence. A "step function kernel" is incorporated into the formulation in order to detect explicitly sudden step-like changes in the surface reflectance induced by burning. The change detection model is applied to the problem of locating and mapping fire affected areas from daily moderate spatial resolution satellite data, and an indicator of burn severity is introduced. Monthly burned area datasets for a 2400km by 1200km area of southern Africa detailing the day and severity of burning are created for a five year period (2000-2004). These data are analysed and the fire regimes of southern African ecosystems during this time are characterised. The results highlight the extent of the burning which is taking place within southern Africa, with between 27-32% of the study area burning during each

  14. Event-specific quantitative detection of nine genetically modified maizes using one novel standard reference molecule.

    PubMed

    Yang, Litao; Guo, Jinchao; Pan, Aihu; Zhang, Haibo; Zhang, Kewei; Wang, Zhengming; Zhang, Dabing

    2007-01-10

    With the development of genetically modified organism (GMO) detection techniques, the Polymerase Chain Reaction (PCR) technique has been the mainstay for GMO detection, and real-time PCR is the most effective and important method for GMO quantification. An event-specific detection strategy based on the unique and specific integration junction sequences between the host plant genome DNA and the integrated gene is being developed for its high specificity. This study establishes the event-specific detection methods for TC1507 and CBH351 maizes. In addition, the event-specific TaqMan real-time PCR detection methods for another seven GM maize events (Bt11, Bt176, GA21, MON810, MON863, NK603, and T25) were systematically optimized and developed. In these PCR assays, the fluorescent quencher, TAMRA, was dyed on the T-base of the probe at the internal position to improve the intensity of the fluorescent signal. To overcome the difficulties in obtaining the certified reference materials of these GM maizes, one novel standard reference molecule containing all nine specific integration junction sequences of these GM maizes and the maize endogenous reference gene, zSSIIb, was constructed and used for quantitative analysis. The limits of detection of these methods were 20 copies for these different GM maizes, the limits of quantitation were about 20 copies, and the dynamic ranges for quantification were from 0.05 to 100% in 100 ng of DNA template. Furthermore, nine groups of the mixed maize samples of these nine GM maize events were quantitatively analyzed to evaluate the accuracy and precision. The accuracy expressed as bias varied from 0.67 to 28.00% for the nine tested groups of GM maize samples, and the precision expressed as relative standard deviations was from 0.83 to 26.20%. All of these indicated that the established event-specific real-time PCR detection systems and the reference molecule in this study are suitable for the identification and quantification of these GM

  15. Detection, tracking and event localization of jet stream features in 4-D atmospheric data

    NASA Astrophysics Data System (ADS)

    Limbach, S.; Schömer, E.; Wernli, H.

    2012-04-01

    We introduce a novel algorithm for the efficient detection and tracking of features in spatiotemporal atmospheric data, as well as for the precise localization of the occurring genesis, lysis, merging and splitting events. The algorithm works on data given on a four-dimensional structured grid. Feature selection and clustering are based on adjustable local and global criteria, feature tracking is predominantly based on spatial overlaps of the feature's full volumes. The resulting 3-D features and the identified correspondences between features of consecutive time steps are represented as the nodes and edges of a directed acyclic graph, the event graph. Merging and splitting events appear in the event graph as nodes with multiple incoming or outgoing edges, respectively. The precise localization of the splitting events is based on a search for all grid points inside the initial 3-D feature that have a similar distance to two successive 3-D features of the next time step. The merging event is localized analogously, operating backward in time. As a first application of our method we present a climatology of upper-tropospheric jet streams and their events, based on four-dimensional wind speed data from European Centre for Medium-Range Weather Forecasts (ECMWF) analyses. We compare our results with a climatology from a previous study, investigate the statistical distribution of the merging and splitting events, and illustrate the meteorological significance of the jet splitting events with a case study. A brief outlook is given on additional potential applications of the 4-D data segmentation technique.

  16. Signal Detection of Adverse Drug Reaction of Amoxicillin Using the Korea Adverse Event Reporting System Database.

    PubMed

    Soukavong, Mick; Kim, Jungmee; Park, Kyounghoon; Yang, Bo Ram; Lee, Joongyub; Jin, Xue Mei; Park, Byung Joo

    2016-09-01

    We conducted pharmacovigilance data mining for a β-lactam antibiotics, amoxicillin, and compare the adverse events (AEs) with the drug labels of 9 countries including Korea, USA, UK, Japan, Germany, Swiss, Italy, France, and Laos. We used the Korea Adverse Event Reporting System (KAERS) database, a nationwide database of AE reports, between December 1988 and June 2014. Frequentist and Bayesian methods were used to calculate disproportionality distribution of drug-AE pairs. The AE which was detected by all the three indices of proportional reporting ratio (PRR), reporting odds ratio (ROR), and information component (IC) was defined as a signal. The KAERS database contained a total of 807,582 AE reports, among which 1,722 reports were attributed to amoxicillin. Among the 192,510 antibiotics-AE pairs, the number of amoxicillin-AE pairs was 2,913. Among 241 AEs, 52 adverse events were detected as amoxicillin signals. Comparing the drug labels of 9 countries, 12 adverse events including ineffective medicine, bronchitis, rhinitis, sinusitis, dry mouth, gastroesophageal reflux, hypercholesterolemia, gastric carcinoma, abnormal crying, induration, pulmonary carcinoma, and influenza-like symptoms were not listed on any of the labels of nine countries. In conclusion, we detected 12 new signals of amoxicillin which were not listed on the labels of 9 countries. Therefore, it should be followed by signal evaluation including causal association, clinical significance, and preventability. PMID:27510377

  17. Signal Detection of Adverse Drug Reaction of Amoxicillin Using the Korea Adverse Event Reporting System Database

    PubMed Central

    2016-01-01

    We conducted pharmacovigilance data mining for a β-lactam antibiotics, amoxicillin, and compare the adverse events (AEs) with the drug labels of 9 countries including Korea, USA, UK, Japan, Germany, Swiss, Italy, France, and Laos. We used the Korea Adverse Event Reporting System (KAERS) database, a nationwide database of AE reports, between December 1988 and June 2014. Frequentist and Bayesian methods were used to calculate disproportionality distribution of drug-AE pairs. The AE which was detected by all the three indices of proportional reporting ratio (PRR), reporting odds ratio (ROR), and information component (IC) was defined as a signal. The KAERS database contained a total of 807,582 AE reports, among which 1,722 reports were attributed to amoxicillin. Among the 192,510 antibiotics-AE pairs, the number of amoxicillin-AE pairs was 2,913. Among 241 AEs, 52 adverse events were detected as amoxicillin signals. Comparing the drug labels of 9 countries, 12 adverse events including ineffective medicine, bronchitis, rhinitis, sinusitis, dry mouth, gastroesophageal reflux, hypercholesterolemia, gastric carcinoma, abnormal crying, induration, pulmonary carcinoma, and influenza-like symptoms were not listed on any of the labels of nine countries. In conclusion, we detected 12 new signals of amoxicillin which were not listed on the labels of 9 countries. Therefore, it should be followed by signal evaluation including causal association, clinical significance, and preventability. PMID:27510377

  18. A method for detecting and locating geophysical events using groups of arrays

    NASA Astrophysics Data System (ADS)

    de Groot-Hedlin, Catherine D.; Hedlin, Michael A. H.

    2015-11-01

    We have developed a novel method to detect and locate geophysical events that makes use of any sufficiently dense sensor network. This method is demonstrated using acoustic sensor data collected in 2013 at the USArray Transportable Array (TA). The algorithm applies Delaunay triangulation to divide the sensor network into a mesh of three-element arrays, called triads. Because infrasound waveforms are incoherent between the sensors within each triad, the data are transformed into envelopes, which are cross-correlated to find signals that satisfy a consistency criterion. The propagation azimuth, phase velocity and signal arrival time are computed for each signal. Triads with signals that are consistent with a single source are bundled as an event group. The ensemble of arrival times and azimuths of detected signals within each group are used to locate a common source in space and time. A total of 513 infrasonic stations that were active for part or all of 2013 were divided into over 2000 triads. Low (0.5-2 Hz) and high (2-8 Hz) catalogues of infrasonic events were created for the eastern USA. The low-frequency catalogue includes over 900 events and reveals several highly active source areas on land that correspond with coal mining regions. The high-frequency catalogue includes over 2000 events, with most occurring offshore. Although their cause is not certain, most events are clearly anthropogenic as almost all occur during regular working hours each week. The regions to which the TA is most sensitive vary seasonally, with the direction of reception dependent on the direction of zonal winds. The catalogue has also revealed large acoustic events that may provide useful insight into the nature of long-range infrasound propagation in the atmosphere.

  19. A Heuristic Indication and Warning Staging Model for Detection and Assessment of Biological Events

    PubMed Central

    Wilson, James M.; Polyak, Marat G.; Blake, Jane W.; Collmann, Jeff

    2008-01-01

    Objective This paper presents a model designed to enable rapid detection and assessment of biological threats that may require swift intervention by the international public health community. Design We utilized Strauss’ grounded theory to develop an expanded model of social disruption due to biological events based on retrospective and prospective case studies. We then applied this model to the temporal domain and propose a heuristic staging model, the Wilson–Collmann Scale for assessing biological event evolution. Measurements We retrospectively and manually examined hard copy archival local media reports in the native vernacular for three biological events associated with substantial social disruption. The model was then tested prospectively through media harvesting based on keywords corresponding to the model parameters. Results Our heuristic staging model provides valuable information about the features of a biological event that can be used to determine the level of concern warranted, such as whether the pathogen in question is responding to established public health disease control measures, including the use of antimicrobials or vaccines; whether the public health and medical infrastructure of the country involved is adequate to mount the necessary response; whether the country’s officials are providing an appropriate level of information to international public health authorities; and whether the event poses a international threat. The approach is applicable for monitoring open-source (public-domain) media for indications and warnings of such events, and specifically for markers of the social disruption that commonly occur as these events unfold. These indications and warnings can then be used as the basis for staging the biological threat in the same manner that the United States National Weather Service currently uses storm warning models (such as the Saffir-Simpson Hurricane Scale) to detect and assess threatening weather conditions. Conclusion

  20. Characterization and Analysis of Networked Array of Sensors for Event Detection (CANARY-EDS)

    Energy Science and Technology Software Center (ESTSC)

    2011-05-27

    CANARY -EDS provides probabilistic event detection based on analysis of time-series data from water quality or other sensors. CANARY also can compare patterns against a library of previously seen data to indicate that a certain pattern has reoccurred, suppressing what would otherwise be considered an event. CANARY can be configured to analyze previously recorded data from files or databases, or it can be configured to run in real-time mode directory from a database, or throughmore » the US EPA EDDIES software.« less

  1. CTBT infrasound network performance to detect the 2013 Russian fireball event

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; Le Pichon, Alexis; Mialle, Pierrick; Garcés, Milton A.

    2015-04-01

    The explosive fragmentation of the 2013 Chelyabinsk meteorite generated a large airburst with an equivalent yield of 500 kT TNT. It is the most energetic event recorded by the infrasound component of the Comprehensive Nuclear-Test-Ban Treaty-International Monitoring System (CTBT-IMS), globally detected by 20 out of 42 operational stations. This study performs a station-by-station estimation of the IMS detection capability to explain infrasound detections and nondetections from short to long distances, using the Chelyabinsk meteorite as global reference event. Investigated parameters influencing the detection capability are the directivity of the line source signal, the ducting of acoustic energy, and the individual noise conditions at each station. Findings include a clear detection preference for stations perpendicular to the meteorite trajectory, even over large distances. Only a weak influence of stratospheric ducting is observed for this low-frequency case. Furthermore, a strong dependence on the diurnal variability of background noise levels at each station is observed, favoring nocturnal detections.

  2. Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO)

    PubMed Central

    Yan, Lixin; Zhang, Yishi; He, Yi; Gao, Song; Zhu, Dunyao; Ran, Bin; Wu, Qing

    2016-01-01

    The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1) the Markov blanket (MB) algorithm is employed to extract the main factors associated with hazardous traffic events; (2) a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle’s speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G) have significant influences on hazardous traffic events. The sequential minimal optimization (SMO) algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles. PMID:27420073

  3. Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO).

    PubMed

    Yan, Lixin; Zhang, Yishi; He, Yi; Gao, Song; Zhu, Dunyao; Ran, Bin; Wu, Qing

    2016-01-01

    The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1) the Markov blanket (MB) algorithm is employed to extract the main factors associated with hazardous traffic events; (2) a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle's speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G) have significant influences on hazardous traffic events. The sequential minimal optimization (SMO) algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles. PMID:27420073

  4. Collaborative-Comparison Learning for Complex Event Detection Using Distributed Hierarchical Graph Neuron (DHGN) Approach in Wireless Sensor Network

    NASA Astrophysics Data System (ADS)

    Muhamad Amin, Anang Hudaya; Khan, Asad I.

    Research trends in existing event detection schemes using Wireless Sensor Network (WSN) have mainly focused on routing and localisation of nodes for optimum coordination when retrieving sensory information. Efforts have also been put in place to create schemes that are able to provide learning mechanisms for event detection using classification or clustering approaches. These schemes entail substantial communication and computational overheads owing to the event-oblivious nature of data transmissions. In this paper, we present an event detection scheme that has the ability to distribute detection processes over the resource-constrained wireless sensor nodes and is suitable for events with spatio-temporal characteristics. We adopt a pattern recognition algorithm known as Distributed Hierarchical Graph Neuron (DHGN) with collaborative-comparison learning for detecting critical events in WSN. The scheme demonstrates good accuracy for binary classification and offers low-complexity and high-scalability in terms of its processing requirements.

  5. Support Vector Machine Model for Automatic Detection and Classification of Seismic Events

    NASA Astrophysics Data System (ADS)

    Barros, Vesna; Barros, Lucas

    2016-04-01

    The automated processing of multiple seismic signals to detect, localize and classify seismic events is a central tool in both natural hazards monitoring and nuclear treaty verification. However, false detections and missed detections caused by station noise and incorrect classification of arrivals are still an issue and the events are often unclassified or poorly classified. Thus, machine learning techniques can be used in automatic processing for classifying the huge database of seismic recordings and provide more confidence in the final output. Applied in the context of the International Monitoring System (IMS) - a global sensor network developed for the Comprehensive Nuclear-Test-Ban Treaty (CTBT) - we propose a fully automatic method for seismic event detection and classification based on a supervised pattern recognition technique called the Support Vector Machine (SVM). According to Kortström et al., 2015, the advantages of using SVM are handleability of large number of features and effectiveness in high dimensional spaces. Our objective is to detect seismic events from one IMS seismic station located in an area of high seismicity and mining activity and classify them as earthquakes or quarry blasts. It is expected to create a flexible and easily adjustable SVM method that can be applied in different regions and datasets. Taken a step further, accurate results for seismic stations could lead to a modification of the model and its parameters to make it applicable to other waveform technologies used to monitor nuclear explosions such as infrasound and hydroacoustic waveforms. As an authorized user, we have direct access to all IMS data and bulletins through a secure signatory account. A set of significant seismic waveforms containing different types of events (e.g. earthquake, quarry blasts) and noise is being analysed to train the model and learn the typical pattern of the signal from these events. Moreover, comparing the performance of the support

  6. Surface-Wave Multiple-Event Relocation and Detection of Earthquakes along the Romanche Fracture Zone

    NASA Astrophysics Data System (ADS)

    Cleveland, M.; Ammon, C. J.; VanDeMark, T. F.

    2011-12-01

    The Romanche Transform system, located along the equatorial Mid-Atlantic Ridge, is approximately 900 km in length and separates plates moving with a relative plate speed of three cm/yr. We use cross-correlation of globally recorded Rayleigh waves to estimate precise relative epicentroids of moderate-size earthquakes along the Romanche Fracture Zone system. The Romanche transform has an even distribution of large events along its entire length that provide a good base of events with excellent signal-to-noise observations. Two distinct moderate-magnitude event clusters occur along the eastern half of the transform and the region between the clusters hosted a large event in the last decade. Based on initial results (Van DeMark, 2006), unlike those of shorter transform systems, the events along the Romanche do not follow narrow features, the event clusters seem to spread perpendicular as well as laterally to the transform trend. These patterns are consistent with parallel, en echelon and/or braided fault systems, which have been previously observed on the Romanche through the use of side scanning sonar (Parson and Searle, 1986). We also explore the character and potential of seismic body waves to extend the method to help improve relative event depth estimates. Relying on a good base of larger and moderate-magnitude seismicity, we attempt to extend the analysis by processing continuous data streams through processes measuring waveform similarity (e.g. cross-correlation) in an attempt to detect smaller events using a subset of nearest seismic stations.

  7. Detection of Unusual Events and Trends in Complex Non-Stationary Data Streams

    SciTech Connect

    Perez, Rafael B; Protopopescu, Vladimir A; Worley, Brian Addison; Perez, Cristina

    2006-01-01

    The search for unusual events and trends hidden in multi-component, nonlinear, non-stationary, noisy signals is extremely important for a host of different applications, ranging from nuclear power plant and electric grid operation to internet traffic and implementation of non-proliferation protocols. In the context of this work, we define an unusual event as a local signal disturbance and a trend as a continuous carrier of information added to and different from the underlying baseline dynamics. The goal of this paper is to investigate the feasibility of detecting hidden intermittent events inside non-stationary signal data sets corrupted by high levels of noise, by using the Hilbert-Huang empirical mode decomposition method.

  8. Use of a clinical event monitor to prevent and detect medication errors.

    PubMed Central

    Payne, T. H.; Savarino, J.; Marshall, R.; Hoey, C. T.

    2000-01-01

    Errors in health care facilities are common and often unrecognized. We have used our clinical event monitor to prevent and detect medication errors by scrutinizing electronic messages sent to it when any medication order is written in our facility. A growing collection of medication safety rules covering dose limit errors, laboratory monitoring, and other topics may be applied to each medication order message to provide an additional layer of protection beyond existing order checks, reminders, and alerts available within our computer-based record system. During a typical day the event monitor receives 4802 messages, of which 4719 pertain to medication orders. We have found the clinical event monitor to be a valuable tool for clinicians and quality management groups charged with improving medication safety. PMID:11079962

  9. Detection of Severe Rain on Snow events using passive microwave remote sensing

    NASA Astrophysics Data System (ADS)

    Grenfell, T. C.; Putkonen, J.

    2007-12-01

    Severe wintertime rain-on-snow (ROS) events create a strong ice layer or layers in the snow on arctic tundra that act as a barrier to ungulate grazing. These events are linked with large-scale ungulate herd declines via starvation and reduced calf production rate when the animals are unable to penetrate through the resulting ice layer. ROS events also produce considerable perturbation in the mean wintertime soil temperature beneath the snow pack. ROS is a sporadic but well-known and significant phenomenon that is currently very poorly documented. Characterization of the distribution and occurrence of severe rain-on-snow events is based only on anecdotal evidence, indirect observations of carcasses found adjacent to iced snow packs, and irregular detection by a sparse observational weather network. We have analyzed in detail a particular well-identified ROS event that took place on Banks Island in early October 2003 that resulted in the death of 20,000 musk oxen. We make use of multifrequency passive microwave imagery from the special sensing microwave imager satellite sensor suite (SSM/I) in conjunction with a strong-fluctuation-theory (SFT) emissivity model. We show that a combination of time series analysis and cluster analysis based on microwave spectral gradients and polarization ratios provides a means to detect the stages of the ROS event resulting from the modification of the vertical structure of the snow pack, specifically wetting the snow, the accumulation of liquid water at the base of the snow during the rain event, and the subsequent modification of the snowpack after refreezing. SFT model analysis provides quantitative confirmation of our interpretation of the evolution of the microwave properties of the snowpack as a result of the ROS event. In particular, in addition to the grain coarsening due to destructive metamorphism, we detect the presence of the internal water and ice layers, directly identifying the physical properties producing the

  10. Detection and analysis of high-temperature events in the BIRD mission

    NASA Astrophysics Data System (ADS)

    Zhukov, Boris; Briess, Klaus; Lorenz, Eckehard; Oertel, Dieter; Skrbek, Wolfgang

    2005-01-01

    The primary mission objective of a new small Bi-spectral InfraRed Detection (BIRD) satellite is detection and quantitative analysis of high-temperature events like fires and volcanoes. An absence of saturation in the BIRD infrared channels makes it possible to improve false alarm rejection as well as to retrieve quantitative characteristics of hot targets, including their effective fire temperature, area and the radiative energy release. Examples are given of detection and analysis of wild and coal seam fires, of volcanic activity as well as of oil fires in Iraq. The smallest fires detected by BIRD, which were verified on ground, had an area of 12m2 at daytime and 4m2 at night.

  11. Assessing Reliability of Medical Record Reviews for the Detection of Hospital Adverse Events

    PubMed Central

    Ock, Minsu; Lee, Sang-il; Jo, Min-Woo; Lee, Jin Yong; Kim, Seon-Ha

    2015-01-01

    Objectives: The purpose of this study was to assess the inter-rater reliability and intra-rater reliability of medical record review for the detection of hospital adverse events. Methods: We conducted two stages retrospective medical records review of a random sample of 96 patients from one acute-care general hospital. The first stage was an explicit patient record review by two nurses to detect the presence of 41 screening criteria (SC). The second stage was an implicit structured review by two physicians to identify the occurrence of adverse events from the positive cases on the SC. The inter-rater reliability of two nurses and that of two physicians were assessed. The intra-rater reliability was also evaluated by using test-retest method at approximately two weeks later. Results: In 84.2% of the patient medical records, the nurses agreed as to the necessity for the second stage review (kappa, 0.68; 95% confidence interval [CI], 0.54 to 0.83). In 93.0% of the patient medical records screened by nurses, the physicians agreed about the absence or presence of adverse events (kappa, 0.71; 95% CI, 0.44 to 0.97). When assessing intra-rater reliability, the kappa indices of two nurses were 0.54 (95% CI, 0.31 to 0.77) and 0.67 (95% CI, 0.47 to 0.87), whereas those of two physicians were 0.87 (95% CI, 0.62 to 1.00) and 0.37 (95% CI, -0.16 to 0.89). Conclusions: In this study, the medical record review for detecting adverse events showed intermediate to good level of inter-rater and intra-rater reliability. Well organized training program for reviewers and clearly defining SC are required to get more reliable results in the hospital adverse event study. PMID:26429290

  12. Flow detection via sparse frame analysis for suspicious event recognition in infrared imagery

    NASA Astrophysics Data System (ADS)

    Fernandes, Henrique C.; Batista, Marcos A.; Barcelos, Celia A. Z.; Maldague, Xavier P. V.

    2013-05-01

    It is becoming increasingly evident that intelligent systems are very bene¯cial for society and that the further development of such systems is necessary to continue to improve society's quality of life. One area that has drawn the attention of recent research is the development of automatic surveillance systems. In our work we outline a system capable of monitoring an uncontrolled area (an outside parking lot) using infrared imagery and recognizing suspicious events in this area. The ¯rst step is to identify moving objects and segment them from the scene's background. Our approach is based on a dynamic background-subtraction technique which robustly adapts detection to illumination changes. It is analyzed only regions where movement is occurring, ignoring in°uence of pixels from regions where there is no movement, to segment moving objects. Regions where movement is occurring are identi¯ed using °ow detection via sparse frame analysis. During the tracking process the objects are classi¯ed into two categories: Persons and Vehicles, based on features such as size and velocity. The last step is to recognize suspicious events that may occur in the scene. Since the objects are correctly segmented and classi¯ed it is possible to identify those events using features such as velocity and time spent motionless in one spot. In this paper we recognize the suspicious event suspicion of object(s) theft from inside a parked vehicle at spot X by a person" and results show that the use of °ow detection increases the recognition of this suspicious event from 78:57% to 92:85%.

  13. Energy Reconstruction for Events Detected in TES X-ray Detectors

    NASA Astrophysics Data System (ADS)

    Ceballos, M. T.; Cardiel, N.; Cobo, B.

    2015-09-01

    The processing of the X-ray events detected by a TES (Transition Edge Sensor) device (such as the one that will be proposed in the ESA AO call for instruments for the Athena mission (Nandra et al. 2013) as a high spectral resolution instrument, X-IFU (Barret et al. 2013)), is a several step procedure that starts with the detection of the current pulses in a noisy signal and ends up with their energy reconstruction. For this last stage, an energy calibration process is required to convert the pseudo energies measured in the detector to the real energies of the incoming photons, accounting for possible nonlinearity effects in the detector. We present the details of the energy calibration algorithm we implemented as the last part of the Event Processing software that we are developing for the X-IFU instrument, that permits the calculation of the calibration constants in an analytical way.

  14. How unusual are the "unusual events" detected by control chart techniques in healthcare settings?

    PubMed

    Borckardt, Jeffrey J; Nash, Michael R; Hardesty, Susan; Herbert, Joan; Cooney, Harriet; Pelic, Christopher

    2006-01-01

    Statistical process control (SPC) charts have become widely implemented tools for quality monitoring and assurance in healthcare settings across the United States. SPC methods have been successfully used in industrial settings to track the quality of products manufactured by machines and to detect deviations from acceptable Levels of product quality. However, problems may arise when SPC methods are used to evaluate human behavior. Specifically, when human behavior is tracked over time, the data stream generated usually exhibits periodicity and gradualism with respect to behavioral changes over time. These tendencies can be quantified and are recognized in the statistical field as autocorrelation. When autocorrelation is present, conventional SPC methods too often identify events as "unusuaL" when they really should be understood as products of random fluctuation. This article discusses the concept of autocorrelation and demonstrates the negative impact of autocorrelation on traditional SPC methods, with a specific focus on the use of SPC charts to detect unusual events. PMID:16944647

  15. Automatic Detection and Classification of Unsafe Events During Power Wheelchair Use

    PubMed Central

    Moghaddam, Athena K.; Yuen, Hiu Kim; Archambault, Philippe S.; Routhier, François; Michaud, François; Boissy, Patrick

    2014-01-01

    Using a powered wheelchair (PW) is a complex task requiring advanced perceptual and motor control skills. Unfortunately, PW incidents and accidents are not uncommon and their consequences can be serious. The objective of this paper is to develop technological tools that can be used to characterize a wheelchair user’s driving behavior under various settings. In the experiments conducted, PWs are outfitted with a datalogging platform that records, in real-time, the 3-D acceleration of the PW. Data collection was conducted over 35 different activities, designed to capture a spectrum of PW driving events performed at different speeds (collisions with fixed or moving objects, rolling on incline plane, and rolling across multiple types obstacles). The data was processed using time-series analysis and data mining techniques, to automatically detect and identify the different events. We compared the classification accuracy using four different types of time-series features: 1) time-delay embeddings; 2) time-domain characterization; 3) frequency-domain features; and 4) wavelet transforms. In the analysis, we compared the classification accuracy obtained when distinguishing between safe and unsafe events during each of the 35 different activities. For the purposes of this study, unsafe events were defined as activities containing collisions against objects at different speed, and the remainder were defined as safe events. We were able to accurately detect 98% of unsafe events, with a low (12%) false positive rate, using only five examples of each activity. This proof-of-concept study shows that the proposed approach has the potential of capturing, based on limited input from embedded sensors, contextual information on PW use, and of automatically characterizing a user’s PW driving behavior. PMID:27170879

  16. Automatic Detection and Classification of Unsafe Events During Power Wheelchair Use.

    PubMed

    Pineau, Joelle; Moghaddam, Athena K; Yuen, Hiu Kim; Archambault, Philippe S; Routhier, François; Michaud, François; Boissy, Patrick

    2014-01-01

    Using a powered wheelchair (PW) is a complex task requiring advanced perceptual and motor control skills. Unfortunately, PW incidents and accidents are not uncommon and their consequences can be serious. The objective of this paper is to develop technological tools that can be used to characterize a wheelchair user's driving behavior under various settings. In the experiments conducted, PWs are outfitted with a datalogging platform that records, in real-time, the 3-D acceleration of the PW. Data collection was conducted over 35 different activities, designed to capture a spectrum of PW driving events performed at different speeds (collisions with fixed or moving objects, rolling on incline plane, and rolling across multiple types obstacles). The data was processed using time-series analysis and data mining techniques, to automatically detect and identify the different events. We compared the classification accuracy using four different types of time-series features: 1) time-delay embeddings; 2) time-domain characterization; 3) frequency-domain features; and 4) wavelet transforms. In the analysis, we compared the classification accuracy obtained when distinguishing between safe and unsafe events during each of the 35 different activities. For the purposes of this study, unsafe events were defined as activities containing collisions against objects at different speed, and the remainder were defined as safe events. We were able to accurately detect 98% of unsafe events, with a low (12%) false positive rate, using only five examples of each activity. This proof-of-concept study shows that the proposed approach has the potential of capturing, based on limited input from embedded sensors, contextual information on PW use, and of automatically characterizing a user's PW driving behavior. PMID:27170879

  17. A multivariate based event detection method and performance comparison with two baseline methods.

    PubMed

    Liu, Shuming; Smith, Kate; Che, Han

    2015-09-01

    Early warning systems have been widely deployed to protect water systems from accidental and intentional contamination events. Conventional detection algorithms are often criticized for having high false positive rates and low true positive rates. This mainly stems from the inability of these methods to determine whether variation in sensor measurements is caused by equipment noise or the presence of contamination. This paper presents a new detection method that identifies the existence of contamination by comparing Euclidean distances of correlation indicators, which are derived from the correlation coefficients of multiple water quality sensors. The performance of the proposed method was evaluated using data from a contaminant injection experiment and compared with two baseline detection methods. The results show that the proposed method can differentiate between fluctuations caused by equipment noise and those due to the presence of contamination. It yielded higher possibility of detection and a lower false alarm rate than the two baseline methods. With optimized parameter values, the proposed method can correctly detect 95% of all contamination events with a 2% false alarm rate. PMID:25996758

  18. Temporal and Spatial Predictability of an Irrelevant Event Differently Affect Detection and Memory of Items in a Visual Sequence

    PubMed Central

    Ohyama, Junji; Watanabe, Katsumi

    2016-01-01

    We examined how the temporal and spatial predictability of a task-irrelevant visual event affects the detection and memory of a visual item embedded in a continuously changing sequence. Participants observed 11 sequentially presented letters, during which a task-irrelevant visual event was either present or absent. Predictabilities of spatial location and temporal position of the event were controlled in 2 × 2 conditions. In the spatially predictable conditions, the event occurred at the same location within the stimulus sequence or at another location, while, in the spatially unpredictable conditions, it occurred at random locations. In the temporally predictable conditions, the event timing was fixed relative to the order of the letters, while in the temporally unpredictable condition; it could not be predicted from the letter order. Participants performed a working memory task and a target detection reaction time (RT) task. Memory accuracy was higher for a letter simultaneously presented at the same location as the event in the temporally unpredictable conditions, irrespective of the spatial predictability of the event. On the other hand, the detection RTs were only faster for a letter simultaneously presented at the same location as the event when the event was both temporally and spatially predictable. Thus, to facilitate ongoing detection processes, an event must be predictable both in space and time, while memory processes are enhanced by temporally unpredictable (i.e., surprising) events. Evidently, temporal predictability has differential effects on detection and memory of a visual item embedded in a sequence of images. PMID:26869966

  19. Temporal and Spatial Predictability of an Irrelevant Event Differently Affect Detection and Memory of Items in a Visual Sequence.

    PubMed

    Ohyama, Junji; Watanabe, Katsumi

    2016-01-01

    We examined how the temporal and spatial predictability of a task-irrelevant visual event affects the detection and memory of a visual item embedded in a continuously changing sequence. Participants observed 11 sequentially presented letters, during which a task-irrelevant visual event was either present or absent. Predictabilities of spatial location and temporal position of the event were controlled in 2 × 2 conditions. In the spatially predictable conditions, the event occurred at the same location within the stimulus sequence or at another location, while, in the spatially unpredictable conditions, it occurred at random locations. In the temporally predictable conditions, the event timing was fixed relative to the order of the letters, while in the temporally unpredictable condition; it could not be predicted from the letter order. Participants performed a working memory task and a target detection reaction time (RT) task. Memory accuracy was higher for a letter simultaneously presented at the same location as the event in the temporally unpredictable conditions, irrespective of the spatial predictability of the event. On the other hand, the detection RTs were only faster for a letter simultaneously presented at the same location as the event when the event was both temporally and spatially predictable. Thus, to facilitate ongoing detection processes, an event must be predictable both in space and time, while memory processes are enhanced by temporally unpredictable (i.e., surprising) events. Evidently, temporal predictability has differential effects on detection and memory of a visual item embedded in a sequence of images. PMID:26869966

  20. Testing the waveform correlation event detection system: Teleseismic, regional, and local distances

    SciTech Connect

    Young, C.J.; Beiriger, J.I.; Harris, J.M.

    1997-08-01

    Waveform Correlation Event Detection System (WCEDS) prototypes have now been developed for both global and regional networks and the authors have extensively tested them to assess the potential usefulness of this technology for CTBT (Comprehensive Test Ban Treaty) monitoring. In this paper they present the results of tests on data sets from the IDC (International Data Center) Primary Network and the New Mexico Tech Seismic Network. The data sets span a variety of event types and noise conditions. The results are encouraging at both scales but show particular promise for regional networks. The global system was developed at Sandia Labs and has been tested on data from the IDC Primary Network. The authors have found that for this network the system does not perform at acceptable levels for either detection or location unless directional information (azimuth and slowness) is used. By incorporating directional information, however, both areas can be improved substantially suggesting that WCEDS may be able to offer a global detection capability which could complement that provided by the GA (Global Association) system in use at the IDC and USNDC (United States National Data Center). The local version of WCEDS (LWCEDS) has been developed and tested at New Mexico Tech using data from the New Mexico Tech Seismic Network (NMTSN). Results indicate that the WCEDS technology works well at this scale, despite the fact that the present implementation of LWCEDS does not use directional information. The NMTSN data set is a good test bed for the development of LWCEDS because of a typically large number of observed local phases and near network-wide recording of most local and regional events. Detection levels approach those of trained analysts, and locations are within 3 km of manually determined locations for local events.

  1. Event Detection for Hydrothermal Plumes: A case study at Grotto Vent

    NASA Astrophysics Data System (ADS)

    Bemis, K. G.; Ozer, S.; Xu, G.; Rona, P. A.; Silver, D.

    2012-12-01

    Evidence is mounting that geologic events such as volcanic eruptions (and intrusions) and earthquakes (near and far) influence the flow rates and temperatures of hydrothermal systems. Connecting such suppositions to observations of hydrothermal output is challenging, but new ongoing time series have the potential to capture such events. This study explores using activity detection, a technique modified from computer vision, to identify pre-defined events within an extended time series recorded by COVIS (Cabled Observatory Vent Imaging Sonar) and applies it to a time series, with gaps, from Sept 2010 to the present; available measurements include plume orientation, plume rise rate, and diffuse flow area at the NEPTUNE Canada Observatory at Grotto Vent, Main Endeavour Field, Juan de Fuca Ridge. Activity detection is the process of finding a pattern (activity) in a data set containing many different types of patterns. Among many approaches proposed to model and detect activities, we have chosen a graph-based technique, Petri Nets, as they do not require training data to model the activity. They use the domain expert's knowledge to build the activity as a combination of feature states and their transitions (actions). Starting from a conceptual model of how hydrothermal plumes respond to daily tides, we have developed a Petri Net based detection algorithm that identifies deviations from the specified response. Initially we assumed that the orientation of the plume would change smoothly and symmetrically in a consistent daily pattern. However, results indicate that the rate of directional changes varies. The present Petri Net detects unusually large and rapid changes in direction or amount of bending; however inspection of Figure 1 suggests that many of the events detected may be artifacts resulting from gaps in the data or from the large temporal spacing. Still, considerable complexity overlies the "normal" tidal response pattern (the data has a dominant frequency of

  2. Real-time gait event detection for transfemoral amputees during ramp ascending and descending.

    PubMed

    Maqbool, H F; Husman, M A B; Awad, M I; Abouhossein, A; Dehghani-Sanij, A A

    2015-01-01

    Events and phases detection of the human gait are vital for controlling prosthesis, orthosis and functional electrical stimulation (FES) systems. Wearable sensors are inexpensive, portable and have fast processing capability. They are frequently used to assess spatio-temporal, kinematic and kinetic parameters of the human gait which in turn provide more details about the human voluntary control and ampute-eprosthesis interaction. This paper presents a reliable real-time gait event detection algorithm based on simple heuristics approach, applicable to signals from tri-axial gyroscope for lower limb amputees during ramp ascending and descending. Experimental validation is done by comparing the results of gyroscope signal with footswitches. For healthy subjects, the mean difference between events detected by gyroscope and footswitches is 14 ms and 10.5 ms for initial contact (IC) whereas for toe off (TO) it is -5 ms and -25 ms for ramp up and down respectively. For transfemoral amputee, the error is slightly higher either due to the placement of footswitches underneath the foot or the lack of proper knee flexion and ankle plantarflexion/dorsiflexion during ramp up and down. Finally, repeatability tests showed promising results. PMID:26737364

  3. Method for the depth corrected detection of ionizing events from a co-planar grids sensor

    DOEpatents

    De Geronimo, Gianluigi; Bolotnikov, Aleksey E.; Carini, Gabriella

    2009-05-12

    A method for the detection of ionizing events utilizing a co-planar grids sensor comprising a semiconductor substrate, cathode electrode, collecting grid and non-collecting grid. The semiconductor substrate is sensitive to ionizing radiation. A voltage less than 0 Volts is applied to the cathode electrode. A voltage greater than the voltage applied to the cathode is applied to the non-collecting grid. A voltage greater than the voltage applied to the non-collecting grid is applied to the collecting grid. The collecting grid and the non-collecting grid are summed and subtracted creating a sum and difference respectively. The difference and sum are divided creating a ratio. A gain coefficient factor for each depth (distance between the ionizing event and the collecting grid) is determined, whereby the difference between the collecting electrode and the non-collecting electrode multiplied by the corresponding gain coefficient is the depth corrected energy of an ionizing event. Therefore, the energy of each ionizing event is the difference between the collecting grid and the non-collecting grid multiplied by the corresponding gain coefficient. The depth of the ionizing event can also be determined from the ratio.

  4. Advanced Geospatial Hydrodynamic Signals Analysis for Tsunami Event Detection and Warning

    NASA Astrophysics Data System (ADS)

    Arbab-Zavar, Banafshe; Sabeur, Zoheir

    2013-04-01

    Current early tsunami warning can be issued upon the detection of a seismic event which may occur at a given location offshore. This also provides an opportunity to predict the tsunami wave propagation and run-ups at potentially affected coastal zones by selecting the best matching seismic event from a database of pre-computed tsunami scenarios. Nevertheless, it remains difficult and challenging to obtain the rupture parameters of the tsunamigenic earthquakes in real time and simulate the tsunami propagation with high accuracy. In this study, we propose a supporting approach, in which the hydrodynamic signal is systematically analysed for traces of a tsunamigenic signal. The combination of relatively low amplitudes of a tsunami signal at deep waters and the frequent occurrence of background signals and noise contributes to a generally low signal to noise ratio for the tsunami signal; which in turn makes the detection of this signal difficult. In order to improve the accuracy and confidence of detection, a re-identification framework in which a tsunamigenic signal is detected via the scan of a network of hydrodynamic stations with water level sensing is performed. The aim is to attempt the re-identification of the same signatures as the tsunami wave spatially propagates through the hydrodynamic stations sensing network. The re-identification of the tsunamigenic signal is technically possible since the tsunami signal at the open ocean itself conserves its birthmarks relating it to the source event. As well as supporting the initial detection and improving the confidence of detection, a re-identified signal is indicative of the spatial range of the signal, and thereby it can be used to facilitate the identification of certain background signals such as wind waves which do not have as large a spatial reach as tsunamis. In this paper, the proposed methodology for the automatic detection of tsunamigenic signals has been achieved using open data from NOAA with a recorded

  5. BioSense: implementation of a National Early Event Detection and Situational Awareness System.

    PubMed

    Bradley, Colleen A; Rolka, H; Walker, D; Loonsk, J

    2005-08-26

    BioSense is a CDC initiative to support enhanced early detection, quantification, and localization of possible biologic terrorism attacks and other events of public health concern on a national level. The goals of the BioSense initiative are to advance early detection by providing the standards, infrastructure, and data acquisition for near real-time reporting, analytic evaluation and implementation, and early event detection support for state and local public health officials. BioSense collects and analyzes Department of Defense and Department of Veterans Affairs ambulatory clinical diagnoses and procedures and Laboratory Corporation of America laboratory-test orders. The application summarizes and presents analytical results and data visualizations by source, day, and syndrome for each ZIP code, state, and metropolitan area through maps, graphs, and tables. An initial proof of a concept evaluation project was conducted before the system was made available to state and local users in April 2004. User recruitment involved identifying and training BioSense administrators and users from state and local health departments. User support has been an essential component of the implementation and enhancement process. CDC initiated the BioIntelligence Center (BIC) in June 2004 to conduct internal monitoring of BioSense national data daily. BIC staff have supported state and local system monitoring, conducted data anomaly inquiries, and communicated with state and local public health officials. Substantial investments will be made in providing regional, state, and local data for early event detection and situational awareness, test beds for data and algorithm evaluation, detection algorithm development, and data management technologies, while maintaining the focus on state and local public health needs. PMID:16177687

  6. Group localisation and unsupervised detection and classification of basic crowd behaviour events for surveillance applications

    NASA Astrophysics Data System (ADS)

    Roubtsova, Nadejda S.; de With, Peter H. N.

    2013-02-01

    Technology for monitoring crowd behaviour is in demand for surveillance and security applications. The trend in research is to tackle detection of complex crowd behaviour events (panic, ght, evacuation etc.) directly using machine learning techniques. In this paper, we present a contrary, bottom-up approach seeking basic group information: (1) instantaneous location and (2) the merge, split and lateral slide-by events - the three basic motion patterns comprising any crowd behaviour. The focus on such generic group information makes our algorithm suitable as a building block in a variety of surveillance systems, possibly integrated with static content analysis solutions. Our feature extraction framework has optical ow in its core. The framework is universal being motion-based, rather than object-detection-based and generates a large variety of motion-blob- characterising features useful for an array of classi cation problems. Motion-based characterisation is performed on a group as an atomic whole and not by means of superposition of individual human motions. Within that feature space, our classi cation system makes decisions based on heuristic rules and thresholds, without machine learning. Our system performs well on group localisation, consistently generating contours around both moving and halted groups. The visual output of our periodical group localisation is equivalent to tracking and the group contour accuracy ranges from adequate to exceptionally good. The system successfully detects and classi es within our merge/split/slide-by event space in surveillance-type video sequences, di ering in resolution, scale, quality and motion content. Quantitatively, its performance is characterised by a good recall: 83% on detection and 71% on combined detection and classi cation.

  7. Final Scientific Report, Integrated Seismic Event Detection and Location by Advanced Array Processing

    SciTech Connect

    Kvaerna, T.; Gibbons. S.J.; Ringdal, F; Harris, D.B.

    2007-01-30

    In the field of nuclear explosion monitoring, it has become a priority to detect, locate, and identify seismic events down to increasingly small magnitudes. The consideration of smaller seismic events has implications for a reliable monitoring regime. Firstly, the number of events to be considered increases greatly; an exponential increase in naturally occurring seismicity is compounded by large numbers of seismic signals generated by human activity. Secondly, the signals from smaller events become more difficult to detect above the background noise and estimates of parameters required for locating the events may be subject to greater errors. Thirdly, events are likely to be observed by a far smaller number of seismic stations, and the reliability of event detection and location using a very limited set of observations needs to be quantified. For many key seismic stations, detection lists may be dominated by signals from routine industrial explosions which should be ascribed, automatically and with a high level of confidence, to known sources. This means that expensive analyst time is not spent locating routine events from repeating seismic sources and that events from unknown sources, which could be of concern in an explosion monitoring context, are more easily identified and can be examined with due care. We have obtained extensive lists of confirmed seismic events from mining and other artificial sources which have provided an excellent opportunity to assess the quality of existing fully-automatic event bulletins and to guide the development of new techniques for online seismic processing. Comparing the times and locations of confirmed events from sources in Fennoscandia and NW Russia with the corresponding time and location estimates reported in existing automatic bulletins has revealed substantial mislocation errors which preclude a confident association of detected signals with known industrial sources. The causes of the errors are well understood and are

  8. Application of data cubes for improving detection of water cycle extreme events

    NASA Astrophysics Data System (ADS)

    Teng, W. L.; Albayrak, A.

    2015-12-01

    As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case for our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme (WCE) events, a specific case of anomaly detection, requiring time series data. We investigate the use of the sequential probability ratio test (SPRT) for anomaly detection and support vector machines (SVM) for anomaly classification. We show an example of detection of WCE events, using the Global Land Data Assimilation Systems (GLDAS) data set.

  9. Non Conventional Seismic Events Along the Himalayan Arc Detected in the Hi-Climb Dataset

    NASA Astrophysics Data System (ADS)

    Vergne, J.; Nàbĕlek, J. L.; Rivera, L.; Bollinger, L.; Burtin, A.

    2008-12-01

    From September 2002 to August 2005, more than 200 broadband seismic stations were operated across the Himalayan arc and the southern Tibetan plateau in the framework of the Hi-Climb project. Here, we take advantage of the high density of stations along the main profile to look for coherent seismic wave arrivals that can not be attributed to ordinary tectonic events. An automatic detection algorithm is applied to the continuous data streams filtered between 1 and 10 Hz, followed by a visual inspection of all detections. We discovered about one hundred coherent signals that cannot be attributed to local, regional or teleseismic earthquakes and which are characterized by emergent arrivals and long durations ranging from one minute to several hours. Most of these non conventional seismic events have a low signal to noise ratio and are thus only observed above 1 Hz in the frequency band where the seismic noise is the lowest. However, a small subset of them are strong enough to be observed in a larger frequency band and show an enhancement of long periods compared to standard earthquakes. Based on the analysis of the relative amplitude measured at each station or, when possible, on the correlation of the low frequency part of the signals, most of these events appear to be located along the High Himalayan range. But, because of their emergent character and the main orientation of the seismic profile, their longitude and depth remain poorly constrained. The origin of these non conventional seismic events is still unsealed but their seismic signature shares several characteristics with non volcanic tremors, glacial earthquakes and/or debris avalanches. All these phenomena may occur along the Himalayan range but were not seismically detected before. Here we discuss the pros and cons for each of these postulated candidates based on the analysis of the recorded waveforms and slip models.

  10. Global Detection of Protein Kinase D-dependent Phosphorylation Events in Nocodazole-treated Human Cells*

    PubMed Central

    Franz-Wachtel, Mirita; Eisler, Stephan A.; Krug, Karsten; Wahl, Silke; Carpy, Alejandro; Nordheim, Alfred; Pfizenmaier, Klaus; Hausser, Angelika; Macek, Boris

    2012-01-01

    Protein kinase D (PKD) is a cytosolic serine/threonine kinase implicated in regulation of several cellular processes such as response to oxidative stress, directed cell migration, invasion, differentiation, and fission of the vesicles at the trans-Golgi network. Its variety of functions must be mediated by numerous substrates; however, only a couple of PKD substrates have been identified so far. Here we perform stable isotope labeling of amino acids in cell culture-based quantitative phosphoproteomic analysis to detect phosphorylation events dependent on PKD1 activity in human cells. We compare relative phosphorylation levels between constitutively active and kinase dead PKD1 strains of HEK293 cells, both treated with nocodazole, a microtubule-depolymerizing reagent that disrupts the Golgi complex and activates PKD1. We identify 124 phosphorylation sites that are significantly down-regulated upon decrease of PKD1 activity and show that the PKD target motif is significantly enriched among down-regulated phosphorylation events, pointing to the presence of direct PKD1 substrates. We further perform PKD1 target motif analysis, showing that a proline residue at position +1 relative to the phosphorylation site serves as an inhibitory cue for PKD1 activity. Among PKD1-dependent phosphorylation events, we detect predominantly proteins with localization at Golgi membranes and function in protein sorting, among them several sorting nexins and members of the insulin-like growth factor 2 receptor pathway. This study presents the first global detection of PKD1-dependent phosphorylation events and provides a wealth of information for functional follow-up of PKD1 activity upon disruption of the Golgi network in human cells. PMID:22496350

  11. Event Detection Using Mobile Phone Mass GPS Data and Their Reliavility Verification by Dmsp/ols Night Light Image

    NASA Astrophysics Data System (ADS)

    Yuki, Akiyama; Satoshi, Ueyama; Ryosuke, Shibasaki; Adachi, Ryuichiro

    2016-06-01

    In this study, we developed a method to detect sudden population concentration on a certain day and area, that is, an "Event," all over Japan in 2012 using mass GPS data provided from mobile phone users. First, stay locations of all phone users were detected using existing methods. Second, areas and days where Events occurred were detected by aggregation of mass stay locations into 1-km-square grid polygons. Finally, the proposed method could detect Events with an especially large number of visitors in the year by removing the influences of Events that occurred continuously throughout the year. In addition, we demonstrated reasonable reliability of the proposed Event detection method by comparing the results of Event detection with light intensities obtained from the night light images from the DMSP/OLS night light images. Our method can detect not only positive events such as festivals but also negative events such as natural disasters and road accidents. These results are expected to support policy development of urban planning, disaster prevention, and transportation management.

  12. Fault detection and isolation in manufacturing systems with an identified discrete event model

    NASA Astrophysics Data System (ADS)

    Roth, Matthias; Schneider, Stefan; Lesage, Jean-Jacques; Litz, Lothar

    2012-10-01

    In this article a generic method for fault detection and isolation (FDI) in manufacturing systems considered as discrete event systems (DES) is presented. The method uses an identified model of the closed-loop of plant and controller built on the basis of observed fault-free system behaviour. An identification algorithm known from literature is used to determine the fault detection model in form of a non-deterministic automaton. New results of how to parameterise this algorithm are reported. To assess the fault detection capability of an identified automaton, probabilistic measures are proposed. For fault isolation, the concept of residuals adapted for DES is used by defining appropriate set operations representing generic fault symptoms. The method is applied to a case study system.

  13. An evaluation of generalized likelihood Ratio Outlier Detection to identification of seismic events in Western China

    SciTech Connect

    Taylor, S.R.; Hartse, H.E.

    1996-09-24

    The Generalized Likelihood Ratio Outlier Detection Technique for seismic event identification is evaluated using synthetic test data and frequency-dependent P{sub g}/L{sub g} measurements from western China. For most seismic stations that are to be part of the proposed International Monitoring System for the Comprehensive Test Ban Treaty, there will be few or no nuclear explosions in the magnitude range of interest (e.g. M{sub b} < 4) on which to base an event-identification system using traditional classification techniques. Outlier detection is a reasonable alternative approach to the seismic discrimination problem when no calibration explosions are available. Distance-corrected P{sub g}/L{sub g} data in seven different frequency bands ranging from 0.5 to 8 Hz from the Chinese Digital Seismic Station WMQ are used to evaluate the technique. The data are collected from 157 known earthquakes, 215 unknown events (presumed earthquakes and possibly some industrial explosions), and 18 known nuclear explosions (1 from the Chinese Lop Nor test site and 17 from the East Kazakh test site). A feature selection technique is used to find the best combination of discriminants to use for outlier detection. Good discrimination performance is found by combining a low-frequency (0.5 to 1 Hz) P{sub g}/L{sub g} ratio with high-frequency ratios (e.g. 2 to 4 and 4 to 8 Hz). Although the low-frequency ratio does not discriminate between earthquakes and nuclear explosions well by itself, it can be effectively combined with the high-frequency discriminants. Based on the tests with real and synthetic data, the outlier detection technique appears to be an effective approach to seismic monitoring in uncalibrated regions.

  14. Event detection of hydrological processes with passive L-band data from SMOS

    NASA Astrophysics Data System (ADS)

    Al Bitar, Ahmad; Jacquette, Elsa; Kerr, Yann; Mialon, Arnaud; Cabot, Francois; Quesney, Arnaud; Merlin, Olivier; Richaume, Philippe

    2010-10-01

    Since it's launch, the ESA's Soil Moisture and Ocean Salinity (SMOS) satellite, is delivering new data from its LBand 1.4Ghz 2D interferometer [1]. The observations from SMOS are used to retrieve soil moisture in the first centimeters and ocean salinity at the surface of the water. The observations are multi-angular with a 3 days maximum revisit time. The spatial resolution of SMOS data is 40km. In this paper we present on event detection algorithm implemented at CATDS (Centre Aval de Traitement des Données SMOS) the CNES level 3 and level 4 SMOS enter. This algorithm is a three stage change detection algorithm. At stage one the possibility/probability of occurrence of the event is evaluated. This is done via spatiotemporal constraints maps. These maps are obtained from the analysis of NSIDC's freezing index products over the last century. Climate data from ancillary files are tested will taking into consideration the uncertainty of the data. Some selected retrieved variables are also tested. At stage two a time series analysis is applied. In the current version of the algorithm a direct change detection algorithm is used. The tests make use of available variables of polarization index, retrieved soil moisture...Finally at stage three a simple fuzzy logic approach is used to decide if the event occurred. This approaches takes into consideration the separation time of the data. Ascending and descending orbits are taken into consideration. In this study freezing detection is presented over central CONUS. The temporal and angular signature of SMOS will be presented. Comparison is done with the SCAN network

  15. Predictive modeling of structured electronic health records for adverse drug event detection

    PubMed Central

    2015-01-01

    Background The digitization of healthcare data, resulting from the increasingly widespread adoption of electronic health records, has greatly facilitated its analysis by computational methods and thereby enabled large-scale secondary use thereof. This can be exploited to support public health activities such as pharmacovigilance, wherein the safety of drugs is monitored to inform regulatory decisions about sustained use. To that end, electronic health records have emerged as a potentially valuable data source, providing access to longitudinal observations of patient treatment and drug use. A nascent line of research concerns predictive modeling of healthcare data for the automatic detection of adverse drug events, which presents its own set of challenges: it is not yet clear how to represent the heterogeneous data types in a manner conducive to learning high-performing machine learning models. Methods Datasets from an electronic health record database are used for learning predictive models with the purpose of detecting adverse drug events. The use and representation of two data types, as well as their combination, are studied: clinical codes, describing prescribed drugs and assigned diagnoses, and measurements. Feature selection is conducted on the various types of data to reduce dimensionality and sparsity, while allowing for an in-depth feature analysis of the usefulness of each data type and representation. Results Within each data type, combining multiple representations yields better predictive performance compared to using any single representation. The use of clinical codes for adverse drug event detection significantly outperforms the use of measurements; however, there is no significant difference over datasets between using only clinical codes and their combination with measurements. For certain adverse drug events, the combination does, however, outperform using only clinical codes. Feature selection leads to increased predictive performance for both

  16. Exupery volcano fast response system - The event detection and waveform classification system

    NASA Astrophysics Data System (ADS)

    Hammer, Conny; Ohrnberger, Matthias

    2010-05-01

    Volcanic eruptions are often preceded by seismic activity which can be used to quantify the volcanic activity since the number and the size of certain types of seismic events usually increase before periods of volcanic crisis. The implementation of an automatic detection and classification system for seismic signals of volcanic origin allows not only for the processing of large amounts of data in short time, but also provides consistent and time-invariant results. Here, we have developed a system based upon a combination of different methods. To enable a first robust event detection in the continuous data stream different modules are implemented in the real time system Earthworm which is widely distributed in active volcano monitoring observatories worldwide. Among those software modules are classical trigger algorithm like STA/LTA and cross-correlation master event matching which is also used to detect different classes of signals. Furthermore an additional module is implemented in the real time system to compute continuous activity parameters which are also used to quantify the volcanic activity. Most automatic classification systems need a sufficiently large pre-classified data set for training the system. However in case of a volcanic crisis we are often confronted with a lack of training data due to insufficient prior observations because prior data acquisition might be carried out with different equipment at a low number of sites and due to the imminent crisis there might be no time for the time-consuming and tedious process of preparing a training data set. For this reason we have developed a novel seismic event spotting technique in order to be less dependent on the existence of previously acquired data bases of event classes. One main goal is therefore to provide observatory staff with a robust event classification based on a minimum number of reference waveforms. By using a "learning-while-recording" approach we are allowing for the fast build-up of a

  17. Detection and identification of multiple genetically modified events using DNA insert fingerprinting.

    PubMed

    Raymond, Philippe; Gendron, Louis; Khalf, Moustafa; Paul, Sylvianne; Dibley, Kim L; Bhat, Somanath; Xie, Vicki R D; Partis, Lina; Moreau, Marie-Eve; Dollard, Cheryl; Coté, Marie-José; Laberge, Serge; Emslie, Kerry R

    2010-03-01

    Current screening and event-specific polymerase chain reaction (PCR) assays for the detection and identification of genetically modified organisms (GMOs) in samples of unknown composition or for the detection of non-regulated GMOs have limitations, and alternative approaches are required. A transgenic DNA fingerprinting methodology using restriction enzyme digestion, adaptor ligation, and nested PCR was developed where individual GMOs are distinguished by the characteristic fingerprint pattern of the fragments generated. The inter-laboratory reproducibility of the amplified fragment sizes using different capillary electrophoresis platforms was compared, and reproducible patterns were obtained with an average difference in fragment size of 2.4 bp. DNA insert fingerprints for 12 different maize events, including two maize hybrids and one soy event, were generated that reflected the composition of the transgenic DNA constructs. Once produced, the fingerprint profiles were added to a database which can be readily exchanged and shared between laboratories. This approach should facilitate the process of GMO identification and characterization. PMID:19943159

  18. Piezoelectric energy-harvesting power source and event detection sensors for gun-fired munitions

    NASA Astrophysics Data System (ADS)

    Rastegar, Jahangir; Feng, Dake; Pereira, Carlos M.

    2015-05-01

    This paper presents a review of piezoelectric based energy harvesting devices and their charge collection electronics for use in very harsh environment of gun-fired munitions. A number of novel classes of such energy harvesting power sources have been developed for gun-fired munitions and similar applications, including those with integrated safety and firing setback event detection electronics and logic circuitry. The power sources are designed to harvest energy from firing acceleration and vibratory motions during the flight. As an example, the application of the developed piezoelectric based energy harvesting devices with event detection circuitry for the development of self-powered initiators with full no-fire safety circuitry for protection against accidental drops, transportation vibration, and other similar low amplitude accelerations and/or high amplitude but short duration acceleration events is presented. The design allows the use of a very small piezoelectric element, thereby allowing such devices to be highly miniaturized. These devices can be readily hardened to withstand very high G firing setback accelerations in excess of 100,000 G and the harsh firing environment. The design of prototypes and testing under realistic conditions are presented.

  19. Detection of Events in Biomedical Signals by a Rényi Entropy Measure

    NASA Astrophysics Data System (ADS)

    Gabarda, S.; Cristóbal, G.; Martínez-Alajarín, J.; Ruiz, R.

    2006-10-01

    Biomedical signals contain important information about the healthy condition of human beings. Anomalous events in these signals are commonly associated to diseases. The information content enclosed by time-frequency representations (TFR) of biomedical signals can be explored by means of different Rényi entropy measures. To be precise, Rényi entropy can be approached under different normalizations, producing different outcomes. The best choice depends upon the particularities of the application considered. In this paper we propose a new processing scheme to the problem of events detection in biomedical signals, based on a particular normalization of the Rény entropy measurement. As in the case of another TFR's, the pseudo-Wigner distribution (PWD) of a biomedical signal can take negative values and thus it cannot be properly interpreted as a probability density function. Therefore a complexity measure based on the classical Shannon entropy cannot be used and a generalized measure such as the Rényi entropy is required. Our method allows the identification of the events as the moments having the highest amount of information (entropy) along the temporal data. This provides localized information about normal and pathological events in biomedical signals. Therefore, the diagnosis of diseases is facilitated in this way. The method is illustrated with examples of application to phonocardiograms and electrocardiograms and result are discussed.

  20. Detecting regular sound changes in linguistics as events of concerted evolution

    SciTech Connect

    Hruschka, Daniel  J.; Branford, Simon; Smith, Eric  D.; Wilkins, Jon; Meade, Andrew; Pagel, Mark; Bhattacharya, Tanmoy

    2014-12-18

    Background: Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Results: Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular sound change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. Conclusions: We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group.

  1. Comprehensive temporal information detection from clinical text: medical events, time, and TLINK identification

    PubMed Central

    Sohn, Sunghwan; Wagholikar, Kavishwar B; Li, Dingcheng; Jonnalagadda, Siddhartha R; Tao, Cui; Komandur Elayavilli, Ravikumar; Liu, Hongfang

    2013-01-01

    Background Temporal information detection systems have been developed by the Mayo Clinic for the 2012 i2b2 Natural Language Processing Challenge. Objective To construct automated systems for EVENT/TIMEX3 extraction and temporal link (TLINK) identification from clinical text. Materials and methods The i2b2 organizers provided 190 annotated discharge summaries as the training set and 120 discharge summaries as the test set. Our Event system used a conditional random field classifier with a variety of features including lexical information, natural language elements, and medical ontology. The TIMEX3 system employed a rule-based method using regular expression pattern match and systematic reasoning to determine normalized values. The TLINK system employed both rule-based reasoning and machine learning. All three systems were built in an Apache Unstructured Information Management Architecture framework. Results Our TIMEX3 system performed the best (F-measure of 0.900, value accuracy 0.731) among the challenge teams. The Event system produced an F-measure of 0.870, and the TLINK system an F-measure of 0.537. Conclusions Our TIMEX3 system demonstrated good capability of regular expression rules to extract and normalize time information. Event and TLINK machine learning systems required well-defined feature sets to perform well. We could also leverage expert knowledge as part of the machine learning features to further improve TLINK identification performance. PMID:23558168

  2. Detecting Regular Sound Changes in Linguistics as Events of Concerted Evolution

    PubMed Central

    Hruschka, Daniel J.; Branford, Simon; Smith, Eric D.; Wilkins, Jon; Meade, Andrew; Pagel, Mark; Bhattacharya, Tanmoy

    2015-01-01

    Summary Background Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Results Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular sound change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. Conclusions We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group. PMID:25532895

  3. Detection of Visual Events in Underwater Video Using a Neuromorphic Saliency-based Attention System

    NASA Astrophysics Data System (ADS)

    Edgington, D. R.; Walther, D.; Cline, D. E.; Sherlock, R.; Salamy, K. A.; Wilson, A.; Koch, C.

    2003-12-01

    The Monterey Bay Aquarium Research Institute (MBARI) uses high-resolution video equipment on remotely operated vehicles (ROV) to obtain quantitative data on the distribution and abundance of oceanic animals. High-quality video data supplants the traditional approach of assessing the kinds and numbers of animals in the oceanic water column through towing collection nets behind ships. Tow nets are limited in spatial resolution, and often destroy abundant gelatinous animals resulting in species undersampling. Video camera-based quantitative video transects (QVT) are taken through the ocean midwater, from 50m to 4000m, and provide high-resolution data at the scale of the individual animals and their natural aggregation patterns. However, the current manual method of analyzing QVT video by trained scientists is labor intensive and poses a serious limitation to the amount of information that can be analyzed from ROV dives. Presented here is an automated system for detecting marine animals (events) visible in the videos. Automated detection is difficult due to the low contrast of many translucent animals and due to debris ("marine snow") cluttering the scene. Video frames are processed with an artificial intelligence attention selection algorithm that has proven a robust means of target detection in a variety of natural terrestrial scenes. The candidate locations identified by the attention selection module are tracked across video frames using linear Kalman filters. Typically, the occurrence of visible animals in the video footage is sparse in space and time. A notion of "boring" video frames is developed by detecting whether or not there is an interesting candidate object for an animal present in a particular sequence of underwater video -- video frames that do not contain any "interesting" events. If objects can be tracked successfully over several frames, they are stored as potentially "interesting" events. Based on low-level properties, interesting events are

  4. Analysis of grain boundary dynamics using event detection and cumulative averaging.

    PubMed

    Gautam, A; Ophus, C; Lançon, F; Denes, P; Dahmen, U

    2015-04-01

    To analyze extended time series of high resolution images, we have employed automated frame-by-frame comparisons that are able to detect dynamic changes in the structure of a grain boundary in Au. Using cumulative averaging of images between events allowed high resolution measurements of the atomic relaxation in the interface with sufficient accuracy for comparison with atomistic models. Cumulative averaging was also used to observe the structural rearrangement of atomic columns at a moving step in the grain boundary. The technique of analyzing changing features in high resolution images by averaging between incidents can be used to deconvolute stochastic events that occur at random intervals and on time scales well beyond that accessible to single-shot imaging. PMID:25498139

  5. Detecting consciousness in a total locked-in syndrome: an active event-related paradigm.

    PubMed

    Schnakers, Caroline; Perrin, Fabien; Schabus, Manuel; Hustinx, Roland; Majerus, Steve; Moonen, Gustave; Boly, Melanie; Vanhaudenhuyse, Audrey; Bruno, Marie-Aurelie; Laureys, Steven

    2009-08-01

    Total locked-in syndrome is characterized by tetraplegia, anarthria and paralysis of eye motility. In this study, consciousness was detected in a 21-year-old woman who presented a total locked-in syndrome after a basilar artery thrombosis (49 days post-injury) using an active event-related paradigm. The patient was presented sequences of names containing the patient's own name and other names. The patient was instructed to count her own name or to count another target name. Similar to 4 age- and gender-matched healthy controls, the P3 response recorded for the voluntarily counted own name was larger than while passively listening. This P3 response was observed 14 days before the first behavioral signs of consciousness. This study shows that our active event-related paradigm allowed to identify voluntary brain activity in a patient who would behaviorally be diagnosed as comatose. PMID:19241281

  6. Application of Data Cubes for Improving Detection of Water Cycle Extreme Events

    NASA Technical Reports Server (NTRS)

    Albayrak, Arif; Teng, William

    2015-01-01

    As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case of our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme events, a specific case of anomaly detection, requiring time series data. We investigate the use of support vector machines (SVM) for anomaly classification. We show an example of detection of water cycle extreme events, using data from the Tropical Rainfall Measuring Mission (TRMM).

  7. Detecting tidal disruption events of massive black holes in normal galaxies with the Einstein Probe

    NASA Astrophysics Data System (ADS)

    Yuan, W.; Komossa, S.; Zhang, C.; Feng, H.; Ling, Z.-X.; Zhao, D. H.; Zhang, S.-N.; Osborne, J. P.; O'Brien, P.; Willingale, R.; Lapington, J.; Lapington

    2016-02-01

    Stars are tidally disrupted and accreted when they approach massive black holes (MBHs) closely, producing a flare of electromagnetic radiation. The majority of the (approximately two dozen) tidal disruption events (TDEs) identified so far have been discovered by their luminous, transient X-ray emission. Once TDEs are detected in much larger numbers, in future dedicated transient surveys, a wealth of new applications will become possible. Here, we present the proposed Einstein Probe mission, which is a dedicated time-domain soft X-ray all-sky monitor aiming at detecting X-ray transients including TDEs in large numbers. The mission consists of a wide-field micro-pore Lobster-eye imager (60° × 60°), and is designed to carry out an all-sky transient survey at energies of 0.5-4 keV. It will also carry a more sensitive telescope for X-ray follow-ups, and will be capable of issuing public transient alerts rapidly. Einstein Probe is expected to revolutionise the field of TDE research by detecting several tens to hundreds of events per year from the early phase of flares, many with long-term, well sampled lightcurves.

  8. Robust cardiac event change detection method for long-term healthcare monitoring applications.

    PubMed

    Satija, Udit; Ramkumar, Barathram; Manikandan, M Sabarimalai

    2016-06-01

    A long-term continuous cardiac health monitoring system highly demands more battery power for real-time transmission of electrocardiogram (ECG) signals and increases bandwidth, treatment costs and traffic load of the diagnostic server. In this Letter, the authors present an automated low-complexity robust cardiac event change detection (CECD) method that can continuously detect specific changes in PQRST morphological patterns and heart rhythms and then enable transmission/storing of the recorded ECG signals. The proposed CECD method consists of four stages: ECG signal quality assessment, R-peak detection and beat waveform extraction, temporal and RR interval feature extraction and cardiac event change decision. The proposed method is tested and validated using both normal and abnormal ECG signals including different types of arrhythmia beats, heart rates and signal quality. Results show that the method achieves an average sensitivity of 99.76%, positive predictivity of 94.58% and overall accuracy of 94.32% in determining the changes in heartbeat waveforms of the ECG signals. PMID:27382480

  9. Multiscale vision model for event detection and reconstruction in two-photon imaging data.

    PubMed

    Brazhe, Alexey; Mathiesen, Claus; Lind, Barbara; Rubin, Andrey; Lauritzen, Martin

    2014-07-01

    Reliable detection of calcium waves in multiphoton imaging data is challenging because of the low signal-to-noise ratio and because of the unpredictability of the time and location of these spontaneous events. This paper describes our approach to calcium wave detection and reconstruction based on a modified multiscale vision model, an object detection framework based on the thresholding of wavelet coefficients and hierarchical trees of significant coefficients followed by nonlinear iterative partial object reconstruction, for the analysis of two-photon calcium imaging data. The framework is discussed in the context of detection and reconstruction of intercellular glial calcium waves. We extend the framework by a different decomposition algorithm and iterative reconstruction of the detected objects. Comparison with several popular state-of-the-art image denoising methods shows that performance of the multiscale vision model is similar in the denoising, but provides a better segmenation of the image into meaningful objects, whereas other methods need to be combined with dedicated thresholding and segmentation utilities. PMID:26157968

  10. Automated detection and analysis of depolarization events in human cardiomyocytes using MaDEC.

    PubMed

    Szymanska, Agnieszka F; Heylman, Christopher; Datta, Rupsa; Gratton, Enrico; Nenadic, Zoran

    2016-08-01

    Optical imaging-based methods for assessing the membrane electrophysiology of in vitro human cardiac cells allow for non-invasive temporal assessment of the effect of drugs and other stimuli. Automated methods for detecting and analyzing the depolarization events (DEs) in image-based data allow quantitative assessment of these different treatments. In this study, we use 2-photon microscopy of fluorescent voltage-sensitive dyes (VSDs) to capture the membrane voltage of actively beating human induced pluripotent stem cell-derived cardiomyocytes (hiPS-CMs). We built a custom and freely available Matlab software, called MaDEC, to detect, quantify, and compare DEs of hiPS-CMs treated with the β-adrenergic drugs, propranolol and isoproterenol. The efficacy of our software is quantified by comparing detection results against manual DE detection by expert analysts, and comparing DE analysis results to known drug-induced electrophysiological effects. The software accurately detected DEs with true positive rates of 98-100% and false positive rates of 1-2%, at signal-to-noise ratios (SNRs) of 5 and above. The MaDEC software was also able to distinguish control DEs from drug-treated DEs both immediately as well as 10min after drug administration. PMID:27281718

  11. Hierarchical modeling for rare event detection and cell subset alignment across flow cytometry samples.

    PubMed

    Cron, Andrew; Gouttefangeas, Cécile; Frelinger, Jacob; Lin, Lin; Singh, Satwinder K; Britten, Cedrik M; Welters, Marij J P; van der Burg, Sjoerd H; West, Mike; Chan, Cliburn

    2013-01-01

    Flow cytometry is the prototypical assay for multi-parameter single cell analysis, and is essential in vaccine and biomarker research for the enumeration of antigen-specific lymphocytes that are often found in extremely low frequencies (0.1% or less). Standard analysis of flow cytometry data relies on visual identification of cell subsets by experts, a process that is subjective and often difficult to reproduce. An alternative and more objective approach is the use of statistical models to identify cell subsets of interest in an automated fashion. Two specific challenges for automated analysis are to detect extremely low frequency event subsets without biasing the estimate by pre-processing enrichment, and the ability to align cell subsets across multiple data samples for comparative analysis. In this manuscript, we develop hierarchical modeling extensions to the Dirichlet Process Gaussian Mixture Model (DPGMM) approach we have previously described for cell subset identification, and show that the hierarchical DPGMM (HDPGMM) naturally generates an aligned data model that captures both commonalities and variations across multiple samples. HDPGMM also increases the sensitivity to extremely low frequency events by sharing information across multiple samples analyzed simultaneously. We validate the accuracy and reproducibility of HDPGMM estimates of antigen-specific T cells on clinically relevant reference peripheral blood mononuclear cell (PBMC) samples with known frequencies of antigen-specific T cells. These cell samples take advantage of retrovirally TCR-transduced T cells spiked into autologous PBMC samples to give a defined number of antigen-specific T cells detectable by HLA-peptide multimer binding. We provide open source software that can take advantage of both multiple processors and GPU-acceleration to perform the numerically-demanding computations. We show that hierarchical modeling is a useful probabilistic approach that can provide a consistent labeling

  12. An analog cell to detect single event transients in voltage references

    NASA Astrophysics Data System (ADS)

    Franco, F. J.; Palomar, C.; Izquierdo, J. G.; Agapito, J. A.

    2015-01-01

    A reliable voltage reference is mandatory in mixed-signal systems. However, this family of components can undergo very long single event transients when operating in radiation environments such as space and nuclear facilities due to the impact of heavy ions. The purpose of the present paper is to demonstrate how a simple cell can be used to detect these transients. The cell was implemented with typical COTS components and its behavior was verified by SPICE simulations and in a laser facility. Different applications of the cell are explored as well.

  13. A novel progressive signal association algorithm for detecting teleseismic/network-outside events using regional seismic networks

    NASA Astrophysics Data System (ADS)

    Jin, Ping; Pan, Changzhou; Zhang, Chengliu; Shen, Xufeng; Wang, Hongchun; Lu, Na

    2015-06-01

    Regional seismic networks may and in some cases need to be used to monitor teleseismic or network-outside events. For detecting and localizing teleseismic events automatically and reliably in this case, in this paper we present a novel progressive association algorithm for teleseismic signals recorded by a regional seismic network. The algorithm takes triangle station arrays as the starting point to search for P waves of teleseismic events progressively by that, as detections from different stations actually are from the same teleseismic event, their arrival times should be linearly related to the average slowness vector with which the signal propagates across the network, and the slowness of direct teleseismic P wave basically is different from other major seismic phases. We have tested this algorithm using data recorded by Xinjiang Seismic Network of China (XJSN) for 16 d. The results show that the algorithm can effectively and reliably detect and localize earthquakes outside of the network. For the period of the test data, as all mb 4.0+ events with Δc < 30° and all mb 4.5+ events with Δc < 60° referring to the International Data Center-Reviewed Event Bulletin (IDC REB) were detected, where Δc is the epicentral distance relative to the network's geographical centre, the rate of false events only accounted for 2.4 per cent, suggesting that the new association algorithm has good application prospect for situations when regional seismic networks need to be used to monitor teleseismic events.

  14. Optimized Swinging Door Algorithm for Wind Power Ramp Event Detection: Preprint

    SciTech Connect

    Cui, Mingjian; Zhang, Jie; Florita, Anthony R.; Hodge, Bri-Mathias; Ke, Deping; Sun, Yuanzhang

    2015-08-06

    Significant wind power ramp events (WPREs) are those that influence the integration of wind power, and they are a concern to the continued reliable operation of the power grid. As wind power penetration has increased in recent years, so has the importance of wind power ramps. In this paper, an optimized swinging door algorithm (SDA) is developed to improve ramp detection performance. Wind power time series data are segmented by the original SDA, and then all significant ramps are detected and merged through a dynamic programming algorithm. An application of the optimized SDA is provided to ascertain the optimal parameter of the original SDA. Measured wind power data from the Electric Reliability Council of Texas (ERCOT) are used to evaluate the proposed optimized SDA.

  15. Label-Free Detection of Single Living Bacteria via Electrochemical Collision Event.

    PubMed

    Lee, Ji Young; Kim, Byung-Kwon; Kang, Mijeong; Park, Jun Hui

    2016-01-01

    We detected single living bacterial cells on ultramicroelectrode (UME) using a single-particle collision method and optical microscopic methods. The number of collision events involving the bacterial cells indicated in current-time (i-t) curves corresponds to the number of bacterial cells (i.e., Escherichia coli) on the UME surface, as observed visually. Simulations were performed to determine the theoretical current response (75 pA) and frequency (0.47 pM(-1) s(-1)) of single Escherichia coli collisions. The experimental current response (83 pA) and frequency (0.26 pM(-1) s(-1)) were on the same order of magnitude as the theoretical values. This single-particle collision approach facilitates detecting living bacteria and determining their concentration in solution and could be widely applied to studying other bacteria and biomolecules. PMID:27435527

  16. Highly specific detection of genetic modification events using an enzyme-linked probe hybridization chip.

    PubMed

    Zhang, M Z; Zhang, X F; Chen, X M; Chen, X; Wu, S; Xu, L L

    2015-01-01

    The enzyme-linked probe hybridization chip utilizes a method based on ligase-hybridizing probe chip technology, with the principle of using thio-primers for protection against enzyme digestion, and using lambda DNA exonuclease to cut multiple PCR products obtained from the sample being tested into single-strand chains for hybridization. The 5'-end amino-labeled probe was fixed onto the aldehyde chip, and hybridized with the single-stranded PCR product, followed by addition of a fluorescent-modified probe that was then enzymatically linked with the adjacent, substrate-bound probe in order to achieve highly specific, parallel, and high-throughput detection. Specificity and sensitivity testing demonstrated that enzyme-linked probe hybridization technology could be applied to the specific detection of eight genetic modification events at the same time, with a sensitivity reaching 0.1% and the achievement of accurate, efficient, and stable results. PMID:26345863

  17. Label-Free Detection of Single Living Bacteria via Electrochemical Collision Event

    PubMed Central

    Lee, Ji Young; Kim, Byung-Kwon; Kang, Mijeong; Park, Jun Hui

    2016-01-01

    We detected single living bacterial cells on ultramicroelectrode (UME) using a single-particle collision method and optical microscopic methods. The number of collision events involving the bacterial cells indicated in current-time (i-t) curves corresponds to the number of bacterial cells (i.e., Escherichia coli) on the UME surface, as observed visually. Simulations were performed to determine the theoretical current response (75 pA) and frequency (0.47 pM−1 s−1) of single Escherichia coli collisions. The experimental current response (83 pA) and frequency (0.26 pM−1 s−1) were on the same order of magnitude as the theoretical values. This single-particle collision approach facilitates detecting living bacteria and determining their concentration in solution and could be widely applied to studying other bacteria and biomolecules. PMID:27435527

  18. Detecting event-related recurrences by symbolic analysis: applications to human language processing

    PubMed Central

    beim Graben, Peter; Hutt, Axel

    2015-01-01

    Quasi-stationarity is ubiquitous in complex dynamical systems. In brain dynamics, there is ample evidence that event-related potentials (ERPs) reflect such quasi-stationary states. In order to detect them from time series, several segmentation techniques have been proposed. In this study, we elaborate a recent approach for detecting quasi-stationary states as recurrence domains by means of recurrence analysis and subsequent symbolization methods. We address two pertinent problems of contemporary recurrence analysis: optimizing the size of recurrence neighbourhoods and identifying symbols from different realizations for sequence alignment. As possible solutions for these problems, we suggest a maximum entropy criterion and a Hausdorff clustering algorithm. The resulting recurrence domains for single-subject ERPs are obtained as partition cells reflecting quasi-stationary brain states. PMID:25548270

  19. Label-Free Detection of Single Living Bacteria via Electrochemical Collision Event

    NASA Astrophysics Data System (ADS)

    Lee, Ji Young; Kim, Byung-Kwon; Kang, Mijeong; Park, Jun Hui

    2016-07-01

    We detected single living bacterial cells on ultramicroelectrode (UME) using a single-particle collision method and optical microscopic methods. The number of collision events involving the bacterial cells indicated in current-time (i-t) curves corresponds to the number of bacterial cells (i.e., Escherichia coli) on the UME surface, as observed visually. Simulations were performed to determine the theoretical current response (75 pA) and frequency (0.47 pM‑1 s‑1) of single Escherichia coli collisions. The experimental current response (83 pA) and frequency (0.26 pM‑1 s‑1) were on the same order of magnitude as the theoretical values. This single-particle collision approach facilitates detecting living bacteria and determining their concentration in solution and could be widely applied to studying other bacteria and biomolecules.

  20. Detection of planets in extremely weak central perturbation microlensing events via next-generation ground-based surveys

    SciTech Connect

    Chung, Sun-Ju; Lee, Chung-Uk; Koo, Jae-Rim E-mail: leecu@kasi.re.kr

    2014-04-20

    Even though the recently discovered high-magnification event MOA-2010-BLG-311 had complete coverage over its peak, confident planet detection did not happen due to extremely weak central perturbations (EWCPs, fractional deviations of ≲ 2%). For confident detection of planets in EWCP events, it is necessary to have both high cadence monitoring and high photometric accuracy better than those of current follow-up observation systems. The next-generation ground-based observation project, Korea Microlensing Telescope Network (KMTNet), satisfies these conditions. We estimate the probability of occurrence of EWCP events with fractional deviations of ≤2% in high-magnification events and the efficiency of detecting planets in the EWCP events using the KMTNet. From this study, we find that the EWCP events occur with a frequency of >50% in the case of ≲ 100 M {sub E} planets with separations of 0.2 AU ≲ d ≲ 20 AU. We find that for main-sequence and sub-giant source stars, ≳ 1 M {sub E} planets in EWCP events with deviations ≤2% can be detected with frequency >50% in a certain range that changes with the planet mass. However, it is difficult to detect planets in EWCP events of bright stars like giant stars because it is easy for KMTNet to be saturated around the peak of the events because of its constant exposure time. EWCP events are caused by close, intermediate, and wide planetary systems with low-mass planets and close and wide planetary systems with massive planets. Therefore, we expect that a much greater variety of planetary systems than those already detected, which are mostly intermediate planetary systems, regardless of the planet mass, will be significantly detected in the near future.

  1. Automatic Detection of Whole Night Snoring Events Using Non-Contact Microphone

    PubMed Central

    Dafna, Eliran; Tarasiuk, Ariel; Zigel, Yaniv

    2013-01-01

    Objective Although awareness of sleep disorders is increasing, limited information is available on whole night detection of snoring. Our study aimed to develop and validate a robust, high performance, and sensitive whole-night snore detector based on non-contact technology. Design Sounds during polysomnography (PSG) were recorded using a directional condenser microphone placed 1 m above the bed. An AdaBoost classifier was trained and validated on manually labeled snoring and non-snoring acoustic events. Patients Sixty-seven subjects (age 52.5±13.5 years, BMI 30.8±4.7 kg/m2, m/f 40/27) referred for PSG for obstructive sleep apnea diagnoses were prospectively and consecutively recruited. Twenty-five subjects were used for the design study; the validation study was blindly performed on the remaining forty-two subjects. Measurements and Results To train the proposed sound detector, >76,600 acoustic episodes collected in the design study were manually classified by three scorers into snore and non-snore episodes (e.g., bedding noise, coughing, environmental). A feature selection process was applied to select the most discriminative features extracted from time and spectral domains. The average snore/non-snore detection rate (accuracy) for the design group was 98.4% based on a ten-fold cross-validation technique. When tested on the validation group, the average detection rate was 98.2% with sensitivity of 98.0% (snore as a snore) and specificity of 98.3% (noise as noise). Conclusions Audio-based features extracted from time and spectral domains can accurately discriminate between snore and non-snore acoustic events. This audio analysis approach enables detection and analysis of snoring sounds from a full night in order to produce quantified measures for objective follow-up of patients. PMID:24391903

  2. Large-Scale Disturbance Events in Terrestrial Ecosystems Detected using Global Satellite Data Sets

    NASA Astrophysics Data System (ADS)

    Potter, C.; Tan, P.; Kumar, V.; Klooster, S.

    2004-12-01

    Studies are being conducted to evaluate patterns in a 19-year record of global satellite observations of vegetation phenology from the Advanced Very High Resolution Radiometer (AVHRR), as a means to characterize large-scale ecosystem disturbance events and regimes. The fraction absorbed of photosynthetically active radiation (FPAR) by vegetation canopies worldwide has been computed at a monthly time interval from 1982 to 2000 and gridded at a spatial resolution of 8-km globally. Potential disturbance events were identified in the FPAR time series by locating anomalously low values (FPAR-LO) that lasted longer than 12 consecutive months at any 8-km pixel. We can find verifiable evidence of numerous disturbance types across North America, including major regional patterns of cold and heat waves, forest fires, tropical storms, and large-scale forest logging. Based on this analysis, an historical picture is emerging of periodic droughts and heat waves, possibly coupled with herbivorous insect outbreaks, as among the most important causes of ecosystem disturbance in North America. In South America, large areas of northeastern Brazil appear to have been impacted in the early 1990s by severe drought. Amazon tropical forest disturbance can be detected at large scales particularly in the mid 1990s. In Asia, large-scale disturbance events appear in the mid 1980s and the late 1990s across boreal and temperate forest zones, as well as in cropland areas of western India. In northern Europe and central Africa, large-scale forest disturbance appears in the mid 1990s.

  3. Comparison and applicability of landslide susceptibility models based on landslide ratio-based logistic regression, frequency ratio, weight of evidence, and instability index methods in an extreme rainfall event

    NASA Astrophysics Data System (ADS)

    Wu, Chunhung

    2016-04-01

    Few researches have discussed about the applicability of applying the statistical landslide susceptibility (LS) model for extreme rainfall-induced landslide events. The researches focuses on the comparison and applicability of LS models based on four methods, including landslide ratio-based logistic regression (LRBLR), frequency ratio (FR), weight of evidence (WOE), and instability index (II) methods, in an extreme rainfall-induced landslide cases. The landslide inventory in the Chishan river watershed, Southwestern Taiwan, after 2009 Typhoon Morakot is the main materials in this research. The Chishan river watershed is a tributary watershed of Kaoping river watershed, which is a landslide- and erosion-prone watershed with the annual average suspended load of 3.6×107 MT/yr (ranks 11th in the world). Typhoon Morakot struck Southern Taiwan from Aug. 6-10 in 2009 and dumped nearly 2,000 mm of rainfall in the Chishan river watershed. The 24-hour, 48-hour, and 72-hours accumulated rainfall in the Chishan river watershed exceeded the 200-year return period accumulated rainfall. 2,389 landslide polygons in the Chishan river watershed were extracted from SPOT 5 images after 2009 Typhoon Morakot. The total landslide area is around 33.5 km2, equals to the landslide ratio of 4.1%. The main landslide types based on Varnes' (1978) classification are rotational and translational slides. The two characteristics of extreme rainfall-induced landslide event are dense landslide distribution and large occupation of downslope landslide areas owing to headward erosion and bank erosion in the flooding processes. The area of downslope landslide in the Chishan river watershed after 2009 Typhoon Morakot is 3.2 times higher than that of upslope landslide areas. The prediction accuracy of LS models based on LRBLR, FR, WOE, and II methods have been proven over 70%. The model performance and applicability of four models in a landslide-prone watershed with dense distribution of rainfall

  4. A new multivariate time series data analysis technique: Automated detection of flux transfer events using Cluster data

    NASA Astrophysics Data System (ADS)

    Karimabadi, H.; Sipes, T. B.; Wang, Y.; Lavraud, B.; Roberts, A.

    2009-06-01

    A new data mining technique called MineTool-TS is introduced which captures the time-lapse information in multivariate time series data through extraction of global features and metafeatures. This technique is developed into a JAVA-based data mining software which automates all the steps in the model building to make it more accessible to nonexperts. As its first application in space sciences, MineTool-TS is used to develop a model for automated detection of flux transfer events (FTEs) at Earth's magnetopause in the Cluster spacecraft time series data. The model classifies a given time series into one of three categories of non-FTE, magnetosheath FTE, or magnetospheric FTE. One important feature of MineTool-TS is the ability to explore the importance of each variable or combination of variables as indicators of FTEs. FTEs have traditionally been identified on the basis of their magnetic field signatures, but here we find that some plasma variables can also be effective indicators of FTEs. For example, the perpendicular ion temperature yields a model accuracy of ˜93%, while a model based solely on the normal magnetic field BN yields an accuracy of ˜95%. This opens up the possibility of searching for more unusual FTEs that may, for example, have no clear BN signature and create a more comprehensive and less biased list of FTEs for statistical studies. We also find that models using GSM coordinates yield comparable accuracy to those using boundary normal coordinates. This is useful since there are regions where magnetopause models are not accurate. Another surprising result is the finding that the algorithm can largely detect FTEs, and even distinguish between magnetosheath and magnetospheric FTEs, solely on the basis of models built from single parameters, something that experts may not do so straightforwardly on the basis of short time series intervals. The most accurate models use a combination of plasma and magnetic field variables and achieve a very high

  5. Endpoint Visual Detection of Three Genetically Modified Rice Events by Loop-Mediated Isothermal Amplification

    PubMed Central

    Chen, Xiaoyun; Wang, Xiaofu; Jin, Nuo; Zhou, Yu; Huang, Sainan; Miao, Qingmei; Zhu, Qing; Xu, Junfeng

    2012-01-01

    Genetically modified (GM) rice KMD1, TT51-1, and KF6 are three of the most well known transgenic Bt rice lines in China. A rapid and sensitive molecular assay for risk assessment of GM rice is needed. Polymerase chain reaction (PCR), currently the most common method for detecting genetically modified organisms, requires temperature cycling and relatively complex procedures. Here we developed a visual and rapid loop-mediated isothermal amplification (LAMP) method to amplify three GM rice event-specific junction sequences. Target DNA was amplified and visualized by two indicators (SYBR green or hydroxy naphthol blue [HNB]) within 60 min at an isothermal temperature of 63 °C. Different kinds of plants were selected to ensure the specificity of detection and the results of the non-target samples were negative, indicating that the primer sets for the three GM rice varieties had good levels of specificity. The sensitivity of LAMP, with detection limits at low concentration levels (0.01%–0.005% GM), was 10- to 100-fold greater than that of conventional PCR. Additionally, the LAMP assay coupled with an indicator (SYBR green or HNB) facilitated analysis. These findings revealed that the rapid detection method was suitable as a simple field-based test to determine the status of GM crops. PMID:23203072

  6. Detection of Pharmacovigilance-Related adverse Events Using Electronic Health Records and automated Methods

    PubMed Central

    Haerian, K; Varn, D; Vaidya, S; Ena, L; Chase, HS; Friedman, C

    2013-01-01

    Electronic health records (EHRs) are an important source of data for detection of adverse drug reactions (ADRs). However, adverse events are frequently due not to medications but to the patients’ underlying conditions. Mining to detect ADRs from EHR data must account for confounders. We developed an automated method using natural-language processing (NLP) and a knowledge source to differentiate cases in which the patient’s disease is responsible for the event rather than a drug. Our method was applied to 199,920 hospitalization records, concentrating on two serious ADRs: rhabdomyolysis (n = 687) and agranulocytosis (n = 772). Our method automatically identified 75% of the cases, those with disease etiology. The sensitivity and specificity were 93.8% (confidence interval: 88.9-96.7%) and 91.8% (confidence interval: 84.0-96.2%), respectively. The method resulted in considerable saving of time: for every 1 h spent in development, there was a saving of at least 20 h in manual review. The review of the remaining 25% of the cases therefore became more feasible, allowing us to identify the medications that had caused the ADRs. PMID:22713699

  7. A Simple and Robust Event-Detection Algorithm for Single-Cell Impedance Cytometry.

    PubMed

    Caselli, Federica; Bisegna, Paolo

    2016-02-01

    Microfluidic impedance cytometry is emerging as a powerful label-free technique for the characterization of single biological cells. In order to increase the sensitivity and the specificity of the technique, suited digital signal processing methods are required to extract meaningful information from measured impedance data. In this study, a simple and robust event-detection algorithm for impedance cytometry is presented. Since a differential measuring scheme is generally adopted, the signal recorded when a cell passes through the sensing region of the device exhibits a typical odd-symmetric pattern. This feature is exploited twice by the proposed algorithm: first, a preliminary segmentation, based on the correlation of the data stream with the simplest odd-symmetric template, is performed; then, the quality of detected events is established by evaluating their E2O index, that is, a measure of the ratio between their even and odd parts. A thorough performance analysis is reported, showing the robustness of the algorithm with respect to parameter choice and noise level. In terms of sensitivity and positive predictive value, an overall performance of 94.9% and 98.5%, respectively, was achieved on two datasets relevant to microfluidic chips with very different characteristics, considering three noise levels. The present algorithm can foster the role of impedance cytometry in single-cell analysis, which is the new frontier in "Omics." PMID:26241968

  8. Online Least Squares One-Class Support Vector Machines-Based Abnormal Visual Event Detection

    PubMed Central

    Wang, Tian; Chen, Jie; Zhou, Yi; Snoussi, Hichem

    2013-01-01

    The abnormal event detection problem is an important subject in real-time video surveillance. In this paper, we propose a novel online one-class classification algorithm, online least squares one-class support vector machine (online LS-OC-SVM), combined with its sparsified version (sparse online LS-OC-SVM). LS-OC-SVM extracts a hyperplane as an optimal description of training objects in a regularized least squares sense. The online LS-OC-SVM learns a training set with a limited number of samples to provide a basic normal model, then updates the model through remaining data. In the sparse online scheme, the model complexity is controlled by the coherence criterion. The online LS-OC-SVM is adopted to handle the abnormal event detection problem. Each frame of the video is characterized by the covariance matrix descriptor encoding the moving information, then is classified into a normal or an abnormal frame. Experiments are conducted, on a two-dimensional synthetic distribution dataset and a benchmark video surveillance dataset, to demonstrate the promising results of the proposed online LS-OC-SVM method. PMID:24351629

  9. AKSED: adaptive knowledge-based system for event detection using collaborative unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Wang, X. Sean; Lee, Byung Suk; Sadjadi, Firooz

    2006-05-01

    Advances in sensor technology and image processing have made it possible to equip unmanned aerial vehicles (UAVs) with economical, high-resolution, energy-efficient sensors. Despite the improvements, current UAVs lack autonomous and collaborative operation capabilities, due to limited bandwidth and limited on-board image processing abilities. The situation, however, is changing. In the next generation of UAVs, much image processing can be carried out onboard and communication bandwidth problem will improve. More importantly, with more processing power, collaborative operations among a team of autonomous UAVs can provide more intelligent event detection capabilities. In this paper, we present ideas for developing a system enabling target recognitions by collaborative operations of autonomous UAVs. UAVs are configured in three stages: manufacturing, mission planning, and deployment. Different sets of information are needed at different stages, and the resulting outcome is an optimized event detection code deployed onto a UAV. The envisioned system architecture and the contemplated methodology, together with problems to be addressed, are presented.

  10. The Waveform Correlation Event Detection System project, Phase II: Testing with the IDC primary network

    SciTech Connect

    Young, C.J.; Beiriger, J.I.; Moore, S.G.

    1998-04-01

    Further improvements to the Waveform Correlation Event Detection System (WCEDS) developed by Sandia Laboratory have made it possible to test the system on the accepted Comprehensive Test Ban Treaty (CTBT) seismic monitoring network. For our test interval we selected a 24-hour period from December 1996, and chose to use the Reviewed Event Bulletin (REB) produced by the Prototype International Data Center (PIDC) as ground truth for evaluating the results. The network is heterogeneous, consisting of array and three-component sites, and as a result requires more flexible waveform processing algorithms than were available in the first version of the system. For simplicity and superior performance, we opted to use the spatial coherency algorithm of Wagner and Owens (1996) for both types of sites. Preliminary tests indicated that the existing version of WCEDS, which ignored directional information, could not achieve satisfactory detection or location performance for many of the smaller events in the REB, particularly those in the south Pacific where the network coverage is unusually sparse. To achieve an acceptable level of performance, we made modifications to include directional consistency checks for the correlations, making the regions of high correlation much less ambiguous. These checks require the production of continuous azimuth and slowness streams for each station, which is accomplished by means of FK processing for the arrays and power polarization processing for the three-component sites. In addition, we added the capability to use multiple frequency-banded data streams for each site to increase sensitivity to phases whose frequency content changes as a function of distance.

  11. Unreported seismic events found far off-shore Mexico using full-waveform, cross-correlation detection method.

    NASA Astrophysics Data System (ADS)

    Solano, ErickaAlinne; Hjorleifsdottir, Vala; Perez-Campos, Xyoli

    2015-04-01

    A large subset of seismic events do not have impulsive arrivals, such as low frequency events in volcanoes, earthquakes in the shallow part of the subduction interface and further down dip from the traditional seismogenic part, glacial events, volcanic and non-volcanic tremors and landslides. A suite of methods can be used to detect these non-impulsive events. One of this methods is the full-waveform detection based on time reversal methods (Solano, et al , submitted to GJI). The method uses continuous observed seismograms, together with Greens functions and moment tensor responses calculated for an arbitrary 3D structure. This method was applied to the 2012 Ometepec-Pinotepa Nacional earthquake sequence in Guerrero, Mexico. During the span time of the study, we encountered three previously unknown events. One of this events was an impulsive earthquake in the Ometepec area, that only has clear arrivals on three stations and was therefore not located and reported by the SSN. The other two events are previously undetected events, very depleted in high frequencies, that occurred far outside the search area. A very rough estimate gives the location of this two events in the portion of the East Pacific Rise around 9 N. These two events are detected despite their distance from the search area, due to favorable move-out on the array of the Mexican National Seismological Service network (SSN). We are expanding the study area to the EPR and to a larger period of time, with the objective of finding more events in that region. We will present an analysis of the newly detected events, as well as any further findings at the meeting.

  12. EEG-based event detection using optimized echo state networks with leaky integrator neurons.

    PubMed

    Ayyagari, Sudhanshu S D P; Jones, Richard D; Weddell, Stephen J

    2014-01-01

    This study investigates the classification ability of linear and nonlinear classifiers on biological signals using the electroencephalogram (EEG) and examines the impact of architectural changes within the classifier in order to enhance the classification. Consequently, artificial events were used to validate a prototype EEG-based microsleep detection system based around an echo state network (ESN) and a linear discriminant analysis (LDA) classifier. The artificial events comprised infrequent 2-s long bursts of 15 Hz sinusoids superimposed on prerecorded 16-channel EEG data which provided a means of determining and optimizing the accuracy of overall classifier on `gold standard' events. The performance of this system was tested on different signal-to-noise amplitude ratios (SNRs) ranging from 16 down to 0.03. Results from several feature selection/reduction and pattern classification modules indicated that training the classifier using a leaky-integrator neuron ESN structure yielded highest classification accuracy. For datasets with a low SNR of 0.3, training the leaky-neuron ESN using only those features which directly correspond to the underlying event, resulted in a phi correlation of 0.92 compared to 0.37 that employed principal component analysis (PCA). On the same datasets, other classifiers such as LDA and simple ESNs using PCA performed weakly with a correlation of 0.05 and 0 respectively. These results suggest that ESNs with leaky neuron architectures have superior pattern recognition properties. This, in turn, may reflect their superior ability to exploit differences in state dynamics and, hence, provide superior temporal characteristics in learning. PMID:25571328

  13. Slip-Related Changes in Plantar Pressure Distribution, and Parameters for Early Detection of Slip Events

    PubMed Central

    Choi, Seungyoung; Cho, Hyungpil; Kang, Boram; Lee, Dong Hun; Kim, Mi Jung

    2015-01-01

    Objective To investigate differences in plantar pressure distribution between a normal gait and unpredictable slip events to predict the initiation of the slipping process. Methods Eleven male participants were enrolled. Subjects walked onto a wooden tile, and two layers of oily vinyl sheet were placed on the expected spot of the 4th step to induce a slip. An insole pressure-measuring system was used to monitor plantar pressure distribution. This system measured plantar pressure in four regions (the toes, metatarsal head, arch, and heel) for three events: the step during normal gait; the recovered step, when the subject recovered from a slip; and the uncorrected, harmful slipped step. Four variables were analyzed: peak pressure (PP), contact time (CT), the pressure-time integral (PTI), and the instant of peak pressure (IPP). Results The plantar pressure pattern in the heel was unique, as compared with other parts of the sole. In the heel, PP, CT, and PTI values were high in slipped and recovered steps compared with normal steps. The IPP differed markedly among the three steps. The IPPs in the heel for the three events were, in descending order (from latest to earliest), slipped, recovered, and normal steps, whereas in the other regions the order was normal, recovered, and slipped steps. Finally, the metatarsal head-to-heel IPP ratios for the normal, recovered, and slipped steps were 6.1±2.9, 3.1±3.0, and 2.2±2.5, respectively. Conclusion A distinctive plantar pressure pattern in the heel might be useful for early detection of a slip event to prevent slip-related injuries. PMID:26798603

  14. Detecting regular sound changes in linguistics as events of concerted evolution

    DOE PAGESBeta

    Hruschka, Daniel  J.; Branford, Simon; Smith, Eric  D.; Wilkins, Jon; Meade, Andrew; Pagel, Mark; Bhattacharya, Tanmoy

    2014-12-18

    Background: Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Results: Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular soundmore » change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. Conclusions: We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group.« less

  15. Automatic Event Detection and Characterization of solar events with IRIS, SDO/AIA and Hi-C

    NASA Astrophysics Data System (ADS)

    Alexander, Caroline; Fayock, Brian; Winebarger, Amy

    2016-05-01

    Dynamic, low-lying loops with peak temperatures <1 MK are observed throughout the solar transition region. These loops can be observed in SDO/AIA data due to some lower temperature spectral lines in the passbands, but have not been studied in great detail. We have developed a technique to automatically identify events (i.e., brightenings) on a pixel-by-pixel basis applying a set of selection criteria. The pixels are then grouped according to their proximity in space and relative progression of the event. This method allows us to characterize their overall lifetime and the rate at which these events occur. Our current progress includes identification of these groups of events in IRIS data, determination of their existence in AIA data, and characterization based on a comparison between the two. This technique has also been used on Hi-C data in preparation for the rocket re-flight in July 2016. Results on the success of this technique at identifying real structures and sources of heating will be shown.

  16. Validity assessment of the detection method of maize event Bt10 through investigation of its molecular structure.

    PubMed

    Milcamps, Anne; Rabe, Scott; Cade, Rebecca; De Framond, Anic J; Henriksson, Peter; Kramer, Vance; Lisboa, Duarte; Pastor-Benito, Susana; Willits, Michael G; Lawrence, David; Van den Eede, Guy

    2009-04-22

    In March 2005, U.S. authorities informed the European Commission of the inadvertent release of unauthorized maize GM event Bt10 in their market and subsequently the grain channel. In the United States measures were taken to eliminate Bt10 from seed and grain supplies; in the European Union an embargo for maize gluten and brewer's grain import was implemented unless certified of Bt10 absence with a Bt10-specific PCR detection method. With the aim of assessing the validity of the Bt10 detection method, an in-depth analysis of the molecular organization of the genetic modification of this event was carried out by both the company Syngenta, who produced the event, and the European Commission Joint Research Centre, who validated the detection method. Using a variety of molecular analytical tools, both organizations found the genetic modification of event Bt10 to be very complex in structure, with rearrangements, inversions, and multiple copies of the structural elements (cry1Ab, pat, and the amp gene), interspersed with small genomic maize fragments. Southern blot analyses demonstrated that all Bt10 elements were found tightly linked on one large fragment, including the region that would generate the event-specific PCR amplicon of the Bt10 detection method. This study proposes a hypothetical map of the insert of event Bt10 and concludes that the validated detection method for event Bt10 is fit for its purpose. PMID:19368351

  17. Real-time detection of pathological cardiac events in the electrocardiogram.

    PubMed

    Iliev, Ivo; Krasteva, Vessela; Tabakov, Serafim

    2007-03-01

    The development of accurate and fast methods for real-time electrocardiogram (ECG) analysis is mandatory in handheld fully automated monitoring devices for high-risk cardiac patients. The present work describes a simple software method for fast detection of pathological cardiac events. It implements real-time procedures for QRS detection, interbeat RR-intervals analysis, QRS waveform evaluation and a decision-tree beat classifier. Two QRS descriptors are defined to assess (i) the RR interval deviation from the mean RR interval and (ii) the QRS waveform deviation from the QRS pattern of the sustained rhythm. The calculation of the second parameter requires a specific technique, in order to satisfy the demand for straight signal processing with minimum iterations and small memory size. This technique includes fast and resource efficient estimation of a histogram matrix, which accumulates dynamically the amplitude-temporal distribution of the successive QRS pattern waveforms. The pilot version of the method is developed in Matlab and it is tested with internationally recognized ECG databases. The assessment of the online single lead QRS detector showed sensitivity and positive predictivity of above 99%. The classification rules for detection of pathological ventricular beats were defined empirically by statistical analysis. The attained specificity and sensitivity are about 99.5% and 95.7% for all databases and about 99.81% and 98.87% for the noise free dataset. The method is applicable in low computational cost systems for long-term ECG monitoring, such as intelligent holters, automatic event/alarm recorders or personal devices with intermittent wireless data transfer to a central terminal. PMID:17322591

  18. Predictors of Arrhythmic Events Detected by Implantable Loop Recorders in Renal Transplant Candidates

    PubMed Central

    Silva, Rodrigo Tavares; Martinelli Filho, Martino; Peixoto, Giselle de Lima; de Lima, José Jayme Galvão; de Siqueira, Sérgio Freitas; Costa, Roberto; Gowdak, Luís Henrique Wolff; de Paula, Flávio Jota; Kalil Filho, Roberto; Ramires, José Antônio Franchini

    2015-01-01

    Background The recording of arrhythmic events (AE) in renal transplant candidates (RTCs) undergoing dialysis is limited by conventional electrocardiography. However, continuous cardiac rhythm monitoring seems to be more appropriate due to automatic detection of arrhythmia, but this method has not been used. Objective We aimed to investigate the incidence and predictors of AE in RTCs using an implantable loop recorder (ILR). Methods A prospective observational study conducted from June 2009 to January 2011 included 100 consecutive ambulatory RTCs who underwent ILR and were followed-up for at least 1 year. Multivariate logistic regression was applied to define predictors of AE. Results During a mean follow-up of 424 ± 127 days, AE could be detected in 98% of patients, and 92% had more than one type of arrhythmia, with most considered potentially not serious. Sustained atrial tachycardia and atrial fibrillation occurred in 7% and 13% of patients, respectively, and bradyarrhythmia and non-sustained or sustained ventricular tachycardia (VT) occurred in 25% and 57%, respectively. There were 18 deaths, of which 7 were sudden cardiac events: 3 bradyarrhythmias, 1 ventricular fibrillation, 1 myocardial infarction, and 2 undetermined. The presence of a long QTc (odds ratio [OR] = 7.28; 95% confidence interval [CI], 2.01–26.35; p = 0.002), and the duration of the PR interval (OR = 1.05; 95% CI, 1.02–1.08; p < 0.001) were independently associated with bradyarrhythmias. Left ventricular dilatation (LVD) was independently associated with non-sustained VT (OR = 2.83; 95% CI, 1.01–7.96; p = 0.041). Conclusions In medium-term follow-up of RTCs, ILR helped detect a high incidence of AE, most of which did not have clinical relevance. The PR interval and presence of long QTc were predictive of bradyarrhythmias, whereas LVD was predictive of non-sustained VT. PMID:26351983

  19. The WISE Detection of an Infrared Echo in Tidal Disruption Event ASASSN-14li

    NASA Astrophysics Data System (ADS)

    Jiang, Ning; Dou, Liming; Wang, Tinggui; Yang, Chenwei; Lyu, Jianwei; Zhou, Hongyan

    2016-09-01

    We report the detection of a significant infrared variability of the nearest tidal disruption event (TDE) ASASSN-14li using Wide-field Infrared Survey Explorer and newly released Near-Earth Object WISE Reactivation data. In comparison with the quiescent state, the infrared flux is brightened by 0.12 and 0.16 mag in the W1 (3.4 μm) and W2 (4.6 μm) bands at 36 days after the optical discovery (or ∼110 days after the peak disruption date). The flux excess is still detectable ∼170 days later. Assuming that the flare-like infrared emission is from the dust around the black hole, its blackbody temperature is estimated to be ∼2.1 × 103 K, slightly higher than the dust sublimation temperature, indicating that the dust is likely located close to the dust sublimation radius. The equilibrium between the heating and radiation of the dust claims a bolometric luminosity of ∼1043–1045 erg s‑1, comparable with the observed peak luminosity. This result has for the first time confirmed the detection of infrared emission from the dust echoes of TDEs.

  20. Automatic Detection of Swallowing Events by Acoustical Means for Applications of Monitoring of Ingestive Behavior

    PubMed Central

    Sazonov, Edward S.; Makeyev, Oleksandr; Schuckers, Stephanie; Lopez-Meyer, Paulo; Melanson, Edward L.; Neuman, Michael R.

    2010-01-01

    Our understanding of etiology of obesity and overweight is incomplete due to lack of objective and accurate methods for Monitoring of Ingestive Behavior (MIB) in the free living population. Our research has shown that frequency of swallowing may serve as a predictor for detecting food intake, differentiating liquids and solids, and estimating ingested mass. This paper proposes and compares two methods of acoustical swallowing detection from sounds contaminated by motion artifacts, speech and external noise. Methods based on mel-scale Fourier spectrum, wavelet packets, and support vector machines are studied considering the effects of epoch size, level of decomposition and lagging on classification accuracy. The methodology was tested on a large dataset (64.5 hours with a total of 9,966 swallows) collected from 20 human subjects with various degrees of adiposity. Average weighted epoch recognition accuracy for intra-visit individual models was 96.8% which resulted in 84.7% average weighted accuracy in detection of swallowing events. These results suggest high efficiency of the proposed methodology in separation of swallowing sounds from artifacts that originate from respiration, intrinsic speech, head movements, food ingestion, and ambient noise. The recognition accuracy was not related to body mass index, suggesting that the methodology is suitable for obese individuals. PMID:19789095

  1. Solar Power Ramp Events Detection Using an Optimized Swinging Door Algorithm

    SciTech Connect

    Cui, Mingjian; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Ke, Deping; Sun, Yuanzhang

    2015-08-05

    Solar power ramp events (SPREs) significantly influence the integration of solar power on non-clear days and threaten the reliable and economic operation of power systems. Accurately extracting solar power ramps becomes more important with increasing levels of solar power penetrations in power systems. In this paper, we develop an optimized swinging door algorithm (OpSDA) to enhance the state of the art in SPRE detection. First, the swinging door algorithm (SDA) is utilized to segregate measured solar power generation into consecutive segments in a piecewise linear fashion. Then we use a dynamic programming approach to combine adjacent segments into significant ramps when the decision thresholds are met. In addition, the expected SPREs occurring in clear-sky solar power conditions are removed. Measured solar power data from Tucson Electric Power is used to assess the performance of the proposed methodology. OpSDA is compared to two other ramp detection methods: the SDA and the L1-Ramp Detect with Sliding Window (L1-SW) method. The statistical results show the validity and effectiveness of the proposed method. OpSDA can significantly improve the performance of the SDA, and it can perform as well as or better than L1-SW with substantially less computation time.

  2. Solar Power Ramp Events Detection Using an Optimized Swinging Door Algorithm: Preprint

    SciTech Connect

    Cui, Mingjian; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Ke, Deping; Sun, Yuanzhang

    2015-08-07

    Solar power ramp events (SPREs) are those that significantly influence the integration of solar power on non-clear days and threaten the reliable and economic operation of power systems. Accurately extracting solar power ramps becomes more important with increasing levels of solar power penetrations in power systems. In this paper, we develop an optimized swinging door algorithm (OpSDA) to detection. First, the swinging door algorithm (SDA) is utilized to segregate measured solar power generation into consecutive segments in a piecewise linear fashion. Then we use a dynamic programming approach to combine adjacent segments into significant ramps when the decision thresholds are met. In addition, the expected SPREs occurring in clear-sky solar power conditions are removed. Measured solar power data from Tucson Electric Power is used to assess the performance of the proposed methodology. OpSDA is compared to two other ramp detection methods: the SDA and the L1-Ramp Detect with Sliding Window (L1-SW) method. The statistical results show the validity and effectiveness of the proposed method. OpSDA can significantly improve the performance of the SDA, and it can perform as well as or better than L1-SW with substantially less computation time.

  3. High-Performance Signal Detection for Adverse Drug Events using MapReduce Paradigm.

    PubMed

    Fan, Kai; Sun, Xingzhi; Tao, Ying; Xu, Linhao; Wang, Chen; Mao, Xianling; Peng, Bo; Pan, Yue

    2010-01-01

    Post-marketing pharmacovigilance is important for public health, as many Adverse Drug Events (ADEs) are unknown when those drugs were approved for marketing. However, due to the large number of reported drugs and drug combinations, detecting ADE signals by mining these reports is becoming a challenging task in terms of computational complexity. Recently, a parallel programming model, MapReduce has been introduced by Google to support large-scale data intensive applications. In this study, we proposed a MapReduce-based algorithm, for common ADE detection approach, Proportional Reporting Ratio (PRR), and tested it in mining spontaneous ADE reports from FDA. The purpose is to investigate the possibility of using MapReduce principle to speed up biomedical data mining tasks using this pharmacovigilance case as one specific example. The results demonstrated that MapReduce programming model could improve the performance of common signal detection algorithm for pharmacovigilance in a distributed computation environment at approximately liner speedup rates. PMID:21347109

  4. Orbit Determination and Maneuver Detection Using Event Representation with Thrust-Fourier-Coefficients

    NASA Astrophysics Data System (ADS)

    Lubey, D.; Ko, H.; Scheeres, D.

    The classical orbit determination (OD) method of dealing with unknown maneuvers is to restart the OD process with post-maneuver observations. However, it is also possible to continue the OD process through such unknown maneuvers by representing those unknown maneuvers with an appropriate event representation. It has been shown in previous work (Ko & Scheeres, JGCD 2014) that any maneuver performed by a satellite transitioning between two arbitrary orbital states can be represented as an equivalent maneuver connecting those two states using Thrust-Fourier-Coefficients (TFCs). Event representation using TFCs rigorously provides a unique control law that can generate the desired secular behavior for a given unknown maneuver. This paper presents applications of this representation approach to orbit prediction and maneuver detection problem across unknown maneuvers. The TFCs are appended to a sequential filter as an adjoint state to compensate unknown perturbing accelerations and the modified filter estimates the satellite state and thrust coefficients by processing OD across the time of an unknown maneuver. This modified sequential filter with TFCs is capable of fitting tracking data and maintaining an OD solution in the presence of unknown maneuvers. Also, the modified filter is found effective in detecting a sudden change in TFC values which indicates a maneuver. In order to illustrate that the event representation approach with TFCs is robust and sufficiently general to be easily adjustable, different types of measurement data are processed with the filter in a realistic LEO setting. Further, cases with mis-modeling of non-gravitational force are included in our study to verify the versatility and efficiency of our presented algorithm. Simulation results show that the modified sequential filter with TFCs can detect and estimate the orbit and thrust parameters in the presence of unknown maneuvers with or without measurement data during maneuvers. With no measurement

  5. The Event Detection and the Apparent Velocity Estimation Based on Computer Vision

    NASA Astrophysics Data System (ADS)

    Shimojo, M.

    2012-08-01

    The high spatial and time resolution data obtained by the telescopes aboard Hinode revealed the new interesting dynamics in solar atmosphere. In order to detect such events and estimate the velocity of dynamics automatically, we examined the estimation methods of the optical flow based on the OpenCV that is the computer vision library. We applied the methods to the prominence eruption observed by NoRH, and the polar X-ray jet observed by XRT. As a result, it is clear that the methods work well for solar images if the images are optimized for the methods. It indicates that the optical flow estimation methods in the OpenCV library are very useful to analyze the solar phenomena.

  6. Event detection and localization for small mobile robots using reservoir computing.

    PubMed

    Antonelo, E A; Schrauwen, B; Stroobandt, D

    2008-08-01

    Reservoir Computing (RC) techniques use a fixed (usually randomly created) recurrent neural network, or more generally any dynamic system, which operates at the edge of stability, where only a linear static readout output layer is trained by standard linear regression methods. In this work, RC is used for detecting complex events in autonomous robot navigation. This can be extended to robot localization tasks which are solely based on a few low-range, high-noise sensory data. The robot thus builds an implicit map of the environment (after learning) that is used for efficient localization by simply processing the input stream of distance sensors. These techniques are demonstrated in both a simple simulation environment and in the physically realistic Webots simulation of the commercially available e-puck robot, using several complex and even dynamic environments. PMID:18662855

  7. Energy efficient data representation and aggregation with event region detection in wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Banerjee, Torsha

    Detection (PERD) for WSNs. When a single event occurs, a child of the tree sends a Flagged Polynomial (FP) to its parent, if the readings approximated by it falls outside the data range defining the existing phenomenon. After the aggregation process is over, the root having the two polynomials, P and FP can be queried for FP (approximating the new event region) instead of flooding the whole network. For multiple such events, instead of computing a polynomial corresponding to each new event, areas with same data range are combined by the corresponding tree nodes and the aggregated coefficients are passed on. Results reveal that a new event can be detected by PERD while error in detection remains constant and is less than a threshold of 10%. As the node density increases, accuracy and delay for event detection are found to remain almost constant, making PERD highly scalable. Whenever an event occurs in a WSN, data is generated by closeby sensors and relaying the data to the base station (BS) make sensors closer to the BS run out of energy at a much faster rate than sensors in other parts of the network. This gives rise to an unequal distribution of residual energy in the network and makes those sensors with lower remaining energy level die at much faster rate than others. We propose a scheme for enhancing network Lifetime using mobile cluster heads (CH) in a WSN. To maintain remaining energy more evenly, some energy-rich nodes are designated as CHs which move in a controlled manner towards sensors rich in energy and data. This eliminates multihop transmission required by the static sensors and thus increases the overall lifetime of the WSN. We combine the idea of clustering and mobile CH to first form clusters of static sensor nodes. A collaborative strategy among the CHs further increases the lifetime of the network. Time taken for transmitting data to the BS is reduced further by making the CHs follow a connectivity strategy that always maintain a connected path to the BS

  8. Possible Detection of Volcanic Activity on Europa: Analysis of An Optical Transient Event

    NASA Astrophysics Data System (ADS)

    de La Fuente Marcos, R.; Nissar, A.

    2002-06-01

    Europa's low crater density suggests that geological activity has continued to the present epoch, leading to the possibility that current resurfacing events might be detectable. CCD observations were carried out with a ST-6 camera at the 0.5 m Mons Cassegrain telescope (Izaña Observatory, Tenerife,Canary Islands, Spain) during the night between 2 3 October 1999. Our images show a transient bright feature on the Galilean satellite. These images are analyzed here with the purpose of understanding the nature of the transient phenomena as it could be the result of explosive venting on the surface of the Jovian satellite. By comparison, we use NASA Infrared Telescope Facility images of two Io hot spots taken on12 October 1990. Although we mainly restrict our discussion on apossible eruptive nature of the observed spots, we also consider other alternative mechanisms able to produce bright events. In particular, an interaction between charged material being ejected from Europa and the Jovian magnetosphere cannot be entirely ruled out. If confirmed, this result would lend support for the existence of active resurfacing in Europa.

  9. Fusion of waveform events and radionuclide detections with the help of atmospheric transport modelling

    NASA Astrophysics Data System (ADS)

    Krysta, Monika; Kushida, Noriyuki; Kotselko, Yuriy; Carter, Jerry

    2016-04-01

    Possibilities of associating information from four pillars constituting CTBT monitoring and verification regime, namely seismic, infrasound, hydracoustic and radionuclide networks, have been explored by the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) for a long time. Based on a concept of overlying waveform events with the geographical regions constituting possible sources of the detected radionuclides, interactive and non-interactive tools were built in the past. Based on the same concept, a design of a prototype of a Fused Event Bulletin was proposed recently. One of the key design elements of the proposed approach is the ability to access fusion results from either the radionuclide or from the waveform technologies products, which are available on different time scales and through various different automatic and interactive products. To accommodate various time scales a dynamic product evolving while the results of the different technologies are being processed and compiled is envisioned. The product would be available through the Secure Web Portal (SWP). In this presentation we describe implementation of the data fusion functionality in the test framework of the SWP. In addition, we address possible refinements to the already implemented concepts.

  10. Using AHRQ patient safety indicators to detect postdischarge adverse events in the Veterans Health Administration.

    PubMed

    Mull, Hillary J; Borzecki, Ann M; Chen, Qi; Shin, Marlena H; Rosen, Amy K

    2014-01-01

    Patient safety indicators (PSIs) use inpatient administrative data to flag cases with potentially preventable adverse events (AEs) attributable to hospital care. This study explored how many AEs the PSIs identified in the 30 days post discharge. PSI software was run on Veterans Health Administration 2003-2007 administrative data for 10 recently validated PSIs. Among PSI-eligible index hospitalizations not flagged with an AE, this study evaluated how many AEs occurred within 1 to 14 and 15 to 30 days post discharge using inpatient and outpatient administrative data. Considering all PSI-eligible index hospitalizations, 11 141 postdischarge AEs were identified, compared with 40 578 inpatient-flagged AEs. More than 60% of postdischarge AEs were detected within 14 days of discharge. The majority of postdischarge AEs were decubitus ulcers and postoperative pulmonary embolisms or deep vein thromboses. Extending PSI algorithms to the postdischarge period may provide a more complete picture of hospital quality. Future work should use chart review to validate postdischarge PSI events. PMID:23939485

  11. Applying a New Event Detection Algorithm to an Ocean Bottom Seismometer Dataset Recorded Offshore Southern California

    NASA Astrophysics Data System (ADS)

    Bishop, J.; Kohler, M. D.; Bunn, J.; Chandy, K. M.

    2015-12-01

    A number of active southern California offshore faults are capable of M>6 earthquakes, and the only permanent Southern California Seismic Network stations that can contribute to ongoing, small-magnitude earthquake detection and location are those located on the coastline and islands. To obtain a more detailed picture of the seismicity of the region, an array of 34 ocean bottom seismometers (OBSs) was deployed to record continuous waveform data off the coast of Southern California for 12 months (2010-2011) as part of the ALBACORE (Asthenospheric and Lithospheric Broadband Architecture from the California Offshore Region Experiment) project. To obtain a local event catalog based on OBS data, we make use of a newly developed data processing platform based on Python. The data processing procedure comprises a multi-step analysis that starts with the identification of significant signals above the time-adjusted noise floor for each sensor. This is followed by a time-dependent statistical estimate of the likelihood of an earthquake based on the aggregated signals in the array. For periods with elevated event likelihood, an adaptive grid-fitting procedure is used that yields candidate earthquake hypocenters with confidence estimates that best match the observed sensor signals. The results are validated with synthetic travel times and manual picks. Using results from ALBACORE, we have created a more complete view of active faulting in the California Borderland.

  12. Application of remote sensing in coastal change detection after the tsunami event in Indonesia

    NASA Astrophysics Data System (ADS)

    Lim, H. S.; MatJafri, M. Z.; Abdullah, K.; Saleh, N. Mohd.; Surbakti, M. S.

    2008-10-01

    Shoreline mapping and shoreline change detection are critical in many coastal zone applications. This study focuses on applying remote sensing technology to identify and assess coastal changes in the Banda Aceh, Indonesia. Major changes to land cover were found along the coastal line. Using remote sensing data to detect coastal line change requires high spatial resolution data. In this study, two high spatial data with 30 meter resolution of Landsat TM images captured before and after the Tsunami event were acquired for this purpose. The two satellite images was overlain and compared with pre-Tsunami imagery and with after Tsunami. The two Landsat TM images also were used to generate land cover classification maps for the 24 December 2004 and 27 March 2005, before and after the Tsunami event respectively. The standard supervised classifier was performed to the satellite images such as the Maximum Likelihood, Minimum Distance-to-mean and Parallelepiped. High overall accuracy (>80%) and Kappa coefficient (>0.80) was achieved by the Maximum Likelihood classifier in this study. Estimation of the damage areas between the two dated was estimated from the different between the two classified land cover maps. Visible damage could be seen in either before and after image pair. The visible damage land areas were determined and draw out using the polygon tool included in the PCI Geomatica image processing software. The final set of polygons containing the major changes in the coastal line. An overview of the coastal line changes using Landsat TM images is also presented in this study. This study provided useful information that helps local decision makers make better plan and land management choices.

  13. Automated Visual Event Detection, Tracking, and Data Management System for Cabled- Observatory Video

    NASA Astrophysics Data System (ADS)

    Edgington, D. R.; Cline, D. E.; Schlining, B.; Raymond, E.

    2008-12-01

    Ocean observatories and underwater video surveys have the potential to unlock important discoveries with new and existing camera systems. Yet the burden of video management and analysis often requires reducing the amount of video recorded through time-lapse video or similar methods. It's unknown how many digitized video data sets exist in the oceanographic community, but we suspect that many remain under analyzed due to lack of good tools or human resources to analyze the video. To help address this problem, the Automated Visual Event Detection (AVED) software and The Video Annotation and Reference System (VARS) have been under development at MBARI. For detecting interesting events in the video, the AVED software has been developed over the last 5 years. AVED is based on a neuromorphic-selective attention algorithm, modeled on the human vision system. Frames are decomposed into specific feature maps that are combined into a unique saliency map. This saliency map is then scanned to determine the most salient locations. The candidate salient locations are then segmented from the scene using algorithms suitable for the low, non-uniform light and marine snow typical of deep underwater video. For managing the AVED descriptions of the video, the VARS system provides an interface and database for describing, viewing, and cataloging the video. VARS was developed by the MBARI for annotating deep-sea video data and is currently being used to describe over 3000 dives by our remotely operated vehicles (ROV), making it well suited to this deepwater observatory application with only a few modifications. To meet the compute and data intensive job of video processing, a distributed heterogeneous network of computers is managed using the Condor workload management system. This system manages data storage, video transcoding, and AVED processing. Looking to the future, we see high-speed networks and Grid technology as an important element in addressing the problem of processing and

  14. Detection of events of public health importance under the international health regulations: a toolkit to improve reporting of unusual events by frontline healthcare workers

    PubMed Central

    2011-01-01

    Background The International Health Regulations (IHR (2005)) require countries to notify WHO of any event which may constitute a public health emergency of international concern. This notification relies on reports of events occurring at the local level reaching the national public health authorities. By June 2012 WHO member states are expected to have implemented the capacity to "detect events involving disease or death above expected levels for the particular time and place" on the local level and report essential information to the appropriate level of public health authority. Our objective was to develop tools to assist European countries improve the reporting of unusual events of public health significance from frontline healthcare workers to public health authorities. Methods We investigated obstacles and incentives to event reporting through a systematic literature review and expert consultations with national public health officials from various European countries. Multi-day expert meetings and qualitative interviews were used to gather experiences and examples of public health event reporting. Feedback on specific components of the toolkit was collected from healthcare workers and public health officials throughout the design process. Results Evidence from 79 scientific publications, two multi-day expert meetings and seven qualitative interviews stressed the need to clarify concepts and expectations around event reporting in European countries between the frontline and public health authorities. An analytical framework based on three priority areas for improved event reporting (professional engagement, communication and infrastructure) was developed and guided the development of the various tools. We developed a toolkit adaptable to country-specific needs that includes a guidance document for IHR National Focal Points and nine tool templates targeted at clinicians and laboratory staff: five awareness campaign tools, three education and training tools, and

  15. Beat-by-Beat Quantification of Cardiac Cycle Events Detected From Three-Dimensional Precordial Acceleration Signals.

    PubMed

    Paukkunen, Mikko; Parkkila, Petteri; Hurnanen, Tero; Pankaala, Mikko; Koivisto, Tero; Nieminen, Tuomo; Kettunen, Raimo; Sepponen, Raimo

    2016-03-01

    The vibrations produced by the cardiovascular system that are coupled to the precordium can be noninvasively detected using accelerometers. This technique is called seismocardiography. Although clinical applications have been proposed for seismocardiography, the physiology underlying the signal is still not clear. The relationship of seismocardiograms of on the back-to-front axis and cardiac events is fairly well known. However, the 3-D seismocardiograms detectable with modern accelerometers have not been quantified in terms of cardiac cycle events. A major reason for this might be the degree of intersubject variability observed in 3-D seismocardiograms. We present a method to quantify 3-D seismocardiography in terms of cardiac cycle events. First, cardiac cycle events are identified from the seismocardiograms, and then, assigned a number based on the location in which the corresponding event was found. 396 cardiac cycle events from 9 healthy subjects and 120 cardiac cycle events from patients suffering from atrial flutter were analyzed. Despite the weak intersubject correlation of the waveforms (0.05, 0.27, and 0.15 for the x-, y-, and z-axes, respectively), the present method managed to find latent similarities in the seismocardiograms of healthy subjects. We observed that in healthy subjects the distribution of cardiac cycle event coordinates was centered on specific locations. These locations were different in patients with atrial flutter. The results suggest that spatial distribution of seismocardiographic cardiac cycle events might be used to discriminate healthy individuals and those with a failing heart. PMID:25594987

  16. Using GPS to Rapidly Detect and Model Earthquakes and Transient Deformation Events

    NASA Astrophysics Data System (ADS)

    Crowell, Brendan W.

    The rapid modeling and detection of earthquakes and transient deformation is a problem of extreme societal importance for earthquake early warning and rapid hazard response. To date, GPS data is not used in earthquake early warning or rapid source modeling even in Japan or California where the most extensive geophysical networks exist. This dissertation focuses on creating algorithms for automated modeling of earthquakes and transient slip events using GPS data in the western United States and Japan. First, I focus on the creation and use of high-rate GPS and combined seismogeodetic data for applications in earthquake early warning and rapid slip inversions. Leveraging data from earthquakes in Japan and southern California, I demonstrate that an accurate magnitude estimate can be made within seconds using P wave displacement scaling, and that a heterogeneous static slip model can be generated within 2-3 minutes. The preliminary source characterization is sufficiently robust to independently confirm the extent of fault slip used for rapid assessment of strong ground motions and improved tsunami warning in subduction zone environments. Secondly, I investigate the automated detection of transient slow slip events in Cascadia using daily positional estimates from GPS. Proper geodetic characterization of transient deformation is necessary for studies of regional interseismic, coseismic and postseismic tectonics, and miscalculations can affect our understanding of the regional stress field. I utilize the relative strength index (RSI) from financial forecasting to create a complete record of slow slip from continuous GPS stations in the Cascadia subduction zone between 1996 and 2012. I create a complete history of slow slip across the Cascadia subduction zone, fully characterizing the timing, progression, and magnitude of events. Finally, using a combination of continuous and campaign GPS measurements, I characterize the amount of extension, shear and subsidence in the

  17. PREFACE: Fourth Symposium on Large TPCs for Low Energy Rare Event Detection

    NASA Astrophysics Data System (ADS)

    Irastorza, Igor G.; Colas, Paul; Giomataris, Ioannis

    2009-07-01

    The Fourth International Symposium on large TPCs for low-energy rare-event detection was held at the Hermite auditorium of the Insitute Henri Poincaréte, 11 rue Pierre et Marie Curie in Paris on 18-19 December 2008. As in previous instances of the meeting, held always in Paris in 2006, 2004 and 2002, it gathered a significant community of physicists involved in rare event searches and/or development of time projection chambers (TPCs). The purpose of the meeting was to present and discuss the status of current experiments or projects involving the use of large TPCs for the search of rare events, like low-energy neutrinos, double beta decay, dark matter or axion experiments, as well as to discuss new results and ideas in the framework of the last developments of Micro Pattern Gaseous Detectors (MPGD), and how these are being - or could be - applied to the mentioned searches. The rapid evolvement of these devices and the relevance of their latest results need to be efficiently transferred to the rare event community. The creation of this series of meetings followed the motivation of bringing together both know-hows and it is proving to be a fruitful area of collaboration. Once more, the format of the meeting proved to be a success. A short (2 days) and relatively informal program with some recent highlighted results, rather than exhaustive reviews, attracted the interest of the audience. The symposium, fourth of the series, is becoming consolidated as a regular meeting place for the synergic interplay between the fields of rare events and TPC development. Apart from the usual topics central to the conference subject, like the status of some low-energy neutrino physics and double beta decay experiments, dark matter experiments (and in general physics in underground laboratories), axion searches, or development results, every year the conference programme is enriched with original slightly off-topic contributions that trigger the curiosity and stimulate further thought

  18. Improvement of spectral density-based activation detection of event-related fMRI data.

    PubMed

    Ngan, Shing-Chung; Hu, Xiaoping; Tan, Li-Hai; Khong, Pek-Lan

    2009-09-01

    For event-related data obtained from an experimental paradigm with a periodic design, spectral density at the fundamental frequency of the paradigm has been used as a template-free activation detection measure. In this article, we build and expand upon this detection measure to create an improved, integrated measure. Such an integrated measure linearly combines information contained in the spectral densities at the fundamental frequency as well as the harmonics of the paradigm and in a spatial correlation function characterizing the degree of co-activation among neighboring voxels. Several figures of merit are described and used to find appropriate values for the coefficients in the linear combination. Using receiver-operating characteristic analysis on simulated functional magnetic resonance imaging (fMRI) data sets, we quantify and validate the improved performance of the integrated measure over the spectral density measure based on the fundamental frequency as well as over some other popular template-free data analysis methods. We then demonstrate the application of the new method on an experimental fMRI data set. Finally, several extensions to this work are suggested. PMID:19535208

  19. Detecting concealed information using feedback related event-related brain potentials.

    PubMed

    Sai, Liyang; Lin, Xiaohong; Hu, Xiaoqing; Fu, Genyue

    2014-10-01

    Employing an event-related potential (ERP)-based concealed information test (CIT), the present study investigated (1) the neurocognitive processes when people received feedbacks regarding their deceptive/truthful responses and (2) whether such feedback-related ERP activities can be used to detect concealed information above and beyond the recognition-related P300. During the CIT, participants were presented with rare, meaningful probes (their own names) embedded within a series of frequent yet meaningless irrelevants (others' names). Participants were instructed to deny their recognition of the probes. Critically, following participants' responses, they were provided with feedbacks regarding whether they succeeded or failed in the CIT. Replicating previous ERP-based CITs, we found a larger P300 elicited by probe compared to irrelevant. Regarding feedback-related ERPs, a temporospatial Principle Component Analyses found two ERP components that were not only sensitive to feedback manipulations but also can discriminate probe from irrelevant: an earlier, central-distributed positivity that was elicited by "success" feedbacks peaked around 219ms; and a later, right central-distributed positivity that was also elicited by "success" feedbacks, peaked around 400ms. Importantly, the feedback ERPs were not correlated with P300 that was elicited by probe/irrelevant, suggesting that these two ERPs reflect independent processes underlying memory concealment. These findings illustrate the feasibility and promise of using feedback-related ERPs to detect concealed memory and thus deception. PMID:25058495

  20. Fully Autonomous Multiplet Event Detection: Application to Local-Distance Monitoring of Blood Falls Seismicity

    SciTech Connect

    Carmichael, Joshua Daniel; Carr, Christina; Pettit, Erin C.

    2015-06-18

    We apply a fully autonomous icequake detection methodology to a single day of high-sample rate (200 Hz) seismic network data recorded from the terminus of Taylor Glacier, ANT that temporally coincided with a brine release episode near Blood Falls (May 13, 2014). We demonstrate a statistically validated procedure to assemble waveforms triggered by icequakes into populations of clusters linked by intra-event waveform similarity. Our processing methodology implements a noise-adaptive power detector coupled with a complete-linkage clustering algorithm and noise-adaptive correlation detector. This detector-chain reveals a population of 20 multiplet sequences that includes ~150 icequakes and produces zero false alarms on the concurrent, diurnally variable noise. Our results are very promising for identifying changes in background seismicity associated with the presence or absence of brine release episodes. We thereby suggest that our methodology could be applied to longer time periods to establish a brine-release monitoring program for Blood Falls that is based on icequake detections.

  1. Localized Detection of Frozen Precipitation Events and the Rain/Snow Transition

    NASA Astrophysics Data System (ADS)

    Strachan, S.

    2014-12-01

    Frozen precipitation in the mid-latitudes and semi-arid environments frequently serves a crucial role in the annual water budget. Often occurring along elevational gradients, the rain/snow transition (or, "snow line") in mountain systems determines the amount of water which enters the system slowly during melt phases as opposed to rain which immediately infiltrates or runs off to lower elevations. This in turn influences the location and composition of ecological communities such as conifer forests, as well as timing and nature of the entire mountain block annual hydrologic cycle. Characterization of the rain/snow transition is becoming a priority in mountainous semi-arid regions, as increasing human populations and repeated drought episodes combine to create water shortages. Atmospheric conditions (temperature and relative humidity) which signal the rain/snow transition have been described, but variability within the conditions window can create error in estimating true areal cover of frozen versus liquid precipitation. In populated, flood-prone regions, radar installations specifically tuned to the detection of the "bright band" transition elevation can be deployed; however these cannot be permanently installed at remote, solar-power-dependent climate stations or with fine geographical scale. Characterization of current trends in rain/snow transition can be made using automated weather stations placed along the elevational gradient fielding sensors for high-frequency (e.g. 1-10 minute) measurement of air temperature, relative humidity, liquid precipitation, and precipitation mass. Visual validation of precipitation modes detected through automated means is performed using time-series records from digital cameras placed at each station. Refinements of geographically-explicit relationships of atmospheric conditions to precipitation mode can be made over time, as well as detection of seasonally-anomalous but eco-hydrologically-significant frozen precipitation events

  2. FOREWORD: 3rd Symposium on Large TPCs for Low Energy Event Detection

    NASA Astrophysics Data System (ADS)

    Irastorza, Igor G.; Colas, Paul; Gorodetzky, Phillippe

    2007-05-01

    The Third International Symposium on large TPCs for low-energy rare-event detection was held at Carré des sciences, Poincaré auditorium, 25 rue de la Montagne Ste Geneviève in Paris on 11 12 December 2006. This prestigious location belonging to the Ministry of Research is hosted in the former Ecole Polytechnique. The meeting, held in Paris every two years, gathers a significant community of physicists involved in rare event detection. Its purpose is an extensive discussion of present and future projects using large TPCs for low energy, low background detection of rare events (low-energy neutrinos, dark matter, solar axions). The use of a new generation of Micro-Pattern Gaseous Detectors (MPGD) appears to be a promising way to reach this goal. The program this year was enriched by a new session devoted to the detection challenge of polarized gamma rays, relevant novel experimental techniques and the impact on particle physics, astrophysics and astronomy. A very particular feature of this conference is the large variety of talks ranging from purely theoretical to purely experimental subjects including novel technological aspects. This allows discussion and exchange of useful information and new ideas that are emerging to address particle physics experimental challenges. The scientific highlights at the Symposium came on many fronts: Status of low-energy neutrino physics and double-beta decay New ideas on double-beta decay experiments Gamma ray polarization measurement combining high-precision TPCs with MPGD read-out Dark Matter challenges in both axion and WIMP search with new emerging ideas for detection improvements Progress in gaseous and liquid TPCs for rare event detection Georges Charpak opened the meeting with a talk on gaseous detectors for applications in the bio-medical field. He also underlined the importance of new MPGD detectors for both physics and applications. There were about 100 registered participants at the symposium. The successful

  3. Model-based tomographic reconstruction

    DOEpatents

    Chambers, David H.; Lehman, Sean K.; Goodman, Dennis M.

    2012-06-26

    A model-based approach to estimating wall positions for a building is developed and tested using simulated data. It borrows two techniques from geophysical inversion problems, layer stripping and stacking, and combines them with a model-based estimation algorithm that minimizes the mean-square error between the predicted signal and the data. The technique is designed to process multiple looks from an ultra wideband radar array. The processed signal is time-gated and each section processed to detect the presence of a wall and estimate its position, thickness, and material parameters. The floor plan of a building is determined by moving the array around the outside of the building. In this paper we describe how the stacking and layer stripping algorithms are combined and show the results from a simple numerical example of three parallel walls.

  4. Automated detection of rare-event pathogens through time-gated luminescence scanning microscopy.

    PubMed

    Lu, Yiqing; Jin, Dayong; Leif, Robert C; Deng, Wei; Piper, James A; Yuan, Jingli; Duan, Yusheng; Huo, Yujing

    2011-05-01

    Many microorganisms have a very low threshold (<10 cells) to trigger infectious diseases, and, in these cases, it is important to determine the absolute cell count in a low-cost and speedy fashion. Fluorescent microscopy is a routine method; however, one fundamental problem has been associated with the existence in the sample of large numbers of nontarget particles, which are naturally autofluorescent, thereby obscuring the visibility of target organisms. This severely affects both direct visual inspection and the automated microscopy based on computer pattern recognition. We report a novel strategy of time-gated luminescent scanning for accurate counting of rare-event cells, which exploits the large difference in luminescence lifetimes between the lanthanide biolabels, >100 μs, and the autofluorescence backgrounds, <0.1 μs, to render background autofluorescence invisible to the detector. Rather than having to resort to sophisticated imaging analysis, the background-free feature allows a single-element photomultiplier to locate rare-event cells, so that requirements for data storage and analysis are minimized to the level of image confirmation only at the final step. We have evaluated this concept in a prototype instrument using a 2D scanning stage and applied it to rare-event Giardia detection labeled by a europium complex. For a slide area of 225 mm(2) , the time-gated scanning method easily reduced the original 40,000 adjacent elements (0.075 mm × 0.075 mm) down to a few "elements of interest" containing the Giardia cysts. We achieved an averaged signal-to-background ratio of 41.2 (minimum ratio of 12.1). Such high contrasts ensured the accurate mapping of all the potential Giardia cysts free of false positives or negatives. This was confirmed by the automatic retrieving and time-gated luminescence bioimaging of these Giardia cysts. Such automated microscopy based on time-gated scanning can provide novel solutions for quantitative diagnostics in advanced

  5. Detection of Rapid Events at Mantle Depth by Future Gravity Missions

    NASA Astrophysics Data System (ADS)

    Ivins, Erik; Watkins, Michael

    2015-04-01

    The robust detection of gravity changes associated with relatively shallow subduction zone earthquakes (0-50 km depth co-seismic rupture) has been one of the success stories of the GRACE (e.g., Han et al. 2013, JGR-B, doi:10.1002/jgrb.50116) and GOCE (e.g., Fuchs et al., 2013, JGR-B, doi: 10.1002/jgrb.50381) missions. This surprise is a testament to the sensitivity of the measurement system, for the satellites must map the gravity potential field changes while flying at orbital altitudes exceeding 400 km (in the case of GRACE). It is clear that these observations contribute to advancing our understanding of large subduction zone earthquakes, if for no other reason than they allow comprehensive observation over the ocean covered solid Earth. The observations aid studies of both the mass transport associated with coseismic and post-seismic. Measurement capability for missions proposed to be flown after GRACE-2 are anticipated to be an order of magnitude, or greater, in accuracy and resolution (e.g., Wiese et al., 2012, J. Geodesy, doi: 10.1007/s00190-011-0493-8; Elsaka et al. 2014, J. Geodesy, doi: 10.1007/s00190-013-0665-9). Deep subduction zone earthquakes have not been detected, nor have any other non-seismic solid Earth deformations - with the exception of the glacial isostatic adjustment vertical response to the last glacial age. We examine the possibility that earthquakes occurring at, or near, the major transition zone in the mantle should be detected in the region where mantle phases become unstable and undergoes transition to a stable perovskite phase below 660 km depth. The Mw 8.2 1994 Bolivian Earthquake and the May 24, 2013 Mw 8.3 earthquake beneath the Sea of Okhotsk, Russia, are prototypes of events that can be studied with future gravity missions. Observation of gravity changes associated with deep subduction zone earthquakes could provide new clues on the enigmatic questions currently in debate over faulting mechanism (e.g., Zhan et al., 2014, Science

  6. Wavelet based automated postural event detection and activity classification with single imu - biomed 2013.

    PubMed

    Lockhart, Thurmon E; Soangra, Rahul; Zhang, Jian; Wu, Xuefan

    2013-01-01

    Mobility characteristics associated with activity of daily living such as sitting down, lying down, rising up, and walking are considered to be important in maintaining functional independence and healthy life style especially for the growing elderly population. Characteristics of postural transitions such as sit-to-stand are widely used by clinicians as a physical indicator of health, and walking is used as an important mobility assessment tool. Many tools have been developed to assist in the assessment of functional levels and to detect a person’s activities during daily life. These include questionnaires, observation, diaries, kinetic and kinematic systems, and validated functional tests. These measures are costly and time consuming, rely on subjective patient recall and may not accurately reflect functional ability in the patient’s home. In order to provide a low-cost, objective assessment of functional ability, inertial measurement unit (IMU) using MEMS technology has been employed to ascertain ADLs. These measures facilitate long-term monitoring of activity of daily living using wearable sensors. IMU system are desirable in monitoring human postures since they respond to both frequency and the intensity of movements and measure both dc (gravitational acceleration vector) and ac (acceleration due to body movement) components at a low cost. This has enabled the development of a small, lightweight, portable system that can be worn by a free-living subject without motion impediment – TEMPO (Technology Enabled Medical Precision Observation). Using this IMU system, we acquired indirect measures of biomechanical variables that can be used as an assessment of individual mobility characteristics with accuracy and recognition rates that are comparable to the modern motion capture systems. In this study, five subjects performed various ADLs and mobility measures such as posture transitions and gait characteristics were obtained. We developed postural event detection

  7. Detection of air pollution events over Évora-Portugal during 2009

    NASA Astrophysics Data System (ADS)

    Filipa Domingues, Ana; Bortoli, Daniele; Silva, Ana Maria; Kulkarni, Pavan; Antón, Manuel

    2010-05-01

    All over the world pollutant industries, traffic and other natural and anthropogenic sources are responsible for air pollution affecting health and also the climate. At the moment the monitoring of air quality in urban and country regions become an urgent concern in the atmospheric studies due to the impact of global air pollution on climate and on the environment. One of the evidences of the global character of air pollution is that it not only affects industrialized countries but also reaches less developed countries with pollution gases and particles generated for elsewhere. The development and the employment of instruments and techniques for measure the variation of atmospheric trace gases and perform their monitoring are crucial for the improvement of the air quality and the control of pollutants emissions. One of the instruments able to perform the air quality monitoring is the Spectrometer for Atmospheric TRacers Measurements (SPATRAM) and it is installed at the CGÉs Observatory in Évora (38.5° N, 7.9° W, 300 m asl). This UV-VIS Spectrometer is used to carry out measurements of the zenith scattered radiation (290- 900 nm) to retrieve the vertical content of some atmospheric trace gases such as O3 and NO2 in stratosphere, using Differential Optical Absorption Spectroscopy (DOAS) methodology. Although SPATRAM, in its actual geometric and operational configuration - zenith sky looking and passive mode measurements, is not able to detect small variations of tracers in the troposphere it is possible to identify enhancements in the pollution loads due to air masses movements from polluted sites. In spite of the fact that Evora is a quite unpolluted city the deep analysis of the DOAS output, namely the quantity of gas (in this case NO2) present along the optical path of measurements (SCD - Slant Column Density) allows for the detection of unpredicted variations in the diurnal NO2 cycle. The SPATRAḾs data allows the identification of polluting events which

  8. An automated cross-correlation based event detection technique and its application to surface passive data set

    USGS Publications Warehouse

    Forghani-Arani, Farnoush; Behura, Jyoti; Haines, Seth S.; Batzle, Mike

    2013-01-01

    In studies on heavy oil, shale reservoirs, tight gas and enhanced geothermal systems, the use of surface passive seismic data to monitor induced microseismicity due to the fluid flow in the subsurface is becoming more common. However, in most studies passive seismic records contain days and months of data and manually analysing the data can be expensive and inaccurate. Moreover, in the presence of noise, detecting the arrival of weak microseismic events becomes challenging. Hence, the use of an automated, accurate and computationally fast technique for event detection in passive seismic data is essential. The conventional automatic event identification algorithm computes a running-window energy ratio of the short-term average to the long-term average of the passive seismic data for each trace. We show that for the common case of a low signal-to-noise ratio in surface passive records, the conventional method is not sufficiently effective at event identification. Here, we extend the conventional algorithm by introducing a technique that is based on the cross-correlation of the energy ratios computed by the conventional method. With our technique we can measure the similarities amongst the computed energy ratios at different traces. Our approach is successful at improving the detectability of events with a low signal-to-noise ratio that are not detectable with the conventional algorithm. Also, our algorithm has the advantage to identify if an event is common to all stations (a regional event) or to a limited number of stations (a local event). We provide examples of applying our technique to synthetic data and a field surface passive data set recorded at a geothermal site.

  9. An engineered nano-plasmonic biosensing surface for colorimetric and SERS detection of DNA-hybridization events

    NASA Astrophysics Data System (ADS)

    Heydari, Esmaeil; Thompson, David; Graham, Duncan; Cooper, Jonathan M.; Clark, Alasdair W.

    2015-03-01

    We report a versatile nanophotonic biosensing platform that enables both colorimetric detection and enhanced Raman spectroscopy detection of molecular binding events. Through the integration of electron-beam lithography, dip-pennanolithography and molecular self-assembly, we demonstrate plasmonic nanostructures which change geometry and plasmonic properties in response to molecularly-mediated nanoparticle binding events. These biologically-active nanostructured surfaces hold considerable potential for use as multiplexed sensor platforms for point-of-care diagnostics, and as scaffolds for a new generation of molecularly dynamic metamaterials.

  10. Model based manipulator control

    NASA Technical Reports Server (NTRS)

    Petrosky, Lyman J.; Oppenheim, Irving J.

    1989-01-01

    The feasibility of using model based control (MBC) for robotic manipulators was investigated. A double inverted pendulum system was constructed as the experimental system for a general study of dynamically stable manipulation. The original interest in dynamically stable systems was driven by the objective of high vertical reach (balancing), and the planning of inertially favorable trajectories for force and payload demands. The model-based control approach is described and the results of experimental tests are summarized. Results directly demonstrate that MBC can provide stable control at all speeds of operation and support operations requiring dynamic stability such as balancing. The application of MBC to systems with flexible links is also discussed.

  11. Elastomeric optical fiber sensors and method for detecting and measuring events occurring in elastic materials

    DOEpatents

    Muhs, Jeffrey D.; Capps, Gary J.; Smith, David B.; White, Clifford P.

    1994-01-01

    Fiber optic sensing means for the detection and measurement of events such as dynamic loadings imposed upon elastic materials including cementitious materials, elastomers, and animal body components and/or the attrition of such elastic materials are provided. One or more optical fibers each having a deformable core and cladding formed of an elastomeric material such as silicone rubber are embedded in the elastic material. Changes in light transmission through any of the optical fibers due the deformation of the optical fiber by the application of dynamic loads such as compression, tension, or bending loadings imposed on the elastic material or by the attrition of the elastic material such as by cracking, deterioration, aggregate break-up, and muscle, tendon, or organ atrophy provide a measurement of the dynamic loadings and attrition. The fiber optic sensors can be embedded in elastomers subject to dynamic loadings and attrition such as commonly used automobiles and in shoes for determining the amount and frequency of the dynamic loadings and the extent of attrition. The fiber optic sensors are also useable in cementitious material for determining the maturation thereof.

  12. First Satellite-detected Perturbations of Outgoing Longwave Radiation Associated with Blowing Snow Events over Antarctica

    NASA Technical Reports Server (NTRS)

    Yang, Yuekui; Palm, Stephen P.; Marshak, Alexander; Wu, Dong L.; Yu, Hongbin; Fu, Qiang

    2014-01-01

    We present the first satellite-detected perturbations of the outgoing longwave radiation (OLR) associated with blowing snow events over the Antarctic ice sheet using data from Cloud-Aerosol Lidar with Orthogonal Polarization and Clouds and the Earth's Radiant Energy System. Significant cloud-free OLR differences are observed between the clear and blowing snow sky, with the sign andmagnitude depending on season and time of the day. During nighttime, OLRs are usually larger when blowing snow is present; the average difference in OLRs between without and with blowing snow over the East Antarctic Ice Sheet is about 5.2 W/m2 for the winter months of 2009. During daytime, in contrast, the OLR perturbation is usually smaller or even has the opposite sign. The observed seasonal variations and day-night differences in the OLR perturbation are consistent with theoretical calculations of the influence of blowing snow on OLR. Detailed atmospheric profiles are needed to quantify the radiative effect of blowing snow from the satellite observations.

  13. PREFACE: 7th International Symposium on Large TPCs for Low-Energy Rare Event Detection

    NASA Astrophysics Data System (ADS)

    Colas, P.; Giomataris, I.; Irastorza, I.; Patzak, Th

    2015-11-01

    The seventh "International Symposium on Large TPCs for Low-Energy Rare Event Detection", took place in Paris between the 15th and 17th of December 2014 at the Institute of Astroparticle Physics (APC) campus - Paris Diderot University. As usual the conference was organized during the week before Christmas, which seems to be convenient for most of the people and occurs every two years with almost 120 participants attending. Many people contributed to the success of the conference, but the organizers would particularly like to thank the management of APC for providing the nice Buffon auditorium and infrastructure. We also acknowledge the valuable support of DSM-Irfu and the University of Zaragoza. The scientific program consisted of plenary sessions including the following topics with theoretical and experimental lectures: • Low energy neutrino physics • Neutrinoless double beta decay process • Dark matter searches • Axion and especially solar axion searches • Space experiments and gamma-ray polarimetry • New detector R&D and future experiments

  14. A convenient method for detecting electrolyte bridges in multichannel electroencephalogram and event-related potential recordings.

    PubMed

    Tenke, C E; Kayser, J

    2001-03-01

    Dense electrode arrays offer numerous advantages over single channel electroencephalogram/event-related potential (EEG/ERP) recordings, but also exaggerate the influence of common error sources arising from the preparation of scalp placements. Even with conventional low density recordings (e.g. 30-channel Electro-Cap), over-application of electrode gel may result in electrolyte leakage and create low impedance bridges, particularly at vertically-aligned sites (e.g. inferior-lateral). The ensuing electrical short produces an artificial similarity of ERPs at neighboring sites that distorts the ERP topography. This artifact is not immediately apparent in group averages, and may even go undetected after visual inspection of the individual ERP waveforms. Besides adding noise variance to the topography, this error source also has the capacity to introduce systematic, localized artifacts (e.g. add or remove evidence of lateralized activity). Electrolyte bridges causing these artifacts can be easily detected by a simple variant of the Hjorth algorithm (intrinsic Hjorth), in which spatial interelectrode distances are replaced by an electrical analog of distance (i.e. the variances of the difference waveforms for all pairwise combinations of electrodes). When a low impedance bridge exists, the Hjorth algorithm identifies all affected sites as flat lines that are readily distinguishable from Hjorth waveforms at unbridged electrodes. PMID:11222978

  15. Block-adaptive filtering and its application to seismic-event detection

    SciTech Connect

    Clark, G.A.

    1981-04-01

    Block digital filtering involves the calculation of a block or finite set of filter output samples from a block of input samples. The motivation for block processing arises from computational advantages of the technique. Block filters take good advantage of parallel processing architectures, which are becoming more and more attractive with the advent of very large scale integrated (VLSI) circuits. This thesis extends the block technique to Wiener and adaptive filters, both of which are statistical filters. The key ingredient to this extension turns out to be the definition of a new performance index, block mean square error (BMSE), which combines the well known sum square error (SSE) and mean square error (MSE). A block adaptive filtering procedure is derived in which the filter coefficients are adjusted once per each output block in accordance with a generalized block least mean-square (BLMS) algorithm. Convergence properties of the BLMS algorithm are studied, including conditions for guaranteed convergence, convergence speed, and convergence accuracy. Simulation examples are given for clarity. Convergence properties of the BLMS and LMS algorithms are analyzed and compared. They are shown to be analogous, and under the proper circumstances, equivalent. The block adaptive filter was applied to the problem of detecting small seismic events in microseismic background noise. The predictor outperformed the world-wide standardized seismograph network (WWSSN) seismometers in improving signal-to-noise ratio (SNR).

  16. Real-Time Microbiology Laboratory Surveillance System to Detect Abnormal Events and Emerging Infections, Marseille, France

    PubMed Central

    Abat, Cédric; Chaudet, Hervé; Colson, Philippe; Rolain, Jean-Marc

    2015-01-01

    Infectious diseases are a major threat to humanity, and accurate surveillance is essential. We describe how to implement a laboratory data–based surveillance system in a clinical microbiology laboratory. Two historical Microsoft Excel databases were implemented. The data were then sorted and used to execute the following 2 surveillance systems in Excel: the Bacterial real-time Laboratory-based Surveillance System (BALYSES) for monitoring the number of patients infected with bacterial species isolated at least once in our laboratory during the study periodl and the Marseille Antibiotic Resistance Surveillance System (MARSS), which surveys the primary β-lactam resistance phenotypes for 15 selected bacterial species. The first historical database contained 174,853 identifications of bacteria, and the second contained 12,062 results of antibiotic susceptibility testing. From May 21, 2013, through June 4, 2014, BALYSES and MARSS enabled the detection of 52 abnormal events for 24 bacterial species, leading to 19 official reports. This system is currently being refined and improved. PMID:26196165

  17. Evaluation of Outbreak Detection Performance Using Multi-Stream Syndromic Surveillance for Influenza-Like Illness in Rural Hubei Province, China: A Temporal Simulation Model Based on Healthcare-Seeking Behaviors

    PubMed Central

    Fan, Yunzhou; Wang, Ying; Jiang, Hongbo; Yang, Wenwen; Yu, Miao; Yan, Weirong; Diwan, Vinod K.; Xu, Biao; Dong, Hengjin; Palm, Lars; Nie, Shaofa

    2014-01-01

    Background Syndromic surveillance promotes the early detection of diseases outbreaks. Although syndromic surveillance has increased in developing countries, performance on outbreak detection, particularly in cases of multi-stream surveillance, has scarcely been evaluated in rural areas. Objective This study introduces a temporal simulation model based on healthcare-seeking behaviors to evaluate the performance of multi-stream syndromic surveillance for influenza-like illness. Methods Data were obtained in six towns of rural Hubei Province, China, from April 2012 to June 2013. A Susceptible-Exposed-Infectious-Recovered model generated 27 scenarios of simulated influenza A (H1N1) outbreaks, which were converted into corresponding simulated syndromic datasets through the healthcare-behaviors model. We then superimposed converted syndromic datasets onto the baselines obtained to create the testing datasets. Outbreak performance of single-stream surveillance of clinic visit, frequency of over the counter drug purchases, school absenteeism, and multi-stream surveillance of their combinations were evaluated using receiver operating characteristic curves and activity monitoring operation curves. Results In the six towns examined, clinic visit surveillance and school absenteeism surveillance exhibited superior performances of outbreak detection than over the counter drug purchase frequency surveillance; the performance of multi-stream surveillance was preferable to signal-stream surveillance, particularly at low specificity (Sp <90%). Conclusions The temporal simulation model based on healthcare-seeking behaviors offers an accessible method for evaluating the performance of multi-stream surveillance. PMID:25409025

  18. Gait event detection using linear accelerometers or angular velocity transducers in able-bodied and spinal-cord injured individuals.

    PubMed

    Jasiewicz, Jan M; Allum, John H J; Middleton, James W; Barriskill, Andrew; Condie, Peter; Purcell, Brendan; Li, Raymond Che Tin

    2006-12-01

    We report on three different methods of gait event detection (toe-off and heel strike) using miniature linear accelerometers and angular velocity transducers in comparison to using standard pressure-sensitive foot switches. Detection was performed with normal and spinal-cord injured subjects. The detection of end contact (EC), normally toe-off, and initial contact (IC) normally, heel strike was based on either foot linear accelerations or foot sagittal angular velocity or shank sagittal angular velocity. The results showed that all three methods were as accurate as foot switches in estimating times of IC and EC for normal gait patterns. In spinal-cord injured subjects, shank angular velocity was significantly less accurate (p<0.02). We conclude that detection based on foot linear accelerations or foot angular velocity can correctly identify the timing of IC and EC events in both normal and spinal-cord injured subjects. PMID:16500102

  19. Discriminating famous from fictional names based on lifetime experience: evidence in support of a signal-detection model based on finite mixture distributions.

    PubMed

    Bowles, Ben; Harlow, Iain M; Meeking, Melissa M; Köhler, Stefan

    2012-01-01

    It is widely accepted that signal-detection mechanisms contribute to item-recognition memory decisions that involve discriminations between targets and lures based on a controlled laboratory study episode. Here, the authors employed mathematical modeling of receiver operating characteristics (ROC) to determine whether and how a signal-detection mechanism contributes to discriminations between moderately famous and fictional names based on lifetime experience. Unique to fame judgments is a lack of control over participants' previous exposure to the stimuli deemed "targets" by the experimenter; specifically, if they pertain to moderately famous individuals, participants may have had no prior exposure to a substantial proportion of the famous names presented. The authors adopted established models from the recognition-memory literature to examine the quantitative fit that could be obtained through the inclusion of signal-detection and threshold mechanisms for two data sets. They first established that a signal-detection process operating on graded evidence is critical to account for the fame judgment data they collected. They then determined whether the graded memory evidence for famous names would best be described with one distribution with greater variance than that for the fictional names, or with two finite mixture distributions for famous names that correspond to items with or without prior exposure, respectively. Analyses revealed that a model that included a d' parameter, as well as a mixture parameter, provided the best compromise between number of parameters and quantitative fit. Additional comparisons between this equal-variance signal-detection mixture model and a dual-process model, which included a high-threshold process in addition to a signal-detection process, also favored the former model. In support of the conjecture that the mixture parameter captures participants' prior experience, the authors found that it was increased when the analysis was

  20. Event-driven simulation in SELMON: An overview of EDSE

    NASA Technical Reports Server (NTRS)

    Rouquette, Nicolas F.; Chien, Steve A.; Charest, Leonard, Jr.

    1992-01-01

    EDSE (event-driven simulation engine), a model-based event-driven simulator implemented for SELMON, a tool for sensor selection and anomaly detection in real-time monitoring is described. The simulator is used in conjunction with a causal model to predict future behavior of the model from observed data. The behavior of the causal model is interpreted as equivalent to the behavior of the physical system being modeled. An overview of the functionality of the simulator and the model-based event-driven simulation paradigm on which it is based is provided. Included are high-level descriptions of the following key properties: event consumption and event creation, iterative simulation, synchronization and filtering of monitoring data from the physical system. Finally, how EDSE stands with respect to the relevant open issues of discrete-event and model-based simulation is discussed.

  1. Discriminating Famous from Fictional Names Based on Lifetime Experience: Evidence in Support of a Signal-Detection Model Based on Finite Mixture Distributions

    ERIC Educational Resources Information Center

    Bowles, Ben; Harlow, Iain M.; Meeking, Melissa M.; Kohler, Stefan

    2012-01-01

    It is widely accepted that signal-detection mechanisms contribute to item-recognition memory decisions that involve discriminations between targets and lures based on a controlled laboratory study episode. Here, the authors employed mathematical modeling of receiver operating characteristics (ROC) to determine whether and how a signal-detection…

  2. Unsupervised spatio-temporal detection of brain functional activation based on hidden Markov multiple event sequence models

    NASA Astrophysics Data System (ADS)

    Faisan, Sylvain; Thoraval, Laurent; Armspach, Jean-Paul; Heitz, Fabrice; Foucher, Jack

    2005-04-01

    This paper presents a novel, completely unsupervised fMRI brain mapping approach that addresses the three problems of hemodynamic response function (HRF) shape variability, neural event timing, and fMRI response linearity. To make it robust, the method takes into account spatial and temporal information directly into the core of the activation detection process. In practice, activation detection is formulated in terms of temporal alignment between the sequence of hemodynamic response onsets (HROs) detected in the fMRI signal at υ and in the spatial neighbourhood of υ, and the sequence of "off-on" transitions observed in the input blocked stimulation paradigm (when considering epoch-related fMRI data), or the sequence of stimuli of the event-based paradigm (when considering event-related fMRI data). This multiple event sequence alignment problem, which comes under multisensor data fusion, is solved within the probabilistic framework of hidden Markov multiple event sequence models (HMMESMs), a special class of hidden Markov models. Results obtained on real and synthetic data compete with those obtained with the popular statistical parametric mapping (SPM) approach, but without necessitating any prior definition of the expected activation patterns, the HMMESM mapping approach being completely unsupervised.

  3. Effects of rainfall events on the occurrence and detection efficiency of viruses in river water impacted by combined sewer overflows.

    PubMed

    Hata, Akihiko; Katayama, Hiroyuki; Kojima, Keisuke; Sano, Shoichi; Kasuga, Ikuro; Kitajima, Masaaki; Furumai, Hiroaki

    2014-01-15

    Rainfall events can introduce large amount of microbial contaminants including human enteric viruses into surface water by intermittent discharges from combined sewer overflows (CSOs). The present study aimed to investigate the effect of rainfall events on viral loads in surface waters impacted by CSO and the reliability of molecular methods for detection of enteric viruses. The reliability of virus detection in the samples was assessed by using process controls for virus concentration, nucleic acid extraction and reverse transcription (RT)-quantitative PCR (qPCR) steps, which allowed accurate estimation of virus detection efficiencies. Recovery efficiencies of poliovirus in river water samples collected during rainfall events (<10%) were lower than those during dry weather conditions (>10%). The log10-transformed virus concentration efficiency was negatively correlated with suspended solid concentration (r(2)=0.86) that increased significantly during rainfall events. Efficiencies of DNA extraction and qPCR steps determined with adenovirus type 5 and a primer sharing control, respectively, were lower in dry weather. However, no clear relationship was observed between organic water quality parameters and efficiencies of these two steps. Observed concentrations of indigenous enteric adenoviruses, GII-noroviruses, enteroviruses, and Aichi viruses increased during rainfall events even though the virus concentration efficiency was presumed to be lower than in dry weather. The present study highlights the importance of using appropriate process controls to evaluate accurately the concentration of water borne enteric viruses in natural waters impacted by wastewater discharge, stormwater, and CSOs. PMID:24064345

  4. Event detection by feature unpredictability in phase-contrast videos of cell cultures.

    PubMed

    Kandemir, Melih; Rubio, Jose C; Schmidt, Ute; Wojek, Christian; Welbl, Johannes; Ommer, Björn; Hamprecht, Fred A

    2014-01-01

    In this work we propose a novel framework for generic event monitoring in live cell culture videos, built on the assumption that unpredictable observations should correspond to biological events. We use a small set of event-free data to train a multioutput multikernel Gaussian process model that operates as an event predictor by performing autoregression on a bank of heterogeneous features extracted from consecutive frames of a video sequence. We show that the prediction error of this model can be used as a probability measure of the presence of relevant events, that can enable users to perform further analysis or monitoring of large-scale non-annotated data. We validate our approach in two phase-contrast sequence data sets containing mitosis and apoptosis events: a new private dataset of human bone cancer (osteosarcoma) cells and a benchmark dataset of stem cells. PMID:25485374

  5. A new method to detect anisotropic electron events with SOHO/EPHIN

    NASA Astrophysics Data System (ADS)

    Banjac, Saša; Kühl, Patrick; Heber, Bernd

    2016-07-01

    The EPHIN instrument (Electron Proton Helium INstrument) forms a part of the COSTEP experiment (COmprehensive SupraThermal and Energetic Particle Analyzer) within the CEPAC collaboration on board of the SOHO spacecraft (SOlar and Heliospheric Observatory). The EPHIN sensor is a stack of six solid-state detectors surrounded by an anti-coincidence. It measures energy spectra of electrons in the range 250 keV to >8.7 MeV, and hydrogen and helium isotopes in the range 4~MeV/n to >53~MeV/n. In order to improve the isotopic resolution, the first two detectors have been segmented: 5 segments form a ring enclosing a central segment. This does not only allow to correct the energy-losses in the detectors for the different path-length in the detectors but allows also an estimation of the arrival direction of the particles with respect to the sensor axis. Utilizing an extensive GEANT 4 Monte-Carlo simulation of the sensor head we computed the scattering-induced modifications to the input angular distribution and developed an inversion method that takes into account the poor counting statistics by optimizing the corresponding algorithm. This improvement makes it possible for the first time to detect long lasting anisotropies in the 1~MeV-3~MeV electron flux with a single telescope on a three-axis stabilized spacecraft. We present the method and its application to several events with strong anisotropies. For validation, we compare our data with the WIND-3DP results.

  6. Vision-Based Finger Detection, Tracking, and Event Identification Techniques for Multi-Touch Sensing and Display Systems

    PubMed Central

    Chen, Yen-Lin; Liang, Wen-Yew; Chiang, Chuan-Yen; Hsieh, Tung-Ju; Lee, Da-Cheng; Yuan, Shyan-Ming; Chang, Yang-Lang

    2011-01-01

    This study presents efficient vision-based finger detection, tracking, and event identification techniques and a low-cost hardware framework for multi-touch sensing and display applications. The proposed approach uses a fast bright-blob segmentation process based on automatic multilevel histogram thresholding to extract the pixels of touch blobs obtained from scattered infrared lights captured by a video camera. The advantage of this automatic multilevel thresholding approach is its robustness and adaptability when dealing with various ambient lighting conditions and spurious infrared noises. To extract the connected components of these touch blobs, a connected-component analysis procedure is applied to the bright pixels acquired by the previous stage. After extracting the touch blobs from each of the captured image frames, a blob tracking and event recognition process analyzes the spatial and temporal information of these touch blobs from consecutive frames to determine the possible touch events and actions performed by users. This process also refines the detection results and corrects for errors and occlusions caused by noise and errors during the blob extraction process. The proposed blob tracking and touch event recognition process includes two phases. First, the phase of blob tracking associates the motion correspondence of blobs in succeeding frames by analyzing their spatial and temporal features. The touch event recognition process can identify meaningful touch events based on the motion information of touch blobs, such as finger moving, rotating, pressing, hovering, and clicking actions. Experimental results demonstrate that the proposed vision-based finger detection, tracking, and event identification system is feasible and effective for multi-touch sensing applications in various operational environments and conditions. PMID:22163990

  7. An innovative methodological approach in the frame of Marine Strategy Framework Directive: a statistical model based on ship detection SAR data for monitoring programmes.

    PubMed

    Pieralice, Francesca; Proietti, Raffaele; La Valle, Paola; Giorgi, Giordano; Mazzolena, Marco; Taramelli, Andrea; Nicoletti, Luisa

    2014-12-01

    The Marine Strategy Framework Directive (MSFD, 2008/56/EC) is focused on protection, preservation and restoration of the marine environment by achieving and maintaining Good Environmental Status (GES) by 2020. Within this context, this paper presents a methodological approach for a fast and repeatable monitoring that allows quantitative assessment of seabed abrasion pressure due to recreational boat anchoring. The methodology consists of two steps: a semi-automatic procedure based on an algorithm for the ship detection in SAR imagery and a statistical model to obtain maps of spatial and temporal distribution density of anchored boats. Ship detection processing has been performed on 36 ASAR VV-pol images of Liguria test site, for the three years 2008, 2009 and 2010. Starting from the pointwise distribution layer produced by ship detection in imagery, boats points have been subdivided into 4 areas where a constant distribution density has been assumed for the entire period 2008-2010. In the future, this methodology will be applied also to higher resolution data of Sentinel-1 mission, specifically designed for the operational needs of the European Programme Copernicus. PMID:25096752

  8. Detection of water-quality contamination events based on multi-sensor fusion using an extented Dempster-Shafer method

    NASA Astrophysics Data System (ADS)

    Hou, Dibo; He, Huimei; Huang, Pingjie; Zhang, Guangxin; Loaiciga, Hugo

    2013-05-01

    This study presents a method for detecting contamination events of sources of drinking water based on the Dempster-Shafer (D-S) evidence theory. The detection method has the purpose of protecting water supply systems against accidental and intentional contamination events. This purpose is achieved by first predicting future water-quality parameters using an autoregressive (AR) model. The AR model predicts future water-quality parameters using recent measurements of these parameters made with automated (on-line) water-quality sensors. Next, a probabilistic method assigns probabilities to the time series of residuals formed by comparing predicted water-quality parameters with threshold values. Finally, the D-S fusion method searches for anomalous probabilities of the residuals and uses the result of that search to determine whether the current water quality is normal (that is, free of pollution) or contaminated. The D-S fusion method is extended and improved in this paper by weighted averaging of water-contamination evidence and by the analysis of the persistence of anomalous probabilities of water-quality parameters. The extended D-S fusion method makes determinations that have a high probability of being correct concerning whether or not a source of drinking water has been contaminated. This paper's method for detecting water-contamination events was tested with water-quality time series from automated (on-line) water quality sensors. In addition, a small-scale, experimental, water-pipe network was tested to detect water-contamination events. The two tests demonstrated that the extended D-S fusion method achieves a low false alarm rate and high probabilities of detecting water contamination events.

  9. The necessity of recognizing all events in X-ray detection.

    PubMed

    Papp, T; Maxwell, J A; Papp, A T

    2010-01-01

    In our work in studying properties of inner shell ionization, we are troubled that the experimental data used to determine the basic parameters of X-ray physics have a large and unexplainable scatter. As we looked into the problems we found that many of them contradict simple logic, elemental arithmetic, even parity and angular momentum conservation laws. We have identified that the main source of the problems, other than the human factor, is rooted in the signal processing electronics. To overcome these problems we have developed a fully digital signal processor, which not only has excellent resolution and line shape, but also allows proper accounting of all events. This is achieved by processing all events and separating them into two or more spectra (maximum 16), where the first spectrum is the accepted or good spectrum and the second spectrum is the spectrum of all rejected events. The availability of all the events allows one to see the other part of the spectrum. To our surprise the total information explains many of the shortcomings and contradictions of the X-ray database. The data processing methodology cannot be established on the partial and fractional information offered by other approaches. Comparing Monte Carlo detector modeling results with the partial spectra is ambiguous. It suggests that the metrology of calibration by radioactive sources as well as other X-ray measurements could be improved by the availability of the proper accounting of all events. It is not enough to know that an event was rejected and increment the input counter, it is necessary to know, what was rejected and why it happened, whether it was a noise or a disturbed event, a retarded event or a true event, or any pile up combination of these events. Such information is supplied by our processor reporting the events rejected by each discriminator in separate spectra. Several industrial applications of this quality assurance capable signal processor are presented. PMID:19910204

  10. Unified framework for triaxial accelerometer-based fall event detection and classification using cumulants and hierarchical decision tree classifier

    PubMed Central

    Kambhampati, Satya Samyukta; Singh, Vishal; Ramkumar, Barathram

    2015-01-01

    In this Letter, the authors present a unified framework for fall event detection and classification using the cumulants extracted from the acceleration (ACC) signals acquired using a single waist-mounted triaxial accelerometer. The main objective of this Letter is to find suitable representative cumulants and classifiers in effectively detecting and classifying different types of fall and non-fall events. It was discovered that the first level of the proposed hierarchical decision tree algorithm implements fall detection using fifth-order cumulants and support vector machine (SVM) classifier. In the second level, the fall event classification algorithm uses the fifth-order cumulants and SVM. Finally, human activity classification is performed using the second-order cumulants and SVM. The detection and classification results are compared with those of the decision tree, naive Bayes, multilayer perceptron and SVM classifiers with different types of time-domain features including the second-, third-, fourth- and fifth-order cumulants and the signal magnitude vector and signal magnitude area. The experimental results demonstrate that the second- and fifth-order cumulant features and SVM classifier can achieve optimal detection and classification rates of above 95%, as well as the lowest false alarm rate of 1.03%. PMID:26609414

  11. Unified framework for triaxial accelerometer-based fall event detection and classification using cumulants and hierarchical decision tree classifier.

    PubMed

    Kambhampati, Satya Samyukta; Singh, Vishal; Manikandan, M Sabarimalai; Ramkumar, Barathram

    2015-08-01

    In this Letter, the authors present a unified framework for fall event detection and classification using the cumulants extracted from the acceleration (ACC) signals acquired using a single waist-mounted triaxial accelerometer. The main objective of this Letter is to find suitable representative cumulants and classifiers in effectively detecting and classifying different types of fall and non-fall events. It was discovered that the first level of the proposed hierarchical decision tree algorithm implements fall detection using fifth-order cumulants and support vector machine (SVM) classifier. In the second level, the fall event classification algorithm uses the fifth-order cumulants and SVM. Finally, human activity classification is performed using the second-order cumulants and SVM. The detection and classification results are compared with those of the decision tree, naive Bayes, multilayer perceptron and SVM classifiers with different types of time-domain features including the second-, third-, fourth- and fifth-order cumulants and the signal magnitude vector and signal magnitude area. The experimental results demonstrate that the second- and fifth-order cumulant features and SVM classifier can achieve optimal detection and classification rates of above 95%, as well as the lowest false alarm rate of 1.03%. PMID:26609414

  12. A new method for producing automated seismic bulletins: Probabilistic event detection, association, and location

    SciTech Connect

    Draelos, Timothy J.; Ballard, Sanford; Young, Christopher J.; Brogan, Ronald

    2015-10-01

    Given a set of observations within a specified time window, a fitness value is calculated at each grid node by summing station-specific conditional fitness values. Assuming each observation was generated by a refracted P wave, these values are proportional to the conditional probabilities that each observation was generated by a seismic event at the grid node. The node with highest fitness value is accepted as a hypothetical event location, subject to some minimal fitness value, and all arrivals within a longer time window consistent with that event are associated with it. During the association step, a variety of different phases are considered. In addition, once associated with an event, an arrival is removed from further consideration. While unassociated arrivals remain, the search for other events is repeated until none are identified.

  13. A new method for producing automated seismic bulletins: Probabilistic event detection, association, and location

    DOE PAGESBeta

    Draelos, Timothy J.; Ballard, Sanford; Young, Christopher J.; Brogan, Ronald

    2015-10-01

    Given a set of observations within a specified time window, a fitness value is calculated at each grid node by summing station-specific conditional fitness values. Assuming each observation was generated by a refracted P wave, these values are proportional to the conditional probabilities that each observation was generated by a seismic event at the grid node. The node with highest fitness value is accepted as a hypothetical event location, subject to some minimal fitness value, and all arrivals within a longer time window consistent with that event are associated with it. During the association step, a variety of different phasesmore » are considered. In addition, once associated with an event, an arrival is removed from further consideration. While unassociated arrivals remain, the search for other events is repeated until none are identified.« less

  14. Breath-by-breath detection of apneic events for OSA severity estimation using non-contact audio recordings.

    PubMed

    Rosenwein, T; Dafna, E; Tarasiuk, A; Zigel, Y

    2015-08-01

    Obstructive sleep apnea (OSA) is a prevalent sleep disorder, characterized by recurrent episodes of upper airway obstructions during sleep. We hypothesize that breath-by-breath audio analysis of the respiratory cycle (i.e., inspiration and expiration phases) during sleep can reliably estimate the apnea hypopnea index (AHI), a measure of OSA severity. The AHI is calculated as the average number of apnea (A)/hypopnea (H) events per hour of sleep. Audio signals recordings of 186 adults referred to OSA diagnosis were acquired in-laboratory and at-home conditions during polysomnography and WatchPat study, respectively. A/H events were automatically segmented and classified using a binary random forest classifier. Total accuracy rate of 86.3% and an agreement of κ=42.98% were achieved in A/H event detection. Correlation of r=0.87 (r=0.74), diagnostic agreement of 76% (81.7%), and average absolute difference AHI error of 7.4 (7.8) (events/hour) were achieved in in-laboratory (at-home) conditions, respectively. Here we provide evidence that A/H events can be reliably detected at their exact time locations during sleep using non-contact audio approach. This study highlights the potential of this approach to reliably evaluate AHI in at home conditions. PMID:26738073

  15. IRcall and IRclassifier: two methods for flexible detection of intron retention events from RNA-Seq data

    PubMed Central

    2015-01-01

    Background The emergence of next-generation RNA sequencing (RNA-Seq) provides tremendous opportunities for researchers to analyze alternative splicing on a genome-wide scale. However, accurate detection of intron retention (IR) events from RNA-Seq data has remained an unresolved challenge in next-generation sequencing (NGS) studies. Results We propose two new methods: IRcall and IRclassifier to detect IR events from RNA-Seq data. Our methods combine together gene expression information, read coverage within an intron, and read counts (within introns, within flanking exons, supporting splice junctions, and overlapping with 5' splice site/ 3' splice site), employing ranking strategy and classifiers to detect IR events. We applied our approaches to one published RNA-Seq data on contrasting skip mutant and wild-type in Arabidopsis thaliana. Compared with three state-of-the-art methods, IRcall and IRclassifier could effectively filter out false positives, and predict more accurate IR events. Availability The data and codes of IRcall and IRclassifier are available at http://mlg.hit.edu.cn/ybai/IR/IRcallAndIRclass.html PMID:25707295

  16. Accuracy and Precision of Equine Gait Event Detection during Walking with Limb and Trunk Mounted Inertial Sensors

    PubMed Central

    Olsen, Emil; Andersen, Pia Haubro; Pfau, Thilo

    2012-01-01

    The increased variations of temporal gait events when pathology is present are good candidate features for objective diagnostic tests. We hypothesised that the gait events hoof-on/off and stance can be detected accurately and precisely using features from trunk and distal limb-mounted Inertial Measurement Units (IMUs). Four IMUs were mounted on the distal limb and five IMUs were attached to the skin over the dorsal spinous processes at the withers, fourth lumbar vertebrae and sacrum as well as left and right tuber coxae. IMU data were synchronised to a force plate array and a motion capture system. Accuracy (bias) and precision (SD of bias) was calculated to compare force plate and IMU timings for gait events. Data were collected from seven horses. One hundred and twenty three (123) front limb steps were analysed; hoof-on was detected with a bias (SD) of −7 (23) ms, hoof-off with 0.7 (37) ms and front limb stance with −0.02 (37) ms. A total of 119 hind limb steps were analysed; hoof-on was found with a bias (SD) of −4 (25) ms, hoof-off with 6 (21) ms and hind limb stance with 0.2 (28) ms. IMUs mounted on the distal limbs and sacrum can detect gait events accurately and precisely. PMID:22969392

  17. Wavelet packet transform for detection of single events in acoustic emission signals

    NASA Astrophysics Data System (ADS)

    Bianchi, Davide; Mayrhofer, Erwin; Gröschl, Martin; Betz, Gerhard; Vernes, András

    2015-12-01

    Acoustic emission signals in tribology can be used for monitoring the state of bodies in contact and relative motion. The recorded signal includes information which can be associated with different events, such as the formation and propagation of cracks, appearance of scratches and so on. One of the major challenges in analyzing these acoustic emission signals is to identify parts of the signal which belong to such an event and discern it from noise. In this contribution, a wavelet packet decomposition within the framework of multiresolution analysis theory is considered to analyze acoustic emission signals to investigate the failure of tribological systems. By applying the wavelet packet transform a method for the extraction of single events in rail contact fatigue test is proposed. The extraction of such events at several stages of the test permits a classification and the analysis of the evolution of cracks in the rail.

  18. Validating administrative data for the detection of adverse events in older hospitalized patients

    PubMed Central

    Ackroyd-Stolarz, Stacy; Bowles, Susan K; Giffin, Lorri

    2014-01-01

    Older hospitalized patients are at risk of experiencing adverse events including, but not limited to, hospital-acquired pressure ulcers, fall-related injuries, and adverse drug events. A significant challenge in monitoring and managing adverse events is lack of readily accessible information on their occurrence. Purpose The objective of this retrospective cross-sectional study was to validate diagnostic codes for pressure ulcers, fall-related injuries, and adverse drug events found in routinely collected administrative hospitalization data. Methods All patients 65 years of age or older discharged between April 1, 2009 and March 31, 2011 from a provincial academic health sciences center in Canada were eligible for inclusion in the validation study. For each of the three types of adverse events, a random sample of 50 patients whose records were positive and 50 patients whose records were not positive for an adverse event was sought for review in the validation study (n=300 records in total). A structured health record review was performed independently by two health care providers with experience in geriatrics, both of whom were unaware of the patient’s status with respect to adverse event coding. A physician reviewed 40 records (20 reviewed by each health care provider) to establish interrater agreement. Results A total of 39 pressure ulcers, 56 fall-related injuries, and 69 adverse drug events were identified through health record review. Of these, 34 pressure ulcers, 54 fall-related injuries, and 47 adverse drug events were also identified in administrative data. Overall, the diagnostic codes for adverse events had a sensitivity and specificity exceeding 0.67 (95% confidence interval [CI]: 0.56–0.99) and 0.89 (95% CI: 0.72–0.99), respectively. Conclusion It is feasible and valid to identify pressure ulcers, fall-related injuries, and adverse drug events in older hospitalized patients using routinely collected administrative hospitalization data. The

  19. Hard X-ray Detectability of Small Impulsive Heating Events in the Solar Corona

    NASA Astrophysics Data System (ADS)

    Glesener, L.; Klimchuk, J. A.; Bradshaw, S. J.; Marsh, A.; Krucker, S.; Christe, S.

    2015-12-01

    Impulsive heating events ("nanoflares") are a candidate to supply the solar corona with its ~2 MK temperature. These transient events can be studied using extreme ultraviolet and soft X-ray observations, among others. However, the impulsive events may occur in tenuous loops on small enough timescales that the heating is essentially not observed due to ionization timescales, and only the cooling phase is observed. Bremsstrahlung hard X-rays could serve as a more direct and prompt indicator of transient heating events. A hard X-ray spacecraft based on the direct-focusing technology pioneered by the Focusing Optics X-ray Solar Imager (FOXSI) sounding rocket could search for these direct signatures. In this work, we use the hydrodynamical EBTEL code to simulate differential emission measures produced by individual heating events and by ensembles of such events. We then directly predict hard X-ray spectra and consider their observability by a future spaceborne FOXSI, and also by the RHESSI and NuSTAR spacecraft.

  20. A Time-Reversed Reciprocal Method for Detecting High-frequency events in Civil Structures

    NASA Astrophysics Data System (ADS)

    Kohler, M. D.; Heaton, T. H.

    2007-12-01

    A new method that uses the properties of wave propagation reciprocity and time-reversed reciprocal Green's functions is presented for identifying high-frequency events that occur within engineered structures. Wave propagation properties of a seismic source in an elastic medium are directly applicable to structural waveform data. The number of structures with dense seismic networks embedded in them is increasing, making it possible to develop new approaches to identifying failure events such as fracturing welds that take advantage of the large number of recordings. The event identification method is based on the hypothesis that a database can be compiled of pre-event, source-receiver Green's functions using experimental sources. For buildings it is assumed that the source-time excitation is a delta function, proportional to the displacement produced at the receiver site. In theory, if all the Green's functions for a structure are known for a complete set of potential failure event locations, forward modeling can be used to compute a range of displacements to identify the correct Green's functions, locations, and source times from the suite of displacements that recorded actual events. The method is applied to a 17-story, steel, moment-frame building using experimentally applied impulse-force hammer sources. The building has an embedded, 72-channel, accelerometer array that is continuously recorded by 24-bit data loggers at 100 and 500 sps. The focus of this particular application is the identification of brittle- fractured welds of beam-column connections.

  1. A model-based approach for detection of runways and other objects in image sequences acquired using an on-board camera

    NASA Technical Reports Server (NTRS)

    Kasturi, Rangachar; Devadiga, Sadashiva; Tang, Yuan-Liang

    1994-01-01

    This research was initiated as a part of the Advanced Sensor and Imaging System Technology (ASSIST) program at NASA Langley Research Center. The primary goal of this research is the development of image analysis algorithms for the detection of runways and other objects using an on-board camera. Initial effort was concentrated on images acquired using a passive millimeter wave (PMMW) sensor. The images obtained using PMMW sensors under poor visibility conditions due to atmospheric fog are characterized by very low spatial resolution but good image contrast compared to those images obtained using sensors operating in the visible spectrum. Algorithms developed for analyzing these images using a model of the runway and other objects are described in Part 1 of this report. Experimental verification of these algorithms was limited to a sequence of images simulated from a single frame of PMMW image. Subsequent development and evaluation of algorithms was done using video image sequences. These images have better spatial and temporal resolution compared to PMMW images. Algorithms for reliable recognition of runways and accurate estimation of spatial position of stationary objects on the ground have been developed and evaluated using several image sequences. These algorithms are described in Part 2 of this report. A list of all publications resulting from this work is also included.

  2. A model-based approach for detection of runways and other objects in image sequences acquired using an on-board camera

    NASA Astrophysics Data System (ADS)

    Kasturi, Rangachar; Devadiga, Sadashiva; Tang, Yuan-Liang

    1994-08-01

    This research was initiated as a part of the Advanced Sensor and Imaging System Technology (ASSIST) program at NASA Langley Research Center. The primary goal of this research is the development of image analysis algorithms for the detection of runways and other objects using an on-board camera. Initial effort was concentrated on images acquired using a passive millimeter wave (PMMW) sensor. The images obtained using PMMW sensors under poor visibility conditions due to atmospheric fog are characterized by very low spatial resolution but good image contrast compared to those images obtained using sensors operating in the visible spectrum. Algorithms developed for analyzing these images using a model of the runway and other objects are described in Part 1 of this report. Experimental verification of these algorithms was limited to a sequence of images simulated from a single frame of PMMW image. Subsequent development and evaluation of algorithms was done using video image sequences. These images have better spatial and temporal resolution compared to PMMW images. Algorithms for reliable recognition of runways and accurate estimation of spatial position of stationary objects on the ground have been developed and evaluated using several image sequences. These algorithms are described in Part 2 of this report. A list of all publications resulting from this work is also included.

  3. Model-Based Systems

    NASA Technical Reports Server (NTRS)

    Frisch, Harold P.

    2007-01-01

    Engineers, who design systems using text specification documents, focus their work upon the completed system to meet Performance, time and budget goals. Consistency and integrity is difficult to maintain within text documents for a single complex system and more difficult to maintain as several systems are combined into higher-level systems, are maintained over decades, and evolve technically and in performance through updates. This system design approach frequently results in major changes during the system integration and test phase, and in time and budget overruns. Engineers who build system specification documents within a model-based systems environment go a step further and aggregate all of the data. They interrelate all of the data to insure consistency and integrity. After the model is constructed, the various system specification documents are prepared, all from the same database. The consistency and integrity of the model is assured, therefore the consistency and integrity of the various specification documents is insured. This article attempts to define model-based systems relative to such an environment. The intent is to expose the complexity of the enabling problem by outlining what is needed, why it is needed and how needs are being addressed by international standards writing teams.

  4. Accumulation rates or percentages? How to quantify Sporormiella and other coprophilous fungal spores to detect late Quaternary megafaunal extinction events

    NASA Astrophysics Data System (ADS)

    Wood, Jamie R.; Wilmshurst, Janet M.

    2013-10-01

    Spores of coprophilous fungi, and in particular those of Sporormiella, are a routinely used proxy for detecting late Quaternary herbivore extinction events in sedimentary records. Spore abundance is typically quantified as a percentage of the total, or dryland, pollen sum. Although this is a quick method that does not require the development of site-specific age-depth models, it relies on stable pollen accumulation rates and is therefore highly sensitive to changes in vegetation. This may lead to incorrect placement of extinction events in sedimentary records, particularly when they occur contemporaneously with major climatic/vegetation transitions. We suggest that the preferred method of quantification should be accumulation rate, and that pollen abundance data should also be presented, particularly for periods of major vegetation change. This approach provides a more reliable record of past herbivore abundance independent of vegetation change, allowing extinction events to be more accurately placed in stratigraphic sequences.

  5. Externally Sensitized Deprotection of PPG-Masked Carbonyls as a Spatial Proximity Probe in Photoamplified Detection of Binding Events

    PubMed Central

    Gustafson, Tiffany P.; Metzel, Greg A.

    2013-01-01

    Externally-sensitized electron-transfer fragmentation in dithiane PPG-protected carbonyls is adopted for detection and amplification of molecular recognition events. The new methodology allows for detection of as low as 50 attomoles of avidin utilizing an imager based on a low sensitivity mass-produced consumer CCD camera. Numeric modelling is carried out to demonstrate the intrinsic limitations of 2D amplification on surfaces and the advantages of unconstrained amplification in a compartmentalized volume of spatially addressable 3D solutions. PMID:22252455

  6. Event-specific detection of seven genetically modified soybean and maizes using multiplex-PCR coupled with oligonucleotide microarray.

    PubMed

    Xu, Jia; Zhu, Shuifang; Miao, Haizhen; Huang, Wensheng; Qiu, Minyan; Huang, Yan; Fu, Xuping; Li, Yao

    2007-07-11

    With the increasing development of genetically modified organism (GMO) detection techniques, the polymerase chain reaction (PCR) technique has been the mainstay for GMO detection. An oligonucleotide microarray is a glass chip to the surface of which an array of oligonucleotides was fixed as spots, each containing numerous copies of a sequence-specific probe that is complementary to a gene of interest. So it is used to detect ten or more targets synchronously. In this research, an event-specific detection strategy based on the unique and specific integration junction sequences between the host plant genome DNA and the integrated gene is being developed for its high specificity using multiplex-PCR together with oligonucleotide microarray. A commercial GM soybean (GTS 40-3-2) and six GM maize events (MON810, MON863, Bt176, Bt11, GA21, and T25) were detected by this method. The results indicate that it is a suitable method for the identification of these GM soybean and maizes. PMID:17559227

  7. Detecting Recent Atmospheric River Induced Flood Events over the Russian River Basin

    NASA Astrophysics Data System (ADS)

    Mehran, A.; Lettenmaier, D. P.; Ralph, F. M.; Lavers, D. A.

    2015-12-01

    Almost all major flood events in the coastal Western U.S. occur as a result of multi-day extreme precipitation during the winter and late fall, and most such events are now known to be Atmospheric Rivers (ARs). AR events are defined as having integrated water vapor (IWV) exceeding 2 cm in an area at least 2000 km long and no more than 1000 km wide. The dominant moisture source in many AR events, including those associated with most floods in the Russian River basin in Northern California, is the tropics. We report on a hydrological analysis of selected floods in the Russian River basin using the Distributed Hydrology Soil Vegetation Model (DHSVM), forced alternately by gridded station data, NWS WSR-88D radar data, and output from a regional atmospheric model. We also report results of river state forecasts using a river hydrodynamics model to reconstruct flood inundation from selected AR events. We diagnose errors in both the hydrological and river stage predictions, and discuss alternatives for future error reduction.

  8. Compression Algorithm Analysis of In-Situ (S)TEM Video: Towards Automatic Event Detection and Characterization

    SciTech Connect

    Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.; Browning, Nigel D.

    2015-09-23

    Precise analysis of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses analysis of video compression statistics for detecting and characterizing events in large data sets. This software works by converting the data into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect events in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each analysis. Going forward, an algorithm for detecting and possibly describing events automatically can be written based on statistic(s) for each data type.

  9. PortVis: A Tool for Port-Based Detection of Security Events

    SciTech Connect

    McPherson, J; Ma, K; Krystosk, P; Bartoletti, T; Christensen, M

    2004-06-29

    Most visualizations of security-related network data require large amounts of finely detailed, high-dimensional data. However, in some cases, the data available can only be coarsely detailed because of security concerns or other limitations. How can interesting security events still be discovered in data that lacks important details, such as IP addresses, network security alarms, and labels? In this paper, we discuss a system we have designed that takes very coarsely detailed data-basic, summarized information of the activity on each TCP port during each given hour-and uses visualization to help uncover interesting security events.

  10. Wideband acoustic activation and detection of droplet vaporization events using a capacitive micromachined ultrasonic transducer.

    PubMed

    Novell, Anthony; Arena, Christopher B; Oralkan, Omer; Dayton, Paul A

    2016-06-01

    An ongoing challenge exists in understanding and optimizing the acoustic droplet vaporization (ADV) process to enhance contrast agent effectiveness for biomedical applications. Acoustic signatures from vaporization events can be identified and differentiated from microbubble or tissue signals based on their frequency content. The present study exploited the wide bandwidth of a 128-element capacitive micromachined ultrasonic transducer (CMUT) array for activation (8 MHz) and real-time imaging (1 MHz) of ADV events from droplets circulating in a tube. Compared to a commercial piezoelectric probe, the CMUT array provides a substantial increase of the contrast-to-noise ratio. PMID:27369143

  11. Strategy for in situ detection of natural transformation-based horizontal gene transfer events.

    PubMed

    Rizzi, Aurora; Pontiroli, Alessandra; Brusetti, Lorenzo; Borin, Sara; Sorlini, Claudia; Abruzzese, Alessandro; Sacchi, Gian Attilio; Vogel, Timothy M; Simonet, Pascal; Bazzicalupo, Marco; Nielsen, Kaare Magne; Monier, Jean-Michel; Daffonchio, Daniele

    2008-02-01

    A strategy is described that enables the in situ detection of natural transformation in Acinetobacter baylyi BD413 by the expression of a green fluorescent protein. Microscale detection of bacterial transformants growing on plant tissues was shown by fluorescence microscopy and indicated that cultivation-based selection of transformants on antibiotic-containing agar plates underestimates transformation frequencies. PMID:18165369

  12. Dune Detective, Using Ecological Studies to Reconstruct Events Which Shaped a Barrier Island.

    ERIC Educational Resources Information Center

    Godfrey, Paul J.; Hon, Will

    This publication is designed for use as part of a curriculum series developed by the Regional Marine Science Project. Students in grades 11 and 12 are exposed to research methods through a series of field exercises guiding investigators in reconstructing the events which have shaped the natural communities of a barrier beach. Background…

  13. Electroencephalographic detection of respiratory-related cortical activity in humans: from event-related approaches to continuous connectivity evaluation.

    PubMed

    Hudson, Anna L; Navarro-Sune, Xavier; Martinerie, Jacques; Pouget, Pierre; Raux, Mathieu; Chavez, Mario; Similowski, Thomas

    2016-04-01

    The presence of a respiratory-related cortical activity during tidal breathing is abnormal and a hallmark of respiratory difficulties, but its detection requires superior discrimination and temporal resolution. The aim of this study was to validate a computational method using EEG covariance (or connectivity) matrices to detect a change in brain activity related to breathing. In 17 healthy subjects, EEG was recorded during resting unloaded breathing (RB), voluntary sniffs, and breathing against an inspiratory threshold load (ITL). EEG were analyzed by the specially developed covariance-based classifier, event-related potentials, and time-frequency (T-F) distributions. Nine subjects repeated the protocol. The classifier could accurately detect ITL and sniffs compared with the reference period of RB. For ITL, EEG-based detection was superior to airflow-based detection (P < 0.05). A coincident improvement in EEG-airflow correlation in ITL compared with RB (P < 0.05) confirmed that EEG detection relates to breathing. Premotor potential incidence was significantly higher before inspiration in sniffs and ITL compared with RB (P < 0.05), but T-F distributions revealed a significant difference between sniffs and RB only (P < 0.05). Intraclass correlation values ranged from poor (-0.2) to excellent (1.0). Thus, as for conventional event-related potential analysis, the covariance-based classifier can accurately predict a change in brain state related to a change in respiratory state, and given its capacity for near "real-time" detection, it is suitable to monitor the respiratory state in respiratory and critically ill patients in the development of a brain-ventilator interface. PMID:26864771

  14. Monkeying around with the gorillas in our midst: familiarity with an inattentional-blindness task does not improve the detection of unexpected events.

    PubMed

    Simons, Daniel J

    2010-01-01

    When people know to look for an unexpected event (eg, a gorilla in a basketball game), they tend to notice that event. But does knowledge that an unexpected event might occur improve the detection of other unexpected events in a similar scene? Subjects watched a new video in which, in addition to the gorilla, two other unexpected events occurred: a curtain changed color, and one player left the scene. Subjects who knew about videos like this one consistently spotted the gorilla in the new video, but they were slightly less likely to notice the other events. Foreknowledge that unexpected events might occur does not enhance the ability to detect other such events. PMID:23397479

  15. Detecting specific health-related events using an integrated sensor system for vital sign monitoring.

    PubMed

    Adnane, Mourad; Jiang, Zhongwei; Choi, Samjin; Jang, Hoyoung

    2009-01-01

    In this paper, a new method for the detection of apnea/hypopnea periods in physiological data is presented. The method is based on the intelligent combination of an integrated sensor system for long-time cardiorespiratory signal monitoring and dedicated signal-processing packages. Integrated sensors are a PVDF film and conductive fabric sheets. The signal processing package includes dedicated respiratory cycle (RC) and QRS complex detection algorithms and a new method using the respiratory cycle variability (RCV) for detecting apnea/hypopnea periods in physiological data. Results show that our method is suitable for online analysis of long time series data. PMID:22399978

  16. A new PCR-CGE (size and color) method for simultaneous detection of genetically modified maize events.

    PubMed

    Nadal, Anna; Coll, Anna; La Paz, Jose-Luis; Esteve, Teresa; Pla, Maria

    2006-10-01

    We present a novel multiplex PCR assay for simultaneous detection of multiple transgenic events in maize. Initially, five PCR primers pairs specific to events Bt11, GA21, MON810, and NK603, and Zea mays L. (alcohol dehydrogenase) were included. The event specificity was based on amplification of transgene/plant genome flanking regions, i.e., the same targets as for validated real-time PCR assays. These short and similarly sized amplicons were selected to achieve high and similar amplification efficiency for all targets; however, its unambiguous identification was a technical challenge. We achieved a clear distinction by a novel CGE approach that combined the identification by size and color (CGE-SC). In one single step, all five targets were amplified and specifically labeled with three different fluorescent dyes. The assay was specific and displayed an LOD of 0.1% of each genetically modified organism (GMO). Therefore, it was adequate to fulfill legal thresholds established, e.g., in the European Union. Our CGE-SC based strategy in combination with an adequate labeling design has the potential to simultaneously detect higher numbers of targets. As an example, we present the detection of up to eight targets in a single run. Multiplex PCR-CGE-SC only requires a conventional sequencer device and enables automation and high throughput. In addition, it proved to be transferable to a different laboratory. The number of authorized GMO events is rapidly growing; and the acreage of genetically modified (GM) varieties cultivated and commercialized worldwide is rapidly increasing. In this context, our multiplex PCR-CGE-SC can be suitable for screening GM contents in food. PMID:16972302

  17. Vy-PER: eliminating false positive detection of virus integration events in next generation sequencing data

    PubMed Central

    Forster, Michael; Szymczak, Silke; Ellinghaus, David; Hemmrich, Georg; Rühlemann, Malte; Kraemer, Lars; Mucha, Sören; Wienbrandt, Lars; Stanulla, Martin; Franke, Andre

    2015-01-01

    Several pathogenic viruses such as hepatitis B and human immunodeficiency viruses may integrate into the host genome. These virus/host integrations are detectable using paired-end next generation sequencing. However, the low number of expected true virus integrations may be difficult to distinguish from the noise of many false positive candidates. Here, we propose a novel filtering approach that increases specificity without compromising sensitivity for virus/host chimera detection. Our detection pipeline termed Vy-PER (Virus integration detection bY Paired End Reads) outperforms existing similar tools in speed and accuracy. We analysed whole genome data from childhood acute lymphoblastic leukemia (ALL), which is characterised by genomic rearrangements and usually associated with radiation exposure. This analysis was motivated by the recently reported virus integrations at genomic rearrangement sites and association with chromosomal instability in liver cancer. However, as expected, our analysis of 20 tumour and matched germline genomes from ALL patients finds no significant evidence for integrations by known viruses. Nevertheless, our method eliminates 12,800 false positives per genome (80× coverage) and only our method detects singleton human-phiX174-chimeras caused by optical errors of the Illumina HiSeq platform. This high accuracy is useful for detecting low virus integration levels as well as non-integrated viruses. PMID:26166306

  18. Vy-PER: eliminating false positive detection of virus integration events in next generation sequencing data.

    PubMed

    Forster, Michael; Szymczak, Silke; Ellinghaus, David; Hemmrich, Georg; Rühlemann, Malte; Kraemer, Lars; Mucha, Sören; Wienbrandt, Lars; Stanulla, Martin; Franke, Andre

    2015-01-01

    Several pathogenic viruses such as hepatitis B and human immunodeficiency viruses may integrate into the host genome. These virus/host integrations are detectable using paired-end next generation sequencing. However, the low number of expected true virus integrations may be difficult to distinguish from the noise of many false positive candidates. Here, we propose a novel filtering approach that increases specificity without compromising sensitivity for virus/host chimera detection. Our detection pipeline termed Vy-PER (Virus integration detection bY Paired End Reads) outperforms existing similar tools in speed and accuracy. We analysed whole genome data from childhood acute lymphoblastic leukemia (ALL), which is characterised by genomic rearrangements and usually associated with radiation exposure. This analysis was motivated by the recently reported virus integrations at genomic rearrangement sites and association with chromosomal instability in liver cancer. However, as expected, our analysis of 20 tumour and matched germline genomes from ALL patients finds no significant evidence for integrations by known viruses. Nevertheless, our method eliminates 12,800 false positives per genome (80× coverage) and only our method detects singleton human-phiX174-chimeras caused by optical errors of the Illumina HiSeq platform. This high accuracy is useful for detecting low virus integration levels as well as non-integrated viruses. PMID:26166306

  19. Data-Driven Multimodal Sleep Apnea Events Detection : Synchrosquezing Transform Processing and Riemannian Geometry Classification Approaches.

    PubMed

    Rutkowski, Tomasz M

    2016-07-01

    A novel multimodal and bio-inspired approach to biomedical signal processing and classification is presented in the paper. This approach allows for an automatic semantic labeling (interpretation) of sleep apnea events based the proposed data-driven biomedical signal processing and classification. The presented signal processing and classification methods have been already successfully applied to real-time unimodal brainwaves (EEG only) decoding in brain-computer interfaces developed by the author. In the current project the very encouraging results are obtained using multimodal biomedical (brainwaves and peripheral physiological) signals in a unified processing approach allowing for the automatic semantic data description. The results thus support a hypothesis of the data-driven and bio-inspired signal processing approach validity for medical data semantic interpretation based on the sleep apnea events machine-learning-related classification. PMID:27194241

  20. Detection and location of multiple events by MARS. Final report. [Multiple Arrival Recognition System

    SciTech Connect

    Wang, J.; Masso, J.F.; Archambeau, C.B.; Savino, J.M.

    1980-09-01

    Seismic data from two explosions was processed using the Systems Science and Software MARS (Multiple Arrival Recognition System) seismic event detector in an effort to determine their relative spatial and temporal separation on the basis of seismic data alone. The explosions were less than 1.0 kilometer apart and were separated by less than 0.5 sec in origin times. The seismic data consisted of nine local accelerograms (r < 1.0 km) and four regional (240 through 400 km) seismograms. The MARS processing clearly indicates the presence of multiple explosions, but the restricted frequency range of the data inhibits accurate time picks and hence limits the precision of the event location.

  1. Signature Based Detection of User Events for Post-mortem Forensic Analysis

    NASA Astrophysics Data System (ADS)

    James, Joshua Isaac; Gladyshev, Pavel; Zhu, Yuandong

    This paper introduces a novel approach to user event reconstruction by showing the practicality of generating and implementing signature-based analysis methods to reconstruct high-level user actions from a collection of low-level traces found during a post-mortem forensic analysis of a system. Traditional forensic analysis and the inferences an investigator normally makes when given digital evidence, are examined. It is then demonstrated that this natural process of inferring high-level events from low-level traces may be encoded using signature-matching techniques. Simple signatures using the defined method are created and applied for three popular Windows-based programs as a proof of concept.

  2. Detecting flood event trends assigned to changes in urbanisation levels using a bivariate copula model

    NASA Astrophysics Data System (ADS)

    Requena, Ana; Prosdocimi, Ilaria; Kjeldsen, Thomas R.; Mediero, Luis

    2014-05-01

    Flood frequency analyses based on stationary assumptions are usually employed for estimating design floods. However, more complex non-stationarity approaches are trying to be incorporated with the aim of improving such estimates. In this study, the effect of changing urbanisation on maximum flood peak (Q) and volume (V) series is analysed. The potential changes in an urbanised catchment and in a nearby hydrologically similar rural catchment in northwest England are investigated. The urbanised catchment is characterised by a noticeable increase of the urbanisation level in time, while the rural catchment has not been altered by anthropogenic actions. Winter, summer and annual maximum flood events are studied. With the aim of analysing changes in time, two non-superimposed time-windows are defined covering the periods 1976-1992 and 1993-2008, respectively. A preliminary analysis of temporal trends in Q, V and Kendall's tau is visually done, being formal tested by a resampling procedure. Differences were found among winter, summer and annual maximum flood events. As annual maximum flood events are commonly used for designing purposes, the corresponding bivariate distribution (margins and copula) was obtained for the different time-windows. Trends regarding both time-windows were analysed by comparing bivariate return period curves in the Q-V space. Different behaviours were found depending on the catchment. As a result, the application of the proposed methodology provides useful information in describing changes in flood events, regarding different flood variables and their relationship. In addition, the methodology can inform practitioners on the potential changes connected with urbanisation for appropriate design flood estimation.

  3. A new diamond biosensor with integrated graphitic microchannels for detecting quantal exocytic events from chromaffin cells.

    PubMed

    Picollo, Federico; Gosso, Sara; Vittone, Ettore; Pasquarelli, Alberto; Carbone, Emilio; Olivero, Paolo; Carabelli, Valentina

    2013-09-14

    An MeV ion-microbeam lithographic technique can be successfully employed for the fabrication of an all-carbon miniaturized cellular biosensor based on graphitic microchannels embedded in a single-crystal diamond matrix. The device is functionally characterized for the in vitro recording of quantal exocytic events from single chromaffin cells, with high sensitivity and signal-to-noise ratio, opening promising perspectives for the realization of monolithic all-carbon cellular biosensors. PMID:23847004

  4. Comparison of imaging modalities for detection of residual fragments and prediction of stone related events following percutaneous nephrolitotomy

    PubMed Central

    Gokce, Mehmet Ilker; Ozden, Eriz; Suer, Evren; Gulpinar, Basak; Gulpınar, Omer; Tangal, Semih

    2015-01-01

    Introduction Achieving stone free status (SFS) is the goal of stone surgery. In this study it is aimed to compare effectiveness of unenhanced helical computerized tomography (UHCT), KUB and ultrasonography (US) for detection of residual RFs and predicition of stone releated events following percutaneous nephrolitotomy (PNL). Materials and Methods Patients underwent PNL for radiopaque stones between November 2007 and February 2010 were followed. Patients were examined within 24-48 hours after the procedure by KUB, US and UHCT. For stone size 4 mm was accepted as cut off level of significance.Sensitivity and specificity of KUB and US for detection of RFs and value of them for prediction of stone related events were calculated. Results SFS was achieved in 95 patients (54.9%) and when cut off value of 4 mm for RFs was employed, SFS was achieved in 131 patients (75.7%). Sensitivity was 70.5% for KUB, and 52.5% for US. UHCT was shown to be significantly more efficient for detection of RFs compared to both KUB (p=0.01) and US (p=0.001). When cut off level of 4 mm employed, sensitivity of KUB and US increased to 85.7% and 57.1%. Statistical significant superiority of UHCT still remained (p value vs. KUB: 0.03 and p value vs. US: 0.008). Conclusion UHCT is the most sensitive diagnostic tool for detecting RFs after PNL. It has higher sensitivity regardless of stone size compared to KUB and US. Additionally UHCT has higher capability of predicting occurrence of stone related events. PMID:25928513

  5. Invariance of exocytotic events detected by amperometry as a function of the carbon fiber microelectrode diameter.

    PubMed

    Amatore, Christian; Arbault, Stéphane; Bouret, Yann; Guille, Manon; Lemaître, Frédéric; Verchier, Yann

    2009-04-15

    Etched carbon fiber microelectrodes of different radii have been used for amperometric measurements of single exocytotic events occurring at adrenal chromaffin cells. Frequency, kinetic, and quantitative information on exocytosis provided by amperometric spikes were analyzed as a function of the surface area of the microelectrodes. Interestingly, the percentage of spikes with foot (as well as their own characteristics), a category revealing the existence of sufficient long-lasting fusion pores, was found to be constant whatever the microelectrode diameter was, whereas the probability of overlapping spikes decreased with the electrode size. This confirmed that the prespike foot could not feature accidental superimposition of separated events occurring at different places. Moreover, the features of amperometric spikes investigated here (charge, intensity and kinetics) were found constant for all microelectrode diameters. This demonstrated that the electrochemical measurement does not introduce significant bias onto the kinetics and thermodynamics of release during individual exocytotic events. All in all, this work evidences that information on exocytosis amperometrically recorded with the usual 7 microm diameter carbon fiber electrodes is biologically relevant, although the frequent overlap between spikes requires a censorship of the data during the analytical treatment. PMID:19290664

  6. Model Based Definition

    NASA Technical Reports Server (NTRS)

    Rowe, Sidney E.

    2010-01-01

    In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.

  7. Detection of centimeter-sized meteoroid impact events in Saturn's F ring.

    PubMed

    Showalter, M R

    1998-11-01

    Voyager images reveal that three prominent clumps in Saturn's F ring were short-lived, appearing rapidly and then spreading and decaying in brightness over periods of approximately 2 weeks. These features arise from hypervelocity impacts by approximately 10-centimeter meteoroids into F ring bodies. Future ring observations of these impact events could constrain the centimeter-sized component of the meteoroid population, which is otherwise unmeasurable but plays an important role in the evolution of rings and surfaces in the outer solar system. The F ring's numerous other clumps are much longer lived and appear to be unrelated to impacts. PMID:9804543

  8. Detection of centimeter-sized meteoroid impact events in Saturn's F ring

    NASA Technical Reports Server (NTRS)

    Showalter, M. R.

    1998-01-01

    Voyager images reveal that three prominent clumps in Saturn's F ring were short-lived, appearing rapidly and then spreading and decaying in brightness over periods of approximately 2 weeks. These features arise from hypervelocity impacts by approximately 10-centimeter meteoroids into F ring bodies. Future ring observations of these impact events could constrain the centimeter-sized component of the meteoroid population, which is otherwise unmeasurable but plays an important role in the evolution of rings and surfaces in the outer solar system. The F ring's numerous other clumps are much longer lived and appear to be unrelated to impacts.

  9. myBlackBox: Blackbox Mobile Cloud Systems for Personalized Unusual Event Detection

    PubMed Central

    Ahn, Junho; Han, Richard

    2016-01-01

    We demonstrate the feasibility of constructing a novel and practical real-world mobile cloud system, called myBlackBox, that efficiently fuses multimodal smartphone sensor data to identify and log unusual personal events in mobile users’ daily lives. The system incorporates a hybrid architectural design that combines unsupervised classification of audio, accelerometer and location data with supervised joint fusion classification to achieve high accuracy, customization, convenience and scalability. We show the feasibility of myBlackBox by implementing and evaluating this end-to-end system that combines Android smartphones with cloud servers, deployed for 15 users over a one-month period. PMID:27223292

  10. Network Event Recording Device: An automated system for Network anomaly detection, and notification. Draft

    SciTech Connect

    Simmons, D.G.; Wilkins, R.

    1994-09-01

    The goal of the Network Event Recording Device (NERD) is to provide a flexible autonomous system for network logging and notification when significant network anomalies occur. The NERD is also charged with increasing the efficiency and effectiveness of currently implemented network security procedures. While it has always been possible for network and security managers to review log files for evidence of network irregularities, the NERD provides real-time display of network activity, as well as constant monitoring and notification services for managers. Similarly, real-time display and notification of possible security breaches will provide improved effectiveness in combating resource infiltration from both inside and outside the immediate network environment.

  11. myBlackBox: Blackbox Mobile Cloud Systems for Personalized Unusual Event Detection.

    PubMed

    Ahn, Junho; Han, Richard

    2016-01-01

    We demonstrate the feasibility of constructing a novel and practical real-world mobile cloud system, called myBlackBox, that efficiently fuses multimodal smartphone sensor data to identify and log unusual personal events in mobile users' daily lives. The system incorporates a hybrid architectural design that combines unsupervised classification of audio, accelerometer and location data with supervised joint fusion classification to achieve high accuracy, customization, convenience and scalability. We show the feasibility of myBlackBox by implementing and evaluating this end-to-end system that combines Android smartphones with cloud servers, deployed for 15 users over a one-month period. PMID:27223292

  12. Analyzing direct dark matter detection data with unrejected background events by the AMIDAS website

    NASA Astrophysics Data System (ADS)

    Shan, Chung-Lin

    2012-09-01

    In this talk I have presented the data analysis results of extracting properties of halo WIMPs: the mass and the (ratios between the) spin-independent and spin-dependent couplings/cross sections on nucleons by the AMIDAS website by taking into account possible unrejected background events in the analyzed data sets. Although non-standard astronomical setup has been used to generate pseudodata sets for our analyses, it has been found that, without prior information/assumption about the local density and velocity distribution of halo Dark Matter, these WIMP properties have been reconstructed with ~ 2% to lesssim 30% deviations from the input values.

  13. Solar panel clearing events, dust devil tracks, and in-situ vortex detections on Mars

    NASA Astrophysics Data System (ADS)

    Lorenz, Ralph D.; Reiss, Dennis

    2015-03-01

    Spirit rover solar array data, which if publicly-archived would provide a useful window on Mars meteorology, shows dust-clearing events coinciding with the onset of dust devil season in three Mars years. The recurrence interval of 100-700 days is consistent with the extrapolation of Pathfinder and Phoenix vortex encounters indicated by pressure drops of ∼6-40 Pa (similar to laboratory measurements of dust lifting threshold) and with observed areas and rates of generation of dust devil tracks on Mars.

  14. Development and application of absolute quantitative detection by duplex chamber-based digital PCR of genetically modified maize events without pretreatment steps.

    PubMed

    Zhu, Pengyu; Fu, Wei; Wang, Chenguang; Du, Zhixin; Huang, Kunlun; Zhu, Shuifang; Xu, Wentao

    2016-04-15

    The possibility of the absolute quantitation of GMO events by digital PCR was recently reported. However, most absolute quantitation methods based on the digital PCR required pretreatment steps. Meanwhile, singleplex detection could not meet the demand of the absolute quantitation of GMO events that is based on the ratio of foreign fragments and reference genes. Thus, to promote the absolute quantitative detection of different GMO events by digital PCR, we developed a quantitative detection method based on duplex digital PCR without pretreatment. Moreover, we tested 7 GMO events in our study to evaluate the fitness of our method. The optimized combination of foreign and reference primers, limit of quantitation (LOQ), limit of detection (LOD) and specificity were validated. The results showed that the LOQ of our method for different GMO events was 0.5%, while the LOD is 0.1%. Additionally, we found that duplex digital PCR could achieve the detection results with lower RSD compared with singleplex digital PCR. In summary, the duplex digital PCR detection system is a simple and stable way to achieve the absolute quantitation of different GMO events. Moreover, the LOQ and LOD indicated that this method is suitable for the daily detection and quantitation of GMO events. PMID:27016439

  15. Maximizing the probability of detecting an electromagnetic counterpart of gravitational-wave events

    NASA Astrophysics Data System (ADS)

    Coughlin, Michael; Stubbs, Christopher

    2016-07-01

    Compact binary coalescences are a promising source of gravitational waves for second-generation interferometric gravitational-wave detectors such as advanced LIGO and advanced Virgo. These are among the most promising sources for joint detection of electromagnetic (EM) and gravitational-wave (GW) emission. To maximize the science performed with these objects, it is essential to undertake a followup observing strategy that maximizes the likelihood of detecting the EM counterpart. We present a follow-up strategy that maximizes the counterpart detection probability, given a fixed investment of telescope time. We show how the prior assumption on the luminosity function of the electro-magnetic counterpart impacts the optimized followup strategy. Our results suggest that if the goal is to detect an EM counterpart from among a succession of GW triggers, the optimal strategy is to perform long integrations in the highest likelihood regions. For certain assumptions about source luminosity and mass distributions, we find that an optimal time investment that is proportional to the 2/3 power of the surface density of the GW location probability on the sky. In the future, this analysis framework will benefit significantly from the 3-dimensional localization probability.

  16. Systems and methods for detecting a failure event in a field programmable gate array

    NASA Technical Reports Server (NTRS)

    Ng, Tak-Kwong (Inventor); Herath, Jeffrey A. (Inventor)

    2009-01-01

    An embodiment generally relates to a method of self-detecting an error in a field programmable gate array (FPGA). The method includes writing a signature value into a signature memory in the FPGA and determining a conclusion of a configuration refresh operation in the FPGA. The method also includes reading an outcome value from the signature memory.

  17. Electrical detection of specific versus non-specific binding events in breast cancer cells

    NASA Astrophysics Data System (ADS)

    King, Benjamin C.; Clark, Michael; Burkhead, Thomas; Sethu, Palaniappan; Rai, Shesh; Kloecker, Goetz; Panchapakesan, Balaji

    2012-10-01

    Detection of circulating tumor cells (CTCs) from patient blood samples offers a desirable alternative to invasive tissue biopsies for screening of malignant carcinomas. A rigorous CTC detection method must identify CTCs from millions of other formed elements in blood and distinguish them from healthy tissue cells also present in the blood. CTCs are known to overexpress surface receptors, many of which aid them in invading other tissue, and these provide an avenue for their detection. We have developed carbon nanotube (CNT) thin film devices to specifically detect these receptors in intact cells. The CNT sidewalls are functionalized with antibodies specific to Epithelial Cell Adhesion Molecule (EpCAM), a marker overexpressed by breast and other carcinomas. Specific binding of EpCAM to anti-EpCAM antibodies causes a change in the local charge environment of the CNT surface which produces a characteristic electrical signal. Two cell lines were tested in the device: MCF7, a mammary adenocarcinoma line which overexpresses EpCAM, and MCF10A, a non-tumorigenic mammary epithelial line which does not. Introduction of MCF7s caused significant changes in the electrical conductance of the devices due to specific binding and associated charge environment change near the CNT sidewalls. Introduction of MCF10A displays a different profile due to purely nonspecific interactions. The profile of specific vs. nonspecific interaction signatures using carbon based devices will guide development of this diagnostic tool towards clinical sample volumes with wide variety of markers.

  18. DETECTION OF SUPERSONIC DOWNFLOWS AND ASSOCIATED HEATING EVENTS IN THE TRANSITION REGION ABOVE SUNSPOTS

    SciTech Connect

    Kleint, L.; Martínez-Sykora, J.; Antolin, P.; Tian, H.; Testa, P.; Reeves, K. K.; McKillop, S.; Saar, S.; Golub, L.; Judge, P.; Carlsson, M.; Hansteen, V.; Jaeggli, S.; and others

    2014-07-10

    Interface Region Imaging Spectrograph data allow us to study the solar transition region (TR) with an unprecedented spatial resolution of 0.''33. On 2013 August 30, we observed bursts of high Doppler shifts suggesting strong supersonic downflows of up to 200 km s{sup –1} and weaker, slightly slower upflows in the spectral lines Mg II h and k, C II 1336, Si IV 1394 Å, and 1403 Å, that are correlated with brightenings in the slitjaw images (SJIs). The bursty behavior lasts throughout the 2 hr observation, with average burst durations of about 20 s. The locations of these short-lived events appear to be the umbral and penumbral footpoints of EUV loops. Fast apparent downflows are observed along these loops in the SJIs and in the Atmospheric Imaging Assembly, suggesting that the loops are thermally unstable. We interpret the observations as cool material falling from coronal heights, and especially coronal rain produced along the thermally unstable loops, which leads to an increase of intensity at the loop footpoints, probably indicating an increase of density and temperature in the TR. The rain speeds are on the higher end of previously reported speeds for this phenomenon, and possibly higher than the free-fall velocity along the loops. On other observing days, similar bright dots are sometimes aligned into ribbons, resembling small flare ribbons. These observations provide a first insight into small-scale heating events in sunspots in the TR.

  19. Onboard Classifiers for Science Event Detection on a Remote Sensing Spacecraft

    NASA Technical Reports Server (NTRS)

    Castano, Rebecca; Mazzoni, Dominic; Tang, Nghia; Greeley, Ron; Doggett, Thomas; Cichy, Ben; Chien, Steve; Davies, Ashley

    2006-01-01

    Typically, data collected by a spacecraft is downlinked to Earth and pre-processed before any analysis is performed. We have developed classifiers that can be used onboard a spacecraft to identify high priority data for downlink to Earth, providing a method for maximizing the use of a potentially bandwidth limited downlink channel. Onboard analysis can also enable rapid reaction to dynamic events, such as flooding, volcanic eruptions or sea ice break-up. Four classifiers were developed to identify cryosphere events using hyperspectral images. These classifiers include a manually constructed classifier, a Support Vector Machine (SVM), a Decision Tree and a classifier derived by searching over combinations of thresholded band ratios. Each of the classifiers was designed to run in the computationally constrained operating environment of the spacecraft. A set of scenes was hand-labeled to provide training and testing data. Performance results on the test data indicate that the SVM and manual classifiers outperformed the Decision Tree and band-ratio classifiers with the SVM yielding slightly better classifications than the manual classifier.

  20. Event-related brain potentials reveal the time-course of language change detection in early bilinguals.

    PubMed

    Kuipers, Jan-Rouke; Thierry, Guillaume

    2010-05-01

    Using event-related brain potentials, we investigated the temporal course of language change detection in proficient bilinguals as compared to matched controls. Welsh-English bilingual participants and English controls were presented with a variant of the oddball paradigm involving picture-word pairs. The language of the spoken word was manipulated such that English was the frequent stimulus (75%) and Welsh the infrequent stimulus (25%). We also manipulated semantic relatedness between pictures and words, such that only half of the pictures were followed by a word that corresponded with the identity of the picture. The P2 wave was significantly modulated by language in the bilingual group only, suggesting that this group detected a language change as early as 200 ms after word onset. Monolinguals also reliably detected the language change, but at a later stage of semantic integration (N400 range), since Welsh words were perceived as meaningless. The early detection of a language change in bilinguals triggered stimulus re-evaluation mechanisms reflected by a significant P600 modulation by Welsh words. Furthermore, compared to English unrelated words, English words matching the picture identity elicited significantly greater P2 amplitudes in the bilingual group only, suggesting that proficient bilinguals validate an incoming word against their expectation based on the context. Overall, highly proficient bilinguals appear to detect language changes very early on during speech perception and to consciously monitor language changes when they occur. PMID:20117220

  1. Combined passive detection and ultrafast active imaging of cavitation events induced by short pulses of high-intensity ultrasound.

    PubMed

    Gateau, Jérôme; Aubry, Jean-François; Pernot, Mathieu; Fink, Mathias; Tanter, Mickaël

    2011-03-01

    The activation of natural gas nuclei to induce larger bubbles is possible using short ultrasonic excitations of high amplitude, and is required for ultrasound cavitation therapies. However, little is known about the distribution of nuclei in tissues. Therefore, the acoustic pressure level necessary to generate bubbles in a targeted zone and their exact location are currently difficult to predict. To monitor the initiation of cavitation activity, a novel all-ultrasound technique sensitive to single nucleation events is presented here. It is based on combined passive detection and ultrafast active imaging over a large volume using the same multi-element probe. Bubble nucleation was induced using a focused transducer (660 kHz, f-number = 1) driven by a high-power electric burst (up to 300 W) of one to two cycles. Detection was performed with a linear array (4 to 7 MHz) aligned with the single-element focal point. In vitro experiments in gelatin gel and muscular tissue are presented. The synchronized passive detection enabled radio-frequency data to be recorded, comprising high-frequency coherent wave fronts as signatures of the acoustic emissions linked to the activation of the nuclei. Active change detection images were obtained by subtracting echoes collected in the unnucleated medium. These indicated the appearance of stable cavitating regions. Because of the ultrafast frame rate, active detection occurred as quickly as 330 μs after the high-amplitude excitation and the dynamics of the induced regions were studied individually. PMID:21429844

  2. Combined passive detection and ultrafast active imaging of cavitation events induced by short pulses of high-intensity ultrasound

    PubMed Central

    Gateau, Jérôme; Aubry, Jean-François; Pernot, Mathieu; Fink, Mathias; Tanter, Mickaël

    2011-01-01

    The activation of natural gas nuclei to induce larger bubbles is possible using short ultrasonic excitations of high amplitude, and is required for ultrasound cavitation therapies. However, little is known about the distribution of nuclei in tissues. Therefore, the acoustic pressure level necessary to generate bubbles in a targeted zone and their exact location are currently difficult to predict. In order to monitor the initiation of cavitation activity, a novel all-ultrasound technique sensitive to single nucleation events is presented here. It is based on combined passive detection and ultrafast active imaging over a large volume and with the same multi-element probe. Bubble nucleation was induced with a focused transducer (660kHz, f#=1) driven by a high power (up to 300 W) electric burst of one to two cycles. Detection was performed with a linear array (4–7MHz) aligned with the single-element focal point. In vitro experiments in gelatin gel and muscular tissue are presented. The synchronized passive detection enabled radio-frequency data to be recorded, comprising high-frequency coherent wave fronts as signatures of the acoustic emissions linked to the activation of the nuclei. Active change detection images were obtained by subtracting echoes collected in the unucleated medium. These indicated the appearance of stable cavitating regions. Thanks to the ultrafast frame rate, active detection occurred as soon as 330 μs after the high amplitude excitation and the dynamics of the induced regions were studied individually. PMID:21429844

  3. Detection and Analysis of High Ice Concentration Events and Supercooled Drizzle from IAGOS Commercial Aircraft

    NASA Astrophysics Data System (ADS)

    Gallagher, Martin; Baumgardner, Darrel; Lloyd, Gary; Beswick, Karl; Freer, Matt; Durant, Adam

    2016-04-01

    Hazardous encounters with high ice concentrations that lead to temperature and airspeed sensor measurement errors, as well as engine rollback and flameout, continue to pose serious problems for flight operations of commercial air carriers. Supercooled liquid droplets (SLD) are an additional hazard, especially for smaller commuter aircraft that do not have sufficient power to fly out of heavy icing conditions or heat to remove the ice. New regulations issued by the United States and European regulatory agencies are being implemented that will require aircraft below a certain weight class to carry sensors that will detect and warn of these types of icing conditions. Commercial aircraft do not currently carry standard sensors to detect the presence of ice crystals in high concentrations because they are typical found in sizes that are below the detection range of aircraft weather radar. Likewise, the sensors that are currently used to detect supercooled water do not respond well to drizzle-sized drops. Hence, there is a need for a sensor that can fill this measurement void. In addition, the forecast models that are used to predict regions of icing rely on pilot observations as the only means to validate the model products and currently there are no forecasts for the prevalence of high altitude ice crystals. Backscatter Cloud Probes (BCP) have been flying since 2011 under the IAGOS project on six Airbus commercial airliners operated by Lufthansa, Air France, China Air, Iberia and Cathay Pacific, and measure cloud droplets, ice crystals and aerosol particles larger than 5 μm. The BCP can detect these particles and measures an optical equivalent diameter (OED) but is not able to distinguish the type of particle, i.e. whether they are droplets, ice crystals, dust or ash. However, some qualification can be done based on measured temperature to discriminate between liquid water and ice. The next generation BCP (BCPD, Backscatter Cloud Probe with polarization detection) is

  4. Audio-visual event detection based on mining of semantic audio-visual labels

    NASA Astrophysics Data System (ADS)

    Goh, King-Shy; Miyahara, Koji; Radhakrishnan, Regunathan; Xiong, Ziyou; Divakaran, Ajay

    2003-12-01

    Removing commercials from television programs is a much sought-after feature for a personal video recorder. In this paper, we employ an unsupervised clustering scheme (CM_Detect) to detect commercials in television programs. Each program is first divided into W8-minute chunks, and we extract audio and visual features from each of these chunks. Next, we apply k-means clustering to assign each chunk with a commercial/program label. In contrast to other methods, we do not make any assumptions regarding the program content. Thus, our method is highly content-adaptive and computationally inexpensive. Through empirical studies on various content, including American news, Japanese news, and sports programs, we demonstrate that our method is able to filter out most of the commercials without falsely removing the regular program.

  5. An electronic circuit that detects left ventricular ejection events by processing the arterial pressure waveform

    NASA Technical Reports Server (NTRS)

    Gebben, V. D.; Webb, J. A., Jr.

    1972-01-01

    An electronic circuit for processing arterial blood pressure waveform signals is described. The circuit detects blood pressure as the heart pumps blood through the aortic valve and the pressure distribution caused by aortic valve closure. From these measurements, timing signals for use in measuring the left ventricular ejection time is determined, and signals are provided for computer monitoring of the cardiovascular system. Illustrations are given of the circuit and pressure waveforms.

  6. Detection of the 22-GHz line of water during and after the SL-9/Jupiter event.

    NASA Astrophysics Data System (ADS)

    Montebugnoli, S.; Bortolotti, C.; Cattani, A.; Grueff, G.; Maccaferri, A.; Maccaferri, G.; Orfei, A.; Padrielli, L.; Tugnoli, M.; Tuccari, G.; Roma, M.; Venturi, T.; Cosmovici, C. B.; Orfei, R.; Scappini, F.; Colom, P.; Pogrebenko, S.

    Because of the exceptional changes in the chemistry and in the excitation conditions of the Jovian atmosphere and ionosphere, the comet/Jupiter impacts represented a unique opportunity of detecting the water radio line and to state its cometary origin. By using a multichannel spectrometer (up to 128.000 channels), coupled with the 32-m dish of the Medicina Radiotelescope, the authors were able to detect the emission line of water at 1.35 cm as a consequence of blob E and probably also of blobs A and C on July 19, 1994. The detection of water from blob E covers a two months range after impact. It follows that, since the water was excited at high altitudes (above the 1 microbar level) it should necessarily originate from the cometary nuclei. The extremely narrow bandwidth of the line (40 kHz) can't be explained by a classical approach of thermal or collisional broadening. Thus a new model has to be worked out in order to explain line bandwidth, temperature brightness, morphology and altitude of the water cloud.

  7. Multitemporal burnt area detection methods based on a couple of images acquired after the fire event

    NASA Astrophysics Data System (ADS)

    Carlà, R.; Santurri, L.; Bonora, L.; Conese, C.

    2009-09-01

    Fire detection methods based on remote sensing data are gaining more and more attention among the scientific community, and many algorithms have been developed for this purpose. In order to assess the location and the characteristics of burned areas, some of them apply a suitable threshold to a multispectral index such as the NBR (Noise Burn Ratio) index or the NDII (Normalized Difference Infrared Index) evaluated on a single image acquired after the fire season. Other methods use a multitemporal approach based on the processing of a couple of images, the former acquired before and the latter after the fire season, and applying a chosen threshold to the differential value of the same, or other multispectral indexes. This paper focuses the problem of assessing the performance of some burnt areas detection methods based on a couple of satellite images acquired both after the fire season. In particular the threshold method applied to the differential form of the NDII and NDVI (Normalized Differential Vegetation Index) are considered as concern their capacity of locating or detecting (not characterizing) burnt areas and the resulting performances are evaluated and compared with the corresponding ones of the same methods applied to a single image only, acquired after the fire season.

  8. Detection of overflow events in the shag rocks passage, scotia ridge.

    PubMed

    Zenk, W

    1981-09-01

    During an almost yearlong period of observations made with a current meter in the fracture zone between the Falkland Islands (Islas Malvinas) and South Georgia, several overflow events were recorded at a depth of 3000 meters carrying cold bottom water from the Scotia Sea into the Argentine Basin. The outflow bursts of Scotia Sea bottom water, a mixing product of Weddell Sea and eastern Pacific bottom water, were associated with typical speeds of more than 28 centimeters per second toward the northwest and characteristic temperatures below 0.6 degrees C. The maximum 24-hour average speed of 65 centimeters per second, together with a temperature of 0.29 degrees C, was encountered on 14 November 1980 at a water depth of 2973 meters, 35 meters above the sea floor. PMID:17741101

  9. Real-time detection of an extreme scattering event: Constraints on Galactic plasma lenses.

    PubMed

    Bannister, Keith W; Stevens, Jamie; Tuntsov, Artem V; Walker, Mark A; Johnston, Simon; Reynolds, Cormac; Bignall, Hayley

    2016-01-22

    Extreme scattering events (ESEs) are distinctive fluctuations in the brightness of astronomical radio sources caused by occulting plasma lenses in the interstellar medium. The inferred plasma pressures of the lenses are ~10(3) times the ambient pressure, challenging our understanding of gas conditions in the Milky Way. Using a new survey technique, we discovered an ESE while it was in progress. Here we report radio and optical follow-up observations. Modeling of the radio data demonstrates that the lensing structure is a density enhancement and the lens is diverging, ruling out one of two competing physical models. Our technique will uncover many more ESEs, addressing a long-standing mystery of the small-scale gas structure of our Galaxy. PMID:26798008

  10. Real-time detection of an extreme scattering event: Constraints on Galactic plasma lenses

    NASA Astrophysics Data System (ADS)

    Bannister, Keith W.; Stevens, Jamie; Tuntsov, Artem V.; Walker, Mark A.; Johnston, Simon; Reynolds, Cormac; Bignall, Hayley

    2016-01-01

    Extreme scattering events (ESEs) are distinctive fluctuations in the brightness of astronomical radio sources caused by occulting plasma lenses in the interstellar medium. The inferred plasma pressures of the lenses are ~103 times the ambient pressure, challenging our understanding of gas conditions in the Milky Way. Using a new survey technique, we discovered an ESE while it was in progress. Here we report radio and optical follow-up observations. Modeling of the radio data demonstrates that the lensing structure is a density enhancement and the lens is diverging, ruling out one of two competing physical models. Our technique will uncover many more ESEs, addressing a long-standing mystery of the small-scale gas structure of our Galaxy.

  11. Embedding surveillance into clinical care to detect serious adverse events in pregnancy.

    PubMed

    Seale, Anna C; Barsosio, Hellen C; Koech, Angela C; Berkley, James A

    2015-11-25

    Severe maternal complications in pregnancy in sub-Saharan Africa contribute to high maternal mortality and morbidity. Incidence data on severe maternal complications, life-threatening conditions, maternal deaths and birth outcomes are essential for clinical audit and to inform trial design of the types and frequency of expected severe adverse events (SAEs). However, such data are very limited, especially in sub-Saharan Africa. We set up standardized, systematic clinical surveillance embedded into routine clinical care in a rural county hospital in Kenya. Pregnant women and newborns are systematically assessed and investigated. Data are reported using a standardized Maternal Admission Record that forms both the hospital's clinical record and the data collection tool. Integrating clinical surveillance with routine clinical care is feasible and should be expanded in sub-Saharan Africa, both for improving clinical practice and as a basis for intervention studies to reduce maternal and newborn mortality and morbidity where rates are highest. PMID:26254977

  12. Embedding surveillance into clinical care to detect serious adverse events in pregnancy

    PubMed Central

    Seale, Anna C; Barsosio, Helen C; Koech, Angela; Berkley, James A

    2016-01-01

    Severe maternal complications in pregnancy in sub-Saharan Africa contribute to high maternal mortality and morbidity. Incidence data on severe maternal complications, life-threatening conditions, maternal deaths and birth outcomes are essential for clinical audit and to inform trial design of the types and frequency of expected severe adverse events (SAEs). However, such data are very limited, especially in sub-Saharan Africa. We set up standardized, systematic clinical surveillance embedded into routine clinical care in a rural county hospital in Kenya. Pregnant women and newborns are systematically assessed and investigated. Data are reported using a standardized Maternal Admission Record that forms both the hospital’s clinical record and the data collection tool. Integrating clinical surveillance with routine clinical care is feasible and should be expanded in sub-Saharan Africa, both for improving clinical practice and as a basis for intervention studies to reduce maternal and newborn mortality and morbidity where rates are highest. PMID:26254977

  13. 3D-nanostructured Au electrodes for the event-specific detection of MON810 transgenic maize.

    PubMed

    Barroso, M Fátima; Freitas, Maria; Oliveira, M Beatriz P P; de-los-Santos-Álvarez, Noemí; Lobo-Castañón, María Jesús; Delerue-Matos, Cristina

    2015-03-01

    In the present work, the development of a genosensor for the event-specific detection of MON810 transgenic maize is proposed. Taking advantage of nanostructuration, a cost-effective three dimensional electrode was fabricated and a ternary monolayer containing a dithiol, a monothiol and the thiolated capture probe was optimized to minimize the unspecific signals. A sandwich format assay was selected as a way of precluding inefficient hybridization associated with stable secondary target structures. A comparison between the analytical performance of the Au nanostructured electrodes and commercially available screen-printed electrodes highlighted the superior performance of the nanostructured ones. Finally, the genosensor was effectively applied to detect the transgenic sequence in real samples, showing its potential for future quantitative analysis. PMID:25618653

  14. Principles of models based engineering

    SciTech Connect

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  15. Characterization and event specific-detection by quantitative real-time PCR of T25 maize insert.

    PubMed

    Collonnier, Cécile; Schattner, Alexandra; Berthier, Georges; Boyer, Francine; Coué-Philippe, Géraldine; Diolez, Annick; Duplan, Marie-Noëlle; Fernandez, Sophie; Kebdani, Naïma; Kobilinsky, André; Romaniuk, Marcel; de Beuckeleer, Marc; de Loose, Marc; Windels, Pieter; Bertheau, Yves

    2005-01-01

    T25 is one of the 4 maize transformation events from which commercial lines have so far been authorized in Europe. It was created by polyethylene glycol-mediated transformation using a construct bearing one copy of the synthetic pat gene associated with both promoter and terminator of the 35S ribosomal gene from cauliflower mosaic virus. In this article, we report the sequencing of the whole T25 insert and the characterization of its integration site by using a genome walking strategy. Our results confirmed that one intact copy of the initial construct had been integrated in the plant genome. They also revealed, at the 5' junction of the insert, the presence of a second truncated 35S promoter, probably resulting from rearrangements which may have occurred before or during integration of the plasmid DNA. The analysis of the junction fragments showed that the integration site of the insert presented high homologies with the Huck retrotransposon family. By using one primer annealing in the maize genome and the other in the 5' end of the integrated DNA, we developed a reliable event-specific detection system for T25 maize. To provide means to comply with the European regulation, a real-time PCR test was designed for specific quantitation of T25 event by using Taqman chemistry. PMID:15859082

  16. Fast joint detection-estimation of evoked brain activity in event-related FMRI using a variational approach

    PubMed Central

    Chaari, Lotfi; Vincent, Thomas; Forbes, Florence; Dojat, Michel; Ciuciu, Philippe

    2013-01-01

    In standard within-subject analyses of event-related fMRI data, two steps are usually performed separately: detection of brain activity and estimation of the hemodynamic response. Because these two steps are inherently linked, we adopt the so-called region-based Joint Detection-Estimation (JDE) framework that addresses this joint issue using a multivariate inference for detection and estimation. JDE is built by making use of a regional bilinear generative model of the BOLD response and constraining the parameter estimation by physiological priors using temporal and spatial information in a Markovian model. In contrast to previous works that use Markov Chain Monte Carlo (MCMC) techniques to sample the resulting intractable posterior distribution, we recast the JDE into a missing data framework and derive a Variational Expectation-Maximization (VEM) algorithm for its inference. A variational approximation is used to approximate the Markovian model in the unsupervised spatially adaptive JDE inference, which allows automatic fine-tuning of spatial regularization parameters. It provides a new algorithm that exhibits interesting properties in terms of estimation error and computational cost compared to the previously used MCMC-based approach. Experiments on artificial and real data show that VEM-JDE is robust to model mis-specification and provides computational gain while maintaining good performance in terms of activation detection and hemodynamic shape recovery. PMID:23096056

  17. Surface plasmon resonance biosensor detects the downstream events of active PKCbeta in antigen-stimulated mast cells.

    PubMed

    Tanaka, Maiko; Hiragun, Takaaki; Tsutsui, Tomoko; Yanase, Yuhki; Suzuki, Hidenori; Hide, Michihiro

    2008-06-15

    Surface plasmon resonance (SPR) biosensors detect large changes of angle of resonance (AR) when RBL-2H3 mast cells are cultured on a sensor chip and stimulated with antigen. However, the detail of molecular events that are responsible for such large changes of AR remained unknown. In this study, we investigated the relationship between intracellular signaling events induced by antigen and the change of AR, by genetic manipulation of intracellular signaling molecules; spleen tyrosine kinase (Syk), src-like adaptor protein (SLAP), linker for activation of T cells (LAT), growth-factor-receptor-bound protein 2 (Grb2), Grb2-related adaptor protein (Gads), and isotypes of protein kinase C (PKC). RBL-2H3 mast cells overexpressing dominant-negative Syk or SLAP, which both interfere with active Syk, exhibited only minimal increase of AR in response to antigen stimulation. Likewise, the interference of the activation of LAT and Gads, by expressing dominant-negative LAT and Gads, respectively, resulted in nearly complete suppression of the antigen-induced increase of AR. The cells overexpressing PKCs, apart from PKCbeta, showed a reduced extent of increase of AR in response to antigen stimulation. Moreover, the introduction of the small interfering RNA targeted against PKCbeta suppressed the antigen-induced increase of AR. These results indicate that the activation of Syk, LAT, Gads, and subsequent PKCbeta is indispensable for the antigen-induced increase of AR of mast cells detected by SPR biosensors. PMID:18339533

  18. The direct detection of boosted dark matter at high energies and PeV events at IceCube

    SciTech Connect

    Bhattacharya, A.; Gandhi, R.; Gupta, A.

    2015-03-13

    We study the possibility of detecting dark matter directly via a small but energetic component that is allowed within present-day constraints. Drawing closely upon the fact that neutral current neutrino nucleon interactions are indistinguishable from DM-nucleon interactions at low energies, we extend this feature to high energies for a small, non-thermal but highly energetic population of DM particle χ, created via the decay of a significantly more massive and long-lived non-thermal relic Φ, which forms the bulk of DM. If χ interacts with nucleons, its cross-section, like the neutrino-nucleus coherent cross-section, can rise sharply with energy leading to deep inelastic scattering, similar to neutral current neutrino-nucleon interactions at high energies. Thus, its direct detection may be possible via cascades in very large neutrino detectors. As a specific example, we apply this notion to the recently reported three ultra-high energy PeV cascade events clustered around 1 – 2 PeV at IceCube (IC). We discuss the features which may help discriminate this scenario from one in which only astrophysical neutrinos constitute the event sample in detectors like IC.

  19. The direct detection of boosted dark matter at high energies and PeV events at IceCube

    SciTech Connect

    Bhattacharya, A.; Gandhi, R.; Gupta, A.

    2015-03-13

    We study the possibility of detecting dark matter directly via a small but energetic component that is allowed within present-day constraints. Drawing closely upon the fact that neutral current neutrino nucleon interactions are indistinguishable from DM-nucleon interactions at low energies, we extend this feature to high energies for a small, non-thermal but highly energetic population of DM particle χ, created via the decay of a significantly more massive and long-lived non-thermal relic ϕ, which forms the bulk of DM. If χ interacts with nucleons, its cross-section, like the neutrino-nucleus coherent cross-section, can rise sharply with energy leading to deep inelastic scattering, similar to neutral current neutrino-nucleon interactions at high energies. Thus, its direct detection may be possible via cascades in very large neutrino detectors. As a specific example, we apply this notion to the recently reported three ultra-high energy PeV cascade events clustered around 1−2 PeV at IceCube (IC). We discuss the features which may help discriminate this scenario from one in which only astrophysical neutrinos constitute the event sample in detectors like IC.

  20. The direct detection of boosted dark matter at high energies and PeV events at IceCube

    DOE PAGESBeta

    Bhattacharya, A.; Gandhi, R.; Gupta, A.

    2015-03-13

    We study the possibility of detecting dark matter directly via a small but energetic component that is allowed within present-day constraints. Drawing closely upon the fact that neutral current neutrino nucleon interactions are indistinguishable from DM-nucleon interactions at low energies, we extend this feature to high energies for a small, non-thermal but highly energetic population of DM particle χ, created via the decay of a significantly more massive and long-lived non-thermal relic Φ, which forms the bulk of DM. If χ interacts with nucleons, its cross-section, like the neutrino-nucleus coherent cross-section, can rise sharply with energy leading to deep inelasticmore » scattering, similar to neutral current neutrino-nucleon interactions at high energies. Thus, its direct detection may be possible via cascades in very large neutrino detectors. As a specific example, we apply this notion to the recently reported three ultra-high energy PeV cascade events clustered around 1 – 2 PeV at IceCube (IC). We discuss the features which may help discriminate this scenario from one in which only astrophysical neutrinos constitute the event sample in detectors like IC.« less

  1. APASVO: A free software tool for automatic P-phase picking and event detection in seismic traces

    NASA Astrophysics Data System (ADS)

    Romero, José Emilio; Titos, Manuel; Bueno, Ángel; Álvarez, Isaac; García, Luz; Torre, Ángel de la; Benítez, M.a. Carmen

    2016-05-01

    The accurate estimation of the arrival time of seismic waves or picking is a problem of major interest in seismic research given its relevance in many seismological applications, such as earthquake source location and active seismic tomography. In the last decades, several automatic picking methods have been proposed with the ultimate goal of implementing picking algorithms whose results are comparable to those obtained by manual picking. In order to facilitate the use of these automated methods in the analysis of seismic traces, this paper presents a new free, open source, software graphical tool, named APASVO, which allows picking tasks in an easy and user-friendly way. The tool also provides event detection functionality, where a relatively imprecise estimation of the onset time is sufficient. The application implements the STA-LTA detection algorithm and the AMPA picking algorithm. An autoregressive AIC-based picking method can also be applied. Besides, this graphical tool is complemented with two additional command line tools, an event picking tool and a synthetic earthquake generator. APASVO is a multiplatform tool that works on Windows, Linux and OS X. The application can process data in a large variety of file formats. It is implemented in Python and relies on well-known scientific computing packages such as ObsPy, NumPy, SciPy and Matplotlib.

  2. Microlensing events by Proxima Centauri in 2014 and 2016: Opportunities for mass determination and possible planet detection

    SciTech Connect

    Sahu, Kailash C.; Bond, Howard E.; Anderson, Jay; Dominik, Martin E-mail: jayander@stsci.edu E-mail: md35@st-andrews.ac.uk

    2014-02-20

    We have found that Proxima Centauri, the star closest to our Sun, will pass close to a pair of faint background stars in the next few years. Using Hubble Space Telescope (HST) images obtained in 2012 October, we determine that the passage close to a mag 20 star will occur in 2014 October (impact parameter 1.''6), and to a mag 19.5 star in 2016 February (impact parameter 0.''5). As Proxima passes in front of these stars, the relativistic deflection of light will cause shifts in the positions of the background stars of ∼0.5 and 1.5 mas, respectively, readily detectable by HST imaging, and possibly by Gaia and ground-based facilities such as the Very Large Telescope. Measurement of these astrometric shifts offers a unique and direct method to measure the mass of Proxima. Moreover, if Proxima has a planetary system, the planets may be detectable through their additional microlensing signals, although the probability of such detections is small. With astrometric accuracies of 0.03 mas (achievable with HST spatial scanning), centroid shifts caused by Jovian planets are detectable at separations of up to 2.''0 (corresponding to 2.6 AU at the distance of Proxima), and centroid shifts by Earth-mass planets are detectable within a small band of 8 mas (corresponding to 0.01 AU) around the source trajectories. Jovian planets within a band of about 28 mas (corresponding to 0.036 AU) around the source trajectories would produce a brightening of the source by >0.01 mag and could hence be detectable. Estimated timescales of the astrometric and photometric microlensing events due to a planet range from a few hours to a few days, and both methods would provide direct measurements of the planetary mass.

  3. Analyzing Protease Specificity and Detecting in Vivo Proteolytic Events Using Tandem Mass Spectrometry

    SciTech Connect

    Gupta, Nitin; Hixson, Kim K.; Culley, David E.; Smith, Richard D.; Pevzner, Pavel A.

    2010-07-01

    While trypsin remains the most commonly used protease in mass spectrometry, other proteases may be employed for increasing peptide-coverage or generating overlapping peptides. Knowledge of the accurate specifcity rules of these proteases is helpful for database search tools to detect peptides, and becomes crucial when mass spectrometry is used to discover in vivo proteolytic cleavages. In this study, we use tandem mass spectrometry to analyze the specifcity rules of selected proteases and describe MS- Proteolysis, a software tool for identifying putative sites of in vivo proteolytic cleavage. Our analysis suggests that the specifcity rules for some commonly used proteases can be improved, e.g., we find that V8 protease cuts not only after Asp and Glu, as currently expected, but also shows a smaller propensity to cleave after Gly for the conditions tested in this study. Finally, we show that comparative analysis of multiple proteases can be used to detect putative in vivo proteolytic sites on a proteome-wide scale.

  4. Dual-particle imaging system based on simultaneous detection of photon and neutron collision events

    NASA Astrophysics Data System (ADS)

    Poitrasson-Rivière, Alexis; Hamel, Michael C.; Polack, J. Kyle; Flaska, Marek; Clarke, Shaun D.; Pozzi, Sara A.

    2014-10-01

    A dual-particle imaging (DPI) system capable of simultaneously detecting and imaging fast neutrons and photons has been designed and built. Imaging fast neutrons and photons simultaneously is particularly desirable for nuclear nonproliferation and/or safeguards applications because typical sources of interest (special nuclear material) emit both particle types. The DPI system consists of three detection planes: the first two planes consist of organic-liquid scintillators and the third plane consists of NaI(Tl) inorganic scintillators. Pulse shape discrimination technique(s) may be used for the liquid scintillators to differentiate neutron and photon pulses whereas the NaI(Tl) scintillators are highly insensitive to neutrons. A prototype DPI system was set up using a digital data acquisition system as a proof of concept. Initial measurements showed potential for use of the DPI system with special nuclear material. The DPI system has efficiencies of the order of 10-4 correlated counts per incident particles for both neutron and photon correlated counts, with simple-backprojection images displaying peaks within a few degrees of the source location. This uncertainty is expected to decrease with more extensive data interpretation.

  5. Clinical Experiments of Communication by ALS Patient Utilizing Detecting Event-Related Potential

    NASA Astrophysics Data System (ADS)

    Kanou, Naoyuki; Sakuma, Kenji; Nakashima, Kenji

    Amyotrophic Lateral Sclerosis(ALS) patients are unable to successfully communicate their desires, although their mentality is normal, and so, the necessity of Communication Aids(CA) for ALS patients is realized. Therefore, the authors are focused on Event-Related Potential(ERP) which is elicited primarily for the target by visual and auditory stimuli. P200, N200 and P300 are components of ERP. These are potentials that are elicited when the subject focuses attention on stimuli that appears infrequently. ALS patient participated in two experiments. In the first experiment, a target word out of five words on a computer display was specified. The five words were linked to an each electric appliance, allowing the ALS patient to switch on a target appliance by ERP. In the second experiment, a target word in a 5×5 matrix was specified by measure of ERP. The rows and columns of the matrix were reversed randomly. The word on a crossing point of rows and columns including the target word, was specified as the target word. The rate of correct judgment in the first and second experiments were 100% in N200 and 96% in P200. For practical use of this system, it is very important to determine suitable communication algorithms for each patient by performing these experiments evaluating the results.

  6. Seismic Event Detection and Localization Using Ionospheric Disturbances in TEC Measurements

    NASA Astrophysics Data System (ADS)

    Loveland, R.; Heavner, M.; Komjathy, A.; Mannucci, A. J.; Suszcynsky, D. M.

    2013-12-01

    Ionospheric signatures from a variety of seismic phenomena (e.g. earthquakes, volcanoes, nuclear tests, mining blasts) have been documented by a number of observers. These phenomena have been shown to generate Rayleigh waves, acoustic waves, and gravity waves radiating outward from sources with various levels of localization. The propagation of these waves can be observed in corresponding thin-shell ionospheric Total Electron Content (TEC) observations derived from ground-based GNSS networks. GNSS station network density varies; Japan GEONET density is high with average spacing of 25 km; USGS density is lower with large variability depending on the geographic location. In this effort we attempt to localize underlying seismic sources from GNSS TEC observations. We assume isotropic propagation speeds with observations that are periodic temporally but only sparsely sampled spatially, and apply optimization methods to the resulting model to find the best candidates for source location. We apply this to both Japan GEONET and USGS data, and compare the resulting locations and levels of TEC disturbance to known event locations and magnitudes.

  7. Reconstruction of the Magnetkoepfl rockfall event - Detecting rock fall release zones using terrestrial laser scanning, Hohe Tauern, Austria

    NASA Astrophysics Data System (ADS)

    Hartmeyer, I.; Keuschnig, M.; Delleske, R.; Schrott, L.

    2012-04-01

    Instability of rock faces in high mountain areas is an important risk factor for man and infrastructure, particularly within the context of climate change. Numerous rock fall events in the European Alps suggest an increasing occurrence of mass movements due to rising temperatures in recent years. Within the MOREXPERT project ('Monitoring Expert System for Hazardous Rock Walls') a new long-term monitoring site for mass movement and permafrost interaction has been initiated in the Austrian Alps. The study area is located at the Kitzsteinhorn (Hohe Tauern), a particularly interesting site for the investigation of glacier retreat and potential permafrost degradation and their respective consequences for the stability of alpine rock faces. To detect and quantify changes occurring at the terrain surface an extensive terrestrial laser scanning (TLS) monitoring campaign was started in 2011. TLS creates three-dimensional high-resolution images of the scanned area allowing precise quantification of changes in geometry and volume in steep rock faces over distances of up to several hundreds of meters. Within the TLS monitoring campaign at the Kitzsteinhorn a large number of differently dimensioned rock faces is examined (varying size, slope inclination etc.). Scanned areas include the Kitzsteinhorn northwest and south face, the Magnetkoepfl east face as well as a couple of small rock faces in the vicinity of the summit station. During the night from August 27th to August 28th 2011 a rock fall event was documented by employees of the cable car company. The release zone could not immediately be detected. The east face of the Magnetkoepfl covers approximately 70 meters in height and about 200 meters in width. It is made up of calcareous mica-schist and displays an abundance of well-developed joint sets with large joint apertures. Meteorological data from a weather station located at the same altitude (2.950m) and just 500m away from the release zone show that the rock fall event

  8. ON THE DETECTABILITY OF A PREDICTED MESOLENSING EVENT ASSOCIATED WITH THE HIGH PROPER MOTION STAR VB 10

    SciTech Connect

    Lepine, Sebastien; DiStefano, Rosanne E-mail: rd@cfa.harvard.edu

    2012-04-10

    Extrapolation of the astrometric motion of the nearby low-mass star VB 10 indicates that sometime in late 2011 December or during the first 2-3 months of 2012, the star will make a close approach to a background point source. Based on astrometric uncertainties, we estimate a 1 in 2 chance that the distance of closest approach {rho}{sub min} will be less than 100 mas, a 1 in 5 chance that {rho}{sub min} < 50 mas, and a 1 in 10 chance that {rho}{sub min} < 20 mas. The last would result in a microlensing event with a 6% magnification in the light from the background source and an astrometric shift of 3.3 mas. The lensing signal will however be significantly diluted by the light from VB 10, which is 1.5 mag brighter than the background source in B band, 5 mag brighter in I band, and 10 mag brighter in K band, making the event undetectable in all but the bluer optical bands. However, we show that if VB 10 happens to harbor a {approx}1 M{sub J} planet on a moderately wide ( Almost-Equal-To 0.18 AU-0.84 AU) orbit, there is a chance (1% to more than 10%, depending on the distance of closest approach and orbital period and inclination) that a passage of the planet closer to the background source will result in a secondary event of higher magnification. The detection of secondary events could be made possible with a several-times-per-night multi-site monitoring campaign.

  9. Application of stochastic discrete event system framework for detection of induced low rate TCP attack.

    PubMed

    Barbhuiya, F A; Agarwal, Mayank; Purwar, Sanketh; Biswas, Santosh; Nandi, Sukumar

    2015-09-01

    TCP is the most widely accepted transport layer protocol. The major emphasis during the development of TCP was its functionality and efficiency. However, not much consideration was given on studying the possibility of attackers exploiting the protocol, which has lead to several attacks on TCP. This paper deals with the induced low rate TCP attack. Since the attack is relatively new, only a few schemes have been proposed to mitigate it. However, the main issues with these schemes are scalability, change in TCP header, lack of formal frameworks, etc. In this paper, we have adapted the stochastic DES framework for detecting the attack, which addresses most of these issues. We have successfully deployed and tested the proposed DES based IDS on a test bed. PMID:26073643

  10. High-throughput, low-cost, and event-specific polymerase chain reaction detection of herbicide tolerance in genetically modified soybean A2704-12.

    PubMed

    Ma, H; Li, H; Li, J; Wang, X F; Wei, P C; Li, L; Yang, J B

    2014-01-01

    The aim of this study was to develop an event-specific qualitative and real-time quantitative polymerase chain reaction (PCR) method for detection of herbicide-tolerance genetically modified (GM) soybean A2704-12. The event-specific PCR primers were designed, based on the 5'-flanking integration sequence in the soybean genome, to amplify the 239-bp target fragment. Employing the same event-specific primers, qualitative PCR and real-time quantitative PCR detection methods were successfully developed. The results showed that the A2704-12 event could be specifically distinguished from other GM soybean events. In the qualitative PCR assay, the limit of detection was 0.05%, and in the real-time quantitative PCR assay, the limit of detection was less than 0.01%. Moreover, our genomic DNA (gDNA) extraction protocol is high-throughput, safe, and low-cost. The event-specific PCR assay system is cost-efficient by using SYBR Green I in real-time PCR, and by using the same primers in both the qualitative and quantitative PCR assays. We therefore developed a high-throughput, low-cost, and event-specific qualitative and quantitative PCR detection method for GM soybean A2704-12. The method would be useful for market supervision and management of GM soybean A2704-12 due to its high specificity and sensitivity. PMID:24615034

  11. Precursory Acoustic Signals Detection in Rockfall Events by Means of Optical Fiber Sensors

    NASA Astrophysics Data System (ADS)

    Schenato, L.; Marcato, G.; Gruca, G.; Iannuzzi, D.; Palmieri, L.; Galtarossa, A.; Pasuto, A.

    2012-12-01

    Rockfalls represent a major source of hazard in mountain areas: they occur at the apex of a process of stress accumulation in the unstable slope, during which part of the accumulated energy is released in small internal cracks. These cracks and the related acoustic emissions (AE) can, therefore, be used as precursory signals, through which the unstable rock could be monitored. In particular, according to previous scientific literature AE can be monitored in the range 20÷100 kHz. With respect to traditional AE sensors, such as accelerometers and piezoelectric transducers, fiber optic sensors (FOSs) may provide a reliable solution, potentially offering more robustness to electromagnetic interference, smaller form factor, multiplexing ability and increased distance range and higher sensitivity. To explore this possibility, in this work we have experimentally analyzed two interferometric fiber optical sensors for AE detection in rock masses. In particular, the first sensor is made of 100 m of G.657 optical fiber, tightly wound on an aluminum flanged hollow mandrel (inner diameter 30 mm, height 42 mm) that is isolated from the environment with acoustic absorbing material. A 4-cm-long M10 screw, which acts also as the main mean of acoustic coupling between the rock and the sensor, is used to fasten the sensor to the rock. This fiber coil sensor (FCS) is inserted in the sensing arm of a fiber Mach-Zehnder interferometer. The second sensor consists in a micro cantilever carved on the top of a cylindrical silica ferrule, with a marked mechanical resonance at about 12.5 kHz (Q-factor of about 400). A standard single mode fiber is housed in the same ferrule and the gap between the cantilever and the fiber end face acts as a vibration-sensitive Fabry-Perot cavity, interrogated with a low-coherence laser, tuned at the quadrature point of the cavity. The sensor is housed in a 2-cm-long M10 bored bolt. Performance have been compared with those from a standard piezo

  12. Automatically Detecting Acute Myocardial Infarction Events from EHR Text: A Preliminary Study.

    PubMed

    Zheng, Jiaping; Yarzebski, Jorge; Ramesh, Balaji Polepalli; Goldberg, Robert J; Yu, Hong

    2014-01-01

    The Worcester Heart Attack Study (WHAS) is a population-based surveillance project examining trends in the incidence, in-hospital, and long-term survival rates of acute myocardial infarction (AMI) among residents of central Massachusetts. It provides insights into various aspects of AMI. Much of the data has been assessed manually. We are developing supervised machine learning approaches to automate this process. Since the existing WHAS data cannot be used directly for an automated system, we first annotated the AMI information in electronic health records (EHR). With strict inter-annotator agreement over 0.74 and un-strict agreement over 0.9 of Cohen's κ, we annotated 105 EHR discharge summaries (135k tokens). Subsequently, we applied the state-of-the-art supervised machine-learning model, Conditional Random Fields (CRFs) for AMI detection. We explored different approaches to overcome the data sparseness challenge and our results showed that cluster-based word features achieved the highest performance. PMID:25954440

  13. Age Dating Merger Events in Early Type Galaxies via the Detection of AGB Light

    NASA Technical Reports Server (NTRS)

    Bothun, G.

    2005-01-01

    A thorough statistical analysis of the J-H vs. H-K color plane of all detected early type galaxies in the 2MASS catalog with velocities less than 5000 km/s has been performed. This all sky survey is not sensitive to one particular galactic environment and therefore a representative range of early type galaxy environments have been sampled. Virtually all N-body simulation so major mergers produces a central starburst due to rapid collection of gas. This central starburst is of sufficient amplitude to change the stellar population in the central regions of the galaxy. Intermediate age populations are given away by the presence of AGB stars which will drive the central colors redder in H-K relative to the J- H baseline. This color anomaly has a lifetime of 2-5 billion years depending on the amplitude of the initial starburst Employing this technique on the entire 2MASS sample (several hundred galaxies) reveals that the AGB signature occurs less than 1% of the time. This is a straightforward indication that virtually all nearby early type galaxies have not had a major merger occur within the last few billion years.

  14. Detection of Extreme Climate Event Impacts to Terrestrial Productivity From Airborne Hyperspectral Imagery

    NASA Astrophysics Data System (ADS)

    Desai, A. R.; DuBois, S.; Singh, A.; Serbin, S.; Goulden, M.; Baldocchi, D. D.; Oechel, W. C.; Kruger, E. L.; Townsend, P. A.

    2015-12-01

    Changes in drought frequency and intensity are likely to be some of the largest climate anomalies to influence the net productivity of ecosystems, especially in already water-limited regions. However, the physiological mechanisms that drive this response are limited by primarily infrequent and small-scale leaf-level measurements. Here, we integrated eddy covariance flux tower estimates of gross primary productivity (GPP) across an elevation-gradient in California with airborne imagery from the NASA HyspIRI Preparatory campaign to evaluate the potential of hyperspectral imagery to detect responses of GPP to prolonged drought. We observed a number of spectral features in the visible, infrared, and shortwave infrared regions that yielded stronger linkages than traditional broadband indices with space and time variation in GPP across a range of ecosystems in California experiencing water stress in the past three years. Further, partial least squares regression (PLSR) modeling offers the ability to generate predictive models of GPP from narrowband hyperspectral remote sensing that directly links plant chemistry and spectral properties to productivity, and could serve as a significant advance over broadband remote sensing of GPP anomalies.

  15. High-sensitivity fluorescence anisotropy detection of protein-folding events: application to alpha-lactalbumin.

    PubMed

    Canet, D; Doering, K; Dobson, C M; Dupont, Y

    2001-04-01

    An experimental procedure has been devised to record simultaneously fluorescence intensity and fluorescence anisotropy. A photoelastic modulator on the excitation beam enables the anisotropy signal to be recorded in one pass using a single photomultiplier tube and eliminates the need for a polarizer on the emission path. In conjunction with a stopped-flow mixer, providing a time-resolved capability, this procedure was used to study the refolding of apo alpha-lactalbumin following dilution from guanidinium chloride. Although the fluorescence intensity does not change detectably, the fluorescence anisotropy was found to resolve the conformational changes occurring between the initial unfolded state and the molten globule state formed either kinetically during refolding at pH 7.0 or at equilibrium at pH 2.0 (A-state). This result provides further evidence that fluorescence anisotropy is a valuable probe of protein structural transitions and that the information it provides concerning the rotational mobility of a fluorophore can be complementary to the information about the local environment provided by fluorescence intensity. PMID:11259312

  16. Symbolic Processing Combined with Model-Based Reasoning

    NASA Technical Reports Server (NTRS)

    James, Mark

    2009-01-01

    A computer program for the detection of present and prediction of future discrete states of a complex, real-time engineering system utilizes a combination of symbolic processing and numerical model-based reasoning. One of the biggest weaknesses of a purely symbolic approach is that it enables prediction of only future discrete states while missing all unmodeled states or leading to incorrect identification of an unmodeled state as a modeled one. A purely numerical approach is based on a combination of statistical methods and mathematical models of the applicable physics and necessitates development of a complete model to the level of fidelity required for prediction. In addition, a purely numerical approach does not afford the ability to qualify its results without some form of symbolic processing. The present software implements numerical algorithms to detect unmodeled events and symbolic algorithms to predict expected behavior, correlate the expected behavior with the unmodeled events, and interpret the results in order to predict future discrete states. The approach embodied in this software differs from that of the BEAM methodology (aspects of which have been discussed in several prior NASA Tech Briefs articles), which provides for prediction of future measurements in the continuous-data domain.

  17. SPR and SPR Imaging: Recent Trends in Developing Nanodevices for Detection and Real-Time Monitoring of Biomolecular Events.

    PubMed

    Puiu, Mihaela; Bala, Camelia

    2016-01-01

    In this paper we review the underlying principles of the surface plasmon resonance (SPR) technique, particularly emphasizing its advantages along with its limitations regarding the ability to discriminate between the specific binding response and the interfering effects from biological samples. While SPR sensors were developed almost three decades, SPR detection is not yet able to reduce the time-consuming steps of the analysis, and is hardly amenable for miniaturized, portable platforms required in point-of-care (POC) testing. Recent advances in near-field optics have emerged, resulting in the development of SPR imaging (SPRi) as a powerful optical, label-free monitoring tool for multiplexed detection and monitoring of biomolecular events. The microarrays design of the SPRi chips incorporating various metallic nanostructures make these optofluidic devices more suitable for diagnosis and near-patient testing than the traditional SPR sensors. The latest developments indicate SPRi detection as being the most promising surface plasmon-based technique fulfilling the demands for implementation in lab-on-a-chip (LOC) technologies. PMID:27314345

  18. SPR and SPR Imaging: Recent Trends in Developing Nanodevices for Detection and Real-Time Monitoring of Biomolecular Events

    PubMed Central

    Puiu, Mihaela; Bala, Camelia

    2016-01-01

    In this paper we review the underlying principles of the surface plasmon resonance (SPR) technique, particularly emphasizing its advantages along with its limitations regarding the ability to discriminate between the specific binding response and the interfering effects from biological samples. While SPR sensors were developed almost three decades, SPR detection is not yet able to reduce the time-consuming steps of the analysis, and is hardly amenable for miniaturized, portable platforms required in point-of-care (POC) testing. Recent advances in near-field optics have emerged, resulting in the development of SPR imaging (SPRi) as a powerful optical, label-free monitoring tool for multiplexed detection and monitoring of biomolecular events. The microarrays design of the SPRi chips incorporating various metallic nanostructures make these optofluidic devices more suitable for diagnosis and near-patient testing than the traditional SPR sensors. The latest developments indicate SPRi detection as being the most promising surface plasmon-based technique fulfilling the demands for implementation in lab-on-a-chip (LOC) technologies. PMID:27314345

  19. Detection of explosive events by monitoring acoustically-induced geomagnetic perturbations

    SciTech Connect

    Lewis, J P; Rock, D R; Shaeffer, D L; Warshaw, S I

    1999-10-07

    The Black Thunder Coal Mine (BTCM) near Gillette, Wyoming was used as a test bed to determine the feasibility of detecting explosion-induced geomagnetic disturbances with ground-based induction magnetometers. Two magnetic observatories were fielded at distances of 50 km and 64 km geomagnetically north from the northernmost edge of BTCM. Each observatory consisted of three separate but mutually orthogonal magnetometers, Global Positioning System (GPS) timing, battery and solar power, a data acquisition and storage system, and a three-axis seismometer. Explosions with yields of 1 to 3 kT of TNT equivalent occur approximately every three weeks at BTCM. We hypothesize that explosion-induced acoustic waves propagate upward and interact collisionally with the ionosphere to produce ionospheric electron density (and concomitant current density) perturbations which act as sources for geomagnetic disturbances. These disturbances propagate through an ionospheric Alfven waveguide that we postulate to be leaky (due to the imperfectly conducting lower ionospheric boundary). Consequently, wave energy may be observed on the ground. We observed transient pulses, known as Q-bursts, with pulse widths about 0.5 s and with spectral energy dominated by the Schumann resonances. These resonances appear to be excited in the earth-ionosphere cavity by Alfven solitons that may have been generated by the explosion-induced acoustic waves reaching the ionospheric E and F regions and that subsequently propagate down through the ionosphere to the atmosphere. In addition, we observe late time (> 800 s) ultra low frequency (ULF) geomagnetic perturbations that appear to originate in the upper F region ({approximately}300 km) and appear to be caused by the explosion-induced acoustic wave interacting with that part of the ionosphere. We suggest that explosion-induced Q-bursts may be discriminated from naturally occurring Q-bursts by association of the former with the late time explosion-induced ULF

  20. Multiple-Threshold Event Detection and Other Enhancements to the Virtual Seismologist (VS) Earthquake Early Warning Algorithm

    NASA Astrophysics Data System (ADS)

    Fischer, M.; Caprio, M.; Cua, G. B.; Heaton, T. H.; Clinton, J. F.; Wiemer, S.

    2009-12-01

    The Virtual Seismologist (VS) algorithm is a Bayesian approach to earthquake early warning (EEW) being implemented by the Swiss Seismological Service at ETH Zurich. The application of Bayes’ theorem in earthquake early warning states that the most probable source estimate at any given time is a combination of contributions from a likelihood function that evolves in response to incoming data from the on-going earthquake, and selected prior information, which can include factors such as network topology, the Gutenberg-Richter relationship or previously observed seismicity. The VS algorithm was one of three EEW algorithms involved in the California Integrated Seismic Network (CISN) real-time EEW testing and performance evaluation effort. Its compelling real-time performance in California over the last three years has led to its inclusion in the new USGS-funded effort to develop key components of CISN ShakeAlert, a prototype EEW system that could potentially be implemented in California. A significant portion of VS code development was supported by the SAFER EEW project in Europe. We discuss recent enhancements to the VS EEW algorithm. We developed and continue to test a multiple-threshold event detection scheme, which uses different association / location approaches depending on the peak amplitudes associated with an incoming P pick. With this scheme, an event with sufficiently high initial amplitudes can be declared on the basis of a single station, maximizing warning times for damaging events for which EEW is most relevant. Smaller, non-damaging events, which will have lower initial amplitudes, will require more picks to be declared an event to reduce false alarms. This transforms the VS codes from a regional EEW approach reliant on traditional location estimation (and it requirement of at least 4 picks as implemented by the Binder Earthworm phase associator) to a hybrid on-site/regional approach capable of providing a continuously evolving stream of EEW

  1. Adaptive and context-aware detection and classification of potential QoS degradation events in biomedical wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Abreu, Carlos; Miranda, Francisco; Mendes, Paulo M.

    2016-06-01

    The use of wireless sensor networks in healthcare has the potential to enhance the services provided to citizens. In particular, they play an important role in the development of state-of-the-art patient monitoring applications. Nevertheless, due to the critical nature of the data conveyed by such patient monitoring applications, they have to fulfil high standards of quality of service in order to obtain the confidence of all players in the healthcare industry. In such context, vis-à-vis the quality of service being provided by the wireless sensor network, this work presents an adaptive and context-aware method to detect and classify performance degradation events. The proposed method has the ability to catch the most significant and damaging variations on the metrics being used to quantify the quality of service provided by the network without overreacting to small and innocuous variations on the metric's value.

  2. Automatic Event Detection in Search for Inter-Moss Loops in IRIS Si IV Slit-Jaw Images

    NASA Technical Reports Server (NTRS)

    Fayock, Brian; Winebarger, Amy R.; De Pontieu, Bart

    2015-01-01

    The high-resolution capabilities of the Interface Region Imaging Spectrometer (IRIS) mission have allowed the exploration of the finer details of the solar magnetic structure from the chromosphere to the lower corona that have previously been unresolved. Of particular interest to us are the relatively short-lived, low-lying magnetic loops that have foot points in neighboring moss regions. These inter-moss loops have also appeared in several AIA pass bands, which are generally associated with temperatures that are at least an order of magnitude higher than that of the Si IV emission seen in the 1400 angstrom pass band of IRIS. While the emission lines seen in these pass bands can be associated with a range of temperatures, the simultaneous appearance of these loops in IRIS 1400 and AIA 171, 193, and 211 suggest that they are not in ionization equilibrium. To study these structures in detail, we have developed a series of algorithms to automatically detect signal brightening or events on a pixel-by-pixel basis and group them together as structures for each of the above data sets. These algorithms have successfully picked out all activity fitting certain adjustable criteria. The resulting groups of events are then statistically analyzed to determine which characteristics can be used to distinguish the inter-moss loops from all other structures. While a few characteristic histograms reveal that manually selected inter-moss loops lie outside the norm, a combination of several characteristics will need to be used to determine the statistical likelihood that a group of events be categorized automatically as a loop of interest. The goal of this project is to be able to automatically pick out inter-moss loops from an entire data set and calculate the characteristics that have previously been determined manually, such as length, intensity, and lifetime. We will discuss the algorithms, preliminary results, and current progress of automatic characterization.

  3. Event-related fast optical signal in a rapid object recognition task: improving detection by the Independent Component Analysis

    PubMed Central

    Medvedev, Andrei V.; Kainerstorfer, Jana; Borisov, Sergey V.; Barbour, Randall L.; VanMeter, John

    2008-01-01

    Noninvasive recording of fast optical signals presumably reflecting neuronal activity is a challenging task because of a relatively low signal-to-noise ratio. To improve detection of those signals in rapid object recognition tasks, we used the Independent Component Analysis (ICA) to reduce “global interference” (heartbeat and contribution of superficial layers). We recorded optical signals from the left prefrontal cortex in 10 right-handed participants with a continuous-wave instrument (DYNOT, NIRx, Brooklyn, NY). Visual stimuli were pictures of urban, landscape and seashore scenes with various vehicles as targets (target-to-non-target ratio 1:6) presented at ISI = 166 ms or 250 ms. Subjects mentally counted targets. Data were filtered at 2–30 Hz and artifactual components were identified visually (for heartbeat) and using the ICA weight matrix (for superficial layers). Optical signals were restored from the ICA components with artifactual components removed and then averaged over target and non-target epochs. After ICA processing, the event-related response was detected in 70–100% of subjects. The refined signal showed a significant decrease from baseline within 200–300 ms after targets and a slight increase after non-targets. The temporal profile of the optical signal corresponded well to the profile of a “differential ERP response”, the difference between targets and non-targets which peaks at 200 ms in similar object detection tasks. These results demonstrate that the detection of fast optical responses with continuous-wave instruments can be improved through the ICA method capable to remove noise, global interference and the activity of superficial layers. Fast optical signals may provide further information on brain processing during higher-order cognitive tasks such as rapid categorization of objects. PMID:18725213

  4. NuLat: 3D Event Reconstruction of a ROL Detector for Neutrino Detection and Background Rejection

    NASA Astrophysics Data System (ADS)

    Yokley, Zachary; NuLat Collaboration

    2015-04-01

    NuLat is a proposed very-short baseline reactor antineutrino experiment that employs a unique detector design, a Ragahavan Optical Lattice (ROL), developed for the LENS solar neutrino experiment. The 3D lattice provides high spatial and temporal resolution and allows for energy deposition in each voxel to be determined independently of other voxels, as well as the time sequence associated with each voxel energy deposition. This unique feature arises from two independent means to spatially locate energy deposits: via timing and via optical channeling. NuLat, the first application of a ROL detector targeting physics results, will measure the reactor antineutrino flux at very short baselines via inverse beta decay (IBD). The ROL design of NuLat makes possible the reconstruction of positron energy with little contamination due to the annihilation gammas which smear the positron energy resolution in a traditional detector. IBD events are cleanly tagged via temporal and spatial coincidence of neutron capture in the vertex voxel or nearest neighbors. This talk will present work on IBD event reconstruction in NuLat and its likely impact on sterile neutrino detection via operation in higher background locations enabled by its superior rejection of backgrounds. This research has been funded in part by the National Science Foundation on Award Numbers 1001394 and 1001078.

  5. Measurement of patient safety: a systematic review of the reliability and validity of adverse event detection with record review

    PubMed Central

    Hanskamp-Sebregts, Mirelle; Zegers, Marieke; Vincent, Charles; van Gurp, Petra J; de Vet, Henrica C W; Wollersheim, Hub

    2016-01-01

    Objectives Record review is the most used method to quantify patient safety. We systematically reviewed the reliability and validity of adverse event detection with record review. Design A systematic review of the literature. Methods We searched PubMed, EMBASE, CINAHL, PsycINFO and the Cochrane Library and from their inception through February 2015. We included all studies that aimed to describe the reliability and/or validity of record review. Two reviewers conducted data extraction. We pooled κ values (κ) and analysed the differences in subgroups according to number of reviewers, reviewer experience and training level, adjusted for the prevalence of adverse events. Results In 25 studies, the psychometric data of the Global Trigger Tool (GTT) and the Harvard Medical Practice Study (HMPS) were reported and 24 studies were included for statistical pooling. The inter-rater reliability of the GTT and HMPS showed a pooled κ of 0.65 and 0.55, respectively. The inter-rater agreement was statistically significantly higher when the group of reviewers within a study consisted of a maximum five reviewers. We found no studies reporting on the validity of the GTT and HMPS. Conclusions The reliability of record review is moderate to substantial and improved when a small group of reviewers carried out record review. The validity of the record review method has never been evaluated, while clinical data registries, autopsy or direct observations of patient care are potential reference methods that can be used to test concurrent validity. PMID:27550650

  6. Method and device for detecting impact events on a security barrier which includes a hollow rebar allowing insertion and removal of an optical fiber

    DOEpatents

    Pies, Ross E.

    2016-03-29

    A method and device for the detection of impact events on a security barrier. A hollow rebar is farmed within a security barrier, whereby the hollow rebar is completely surrounded by the security barrier. An optical fiber passes through the interior of the hollow rebar. An optical transmitter and an optical receiver are both optically connected to the optical fiber and connected to optical electronics. The optical electronics are configured to provide notification upon the detection of an impact event at the security barrier based on the detection of disturbances within the optical fiber.

  7. Microlensing Events by Proxima Centauri in 2014 and 2016: Opportunities for Mass Determination and Possible Planet Detection

    NASA Astrophysics Data System (ADS)

    Sahu, Kailash C.; Bond, Howard E.; Anderson, Jay; Dominik, Martin

    2014-02-01

    We have found that Proxima Centauri, the star closest to our Sun, will pass close to a pair of faint background stars in the next few years. Using Hubble Space Telescope (HST) images obtained in 2012 October, we determine that the passage close to a mag 20 star will occur in 2014 October (impact parameter 1.''6), and to a mag 19.5 star in 2016 February (impact parameter 0.''5). As Proxima passes in front of these stars, the relativistic deflection of light will cause shifts in the positions of the background stars of ~0.5 and 1.5 mas, respectively, readily detectable by HST imaging, and possibly by Gaia and ground-based facilities such as the Very Large Telescope. Measurement of these astrometric shifts offers a unique and direct method to measure the mass of Proxima. Moreover, if Proxima has a planetary system, the planets may be detectable through their additional microlensing signals, although the probability of such detections is small. With astrometric accuracies of 0.03 mas (achievable with HST spatial scanning), centroid shifts caused by Jovian planets are detectable at separations of up to 2.''0 (corresponding to 2.6 AU at the distance of Proxima), and centroid shifts by Earth-mass planets are detectable within a small band of 8 mas (corresponding to 0.01 AU) around the source trajectories. Jovian planets within a band of about 28 mas (corresponding to 0.036 AU) around the source trajectories would produce a brightening of the source by >0.01 mag and could hence be detectable. Estimated timescales of the astrometric and photometric microlensing events due to a planet range from a few hours to a few days, and both methods would provide direct measurements of the planetary mass. Based in part on observations made with the NASA/ESA Hubble Space Telescope, obtained at the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS 5-26555.

  8. Cladograms with Path to Event (ClaPTE): A novel algorithm to detect associations between genotypes or phenotypes using phylogenies

    PubMed Central

    Handelman, Samuel K; Aaronson, Jacob M.; Seweryn, Michal; Voronkin, Igor; Kwiek, Jesse J.; Sadee, Wolfgang; Verducci, Joseph S.; Janies, Daniel A.

    2015-01-01

    Background Associations between genotype and phenotype provide insight into the evolution of pathogenesis, drug resistance, and the spread of pathogens between hosts. However, common ancestry can lead to apparent associations between biologically unrelated features. The novel method Cladograms with Path to Event (ClaPTE) detects associations between character-pairs (either a pair of mutations or a mutation paired with a phenotype) while adjusting for common ancestry, using phylogenetic trees. Methods ClaPTE tests for character-pairs changing close together on the phylogenetic tree, consistent with an associated character-pair. ClaPTE is compared to three existing methods (independent contrasts, mixed model, and likelihood ratio) to detect character-pair associations adjusted for common ancestry. Comparisons utilize simulations on gene trees for: HIV Env, HIV promoter, and bacterial DnaJ and GuaB; and case studies for Oseltamavir resistance in Influenza, and for DnaJ and GuaB. Simulated data include both true-positive/associated character-pairs, and true-negative/not-associated character-pairs, used to assess type I (frequency of p-values in true-negatives) and type II (sensitivity to true-positives) error control. Results and conclusions ClaPTE has competitive sensitivity and better type I error control than existing methods. In the Influenza/Oseltamavir case study, ClaPTE reports no new permissive mutations but detects associations between adjacent (in primary sequence) amino acid positions which other methods miss. In the DnaJ and GuaB case study, ClaPTE reports more frequent associations between positions both from the same protein family than between positions from different families, in contrast to other methods. In both case studies, the results from ClaPTE are biologically plausible. PMID:25577610

  9. The Detection of a Type IIn Supernova in Optical Follow-up Observations of IceCube Neutrino Events

    NASA Astrophysics Data System (ADS)

    Aartsen, M. G.; Abraham, K.; Ackermann, M.; Adams, J.; Aguilar, J. A.; Ahlers, M.; Ahrens, M.; Altmann, D.; Anderson, T.; Archinger, M.; Arguelles, C.; Arlen, T. C.; Auffenberg, J.; Bai, X.; Barwick, S. W.; Baum, V.; Bay, R.; Beatty, J. J.; Becker Tjus, J.; Becker, K.-H.; Beiser, E.; BenZvi, S.; Berghaus, P.; Berley, D.; Bernardini, E.; Bernhard, A.; Besson, D. Z.; Binder, G.; Bindig, D.; Bissok, M.; Blaufuss, E.; Blumenthal, J.; Boersma, D. J.; Bohm, C.; Börner, M.; Bos, F.; Bose, D.; Böser, S.; Botner, O.; Braun, J.; Brayeur, L.; Bretz, H.-P.; Brown, A. M.; Buzinsky, N.; Casey, J.; Casier, M.; Cheung, E.; Chirkin, D.; Christov, A.; Christy, B.; Clark, K.; Classen, L.; Coenders, S.; Cowen, D. F.; Cruz Silva, A. H.; Daughhetee, J.; Davis, J. C.; Day, M.; de André, J. P. A. M.; De Clercq, C.; Dembinski, H.; De Ridder, S.; Desiati, P.; de Vries, K. D.; de Wasseige, G.; de With, M.; DeYoung, T.; Díaz-Vélez, J. C.; Dumm, J. P.; Dunkman, M.; Eagan, R.; Eberhardt, B.; Ehrhardt, T.; Eichmann, B.; Euler, S.; Evenson, P. A.; Fadiran, O.; Fahey, S.; Fazely, A. R.; Fedynitch, A.; Feintzeig, J.; Felde, J.; Filimonov, K.; Finley, C.; Fischer-Wasels, T.; Flis, S.; Fuchs, T.; Glagla, M.; Gaisser, T. K.; Gaior, R.; Gallagher, J.; Gerhardt, L.; Ghorbani, K.; Gier, D.; Gladstone, L.; Glüsenkamp, T.; Goldschmidt, A.; Golup, G.; Gonzalez, J. G.; Góra, D.; Grant, D.; Gretskov, P.; Groh, J. C.; Gross, A.; Ha, C.; Haack, C.; Haj Ismail, A.; Hallgren, A.; Halzen, F.; Hansmann, B.; Hanson, K.; Hebecker, D.; Heereman, D.; Helbing, K.; Hellauer, R.; Hellwig, D.; Hickford, S.; Hignight, J.; Hill, G. C.; Hoffman, K. D.; Hoffmann, R.; Holzapfe, K.; Homeier, A.; Hoshina, K.; Huang, F.; Huber, M.; Huelsnitz, W.; Hulth, P. O.; Hultqvist, K.; In, S.; Ishihara, A.; Jacobi, E.; Japaridze, G. S.; Jero, K.; Jurkovic, M.; Kaminsky, B.; Kappes, A.; Karg, T.; Karle, A.; Kauer, M.; Keivani, A.; Kelley, J. L.; Kemp, J.; Kheirandish, A.; Kiryluk, J.; Kläs, J.; Klein, S. R.; Kohnen, G.; Koirala, R.; Kolanoski, H.; Konietz, R.; Koob, A.; Köpke, L.; Kopper, C.; Kopper, S.; Koskinen, D. J.; Kowalski, M.; Krings, K.; Kroll, G.; Kroll, M.; Kunnen, J.; Kurahashi, N.; Kuwabara, T.; Labare, M.; Lanfranchi, J. L.; Larson, M. J.; Lesiak-Bzdak, M.; Leuermann, M.; Leuner, J.; Lünemann, J.; Madsen, J.; Maggi, G.; Mahn, K. B. M.; Maruyama, R.; Mase, K.; Matis, H. S.; Maunu, R.; McNally, F.; Meagher, K.; Medici, M.; Meli, A.; Menne, T.; Merino, G.; Meures, T.; Miarecki, S.; Middell, E.; Middlemas, E.; Miller, J.; Mohrmann, L.; Montaruli, T.; Morse, R.; Nahnhauer, R.; Naumann, U.; Niederhausen, H.; Nowicki, S. C.; Nygren, D. R.; Obertacke, A.; Olivas, A.; Omairat, A.; O’Murchadha, A.; Palczewski, T.; Pandya, H.; Paul, L.; Pepper, J. A.; Pérez de los Heros, C.; Pfendner, C.; Pieloth, D.; Pinat, E.; Posselt, J.; Price, P. B.; Przybylski, G. T.; Pütz, J.; Quinnan, M.; Rädel, L.; Rameez, M.; Rawlins, K.; Redl, P.; Reimann, R.; Relich, M.; Resconi, E.; Rhode, W.; Richman, M.; Richter, S.; Riedel, B.; Robertson, S.; Rongen, M.; Rott, C.; Ruhe, T.; Ryckbosch, D.; Saba, S. M.; Sabbatini, L.; Sander, H.-G.; Sandrock, A.; Sandroos, J.; Sarkar, S.; Schatto, K.; Scheriau, F.; Schimp, M.; Schmidt, T.; Schmitz, M.; Schoenen, S.; Schöneberg, S.; Schönwald, A.; Schukraft, A.; Schulte, L.; Seckel, D.; Seunarine, S.; Shanidze, R.; Smith, M. W. E.; Soldin, D.; Spiczak, G. M.; Spiering, C.; Stahlberg, M.; Stamatikos, M.; Stanev, T.; Stanisha, N. A.; Stasik, A.; Stezelberger, T.; Stokstad, R. G.; Stössl, A.; Strahler, E. A.; Ström, R.; Strotjohann, N. L.; Sullivan, G. W.; Sutherland, M.; Taavola, H.; Taboada, I.; Ter-Antonyan, S.; Terliuk, A.; Tešić, G.; Tilav, S.; Toale, P. A.; Tobin, M. N.; Tosi, D.; Tselengidou, M.; Turcati, A.; Unger, E.; Usner, M.; Vallecorsa, S.; van Eijndhoven, N.; Vandenbroucke, J.; van Santen, J.; Vanheule, S.; Veenkamp, J.; Vehring, M.; Voge, M.; Vraeghe, M.; Walck, C.; Wallraff, M.; Wandkowsky, N.; Weaver, Ch.; Wendt, C.; Westerhoff, S.; Whelan, B. J.; Whitehorn, N.; Wichary, C.; Wiebe, K.; Wiebusch, C. H.; Wille, L.; Williams, D. R.; Wissing, H.; Wolf, M.; Wood, T. R.; Woschnagg, K.; Xu, D. L.; Xu, X. W.; Xu, Y.; Yanez, J. P.; Yodh, G.; Yoshida, S.; Zarzhitsky, P.; Zoll, M.; IceCube Collaboration; Ofek, Eran O.; Kasliwal, Mansi M.; Nugent, Peter E.; Arcavi, Iair; Bloom, Joshua S.; Kulkarni, Shrinivas R.; Perley, Daniel A.; Barlow, Tom; Horesh, Assaf; Gal-Yam, Avishay; Howell, D. A.; Dilday, Ben; for the PTF Collaboration; Evans, Phil A.; Kennea, Jamie A.; for the Swift Collaboration; Burgett, W. S.; Chambers, K. C.; Kaiser, N.; Waters, C.; Flewelling, H.; Tonry, J. L.; Rest, A.; Smartt, S. J.; Pan-STARRS1 Science Consortium, for the

    2015-09-01

    The IceCube neutrino observatory pursues a follow-up program selecting interesting neutrino events in real-time and issuing alerts for electromagnetic follow-up observations. In 2012 March, the most significant neutrino alert during the first three years of operation was issued by IceCube. In the follow-up observations performed by the Palomar Transient Factory (PTF), a Type IIn supernova (SN IIn) PTF12csy was found 0.°2 away from the neutrino alert direction, with an error radius of 0.°54. It has a redshift of z = 0.0684, corresponding to a luminosity distance of about 300 Mpc and the Pan-STARRS1 survey shows that its explosion time was at least 158 days (in host galaxy rest frame) before the neutrino alert, so that a causal connection is unlikely. The a posteriori significance of the chance detection of both the neutrinos and the SN at any epoch is 2.2σ within IceCube's 2011/12 data acquisition season. Also, a complementary neutrino analysis reveals no long-term signal over the course of one year. Therefore, we consider the SN detection coincidental and the neutrinos uncorrelated to the SN. However, the SN is unusual and interesting by itself: it is luminous and energetic, bearing strong resemblance to the SN IIn 2010jl, and shows signs of interaction of the SN ejecta with a dense circumstellar medium. High-energy neutrino emission is expected in models of diffusive shock acceleration, but at a low, non-detectable level for this specific SN. In this paper, we describe the SN PTF12csy and present both the neutrino and electromagnetic data, as well as their analysis.

  10. Automatic detection of epileptiform events in EEG by a three-stage procedure based on artificial neural networks.

    PubMed

    Acir, Nurettin; Oztura, Ibrahim; Kuntalp, Mehmet; Baklan, Bariş; Güzeliş, Cüneyt

    2005-01-01

    This paper introduces a three-stage procedure based on artificial neural networks for the automatic detection of epileptiform events (EVs) in a multichannel electroencephalogram (EEG) signal. In the first stage, two discrete perceptrons fed by six features are used to classify EEG peaks into three subgroups: 1) definite epileptiform transients (ETs); 2) definite non-ETs; and 3) possible ETs and possible non-ETs. The pre-classification done in the first stage not only reduces the computation time but also increases the overall detection performance of the procedure. In the second stage, the peaks falling into the third group are aimed to be separated from each other by a nonlinear artificial neural network that would function as a postclassifier whose input is a vector of 41 consecutive sample values obtained from each peak. Different networks, i.e., a backpropagation multilayer perceptron and two radial basis function networks trained by a hybrid method and a support vector method, respectively, are constructed as the postclassifier and then compared in terms of their classification performances. In the third stage, multichannel information is integrated into the system for contributing to the process of identifying an EV by the electroencephalographers (EEGers). After the integration of multichannel information, the overall performance of the system is determined with respect to EVs. Visual evaluation, by two EEGers, of 19 channel EEG records of 10 epileptic patients showed that the best performance is obtained with a radial basis support vector machine providing an average sensitivity of 89.1%, an average selectivity of 85.9%, and a false detection rate (per hour) of 7.5. PMID:15651562

  11. Detection of Rain-on-Snow (ROS) Events Using the Advanced Microwave Scanning Radiometer-Earth Observing System (AMSR-E) and Weather Station Observations

    NASA Astrophysics Data System (ADS)

    Ryan, E. M.; Brucker, L.; Forman, B. A.

    2015-12-01

    During the winter months, the occurrence of rain-on-snow (ROS) events can impact snow stratigraphy via generation of large scale ice crusts, e.g., on or within the snowpack. The formation of such layers significantly alters the electromagnetic response of the snowpack, which can be witnessed using space-based microwave radiometers. In addition, ROS layers can hinder the ability of wildlife to burrow in the snow for vegetation, which limits their foraging capability. A prime example occurred on 23 October 2003 in Banks Island, Canada, where an ROS event is believed to have caused the deaths of over 20,000 musk oxen. Through the use of passive microwave remote sensing, ROS events can be detected by utilizing observed brightness temperatures (Tb) from AMSR-E. Tb observed at different microwave frequencies and polarizations depends on snow properties. A wet snowpack formed from an ROS event yields a larger Tb than a typical dry snowpack would. This phenomenon makes observed Tb useful when detecting ROS events. With the use of data retrieved from AMSR-E, in conjunction with observations from ground-based weather station networks, a database of estimated ROS events over the past twelve years was generated. Using this database, changes in measured Tb following the ROS events was also observed. This study adds to the growing knowledge of ROS events and has the potential to help inform passive microwave snow water equivalent (SWE) retrievals or snow cover properties in polar regions.

  12. Real-time automated 3D sensing, detection, and recognition of dynamic biological micro-organic events

    NASA Astrophysics Data System (ADS)

    Javidi, Bahram; Yeom, Seokwon; Moon, Inkyu; Daneshpanah, Mehdi

    2006-05-01

    In this paper, we present an overview of three-dimensional (3D) optical imaging techniques for real-time automated sensing, visualization, and recognition of dynamic biological microorganisms. Real time sensing and 3D reconstruction of the dynamic biological microscopic objects can be performed by single-exposure on-line (SEOL) digital holographic microscopy. A coherent 3D microscope-based interferometer is constructed to record digital holograms of dynamic micro biological events. Complex amplitude 3D images of the biological microorganisms are computationally reconstructed at different depths by digital signal processing. Bayesian segmentation algorithms are applied to identify regions of interest for further processing. A number of pattern recognition approaches are addressed to identify and recognize the microorganisms. One uses 3D morphology of the microorganisms by analyzing 3D geometrical shapes which is composed of magnitude and phase. Segmentation, feature extraction, graph matching, feature selection, and training and decision rules are used to recognize the biological microorganisms. In a different approach, 3D technique is used that are tolerant to the varying shapes of the non-rigid biological microorganisms. After segmentation, a number of sampling patches are arbitrarily extracted from the complex amplitudes of the reconstructed 3D biological microorganism. These patches are processed using a number of cost functions and statistical inference theory for the equality of means and equality of variances between the sampling segments. Also, we discuss the possibility of employing computational integral imaging for 3D sensing, visualization, and recognition of biological microorganisms illuminated under incoherent light. Experimental results with several biological microorganisms are presented to illustrate detection, segmentation, and identification of micro biological events.

  13. Efficient In Planta Detection and Dissection of De Novo Mutation Events in the Arabidopsis thaliana Disease Resistance Gene UNI.

    PubMed

    Ogawa, Tomohiko; Mori, Akiko; Igari, Kadunari; Morita, Miyo Terao; Tasaka, Masao; Uchida, Naoyuki

    2016-06-01

    Plants possess disease resistance (R) proteins encoded by R genes, and each R protein recognizes a specific pathogen factor(s) for immunity. Interestingly, a remarkably high degree of polymorphisms in R genes, which are traces of past mutation events during evolution, suggest the rapid diversification of R genes. However, little is known about molecular aspects that facilitate the rapid change of R genes because of the lack of tools that enable us to monitor de novo R gene mutations efficiently in an experimentally feasible time scale, especially in living plants. Here we introduce a model assay system that enables efficient in planta detection of de novo mutation events in the Arabidopsis thaliana R gene UNI in one generation. The uni-1D mutant harbors a gain-of-function allele of the UNI gene. uni-1D heterozygous individuals originally exhibit dwarfism with abnormally short stems. However, interestingly, morphologically normal stems sometimes emerge spontaneously from the uni-1D plants, and the morphologically reverted tissues carry additional de novo mutations in the UNI gene. Strikingly, under an extreme condition, almost half of the examined population shows the reversion phenomenon. By taking advantage of this phenomenon, we demonstrate that the reversion frequency is remarkably sensitive to a variety of fluctuations in DNA stability, underlying a mutable tendency of the UNI gene. We also reveal that activities of the salicylic acid pathway and DNA damage sensor pathway are involved in the reversion phenomenon. Thus, we provide an experimentally feasible model tool to explore factors and conditions that significantly affect the R gene mutation phenomenon. PMID:27016096

  14. Enhanced health event detection and influenza surveillance using a joint Veterans Affairs and Department of Defense biosurveillance application

    PubMed Central

    2011-01-01

    Background The establishment of robust biosurveillance capabilities is an important component of the U.S. strategy for identifying disease outbreaks, environmental exposures and bioterrorism events. Currently, U.S. Departments of Defense (DoD) and Veterans Affairs (VA) perform biosurveillance independently. This article describes a joint VA/DoD biosurveillance project at North Chicago-VA Medical Center (NC-VAMC). The Naval Health Clinics-Great Lakes facility physically merged with NC-VAMC beginning in 2006 with the full merger completed in October 2010 at which time all DoD care and medical personnel had relocated to the expanded and remodeled NC-VAMC campus and the combined facility was renamed the Lovell Federal Health Care Center (FHCC). The goal of this study was to evaluate disease surveillance using a biosurveillance application which combined data from both populations. Methods A retrospective analysis of NC-VAMC/Lovell FHCC and other Chicago-area VAMC data was performed using the ESSENCE biosurveillance system, including one infectious disease outbreak (Salmonella/Taste of Chicago-July 2007) and one weather event (Heat Wave-July 2006). Influenza-like-illness (ILI) data from these same facilities was compared with CDC/Illinois Sentinel Provider and Cook County ESSENCE data for 2007-2008. Results Following consolidation of VA and DoD facilities in North Chicago, median number of visits more than doubled, median patient age dropped and proportion of females rose significantly in comparison with the pre-merger NC-VAMC facility. A high-level gastrointestinal alert was detected in July 2007, but only low-level alerts at other Chicago-area VAMCs. Heat-injury alerts were triggered for the merged facility in June 2006, but not at the other facilities. There was also limited evidence in these events that surveillance of the combined population provided utility above and beyond the VA-only and DoD-only components. Recorded ILI activity for NC-VAMC/Lovell FHCC was more

  15. Gaseous time projection chambers for rare event detection: results from the T-REX project. I. Double beta decay

    NASA Astrophysics Data System (ADS)

    Irastorza, I. G.; Aznar, F.; Castel, J.; Cebrián, S.; Dafni, T.; Galán, J.; Garcia, J. A.; Garza, J. G.; Gómez, H.; Herrera, D. C.; Iguaz, F. J.; Luzon, G.; Mirallas, H.; Ruiz, E.; Seguí, L.; Tomás, A.

    2016-01-01

    As part of the T-REX project, a number of R&D and prototyping activities have been carried out during the last years to explore the applicability of gaseous Time Projection Chambers (TPCs) with Micromesh Gas Structures (Micromegas) in rare event searches like double beta decay, axion research and low-mass WIMP searches. In both this and its companion paper, we compile the main results of the project and give an outlook of application prospects for this detection technique. While in the companion paper we focus on axions and WIMPs, in this paper we focus on the results regarding the measurement of the double beta decay (DBD) of 136Xe in a high pressure Xe (HPXe) TPC. Micromegas of the microbulk type have been extensively studied in high pressure Xe and Xe mixtures. Particularly relevant are the results obtained in Xe + trimethylamine (TMA) mixtures, showing very promising results in terms of gain, stability of operation, and energy resolution at high pressures up to 10 bar. The addition of TMA at levels of ~ 1% reduces electron diffusion by up to a factor of 10 with respect to pure Xe, improving the quality of the topological pattern, with a positive impact on the discrimination capability. Operation with a medium size prototype of 30 cm diameter and 38 cm of drift (holding about 1 kg of Xe at 10 bar in the fiducial volume, enough to contain high energy electron tracks in the detector volume) has allowed to test the detection concept in realistic experimental conditions. Microbulk Micromegas are able to image the DBD ionization signature with high quality while, at the same time, measuring its energy deposition with a resolution of at least a ~ 3% FWHM @ Qββ. This value was experimentally demonstrated for high-energy extended tracks at 10 bar, and is probably improvable down to the ~ 1% FWHM levels as extrapolated from low energy events. In addition, first results on the topological signature information (one straggling track ending in two blobs) show promising

  16. Analysis of Inter-Moss Loops in the Solar Region with IRIS and SDO AIA: Automatic Event Detection and Characterization

    NASA Technical Reports Server (NTRS)

    Fayock, Brian; Winebarger, Amy; De Pontieu, Bart; Alexander, Caroline

    2016-01-01

    The Interface Region Imaging Spectrograph (IRIS), launched in the summer of 2013, is designed specifically to observe and investigate the transition region and adjacent layers of the solar atmosphere, obtaining images in high spatial, temporal, and spectral resolution. Our particular work is focused on the evolution of inter-moss loops, which have been detected in the lower corona by the Atmospheric Imaging Assembly (AIA) and the High-Resolution Coronal Imager (Hi- C), but are known to have foot points below the transition region. With the high-resolution capabilities of IRIS and its Si IV pass band, which measures activity in the upper chromosphere, we can study these magnetic loops in detail and compare their characteristic length and time scales to those obtained from several AIA image sets, particularly the 171, 193, and 211 pass bands. By comparing the results between these four data sets, one can potentially establish a measure of the ionization equilibrium for the location in question. To explore this idea, we found a large, sit-and-stare observation within the IRIS database that fit our specifications. This data set contained a number of well-defined inter-moss loops (by visual inspection) with a cadence less than or equal to that of AIA (approximately 12 seconds). This particular data set was recorded on October 23, 2013 at 07:09:30, lasting for 3219 seconds with a field of view of 120.6 by 128.1 arcseconds, centered on -53.9 by 59.1 arcseconds from disk center. For ease of comparison, the AIA data has been interpolated to match the IRIS cadence and resolution. In the main portion of the poster, we demonstrate the detection of events, the information collected, and the immediate results to the right, showing the progress of an event with green as the start, blue as the peak, and red as the end. Below here, we demonstrate how pixels are combined to form groups. The 3D results are shown to the right

  17. Analysis of Inter-Moss Loops in the Solar Region with IRIS and SDO AIA: Automatic Event Detection and Characterization

    NASA Technical Reports Server (NTRS)

    Fayock, Brian; Winebarger, Amy; De Pontieu, Bart

    2014-01-01

    The Interface Region Imaging Spectrograph (IRIS), launched in the summer of 2013, is designed specifically to observe and investigate the transition region and adjacent layers of the solar atmosphere, obtaining images in high spatial, temporal, and spectral resolution. Our particular work is focused on the evolution of inter-moss loops, which have been detected in the lower corona by the Atmospheric Imaging Assembly (AIA) and the High-Resolution Coronal Imager (Hi- C), but are known to have foot points below the transition region. With the high-resolution capabilities of IRIS and its Si IV pass band, which measures activity in the upper chromosphere, we can study these magnetic loops in detail and compare their characteristic length and time scales to those obtained from several AIA image sets, particularly the 171, 193, and 211 pass bands. By comparing the results between these four data sets, one can potentially establish a measure of the ionization equilibrium for the location in question. To explore this idea, we found a large, sit-and-stare observation within the IRIS database that fit our specifications. This data set contained a number of well-defined inter-moss loops (by visual inspection) with a cadence less than or equal to that of AIA (approximately 12 seconds). This particular data set was recorded on October 23, 2013 at 07:09:30, lasting for 3219 seconds with a field of view of 120.6 by 128.1 arcseconds, centered on -53.9 by 59.1 arcseconds from disk center. For ease of comparison, the AIA data has been interpolated to match the IRIS cadence and resolution. In the main portion of the poster, we demonstrate the detection of events, the information collected, and the immediate results to the right, showing the progress of an event with green as the start, blue as the peak, and red as the end. Below here, we demonstrate how pixels are combined to form groups. The 3D results are shown to the right.

  18. An Unsorted Spike-Based Pattern Recognition Method for Real-Time Continuous Sensory Event Detection from Dorsal Root Ganglion Recording.

    PubMed

    Han, Sungmin; Chu, Jun-Uk; Kim, Hyungmin; Choi, Kuiwon; Park, Jong Woong; Youn, Inchan

    2016-06-01

    In functional neuromuscular stimulation systems, sensory information-based closed-loop control can be useful for restoring lost function in patients with hemiplegia or quadriplegia. The goal of this study was to detect sensory events from tactile afferent signals continuously in real time using a novel unsorted spike-based pattern recognition method. The tactile afferent signals were recorded with a 16-channel microelectrode in the dorsal root ganglion, and unsorted spike-based feature vectors were extracted as a novel combination of the time and time-frequency domain features. Principal component analysis was used to reduce the dimensionality of the feature vectors, and a multilayer perceptron classifier was used to detect sensory events. The proposed method showed good performance for classification accuracy, and the processing time delay of sensory event detection was less than 200 ms. These results indicated that the proposed method could be applicable for sensory feedback in closed-loop control systems. PMID:26672029

  19. Deficits in cue detection underlie event-based prospective memory impairment in major depression: an eye tracking study.

    PubMed

    Chen, Siyi; Zhou, Renlai; Cui, Hong; Chen, Xinyin

    2013-10-30

    This study examined the cue detection in the non-focal event-based prospective memory (PM) of individuals with and without a major depressive disorder using behavioural and eye tracking assessments. The participants were instructed to search on each trial for a different target stimulus that could be present or absent and to make prospective responses to the cue object. PM tasks included cue only and target plus cue, whereas ongoing tasks included target only and distracter only. The results showed that a) participants with depression performed more poorly than those without depression in PM; b) participants with depression showed more fixations and longer total and average fixation durations in both ongoing and PM conditions; c) participants with depression had lower scores on accuracy in target-plus-cue trials than in cue-only trials and had a higher gaze rate of targets on hits and misses in target-plus-cue trials than did those without depression. The results indicate that the state of depression may impair top-down cognitive control function, which in turn results in particular deficits in the engagement of monitoring for PM cues. PMID:23477903

  20. 1.3 mm WAVELENGTH VLBI OF SAGITTARIUS A*: DETECTION OF TIME-VARIABLE EMISSION ON EVENT HORIZON SCALES

    SciTech Connect

    Fish, Vincent L.; Doeleman, Sheperd S.; Beaudoin, Christopher; Bolin, David E.; Rogers, Alan E. E.; Blundell, Ray; Gurwell, Mark A.; Moran, James M.; Primiani, Rurik; Bower, Geoffrey C.; Plambeck, Richard; Chamberlin, Richard; Freund, Robert; Friberg, Per; Honma, Mareki; Oyama, Tomoaki; Inoue, Makoto; Krichbaum, Thomas P.; Lamb, James; Marrone, Daniel P.

    2011-02-01

    Sagittarius A*, the {approx}4 x 10{sup 6} M{sub sun} black hole candidate at the Galactic center, can be studied on Schwarzschild radius scales with (sub)millimeter wavelength very long baseline interferometry (VLBI). We report on 1.3 mm wavelength observations of Sgr A* using a VLBI array consisting of the JCMT on Mauna Kea, the Arizona Radio Observatory's Submillimeter Telescope on Mt. Graham in Arizona, and two telescopes of the CARMA array at Cedar Flat in California. Both Sgr A* and the quasar calibrator 1924-292 were observed over three consecutive nights, and both sources were clearly detected on all baselines. For the first time, we are able to extract 1.3 mm VLBI interferometer phase information on Sgr A* through measurement of closure phase on the triangle of baselines. On the third night of observing, the correlated flux density of Sgr A* on all VLBI baselines increased relative to the first two nights, providing strong evidence for time-variable change on scales of a few Schwarzschild radii. These results suggest that future VLBI observations with greater sensitivity and additional baselines will play a valuable role in determining the structure of emission near the event horizon of Sgr A*.

  1. Detection of the brain response during a cognitive task using perfusion-based event-related functional MRI.

    PubMed

    Yee, S H; Liu, H L; Hou, J; Pu, Y; Fox, P T; Gao, J H

    2000-08-01

    Event-related (ER) fMRI has evoked great interest due to the ability to depict the dynamic features of human brain function during various cognitive tasks. Thus far, all cognitive ER-fMRI studies have been based on blood oxygenation level-dependent (BOLD) contrast techniques. Compared with BOLD-based fMRI techniques, perfusion-based fMRI is able to localize the region of neuronal activity more accurately. This report demonstrates, for the first time, the detection of the brain response to a cognitive task using high temporal resolution perfusion-based ER-fMRI. An English verb generation task was used in this study. Results show that perfusion-based ER-fMRI accurately depicts the activation in Broca's area. Average changes in regional relative cerebral blood flow reached a maximum value of 30.7% at approximately 6.5 s after the start of stimulation and returned to 10% of the maximum value at approximately 12.8 s. Our results show that perfusion-based ER-fMRI is a useful tool for cognitive neuroscience studies, providing comparable temporal resolution and better localization of brain function than BOLD ER-fMRI. PMID:10943717

  2. Collaborative trial for the validation of event-specific PCR detection methods of genetically modified papaya Huanong No.1.

    PubMed

    Wei, Jiaojun; Le, Huangying; Pan, Aihu; Xu, Junfeng; Li, Feiwu; Li, Xiang; Quan, Sheng; Guo, Jinchao; Yang, Litao

    2016-03-01

    For transferring the event-specific PCR methods of genetically modified papaya Huanong No.1 to other laboratories, we validated the previous developed PCR assays of Huanong No.1 according to the international standard organization (ISO) guidelines. A total of 11 laboratories participated and returned their test results in this trial. In qualitative PCR assay, the high specificity and limit of detection as low as 0.1% was confirmed. For the quantitative PCR assay, the limit of quantification was as low as 25 copies. The quantitative biases among ten blind samples were within the range between 0.21% and 10.04%. Furthermore, the measurement uncertainty of the quantitative PCR results was calculated within the range between 0.28% and 2.92% for these ten samples. All results demonstrated that the Huanong No.1 qualitative and quantitative PCR assays were creditable and applicable for identification and quantification of GM papaya Huanong No.1 in further routine lab analysis. PMID:26471522

  3. Multiplex polymerase chain reaction-capillary gel electrophoresis: a promising tool for GMO screening--assay for simultaneous detection of five genetically modified cotton events and species.

    PubMed

    Nadal, Anna; Esteve, Teresa; Pla, Maria

    2009-01-01

    A multiplex polymerase chain reaction assay coupled to capillary gel electrophoresis for amplicon identification by size and color (multiplex PCR-CGE-SC) was developed for simultaneous detection of cotton species and 5 events of genetically modified (GM) cotton. Validated real-time-PCR reactions targeting Bollgard, Bollgard II, Roundup Ready, 3006-210-23, and 281-24-236 junction sequences, and the cotton reference gene acp1 were adapted to detect more than half of the European Union-approved individual or stacked GM cotton events in one reaction. The assay was fully specific (<1.7% of false classification rate), with limit of detection values of 0.1% for each event, which were also achieved with simulated mixtures at different relative percentages of targets. The assay was further combined with a second multiplex PCR-CGE-SC assay to allow simultaneous detection of 6 cotton and 5 maize targets (two endogenous genes and 9 GM events) in two multiplex PCRs and a single CGE, making the approach more economic. Besides allowing simultaneous detection of many targets with adequate specificity and sensitivity, the multiplex PCR-CGE-SC approach has high throughput and automation capabilities, while keeping a very simple protocol, e.g., amplification and labeling in one step. Thus, it is an easy and inexpensive tool for initial screening, to be complemented with quantitative assays if necessary. PMID:19610365

  4. Automated detection of feeding strikes by larval fish using continuous high-speed digital video: a novel method to extract quantitative data from fast, sparse kinematic events.

    PubMed

    Shamur, Eyal; Zilka, Miri; Hassner, Tal; China, Victor; Liberzon, Alex; Holzman, Roi

    2016-06-01

    Using videography to extract quantitative data on animal movement and kinematics constitutes a major tool in biomechanics and behavioral ecology. Advanced recording technologies now enable acquisition of long video sequences encompassing sparse and unpredictable events. Although such events may be ecologically important, analysis of sparse data can be extremely time-consuming and potentially biased; data quality is often strongly dependent on the training level of the observer and subject to contamination by observer-dependent biases. These constraints often limit our ability to study animal performance and fitness. Using long videos of foraging fish larvae, we provide a framework for the automated detection of prey acquisition strikes, a behavior that is infrequent yet critical for larval survival. We compared the performance of four video descriptors and their combinations against manually identified feeding events. For our data, the best single descriptor provided a classification accuracy of 77-95% and detection accuracy of 88-98%, depending on fish species and size. Using a combination of descriptors improved the accuracy of classification by ∼2%, but did not improve detection accuracy. Our results indicate that the effort required by an expert to manually label videos can be greatly reduced to examining only the potential feeding detections in order to filter false detections. Thus, using automated descriptors reduces the amount of manual work needed to identify events of interest from weeks to hours, enabling the assembly of an unbiased large dataset of ecologically relevant behaviors. PMID:26994179

  5. PREFACE: 5th Symposium on Large TPCs for Low Energy Rare Event Detection and Workshop on Neutrinos from Supernovae

    NASA Astrophysics Data System (ADS)

    Irastorza, Igor G.; Scholberg, Kate; Colas, Paul; Giomataris, Ioannis

    2011-08-01

    The Fifth International Symposium on large TPCs for low-energy rare-event detection was held at the auditorium of the Astroparticle and Cosmology (APC) Laboratory in Paris, on 14-17 December 2010. As for all previous meetings, always held in Paris in 2008, 2006, 2004 and 2002, it brought together a significant community of physicists involved in rare event searches and/or development of time projection chambers (TPCs). As a novelty this year, the meeting was extended with two half-day sessions on Supernova physics. These proceedings also include the contributions corresponding to the supernova sessions. The purpose of the meeting was to present and discuss the status of current experiments or projects involving the use of TPCs to search for rare events, like low-energy neutrinos, double beta decay, dark matter or axion experiments, as well as to discuss new results and ideas in the framework of the last developments of Micro Pattern Gaseous Detectors (MPGD), and how these are being - or could be - applied to these searches. As in previous meetings in this series, the format included an informal program with some recent highlighted results, rather than exhaustive reviews, with time for discussion and interaction. The symposium, the fifth of the series, is becoming consolidated as a regular meeting place for the synergic interplay between the fields of rare events and TPC development. The meeting started with a moving tribute by Ioannis Giomataris to the memory of George Charpak, who recently passed away. We then moved on to the usual topics like the status of some low-energy neutrino physics and double beta decay experiments, dark matter experiments with directional detectors, axion searches, or development results. A relevant subject this time was the electroluminescence in Xe TPCs, covered by several speakers. Every time the conference program is enriched with original slightly off-topic contributions that trigger the curiosity and stimulate further thought. As

  6. Final report for LDRD project 11-0029 : high-interest event detection in large-scale multi-modal data sets : proof of concept.

    SciTech Connect

    Rohrer, Brandon Robinson

    2011-09-01

    Events of interest to data analysts are sometimes difficult to characterize in detail. Rather, they consist of anomalies, events that are unpredicted, unusual, or otherwise incongruent. The purpose of this LDRD was to test the hypothesis that a biologically-inspired anomaly detection algorithm could be used to detect contextual, multi-modal anomalies. There currently is no other solution to this problem, but the existence of a solution would have a great national security impact. The technical focus of this research was the application of a brain-emulating cognition and control architecture (BECCA) to the problem of anomaly detection. One aspect of BECCA in particular was discovered to be critical to improved anomaly detection capabilities: it's feature creator. During the course of this project the feature creator was developed and tested against multiple data types. Development direction was drawn from psychological and neurophysiological measurements. Major technical achievements include the creation of hierarchical feature sets created from both audio and imagery data.

  7. Detection of Healthcare-Related Extended-Spectrum Beta-Lactamase-Producing Escherichia coli Transmission Events Using Combined Genetic and Phenotypic Epidemiology

    PubMed Central

    Boers, Stefan A.; Jansen, Ruud; Hays, John P.; Goessens, Wil H. F.; Vos, Margreet C.

    2016-01-01

    Background Since the year 2000 there has been a sharp increase in the prevalence of healthcare-related infections caused by extended-spectrum beta-lactamase (ESBL)-producing Escherichia coli. However, the high community prevalence of ESBL-producing E. coli isolates means that many E. coli typing techniques may not be suitable for detecting E. coli transmission events. Therefore, we investigated if High-throughput MultiLocus Sequence Typing (HiMLST) and/or Raman spectroscopy were suitable techniques for detecting recent E. coli transmission events. Methods This study was conducted from January until December 2010 at Erasmus University Medical Center, Rotterdam, the Netherlands. Isolates were typed using HiMLST and Raman spectroscopy. A genetic cluster was defined as two or more patients carrying identical isolates. We used predefined definitions for epidemiological relatedness to assess healthcare-related transmission. Results We included 194 patients; strains of 112 patients were typed using HiMLST and strains of 194 patients were typed using Raman spectroscopy. Raman spectroscopy identified 16 clusters while HiMLST identified 10 clusters. However, no healthcare-related transmission events were detected. When combining data from both typing techniques, we identified eight clusters (n = 34 patients), as well as 78 patients with a non-cluster isolate. However, we could not detect any healthcare-related transmission in these 8 clusters. Conclusions Although clusters were genetically detected using HiMLST and Raman spectroscopy, no definite epidemiological relationships could be demonstrated which makes the possibility of healthcare-related transmission events highly unlikely. Our results suggest that typing of ESBL-producing E. coli using HiMLST and/or Raman spectroscopy is not helpful in detecting E. coli healthcare-related transmission events. PMID:27463231

  8. A novel “correlated ion and neutral time of flight” method: Event-by-event detection of neutral and charged fragments in collision induced dissociation of mass selected ions

    SciTech Connect

    Teyssier, C.; Fillol, R.; Abdoul-Carime, H.; Farizon, B.; Farizon, M.

    2014-01-15

    A new tandem mass spectrometry (MS/MS) method based on time of flight measurements performed on an event-by-event detection technique is presented. This “correlated ion and neutral time of flight” method allows to explore Collision Induced Dissociation (CID) fragmentation processes by directly identifying not only all ions and neutral fragments produced but also their arrival time correlations within each single fragmentation event from a dissociating molecular ion. This constitutes a new step in the characterization of molecular ions. The method will be illustrated here for a prototypical case involving CID of protonated water clusters H{sup +}(H{sub 2}O){sub n=1–5} upon collisions with argon atoms.

  9. Development of multiplex PCR method for simultaneous detection of four events of genetically modified maize: DAS-59122-7, MIR604, MON863 and MON88017.

    PubMed

    Oguchi, Taichi; Onishi, Mari; Mano, Junichi; Akiyama, Hiroshi; Teshima, Reiko; Futo, Satoshi; Furui, Satoshi; Kitta, Kazumi

    2010-01-01

    A novel multiplex PCR method was developed for simultaneous event-specific detection of four events of GM maize, i.e., DAS-59122-7, MIR604, MON88017, and MON863. The single laboratory examination of analytical performance using simulated DNA mixtures containing GM DNA at various concentrations in non-GM DNA suggested that the limits of detection (LOD) of the multiplex PCR method were 0.16% for MON863, MIR604, and MON88017, and 0.078% for DAS-59122-7. We previously developed a nonaplex (9plex) PCR method for eight events of GM maize, i.e., Bt11, Bt176, GA21, MON810, MON863, NK603, T25, and TC1507. Together with the nonaplex PCR method, the newly developed method enabled the detection and identification of eleven GM maize events that are frequently included in commercial GM seed used in Japan. In addition, this combinational analysis may be useful for the identification of combined event products of GM maize. PMID:20595789

  10. Time Dependent Model of the Slow Slip Event in the Tokai Region, South Central Japan, Detected by GPS Measurements in 2001

    NASA Astrophysics Data System (ADS)

    Ohta, Y.; Kimata, F.

    2003-04-01

    The slow slip event is detected in the Tokai region, south central Japan, by national GPS network (GEONET) in 2001. Generally, as the Philippine Sea Plate is subducting toward to northwest in the Suruga-Nankai Trough, ground displacements in the direction of northwest with rates of 1-2cm/yr are detected in this area. The northwestward displacements are no longer observed at this region from the GPS measurements in 2001. Reports from the ground deformation monitoring by GSI suggest that the slow event is still advancing as of January 2003. Repeated slow slip events are estimated from the leveling and EDM ranging by Kimata et al. (2001). Mogi (1989) suggested the pre-slip of the 1944 Tonankai earthquake in the region from the leveling operated at the current day of the earthquake. Based on these past experience, we take an interest in relationship to slow slip event and plate boundary great earthquake. The 2001 Tokai slow slip models are estimated from the GPS measurements and leveling data [Ozawa et al. (2002) and Kimata et al. (2002)]. They suggest the slow slip southeastward with 10-20 cm is occurred in the area less than 100 sq km. The location of the slow slip fault is the inland of the plate boundary with low angle of subducting at a depth 20-30 km. Additionally; there are GPS stations immediately above slow slip fault, and high density. To make clear the slow slip with more detail, we are re-processing the GEONET GPS data using PPP method at GIPSY OASIS II, and discuss the time-dependent model of the slow slip event. In midterm of 2000, Tokai region came under the influence of ground deformations causes by the volcanic activity at the Miyakejima Volcano, and it makes difficult to discuss the deformation start point associated with the slow slip event from the GPS measurements. From our preliminary results, the vertical movements and shortening of base length between the neighboring GPS stations suggest that slow slip event is occurred in the western part of

  11. Line Identifications of Type I Supernovae: On the Detection of Si II for These Hydrogen-poor Events

    NASA Astrophysics Data System (ADS)

    Parrent, J. T.; Milisavljevic, D.; Soderberg, A. M.; Parthasarathy, M.

    2016-03-01

    Here we revisit line identifications of type I supernovae (SNe I) and highlight trace amounts of unburned hydrogen as an important free parameter for the composition of the progenitor. Most one-dimensional stripped-envelope models of supernovae indicate that observed features near 6000-6400 Å in type I spectra are due to more than Si ii λ6355. However, while an interpretation of conspicuous Si ii λ6355 can approximate 6150 Å absorption features for all SNe Ia during the first month of free expansion, similar identifications applied to 6250 Å features of SNe Ib and Ic have not been as successful. When the corresponding synthetic spectra are compared with high-quality timeseries observations, the computed spectra are frequently too blue in wavelength. Some improvement can be achieved with Fe ii lines that contribute redward of 6150 Å however, the computed spectra either remain too blue or the spectrum only reaches a fair agreement when the rise-time to peak brightness of the model conflicts with observations by a factor of two. This degree of disagreement brings into question the proposed explosion scenario. Similarly, a detection of strong Si ii λ6355 in the spectra of broadlined Ic and super-luminous events of type I/R is less convincing despite numerous model spectra used to show otherwise. Alternatively, we suggest 6000-6400 Å features are possibly influenced by either trace amounts of hydrogen or blueshifted absorption and emission in Hα, the latter being an effect which is frequently observed in the spectra of hydrogen-rich, SNe II.

  12. Estimation of fault geometry of a slow slip event off the Kii Peninsula, southwest of Japan, detected by DONET

    NASA Astrophysics Data System (ADS)

    Suzuki, K.; Nakano, M.; Hori, T.; Takahashi, N.

    2015-12-01

    The Japan Agency for Marine-Earth Science and Technology installed permanent ocean bottom observation network called Dense Oceanfloor Network System for Earthquakes and Tsunamis (DONET) off the Kii Peninsula, southwest of Japan, to monitor earthquakes and tsunamis. We detected the long-term vertical displacements of sea floor from the ocean-bottom pressure records, starting from March 2013, at several DONET stations (Suzuki et al., 2014). We consider that these displacements were caused by the crustal deformation due to a slow slip event (SSE).  We estimated the fault geometry of the SSE by using the observed ocean-bottom displacements. The ocean-bottom displacements were obtained by removing the tidal components from the pressure records. We also subtracted the average of pressure changes taken over the records at stations connected to each science node from each record in order to remove the contributions due to atmospheric pressure changes and non-tidal ocean dynamic mass variations. Therefore we compared observed displacements with the theoretical ones that was subtracted the average displacement in the fault geometry estimation. We also compared observed and theoretical average displacements for the model evaluation. In this study, the observed average displacements were assumed to be zero. Although there are nine parameters in the fault model, we observed vertical displacements at only four stations. Therefore we assumed three fault geometries; (1) a reverse fault slip along the plate boundary, (2) a strike slip along a splay fault, and (3) a reverse fault slip along the splay fault. We obtained that the model (3) gives the smallest residual between observed and calculated displacements. We also observed that this SSE was synchronized with a decrease in the background seismicity within the area of a nearby earthquake cluster. In the future, we will investigate the relationship between the SSE and the seismicity change.

  13. Concordance of nuclear and mitochondrial DNA markers in detecting a founder event in Lake Clark sockeye salmon

    USGS Publications Warehouse

    Ramstad, Kristina M.; Woody, Carol Ann; Habicht, Chris; Sage, G. Kevin; Seeb, James E.; Allendorf, Fred W.

    2007-01-01

    Genetic bottleneck effects can reduce genetic variation, persistence probability, and evolutionary potential of populations. Previous microsatellite analysis suggested a bottleneck associated with a common founding of sock-eye salmon Oncorhynchus nerka populations of Lake Clark, Alaska, about 100 to 400 generations ago. The common foundingevent occurred after the last glacial recession and resulted in reduced allelic diversity and strong divergence of Lake Clarksockeye salmon relative to neighboring Six Mile Lake and LakeIliamna populations. Here we used two additional genetic marker types (allozymes and mtDNA) to examine these patterns further. Allozyme and mtDNA results were congruent with the microsatellite data in suggesting a common founder event in LakeClark sockeye salmon and confirmed the divergence of Lake Clarkpopulations from neighboring Six Mile Lake and Lake Iliamna populations. The use of multiple marker types provided better understanding of the bottleneck in Lake Clark. For example, the Sucker Bay Lake population had an exceptionally severe reduction in allelic diversity at microsatellite loci, but not at mtDNA. This suggests that the reduced microsatellite variation in Sucker Bay Lake fish is due to consistently smaller effective population size than other Lake Clark populations, rather than a more acute or additi