Science.gov

Sample records for model-based event detection

  1. a Topic Modeling Based Representation to Detect Tweet Locations. Example of the Event "je Suis Charlie"

    NASA Astrophysics Data System (ADS)

    Morchid, M.; Josselin, D.; Portilla, Y.; Dufour, R.; Altman, E.; Linarès, G.

    2015-09-01

    Social Networks became a major actor in information propagation. Using the Twitter popular platform, mobile users post or relay messages from different locations. The tweet content, meaning and location, show how an event-such as the bursty one "JeSuisCharlie", happened in France in January 2015, is comprehended in different countries. This research aims at clustering the tweets according to the co-occurrence of their terms, including the country, and forecasting the probable country of a non-located tweet, knowing its content. First, we present the process of collecting a large quantity of data from the Twitter website. We finally have a set of 2,189 located tweets about "Charlie", from the 7th to the 14th of January. We describe an original method adapted from the Author-Topic (AT) model based on the Latent Dirichlet Allocation (LDA) method. We define an homogeneous space containing both lexical content (words) and spatial information (country). During a training process on a part of the sample, we provide a set of clusters (topics) based on statistical relations between lexical and spatial terms. During a clustering task, we evaluate the method effectiveness on the rest of the sample that reaches up to 95% of good assignment. It shows that our model is pertinent to foresee tweet location after a learning process.

  2. Detection of anomalous events

    DOEpatents

    Ferragut, Erik M.; Laska, Jason A.; Bridges, Robert A.

    2016-06-07

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The system can include a plurality of anomaly detectors that together implement an algorithm to identify low-probability events and detect atypical traffic patterns. The anomaly detector provides for comparability of disparate sources of data (e.g., network flow data and firewall logs.) Additionally, the anomaly detector allows for regulatability, meaning that the algorithm can be user configurable to adjust a number of false alerts. The anomaly detector can be used for a variety of probability density functions, including normal Gaussian distributions, irregular distributions, as well as functions associated with continuous or discrete variables.

  3. Detection of solar events

    DOEpatents

    Fischbach, Ephraim; Jenkins, Jere

    2013-08-27

    A flux detection apparatus can include a radioactive sample having a decay rate capable of changing in response to interaction with a first particle or a field, and a detector associated with the radioactive sample. The detector is responsive to a second particle or radiation formed by decay of the radioactive sample. The rate of decay of the radioactive sample can be correlated to flux of the first particle or the field. Detection of the first particle or the field can provide an early warning for an impending solar event.

  4. Scintillation event energy measurement via a pulse model based iterative deconvolution method

    NASA Astrophysics Data System (ADS)

    Deng, Zhenzhou; Xie, Qingguo; Duan, Zhiwen; Xiao, Peng

    2013-11-01

    This work focuses on event energy measurement, a crucial task of scintillation detection systems. We modeled the scintillation detector as a linear system and treated the energy measurement as a deconvolution problem. We proposed a pulse model based iterative deconvolution (PMID) method, which can process pileup events without detection and is adaptive for different signal pulse shapes. The proposed method was compared with digital gated integrator (DGI) and digital delay-line clipping (DDLC) using real world experimental data. For singles data, the energy resolution (ER) produced by PMID matched that of DGI. For pileups, the PMID method outperformed both DGI and DDLC in ER and counts recovery. The encouraging results suggest that the PMID method has great potentials in applications like photon-counting systems and pulse height spectrometers, in which multiple-event pileups are common.

  5. Naive Probability: Model-Based Estimates of Unique Events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning.

  6. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Sullivan, E J; Candy, J V

    2007-08-13

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  7. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Candy, J V

    2008-12-08

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  8. Zero-inflated Poisson model based likelihood ratio test for drug safety signal detection.

    PubMed

    Huang, Lan; Zheng, Dan; Zalkikar, Jyoti; Tiwari, Ram

    2017-02-01

    In recent decades, numerous methods have been developed for data mining of large drug safety databases, such as Food and Drug Administration's (FDA's) Adverse Event Reporting System, where data matrices are formed by drugs such as columns and adverse events as rows. Often, a large number of cells in these data matrices have zero cell counts and some of them are "true zeros" indicating that the drug-adverse event pairs cannot occur, and these zero counts are distinguished from the other zero counts that are modeled zero counts and simply indicate that the drug-adverse event pairs have not occurred yet or have not been reported yet. In this paper, a zero-inflated Poisson model based likelihood ratio test method is proposed to identify drug-adverse event pairs that have disproportionately high reporting rates, which are also called signals. The maximum likelihood estimates of the model parameters of zero-inflated Poisson model based likelihood ratio test are obtained using the expectation and maximization algorithm. The zero-inflated Poisson model based likelihood ratio test is also modified to handle the stratified analyses for binary and categorical covariates (e.g. gender and age) in the data. The proposed zero-inflated Poisson model based likelihood ratio test method is shown to asymptotically control the type I error and false discovery rate, and its finite sample performance for signal detection is evaluated through a simulation study. The simulation results show that the zero-inflated Poisson model based likelihood ratio test method performs similar to Poisson model based likelihood ratio test method when the estimated percentage of true zeros in the database is small. Both the zero-inflated Poisson model based likelihood ratio test and likelihood ratio test methods are applied to six selected drugs, from the 2006 to 2011 Adverse Event Reporting System database, with varying percentages of observed zero-count cells.

  9. Fuzzy model-based observers for fault detection in CSTR.

    PubMed

    Ballesteros-Moncada, Hazael; Herrera-López, Enrique J; Anzurez-Marín, Juan

    2015-11-01

    Under the vast variety of fuzzy model-based observers reported in the literature, what would be the properone to be used for fault detection in a class of chemical reactor? In this study four fuzzy model-based observers for sensor fault detection of a Continuous Stirred Tank Reactor were designed and compared. The designs include (i) a Luenberger fuzzy observer, (ii) a Luenberger fuzzy observer with sliding modes, (iii) a Walcott-Zak fuzzy observer, and (iv) an Utkin fuzzy observer. A negative, an oscillating fault signal, and a bounded random noise signal with a maximum value of ±0.4 were used to evaluate and compare the performance of the fuzzy observers. The Utkin fuzzy observer showed the best performance under the tested conditions.

  10. Model-based fault detection and diagnosis in ALMA subsystems

    NASA Astrophysics Data System (ADS)

    Ortiz, José; Carrasco, Rodrigo A.

    2016-07-01

    The Atacama Large Millimeter/submillimeter Array (ALMA) observatory, with its 66 individual telescopes and other central equipment, generates a massive set of monitoring data every day, collecting information on the performance of a variety of critical and complex electrical, electronic and mechanical components. This data is crucial for most troubleshooting efforts performed by engineering teams. More than 5 years of accumulated data and expertise allow for a more systematic approach to fault detection and diagnosis. This paper presents model-based fault detection and diagnosis techniques to support corrective and predictive maintenance in a 24/7 minimum-downtime observatory.

  11. Method for Real-Time Model Based Structural Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Smith, Timothy A. (Inventor); Urnes, James M., Sr. (Inventor); Reichenbach, Eric Y. (Inventor)

    2015-01-01

    A system and methods for real-time model based vehicle structural anomaly detection are disclosed. A real-time measurement corresponding to a location on a vehicle structure during an operation of the vehicle is received, and the real-time measurement is compared to expected operation data for the location to provide a modeling error signal. A statistical significance of the modeling error signal to provide an error significance is calculated, and a persistence of the error significance is determined. A structural anomaly is indicated, if the persistence exceeds a persistence threshold value.

  12. Acoustic Event Detection and Classification

    NASA Astrophysics Data System (ADS)

    Temko, Andrey; Nadeu, Climent; Macho, Dušan; Malkin, Robert; Zieger, Christian; Omologo, Maurizio

    The human activity that takes place in meeting rooms or classrooms is reflected in a rich variety of acoustic events (AE), produced either by the human body or by objects handled by humans, so the determination of both the identity of sounds and their position in time may help to detect and describe that human activity. Indeed, speech is usually the most informative sound, but other kinds of AEs may also carry useful information, for example, clapping or laughing inside a speech, a strong yawn in the middle of a lecture, a chair moving or a door slam when the meeting has just started. Additionally, detection and classification of sounds other than speech may be useful to enhance the robustness of speech technologies like automatic speech recognition.

  13. Detectability of Discrete Event Systems with Dynamic Event Observation

    PubMed Central

    Shu, Shaolong; Lin, Feng

    2009-01-01

    Our previous work considers detectability of discrete event systems which is to determine the current state and subsequent states of a system based on event observation. We assume that event observation is static, that is, if an event is observable, then all its occurrences are observable. However, in practical systems such as sensor networks, event observation often needs to be dynamic, that is, the occurrences of same events may or may not be observable, depending on the state of the system. In this paper, we generalize static event observation into dynamic event observation and consider the detectability problem under dynamic event observation. We define four types of detectabilities. To check detectabilities, we construct the observer with exponential complexity. To reduce computational complexity, we can also construct a detector with polynomial complexity to check strong detectabilities. Dynamic event observation can be implemented in two possible ways: a passive observation and an active observation. For the active observation, we discuss how to find minimal event observation policies that preserve four types of detectabilities respectively. PMID:20161618

  14. a model based on crowsourcing for detecting natural hazards

    NASA Astrophysics Data System (ADS)

    Duan, J.; Ma, C.; Zhang, J.; Liu, S.; Liu, J.

    2015-12-01

    Remote Sensing Technology provides a new method for the detecting,early warning,mitigation and relief of natural hazards. Given the suddenness and the unpredictability of the location of natural hazards as well as the actual demands for hazards work, this article proposes an evaluation model for remote sensing detecting of natural hazards based on crowdsourcing. Firstly, using crowdsourcing model and with the help of the Internet and the power of hundreds of millions of Internet users, this evaluation model provides visual interpretation of high-resolution remote sensing images of hazards area and collects massive valuable disaster data; secondly, this evaluation model adopts the strategy of dynamic voting consistency to evaluate the disaster data provided by the crowdsourcing workers; thirdly, this evaluation model pre-estimates the disaster severity with the disaster pre-evaluation model based on regional buffers; lastly, the evaluation model actuates the corresponding expert system work according to the forecast results. The idea of this model breaks the boundaries between geographic information professionals and the public, makes the public participation and the citizen science eventually be realized, and improves the accuracy and timeliness of hazards assessment results.

  15. Lightning Detection Efficiency Analysis Process: Modeling Based on Empirical Data

    NASA Technical Reports Server (NTRS)

    Rompala, John T.

    2005-01-01

    A ground based lightning detection system employs a grid of sensors, which record and evaluate the electromagnetic signal produced by a lightning strike. Several detectors gather information on that signal s strength, time of arrival, and behavior over time. By coordinating the information from several detectors, an event solution can be generated. That solution includes the signal s point of origin, strength and polarity. Determination of the location of the lightning strike uses algorithms based on long used techniques of triangulation. Determination of the event s original signal strength relies on the behavior of the generated magnetic field over distance and time. In general the signal from the event undergoes geometric dispersion and environmental attenuation as it progresses. Our knowledge of that radial behavior together with the strength of the signal received by detecting sites permits an extrapolation and evaluation of the original strength of the lightning strike. It also limits the detection efficiency (DE) of the network. For expansive grids and with a sparse density of detectors, the DE varies widely over the area served. This limits the utility of the network in gathering information on regional lightning strike density and applying it to meteorological studies. A network of this type is a grid of four detectors in the Rondonian region of Brazil. The service area extends over a million square kilometers. Much of that area is covered by rain forests. Thus knowledge of lightning strike characteristics over the expanse is of particular value. I have been developing a process that determines the DE over the region [3]. In turn, this provides a way to produce lightning strike density maps, corrected for DE, over the entire region of interest. This report offers a survey of that development to date and a record of present activity.

  16. Event oriented dictionary learning for complex event detection.

    PubMed

    Yan, Yan; Yang, Yi; Meng, Deyu; Liu, Gaowen; Tong, Wei; Hauptmann, Alexander G; Sebe, Nicu

    2015-06-01

    Complex event detection is a retrieval task with the goal of finding videos of a particular event in a large-scale unconstrained Internet video archive, given example videos and text descriptions. Nowadays, different multimodal fusion schemes of low-level and high-level features are extensively investigated and evaluated for the complex event detection task. However, how to effectively select the high-level semantic meaningful concepts from a large pool to assist complex event detection is rarely studied in the literature. In this paper, we propose a novel strategy to automatically select semantic meaningful concepts for the event detection task based on both the events-kit text descriptions and the concepts high-level feature descriptions. Moreover, we introduce a novel event oriented dictionary representation based on the selected semantic concepts. Toward this goal, we leverage training images (frames) of selected concepts from the semantic indexing dataset with a pool of 346 concepts, into a novel supervised multitask lp -norm dictionary learning framework. Extensive experimental results on TRECVID multimedia event detection dataset demonstrate the efficacy of our proposed method.

  17. Model-Based Reasoning in the Detection of Satellite Anomalies

    DTIC Science & Technology

    1990-12-01

    Conference on Artificial Intellegence . 1363-1368. Detroit, Michigan, August 89. Chu, Wei-Hai. "Generic Expert System Shell for Diagnostic Reasoning... Intellegence . 1324-1330. Detroit, Michigan, August 89. de Kleer, Johan and Brian C. Williams. "Diagnosing Multiple Faults," Artificial Intellegence , 32(1): 97...Benjamin Kuipers. "Model-Based Monitoring of Dynamic Systems," Proceedings of the Eleventh Intematianal Joint Conference on Artificial Intellegence . 1238

  18. Sequential Model-Based Detection in a Shallow Ocean Acoustic Environment

    SciTech Connect

    Candy, J V

    2002-03-26

    A model-based detection scheme is developed to passively monitor an ocean acoustic environment along with its associated variations. The technique employs an embedded model-based processor and a reference model in a sequential likelihood detection scheme. The monitor is therefore called a sequential reference detector. The underlying theory for the design is developed and discussed in detail.

  19. Generalized Detectability for Discrete Event Systems

    PubMed Central

    Shu, Shaolong; Lin, Feng

    2011-01-01

    In our previous work, we investigated detectability of discrete event systems, which is defined as the ability to determine the current and subsequent states of a system based on observation. For different applications, we defined four types of detectabilities: (weak) detectability, strong detectability, (weak) periodic detectability, and strong periodic detectability. In this paper, we extend our results in three aspects. (1) We extend detectability from deterministic systems to nondeterministic systems. Such a generalization is necessary because there are many systems that need to be modeled as nondeterministic discrete event systems. (2) We develop polynomial algorithms to check strong detectability. The previous algorithms are based on observer whose construction is of exponential complexity, while the new algorithms are based on a new automaton called detector. (3) We extend detectability to D-detectability. While detectability requires determining the exact state of a system, D-detectability relaxes this requirement by asking only to distinguish certain pairs of states. With these extensions, the theory on detectability of discrete event systems becomes more applicable in solving many practical problems. PMID:21691432

  20. Detecting unitary events without discretization of time.

    PubMed

    Grün, S; Diesmann, M; Grammont, F; Riehle, A; Aertsen, A

    1999-12-15

    In earlier studies we developed the 'Unitary Events' analysis (Grün S. Unitary Joint-Events in Multiple-Neuron Spiking Activity: Detection, Significance and Interpretation. Reihe Physik, Band 60. Thun, Frankfurt/Main: Verlag Harri Deutsch, 1996.) to detect the presence of conspicuous spike coincidences in multiple single unit recordings and to evaluate their statistical significance. The method enabled us to study the relation between spike synchronization and behavioral events (Riehle A, Grün S, Diesmann M, Aertsen A. Spike synchronization and rate modulation differentially involved in motor cortical function. Science 1997;278:1950-1953.). There is recent experimental evidence that the timing accuracy of coincident spiking events, which might be relevant for higher brain function, may be in the range of 1-5 ms. To detect coincidences on that time scale, we sectioned the observation interval into short disjunct time slices ('bins'). Unitary Events analysis of this discretized process demonstrated that coincident events can indeed be reliably detected. However, the method looses sensitivity for higher temporal jitter of the events constituting the coincidences (Grün S. Unitary Joint-Events in Multiple-Neuron Spiking Activity: Detection, Significance and Interpretation. Reihe Physik, Band 60. Thun, Frankfurt/Main: Verlag Harri Deutsch, 1996.). Here we present a new approach, the 'multiple shift' method (MS), which overcomes the need for binning and treats the data in their (original) high time resolution (typically 1 ms, or better). Technically, coincidences are detected by shifting the spike trains against each other over the range of allowed coincidence width and integrating the number of exact coincidences (on the time resolution of the data) over all shifts. We found that the new method enhances the sensitivity for coincidences with temporal jitter. Both methods are outlined and compared on the basis of their analytical description and their application on

  1. Model-Based Design of Tree WSNs for Decentralized Detection.

    PubMed

    Tantawy, Ashraf; Koutsoukos, Xenofon; Biswas, Gautam

    2015-08-20

    The classical decentralized detection problem of finding the optimal decision rules at the sensor and fusion center, as well as variants that introduce physical channel impairments have been studied extensively in the literature. The deployment of WSNs in decentralized detection applications brings new challenges to the field. Protocols for different communication layers have to be co-designed to optimize the detection performance. In this paper, we consider the communication network design problem for a tree WSN. We pursue a system-level approach where a complete model for the system is developed that captures the interactions between different layers, as well as different sensor quality measures. For network optimization, we propose a hierarchical optimization algorithm that lends itself to the tree structure, requiring only local network information. The proposed design approach shows superior performance over several contentionless and contention-based network design approaches.

  2. Detecting Extreme Events in Gridded Climate Data

    SciTech Connect

    Ramachandra, Bharathkumar; Gadiraju, Krishna; Vatsavai, Raju; Kaiser, Dale Patrick; Karnowski, Thomas Paul

    2016-01-01

    Detecting and tracking extreme events in gridded climatological data is a challenging problem on several fronts: algorithms, scalability, and I/O. Successful detection of these events will give climate scientists an alternate view of the behavior of different climatological variables, leading to enhanced scientific understanding of the impacts of events such as heat and cold waves, and on a larger scale, the El Nin o Southern Oscillation. Recent advances in computing power and research in data sciences enabled us to look at this problem with a different perspective from what was previously possible. In this paper we present our computationally efficient algorithms for anomalous cluster detection on climate change big data. We provide results on detection and tracking of surface temperature and geopotential height anomalies, a trend analysis, and a study of relationships between the variables. We also identify the limitations of our approaches, future directions for research and alternate approaches.

  3. A biological hierarchical model based underwater moving object detection.

    PubMed

    Shen, Jie; Fan, Tanghuai; Tang, Min; Zhang, Qian; Sun, Zhen; Huang, Fengchen

    2014-01-01

    Underwater moving object detection is the key for many underwater computer vision tasks, such as object recognizing, locating, and tracking. Considering the super ability in visual sensing of the underwater habitats, the visual mechanism of aquatic animals is generally regarded as the cue for establishing bionic models which are more adaptive to the underwater environments. However, the low accuracy rate and the absence of the prior knowledge learning limit their adaptation in underwater applications. Aiming to solve the problems originated from the inhomogeneous lumination and the unstable background, the mechanism of the visual information sensing and processing pattern from the eye of frogs are imitated to produce a hierarchical background model for detecting underwater objects. Firstly, the image is segmented into several subblocks. The intensity information is extracted for establishing background model which could roughly identify the object and the background regions. The texture feature of each pixel in the rough object region is further analyzed to generate the object contour precisely. Experimental results demonstrate that the proposed method gives a better performance. Compared to the traditional Gaussian background model, the completeness of the object detection is 97.92% with only 0.94% of the background region that is included in the detection results.

  4. Detection of goal events in soccer videos

    NASA Astrophysics Data System (ADS)

    Kim, Hyoung-Gook; Roeber, Steffen; Samour, Amjad; Sikora, Thomas

    2004-12-01

    In this paper, we present an automatic extraction of goal events in soccer videos by using audio track features alone without relying on expensive-to-compute video track features. The extracted goal events can be used for high-level indexing and selective browsing of soccer videos. The detection of soccer video highlights using audio contents comprises three steps: 1) extraction of audio features from a video sequence, 2) event candidate detection of highlight events based on the information provided by the feature extraction Methods and the Hidden Markov Model (HMM), 3) goal event selection to finally determine the video intervals to be included in the summary. For this purpose we compared the performance of the well known Mel-scale Frequency Cepstral Coefficients (MFCC) feature extraction method vs. MPEG-7 Audio Spectrum Projection feature (ASP) extraction method based on three different decomposition methods namely Principal Component Analysis( PCA), Independent Component Analysis (ICA) and Non-Negative Matrix Factorization (NMF). To evaluate our system we collected five soccer game videos from various sources. In total we have seven hours of soccer games consisting of eight gigabytes of data. One of five soccer games is used as the training data (e.g., announcers' excited speech, audience ambient speech noise, audience clapping, environmental sounds). Our goal event detection results are encouraging.

  5. Detection of goal events in soccer videos

    NASA Astrophysics Data System (ADS)

    Kim, Hyoung-Gook; Roeber, Steffen; Samour, Amjad; Sikora, Thomas

    2005-01-01

    In this paper, we present an automatic extraction of goal events in soccer videos by using audio track features alone without relying on expensive-to-compute video track features. The extracted goal events can be used for high-level indexing and selective browsing of soccer videos. The detection of soccer video highlights using audio contents comprises three steps: 1) extraction of audio features from a video sequence, 2) event candidate detection of highlight events based on the information provided by the feature extraction Methods and the Hidden Markov Model (HMM), 3) goal event selection to finally determine the video intervals to be included in the summary. For this purpose we compared the performance of the well known Mel-scale Frequency Cepstral Coefficients (MFCC) feature extraction method vs. MPEG-7 Audio Spectrum Projection feature (ASP) extraction method based on three different decomposition methods namely Principal Component Analysis( PCA), Independent Component Analysis (ICA) and Non-Negative Matrix Factorization (NMF). To evaluate our system we collected five soccer game videos from various sources. In total we have seven hours of soccer games consisting of eight gigabytes of data. One of five soccer games is used as the training data (e.g., announcers' excited speech, audience ambient speech noise, audience clapping, environmental sounds). Our goal event detection results are encouraging.

  6. Automated Detection of Events of Scientific Interest

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    A report presents a slightly different perspective of the subject matter of Fusing Symbolic and Numerical Diagnostic Computations (NPO-42512), which appears elsewhere in this issue of NASA Tech Briefs. Briefly, the subject matter is the X-2000 Anomaly Detection Language, which is a developmental computing language for fusing two diagnostic computer programs one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for real-time detection of events. In the case of the cited companion NASA Tech Briefs article, the contemplated events that one seeks to detect would be primarily failures or other changes that could adversely affect the safety or success of a spacecraft mission. In the case of the instant report, the events to be detected could also include natural phenomena that could be of scientific interest. Hence, the use of X- 2000 Anomaly Detection Language could contribute to a capability for automated, coordinated use of multiple sensors and sensor-output-data-processing hardware and software to effect opportunistic collection and analysis of scientific data.

  7. On event-based optical flow detection

    PubMed Central

    Brosch, Tobias; Tschechne, Stephan; Neumann, Heiko

    2015-01-01

    Event-based sensing, i.e., the asynchronous detection of luminance changes, promises low-energy, high dynamic range, and sparse sensing. This stands in contrast to whole image frame-wise acquisition by standard cameras. Here, we systematically investigate the implications of event-based sensing in the context of visual motion, or flow, estimation. Starting from a common theoretical foundation, we discuss different principal approaches for optical flow detection ranging from gradient-based methods over plane-fitting to filter based methods and identify strengths and weaknesses of each class. Gradient-based methods for local motion integration are shown to suffer from the sparse encoding in address-event representations (AER). Approaches exploiting the local plane like structure of the event cloud, on the other hand, are shown to be well suited. Within this class, filter based approaches are shown to define a proper detection scheme which can also deal with the problem of representing multiple motions at a single location (motion transparency). A novel biologically inspired efficient motion detector is proposed, analyzed and experimentally validated. Furthermore, a stage of surround normalization is incorporated. Together with the filtering this defines a canonical circuit for motion feature detection. The theoretical analysis shows that such an integrated circuit reduces motion ambiguity in addition to decorrelating the representation of motion related activations. PMID:25941470

  8. Certification Aspects in Critical Embedded Software Development with Model Based Techniques: Detection of Unintended Functions

    NASA Astrophysics Data System (ADS)

    Atencia Yepez, A.; Autrán Cerqueira, J.; Urueña, S.; Jurado, R.

    2012-01-01

    This paper, developed under contract with European Aviation Safety Agency (EASA), analyses in detail which may be the certification implications in the aeronautic industry associated to the application of model-level verification and validation techniques. Particularly, this paper focuses on the problematic of detecting unintended functions by applying Model Coverage Criteria at model level. This point is significantly important for the future extensive use of Model Based approaches in safety critical software, since the uncertainty in the system performance introduced by the unintended functions, which may also lead to unacceptable hazardous or catastrophic events, prevents the system to be compliance with certification requirements. The paper provides a definition and a categorization of unintended functions and gives some relevant examples to assess the efficiency of model- coverage techniques in the detection of UF. The paper explains how this analysis is supported by a methodology based on the study of sources for introducing unintended functions. Finally it is analysed the feasibility of using Model-level verification techniques to support the software certification process.

  9. A Bayesian Hidden Markov Model-based approach for anomaly detection in electronic systems

    NASA Astrophysics Data System (ADS)

    Dorj, E.; Chen, C.; Pecht, M.

    Early detection of anomalies in any system or component prevents impending failures and enhances performance and availability. The complex architecture of electronics, the interdependency of component functionalities, and the miniaturization of most electronic systems make it difficult to detect and analyze anomalous behaviors. A Hidden Markov Model-based classification technique determines unobservable hidden behaviors of complex and remotely inaccessible electronic systems using observable signals. This paper presents a data-driven approach for anomaly detection in electronic systems based on a Bayesian Hidden Markov Model classification technique. The posterior parameters of the Hidden Markov Models are estimated using the conjugate prior method. An application of the developed Bayesian Hidden Markov Model-based anomaly detection approach is presented for detecting anomalous behavior in Insulated Gate Bipolar Transistors using experimental data. The detection results illustrate that the developed anomaly detection approach can help detect anomalous behaviors in electronic systems, which can help prevent system downtime and catastrophic failures.

  10. Detection and recognition of indoor smoking events

    NASA Astrophysics Data System (ADS)

    Bien, Tse-Lun; Lin, Chang Hong

    2013-03-01

    Smoking in public indoor spaces has become prohibited in many countries since it not only affects the health of the people around you, but also increases the risk of fire outbreaks. This paper proposes a novel scheme to automatically detect and recognize smoking events by using exsiting surveillance cameras. The main idea of our proposed method is to detect human smoking events by recognizing their actions. In this scheme, the human pose estimation is introduced to analyze human actions from their poses. The human pose estimation method segments head and both hands from human body parts by using a skin color detection method. However, the skin color methods may fail in insufficient light conditions. Therefore, the lighting compensation is applied to help the skin color detection method become more accurate. Due to the human body parts may be covered by shadows, which may cause the human pose estimation to fail, the Kalman filter is applied to track the missed body parts. After that, we evaluate the probability features of hands approaching the head. The support vector machine (SVM) is applied to learn and recognize the smoking events by the probability features. To analysis the performance of proposed method, the datasets established in the survillance camera view under indoor enviroment are tested. The experimental results show the effectiveness of our proposed method with accuracy rate of 83.33%.

  11. A methodology for detecting routing events in discrete flow networks.

    SciTech Connect

    Garcia, H. E.; Yoo, T.; Nuclear Technology

    2004-01-01

    A theoretical framework for formulating and implementing model-based monitoring of discrete flow networks is discussed. Possible flows of items are described as the sequence of discrete-event (DE) traces. Each trace defines the DE sequence(s) that are triggered when an entity follows a given flow-path and visits tracking locations distributed within the monitored system. Given the set of possible discrete flows, a possible-behavior model - an interacting set of automata - is constructed, where each automaton models the discrete flow of items at each tracking location. Event labels or symbols contain all the information required to unambiguously distinguish each discrete flow. Within the possible behavior, there is a special sub-behavior whose occurrence is required to be detected. The special behavior may be specified by the occurrence of routing events, such as faults. These intermittent or non-persistent events may occur repeatedly. An observation mask is then defined, characterizing the actual observation configuration available for collecting item tracking data. The analysis task is then to determine whether this observation configuration is capable of detecting the identified special behavior. The assessment is accomplished by evaluating several observability notions, such as detectability and diagnosability. If the corresponding property is satisfied, associated formal observers are constructed to perform the monitoring task at hand. The synthesis of an optimal observation mask may also be conducted to suggest an appropriate observation configuration guaranteeing the detection of the special events and to construct associated monitoring agents. The proposed framework, modeling methodology, and supporting techniques for discrete flow networks monitoring are presented and illustrated with an example.

  12. Phase-Space Detection of Cyber Events

    SciTech Connect

    Hernandez Jimenez, Jarilyn M; Ferber, Aaron E; Prowell, Stacy J; Hively, Lee M

    2015-01-01

    Energy Delivery Systems (EDS) are a network of processes that produce, transfer and distribute energy. EDS are increasingly dependent on networked computing assets, as are many Industrial Control Systems. Consequently, cyber-attacks pose a real and pertinent threat, as evidenced by Stuxnet, Shamoon and Dragonfly. Hence, there is a critical need for novel methods to detect, prevent, and mitigate effects of such attacks. To detect cyber-attacks in EDS, we developed a framework for gathering and analyzing timing data that involves establishing a baseline execution profile and then capturing the effect of perturbations in the state from injecting various malware. The data analysis was based on nonlinear dynamics and graph theory to improve detection of anomalous events in cyber applications. The goal was the extraction of changing dynamics or anomalous activity in the underlying computer system. Takens' theorem in nonlinear dynamics allows reconstruction of topologically invariant, time-delay-embedding states from the computer data in a sufficiently high-dimensional space. The resultant dynamical states were nodes, and the state-to-state transitions were links in a mathematical graph. Alternatively, sequential tabulation of executing instructions provides the nodes with corresponding instruction-to-instruction links. Graph theorems guarantee graph-invariant measures to quantify the dynamical changes in the running applications. Results showed a successful detection of cyber events.

  13. Model Based Analysis of Clonal Developments Allows for Early Detection of Monoclonal Conversion and Leukemia

    PubMed Central

    Thielecke, Lars; Glauche, Ingmar

    2016-01-01

    The availability of several methods to unambiguously mark individual cells has strongly fostered the understanding of clonal developments in hematopoiesis and other stem cell driven regenerative tissues. While cellular barcoding is the method of choice for experimental studies, patients that underwent gene therapy carry a unique insertional mark within the transplanted cells originating from the integration of the retroviral vector. Close monitoring of such patients allows accessing their clonal dynamics, however, the early detection of events that predict monoclonal conversion and potentially the onset of leukemia are beneficial for treatment. We developed a simple mathematical model of a self-stabilizing hematopoietic stem cell population to generate a wide range of possible clonal developments, reproducing typical, experimentally and clinically observed scenarios. We use the resulting model scenarios to suggest and test a set of statistical measures that should allow for an interpretation and classification of relevant clonal dynamics. Apart from the assessment of several established diversity indices we suggest a measure that quantifies the extension to which the increase in the size of one clone is attributed to the total loss in the size of all other clones. By evaluating the change in relative clone sizes between consecutive measurements, the suggested measure, referred to as maximum relative clonal expansion (mRCE), proves to be highly sensitive in the detection of rapidly expanding cell clones prior to their dominant manifestation. This predictive potential places the mRCE as a suitable means for the early recognition of leukemogenesis especially in gene therapy patients that are closely monitored. Our model based approach illustrates how simulation studies can actively support the design and evaluation of preclinical strategies for the analysis and risk evaluation of clonal developments. PMID:27764218

  14. WCEDS: A waveform correlation event detection system

    SciTech Connect

    Young, C.J.; Beiriger, J.I.; Trujillo, J.R.; Withers, M.M.; Aster, R.C.; Astiz, L.; Shearer, P.M.

    1995-08-01

    We have developed a working prototype of a grid-based global event detection system based on waveform correlation. The algorithm comes from a long-period detector but we have recast it in a full matrix formulation which can reduce the number of multiplications needed by better than two orders of magnitude for realistic monitoring scenarios. The reduction is made possible by eliminating redundant multiplications in the original formulation. All unique correlations for a given origin time are stored in a correlation matrix (C) which is formed by a full matrix product of a Master Image matrix (M) and a data matrix (D). The detector value at each grid point is calculated by following a different summation path through the correlation matrix. Master Images can be derived either empirically or synthetically. Our testing has used synthetic Master Images because their influence on the detector is easier to understand. We tested the system using the matrix formulation with continuous data from the IRIS (Incorporate Research Institutes for Seismology) broadband global network to monitor a 2 degree evenly spaced surface grid with a time discretization of 1 sps; we successfully detected the largest event in a two hour segment from October 1993. The output at the correct gridpoint was at least 33% larger than at adjacent grid points, and the output at the correct gridpoint at the correct origin time was more than 500% larger than the output at the same gridpoint immediately before or after. Analysis of the C matrix for the origin time of the event demonstrates that there are many significant ``false`` correlations of observed phases with incorrect predicted phases. These false correlations dull the sensitivity of the detector and so must be dealt with if our system is to attain detection thresholds consistent with a Comprehensive Test Ban Treaty (CTBT).

  15. Pickless event detection and location: The waveform correlation event detection system (WCEDS) revisited

    DOE PAGES

    Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; ...

    2016-01-01

    The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events thatmore » generated the phase picks.« less

  16. Pickless event detection and location: The waveform correlation event detection system (WCEDS) revisited

    SciTech Connect

    Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; Slinkard, Megan Elizabeth

    2016-01-01

    The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events that generated the phase picks.

  17. Radioactive Threat Detection with Scattering Physics: A Model-Based Application

    SciTech Connect

    Candy, J V; Chambers, D H; Breitfeller, E F; Guidry, B L; Verbeke, J M; Axelrod, M A; Sale, K E; Meyer, A M

    2010-01-21

    The detection of radioactive contraband is a critical problem in maintaining national security for any country. Emissions from threat materials challenge both detection and measurement technologies especially when concealed by various types of shielding complicating the transport physics significantly. The development of a model-based sequential Bayesian processor that captures both the underlying transport physics including scattering offers a physics-based approach to attack this challenging problem. It is shown that this processor can be used to develop an effective detection technique.

  18. Model-Based Detection of Radioactive Contraband for Harbor Defense Incorporating Compton Scattering Physics

    SciTech Connect

    Candy, J V; Chambers, D H; Breitfeller, E F; Guidry, B L; Verbeke, J M; Axelrod, M A; Sale, K E; Meyer, A M

    2010-03-02

    The detection of radioactive contraband is a critical problem is maintaining national security for any country. Photon emissions from threat materials challenge both detection and measurement technologies especially when concealed by various types of shielding complicating the transport physics significantly. This problem becomes especially important when ships are intercepted by U.S. Coast Guard harbor patrols searching for contraband. The development of a sequential model-based processor that captures both the underlying transport physics of gamma-ray emissions including Compton scattering and the measurement of photon energies offers a physics-based approach to attack this challenging problem. The inclusion of a basic radionuclide representation of absorbed/scattered photons at a given energy along with interarrival times is used to extract the physics information available from the noisy measurements portable radiation detection systems used to interdict contraband. It is shown that this physics representation can incorporated scattering physics leading to an 'extended' model-based structure that can be used to develop an effective sequential detection technique. The resulting model-based processor is shown to perform quite well based on data obtained from a controlled experiment.

  19. Implementation of a model based fault detection and diagnosis technique for actuation faults of the SSME

    NASA Technical Reports Server (NTRS)

    Duyar, A.; Guo, T.-H.; Merrill, W.; Musgrave, J.

    1991-01-01

    In a previous study, Guo, Merrill and Duyar, 1990, reported a conceptual development of a fault detection and diagnosis system for actuation faults of the Space Shuttle main engine. This study, which is a continuation of the previous work, implements the developed fault detection and diagnosis scheme for the real time actuation fault diagnosis of the Space Shuttle Main Engine. The scheme will be used as an integral part of an intelligent control system demonstration experiment at NASA Lewis. The diagnosis system utilizes a model based method with real time identification and hypothesis testing for actuation, sensor, and performance degradation faults.

  20. Model-based estimation of measures of association for time-to-event outcomes

    PubMed Central

    2014-01-01

    Background Hazard ratios are ubiquitously used in time to event applications to quantify adjusted covariate effects. Although hazard ratios are invaluable for hypothesis testing, other adjusted measures of association, both relative and absolute, should be provided to fully appreciate studies results. The corrected group prognosis method is generally used to estimate the absolute risk reduction and the number needed to be treated for categorical covariates. Methods The goal of this paper is to present transformation models for time-to-event outcomes to obtain, directly from estimated coefficients, the measures of association widely used in biostatistics together with their confidence interval. Pseudo-values are used for a practical estimation of transformation models. Results Using the regression model estimated through pseudo-values with suitable link functions, relative risks, risk differences and the number needed to treat, are obtained together with their confidence intervals. One example based on literature data and one original application to the study of prognostic factors in primary retroperitoneal soft tissue sarcomas are presented. A simulation study is used to show some properties of the different estimation methods. Conclusions Clinically useful measures of treatment or exposure effect are widely available in epidemiology. When time to event outcomes are present, the analysis is performed generally resorting to predicted values from Cox regression model. It is now possible to resort to more general regression models, adopting suitable link functions and pseudo values for estimation, to obtain alternative measures of effect directly from regression coefficients together with their confidence interval. This may be especially useful when, in presence of time dependent covariate effects, it is not straightforward to specify the correct, if any, time dependent functional form. The method can easily be implemented with standard software. PMID:25106903

  1. Real-Time Model-Based Leak-Through Detection within Cryogenic Flow Systems

    NASA Technical Reports Server (NTRS)

    Walker, M.; Figueroa, F.

    2015-01-01

    The timely detection of leaks within cryogenic fuel replenishment systems is of significant importance to operators on account of the safety and economic impacts associated with material loss and operational inefficiencies. Associated loss in control of pressure also effects the stability and ability to control the phase of cryogenic fluids during replenishment operations. Current research dedicated to providing Prognostics and Health Management (PHM) coverage of such cryogenic replenishment systems has focused on the detection of leaks to atmosphere involving relatively simple model-based diagnostic approaches that, while effective, are unable to isolate the fault to specific piping system components. The authors have extended this research to focus on the detection of leaks through closed valves that are intended to isolate sections of the piping system from the flow and pressurization of cryogenic fluids. The described approach employs model-based detection of leak-through conditions based on correlations of pressure changes across isolation valves and attempts to isolate the faults to specific valves. Implementation of this capability is enabled by knowledge and information embedded in the domain model of the system. The approach has been used effectively to detect such leak-through faults during cryogenic operational testing at the Cryogenic Testbed at NASA's Kennedy Space Center.

  2. Model-based fault detection of blade pitch system in floating wind turbines

    NASA Astrophysics Data System (ADS)

    Cho, S.; Gao, Z.; Moan, T.

    2016-09-01

    This paper presents a model-based scheme for fault detection of a blade pitch system in floating wind turbines. A blade pitch system is one of the most critical components due to its effect on the operational safety and the dynamics of wind turbines. Faults in this system should be detected at the early stage to prevent failures. To detect faults of blade pitch actuators and sensors, an appropriate observer should be designed to estimate the states of the system. Residuals are generated by a Kalman filter and a threshold based on H optimization, and linear matrix inequality (LMI) is used for residual evaluation. The proposed method is demonstrated in a case study that bias and fixed output in pitch sensors and stuck in pitch actuators. The simulation results show that the proposed method detects different realistic fault scenarios of wind turbines under the stochastic external winds.

  3. A model-based approach for detection of objects in low resolution passive millimeter wave images

    NASA Technical Reports Server (NTRS)

    Kasturi, Rangachar; Tang, Yuan-Liang; Devadiga, Sadashiva

    1993-01-01

    A model-based vision system to assist the pilots in landing maneuvers under restricted visibility conditions is described. The system was designed to analyze image sequences obtained from a Passive Millimeter Wave (PMMW) imaging system mounted on the aircraft to delineate runways/taxiways, buildings, and other objects on or near runways. PMMW sensors have good response in a foggy atmosphere, but their spatial resolution is very low. However, additional data such as airport model and approximate position and orientation of aircraft are available. These data are exploited to guide our model-based system to locate objects in the low resolution image and generate warning signals to alert the pilots. Also analytical expressions were derived from the accuracy of the camera position estimate obtained by detecting the position of known objects in the image.

  4. Relevance popularity: A term event model based feature selection scheme for text classification.

    PubMed

    Feng, Guozhong; An, Baiguo; Yang, Fengqin; Wang, Han; Zhang, Libiao

    2017-01-01

    Feature selection is a practical approach for improving the performance of text classification methods by optimizing the feature subsets input to classifiers. In traditional feature selection methods such as information gain and chi-square, the number of documents that contain a particular term (i.e. the document frequency) is often used. However, the frequency of a given term appearing in each document has not been fully investigated, even though it is a promising feature to produce accurate classifications. In this paper, we propose a new feature selection scheme based on a term event Multinomial naive Bayes probabilistic model. According to the model assumptions, the matching score function, which is based on the prediction probability ratio, can be factorized. Finally, we derive a feature selection measurement for each term after replacing inner parameters by their estimators. On a benchmark English text datasets (20 Newsgroups) and a Chinese text dataset (MPH-20), our numerical experiment results obtained from using two widely used text classifiers (naive Bayes and support vector machine) demonstrate that our method outperformed the representative feature selection methods.

  5. Detecting seismic events using Benford's Law

    NASA Astrophysics Data System (ADS)

    Diaz, Jordi; Gallart, Josep; Ruiz, Mario

    2015-04-01

    The Benford's Law (BL) states that the distribution of first significant digits is not uniform but follows a logarithmic frequency distribution. Even if a remarkable wide range of natural and socioeconomical data sets, from stock market values to quantum phase transitions, fit this peculiar law, the conformity to it has deserved few scientific applications, being used mainly as a test to pinpoint anomalous or fraudulent data. We developed a procedure to detect the arrival of seismic waves based on the degree of conformity of the amplitude values in the raw seismic trace to the BL. The signal is divided in time windows of appropriate length and the fitting of the first digits distribution to BL is checked in each time window using a conformity estimator. We document that both teleseismic and local earthquakes can be clearly identified in this procedure and we compare its performance with respect to the classical STA/LTA approach. Moreover, we show that the conformity of the seismic record to the BL does not depend on the amplitude of the incoming series, as the occurrence of events with very different amplitudes result in quite similar degree of BL fitting. On the other hand, we show that natural or man-made quasi-monochromatic seismic signals, surface wave trains or engine-generated vibrations can be identified through their very low BL estimator values, when appropriate interval lengths are used. Therefore, we conclude that the degree of conformity of a seismic signal with the BL is primarily dependent on the frequency content of that signal.

  6. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan Walker

    2015-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  7. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan W.

    2014-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  8. Model-based fault detection and identification with online aerodynamic model structure selection

    NASA Astrophysics Data System (ADS)

    Lombaerts, T.

    2013-12-01

    This publication describes a recursive algorithm for the approximation of time-varying nonlinear aerodynamic models by means of a joint adaptive selection of the model structure and parameter estimation. This procedure is called adaptive recursive orthogonal least squares (AROLS) and is an extension and modification of the previously developed ROLS procedure. This algorithm is particularly useful for model-based fault detection and identification (FDI) of aerospace systems. After the failure, a completely new aerodynamic model can be elaborated recursively with respect to structure as well as parameter values. The performance of the identification algorithm is demonstrated on a simulation data set.

  9. A Unified Framework for Event Summarization and Rare Event Detection from Multiple Views.

    PubMed

    Kwon, Junseok; Lee, Kyoung Mu

    2015-09-01

    A novel approach for event summarization and rare event detection is proposed. Unlike conventional methods that deal with event summarization and rare event detection independently, our method solves them in a single framework by transforming them into a graph editing problem. In our approach, a video is represented by a graph, each node of which indicates an event obtained by segmenting the video spatially and temporally. The edges between nodes describe the relationship between events. Based on the degree of relations, edges have different weights. After learning the graph structure, our method finds subgraphs that represent event summarization and rare events in the video by editing the graph, that is, merging its subgraphs or pruning its edges. The graph is edited to minimize a predefined energy model with the Markov Chain Monte Carlo (MCMC) method. The energy model consists of several parameters that represent the causality, frequency, and significance of events. We design a specific energy model that uses these parameters to satisfy each objective of event summarization and rare event detection. The proposed method is extended to obtain event summarization and rare event detection results across multiple videos captured from multiple views. For this purpose, the proposed method independently learns and edits each graph of individual videos for event summarization or rare event detection. Then, the method matches the extracted multiple graphs to each other, and constructs a single composite graph that represents event summarization or rare events from multiple views. Experimental results show that the proposed approach accurately summarizes multiple videos in a fully unsupervised manner. Moreover, the experiments demonstrate that the approach is advantageous in detecting rare transition of events.

  10. Model-Based Building Detection from Low-Cost Optical Sensors Onboard Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Karantzalos, K.; Koutsourakis, P.; Kalisperakis, I.; Grammatikopoulos, L.

    2015-08-01

    The automated and cost-effective building detection in ultra high spatial resolution is of major importance for various engineering and smart city applications. To this end, in this paper, a model-based building detection technique has been developed able to extract and reconstruct buildings from UAV aerial imagery and low-cost imaging sensors. In particular, the developed approach through advanced structure from motion, bundle adjustment and dense image matching computes a DSM and a true orthomosaic from the numerous GoPro images which are characterised by important geometric distortions and fish-eye effect. An unsupervised multi-region, graphcut segmentation and a rule-based classification is responsible for delivering the initial multi-class classification map. The DTM is then calculated based on inpaininting and mathematical morphology process. A data fusion process between the detected building from the DSM/DTM and the classification map feeds a grammar-based building reconstruction and scene building are extracted and reconstructed. Preliminary experimental results appear quite promising with the quantitative evaluation indicating detection rates at object level of 88% regarding the correctness and above 75% regarding the detection completeness.

  11. Robust event detection scheme for complex scenes in video surveillance

    NASA Astrophysics Data System (ADS)

    Chen, Erkang; Xu, Yi; Yang, Xiaokang; Zhang, Wenjun

    2011-07-01

    Event detection for video surveillance is a difficult task due to many challenges: cluttered background, illumination variations, scale variations, occlusions among people, etc. We propose an effective and efficient event detection scheme in such complex situations. Moving shadows due to illumination are tackled with a segmentation method with shadow detection, and scale variations are taken care of using the CamShift guided particle filter tracking algorithm. For event modeling, hidden Markov models are employed. The proposed scheme also reduces the overall computational cost by combing two human detection algorithms and using tracking information to aid human detection. Experimental results on TRECVid event detection evaluation demonstrate the efficacy of the proposed scheme. It is robust, especially to moving shadows and scale variations. Employing the scheme, we achieved the best run results for two events in the TRECVid benchmarking evaluation.

  12. Comparison of chiller models for use in model-based fault detection

    SciTech Connect

    Sreedharan, Priya; Haves, Philip

    2001-06-07

    Selecting the model is an important and essential step in model based fault detection and diagnosis (FDD). Factors that are considered in evaluating a model include accuracy, training data requirements, calibration effort, generality, and computational requirements. The objective of this study was to evaluate different modeling approaches for their applicability to model based FDD of vapor compression chillers. Three different models were studied: the Gordon and Ng Universal Chiller model (2nd generation) and a modified version of the ASHRAE Primary Toolkit model, which are both based on first principles, and the DOE-2 chiller model, as implemented in CoolTools{trademark}, which is empirical. The models were compared in terms of their ability to reproduce the observed performance of an older, centrifugal chiller operating in a commercial office building and a newer centrifugal chiller in a laboratory. All three models displayed similar levels of accuracy. Of the first principles models, the Gordon-Ng model has the advantage of being linear in the parameters, which allows more robust parameter estimation methods to be used and facilitates estimation of the uncertainty in the parameter values. The ASHRAE Toolkit Model may have advantages when refrigerant temperature measurements are also available. The DOE-2 model can be expected to have advantages when very limited data are available to calibrate the model, as long as one of the previously identified models in the CoolTools library matches the performance of the chiller in question.

  13. Failure analysis for model-based organ segmentation using outlier detection

    NASA Astrophysics Data System (ADS)

    Saalbach, Axel; Wächter Stehle, Irina; Lorenz, Cristian; Weese, Jürgen

    2014-03-01

    During the last years Model-Based Segmentation (MBS) techniques have been used in a broad range of medical applications. In clinical practice, such techniques are increasingly employed for diagnostic purposes and treatment decisions. However, it is not guaranteed that a segmentation algorithm will converge towards the desired solution. In specific situations as in the presence of rare anatomical variants (which cannot be represented) or for images with an extremely low quality, a meaningful segmentation might not be feasible. At the same time, an automated estimation of the segmentation reliability is commonly not available. In this paper we present an approach for the identification of segmentation failures using concepts from the field of outlier detection. The approach is validated on a comprehensive set of Computed Tomography Angiography (CTA) images by means of Receiver Operating Characteristic (ROC) analysis. Encouraging results in terms of an Area Under the ROC Curve (AUC) of up to 0.965 were achieved.

  14. Detecting plastic events in emulsions simulations

    NASA Astrophysics Data System (ADS)

    Lulli, Matteo; Matteo Lulli, Massimo Bernaschi, Mauro Sbragaglia Team

    2016-11-01

    Emulsions are complex systems which are formed by a number of non-coalescing droplets dispersed in a solvent leading to non-trivial effects in the overall flowing dynamics. Such systems possess a yield stress below which an elastic response to an external forcing occurs, while above the yield stress the system flows as a non-Newtonian fluid, i.e. the stress is not proportional to the shear. In the solid-like regime the network of the droplets interfaces stores the energy coming from the work exerted by an external forcing, which can be used to move the droplets in a non-reversible way, i.e. causing plastic events. The Kinetic-Elasto-Plastic (KEP) theory is an effective theory describing some features of the flowing regime relating the rate of plastic events to a scalar field called fluidity f =γ˙/σ , i.e. the inverse of an effective viscosity. Boundary conditions have a non-trivial role not captured by the KEP description. In this contribution we will compare numerical results against experiments concerning the Poiseuille flow of emulsions in microchannels with complex boundary geometries. Using an efficient computational tool we can show non-trivial results on plastic events for different realizations of the rough boundaries. The research leading to these results has received funding from the European Research Council under the European Community's Seventh Framework Programme (FP7/2007- 2013)/ERC Grant Agreement no. [279004].

  15. 3D model-based detection and tracking for space autonomous and uncooperative rendezvous

    NASA Astrophysics Data System (ADS)

    Shang, Yang; Zhang, Yueqiang; Liu, Haibo

    2015-10-01

    In order to fully navigate using a vision sensor, a 3D edge model based detection and tracking technique was developed. Firstly, we proposed a target detection strategy over a sequence of several images from the 3D model to initialize the tracking. The overall purpose of such approach is to robustly match each image with the model views of the target. Thus we designed a line segment detection and matching method based on the multi-scale space technology. Experiments on real images showed that our method is highly robust under various image changes. Secondly, we proposed a method based on 3D particle filter (PF) coupled with M-estimation to track and estimate the pose of the target efficiently. In the proposed approach, a similarity observation model was designed according to a new distance function of line segments. Then, based on the tracking results of PF, the pose was optimized using M-estimation. Experiments indicated that the proposed method can effectively track and accurately estimate the pose of freely moving target in unconstrained environment.

  16. A sparse model based detection of copy number variations from exome sequencing data

    PubMed Central

    Duan, Junbo; Wan, Mingxi; Deng, Hong-Wen; Wang, Yu-Ping

    2016-01-01

    Goal Whole-exome sequencing provides a more cost-effective way than whole-genome sequencing for detecting genetic variants such as copy number variations (CNVs). Although a number of approaches have been proposed to detect CNVs from whole-genome sequencing, a direct adoption of these approaches to whole-exome sequencing will often fail because exons are separately located along a genome. Therefore, an appropriate method is needed to target the specific features of exome sequencing data. Methods In this paper a novel sparse model based method is proposed to discover CNVs from multiple exome sequencing data. First, exome sequencing data are represented with a penalized matrix approximation, and technical variability and random sequencing errors are assumed to follow a generalized Gaussian distribution. Second, an iteratively re-weighted least squares algorithm is used to estimate the solution. Results The method is tested and validated on both synthetic and real data, and compared with other approaches including CoNIFER, XHMM and cn.MOPS. The test demonstrates that the proposed method outperform other approaches. Conclusion The proposed sparse model can detect CNVs from exome sequencing data with high power and precision. Significance Sparse model can target the specific features of exome sequencing data. The software codes are freely available at http://www.tulane.edu/wyp/software/ExonCNV.m PMID:26258935

  17. Subsurface Event Detection and Classification Using Wireless Signal Networks

    PubMed Central

    Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T.

    2012-01-01

    Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events. PMID:23202191

  18. Global Seismic Event Detection Using Surface Waves: 15 Possible Antarctic Glacial Sliding Events

    NASA Astrophysics Data System (ADS)

    Chen, X.; Shearer, P. M.; Walker, K. T.; Fricker, H. A.

    2008-12-01

    To identify overlooked or anomalous seismic events not listed in standard catalogs, we have developed an algorithm to detect and locate global seismic events using intermediate-period (35-70s) surface waves. We apply our method to continuous vertical-component seismograms from the global seismic networks as archived in the IRIS UV FARM database from 1997 to 2007. We first bandpass filter the seismograms, apply automatic gain control, and compute envelope functions. We then examine 1654 target event locations defined at 5 degree intervals and stack the seismogram envelopes along the predicted Rayleigh-wave travel times. The resulting function has spatial and temporal peaks that indicate possible seismic events. We visually check these peaks using a graphical user interface to eliminate artifacts and assign an overall reliability grade (A, B or C) to the new events. We detect 78% of events in the Global Centroid Moment Tensor (CMT) catalog. However, we also find 840 new events not listed in the PDE, ISC and REB catalogs. Many of these new events were previously identified by Ekstrom (2006) using a different Rayleigh-wave detection scheme. Most of these new events are located along oceanic ridges and transform faults. Some new events can be associated with volcanic eruptions such as the 2000 Miyakejima sequence near Japan and others with apparent glacial sliding events in Greenland (Ekstrom et al., 2003). We focus our attention on 15 events detected from near the Antarctic coastline and relocate them using a cross-correlation approach. The events occur in 3 groups which are well-separated from areas of cataloged earthquake activity. We speculate that these are iceberg calving and/or glacial sliding events, and hope to test this by inverting for their source mechanisms and examining remote sensing data from their source regions.

  19. Saliency-based abnormal event detection in crowded scenes

    NASA Astrophysics Data System (ADS)

    Shi, Yanjiao; Liu, Yunxiang; Zhang, Qing; Yi, Yugen; Li, Wenju

    2016-11-01

    Abnormal event detection plays a critical role for intelligent video surveillance, and detection in crowded scenes is a challenging but more practical task. We present an abnormal event detection method for crowded video. Region-wise modeling is proposed to address the inconsistent detected motion of the same object due to different depths of field. Comparing to traditional block-wise modeling, the region-wise method not only can reduce heavily the number of models to be built but also can enrich the samples for training the normal events model. In order to reduce the computational burden and make the region-based anomaly detection feasible, a saliency detection technique is adopted in this paper. By identifying the salient parts of the image sequences, the irrelevant blocks are ignored, which removes the disturbance and improves the detection performance further. Experiments on the benchmark dataset and comparisons with the state-of-the-art algorithms validate the advantages of the proposed method.

  20. Event Detection using Twitter: A Spatio-Temporal Approach

    PubMed Central

    Cheng, Tao; Wicks, Thomas

    2014-01-01

    Background Every day, around 400 million tweets are sent worldwide, which has become a rich source for detecting, monitoring and analysing news stories and special (disaster) events. Existing research within this field follows key words attributed to an event, monitoring temporal changes in word usage. However, this method requires prior knowledge of the event in order to know which words to follow, and does not guarantee that the words chosen will be the most appropriate to monitor. Methods This paper suggests an alternative methodology for event detection using space-time scan statistics (STSS). This technique looks for clusters within the dataset across both space and time, regardless of tweet content. It is expected that clusters of tweets will emerge during spatio-temporally relevant events, as people will tweet more than expected in order to describe the event and spread information. The special event used as a case study is the 2013 London helicopter crash. Results and Conclusion A spatio-temporally significant cluster is found relating to the London helicopter crash. Although the cluster only remains significant for a relatively short time, it is rich in information, such as important key words and photographs. The method also detects other special events such as football matches, as well as train and flight delays from Twitter data. These findings demonstrate that STSS is an effective approach to analysing Twitter data for event detection. PMID:24893168

  1. Articulating uncertainty as part of scientific argumentation during model-based exoplanet detection tasks

    NASA Astrophysics Data System (ADS)

    Lee, Hee-Sun; Pallant, Amy; Pryputniewicz, Sarah

    2015-08-01

    Teaching scientific argumentation has emerged as an important goal for K-12 science education. In scientific argumentation, students are actively involved in coordinating evidence with theory based on their understanding of the scientific content and thinking critically about the strengths and weaknesses of the cited evidence in the context of the investigation. We developed a one-week-long online curriculum module called "Is there life in space?" where students conduct a series of four model-based tasks to learn how scientists detect extrasolar planets through the “wobble” and transit methods. The simulation model allows students to manipulate various parameters of an imaginary star and planet system such as planet size, orbit size, planet-orbiting-plane angle, and sensitivity of telescope equipment, and to adjust the display settings for graphs illustrating the relative velocity and light intensity of the star. Students can use model-based evidence to formulate an argument on whether particular signals in the graphs guarantee the presence of a planet. Students' argumentation is facilitated by the four-part prompts consisting of multiple-choice claim, open-ended explanation, Likert-scale uncertainty rating, and open-ended uncertainty rationale. We analyzed 1,013 scientific arguments formulated by 302 high school student groups taught by 7 teachers. We coded these arguments in terms of the accuracy of their claim, the sophistication of explanation connecting evidence to the established knowledge base, the uncertainty rating, and the scientific validity of uncertainty. We found that (1) only 18% of the students' uncertainty rationale involved critical reflection on limitations inherent in data and concepts, (2) 35% of students' uncertainty rationale reflected their assessment of personal ability and knowledge, rather than scientific sources of uncertainty related to the evidence, and (3) the nature of task such as the use of noisy data or the framing of

  2. Model-based approach to the detection and classification of mines in sidescan sonar.

    PubMed

    Reed, Scott; Petillot, Yvan; Bell, Judith

    2004-01-10

    This paper presents a model-based approach to mine detection and classification by use of sidescan sonar. Advances in autonomous underwater vehicle technology have increased the interest in automatic target recognition systems in an effort to automate a process that is currently carried out by a human operator. Current automated systems generally require training and thus produce poor results when the test data set is different from the training set. This has led to research into unsupervised systems, which are able to cope with the large variability in conditions and terrains seen in sidescan imagery. The system presented in this paper first detects possible minelike objects using a Markov random field model, which operates well on noisy images, such as sidescan, and allows a priori information to be included through the use of priors. The highlight and shadow regions of the object are then extracted with a cooperating statistical snake, which assumes these regions are statistically separate from the background. Finally, a classification decision is made using Dempster-Shafer theory, where the extracted features are compared with synthetic realizations generated with a sidescan sonar simulator model. Results for the entire process are shown on real sidescan sonar data. Similarities between the sidescan sonar and synthetic aperture radar (SAR) imaging processes ensure that the approach outlined here could be made applied to SAR image analysis.

  3. CHIRP-Like Signals: Estimation, Detection and Processing A Sequential Model-Based Approach

    SciTech Connect

    Candy, J. V.

    2016-08-04

    Chirp signals have evolved primarily from radar/sonar signal processing applications specifically attempting to estimate the location of a target in surveillance/tracking volume. The chirp, which is essentially a sinusoidal signal whose phase changes instantaneously at each time sample, has an interesting property in that its correlation approximates an impulse function. It is well-known that a matched-filter detector in radar/sonar estimates the target range by cross-correlating a replicant of the transmitted chirp with the measurement data reflected from the target back to the radar/sonar receiver yielding a maximum peak corresponding to the echo time and therefore enabling the desired range estimate. In this application, we perform the same operation as a radar or sonar system, that is, we transmit a “chirp-like pulse” into the target medium and attempt to first detect its presence and second estimate its location or range. Our problem is complicated by the presence of disturbance signals from surrounding broadcast stations as well as extraneous sources of interference in our frequency bands and of course the ever present random noise from instrumentation. First, we discuss the chirp signal itself and illustrate its inherent properties and then develop a model-based processing scheme enabling both the detection and estimation of the signal from noisy measurement data.

  4. Efficient method for events detection in phonocardiographic signals

    NASA Astrophysics Data System (ADS)

    Martinez-Alajarin, Juan; Ruiz-Merino, Ramon

    2005-06-01

    The auscultation of the heart is still the first basic analysis tool used to evaluate the functional state of the heart, as well as the first indicator used to submit the patient to a cardiologist. In order to improve the diagnosis capabilities of auscultation, signal processing algorithms are currently being developed to assist the physician at primary care centers for adult and pediatric population. A basic task for the diagnosis from the phonocardiogram is to detect the events (main and additional sounds, murmurs and clicks) present in the cardiac cycle. This is usually made by applying a threshold and detecting the events that are bigger than the threshold. However, this method usually does not allow the detection of the main sounds when additional sounds and murmurs exist, or it may join several events into a unique one. In this paper we present a reliable method to detect the events present in the phonocardiogram, even in the presence of heart murmurs or additional sounds. The method detects relative maxima peaks in the amplitude envelope of the phonocardiogram, and computes a set of parameters associated with each event. Finally, a set of characteristics is extracted from each event to aid in the identification of the events. Besides, the morphology of the murmurs is also detected, which aids in the differentiation of different diseases that can occur in the same temporal localization. The algorithms have been applied to real normal heart sounds and murmurs, achieving satisfactory results.

  5. Video Event Detection Framework on Large-Scale Video Data

    ERIC Educational Resources Information Center

    Park, Dong-Jun

    2011-01-01

    Detection of events and actions in video entails substantial processing of very large, even open-ended, video streams. Video data present a unique challenge for the information retrieval community because properly representing video events is challenging. We propose a novel approach to analyze temporal aspects of video data. We consider video data…

  6. [Detecting gene-gene/environment interactions by model-based multifactor dimensionality reduction].

    PubMed

    Fan, Wei; Shen, Chao; Guo, Zhirong

    2015-11-01

    This paper introduces a method called model-based multifactor dimensionality reduction (MB-MDR), which was firstly proposed by Calle et al., and can be applied for detecting gene-gene or gene-environment interactions in genetic studies. The basic principle and characteristics of MB-MDR as well as the operation in R program are briefly summarized. Besides, the detailed procedure of MB-MDR is illustrated by using example. Compared with classical MDR, MB-MDR has similar principle, which merges multi-locus genotypes into a one-dimensional construct and can be used in the study with small sample size. However, there is some difference between MB-MDR and classical MDR. First, it has higher statistical power than MDR and other MDR in the presence of different noises due to the different way the genotype cells merged. Second, compared with MDR, it can deal with all binary and quantitative traits, adjust marginal effects of factors and confounders. MBMDR could be a useful method in the analyses of gene-gene/environment interactions.

  7. Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks

    PubMed Central

    Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming

    2015-01-01

    With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady. PMID:26694394

  8. Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks.

    PubMed

    Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming

    2015-12-15

    With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady.

  9. Secure Access Control and Large Scale Robust Representation for Online Multimedia Event Detection

    PubMed Central

    Liu, Changyu; Li, Huiling

    2014-01-01

    We developed an online multimedia event detection (MED) system. However, there are a secure access control issue and a large scale robust representation issue when we want to integrate traditional event detection algorithms into the online environment. For the first issue, we proposed a tree proxy-based and service-oriented access control (TPSAC) model based on the traditional role based access control model. Verification experiments were conducted on the CloudSim simulation platform, and the results showed that the TPSAC model is suitable for the access control of dynamic online environments. For the second issue, inspired by the object-bank scene descriptor, we proposed a 1000-object-bank (1000OBK) event descriptor. Feature vectors of the 1000OBK were extracted from response pyramids of 1000 generic object detectors which were trained on standard annotated image datasets, such as the ImageNet dataset. A spatial bag of words tiling approach was then adopted to encode these feature vectors for bridging the gap between the objects and events. Furthermore, we performed experiments in the context of event classification on the challenging TRECVID MED 2012 dataset, and the results showed that the robust 1000OBK event descriptor outperforms the state-of-the-art approaches. PMID:25147840

  10. Abnormal events detection in crowded scenes by trajectory cluster

    NASA Astrophysics Data System (ADS)

    Zhou, Shifu; Zhang, Zhijiang; Zeng, Dan; Shen, Wei

    2015-02-01

    Abnormal events detection in crowded scenes has been a challenge due to volatility of the definitions for both normality and abnormality, the small number of pixels on the target, appearance ambiguity resulting from the dense packing, and severe inter-object occlusions. A novel framework was proposed for the detection of unusual events in crowded scenes using trajectories produced by moving pedestrians based on an intuition that the motion patterns of usual behaviors are similar to these of group activity, whereas unusual behaviors are not. First, spectral clustering is used to group trajectories with similar spatial patterns. Different trajectory clusters represent different activities. Then, unusual trajectories can be detected using these patterns. Furthermore, behavior of a mobile pedestrian can be defined by comparing its direction with these patterns, such as moving in the opposite direction of the group or traversing the group. Experimental results indicated that the proposed algorithm could be used to reliably locate the abnormal events in crowded scenes.

  11. A Framework of Simple Event Detection in Surveillance Video

    NASA Astrophysics Data System (ADS)

    Xu, Weiguang; Zhang, Yafei; Lu, Jianjiang; Tian, Yulong; Wang, Jiabao

    Video surveillance is playing more and more important role in people's social life. Real-time alerting of threaten events and searching interesting content in stored large scale video footage needs human operator to pay full attention on monitor for long time. The labor intensive mode has limit the effectiveness and efficiency of the system. A framework of simple event detection is presented advance the automation of video surveillance. An improved inner key point matching approach is used to compensate motion of background in real-time; frame difference are used to detect foreground; HOG based classifiers are used to classify foreground object into people and car; mean-shift is used to tracking the recognized objects. Events are detected based on predefined rules. The maturity of the algorithms guarantee the robustness of the framework, and the improved approach and the easily checked rules enable the framework to work in real-time. Future works to be done are also discussed.

  12. Data mining for signal detection of adverse event safety data.

    PubMed

    Chen, Hung-Chia; Tsong, Yi; Chen, James J

    2013-01-01

    The Adverse Event Reporting System (AERS) is the primary database designed to support the Food and Drug Administration (FDA) postmarketing safety surveillance program for all approved drugs and therapeutic biologic products. Most current disproportionality analysis focuses on the detection of potential adverse events (AE) involving a single drug and a single AE only. In this paper, we present a data mining biclustering technique based on the singular value decomposition to extract local regions of association for a safety study. The analysis consists of collection of biclusters, each representing an association between a set of drugs with the corresponding set of adverse events. Significance of each bicluster can be tested using disproportionality analysis. Individual drug-event combination can be further tested. A safety data set consisting of 193 drugs with 8453 adverse events is analyzed as an illustration.

  13. Bi-Level Semantic Representation Analysis for Multimedia Event Detection.

    PubMed

    Chang, Xiaojun; Ma, Zhigang; Yang, Yi; Zeng, Zhiqiang; Hauptmann, Alexander G

    2016-03-28

    Multimedia event detection has been one of the major endeavors in video event analysis. A variety of approaches have been proposed recently to tackle this problem. Among others, using semantic representation has been accredited for its promising performance and desirable ability for human-understandable reasoning. To generate semantic representation, we usually utilize several external image/video archives and apply the concept detectors trained on them to the event videos. Due to the intrinsic difference of these archives, the resulted representation is presumable to have different predicting capabilities for a certain event. Notwithstanding, not much work is available for assessing the efficacy of semantic representation from the source-level. On the other hand, it is plausible to perceive that some concepts are noisy for detecting a specific event. Motivated by these two shortcomings, we propose a bi-level semantic representation analyzing method. Regarding source-level, our method learns weights of semantic representation attained from different multimedia archives. Meanwhile, it restrains the negative influence of noisy or irrelevant concepts in the overall concept-level. In addition, we particularly focus on efficient multimedia event detection with few positive examples, which is highly appreciated in the real-world scenario. We perform extensive experiments on the challenging TRECVID MED 2013 and 2014 datasets with encouraging results that validate the efficacy of our proposed approach.

  14. Method for early detection of cooling-loss events

    DOEpatents

    Bermudez, Sergio A.; Hamann, Hendrik F.; Marianno, Fernando J.

    2015-12-22

    A method of detecting cooling-loss event early is provided. The method includes defining a relative humidity limit and change threshold for a given space, measuring relative humidity in the given space, determining, with a processing unit, whether the measured relative humidity is within the defined relative humidity limit, generating a warning in an event the measured relative humidity is outside the defined relative humidity limit and determining whether a change in the measured relative humidity is less than the defined change threshold for the given space and generating an alarm in an event the change is greater than the defined change threshold.

  15. Method for early detection of cooling-loss events

    DOEpatents

    Bermudez, Sergio A.; Hamann, Hendrik; Marianno, Fernando J.

    2015-06-30

    A method of detecting cooling-loss event early is provided. The method includes defining a relative humidity limit and change threshold for a given space, measuring relative humidity in the given space, determining, with a processing unit, whether the measured relative humidity is within the defined relative humidity limit, generating a warning in an event the measured relative humidity is outside the defined relative humidity limit and determining whether a change in the measured relative humidity is less than the defined change threshold for the given space and generating an alarm in an event the change is greater than the defined change threshold.

  16. Human Rights Event Detection from Heterogeneous Social Media Graphs.

    PubMed

    Chen, Feng; Neill, Daniel B

    2015-03-01

    Human rights organizations are increasingly monitoring social media for identification, verification, and documentation of human rights violations. Since manual extraction of events from the massive amount of online social network data is difficult and time-consuming, we propose an approach for automated, large-scale discovery and analysis of human rights-related events. We apply our recently developed Non-Parametric Heterogeneous Graph Scan (NPHGS), which models social media data such as Twitter as a heterogeneous network (with multiple different node types, features, and relationships) and detects emerging patterns in the network, to identify and characterize human rights events. NPHGS efficiently maximizes a nonparametric scan statistic (an aggregate measure of anomalousness) over connected subgraphs of the heterogeneous network to identify the most anomalous network clusters. It summarizes each event with information such as type of event, geographical locations, time, and participants, and provides documentation such as links to videos and news reports. Building on our previous work that demonstrates the utility of NPHGS for civil unrest prediction and rare disease outbreak detection, we present an analysis of human rights events detected by NPHGS using two years of Twitter data from Mexico. NPHGS was able to accurately detect relevant clusters of human rights-related tweets prior to international news sources, and in some cases, prior to local news reports. Analysis of social media using NPHGS could enhance the information-gathering missions of human rights organizations by pinpointing specific abuses, revealing events and details that may be blocked from traditional media sources, and providing evidence of emerging patterns of human rights violations. This could lead to more timely, targeted, and effective advocacy, as well as other potential interventions.

  17. Detection of Abnormal Events via Optical Flow Feature Analysis

    PubMed Central

    Wang, Tian; Snoussi, Hichem

    2015-01-01

    In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227

  18. Context-aware event detection smartphone application for first responders

    NASA Astrophysics Data System (ADS)

    Boddhu, Sanjay K.; Dave, Rakesh P.; McCartney, Matt; West, James A.; Williams, Robert L.

    2013-05-01

    The rise of social networking platforms like Twitter, Facebook, etc…, have provided seamless sharing of information (as chat, video and other media) among its user community on a global scale. Further, the proliferation of the smartphones and their connectivity networks has powered the ordinary individuals to share and acquire information regarding the events happening in his/her immediate vicinity in a real-time fashion. This human-centric sensed data being generated in "human-as-sensor" approach is tremendously valuable as it delivered mostly with apt annotations and ground truth that would be missing in traditional machine-centric sensors, besides high redundancy factor (same data thru multiple users). Further, when appropriately employed this real-time data can support in detecting localized events like fire, accidents, shooting, etc…, as they unfold and pin-point individuals being affected by those events. This spatiotemporal information, when made available for first responders in the event vicinity (or approaching it) can greatly assist them to make effective decisions to protect property and life in a timely fashion. In this vein, under SATE and YATE programs, the research team at AFRL Tec^Edge Discovery labs had demonstrated the feasibility of developing Smartphone applications, that can provide a augmented reality view of the appropriate detected events in a given geographical location (localized) and also provide an event search capability over a large geographic extent. In its current state, the application thru its backend connectivity utilizes a data (Text & Image) processing framework, which deals with data challenges like; identifying and aggregating important events, analyzing and correlating the events temporally and spatially and building a search enabled event database. Further, the smartphone application with its backend data processing workflow has been successfully field tested with live user generated feeds.

  19. Human visual system-based smoking event detection

    NASA Astrophysics Data System (ADS)

    Odetallah, Amjad D.; Agaian, Sos S.

    2012-06-01

    Human action (e.g. smoking, eating, and phoning) analysis is an important task in various application domains like video surveillance, video retrieval, human-computer interaction systems, and so on. Smoke detection is a crucial task in many video surveillance applications and could have a great impact to raise the level of safety of urban areas, public parks, airplanes, hospitals, schools and others. The detection task is challenging since there is no prior knowledge about the object's shape, texture and color. In addition, its visual features will change under different lighting and weather conditions. This paper presents a new scheme of a system for detecting human smoking events, or small smoke, in a sequence of images. In developed system, motion detection and background subtraction are combined with motion-region-saving, skin-based image segmentation, and smoke-based image segmentation to capture potential smoke regions which are further analyzed to decide on the occurrence of smoking events. Experimental results show the effectiveness of the proposed approach. As well, the developed method is capable of detecting the small smoking events of uncertain actions with various cigarette sizes, colors, and shapes.

  20. Aseismic events in Southern California: Detection with InSAR

    NASA Astrophysics Data System (ADS)

    Lohman, R. B.; McGuire, J. J.; Lundgren, P.

    2007-05-01

    Aseismic slow slip events are usually studied using data types that have a dense temporal sampling rate, such as continuous GPS or tremor analysis using seismic data. However, even the sparser temporal coverage of InSAR data can further our understanding of these events in three significant ways - First, in areas where aseismic transients have been detected on geodetic arrays, InSAR may be able to provide a spatially denser image of the extent and magnitude of deformation. Second, InSAR observations are complementary to GPS because of the differing sensitivities to horizontal and vertical motions. Thirdly, in areas with no ground-based geodetic instrumentation, InSAR can be used in survey mode to detect deformation signals that are not associated with any observed seismicity. The temporal constraints on such signals may not be tight enough to allow for dynamics models of how aseismic transients occur, but InSAR-only detections can improve our understanding of the spatial extent of these types of events and can also identify key areas for future instrumentation and observation. Here, I summarize some of the contributions of InSAR observations of slow slip events, including data spanning the 2005 Obsidian Buttes swam in the Salton Trough, CA, and InSAR time-series results for the Salton Trough using both traditional interferometry and the persistent scatterer method.

  1. Comparison of Event Detection Methods for Centralized Sensor Networks

    NASA Technical Reports Server (NTRS)

    Sauvageon, Julien; Agogiono, Alice M.; Farhang, Ali; Tumer, Irem Y.

    2006-01-01

    The development of an Integrated Vehicle Health Management (IVHM) for space vehicles has become a great concern. Smart Sensor Networks is one of the promising technologies that are catching a lot of attention. In this paper, we propose to a qualitative comparison of several local event (hot spot) detection algorithms in centralized redundant sensor networks. The algorithms are compared regarding their ability to locate and evaluate the event under noise and sensor failures. The purpose of this study is to check if the ratio performance/computational power of the Mote Fuzzy Validation and Fusion algorithm is relevant compare to simpler methods.

  2. Detection and interpretation of seismoacoustic events at German infrasound stations

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Koch, Karl; Ceranna, Lars

    2016-04-01

    Three infrasound arrays with collocated or nearby installed seismometers are operated by the Federal Institute for Geosciences and Natural Resources (BGR) as the German National Data Center (NDC) for the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Infrasound generated by seismoacoustic events is routinely detected at these infrasound arrays, but air-to-ground coupled acoustic waves occasionally show up in seismometer recordings as well. Different natural and artificial sources like meteoroids as well as industrial and mining activity generate infrasonic signatures that are simultaneously detected at microbarometers and seismometers. Furthermore, many near-surface sources like earthquakes and explosions generate both seismic and infrasonic waves that can be detected successively with both technologies. The combined interpretation of seismic and acoustic signatures provides additional information about the origin time and location of remote infrasound events or about the characterization of seismic events distinguishing man-made and natural origins. Furthermore, seismoacoustic studies help to improve the modelling of infrasound propagation and ducting in the atmosphere and allow quantifying the portion of energy coupled into ground and into air by seismoacoustic sources. An overview of different seismoacoustic sources and their detection by German infrasound stations as well as some conclusions on the benefit of a combined seismoacoustic analysis are presented within this study.

  3. Implementation of a model based fault detection and diagnosis for actuation faults of the Space Shuttle main engine

    NASA Technical Reports Server (NTRS)

    Duyar, A.; Guo, T.-H.; Merrill, W.; Musgrave, J.

    1992-01-01

    In a previous study, Guo, Merrill and Duyar, 1990, reported a conceptual development of a fault detection and diagnosis system for actuation faults of the space shuttle main engine. This study, which is a continuation of the previous work, implements the developed fault detection and diagnosis scheme for the real time actuation fault diagnosis of the space shuttle main engine. The scheme will be used as an integral part of an intelligent control system demonstration experiment at NASA Lewis. The diagnosis system utilizes a model based method with real time identification and hypothesis testing for actuation, sensor, and performance degradation faults.

  4. On Identifiability of Bias-Type Actuator-Sensor Faults in Multiple-Model-Based Fault Detection and Identification

    NASA Technical Reports Server (NTRS)

    Joshi, Suresh M.

    2012-01-01

    This paper explores a class of multiple-model-based fault detection and identification (FDI) methods for bias-type faults in actuators and sensors. These methods employ banks of Kalman-Bucy filters to detect the faults, determine the fault pattern, and estimate the fault values, wherein each Kalman-Bucy filter is tuned to a different failure pattern. Necessary and sufficient conditions are presented for identifiability of actuator faults, sensor faults, and simultaneous actuator and sensor faults. It is shown that FDI of simultaneous actuator and sensor faults is not possible using these methods when all sensors have biases.

  5. Analysis of a SCADA System Anomaly Detection Model Based on Information Entropy

    DTIC Science & Technology

    2014-03-27

    20 Intrusion Detection...alarms ( Rem ). ............................................................................................................. 86 Figure 25. TP% for...literature concerning the focus areas of this research. The focus areas include SCADA vulnerabilities, information theory, and intrusion detection

  6. PMU Data Event Detection: A User Guide for Power Engineers

    SciTech Connect

    Allen, A.; Singh, M.; Muljadi, E.; Santoso, S.

    2014-10-01

    This user guide is intended to accompany a software package containing a Matrix Laboratory (MATLAB) script and related functions for processing phasor measurement unit (PMU) data. This package and guide have been developed by the National Renewable Energy Laboratory and the University of Texas at Austin. The objective of this data processing exercise is to discover events in the vast quantities of data collected by PMUs. This document attempts to cover some of the theory behind processing the data to isolate events as well as the functioning of the MATLAB scripts. The report describes (1) the algorithms and mathematical background that the accompanying MATLAB codes use to detect events in PMU data and (2) the inputs required from the user and the outputs generated by the scripts.

  7. Detecting modification of biomedical events using a deep parsing approach

    PubMed Central

    2012-01-01

    Background This work describes a system for identifying event mentions in bio-molecular research abstracts that are either speculative (e.g. analysis of IkappaBalpha phosphorylation, where it is not specified whether phosphorylation did or did not occur) or negated (e.g. inhibition of IkappaBalpha phosphorylation, where phosphorylation did not occur). The data comes from a standard dataset created for the BioNLP 2009 Shared Task. The system uses a machine-learning approach, where the features used for classification are a combination of shallow features derived from the words of the sentences and more complex features based on the semantic outputs produced by a deep parser. Method To detect event modification, we use a Maximum Entropy learner with features extracted from the data relative to the trigger words of the events. The shallow features are bag-of-words features based on a small sliding context window of 3-4 tokens on either side of the trigger word. The deep parser features are derived from parses produced by the English Resource Grammar and the RASP parser. The outputs of these parsers are converted into the Minimal Recursion Semantics formalism, and from this, we extract features motivated by linguistics and the data itself. All of these features are combined to create training or test data for the machine learning algorithm. Results Over the test data, our methods produce approximately a 4% absolute increase in F-score for detection of event modification compared to a baseline based only on the shallow bag-of-words features. Conclusions Our results indicate that grammar-based techniques can enhance the accuracy of methods for detecting event modification. PMID:22595089

  8. Integrated Seismic Event Detection and Location by Advanced Array Processing

    SciTech Connect

    Kvaerna, T; Gibbons, S J; Ringdal, F; Harris, D B

    2007-02-09

    The principal objective of this two-year study is to develop and test a new advanced, automatic approach to seismic detection/location using array processing. We address a strategy to obtain significantly improved precision in the location of low-magnitude events compared with current fully-automatic approaches, combined with a low false alarm rate. We have developed and evaluated a prototype automatic system which uses as a basis regional array processing with fixed, carefully calibrated, site-specific parameters in conjuction with improved automatic phase onset time estimation. We have in parallel developed tools for Matched Field Processing for optimized detection and source-region identification of seismic signals. This narrow-band procedure aims to mitigate some of the causes of difficulty encountered using the standard array processing system, specifically complicated source-time histories of seismic events and shortcomings in the plane-wave approximation for seismic phase arrivals at regional arrays.

  9. Gait event detection during stair walking using a rate gyroscope.

    PubMed

    Formento, Paola Catalfamo; Acevedo, Ruben; Ghoussayni, Salim; Ewins, David

    2014-03-19

    Gyroscopes have been proposed as sensors for ambulatory gait analysis and functional electrical stimulation systems. These applications often require detection of the initial contact (IC) of the foot with the floor and/or final contact or foot off (FO) from the floor during outdoor walking. Previous investigations have reported the use of a single gyroscope placed on the shank for detection of IC and FO on level ground and incline walking. This paper describes the evaluation of a gyroscope placed on the shank for determination of IC and FO in subjects ascending and descending a set of stairs. Performance was compared with a reference pressure measurement system. The absolute mean difference between the gyroscope and the reference was less than 45 ms for IC and better than 135 ms for FO for both activities. Detection success was over 93%. These results provide preliminary evidence supporting the use of a gyroscope for gait event detection when walking up and down stairs.

  10. Statistical language analysis for automatic exfiltration event detection.

    SciTech Connect

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  11. A model-based approach for detection of objects in low resolution passive-millimeter wave images

    NASA Technical Reports Server (NTRS)

    Tang, Yuan-Liang; Devadiga, Sadashiva; Kasturi, Rangachar; Harris, Randall L., Sr.

    1993-01-01

    We describe a model-based vision system to assist pilots in landing maneuvers under restricted visibility conditions. The system was designed to analyze image sequences obtained from a Passive Millimeter Wave (PMMW) imaging system mounted on the aircraft to delineate runways/taxiways, buildings, and other objects on or near runways. PMMW sensors have good response in a foggy atmosphere; but, their spatial resolution is very low. However, additional data such as airport model and approximate position and orientation of aircraft are available. We exploit these data to guide our model-based system to locate objects in the low resolution image and generate warning signals to alert the pilots. We also derive analytical expressions for the accuracy of the camera position estimate obtained by detecting the position of known objects in the image.

  12. Model-based assessment of the role of human-induced climate change in the 2005 Caribbean coral bleaching event.

    PubMed

    Donner, Simon D; Knutson, Thomas R; Oppenheimer, Michael

    2007-03-27

    Episodes of mass coral bleaching around the world in recent decades have been attributed to periods of anomalously warm ocean temperatures. In 2005, the sea surface temperature (SST) anomaly in the tropical North Atlantic that may have contributed to the strong hurricane season caused widespread coral bleaching in the Eastern Caribbean. Here, we use two global climate models to evaluate the contribution of natural climate variability and anthropogenic forcing to the thermal stress that caused the 2005 coral bleaching event. Historical temperature data and simulations for the 1870-2000 period show that the observed warming in the region is unlikely to be due to unforced climate variability alone. Simulation of background climate variability suggests that anthropogenic warming may have increased the probability of occurrence of significant thermal stress events for corals in this region by an order of magnitude. Under scenarios of future greenhouse gas emissions, mass coral bleaching in the Eastern Caribbean may become a biannual event in 20-30 years. However, if corals and their symbionts can adapt by 1-1.5 degrees C, such mass bleaching events may not begin to recur at potentially harmful intervals until the latter half of the century. The delay could enable more time to alter the path of greenhouse gas emissions, although long-term "committed warming" even after stabilization of atmospheric CO(2) levels may still represent an additional long-term threat to corals.

  13. Model-based assessment of the role of human-induced climate change in the 2005 Caribbean coral bleaching event

    SciTech Connect

    Donner, S.D.; Knutson, T.R.; Oppenheimer, M.

    2007-03-27

    Episodes of mass coral bleaching around the world in recent decades have been attributed to periods of anomalously warm ocean temperatures. In 2005, the sea surface temperature (SST) anomaly in the tropical North Atlantic that may have contributed to the strong hurricane season caused widespread coral bleaching in the Eastern Caribbean. Here, the authors use two global climate models to evaluate the contribution of natural climate variability and anthropogenic forcing to the thermal stress that caused the 2005 coral bleaching event. Historical temperature data and simulations for the 1870-2000 period show that the observed warming in the region is unlikely to be due to unforced climate variability alone. Simulation of background climate variability suggests that anthropogenic warming may have increased the probability of occurrence of significant thermal stress events for corals in this region by an order of magnitude. Under scenarios of future greenhouse gas emissions, mass coral bleaching in the Eastern Caribbean may become a biannual event in 20-30 years. However, if corals and their symbionts can adapt by 1-1.5{sup o}C, such mass bleaching events may not begin to recur at potentially harmful intervals until the latter half of the century. The delay could enable more time to alter the path of greenhouse gas emissions, although long-term 'committed warming' even after stabilization of atmospheric CO{sub 2} levels may still represent an additional long-term threat to corals.

  14. Application of Kalman Filtering Techniques for Microseismic Event Detection

    NASA Astrophysics Data System (ADS)

    Baziw, E.; Weir-Jones, I.

    - Microseismic monitoring systems are generally installed in areas of induced seismicity caused by human activity. Induced seismicity results from changes in the state of stress which may occur as a result of excavation within the rock mass in mining (i.e., rockbursts), and changes in hydrostatic pressures and rock temperatures (e.g., during fluid injection or extraction) in oil exploitation, dam construction or fluid disposal. Microseismic monitoring systems determine event locations and important source parameters such as attenuation, seismic moment, source radius, static stress drop, peak particle velocity and seismic energy. An essential part of the operation of a microseismic monitoring system is the reliable detection of microseismic events. In the absence of reliable, automated picking techniques, operators rely upon manual picking. This is time-consuming, costly and, in the presence of background noise, very prone to error. The techniques described in this paper not only permit the reliable identification of events in cluttered signal environments they have also enabled the authors to develop reliable automated event picking procedures. This opens the way to use microseismic monitoring as a cost-effective production/operations procedure. It has been the experience of the authors that in certain noisy environments, the seismic monitoring system may trigger on and subsequently acquire substantial quantities of erroneous data, due to the high energy content of the ambient noise. Digital filtering techniques need to be applied on the microseismic data so that the ambient noise is removed and event detection simplified. The monitoring of seismic acoustic emissions is a continuous, real-time process and it is desirable to implement digital filters which can also be designed in the time domain and in real-time such as the Kalman Filter. This paper presents a real-time Kalman Filter which removes the statistically describable background noise from the recorded

  15. Integrating event detection system operation characteristics into sensor placement optimization.

    SciTech Connect

    Hart, William Eugene; McKenna, Sean Andrew; Phillips, Cynthia Ann; Murray, Regan Elizabeth; Hart, David Blaine

    2010-05-01

    We consider the problem of placing sensors in a municipal water network when we can choose both the location of sensors and the sensitivity and specificity of the contamination warning system. Sensor stations in a municipal water distribution network continuously send sensor output information to a centralized computing facility, and event detection systems at the control center determine when to signal an anomaly worthy of response. Although most sensor placement research has assumed perfect anomaly detection, signal analysis software has parameters that control the tradeoff between false alarms and false negatives. We describe a nonlinear sensor placement formulation, which we heuristically optimize with a linear approximation that can be solved as a mixed-integer linear program. We report the results of initial experiments on a real network and discuss tradeoffs between early detection of contamination incidents, and control of false alarms.

  16. Detection of red tide events in the Ariake Sound, Japan

    NASA Astrophysics Data System (ADS)

    Ishizaka, Joji

    2003-05-01

    High resolution SeaWiFS data was used to detect a red tide event occurred in the Ariake Sound, Japan, in winter of 2000 to 2001. The area is small embayment surrounding by tidal flat, and it is known as one of the most productive areas in coast of Japan. The red tide event damaged to seaweed (Nori) culture, and the relation to the reclamation at the Isahaya Bay in the Sound has been discussed. SeaWiFS chlorophyll data showed the red tide started early December 2000, from the Isahaya Bay, although direct relationship to the reclamation was not clear. The red tide persisted to the end of February. Monthly average of SeaWiFS data from May 1998 to December 2001 indicated that the chlorophyll increased twice a year, early summer and fall after the rain. The red tide event was part of the fall bloom which started later and continued longer than other years. Ocean color is useful to detect the red tide; however, it is required to improve the algorithms to accurately estimate chlorophyll in high turbid water and to discriminate toxic flagellates.

  17. Adaptive Model-Based Mine Detection/Localization using Noisy Laser Doppler Vibration Measurements

    SciTech Connect

    Sullivan, E J; Xiang, N; Candy, J V

    2009-04-06

    The acoustic detection of buried mines is hampered by the fact that at the frequencies required for obtaining useful penetration, the energy is quickly absorbed by the ground. A recent approach which avoids this problem, is to excite the ground with a high-level low frequency sound, which excites low frequency resonances in the mine. These resonances cause a low-level vibration on the surface which can be detected by a Laser Doppler Vibrometer. This paper presents a method of quickly and efficiently detecting these vibrations by sensing a change in the statistics of the signal when the mine is present. Results based on real data are shown.

  18. A Framework for Automated Spine and Vertebrae Interpolation-Based Detection and Model-Based Segmentation.

    PubMed

    Korez, Robert; Ibragimov, Bulat; Likar, Boštjan; Pernuš, Franjo; Vrtovec, Tomaž

    2015-08-01

    Automated and semi-automated detection and segmentation of spinal and vertebral structures from computed tomography (CT) images is a challenging task due to a relatively high degree of anatomical complexity, presence of unclear boundaries and articulation of vertebrae with each other, as well as due to insufficient image spatial resolution, partial volume effects, presence of image artifacts, intensity variations and low signal-to-noise ratio. In this paper, we describe a novel framework for automated spine and vertebrae detection and segmentation from 3-D CT images. A novel optimization technique based on interpolation theory is applied to detect the location of the whole spine in the 3-D image and, using the obtained location of the whole spine, to further detect the location of individual vertebrae within the spinal column. The obtained vertebra detection results represent a robust and accurate initialization for the subsequent segmentation of individual vertebrae, which is performed by an improved shape-constrained deformable model approach. The framework was evaluated on two publicly available CT spine image databases of 50 lumbar and 170 thoracolumbar vertebrae. Quantitative comparison against corresponding reference vertebra segmentations yielded an overall mean centroid-to-centroid distance of 1.1 mm and Dice coefficient of 83.6% for vertebra detection, and an overall mean symmetric surface distance of 0.3 mm and Dice coefficient of 94.6% for vertebra segmentation. The results indicate that by applying the proposed automated detection and segmentation framework, vertebrae can be successfully detected and accurately segmented in 3-D from CT spine images.

  19. High Probabilities of Planet Detection during Microlensing Events.

    NASA Astrophysics Data System (ADS)

    Peale, S. J.

    2000-10-01

    The averaged probability of detecting a planetary companion of a lensing star during a gravitational microlensing event toward the Galactic center when the planet-lens mass ratio is 0.001 is shown to have a maximum exceeding 20% for a distribution of source-lens impact parameters that is determined by the efficiency of event detection, and a maximum exceeding 10% for a uniform distribution of impact parameters. The probability varies as the square root of the planet-lens mass ratio. A planet is assumed detectable if the perturbation of the light curve exceeds 2/(S/N) for a significant number of data points, where S/N is the signal-to noise ratio for the photometry of the source. The probability peaks at a planetary semimajor axis a that is close to the mean Einstein ring radius of the lenses of about 2 AU along the line of sight, and remains significant for 0.6<= a<= 10 AU. The low value of the mean Einstein ring radius results from the dominance of M stars in the mass function of the lenses. The probability is averaged over the distribution of the projected position of the planet onto the lens plane, over the lens mass function, over the distribution of impact parameters, over the distribution of lens along the line of sight to the source star, over the I band luminosity function of the sources adjusted for the source distance, and over the source distribution along the line of sight. If two or more parameters of the lensing event are known, such as the I magnitude of the source and the impact parameter, the averages over these parameters can be omitted and the probability of detection determined for a particular event. The calculated probabilities behave as expected with variations in the line of sight, the mass function of the lenses, the extinction and distance to and magnitude of the source, and with a more demanding detection criterion. The relatively high values of the probabilities are robust to plausible variations in the assumptions. The high

  20. Towards perception awareness: Perceptual event detection for Brain computer interfaces.

    PubMed

    Nejati, Hossein; Tsourides, Kleovoulos; Pomponiu, Victor; Ehrenberg, Evan C; Ngai-Man Cheung; Sinha, Pawan

    2015-08-01

    Brain computer interface (BCI) technology is becoming increasingly popular in many domains such as entertainment, mental state analysis, and rehabilitation. For robust performance in these domains, detecting perceptual events would be a vital ability, enabling adaptation to and act on the basis of user's perception of the environment. Here we present a framework to automatically mine spatiotemporal characteristics of a given perceptual event. As this "signature" is derived directly from subject's neural behavior, it can serve as a representation of the subject's perception of the targeted scenario, which in turn allows a BCI system to gain a new level of context awareness: perception awareness. As a proof of concept, we show the application of the proposed framework on MEG signal recordings from a face perception study, and the resulting temporal and spatial characteristics of the derived neural signature, as well as it's compatibility with the neuroscientific literature on face perception.

  1. Endmember detection in marine environment with oil spill event

    NASA Astrophysics Data System (ADS)

    Andreou, Charoula; Karathanassi, Vassilia

    2011-11-01

    Oil spill events are a crucial environmental issue. Detection of oil spills is important for both oil exploration and environmental protection. In this paper, investigation of hyperspectral remote sensing is performed for the detection of oil spills and the discrimination of different oil types. Spectral signatures of different oil types are very useful, since they may serve as endmembers in unmixing and classification models. Towards this direction, an oil spectral library, resulting from spectral measurements of artificial oil spills as well as of look-alikes in marine environment was compiled. Samples of four different oil types were used; two crude oils, one marine residual fuel oil, and one light petroleum product. Lookalikes comprise sea water, river discharges, shallow water and water with algae. Spectral measurements were acquired with spectro-radiometer GER1500. Moreover, oil and look-alikes spectral signatures have been examined whether they can be served as endmembers. This was accomplished by testifying their linear independence. After that, synthetic hyperspectral images based on the relevant oil spectral library were created. Several simplex-based endmember algorithms such as sequential maximum angle convex cone (SMACC), vertex component analysis (VCA), n-finder algorithm (N-FINDR), and automatic target generation process (ATGP) were applied on the synthetic images in order to evaluate their effectiveness for detecting oil spill events occurred from different oil types. Results showed that different types of oil spills with various thicknesses can be extracted as endmembers.

  2. ARX model-based gearbox fault detection and localization under varying load conditions

    NASA Astrophysics Data System (ADS)

    Yang, Ming; Makis, Viliam

    2010-11-01

    The development of the fault detection schemes for gearbox systems has received considerable attention in recent years. Both time series modeling and feature extraction based on wavelet methods have been considered, mostly under constant load. Constant load assumption implies that changes in vibration data are caused only by deterioration of the gearbox. However, most real gearbox systems operate under varying load and speed which affect the vibration signature of the system and in general make it difficult to recognize the occurrence of an impending fault. This paper presents a novel approach to detect and localize the gear failure occurrence for a gearbox operating under varying load conditions. First, residual signal is calculated using an autoregressive model with exogenous variables (ARX) fitted to the time-synchronously averaged (TSA) vibration data and filtered TSA envelopes when the gearbox operated under various load conditions in the healthy state. The gear of interest is divided into several sections so that each section includes the same number of adjacent teeth. Then, the fault detection and localization indicator is calculated by applying F-test to the residual signal of the ARX model. The proposed fault detection scheme indicates not only when the gear fault occurs, but also in which section of the gear. Finally, the performance of the fault detection scheme is checked using full lifetime vibration data obtained from the gearbox operating from a new condition to a breakdown under varying load.

  3. An Automated Visual Event Detection System for Cabled Observatory Video

    NASA Astrophysics Data System (ADS)

    Edgington, D. R.; Cline, D. E.; Mariette, J.

    2007-12-01

    The permanent presence of underwater cameras on oceanic cabled observatories, such as the Victoria Experimental Network Under the Sea (VENUS) and Eye-In-The-Sea (EITS) on Monterey Accelerated Research System (MARS), will generate valuable data that can move forward the boundaries of understanding the underwater world. However, sightings of underwater animal activities are rare, resulting in the recording of many hours of video with relatively few events of interest. The burden of video management and analysis often requires reducing the amount of video recorded and later analyzed. Sometimes enough human resources do not exist to analyze the video; the strains on human attention needed to analyze video demand an automated way to assist in video analysis. Towards this end, an Automated Visual Event Detection System (AVED) is in development at the Monterey Bay Aquarium Research Institute (MBARI) to address the problem of analyzing cabled observatory video. Here we describe the overall design of the system to process video data and enable science users to analyze the results. We present our results analyzing video from the VENUS observatory and test data from EITS deployments. This automated system for detecting visual events includes a collection of custom and open source software that can be run three ways: through a Web Service, through a Condor managed pool of AVED enabled compute servers, or locally on a single computer. The collection of software also includes a graphical user interface to preview or edit detected results and to setup processing options. To optimize the compute-intensive AVED algorithms, a parallel program has been implemented for high-data rate applications like the EITS instrument on MARS.

  4. Hybrid Model-Based and Data-Driven Fault Detection and Diagnostics for Commercial Buildings

    SciTech Connect

    Frank, Stephen; Heaney, Michael; Jin, Xin; Robertson, Joseph; Cheung, Howard; Elmore, Ryan; Henze, Gregor

    2016-08-26

    Commercial buildings often experience faults that produce undesirable behavior in building systems. Building faults waste energy, decrease occupants' comfort, and increase operating costs. Automated fault detection and diagnosis (FDD) tools for buildings help building owners discover and identify the root causes of faults in building systems, equipment, and controls. Proper implementation of FDD has the potential to simultaneously improve comfort, reduce energy use, and narrow the gap between actual and optimal building performance. However, conventional rule-based FDD requires expensive instrumentation and valuable engineering labor, which limit deployment opportunities. This paper presents a hybrid, automated FDD approach that combines building energy models and statistical learning tools to detect and diagnose faults noninvasively, using minimal sensors, with little customization. We compare and contrast the performance of several hybrid FDD algorithms for a small security building. Our results indicate that the algorithms can detect and diagnose several common faults, but more work is required to reduce false positive rates and improve diagnosis accuracy.

  5. FraudMiner: a novel credit card fraud detection model based on frequent itemset mining.

    PubMed

    Seeja, K R; Zareapoor, Masoumeh

    2014-01-01

    This paper proposes an intelligent credit card fraud detection model for detecting fraud from highly imbalanced and anonymous credit card transaction datasets. The class imbalance problem is handled by finding legal as well as fraud transaction patterns for each customer by using frequent itemset mining. A matching algorithm is also proposed to find to which pattern (legal or fraud) the incoming transaction of a particular customer is closer and a decision is made accordingly. In order to handle the anonymous nature of the data, no preference is given to any of the attributes and each attribute is considered equally for finding the patterns. The performance evaluation of the proposed model is done on UCSD Data Mining Contest 2009 Dataset (anonymous and imbalanced) and it is found that the proposed model has very high fraud detection rate, balanced classification rate, Matthews correlation coefficient, and very less false alarm rate than other state-of-the-art classifiers.

  6. FraudMiner: A Novel Credit Card Fraud Detection Model Based on Frequent Itemset Mining

    PubMed Central

    Seeja, K. R.; Zareapoor, Masoumeh

    2014-01-01

    This paper proposes an intelligent credit card fraud detection model for detecting fraud from highly imbalanced and anonymous credit card transaction datasets. The class imbalance problem is handled by finding legal as well as fraud transaction patterns for each customer by using frequent itemset mining. A matching algorithm is also proposed to find to which pattern (legal or fraud) the incoming transaction of a particular customer is closer and a decision is made accordingly. In order to handle the anonymous nature of the data, no preference is given to any of the attributes and each attribute is considered equally for finding the patterns. The performance evaluation of the proposed model is done on UCSD Data Mining Contest 2009 Dataset (anonymous and imbalanced) and it is found that the proposed model has very high fraud detection rate, balanced classification rate, Matthews correlation coefficient, and very less false alarm rate than other state-of-the-art classifiers. PMID:25302317

  7. Model-Based Design of Tree WSNs for Decentralized Detection

    PubMed Central

    Tantawy, Ashraf; Koutsoukos, Xenofon; Biswas, Gautam

    2015-01-01

    The classical decentralized detection problem of finding the optimal decision rules at the sensor and fusion center, as well as variants that introduce physical channel impairments have been studied extensively in the literature. The deployment of WSNs in decentralized detection applications brings new challenges to the field. Protocols for different communication layers have to be co-designed to optimize the detection performance. In this paper, we consider the communication network design problem for a tree WSN. We pursue a system-level approach where a complete model for the system is developed that captures the interactions between different layers, as well as different sensor quality measures. For network optimization, we propose a hierarchical optimization algorithm that lends itself to the tree structure, requiring only local network information. The proposed design approach shows superior performance over several contentionless and contention-based network design approaches. PMID:26307989

  8. Moving object detection using a background modeling based on entropy theory and quad-tree decomposition

    NASA Astrophysics Data System (ADS)

    Elharrouss, Omar; Moujahid, Driss; Elkah, Samah; Tairi, Hamid

    2016-11-01

    A particular algorithm for moving object detection using a background subtraction approach is proposed. We generate the background model by combining quad-tree decomposition with entropy theory. In general, many background subtraction approaches are sensitive to sudden illumination change in the scene and cannot update the background image in scenes. The proposed background modeling approach analyzes the illumination change problem. After performing the background subtraction based on the proposed background model, the moving targets can be accurately detected at each frame of the image sequence. In order to produce high accuracy for the motion detection, the binary motion mask can be computed by the proposed threshold function. The experimental analysis based on statistical measurements proves the efficiency of our proposed method in terms of quality and quantity. And it even outperforms substantially existing methods by perceptional evaluation.

  9. Hybrid Model-Based and Data-Driven Fault Detection and Diagnostics for Commercial Buildings: Preprint

    SciTech Connect

    Frank, Stephen; Heaney, Michael; Jin, Xin; Robertson, Joseph; Cheung, Howard; Elmore, Ryan; Henze, Gregor

    2016-08-01

    Commercial buildings often experience faults that produce undesirable behavior in building systems. Building faults waste energy, decrease occupants' comfort, and increase operating costs. Automated fault detection and diagnosis (FDD) tools for buildings help building owners discover and identify the root causes of faults in building systems, equipment, and controls. Proper implementation of FDD has the potential to simultaneously improve comfort, reduce energy use, and narrow the gap between actual and optimal building performance. However, conventional rule-based FDD requires expensive instrumentation and valuable engineering labor, which limit deployment opportunities. This paper presents a hybrid, automated FDD approach that combines building energy models and statistical learning tools to detect and diagnose faults noninvasively, using minimal sensors, with little customization. We compare and contrast the performance of several hybrid FDD algorithms for a small security building. Our results indicate that the algorithms can detect and diagnose several common faults, but more work is required to reduce false positive rates and improve diagnosis accuracy.

  10. Data mining to generate adverse drug events detection rules.

    PubMed

    Chazard, Emmanuel; Ficheur, Grégoire; Bernonville, Stéphanie; Luyckx, Michel; Beuscart, Régis

    2011-11-01

    Adverse drug events (ADEs) are a public health issue. Their detection usually relies on voluntary reporting or medical chart reviews. The objective of this paper is to automatically detect cases of ADEs by data mining. 115,447 complete past hospital stays are extracted from six French, Danish, and Bulgarian hospitals using a common data model including diagnoses, drug administrations, laboratory results, and free-text records. Different kinds of outcomes are traced, and supervised rule induction methods (decision trees and association rules) are used to discover ADE detection rules, with respect to time constraints. The rules are then filtered, validated, and reorganized by a committee of experts. The rules are described in a rule repository, and several statistics are automatically computed in every medical department, such as the confidence, relative risk, and median delay of outcome appearance. 236 validated ADE-detection rules are discovered; they enable to detect 27 different kinds of outcomes. The rules use a various number of conditions related to laboratory results, diseases, drug administration, and demographics. Some rules involve innovative conditions, such as drug discontinuations.

  11. Use of sonification in the detection of anomalous events

    NASA Astrophysics Data System (ADS)

    Ballora, Mark; Cole, Robert J.; Kruesi, Heidi; Greene, Herbert; Monahan, Ganesh; Hall, David L.

    2012-06-01

    In this paper, we describe the construction of a soundtrack that fuses stock market data with information taken from tweets. This soundtrack, or auditory display, presents the numerical and text data in such a way that anomalous events may be readily detected, even by untrained listeners. The soundtrack generation is flexible, allowing an individual listener to create a unique audio mix from the available information sources. Properly constructed, the display exploits the auditory system's sensitivities to periodicities, to dynamic changes, and to patterns. This type of display could be valuable in environments that demand high levels of situational awareness based on multiple sources of incoming information.

  12. Automatic Detection of Student Mental Models Based on Natural Language Student Input during Metacognitive Skill Training

    ERIC Educational Resources Information Center

    Lintean, Mihai; Rus, Vasile; Azevedo, Roger

    2012-01-01

    This article describes the problem of detecting the student mental models, i.e. students' knowledge states, during the self-regulatory activity of prior knowledge activation in MetaTutor, an intelligent tutoring system that teaches students self-regulation skills while learning complex science topics. The article presents several approaches to…

  13. Automatic adverse drug events detection using letters to the editor.

    PubMed

    Yang, Chao; Srinivasan, Padmini; Polgreen, Philip M

    2012-01-01

    We present and test the intuition that letters to the editor in journals carry early signals of adverse drug events (ADEs). Surprisingly these letters have not yet been exploited for automatic ADE detection unlike for example, clinical records and PubMed. Part of the challenge is that it is not easy to access the full-text of letters (for the most part these do not appear in PubMed). Also letters are likely underrated in comparison with full articles. Besides demonstrating that this intuition holds we contribute techniques for post market drug surveillance. Specifically, we test an automatic approach for ADE detection from letters using off-the-shelf machine learning tools. We also involve natural language processing for feature definitions. Overall we achieve high accuracy in our experiments and our method also works well on a second new test set. Our results encourage us to further pursue this line of research.

  14. Detectability of GW150914-like events by gravitational microlensing

    NASA Astrophysics Data System (ADS)

    Eilbott, Daniel; Riley, Alexander; Cohn, Jonathan; Kesden, Michael H.; King, Lindsay J.

    2017-01-01

    The recent discovery of gravitational waves from stellar-mass binary black holes (BBHs) provided direct evidence of the existence of these systems. These BBHs would have gravitational microlensing signatures that are, due to their large masses and small separations, distinct from single-lens signals. We apply Bayesian statistics to examine the distinguishability of BBH microlensing events from single-lens events under ideal observing conditions, using modern photometric and astrometric capabilities. Given one year of ideal observations, a source star at the Galactic center, a GW150914-like BBH lens (total mass 65 M⊙, mass ratio 0.8) at half that distance, and an impact parameter of 0.4 Einstein radii, we find that BBH separations down to 0.00634 Einstein radii are detectable, which is < 0.00716 Einstein radii, the limit at which the BBH would merge within the age of the universe. We encourage analyses of LSST data to search for similar modulation in all long-duration events, providing a new channel for the discovery of short-period BBHs in our Galaxy.

  15. Increased SERS detection efficiency for characterizing rare events in flow.

    PubMed

    Jacobs, Kevin T; Schultz, Zachary D

    2015-08-18

    Improved surface-enhanced Raman scattering (SERS) measurements of a flowing aqueous sample are accomplished by combining line focus optics with sheath-flow SERS detection. The straightforward introduction of a cylindrical lens into the optical path of the Raman excitation laser increases the efficiency of SERS detection and the reproducibility of SERS signals at low concentrations. The width of the line focus is matched to the width of the sample capillary from which the analyte elutes under hydrodynamic focusing conditions, allowing for increased collection across the SERS substrate while maintaining the power density below the damage threshold at any specific point. We show that a 4× increase in power spread across the line increases the signal-to-noise ratio by a factor of 2 for a variety of analytes, such as rhodamine 6G, amino acids, and lipid vesicles, without any detectable photodamage. COMSOL simulations and Raman maps elucidate the hydrodynamic focusing properties of the flow cell, providing a clearer picture of the confinement effects at the surface where the sample exits the capillary. The lipid vesicle results suggest that the combination of hydrodynamic focusing and increased optical collection enables the reproducible detection of rare events, in this case individual lipid vesicles.

  16. Hybrid light transport model based bioluminescence tomography reconstruction for early gastric cancer detection

    NASA Astrophysics Data System (ADS)

    Chen, Xueli; Liang, Jimin; Hu, Hao; Qu, Xiaochao; Yang, Defu; Chen, Duofang; Zhu, Shouping; Tian, Jie

    2012-03-01

    Gastric cancer is the second cause of cancer-related death in the world, and it remains difficult to cure because it has been in late-stage once that is found. Early gastric cancer detection becomes an effective approach to decrease the gastric cancer mortality. Bioluminescence tomography (BLT) has been applied to detect early liver cancer and prostate cancer metastasis. However, the gastric cancer commonly originates from the gastric mucosa and grows outwards. The bioluminescent light will pass through a non-scattering region constructed by gastric pouch when it transports in tissues. Thus, the current BLT reconstruction algorithms based on the approximation model of radiative transfer equation are not optimal to handle this problem. To address the gastric cancer specific problem, this paper presents a novel reconstruction algorithm that uses a hybrid light transport model to describe the bioluminescent light propagation in tissues. The radiosity theory integrated with the diffusion equation to form the hybrid light transport model is utilized to describe light propagation in the non-scattering region. After the finite element discretization, the hybrid light transport model is converted into a minimization problem which fuses an l1 norm based regularization term to reveal the sparsity of bioluminescent source distribution. The performance of the reconstruction algorithm is first demonstrated with a digital mouse based simulation with the reconstruction error less than 1mm. An in situ gastric cancer-bearing nude mouse based experiment is then conducted. The primary result reveals the ability of the novel BLT reconstruction algorithm in early gastric cancer detection.

  17. Model-based detection of heart rate turbulence using mean shape information.

    PubMed

    Smith, Danny; Solem, Kristian; Laguna, Pablo; Martínez, Juan Pablo; Sörnmo, Leif

    2010-02-01

    A generalized likelihood ratio test (GLRT) statistic is proposed for detection of heart rate turbulence (HRT), where a set of Karhunen-LoEve basis functions models HRT. The detector structure is based on the extended integral pulse frequency modulation model that accounts for the presence of ectopic beats and HRT. This new test statistic takes a priori information regarding HRT shape into account, whereas our previously presented GLRT detector relied solely on the energy contained in the signal subspace. The spectral relationship between heart rate variability (HRV) and HRT is investigated for the purpose of modeling HRV "noise" present during the turbulence period, the results suggesting that the white noise assumption is feasible to pursue. The performance was studied for both simulated and real data, leading to results which show that the new GLRT detector is superior to the original one as well as to the commonly used parameter turbulence slope (TS) on both types of data. Averaging ten ventricular ectopic beats, the estimated detection probability of the new detector, the previous detector, and TS were found to be 0.83, 0.35, and 0.41, respectively, when the false alarm probability was held fixed at 0.1.

  18. Model-Based Anomaly Detection for a Transparent Optical Transmission System

    NASA Astrophysics Data System (ADS)

    Bengtsson, Thomas; Salamon, Todd; Ho, Tin Kam; White, Christopher A.

    In this chapter, we present an approach for anomaly detection at the physical layer of networks where detailed knowledge about the devices and their operations is available. The approach combines physics-based process models with observational data models to characterize the uncertainties and derive the alarm decision rules. We formulate and apply three different methods based on this approach for a well-defined problem in optical network monitoring that features many typical challenges for this methodology. Specifically, we address the problem of monitoring optically transparent transmission systems that use dynamically controlled Raman amplification systems. We use models of amplifier physics together with statistical estimation to derive alarm decision rules and use these rules to automatically discriminate between measurement errors, anomalous losses, and pump failures. Our approach has led to an efficient tool for systematically detecting anomalies in the system behavior of a deployed network, where pro-active measures to address such anomalies are key to preventing unnecessary disturbances to the system's continuous operation.

  19. Takagi-Sugeno fuzzy-model-based fault detection for networked control systems with Markov delays.

    PubMed

    Zheng, Ying; Fang, Huajing; Wang, Hua O

    2006-08-01

    A Takagi-Sugeno (T-S) model is employed to represent a networked control system (NCS) with different network-induced delays. Comparing with existing NCS modeling methods, this approach does not require the knowledge of exact values of network-induced delays. Instead, it addresses situations involving all possible network-induced delays. Moreover, this approach also handles data-packet loss. As an application of the T-S-based modeling method, a parity-equation approach and a fuzzy-observer-based approach for fault detection of an NCS were developed. An example of a two-link inverted pendulum is used to illustrate the utility and viability of the proposed approaches.

  20. A Model-Based Probabilistic Inversion Framework for Wire Fault Detection Using TDR

    NASA Technical Reports Server (NTRS)

    Schuet, Stefan R.; Timucin, Dogan A.; Wheeler, Kevin R.

    2010-01-01

    Time-domain reflectometry (TDR) is one of the standard methods for diagnosing faults in electrical wiring and interconnect systems, with a long-standing history focused mainly on hardware development of both high-fidelity systems for laboratory use and portable hand-held devices for field deployment. While these devices can easily assess distance to hard faults such as sustained opens or shorts, their ability to assess subtle but important degradation such as chafing remains an open question. This paper presents a unified framework for TDR-based chafing fault detection in lossy coaxial cables by combining an S-parameter based forward modeling approach with a probabilistic (Bayesian) inference algorithm. Results are presented for the estimation of nominal and faulty cable parameters from laboratory data.

  1. [Establishment and Improvement of Portable X-Ray Fluorescence Spectrometer Detection Model Based on Wavelet Transform].

    PubMed

    Li, Fang; Wang, Ji-hua; Lu, An-xiang; Han, Ping

    2015-04-01

    The concentration of Cr, Cu, Zn, As and Pb in soil was tested by portable X-ray fluorescence spectrometer. Each sample was tested for 3 times, then after using wavelet threshold noise filtering method for denoising and smoothing the spectra, a standard curve for each heavy metal was established according to the standard values of heavy metals in soil and the corresponding counts which was the average of the 3 processed spectra. The signal to noise ratio (SNR), mean square error (MSE) and information entropy (H) were taken to assess the effects of denoising when using wavelet threshold noise filtering method for determining the best wavelet basis and wavelet decomposition level. Some samples with different concentrations and H3 B03 (blank) were chosen to retest this instrument to verify its stability. The results show that: the best denoising result was obtained with the coif3 wavelet basis at the decomposition level of 3 when using the wavelet transform method. The determination coefficient (R2) range of the instrument is 0.990-0.996, indicating that a high degree of linearity was found between the contents of heavy metals in soil and each X-ray fluorescence spectral characteristic peak intensity with the instrument measurement within the range (0-1,500 mg · kg(-1)). After retesting and calculating, the results indicate that all the detection limits of the instrument are below the soil standards at national level. The accuracy of the model has been effectively improved, and the instrument also shows good precision with the practical application of wavelet transform to the establishment and improvement of X-ray fluorescence spectrometer detection model. Thus the instrument can be applied in on-site rapid screening of heavy metal in contaminated soil.

  2. Detecting and characterising ramp events in wind power time series

    NASA Astrophysics Data System (ADS)

    Gallego, Cristóbal; Cuerva, Álvaro; Costa, Alexandre

    2014-12-01

    In order to implement accurate models for wind power ramp forecasting, ramps need to be previously characterised. This issue has been typically addressed by performing binary ramp/non-ramp classifications based on ad-hoc assessed thresholds. However, recent works question this approach. This paper presents the ramp function, an innovative wavelet- based tool which detects and characterises ramp events in wind power time series. The underlying idea is to assess a continuous index related to the ramp intensity at each time step, which is obtained by considering large power output gradients evaluated under different time scales (up to typical ramp durations). The ramp function overcomes some of the drawbacks shown by the aforementioned binary classification and permits forecasters to easily reveal specific features of the ramp behaviour observed at a wind farm. As an example, the daily profile of the ramp-up and ramp-down intensities are obtained for the case of a wind farm located in Spain.

  3. Detecting Rare Events in the Time-Domain

    SciTech Connect

    Rest, A; Garg, A

    2008-10-31

    One of the biggest challenges in current and future time-domain surveys is to extract the objects of interest from the immense data stream. There are two aspects to achieving this goal: detecting variable sources and classifying them. Difference imaging provides an elegant technique for identifying new transients or changes in source brightness. Much progress has been made in recent years toward refining the process. We discuss a selection of pitfalls that can afflict an automated difference imagine pipeline and describe some solutions. After identifying true astrophysical variables, we are faced with the challenge of classifying them. For rare events, such as supernovae and microlensing, this challenge is magnified because we must balance having selection criteria that select for the largest number of objects of interest against a high contamination rate. We discuss considerations and techniques for developing classification schemes.

  4. An approach to model-based fault detection in industrial measurement systems with application to engine test benches

    NASA Astrophysics Data System (ADS)

    Angelov, P.; Giglio, V.; Guardiola, C.; Lughofer, E.; Luján, J. M.

    2006-07-01

    An approach to fault detection (FD) in industrial measurement systems is proposed in this paper which includes an identification strategy for early detection of the appearance of a fault. This approach is model based, i.e. nominal models are used which represent the fault-free state of the on-line measured process. This approach is also suitable for off-line FD. The framework that combines FD with isolation and correction (FDIC) is outlined in this paper. The proposed approach is characterized by automatic threshold determination, ability to analyse local properties of the models, and aggregation of different fault detection statements. The nominal models are built using data-driven and hybrid approaches, combining first principle models with on-line data-driven techniques. At the same time the models are transparent and interpretable. This novel approach is then verified on a number of real and simulated data sets of car engine test benches (both gasoline—Alfa Romeo JTS, and diesel—Caterpillar). It is demonstrated that the approach can work effectively in real industrial measurement systems with data of large dimensions in both on-line and off-line modes.

  5. Pulmonary Nodule Detection Model Based on SVM and CT Image Feature-Level Fusion with Rough Sets

    PubMed Central

    Lu, Huiling; Zhang, Junjie; Shi, Hongbin

    2016-01-01

    In order to improve the detection accuracy of pulmonary nodules in CT image, considering two problems of pulmonary nodules detection model, including unreasonable feature structure and nontightness of feature representation, a pulmonary nodules detection algorithm is proposed based on SVM and CT image feature-level fusion with rough sets. Firstly, CT images of pulmonary nodule are analyzed, and 42-dimensional feature components are extracted, including six new 3-dimensional features proposed by this paper and others 2-dimensional and 3-dimensional features. Secondly, these features are reduced for five times with rough set based on feature-level fusion. Thirdly, a grid optimization model is used to optimize the kernel function of support vector machine (SVM), which is used as a classifier to identify pulmonary nodules. Finally, lung CT images of 70 patients with pulmonary nodules are collected as the original samples, which are used to verify the effectiveness and stability of the proposed model by four groups' comparative experiments. The experimental results show that the effectiveness and stability of the proposed model based on rough set feature-level fusion are improved in some degrees. PMID:27722173

  6. Detecting Tidal Disruption Events (TDEs) with the Einstein Probe

    NASA Astrophysics Data System (ADS)

    Yuan, W.; Komossa, S.; Zhang, C.; Feng, H.; Zhang, S.; Osborne, J.; O'Brien, P.; Watson, M.; Fraser, G.

    2014-07-01

    Stars are tidally disrupted and accreted when they approach supermassive black holes (SMBHs) closely, producing a flare of electromagnetic radiation. The majority of the (approximately two dozen) tidal disruption events (TDEs) identified so far have been discovered by their luminous, transient X-ray emission. Once TDEs are detected in much larger numbers, in future dedicated transient surveys, a wealth of new applications will become possible. Including (1) TDE rate measurements in dependence of host galaxy types, (2) an assessment of the population of IMBHs, and (3) new probes of general relativity and accretion processes. Here, we present the proposed X-ray mission Einstein Probe}, which aims at detecting TDEs in large numbers. The mission consists of a wide-field micro-pore Lobster-eye imager (60deg x 60deg, or ˜1 ster), and is designed to carry out an all-sky transient survey at energies of 0.5-4 keV. It will also carry an X-ray telescope of the same micro-pore optics for follow-ups, with a smaller field-of-view. It will be capable of issuing public transient alerts rapidly.

  7. Communication of ALS Patients by Detecting Event-Related Potential

    NASA Astrophysics Data System (ADS)

    Kanou, Naoyuki; Sakuma, Kenji; Nakashima, Kenji

    Amyotrophic Lateral Sclerosis(ALS) patients are unable to successfully communicate their desires, although their mental capacity is the same as non-affected persons. Therefore, the authors put emphasis on Event-Related Potential(ERP) which elicits the highest outcome for the target visual and hearing stimuli. P300 is one component of ERP. It is positive potential that is elicited when the subject focuses attention on stimuli that appears infrequently. In this paper, the authors focused on P200 and N200 components, in addition to P300, for their great improvement in the rate of correct judgment in the target word-specific experiment. Hence the authors propose the algorithm that specifies target words by detecting these three components. Ten healthy subjects and ALS patient underwent the experiment in which a target word out of five words, was specified by this algorithm. The rates of correct judgment in nine of ten healthy subjects were more than 90.0%. The highest rate was 99.7%. The highest rate of ALS patient was 100.0%. Through these results, the authors found the possibility that ALS patients could communicate with surrounding persons by detecting ERP(P200, N200 and P300) as their desire.

  8. Apparatus and method for detecting full-capture radiation events

    DOEpatents

    Odell, Daniel M. C.

    1994-01-01

    An apparatus and method for sampling the output signal of a radiation detector and distinguishing full-capture radiation events from Compton scattering events. The output signal of a radiation detector is continuously sampled. The samples are converted to digital values and input to a discriminator where samples that are representative of events are identified. The discriminator transfers only event samples, that is, samples representing full-capture events and Compton events, to a signal processor where the samples are saved in a three-dimensional count matrix with time (from the time of onset of the pulse) on the first axis, sample pulse current amplitude on the second axis, and number of samples on the third axis. The stored data are analyzed to separate the Compton events from full-capture events, and the energy of the full-capture events is determined without having determined the energies of any of the individual radiation detector events.

  9. Balloon-Borne Infrasound Detection of Energetic Bolide Events

    NASA Astrophysics Data System (ADS)

    Young, Eliot F.; Ballard, Courtney; Klein, Viliam; Bowman, Daniel; Boslough, Mark

    2016-10-01

    Infrasound is usually defined as sound waves below 20 Hz, the nominal limit of human hearing. Infrasound waves propagate over vast distances through the Earth's atmosphere: the CTBTO (Comprehensive Nuclear-Test-Ban Treaty Organization) has 48 installed infrasound-sensing stations around the world to detect nuclear detonations and other disturbances. In February 2013, several CTBTO infrasound stations detected infrasound signals from a large bolide that exploded over Chelyabinsk, Russia. Some stations recorded signals that had circumnavigated the Earth, over a day after the original event. The goal of this project is to improve upon the sensitivity of the CTBTO network by putting microphones on small, long-duration super-pressure balloons, with the overarching goal of studying the small end of the NEO population by using the Earth's atmosphere as a witness plate.A balloon-borne infrasound sensor is expected to have two advantages over ground-based stations: a lack of wind noise and a concentration of infrasound energy in the "stratospheric duct" between roughly 5 - 50 km altitude. To test these advantages, we have built a small balloon payload with five calibrated microphones. We plan to fly this payload on a NASA high-altitude balloon from Ft Sumner, NM in August 2016. We have arranged for three large explosions to take place in Socorro, NM while the balloon is aloft to assess the sensitivity of balloon-borne vs. ground-based infrasound sensors. We will report on the results from this test flight and the prospects for detecting/characterizing small bolides in the stratosphere.

  10. Automatic detection of volcano-seismic events by modeling state and event duration in hidden Markov models

    NASA Astrophysics Data System (ADS)

    Bhatti, Sohail Masood; Khan, Muhammad Salman; Wuth, Jorge; Huenupan, Fernando; Curilem, Millaray; Franco, Luis; Yoma, Nestor Becerra

    2016-09-01

    In this paper we propose an automatic volcano event detection system based on Hidden Markov Model (HMM) with state and event duration models. Since different volcanic events have different durations, therefore the state and whole event durations learnt from the training data are enforced on the corresponding state and event duration models within the HMM. Seismic signals from the Llaima volcano are used to train the system. Two types of events are employed in this study, Long Period (LP) and Volcano-Tectonic (VT). Experiments show that the standard HMMs can detect the volcano events with high accuracy but generates false positives. The results presented in this paper show that the incorporation of duration modeling can lead to reductions in false positive rate in event detection as high as 31% with a true positive accuracy equal to 94%. Further evaluation of the false positives indicate that the false alarms generated by the system were mostly potential events based on the signal-to-noise ratio criteria recommended by a volcano expert.

  11. DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals

    PubMed Central

    Lawhern, Vernon; Hairston, W. David; Robbins, Kay

    2013-01-01

    Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration. PMID:23638169

  12. Detection of Local/Regional Events in Kuwait Using Next-Generation Detection Algorithms

    SciTech Connect

    Gok, M. Rengin; Al-Jerri, Farra; Dodge, Douglas; Al-Enezi, Abdullah; Hauk, Terri; Mellors, R.

    2014-12-10

    Seismic networks around the world use conventional triggering algorithms to detect seismic signals in order to locate local/regional seismic events. Kuwait National Seismological Network (KNSN) of Kuwait Institute of Scientific Research (KISR) is operating seven broad-band and short-period three-component stations in Kuwait. The network is equipped with Nanometrics digitizers and uses Antelope and Guralp acquisition software for processing and archiving the data. In this study, we selected 10 days of archived hourly-segmented continuous data of five stations (Figure 1) and 250 days of continuous recording at MIB. For the temporary deployment our selection criteria was based on KNSN catalog intensity for the period of time we test the method. An autonomous event detection and clustering framework is employed to test a more complete catalog of this short period of time. The goal is to illustrate the effectiveness of the technique and pursue the framework for longer period of time.

  13. Adverse drug events and medication errors: detection and classification methods.

    PubMed

    Morimoto, T; Gandhi, T K; Seger, A C; Hsieh, T C; Bates, D W

    2004-08-01

    Investigating the incidence, type, and preventability of adverse drug events (ADEs) and medication errors is crucial to improving the quality of health care delivery. ADEs, potential ADEs, and medication errors can be collected by extraction from practice data, solicitation of incidents from health professionals, and patient surveys. Practice data include charts, laboratory, prescription data, and administrative databases, and can be reviewed manually or screened by computer systems to identify signals. Research nurses, pharmacists, or research assistants review these signals, and those that are likely to represent an ADE or medication error are presented to reviewers who independently categorize them into ADEs, potential ADEs, medication errors, or exclusions. These incidents are also classified according to preventability, ameliorability, disability, severity, stage, and responsible person. These classifications, as well as the initial selection of incidents, have been evaluated for agreement between reviewers and the level of agreement found ranged from satisfactory to excellent (kappa = 0.32-0.98). The method of ADE and medication error detection and classification described is feasible and has good reliability. It can be used in various clinical settings to measure and improve medication safety.

  14. Using REDItools to Detect RNA Editing Events in NGS Datasets.

    PubMed

    Picardi, Ernesto; D'Erchia, Anna Maria; Montalvo, Antonio; Pesole, Graziano

    2015-03-09

    RNA editing is a post-transcriptional/co-transcriptional molecular phenomenon whereby a genetic message is modified from the corresponding DNA template by means of substitutions, insertions, and/or deletions. It occurs in a variety of organisms and different cellular locations through evolutionally and biochemically unrelated proteins. RNA editing has a plethora of biological effects including the modulation of alternative splicing and fine-tuning of gene expression. RNA editing events by base substitutions can be detected on a genomic scale by NGS technologies through the REDItools package, an ad hoc suite of Python scripts to study RNA editing using RNA-Seq and DNA-Seq data or RNA-Seq data alone. REDItools implement effective filters to minimize biases due to sequencing errors, mapping errors, and SNPs. The package is freely available at Google Code repository (http://code.google.com/p/reditools/) and released under the MIT license. In the present unit we show three basic protocols corresponding to three main REDItools scripts.

  15. Visual traffic surveillance framework: classification to event detection

    NASA Astrophysics Data System (ADS)

    Ambardekar, Amol; Nicolescu, Mircea; Bebis, George; Nicolescu, Monica

    2013-10-01

    Visual traffic surveillance using computer vision techniques can be noninvasive, automated, and cost effective. Traffic surveillance systems with the ability to detect, count, and classify vehicles can be employed in gathering traffic statistics and achieving better traffic control in intelligent transportation systems. However, vehicle classification poses a difficult problem as vehicles have high intraclass variation and relatively low interclass variation. Five different object recognition techniques are investigated: principal component analysis (PCA)+difference from vehicle space, PCA+difference in vehicle space, PCA+support vector machine, linear discriminant analysis, and constellation-based modeling applied to the problem of vehicle classification. Three of the techniques that performed well were incorporated into a unified traffic surveillance system for online classification of vehicles, which uses tracking results to improve the classification accuracy. To evaluate the accuracy of the system, 31 min of traffic video containing multilane traffic intersection was processed. It was possible to achieve classification accuracy as high as 90.49% while classifying correctly tracked vehicles into four classes: cars, SUVs/vans, pickup trucks, and buses/semis. While processing a video, our system also recorded important traffic parameters such as the appearance, speed, trajectory of a vehicle, etc. This information was later used in a search assistant tool to find interesting traffic events.

  16. Signal detection to identify serious adverse events (neuropsychiatric events) in travelers taking mefloquine for chemoprophylaxis of malaria

    PubMed Central

    Naing, Cho; Aung, Kyan; Ahmed, Syed Imran; Mak, Joon Wah

    2012-01-01

    Background For all medications, there is a trade-off between benefits and potential for harm. It is important for patient safety to detect drug-event combinations and analyze by appropriate statistical methods. Mefloquine is used as chemoprophylaxis for travelers going to regions with known chloroquine-resistant Plasmodium falciparum malaria. As such, there is a concern about serious adverse events associated with mefloquine chemoprophylaxis. The objective of the present study was to assess whether any signal would be detected for the serious adverse events of mefloquine, based on data in clinicoepidemiological studies. Materials and methods We extracted data on adverse events related to mefloquine chemoprophylaxis from the two published datasets. Disproportionality reporting of adverse events such as neuropsychiatric events and other adverse events was presented in the 2 × 2 contingency table. Reporting odds ratio and corresponding 95% confidence interval [CI] data-mining algorithm was applied for the signal detection. The safety signals are considered significant when the ROR estimates and the lower limits of the corresponding 95% CI are ≥2. Results Two datasets addressing adverse events of mefloquine chemoprophylaxis (one from a published article and one from a Cochrane systematic review) were included for analyses. Reporting odds ratio 1.58, 95% CI: 1.49–1.68 based on published data in the selected article, and 1.195, 95% CI: 0.94–1.44 based on data in the selected Cochrane review. Overall, in both datasets, the reporting odds ratio values of lower 95% CI were less than 2. Conclusion Based on available data, findings suggested that signals for serious adverse events pertinent to neuropsychiatric event were not detected for mefloquine. Further studies are needed to substantiate this. PMID:22936859

  17. Large Time Projection Chambers for Rare Event Detection

    SciTech Connect

    Heffner, M

    2009-11-03

    The Time Projection Chamber (TPC) concept [add ref to TPC section] has been applied to many projects outside of particle physics and the accelerator based experiments where it was initially developed. TPCs in non-accelerator particle physics experiments are principally focused on rare event detection (e.g. neutrino and darkmater experiments) and the physics of these experiments can place dramatically different constraints on the TPC design (only extensions to the traditional TPCs are discussed here). The drift gas, or liquid, is usually the target or matter under observation and due to very low signal rates a TPC with the largest active mass is desired. The large mass complicates particle tracking of short and sometimes very low energy particles. Other special design issues include, efficient light collection, background rejection, internal triggering and optimal energy resolution. Backgrounds from gamma-rays and neutrons are significant design issues in the construction of these TPCs. They are generally placed deep underground to shield from cosmogenic particles and surrounded with shielding to reduce radiation from the local surroundings. The construction materials have to be carefully screened for radiopurity as they are in close contact with the active mass and can be a signification source of background events. The TPC excels in reducing this internal background because the mass inside the fieldcage forms one monolithic volume from which fiducial cuts can be made ex post facto to isolate quiet drift mass, and can be circulated and purified to a very high level. Self shielding in these large mass systems can be significant and the effect improves with density. The liquid phase TPC can obtain a high density at low pressure which results in very good self-shielding and compact installation with a lightweight containment. The down sides are the need for cryogenics, slower charge drift, tracks shorter than the typical electron diffusion, lower energy resolution (e

  18. Transient Evoked Potential in a Critical Event Detection Task.

    DTIC Science & Technology

    1984-02-01

    Vigilance and Discrimination: A Reassessment," Science, 164:326-328, 1969. 38. Fabiani , Monica and others. "Individual Differences in the von Restorff...implies that events which elicit a P300 are more likely to be remembered than events which do not 2-23 .... . . ... invoke a P300 (15:507-510). Fabiani

  19. Detection Method of Three Events to Window and Key Using Light Sensor for Crime Prevention

    NASA Astrophysics Data System (ADS)

    Yamawaki, Akira; Katakami, Takayuki; Kitazono, Yuhki; Serikawa, Seiichi

    The three events to the window and the key occurring when a thief attempts to intrude into the house are detected by the different sensors conventionally. This paper proposes a method detecting the three events by using the simple light-sensor consisting of an infrared LED and a photodiode. In the experiments, the light sensor shows the different tendencies that can detect each event. This fact indicates that our proposal can realize a sensor module more efficiently instead of using different sensors.

  20. Migration Based Event Detection and Automatic P- and S-Phase Picking in Hengill, Southwest Iceland

    NASA Astrophysics Data System (ADS)

    Wagner, F.; Tryggvason, A.; Gudmundsson, O.; Roberts, R.; Bodvarsson, R.; Fehler, M.

    2015-12-01

    Automatic detection of seismic events is a complicated process. Common procedures depend on the detection of seismic phases (e.g. P and S) in single trace analyses and their correct association with locatable point sources. The event detection threshold is thus directly related to the single trace detection threshold. Highly sensitive phase detectors detect low signal-to-noise ratio (S/N) phases but also produce a low percentage of locatable events. Short inter-event times of only a few seconds, which is not uncommon during seismic or volcanic crises, is a complication to any event association algorithm. We present an event detection algorithm based on seismic migration of trace attributes into an a-priori three-dimensional (3D) velocity model. We evaluate its capacity as automatic detector compared to conventional methods. Detecting events using seismic migration removes the need for phase association. The event detector runs on a stack of time shifted traces, which increases S/N and thus allows for a low detection threshold. Detected events come with an origin time and a location estimate, enabling a focused trace analysis, including P- and S-phase recognition, to discard false detections and build a basis for accurate automatic phase picking. We apply the migration based detection algorithm to data from a semi-permanent seismic network at Hengill, an active volcanic region with several geothermal production sites in southwest Iceland. The network includes 26 stations with inter-station distances down to 5 km. Results show a high success rate compared to the manually picked catalogue (up to 90% detected). New detections, that were missed by the standard detection routine, show a generally good ratio of true to false alarms, i.e. most of the new events are locatable.

  1. Network hydraulics inclusion in water quality event detection using multiple sensor stations data.

    PubMed

    Oliker, Nurit; Ostfeld, Avi

    2015-09-01

    Event detection is one of the current most challenging topics in water distribution systems analysis: how regular on-line hydraulic (e.g., pressure, flow) and water quality (e.g., pH, residual chlorine, turbidity) measurements at different network locations can be efficiently utilized to detect water quality contamination events. This study describes an integrated event detection model which combines multiple sensor stations data with network hydraulics. To date event detection modelling is likely limited to single sensor station location and dataset. Single sensor station models are detached from network hydraulics insights and as a result might be significantly exposed to false positive alarms. This work is aimed at decreasing this limitation through integrating local and spatial hydraulic data understanding into an event detection model. The spatial analysis complements the local event detection effort through discovering events with lower signatures by exploring the sensors mutual hydraulic influences. The unique contribution of this study is in incorporating hydraulic simulation information into the overall event detection process of spatially distributed sensors. The methodology is demonstrated on two example applications using base runs and sensitivity analyses. Results show a clear advantage of the suggested model over single-sensor event detection schemes.

  2. A model of human event detection in multiple process monitoring situations

    NASA Technical Reports Server (NTRS)

    Greenstein, J. S.; Rouse, W. B.

    1978-01-01

    It is proposed that human decision making in many multi-task situations might be modeled in terms of the manner in which the human detects events related to his tasks and the manner in which he allocates his attention among his tasks once he feels events have occurred. A model of human event detection performance in such a situation is presented. An assumption of the model is that, in attempting to detect events, the human generates the probability that events have occurred. Discriminant analysis is used to model the human's generation of these probabilities. An experimental study of human event detection performance in a multiple process monitoring situation is described and the application of the event detection model to this situation is addressed. The experimental study employed a situation in which subjects simulataneously monitored several dynamic processes for the occurrence of events and made yes/no decisions on the presence of events in each process. Input to the event detection model of the information displayed to the experimental subjects allows comparison of the model's performance with the performance of the subjects.

  3. Method and apparatus for detecting and determining event characteristics with reduced data collection

    NASA Technical Reports Server (NTRS)

    Totman, Peter D. (Inventor); Everton, Randy L. (Inventor); Egget, Mark R. (Inventor); Macon, David J. (Inventor)

    2007-01-01

    A method and apparatus for detecting and determining event characteristics such as, for example, the material failure of a component, in a manner which significantly reduces the amount of data collected. A sensor array, including a plurality of individual sensor elements, is coupled to a programmable logic device (PLD) configured to operate in a passive state and an active state. A triggering event is established such that the PLD records information only upon detection of the occurrence of the triggering event which causes a change in state within one or more of the plurality of sensor elements. Upon the occurrence of the triggering event, the change in state of the one or more sensor elements causes the PLD to record in memory which sensor element detected the event and at what time the event was detected. The PLD may be coupled with a computer for subsequent downloading and analysis of the acquired data.

  4. Detection of severe weather events using new remote sensing methods

    NASA Astrophysics Data System (ADS)

    Choy, S.; Wang, C.; Zhang, K.; Kuleshov, Y.

    2012-04-01

    The potential of using ground- and space-based Global Positioning System (GPS) observations for studying severe weather events is presented using the March 2010 Melbourne storm as a case study. This event generated record rainfall and flash flooding across the state of Victoria, Australia. The Victorian GPSnet is the only state-wide and densest ground-based GPS infrastructure in Australia. This provides a unique opportunity to study the spatial and temporal variations in precipitable water vapour (PWV) as the storm passed over the network. The results show strong spatial and temporal correlation between variations of the ground-based GPS-PWV estimates and the thunderstorm passage. This finding demonstrates that the ground-based GPS techniques can supplement conventional meteorological observations in studying, monitoring, and potentially predicting severe weather events. The advantage of using ground-based GPS technique is that it is capable of providing continuous observation of the storm passage with high temporal resolution; while the spatial resolution of the distribution of water vapour is dependent on the geographical location and density of the GPS stations. The results from the space-based GPS radio occultation (RO) technique, on the other hand, are not as robust. Although GPS RO can capture the dynamics of the atmosphere with high vertical resolution, its limited geographical coverage in a local region and its temporal resolution over a short period of time raise an important question about its potential for monitoring severe weather events, particularly local thunderstorms which have a relatively short life-span. GPS RO technique will be more suitable for long-term climatology studies over a large area. It is anticipated that the findings from this study will encourage further research into using GPS meteorology technique for monitoring and forecasting severe weather events in the Australian region.

  5. A Dynamically Configurable Log-based Distributed Security Event Detection Methodology using Simple Event Correlator

    DTIC Science & Technology

    2010-06-01

    from SANS Whitepaper - "... Detecting Attacks on Web Applications from Log Files" #look for image tags type=Single continue=TakeNext ptype=RegExp...shellcmd /home/user/sec -2.5.3/ common/syslogclient "... Synthetic : " "$2|$1|xss detected in image tag: $3" #send the raw log type=Single ptype=RegExp...Expressions taken from SANS Whitepaper - "... Detecting Attacks on Web Applications from Log Files" #look for image tags type=Single continue=TakeNext

  6. Contamination event detection using multiple types of conventional water quality sensors in source water.

    PubMed

    Liu, Shuming; Che, Han; Smith, Kate; Chen, Lei

    2014-08-01

    Early warning systems are often used to detect deliberate and accidental contamination events in a water system. Conventional methods normally detect a contamination event by comparing the predicted and observed water quality values from one sensor. This paper proposes a new method for event detection by exploring the correlative relationships between multiple types of conventional water quality sensors. The performance of the proposed method was evaluated using data from contaminant injection experiments in a laboratory. Results from these experiments demonstrated the correlative responses of multiple types of sensors. It was observed that the proposed method could detect a contamination event 9 minutes after the introduction of lead nitrate solution with a concentration of 0.01 mg L(-1). The proposed method employs three parameters. Their impact on the detection performance was also analyzed. The initial analysis showed that the correlative response is contaminant-specific, which implies that it can be utilized not only for contamination detection, but also for contaminant identification.

  7. Power System Extreme Event Detection: The VulnerabilityFrontier

    SciTech Connect

    Lesieutre, Bernard C.; Pinar, Ali; Roy, Sandip

    2007-10-17

    In this work we apply graph theoretic tools to provide aclose bound on a frontier relating the number of line outages in a gridto the power disrupted by the outages. This frontier describes theboundary of a space relating the possible severity of a disturbance interms of power disruption, from zero to some maximum on the boundary, tothe number line outages involved in the event. We present the usefulnessof this analysis with a complete analysis of a 30 bus system, and presentresults for larger systems.

  8. Data-mining-based detection of adverse drug events.

    PubMed

    Chazard, Emmanuel; Preda, Cristian; Merlin, Béatrice; Ficheur, Grégoire; Beuscart, Régis

    2009-01-01

    Every year adverse drug events (ADEs) are known to be responsible for 98,000 deaths in the USA. Classical methods rely on report statements, expert knowledge, and staff operated record review. One of our objectives, in the PSIP project framework, is to use data mining (e.g., decision trees) to electronically identify situations leading to risk of ADEs. 10,500 hospitalization records from Denmark and France were used. 500 rules were automatically obtained, which are currently being validated by experts. A decision support system to prevent ADEs is then to be developed. The article examines a decision tree and the rules in the field of vitamin K antagonists.

  9. Nonthreshold-based event detection for 3d environment monitoring in sensor networks

    SciTech Connect

    Li, M.; Liu, Y.H.; Chen, L.

    2008-12-15

    Event detection is a crucial task for wireless sensor network applications, especially environment monitoring. Existing approaches for event detection are mainly based on some predefined threshold values and, thus, are often inaccurate and incapable of capturing complex events. For example, in coal mine monitoring scenarios, gas leakage or water osmosis can hardly be described by the overrun of specified attribute thresholds but some complex pattern in the full-scale view of the environmental data. To address this issue, we propose a nonthreshold-based approach for the real 3D sensor monitoring environment. We employ energy-efficient methods to collect a time series of data maps from the sensor network and detect complex events through matching the gathered data to spatiotemporal data patterns. Finally, we conduct trace-driven simulations to prove the efficacy and efficiency of this approach on detecting events of complex phenomena from real-life records.

  10. Neuro-evolutionary event detection technique for downhole microseismic surveys

    NASA Astrophysics Data System (ADS)

    Maity, Debotyam; Salehi, Iraj

    2016-01-01

    Recent years have seen a significant increase in borehole microseismic data acquisition programs associated with unconventional reservoir developments such as hydraulic fracturing programs for shale oil and gas. The data so acquired is used for hydraulic fracture monitoring and diagnostics and therefore, the quality of the data in terms of resolution and accuracy has a significant impact on its value to the industry. Borehole microseismic data acquired in such environments typically suffer from propagation effects due to the presence of thin interbedded shale layers as well as noise and interference effects. Moreover, acquisition geometry has significant impact on detectability across portions of the sensor array. Our work focuses on developing robust first arrival detection and pick selection workflow for both P and S waves specifically designed for such environments. We introduce a novel workflow for refinement of picks with immunity towards significant noise artifacts and applicability over data with very low signal-to-noise ratio provided some accurate picks have already been made. This workflow utilizes multi-step hybrid detection and classification routine which makes use of a neural network based autopicker for initial picking and an evolutionary algorithm for pick refinement. We highlight the results from an actual field case study including multiple examples demonstrating immunity towards noise and compare the effectiveness of the workflow with two contemporary autopicking routines without the application of the shared detection/refinement procedure. Finally, we use a windowed waveform cross-correlation based uncertainty estimation method for potential quality control purposes. While the workflow was developed to work with the neural network based autopicker, it can be used with any other traditional autopicker and provides significant improvements in pick detection across seismic gathers.

  11. Spatial-temporal event detection in climate parameter imagery.

    SciTech Connect

    McKenna, Sean Andrew; Gutierrez, Karen A.

    2011-10-01

    Previously developed techniques that comprise statistical parametric mapping, with applications focused on human brain imaging, are examined and tested here for new applications in anomaly detection within remotely-sensed imagery. Two approaches to analysis are developed: online, regression-based anomaly detection and conditional differences. These approaches are applied to two example spatial-temporal data sets: data simulated with a Gaussian field deformation approach and weekly NDVI images derived from global satellite coverage. Results indicate that anomalies can be identified in spatial temporal data with the regression-based approach. Additionally, la Nina and el Nino climatic conditions are used as different stimuli applied to the earth and this comparison shows that el Nino conditions lead to significant decreases in NDVI in both the Amazon Basin and in Southern India.

  12. Multi-Detection Events, Probability Density Functions, and Reduced Location Area

    SciTech Connect

    Eslinger, Paul W.; Schrom, Brian T.

    2016-03-01

    Abstract Several efforts have been made in the Comprehensive Nuclear-Test-Ban Treaty (CTBT) community to assess the benefits of combining detections of radionuclides to improve the location estimates available from atmospheric transport modeling (ATM) backtrack calculations. We present a Bayesian estimation approach rather than a simple dilution field of regard approach to allow xenon detections and non-detections to be combined mathematically. This system represents one possible probabilistic approach to radionuclide event formation. Application of this method to a recent interesting radionuclide event shows a substantial reduction in the location uncertainty of that event.

  13. Detection of intermittent events in atmospheric time series

    NASA Astrophysics Data System (ADS)

    Paradisi, P.; Cesari, R.; Palatella, L.; Contini, D.; Donateo, A.

    2009-04-01

    The modeling approach in atmospheric sciences is based on the assumption that local fluxes of mass, momentum, heat, etc... can be described as linear functions of the local gradient of some intensive property (concentration, flow strain, temperature,...). This is essentially associated with Gaussian statistics and short range (exponential) correlations. However, the atmosphere is a complex dynamical system displaying a wide range of spatial and temporal scales. A global description of the atmospheric dynamics should include a great number of degrees of freedom, strongly interacting on several temporal and spatial scales, thus generating long range (power-law) correlations and non-Gaussian distribution of fluctuations (Lévy flights, Lévy walks, Continuous Time Random Walks) [1]. This is typically associated with anomalous diffusion and scaling, non-trivial memory features and correlation decays and, especially, with the emergence of flux-gradient relationships that are non-linear and/or non-local in time and/or space. Actually, the local flux-gradient relationship is greatly preferred due to a more clear physical meaning, allowing to perform direct comparisons with experimental data, and, especially, to smaller computational costs in numerical models. In particular, the linearity of this relationship allows to define a transport coefficient (e.g., turbulent diffusivity) and the modeling effort is usually focused on this coefficient. However, the validity of the local (and linear) flux-gradient model is strongly dependent on the range of spatial and temporal scales represented by the model and, consequently, by the sub-grid processes included in the flux-gradient relationship. In this work, in order to check the validity of local and linear flux-gradient relationships, an approach based on the concept of renewal critical events [2] is introduced. In fact, in renewal theory [2], the dynamical origin of anomalous behaviour and non-local flux-gradient relation is

  14. Simple Movement and Detection in Discrete Event Simulation

    DTIC Science & Technology

    2005-12-01

    with a description of uniform linear motion in the following section. We will then con- sider the simplest kind of sensing, the “ cookie -cutter.” A... cookie -cutter sensor sees everything that is within its range R, and must be notified at the precise time a target enters it range. In a time-step...simulation, cookie -cutter detection is very easy. Simply compute the distance between the sensor and the target at each time step. If the target is

  15. Detecting and Locating Seismic Events Without Phase Picks or Velocity Models

    NASA Astrophysics Data System (ADS)

    Arrowsmith, S.; Young, C. J.; Ballard, S.; Slinkard, M.

    2015-12-01

    The standard paradigm for seismic event monitoring is to scan waveforms from a network of stations and identify the arrival time of various seismic phases. A signal association algorithm then groups the picks to form events, which are subsequently located by minimizing residuals between measured travel times and travel times predicted by an Earth model. Many of these steps are prone to significant errors which can lead to erroneous arrival associations and event locations. Here, we revisit a concept for event detection that does not require phase picks or travel time curves and fuses detection, association and location into a single algorithm. Our pickless event detector exploits existing catalog and waveform data to build an empirical stack of the full regional seismic wavefield, which is subsequently used to detect and locate events at a network level using correlation techniques. Because the technique uses more of the information content of the original waveforms, the concept is particularly powerful for detecting weak events that would be missed by conventional methods. We apply our detector to seismic data from the University of Utah Seismograph Stations network and compare our results with the earthquake catalog published by the University of Utah. We demonstrate that the pickless detector can detect and locate significant numbers of events previously missed by standard data processing techniques.

  16. Model-Based Multifactor Dimensionality Reduction to detect epistasis for quantitative traits in the presence of error-free and noisy data.

    PubMed

    Mahachie John, Jestinah M; Van Lishout, François; Van Steen, Kristel

    2011-06-01

    Detecting gene-gene interactions or epistasis in studies of human complex diseases is a big challenge in the area of epidemiology. To address this problem, several methods have been developed, mainly in the context of data dimensionality reduction. One of these methods, Model-Based Multifactor Dimensionality Reduction, has so far mainly been applied to case-control studies. In this study, we evaluate the power of Model-Based Multifactor Dimensionality Reduction for quantitative traits to detect gene-gene interactions (epistasis) in the presence of error-free and noisy data. Considered sources of error are genotyping errors, missing genotypes, phenotypic mixtures and genetic heterogeneity. Our simulation study encompasses a variety of settings with varying minor allele frequencies and genetic variance for different epistasis models. On each simulated data, we have performed Model-Based Multifactor Dimensionality Reduction in two ways: with and without adjustment for main effects of (known) functional SNPs. In line with binary trait counterparts, our simulations show that the power is lowest in the presence of phenotypic mixtures or genetic heterogeneity compared to scenarios with missing genotypes or genotyping errors. In addition, empirical power estimates reduce even further with main effects corrections, but at the same time, false-positive percentages are reduced as well. In conclusion, phenotypic mixtures and genetic heterogeneity remain challenging for epistasis detection, and careful thought must be given to the way important lower-order effects are accounted for in the analysis.

  17. A Fuzzy-Decision Based Approach for Composite Event Detection in Wireless Sensor Networks

    PubMed Central

    Zhang, Shukui; Chen, Hao; Zhu, Qiaoming

    2014-01-01

    The event detection is one of the fundamental researches in wireless sensor networks (WSNs). Due to the consideration of various properties that reflect events status, the Composite event is more consistent with the objective world. Thus, the research of the Composite event becomes more realistic. In this paper, we analyze the characteristics of the Composite event; then we propose a criterion to determine the area of the Composite event and put forward a dominating set based network topology construction algorithm under random deployment. For the unreliability of partial data in detection process and fuzziness of the event definitions in nature, we propose a cluster-based two-dimensional τ-GAS algorithm and fuzzy-decision based composite event decision mechanism. In the case that the sensory data of most nodes are normal, the two-dimensional τ-GAS algorithm can filter the fault node data effectively and reduce the influence of erroneous data on the event determination. The Composite event judgment mechanism which is based on fuzzy-decision holds the superiority of the fuzzy-logic based algorithm; moreover, it does not need the support of a huge rule base and its computational complexity is small. Compared to CollECT algorithm and CDS algorithm, this algorithm improves the detection accuracy and reduces the traffic. PMID:25136690

  18. Detection of Upper Airway Status and Respiratory Events by a Current Generation Positive Airway Pressure Device

    PubMed Central

    Li, Qing Yun; Berry, Richard B.; Goetting, Mark G.; Staley, Bethany; Soto-Calderon, Haideliza; Tsai, Sheila C.; Jasko, Jeffrey G.; Pack, Allan I.; Kuna, Samuel T.

    2015-01-01

    Study Objectives: To compare a positive airway pressure (PAP) device's detection of respiratory events and airway status during device-detected apneas with events scored on simultaneous polysomnography (PSG). Design: Prospective PSGs of patients with sleep apnea using a new-generation PAP device. Settings: Four clinical and academic sleep centers. Patients: Forty-five patients with obstructive sleep apnea (OSA) and complex sleep apnea (Comp SA) performed a PSG on PAP levels adjusted to induce respiratory events. Interventions: None. Measurements and Results: PAP device data identifying the type of respiratory event and whether the airway during a device-detected apnea was open or obstructed were compared to time-synced, manually scored respiratory events on simultaneous PSG recording. Intraclass correlation coefficients between device-detected and PSG scored events were 0.854 for apnea-hypopnea index (AHI), 0.783 for apnea index, 0.252 for hypopnea index, and 0.098 for respiratory event-related arousals index. At a device AHI (AHIFlow) of 10 events/h, area under the receiver operating characteristic curve was 0.98, with sensitivity 0.92 and specificity 0.84. AHIFlow tended to overestimate AHI on PSG at values less than 10 events/h. The device detected that the airway was obstructed in 87.4% of manually scored obstructive apneas. Of the device-detected apneas with clear airway, a minority (15.8%) were manually scored as obstructive apneas. Conclusions: A device-detected apnea-hypopnea index (AHIFlow) < 10 events/h on a positive airway pressure device is strong evidence of good treatment efficacy. Device-detected airway status agrees closely with the presumed airway status during polysomnography scored events, but should not be equated with a specific type of respiratory event. Citation: Li QY, Berry RB, Goetting MG, Staley B, Soto-Calderon H, Tsai SC, Jasko JG, Pack AI, Kuna ST. Detection of upper airway status and respiratory events by a current generation positive

  19. Event-related complexity analysis and its application in the detection of facial attractiveness.

    PubMed

    Deng, Zhidong; Zhang, Zimu

    2014-11-01

    In this study, an event-related complexity (ERC) analysis method is proposed and used to explore the neural correlates of facial attractiveness detection in the context of a cognitive experiment. The ERC method gives a quantitative index for measuring the diverse brain activation properties that represent the neural correlates of event-related responses. This analysis reveals distinct effects of facial attractiveness processing and also provides further information that could not have been achieved from event-related potential alone.

  20. An adaptive fault-tolerant event detection scheme for wireless sensor networks.

    PubMed

    Yim, Sung-Jib; Choi, Yoon-Hwa

    2010-01-01

    In this paper, we present an adaptive fault-tolerant event detection scheme for wireless sensor networks. Each sensor node detects an event locally in a distributed manner by using the sensor readings of its neighboring nodes. Confidence levels of sensor nodes are used to dynamically adjust the threshold for decision making, resulting in consistent performance even with increasing number of faulty nodes. In addition, the scheme employs a moving average filter to tolerate most transient faults in sensor readings, reducing the effective fault probability. Only three bits of data are exchanged to reduce the communication overhead in detecting events. Simulation results show that event detection accuracy and false alarm rate are kept very high and low, respectively, even in the case where 50% of the sensor nodes are faulty.

  1. Unsupervised Spatial Event Detection in Targeted Domains with Applications to Civil Unrest Modeling

    PubMed Central

    Zhao, Liang; Chen, Feng; Dai, Jing; Hua, Ting; Lu, Chang-Tien; Ramakrishnan, Naren

    2014-01-01

    Twitter has become a popular data source as a surrogate for monitoring and detecting events. Targeted domains such as crime, election, and social unrest require the creation of algorithms capable of detecting events pertinent to these domains. Due to the unstructured language, short-length messages, dynamics, and heterogeneity typical of Twitter data streams, it is technically difficult and labor-intensive to develop and maintain supervised learning systems. We present a novel unsupervised approach for detecting spatial events in targeted domains and illustrate this approach using one specific domain, viz. civil unrest modeling. Given a targeted domain, we propose a dynamic query expansion algorithm to iteratively expand domain-related terms, and generate a tweet homogeneous graph. An anomaly identification method is utilized to detect spatial events over this graph by jointly maximizing local modularity and spatial scan statistics. Extensive experiments conducted in 10 Latin American countries demonstrate the effectiveness of the proposed approach. PMID:25350136

  2. MCD for detection of event-based landslides

    NASA Astrophysics Data System (ADS)

    Mondini, A. C.; Chang, K.; Guzzetti, F.

    2011-12-01

    Landslides play an important role in the landscape evolution of mountainous terrain. They also present a socioeconomic problem in terms of risk for people and properties. Landslide inventory maps are not available for many areas affected by slope instabilities, resulting in a lack of primary information for the comprehension of the phenomenon, evaluation of relative landslide statistics, and civil protection operations on large scales. Traditional methods for the preparation of landslide inventory maps are based on the geomorphological interpretation of stereoscopic aerial photography and field surveys. These methods are expensive and time consuming. The exploitation of new remote sensing data, in particular very high resolution (VHR) satellite images, and new dedicated methods present an alternative to the traditional methods and are at the forefront of modern landslide research. Recent studies have showed the possibility to produce accurate landslide maps, reducing the time and resources required for their compilation and systematic update. This paper presents the Multiple Change Detection (MCD) technique, a new method that has shown promising results in landslide mapping. Through supervised or unsupervised classifiers, MCD combines different algorithms of change detection metrics, such as change in Normalized Differential Vegetation Index, spectral angle, principal component analysis, and independent component analysis, and applies them to a multi-temporal set of VHR satellite images to distinguish new landslides from stable areas. MCD has been applied with success in different geographical areas and with different satellite images, suggesting it is a reliable and robust technique. The technique can distinguish old from new landslides and capture runout features. Results of these case studies will be presented in the conference. Also to be presented are new developments of MCD involving the introduction of a priori information on landslide susceptibility within

  3. Seismic network detection probability assessment using waveforms and accounting to event association logic

    NASA Astrophysics Data System (ADS)

    Pinsky, Vladimir; Shapira, Avi

    2017-01-01

    The geographical area where a seismic event of magnitude M ≥ M t is detected by a seismic station network, for a defined probability is derived from a station probability of detection estimated as a function of epicentral distance. The latter is determined from both the bulletin data and the waveforms recorded by the station during the occurrence of the event with and without band-pass filtering. For simulating the real detection process, the waveforms are processed using the conventional Carl Johnson detection and association algorithm. The attempt is presented to account for the association time criterion in addition to the conventional approach adopted by the known PMC method.

  4. An integrated logit model for contamination event detection in water distribution systems.

    PubMed

    Housh, Mashor; Ostfeld, Avi

    2015-05-15

    The problem of contamination event detection in water distribution systems has become one of the most challenging research topics in water distribution systems analysis. Current attempts for event detection utilize a variety of approaches including statistical, heuristics, machine learning, and optimization methods. Several existing event detection systems share a common feature in which alarms are obtained separately for each of the water quality indicators. Unifying those single alarms from different indicators is usually performed by means of simple heuristics. A salient feature of the current developed approach is using a statistically oriented model for discrete choice prediction which is estimated using the maximum likelihood method for integrating the single alarms. The discrete choice model is jointly calibrated with other components of the event detection system framework in a training data set using genetic algorithms. The fusing process of each indicator probabilities, which is left out of focus in many existing event detection system models, is confirmed to be a crucial part of the system which could be modelled by exploiting a discrete choice model for improving its performance. The developed methodology is tested on real water quality data, showing improved performances in decreasing the number of false positive alarms and in its ability to detect events with higher probabilities, compared to previous studies.

  5. Comparison of the STA/LTA and power spectral density methods for microseismic event detection

    NASA Astrophysics Data System (ADS)

    Vaezi, Yoones; Van der Baan, Mirko

    2015-12-01

    Robust event detection and picking is a prerequisite for reliable (micro-) seismic interpretations. Detection of weak events is a common challenge among various available event detection algorithms. In this paper we compare the performance of two event detection methods, the short-term average/long-term average (STA/LTA) method, which is the most commonly used technique in industry, and a newly introduced method that is based on the power spectral density (PSD) measurements. We have applied both techniques to a 1-hr long segment of the vertical component of some raw continuous data recorded at a borehole geophone in a hydraulic fracturing experiment. The PSD technique outperforms the STA/LTA technique by detecting a higher number of weak events while keeping the number of false alarms at a reasonable level. The time-frequency representations obtained through the PSD method can also help define a more suitable bandpass filter which is usually required for the STA/LTA method. The method offers thus much promise for automated event detection in industrial, local, regional and global seismological data sets.

  6. Probabilistic approaches to fault detection in networked discrete event systems.

    PubMed

    Athanasopoulou, Eleftheria; Hadjicostis, Christoforos N

    2005-09-01

    In this paper, we consider distributed systems that can be modeled as finite state machines with known behavior under fault-free conditions, and we study the detection of a general class of faults that manifest themselves as permanent changes in the next-state transition functionality of the system. This scenario could arise in a variety of situations encountered in communication networks, including faults occurred due to design or implementation errors during the execution of communication protocols. In our approach, fault diagnosis is performed by an external observer/diagnoser that functions as a finite state machine and which has access to the input sequence applied to the system but has only limited access to the system state or output. In particular, we assume that the observer/diagnoser is only able to obtain partial information regarding the state of the given system at intermittent time intervals that are determined by certain synchronizing conditions between the system and the observer/diagnoser. By adopting a probabilistic framework, we analyze ways to optimally choose these synchronizing conditions and develop adaptive strategies that achieve a low probability of aliasing, i.e., a low probability that the external observer/diagnoser incorrectly declares the system as fault-free. An application of these ideas in the context of protocol testing/classification is provided as an example.

  7. Event-specific qualitative and quantitative detection of five genetically modified rice events using a single standard reference molecule.

    PubMed

    Kim, Jae-Hwan; Park, Saet-Byul; Roh, Hyo-Jeong; Shin, Min-Ki; Moon, Gui-Im; Hong, Jin-Hwan; Kim, Hae-Yeong

    2017-07-01

    One novel standard reference plasmid, namely pUC-RICE5, was constructed as a positive control and calibrator for event-specific qualitative and quantitative detection of genetically modified (GM) rice (Bt63, Kemingdao1, Kefeng6, Kefeng8, and LLRice62). pUC-RICE5 contained fragments of a rice-specific endogenous reference gene (sucrose phosphate synthase) as well as the five GM rice events. An existing qualitative PCR assay approach was modified using pUC-RICE5 to create a quantitative method with limits of detection correlating to approximately 1-10 copies of rice haploid genomes. In this quantitative PCR assay, the square regression coefficients ranged from 0.993 to 1.000. The standard deviation and relative standard deviation values for repeatability ranged from 0.02 to 0.22 and 0.10% to 0.67%, respectively. The Ministry of Food and Drug Safety (Korea) validated the method and the results suggest it could be used routinely to identify five GM rice events.

  8. Detecting cardiac events - state-of-the-art.

    PubMed

    Collinson, Paul

    2015-11-01

    Cardiac biomarker measurement currently addresses two key questions in patient management: the differential diagnosis of chest pain and the differential diagnosis of the patient with breathlessness. There are currently three major themes in the strategies for the differential diagnosis of chest pain. The first is to undertake troponin measurement in patients selected to be at lower risk, hence to have a low prior probability of disease. The second is the introduction of high-sensitivity cardiac troponin (hs cTn) assays into routine clinical use with measurement at 0 and 3 h from admission. Two other approaches that utilize the diagnostic characteristics of these assays have also been suggested. The first is to use the limit of detection or limit of blank of the assay as the diagnostic discriminant. The second approach is to use the low imprecision of the assay within the reference interval and combine a discriminant value with an absolute rate of change (delta value). The third is the use of additional biomarkers to allow early discharge from the emergency department. The concept is to measure high-sensitivity cardiac troponin plus the extra marker on admission. The role of measurement of B-type natriuretic peptide or its N-terminal prohormone, N-terminal pro-B-type natriuretic peptide, has been accepted and incorporated into guidelines for chronic heart failure for some time. More recently, guidelines for acute heart failure can also recommend a single measurement of B-type natriuretic peptide or N-terminal pro-B-type natriuretic peptide in people presenting with new suspected acute heart failure.

  9. Supervisory control design based on hybrid systems and fuzzy events detection. Application to an oxichlorination reactor.

    PubMed

    Altamiranda, Edmary; Torres, Horacio; Colina, Eliezer; Chacón, Edgar

    2002-10-01

    This paper presents a supervisory control scheme based on hybrid systems theory and fuzzy events detection. The fuzzy event detector is a linguistic model, which synthesizes complex relations between process variables and process events incorporating experts' knowledge about the process operation. This kind of detection allows the anticipation of appropriate control actions, which depend upon the selected membership functions used to characterize the process under scrutiny. The proposed supervisory control scheme was successfully implemented for an oxichlorination reactor in a vinyl monomer plant. This implementation has allowed improvement of reactor stability and reduction of raw material consumption.

  10. Real-time detection and classification of anomalous events in streaming data

    DOEpatents

    Ferragut, Erik M.; Goodall, John R.; Iannacone, Michael D.; Laska, Jason A.; Harrison, Lane T.

    2016-04-19

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The events can be displayed to a user in user-defined groupings in an animated fashion. The system can include a plurality of anomaly detectors that together implement an algorithm to identify low probability events and detect atypical traffic patterns. The atypical traffic patterns can then be classified as being of interest or not. In one particular example, in a network environment, the classification can be whether the network traffic is malicious or not.

  11. Adaptively Adjusted Event-Triggering Mechanism on Fault Detection for Networked Control Systems.

    PubMed

    Wang, Yu-Long; Lim, Cheng-Chew; Shi, Peng

    2016-12-08

    This paper studies the problem of adaptively adjusted event-triggering mechanism-based fault detection for a class of discrete-time networked control system (NCS) with applications to aircraft dynamics. By taking into account the fault occurrence detection progress and the fault occurrence probability, and introducing an adaptively adjusted event-triggering parameter, a novel event-triggering mechanism is proposed to achieve the efficient utilization of the communication network bandwidth. Both the sensor-to-control station and the control station-to-actuator network-induced delays are taken into account. The event-triggered sensor and the event-triggered control station are utilized simultaneously to establish new network-based closed-loop models for the NCS subject to faults. Based on the established models, the event-triggered simultaneous design of fault detection filter (FDF) and controller is presented. A new algorithm for handling the adaptively adjusted event-triggering parameter is proposed. Performance analysis verifies the effectiveness of the adaptively adjusted event-triggering mechanism, and the simultaneous design of FDF and controller.

  12. An Efficient Pattern Mining Approach for Event Detection in Multivariate Temporal Data.

    PubMed

    Batal, Iyad; Cooper, Gregory; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos

    2016-01-01

    This work proposes a pattern mining approach to learn event detection models from complex multivariate temporal data, such as electronic health records. We present Recent Temporal Pattern mining, a novel approach for efficiently finding predictive patterns for event detection problems. This approach first converts the time series data into time-interval sequences of temporal abstractions. It then constructs more complex time-interval patterns backward in time using temporal operators. We also present the Minimal Predictive Recent Temporal Patterns framework for selecting a small set of predictive and non-spurious patterns. We apply our methods for predicting adverse medical events in real-world clinical data. The results demonstrate the benefits of our methods in learning accurate event detection models, which is a key step for developing intelligent patient monitoring and decision support systems.

  13. An Efficient Pattern Mining Approach for Event Detection in Multivariate Temporal Data

    PubMed Central

    Batal, Iyad; Cooper, Gregory; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos

    2015-01-01

    This work proposes a pattern mining approach to learn event detection models from complex multivariate temporal data, such as electronic health records. We present Recent Temporal Pattern mining, a novel approach for efficiently finding predictive patterns for event detection problems. This approach first converts the time series data into time-interval sequences of temporal abstractions. It then constructs more complex time-interval patterns backward in time using temporal operators. We also present the Minimal Predictive Recent Temporal Patterns framework for selecting a small set of predictive and non-spurious patterns. We apply our methods for predicting adverse medical events in real-world clinical data. The results demonstrate the benefits of our methods in learning accurate event detection models, which is a key step for developing intelligent patient monitoring and decision support systems. PMID:26752800

  14. Minimal elastographic modeling of breast cancer for model based tumor detection in a digital image elasto tomography (DIET) system

    NASA Astrophysics Data System (ADS)

    Lotz, Thomas F.; Muller, Natalie; Hann, Christopher E.; Chase, J. Geoffrey

    2011-03-01

    Digital Image Elasto Tomography (DIET) is a non-invasive breast cancer screening technology that images the surface motion of a breast under harmonic mechanical actuation. A new approach capturing the dynamics and characteristics of tumor behavior is presented. A simple mechanical model of the breast is used to identify a transfer function relating the input harmonic actuation to the output surface displacements using imaging data of a silicone phantom. Areas of higher stiffness cause significant changes of damping and resonant frequencies as seen in the resulting Bode plots. A case study on a healthy and tumor silicone breast phantom shows the potential for this model-based method to clearly distinguish cancerous and healthy tissue as well as correctly predicting the tumor position.

  15. Early snowmelt events: detection, distribution, and significance in a major sub-arctic watershed

    NASA Astrophysics Data System (ADS)

    Alese Semmens, Kathryn; Ramage, Joan; Bartsch, Annett; Liston, Glen E.

    2013-03-01

    High latitude drainage basins are experiencing higher average temperatures, earlier snowmelt onset in spring, and an increase in rain on snow (ROS) events in winter, trends that climate models project into the future. Snowmelt-dominated basins are most sensitive to winter temperature increases that influence the frequency of ROS events and the timing and duration of snowmelt, resulting in changes to spring runoff. Of specific interest in this study are early melt events that occur in late winter preceding melt onset in the spring. The study focuses on satellite determination and characterization of these early melt events using the Yukon River Basin (Canada/USA) as a test domain. The timing of these events was estimated using data from passive (Advanced Microwave Scanning Radiometer—EOS (AMSR-E)) and active (SeaWinds on Quick Scatterometer (QuikSCAT)) microwave remote sensors, employing detection algorithms for brightness temperature (AMSR-E) and radar backscatter (QuikSCAT). The satellite detected events were validated with ground station meteorological and hydrological data, and the spatial and temporal variability of the events across the entire river basin was characterized. Possible causative factors for the detected events, including ROS, fog, and positive air temperatures, were determined by comparing the timing of the events to parameters from SnowModel and National Centers for Environmental Prediction North American Regional Reanalysis (NARR) outputs, and weather station data. All melt events coincided with above freezing temperatures, while a limited number corresponded to ROS (determined from SnowModel and ground data) and a majority to fog occurrence (determined from NARR). The results underscore the significant influence that warm air intrusions have on melt in some areas and demonstrate the large temporal and spatial variability over years and regions. The study provides a method for melt detection and a baseline from which to assess future change.

  16. Using machine learning to detect events in eye-tracking data.

    PubMed

    Zemblys, Raimondas; Niehorster, Diederick C; Komogortsev, Oleg; Holmqvist, Kenneth

    2017-02-23

    Event detection is a challenging stage in eye movement data analysis. A major drawback of current event detection methods is that parameters have to be adjusted based on eye movement data quality. Here we show that a fully automated classification of raw gaze samples as belonging to fixations, saccades, or other oculomotor events can be achieved using a machine-learning approach. Any already manually or algorithmically detected events can be used to train a classifier to produce similar classification of other data without the need for a user to set parameters. In this study, we explore the application of random forest machine-learning technique for the detection of fixations, saccades, and post-saccadic oscillations (PSOs). In an effort to show practical utility of the proposed method to the applications that employ eye movement classification algorithms, we provide an example where the method is employed in an eye movement-driven biometric application. We conclude that machine-learning techniques lead to superior detection compared to current state-of-the-art event detection algorithms and can reach the performance of manual coding.

  17. Event Detection and Visualization of Ocean Eddies based on SSH and Velocity Field

    NASA Astrophysics Data System (ADS)

    Matsuoka, Daisuke; Araki, Fumiaki; Inoue, Yumi; Sasaki, Hideharu

    2016-04-01

    Numerical studies of ocean eddies have been progressed using high-resolution ocean general circulation models. In order to understand ocean eddies from simulation results with large amount of information volume, it is necessary to visualize not only distribution of eddies of each time step, but also events or phenomena of eddies. However, previous methods cannot precisely detect eddies, especially, during the events such as eddies' amalgamation, bifurcation. In the present study, we propose a new approach of eddy's detection, tracking and event visualization based on sea surface height (SSH) and velocity field. The proposed method detects eddies region as well as streams and currents region, and classifies detected eddies into several types. By tracking the time-varying change of classified eddies, it is possible to detect not only eddies event such as amalgamation and bifurcation but also the interaction between eddy and ocean current. As a result of visualizing detected eddies and events, we succeeded in creating the movie which enables us to intuitively understand the region of interest.

  18. Event Detection in Aerospace Systems using Centralized Sensor Networks: A Comparative Study of Several Methodologies

    NASA Technical Reports Server (NTRS)

    Mehr, Ali Farhang; Sauvageon, Julien; Agogino, Alice M.; Tumer, Irem Y.

    2006-01-01

    Recent advances in micro electromechanical systems technology, digital electronics, and wireless communications have enabled development of low-cost, low-power, multifunctional miniature smart sensors. These sensors can be deployed throughout a region in an aerospace vehicle to build a network for measurement, detection and surveillance applications. Event detection using such centralized sensor networks is often regarded as one of the most promising health management technologies in aerospace applications where timely detection of local anomalies has a great impact on the safety of the mission. In this paper, we propose to conduct a qualitative comparison of several local event detection algorithms for centralized redundant sensor networks. The algorithms are compared with respect to their ability to locate and evaluate an event in the presence of noise and sensor failures for various node geometries and densities.

  19. Optimizing Biosurveillance Systems that Use Threshold-based Event Detection Methods

    DTIC Science & Technology

    2009-06-01

    Optimizing Biosurveillance Systems that Use Threshold-based Event Detection Methods Ronald D. Fricker, Jr.∗ and David Banschbach† June 1, 2009...Abstract We describe a methodology for optimizing a threshold detection-based biosurveillance system. The goal is to maximize the system-wide probability of...Using this approach, pub- lic health officials can “tune” their biosurveillance systems to optimally detect various threats, thereby allowing

  20. Cooperative object tracking and composite event detection with wireless embedded smart cameras.

    PubMed

    Wang, Youlu; Velipasalar, Senem; Casares, Mauricio

    2010-10-01

    Embedded smart cameras have limited processing power, memory, energy, and bandwidth. Thus, many system- and algorithm-wise challenges remain to be addressed to have operational, battery-powered wireless smart-camera networks. We present a wireless embedded smart-camera system for cooperative object tracking and detection of composite events spanning multiple camera views. Each camera is a CITRIC mote consisting of a camera board and wireless mote. Lightweight and robust foreground detection and tracking algorithms are implemented on the camera boards. Cameras exchange small-sized data wirelessly in a peer-to-peer manner. Instead of transferring or saving every frame or trajectory, events of interest are detected. Simpler events are combined in a time sequence to define semantically higher-level events. Event complexity can be increased by increasing the number of primitives and/or number of camera views they span. Examples of consistently tracking objects across different cameras, updating location of occluded/lost objects from other cameras, and detecting composite events spanning two or three camera views, are presented. All the processing is performed on camera boards. Operating current plots of smart cameras, obtained when performing different tasks, are also presented. Power consumption is analyzed based upon these measurements.

  1. Adverse event detection in drug development: recommendations and obligations beyond phase 3.

    PubMed

    Berlin, Jesse A; Glasser, Susan C; Ellenberg, Susan S

    2008-08-01

    Premarketing studies of drugs, although large enough to demonstrate efficacy and detect common adverse events, cannot reliably detect an increased incidence of rare adverse events or events with significant latency. For most drugs, only about 500 to 3000 participants are studied, for relatively short durations, before a drug is marketed. Systems for assessment of postmarketing adverse events include spontaneous reports, computerized claims or medical record databases, and formal postmarketing studies. We briefly review the strengths and limitations of each. Postmarketing surveillance is essential for developing a full understanding of the balance between benefits and adverse effects. More work is needed in analysis of data from spontaneous reports of adverse effects and automated databases, design of ad hoc studies, and design of economically feasible large randomized studies.

  2. A model-based information sharing protocol for profile Hidden Markov Models used for HIV-1 recombination detection

    PubMed Central

    2014-01-01

    Background In many applications, a family of nucleotide or protein sequences classified into several subfamilies has to be modeled. Profile Hidden Markov Models (pHMMs) are widely used for this task, modeling each subfamily separately by one pHMM. However, a major drawback of this approach is the difficulty of dealing with subfamilies composed of very few sequences. One of the most crucial bioinformatical tasks affected by the problem of small-size subfamilies is the subtyping of human immunodeficiency virus type 1 (HIV-1) sequences, i.e., HIV-1 subtypes for which only a small number of sequences is known. Results To deal with small samples for particular subfamilies of HIV-1, we introduce a novel model-based information sharing protocol. It estimates the emission probabilities of the pHMM modeling a particular subfamily not only based on the nucleotide frequencies of the respective subfamily but also incorporating the nucleotide frequencies of all available subfamilies. To this end, the underlying probabilistic model mimics the pattern of commonality and variation between the subtypes with regards to the biological characteristics of HI viruses. In order to implement the proposed protocol, we make use of an existing HMM architecture and its associated inference engine. Conclusions We apply the modified algorithm to classify HIV-1 sequence data in the form of partial HIV-1 sequences and semi-artificial recombinants. Thereby, we demonstrate that the performance of pHMMs can be significantly improved by the proposed technique. Moreover, we show that our algorithm performs significantly better than Simplot and Bootscanning. PMID:24946781

  3. On-line Machine Learning and Event Detection in Petascale Data Streams

    NASA Astrophysics Data System (ADS)

    Thompson, David R.; Wagstaff, K. L.

    2012-01-01

    Traditional statistical data mining involves off-line analysis in which all data are available and equally accessible. However, petascale datasets have challenged this premise since it is often impossible to store, let alone analyze, the relevant observations. This has led the machine learning community to investigate adaptive processing chains where data mining is a continuous process. Here pattern recognition permits triage and followup decisions at multiple stages of a processing pipeline. Such techniques can also benefit new astronomical instruments such as the Large Synoptic Survey Telescope (LSST) and Square Kilometre Array (SKA) that will generate petascale data volumes. We summarize some machine learning perspectives on real time data mining, with representative cases of astronomical applications and event detection in high volume datastreams. The first is a "supervised classification" approach currently used for transient event detection at the Very Long Baseline Array (VLBA). It injects known signals of interest - faint single-pulse anomalies - and tunes system parameters to recover these events. This permits meaningful event detection for diverse instrument configurations and observing conditions whose noise cannot be well-characterized in advance. Second, "semi-supervised novelty detection" finds novel events based on statistical deviations from previous patterns. It detects outlier signals of interest while considering known examples of false alarm interference. Applied to data from the Parkes pulsar survey, the approach identifies anomalous "peryton" phenomena that do not match previous event models. Finally, we consider online light curve classification that can trigger adaptive followup measurements of candidate events. Classifier performance analyses suggest optimal survey strategies, and permit principled followup decisions from incomplete data. These examples trace a broad range of algorithm possibilities available for online astronomical data

  4. Detection of bubble nucleation event in superheated drop detector by the pressure sensor

    NASA Astrophysics Data System (ADS)

    Das, Mala; Biswas, Nilanjan

    2017-01-01

    Superheated drop detector consisting of drops of superheated liquid suspended in polymer or gel matrix is of great demand, mainly because of its insensitivity to ß-particles and ?-rays and also because of the low cost. The bubble nucleation event is detected by measuring the acoustic shock wave released during the nucleation process. The present work demonstrates the detection of bubble nucleation events by using the pressure sensor. The associated circuits for the measurement are described in this article. The detection of events is verified by measuring the events with the acoustic sensor. The measurement was done using drops of various sizes to study the effect of the size of the drop on the pressure recovery time. Probability of detection of events has increased for larger size of the superheated drops and lesser volume of air in contact with the gel matrix. The exponential decay fitting to the pressure sensor signals shows the dead time for pressure recovery of such a drop detector to be a few microseconds.

  5. Automated detection of apnea/hypopnea events in healthy children polysomnograms: preliminary results.

    PubMed

    Held, Claudio M; Causa, Leonardo; Jaillet, Fabrice; Chamorro, Rodrigo; Garrido, Marcelo; Algarin, Cecilia; Peirano, Patricio

    2013-01-01

    A methodology to detect sleep apnea/hypopnea events in the respiratory signals of polysomnographic recordings is presented. It applies empirical mode decomposition (EMD), Hilbert-Huang transform (HHT), fuzzy logic and signal preprocessing techniques for feature extraction, expert criteria and context analysis. EMD, HHT and fuzzy logic are used for artifact detection and preliminary detection of respiration signal zones with significant variations in the amplitude of the signal; feature extraction, expert criteria and context analysis are used to characterize and validate the respiratory events. An annotated database of 30 all-night polysomnographic recordings, acquired from 30 healthy ten-year-old children, was divided in a training set of 15 recordings (485 sleep apnea/hypopnea events), a validation set of five recordings (109 sleep apnea/hypopnea events), and a testing set of ten recordings (281 sleep apnea/hypopnea events). The overall detection performance on the testing data set was 89.7% sensitivity and 16.3% false-positive rate. The next step is to include discrimination among apneas, hypopneas and respiratory pauses.

  6. Field testing of component-level model-based fault detection methods for mixing boxes and VAV fan systems

    SciTech Connect

    Xu, Peng; Haves, Philip

    2002-05-16

    An automated fault detection and diagnosis tool for HVAC systems is being developed, based on an integrated, life-cycle, approach to commissioning and performance monitoring. The tool uses component-level HVAC equipment models implemented in the SPARK equation-based simulation environment. The models are configured using design information and component manufacturers' data and then fine-tuned to match the actual performance of the equipment by using data measured during functional tests of the sort using in commissioning. This paper presents the results of field tests of mixing box and VAV fan system models in an experimental facility and a commercial office building. The models were found to be capable of representing the performance of correctly operating mixing box and VAV fan systems and detecting several types of incorrect operation.

  7. Development of a Physical Model-Based Algorithm for the Detection of Single-Nucleotide Substitutions by Using Tiling Microarrays

    PubMed Central

    Ono, Naoaki; Suzuki, Shingo; Furusawa, Chikara; Shimizu, Hiroshi; Yomo, Tetsuya

    2013-01-01

    High-density DNA microarrays are useful tools for analyzing sequence changes in DNA samples. Although microarray analysis provides informative signals from a large number of probes, the analysis and interpretation of these signals have certain inherent limitations, namely, complex dependency of signals on the probe sequences and the existence of false signals arising from non-specific binding between probe and target. In this study, we have developed a novel algorithm to detect the single-base substitutions by using microarray data based on a thermodynamic model of hybridization. We modified the thermodynamic model by introducing a penalty for mismatches that represent the effects of substitutions on hybridization affinity. This penalty results in significantly higher detection accuracy than other methods, indicating that the incorporation of hybridization free energy can improve the analysis of sequence variants by using microarray data. PMID:23382915

  8. Model-based waveform design for optimal detection: A multi-objective approach to dealing with incomplete a priori knowledge.

    PubMed

    Hamschin, Brandon M; Loughlin, Patrick J

    2015-11-01

    This work considers the design of optimal, energy-constrained transmit signals for active sensing for the case when the designer has incomplete or uncertain knowledge of the target and/or environment. The mathematical formulation is that of a multi-objective optimization problem, wherein one can incorporate a plurality of potential targets, interference, or clutter models and in doing so take advantage of the wide range of results in the literature related to modeling each. It is shown, via simulation, that when the objective function of the optimization problem is chosen to maximize the minimum (i.e., maxmin) probability of detection among all possible model combinations, the optimal waveforms obtained are advantageous. The advantage results because the maxmin waveforms judiciously allocate energy to spectral regions where each of the target models respond strongly and each of the environmental models affect minimal detection performance degradation. In particular, improved detection performance is shown compared to linear frequency modulated transmit signals and compared to signals designed with the wrong target spectrum assumed. Additionally, it is shown that the maxmin design yields performance comparable to an optimal design matched to the correct target/environmental model. Finally, it is proven that the maxmin problem formulation is convex.

  9. Method for detecting binding events using micro-X-ray fluorescence spectrometry

    DOEpatents

    Warner, Benjamin P.; Havrilla, George J.; Mann, Grace

    2010-12-28

    Method for detecting binding events using micro-X-ray fluorescence spectrometry. Receptors are exposed to at least one potential binder and arrayed on a substrate support. Each member of the array is exposed to X-ray radiation. The magnitude of a detectable X-ray fluorescence signal for at least one element can be used to determine whether a binding event between a binder and a receptor has occurred, and can provide information related to the extent of binding between the binder and receptor.

  10. Visual sensor based abnormal event detection with moving shadow removal in home healthcare applications.

    PubMed

    Lee, Young-Sook; Chung, Wan-Young

    2012-01-01

    Vision-based abnormal event detection for home healthcare systems can be greatly improved using visual sensor-based techniques able to detect, track and recognize objects in the scene. However, in moving object detection and tracking processes, moving cast shadows can be misclassified as part of objects or moving objects. Shadow removal is an essential step for developing video surveillance systems. The goal of the primary is to design novel computer vision techniques that can extract objects more accurately and discriminate between abnormal and normal activities. To improve the accuracy of object detection and tracking, our proposed shadow removal algorithm is employed. Abnormal event detection based on visual sensor by using shape features variation and 3-D trajectory is presented to overcome the low fall detection rate. The experimental results showed that the success rate of detecting abnormal events was 97% with a false positive rate of 2%. Our proposed algorithm can allow distinguishing diverse fall activities such as forward falls, backward falls, and falling asides from normal activities.

  11. A model-based fault-detection and prediction scheme for nonlinear multivariable discrete-time systems with asymptotic stability guarantees.

    PubMed

    Thumati, Balaje T; Jagannathan, S

    2010-03-01

    In this paper, a novel, unified model-based fault-detection and prediction (FDP) scheme is developed for nonlinear multiple-input-multiple-output (MIMO) discrete-time systems. The proposed scheme addresses both state and output faults by considering separate time profiles. The faults, which could be incipient or abrupt, are modeled using input and output signals of the system. The fault-detection (FD) scheme comprises online approximator in discrete time (OLAD) with a robust adaptive term. An output residual is generated by comparing the FD estimator output with that of the measured system output. A fault is detected when this output residual exceeds a predefined threshold. Upon detecting the fault, the robust adaptive terms and the OLADs are initiated wherein the OLAD approximates the unknown fault dynamics online while the robust adaptive terms help in ensuring asymptotic stability of the FD design. Using the OLAD outputs, a fault diagnosis scheme is introduced. A stable parameter update law is developed not only to tune the OLAD parameters but also to estimate the time to failure (TTF), which is considered as a first step for prognostics. The asymptotic stability of the FDP scheme enhances the detection and TTF accuracy. The effectiveness of the proposed approach is demonstrated using a fourth-order MIMO satellite system.

  12. Detection of invisible and crucial events: from seismic fluctuations to the war against terrorism

    NASA Astrophysics Data System (ADS)

    Allegrini, Paolo; Fronzoni, Leone; Grigolini, Paolo; Latora, Vito; Mega, Mirko S.; Palatella, Luigi; Rapisarda, Andrea; Vinciguerra, Sergio

    2004-04-01

    We prove the efficiency of a new method for the detection of crucial events that might have useful applications to the war against terrorism. This has to do with the search for rare but significant events, a theme of research that has been made of extreme importance by the tragedy of September 11. This method is applied here to defining the statistics of seismic main-shocks, as done in cond-mat/0212529. The emphasis here is on the conceptual issues behind the results obtained in cond-mat/0212529 than on geophysics. This discussion suggests that the method has a wider range of validity. We support this general discussion with a dynamic model originally proposed in cond-mat/0107597 for purposes different from geophysical applications. However, it is a case where the crucial events to detect are under our control, thereby making it possible for us to check the accuracy of the method of detection of invisible and crucial events that we propose here for a general purpose, including the war against terrorism. For this model an analytical treatment has been recently found [cond-mat/0209038], supporting the claims that we make in this paper for the accuracy of the method of detection. For the reader's convenience, the results on the seismic fluctuations are suitably reviewed, and discussed in the light of the more general perspective of this paper. We also review the model for seismic fluctuations, proposed in the earlier work of cond-mat/0212529. This model shares with the model of cond-mat/0107597 the property that the crucial events are imbedded in a sea of secondary events, but it allows us to reveal with accuracy the statistics of the crucial events for different mathematical reasons.

  13. Using Atmospheric Circulation Patterns to Detect and Attribute Changes in the Risk of Extreme Climate Events

    NASA Astrophysics Data System (ADS)

    Diffenbaugh, N. S.; Horton, D. E.; Singh, D.; Swain, D. L.; Touma, D. E.; Mankin, J. S.

    2015-12-01

    Because of the high cost of extreme events and the growing evidence that global warming is likely to alter the statistical distribution of climate variables, detection and attribution of changes in the probability of extreme climate events has become a pressing topic for the scientific community, elected officials, and the public. While most of the emphasis has thus far focused on analyzing the climate variable of interest (most often temperature or precipitation, but also flooding and drought), there is an emerging emphasis on applying detection and attribution analysis techniques to the underlying physical causes of individual extreme events. This approach is promising in part because the underlying physical causes (such as atmospheric circulation patterns) can in some cases be more accurately represented in climate models than the more proximal climate variable (such as precipitation). In addition, and more scientifically critical, is the fact that the most extreme events result from a rare combination of interacting causes, often referred to as "ingredients". Rare events will therefore always have a strong influence of "natural" variability. Analyzing the underlying physical mechanisms can therefore help to test whether there have been changes in the probability of the constituent conditions of an individual event, or whether the co-occurrence of causal conditions cannot be distinguished from random chance. This presentation will review approaches to applying detection/attribution analysis to the underlying physical causes of extreme events (including both "thermodynamic" and "dynamic" causes), and provide a number of case studies, including the role of frequency of atmospheric circulation patterns in the probability of hot, cold, wet and dry events.

  14. Developing assessment system for wireless capsule endoscopy videos based on event detection

    NASA Astrophysics Data System (ADS)

    Chen, Ying-ju; Yasen, Wisam; Lee, Jeongkyu; Lee, Dongha; Kim, Yongho

    2009-02-01

    Along with the advancing of technology in wireless and miniature camera, Wireless Capsule Endoscopy (WCE), the combination of both, enables a physician to diagnose patient's digestive system without actually perform a surgical procedure. Although WCE is a technical breakthrough that allows physicians to visualize the entire small bowel noninvasively, the video viewing time takes 1 - 2 hours. This is very time consuming for the gastroenterologist. Not only it sets a limit on the wide application of this technology but also it incurs considerable amount of cost. Therefore, it is important to automate such process so that the medical clinicians only focus on interested events. As an extension from our previous work that characterizes the motility of digestive tract in WCE videos, we propose a new assessment system for energy based events detection (EG-EBD) to classify the events in WCE videos. For the system, we first extract general features of a WCE video that can characterize the intestinal contractions in digestive organs. Then, the event boundaries are identified by using High Frequency Content (HFC) function. The segments are classified into WCE event by special features. In this system, we focus on entering duodenum, entering cecum, and active bleeding. This assessment system can be easily extended to discover more WCE events, such as detailed organ segmentation and more diseases, by using new special features. In addition, the system provides a score for every WCE image for each event. Using the event scores, the system helps a specialist to speedup the diagnosis process.

  15. Why conventional detection methods fail in identifying the existence of contamination events.

    PubMed

    Liu, Shuming; Li, Ruonan; Smith, Kate; Che, Han

    2016-04-15

    Early warning systems are widely used to safeguard water security, but their effectiveness has raised many questions. To understand why conventional detection methods fail to identify contamination events, this study evaluates the performance of three contamination detection methods using data from a real contamination accident and two artificial datasets constructed using a widely applied contamination data construction approach. Results show that the Pearson correlation Euclidean distance (PE) based detection method performs better for real contamination incidents, while the Euclidean distance method (MED) and linear prediction filter (LPF) method are more suitable for detecting sudden spike-like variation. This analysis revealed why the conventional MED and LPF methods failed to identify existence of contamination events. The analysis also revealed that the widely used contamination data construction approach is misleading.

  16. Adaptive hidden Markov model-based online learning framework for bearing faulty detection and performance degradation monitoring

    NASA Astrophysics Data System (ADS)

    Yu, Jianbo

    2017-01-01

    This study proposes an adaptive-learning-based method for machine faulty detection and health degradation monitoring. The kernel of the proposed method is an "evolving" model that uses an unsupervised online learning scheme, in which an adaptive hidden Markov model (AHMM) is used for online learning the dynamic health changes of machines in their full life. A statistical index is developed for recognizing the new health states in the machines. Those new health states are then described online by adding of new hidden states in AHMM. Furthermore, the health degradations in machines are quantified online by an AHMM-based health index (HI) that measures the similarity between two density distributions that describe the historic and current health states, respectively. When necessary, the proposed method characterizes the distinct operating modes of the machine and can learn online both abrupt as well as gradual health changes. Our method overcomes some drawbacks of the HIs (e.g., relatively low comprehensibility and applicability) based on fixed monitoring models constructed in the offline phase. Results from its application in a bearing life test reveal that the proposed method is effective in online detection and adaptive assessment of machine health degradation. This study provides a useful guide for developing a condition-based maintenance (CBM) system that uses an online learning method without considerable human intervention.

  17. Predicting error in detecting mammographic masses among radiology trainees using statistical models based on BI-RADS features

    SciTech Connect

    Grimm, Lars J. Ghate, Sujata V.; Yoon, Sora C.; Kim, Connie; Kuzmiak, Cherie M.; Mazurowski, Maciej A.

    2014-03-15

    Purpose: The purpose of this study is to explore Breast Imaging-Reporting and Data System (BI-RADS) features as predictors of individual errors made by trainees when detecting masses in mammograms. Methods: Ten radiology trainees and three expert breast imagers reviewed 100 mammograms comprised of bilateral medial lateral oblique and craniocaudal views on a research workstation. The cases consisted of normal and biopsy proven benign and malignant masses. For cases with actionable abnormalities, the experts recorded breast (density and axillary lymph nodes) and mass (shape, margin, and density) features according to the BI-RADS lexicon, as well as the abnormality location (depth and clock face). For each trainee, a user-specific multivariate model was constructed to predict the trainee's likelihood of error based on BI-RADS features. The performance of the models was assessed using area under the receive operating characteristic curves (AUC). Results: Despite the variability in errors between different trainees, the individual models were able to predict the likelihood of error for the trainees with a mean AUC of 0.611 (range: 0.502–0.739, 95% Confidence Interval: 0.543–0.680,p < 0.002). Conclusions: Patterns in detection errors for mammographic masses made by radiology trainees can be modeled using BI-RADS features. These findings may have potential implications for the development of future educational materials that are personalized to individual trainees.

  18. Improving Magnitude Detection Thresholds Using Multi-Station Multi-Event, and Multi-Phase Methods

    DTIC Science & Technology

    2008-07-31

    applied to different tectonic settings and for what percentage of the seismicity. 111 million correlations were performed on Lg-waves for the events in...a significant detection spike. 30 24. Figure 24. Example of an aftershock (spike at 2400 samples) detected after a mainshock (spike at 1500...false alarms in 36 days for a SNR of 0.32. The significant result of this study is that a correlation detector has more than an order of magnitude

  19. Wenchuan Event Detection And Localization Using Waveform Correlation Coupled With Double Difference

    NASA Astrophysics Data System (ADS)

    Slinkard, M.; Heck, S.; Schaff, D. P.; Young, C. J.; Richards, P. G.

    2014-12-01

    The well-studied Wenchuan aftershock sequence triggered by the May 12, 2008, Ms 8.0, mainshock offers an ideal test case for evaluating the effectiveness of using waveform correlation coupled with double difference relocation to detect and locate events in a large aftershock sequence. We use Sandia's SeisCorr detector to process 3 months of data recorded by permanent IRIS and temporary ASCENT stations using templates from events listed in a global catalog to find similar events in the raw data stream. Then we take the detections and relocate them using the double difference method. We explore both the performance that can be expected with using just a small number of stations, and, the benefits of reprocessing a well-studied sequence such as this one using waveform correlation to find even more events. We benchmark our results against previously published results describing relocations of regional catalog data. Before starting this project, we had examples where with just a few stations at far-regional distances, waveform correlation combined with double difference did and impressive job of detection and location events with precision at the few hundred and even tens of meters level.

  20. Final Scientific Report, Integrated Seismic Event Detection and Location by Advanced Array Processing

    SciTech Connect

    Kvaerna, T.; Gibbons. S.J.; Ringdal, F; Harris, D.B.

    2007-01-30

    In the field of nuclear explosion monitoring, it has become a priority to detect, locate, and identify seismic events down to increasingly small magnitudes. The consideration of smaller seismic events has implications for a reliable monitoring regime. Firstly, the number of events to be considered increases greatly; an exponential increase in naturally occurring seismicity is compounded by large numbers of seismic signals generated by human activity. Secondly, the signals from smaller events become more difficult to detect above the background noise and estimates of parameters required for locating the events may be subject to greater errors. Thirdly, events are likely to be observed by a far smaller number of seismic stations, and the reliability of event detection and location using a very limited set of observations needs to be quantified. For many key seismic stations, detection lists may be dominated by signals from routine industrial explosions which should be ascribed, automatically and with a high level of confidence, to known sources. This means that expensive analyst time is not spent locating routine events from repeating seismic sources and that events from unknown sources, which could be of concern in an explosion monitoring context, are more easily identified and can be examined with due care. We have obtained extensive lists of confirmed seismic events from mining and other artificial sources which have provided an excellent opportunity to assess the quality of existing fully-automatic event bulletins and to guide the development of new techniques for online seismic processing. Comparing the times and locations of confirmed events from sources in Fennoscandia and NW Russia with the corresponding time and location estimates reported in existing automatic bulletins has revealed substantial mislocation errors which preclude a confident association of detected signals with known industrial sources. The causes of the errors are well understood and are

  1. Detection and identification of multiple genetically modified events using DNA insert fingerprinting.

    PubMed

    Raymond, Philippe; Gendron, Louis; Khalf, Moustafa; Paul, Sylvianne; Dibley, Kim L; Bhat, Somanath; Xie, Vicki R D; Partis, Lina; Moreau, Marie-Eve; Dollard, Cheryl; Coté, Marie-José; Laberge, Serge; Emslie, Kerry R

    2010-03-01

    Current screening and event-specific polymerase chain reaction (PCR) assays for the detection and identification of genetically modified organisms (GMOs) in samples of unknown composition or for the detection of non-regulated GMOs have limitations, and alternative approaches are required. A transgenic DNA fingerprinting methodology using restriction enzyme digestion, adaptor ligation, and nested PCR was developed where individual GMOs are distinguished by the characteristic fingerprint pattern of the fragments generated. The inter-laboratory reproducibility of the amplified fragment sizes using different capillary electrophoresis platforms was compared, and reproducible patterns were obtained with an average difference in fragment size of 2.4 bp. DNA insert fingerprints for 12 different maize events, including two maize hybrids and one soy event, were generated that reflected the composition of the transgenic DNA constructs. Once produced, the fingerprint profiles were added to a database which can be readily exchanged and shared between laboratories. This approach should facilitate the process of GMO identification and characterization.

  2. Detecting Continuity Violations in Infancy: A New Account and New Evidence from Covering and Tube Events

    ERIC Educational Resources Information Center

    Wang, S.h.; Baillargeon, R.; Paterson, S.

    2005-01-01

    Recent research on infants' responses to occlusion and containment events indicates that, although some violations of the continuity principle are detected at an early age e.g. Aguiar, A., & Baillargeon, R. (1999). 2.5-month-old infants' reasoning about when objects should and should not be occluded. Cognitive Psychology 39, 116-157; Hespos, S.…

  3. Adverse drug reactions – examples of detection of rare events using databases

    PubMed Central

    Chan, Esther W; Liu, Kirin Q L; Chui, Celine S L; Sing, Chor-Wing; Wong, Lisa Y L; Wong, Ian C K

    2015-01-01

    It is recognised that randomised controlled trials are not feasible for capturing rare adverse events. There is an increasing trend towards observational research methodologies using large population-based health databases. These databases offer more scope for adequate sample sizes, allowing for comprehensive patient characterisation and assessment of the associated factors. While direct causality cannot be established and confounders cannot be ignored, databases present an opportunity to explore and quantify rare events. The use of databases for the detection of rare adverse events in the following conditions, sudden death associated with attention deficit hyperactivity disorder (ADHD) treatment, retinal detachment associated with the use of fluoroquinolones and toxic epidermal necrolysis associated with drug exposure, are discussed as examples. In general, rare adverse events tend to have immediate and important clinical implications and may be life-threatening. An understanding of the causative factors is therefore important, in addition to the research methodologies and database platforms that enable the undertaking of the research. PMID:25060360

  4. A novel seizure detection algorithm informed by hidden Markov model event states

    NASA Astrophysics Data System (ADS)

    Baldassano, Steven; Wulsin, Drausin; Ung, Hoameng; Blevins, Tyler; Brown, Mesha-Gay; Fox, Emily; Litt, Brian

    2016-06-01

    Objective. Recently the FDA approved the first responsive, closed-loop intracranial device to treat epilepsy. Because these devices must respond within seconds of seizure onset and not miss events, they are tuned to have high sensitivity, leading to frequent false positive stimulations and decreased battery life. In this work, we propose a more robust seizure detection model. Approach. We use a Bayesian nonparametric Markov switching process to parse intracranial EEG (iEEG) data into distinct dynamic event states. Each event state is then modeled as a multidimensional Gaussian distribution to allow for predictive state assignment. By detecting event states highly specific for seizure onset zones, the method can identify precise regions of iEEG data associated with the transition to seizure activity, reducing false positive detections associated with interictal bursts. The seizure detection algorithm was translated to a real-time application and validated in a small pilot study using 391 days of continuous iEEG data from two dogs with naturally occurring, multifocal epilepsy. A feature-based seizure detector modeled after the NeuroPace RNS System was developed as a control. Main results. Our novel seizure detection method demonstrated an improvement in false negative rate (0/55 seizures missed versus 2/55 seizures missed) as well as a significantly reduced false positive rate (0.0012 h versus 0.058 h-1). All seizures were detected an average of 12.1 ± 6.9 s before the onset of unequivocal epileptic activity (unequivocal epileptic onset (UEO)). Significance. This algorithm represents a computationally inexpensive, individualized, real-time detection method suitable for implantable antiepileptic devices that may considerably reduce false positive rate relative to current industry standards.

  5. Visual and Real-Time Event-Specific Loop-Mediated Isothermal Amplification Based Detection Assays for Bt Cotton Events MON531 and MON15985.

    PubMed

    Randhawa, Gurinder Jit; Chhabra, Rashmi; Bhoge, Rajesh K; Singh, Monika

    2015-01-01

    Bt cotton events MON531 and MON15985 are authorized for commercial cultivation in more than 18 countries. In India, four Bt cotton events have been commercialized; more than 95% of total area under genetically modified (GM) cotton cultivation comprises events MON531 and MON15985. The present study reports on the development of efficient event-specific visual and real-time loop-mediated isothermal amplification (LAMP) assays for detection and identification of cotton events MON531 and MON15985. Efficiency of LAMP assays was compared with conventional and real-time PCR assays. Real-time LAMP assay was found time-efficient and most sensitive, detecting up to two target copies within 35 min. The developed real-time LAMP assays, when combined with efficient DNA extraction kit/protocol, may facilitate onsite GM detection to check authenticity of Bt cotton seeds.

  6. Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)

    NASA Astrophysics Data System (ADS)

    Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko

    2016-07-01

    A new Matched Filtering Algorithm (MFA) is proposed for detecting and analysing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multichannel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multicomponent waveforms into the ray-centred co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, that is, microseismic events for which only one of the S- or P-wave arrivals is evident due to unfavourable S/N conditions. A real-data example using microseismic monitoring data from four stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than fourfold increase) in the number of located events compared with the original catalogue. Moreover, analysis of the new MFA catalogue suggests that this approach leads to more robust interpretation

  7. Pre-trained D-CNN models for detecting complex events in unconstrained videos

    NASA Astrophysics Data System (ADS)

    Robinson, Joseph P.; Fu, Yun

    2016-05-01

    Rapid event detection faces an emergent need to process large videos collections; whether surveillance videos or unconstrained web videos, the ability to automatically recognize high-level, complex events is a challenging task. Motivated by pre-existing methods being complex, computationally demanding, and often non-replicable, we designed a simple system that is quick, effective and carries minimal overhead in terms of memory and storage. Our system is clearly described, modular in nature, replicable on any Desktop, and demonstrated with extensive experiments, backed by insightful analysis on different Convolutional Neural Networks (CNNs), as stand-alone and fused with others. With a large corpus of unconstrained, real-world video data, we examine the usefulness of different CNN models as features extractors for modeling high-level events, i.e., pre-trained CNNs that differ in architectures, training data, and number of outputs. For each CNN, we use 1-fps from all training exemplar to train one-vs-rest SVMs for each event. To represent videos, frame-level features were fused using a variety of techniques. The best being to max-pool between predetermined shot boundaries, then average-pool to form the final video-level descriptor. Through extensive analysis, several insights were found on using pre-trained CNNs as off-the-shelf feature extractors for the task of event detection. Fusing SVMs of different CNNs revealed some interesting facts, finding some combinations to be complimentary. It was concluded that no single CNN works best for all events, as some events are more object-driven while others are more scene-based. Our top performance resulted from learning event-dependent weights for different CNNs.

  8. Drivers of Emerging Infectious Disease Events as a Framework for Digital Detection.

    PubMed

    Olson, Sarah H; Benedum, Corey M; Mekaru, Sumiko R; Preston, Nicholas D; Mazet, Jonna A K; Joly, Damien O; Brownstein, John S

    2015-08-01

    The growing field of digital disease detection, or epidemic intelligence, attempts to improve timely detection and awareness of infectious disease (ID) events. Early detection remains an important priority; thus, the next frontier for ID surveillance is to improve the recognition and monitoring of drivers (antecedent conditions) of ID emergence for signals that precede disease events. These data could help alert public health officials to indicators of elevated ID risk, thereby triggering targeted active surveillance and interventions. We believe that ID emergence risks can be anticipated through surveillance of their drivers, just as successful warning systems of climate-based, meteorologically sensitive diseases are supported by improved temperature and precipitation data. We present approaches to driver surveillance, gaps in the current literature, and a scientific framework for the creation of a digital warning system. Fulfilling the promise of driver surveillance will require concerted action to expand the collection of appropriate digital driver data.

  9. EO/IR satellite constellations for the early detection and tracking of collision events

    NASA Astrophysics Data System (ADS)

    Zatezalo, A.; El-Fallah, A.; Mahler, R.; Mehra, R. K.; Pham, K.

    2010-04-01

    The detection and tracking of collision events involving existing Low Earth Orbit (LEO) Resident Space Objects (RSOs) is becoming increasingly important with the higher LEO space objects traffic volume which is anticipated to increase even further in the near future. Changes in velocity that can lead to a collision are hard to detect early on time, and before the collision happens. Several collision events can happen at the same time and continuous monitoring of the LEO orbit is necessary in order to determine and implement collision avoidance strategies. We present a simulation of a constellation system consisting of multiple platforms carrying EO/IR sensors for the detection of such collisions. The presented simulation encompasses the full complexity of LEO trajectories changes which can collide with currently operating satellites. Efficient multitarget filter with information-theoretic multisensor management is implemented and evaluated on different constellations.

  10. Drivers of Emerging Infectious Disease Events as a Framework for Digital Detection

    PubMed Central

    Olson, Sarah H.; Benedum, Corey M.; Mekaru, Sumiko R.; Preston, Nicholas D.; Mazet, Jonna A.K.; Joly, Damien O.

    2015-01-01

    The growing field of digital disease detection, or epidemic intelligence, attempts to improve timely detection and awareness of infectious disease (ID) events. Early detection remains an important priority; thus, the next frontier for ID surveillance is to improve the recognition and monitoring of drivers (antecedent conditions) of ID emergence for signals that precede disease events. These data could help alert public health officials to indicators of elevated ID risk, thereby triggering targeted active surveillance and interventions. We believe that ID emergence risks can be anticipated through surveillance of their drivers, just as successful warning systems of climate-based, meteorologically sensitive diseases are supported by improved temperature and precipitation data. We present approaches to driver surveillance, gaps in the current literature, and a scientific framework for the creation of a digital warning system. Fulfilling the promise of driver surveillance will require concerted action to expand the collection of appropriate digital driver data. PMID:26196106

  11. Event Detection and Location of Earthquakes Using the Cascadia Initiative Dataset

    NASA Astrophysics Data System (ADS)

    Morton, E.; Bilek, S. L.; Rowe, C. A.

    2015-12-01

    The Cascadia subduction zone (CSZ) produces a range of slip behavior along the plate boundary megathrust, from great earthquakes to episodic slow slip and tremor (ETS). Unlike other subduction zones that produce great earthquakes and ETS, the CSZ is notable for the lack of small and moderate magnitude earthquakes recorded. The seismogenic zone extent is currently estimated to be primarily offshore, thus the lack of observed small, interplate earthquakes may be partially due to the use of only land seismometers. The Cascadia Initiative (CI) community seismic experiment seeks to address this issue by including ocean bottom seismometers (OBS) deployed directly over the locked seismogenic zone, in addition to land seismometers. We use these seismic data to explore whether small magnitude earthquakes are occurring on the plate interface, but have gone undetected by the land-based seismic networks. We select a subset of small magnitude (M0.1-3.7) earthquakes from existing earthquake catalogs, based on land seismic data, whose preliminary hypocentral locations suggest they may have occurred on the plate interface. We window the waveforms on CI OBS and land seismometers around the phase arrival times for these earthquakes to generate templates for subspace detection, which allows for additional flexibility over traditional matched filter detection methods. Here we present event detections from the first year of CI deployment and preliminary locations for the detected events. Initial results of scanning the first year of the CI deployment using one cluster of template events, located near a previously identified subducted seamount, include 473 detections on OBS station M08A (~61.6 km offshore) and 710 detections on OBS station J25A (~44.8 km northeast of M08A). Ongoing efforts include detection using additional OBS stations along the margin, as well as determining locations of clusters detected in the first year of deployment.

  12. Long-Duration Neutron Production in Solar Eruptive Events Detected with the MESSENGER Neutron Spectrometer

    NASA Astrophysics Data System (ADS)

    Feldman, W. C.; Lawrence, D. J.; Vestrand, W. T.; Peplowski, P. N.

    2014-12-01

    Nine long-duration neutron solar eruptive events (SEEs) between 31 December 2007 and 16 March 2013 appear to be excellent candidates for detection of fast neutrons from the Sun by the MESSENGER Neutron Spectrometer (NS). One event (on 4 June 2011) is the cleanest example, because it was not accompanied by energetic ions at MESSENGER having energies greater than 50±10 MeV/nuc. The purpose of this study is to assemble a set of conditions common to all events that can help identify the physical conditions at their origin. We classified the nine events into three categories: (1) those having tight magnetic connection to the Sun as well as to spacecraft at 1 AU that can separately measure the energetic proton, alpha particle, and electron flux spectra, (2) those with sufficiently close connection that the energetic flux spectra can be compared, (3) those that have only marginal connections, and (4) those that are also seen at Earth. Four events fall into category (1), three into category (2), two into category (3), and parts of four events overlapped neutron events also seen by the scintillation FIBer solar neutron telescope (FIB) detector placed on the International Space Station in 2009. Seven of the nine events that have either tight or marginal magnetic connection have alpha particle abundances less than 2%. For each event, we modeled expected fast neutron count rates from the 1 AU ion spectrum, a process that accounts for the transport of the neutrons through the spacecraft to the NS. The ratios of measured to predicted fast-neutron counts range between 2.0 and 12.1.

  13. Automatic microseismic event detection by band-limited phase-only correlation

    NASA Astrophysics Data System (ADS)

    Wu, Shaojiang; Wang, Yibo; Zhan, Yi; Chang, Xu

    2016-12-01

    Identification and detection of microseismic events is a significant issue in source locations and source mechanism analysis. The number of the records is notably large, especially in the case of some real-time monitoring, and while the majority of microseismic events are highly weak and sparse, automatic algorithms are indispensable. In this study, we introduce an effective method for the identification and detection of microseismic events by judging whether the P-wave phase exists in a local segment from a single three-component microseismic records. The new judging algorithm consists primarily of the following key steps: 1) transform the waveform time series into time-varying spectral representations using the S-transform; 2) calculate the similarity of the frequency content in the time-frequency domain using the phase-only correlation function; and 3) identify the P-phase by the combination analysis between any two components. The proposed algorithm is compared to a similar approach using the cross-correlation in the time domain between any two components and later tested with synthetic microseismic datasets and real field-recorded datasets. The results indicate that the proposed algorithm is able to distinguish similar and dissimilar waveforms, even for low signal noise ratio and emergent events, which is important for accurate and rapid selection of microseismic events from a large number of records. This method can be applied to other geophysical analyses based on the waveform data.

  14. Event-specific quantitative detection of nine genetically modified maizes using one novel standard reference molecule.

    PubMed

    Yang, Litao; Guo, Jinchao; Pan, Aihu; Zhang, Haibo; Zhang, Kewei; Wang, Zhengming; Zhang, Dabing

    2007-01-10

    With the development of genetically modified organism (GMO) detection techniques, the Polymerase Chain Reaction (PCR) technique has been the mainstay for GMO detection, and real-time PCR is the most effective and important method for GMO quantification. An event-specific detection strategy based on the unique and specific integration junction sequences between the host plant genome DNA and the integrated gene is being developed for its high specificity. This study establishes the event-specific detection methods for TC1507 and CBH351 maizes. In addition, the event-specific TaqMan real-time PCR detection methods for another seven GM maize events (Bt11, Bt176, GA21, MON810, MON863, NK603, and T25) were systematically optimized and developed. In these PCR assays, the fluorescent quencher, TAMRA, was dyed on the T-base of the probe at the internal position to improve the intensity of the fluorescent signal. To overcome the difficulties in obtaining the certified reference materials of these GM maizes, one novel standard reference molecule containing all nine specific integration junction sequences of these GM maizes and the maize endogenous reference gene, zSSIIb, was constructed and used for quantitative analysis. The limits of detection of these methods were 20 copies for these different GM maizes, the limits of quantitation were about 20 copies, and the dynamic ranges for quantification were from 0.05 to 100% in 100 ng of DNA template. Furthermore, nine groups of the mixed maize samples of these nine GM maize events were quantitatively analyzed to evaluate the accuracy and precision. The accuracy expressed as bias varied from 0.67 to 28.00% for the nine tested groups of GM maize samples, and the precision expressed as relative standard deviations was from 0.83 to 26.20%. All of these indicated that the established event-specific real-time PCR detection systems and the reference molecule in this study are suitable for the identification and quantification of these GM

  15. Signal Detection of Adverse Drug Reaction of Amoxicillin Using the Korea Adverse Event Reporting System Database.

    PubMed

    Soukavong, Mick; Kim, Jungmee; Park, Kyounghoon; Yang, Bo Ram; Lee, Joongyub; Jin, Xue Mei; Park, Byung Joo

    2016-09-01

    We conducted pharmacovigilance data mining for a β-lactam antibiotics, amoxicillin, and compare the adverse events (AEs) with the drug labels of 9 countries including Korea, USA, UK, Japan, Germany, Swiss, Italy, France, and Laos. We used the Korea Adverse Event Reporting System (KAERS) database, a nationwide database of AE reports, between December 1988 and June 2014. Frequentist and Bayesian methods were used to calculate disproportionality distribution of drug-AE pairs. The AE which was detected by all the three indices of proportional reporting ratio (PRR), reporting odds ratio (ROR), and information component (IC) was defined as a signal. The KAERS database contained a total of 807,582 AE reports, among which 1,722 reports were attributed to amoxicillin. Among the 192,510 antibiotics-AE pairs, the number of amoxicillin-AE pairs was 2,913. Among 241 AEs, 52 adverse events were detected as amoxicillin signals. Comparing the drug labels of 9 countries, 12 adverse events including ineffective medicine, bronchitis, rhinitis, sinusitis, dry mouth, gastroesophageal reflux, hypercholesterolemia, gastric carcinoma, abnormal crying, induration, pulmonary carcinoma, and influenza-like symptoms were not listed on any of the labels of nine countries. In conclusion, we detected 12 new signals of amoxicillin which were not listed on the labels of 9 countries. Therefore, it should be followed by signal evaluation including causal association, clinical significance, and preventability.

  16. Signal Detection of Adverse Drug Reaction of Amoxicillin Using the Korea Adverse Event Reporting System Database

    PubMed Central

    2016-01-01

    We conducted pharmacovigilance data mining for a β-lactam antibiotics, amoxicillin, and compare the adverse events (AEs) with the drug labels of 9 countries including Korea, USA, UK, Japan, Germany, Swiss, Italy, France, and Laos. We used the Korea Adverse Event Reporting System (KAERS) database, a nationwide database of AE reports, between December 1988 and June 2014. Frequentist and Bayesian methods were used to calculate disproportionality distribution of drug-AE pairs. The AE which was detected by all the three indices of proportional reporting ratio (PRR), reporting odds ratio (ROR), and information component (IC) was defined as a signal. The KAERS database contained a total of 807,582 AE reports, among which 1,722 reports were attributed to amoxicillin. Among the 192,510 antibiotics-AE pairs, the number of amoxicillin-AE pairs was 2,913. Among 241 AEs, 52 adverse events were detected as amoxicillin signals. Comparing the drug labels of 9 countries, 12 adverse events including ineffective medicine, bronchitis, rhinitis, sinusitis, dry mouth, gastroesophageal reflux, hypercholesterolemia, gastric carcinoma, abnormal crying, induration, pulmonary carcinoma, and influenza-like symptoms were not listed on any of the labels of nine countries. In conclusion, we detected 12 new signals of amoxicillin which were not listed on the labels of 9 countries. Therefore, it should be followed by signal evaluation including causal association, clinical significance, and preventability. PMID:27510377

  17. Detections of Planets in Binaries Through the Channel of Chang–Refsdal Gravitational Lensing Events

    NASA Astrophysics Data System (ADS)

    Han, Cheongho; Shin, In-Gu; Jung, Youn Kil

    2017-02-01

    Chang–Refsdal (C–R) lensing, which refers to the gravitational lensing of a point mass perturbed by a constant external shear, provides a good approximation in describing lensing behaviors of either a very wide or a very close binary lens. C–R lensing events, which are identified by short-term anomalies near the peak of high-magnification lensing light curves, are routinely detected from lensing surveys, but not much attention is paid to them. In this paper, we point out that C–R lensing events provide an important channel to detect planets in binaries, both in close and wide binary systems. Detecting planets through the C–R lensing event channel is possible because the planet-induced perturbation occurs in the same region of the C–R lensing-induced anomaly and thus the existence of the planet can be identified by the additional deviation in the central perturbation. By presenting the analysis of the actually observed C–R lensing event OGLE-2015-BLG-1319, we demonstrate that dense and high-precision coverage of a C–R lensing-induced perturbation can provide a strong constraint on the existence of a planet in a wide range of planet parameters. The sample of an increased number of microlensing planets in binary systems will provide important observational constraints in giving shape to the details of planet formation, which have been restricted to the case of single stars to date.

  18. Model-Based Fault Tolerant Control

    NASA Technical Reports Server (NTRS)

    Kumar, Aditya; Viassolo, Daniel

    2008-01-01

    The Model Based Fault Tolerant Control (MBFTC) task was conducted under the NASA Aviation Safety and Security Program. The goal of MBFTC is to develop and demonstrate real-time strategies to diagnose and accommodate anomalous aircraft engine events such as sensor faults, actuator faults, or turbine gas-path component damage that can lead to in-flight shutdowns, aborted take offs, asymmetric thrust/loss of thrust control, or engine surge/stall events. A suite of model-based fault detection algorithms were developed and evaluated. Based on the performance and maturity of the developed algorithms two approaches were selected for further analysis: (i) multiple-hypothesis testing, and (ii) neural networks; both used residuals from an Extended Kalman Filter to detect the occurrence of the selected faults. A simple fusion algorithm was implemented to combine the results from each algorithm to obtain an overall estimate of the identified fault type and magnitude. The identification of the fault type and magnitude enabled the use of an online fault accommodation strategy to correct for the adverse impact of these faults on engine operability thereby enabling continued engine operation in the presence of these faults. The performance of the fault detection and accommodation algorithm was extensively tested in a simulation environment.

  19. A Heuristic Indication and Warning Staging Model for Detection and Assessment of Biological Events

    PubMed Central

    Wilson, James M.; Polyak, Marat G.; Blake, Jane W.; Collmann, Jeff

    2008-01-01

    Objective This paper presents a model designed to enable rapid detection and assessment of biological threats that may require swift intervention by the international public health community. Design We utilized Strauss’ grounded theory to develop an expanded model of social disruption due to biological events based on retrospective and prospective case studies. We then applied this model to the temporal domain and propose a heuristic staging model, the Wilson–Collmann Scale for assessing biological event evolution. Measurements We retrospectively and manually examined hard copy archival local media reports in the native vernacular for three biological events associated with substantial social disruption. The model was then tested prospectively through media harvesting based on keywords corresponding to the model parameters. Results Our heuristic staging model provides valuable information about the features of a biological event that can be used to determine the level of concern warranted, such as whether the pathogen in question is responding to established public health disease control measures, including the use of antimicrobials or vaccines; whether the public health and medical infrastructure of the country involved is adequate to mount the necessary response; whether the country’s officials are providing an appropriate level of information to international public health authorities; and whether the event poses a international threat. The approach is applicable for monitoring open-source (public-domain) media for indications and warnings of such events, and specifically for markers of the social disruption that commonly occur as these events unfold. These indications and warnings can then be used as the basis for staging the biological threat in the same manner that the United States National Weather Service currently uses storm warning models (such as the Saffir-Simpson Hurricane Scale) to detect and assess threatening weather conditions. Conclusion

  20. Exploring the Limits of Waveform Correlation Event Detection as Applied to Three Earthquake Aftershock Sequences

    NASA Astrophysics Data System (ADS)

    Young, C. J.; Carr, D.; Resor, M.; Duffey, S.

    2009-12-01

    Swarms of earthquakes and/or aftershock sequences can dramatically increase the level of seismicity in a region for a period of time lasting from days to months, depending on the swarm or sequence. Such occurrences can provide a large amount of useful information to seismologists. For those who monitor seismic events for possible nuclear explosions, however, these swarms/sequences are a nuisance. In an explosion monitoring system, each event must be treated as a possible nuclear test until it can be proven, to a high degree of confidence, not to be. Seismic events recorded by the same station with highly correlated waveforms almost certainly have a similar location and source type, so clusters of events within a swarm can quickly be identified as earthquakes. We have developed a number of tools that can be used to exploit the high degree of waveform similarity expected to be associated with swarms/sequences. Dendro Tool measures correlations between known events. The Waveform Correlation Detector is intended to act as a detector, finding events in raw data which correlate with known events. The Self Scanner is used to find all correlated segments within a raw data steam and does not require an event library. All three techniques together provide an opportunity to study the similarities of events in an aftershock sequence in different ways. To comprehensively characterize the benefits and limits of waveform correlation techniques, we studied 3 aftershock sequences, using our 3 tools, at multiple stations. We explored the effects of station distance and event magnitudes on correlation results. Lastly, we show the reduction in detection threshold and analyst workload offered by waveform correlation techniques compared to STA/LTA based detection. We analyzed 4 days of data from each aftershock sequence using all three methods. Most known events clustered in a similar manner across the toolsets. Up to 25% of catalogued events were found to be a member of a cluster. In

  1. Wave-induced burst precipitation events detected with a digital ionosonde

    SciTech Connect

    Jarvis, M.J.; Smith, A.J. ); Berkey, F.T. ); Carpenter, D.L. )

    1990-01-01

    Initial results are presented from two methods whereby burst precipitation events in the lower ionosphere, almost certainly induced by VLF wave-particle interactions in the magnetosphere, have been detected using a ground-based digital ionosonde. In the first method, HF echoes are received above the critical frequency of the surrounding plasma; particle energies and the location and extent of the plasma enhancement may be deduced. In the second method, a rapid decrease in the phase of ionospheric echoes is observed due to refractive index changes along the echo path; particle energies, the duration of the precipitation event and the precipitation energy flux can be estimated.

  2. Composite Event Specification and Detection for Supporting Active Capability in an OODBMS: Semantics Architecture and Implementation.

    DTIC Science & Technology

    1995-03-01

    For all outgoing edges i from ’n’ propagate parameters in node ’n’ to the nodei connected by edge i activate-operator-node( nodej ); Delete propagated...El E2 Figure 6: Detection of X in recent mode PROCEDURE activate-operator-node( nodej ) /* Recent Context */ CASE nodei is of type a primitive or...composite event has been signalled to nodej */ AND(E1, E2): if left event el is signalled if E2’s list is not empty Pass <e2, el> to the parent Replace el in

  3. Detecting adverse drug events in discharge summaries using variations on the simple Bayes model.

    PubMed

    Visweswaran, Shyam; Hanbury, Paul; Saul, Melissa; Cooper, Gregory F

    2003-01-01

    Detection and prevention of adverse events and, in particular, adverse drug events (ADEs), is an important problem in health care today. We describe the implementation and evaluation of four variations on the simple Bayes model for identifying ADE-related discharge summaries. Our results show that these probabilistic techniques achieve an ROC curve area of up to 0.77 in correctly determining which patient cases should be assigned an ADE-related ICD-9-CM code. These results suggest a potential for these techniques to contribute to the development of an automated system that helps identify ADEs, as a step toward further understanding and preventing them.

  4. Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO)

    PubMed Central

    Yan, Lixin; Zhang, Yishi; He, Yi; Gao, Song; Zhu, Dunyao; Ran, Bin; Wu, Qing

    2016-01-01

    The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1) the Markov blanket (MB) algorithm is employed to extract the main factors associated with hazardous traffic events; (2) a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle’s speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G) have significant influences on hazardous traffic events. The sequential minimal optimization (SMO) algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles. PMID:27420073

  5. Detection of seismic events triggered by P-waves from the 2011 Tohoku-Oki earthquake

    NASA Astrophysics Data System (ADS)

    Miyazawa, Masatoshi

    2012-12-01

    Large-amplitude surface waves from the 2011 Tohoku-Oki earthquake triggered many seismic events across Japan, while the smaller amplitude P-wave triggering remains unclear. A spectral method was used to detect seismic events triggered by the first arriving P-waves over Japan. This method uses a reference event to correct for source and propagation effects, so that the local response near the station can be examined in detail. P-wave triggering was found in the regions where triggered non-volcanic tremor (NVT) has been observed, and some seismic and volcanic regions. The triggering strain due to P-waves is of the order of 10-8 to 10-7, which is 1 to 2 orders of magnitude smaller than the triggering strain necessary for the surface wave triggering. In the regions of NVT, the triggered event was not identified with slow events, but with other seismic events such as tectonic earthquakes. The sequence of triggering in the regions started with P-wave arrivals. The subsequent surface waves contributed to triggering of NVT, possibly together with slow slip, which resulted in the large amplitude of the NVT.

  6. Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO).

    PubMed

    Yan, Lixin; Zhang, Yishi; He, Yi; Gao, Song; Zhu, Dunyao; Ran, Bin; Wu, Qing

    2016-07-13

    The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1) the Markov blanket (MB) algorithm is employed to extract the main factors associated with hazardous traffic events; (2) a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle's speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G) have significant influences on hazardous traffic events. The sequential minimal optimization (SMO) algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles.

  7. Microfluidic Arrayed Lab-On-A-Chip for Electrochemical Capacitive Detection of DNA Hybridization Events.

    PubMed

    Ben-Yoav, Hadar; Dykstra, Peter H; Bentley, William E; Ghodssi, Reza

    2017-01-01

    A microfluidic electrochemical lab-on-a-chip (LOC) device for DNA hybridization detection has been developed. The device comprises a 3 × 3 array of microelectrodes integrated with a dual layer microfluidic valved manipulation system that provides controlled and automated capabilities for high throughput analysis of microliter volume samples. The surface of the microelectrodes is functionalized with single-stranded DNA (ssDNA) probes which enable specific detection of complementary ssDNA targets. These targets are detected by a capacitive technique which measures dielectric variation at the microelectrode-electrolyte interface due to DNA hybridization events. A quantitative analysis of the hybridization events is carried out based on a sensing modeling that includes detailed analysis of energy storage and dissipation components. By calculating these components during hybridization events the device is able to demonstrate specific and dose response sensing characteristics. The developed microfluidic LOC for DNA hybridization detection offers a technology for real-time and label-free assessment of genetic markers outside of laboratory settings, such as at the point-of-care or in-field environmental monitoring.

  8. An algorithm to detect low incidence arrhythmic events in electrocardiographic records from ambulatory patients.

    PubMed

    Hungenahally, S K; Willis, R J

    1994-11-01

    An algorithm was devised to detect low incidence arrhythmic events in electrocardiograms obtained during ambulatory monitoring. The algorithm incorporated baseline correction and R wave detection. The RR interval was used to identify tachycardia, bradycardia, and premature ventricular beats. Only a few beats before and after the arrhythmic event were stored. The software was evaluated on a prototype hardware system which consisted of an Intel 86/30 single board computer with a suitable analog pre-processor and an analog to digital converter. The algorithm was used to determine the incidence and type of arrhythmia in records from an ambulatory electrocardiogram (ECG) database and from a cardiac exercise laboratory. These results were compared to annotations on the records which were assumed to be correct. Standard criteria used previously to evaluate algorithms designed for arrhythmia detection were sensitivity, specificity, and diagnostic accuracy. Sensitivities ranging from 77 to 100%, specificities from 94 to 100%, and diagnostic accuracies from 92 to 100% were obtained on the different data sets. These results compare favourably with published results based on more elaborate algorithms. By circumventing the need to make a continuous record of the ECG, the algorithm could form the basis for a compact monitoring device for the detection of arrhythmic events which are so infrequent that standard 24-h Holter monitoring is insufficient.

  9. CTBT infrasound network performance to detect the 2013 Russian fireball event

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; Le Pichon, Alexis; Mialle, Pierrick; Garcés, Milton A.

    2015-04-01

    The explosive fragmentation of the 2013 Chelyabinsk meteorite generated a large airburst with an equivalent yield of 500 kT TNT. It is the most energetic event recorded by the infrasound component of the Comprehensive Nuclear-Test-Ban Treaty-International Monitoring System (CTBT-IMS), globally detected by 20 out of 42 operational stations. This study performs a station-by-station estimation of the IMS detection capability to explain infrasound detections and nondetections from short to long distances, using the Chelyabinsk meteorite as global reference event. Investigated parameters influencing the detection capability are the directivity of the line source signal, the ducting of acoustic energy, and the individual noise conditions at each station. Findings include a clear detection preference for stations perpendicular to the meteorite trajectory, even over large distances. Only a weak influence of stratospheric ducting is observed for this low-frequency case. Furthermore, a strong dependence on the diurnal variability of background noise levels at each station is observed, favoring nocturnal detections.

  10. Support Vector Machine Model for Automatic Detection and Classification of Seismic Events

    NASA Astrophysics Data System (ADS)

    Barros, Vesna; Barros, Lucas

    2016-04-01

    The automated processing of multiple seismic signals to detect, localize and classify seismic events is a central tool in both natural hazards monitoring and nuclear treaty verification. However, false detections and missed detections caused by station noise and incorrect classification of arrivals are still an issue and the events are often unclassified or poorly classified. Thus, machine learning techniques can be used in automatic processing for classifying the huge database of seismic recordings and provide more confidence in the final output. Applied in the context of the International Monitoring System (IMS) - a global sensor network developed for the Comprehensive Nuclear-Test-Ban Treaty (CTBT) - we propose a fully automatic method for seismic event detection and classification based on a supervised pattern recognition technique called the Support Vector Machine (SVM). According to Kortström et al., 2015, the advantages of using SVM are handleability of large number of features and effectiveness in high dimensional spaces. Our objective is to detect seismic events from one IMS seismic station located in an area of high seismicity and mining activity and classify them as earthquakes or quarry blasts. It is expected to create a flexible and easily adjustable SVM method that can be applied in different regions and datasets. Taken a step further, accurate results for seismic stations could lead to a modification of the model and its parameters to make it applicable to other waveform technologies used to monitor nuclear explosions such as infrasound and hydroacoustic waveforms. As an authorized user, we have direct access to all IMS data and bulletins through a secure signatory account. A set of significant seismic waveforms containing different types of events (e.g. earthquake, quarry blasts) and noise is being analysed to train the model and learn the typical pattern of the signal from these events. Moreover, comparing the performance of the support

  11. Model-based fault detection and isolation for intermittently active faults with application to motion-based thruster fault detection and isolation for spacecraft

    NASA Technical Reports Server (NTRS)

    Wilson, Edward (Inventor)

    2008-01-01

    The present invention is a method for detecting and isolating fault modes in a system having a model describing its behavior and regularly sampled measurements. The models are used to calculate past and present deviations from measurements that would result with no faults present, as well as with one or more potential fault modes present. Algorithms that calculate and store these deviations, along with memory of when said faults, if present, would have an effect on the said actual measurements, are used to detect when a fault is present. Related algorithms are used to exonerate false fault modes and finally to isolate the true fault mode. This invention is presented with application to detection and isolation of thruster faults for a thruster-controlled spacecraft. As a supporting aspect of the invention, a novel, effective, and efficient filtering method for estimating the derivative of a noisy signal is presented.

  12. Use of a clinical event monitor to prevent and detect medication errors.

    PubMed Central

    Payne, T. H.; Savarino, J.; Marshall, R.; Hoey, C. T.

    2000-01-01

    Errors in health care facilities are common and often unrecognized. We have used our clinical event monitor to prevent and detect medication errors by scrutinizing electronic messages sent to it when any medication order is written in our facility. A growing collection of medication safety rules covering dose limit errors, laboratory monitoring, and other topics may be applied to each medication order message to provide an additional layer of protection beyond existing order checks, reminders, and alerts available within our computer-based record system. During a typical day the event monitor receives 4802 messages, of which 4719 pertain to medication orders. We have found the clinical event monitor to be a valuable tool for clinicians and quality management groups charged with improving medication safety. PMID:11079962

  13. Assessment and validation of a simple automated method for the detection of gait events and intervals.

    PubMed

    Ghoussayni, Salim; Stevens, Christopher; Durham, Sally; Ewins, David

    2004-12-01

    A simple and rapid automatic method for detection of gait events at the foot could speed up and possibly increase the repeatability of gait analysis and evaluations of treatments for pathological gaits. The aim of this study was to compare and validate a kinematic-based algorithm used in the detection of four gait events, heel contact, heel rise, toe contact and toe off. Force platform data is often used to obtain start and end of contact phases, but not usually heel rise and toe contact events. For this purpose synchronised kinematic, kinetic and video data were captured from 12 healthy adult subjects walking both barefoot and shod at slow and normal self-selected speeds. The data were used to determine the gait events using three methods: force, visual inspection and algorithm methods. Ninety percent of all timings given by the algorithm were within one frame (16.7 ms) when compared to visual inspection. There were no statistically significant differences between the visual and algorithm timings. For both heel and toe contact the differences between the three methods were within 1.5 frames, whereas for heel rise and toe off the differences between the force on one side and the visual and algorithm on the other were higher and more varied (up to 175 ms). In addition, the algorithm method provided the duration of three intervals, heel contact to toe contact, toe contact to heel rise and heel rise to toe off, which are not readily available from force platform data. The ability to automatically and reliably detect the timings of these four gait events and three intervals using kinematic data alone is an asset to clinical gait analysis.

  14. Flow detection via sparse frame analysis for suspicious event recognition in infrared imagery

    NASA Astrophysics Data System (ADS)

    Fernandes, Henrique C.; Batista, Marcos A.; Barcelos, Celia A. Z.; Maldague, Xavier P. V.

    2013-05-01

    It is becoming increasingly evident that intelligent systems are very bene¯cial for society and that the further development of such systems is necessary to continue to improve society's quality of life. One area that has drawn the attention of recent research is the development of automatic surveillance systems. In our work we outline a system capable of monitoring an uncontrolled area (an outside parking lot) using infrared imagery and recognizing suspicious events in this area. The ¯rst step is to identify moving objects and segment them from the scene's background. Our approach is based on a dynamic background-subtraction technique which robustly adapts detection to illumination changes. It is analyzed only regions where movement is occurring, ignoring in°uence of pixels from regions where there is no movement, to segment moving objects. Regions where movement is occurring are identi¯ed using °ow detection via sparse frame analysis. During the tracking process the objects are classi¯ed into two categories: Persons and Vehicles, based on features such as size and velocity. The last step is to recognize suspicious events that may occur in the scene. Since the objects are correctly segmented and classi¯ed it is possible to identify those events using features such as velocity and time spent motionless in one spot. In this paper we recognize the suspicious event suspicion of object(s) theft from inside a parked vehicle at spot X by a person" and results show that the use of °ow detection increases the recognition of this suspicious event from 78:57% to 92:85%.

  15. Real-time gait event detection for lower limb amputees using a single wearable sensor.

    PubMed

    Maqbool, H F; Husman, M A B; Awad, M I; Abouhossein, A; Mehryar, P; Iqbal, N; Dehghani-Sanij, A A

    2016-08-01

    This paper presents a rule-based real-time gait event/phase detection system (R-GEDS) using a shank mounted inertial measurement unit (IMU) for lower limb amputees during the level ground walking. Development of the algorithm is based on the shank angular velocity in the sagittal plane and linear acceleration signal in the shank longitudinal direction. System performance was evaluated with four control subjects (CS) and one transfemoral amputee (TFA) and the results were validated with four FlexiForce footswitches (FSW). The results showed a data latency for initial contact (IC) and toe off (TO) within a range of ± 40 ms for both CS and TFA. A delay of about 3.7 ± 62 ms for a foot-flat start (FFS) and an early detection of -9.4 ± 66 ms for heel-off (HO) was found for CS. Prosthetic side showed an early detection of -105 ± 95 ms for FFS whereas intact side showed a delay of 141 ±73 ms for HO. The difference in the kinematics of the TFA and CS is one of the potential reasons for high variations in the time difference. Overall, detection accuracy was 99.78% for all the events in both groups. Based on the validated results, the proposed system can be used to accurately detect the temporal gait events in real-time that leads to the detection of gait phase system and therefore, can be utilized in gait analysis applications and the control of lower limb prostheses.

  16. Automatic Detection and Classification of Unsafe Events During Power Wheelchair Use

    PubMed Central

    Moghaddam, Athena K.; Yuen, Hiu Kim; Archambault, Philippe S.; Routhier, François; Michaud, François; Boissy, Patrick

    2014-01-01

    Using a powered wheelchair (PW) is a complex task requiring advanced perceptual and motor control skills. Unfortunately, PW incidents and accidents are not uncommon and their consequences can be serious. The objective of this paper is to develop technological tools that can be used to characterize a wheelchair user’s driving behavior under various settings. In the experiments conducted, PWs are outfitted with a datalogging platform that records, in real-time, the 3-D acceleration of the PW. Data collection was conducted over 35 different activities, designed to capture a spectrum of PW driving events performed at different speeds (collisions with fixed or moving objects, rolling on incline plane, and rolling across multiple types obstacles). The data was processed using time-series analysis and data mining techniques, to automatically detect and identify the different events. We compared the classification accuracy using four different types of time-series features: 1) time-delay embeddings; 2) time-domain characterization; 3) frequency-domain features; and 4) wavelet transforms. In the analysis, we compared the classification accuracy obtained when distinguishing between safe and unsafe events during each of the 35 different activities. For the purposes of this study, unsafe events were defined as activities containing collisions against objects at different speed, and the remainder were defined as safe events. We were able to accurately detect 98% of unsafe events, with a low (12%) false positive rate, using only five examples of each activity. This proof-of-concept study shows that the proposed approach has the potential of capturing, based on limited input from embedded sensors, contextual information on PW use, and of automatically characterizing a user’s PW driving behavior. PMID:27170879

  17. Residual Events during Use of CPAP: Prevalence, Predictors, and Detection Accuracy

    PubMed Central

    Reiter, Joel; Zleik, Bashar; Bazalakova, Mihaela; Mehta, Pankaj; Thomas, Robert Joseph

    2016-01-01

    Study Objectives: To assess the frequency, severity, and determinants of residual respiratory events during continuous positive airway therapy (CPAP) for obstructive sleep apnea (OSA) as determined by device output. Methods: Subjects were consecutive OSA patients at an American Academy of Sleep Medicine accredited multidisciplinary sleep center. Inclusion criteria included CPAP use for a minimum of 3 months, and a minimum nightly use of 4 hours. Compliance metrics and waveform data from 217 subjects were analyzed retrospectively. Events were scored manually when there was a clear reduction of amplitude (≥ 30%) or flow-limitation with 2–3 larger recovery breaths. Automatically detected versus manually scored events were subjected to statistical analyses included Bland-Altman plots, correlation coefficients, and logistic regression exploring predictors of residual events. Results: The mean patient age was 54.7 ± 14.2 years; 63% were males. All patients had a primary diagnosis of obstructive sleep apnea, 26% defined as complex sleep apnea. Residual flow measurement based apnea-hypopnea index (AHIFLOW) > 5, 10, and 15/h was seen in 32.3%, 9.7%, and 1.8% vs. 60.8%, 23%, and 7.8% of subjects based on automated vs. manual scoring of waveform data. Automatically detected versus manually scored average AHIFLOW was 4.4 ± 3.8 vs. 7.3 ± 5.1 per hour. In a logistic regression analysis, the only predictors for a manual AHIFLOW > 5/h were the absolute central apnea index (CAI), (odds ratio [OR]: 1.5, p: 0.01, CI: 1.1–2.0), or using a CAI threshold of 5/h of sleep (OR: 5.0, p: < 0.001, CI: 2.2–13.8). For AHIFLOW > 10/h, the OR was 1.14, p: 0.03 (CI: 1.1–1.3) per every CAI unit of 1/hour. Conclusions: Residual respiratory events are common during CPAP treatment, may be missed by automated device detection and predicted by a high central apnea index on the baseline diagnostic study. Direct visualization of flow data is generally available and improves detection

  18. Using Structured Telephone Follow-up Assessments to Improve Suicide-related Adverse Event Detection

    PubMed Central

    Arias, Sarah A.; Zhang, Zi; Hillerns, Carla; Sullivan, Ashley F.; Boudreaux, Edwin D.; Miller, Ivan; Camargo, Carlos A.

    2014-01-01

    Adverse event (AE) detection and reporting practices were compared during the first phase of the Emergency Department Safety Assessment and Follow-up Evaluation (ED-SAFE), a suicide intervention study. Data were collected using a combination of chart reviews and structured telephone follow-up assessments post-enrollment. Beyond chart reviews, structured telephone follow-up assessments identified 45% of the total AEs in our study. Notably, detection of suicide attempts significantly varied by approach with 53 (18%) detected by chart review, 173 (59%) by structured telephone follow-up assessments, and 69 (23%) marked as duplicates. Findings provide support for utilizing multiple methods for more robust AE detection in suicide research. PMID:24588679

  19. A Method for Automated Detection of Usability Problems from Client User Interface Events

    PubMed Central

    Saadawi, Gilan M.; Legowski, Elizabeth; Medvedeva, Olga; Chavan, Girish; Crowley, Rebecca S.

    2005-01-01

    Think-aloud usability analysis provides extremely useful data but is very time-consuming and expensive to perform because of the extensive manual video analysis that is required. We describe a simple method for automated detection of usability problems from client user interface events for a developing medical intelligent tutoring system. The method incorporates (1) an agent-based method for communication that funnels all interface events and system responses to a centralized database, (2) a simple schema for representing interface events and higher order subgoals, and (3) an algorithm that reproduces the criteria used for manual coding of usability problems. A correction factor was empirically determining to account for the slower task performance of users when thinking aloud. We tested the validity of the method by simultaneously identifying usability problems using TAU and manually computing them from stored interface event data using the proposed algorithm. All usability problems that did not rely on verbal utterances were detectable with the proposed method. PMID:16779121

  20. Temporal and Spatial Predictability of an Irrelevant Event Differently Affect Detection and Memory of Items in a Visual Sequence.

    PubMed

    Ohyama, Junji; Watanabe, Katsumi

    2016-01-01

    We examined how the temporal and spatial predictability of a task-irrelevant visual event affects the detection and memory of a visual item embedded in a continuously changing sequence. Participants observed 11 sequentially presented letters, during which a task-irrelevant visual event was either present or absent. Predictabilities of spatial location and temporal position of the event were controlled in 2 × 2 conditions. In the spatially predictable conditions, the event occurred at the same location within the stimulus sequence or at another location, while, in the spatially unpredictable conditions, it occurred at random locations. In the temporally predictable conditions, the event timing was fixed relative to the order of the letters, while in the temporally unpredictable condition; it could not be predicted from the letter order. Participants performed a working memory task and a target detection reaction time (RT) task. Memory accuracy was higher for a letter simultaneously presented at the same location as the event in the temporally unpredictable conditions, irrespective of the spatial predictability of the event. On the other hand, the detection RTs were only faster for a letter simultaneously presented at the same location as the event when the event was both temporally and spatially predictable. Thus, to facilitate ongoing detection processes, an event must be predictable both in space and time, while memory processes are enhanced by temporally unpredictable (i.e., surprising) events. Evidently, temporal predictability has differential effects on detection and memory of a visual item embedded in a sequence of images.

  1. Nanoporous niobium oxide for label-free detection of DNA hybridization events.

    PubMed

    Choi, Jinsub; Lim, Jae Hoon; Rho, Sangchul; Jahng, Deokjin; Lee, Jaeyoung; Kim, Kyung Ja

    2008-01-15

    We found that DNA probes can be immobilized on anodically prepared porous niobium oxide without a chemical modification of both the DNA probes and the substrate. By using the porous niobium oxide with a positive surface charge, DNA hybridization events are detected on the basis of the blue-shift of a maximum absorption peak in UV-vis-NIR spectroscopy. The blue-shift is ascribed to the change of surface charge upon single- or double-stranded DNA. The method does not require a label and shows high sensitivity with the detection limit of the concentration of 1nM.

  2. A multivariate based event detection method and performance comparison with two baseline methods.

    PubMed

    Liu, Shuming; Smith, Kate; Che, Han

    2015-09-01

    Early warning systems have been widely deployed to protect water systems from accidental and intentional contamination events. Conventional detection algorithms are often criticized for having high false positive rates and low true positive rates. This mainly stems from the inability of these methods to determine whether variation in sensor measurements is caused by equipment noise or the presence of contamination. This paper presents a new detection method that identifies the existence of contamination by comparing Euclidean distances of correlation indicators, which are derived from the correlation coefficients of multiple water quality sensors. The performance of the proposed method was evaluated using data from a contaminant injection experiment and compared with two baseline detection methods. The results show that the proposed method can differentiate between fluctuations caused by equipment noise and those due to the presence of contamination. It yielded higher possibility of detection and a lower false alarm rate than the two baseline methods. With optimized parameter values, the proposed method can correctly detect 95% of all contamination events with a 2% false alarm rate.

  3. Event Detection for Hydrothermal Plumes: A case study at Grotto Vent

    NASA Astrophysics Data System (ADS)

    Bemis, K. G.; Ozer, S.; Xu, G.; Rona, P. A.; Silver, D.

    2012-12-01

    Evidence is mounting that geologic events such as volcanic eruptions (and intrusions) and earthquakes (near and far) influence the flow rates and temperatures of hydrothermal systems. Connecting such suppositions to observations of hydrothermal output is challenging, but new ongoing time series have the potential to capture such events. This study explores using activity detection, a technique modified from computer vision, to identify pre-defined events within an extended time series recorded by COVIS (Cabled Observatory Vent Imaging Sonar) and applies it to a time series, with gaps, from Sept 2010 to the present; available measurements include plume orientation, plume rise rate, and diffuse flow area at the NEPTUNE Canada Observatory at Grotto Vent, Main Endeavour Field, Juan de Fuca Ridge. Activity detection is the process of finding a pattern (activity) in a data set containing many different types of patterns. Among many approaches proposed to model and detect activities, we have chosen a graph-based technique, Petri Nets, as they do not require training data to model the activity. They use the domain expert's knowledge to build the activity as a combination of feature states and their transitions (actions). Starting from a conceptual model of how hydrothermal plumes respond to daily tides, we have developed a Petri Net based detection algorithm that identifies deviations from the specified response. Initially we assumed that the orientation of the plume would change smoothly and symmetrically in a consistent daily pattern. However, results indicate that the rate of directional changes varies. The present Petri Net detects unusually large and rapid changes in direction or amount of bending; however inspection of Figure 1 suggests that many of the events detected may be artifacts resulting from gaps in the data or from the large temporal spacing. Still, considerable complexity overlies the "normal" tidal response pattern (the data has a dominant frequency of

  4. Real-time gait event detection for normal subjects from lower trunk accelerations.

    PubMed

    González, Rafael C; López, Antonio M; Rodriguez-Uría, Javier; Alvarez, Diego; Alvarez, Juan C

    2010-03-01

    In this paper we report on a novel algorithm for the real-time detection and timing of initial (IC) and final contact (FC) gait events. We process the vertical and antero-posterior accelerations registered at the lower trunk (L3 vertebra). The algorithm is based on a set of heuristic rules extracted from a set of 1719 steps. An independent experiment was conducted to compare the results of our algorithms with those obtained from a Digimax force platform. The results show small deviations from times of occurrence of events recorded from the platform (13+/-35 ms for IC and 9+/-54 ms for FC). Results for the FC timing are especially relevant in this field, as no previous work has addressed its temporal location through the processing of lower trunk accelerations. The delay in the real-time detection of the IC is 117+/-39 ms and 34+/-72 ms for the FC, improving previously reported results for real-time detection of events from lower trunk accelerations.

  5. Adverse event detection (AED) system for continuously monitoring and evaluating structural health status

    NASA Astrophysics Data System (ADS)

    Yun, Jinsik; Ha, Dong Sam; Inman, Daniel J.; Owen, Robert B.

    2011-03-01

    Structural damage for spacecraft is mainly due to impacts such as collision of meteorites or space debris. We present a structural health monitoring (SHM) system for space applications, named Adverse Event Detection (AED), which integrates an acoustic sensor, an impedance-based SHM system, and a Lamb wave SHM system. With these three health-monitoring methods in place, we can determine the presence, location, and severity of damage. An acoustic sensor continuously monitors acoustic events, while the impedance-based and Lamb wave SHM systems are in sleep mode. If an acoustic sensor detects an impact, it activates the impedance-based SHM. The impedance-based system determines if the impact incurred damage. When damage is detected, it activates the Lamb wave SHM system to determine the severity and location of the damage. Further, since an acoustic sensor dissipates much less power than the two SHM systems and the two systems are activated only when there is an acoustic event, our system reduces overall power dissipation significantly. Our prototype system demonstrates the feasibility of the proposed concept.

  6. Early detection of cell activation events by means of attenuated total reflection Fourier transform infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Titus, Jitto; Filfili, Chadi; Hilliard, Julia K.; Ward, John A.; Unil Perera, A. G.

    2014-06-01

    Activation of Jurkat T-cells in culture following treatment with anti-CD3 (Cluster of Differentiation 3) antibody is detectable by interrogating the treated T-cells using the Attenuated Total Reflection-Fourier Transform Infrared (ATR-FTIR) Spectroscopy technique. Cell activation was detected within 75 min after the cells encountered specific immunoglobulin molecules. Spectral markers noted following ligation of the CD3 receptor with anti CD3 antibody provides proof-of-concept that ATR-FTIR spectroscopy is a sensitive measure of molecular events subsequent to cells interacting with anti-CD3 Immunoglobulin G. The resultant ligation of the CD3 receptor results in the initiation of well defined, specific signaling pathways that parallel the measurable molecular events detected using ATR-FTIR. Paired t-test with post-hoc Bonferroni corrections for multiple comparisons has resulted in the identification of statistically significant spectral markers (p < 0.02) at 1367 and 1358 cm-1. Together, these data demonstrate that early treatment-specific cellular events can be measured by ATR-FTIR and that this technique can be used to identify specific agents via the responses of the cell biosensor at different time points postexposure.

  7. Event-Triggered Fault Detection Filter Design for a Continuous-Time Networked Control System.

    PubMed

    Wang, Yu-Long; Shi, Peng; Lim, Cheng-Chew; Liu, Yuan

    2016-12-01

    This paper studies the problem of event-triggered fault detection filter (FDF) and controller coordinated design for a continuous-time networked control system (NCS) with biased sensor faults. By considering sensor-to-FDF network-induced delays and packet dropouts, which do not impose a constraint on the event-triggering mechanism, and proposing the simultaneous network bandwidth utilization ratio and fault occurrence probability-based event-triggering mechanism, a new closed-loop model for the considered NCS is established. Based on the established model, the event-triggered H ∞ performance analysis, and FDF and controller coordinated design are presented. The combined mutually exclusive distribution and Wirtinger-based integral inequality approach is proposed for the first time to deal with integral inequalities for products of vectors. This approach is proved to be less conservative than the existing Wirtinger-based integral inequality approach. The designed FDF and controller can guarantee the sensitivity of the residual signal to faults and the robustness of the NCS to external disturbances. The simulation results verify the effectiveness of the proposed event-triggering mechanism, and the FDF and controller coordinated design.

  8. Method for the depth corrected detection of ionizing events from a co-planar grids sensor

    DOEpatents

    De Geronimo, Gianluigi; Bolotnikov, Aleksey E.; Carini, Gabriella

    2009-05-12

    A method for the detection of ionizing events utilizing a co-planar grids sensor comprising a semiconductor substrate, cathode electrode, collecting grid and non-collecting grid. The semiconductor substrate is sensitive to ionizing radiation. A voltage less than 0 Volts is applied to the cathode electrode. A voltage greater than the voltage applied to the cathode is applied to the non-collecting grid. A voltage greater than the voltage applied to the non-collecting grid is applied to the collecting grid. The collecting grid and the non-collecting grid are summed and subtracted creating a sum and difference respectively. The difference and sum are divided creating a ratio. A gain coefficient factor for each depth (distance between the ionizing event and the collecting grid) is determined, whereby the difference between the collecting electrode and the non-collecting electrode multiplied by the corresponding gain coefficient is the depth corrected energy of an ionizing event. Therefore, the energy of each ionizing event is the difference between the collecting grid and the non-collecting grid multiplied by the corresponding gain coefficient. The depth of the ionizing event can also be determined from the ratio.

  9. Data mining in the US Vaccine Adverse Event Reporting System (VAERS): early detection of intussusception and other events after rotavirus vaccination.

    PubMed

    Niu, M T; Erwin, D E; Braun, M M

    2001-09-14

    The Vaccine Adverse Event Reporting System (VAERS) is the US passive surveillance system monitoring vaccine safety. A major limitation of VAERS is the lack of denominator data (number of doses of administered vaccine), an element necessary for calculating reporting rates. Empirical Bayesian data mining, a data analysis method, utilizes the number of events reported for each vaccine and statistically screens the database for higher than expected vaccine-event combinations signaling a potential vaccine-associated event. This is the first study of data mining in VAERS designed to test the utility of this method to detect retrospectively a known side effect of vaccination-intussusception following rotavirus (RV) vaccine. From October 1998 to December 1999, 112 cases of intussusception were reported. The data mining method was able to detect a signal for RV-intussusception in February 1999 when only four cases were reported. These results demonstrate the utility of data mining to detect significant vaccine-associated events at early date. Data mining appears to be an efficient and effective computer-based program that may enhance early detection of adverse events in passive surveillance systems.

  10. Syndromic Surveillance Based on Emergency Visits: A Reactive Tool for Unusual Events Detection

    PubMed Central

    Vilain, Pascal; Bourdé, Arnaud; Cassou, Pierre-Jean Marianne dit; Jacques-Antoine, Yves; Morbidelli, Philippe; Filleul, Laurent

    2013-01-01

    Objective To show with examples that syndromic surveillance system can be a reactive tool for public health surveillance. Introduction The late health events such as the heat wave of 2003 showed the need to make public health surveillance evolve in France. Thus, the French Institute for Public Health Surveillance has developed syndromic surveillance systems based on several information sources such as emergency departments (1). In Reunion Island, the chikungunya outbreak of 2005–2006, then the influenza pandemic of 2009 contributed to the implementation and the development of this surveillance system (2–3). In the past years, this tool allowed to follow and measure the impact of seasonal epidemics. Nevertheless, its usefulness for the detection of minor unusual events had yet to be demonstrated. Methods In Reunion Island, the syndromic surveillance system is based on the activity of six emergency departments. Two types of indicators are constructed from collected data: - Qualitative indicators for the alert (every visit whose diagnostic relates to a notifiable disease or potential epidemic disease);- Quantitative indicators for the epidemic/cluster detection (number of visits based on syndromic grouping). Daily and weekly analyses are carried out. A decision algorithm allows to validate the signal and to organize an epidemiological investigation if necessary. Results Each year, about 150 000 visits are registered in the six emergency departments that is 415 consultations per day on average. Several unusual health events on small-scale were detected early. In August 2011, the surveillance system allowed to detect the first autochthonous cases of measles, a few days before this notifiable disease was reported to health authorities (Figure 1). In January 2012, the data of emergency departments allowed to validate the signal of viral meningitis as well as to detect a cluster in the West of the island and to follow its trend. In June 2012, a family foodborne illness

  11. Advanced Geospatial Hydrodynamic Signals Analysis for Tsunami Event Detection and Warning

    NASA Astrophysics Data System (ADS)

    Arbab-Zavar, Banafshe; Sabeur, Zoheir

    2013-04-01

    Current early tsunami warning can be issued upon the detection of a seismic event which may occur at a given location offshore. This also provides an opportunity to predict the tsunami wave propagation and run-ups at potentially affected coastal zones by selecting the best matching seismic event from a database of pre-computed tsunami scenarios. Nevertheless, it remains difficult and challenging to obtain the rupture parameters of the tsunamigenic earthquakes in real time and simulate the tsunami propagation with high accuracy. In this study, we propose a supporting approach, in which the hydrodynamic signal is systematically analysed for traces of a tsunamigenic signal. The combination of relatively low amplitudes of a tsunami signal at deep waters and the frequent occurrence of background signals and noise contributes to a generally low signal to noise ratio for the tsunami signal; which in turn makes the detection of this signal difficult. In order to improve the accuracy and confidence of detection, a re-identification framework in which a tsunamigenic signal is detected via the scan of a network of hydrodynamic stations with water level sensing is performed. The aim is to attempt the re-identification of the same signatures as the tsunami wave spatially propagates through the hydrodynamic stations sensing network. The re-identification of the tsunamigenic signal is technically possible since the tsunami signal at the open ocean itself conserves its birthmarks relating it to the source event. As well as supporting the initial detection and improving the confidence of detection, a re-identified signal is indicative of the spatial range of the signal, and thereby it can be used to facilitate the identification of certain background signals such as wind waves which do not have as large a spatial reach as tsunamis. In this paper, the proposed methodology for the automatic detection of tsunamigenic signals has been achieved using open data from NOAA with a recorded

  12. BioSense: implementation of a National Early Event Detection and Situational Awareness System.

    PubMed

    Bradley, Colleen A; Rolka, H; Walker, D; Loonsk, J

    2005-08-26

    BioSense is a CDC initiative to support enhanced early detection, quantification, and localization of possible biologic terrorism attacks and other events of public health concern on a national level. The goals of the BioSense initiative are to advance early detection by providing the standards, infrastructure, and data acquisition for near real-time reporting, analytic evaluation and implementation, and early event detection support for state and local public health officials. BioSense collects and analyzes Department of Defense and Department of Veterans Affairs ambulatory clinical diagnoses and procedures and Laboratory Corporation of America laboratory-test orders. The application summarizes and presents analytical results and data visualizations by source, day, and syndrome for each ZIP code, state, and metropolitan area through maps, graphs, and tables. An initial proof of a concept evaluation project was conducted before the system was made available to state and local users in April 2004. User recruitment involved identifying and training BioSense administrators and users from state and local health departments. User support has been an essential component of the implementation and enhancement process. CDC initiated the BioIntelligence Center (BIC) in June 2004 to conduct internal monitoring of BioSense national data daily. BIC staff have supported state and local system monitoring, conducted data anomaly inquiries, and communicated with state and local public health officials. Substantial investments will be made in providing regional, state, and local data for early event detection and situational awareness, test beds for data and algorithm evaluation, detection algorithm development, and data management technologies, while maintaining the focus on state and local public health needs.

  13. Group localisation and unsupervised detection and classification of basic crowd behaviour events for surveillance applications

    NASA Astrophysics Data System (ADS)

    Roubtsova, Nadejda S.; de With, Peter H. N.

    2013-02-01

    Technology for monitoring crowd behaviour is in demand for surveillance and security applications. The trend in research is to tackle detection of complex crowd behaviour events (panic, ght, evacuation etc.) directly using machine learning techniques. In this paper, we present a contrary, bottom-up approach seeking basic group information: (1) instantaneous location and (2) the merge, split and lateral slide-by events - the three basic motion patterns comprising any crowd behaviour. The focus on such generic group information makes our algorithm suitable as a building block in a variety of surveillance systems, possibly integrated with static content analysis solutions. Our feature extraction framework has optical ow in its core. The framework is universal being motion-based, rather than object-detection-based and generates a large variety of motion-blob- characterising features useful for an array of classi cation problems. Motion-based characterisation is performed on a group as an atomic whole and not by means of superposition of individual human motions. Within that feature space, our classi cation system makes decisions based on heuristic rules and thresholds, without machine learning. Our system performs well on group localisation, consistently generating contours around both moving and halted groups. The visual output of our periodical group localisation is equivalent to tracking and the group contour accuracy ranges from adequate to exceptionally good. The system successfully detects and classi es within our merge/split/slide-by event space in surveillance-type video sequences, di ering in resolution, scale, quality and motion content. Quantitatively, its performance is characterised by a good recall: 83% on detection and 71% on combined detection and classi cation.

  14. Seismic Event Identification Using Scanning Detection: A Comparison of Denoising and Classification Methods

    NASA Astrophysics Data System (ADS)

    Rowe, C. A.; MacCarthy, J. K.; Giudicepietro, F.

    2005-12-01

    Automatic detection and classification methods are increasingly important in observatory operations, as the volume and rate of incoming data exceed the capacity of human analysis staff to process the data in near-real-time. We explore the success of scanning detection for similar event identification in a variety of seismic waveform catalogs. Several waveform pre-processing methods are applied to previously recorded events which are scanned through triggered and continuous waveform catalogs to determine the success and false alarm rate for detections of repeating signals. Pre-processing approaches include adaptive, cross-coherency filtering, adaptive, auto-associative neural network filtering, discrete wavelet package decomposition and linear predictive coding as well as suites of standard bandpass filters. Classification / detection methods for the various pre-processed signals are applied to investigate the robustness of the individual and combined approaches. The classifiers as applied to the processed waveforms include dendrogram-based clustering and neural network classifiers. We will present findings for the various combinations of methods as applied to tectonic earthquakes, mine blasts and volcanic seismicity.

  15. Application of data cubes for improving detection of water cycle extreme events

    NASA Astrophysics Data System (ADS)

    Teng, W. L.; Albayrak, A.

    2015-12-01

    As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case for our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme (WCE) events, a specific case of anomaly detection, requiring time series data. We investigate the use of the sequential probability ratio test (SPRT) for anomaly detection and support vector machines (SVM) for anomaly classification. We show an example of detection of WCE events, using the Global Land Data Assimilation Systems (GLDAS) data set.

  16. Event Detection Using Mobile Phone Mass GPS Data and Their Reliavility Verification by Dmsp/ols Night Light Image

    NASA Astrophysics Data System (ADS)

    Yuki, Akiyama; Satoshi, Ueyama; Ryosuke, Shibasaki; Adachi, Ryuichiro

    2016-06-01

    In this study, we developed a method to detect sudden population concentration on a certain day and area, that is, an "Event," all over Japan in 2012 using mass GPS data provided from mobile phone users. First, stay locations of all phone users were detected using existing methods. Second, areas and days where Events occurred were detected by aggregation of mass stay locations into 1-km-square grid polygons. Finally, the proposed method could detect Events with an especially large number of visitors in the year by removing the influences of Events that occurred continuously throughout the year. In addition, we demonstrated reasonable reliability of the proposed Event detection method by comparing the results of Event detection with light intensities obtained from the night light images from the DMSP/OLS night light images. Our method can detect not only positive events such as festivals but also negative events such as natural disasters and road accidents. These results are expected to support policy development of urban planning, disaster prevention, and transportation management.

  17. Fault detection and isolation in manufacturing systems with an identified discrete event model

    NASA Astrophysics Data System (ADS)

    Roth, Matthias; Schneider, Stefan; Lesage, Jean-Jacques; Litz, Lothar

    2012-10-01

    In this article a generic method for fault detection and isolation (FDI) in manufacturing systems considered as discrete event systems (DES) is presented. The method uses an identified model of the closed-loop of plant and controller built on the basis of observed fault-free system behaviour. An identification algorithm known from literature is used to determine the fault detection model in form of a non-deterministic automaton. New results of how to parameterise this algorithm are reported. To assess the fault detection capability of an identified automaton, probabilistic measures are proposed. For fault isolation, the concept of residuals adapted for DES is used by defining appropriate set operations representing generic fault symptoms. The method is applied to a case study system.

  18. Bio-inspired WSN architecture: event detection and loacalization in a fault tolerant WSN

    NASA Astrophysics Data System (ADS)

    Alayev, Yosef; Damarla, Thyagaraju

    2009-05-01

    One can think of human body as a sensory network. In particular, skin has several neurons that provide the sense of touch with different sensitivities, and neurons for communicating the sensory signals to the brain. Even though skin might occasionally experience some lacerations, it performs remarkably well (fault tolerant) with the failure of some sensors. One of the challenges in collaborative wireless sensor networks (WSN) is fault tolerant detection and localization of targets. In this paper we present a biologically inspired architecture model for WSN. Diagnosis of sensors in WSN model presented here is derived from the concept of the immune system. We present an architecture for WSN for detection and localization of multiple targets inspired by human nervous system. We show that the advantages of such bio-inspired networks are reduced data for communication, self-diagnosis to detect faulty sensors in real-time and the ability to localize events. We present the results of our algorithms on simulation data.

  19. The temporal reliability of sound modulates visual detection: an event-related potential study.

    PubMed

    Li, Qi; Wu, Yan; Yang, Jingjing; Wu, Jinglong; Touge, Tetsuo

    2015-01-01

    Utilizing the high temporal resolution of event-related potentials (ERPs), we examined the effects of temporal reliability of sounds on visual detection. Significantly faster reaction times to visual target stimuli were observed when reliable temporal information was provided by a task-irrelevant auditory stimulus. Three main ERP components related to the effects of auditory temporal reliability were found: the first at 180-240 ms over a wide central area, the second at 300-400 ms over an anterior area, and the third at 300-380 ms over bilateral temporal areas. Our results support the hypothesis that temporal reliability affects visual detection and indicate that auditory facilitation of visual detection is partly due to spread of attention and thus results from implicit temporal linking of auditory and visual information at a relatively late processing stage.

  20. High-Performance Signal Detection for Adverse Drug Events using MapReduce Paradigm.

    PubMed

    Fan, Kai; Sun, Xingzhi; Tao, Ying; Xu, Linhao; Wang, Chen; Mao, Xianling; Peng, Bo; Pan, Yue

    2010-11-13

    Post-marketing pharmacovigilance is important for public health, as many Adverse Drug Events (ADEs) are unknown when those drugs were approved for marketing. However, due to the large number of reported drugs and drug combinations, detecting ADE signals by mining these reports is becoming a challenging task in terms of computational complexity. Recently, a parallel programming model, MapReduce has been introduced by Google to support large-scale data intensive applications. In this study, we proposed a MapReduce-based algorithm, for common ADE detection approach, Proportional Reporting Ratio (PRR), and tested it in mining spontaneous ADE reports from FDA. The purpose is to investigate the possibility of using MapReduce principle to speed up biomedical data mining tasks using this pharmacovigilance case as one specific example. The results demonstrated that MapReduce programming model could improve the performance of common signal detection algorithm for pharmacovigilance in a distributed computation environment at approximately liner speedup rates.

  1. An evaluation of generalized likelihood Ratio Outlier Detection to identification of seismic events in Western China

    SciTech Connect

    Taylor, S.R.; Hartse, H.E.

    1996-09-24

    The Generalized Likelihood Ratio Outlier Detection Technique for seismic event identification is evaluated using synthetic test data and frequency-dependent P{sub g}/L{sub g} measurements from western China. For most seismic stations that are to be part of the proposed International Monitoring System for the Comprehensive Test Ban Treaty, there will be few or no nuclear explosions in the magnitude range of interest (e.g. M{sub b} < 4) on which to base an event-identification system using traditional classification techniques. Outlier detection is a reasonable alternative approach to the seismic discrimination problem when no calibration explosions are available. Distance-corrected P{sub g}/L{sub g} data in seven different frequency bands ranging from 0.5 to 8 Hz from the Chinese Digital Seismic Station WMQ are used to evaluate the technique. The data are collected from 157 known earthquakes, 215 unknown events (presumed earthquakes and possibly some industrial explosions), and 18 known nuclear explosions (1 from the Chinese Lop Nor test site and 17 from the East Kazakh test site). A feature selection technique is used to find the best combination of discriminants to use for outlier detection. Good discrimination performance is found by combining a low-frequency (0.5 to 1 Hz) P{sub g}/L{sub g} ratio with high-frequency ratios (e.g. 2 to 4 and 4 to 8 Hz). Although the low-frequency ratio does not discriminate between earthquakes and nuclear explosions well by itself, it can be effectively combined with the high-frequency discriminants. Based on the tests with real and synthetic data, the outlier detection technique appears to be an effective approach to seismic monitoring in uncalibrated regions.

  2. Exupery volcano fast response system - The event detection and waveform classification system

    NASA Astrophysics Data System (ADS)

    Hammer, Conny; Ohrnberger, Matthias

    2010-05-01

    Volcanic eruptions are often preceded by seismic activity which can be used to quantify the volcanic activity since the number and the size of certain types of seismic events usually increase before periods of volcanic crisis. The implementation of an automatic detection and classification system for seismic signals of volcanic origin allows not only for the processing of large amounts of data in short time, but also provides consistent and time-invariant results. Here, we have developed a system based upon a combination of different methods. To enable a first robust event detection in the continuous data stream different modules are implemented in the real time system Earthworm which is widely distributed in active volcano monitoring observatories worldwide. Among those software modules are classical trigger algorithm like STA/LTA and cross-correlation master event matching which is also used to detect different classes of signals. Furthermore an additional module is implemented in the real time system to compute continuous activity parameters which are also used to quantify the volcanic activity. Most automatic classification systems need a sufficiently large pre-classified data set for training the system. However in case of a volcanic crisis we are often confronted with a lack of training data due to insufficient prior observations because prior data acquisition might be carried out with different equipment at a low number of sites and due to the imminent crisis there might be no time for the time-consuming and tedious process of preparing a training data set. For this reason we have developed a novel seismic event spotting technique in order to be less dependent on the existence of previously acquired data bases of event classes. One main goal is therefore to provide observatory staff with a robust event classification based on a minimum number of reference waveforms. By using a "learning-while-recording" approach we are allowing for the fast build-up of a

  3. Clinical outcome of subchromosomal events detected by whole‐genome noninvasive prenatal testing

    PubMed Central

    Helgeson, J.; Wardrop, J.; Boomer, T.; Almasri, E.; Paxton, W. B.; Saldivar, J. S.; Dharajiya, N.; Monroe, T. J.; Farkas, D. H.; Grosu, D. S.

    2015-01-01

    Abstract Objective A novel algorithm to identify fetal microdeletion events in maternal plasma has been developed and used in clinical laboratory‐based noninvasive prenatal testing. We used this approach to identify the subchromosomal events 5pdel, 22q11del, 15qdel, 1p36del, 4pdel, 11qdel, and 8qdel in routine testing. We describe the clinical outcomes of those samples identified with these subchromosomal events. Methods Blood samples from high‐risk pregnant women submitted for noninvasive prenatal testing were analyzed using low coverage whole genome massively parallel sequencing. Sequencing data were analyzed using a novel algorithm to detect trisomies and microdeletions. Results In testing 175 393 samples, 55 subchromosomal deletions were reported. The overall positive predictive value for each subchromosomal aberration ranged from 60% to 100% for cases with diagnostic and clinical follow‐up information. The total false positive rate was 0.0017% for confirmed false positives results; false negative rate and sensitivity were not conclusively determined. Conclusion Noninvasive testing can be expanded into the detection of subchromosomal copy number variations, while maintaining overall high test specificity. In the current setting, our results demonstrate high positive predictive values for testing of rare subchromosomal deletions. © 2015 The Authors. Prenatal Diagnosis published by John Wiley & Sons Ltd. PMID:26088833

  4. Gait event detection for use in FES rehabilitation by radial and tangential foot accelerations.

    PubMed

    Rueterbories, Jan; Spaich, Erika G; Andersen, Ole K

    2014-04-01

    Gait rehabilitation by Functional Electrical Stimulations (FESs) requires a reliable trigger signal to start the stimulations. This could be obtained by a simple switch under the heel or by means of an inertial sensor system. This study provides an algorithm to detect gait events in differential acceleration signals of the foot. The key feature of differential measurements is that they compensate the impact of gravity. The real time detection capability of a rule based algorithm in healthy and hemiparetic individuals was investigated. Detection accuracy and precision compared to signals from foot switches were evaluated. The algorithm detected curve features of the vectorial sum of radial and tangential accelerations and mapped those to discrete gait states. The results showed detection rates for healthy and hemiparetic gait ranging form 84.2% to 108.5%. The sensitivity was between 0.81 and 1, and the specificity between 0.85 and 1, depending on gait phase and group of subjects. The algorithm detected gait phase changes earlier than the reference. Differential acceleration signals combined with the proposed algorithm have the potential to be implemented in a future FES system.

  5. Detecting regular sound changes in linguistics as events of concerted evolution

    SciTech Connect

    Hruschka, Daniel  J.; Branford, Simon; Smith, Eric  D.; Wilkins, Jon; Meade, Andrew; Pagel, Mark; Bhattacharya, Tanmoy

    2014-12-18

    Background: Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Results: Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular sound change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. Conclusions: We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group.

  6. DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals

    DTIC Science & Technology

    2013-04-24

    newborn infants [5] as well as the monitoring of fatigue in prolonged driving simulations [6]. In many of these settings, the experiments may last...Automatic burst detection for the EEG of the preterm infant . Physiol Meas 32: 1623. doi:10.1088/0967–3334/32/10/010. 6. Chin-Teng Lin, Che-Jui Chang, Bor

  7. Detection of Visual Events in Underwater Video Using a Neuromorphic Saliency-based Attention System

    NASA Astrophysics Data System (ADS)

    Edgington, D. R.; Walther, D.; Cline, D. E.; Sherlock, R.; Salamy, K. A.; Wilson, A.; Koch, C.

    2003-12-01

    The Monterey Bay Aquarium Research Institute (MBARI) uses high-resolution video equipment on remotely operated vehicles (ROV) to obtain quantitative data on the distribution and abundance of oceanic animals. High-quality video data supplants the traditional approach of assessing the kinds and numbers of animals in the oceanic water column through towing collection nets behind ships. Tow nets are limited in spatial resolution, and often destroy abundant gelatinous animals resulting in species undersampling. Video camera-based quantitative video transects (QVT) are taken through the ocean midwater, from 50m to 4000m, and provide high-resolution data at the scale of the individual animals and their natural aggregation patterns. However, the current manual method of analyzing QVT video by trained scientists is labor intensive and poses a serious limitation to the amount of information that can be analyzed from ROV dives. Presented here is an automated system for detecting marine animals (events) visible in the videos. Automated detection is difficult due to the low contrast of many translucent animals and due to debris ("marine snow") cluttering the scene. Video frames are processed with an artificial intelligence attention selection algorithm that has proven a robust means of target detection in a variety of natural terrestrial scenes. The candidate locations identified by the attention selection module are tracked across video frames using linear Kalman filters. Typically, the occurrence of visible animals in the video footage is sparse in space and time. A notion of "boring" video frames is developed by detecting whether or not there is an interesting candidate object for an animal present in a particular sequence of underwater video -- video frames that do not contain any "interesting" events. If objects can be tracked successfully over several frames, they are stored as potentially "interesting" events. Based on low-level properties, interesting events are

  8. Detecting consciousness in a total locked-in syndrome: an active event-related paradigm.

    PubMed

    Schnakers, Caroline; Perrin, Fabien; Schabus, Manuel; Hustinx, Roland; Majerus, Steve; Moonen, Gustave; Boly, Melanie; Vanhaudenhuyse, Audrey; Bruno, Marie-Aurelie; Laureys, Steven

    2009-08-01

    Total locked-in syndrome is characterized by tetraplegia, anarthria and paralysis of eye motility. In this study, consciousness was detected in a 21-year-old woman who presented a total locked-in syndrome after a basilar artery thrombosis (49 days post-injury) using an active event-related paradigm. The patient was presented sequences of names containing the patient's own name and other names. The patient was instructed to count her own name or to count another target name. Similar to 4 age- and gender-matched healthy controls, the P3 response recorded for the voluntarily counted own name was larger than while passively listening. This P3 response was observed 14 days before the first behavioral signs of consciousness. This study shows that our active event-related paradigm allowed to identify voluntary brain activity in a patient who would behaviorally be diagnosed as comatose.

  9. Detecting tidal disruption events of massive black holes in normal galaxies with the Einstein Probe

    NASA Astrophysics Data System (ADS)

    Yuan, W.; Komossa, S.; Zhang, C.; Feng, H.; Ling, Z.-X.; Zhao, D. H.; Zhang, S.-N.; Osborne, J. P.; O'Brien, P.; Willingale, R.; Lapington, J.; Lapington

    2016-02-01

    Stars are tidally disrupted and accreted when they approach massive black holes (MBHs) closely, producing a flare of electromagnetic radiation. The majority of the (approximately two dozen) tidal disruption events (TDEs) identified so far have been discovered by their luminous, transient X-ray emission. Once TDEs are detected in much larger numbers, in future dedicated transient surveys, a wealth of new applications will become possible. Here, we present the proposed Einstein Probe mission, which is a dedicated time-domain soft X-ray all-sky monitor aiming at detecting X-ray transients including TDEs in large numbers. The mission consists of a wide-field micro-pore Lobster-eye imager (60° × 60°), and is designed to carry out an all-sky transient survey at energies of 0.5-4 keV. It will also carry a more sensitive telescope for X-ray follow-ups, and will be capable of issuing public transient alerts rapidly. Einstein Probe is expected to revolutionise the field of TDE research by detecting several tens to hundreds of events per year from the early phase of flares, many with long-term, well sampled lightcurves.

  10. Foot contact event detection using kinematic data in cerebral palsy children and normal adults gait.

    PubMed

    Desailly, Eric; Daniel, Yepremian; Sardain, Philippe; Lacouture, Patrick

    2009-01-01

    Initial contact (IC) and toe off (TO) times are essential measurements in the analysis of temporal gait parameters, especially in cerebral palsy (CP) gait analysis. A new gait event detection algorithm, called the high pass algorithm (HPA) has been developed and is discussed in this paper. Kinematics of markers on the heel and metatarsal are used. Their forward components are high pass filtered, to amplify the contact discontinuities, thus the local extrema of the processed signal correspond to IC and TO. The accuracy and precision of HPA are compared with the gold standard of foot contact event detection, that is, force plate measurements. Furthermore HPA is compared with two other kinematics methods. This study has been conducted on 20 CP children and on eight normal adults. For normal subjects all the methods performed equally well. True errors in HPA (mean+/-standard deviation) were found to be 1+/-23 ms for IC and 2+/-25 ms for TO in CP children. These results were significantly (p<0.05) more accurate and precise than those obtained using the other algorithms. Moreover, in the case of pathological gaits, the other methods are not suitable for IC detection when IC is flatfoot or forefoot. In conclusion, the HPA is a simple and robust algorithm, which performs equally well for adults and actually performs better when applied to the gait of CP children. It is therefore recommended as the method of choice.

  11. Femtomolar detection of single mismatches by discriminant analysis of DNA hybridization events using gold nanoparticles.

    PubMed

    Ma, Xingyi; Sim, Sang Jun

    2013-03-21

    Even though DNA-based nanosensors have been demonstrated for quantitative detection of analytes and diseases, hybridization events have never been numerically investigated for further understanding of DNA mediated interactions. Here, we developed a nanoscale platform with well-designed capture and detection gold nanoprobes to precisely evaluate the hybridization events. The capture gold nanoprobes were mono-laid on glass and the detection probes were fabricated via a novel competitive conjugation method. The two kinds of probes combined in a suitable orientation following the hybridization with the target. We found that hybridization efficiency was markedly dependent on electrostatic interactions between DNA strands, which can be tailored by adjusting the salt concentration of the incubation solution. Due to the much lower stability of the double helix formed by mismatches, the hybridization efficiencies of single mismatched (MMT) and perfectly matched DNA (PMT) were different. Therefore, we obtained an optimized salt concentration that allowed for discrimination of MMT from PMT without stringent control of temperature or pH. The results indicated this to be an ultrasensitive and precise nanosensor for the diagnosis of genetic diseases.

  12. Application of Data Cubes for Improving Detection of Water Cycle Extreme Events

    NASA Technical Reports Server (NTRS)

    Albayrak, Arif; Teng, William

    2015-01-01

    As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case of our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme events, a specific case of anomaly detection, requiring time series data. We investigate the use of support vector machines (SVM) for anomaly classification. We show an example of detection of water cycle extreme events, using data from the Tropical Rainfall Measuring Mission (TRMM).

  13. Hierarchical modeling for rare event detection and cell subset alignment across flow cytometry samples.

    PubMed

    Cron, Andrew; Gouttefangeas, Cécile; Frelinger, Jacob; Lin, Lin; Singh, Satwinder K; Britten, Cedrik M; Welters, Marij J P; van der Burg, Sjoerd H; West, Mike; Chan, Cliburn

    2013-01-01

    Flow cytometry is the prototypical assay for multi-parameter single cell analysis, and is essential in vaccine and biomarker research for the enumeration of antigen-specific lymphocytes that are often found in extremely low frequencies (0.1% or less). Standard analysis of flow cytometry data relies on visual identification of cell subsets by experts, a process that is subjective and often difficult to reproduce. An alternative and more objective approach is the use of statistical models to identify cell subsets of interest in an automated fashion. Two specific challenges for automated analysis are to detect extremely low frequency event subsets without biasing the estimate by pre-processing enrichment, and the ability to align cell subsets across multiple data samples for comparative analysis. In this manuscript, we develop hierarchical modeling extensions to the Dirichlet Process Gaussian Mixture Model (DPGMM) approach we have previously described for cell subset identification, and show that the hierarchical DPGMM (HDPGMM) naturally generates an aligned data model that captures both commonalities and variations across multiple samples. HDPGMM also increases the sensitivity to extremely low frequency events by sharing information across multiple samples analyzed simultaneously. We validate the accuracy and reproducibility of HDPGMM estimates of antigen-specific T cells on clinically relevant reference peripheral blood mononuclear cell (PBMC) samples with known frequencies of antigen-specific T cells. These cell samples take advantage of retrovirally TCR-transduced T cells spiked into autologous PBMC samples to give a defined number of antigen-specific T cells detectable by HLA-peptide multimer binding. We provide open source software that can take advantage of both multiple processors and GPU-acceleration to perform the numerically-demanding computations. We show that hierarchical modeling is a useful probabilistic approach that can provide a consistent labeling

  14. The Evaluation of a Pulmonary Display to Detect Adverse Respiratory Events Using High Resolution Human Simulator

    PubMed Central

    Wachter, S. Blake; Johnson, Ken; Albert, Robert; Syroid, Noah; Drews, Frank; Westenskow, Dwayne

    2006-01-01

    Objective Authors developed a picture-graphics display for pulmonary function to present typical respiratory data used in perioperative and intensive care environments. The display utilizes color, shape and emergent alerting to highlight abnormal pulmonary physiology. The display serves as an adjunct to traditional operating room displays and monitors. Design To evaluate the prototype, nineteen clinician volunteers each managed four adverse respiratory events and one normal event using a high-resolution patient simulator which included the new displays (intervention subjects) and traditional displays (control subjects). Between-group comparisons included (i) time to diagnosis and treatment for each adverse respiratory event; (ii) the number of unnecessary treatments during the normal scenario; and (iii) self-reported workload estimates while managing study events. Measurements Two expert anesthesiologists reviewed video-taped transcriptions of the volunteers to determine time to treat and time to diagnosis. Time values were then compared between groups using a Mann-Whitney-U Test. Estimated workload for both groups was assessed using the NASA-TLX and compared between groups using an ANOVA. P-values < 0.05 were considered significant. Results Clinician volunteers detected and treated obstructed endotracheal tubes and intrinsic PEEP problems faster with graphical rather than conventional displays (p < 0.05). During the normal scenario simulation, 3 clinicians using the graphical display, and 5 clinicians using the conventional display gave unnecessary treatments. Clinician-volunteers reported significantly lower subjective workloads using the graphical display for the obstructed endotracheal tube scenario (p < 0.001) and the intrinsic PEEP scenario (p < 0.03). Conclusion Authors conclude that the graphical pulmonary display may serve as a useful adjunct to traditional displays in identifying adverse respiratory events. PMID:16929038

  15. Automated detection and analysis of depolarization events in human cardiomyocytes using MaDEC.

    PubMed

    Szymanska, Agnieszka F; Heylman, Christopher; Datta, Rupsa; Gratton, Enrico; Nenadic, Zoran

    2016-08-01

    Optical imaging-based methods for assessing the membrane electrophysiology of in vitro human cardiac cells allow for non-invasive temporal assessment of the effect of drugs and other stimuli. Automated methods for detecting and analyzing the depolarization events (DEs) in image-based data allow quantitative assessment of these different treatments. In this study, we use 2-photon microscopy of fluorescent voltage-sensitive dyes (VSDs) to capture the membrane voltage of actively beating human induced pluripotent stem cell-derived cardiomyocytes (hiPS-CMs). We built a custom and freely available Matlab software, called MaDEC, to detect, quantify, and compare DEs of hiPS-CMs treated with the β-adrenergic drugs, propranolol and isoproterenol. The efficacy of our software is quantified by comparing detection results against manual DE detection by expert analysts, and comparing DE analysis results to known drug-induced electrophysiological effects. The software accurately detected DEs with true positive rates of 98-100% and false positive rates of 1-2%, at signal-to-noise ratios (SNRs) of 5 and above. The MaDEC software was also able to distinguish control DEs from drug-treated DEs both immediately as well as 10min after drug administration.

  16. Detection of planets in extremely weak central perturbation microlensing events via next-generation ground-based surveys

    SciTech Connect

    Chung, Sun-Ju; Lee, Chung-Uk; Koo, Jae-Rim E-mail: leecu@kasi.re.kr

    2014-04-20

    Even though the recently discovered high-magnification event MOA-2010-BLG-311 had complete coverage over its peak, confident planet detection did not happen due to extremely weak central perturbations (EWCPs, fractional deviations of ≲ 2%). For confident detection of planets in EWCP events, it is necessary to have both high cadence monitoring and high photometric accuracy better than those of current follow-up observation systems. The next-generation ground-based observation project, Korea Microlensing Telescope Network (KMTNet), satisfies these conditions. We estimate the probability of occurrence of EWCP events with fractional deviations of ≤2% in high-magnification events and the efficiency of detecting planets in the EWCP events using the KMTNet. From this study, we find that the EWCP events occur with a frequency of >50% in the case of ≲ 100 M {sub E} planets with separations of 0.2 AU ≲ d ≲ 20 AU. We find that for main-sequence and sub-giant source stars, ≳ 1 M {sub E} planets in EWCP events with deviations ≤2% can be detected with frequency >50% in a certain range that changes with the planet mass. However, it is difficult to detect planets in EWCP events of bright stars like giant stars because it is easy for KMTNet to be saturated around the peak of the events because of its constant exposure time. EWCP events are caused by close, intermediate, and wide planetary systems with low-mass planets and close and wide planetary systems with massive planets. Therefore, we expect that a much greater variety of planetary systems than those already detected, which are mostly intermediate planetary systems, regardless of the planet mass, will be significantly detected in the near future.

  17. Gait event detection on level ground and incline walking using a rate gyroscope.

    PubMed

    Catalfamo, Paola; Ghoussayni, Salim; Ewins, David

    2010-01-01

    Gyroscopes have been proposed as sensors for ambulatory gait analysis and functional electrical stimulation systems. Accurate determination of the Initial Contact of the foot with the floor (IC) and the final contact or Foot Off (FO) on different terrains is important. This paper describes the evaluation of a gyroscope placed on the shank for determination of IC and FO in subjects walking outdoors on level ground, and up and down an incline. Performance was compared with a reference pressure measurement system. The mean difference between the gyroscope and the reference was less than -25 ms for IC and less than 75 ms for FO for all terrains. Detection success was over 98%. These results provide preliminary evidence supporting the use of the gyroscope for gait event detection on inclines as well as level walking.

  18. Detecting event-related recurrences by symbolic analysis: applications to human language processing

    PubMed Central

    beim Graben, Peter; Hutt, Axel

    2015-01-01

    Quasi-stationarity is ubiquitous in complex dynamical systems. In brain dynamics, there is ample evidence that event-related potentials (ERPs) reflect such quasi-stationary states. In order to detect them from time series, several segmentation techniques have been proposed. In this study, we elaborate a recent approach for detecting quasi-stationary states as recurrence domains by means of recurrence analysis and subsequent symbolization methods. We address two pertinent problems of contemporary recurrence analysis: optimizing the size of recurrence neighbourhoods and identifying symbols from different realizations for sequence alignment. As possible solutions for these problems, we suggest a maximum entropy criterion and a Hausdorff clustering algorithm. The resulting recurrence domains for single-subject ERPs are obtained as partition cells reflecting quasi-stationary brain states. PMID:25548270

  19. Predicting Negative Events: Using Post-discharge Data to Detect High-Risk Patients

    PubMed Central

    Sulieman, Lina; Fabbri, Daniel; Wang, Fei; Hu, Jianying; Malin, Bradley A

    2016-01-01

    Predicting negative outcomes, such as readmission or death, and detecting high-risk patients are important yet challenging problems in medical informatics. Various models have been proposed to detect high-risk patients; however, the state of the art relies on patient information collected before or at the time of discharge to predict future outcomes. In this paper, we investigate the effect of including data generated post discharge to predict negative outcomes. Specifically, we focus on two types of patients admitted to the Vanderbilt University Medical Center between 2010-2013: i) those with an acute event - 704 hip fractures and ii) those with chronic problems — 5250 congestive heart failure (CHF) patients. We show that the post-discharge model improved the AUC of the LACE index, a standard readmission scoring function, by 20 - 30%. Moreover, the new model resulted in higher AUCs by 15 - 27% for hip fracture and 10 - 12% for CHF compared to standard models. PMID:28269914

  20. Optimized Swinging Door Algorithm for Wind Power Ramp Event Detection: Preprint

    SciTech Connect

    Cui, Mingjian; Zhang, Jie; Florita, Anthony R.; Hodge, Bri-Mathias; Ke, Deping; Sun, Yuanzhang

    2015-08-06

    Significant wind power ramp events (WPREs) are those that influence the integration of wind power, and they are a concern to the continued reliable operation of the power grid. As wind power penetration has increased in recent years, so has the importance of wind power ramps. In this paper, an optimized swinging door algorithm (SDA) is developed to improve ramp detection performance. Wind power time series data are segmented by the original SDA, and then all significant ramps are detected and merged through a dynamic programming algorithm. An application of the optimized SDA is provided to ascertain the optimal parameter of the original SDA. Measured wind power data from the Electric Reliability Council of Texas (ERCOT) are used to evaluate the proposed optimized SDA.

  1. Label-Free Detection of Single Living Bacteria via Electrochemical Collision Event

    PubMed Central

    Lee, Ji Young; Kim, Byung-Kwon; Kang, Mijeong; Park, Jun Hui

    2016-01-01

    We detected single living bacterial cells on ultramicroelectrode (UME) using a single-particle collision method and optical microscopic methods. The number of collision events involving the bacterial cells indicated in current-time (i-t) curves corresponds to the number of bacterial cells (i.e., Escherichia coli) on the UME surface, as observed visually. Simulations were performed to determine the theoretical current response (75 pA) and frequency (0.47 pM−1 s−1) of single Escherichia coli collisions. The experimental current response (83 pA) and frequency (0.26 pM−1 s−1) were on the same order of magnitude as the theoretical values. This single-particle collision approach facilitates detecting living bacteria and determining their concentration in solution and could be widely applied to studying other bacteria and biomolecules. PMID:27435527

  2. Increasing cognitive load to facilitate lie detection: the benefit of recalling an event in reverse order.

    PubMed

    Vrij, Aldert; Mann, Samantha A; Fisher, Ronald P; Leal, Sharon; Milne, Rebecca; Bull, Ray

    2008-06-01

    In two experiments, we tested the hypotheses that (a) the difference between liars and truth tellers will be greater when interviewees report their stories in reverse order than in chronological order, and (b) instructing interviewees to recall their stories in reverse order will facilitate detecting deception. In Experiment 1, 80 mock suspects told the truth or lied about a staged event and did or did not report their stories in reverse order. The reverse order interviews contained many more cues to deceit than the control interviews. In Experiment 2, 55 police officers watched a selection of the videotaped interviews of Experiment 1 and made veracity judgements. Requesting suspects to convey their stories in reverse order improved police observers' ability to detect deception and did not result in a response bias.

  3. Automatic Event Detection and Characterization of solar events with IRIS, SDO/AIA and Hi-C

    NASA Astrophysics Data System (ADS)

    Alexander, Caroline; Fayock, Brian; Winebarger, Amy

    2016-05-01

    Dynamic, low-lying loops with peak temperatures <1 MK are observed throughout the solar transition region. These loops can be observed in SDO/AIA data due to some lower temperature spectral lines in the passbands, but have not been studied in great detail. We have developed a technique to automatically identify events (i.e., brightenings) on a pixel-by-pixel basis applying a set of selection criteria. The pixels are then grouped according to their proximity in space and relative progression of the event. This method allows us to characterize their overall lifetime and the rate at which these events occur. Our current progress includes identification of these groups of events in IRIS data, determination of their existence in AIA data, and characterization based on a comparison between the two. This technique has also been used on Hi-C data in preparation for the rocket re-flight in July 2016. Results on the success of this technique at identifying real structures and sources of heating will be shown.

  4. AKSED: adaptive knowledge-based system for event detection using collaborative unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Wang, X. Sean; Lee, Byung Suk; Sadjadi, Firooz

    2006-05-01

    Advances in sensor technology and image processing have made it possible to equip unmanned aerial vehicles (UAVs) with economical, high-resolution, energy-efficient sensors. Despite the improvements, current UAVs lack autonomous and collaborative operation capabilities, due to limited bandwidth and limited on-board image processing abilities. The situation, however, is changing. In the next generation of UAVs, much image processing can be carried out onboard and communication bandwidth problem will improve. More importantly, with more processing power, collaborative operations among a team of autonomous UAVs can provide more intelligent event detection capabilities. In this paper, we present ideas for developing a system enabling target recognitions by collaborative operations of autonomous UAVs. UAVs are configured in three stages: manufacturing, mission planning, and deployment. Different sets of information are needed at different stages, and the resulting outcome is an optimized event detection code deployed onto a UAV. The envisioned system architecture and the contemplated methodology, together with problems to be addressed, are presented.

  5. Online Least Squares One-Class Support Vector Machines-Based Abnormal Visual Event Detection

    PubMed Central

    Wang, Tian; Chen, Jie; Zhou, Yi; Snoussi, Hichem

    2013-01-01

    The abnormal event detection problem is an important subject in real-time video surveillance. In this paper, we propose a novel online one-class classification algorithm, online least squares one-class support vector machine (online LS-OC-SVM), combined with its sparsified version (sparse online LS-OC-SVM). LS-OC-SVM extracts a hyperplane as an optimal description of training objects in a regularized least squares sense. The online LS-OC-SVM learns a training set with a limited number of samples to provide a basic normal model, then updates the model through remaining data. In the sparse online scheme, the model complexity is controlled by the coherence criterion. The online LS-OC-SVM is adopted to handle the abnormal event detection problem. Each frame of the video is characterized by the covariance matrix descriptor encoding the moving information, then is classified into a normal or an abnormal frame. Experiments are conducted, on a two-dimensional synthetic distribution dataset and a benchmark video surveillance dataset, to demonstrate the promising results of the proposed online LS-OC-SVM method. PMID:24351629

  6. Development of electrochemical biosensor for detection of pathogenic microorganism in Asian dust events.

    PubMed

    Yoo, Min-Sang; Shin, Minguk; Kim, Younghun; Jang, Min; Choi, Yoon-E; Park, Si Jae; Choi, Jonghoon; Lee, Jinyoung; Park, Chulhwan

    2017-05-01

    We developed a single-walled carbon nanotubes (SWCNTs)-based electrochemical biosensor for the detection of Bacillus subtilis, one of the microorganisms observed in Asian dust events, which causes respiratory diseases such as asthma and pneumonia. SWCNTs plays the role of a transducer in biological antigen/antibody reaction for the electrical signal while 1-pyrenebutanoic acid succinimidyl ester (1-PBSE) and ant-B. subtilis were performed as a chemical linker and an acceptor, respectively, for the adhesion of target microorganism in the developed biosensor. The detection range (10(2)-10(10) CFU/mL) and the detection limit (10(2) CFU/mL) of the developed biosensor were identified while the response time was 10 min. The amount of target B. subtilis was the highest in the specificity test of the developed biosensor, compared with the other tested microorganisms (Staphylococcus aureus, Flavobacterium psychrolimnae, and Aquabacterium commune). In addition, target B. subtilis detected by the developed biosensor was observed by scanning electron microscope (SEM) analysis.

  7. Endpoint visual detection of three genetically modified rice events by loop-mediated isothermal amplification.

    PubMed

    Chen, Xiaoyun; Wang, Xiaofu; Jin, Nuo; Zhou, Yu; Huang, Sainan; Miao, Qingmei; Zhu, Qing; Xu, Junfeng

    2012-11-07

    Genetically modified (GM) rice KMD1, TT51-1, and KF6 are three of the most well known transgenic Bt rice lines in China. A rapid and sensitive molecular assay for risk assessment of GM rice is needed. Polymerase chain reaction (PCR), currently the most common method for detecting genetically modified organisms, requires temperature cycling and relatively complex procedures. Here we developed a visual and rapid loop-mediated isothermal amplification (LAMP) method to amplify three GM rice event-specific junction sequences. Target DNA was amplified and visualized by two indicators (SYBR green or hydroxy naphthol blue [HNB]) within 60 min at an isothermal temperature of 63 °C. Different kinds of plants were selected to ensure the specificity of detection and the results of the non-target samples were negative, indicating that the primer sets for the three GM rice varieties had good levels of specificity. The sensitivity of LAMP, with detection limits at low concentration levels (0.01%−0.005% GM), was 10- to 100-fold greater than that of conventional PCR. Additionally, the LAMP assay coupled with an indicator (SYBR green or HNB) facilitated analysis. These findings revealed that the rapid detection method was suitable as a simple field-based test to determine the status of GM crops.

  8. The Waveform Correlation Event Detection System project, Phase II: Testing with the IDC primary network

    SciTech Connect

    Young, C.J.; Beiriger, J.I.; Moore, S.G.

    1998-04-01

    Further improvements to the Waveform Correlation Event Detection System (WCEDS) developed by Sandia Laboratory have made it possible to test the system on the accepted Comprehensive Test Ban Treaty (CTBT) seismic monitoring network. For our test interval we selected a 24-hour period from December 1996, and chose to use the Reviewed Event Bulletin (REB) produced by the Prototype International Data Center (PIDC) as ground truth for evaluating the results. The network is heterogeneous, consisting of array and three-component sites, and as a result requires more flexible waveform processing algorithms than were available in the first version of the system. For simplicity and superior performance, we opted to use the spatial coherency algorithm of Wagner and Owens (1996) for both types of sites. Preliminary tests indicated that the existing version of WCEDS, which ignored directional information, could not achieve satisfactory detection or location performance for many of the smaller events in the REB, particularly those in the south Pacific where the network coverage is unusually sparse. To achieve an acceptable level of performance, we made modifications to include directional consistency checks for the correlations, making the regions of high correlation much less ambiguous. These checks require the production of continuous azimuth and slowness streams for each station, which is accomplished by means of FK processing for the arrays and power polarization processing for the three-component sites. In addition, we added the capability to use multiple frequency-banded data streams for each site to increase sensitivity to phases whose frequency content changes as a function of distance.

  9. Slip-Related Changes in Plantar Pressure Distribution, and Parameters for Early Detection of Slip Events

    PubMed Central

    Choi, Seungyoung; Cho, Hyungpil; Kang, Boram; Lee, Dong Hun; Kim, Mi Jung

    2015-01-01

    Objective To investigate differences in plantar pressure distribution between a normal gait and unpredictable slip events to predict the initiation of the slipping process. Methods Eleven male participants were enrolled. Subjects walked onto a wooden tile, and two layers of oily vinyl sheet were placed on the expected spot of the 4th step to induce a slip. An insole pressure-measuring system was used to monitor plantar pressure distribution. This system measured plantar pressure in four regions (the toes, metatarsal head, arch, and heel) for three events: the step during normal gait; the recovered step, when the subject recovered from a slip; and the uncorrected, harmful slipped step. Four variables were analyzed: peak pressure (PP), contact time (CT), the pressure-time integral (PTI), and the instant of peak pressure (IPP). Results The plantar pressure pattern in the heel was unique, as compared with other parts of the sole. In the heel, PP, CT, and PTI values were high in slipped and recovered steps compared with normal steps. The IPP differed markedly among the three steps. The IPPs in the heel for the three events were, in descending order (from latest to earliest), slipped, recovered, and normal steps, whereas in the other regions the order was normal, recovered, and slipped steps. Finally, the metatarsal head-to-heel IPP ratios for the normal, recovered, and slipped steps were 6.1±2.9, 3.1±3.0, and 2.2±2.5, respectively. Conclusion A distinctive plantar pressure pattern in the heel might be useful for early detection of a slip event to prevent slip-related injuries. PMID:26798603

  10. Detecting regular sound changes in linguistics as events of concerted evolution

    DOE PAGES

    Hruschka, Daniel  J.; Branford, Simon; Smith, Eric  D.; ...

    2014-12-18

    Background: Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Results: Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular soundmore » change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. Conclusions: We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group.« less

  11. Unreported seismic events found far off-shore Mexico using full-waveform, cross-correlation detection method.

    NASA Astrophysics Data System (ADS)

    Solano, ErickaAlinne; Hjorleifsdottir, Vala; Perez-Campos, Xyoli

    2015-04-01

    A large subset of seismic events do not have impulsive arrivals, such as low frequency events in volcanoes, earthquakes in the shallow part of the subduction interface and further down dip from the traditional seismogenic part, glacial events, volcanic and non-volcanic tremors and landslides. A suite of methods can be used to detect these non-impulsive events. One of this methods is the full-waveform detection based on time reversal methods (Solano, et al , submitted to GJI). The method uses continuous observed seismograms, together with Greens functions and moment tensor responses calculated for an arbitrary 3D structure. This method was applied to the 2012 Ometepec-Pinotepa Nacional earthquake sequence in Guerrero, Mexico. During the span time of the study, we encountered three previously unknown events. One of this events was an impulsive earthquake in the Ometepec area, that only has clear arrivals on three stations and was therefore not located and reported by the SSN. The other two events are previously undetected events, very depleted in high frequencies, that occurred far outside the search area. A very rough estimate gives the location of this two events in the portion of the East Pacific Rise around 9 N. These two events are detected despite their distance from the search area, due to favorable move-out on the array of the Mexican National Seismological Service network (SSN). We are expanding the study area to the EPR and to a larger period of time, with the objective of finding more events in that region. We will present an analysis of the newly detected events, as well as any further findings at the meeting.

  12. Validity assessment of the detection method of maize event Bt10 through investigation of its molecular structure.

    PubMed

    Milcamps, Anne; Rabe, Scott; Cade, Rebecca; De Framond, Anic J; Henriksson, Peter; Kramer, Vance; Lisboa, Duarte; Pastor-Benito, Susana; Willits, Michael G; Lawrence, David; Van den Eede, Guy

    2009-04-22

    In March 2005, U.S. authorities informed the European Commission of the inadvertent release of unauthorized maize GM event Bt10 in their market and subsequently the grain channel. In the United States measures were taken to eliminate Bt10 from seed and grain supplies; in the European Union an embargo for maize gluten and brewer's grain import was implemented unless certified of Bt10 absence with a Bt10-specific PCR detection method. With the aim of assessing the validity of the Bt10 detection method, an in-depth analysis of the molecular organization of the genetic modification of this event was carried out by both the company Syngenta, who produced the event, and the European Commission Joint Research Centre, who validated the detection method. Using a variety of molecular analytical tools, both organizations found the genetic modification of event Bt10 to be very complex in structure, with rearrangements, inversions, and multiple copies of the structural elements (cry1Ab, pat, and the amp gene), interspersed with small genomic maize fragments. Southern blot analyses demonstrated that all Bt10 elements were found tightly linked on one large fragment, including the region that would generate the event-specific PCR amplicon of the Bt10 detection method. This study proposes a hypothetical map of the insert of event Bt10 and concludes that the validated detection method for event Bt10 is fit for its purpose.

  13. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm

    PubMed Central

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-01-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses. PMID:27706086

  14. Predictors of Arrhythmic Events Detected by Implantable Loop Recorders in Renal Transplant Candidates

    PubMed Central

    Silva, Rodrigo Tavares; Martinelli Filho, Martino; Peixoto, Giselle de Lima; de Lima, José Jayme Galvão; de Siqueira, Sérgio Freitas; Costa, Roberto; Gowdak, Luís Henrique Wolff; de Paula, Flávio Jota; Kalil Filho, Roberto; Ramires, José Antônio Franchini

    2015-01-01

    Background The recording of arrhythmic events (AE) in renal transplant candidates (RTCs) undergoing dialysis is limited by conventional electrocardiography. However, continuous cardiac rhythm monitoring seems to be more appropriate due to automatic detection of arrhythmia, but this method has not been used. Objective We aimed to investigate the incidence and predictors of AE in RTCs using an implantable loop recorder (ILR). Methods A prospective observational study conducted from June 2009 to January 2011 included 100 consecutive ambulatory RTCs who underwent ILR and were followed-up for at least 1 year. Multivariate logistic regression was applied to define predictors of AE. Results During a mean follow-up of 424 ± 127 days, AE could be detected in 98% of patients, and 92% had more than one type of arrhythmia, with most considered potentially not serious. Sustained atrial tachycardia and atrial fibrillation occurred in 7% and 13% of patients, respectively, and bradyarrhythmia and non-sustained or sustained ventricular tachycardia (VT) occurred in 25% and 57%, respectively. There were 18 deaths, of which 7 were sudden cardiac events: 3 bradyarrhythmias, 1 ventricular fibrillation, 1 myocardial infarction, and 2 undetermined. The presence of a long QTc (odds ratio [OR] = 7.28; 95% confidence interval [CI], 2.01–26.35; p = 0.002), and the duration of the PR interval (OR = 1.05; 95% CI, 1.02–1.08; p < 0.001) were independently associated with bradyarrhythmias. Left ventricular dilatation (LVD) was independently associated with non-sustained VT (OR = 2.83; 95% CI, 1.01–7.96; p = 0.041). Conclusions In medium-term follow-up of RTCs, ILR helped detect a high incidence of AE, most of which did not have clinical relevance. The PR interval and presence of long QTc were predictive of bradyarrhythmias, whereas LVD was predictive of non-sustained VT. PMID:26351983

  15. Comparison and applicability of landslide susceptibility models based on landslide ratio-based logistic regression, frequency ratio, weight of evidence, and instability index methods in an extreme rainfall event

    NASA Astrophysics Data System (ADS)

    Wu, Chunhung

    2016-04-01

    Few researches have discussed about the applicability of applying the statistical landslide susceptibility (LS) model for extreme rainfall-induced landslide events. The researches focuses on the comparison and applicability of LS models based on four methods, including landslide ratio-based logistic regression (LRBLR), frequency ratio (FR), weight of evidence (WOE), and instability index (II) methods, in an extreme rainfall-induced landslide cases. The landslide inventory in the Chishan river watershed, Southwestern Taiwan, after 2009 Typhoon Morakot is the main materials in this research. The Chishan river watershed is a tributary watershed of Kaoping river watershed, which is a landslide- and erosion-prone watershed with the annual average suspended load of 3.6×107 MT/yr (ranks 11th in the world). Typhoon Morakot struck Southern Taiwan from Aug. 6-10 in 2009 and dumped nearly 2,000 mm of rainfall in the Chishan river watershed. The 24-hour, 48-hour, and 72-hours accumulated rainfall in the Chishan river watershed exceeded the 200-year return period accumulated rainfall. 2,389 landslide polygons in the Chishan river watershed were extracted from SPOT 5 images after 2009 Typhoon Morakot. The total landslide area is around 33.5 km2, equals to the landslide ratio of 4.1%. The main landslide types based on Varnes' (1978) classification are rotational and translational slides. The two characteristics of extreme rainfall-induced landslide event are dense landslide distribution and large occupation of downslope landslide areas owing to headward erosion and bank erosion in the flooding processes. The area of downslope landslide in the Chishan river watershed after 2009 Typhoon Morakot is 3.2 times higher than that of upslope landslide areas. The prediction accuracy of LS models based on LRBLR, FR, WOE, and II methods have been proven over 70%. The model performance and applicability of four models in a landslide-prone watershed with dense distribution of rainfall

  16. The WISE Detection of an Infrared Echo in Tidal Disruption Event ASASSN-14li

    NASA Astrophysics Data System (ADS)

    Jiang, Ning; Dou, Liming; Wang, Tinggui; Yang, Chenwei; Lyu, Jianwei; Zhou, Hongyan

    2016-09-01

    We report the detection of a significant infrared variability of the nearest tidal disruption event (TDE) ASASSN-14li using Wide-field Infrared Survey Explorer and newly released Near-Earth Object WISE Reactivation data. In comparison with the quiescent state, the infrared flux is brightened by 0.12 and 0.16 mag in the W1 (3.4 μm) and W2 (4.6 μm) bands at 36 days after the optical discovery (or ˜110 days after the peak disruption date). The flux excess is still detectable ˜170 days later. Assuming that the flare-like infrared emission is from the dust around the black hole, its blackbody temperature is estimated to be ˜2.1 × 103 K, slightly higher than the dust sublimation temperature, indicating that the dust is likely located close to the dust sublimation radius. The equilibrium between the heating and radiation of the dust claims a bolometric luminosity of ˜1043-1045 erg s-1, comparable with the observed peak luminosity. This result has for the first time confirmed the detection of infrared emission from the dust echoes of TDEs.

  17. Rare-event detection and process control for a biomedical application

    NASA Astrophysics Data System (ADS)

    Kegelmeyer, Laura N.

    1990-05-01

    Medical researchers are seeking a method for detecting chromosomal abnormalities in unborn children without requiring invasive procedures such as anmiocentesis. Software has been developed to utilize a light microscope to detect fetal cells that occur with very low frequency in a sample of maternal blood. This rare event detection involves dividing a microscope slide containing a maternal blood sample into as many as 40,000 fields, automatically focusing on each field-of-view, and searching for fetal cells. Size and shape information is obtained by calculating a figure of merit through various binary operations and is used to discriminate fetal cells from noise and artifacts. Once the rare fetal cells are located, the slide is automatically rescanned to count the total number of cells on the slide. Binary operations and image processing hardware are used as much as possible to reduce the total amount of time to analyze one slide. Current runtime for scoring one full slide is about four hours, with motorized stage movement and focusing being the speed-limiting factors. Fetal cells occurring with a frequency of less than 1 in 200,000 maternal cells have been consistently found with this system.

  18. Solar Power Ramp Events Detection Using an Optimized Swinging Door Algorithm

    SciTech Connect

    Cui, Mingjian; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Ke, Deping; Sun, Yuanzhang

    2015-08-05

    Solar power ramp events (SPREs) significantly influence the integration of solar power on non-clear days and threaten the reliable and economic operation of power systems. Accurately extracting solar power ramps becomes more important with increasing levels of solar power penetrations in power systems. In this paper, we develop an optimized swinging door algorithm (OpSDA) to enhance the state of the art in SPRE detection. First, the swinging door algorithm (SDA) is utilized to segregate measured solar power generation into consecutive segments in a piecewise linear fashion. Then we use a dynamic programming approach to combine adjacent segments into significant ramps when the decision thresholds are met. In addition, the expected SPREs occurring in clear-sky solar power conditions are removed. Measured solar power data from Tucson Electric Power is used to assess the performance of the proposed methodology. OpSDA is compared to two other ramp detection methods: the SDA and the L1-Ramp Detect with Sliding Window (L1-SW) method. The statistical results show the validity and effectiveness of the proposed method. OpSDA can significantly improve the performance of the SDA, and it can perform as well as or better than L1-SW with substantially less computation time.

  19. Automatic Detection of Swallowing Events by Acoustical Means for Applications of Monitoring of Ingestive Behavior

    PubMed Central

    Sazonov, Edward S.; Makeyev, Oleksandr; Schuckers, Stephanie; Lopez-Meyer, Paulo; Melanson, Edward L.; Neuman, Michael R.

    2010-01-01

    Our understanding of etiology of obesity and overweight is incomplete due to lack of objective and accurate methods for Monitoring of Ingestive Behavior (MIB) in the free living population. Our research has shown that frequency of swallowing may serve as a predictor for detecting food intake, differentiating liquids and solids, and estimating ingested mass. This paper proposes and compares two methods of acoustical swallowing detection from sounds contaminated by motion artifacts, speech and external noise. Methods based on mel-scale Fourier spectrum, wavelet packets, and support vector machines are studied considering the effects of epoch size, level of decomposition and lagging on classification accuracy. The methodology was tested on a large dataset (64.5 hours with a total of 9,966 swallows) collected from 20 human subjects with various degrees of adiposity. Average weighted epoch recognition accuracy for intra-visit individual models was 96.8% which resulted in 84.7% average weighted accuracy in detection of swallowing events. These results suggest high efficiency of the proposed methodology in separation of swallowing sounds from artifacts that originate from respiration, intrinsic speech, head movements, food ingestion, and ambient noise. The recognition accuracy was not related to body mass index, suggesting that the methodology is suitable for obese individuals. PMID:19789095

  20. Solar Power Ramp Events Detection Using an Optimized Swinging Door Algorithm: Preprint

    SciTech Connect

    Cui, Mingjian; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Ke, Deping; Sun, Yuanzhang

    2015-08-07

    Solar power ramp events (SPREs) are those that significantly influence the integration of solar power on non-clear days and threaten the reliable and economic operation of power systems. Accurately extracting solar power ramps becomes more important with increasing levels of solar power penetrations in power systems. In this paper, we develop an optimized swinging door algorithm (OpSDA) to detection. First, the swinging door algorithm (SDA) is utilized to segregate measured solar power generation into consecutive segments in a piecewise linear fashion. Then we use a dynamic programming approach to combine adjacent segments into significant ramps when the decision thresholds are met. In addition, the expected SPREs occurring in clear-sky solar power conditions are removed. Measured solar power data from Tucson Electric Power is used to assess the performance of the proposed methodology. OpSDA is compared to two other ramp detection methods: the SDA and the L1-Ramp Detect with Sliding Window (L1-SW) method. The statistical results show the validity and effectiveness of the proposed method. OpSDA can significantly improve the performance of the SDA, and it can perform as well as or better than L1-SW with substantially less computation time.

  1. The Event Detection and the Apparent Velocity Estimation Based on Computer Vision

    NASA Astrophysics Data System (ADS)

    Shimojo, M.

    2012-08-01

    The high spatial and time resolution data obtained by the telescopes aboard Hinode revealed the new interesting dynamics in solar atmosphere. In order to detect such events and estimate the velocity of dynamics automatically, we examined the estimation methods of the optical flow based on the OpenCV that is the computer vision library. We applied the methods to the prominence eruption observed by NoRH, and the polar X-ray jet observed by XRT. As a result, it is clear that the methods work well for solar images if the images are optimized for the methods. It indicates that the optical flow estimation methods in the OpenCV library are very useful to analyze the solar phenomena.

  2. DADA: Data Assimilation for the Detection and Attribution of Weather and Climate-related Events

    NASA Astrophysics Data System (ADS)

    Hannart, Alexis; Bocquet, Marc; Carrassi, Alberto; Ghil, Michael; Naveau, Philippe; Pulido, Manuel; Ruiz, Juan; Tandeo, Pierre

    2015-04-01

    We describe a new approach allowing for near real time, systematic causal attribution of weather and climate-related events. The method is purposely designed to allow its operability at meteorological centers by synergizing causal attribution with data treatments that are routinely performed when numerically forecasting the weather, thereby taking advantage of their powerful computational and observational capacity. Namely, we show that causal attribution can be obtained as a by-product of the so-called data assimilation procedures that are run on a daily basis to update the meteorological model with new atmospheric observations. We explain the theoretical rationale of this approach and sketch the most prominent features of a "data assimilation-based detection and attribution" (DADA) procedure. The proposal is illustrated in the context of the 3-variables Lorenz model. Several practical and theoretical research questions that need to be addressed to make the proposal readily operational within weather forecasting centers are finally laid out.

  3. Event detection and localization for small mobile robots using reservoir computing.

    PubMed

    Antonelo, E A; Schrauwen, B; Stroobandt, D

    2008-08-01

    Reservoir Computing (RC) techniques use a fixed (usually randomly created) recurrent neural network, or more generally any dynamic system, which operates at the edge of stability, where only a linear static readout output layer is trained by standard linear regression methods. In this work, RC is used for detecting complex events in autonomous robot navigation. This can be extended to robot localization tasks which are solely based on a few low-range, high-noise sensory data. The robot thus builds an implicit map of the environment (after learning) that is used for efficient localization by simply processing the input stream of distance sensors. These techniques are demonstrated in both a simple simulation environment and in the physically realistic Webots simulation of the commercially available e-puck robot, using several complex and even dynamic environments.

  4. Information Systems Developments to Detect and Analyze Chemotherapy-associated Adverse Drug Events

    PubMed Central

    Weiner, Mark G.; Livshits, Alice; Carozzoni, Carol; McMenamin, Erin; Gibson, Gene; Loren, Alison W.; Hennessy, Sean

    2002-01-01

    A difficult balance exists in the use of cancer chemotherapy in which the cytotoxic medicine must act on the cancer without causing neutropenic fever, a condition that is caused by over-suppression of the immune system. An improved understanding of dosing strategies as well as the use of medications to support the immune system has helped to reduce the likelihood of an admission for neutropenic fever following cancer chemotherapy. Therefore, as with any drug therapy, chemotherapy administration that is temporally associated with an unexpected hospitalization for neutropenia is an adverse drug event (ADE). Analogous to other informatics research to monitor and address the occurrence of ADEs, this work develops and validates the information systems infrastructure necessary to detect the occurrence of and analyze the factors contributing to chemotherapy associated ADEs.

  5. Decision support methods for the detection of adverse events in post-marketing data.

    PubMed

    Hauben, M; Bate, A

    2009-04-01

    Spontaneous reporting is a crucial component of post-marketing drug safety surveillance despite its significant limitations. The size and complexity of some spontaneous reporting system databases represent a challenge for drug safety professionals who traditionally have relied heavily on the scientific and clinical acumen of the prepared mind. Computer algorithms that calculate statistical measures of reporting frequency for huge numbers of drug-event combinations are increasingly used to support pharamcovigilance analysts screening large spontaneous reporting system databases. After an overview of pharmacovigilance and spontaneous reporting systems, we discuss the theory and application of contemporary computer algorithms in regular use, those under development, and the practical considerations involved in the implementation of computer algorithms within a comprehensive and holistic drug safety signal detection program.

  6. Automatic event detection in low SNR microseismic signals based on multi-scale permutation entropy and a support vector machine

    NASA Astrophysics Data System (ADS)

    Jia, Rui-Sheng; Sun, Hong-Mei; Peng, Yan-Jun; Liang, Yong-Quan; Lu, Xin-Ming

    2016-12-01

    Microseismic monitoring is an effective means for providing early warning of rock or coal dynamical disasters, and its first step is microseismic event detection, although low SNR microseismic signals often cannot effectively be detected by routine methods. To solve this problem, this paper presents permutation entropy and a support vector machine to detect low SNR microseismic events. First, an extraction method of signal features based on multi-scale permutation entropy is proposed by studying the influence of the scale factor on the signal permutation entropy. Second, the detection model of low SNR microseismic events based on the least squares support vector machine is built by performing a multi-scale permutation entropy calculation for the collected vibration signals, constructing a feature vector set of signals. Finally, a comparative analysis of the microseismic events and noise signals in the experiment proves that the different characteristics of the two can be fully expressed by using multi-scale permutation entropy. The detection model of microseismic events combined with the support vector machine, which has the features of high classification accuracy and fast real-time algorithms, can meet the requirements of online, real-time extractions of microseismic events.

  7. Detecting binarity of GW150914-like lenses in gravitational microlensing events

    NASA Astrophysics Data System (ADS)

    Kesden, Michael; Eilbott, Daniel; Riley, Alexander; Cohn, Jonathan; King, Lindsay

    2017-01-01

    The recent discovery of gravitational waves from stellar-mass binary black holes (BBHs) provided direct evidence of the existence of these systems. These BBHs would have gravitational microlensing signatures that are, due to their large masses and small separations, distinct from single-lens signals. We apply Bayesian statistics to examine the distinguishability of BBH microlensing events from single-lens events under ideal observing conditions, using modern photometric and astrometric capabilities. Given one year of ideal observations, a source star at the Galactic center, a GW150914-like BBH lens (total mass 65 solar masses, mass ratio 0.8) at half that distance, and an impact parameter of 0.4 Einstein radii, we find that BBHs with separations down to 0.00634 Einstein radii are detectable, marginally below the separation at which such systems would merge due to gravitational radiation with the age of the Universe. Supported by Alfred P Sloan Foundation Grant No. RG- 2015-65299 and NSF Grant No. PHY-1607031.

  8. Using the AHRQ PSIs to Detect Post-Discharge Adverse Events in the Veterans Health Administration

    PubMed Central

    Mull, Hillary J.; Borzecki, Ann M.; Chen, Qi; Shin, Marlena H.; Rosen, Amy K.

    2015-01-01

    Background PSIs use inpatient administrative data to flag cases with potentially preventable adverse events (AEs) attributable to hospital care. We explored how many AEs the PSIs identified in the 30 days post-discharge. Methods We ran the PSI software (version 3.1a) on VA 2003–2007 administrative data for ten recently validated PSIs. Among PSI-eligible index hospitalizations not flagged with an AE, we evaluated how many AEs occurred within 1–14 and 15–30 days post-discharge using inpatient and outpatient administrative data. Results Considering all PSI-eligible index hospitalizations, we identified 11,141 post-discharge AEs, compared to 40,578 inpatient-flagged AEs. More than 60% of post-discharge AEs were detected within 14 days of discharge. The majority of post-discharge AEs were decubitus ulcers and postoperative pulmonary embolisms or deep vein thromboses. Conclusions Extending PSI algorithms to the post-discharge period may provide a more complete picture of hospital quality. Future work should use chart review to validate post-discharge PSI events. PMID:23939485

  9. Energy efficient data representation and aggregation with event region detection in wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Banerjee, Torsha

    Detection (PERD) for WSNs. When a single event occurs, a child of the tree sends a Flagged Polynomial (FP) to its parent, if the readings approximated by it falls outside the data range defining the existing phenomenon. After the aggregation process is over, the root having the two polynomials, P and FP can be queried for FP (approximating the new event region) instead of flooding the whole network. For multiple such events, instead of computing a polynomial corresponding to each new event, areas with same data range are combined by the corresponding tree nodes and the aggregated coefficients are passed on. Results reveal that a new event can be detected by PERD while error in detection remains constant and is less than a threshold of 10%. As the node density increases, accuracy and delay for event detection are found to remain almost constant, making PERD highly scalable. Whenever an event occurs in a WSN, data is generated by closeby sensors and relaying the data to the base station (BS) make sensors closer to the BS run out of energy at a much faster rate than sensors in other parts of the network. This gives rise to an unequal distribution of residual energy in the network and makes those sensors with lower remaining energy level die at much faster rate than others. We propose a scheme for enhancing network Lifetime using mobile cluster heads (CH) in a WSN. To maintain remaining energy more evenly, some energy-rich nodes are designated as CHs which move in a controlled manner towards sensors rich in energy and data. This eliminates multihop transmission required by the static sensors and thus increases the overall lifetime of the WSN. We combine the idea of clustering and mobile CH to first form clusters of static sensor nodes. A collaborative strategy among the CHs further increases the lifetime of the network. Time taken for transmitting data to the BS is reduced further by making the CHs follow a connectivity strategy that always maintain a connected path to the BS

  10. Applying a New Event Detection Algorithm to an Ocean Bottom Seismometer Dataset Recorded Offshore Southern California

    NASA Astrophysics Data System (ADS)

    Bishop, J.; Kohler, M. D.; Bunn, J.; Chandy, K. M.

    2015-12-01

    A number of active southern California offshore faults are capable of M>6 earthquakes, and the only permanent Southern California Seismic Network stations that can contribute to ongoing, small-magnitude earthquake detection and location are those located on the coastline and islands. To obtain a more detailed picture of the seismicity of the region, an array of 34 ocean bottom seismometers (OBSs) was deployed to record continuous waveform data off the coast of Southern California for 12 months (2010-2011) as part of the ALBACORE (Asthenospheric and Lithospheric Broadband Architecture from the California Offshore Region Experiment) project. To obtain a local event catalog based on OBS data, we make use of a newly developed data processing platform based on Python. The data processing procedure comprises a multi-step analysis that starts with the identification of significant signals above the time-adjusted noise floor for each sensor. This is followed by a time-dependent statistical estimate of the likelihood of an earthquake based on the aggregated signals in the array. For periods with elevated event likelihood, an adaptive grid-fitting procedure is used that yields candidate earthquake hypocenters with confidence estimates that best match the observed sensor signals. The results are validated with synthetic travel times and manual picks. Using results from ALBACORE, we have created a more complete view of active faulting in the California Borderland.

  11. KIWI: A technology for public health event monitoring and early warning signal detection

    PubMed Central

    Mukhi, Shamir N

    2016-01-01

    Objectives: To introduce the Canadian Network for Public Health Intelligence’s new Knowledge Integration using Web-based Intelligence (KIWI) technology, and to pefrom preliminary evaluation of the KIWI technology using a case study. The purpose of this new technology is to support surveillance activities by monitoring unstructured data sources for the early detection and awareness of potential public health threats. Methods: A prototype of the KIWI technology, adapted for zoonotic and emerging diseases, was piloted by end-users with expertise in the field of public health and zoonotic/emerging disease surveillance. The technology was assessed using variables such as geographic coverage, user participation, and others; categorized by high-level attributes from evaluation guidelines for internet based surveillance systems. Special attention was given to the evaluation of the system’s automated sense-making algorithm, which used variables such as sensitivity, specificity, and predictive values. Event-based surveillance evaluation was not applied to its full capacity as such an evaluation is beyond the scope of this paper. Results: KIWI was piloted with user participation = 85.0% and geographic coverage within monitored sources = 83.9% of countries. The pilots, which focused on zoonotic and emerging diseases, lasted a combined total of 65 days and resulted in the collection of 3243 individual information pieces (IIP) and 2 community reported events (CRE) for processing. Ten sources were monitored during the second phase of the pilot, which resulted in 545 anticipatory intelligence signals (AIS). KIWI’s automated sense-making algorithm (SMA) had sensitivity = 63.9% (95% CI: 60.2-67.5%), specificity = 88.6% (95% CI: 87.3-89.8%), positive predictive value = 59.8% (95% CI: 56.1-63.4%), and negative predictive value = 90.3% (95% CI: 89.0-91.4%). Discussion: Literature suggests the need for internet based monitoring and surveillance systems that are customizable

  12. Using GPS to Rapidly Detect and Model Earthquakes and Transient Deformation Events

    NASA Astrophysics Data System (ADS)

    Crowell, Brendan W.

    The rapid modeling and detection of earthquakes and transient deformation is a problem of extreme societal importance for earthquake early warning and rapid hazard response. To date, GPS data is not used in earthquake early warning or rapid source modeling even in Japan or California where the most extensive geophysical networks exist. This dissertation focuses on creating algorithms for automated modeling of earthquakes and transient slip events using GPS data in the western United States and Japan. First, I focus on the creation and use of high-rate GPS and combined seismogeodetic data for applications in earthquake early warning and rapid slip inversions. Leveraging data from earthquakes in Japan and southern California, I demonstrate that an accurate magnitude estimate can be made within seconds using P wave displacement scaling, and that a heterogeneous static slip model can be generated within 2-3 minutes. The preliminary source characterization is sufficiently robust to independently confirm the extent of fault slip used for rapid assessment of strong ground motions and improved tsunami warning in subduction zone environments. Secondly, I investigate the automated detection of transient slow slip events in Cascadia using daily positional estimates from GPS. Proper geodetic characterization of transient deformation is necessary for studies of regional interseismic, coseismic and postseismic tectonics, and miscalculations can affect our understanding of the regional stress field. I utilize the relative strength index (RSI) from financial forecasting to create a complete record of slow slip from continuous GPS stations in the Cascadia subduction zone between 1996 and 2012. I create a complete history of slow slip across the Cascadia subduction zone, fully characterizing the timing, progression, and magnitude of events. Finally, using a combination of continuous and campaign GPS measurements, I characterize the amount of extension, shear and subsidence in the

  13. PREFACE: Fourth Symposium on Large TPCs for Low Energy Rare Event Detection

    NASA Astrophysics Data System (ADS)

    Irastorza, Igor G.; Colas, Paul; Giomataris, Ioannis

    2009-07-01

    The Fourth International Symposium on large TPCs for low-energy rare-event detection was held at the Hermite auditorium of the Insitute Henri Poincaréte, 11 rue Pierre et Marie Curie in Paris on 18-19 December 2008. As in previous instances of the meeting, held always in Paris in 2006, 2004 and 2002, it gathered a significant community of physicists involved in rare event searches and/or development of time projection chambers (TPCs). The purpose of the meeting was to present and discuss the status of current experiments or projects involving the use of large TPCs for the search of rare events, like low-energy neutrinos, double beta decay, dark matter or axion experiments, as well as to discuss new results and ideas in the framework of the last developments of Micro Pattern Gaseous Detectors (MPGD), and how these are being - or could be - applied to the mentioned searches. The rapid evolvement of these devices and the relevance of their latest results need to be efficiently transferred to the rare event community. The creation of this series of meetings followed the motivation of bringing together both know-hows and it is proving to be a fruitful area of collaboration. Once more, the format of the meeting proved to be a success. A short (2 days) and relatively informal program with some recent highlighted results, rather than exhaustive reviews, attracted the interest of the audience. The symposium, fourth of the series, is becoming consolidated as a regular meeting place for the synergic interplay between the fields of rare events and TPC development. Apart from the usual topics central to the conference subject, like the status of some low-energy neutrino physics and double beta decay experiments, dark matter experiments (and in general physics in underground laboratories), axion searches, or development results, every year the conference programme is enriched with original slightly off-topic contributions that trigger the curiosity and stimulate further thought

  14. Implementation and performance results of neural network for power quality event detection

    NASA Astrophysics Data System (ADS)

    Huang, Weijian; Tian, Wenzhi

    2008-10-01

    A novel method to detect power quality event in distributed power system combing wavelet network with the improved back-propagation algorithm is presented. The paper tries to explain to design complex supported orthogonal wavelets by compactly supported orthogonal real wavelets, and then explore the extraction of disturbance signal to obtain the feature information, and finally propose several novel wavelet combined information to analyze the disturbance signal, superior to real wavelet analysis result. The feature obtained from WT coefficients are inputted into wavelet network for power quality disturbance pattern recognition. The power quality disturbance recognition model is established and the improved back-propagation algorithm is used to fulfill the network parameter initialization. By means of choosing enough samples to train the recognition model, the type of disturbance can be obtained when signal representing fault is inputted to the trained network. The results of simulation analysis show that the complex wavelet transform combined with wavelet network are more sensitive to signal singularity, and found to be significant improvement over current methods in real-time detection.

  15. Fully Autonomous Multiplet Event Detection: Application to Local-Distance Monitoring of Blood Falls Seismicity

    SciTech Connect

    Carmichael, Joshua Daniel; Carr, Christina; Pettit, Erin C.

    2015-06-18

    We apply a fully autonomous icequake detection methodology to a single day of high-sample rate (200 Hz) seismic network data recorded from the terminus of Taylor Glacier, ANT that temporally coincided with a brine release episode near Blood Falls (May 13, 2014). We demonstrate a statistically validated procedure to assemble waveforms triggered by icequakes into populations of clusters linked by intra-event waveform similarity. Our processing methodology implements a noise-adaptive power detector coupled with a complete-linkage clustering algorithm and noise-adaptive correlation detector. This detector-chain reveals a population of 20 multiplet sequences that includes ~150 icequakes and produces zero false alarms on the concurrent, diurnally variable noise. Our results are very promising for identifying changes in background seismicity associated with the presence or absence of brine release episodes. We thereby suggest that our methodology could be applied to longer time periods to establish a brine-release monitoring program for Blood Falls that is based on icequake detections.

  16. Noise Reduction for Detecting Event-Related Potential by Processing in Dipole Space

    NASA Astrophysics Data System (ADS)

    Fukami, Tadanori; Shimada, Takamasa; Ishikawa, Fumito; Ishikawa, Bunnoshin; Saito, Yoichi

    2007-06-01

    Averaged responses are generally used to detect event-related potentials (ERPs) by supressing the background electroencephalography (EEG) wave, the ERP component of a single-trial response, or the average of a small number of responses is used to assess time variation in a subjects’ state in detail. We therefore proposed a new method of reducing the noise component including the background wave in a single-trial response. In this study, our target is a component such as N100 approximated by one dipole. The method was performed by modifying the dipole position in the brain and detecting the projected components with reference to the dipole estimated from an averaged response. Results of simulation indicate that the proposed method could improve signal-to-noise ratio by 7.6 dB and decrease the error in N100 peak latency 6.7 ms by suppressing the influence of the background wave. In the EEG experiment, eight healthy subjects paticipated and their results show that the sway of waveforms by the background wave is suppressed and that the peak of the N100 component becomes prominent compared with that of the original single-trial response.

  17. Giant magenetoresistive sensors. 2. Detection of biorecognition events at self-referencing and magnetically tagged arrays.

    PubMed

    Millen, Rachel L; Nordling, John; Bullen, Heather A; Porter, Marc D; Tondra, Mark; Granger, Michael C

    2008-11-01

    Microfabricated devices formed from alternating layers of magnetic and nonmagnetic materials at combined thicknesses of a few hundred nanometers exhibit a phenomenon known as the giant magnetoresistance effect. Devices based on this effect are known as giant magnetoresistive (GMR) sensors. The resistance of a GMR is dependent on the strength of an external magnetic field, which has resulted in the widespread usage of such platforms in high-speed, high-data density storage drives. The same attributes (i.e., sensitivity, small size, and speed) are also important embodiments of many types of bioanalytical sensors, pointing to an intriguing opportunity via an integration of GMR technology, magnetic labeling strategies, and biorecognition elements (e.g., antibodies). This paper describes the utilization of GMRs for the detection of streptavidin-coated magnetic particles that are selectively captured by biotinylated gold addresses on a 2 x 0.3 cm sample stick. A GMR sensor network reads the addresses on a sample stick in a manner that begins to emulate that of a "card-swipe" system. This study also takes advantage of on-sample magnetic addresses that function as references for internal calibration of the GMR response and as a facile means to account for small variations in the gap between the sample stick and sensor. The magnetic particle surface coverage at the limit of detection was determined to be approximately 2%, which corresponds to approximately 800 binding events over the 200 x 200 microm capture address. These findings, along with the potential use of streptavidin-coated magnetic particles as a universal label for antigen detection in, for example, heterogeneous assays, are discussed.

  18. The Power to Detect Recent Fragmentation Events Using Genetic Differentiation Methods

    PubMed Central

    Lloyd, Michael W.; Campbell, Lesley; Neel, Maile C.

    2013-01-01

    Habitat loss and fragmentation are imminent threats to biological diversity worldwide and thus are fundamental issues in conservation biology. Increased isolation alone has been implicated as a driver of negative impacts in populations associated with fragmented landscapes. Genetic monitoring and the use of measures of genetic divergence have been proposed as means to detect changes in landscape connectivity. Our goal was to evaluate the sensitivity of Wright’s Fst, Hedrick’ G’st, Sherwin’s MI, and Jost’s D to recent fragmentation events across a range of population sizes and sampling regimes. We constructed an individual-based model, which used a factorial design to compare effects of varying population size, presence or absence of overlapping generations, and presence or absence of population sub-structuring. Increases in population size, overlapping generations, and population sub-structuring each reduced Fst, G’st, MI, and D. The signal of fragmentation was detected within two generations for all metrics. However, the magnitude of the change in each was small in all cases, and when Ne was >100 individuals it was extremely small. Multi-generational sampling and population estimates are required to differentiate the signal of background divergence from changes in Fst, G’st, MI, and D associated with fragmentation. Finally, the window during which rapid change in Fst, G’st, MI, and D between generations occurs can be small, and if missed would lead to inconclusive results. For these reasons, use of Fst, G’st, MI, or D for detecting and monitoring changes in connectivity is likely to prove difficult in real-world scenarios. We advocate use of genetic monitoring only in conjunction with estimates of actual movement among patches such that one could compare current movement with the genetic signature of past movement to determine there has been a change. PMID:23704965

  19. The Monitoring, Detection, Isolation and Assessment of Information Warfare Attacks Through Multi-Level, Multi-Scale System Modeling and Model Based Technology

    DTIC Science & Technology

    2004-01-01

    window size of 10. Figure 5-7 shows the ROC curve of the “EWMA vector” classifier for the λ value of 0.04 based on the CHAID algorithm in comparison ...k(i), from the training data using the event intensity method ...11 Figure 1-2. The observations of the event intensity, k(i), from the testing data using the event intensity method

  20. FOREWORD: 3rd Symposium on Large TPCs for Low Energy Event Detection

    NASA Astrophysics Data System (ADS)

    Irastorza, Igor G.; Colas, Paul; Gorodetzky, Phillippe

    2007-05-01

    The Third International Symposium on large TPCs for low-energy rare-event detection was held at Carré des sciences, Poincaré auditorium, 25 rue de la Montagne Ste Geneviève in Paris on 11 12 December 2006. This prestigious location belonging to the Ministry of Research is hosted in the former Ecole Polytechnique. The meeting, held in Paris every two years, gathers a significant community of physicists involved in rare event detection. Its purpose is an extensive discussion of present and future projects using large TPCs for low energy, low background detection of rare events (low-energy neutrinos, dark matter, solar axions). The use of a new generation of Micro-Pattern Gaseous Detectors (MPGD) appears to be a promising way to reach this goal. The program this year was enriched by a new session devoted to the detection challenge of polarized gamma rays, relevant novel experimental techniques and the impact on particle physics, astrophysics and astronomy. A very particular feature of this conference is the large variety of talks ranging from purely theoretical to purely experimental subjects including novel technological aspects. This allows discussion and exchange of useful information and new ideas that are emerging to address particle physics experimental challenges. The scientific highlights at the Symposium came on many fronts: Status of low-energy neutrino physics and double-beta decay New ideas on double-beta decay experiments Gamma ray polarization measurement combining high-precision TPCs with MPGD read-out Dark Matter challenges in both axion and WIMP search with new emerging ideas for detection improvements Progress in gaseous and liquid TPCs for rare event detection Georges Charpak opened the meeting with a talk on gaseous detectors for applications in the bio-medical field. He also underlined the importance of new MPGD detectors for both physics and applications. There were about 100 registered participants at the symposium. The successful

  1. Wavelet based automated postural event detection and activity classification with single imu - biomed 2013.

    PubMed

    Lockhart, Thurmon E; Soangra, Rahul; Zhang, Jian; Wu, Xuefan

    2013-01-01

    Mobility characteristics associated with activity of daily living such as sitting down, lying down, rising up, and walking are considered to be important in maintaining functional independence and healthy life style especially for the growing elderly population. Characteristics of postural transitions such as sit-to-stand are widely used by clinicians as a physical indicator of health, and walking is used as an important mobility assessment tool. Many tools have been developed to assist in the assessment of functional levels and to detect a person’s activities during daily life. These include questionnaires, observation, diaries, kinetic and kinematic systems, and validated functional tests. These measures are costly and time consuming, rely on subjective patient recall and may not accurately reflect functional ability in the patient’s home. In order to provide a low-cost, objective assessment of functional ability, inertial measurement unit (IMU) using MEMS technology has been employed to ascertain ADLs. These measures facilitate long-term monitoring of activity of daily living using wearable sensors. IMU system are desirable in monitoring human postures since they respond to both frequency and the intensity of movements and measure both dc (gravitational acceleration vector) and ac (acceleration due to body movement) components at a low cost. This has enabled the development of a small, lightweight, portable system that can be worn by a free-living subject without motion impediment – TEMPO (Technology Enabled Medical Precision Observation). Using this IMU system, we acquired indirect measures of biomechanical variables that can be used as an assessment of individual mobility characteristics with accuracy and recognition rates that are comparable to the modern motion capture systems. In this study, five subjects performed various ADLs and mobility measures such as posture transitions and gait characteristics were obtained. We developed postural event detection

  2. Detection of air pollution events over Évora-Portugal during 2009

    NASA Astrophysics Data System (ADS)

    Filipa Domingues, Ana; Bortoli, Daniele; Silva, Ana Maria; Kulkarni, Pavan; Antón, Manuel

    2010-05-01

    All over the world pollutant industries, traffic and other natural and anthropogenic sources are responsible for air pollution affecting health and also the climate. At the moment the monitoring of air quality in urban and country regions become an urgent concern in the atmospheric studies due to the impact of global air pollution on climate and on the environment. One of the evidences of the global character of air pollution is that it not only affects industrialized countries but also reaches less developed countries with pollution gases and particles generated for elsewhere. The development and the employment of instruments and techniques for measure the variation of atmospheric trace gases and perform their monitoring are crucial for the improvement of the air quality and the control of pollutants emissions. One of the instruments able to perform the air quality monitoring is the Spectrometer for Atmospheric TRacers Measurements (SPATRAM) and it is installed at the CGÉs Observatory in Évora (38.5° N, 7.9° W, 300 m asl). This UV-VIS Spectrometer is used to carry out measurements of the zenith scattered radiation (290- 900 nm) to retrieve the vertical content of some atmospheric trace gases such as O3 and NO2 in stratosphere, using Differential Optical Absorption Spectroscopy (DOAS) methodology. Although SPATRAM, in its actual geometric and operational configuration - zenith sky looking and passive mode measurements, is not able to detect small variations of tracers in the troposphere it is possible to identify enhancements in the pollution loads due to air masses movements from polluted sites. In spite of the fact that Evora is a quite unpolluted city the deep analysis of the DOAS output, namely the quantity of gas (in this case NO2) present along the optical path of measurements (SCD - Slant Column Density) allows for the detection of unpredicted variations in the diurnal NO2 cycle. The SPATRAḾs data allows the identification of polluting events which

  3. An automated cross-correlation based event detection technique and its application to surface passive data set

    USGS Publications Warehouse

    Forghani-Arani, Farnoush; Behura, Jyoti; Haines, Seth S.; Batzle, Mike

    2013-01-01

    In studies on heavy oil, shale reservoirs, tight gas and enhanced geothermal systems, the use of surface passive seismic data to monitor induced microseismicity due to the fluid flow in the subsurface is becoming more common. However, in most studies passive seismic records contain days and months of data and manually analysing the data can be expensive and inaccurate. Moreover, in the presence of noise, detecting the arrival of weak microseismic events becomes challenging. Hence, the use of an automated, accurate and computationally fast technique for event detection in passive seismic data is essential. The conventional automatic event identification algorithm computes a running-window energy ratio of the short-term average to the long-term average of the passive seismic data for each trace. We show that for the common case of a low signal-to-noise ratio in surface passive records, the conventional method is not sufficiently effective at event identification. Here, we extend the conventional algorithm by introducing a technique that is based on the cross-correlation of the energy ratios computed by the conventional method. With our technique we can measure the similarities amongst the computed energy ratios at different traces. Our approach is successful at improving the detectability of events with a low signal-to-noise ratio that are not detectable with the conventional algorithm. Also, our algorithm has the advantage to identify if an event is common to all stations (a regional event) or to a limited number of stations (a local event). We provide examples of applying our technique to synthetic data and a field surface passive data set recorded at a geothermal site.

  4. Detection of Rapid Events at Mantle Depth by Future Gravity Missions

    NASA Astrophysics Data System (ADS)

    Ivins, Erik; Watkins, Michael

    2015-04-01

    The robust detection of gravity changes associated with relatively shallow subduction zone earthquakes (0-50 km depth co-seismic rupture) has been one of the success stories of the GRACE (e.g., Han et al. 2013, JGR-B, doi:10.1002/jgrb.50116) and GOCE (e.g., Fuchs et al., 2013, JGR-B, doi: 10.1002/jgrb.50381) missions. This surprise is a testament to the sensitivity of the measurement system, for the satellites must map the gravity potential field changes while flying at orbital altitudes exceeding 400 km (in the case of GRACE). It is clear that these observations contribute to advancing our understanding of large subduction zone earthquakes, if for no other reason than they allow comprehensive observation over the ocean covered solid Earth. The observations aid studies of both the mass transport associated with coseismic and post-seismic. Measurement capability for missions proposed to be flown after GRACE-2 are anticipated to be an order of magnitude, or greater, in accuracy and resolution (e.g., Wiese et al., 2012, J. Geodesy, doi: 10.1007/s00190-011-0493-8; Elsaka et al. 2014, J. Geodesy, doi: 10.1007/s00190-013-0665-9). Deep subduction zone earthquakes have not been detected, nor have any other non-seismic solid Earth deformations - with the exception of the glacial isostatic adjustment vertical response to the last glacial age. We examine the possibility that earthquakes occurring at, or near, the major transition zone in the mantle should be detected in the region where mantle phases become unstable and undergoes transition to a stable perovskite phase below 660 km depth. The Mw 8.2 1994 Bolivian Earthquake and the May 24, 2013 Mw 8.3 earthquake beneath the Sea of Okhotsk, Russia, are prototypes of events that can be studied with future gravity missions. Observation of gravity changes associated with deep subduction zone earthquakes could provide new clues on the enigmatic questions currently in debate over faulting mechanism (e.g., Zhan et al., 2014, Science

  5. Computer aided detection of transient inflation events at Alaskan volcanoes using GPS measurements from 2005-2015

    NASA Astrophysics Data System (ADS)

    Li, Justin D.; Rude, Cody M.; Blair, David M.; Gowanlock, Michael G.; Herring, Thomas A.; Pankratius, Victor

    2016-11-01

    Analysis of transient deformation events in time series data observed via networks of continuous Global Positioning System (GPS) ground stations provide insight into the magmatic and tectonic processes that drive volcanic activity. Typical analyses of spatial positions originating from each station require careful tuning of algorithmic parameters and selection of time and spatial regions of interest to observe possible transient events. This iterative, manual process is tedious when attempting to make new discoveries and does not easily scale with the number of stations. Addressing this challenge, we introduce a novel approach based on a computer-aided discovery system that facilitates the discovery of such potential transient events. The advantages of this approach are demonstrated by actual detections of transient deformation events at volcanoes selected from the Alaska Volcano Observatory database using data recorded by GPS stations from the Plate Boundary Observatory network. Our technique successfully reproduces the analysis of a transient signal detected in the first half of 2008 at Akutan volcano and is also directly applicable to 3 additional volcanoes in Alaska, with the new detection of 2 previously unnoticed inflation events: in early 2011 at Westdahl and in early 2013 at Shishaldin. This study also discusses the benefits of our computer-aided discovery approach for volcanology in general. Advantages include the rapid analysis on multi-scale resolutions of transient deformation events at a large number of sites of interest and the capability to enhance reusability and reproducibility in volcano studies.

  6. One algorithm to rule them all? An evaluation and discussion of ten eye movement event-detection algorithms.

    PubMed

    Andersson, Richard; Larsson, Linnea; Holmqvist, Kenneth; Stridh, Martin; Nyström, Marcus

    2016-05-18

    Almost all eye-movement researchers use algorithms to parse raw data and detect distinct types of eye movement events, such as fixations, saccades, and pursuit, and then base their results on these. Surprisingly, these algorithms are rarely evaluated. We evaluated the classifications of ten eye-movement event detection algorithms, on data from an SMI HiSpeed 1250 system, and compared them to manual ratings of two human experts. The evaluation focused on fixations, saccades, and post-saccadic oscillations. The evaluation used both event duration parameters, and sample-by-sample comparisons to rank the algorithms. The resulting event durations varied substantially as a function of what algorithm was used. This evaluation differed from previous evaluations by considering a relatively large set of algorithms, multiple events, and data from both static and dynamic stimuli. The main conclusion is that current detectors of only fixations and saccades work reasonably well for static stimuli, but barely better than chance for dynamic stimuli. Differing results across evaluation methods make it difficult to select one winner for fixation detection. For saccade detection, however, the algorithm by Larsson, Nyström and Stridh (IEEE Transaction on Biomedical Engineering, 60(9):2484-2493,2013) outperforms all algorithms in data from both static and dynamic stimuli. The data also show how improperly selected algorithms applied to dynamic data misestimate fixation and saccade properties.

  7. An engineered nano-plasmonic biosensing surface for colorimetric and SERS detection of DNA-hybridization events

    NASA Astrophysics Data System (ADS)

    Heydari, Esmaeil; Thompson, David; Graham, Duncan; Cooper, Jonathan M.; Clark, Alasdair W.

    2015-03-01

    We report a versatile nanophotonic biosensing platform that enables both colorimetric detection and enhanced Raman spectroscopy detection of molecular binding events. Through the integration of electron-beam lithography, dip-pennanolithography and molecular self-assembly, we demonstrate plasmonic nanostructures which change geometry and plasmonic properties in response to molecularly-mediated nanoparticle binding events. These biologically-active nanostructured surfaces hold considerable potential for use as multiplexed sensor platforms for point-of-care diagnostics, and as scaffolds for a new generation of molecularly dynamic metamaterials.

  8. Real-Time Microbiology Laboratory Surveillance System to Detect Abnormal Events and Emerging Infections, Marseille, France.

    PubMed

    Abat, Cédric; Chaudet, Hervé; Colson, Philippe; Rolain, Jean-Marc; Raoult, Didier

    2015-08-01

    Infectious diseases are a major threat to humanity, and accurate surveillance is essential. We describe how to implement a laboratory data-based surveillance system in a clinical microbiology laboratory. Two historical Microsoft Excel databases were implemented. The data were then sorted and used to execute the following 2 surveillance systems in Excel: the Bacterial real-time Laboratory-based Surveillance System (BALYSES) for monitoring the number of patients infected with bacterial species isolated at least once in our laboratory during the study periodl and the Marseille Antibiotic Resistance Surveillance System (MARSS), which surveys the primary β-lactam resistance phenotypes for 15 selected bacterial species. The first historical database contained 174,853 identifications of bacteria, and the second contained 12,062 results of antibiotic susceptibility testing. From May 21, 2013, through June 4, 2014, BALYSES and MARSS enabled the detection of 52 abnormal events for 24 bacterial species, leading to 19 official reports. This system is currently being refined and improved.

  9. Block-adaptive filtering and its application to seismic-event detection

    SciTech Connect

    Clark, G.A.

    1981-04-01

    Block digital filtering involves the calculation of a block or finite set of filter output samples from a block of input samples. The motivation for block processing arises from computational advantages of the technique. Block filters take good advantage of parallel processing architectures, which are becoming more and more attractive with the advent of very large scale integrated (VLSI) circuits. This thesis extends the block technique to Wiener and adaptive filters, both of which are statistical filters. The key ingredient to this extension turns out to be the definition of a new performance index, block mean square error (BMSE), which combines the well known sum square error (SSE) and mean square error (MSE). A block adaptive filtering procedure is derived in which the filter coefficients are adjusted once per each output block in accordance with a generalized block least mean-square (BLMS) algorithm. Convergence properties of the BLMS algorithm are studied, including conditions for guaranteed convergence, convergence speed, and convergence accuracy. Simulation examples are given for clarity. Convergence properties of the BLMS and LMS algorithms are analyzed and compared. They are shown to be analogous, and under the proper circumstances, equivalent. The block adaptive filter was applied to the problem of detecting small seismic events in microseismic background noise. The predictor outperformed the world-wide standardized seismograph network (WWSSN) seismometers in improving signal-to-noise ratio (SNR).

  10. PREFACE: 7th International Symposium on Large TPCs for Low-Energy Rare Event Detection

    NASA Astrophysics Data System (ADS)

    Colas, P.; Giomataris, I.; Irastorza, I.; Patzak, Th

    2015-11-01

    The seventh "International Symposium on Large TPCs for Low-Energy Rare Event Detection", took place in Paris between the 15th and 17th of December 2014 at the Institute of Astroparticle Physics (APC) campus - Paris Diderot University. As usual the conference was organized during the week before Christmas, which seems to be convenient for most of the people and occurs every two years with almost 120 participants attending. Many people contributed to the success of the conference, but the organizers would particularly like to thank the management of APC for providing the nice Buffon auditorium and infrastructure. We also acknowledge the valuable support of DSM-Irfu and the University of Zaragoza. The scientific program consisted of plenary sessions including the following topics with theoretical and experimental lectures: • Low energy neutrino physics • Neutrinoless double beta decay process • Dark matter searches • Axion and especially solar axion searches • Space experiments and gamma-ray polarimetry • New detector R&D and future experiments

  11. PREFACE: Sixth Symposium on Large TPCs for Low Energy Rare Event Detection

    NASA Astrophysics Data System (ADS)

    Irastorza, Igor G.; Colas, Paul; Giomataris, Ioannis

    2013-10-01

    For the sixth time the International Symposium on large TPCs for Low-Energy Rare-Event Detection has been organized in Paris on 17-19 December 2012. As for the previous conference, we were welcomed in the Astroparticle and Cosmology Laboratory (APC). Around one hundred physicists from all over the world gathered to discuss progress in the dark matter and low-energy neutrino search. The new results from the LHC were also widely discussed. The Higgs discovery at 125 GeV, without any sign of other new heavy particles, does not provide us with any information on the nature of dark mater. Alternatives to the favored SUSY model, in which the role of the WIMP is played by a stable neutralino, predict low mass candidates below a few GeV. Developing low threshold detectors at sub-keV energies becomes mandatory, and interest for Axion or Axion-like particles as dark matter is revived. We have seen increasing activity in the field and new infrastructures for these searches have been developed. We heard news of activities in the Canfranc laboratory in Spain, Jinping in China, SURF in the USA and about the extension project of Fréjus (LSM) laboratory. We would like to thank the organizing and advisory committees as well as the session chairpersons: J Zinn-Justin, G Wormser, D Nygren, G Chardin, F Vannucci, D Attié, T Patzak and S Jullian. I Giomataris, P Colas and I G Irastorza Group picture

  12. Fractal analysis of GPS time series for early detection of disastrous seismic events

    NASA Astrophysics Data System (ADS)

    Filatov, Denis M.; Lyubushin, Alexey A.

    2017-03-01

    A new method of fractal analysis of time series for estimating the chaoticity of behaviour of open stochastic dynamical systems is developed. The method is a modification of the conventional detrended fluctuation analysis (DFA) technique. We start from analysing both methods from the physical point of view and demonstrate the difference between them which results in a higher accuracy of the new method compared to the conventional DFA. Then, applying the developed method to estimate the measure of chaoticity of a real dynamical system - the Earth's crust, we reveal that the latter exhibits two distinct mechanisms of transition to a critical state: while the first mechanism has already been known due to numerous studies of other dynamical systems, the second one is new and has not previously been described. Using GPS time series, we demonstrate efficiency of the developed method in identification of critical states of the Earth's crust. Finally we employ the method to solve a practically important task: we show how the developed measure of chaoticity can be used for early detection of disastrous seismic events and provide a detailed discussion of the numerical results, which are shown to be consistent with outcomes of other researches on the topic.

  13. Snake scales, partial exposure, and the Snake Detection Theory: A human event-related potentials study

    PubMed Central

    Van Strien, Jan W.; Isbell, Lynne A.

    2017-01-01

    Studies of event-related potentials in humans have established larger early posterior negativity (EPN) in response to pictures depicting snakes than to pictures depicting other creatures. Ethological research has recently shown that macaques and wild vervet monkeys respond strongly to partially exposed snake models and scale patterns on the snake skin. Here, we examined whether snake skin patterns and partially exposed snakes elicit a larger EPN in humans. In Task 1, we employed pictures with close-ups of snake skins, lizard skins, and bird plumage. In task 2, we employed pictures of partially exposed snakes, lizards, and birds. Participants watched a random rapid serial visual presentation of these pictures. The EPN was scored as the mean activity (225–300 ms after picture onset) at occipital and parieto-occipital electrodes. Consistent with previous studies, and with the Snake Detection Theory, the EPN was significantly larger for snake skin pictures than for lizard skin and bird plumage pictures, and for lizard skin pictures than for bird plumage pictures. Likewise, the EPN was larger for partially exposed snakes than for partially exposed lizards and birds. The results suggest that the EPN snake effect is partly driven by snake skin scale patterns which are otherwise rare in nature. PMID:28387376

  14. The Functional Mobility Scale: ability to detect change following single event multilevel surgery.

    PubMed

    Harvey, Adrienne; Graham, H Kerr; Morris, Meg E; Baker, Richard; Wolfe, Rory

    2007-08-01

    The aim of this study was to examine the ability of the Functional Mobility Scale (FMS) to detect change in children with cerebral palsy (CP) undergoing single event multilevel surgery (SEMLS). A retrospective study was conducted of gait laboratory records and video assessments for a consecutive sample of children with CP aged 4 to 18 years who were managed by multilevel surgery. FMS ratings and Gross Motor Function Classification System (GMFCS) levels were recorded preoperatively and at regular postoperative time points. The sample comprised 66 children (32 females, 34 males) with spastic diplegia, GMFCS Levels I (n=18), II (n=24), and III (n=24). The mean age at surgery was 10 years (SD 2y 6mo, range 6-16y). For each FMS distance (5, 50, and 500m) odds ratios showed significant deterioration in mobility at 3 and 6 months postoperatively. Mobility then improved to baseline levels by 12 months and improved further by 24 months postoperatively. GMFCS level remained stable throughout most of the postoperative period for children classified as GMFCS Level III preoperatively but not for children classified as Levels I or II. The FMS was found to be a clinically feasible tool for quantifying change after SEMLS in children with CP.

  15. Elastomeric optical fiber sensors and method for detecting and measuring events occurring in elastic materials

    DOEpatents

    Muhs, Jeffrey D.; Capps, Gary J.; Smith, David B.; White, Clifford P.

    1994-01-01

    Fiber optic sensing means for the detection and measurement of events such as dynamic loadings imposed upon elastic materials including cementitious materials, elastomers, and animal body components and/or the attrition of such elastic materials are provided. One or more optical fibers each having a deformable core and cladding formed of an elastomeric material such as silicone rubber are embedded in the elastic material. Changes in light transmission through any of the optical fibers due the deformation of the optical fiber by the application of dynamic loads such as compression, tension, or bending loadings imposed on the elastic material or by the attrition of the elastic material such as by cracking, deterioration, aggregate break-up, and muscle, tendon, or organ atrophy provide a measurement of the dynamic loadings and attrition. The fiber optic sensors can be embedded in elastomers subject to dynamic loadings and attrition such as commonly used automobiles and in shoes for determining the amount and frequency of the dynamic loadings and the extent of attrition. The fiber optic sensors are also useable in cementitious material for determining the maturation thereof.

  16. Real-Time Microbiology Laboratory Surveillance System to Detect Abnormal Events and Emerging Infections, Marseille, France

    PubMed Central

    Abat, Cédric; Chaudet, Hervé; Colson, Philippe; Rolain, Jean-Marc

    2015-01-01

    Infectious diseases are a major threat to humanity, and accurate surveillance is essential. We describe how to implement a laboratory data–based surveillance system in a clinical microbiology laboratory. Two historical Microsoft Excel databases were implemented. The data were then sorted and used to execute the following 2 surveillance systems in Excel: the Bacterial real-time Laboratory-based Surveillance System (BALYSES) for monitoring the number of patients infected with bacterial species isolated at least once in our laboratory during the study periodl and the Marseille Antibiotic Resistance Surveillance System (MARSS), which surveys the primary β-lactam resistance phenotypes for 15 selected bacterial species. The first historical database contained 174,853 identifications of bacteria, and the second contained 12,062 results of antibiotic susceptibility testing. From May 21, 2013, through June 4, 2014, BALYSES and MARSS enabled the detection of 52 abnormal events for 24 bacterial species, leading to 19 official reports. This system is currently being refined and improved. PMID:26196165

  17. First Satellite-detected Perturbations of Outgoing Longwave Radiation Associated with Blowing Snow Events over Antarctica

    NASA Technical Reports Server (NTRS)

    Yang, Yuekui; Palm, Stephen P.; Marshak, Alexander; Wu, Dong L.; Yu, Hongbin; Fu, Qiang

    2014-01-01

    We present the first satellite-detected perturbations of the outgoing longwave radiation (OLR) associated with blowing snow events over the Antarctic ice sheet using data from Cloud-Aerosol Lidar with Orthogonal Polarization and Clouds and the Earth's Radiant Energy System. Significant cloud-free OLR differences are observed between the clear and blowing snow sky, with the sign andmagnitude depending on season and time of the day. During nighttime, OLRs are usually larger when blowing snow is present; the average difference in OLRs between without and with blowing snow over the East Antarctic Ice Sheet is about 5.2 W/m2 for the winter months of 2009. During daytime, in contrast, the OLR perturbation is usually smaller or even has the opposite sign. The observed seasonal variations and day-night differences in the OLR perturbation are consistent with theoretical calculations of the influence of blowing snow on OLR. Detailed atmospheric profiles are needed to quantify the radiative effect of blowing snow from the satellite observations.

  18. Analysis of different device-based intrathoracic impedance vectors for detection of heart failure events (from the Detect Fluid Early from Intrathoracic Impedance Monitoring study).

    PubMed

    Heist, E Kevin; Herre, John M; Binkley, Philip F; Van Bakel, Adrian B; Porterfield, James G; Porterfield, Linda M; Qu, Fujian; Turkel, Melanie; Pavri, Behzad B

    2014-10-15

    Detect Fluid Early from Intrathoracic Impedance Monitoring (DEFEAT-PE) is a prospective, multicenter study of multiple intrathoracic impedance vectors to detect pulmonary congestion (PC) events. Changes in intrathoracic impedance between the right ventricular (RV) coil and device can (RVcoil→Can) of implantable cardioverter-defibrillators (ICDs) and cardiac resynchronization therapy ICDs (CRT-Ds) are used clinically for the detection of PC events, but other impedance vectors and algorithms have not been studied prospectively. An initial 75-patient study was used to derive optimal impedance vectors to detect PC events, with 2 vector combinations selected for prospective analysis in DEFEAT-PE (ICD vectors: RVring→Can + RVcoil→Can, detection threshold 13 days; CRT-D vectors: left ventricular ring→Can + RVcoil→Can, detection threshold 14 days). Impedance changes were considered true positive if detected <30 days before an adjudicated PC event. One hundred sixty-two patients were enrolled (80 with ICDs and 82 with CRT-Ds), all with ≥1 previous PC event. One hundred forty-four patients provided study data, with 214 patient-years of follow-up and 139 PC events. Sensitivity for PC events of the prespecified algorithms was as follows: ICD: sensitivity 32.3%, false-positive rate 1.28 per patient-year; CRT-D: sensitivity 32.4%, false-positive rate 1.66 per patient-year. An alternative algorithm, ultimately approved by the US Food and Drug Administration (RVring→Can + RVcoil→Can, detection threshold 14 days), resulted in (for all patients) sensitivity of 21.6% and a false-positive rate of 0.9 per patient-year. The CRT-D thoracic impedance vector algorithm selected in the derivation study was not superior to the ICD algorithm RVring→Can + RVcoil→Can when studied prospectively. In conclusion, to achieve an acceptably low false-positive rate, the intrathoracic impedance algorithms studied in DEFEAT-PE resulted in low sensitivity for the prediction of heart

  19. Gait event detection using linear accelerometers or angular velocity transducers in able-bodied and spinal-cord injured individuals.

    PubMed

    Jasiewicz, Jan M; Allum, John H J; Middleton, James W; Barriskill, Andrew; Condie, Peter; Purcell, Brendan; Li, Raymond Che Tin

    2006-12-01

    We report on three different methods of gait event detection (toe-off and heel strike) using miniature linear accelerometers and angular velocity transducers in comparison to using standard pressure-sensitive foot switches. Detection was performed with normal and spinal-cord injured subjects. The detection of end contact (EC), normally toe-off, and initial contact (IC) normally, heel strike was based on either foot linear accelerations or foot sagittal angular velocity or shank sagittal angular velocity. The results showed that all three methods were as accurate as foot switches in estimating times of IC and EC for normal gait patterns. In spinal-cord injured subjects, shank angular velocity was significantly less accurate (p<0.02). We conclude that detection based on foot linear accelerations or foot angular velocity can correctly identify the timing of IC and EC events in both normal and spinal-cord injured subjects.

  20. Event-driven simulation in SELMON: An overview of EDSE

    NASA Technical Reports Server (NTRS)

    Rouquette, Nicolas F.; Chien, Steve A.; Charest, Leonard, Jr.

    1992-01-01

    EDSE (event-driven simulation engine), a model-based event-driven simulator implemented for SELMON, a tool for sensor selection and anomaly detection in real-time monitoring is described. The simulator is used in conjunction with a causal model to predict future behavior of the model from observed data. The behavior of the causal model is interpreted as equivalent to the behavior of the physical system being modeled. An overview of the functionality of the simulator and the model-based event-driven simulation paradigm on which it is based is provided. Included are high-level descriptions of the following key properties: event consumption and event creation, iterative simulation, synchronization and filtering of monitoring data from the physical system. Finally, how EDSE stands with respect to the relevant open issues of discrete-event and model-based simulation is discussed.

  1. Effects of rainfall events on the occurrence and detection efficiency of viruses in river water impacted by combined sewer overflows.

    PubMed

    Hata, Akihiko; Katayama, Hiroyuki; Kojima, Keisuke; Sano, Shoichi; Kasuga, Ikuro; Kitajima, Masaaki; Furumai, Hiroaki

    2014-01-15

    Rainfall events can introduce large amount of microbial contaminants including human enteric viruses into surface water by intermittent discharges from combined sewer overflows (CSOs). The present study aimed to investigate the effect of rainfall events on viral loads in surface waters impacted by CSO and the reliability of molecular methods for detection of enteric viruses. The reliability of virus detection in the samples was assessed by using process controls for virus concentration, nucleic acid extraction and reverse transcription (RT)-quantitative PCR (qPCR) steps, which allowed accurate estimation of virus detection efficiencies. Recovery efficiencies of poliovirus in river water samples collected during rainfall events (<10%) were lower than those during dry weather conditions (>10%). The log10-transformed virus concentration efficiency was negatively correlated with suspended solid concentration (r(2)=0.86) that increased significantly during rainfall events. Efficiencies of DNA extraction and qPCR steps determined with adenovirus type 5 and a primer sharing control, respectively, were lower in dry weather. However, no clear relationship was observed between organic water quality parameters and efficiencies of these two steps. Observed concentrations of indigenous enteric adenoviruses, GII-noroviruses, enteroviruses, and Aichi viruses increased during rainfall events even though the virus concentration efficiency was presumed to be lower than in dry weather. The present study highlights the importance of using appropriate process controls to evaluate accurately the concentration of water borne enteric viruses in natural waters impacted by wastewater discharge, stormwater, and CSOs.

  2. UKIRT Microlensing Surveys as a Pathfinder for WFIRST: The Detection of Five Highly Extinguished Low-∣b∣ Events

    NASA Astrophysics Data System (ADS)

    Shvartzvald, Y.; Bryden, G.; Gould, A.; Henderson, C. B.; Howell, S. B.; Beichman, C.

    2017-02-01

    Optical microlensing surveys are restricted from detecting events near the Galactic plane and center, where the event rate is thought to be the highest due to the high optical extinction of these fields. In the near-infrared (NIR), however, the lower extinction leads to a corresponding increase in event detections and is a primary driver for the wavelength coverage of the WFIRST microlensing survey. During the 2015 and 2016 bulge observing seasons, we conducted NIR microlensing surveys with UKIRT in conjunction with and in support of the Spitzer and Kepler microlensing campaigns. Here, we report on five highly extinguished ({A}H=0.81{--}1.97), low-Galactic latitude (-0.98≤slant b≤slant -0.36) microlensing events discovered from our 2016 survey. Four of them were monitored with an hourly cadence by optical surveys but were not reported as discoveries, likely due to the high extinction. Our UKIRT surveys and suggested future NIR surveys enable the first measurement of the microlensing event rate in the NIR. This wavelength regime overlaps with the bandpass of the filter in which the WFIRST microlensing survey will conduct its highest-cadence observations, making this event rate derivation critically important for optimizing its yield.

  3. Signal classification and event reconstruction for acoustic neutrino detection in sea water with KM3NeT

    NASA Astrophysics Data System (ADS)

    Kießling, Dominik

    2017-03-01

    The research infrastructure KM3NeT will comprise a multi cubic kilometer neutrino telescope that is currently being constructed in the Mediterranean Sea. Modules with optical and acoustic sensors are used in the detector. While the main purpose of the acoustic sensors is the position calibration of the detection units, they can be used as instruments for studies on acoustic neutrino detection, too. In this article, methods for signal classification and event reconstruction for acoustic neutrino detectors will be presented, which were developed using Monte Carlo simulations. For the signal classification the disk-like emission pattern of the acoustic neutrino signal is used. This approach improves the suppression of transient background by several orders of magnitude. Additionally, an event reconstruction is developed based on the signal classification. An overview of these algorithms will be presented and the efficiency of the classification will be discussed. The quality of the event reconstruction will also be presented.

  4. A robust real-time gait event detection using wireless gyroscope and its application on normal and altered gaits.

    PubMed

    Gouwanda, Darwin; Gopalai, Alpha Agape

    2015-02-01

    Gait events detection allows clinicians and biomechanics researchers to determine timing of gait events, to estimate duration of stance phase and swing phase and to segment gait data. It also aids biomedical engineers to improve the design of orthoses and FES (functional electrical stimulation) systems. In recent years, researchers have resorted to using gyroscopes to determine heel-strike (HS) and toe-off (TO) events in gait cycles. However, these methods are subjected to significant delays when implemented in real-time gait monitoring devices, orthoses, and FES systems. Therefore, the work presented in this paper proposes a method that addresses these delays, to ensure real-time gait event detection. The proposed algorithm combines the use of heuristics and zero-crossing method to identify HS and TO. Experiments involving: (1) normal walking; (2) walking with knee brace; and (3) walking with ankle brace for overground walking and treadmill walking were designed to verify and validate the identified HS and TO. The performance of the proposed method was compared against the established gait detection algorithms. It was observed that the proposed method produced detection rate that was comparable to earlier reported methods and recorded reduced time delays, at an average of 100 ms.

  5. A secure distributed logistic regression protocol for the detection of rare adverse drug events

    PubMed Central

    El Emam, Khaled; Samet, Saeed; Arbuckle, Luk; Tamblyn, Robyn; Earle, Craig; Kantarcioglu, Murat

    2013-01-01

    Background There is limited capacity to assess the comparative risks of medications after they enter the market. For rare adverse events, the pooling of data from multiple sources is necessary to have the power and sufficient population heterogeneity to detect differences in safety and effectiveness in genetic, ethnic and clinically defined subpopulations. However, combining datasets from different data custodians or jurisdictions to perform an analysis on the pooled data creates significant privacy concerns that would need to be addressed. Existing protocols for addressing these concerns can result in reduced analysis accuracy and can allow sensitive information to leak. Objective To develop a secure distributed multi-party computation protocol for logistic regression that provides strong privacy guarantees. Methods We developed a secure distributed logistic regression protocol using a single analysis center with multiple sites providing data. A theoretical security analysis demonstrates that the protocol is robust to plausible collusion attacks and does not allow the parties to gain new information from the data that are exchanged among them. The computational performance and accuracy of the protocol were evaluated on simulated datasets. Results The computational performance scales linearly as the dataset sizes increase. The addition of sites results in an exponential growth in computation time. However, for up to five sites, the time is still short and would not affect practical applications. The model parameters are the same as the results on pooled raw data analyzed in SAS, demonstrating high model accuracy. Conclusion The proposed protocol and prototype system would allow the development of logistic regression models in a secure manner without requiring the sharing of personal health information. This can alleviate one of the key barriers to the establishment of large-scale post-marketing surveillance programs. We extended the secure protocol to account for

  6. Model-based tomographic reconstruction

    DOEpatents

    Chambers, David H; Lehman, Sean K; Goodman, Dennis M

    2012-06-26

    A model-based approach to estimating wall positions for a building is developed and tested using simulated data. It borrows two techniques from geophysical inversion problems, layer stripping and stacking, and combines them with a model-based estimation algorithm that minimizes the mean-square error between the predicted signal and the data. The technique is designed to process multiple looks from an ultra wideband radar array. The processed signal is time-gated and each section processed to detect the presence of a wall and estimate its position, thickness, and material parameters. The floor plan of a building is determined by moving the array around the outside of the building. In this paper we describe how the stacking and layer stripping algorithms are combined and show the results from a simple numerical example of three parallel walls.

  7. A new method for producing automated seismic bulletins: Probabilistic event detection, association, and location

    SciTech Connect

    Draelos, Timothy J.; Ballard, Sanford; Young, Christopher J.; Brogan, Ronald

    2015-10-01

    Given a set of observations within a specified time window, a fitness value is calculated at each grid node by summing station-specific conditional fitness values. Assuming each observation was generated by a refracted P wave, these values are proportional to the conditional probabilities that each observation was generated by a seismic event at the grid node. The node with highest fitness value is accepted as a hypothetical event location, subject to some minimal fitness value, and all arrivals within a longer time window consistent with that event are associated with it. During the association step, a variety of different phases are considered. In addition, once associated with an event, an arrival is removed from further consideration. While unassociated arrivals remain, the search for other events is repeated until none are identified.

  8. A new method for producing automated seismic bulletins: Probabilistic event detection, association, and location

    DOE PAGES

    Draelos, Timothy J.; Ballard, Sanford; Young, Christopher J.; ...

    2015-10-01

    Given a set of observations within a specified time window, a fitness value is calculated at each grid node by summing station-specific conditional fitness values. Assuming each observation was generated by a refracted P wave, these values are proportional to the conditional probabilities that each observation was generated by a seismic event at the grid node. The node with highest fitness value is accepted as a hypothetical event location, subject to some minimal fitness value, and all arrivals within a longer time window consistent with that event are associated with it. During the association step, a variety of different phasesmore » are considered. In addition, once associated with an event, an arrival is removed from further consideration. While unassociated arrivals remain, the search for other events is repeated until none are identified.« less

  9. Accuracy and precision of equine gait event detection during walking with limb and trunk mounted inertial sensors.

    PubMed

    Olsen, Emil; Andersen, Pia Haubro; Pfau, Thilo

    2012-01-01

    The increased variations of temporal gait events when pathology is present are good candidate features for objective diagnostic tests. We hypothesised that the gait events hoof-on/off and stance can be detected accurately and precisely using features from trunk and distal limb-mounted Inertial Measurement Units (IMUs). Four IMUs were mounted on the distal limb and five IMUs were attached to the skin over the dorsal spinous processes at the withers, fourth lumbar vertebrae and sacrum as well as left and right tuber coxae. IMU data were synchronised to a force plate array and a motion capture system. Accuracy (bias) and precision (SD of bias) was calculated to compare force plate and IMU timings for gait events. Data were collected from seven horses. One hundred and twenty three (123) front limb steps were analysed; hoof-on was detected with a bias (SD) of -7 (23) ms, hoof-off with 0.7 (37) ms and front limb stance with -0.02 (37) ms. A total of 119 hind limb steps were analysed; hoof-on was found with a bias (SD) of -4 (25) ms, hoof-off with 6 (21) ms and hind limb stance with 0.2 (28) ms. IMUs mounted on the distal limbs and sacrum can detect gait events accurately and precisely.

  10. The truth will out: interrogative polygraphy ("lie detection") with event-related brain potentials.

    PubMed

    Farwell, L A; Donchin, E

    1991-09-01

    The feasibility of using Event Related Brain Potentials (ERPs) in Interrogative Polygraphy ("Lie Detection") was tested by examining the effectiveness of the Guilty Knowledge Test designed by Farwell and Donchin (1986, 1988). The subject is assigned an arbitrary task requiring discrimination between experimenter-designated targets and other, irrelevant stimuli. A group of diagnostic items ("probes"), which to the unwitting are indistinguishable from the irrelevant items, are embedded among the irrelevant. For subjects who possess "guilty knowledge" these probes are distinct from the irrelevants and are likely to elicit a P300, thus revealing their possessing the special knowledge that allows them to differentiate the probes from the irrelevants. We report two experiments in which this paradigm was tested. In Experiment 1, 20 subjects participated in one of two mock espionage scenarios and were tested for their knowledge of both scenarios. All stimuli consisted of short phrases presented for 300 ms each at an interstimulus interval of 1550 ms. A set of items were designated as "targets" and appeared on 17% of the trials. Probes related to the scenarios also appeared on 17% of the trials. The rest of the items were irrelevants. Subjects responded by pressing one switch following targets, and the other following irrelevants (and, of course, probes). ERPs were recorded from FZ, CZ, and PZ. As predicted, targets elicited large P300s in all subjects. Probes associated with a given scenario elicited a P300 in subjects who participated in that scenario. A bootstrapping method was used to assess the quality of the decision for each subject. The algorithm declared the decision indeterminate in 12.5% of the cases. In all other cases a decision was made. There were no false positives and no false negatives: whenever a determination was made it was accurate. The second experiment was virtually identical to the first, with identical results, except that this time 4 subjects were

  11. Wavelet packet transform for detection of single events in acoustic emission signals

    NASA Astrophysics Data System (ADS)

    Bianchi, Davide; Mayrhofer, Erwin; Gröschl, Martin; Betz, Gerhard; Vernes, András

    2015-12-01

    Acoustic emission signals in tribology can be used for monitoring the state of bodies in contact and relative motion. The recorded signal includes information which can be associated with different events, such as the formation and propagation of cracks, appearance of scratches and so on. One of the major challenges in analyzing these acoustic emission signals is to identify parts of the signal which belong to such an event and discern it from noise. In this contribution, a wavelet packet decomposition within the framework of multiresolution analysis theory is considered to analyze acoustic emission signals to investigate the failure of tribological systems. By applying the wavelet packet transform a method for the extraction of single events in rail contact fatigue test is proposed. The extraction of such events at several stages of the test permits a classification and the analysis of the evolution of cracks in the rail.

  12. Model based manipulator control

    NASA Technical Reports Server (NTRS)

    Petrosky, Lyman J.; Oppenheim, Irving J.

    1989-01-01

    The feasibility of using model based control (MBC) for robotic manipulators was investigated. A double inverted pendulum system was constructed as the experimental system for a general study of dynamically stable manipulation. The original interest in dynamically stable systems was driven by the objective of high vertical reach (balancing), and the planning of inertially favorable trajectories for force and payload demands. The model-based control approach is described and the results of experimental tests are summarized. Results directly demonstrate that MBC can provide stable control at all speeds of operation and support operations requiring dynamic stability such as balancing. The application of MBC to systems with flexible links is also discussed.

  13. Hard X-ray Detectability of Small Impulsive Heating Events in the Solar Corona

    NASA Astrophysics Data System (ADS)

    Glesener, L.; Klimchuk, J. A.; Bradshaw, S. J.; Marsh, A.; Krucker, S.; Christe, S.

    2015-12-01

    Impulsive heating events ("nanoflares") are a candidate to supply the solar corona with its ~2 MK temperature. These transient events can be studied using extreme ultraviolet and soft X-ray observations, among others. However, the impulsive events may occur in tenuous loops on small enough timescales that the heating is essentially not observed due to ionization timescales, and only the cooling phase is observed. Bremsstrahlung hard X-rays could serve as a more direct and prompt indicator of transient heating events. A hard X-ray spacecraft based on the direct-focusing technology pioneered by the Focusing Optics X-ray Solar Imager (FOXSI) sounding rocket could search for these direct signatures. In this work, we use the hydrodynamical EBTEL code to simulate differential emission measures produced by individual heating events and by ensembles of such events. We then directly predict hard X-ray spectra and consider their observability by a future spaceborne FOXSI, and also by the RHESSI and NuSTAR spacecraft.

  14. Utility of Clinical Breast Exams in Detecting Local-Regional Breast Events after Breast-Conservation in Women with a Personal History of High-risk Breast Cancer

    PubMed Central

    Neuman, Heather B.; Schumacher, Jessica R.; Francescatti, Amanda B.; Adesoye, Taiwo; SB, Edge; ES, Burnside; DJ, Vanness; M, Yu; Y, Si; D, McKellar; DP, Winchester; Greenberg, Caprice C.

    2016-01-01

    Introduction Although breast cancer follow-up guidelines emphasize the importance of clinical exams, prior studies suggest a small fraction of local-regional events occurring after breast conservation are detected by exam alone. Our objective was to examine how local-regional events are detected in a contemporary, national cohort of high-risk breast cancer survivors. Methods A stage-stratified sample of stage II/III breast cancer patients diagnosed in 2006-2007 (n=11,099) were identified from 1,217 facilities within the National Cancer Data Base. Additional data on local-regional and distant breast events, method of event detection, imaging received, and mortality was collected. We further limited the cohort to patients with breast conservation (n=4,854). Summary statistics describe local-regional event rates and detection method. Results Local-regional events were detected in 5.5% (n=265). 83% were ipsilateral or contralateral in-breast events, and 17% within ipsilateral lymph nodes. 48% of local-regional events were detected on asymptomatic breast imaging, 29% by patients, and 10% on clinical exam. Overall, 0.5% of the 4,854 patients had a local-regional event detected on exam. Exams detected a higher proportion of lymph node (8/45) compared to in-breast events (18/220). No factors were associated with method of event detection. Discussion Clinical exams, as an adjunct to screening mammography, have a modest effect on local-regional event detection. This contradicts current belief that exams are a critical adjunct to mammographic screening. These findings can help to streamline follow-up care, potentially improving follow-up efficiency and quality. PMID:27491784

  15. Discriminating famous from fictional names based on lifetime experience: evidence in support of a signal-detection model based on finite mixture distributions.

    PubMed

    Bowles, Ben; Harlow, Iain M; Meeking, Melissa M; Köhler, Stefan

    2012-01-01

    It is widely accepted that signal-detection mechanisms contribute to item-recognition memory decisions that involve discriminations between targets and lures based on a controlled laboratory study episode. Here, the authors employed mathematical modeling of receiver operating characteristics (ROC) to determine whether and how a signal-detection mechanism contributes to discriminations between moderately famous and fictional names based on lifetime experience. Unique to fame judgments is a lack of control over participants' previous exposure to the stimuli deemed "targets" by the experimenter; specifically, if they pertain to moderately famous individuals, participants may have had no prior exposure to a substantial proportion of the famous names presented. The authors adopted established models from the recognition-memory literature to examine the quantitative fit that could be obtained through the inclusion of signal-detection and threshold mechanisms for two data sets. They first established that a signal-detection process operating on graded evidence is critical to account for the fame judgment data they collected. They then determined whether the graded memory evidence for famous names would best be described with one distribution with greater variance than that for the fictional names, or with two finite mixture distributions for famous names that correspond to items with or without prior exposure, respectively. Analyses revealed that a model that included a d' parameter, as well as a mixture parameter, provided the best compromise between number of parameters and quantitative fit. Additional comparisons between this equal-variance signal-detection mixture model and a dual-process model, which included a high-threshold process in addition to a signal-detection process, also favored the former model. In support of the conjecture that the mixture parameter captures participants' prior experience, the authors found that it was increased when the analysis was

  16. Discriminating Famous from Fictional Names Based on Lifetime Experience: Evidence in Support of a Signal-Detection Model Based on Finite Mixture Distributions

    ERIC Educational Resources Information Center

    Bowles, Ben; Harlow, Iain M.; Meeking, Melissa M.; Kohler, Stefan

    2012-01-01

    It is widely accepted that signal-detection mechanisms contribute to item-recognition memory decisions that involve discriminations between targets and lures based on a controlled laboratory study episode. Here, the authors employed mathematical modeling of receiver operating characteristics (ROC) to determine whether and how a signal-detection…

  17. Decaplex and real-time PCR based detection of MON531 and MON15985 Bt cotton events.

    PubMed

    Randhawa, Gurinder Jit; Chhabra, Rashmi; Singh, Monika

    2010-09-22

    The genetically modified (GM) Bt crops expressing delta-endotoxins from Bacillus thuringiensis provide protection against a wide range of lepidopteron insect pests throughout the growing season of the plant. Bt cotton is the only commercialized crop in India that is planted on an area of 7.6 million hectares. With the increase in development and commercialization of transgenic crops, it is necessary to develop appropriate qualitative and quantitative methods for detection of different transgenic events. The present study reports on the development of a decaplex polymerase chain reaction (PCR) assay for simultaneous detection of transgene sequences, specific transgene constructs, and endogenous stearoyl acyl desaturase (Sad1) gene in two events of Bt cotton, i.e., MON531 and MON15985. The decaplex PCR assay is an efficient tool to identify and discriminate the two major commercialized events of Bt cotton, i.e., MON531 and MON15985, in India. Real-time PCR assays were also developed for quantification of cry1Ac and cry2Ab genes being employed in these two events. The quantitative method was developed using seven serial dilutions containing different levels of Bt cotton DNA mixed with a non-Bt counterpart ranging from 0.01 to 100%. The results revealed that the biases from the true value and the relative standard deviations were all within the range of ±20%. The limit of quantification (LOQ) of the developed real-time PCR method has also been established up to 0.01%.

  18. Characterization and Quantification of Nanoparticle-Antibody Conjugates on Cells Using C60 ToF SIMS in the Event-By-Event Bombardment/Detection Mode

    PubMed Central

    Chen, Li-Jung; Shah, Sunny S.; Silangcruz, Jaime; Eller, Michael J.; Verkhoturov, Stanislav V.; Revzin, Alexander; Schweikert, Emile A.

    2011-01-01

    Cluster C60 ToF-SIMS (time-of-flight secondary ion mass spectrometry) operated in the event-by-event bombardment-detection method has been applied to: a) quantify the binding density of Au nanoparticles (AuNPs)-antiCD4 conjugates on the cell surface; b) identify the binding sites between AuNPs and antibody. Briefly, our method consists of recording the secondary ions, SIs, individually emitted from a single C601,2+ impact. From the cumulative mass spectral data we selected events where a specific SI was detected. The selected records revealed the SIs co-ejected from the nanovolume impacted by an individual C60 with an emission area of ~ 10nm in diameter as an emission depth of 5–10 nm. The fractional coverage is obtained as the ratio of the effective number of projectile impacts on a specified sampling area (Ne) to the total number of impacts (N0). In the negative ion mass spectrum, the palmitate (C16H31O2−) and oletate (C18H33O2−) fatty acid ions present signals from lipid membrane of the cells. The signals at m/z 197 (Au−) and 223 (AuCN−) originate from the AuNPs labeled antibodies (antiCD4) bound to the cell surface antigens. The characteristic amino acid ions validate the presence of antiCD4. A coincidence mass spectrum extracted with ion at m/z 223 (AuCN−) reveals the presence of cysteine at m/z 120, documenting the closeness of cysteine and the AuNP. Their proximity suggests that the binding site for AuNP on the antibody is the sulfur-terminal cysteine. The fractional coverage of membrane lipid was determined to be ~23% of the cell surfaces while the AuNPs was found to be ~21%. The novel method can be implemented on smaller size NPs, it should thus be applicable for studies on size dependent binding of NP-antibody conjugates. PMID:21691427

  19. PortVis: A Tool for Port-Based Detection of Security Events

    SciTech Connect

    McPherson, J; Ma, K; Krystosk, P; Bartoletti, T; Christensen, M

    2004-06-29

    Most visualizations of security-related network data require large amounts of finely detailed, high-dimensional data. However, in some cases, the data available can only be coarsely detailed because of security concerns or other limitations. How can interesting security events still be discovered in data that lacks important details, such as IP addresses, network security alarms, and labels? In this paper, we discuss a system we have designed that takes very coarsely detailed data-basic, summarized information of the activity on each TCP port during each given hour-and uses visualization to help uncover interesting security events.

  20. Compression Algorithm Analysis of In-Situ (S)TEM Video: Towards Automatic Event Detection and Characterization

    SciTech Connect

    Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.; Browning, Nigel D.

    2015-09-23

    Precise analysis of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses analysis of video compression statistics for detecting and characterizing events in large data sets. This software works by converting the data into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect events in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each analysis. Going forward, an algorithm for detecting and possibly describing events automatically can be written based on statistic(s) for each data type.

  1. An innovative methodological approach in the frame of Marine Strategy Framework Directive: a statistical model based on ship detection SAR data for monitoring programmes.

    PubMed

    Pieralice, Francesca; Proietti, Raffaele; La Valle, Paola; Giorgi, Giordano; Mazzolena, Marco; Taramelli, Andrea; Nicoletti, Luisa

    2014-12-01

    The Marine Strategy Framework Directive (MSFD, 2008/56/EC) is focused on protection, preservation and restoration of the marine environment by achieving and maintaining Good Environmental Status (GES) by 2020. Within this context, this paper presents a methodological approach for a fast and repeatable monitoring that allows quantitative assessment of seabed abrasion pressure due to recreational boat anchoring. The methodology consists of two steps: a semi-automatic procedure based on an algorithm for the ship detection in SAR imagery and a statistical model to obtain maps of spatial and temporal distribution density of anchored boats. Ship detection processing has been performed on 36 ASAR VV-pol images of Liguria test site, for the three years 2008, 2009 and 2010. Starting from the pointwise distribution layer produced by ship detection in imagery, boats points have been subdivided into 4 areas where a constant distribution density has been assumed for the entire period 2008-2010. In the future, this methodology will be applied also to higher resolution data of Sentinel-1 mission, specifically designed for the operational needs of the European Programme Copernicus.

  2. Hard X-Ray Burst Detected From Caltech Plasma Jet Experiment Magnetic Reconnection Event

    NASA Astrophysics Data System (ADS)

    Marshall, Ryan S.; Bellan, Paul M.

    2016-10-01

    In the Caltech plasma jet experiment a 100 kA MHD driven jet becomes kink unstable leading to a Rayleigh-Taylor instability that quickly causes a magnetic reconnection event. Movies show that the Rayleigh-Taylor instability is simultaneous with voltage spikes across the electrodes that provide the current that drives the jet. Hard x-rays between 4 keV and 9 keV have now been observed using an x-ray scintillator detector mounted just outside of a kapton window on the vacuum chamber. Preliminary results indicate that the timing of the x-ray burst coincides with a voltage spike on the electrodes occurring in association with the Rayleigh-Taylor event. The x-ray signal accompanies the voltage spike and Rayleigh-Taylor event in approximately 50% of the shots. A possible explanation for why the x-ray signal is sometimes missing is that the magnetic reconnection event may be localized to a specific region of the plasma outside the line of sight of the scintillator. The x-ray signal has also been seen accompanying the voltage spike when no Rayleigh-Taylor is observed. This may be due to the interframe timing on the camera being longer than the very short duration of the Rayleigh-Taylor instability.

  3. Dune Detective, Using Ecological Studies to Reconstruct Events Which Shaped a Barrier Island.

    ERIC Educational Resources Information Center

    Godfrey, Paul J.; Hon, Will

    This publication is designed for use as part of a curriculum series developed by the Regional Marine Science Project. Students in grades 11 and 12 are exposed to research methods through a series of field exercises guiding investigators in reconstructing the events which have shaped the natural communities of a barrier beach. Background…

  4. Best practice for single-trial detection of event-related potentials: Application to brain-computer interfaces.

    PubMed

    Cecotti, Hubert; Ries, Anthony J

    2017-01-01

    The detection of event-related potentials (ERPs) in the electroencephalogram (EEG) signal is a fundamental component in non-invasive brain-computer interface (BCI) research, and in modern cognitive neuroscience studies. Whereas the grand average response across trials provides an estimation of essential characteristics of a brain-evoked response, an estimation of the differences between trials for a particular type of stimulus can provide key insight about the brain dynamics and possible origins of the brain response. The research in ERP single-trial detection has been mainly driven by applications in biomedical engineering, with an interest from machine learning and signal processing groups that test novel methods on noisy signals. Efficient single-trial detection techniques require processing steps that include temporal filtering, spatial filtering, and classification. In this paper, we review the current state-of-the-art methods for single-trial detection of event-related potentials with applications in BCI. Efficient single-trial detection techniques should embed simple yet efficient functions requiring as few hyper-parameters as possible. The focus of this paper is on methods that do not include a large number of hyper-parameters and can be easily implemented with datasets containing a limited number of trials. A benchmark of different classification methods is proposed on a database recorded from sixteen healthy subjects during a rapid serial visual presentation task. The results support the conclusion that single-trial detection can be achieved with an area under the ROC curve superior to 0.9 with less than ten sensors and 20 trials corresponding to the presentation of a target. Whereas the number of sensors is not a key element for efficient single-trial detection, the number of trials must be carefully chosen for creating a robust classifier.

  5. Real-time detection of organic contamination events in water distribution systems by principal components analysis of ultraviolet spectral data.

    PubMed

    Zhang, Jian; Hou, Dibo; Wang, Ke; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo

    2017-04-01

    The detection of organic contaminants in water distribution systems is essential to protect public health from potential harmful compounds resulting from accidental spills or intentional releases. Existing methods for detecting organic contaminants are based on quantitative analyses such as chemical testing and gas/liquid chromatography, which are time- and reagent-consuming and involve costly maintenance. This study proposes a novel procedure based on discrete wavelet transform and principal component analysis for detecting organic contamination events from ultraviolet spectral data. Firstly, the spectrum of each observation is transformed using discrete wavelet with a coiflet mother wavelet to capture the abrupt change along the wavelength. Principal component analysis is then employed to approximate the spectra based on capture and fusion features. The significant value of Hotelling's T(2) statistics is calculated and used to detect outliers. An alarm of contamination event is triggered by sequential Bayesian analysis when the outliers appear continuously in several observations. The effectiveness of the proposed procedure is tested on-line using a pilot-scale setup and experimental data.

  6. A new PCR-CGE (size and color) method for simultaneous detection of genetically modified maize events.

    PubMed

    Nadal, Anna; Coll, Anna; La Paz, Jose-Luis; Esteve, Teresa; Pla, Maria

    2006-10-01

    We present a novel multiplex PCR assay for simultaneous detection of multiple transgenic events in maize. Initially, five PCR primers pairs specific to events Bt11, GA21, MON810, and NK603, and Zea mays L. (alcohol dehydrogenase) were included. The event specificity was based on amplification of transgene/plant genome flanking regions, i.e., the same targets as for validated real-time PCR assays. These short and similarly sized amplicons were selected to achieve high and similar amplification efficiency for all targets; however, its unambiguous identification was a technical challenge. We achieved a clear distinction by a novel CGE approach that combined the identification by size and color (CGE-SC). In one single step, all five targets were amplified and specifically labeled with three different fluorescent dyes. The assay was specific and displayed an LOD of 0.1% of each genetically modified organism (GMO). Therefore, it was adequate to fulfill legal thresholds established, e.g., in the European Union. Our CGE-SC based strategy in combination with an adequate labeling design has the potential to simultaneously detect higher numbers of targets. As an example, we present the detection of up to eight targets in a single run. Multiplex PCR-CGE-SC only requires a conventional sequencer device and enables automation and high throughput. In addition, it proved to be transferable to a different laboratory. The number of authorized GMO events is rapidly growing; and the acreage of genetically modified (GM) varieties cultivated and commercialized worldwide is rapidly increasing. In this context, our multiplex PCR-CGE-SC can be suitable for screening GM contents in food.

  7. Detecting specific health-related events using an integrated sensor system for vital sign monitoring.

    PubMed

    Adnane, Mourad; Jiang, Zhongwei; Choi, Samjin; Jang, Hoyoung

    2009-01-01

    In this paper, a new method for the detection of apnea/hypopnea periods in physiological data is presented. The method is based on the intelligent combination of an integrated sensor system for long-time cardiorespiratory signal monitoring and dedicated signal-processing packages. Integrated sensors are a PVDF film and conductive fabric sheets. The signal processing package includes dedicated respiratory cycle (RC) and QRS complex detection algorithms and a new method using the respiratory cycle variability (RCV) for detecting apnea/hypopnea periods in physiological data. Results show that our method is suitable for online analysis of long time series data.

  8. Event detection and control co-design of sampled-data systems

    NASA Astrophysics Data System (ADS)

    Meng, Xiangyu; Chen, Tongwen

    2014-04-01

    This paper proposes event detector and controller co-design criteria for sampled-data systems in which static or dynamic output feedback controllers are used. The resulting criteria provide sufficient conditions to ensure the asymptotic stability of the closed-loop system with reduced communication rates among sensors, controllers, and actuators. The transmissions are mediated by event detectors at both sensor and controller nodes. For static output feedback control, the sufficient condition is given in terms of the feasibility of bilinear matrix inequalities (BMIs). The BMI feasibility problem is converted into a nonlinear optimisation problem involving linear matrix inequalities (LMIs), which is solved via the complementarity linearisation algorithm. For dynamic output feedback control, the sufficient condition is formulated into an LMI feasibility problem, which is readily solved with existing tools. Numerical examples are included to show the effectiveness of the proposed methods.

  9. Data-Driven Multimodal Sleep Apnea Events Detection : Synchrosquezing Transform Processing and Riemannian Geometry Classification Approaches.

    PubMed

    Rutkowski, Tomasz M

    2016-07-01

    A novel multimodal and bio-inspired approach to biomedical signal processing and classification is presented in the paper. This approach allows for an automatic semantic labeling (interpretation) of sleep apnea events based the proposed data-driven biomedical signal processing and classification. The presented signal processing and classification methods have been already successfully applied to real-time unimodal brainwaves (EEG only) decoding in brain-computer interfaces developed by the author. In the current project the very encouraging results are obtained using multimodal biomedical (brainwaves and peripheral physiological) signals in a unified processing approach allowing for the automatic semantic data description. The results thus support a hypothesis of the data-driven and bio-inspired signal processing approach validity for medical data semantic interpretation based on the sleep apnea events machine-learning-related classification.

  10. Near Real-Time Event Detection & Prediction Using Intelligent Software Agents

    DTIC Science & Technology

    2006-03-01

    recognition. While domain-specific methodologies have garnered varying success levels, a general approach for this complex task has yet to be found...robust event pattern recognition. While domain-specific methodologies have garnered varying success levels, a general approach for this complex...task has yet to be found and therefore motivates this research effort. The overall research goal is to develop, test, and validate a robust generic

  11. Detection of Olfactory Dysfunction Using Olfactory Event Related Potentials in Young Patients with Multiple Sclerosis

    PubMed Central

    Caminiti, Fabrizia; De Salvo, Simona; De Cola, Maria Cristina; Russo, Margherita; Bramanti, Placido; Marino, Silvia; Ciurleo, Rosella

    2014-01-01

    Background Several studies reported olfactory dysfunction in patients with multiple sclerosis. The estimate of the incidence of olfactory deficits in multiple sclerosis is uncertain; this may arise from different testing methods that may be influenced by patients' response bias and clinical, demographic and cognitive features. Aims To evaluate objectively the olfactory function using Olfactory Event Related Potentials. Materials and Methods We tested the olfactory function of 30 patients with relapsing remitting multiple sclerosis (mean age of 36.03±6.96 years) and of 30 age, sex and smoking–habit matched healthy controls by using olfactory potentials. A selective and controlled stimulation of the olfactory system to elicit the olfactory event related potentials was achieved by a computer-controlled olfactometer linked directly with electroencephalograph. Relationships between olfactory potential results and patients' clinical characteristics, such as gender, disability status score, disease-modifying therapy, and disease duration, were evaluated. Results Seven of 30 patients did not show olfactory event related potentials. Sixteen of remaining 23 patients had a mean value of amplitude significantly lower than control group (p<0.01). The presence/absence of olfactory event related potentials was associated with dichotomous expanded disability status scale (p = 0.0433), as well as inversely correlated with the disease duration (r = −0.3641, p = 0.0479). Conclusion Unbiased olfactory dysfunction of different severity found in multiple sclerosis patients suggests an organic impairment which could be related to neuroinflammatory and/or neurodegenerative processes of olfactory networks, supporting the recent findings on neurophysiopathology of disease. PMID:25047369

  12. Enriched Encoding: Reward Motivation Organizes Cortical Networks for Hippocampal Detection of Unexpected Events

    PubMed Central

    Murty, Vishnu P.; Adcock, R. Alison

    2014-01-01

    Learning how to obtain rewards requires learning about their contexts and likely causes. How do long-term memory mechanisms balance the need to represent potential determinants of reward outcomes with the computational burden of an over-inclusive memory? One solution would be to enhance memory for salient events that occur during reward anticipation, because all such events are potential determinants of reward. We tested whether reward motivation enhances encoding of salient events like expectancy violations. During functional magnetic resonance imaging, participants performed a reaction-time task in which goal-irrelevant expectancy violations were encountered during states of high- or low-reward motivation. Motivation amplified hippocampal activation to and declarative memory for expectancy violations. Connectivity of the ventral tegmental area (VTA) with medial prefrontal, ventrolateral prefrontal, and visual cortices preceded and predicted this increase in hippocampal sensitivity. These findings elucidate a novel mechanism whereby reward motivation can enhance hippocampus-dependent memory: anticipatory VTA-cortical–hippocampal interactions. Further, the findings integrate literatures on dopaminergic neuromodulation of prefrontal function and hippocampus-dependent memory. We conclude that during reward motivation, VTA modulation induces distributed neural changes that amplify hippocampal signals and records of expectancy violations to improve predictions—a potentially unique contribution of the hippocampus to reward learning. PMID:23529005

  13. Enriched encoding: reward motivation organizes cortical networks for hippocampal detection of unexpected events.

    PubMed

    Murty, Vishnu P; Adcock, R Alison

    2014-08-01

    Learning how to obtain rewards requires learning about their contexts and likely causes. How do long-term memory mechanisms balance the need to represent potential determinants of reward outcomes with the computational burden of an over-inclusive memory? One solution would be to enhance memory for salient events that occur during reward anticipation, because all such events are potential determinants of reward. We tested whether reward motivation enhances encoding of salient events like expectancy violations. During functional magnetic resonance imaging, participants performed a reaction-time task in which goal-irrelevant expectancy violations were encountered during states of high- or low-reward motivation. Motivation amplified hippocampal activation to and declarative memory for expectancy violations. Connectivity of the ventral tegmental area (VTA) with medial prefrontal, ventrolateral prefrontal, and visual cortices preceded and predicted this increase in hippocampal sensitivity. These findings elucidate a novel mechanism whereby reward motivation can enhance hippocampus-dependent memory: anticipatory VTA-cortical-hippocampal interactions. Further, the findings integrate literatures on dopaminergic neuromodulation of prefrontal function and hippocampus-dependent memory. We conclude that during reward motivation, VTA modulation induces distributed neural changes that amplify hippocampal signals and records of expectancy violations to improve predictions-a potentially unique contribution of the hippocampus to reward learning.

  14. Vy-PER: eliminating false positive detection of virus integration events in next generation sequencing data.

    PubMed

    Forster, Michael; Szymczak, Silke; Ellinghaus, David; Hemmrich, Georg; Rühlemann, Malte; Kraemer, Lars; Mucha, Sören; Wienbrandt, Lars; Stanulla, Martin; Franke, Andre

    2015-07-13

    Several pathogenic viruses such as hepatitis B and human immunodeficiency viruses may integrate into the host genome. These virus/host integrations are detectable using paired-end next generation sequencing. However, the low number of expected true virus integrations may be difficult to distinguish from the noise of many false positive candidates. Here, we propose a novel filtering approach that increases specificity without compromising sensitivity for virus/host chimera detection. Our detection pipeline termed Vy-PER (Virus integration detection bY Paired End Reads) outperforms existing similar tools in speed and accuracy. We analysed whole genome data from childhood acute lymphoblastic leukemia (ALL), which is characterised by genomic rearrangements and usually associated with radiation exposure. This analysis was motivated by the recently reported virus integrations at genomic rearrangement sites and association with chromosomal instability in liver cancer. However, as expected, our analysis of 20 tumour and matched germline genomes from ALL patients finds no significant evidence for integrations by known viruses. Nevertheless, our method eliminates 12,800 false positives per genome (80× coverage) and only our method detects singleton human-phiX174-chimeras caused by optical errors of the Illumina HiSeq platform. This high accuracy is useful for detecting low virus integration levels as well as non-integrated viruses.

  15. Vy-PER: eliminating false positive detection of virus integration events in next generation sequencing data

    PubMed Central

    Forster, Michael; Szymczak, Silke; Ellinghaus, David; Hemmrich, Georg; Rühlemann, Malte; Kraemer, Lars; Mucha, Sören; Wienbrandt, Lars; Stanulla, Martin; Franke, Andre

    2015-01-01

    Several pathogenic viruses such as hepatitis B and human immunodeficiency viruses may integrate into the host genome. These virus/host integrations are detectable using paired-end next generation sequencing. However, the low number of expected true virus integrations may be difficult to distinguish from the noise of many false positive candidates. Here, we propose a novel filtering approach that increases specificity without compromising sensitivity for virus/host chimera detection. Our detection pipeline termed Vy-PER (Virus integration detection bY Paired End Reads) outperforms existing similar tools in speed and accuracy. We analysed whole genome data from childhood acute lymphoblastic leukemia (ALL), which is characterised by genomic rearrangements and usually associated with radiation exposure. This analysis was motivated by the recently reported virus integrations at genomic rearrangement sites and association with chromosomal instability in liver cancer. However, as expected, our analysis of 20 tumour and matched germline genomes from ALL patients finds no significant evidence for integrations by known viruses. Nevertheless, our method eliminates 12,800 false positives per genome (80× coverage) and only our method detects singleton human-phiX174-chimeras caused by optical errors of the Illumina HiSeq platform. This high accuracy is useful for detecting low virus integration levels as well as non-integrated viruses. PMID:26166306

  16. A Systematic Analysis of the Sensitivity of Plasma Pharmacokinetics to Detect Differences in the Pulmonary Performance of Inhaled Fluticasone Propionate Products Using a Model-Based Simulation Approach.

    PubMed

    Weber, Benjamin; Hochhaus, Guenther

    2015-07-01

    The role of plasma pharmacokinetics (PK) for assessing bioequivalence at the target site, the lung, for orally inhaled drugs remains unclear. A validated semi-mechanistic model, considering the presence of mucociliary clearance in central lung regions, was expanded for quantifying the sensitivity of PK studies in detecting differences in the pulmonary performance (total lung deposition, central-to-peripheral lung deposition ratio, and pulmonary dissolution characteristics) between test (T) and reference (R) inhaled fluticasone propionate (FP) products. PK bioequivalence trials for inhaled FP were simulated based on this PK model for a varying number of subjects and T products. The statistical power to conclude bioequivalence when T and R products are identical was demonstrated to be 90% for approximately 50 subjects. Furthermore, the simulations demonstrated that PK metrics (area under the concentration time curve (AUC) and C max) are capable of detecting differences between T and R formulations of inhaled FP products when the products differ by more than 20%, 30%, and 25% for total lung deposition, central-to-peripheral lung deposition ratio, and pulmonary dissolution characteristics, respectively. These results were derived using a rather conservative risk assessment approach with an error rate of <10%. The simulations thus indicated that PK studies might be a viable alternative to clinical studies comparing pulmonary efficacy biomarkers for slowly dissolving inhaled drugs. PK trials for pulmonary efficacy equivalence testing should be complemented by in vitro studies to avoid false positive bioequivalence assessments that are theoretically possible for some specific scenarios. Moreover, a user-friendly web application for simulating such PK equivalence trials with inhaled FP is provided.

  17. Event-triggered fault detection for a class of discrete-time linear systems using interval observers.

    PubMed

    Zhang, Zhi-Hui; Yang, Guang-Hong

    2017-02-13

    This paper provides a novel event-triggered fault detection (FD) scheme for discrete-time linear systems. First, an event-triggered interval observer is proposed to generate the upper and lower residuals by taking into account the influence of the disturbances and the event error. Second, the robustness of the residual interval against the disturbances and the fault sensitivity are improved by introducing l1 and H∞ performances. Third, dilated linear matrix inequalities are used to decouple the Lyapunov matrices from the system matrices. The nonnegative conditions for the estimation error variables are presented with the aid of the slack matrix variables. This technique allows considering a more general Lyapunov function. Furthermore, the FD decision scheme is proposed by monitoring whether the zero value belongs to the residual interval. It is shown that the information communication burden is reduced by designing the event-triggering mechanism, while the FD performance can still be guaranteed. Finally, simulation results demonstrate the effectiveness of the proposed method.

  18. Gait event detection for FES using accelerometers and supervised machine learning.

    PubMed

    Williamson, R; Andrews, B J

    2000-09-01

    Rule based detectors were used with a single cluster of accelerometers attached to the shank for the real time detection of the main phases of normal gait during walking. The gait phase detectors were synthesized from two rule induction algorithms, Rough Sets (RS) and Adaptive Logic Networks (ALNs), and compared with to a previously reported stance/swing detector based on a hand crafted, rule based algorithm. Data was sampled at 100 Hz and the detection errors determined at each sample for 50 steps. For three able bodied subjects, the sample by sample accuracy of stance/swing detection ranged within 94-97%, 87-94%, and 87-95% for the RS, ALN, and the handcrafted methods, respectively. A heuristically formulated postdetector filter improved the RS and ALN detectors' accuracy to 98%. RS and ALN also detected five gait phases to an overall accuracy of 82-89% and 86-91%, respectively. The postdetector filter localized the errors to the phase transitions, but did not change the detection accuracy. The average duration of the error at each transition was 40 ms and 23 ms for RS and ALN, respectively. When implemented on a microcontroller, the RS-based detector executed ten times faster and required one tenth of the memory than the ALN-based detector.

  19. Plate Coupling and Transient Events Detection from Geodetic Measurements in Nicoya Peninsula, Costa Rica

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; McCaffrey, R.; Wdowinski, S.; Dixon, T.; Protti, M.; Gonzalez, V. M.; Newman, A. V.; Feng, L.

    2011-12-01

    Aseismic tremor and slow slip events (SSE) are known to perturb the stress field at the plate subduction interface. Nicoya Peninsula in northern Costa Rica is located near the Middle America Trench (MAT), where the Cocos plate subducts underneath Caribbean plate. The subducting Cocos plate contains two types of subducting oceanic crust, East Pacific Rise (EPR) in the northern peninsula and Cocos-Nazca Spreading center (CNS) in the southern peninsula. The two crust types differ in subducting speed and orientation, topography, age and heat flow. This unique geological setting provides an opportunity to investigate the kinematics and dynamics of SSE and tremor. In the Nicoya peninsula SSE are found in high b-value regions and occur approximately every 20 months. However, the location and magnitude of SSE are still uncertain due to limited observations. Here we report additional geodetic observations and use a new GPS time-series inversion scheme to investigate simultaneously both SSE and interseismic locking patterns in the study area and their evolution with time. We solve for the steady inter-seismic velocity field and parameters characterizing SSE including slip amount and duration. A preliminary analysis of continuous and campaign GPS data using the time dependent geodetic inversion software TDefnode [McCaffrey 2009] reveals three slow-slipping patches for events in 2003, 2005, 2007 and 2009. Previous inversion analysis of Outerbridge et al. [2010] of the 2007 SSE identified the two of the three patches, a deep on at depth of ~25 km, and a shallower patch at depth of ~7 km. The third patch identified by our inversion at depth of ~15 km is similar in area and location to that reported by Protti et al. [2004] for the 2003 event.

  20. A model-based approach for detection of runways and other objects in image sequences acquired using an on-board camera

    NASA Technical Reports Server (NTRS)

    Kasturi, Rangachar; Devadiga, Sadashiva; Tang, Yuan-Liang

    1994-01-01

    This research was initiated as a part of the Advanced Sensor and Imaging System Technology (ASSIST) program at NASA Langley Research Center. The primary goal of this research is the development of image analysis algorithms for the detection of runways and other objects using an on-board camera. Initial effort was concentrated on images acquired using a passive millimeter wave (PMMW) sensor. The images obtained using PMMW sensors under poor visibility conditions due to atmospheric fog are characterized by very low spatial resolution but good image contrast compared to those images obtained using sensors operating in the visible spectrum. Algorithms developed for analyzing these images using a model of the runway and other objects are described in Part 1 of this report. Experimental verification of these algorithms was limited to a sequence of images simulated from a single frame of PMMW image. Subsequent development and evaluation of algorithms was done using video image sequences. These images have better spatial and temporal resolution compared to PMMW images. Algorithms for reliable recognition of runways and accurate estimation of spatial position of stationary objects on the ground have been developed and evaluated using several image sequences. These algorithms are described in Part 2 of this report. A list of all publications resulting from this work is also included.

  1. A model-based approach for detection of runways and other objects in image sequences acquired using an on-board camera

    NASA Astrophysics Data System (ADS)

    Kasturi, Rangachar; Devadiga, Sadashiva; Tang, Yuan-Liang

    1994-08-01

    This research was initiated as a part of the Advanced Sensor and Imaging System Technology (ASSIST) program at NASA Langley Research Center. The primary goal of this research is the development of image analysis algorithms for the detection of runways and other objects using an on-board camera. Initial effort was concentrated on images acquired using a passive millimeter wave (PMMW) sensor. The images obtained using PMMW sensors under poor visibility conditions due to atmospheric fog are characterized by very low spatial resolution but good image contrast compared to those images obtained using sensors operating in the visible spectrum. Algorithms developed for analyzing these images using a model of the runway and other objects are described in Part 1 of this report. Experimental verification of these algorithms was limited to a sequence of images simulated from a single frame of PMMW image. Subsequent development and evaluation of algorithms was done using video image sequences. These images have better spatial and temporal resolution compared to PMMW images. Algorithms for reliable recognition of runways and accurate estimation of spatial position of stationary objects on the ground have been developed and evaluated using several image sequences. These algorithms are described in Part 2 of this report. A list of all publications resulting from this work is also included.

  2. Quantal release, incremental detection, and long-period Ca2+ oscillations in a model based on regulatory Ca2+-binding sites along the permeation pathway.

    PubMed Central

    Dupont, G; Swillens, S

    1996-01-01

    Quantal release, incremental detection, and oscillations are three types of Ca2+ responses that can be obtained in different conditions, after stimulation of the intracellular Ca2+ stores by submaximum concentrations of inositol 1,4,5-triphosphate (InsP3). All three phenomena are thought to occur through the regulatory properties of the InsP3 receptor/Ca2+ channel. In the present study, we perform further analysis of the model (Swillens et al., 1994, Proc. Natl. Acad. Sci. USA. 91:10074-10078) previously proposed for transient InsP3-induced Ca2+ release, based on the bell-shaped dependence of the InsP3 receptor activity on the Ca2+ level and on the existence of an intermediate Ca2+ domain located around the mouth of the channel. We show that Ca2+ oscillations also arise in the latter model. Conditions for the occurrence of the various behaviors are investigated. Numerical simulations also show that the existence of an intermediate Ca2+ domain can markedly increase the period of oscillations. Periods on the order of 1 min can indeed be accounted for by the model when one assigns realistic values to the kinetic constants of the InsP3 receptor, which, in the absence of a domain, lead to oscillations with periods of a few seconds. Finally, theoretical support in favor of a positive cooperativity in the regulation of the InsP3 receptor by Ca2+ is presented. Images FIGURE 7 PMID:8889149

  3. myBlackBox: Blackbox Mobile Cloud Systems for Personalized Unusual Event Detection

    PubMed Central

    Ahn, Junho; Han, Richard

    2016-01-01

    We demonstrate the feasibility of constructing a novel and practical real-world mobile cloud system, called myBlackBox, that efficiently fuses multimodal smartphone sensor data to identify and log unusual personal events in mobile users’ daily lives. The system incorporates a hybrid architectural design that combines unsupervised classification of audio, accelerometer and location data with supervised joint fusion classification to achieve high accuracy, customization, convenience and scalability. We show the feasibility of myBlackBox by implementing and evaluating this end-to-end system that combines Android smartphones with cloud servers, deployed for 15 users over a one-month period. PMID:27223292

  4. Probabilistic Swinging Door Algorithm as Applied to Photovoltaic Power Ramping Event Detection

    SciTech Connect

    Florita, Anthony; Zhang, Jie; Brancucci Martinez-Anido, Carlo; Hodge, Bri-Mathias; Cui, Mingjian

    2015-10-02

    Photovoltaic (PV) power generation experiences power ramping events due to cloud interference. Depending on the extent of PV aggregation and local grid features, such power variability can be constructive or destructive to measures of uncertainty regarding renewable power generation; however, it directly influences contingency planning, production costs, and the overall reliable operation of power systems. For enhanced power system flexibility, and to help mitigate the negative impacts of power ramping, it is desirable to analyze events in a probabilistic fashion so degrees of beliefs concerning system states and forecastability are better captured and uncertainty is explicitly quantified. A probabilistic swinging door algorithm is developed and presented in this paper. It is then applied to a solar data set of PV power generation. The probabilistic swinging door algorithm builds on results from the original swinging door algorithm, first used for data compression in trend logging, and it is described by two uncertain parameters: (i) e, the threshold sensitivity to a given ramp, and (ii) s, the residual of the piecewise linear ramps. These two parameters determine the distribution of ramps and capture the uncertainty in PV power generation.

  5. Onboard Classifiers for Science Event Detection on a Remote Sensing Spacecraft

    NASA Technical Reports Server (NTRS)

    Castano, Rebecca; Mazzoni, Dominic; Tang, Nghia; Greeley, Ron; Doggett, Thomas; Cichy, Ben; Chien, Steve; Davies, Ashley

    2006-01-01

    Typically, data collected by a spacecraft is downlinked to Earth and pre-processed before any analysis is performed. We have developed classifiers that can be used onboard a spacecraft to identify high priority data for downlink to Earth, providing a method for maximizing the use of a potentially bandwidth limited downlink channel. Onboard analysis can also enable rapid reaction to dynamic events, such as flooding, volcanic eruptions or sea ice break-up. Four classifiers were developed to identify cryosphere events using hyperspectral images. These classifiers include a manually constructed classifier, a Support Vector Machine (SVM), a Decision Tree and a classifier derived by searching over combinations of thresholded band ratios. Each of the classifiers was designed to run in the computationally constrained operating environment of the spacecraft. A set of scenes was hand-labeled to provide training and testing data. Performance results on the test data indicate that the SVM and manual classifiers outperformed the Decision Tree and band-ratio classifiers with the SVM yielding slightly better classifications than the manual classifier.

  6. Detection of Supersonic Downflows and Associated Heating Events in the Transition Region above Sunspots

    NASA Astrophysics Data System (ADS)

    Kleint, L.; Antolin, P.; Tian, H.; Judge, P.; Testa, P.; De Pontieu, B.; Martínez-Sykora, J.; Reeves, K. K.; Wuelser, J. P.; McKillop, S.; Saar, S.; Carlsson, M.; Boerner, P.; Hurlburt, N.; Lemen, J.; Tarbell, T. D.; Title, A.; Golub, L.; Hansteen, V.; Jaeggli, S.; Kankelborg, C.

    2014-07-01

    Interface Region Imaging Spectrograph data allow us to study the solar transition region (TR) with an unprecedented spatial resolution of 0.''33. On 2013 August 30, we observed bursts of high Doppler shifts suggesting strong supersonic downflows of up to 200 km s-1 and weaker, slightly slower upflows in the spectral lines Mg II h and k, C II 1336, Si IV 1394 Å, and 1403 Å, that are correlated with brightenings in the slitjaw images (SJIs). The bursty behavior lasts throughout the 2 hr observation, with average burst durations of about 20 s. The locations of these short-lived events appear to be the umbral and penumbral footpoints of EUV loops. Fast apparent downflows are observed along these loops in the SJIs and in the Atmospheric Imaging Assembly, suggesting that the loops are thermally unstable. We interpret the observations as cool material falling from coronal heights, and especially coronal rain produced along the thermally unstable loops, which leads to an increase of intensity at the loop footpoints, probably indicating an increase of density and temperature in the TR. The rain speeds are on the higher end of previously reported speeds for this phenomenon, and possibly higher than the free-fall velocity along the loops. On other observing days, similar bright dots are sometimes aligned into ribbons, resembling small flare ribbons. These observations provide a first insight into small-scale heating events in sunspots in the TR.

  7. DETECTION OF SUPERSONIC DOWNFLOWS AND ASSOCIATED HEATING EVENTS IN THE TRANSITION REGION ABOVE SUNSPOTS

    SciTech Connect

    Kleint, L.; Martínez-Sykora, J.; Antolin, P.; Tian, H.; Testa, P.; Reeves, K. K.; McKillop, S.; Saar, S.; Golub, L.; Judge, P.; Carlsson, M.; Hansteen, V.; Jaeggli, S.; and others

    2014-07-10

    Interface Region Imaging Spectrograph data allow us to study the solar transition region (TR) with an unprecedented spatial resolution of 0.''33. On 2013 August 30, we observed bursts of high Doppler shifts suggesting strong supersonic downflows of up to 200 km s{sup –1} and weaker, slightly slower upflows in the spectral lines Mg II h and k, C II 1336, Si IV 1394 Å, and 1403 Å, that are correlated with brightenings in the slitjaw images (SJIs). The bursty behavior lasts throughout the 2 hr observation, with average burst durations of about 20 s. The locations of these short-lived events appear to be the umbral and penumbral footpoints of EUV loops. Fast apparent downflows are observed along these loops in the SJIs and in the Atmospheric Imaging Assembly, suggesting that the loops are thermally unstable. We interpret the observations as cool material falling from coronal heights, and especially coronal rain produced along the thermally unstable loops, which leads to an increase of intensity at the loop footpoints, probably indicating an increase of density and temperature in the TR. The rain speeds are on the higher end of previously reported speeds for this phenomenon, and possibly higher than the free-fall velocity along the loops. On other observing days, similar bright dots are sometimes aligned into ribbons, resembling small flare ribbons. These observations provide a first insight into small-scale heating events in sunspots in the TR.

  8. Development and application of absolute quantitative detection by duplex chamber-based digital PCR of genetically modified maize events without pretreatment steps.

    PubMed

    Zhu, Pengyu; Fu, Wei; Wang, Chenguang; Du, Zhixin; Huang, Kunlun; Zhu, Shuifang; Xu, Wentao

    2016-04-15

    The possibility of the absolute quantitation of GMO events by digital PCR was recently reported. However, most absolute quantitation methods based on the digital PCR required pretreatment steps. Meanwhile, singleplex detection could not meet the demand of the absolute quantitation of GMO events that is based on the ratio of foreign fragments and reference genes. Thus, to promote the absolute quantitative detection of different GMO events by digital PCR, we developed a quantitative detection method based on duplex digital PCR without pretreatment. Moreover, we tested 7 GMO events in our study to evaluate the fitness of our method. The optimized combination of foreign and reference primers, limit of quantitation (LOQ), limit of detection (LOD) and specificity were validated. The results showed that the LOQ of our method for different GMO events was 0.5%, while the LOD is 0.1%. Additionally, we found that duplex digital PCR could achieve the detection results with lower RSD compared with singleplex digital PCR. In summary, the duplex digital PCR detection system is a simple and stable way to achieve the absolute quantitation of different GMO events. Moreover, the LOQ and LOD indicated that this method is suitable for the daily detection and quantitation of GMO events.

  9. Maximizing the probability of detecting an electromagnetic counterpart of gravitational-wave events

    NASA Astrophysics Data System (ADS)

    Coughlin, Michael; Stubbs, Christopher

    2016-10-01

    Compact binary coalescences are a promising source of gravitational waves for second-generation interferometric gravitational-wave detectors such as advanced LIGO and advanced Virgo. These are among the most promising sources for joint detection of electromagnetic (EM) and gravitational-wave (GW) emission. To maximize the science performed with these objects, it is essential to undertake a followup observing strategy that maximizes the likelihood of detecting the EM counterpart. We present a follow-up strategy that maximizes the counterpart detection probability, given a fixed investment of telescope time. We show how the prior assumption on the luminosity function of the electro-magnetic counterpart impacts the optimized followup strategy. Our results suggest that if the goal is to detect an EM counterpart from among a succession of GW triggers, the optimal strategy is to perform long integrations in the highest likelihood regions. For certain assumptions about source luminosity and mass distributions, we find that an optimal time investment that is proportional to the 2/3 power of the surface density of the GW location probability on the sky. In the future, this analysis framework will benefit significantly from the 3-dimensional localization probability.

  10. Systems and methods for detecting a failure event in a field programmable gate array

    NASA Technical Reports Server (NTRS)

    Ng, Tak-Kwong (Inventor); Herath, Jeffrey A. (Inventor)

    2009-01-01

    An embodiment generally relates to a method of self-detecting an error in a field programmable gate array (FPGA). The method includes writing a signature value into a signature memory in the FPGA and determining a conclusion of a configuration refresh operation in the FPGA. The method also includes reading an outcome value from the signature memory.

  11. Evaluating robustness of gait event detection based on machine learning and natural sensors.

    PubMed

    Hansen, Morten; Haugland, Morten K; Sinkjaer, Thomas

    2004-03-01

    A real-time system for deriving timing control for functional electrical stimulation for foot-drop correction, using peripheral nerve activity as a sensor input, was tested for reliability to investigate the potential for clinical use. The system, which was previously reported on, was tested on a hemiplegic subject instrumented with a recording cuff electrode on the Sural nerve, and a stimulation cuff electrode on the Peroneal cuff. Implanted devices enabled recording and stimulation through telelinks. An input domain was derived from the recorded electroneurogram and fed to a detection algorithm based on an adaptive logic network for controlling the stimulation timing. The reliability was tested by letting the subject wear different foot wear and walk on different surfaces than when the training data was recorded. The detection system was also evaluated several months after training. The detection system proved able to successfully detect when walking with different footwear on varying surfaces up to 374 days after training, and thereby showed great potential for being clinically useful.

  12. dCaP: detecting differential binding events in multiple conditions and proteins

    PubMed Central

    2014-01-01

    Background Current ChIP-seq studies are interested in comparing multiple epigenetic profiles across several cell types and tissues simultaneously for studying constitutive and differential regulation. Simultaneous analysis of multiple epigenetic features in many samples can gain substantial power and specificity than analyzing individual features and/or samples separately. Yet there are currently few tools can perform joint inference of constitutive and differential regulation in multi-feature-multi-condition contexts with statistical testing. Existing tools either test regulatory variation for one factor in multiple samples at a time, or for multiple factors in one or two samples. Many of them only identify binary rather than quantitative variation, which are sensitive to threshold choices. Results We propose a novel and powerful method called dCaP for simultaneously detecting constitutive and differential regulation of multiple epigenetic factors in multiple samples. Using simulation, we demonstrate the superior power of dCaP compared to existing methods. We then apply dCaP to two datasets from human and mouse ENCODE projects to demonstrate its utility. We show in the human dataset that the cell-type specific regulatory loci detected by dCaP are significantly enriched near genes with cell-type specific functions and disease relevance. We further show in the mouse dataset that dCaP captures genomic regions showing significant signal variations for TAL1 occupancy between two mouse erythroid cell lines. The novel TAL1 occupancy loci detected only by dCaP are highly enriched with GATA1 occupancy and differential gene expression, while those detected only by other methods are not. Conclusions Here, we developed a novel approach to utilize the cooperative property of proteins to detect differential binding given multivariate ChIP-seq samples to provide better power, aiming for complementing existing approaches and providing new insights in the method development in

  13. Model-Based Systems

    NASA Technical Reports Server (NTRS)

    Frisch, Harold P.

    2007-01-01

    Engineers, who design systems using text specification documents, focus their work upon the completed system to meet Performance, time and budget goals. Consistency and integrity is difficult to maintain within text documents for a single complex system and more difficult to maintain as several systems are combined into higher-level systems, are maintained over decades, and evolve technically and in performance through updates. This system design approach frequently results in major changes during the system integration and test phase, and in time and budget overruns. Engineers who build system specification documents within a model-based systems environment go a step further and aggregate all of the data. They interrelate all of the data to insure consistency and integrity. After the model is constructed, the various system specification documents are prepared, all from the same database. The consistency and integrity of the model is assured, therefore the consistency and integrity of the various specification documents is insured. This article attempts to define model-based systems relative to such an environment. The intent is to expose the complexity of the enabling problem by outlining what is needed, why it is needed and how needs are being addressed by international standards writing teams.

  14. Detection of climate change impacts on boreal soil carbon cycling: A model-based analysis of carbon stock and flux changes over the coming decades

    NASA Astrophysics Data System (ADS)

    Fan, Z.; Neff, J.

    2009-12-01

    Future changes in organic carbon (OC) cycling of northern soils due to climate change may have significant impacts on global C cycling. However such changes are still complex and poorly understood in part because boreal soils have unique factors that preserve OC (e.g. permafrost) and loss pathways that include CO2, CH4 and dissolved organic carbon (DOC) fluxes. Additionally, boreal soils contain large stocks of OC that challenge attempts to measure OC loss through repeat measurements of OC pools. With multiple pathways of OC loss and challenges to OC monitoring, it becomes critical to determine which component or property of boreal soil OC (e.g. thickness of OC layer, 14C in solid, liquid, or gas phase) is likely to most sensitive to potential climate changes and when changes in these components would become detectable using laboratory or field measurement. The objective of this study is to provide theoretical answers to the above questions using one single complex biogeochemical model along with various sensitivity analyses. Several existing models have been incorporated into the biogeochemical model, including 1) a multi-isotope OC dynamic model simulating the dynamics of OC layers through time, 2) soil thermal dynamics model simulating the soil heat transported by conduction and by convection via movement of liquid water and water vapor, 3) DOC dynamics model simulating the production, fate, and transport of DOC, and 4) CO2 dynamics model simulating the production and transport of CO2. Six synthesis sites with a factorial combination of drainage class (i.e., well-drained, intermediate well-drained, and poorly drained) and permafrost status (i.e., with or without permafrost underlain) were studied in this research. The results highlight the importance of DOC fluxes from the OC layers to the mineral soils; however the importance of DOC fluxes varied among sites and was strongly dependent on the soil physical properties including soil texture and moisture content

  15. Combined passive detection and ultrafast active imaging of cavitation events induced by short pulses of high-intensity ultrasound.

    PubMed

    Gateau, Jérôme; Aubry, Jean-François; Pernot, Mathieu; Fink, Mathias; Tanter, Mickaël

    2011-03-01

    The activation of natural gas nuclei to induce larger bubbles is possible using short ultrasonic excitations of high amplitude, and is required for ultrasound cavitation therapies. However, little is known about the distribution of nuclei in tissues. Therefore, the acoustic pressure level necessary to generate bubbles in a targeted zone and their exact location are currently difficult to predict. To monitor the initiation of cavitation activity, a novel all-ultrasound technique sensitive to single nucleation events is presented here. It is based on combined passive detection and ultrafast active imaging over a large volume using the same multi-element probe. Bubble nucleation was induced using a focused transducer (660 kHz, f-number = 1) driven by a high-power electric burst (up to 300 W) of one to two cycles. Detection was performed with a linear array (4 to 7 MHz) aligned with the single-element focal point. In vitro experiments in gelatin gel and muscular tissue are presented. The synchronized passive detection enabled radio-frequency data to be recorded, comprising high-frequency coherent wave fronts as signatures of the acoustic emissions linked to the activation of the nuclei. Active change detection images were obtained by subtracting echoes collected in the unnucleated medium. These indicated the appearance of stable cavitating regions. Because of the ultrafast frame rate, active detection occurred as quickly as 330 μs after the high-amplitude excitation and the dynamics of the induced regions were studied individually.

  16. Event-related brain potentials reveal the time-course of language change detection in early bilinguals.

    PubMed

    Kuipers, Jan-Rouke; Thierry, Guillaume

    2010-05-01

    Using event-related brain potentials, we investigated the temporal course of language change detection in proficient bilinguals as compared to matched controls. Welsh-English bilingual participants and English controls were presented with a variant of the oddball paradigm involving picture-word pairs. The language of the spoken word was manipulated such that English was the frequent stimulus (75%) and Welsh the infrequent stimulus (25%). We also manipulated semantic relatedness between pictures and words, such that only half of the pictures were followed by a word that corresponded with the identity of the picture. The P2 wave was significantly modulated by language in the bilingual group only, suggesting that this group detected a language change as early as 200 ms after word onset. Monolinguals also reliably detected the language change, but at a later stage of semantic integration (N400 range), since Welsh words were perceived as meaningless. The early detection of a language change in bilinguals triggered stimulus re-evaluation mechanisms reflected by a significant P600 modulation by Welsh words. Furthermore, compared to English unrelated words, English words matching the picture identity elicited significantly greater P2 amplitudes in the bilingual group only, suggesting that proficient bilinguals validate an incoming word against their expectation based on the context. Overall, highly proficient bilinguals appear to detect language changes very early on during speech perception and to consciously monitor language changes when they occur.

  17. Efficient Data Collection and Event Boundary Detection in Wireless Sensor Networks Using Tiny Models

    NASA Astrophysics Data System (ADS)

    King, Kraig; Nittel, Silvia

    Using wireless geosensor networks (WGSN), sensor nodes often monitor a phenomenon that is both continuous in time and space. However, sensor nodes take discrete samples, and an analytical framework inside or outside the WSN is used to analyze the phenomenon. In both cases, expensive communication is used to stream a large number of data samples to other nodes and to the base station. In this work, we explore a novel alternative that utilizes predictive process knowledge of the observed phenomena to minimize upstream communication. Often, observed phenomena adhere to a process with predictable behavior over time. We present a strategy for developing and running so-called 'tiny models' on individual sensor nodes that capture the predictable behavior of the phenomenon; nodes now only communicate when unexpected events are observed. Using multiple simulations, we demonstrate that a significant percentage of messages can be reduced during data collection.

  18. Detection of overflow events in the shag rocks passage, scotia ridge.

    PubMed

    Zenk, W

    1981-09-04

    During an almost yearlong period of observations made with a current meter in the fracture zone between the Falkland Islands (Islas Malvinas) and South Georgia, several overflow events were recorded at a depth of 3000 meters carrying cold bottom water from the Scotia Sea into the Argentine Basin. The outflow bursts of Scotia Sea bottom water, a mixing product of Weddell Sea and eastern Pacific bottom water, were associated with typical speeds of more than 28 centimeters per second toward the northwest and characteristic temperatures below 0.6 degrees C. The maximum 24-hour average speed of 65 centimeters per second, together with a temperature of 0.29 degrees C, was encountered on 14 November 1980 at a water depth of 2973 meters, 35 meters above the sea floor.

  19. Can social media data lead to earlier detection of drug‐related adverse events?

    PubMed Central

    Cremieux, Pierre; Audenrode, Marc Van; Vekeman, Francis; Karner, Paul; Zhang, Haimin; Greenberg, Paul

    2016-01-01

    Abstract Purpose To compare the patient characteristics and the inter‐temporal reporting patterns of adverse events (AEs) for atorvastatin (Lipitor®) and sibutramine (Meridia®) in social media (AskaPatient.com) versus the FDA Adverse Event Reporting System (FAERS). Methods We identified clinically important AEs associated with atorvastatin (muscle pain) and sibutramine (cardiovascular AEs), compared their patterns in social media postings versus FAERS and used Granger causality tests to assess whether social media postings were useful in forecasting FAERS reports. Results We analyzed 998 and 270 social media postings between 2001 and 2014, 69 003 and 7383 FAERS reports between 1997 and 2014 for atorvastatin and sibutramine, respectively. Social media reporters were younger (atorvastatin: 53.9 vs. 64.0 years, p < 0.001; sibutramine: 36.8 vs. 43.8 years, p < 0.001). Social media reviews contained fewer serious AEs (atorvastatin, pain: 2.5% vs. 38.2%; sibutramine, cardiovascular issues: 7.9% vs. 63.0%; p < 0.001 for both) and concentrated on fewer types of AEs (proportion comprising the top 20 AEs: atorvastatin, 88.7% vs. 55.4%; sibutramine, 86.3% vs. 65.4%) compared with FAERS. While social media sibutramine reviews mentioning cardiac issues helped predict those in FAERS 11 months later (p < 0.001), social media atorvastatin reviews did not help predict FAERS reports. Conclusions Social media AE reporters were younger and focused on less‐serious and fewer types of AEs than FAERS reporters. The potential for social media to provide earlier indications of AEs compared with FAERS is uncertain. Our findings highlight some of the promises and limitations of online social media versus conventional pharmacovigilance sources and the need for careful interpretation of the results. © 2016 The Authors. Pharmacoepidemiology and Drug Safety published by John Wiley & Sons Ltd. PMID:27601271

  20. Detection and Analysis of High Ice Concentration Events and Supercooled Drizzle from IAGOS Commercial Aircraft

    NASA Astrophysics Data System (ADS)

    Gallagher, Martin; Baumgardner, Darrel; Lloyd, Gary; Beswick, Karl; Freer, Matt; Durant, Adam

    2016-04-01

    Hazardous encounters with high ice concentrations that lead to temperature and airspeed sensor measurement errors, as well as engine rollback and flameout, continue to pose serious problems for flight operations of commercial air carriers. Supercooled liquid droplets (SLD) are an additional hazard, especially for smaller commuter aircraft that do not have sufficient power to fly out of heavy icing conditions or heat to remove the ice. New regulations issued by the United States and European regulatory agencies are being implemented that will require aircraft below a certain weight class to carry sensors that will detect and warn of these types of icing conditions. Commercial aircraft do not currently carry standard sensors to detect the presence of ice crystals in high concentrations because they are typical found in sizes that are below the detection range of aircraft weather radar. Likewise, the sensors that are currently used to detect supercooled water do not respond well to drizzle-sized drops. Hence, there is a need for a sensor that can fill this measurement void. In addition, the forecast models that are used to predict regions of icing rely on pilot observations as the only means to validate the model products and currently there are no forecasts for the prevalence of high altitude ice crystals. Backscatter Cloud Probes (BCP) have been flying since 2011 under the IAGOS project on six Airbus commercial airliners operated by Lufthansa, Air France, China Air, Iberia and Cathay Pacific, and measure cloud droplets, ice crystals and aerosol particles larger than 5 μm. The BCP can detect these particles and measures an optical equivalent diameter (OED) but is not able to distinguish the type of particle, i.e. whether they are droplets, ice crystals, dust or ash. However, some qualification can be done based on measured temperature to discriminate between liquid water and ice. The next generation BCP (BCPD, Backscatter Cloud Probe with polarization detection) is

  1. Audio-visual event detection based on mining of semantic audio-visual labels

    NASA Astrophysics Data System (ADS)

    Goh, King-Shy; Miyahara, Koji; Radhakrishnan, Regunathan; Xiong, Ziyou; Divakaran, Ajay

    2003-12-01

    Removing commercials from television programs is a much sought-after feature for a personal video recorder. In this paper, we employ an unsupervised clustering scheme (CM_Detect) to detect commercials in television programs. Each program is first divided into W8-minute chunks, and we extract audio and visual features from each of these chunks. Next, we apply k-means clustering to assign each chunk with a commercial/program label. In contrast to other methods, we do not make any assumptions regarding the program content. Thus, our method is highly content-adaptive and computationally inexpensive. Through empirical studies on various content, including American news, Japanese news, and sports programs, we demonstrate that our method is able to filter out most of the commercials without falsely removing the regular program.

  2. STROBE-based methodology for detection of adverse events across multiple communities.

    PubMed

    Sordo, Margarita; Colecchi, Judith; Dubey, Anil; Dubey, Anil Kumar; Gainer, Vivian; Murphy, Shawn N

    2008-11-06

    Partners Healthcare is one of five institutions in conjunction with eHealth Initiative (eHI) and the FDA that is collaborating in a nation-wide effort to develop novel health information technology tools to create an active drug safety surveillance system across the U.S. The STROBE statement serves as the standard for the definition of a structured, systematic, reproducible approach for detecting both the risks and benefits of drug treatments in multiple settings.

  3. 3D-nanostructured Au electrodes for the event-specific detection of MON810 transgenic maize.

    PubMed

    Barroso, M Fátima; Freitas, Maria; Oliveira, M Beatriz P P; de-los-Santos-Álvarez, Noemí; Lobo-Castañón, María Jesús; Delerue-Matos, Cristina

    2015-03-01

    In the present work, the development of a genosensor for the event-specific detection of MON810 transgenic maize is proposed. Taking advantage of nanostructuration, a cost-effective three dimensional electrode was fabricated and a ternary monolayer containing a dithiol, a monothiol and the thiolated capture probe was optimized to minimize the unspecific signals. A sandwich format assay was selected as a way of precluding inefficient hybridization associated with stable secondary target structures. A comparison between the analytical performance of the Au nanostructured electrodes and commercially available screen-printed electrodes highlighted the superior performance of the nanostructured ones. Finally, the genosensor was effectively applied to detect the transgenic sequence in real samples, showing its potential for future quantitative analysis.

  4. An evaluation of B-mode and color Doppler ultrasonography for detecting periovulatory events in the bitch.

    PubMed

    Bergeron, Lindsay H; Nykamp, Stephanie G; Brisson, Brigitte A; Madan, Pavneesh; Gartley, Cathy J

    2013-01-15

    When determining optimal breeding time in the bitch, specific periovulatory events must be identified. The main objectives were to relate ultrasonographic changes in ovarian blood flow, follicle/corpora lutea count and echotexture to periovulatory events, and to assess the efficacy of each for identifying these events. Twelve Beagle (N = 3), Beagle-cross (N = 2) and hound-cross (N = 7) bitches (body weight range, 7.5-27.5 kg) were examined daily from the onset of proestrus to approximately 4 days post-LH peak. Follicle and corpora lutea count and echotexture analyses were performed using B-mode ultrasound and ovarian blood flow analysis was performed using color Doppler ultrasound. Serum LH concentrations were analyzed by validated RIA. There was an increase (P < 0.05) in ovarian blood flow from the day of the preovulatory LH peak (605 pixels; confidence interval, 397-856), to 1 day after this peak (1092 pixels; confidence interval, 724-1535), enabling detection of the preovulatory LH peak. There were no significant changes in follicle/corpora lutea echotexture relative to days from the preovulatory LH peak. There were significant decreases in follicle/corpora lutea number between Days -1 and 3; Days -1 and 4; and Days 0 and 3, relative to the preovulatory LH peak. We concluded that color Doppler ultrasound performed once daily was more accurate in identifying the preovulatory LH peak than B-mode ultrasound and enabled prospective determination of ovulation.

  5. Characterization and event specific-detection by quantitative real-time PCR of T25 maize insert.

    PubMed

    Collonnier, Cécile; Schattner, Alexandra; Berthier, Georges; Boyer, Francine; Coué-Philippe, Géraldine; Diolez, Annick; Duplan, Marie-Noëlle; Fernandez, Sophie; Kebdani, Naïma; Kobilinsky, André; Romaniuk, Marcel; de Beuckeleer, Marc; de Loose, Marc; Windels, Pieter; Bertheau, Yves

    2005-01-01

    T25 is one of the 4 maize transformation events from which commercial lines have so far been authorized in Europe. It was created by polyethylene glycol-mediated transformation using a construct bearing one copy of the synthetic pat gene associated with both promoter and terminator of the 35S ribosomal gene from cauliflower mosaic virus. In this article, we report the sequencing of the whole T25 insert and the characterization of its integration site by using a genome walking strategy. Our results confirmed that one intact copy of the initial construct had been integrated in the plant genome. They also revealed, at the 5' junction of the insert, the presence of a second truncated 35S promoter, probably resulting from rearrangements which may have occurred before or during integration of the plasmid DNA. The analysis of the junction fragments showed that the integration site of the insert presented high homologies with the Huck retrotransposon family. By using one primer annealing in the maize genome and the other in the 5' end of the integrated DNA, we developed a reliable event-specific detection system for T25 maize. To provide means to comply with the European regulation, a real-time PCR test was designed for specific quantitation of T25 event by using Taqman chemistry.

  6. Architecture design of the multi-functional wavelet-based ECG microprocessor for realtime detection of abnormal cardiac events.

    PubMed

    Cheng, Li-Fang; Chen, Tung-Chien; Chen, Liang-Gee

    2012-01-01

    Most of the abnormal cardiac events such as myocardial ischemia, acute myocardial infarction (AMI) and fatal arrhythmia can be diagnosed through continuous electrocardiogram (ECG) analysis. According to recent clinical research, early detection and alarming of such cardiac events can reduce the time delay to the hospital, and the clinical outcomes of these individuals can be greatly improved. Therefore, it would be helpful if there is a long-term ECG monitoring system with the ability to identify abnormal cardiac events and provide realtime warning for the users. The combination of the wireless body area sensor network (BASN) and the on-sensor ECG processor is a possible solution for this application. In this paper, we aim to design and implement a digital signal processor that is suitable for continuous ECG monitoring and alarming based on the continuous wavelet transform (CWT) through the proposed architectures--using both programmable RISC processor and application specific integrated circuits (ASIC) for performance optimization. According to the implementation results, the power consumption of the proposed processor integrated with an ASIC for CWT computation is only 79.4 mW. Compared with the single-RISC processor, about 91.6% of the power reduction is achieved.

  7. The direct detection of boosted dark matter at high energies and PeV events at IceCube

    SciTech Connect

    Bhattacharya, A.; Gandhi, R.; Gupta, A.

    2015-03-13

    We study the possibility of detecting dark matter directly via a small but energetic component that is allowed within present-day constraints. Drawing closely upon the fact that neutral current neutrino nucleon interactions are indistinguishable from DM-nucleon interactions at low energies, we extend this feature to high energies for a small, non-thermal but highly energetic population of DM particle χ, created via the decay of a significantly more massive and long-lived non-thermal relic Φ, which forms the bulk of DM. If χ interacts with nucleons, its cross-section, like the neutrino-nucleus coherent cross-section, can rise sharply with energy leading to deep inelastic scattering, similar to neutral current neutrino-nucleon interactions at high energies. Thus, its direct detection may be possible via cascades in very large neutrino detectors. As a specific example, we apply this notion to the recently reported three ultra-high energy PeV cascade events clustered around 1 – 2 PeV at IceCube (IC). We discuss the features which may help discriminate this scenario from one in which only astrophysical neutrinos constitute the event sample in detectors like IC.

  8. The direct detection of boosted dark matter at high energies and PeV events at IceCube

    DOE PAGES

    Bhattacharya, A.; Gandhi, R.; Gupta, A.

    2015-03-13

    We study the possibility of detecting dark matter directly via a small but energetic component that is allowed within present-day constraints. Drawing closely upon the fact that neutral current neutrino nucleon interactions are indistinguishable from DM-nucleon interactions at low energies, we extend this feature to high energies for a small, non-thermal but highly energetic population of DM particle χ, created via the decay of a significantly more massive and long-lived non-thermal relic Φ, which forms the bulk of DM. If χ interacts with nucleons, its cross-section, like the neutrino-nucleus coherent cross-section, can rise sharply with energy leading to deep inelasticmore » scattering, similar to neutral current neutrino-nucleon interactions at high energies. Thus, its direct detection may be possible via cascades in very large neutrino detectors. As a specific example, we apply this notion to the recently reported three ultra-high energy PeV cascade events clustered around 1 – 2 PeV at IceCube (IC). We discuss the features which may help discriminate this scenario from one in which only astrophysical neutrinos constitute the event sample in detectors like IC.« less

  9. The direct detection of boosted dark matter at high energies and PeV events at IceCube

    SciTech Connect

    Bhattacharya, A.; Gandhi, R.; Gupta, A.

    2015-03-13

    We study the possibility of detecting dark matter directly via a small but energetic component that is allowed within present-day constraints. Drawing closely upon the fact that neutral current neutrino nucleon interactions are indistinguishable from DM-nucleon interactions at low energies, we extend this feature to high energies for a small, non-thermal but highly energetic population of DM particle χ, created via the decay of a significantly more massive and long-lived non-thermal relic ϕ, which forms the bulk of DM. If χ interacts with nucleons, its cross-section, like the neutrino-nucleus coherent cross-section, can rise sharply with energy leading to deep inelastic scattering, similar to neutral current neutrino-nucleon interactions at high energies. Thus, its direct detection may be possible via cascades in very large neutrino detectors. As a specific example, we apply this notion to the recently reported three ultra-high energy PeV cascade events clustered around 1−2 PeV at IceCube (IC). We discuss the features which may help discriminate this scenario from one in which only astrophysical neutrinos constitute the event sample in detectors like IC.

  10. APASVO: A free software tool for automatic P-phase picking and event detection in seismic traces

    NASA Astrophysics Data System (ADS)

    Romero, José Emilio; Titos, Manuel; Bueno, Ángel; Álvarez, Isaac; García, Luz; Torre, Ángel de la; Benítez, M.a. Carmen

    2016-05-01

    The accurate estimation of the arrival time of seismic waves or picking is a problem of major interest in seismic research given its relevance in many seismological applications, such as earthquake source location and active seismic tomography. In the last decades, several automatic picking methods have been proposed with the ultimate goal of implementing picking algorithms whose results are comparable to those obtained by manual picking. In order to facilitate the use of these automated methods in the analysis of seismic traces, this paper presents a new free, open source, software graphical tool, named APASVO, which allows picking tasks in an easy and user-friendly way. The tool also provides event detection functionality, where a relatively imprecise estimation of the onset time is sufficient. The application implements the STA-LTA detection algorithm and the AMPA picking algorithm. An autoregressive AIC-based picking method can also be applied. Besides, this graphical tool is complemented with two additional command line tools, an event picking tool and a synthetic earthquake generator. APASVO is a multiplatform tool that works on Windows, Linux and OS X. The application can process data in a large variety of file formats. It is implemented in Python and relies on well-known scientific computing packages such as ObsPy, NumPy, SciPy and Matplotlib.

  11. Clinical Experiments of Communication by ALS Patient Utilizing Detecting Event-Related Potential

    NASA Astrophysics Data System (ADS)

    Kanou, Naoyuki; Sakuma, Kenji; Nakashima, Kenji

    Amyotrophic Lateral Sclerosis(ALS) patients are unable to successfully communicate their desires, although their mentality is normal, and so, the necessity of Communication Aids(CA) for ALS patients is realized. Therefore, the authors are focused on Event-Related Potential(ERP) which is elicited primarily for the target by visual and auditory stimuli. P200, N200 and P300 are components of ERP. These are potentials that are elicited when the subject focuses attention on stimuli that appears infrequently. ALS patient participated in two experiments. In the first experiment, a target word out of five words on a computer display was specified. The five words were linked to an each electric appliance, allowing the ALS patient to switch on a target appliance by ERP. In the second experiment, a target word in a 5×5 matrix was specified by measure of ERP. The rows and columns of the matrix were reversed randomly. The word on a crossing point of rows and columns including the target word, was specified as the target word. The rate of correct judgment in the first and second experiments were 100% in N200 and 96% in P200. For practical use of this system, it is very important to determine suitable communication algorithms for each patient by performing these experiments evaluating the results.

  12. High-frequency deletion event at aprt locus of CHO cells: detection and characterization of endpoints.

    PubMed

    Dewyse, P; Bradley, W E

    1989-01-01

    Two mechanisms are implicated in generating recessive drug resistance mutants at the adenine phosphoribosyltransferase (aprt) locus of Chinese hamster ovary (CHO) cells, one of which is a spontaneous high-frequency deletion of the entire gene. We have isolated and mapped a 19-kb fragment carrying aprt and its flanking sequences. A Southern blot study of 198 independent deletion mutants revealed that two different mutants have one of their breakpoints within the 19-kb region analyzed. One of these has an upstream breakpoint which could be narrowed down to a 4-kb fragment containing repetitive sequences. The other mutant has a breakpoint within a 410-bp sequence located 8.5 kb downstream of the aprt gene and which carries several elements similar to those signaling V-(D)-J joining in immunoglobulin and T-cell receptor gene rearrangements. In each case the other breakpoint lay outside of the analyzed region. These results support the previous indications that the deletions created by this spontaneous event are large.

  13. A new computational method for the detection of horizontal gene transfer events.

    PubMed

    Tsirigos, Aristotelis; Rigoutsos, Isidore

    2005-01-01

    In recent years, the increase in the amounts of available genomic data has made it easier to appreciate the extent by which organisms increase their genetic diversity through horizontally transferred genetic material. Such transfers have the potential to give rise to extremely dynamic genomes where a significant proportion of their coding DNA has been contributed by external sources. Because of the impact of these horizontal transfers on the ecological and pathogenic character of the recipient organisms, methods are continuously sought that are able to computationally determine which of the genes of a given genome are products of transfer events. In this paper, we introduce and discuss a novel computational method for identifying horizontal transfers that relies on a gene's nucleotide composition and obviates the need for knowledge of codon boundaries. In addition to being applicable to individual genes, the method can be easily extended to the case of clusters of horizontally transferred genes. With the help of an extensive and carefully designed set of experiments on 123 archaeal and bacterial genomes, we demonstrate that the new method exhibits significant improvement in sensitivity when compared to previously published approaches. In fact, it achieves an average relative improvement across genomes of between 11 and 41% compared to the Codon Adaptation Index method in distinguishing native from foreign genes. Our method's horizontal gene transfer predictions for 123 microbial genomes are available online at http://cbcsrv.watson.ibm.com/HGT/.

  14. Fast joint detection-estimation of evoked brain activity in event-related FMRI using a variational approach

    PubMed Central

    Chaari, Lotfi; Vincent, Thomas; Forbes, Florence; Dojat, Michel; Ciuciu, Philippe

    2013-01-01

    In standard within-subject analyses of event-related fMRI data, two steps are usually performed separately: detection of brain activity and estimation of the hemodynamic response. Because these two steps are inherently linked, we adopt the so-called region-based Joint Detection-Estimation (JDE) framework that addresses this joint issue using a multivariate inference for detection and estimation. JDE is built by making use of a regional bilinear generative model of the BOLD response and constraining the parameter estimation by physiological priors using temporal and spatial information in a Markovian model. In contrast to previous works that use Markov Chain Monte Carlo (MCMC) techniques to sample the resulting intractable posterior distribution, we recast the JDE into a missing data framework and derive a Variational Expectation-Maximization (VEM) algorithm for its inference. A variational approximation is used to approximate the Markovian model in the unsupervised spatially adaptive JDE inference, which allows automatic fine-tuning of spatial regularization parameters. It provides a new algorithm that exhibits interesting properties in terms of estimation error and computational cost compared to the previously used MCMC-based approach. Experiments on artificial and real data show that VEM-JDE is robust to model mis-specification and provides computational gain while maintaining good performance in terms of activation detection and hemodynamic shape recovery. PMID:23096056

  15. Source and path corrections, feature selection, and outlier detection applied to regional event discrimination in China

    SciTech Connect

    Hartse, H.E.; Taylor, S.R.; Phillips, W.S.; Velasco, A.A.

    1999-03-01

    The authors are investigating techniques to improve regional discrimination performance in uncalibrated regions. These include combined source and path corrections, spatial path corrections, path-specific waveguide corrections to construct frequency-dependent amplitude corrections that remove attenuation, corner frequency scaling, and source region/path effects (such as blockages). The spatial method and the waveguide method address corrections for specific source regions and along specific paths. After applying the above corrections to phase amplitudes, the authors form amplitude ratios and use a combination of feature selection and outlier detection to choose the best-performing combination of discriminants. Feature selection remains an important issue. Most stations have an inadequate population of nuclear explosions on which to base discriminant selection. Additionally, mining explosions are probably not good surrogates for nuclear explosions. The authors are exploring the feasibility of sampling the source and path corrected amplitudes for each phase as a function of frequency in an outlier detection framework. In this case, the source identification capability will be based on the inability of the earthquake source model to fit data from explosion sources.

  16. Analyzing Protease Specificity and Detecting in Vivo Proteolytic Events Using Tandem Mass Spectrometry

    SciTech Connect

    Gupta, Nitin; Hixson, Kim K.; Culley, David E.; Smith, Richard D.; Pevzner, Pavel A.

    2010-07-01

    While trypsin remains the most commonly used protease in mass spectrometry, other proteases may be employed for increasing peptide-coverage or generating overlapping peptides. Knowledge of the accurate specifcity rules of these proteases is helpful for database search tools to detect peptides, and becomes crucial when mass spectrometry is used to discover in vivo proteolytic cleavages. In this study, we use tandem mass spectrometry to analyze the specifcity rules of selected proteases and describe MS- Proteolysis, a software tool for identifying putative sites of in vivo proteolytic cleavage. Our analysis suggests that the specifcity rules for some commonly used proteases can be improved, e.g., we find that V8 protease cuts not only after Asp and Glu, as currently expected, but also shows a smaller propensity to cleave after Gly for the conditions tested in this study. Finally, we show that comparative analysis of multiple proteases can be used to detect putative in vivo proteolytic sites on a proteome-wide scale.

  17. Dual-particle imaging system based on simultaneous detection of photon and neutron collision events

    NASA Astrophysics Data System (ADS)

    Poitrasson-Rivière, Alexis; Hamel, Michael C.; Polack, J. Kyle; Flaska, Marek; Clarke, Shaun D.; Pozzi, Sara A.

    2014-10-01

    A dual-particle imaging (DPI) system capable of simultaneously detecting and imaging fast neutrons and photons has been designed and built. Imaging fast neutrons and photons simultaneously is particularly desirable for nuclear nonproliferation and/or safeguards applications because typical sources of interest (special nuclear material) emit both particle types. The DPI system consists of three detection planes: the first two planes consist of organic-liquid scintillators and the third plane consists of NaI(Tl) inorganic scintillators. Pulse shape discrimination technique(s) may be used for the liquid scintillators to differentiate neutron and photon pulses whereas the NaI(Tl) scintillators are highly insensitive to neutrons. A prototype DPI system was set up using a digital data acquisition system as a proof of concept. Initial measurements showed potential for use of the DPI system with special nuclear material. The DPI system has efficiencies of the order of 10-4 correlated counts per incident particles for both neutron and photon correlated counts, with simple-backprojection images displaying peaks within a few degrees of the source location. This uncertainty is expected to decrease with more extensive data interpretation.

  18. Event detection and sub-state discovery from biomolecular simulations using higher-order statistics: application to enzyme adenylate kinase.

    PubMed

    Ramanathan, Arvind; Savol, Andrej J; Agarwal, Pratul K; Chennubhotla, Chakra S

    2012-11-01

    Biomolecular simulations at millisecond and longer time-scales can provide vital insights into functional mechanisms. Because post-simulation analyses of such large trajectory datasets can be a limiting factor in obtaining biological insights, there is an emerging need to identify key dynamical events and relating these events to the biological function online, that is, as simulations are progressing. Recently, we have introduced a novel computational technique, quasi-anharmonic analysis (QAA) (Ramanathan et al., PLoS One 2011;6:e15827), for partitioning the conformational landscape into a hierarchy of functionally relevant sub-states. The unique capabilities of QAA are enabled by exploiting anharmonicity in the form of fourth-order statistics for characterizing atomic fluctuations. In this article, we extend QAA for analyzing long time-scale simulations online. In particular, we present HOST4MD--a higher-order statistical toolbox for molecular dynamics simulations, which (1) identifies key dynamical events as simulations are in progress, (2) explores potential sub-states, and (3) identifies conformational transitions that enable the protein to access those sub-states. We demonstrate HOST4MD on microsecond timescale simulations of the enzyme adenylate kinase in its apo state. HOST4MD identifies several conformational events in these simulations, revealing how the intrinsic coupling between the three subdomains (LID, CORE, and NMP) changes during the simulations. Further, it also identifies an inherent asymmetry in the opening/closing of the two binding sites. We anticipate that HOST4MD will provide a powerful and extensible framework for detecting biophysically relevant conformational coordinates from long time-scale simulations.

  19. On the Detectability of a Predicted Mesolensing Event Associated with the High Proper Motion Star VB 10

    NASA Astrophysics Data System (ADS)

    Lépine, Sébastien; DiStefano, Rosanne

    2012-04-01

    Extrapolation of the astrometric motion of the nearby low-mass star VB 10 indicates that sometime in late 2011 December or during the first 2-3 months of 2012, the star will make a close approach to a background point source. Based on astrometric uncertainties, we estimate a 1 in 2 chance that the distance of closest approach ρmin will be less than 100 mas, a 1 in 5 chance that ρmin < 50 mas, and a 1 in 10 chance that ρmin < 20 mas. The last would result in a microlensing event with a 6% magnification in the light from the background source and an astrometric shift of 3.3 mas. The lensing signal will however be significantly diluted by the light from VB 10, which is 1.5 mag brighter than the background source in B band, 5 mag brighter in I band, and 10 mag brighter in K band, making the event undetectable in all but the bluer optical bands. However, we show that if VB 10 happens to harbor a ~1 MJ planet on a moderately wide (≈0.18 AU-0.84 AU) orbit, there is a chance (1% to more than 10%, depending on the distance of closest approach and orbital period and inclination) that a passage of the planet closer to the background source will result in a secondary event of higher magnification. The detection of secondary events could be made possible with a several-times-per-night multi-site monitoring campaign. Based on observations made with the NASA/ESA Hubble Space Telescope, and obtained from the Hubble Legacy Archive, which is a collaboration between the Space Telescope Science Institute (STScI/NASA), the Space Telescope European Coordinating Facility (ST-ECF/ESA), and the Canadian Astronomy Data Centre (CADC/NRC/CSA).

  20. ON THE DETECTABILITY OF A PREDICTED MESOLENSING EVENT ASSOCIATED WITH THE HIGH PROPER MOTION STAR VB 10

    SciTech Connect

    Lepine, Sebastien; DiStefano, Rosanne E-mail: rd@cfa.harvard.edu

    2012-04-10

    Extrapolation of the astrometric motion of the nearby low-mass star VB 10 indicates that sometime in late 2011 December or during the first 2-3 months of 2012, the star will make a close approach to a background point source. Based on astrometric uncertainties, we estimate a 1 in 2 chance that the distance of closest approach {rho}{sub min} will be less than 100 mas, a 1 in 5 chance that {rho}{sub min} < 50 mas, and a 1 in 10 chance that {rho}{sub min} < 20 mas. The last would result in a microlensing event with a 6% magnification in the light from the background source and an astrometric shift of 3.3 mas. The lensing signal will however be significantly diluted by the light from VB 10, which is 1.5 mag brighter than the background source in B band, 5 mag brighter in I band, and 10 mag brighter in K band, making the event undetectable in all but the bluer optical bands. However, we show that if VB 10 happens to harbor a {approx}1 M{sub J} planet on a moderately wide ( Almost-Equal-To 0.18 AU-0.84 AU) orbit, there is a chance (1% to more than 10%, depending on the distance of closest approach and orbital period and inclination) that a passage of the planet closer to the background source will result in a secondary event of higher magnification. The detection of secondary events could be made possible with a several-times-per-night multi-site monitoring campaign.

  1. Application of stochastic discrete event system framework for detection of induced low rate TCP attack.

    PubMed

    Barbhuiya, F A; Agarwal, Mayank; Purwar, Sanketh; Biswas, Santosh; Nandi, Sukumar

    2015-09-01

    TCP is the most widely accepted transport layer protocol. The major emphasis during the development of TCP was its functionality and efficiency. However, not much consideration was given on studying the possibility of attackers exploiting the protocol, which has lead to several attacks on TCP. This paper deals with the induced low rate TCP attack. Since the attack is relatively new, only a few schemes have been proposed to mitigate it. However, the main issues with these schemes are scalability, change in TCP header, lack of formal frameworks, etc. In this paper, we have adapted the stochastic DES framework for detecting the attack, which addresses most of these issues. We have successfully deployed and tested the proposed DES based IDS on a test bed.

  2. Precursory Acoustic Signals Detection in Rockfall Events by Means of Optical Fiber Sensors

    NASA Astrophysics Data System (ADS)

    Schenato, L.; Marcato, G.; Gruca, G.; Iannuzzi, D.; Palmieri, L.; Galtarossa, A.; Pasuto, A.

    2012-12-01

    Rockfalls represent a major source of hazard in mountain areas: they occur at the apex of a process of stress accumulation in the unstable slope, during which part of the accumulated energy is released in small internal cracks. These cracks and the related acoustic emissions (AE) can, therefore, be used as precursory signals, through which the unstable rock could be monitored. In particular, according to previous scientific literature AE can be monitored in the range 20÷100 kHz. With respect to traditional AE sensors, such as accelerometers and piezoelectric transducers, fiber optic sensors (FOSs) may provide a reliable solution, potentially offering more robustness to electromagnetic interference, smaller form factor, multiplexing ability and increased distance range and higher sensitivity. To explore this possibility, in this work we have experimentally analyzed two interferometric fiber optical sensors for AE detection in rock masses. In particular, the first sensor is made of 100 m of G.657 optical fiber, tightly wound on an aluminum flanged hollow mandrel (inner diameter 30 mm, height 42 mm) that is isolated from the environment with acoustic absorbing material. A 4-cm-long M10 screw, which acts also as the main mean of acoustic coupling between the rock and the sensor, is used to fasten the sensor to the rock. This fiber coil sensor (FCS) is inserted in the sensing arm of a fiber Mach-Zehnder interferometer. The second sensor consists in a micro cantilever carved on the top of a cylindrical silica ferrule, with a marked mechanical resonance at about 12.5 kHz (Q-factor of about 400). A standard single mode fiber is housed in the same ferrule and the gap between the cantilever and the fiber end face acts as a vibration-sensitive Fabry-Perot cavity, interrogated with a low-coherence laser, tuned at the quadrature point of the cavity. The sensor is housed in a 2-cm-long M10 bored bolt. Performance have been compared with those from a standard piezo

  3. Age Dating Merger Events in Early Type Galaxies via the Detection of AGB Light

    NASA Technical Reports Server (NTRS)

    Bothun, G.

    2005-01-01

    A thorough statistical analysis of the J-H vs. H-K color plane of all detected early type galaxies in the 2MASS catalog with velocities less than 5000 km/s has been performed. This all sky survey is not sensitive to one particular galactic environment and therefore a representative range of early type galaxy environments have been sampled. Virtually all N-body simulation so major mergers produces a central starburst due to rapid collection of gas. This central starburst is of sufficient amplitude to change the stellar population in the central regions of the galaxy. Intermediate age populations are given away by the presence of AGB stars which will drive the central colors redder in H-K relative to the J- H baseline. This color anomaly has a lifetime of 2-5 billion years depending on the amplitude of the initial starburst Employing this technique on the entire 2MASS sample (several hundred galaxies) reveals that the AGB signature occurs less than 1% of the time. This is a straightforward indication that virtually all nearby early type galaxies have not had a major merger occur within the last few billion years.

  4. Telehealth streams reduction based on pattern recognition techniques for events detection and efficient storage in EHR.

    PubMed

    Henriques, J; Rocha, T; Paredes, S; de Carvalho, P

    2013-01-01

    This work proposes a framework for telehealth streams analysis, founded on a pattern recognition technique that evaluates the similarity between multi-sensorial biosignals. The strategy combines the Haar wavelet with the Karhunen-Loève transforms to describe biosignals by means of a reduced set of parameters. These, that reflect the dynamic behavior of the biosignals, can support the detection of relevant clinical conditions. Moreover, the simplicity and fast execution of the proposed approach allow its application in real-time operation, as well as provide a practical way to manage historical electronic health records: i) common and uncommon behaviors can be distinguished; ii) the creation of different models, tailored to specific conditions can be efficiently stored. The efficiency of the methodology is assessed through its performance analysis, namely by computing the required number of operations and the compression rate. Its effectiveness is evaluated in the prediction of decompensation episodes using biosignals daily collected in the myHeart study (blood pressure, weight, respiration and heart rates).

  5. An automatic rules extraction approach to support OSA events detection in an mHealth system.

    PubMed

    Sannino, Giovanna; De Falco, Ivanoe; De Pietro, Giuseppe

    2014-09-01

    Detection and real time monitoring of obstructive sleep apnea (OSA) episodes are very important tasks in healthcare. To suitably face them, this paper proposes an easy-to-use, cheap mobile-based approach relying on three steps. First, single-channel ECG data from a patient are collected by a wearable sensor and are recorded on a mobile device. Second, the automatic extraction of knowledge about that patient takes place offline, and a set of IF…THEN rules containing heart-rate variability (HRV) parameters is achieved. Third, these rules are used in our real-time mobile monitoring system: the same wearable sensor collects the single-channel ECG data and sends them to the same mobile device, which now processes those data online to compute HRV-related parameter values. If these values activate one of the rules found for that patient, an alarm is immediately produced. This approach has been tested on a literature database with 35 OSA patients. A comparison against five well-known classifiers has been carried out.

  6. Model Based Definition

    NASA Technical Reports Server (NTRS)

    Rowe, Sidney E.

    2010-01-01

    In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.

  7. SPR and SPR Imaging: Recent Trends in Developing Nanodevices for Detection and Real-Time Monitoring of Biomolecular Events

    PubMed Central

    Puiu, Mihaela; Bala, Camelia

    2016-01-01

    In this paper we review the underlying principles of the surface plasmon resonance (SPR) technique, particularly emphasizing its advantages along with its limitations regarding the ability to discriminate between the specific binding response and the interfering effects from biological samples. While SPR sensors were developed almost three decades, SPR detection is not yet able to reduce the time-consuming steps of the analysis, and is hardly amenable for miniaturized, portable platforms required in point-of-care (POC) testing. Recent advances in near-field optics have emerged, resulting in the development of SPR imaging (SPRi) as a powerful optical, label-free monitoring tool for multiplexed detection and monitoring of biomolecular events. The microarrays design of the SPRi chips incorporating various metallic nanostructures make these optofluidic devices more suitable for diagnosis and near-patient testing than the traditional SPR sensors. The latest developments indicate SPRi detection as being the most promising surface plasmon-based technique fulfilling the demands for implementation in lab-on-a-chip (LOC) technologies. PMID:27314345

  8. Detection of explosive events by monitoring acoustically-induced geomagnetic perturbations

    SciTech Connect

    Lewis, J P; Rock, D R; Shaeffer, D L; Warshaw, S I

    1999-10-07

    The Black Thunder Coal Mine (BTCM) near Gillette, Wyoming was used as a test bed to determine the feasibility of detecting explosion-induced geomagnetic disturbances with ground-based induction magnetometers. Two magnetic observatories were fielded at distances of 50 km and 64 km geomagnetically north from the northernmost edge of BTCM. Each observatory consisted of three separate but mutually orthogonal magnetometers, Global Positioning System (GPS) timing, battery and solar power, a data acquisition and storage system, and a three-axis seismometer. Explosions with yields of 1 to 3 kT of TNT equivalent occur approximately every three weeks at BTCM. We hypothesize that explosion-induced acoustic waves propagate upward and interact collisionally with the ionosphere to produce ionospheric electron density (and concomitant current density) perturbations which act as sources for geomagnetic disturbances. These disturbances propagate through an ionospheric Alfven waveguide that we postulate to be leaky (due to the imperfectly conducting lower ionospheric boundary). Consequently, wave energy may be observed on the ground. We observed transient pulses, known as Q-bursts, with pulse widths about 0.5 s and with spectral energy dominated by the Schumann resonances. These resonances appear to be excited in the earth-ionosphere cavity by Alfven solitons that may have been generated by the explosion-induced acoustic waves reaching the ionospheric E and F regions and that subsequently propagate down through the ionosphere to the atmosphere. In addition, we observe late time (> 800 s) ultra low frequency (ULF) geomagnetic perturbations that appear to originate in the upper F region ({approximately}300 km) and appear to be caused by the explosion-induced acoustic wave interacting with that part of the ionosphere. We suggest that explosion-induced Q-bursts may be discriminated from naturally occurring Q-bursts by association of the former with the late time explosion-induced ULF

  9. Automatic Event Detection in Search for Inter-Moss Loops in IRIS Si IV Slit-Jaw Images

    NASA Technical Reports Server (NTRS)

    Fayock, Brian; Winebarger, Amy R.; De Pontieu, Bart

    2015-01-01

    The high-resolution capabilities of the Interface Region Imaging Spectrometer (IRIS) mission have allowed the exploration of the finer details of the solar magnetic structure from the chromosphere to the lower corona that have previously been unresolved. Of particular interest to us are the relatively short-lived, low-lying magnetic loops that have foot points in neighboring moss regions. These inter-moss loops have also appeared in several AIA pass bands, which are generally associated with temperatures that are at least an order of magnitude higher than that of the Si IV emission seen in the 1400 angstrom pass band of IRIS. While the emission lines seen in these pass bands can be associated with a range of temperatures, the simultaneous appearance of these loops in IRIS 1400 and AIA 171, 193, and 211 suggest that they are not in ionization equilibrium. To study these structures in detail, we have developed a series of algorithms to automatically detect signal brightening or events on a pixel-by-pixel basis and group them together as structures for each of the above data sets. These algorithms have successfully picked out all activity fitting certain adjustable criteria. The resulting groups of events are then statistically analyzed to determine which characteristics can be used to distinguish the inter-moss loops from all other structures. While a few characteristic histograms reveal that manually selected inter-moss loops lie outside the norm, a combination of several characteristics will need to be used to determine the statistical likelihood that a group of events be categorized automatically as a loop of interest. The goal of this project is to be able to automatically pick out inter-moss loops from an entire data set and calculate the characteristics that have previously been determined manually, such as length, intensity, and lifetime. We will discuss the algorithms, preliminary results, and current progress of automatic characterization.

  10. Adaptive and context-aware detection and classification of potential QoS degradation events in biomedical wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Abreu, Carlos; Miranda, Francisco; Mendes, Paulo M.

    2016-06-01

    The use of wireless sensor networks in healthcare has the potential to enhance the services provided to citizens. In particular, they play an important role in the development of state-of-the-art patient monitoring applications. Nevertheless, due to the critical nature of the data conveyed by such patient monitoring applications, they have to fulfil high standards of quality of service in order to obtain the confidence of all players in the healthcare industry. In such context, vis-à-vis the quality of service being provided by the wireless sensor network, this work presents an adaptive and context-aware method to detect and classify performance degradation events. The proposed method has the ability to catch the most significant and damaging variations on the metrics being used to quantify the quality of service provided by the network without overreacting to small and innocuous variations on the metric's value.

  11. Collaborative trial for the validation of event-specific PCR detection methods of genetically modified papaya Huanong No.1.

    PubMed

    Wei, Jiaojun; Le, Huangying; Pan, Aihu; Xu, Junfeng; Li, Feiwu; Li, Xiang; Quan, Sheng; Guo, Jinchao; Yang, Litao

    2016-03-01

    For transferring the event-specific PCR methods of genetically modified papaya Huanong No.1 to other laboratories, we validated the previous developed PCR assays of Huanong No.1 according to the international standard organization (ISO) guidelines. A total of 11 laboratories participated and returned their test results in this trial. In qualitative PCR assay, the high specificity and limit of detection as low as 0.1% was confirmed. For the quantitative PCR assay, the limit of quantification was as low as 25 copies. The quantitative biases among ten blind samples were within the range between 0.21% and 10.04%. Furthermore, the measurement uncertainty of the quantitative PCR results was calculated within the range between 0.28% and 2.92% for these ten samples. All results demonstrated that the Huanong No.1 qualitative and quantitative PCR assays were creditable and applicable for identification and quantification of GM papaya Huanong No.1 in further routine lab analysis.

  12. Detection and discrimination of maintenance and de novo CpG methylation events using MethylBreak.

    PubMed

    Hsu, William; Mercado, Augustus T; Hsiao, George; Yeh, Jui-Ming; Chen, Chung-Yung

    2017-05-15

    Understanding the principles governing the establishment and maintenance activities of DNA methyltransferases (DNMTs) can help in the development of predictive biomarkers associated with genetic disorders and diseases. A detection system was developed that distinguishes and quantifies methylation events using methylation-sensitive endonucleases and molecular beacon technology. MethylBreak (MB) is a 22-mer oligonucleotide with one hemimethylated and two unmethylated CpG sites, which are also recognition sites for Sau96I and SacII, and is attached to a fluorophore and a quencher. Maintenance methylation was quantified by fluorescence emission due to the digestion of SacII when the hemimethylated CpG site is methylated, which inhibits Sau96I cleavage. The signal difference between SacII digestion of both MB substrate and maintenance methylated MB corresponds to de novo methylation event. Our technology successfully discriminated and measured both methylation activities at different concentrations of MB and achieved a high correlation coefficient of R(2)=0.997. Additionally, MB was effectively applied to normal and cancer cell lines and in the analysis of enzymatic kinetics and RNA inhibition of recombinant human DNMT1.

  13. Measurement of patient safety: a systematic review of the reliability and validity of adverse event detection with record review

    PubMed Central

    Hanskamp-Sebregts, Mirelle; Zegers, Marieke; Vincent, Charles; van Gurp, Petra J; de Vet, Henrica C W; Wollersheim, Hub

    2016-01-01

    Objectives Record review is the most used method to quantify patient safety. We systematically reviewed the reliability and validity of adverse event detection with record review. Design A systematic review of the literature. Methods We searched PubMed, EMBASE, CINAHL, PsycINFO and the Cochrane Library and from their inception through February 2015. We included all studies that aimed to describe the reliability and/or validity of record review. Two reviewers conducted data extraction. We pooled κ values (κ) and analysed the differences in subgroups according to number of reviewers, reviewer experience and training level, adjusted for the prevalence of adverse events. Results In 25 studies, the psychometric data of the Global Trigger Tool (GTT) and the Harvard Medical Practice Study (HMPS) were reported and 24 studies were included for statistical pooling. The inter-rater reliability of the GTT and HMPS showed a pooled κ of 0.65 and 0.55, respectively. The inter-rater agreement was statistically significantly higher when the group of reviewers within a study consisted of a maximum five reviewers. We found no studies reporting on the validity of the GTT and HMPS. Conclusions The reliability of record review is moderate to substantial and improved when a small group of reviewers carried out record review. The validity of the record review method has never been evaluated, while clinical data registries, autopsy or direct observations of patient care are potential reference methods that can be used to test concurrent validity. PMID:27550650

  14. Generation of a Solar Cycle of Sunspot Metadata Using the AIA Event Detection Framework - A Test of the System

    NASA Astrophysics Data System (ADS)

    Slater, G. L.; Zharkov, S.

    2008-12-01

    The soon-to-be-launched Solar Dynamics Observatory (SDO) will generate roughly 2 TB of image data per day, far more than previous solar missions. Because of the difficulty of widely distributing this enormous volume of data and in order to maximize discovery and scientific return, a sophisticated automated metadata extraction system is being developed at Stanford University and Lockheed Martin Solar and Astrophysics Laboratory in Palo Alto, CA. A key component in this system is the Event Detection System, which will supervise the execution of a set of feature and event extraction algorithms running in parallel, in real time, on all images recorded by the four telescopes of the key imaging instrument, the Atmospheric Imaging Assembly (AIA). The system will run on a beowulf cluster of 160 processors. As a test of the new system, we will run feature extraction software developed under the European Grid of Solar Observatories (EGSO) program to extract sunspot metadata from the 12 year SOHO MDI mission archive of full disk continuum and magnetogram images and also from the TRACE high resolution image archive. Although the main goal will be to test the performance of the production line framework, the resulting database will have applications for both research and space weather prediction. We examine some of these applications and compare the databases generated with others currently available.

  15. Method and device for detecting impact events on a security barrier which includes a hollow rebar allowing insertion and removal of an optical fiber

    DOEpatents

    Pies, Ross E.

    2016-03-29

    A method and device for the detection of impact events on a security barrier. A hollow rebar is farmed within a security barrier, whereby the hollow rebar is completely surrounded by the security barrier. An optical fiber passes through the interior of the hollow rebar. An optical transmitter and an optical receiver are both optically connected to the optical fiber and connected to optical electronics. The optical electronics are configured to provide notification upon the detection of an impact event at the security barrier based on the detection of disturbances within the optical fiber.

  16. Principles of models based engineering

    SciTech Connect

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  17. Detection of short-term slow slip events along the Nankai Trough, southwest Japan, using GNSS data

    NASA Astrophysics Data System (ADS)

    Nishimura, Takuya; Matsuzawa, Takanori; Obara, Kazushige

    2013-06-01

    We detected short-term slow slip events (SSEs) previously observable only with tilt and strain data along the Nankai Trough, southwest Japan, using GNSS (Global Navigation Satellite System) data. Offsets detected in GNSS time series using Akaike's information criterion helped automatically identify 207 episodes with a motion direction opposite to that of the relative plate motion from June 1996 to January 2012. By nonlinear inversion of the detected displacement, we estimated rectangular fault models for 133 probable and 25 possible short-term SSEs over 15 years. The SSE moment magnitudes range from 5.5 to 6.3. Most SSE fault models are located in a narrow band of non-volcanic tremors on the interfaces of the subducting Philippine Sea Plate. Large SSEs (moment magnitude, Mw, ≥6) often occur in western and central Shikoku. The cumulative slip is distributed heterogeneously along the strike, generally decreasing eastward with the maximum slip (~50 cm) in western Shikoku. No definite short-term SSEs were found in the Kii Channel, but several short-term SSEs occurred in Ise Bay. Both regions are known as tremor gaps. The local maximum of the cumulative slip fills in the tremor gap located in Ise Bay. The long-term rate of short-term SSE cumulative moment increased by threefold around 2003 in eastern Shikoku, whereas it was almost constant in other regions. Comparison with short-term SSE catalogues using tilt data suggests that both this study and previous studies missed some SSEs along the Nankai Trough. A combination of geodetic data is important in the monitoring of the spatiotemporal distribution of short-term SSEs.

  18. The Detection of a Type IIn Supernova in Optical Follow-up Observations of IceCube Neutrino Events

    NASA Astrophysics Data System (ADS)

    Aartsen, M. G.; Abraham, K.; Ackermann, M.; Adams, J.; Aguilar, J. A.; Ahlers, M.; Ahrens, M.; Altmann, D.; Anderson, T.; Archinger, M.; Arguelles, C.; Arlen, T. C.; Auffenberg, J.; Bai, X.; Barwick, S. W.; Baum, V.; Bay, R.; Beatty, J. J.; Becker Tjus, J.; Becker, K.-H.; Beiser, E.; BenZvi, S.; Berghaus, P.; Berley, D.; Bernardini, E.; Bernhard, A.; Besson, D. Z.; Binder, G.; Bindig, D.; Bissok, M.; Blaufuss, E.; Blumenthal, J.; Boersma, D. J.; Bohm, C.; Börner, M.; Bos, F.; Bose, D.; Böser, S.; Botner, O.; Braun, J.; Brayeur, L.; Bretz, H.-P.; Brown, A. M.; Buzinsky, N.; Casey, J.; Casier, M.; Cheung, E.; Chirkin, D.; Christov, A.; Christy, B.; Clark, K.; Classen, L.; Coenders, S.; Cowen, D. F.; Cruz Silva, A. H.; Daughhetee, J.; Davis, J. C.; Day, M.; de André, J. P. A. M.; De Clercq, C.; Dembinski, H.; De Ridder, S.; Desiati, P.; de Vries, K. D.; de Wasseige, G.; de With, M.; DeYoung, T.; Díaz-Vélez, J. C.; Dumm, J. P.; Dunkman, M.; Eagan, R.; Eberhardt, B.; Ehrhardt, T.; Eichmann, B.; Euler, S.; Evenson, P. A.; Fadiran, O.; Fahey, S.; Fazely, A. R.; Fedynitch, A.; Feintzeig, J.; Felde, J.; Filimonov, K.; Finley, C.; Fischer-Wasels, T.; Flis, S.; Fuchs, T.; Glagla, M.; Gaisser, T. K.; Gaior, R.; Gallagher, J.; Gerhardt, L.; Ghorbani, K.; Gier, D.; Gladstone, L.; Glüsenkamp, T.; Goldschmidt, A.; Golup, G.; Gonzalez, J. G.; Góra, D.; Grant, D.; Gretskov, P.; Groh, J. C.; Gross, A.; Ha, C.; Haack, C.; Haj Ismail, A.; Hallgren, A.; Halzen, F.; Hansmann, B.; Hanson, K.; Hebecker, D.; Heereman, D.; Helbing, K.; Hellauer, R.; Hellwig, D.; Hickford, S.; Hignight, J.; Hill, G. C.; Hoffman, K. D.; Hoffmann, R.; Holzapfe, K.; Homeier, A.; Hoshina, K.; Huang, F.; Huber, M.; Huelsnitz, W.; Hulth, P. O.; Hultqvist, K.; In, S.; Ishihara, A.; Jacobi, E.; Japaridze, G. S.; Jero, K.; Jurkovic, M.; Kaminsky, B.; Kappes, A.; Karg, T.; Karle, A.; Kauer, M.; Keivani, A.; Kelley, J. L.; Kemp, J.; Kheirandish, A.; Kiryluk, J.; Kläs, J.; Klein, S. R.; Kohnen, G.; Koirala, R.; Kolanoski, H.; Konietz, R.; Koob, A.; Köpke, L.; Kopper, C.; Kopper, S.; Koskinen, D. J.; Kowalski, M.; Krings, K.; Kroll, G.; Kroll, M.; Kunnen, J.; Kurahashi, N.; Kuwabara, T.; Labare, M.; Lanfranchi, J. L.; Larson, M. J.; Lesiak-Bzdak, M.; Leuermann, M.; Leuner, J.; Lünemann, J.; Madsen, J.; Maggi, G.; Mahn, K. B. M.; Maruyama, R.; Mase, K.; Matis, H. S.; Maunu, R.; McNally, F.; Meagher, K.; Medici, M.; Meli, A.; Menne, T.; Merino, G.; Meures, T.; Miarecki, S.; Middell, E.; Middlemas, E.; Miller, J.; Mohrmann, L.; Montaruli, T.; Morse, R.; Nahnhauer, R.; Naumann, U.; Niederhausen, H.; Nowicki, S. C.; Nygren, D. R.; Obertacke, A.; Olivas, A.; Omairat, A.; O'Murchadha, A.; Palczewski, T.; Pandya, H.; Paul, L.; Pepper, J. A.; Pérez de los Heros, C.; Pfendner, C.; Pieloth, D.; Pinat, E.; Posselt, J.; Price, P. B.; Przybylski, G. T.; Pütz, J.; Quinnan, M.; Rädel, L.; Rameez, M.; Rawlins, K.; Redl, P.; Reimann, R.; Relich, M.; Resconi, E.; Rhode, W.; Richman, M.; Richter, S.; Riedel, B.; Robertson, S.; Rongen, M.; Rott, C.; Ruhe, T.; Ryckbosch, D.; Saba, S. M.; Sabbatini, L.; Sander, H.-G.; Sandrock, A.; Sandroos, J.; Sarkar, S.; Schatto, K.; Scheriau, F.; Schimp, M.; Schmidt, T.; Schmitz, M.; Schoenen, S.; Schöneberg, S.; Schönwald, A.; Schukraft, A.; Schulte, L.; Seckel, D.; Seunarine, S.; Shanidze, R.; Smith, M. W. E.; Soldin, D.; Spiczak, G. M.; Spiering, C.; Stahlberg, M.; Stamatikos, M.; Stanev, T.; Stanisha, N. A.; Stasik, A.; Stezelberger, T.; Stokstad, R. G.; Stössl, A.; Strahler, E. A.; Ström, R.; Strotjohann, N. L.; Sullivan, G. W.; Sutherland, M.; Taavola, H.; Taboada, I.; Ter-Antonyan, S.; Terliuk, A.; Tešić, G.; Tilav, S.; Toale, P. A.; Tobin, M. N.; Tosi, D.; Tselengidou, M.; Turcati, A.; Unger, E.; Usner, M.; Vallecorsa, S.; van Eijndhoven, N.; Vandenbroucke, J.; van Santen, J.; Vanheule, S.; Veenkamp, J.; Vehring, M.; Voge, M.; Vraeghe, M.; Walck, C.; Wallraff, M.; Wandkowsky, N.; Weaver, Ch.; Wendt, C.; Westerhoff, S.; Whelan, B. J.; Whitehorn, N.; Wichary, C.; Wiebe, K.; Wiebusch, C. H.; Wille, L.; Williams, D. R.; Wissing, H.; Wolf, M.; Wood, T. R.; Woschnagg, K.; Xu, D. L.; Xu, X. W.; Xu, Y.; Yanez, J. P.; Yodh, G.; Yoshida, S.; Zarzhitsky, P.; Zoll, M.; IceCube Collaboration; Ofek, Eran O.; Kasliwal, Mansi M.; Nugent, Peter E.; Arcavi, Iair; Bloom, Joshua S.; Kulkarni, Shrinivas R.; Perley, Daniel A.; Barlow, Tom; Horesh, Assaf; Gal-Yam, Avishay; Howell, D. A.; Dilday, Ben; PTF Collaboration; Evans, Phil A.; Kennea, Jamie A.; Swift Collaboration; Burgett, W. S.; Chambers, K. C.; Kaiser, N.; Waters, C.; Flewelling, H.; Tonry, J. L.; Rest, A.; Smartt, S. J.; Pan-STARRS1 Science Consortium

    2015-09-01

    The IceCube neutrino observatory pursues a follow-up program selecting interesting neutrino events in real-time and issuing alerts for electromagnetic follow-up observations. In 2012 March, the most significant neutrino alert during the first three years of operation was issued by IceCube. In the follow-up observations performed by the Palomar Transient Factory (PTF), a Type IIn supernova (SN IIn) PTF12csy was found 0.°2 away from the neutrino alert direction, with an error radius of 0.°54. It has a redshift of z = 0.0684, corresponding to a luminosity distance of about 300 Mpc and the Pan-STARRS1 survey shows that its explosion time was at least 158 days (in host galaxy rest frame) before the neutrino alert, so that a causal connection is unlikely. The a posteriori significance of the chance detection of both the neutrinos and the SN at any epoch is 2.2σ within IceCube's 2011/12 data acquisition season. Also, a complementary neutrino analysis reveals no long-term signal over the course of one year. Therefore, we consider the SN detection coincidental and the neutrinos uncorrelated to the SN. However, the SN is unusual and interesting by itself: it is luminous and energetic, bearing strong resemblance to the SN IIn 2010jl, and shows signs of interaction of the SN ejecta with a dense circumstellar medium. High-energy neutrino emission is expected in models of diffusive shock acceleration, but at a low, non-detectable level for this specific SN. In this paper, we describe the SN PTF12csy and present both the neutrino and electromagnetic data, as well as their analysis.

  19. Detection of Rain-on-Snow (ROS) Events Using the Advanced Microwave Scanning Radiometer-Earth Observing System (AMSR-E) and Weather Station Observations

    NASA Astrophysics Data System (ADS)

    Ryan, E. M.; Brucker, L.; Forman, B. A.

    2015-12-01

    During the winter months, the occurrence of rain-on-snow (ROS) events can impact snow stratigraphy via generation of large scale ice crusts, e.g., on or within the snowpack. The formation of such layers significantly alters the electromagnetic response of the snowpack, which can be witnessed using space-based microwave radiometers. In addition, ROS layers can hinder the ability of wildlife to burrow in the snow for vegetation, which limits their foraging capability. A prime example occurred on 23 October 2003 in Banks Island, Canada, where an ROS event is believed to have caused the deaths of over 20,000 musk oxen. Through the use of passive microwave remote sensing, ROS events can be detected by utilizing observed brightness temperatures (Tb) from AMSR-E. Tb observed at different microwave frequencies and polarizations depends on snow properties. A wet snowpack formed from an ROS event yields a larger Tb than a typical dry snowpack would. This phenomenon makes observed Tb useful when detecting ROS events. With the use of data retrieved from AMSR-E, in conjunction with observations from ground-based weather station networks, a database of estimated ROS events over the past twelve years was generated. Using this database, changes in measured Tb following the ROS events was also observed. This study adds to the growing knowledge of ROS events and has the potential to help inform passive microwave snow water equivalent (SWE) retrievals or snow cover properties in polar regions.

  20. Symbolic Processing Combined with Model-Based Reasoning

    NASA Technical Reports Server (NTRS)

    James, Mark

    2009-01-01

    A computer program for the detection of present and prediction of future discrete states of a complex, real-time engineering system utilizes a combination of symbolic processing and numerical model-based reasoning. One of the biggest weaknesses of a purely symbolic approach is that it enables prediction of only future discrete states while missing all unmodeled states or leading to incorrect identification of an unmodeled state as a modeled one. A purely numerical approach is based on a combination of statistical methods and mathematical models of the applicable physics and necessitates development of a complete model to the level of fidelity required for prediction. In addition, a purely numerical approach does not afford the ability to qualify its results without some form of symbolic processing. The present software implements numerical algorithms to detect unmodeled events and symbolic algorithms to predict expected behavior, correlate the expected behavior with the unmodeled events, and interpret the results in order to predict future discrete states. The approach embodied in this software differs from that of the BEAM methodology (aspects of which have been discussed in several prior NASA Tech Briefs articles), which provides for prediction of future measurements in the continuous-data domain.

  1. Real-time automated 3D sensing, detection, and recognition of dynamic biological micro-organic events

    NASA Astrophysics Data System (ADS)

    Javidi, Bahram; Yeom, Seokwon; Moon, Inkyu; Daneshpanah, Mehdi

    2006-05-01

    In this paper, we present an overview of three-dimensional (3D) optical imaging techniques for real-time automated sensing, visualization, and recognition of dynamic biological microorganisms. Real time sensing and 3D reconstruction of the dynamic biological microscopic objects can be performed by single-exposure on-line (SEOL) digital holographic microscopy. A coherent 3D microscope-based interferometer is constructed to record digital holograms of dynamic micro biological events. Complex amplitude 3D images of the biological microorganisms are computationally reconstructed at different depths by digital signal processing. Bayesian segmentation algorithms are applied to identify regions of interest for further processing. A number of pattern recognition approaches are addressed to identify and recognize the microorganisms. One uses 3D morphology of the microorganisms by analyzing 3D geometrical shapes which is composed of magnitude and phase. Segmentation, feature extraction, graph matching, feature selection, and training and decision rules are used to recognize the biological microorganisms. In a different approach, 3D technique is used that are tolerant to the varying shapes of the non-rigid biological microorganisms. After segmentation, a number of sampling patches are arbitrarily extracted from the complex amplitudes of the reconstructed 3D biological microorganism. These patches are processed using a number of cost functions and statistical inference theory for the equality of means and equality of variances between the sampling segments. Also, we discuss the possibility of employing computational integral imaging for 3D sensing, visualization, and recognition of biological microorganisms illuminated under incoherent light. Experimental results with several biological microorganisms are presented to illustrate detection, segmentation, and identification of micro biological events.

  2. Simultaneous detection of eight genetically modified maize lines using a combination of event- and construct-specific multiplex-PCR technique.

    PubMed

    Shrestha, Hari K; Hwu, Kae-Kang; Wang, Shu-Jen; Liu, Li-Fei; Chang, Men-Chi

    2008-10-08

    To fulfill labeling and traceability requirement of genetically modified (GM) maize for trade and regulation, it is essential to develop an event-specific detection method for monitoring the presence of transgenes. In pursuit of this purpose, we systematically optimized and established a combined event- and construct-specific multiplex polymerase chain reaction (mPCR) technique for simultaneous detection of 8 GM maize lines. Altogether 9 sets of primers were designed, including six that were event-specific for Event176, Bt11, TC1507, NK603, MON863, and Mon810; two that were construct-specific for T25 and GA21, and one for an endogenous zein gene. The transgene in each GM maize line and the endogenous zein gene could be clearly detected and distinguished according to the different sizes of PCR amplicons. The limit of detection (LOD) was approximately 0.25% (v/v), although the detection can be as sensitive as 0.1% as demonstrated by the International Seed Testing Association (ISTA) proficiency test. This study further improves the current PCR-based detection method for GM maize. The method can be used in an easy, sensitive, and cost and time effective way for the identification and quality screening of a specific GM maize line.

  3. Gaseous time projection chambers for rare event detection: results from the T-REX project. I. Double beta decay

    SciTech Connect

    Irastorza, I.G.; Aznar, F.; Castel, J. E-mail: faznar@unizar.es [Grupo de Física Nuclear y Astropartículas, Departamento de Física Teórica, Universidad de Zaragoza, C and others

    2016-01-01

    As part of the T-REX project, a number of R and D and prototyping activities have been carried out during the last years to explore the applicability of gaseous Time Projection Chambers (TPCs) with Micromesh Gas Structures (Micromegas) in rare event searches like double beta decay, axion research and low-mass WIMP searches. In both this and its companion paper, we compile the main results of the project and give an outlook of application prospects for this detection technique. While in the companion paper we focus on axions and WIMPs, in this paper we focus on the results regarding the measurement of the double beta decay (DBD) of {sup 136}Xe in a high pressure Xe (HPXe) TPC. Micromegas of the microbulk type have been extensively studied in high pressure Xe and Xe mixtures. Particularly relevant are the results obtained in Xe + trimethylamine (TMA) mixtures, showing very promising results in terms of gain, stability of operation, and energy resolution at high pressures up to 10 bar. The addition of TMA at levels of ∼ 1% reduces electron diffusion by up to a factor of 10 with respect to pure Xe, improving the quality of the topological pattern, with a positive impact on the discrimination capability. Operation with a medium size prototype of 30 cm diameter and 38 cm of drift (holding about 1 kg of Xe at 10 bar in the fiducial volume, enough to contain high energy electron tracks in the detector volume) has allowed to test the detection concept in realistic experimental conditions. Microbulk Micromegas are able to image the DBD ionization signature with high quality while, at the same time, measuring its energy deposition with a resolution of at least a ∼ 3% FWHM @ Q{sub ββ}. This value was experimentally demonstrated for high-energy extended tracks at 10 bar, and is probably improvable down to the ∼ 1% FWHM levels as extrapolated from low energy events. In addition, first results on the topological signature information (one straggling track ending in two

  4. Analysis of Inter-Moss Loops in the Solar Region with IRIS and SDO AIA: Automatic Event Detection and Characterization

    NASA Technical Reports Server (NTRS)

    Fayock, Brian; Winebarger, Amy; De Pontieu, Bart

    2014-01-01

    The Interface Region Imaging Spectrograph (IRIS), launched in the summer of 2013, is designed specifically to observe and investigate the transition region and adjacent layers of the solar atmosphere, obtaining images in high spatial, temporal, and spectral resolution. Our particular work is focused on the evolution of inter-moss loops, which have been detected in the lower corona by the Atmospheric Imaging Assembly (AIA) and the High-Resolution Coronal Imager (Hi- C), but are known to have foot points below the transition region. With the high-resolution capabilities of IRIS and its Si IV pass band, which measures activity in the upper chromosphere, we can study these magnetic loops in detail and compare their characteristic length and time scales to those obtained from several AIA image sets, particularly the 171, 193, and 211 pass bands. By comparing the results between these four data sets, one can potentially establish a measure of the ionization equilibrium for the location in question. To explore this idea, we found a large, sit-and-stare observation within the IRIS database that fit our specifications. This data set contained a number of well-defined inter-moss loops (by visual inspection) with a cadence less than or equal to that of AIA (approximately 12 seconds). This particular data set was recorded on October 23, 2013 at 07:09:30, lasting for 3219 seconds with a field of view of 120.6 by 128.1 arcseconds, centered on -53.9 by 59.1 arcseconds from disk center. For ease of comparison, the AIA data has been interpolated to match the IRIS cadence and resolution. In the main portion of the poster, we demonstrate the detection of events, the information collected, and the immediate results to the right, showing the progress of an event with green as the start, blue as the peak, and red as the end. Below here, we demonstrate how pixels are combined to form groups. The 3D results are shown to the right.

  5. Analysis of Inter-Moss Loops in the Solar Region with IRIS and SDO AIA: Automatic Event Detection and Characterization

    NASA Technical Reports Server (NTRS)

    Fayock, Brian; Winebarger, Amy; De Pontieu, Bart; Alexander, Caroline

    2016-01-01

    The Interface Region Imaging Spectrograph (IRIS), launched in the summer of 2013, is designed specifically to observe and investigate the transition region and adjacent layers of the solar atmosphere, obtaining images in high spatial, temporal, and spectral resolution. Our particular work is focused on the evolution of inter-moss loops, which have been detected in the lower corona by the Atmospheric Imaging Assembly (AIA) and the High-Resolution Coronal Imager (Hi- C), but are known to have foot points below the transition region. With the high-resolution capabilities of IRIS and its Si IV pass band, which measures activity in the upper chromosphere, we can study these magnetic loops in detail and compare their characteristic length and time scales to those obtained from several AIA image sets, particularly the 171, 193, and 211 pass bands. By comparing the results between these four data sets, one can potentially establish a measure of the ionization equilibrium for the location in question. To explore this idea, we found a large, sit-and-stare observation within the IRIS database that fit our specifications. This data set contained a number of well-defined inter-moss loops (by visual inspection) with a cadence less than or equal to that of AIA (approximately 12 seconds). This particular data set was recorded on October 23, 2013 at 07:09:30, lasting for 3219 seconds with a field of view of 120.6 by 128.1 arcseconds, centered on -53.9 by 59.1 arcseconds from disk center. For ease of comparison, the AIA data has been interpolated to match the IRIS cadence and resolution. In the main portion of the poster, we demonstrate the detection of events, the information collected, and the immediate results to the right, showing the progress of an event with green as the start, blue as the peak, and red as the end. Below here, we demonstrate how pixels are combined to form groups. The 3D results are shown to the right

  6. A multi-station matched filter and coherent network processing approach to the automatic detection and relative location of seismic events

    NASA Astrophysics Data System (ADS)

    Gibbons, Steven J.; Näsholm, Sven Peter; Kværna, Tormod

    2014-05-01

    Correlation detectors facilitate seismic monitoring in the near vicinity of previously observed events at far lower detection thresholds than are possible using the methods applied in most existing processing pipelines. The use of seismic arrays has been demonstrated to be highly beneficial in pressing down the detection threshold, due to superior noise suppression, and also in eliminating vast numbers of false alarms by performing array processing on the multi-channel output of the correlation detectors. This last property means that it is highly desirable to run continuous detectors for sites of repeating seismic events on a single-array basis for many arrays across a global network. Spurious detections for a given signal template on a single array can however still occur when an unrelated wavefront crosses the array from a very similar direction to that of the master event wavefront. We present an algorithm which scans automatically the output from multiple stations - both array and 3-component - for coherence between the individual station correlator outputs that is consistent with a disturbance in the vicinity of the master event. The procedure results in a categorical rejection of an event hypothesis in the absence of support from stations other than the one generating the trigger and provides a fully automatic relative event location estimate when patterns in the correlation detector outputs are found to be consistent with a common event. This coherence-based approach removes the need to make explicit measurements of the time-differences for single stations and this eliminates a potential source of error. The method is demonstrated for the North Korea nuclear test site and the relative event location estimates obtained for the 2006, 2009, and 2013 events are compared with previous estimates from different station configurations.

  7. 1.3 mm WAVELENGTH VLBI OF SAGITTARIUS A*: DETECTION OF TIME-VARIABLE EMISSION ON EVENT HORIZON SCALES

    SciTech Connect

    Fish, Vincent L.; Doeleman, Sheperd S.; Beaudoin, Christopher; Bolin, David E.; Rogers, Alan E. E.; Blundell, Ray; Gurwell, Mark A.; Moran, James M.; Primiani, Rurik; Bower, Geoffrey C.; Plambeck, Richard; Chamberlin, Richard; Freund, Robert; Friberg, Per; Honma, Mareki; Oyama, Tomoaki; Inoue, Makoto; Krichbaum, Thomas P.; Lamb, James; Marrone, Daniel P.

    2011-02-01

    Sagittarius A*, the {approx}4 x 10{sup 6} M{sub sun} black hole candidate at the Galactic center, can be studied on Schwarzschild radius scales with (sub)millimeter wavelength very long baseline interferometry (VLBI). We report on 1.3 mm wavelength observations of Sgr A* using a VLBI array consisting of the JCMT on Mauna Kea, the Arizona Radio Observatory's Submillimeter Telescope on Mt. Graham in Arizona, and two telescopes of the CARMA array at Cedar Flat in California. Both Sgr A* and the quasar calibrator 1924-292 were observed over three consecutive nights, and both sources were clearly detected on all baselines. For the first time, we are able to extract 1.3 mm VLBI interferometer phase information on Sgr A* through measurement of closure phase on the triangle of baselines. On the third night of observing, the correlated flux density of Sgr A* on all VLBI baselines increased relative to the first two nights, providing strong evidence for time-variable change on scales of a few Schwarzschild radii. These results suggest that future VLBI observations with greater sensitivity and additional baselines will play a valuable role in determining the structure of emission near the event horizon of Sgr A*.

  8. Detection, Localization and Quantification of Impact Events on a Stiffened Composite Panel with Embedded Fiber Bragg Grating Sensor Networks.

    PubMed

    Lamberti, Alfredo; Luyckx, Geert; Van Paepegem, Wim; Rezayat, Ali; Vanlanduit, Steve

    2017-04-01

    Nowadays, it is possible to manufacture smart composite materials with embedded fiber optic sensors. These sensors can be exploited during the composites' operating life to identify occurring damages such as delaminations. For composite materials adopted in the aviation and wind energy sector, delaminations are most often caused by impacts with external objects. The detection, localization and quantification of such impacts are therefore crucial for the prevention of catastrophic events. In this paper, we demonstrate the feasibility to perform impact identification in smart composite structures with embedded fiber optic sensors. For our analyses, we manufactured a carbon fiber reinforced plate in which we embedded a distributed network of fiber Bragg grating (FBG) sensors. We impacted the plate with a modal hammer and we identified the impacts by processing the FBG data with an improved fast phase correlation (FPC) algorithm in combination with a variable selective least squares (VS-LS) inverse solver approach. A total of 164 impacts distributed on 41 possible impact locations were analyzed. We compared our methodology with the traditional P-Inv based approach. In terms of impact localization, our methodology performed better in 70.7% of the cases. An improvement on the impact time domain reconstruction was achieved in 95 . 1 % of the cases.

  9. PREFACE: 5th Symposium on Large TPCs for Low Energy Rare Event Detection and Workshop on Neutrinos from Supernovae

    NASA Astrophysics Data System (ADS)

    Irastorza, Igor G.; Scholberg, Kate; Colas, Paul; Giomataris, Ioannis

    2011-08-01

    The Fifth International Symposium on large TPCs for low-energy rare-event detection was held at the auditorium of the Astroparticle and Cosmology (APC) Laboratory in Paris, on 14-17 December 2010. As for all previous meetings, always held in Paris in 2008, 2006, 2004 and 2002, it brought together a significant community of physicists involved in rare event searches and/or development of time projection chambers (TPCs). As a novelty this year, the meeting was extended with two half-day sessions on Supernova physics. These proceedings also include the contributions corresponding to the supernova sessions. The purpose of the meeting was to present and discuss the status of current experiments or projects involving the use of TPCs to search for rare events, like low-energy neutrinos, double beta decay, dark matter or axion experiments, as well as to discuss new results and ideas in the framework of the last developments of Micro Pattern Gaseous Detectors (MPGD), and how these are being - or could be - applied to these searches. As in previous meetings in this series, the format included an informal program with some recent highlighted results, rather than exhaustive reviews, with time for discussion and interaction. The symposium, the fifth of the series, is becoming consolidated as a regular meeting place for the synergic interplay between the fields of rare events and TPC development. The meeting started with a moving tribute by Ioannis Giomataris to the memory of George Charpak, who recently passed away. We then moved on to the usual topics like the status of some low-energy neutrino physics and double beta decay experiments, dark matter experiments with directional detectors, axion searches, or development results. A relevant subject this time was the electroluminescence in Xe TPCs, covered by several speakers. Every time the conference program is enriched with original slightly off-topic contributions that trigger the curiosity and stimulate further thought. As

  10. Multiplex polymerase chain reaction-capillary gel electrophoresis: a promising tool for GMO screening--assay for simultaneous detection of five genetically modified cotton events and species.

    PubMed

    Nadal, Anna; Esteve, Teresa; Pla, Maria

    2009-01-01

    A multiplex polymerase chain reaction assay coupled to capillary gel electrophoresis for amplicon identification by size and color (multiplex PCR-CGE-SC) was developed for simultaneous detection of cotton species and 5 events of genetically modified (GM) cotton. Validated real-time-PCR reactions targeting Bollgard, Bollgard II, Roundup Ready, 3006-210-23, and 281-24-236 junction sequences, and the cotton reference gene acp1 were adapted to detect more than half of the European Union-approved individual or stacked GM cotton events in one reaction. The assay was fully specific (<1.7% of false classification rate), with limit of detection values of 0.1% for each event, which were also achieved with simulated mixtures at different relative percentages of targets. The assay was further combined with a second multiplex PCR-CGE-SC assay to allow simultaneous detection of 6 cotton and 5 maize targets (two endogenous genes and 9 GM events) in two multiplex PCRs and a single CGE, making the approach more economic. Besides allowing simultaneous detection of many targets with adequate specificity and sensitivity, the multiplex PCR-CGE-SC approach has high throughput and automation capabilities, while keeping a very simple protocol, e.g., amplification and labeling in one step. Thus, it is an easy and inexpensive tool for initial screening, to be complemented with quantitative assays if necessary.

  11. Automated detection of feeding strikes by larval fish using continuous high-speed digital video: a novel method to extract quantitative data from fast, sparse kinematic events.

    PubMed

    Shamur, Eyal; Zilka, Miri; Hassner, Tal; China, Victor; Liberzon, Alex; Holzman, Roi

    2016-06-01

    Using videography to extract quantitative data on animal movement and kinematics constitutes a major tool in biomechanics and behavioral ecology. Advanced recording technologies now enable acquisition of long video sequences encompassing sparse and unpredictable events. Although such events may be ecologically important, analysis of sparse data can be extremely time-consuming and potentially biased; data quality is often strongly dependent on the training level of the observer and subject to contamination by observer-dependent biases. These constraints often limit our ability to study animal performance and fitness. Using long videos of foraging fish larvae, we provide a framework for the automated detection of prey acquisition strikes, a behavior that is infrequent yet critical for larval survival. We compared the performance of four video descriptors and their combinations against manually identified feeding events. For our data, the best single descriptor provided a classification accuracy of 77-95% and detection accuracy of 88-98%, depending on fish species and size. Using a combination of descriptors improved the accuracy of classification by ∼2%, but did not improve detection accuracy. Our results indicate that the effort required by an expert to manually label videos can be greatly reduced to examining only the potential feeding detections in order to filter false detections. Thus, using automated descriptors reduces the amount of manual work needed to identify events of interest from weeks to hours, enabling the assembly of an unbiased large dataset of ecologically relevant behaviors.

  12. Comparison of drought events detected by SPI calculated from different historical precipitation data sets - case study from Southern Alps

    NASA Astrophysics Data System (ADS)

    Brencic, M.; Hictaler, J.

    2012-04-01

    During recent years substantial efforts were directed toward the reconstruction of past meteorological data sets of precipitation, air temperature, air pressure and sunshine. In Alpine space of Europe long tradition of meteorological data monitoring exist starting with the first modern measurements in late 18th century. However, older data were obtained under very different conditions, standards and quality. Consequently direct comparison between data sets of different observation points is not possible. Several methods defined as data homogenisation procedures were developed intended to enable comparison of data from different observation points and sources. In spite of the fact that homogenisation procedures are scientifically agreed final result represented as homogenised data series depends on the ability and approach of the interpreters. Well know data set from the Greater Alpine region based on the common homogenisation procedure is HISTALP data series. However, HISTALP data set is not the only available homogenised data set in the region. Local agencies responsible for meteorological observations (e.g. in Slovenia Environmental Agency of Slovenia - ARSO) perform their own homogenisation procedures. Because more detailed information about measuring procedures and locations for the particular stations is available for them one can expect differences between homogenised data sets. Longer meteorological data sets can be used to detect past drought events of various magnitudes. They can help to discern past droughts and their characteristics. A very frequently used meteorological drought index is standardized precipitation index - SPI. The nature of SPI is designed to detect events of low frequency. With the help of this index periods of extremely low precipitation can be defined. It is usually based on monthly amount of precipitation where cumulative precipitation amount for the particular time period is calculated. During the calculation of SPI with a time

  13. Naive Probability: Model-based Estimates of Unique Events

    DTIC Science & Technology

    2014-05-04

    will end the shortage of replacement organs in the next 15 years? 3 the Supreme Court rules on the constitutionality of gay marriage in the next 5...La Mont, K., Lipton, J., Dehaene, S., Kanwisher, N., & Spelke, E.S. (2006). Nonsymbolic arithmetic in adults and young children . Cognition, 98, 199...years? that a gay person will be elected as president in the next 50 years? 4 Greece will make a full economic recovery in the next 10 years

  14. Final report for LDRD project 11-0029 : high-interest event detection in large-scale multi-modal data sets : proof of concept.

    SciTech Connect

    Rohrer, Brandon Robinson

    2011-09-01

    Events of interest to data analysts are sometimes difficult to characterize in detail. Rather, they consist of anomalies, events that are unpredicted, unusual, or otherwise incongruent. The purpose of this LDRD was to test the hypothesis that a biologically-inspired anomaly detection algorithm could be used to detect contextual, multi-modal anomalies. There currently is no other solution to this problem, but the existence of a solution would have a great national security impact. The technical focus of this research was the application of a brain-emulating cognition and control architecture (BECCA) to the problem of anomaly detection. One aspect of BECCA in particular was discovered to be critical to improved anomaly detection capabilities: it's feature creator. During the course of this project the feature creator was developed and tested against multiple data types. Development direction was drawn from psychological and neurophysiological measurements. Major technical achievements include the creation of hierarchical feature sets created from both audio and imagery data.

  15. Is detection of adverse events affected by record review methodology? an evaluation of the “Harvard Medical Practice Study” method and the “Global Trigger Tool”

    PubMed Central

    2013-01-01

    Background There has been a theoretical debate as to which retrospective record review method is the most valid, reliable, cost efficient and feasible for detecting adverse events. The aim of the present study was to evaluate the feasibility and capability of two common retrospective record review methods, the “Harvard Medical Practice Study” method and the “Global Trigger Tool” in detecting adverse events in adult orthopaedic inpatients. Methods We performed a three-stage structured retrospective record review process in a random sample of 350 orthopaedic admissions during 2009 at a Swedish university hospital. Two teams comprised each of a registered nurse and two physicians were assigned, one to each method. All records were primarily reviewed by registered nurses. Records containing a potential adverse event were forwarded to physicians for review in stage 2. Physicians made an independent review regarding, for example, healthcare causation, preventability and severity. In the third review stage all adverse events that were found with the two methods together were compared and all discrepancies after review stage 2 were analysed. Events that had not been identified by one of the methods in the first two review stages were reviewed by the respective physicians. Results Altogether, 160 different adverse events were identified in 105 (30.0%) of the 350 records with both methods combined. The “Harvard Medical Practice Study” method identified 155 of the 160 (96.9%, 95% CI: 92.9-99.0) adverse events in 104 (29.7%) records compared with 137 (85.6%, 95% CI: 79.2-90.7) adverse events in 98 (28.0%) records using the “Global Trigger Tool”. Adverse events “causing harm without permanent disability” accounted for most of the observed difference. The overall positive predictive value for criteria and triggers using the “Harvard Medical Practice Study” method and the “Global Trigger Tool” was 40.3% and 30.4%, respectively. Conclusions More adverse

  16. A novel “correlated ion and neutral time of flight” method: Event-by-event detection of neutral and charged fragments in collision induced dissociation of mass selected ions

    SciTech Connect

    Teyssier, C.; Fillol, R.; Abdoul-Carime, H.; Farizon, B.; Farizon, M.

    2014-01-15

    A new tandem mass spectrometry (MS/MS) method based on time of flight measurements performed on an event-by-event detection technique is presented. This “correlated ion and neutral time of flight” method allows to explore Collision Induced Dissociation (CID) fragmentation processes by directly identifying not only all ions and neutral fragments produced but also their arrival time correlations within each single fragmentation event from a dissociating molecular ion. This constitutes a new step in the characterization of molecular ions. The method will be illustrated here for a prototypical case involving CID of protonated water clusters H{sup +}(H{sub 2}O){sub n=1–5} upon collisions with argon atoms.

  17. Development of multiplex PCR method for simultaneous detection of four events of genetically modified maize: DAS-59122-7, MIR604, MON863 and MON88017.

    PubMed

    Oguchi, Taichi; Onishi, Mari; Mano, Junichi; Akiyama, Hiroshi; Teshima, Reiko; Futo, Satoshi; Furui, Satoshi; Kitta, Kazumi

    2010-01-01

    A novel multiplex PCR method was developed for simultaneous event-specific detection of four events of GM maize, i.e., DAS-59122-7, MIR604, MON88017, and MON863. The single laboratory examination of analytical performance using simulated DNA mixtures containing GM DNA at various concentrations in non-GM DNA suggested that the limits of detection (LOD) of the multiplex PCR method were 0.16% for MON863, MIR604, and MON88017, and 0.078% for DAS-59122-7. We previously developed a nonaplex (9plex) PCR method for eight events of GM maize, i.e., Bt11, Bt176, GA21, MON810, MON863, NK603, T25, and TC1507. Together with the nonaplex PCR method, the newly developed method enabled the detection and identification of eleven GM maize events that are frequently included in commercial GM seed used in Japan. In addition, this combinational analysis may be useful for the identification of combined event products of GM maize.

  18. On-site detection of stacked genetically modified soybean based on event-specific TM-LAMP and a DNAzyme-lateral flow biosensor.

    PubMed

    Cheng, Nan; Shang, Ying; Xu, Yuancong; Zhang, Li; Luo, Yunbo; Huang, Kunlun; Xu, Wentao

    2017-05-15

    Stacked genetically modified organisms (GMO) are becoming popular for their enhanced production efficiency and improved functional properties, and on-site detection of stacked GMO is an urgent challenge to be solved. In this study, we developed a cascade system combining event-specific tag-labeled multiplex LAMP with a DNAzyme-lateral flow biosensor for reliable detection of stacked events (DP305423× GTS 40-3-2). Three primer sets, both event-specific and soybean species-specific, were newly designed for the tag-labeled multiplex LAMP system. A trident-like lateral flow biosensor displayed amplified products simultaneously without cross contamination, and DNAzyme enhancement improved the sensitivity effectively. After optimization, the limit of detection was approximately 0.1% (w/w) for stacked GM soybean, which is sensitive enough to detect genetically modified content up to a threshold value established by several countries for regulatory compliance. The entire detection process could be shortened to 120min without any large-scale instrumentation. This method may be useful for the in-field detection of DP305423× GTS 40-3-2 soybean on a single kernel basis and on-site screening tests of stacked GM soybean lines and individual parent GM soybean lines in highly processed foods.

  19. Extended likelihood ratio test-based methods for signal detection in a drug class with application to FDA's adverse event reporting system database.

    PubMed

    Zhao, Yueqin; Yi, Min; Tiwari, Ram C

    2016-05-02

    A likelihood ratio test, recently developed for the detection of signals of adverse events for a drug of interest in the FDA Adverse Events Reporting System database, is extended to detect signals of adverse events simultaneously for all the drugs in a drug class. The extended likelihood ratio test methods, based on Poisson model (Ext-LRT) and zero-inflated Poisson model (Ext-ZIP-LRT), are discussed and are analytically shown, like the likelihood ratio test method, to control the type-I error and false discovery rate. Simulation studies are performed to evaluate the performance characteristics of Ext-LRT and Ext-ZIP-LRT. The proposed methods are applied to the Gadolinium drug class in FAERS database. An in-house likelihood ratio test tool, incorporating the Ext-LRT methodology, is being developed in the Food and Drug Administration.

  20. LINE IDENTIFICATIONS OF TYPE I SUPERNOVAE: ON THE DETECTION OF Si II FOR THESE HYDROGEN-POOR EVENTS

    SciTech Connect

    Parrent, J. T.; Milisavljevic, D.; Soderberg, A. M.; Parthasarathy, M.

    2016-03-20

    Here we revisit line identifications of type I supernovae (SNe I) and highlight trace amounts of unburned hydrogen as an important free parameter for the composition of the progenitor. Most one-dimensional stripped-envelope models of supernovae indicate that observed features near 6000–6400 Å in type I spectra are due to more than Si ii λ6355. However, while an interpretation of conspicuous Si ii λ6355 can approximate 6150 Å absorption features for all SNe Ia during the first month of free expansion, similar identifications applied to 6250 Å features of SNe Ib and Ic have not been as successful. When the corresponding synthetic spectra are compared with high-quality timeseries observations, the computed spectra are frequently too blue in wavelength. Some improvement can be achieved with Fe ii lines that contribute redward of 6150 Å; however, the computed spectra either remain too blue or the spectrum only reaches a fair agreement when the rise-time to peak brightness of the model conflicts with observations by a factor of two. This degree of disagreement brings into question the proposed explosion scenario. Similarly, a detection of strong Si ii λ6355 in the spectra of broadlined Ic and super-luminous events of type I/R is less convincing despite numerous model spectra used to show otherwise. Alternatively, we suggest 6000–6400 Å features are possibly influenced by either trace amounts of hydrogen or blueshifted absorption and emission in Hα, the latter being an effect which is frequently observed in the spectra of hydrogen-rich, SNe II.

  1. Concordance of nuclear and mitochondrial DNA markers in detecting a founder event in Lake Clark sockeye salmon

    USGS Publications Warehouse

    Ramstad, Kristina M.; Woody, Carol Ann; Habicht, Chris; Sage, G. Kevin; Seeb, James E.; Allendorf, Fred W.

    2007-01-01

    Genetic bottleneck effects can reduce genetic variation, persistence probability, and evolutionary potential of populations. Previous microsatellite analysis suggested a bottleneck associated with a common founding of sock-eye salmon Oncorhynchus nerka populations of Lake Clark, Alaska, about 100 to 400 generations ago. The common foundingevent occurred after the last glacial recession and resulted in reduced allelic diversity and strong divergence of Lake Clarksockeye salmon relative to neighboring Six Mile Lake and LakeIliamna populations. Here we used two additional genetic marker types (allozymes and mtDNA) to examine these patterns further. Allozyme and mtDNA results were congruent with the microsatellite data in suggesting a common founder event in LakeClark sockeye salmon and confirmed the divergence of Lake Clarkpopulations from neighboring Six Mile Lake and Lake Iliamna populations. The use of multiple marker types provided better understanding of the bottleneck in Lake Clark. For example, the Sucker Bay Lake population had an exceptionally severe reduction in allelic diversity at microsatellite loci, but not at mtDNA. This suggests that the reduced microsatellite variation in Sucker Bay Lake fish is due to consistently smaller effective population size than other Lake Clark populations, rather than a more acute or additional bottleneck since founding. Caution is urged in using reduced heterozygosity as a measure of genetic bottleneck effects because stochastic variance among loci resulted in an overall increase in allozyme heterozygosity within bottlenecked Lake Clark populations. However, heterozygosity excess, which assesses heterozygosity relative to allelic variation, detected genetic bottleneck effects in both allozyme and microsatellite loci. 

  2. Estimation of fault geometry of a slow slip event off the Kii Peninsula, southwest of Japan, detected by DONET

    NASA Astrophysics Data System (ADS)

    Suzuki, K.; Nakano, M.; Hori, T.; Takahashi, N.

    2015-12-01

    The Japan Agency for Marine-Earth Science and Technology installed permanent ocean bottom observation network called Dense Oceanfloor Network System for Earthquakes and Tsunamis (DONET) off the Kii Peninsula, southwest of Japan, to monitor earthquakes and tsunamis. We detected the long-term vertical displacements of sea floor from the ocean-bottom pressure records, starting from March 2013, at several DONET stations (Suzuki et al., 2014). We consider that these displacements were caused by the crustal deformation due to a slow slip event (SSE).  We estimated the fault geometry of the SSE by using the observed ocean-bottom displacements. The ocean-bottom displacements were obtained by removing the tidal components from the pressure records. We also subtracted the average of pressure changes taken over the records at stations connected to each science node from each record in order to remove the contributions due to atmospheric pressure changes and non-tidal ocean dynamic mass variations. Therefore we compared observed displacements with the theoretical ones that was subtracted the average displacement in the fault geometry estimation. We also compared observed and theoretical average displacements for the model evaluation. In this study, the observed average displacements were assumed to be zero. Although there are nine parameters in the fault model, we observed vertical displacements at only four stations. Therefore we assumed three fault geometries; (1) a reverse fault slip along the plate boundary, (2) a strike slip along a splay fault, and (3) a reverse fault slip along the splay fault. We obtained that the model (3) gives the smallest residual between observed and calculated displacements. We also observed that this SSE was synchronized with a decrease in the background seismicity within the area of a nearby earthquake cluster. In the future, we will investigate the relationship between the SSE and the seismicity change.

  3. A novel quadruplex real-time PCR method for simultaneous detection of Cry2Ae and two genetically modified cotton events (GHB119 and T304-40)

    PubMed Central

    2014-01-01

    Background To date, over 150 genetically modified (GM) crops are widely cultivated. To comply with regulations developed for genetically modified organisms (GMOs), including labeling policies, many detection methods for GMO identification and quantification have been developed. Results To detect the entrance and exit of unauthorized GM crop events in China, we developed a novel quadruplex real-time PCR method for simultaneous detection and quantification of GM cotton events GHB119 and T304-40 in cotton-derived products (based on the 5′-flanking sequence) and the insect-resistance gene Cry2Ae. The limit of detection was 10 copies for GHB119 and Cry2Ae and 25 copies for T304-40. The limit of quantification was 25 copies for GHB119 and Cry2Ae and 50 copies for T304-40. Moreover, low bias and acceptable standard deviation and relative standard deviation values were obtained in quantification analysis of six blind samples containing different GHB119 and T304-40 ingredients. Conclusions The developed quadruplex quantitative method could be used for quantitative detection of two GM cotton events (GHB119 and T304-40) and Cry2Ae gene ingredient in cotton derived products. PMID:24884946

  4. The chemically homogeneous evolutionary channel for binary black hole mergers: rates and properties of gravitational-wave events detectable by advanced LIGO

    NASA Astrophysics Data System (ADS)

    de Mink, S. E.; Mandel, I.

    2016-08-01

    We explore the predictions for detectable gravitational-wave signals from merging binary black holes formed through chemically homogeneous evolution in massive short-period stellar binaries. We find that ˜500 events per year could be detected with advanced ground-based detectors operating at full sensitivity. We analyse the distribution of detectable events, and conclude that there is a very strong preference for detecting events with nearly equal components (mass ratio >0.66 at 90 per cent confidence in our default model) and high masses (total source-frame mass between 57 and 103 M⊙ at 90 per cent confidence). We consider multiple alternative variations to analyse the sensitivity to uncertainties in the evolutionary physics and cosmological parameters, and conclude that while the rates are sensitive to assumed variations, the mass distributions are robust predictions. Finally, we consider the recently reported results of the analysis of the first 16 double-coincident days of the O1 LIGO (Laser Interferometer Gravitational-wave Observatory) observing run, and find that this formation channel is fully consistent with the inferred parameters of the GW150914 binary black hole detection and the inferred merger rate.

  5. Event specific qualitative and quantitative polymerase chain reaction detection of genetically modified MON863 maize based on the 5'-transgene integration sequence.

    PubMed

    Yang, Litao; Xu, Songci; Pan, Aihu; Yin, Changsong; Zhang, Kewei; Wang, Zhenying; Zhou, Zhigang; Zhang, Dabing

    2005-11-30

    Because of the genetically modified organisms (GMOs) labeling policies issued in many countries and areas, polymerase chain reaction (PCR) methods were developed for the execution of GMO labeling policies, such as screening, gene specific, construct specific, and event specific PCR detection methods, which have become a mainstay of GMOs detection. The event specific PCR detection method is the primary trend in GMOs detection because of its high specificity based on the flanking sequence of the exogenous integrant. This genetically modified maize, MON863, contains a Cry3Bb1 coding sequence that produces a protein with enhanced insecticidal activity against the coleopteran pest, corn rootworm. In this study, the 5'-integration junction sequence between the host plant DNA and the integrated gene construct of the genetically modified maize MON863 was revealed by means of thermal asymmetric interlaced-PCR, and the specific PCR primers and TaqMan probe were designed based upon the revealed 5'-integration junction sequence; the conventional qualitative PCR and quantitative TaqMan real-time PCR detection methods employing these primers and probes were successfully developed. In conventional qualitative PCR assay, the limit of detection (LOD) was 0.1% for MON863 in 100 ng of maize genomic DNA for one reaction. In the quantitative TaqMan real-time PCR assay, the LOD and the limit of quantification were eight and 80 haploid genome copies, respectively. In addition, three mixed maize samples with known MON863 contents were detected using the established real-time PCR systems, and the ideal results indicated that the established event specific real-time PCR detection systems were reliable, sensitive, and accurate.

  6. Improvement of depth resolution and detection efficiency by control of secondary-electrons in single-event three-dimensional time-of-flight Rutherford backscattering spectrometry

    NASA Astrophysics Data System (ADS)

    Abo, Satoshi; Hamada, Yasuhisa; Seidl, Albert; Wakaya, Fujio; Takai, Mikio

    2015-04-01

    An improvement of a depth resolution and a detection efficiency in single-event three-dimensional time-of-flight (TOF) Rutherford backscattering spectrometry (RBS) is discussed on both simulation and experiment by control of secondary electron trajectories using sample bias voltage. The secondary electron, used for a start signal in single-event TOF-RBS, flies more directly to a secondary electron detector with the positive sample bias voltage of several tens of volt than that without sample bias voltage in the simulation. The simulated collection efficiency of the secondary electrons also increases with the positive sample bias voltage of several tens of volt. These simulation results indicate the possibility of a smaller depth resolution and a shorter measurement time in single-event TOF-RBS with positive sample bias voltage. The measurement time for the Pt-stripe sample using single-event three-dimensional TOF-RBS with the sample bias voltage of +100 V is 65% shorter than that without sample bias voltage, resulting in a less sample damage by a probe beam. The depth resolution for the Pt stripes under the 50-nm-thick SiO2 cover-layer with the sample bias voltage of +100 V is 4 nm smaller than that without sample bias voltage. Positive sample bias voltage improves the depth resolution and the detection efficiency in single-event three-dimensional TOF-RBS without an influence on the beam focusing.

  7. Using memory for prior aircraft events to detect conflicts under conditions of proactive air traffic control and with concurrent task requirements.

    PubMed

    Bowden, Vanessa K; Loft, Shayne

    2016-06-01

    In 2 experiments we examined the impact of memory for prior events on conflict detection in simulated air traffic control under conditions where individuals proactively controlled aircraft and completed concurrent tasks. Individuals were faster to detect conflicts that had repeatedly been presented during training (positive transfer). Bayesian statistics indicated strong evidence for the null hypothesis that conflict detection was not impaired for events that resembled an aircraft pair that had repeatedly come close to conflicting during training. This is likely because aircraft altitude (the feature manipulated between training and test) was attended to by participants when proactively controlling aircraft. In contrast, a minor change to the relative position of a repeated nonconflicting aircraft pair moderately impaired conflict detection (negative transfer). There was strong evidence for the null hypothesis that positive transfer was not impacted by dividing participant attention, which suggests that part of the information retrieved regarding prior aircraft events was perceptual (the new aircraft pair "looked" like a conflict based on familiarity). These findings extend the effects previously reported by Loft, Humphreys, and Neal (2004), answering the recent strong and unanimous calls across the psychological science discipline to formally establish the robustness and generality of previously published effects. (PsycINFO Database Record

  8. Event-specific qualitative and quantitative PCR detection of the GMO carnation (Dianthus caryophyllus) variety Moonlite based upon the 5'-transgene integration sequence.

    PubMed

    Li, P; Jia, J W; Jiang, L X; Zhu, H; Bai, L; Wang, J B; Tang, X M; Pan, A H

    2012-04-27

    To ensure the implementation of genetically modified organism (GMO)-labeling regulations, an event-specific detection method was developed based on the junction sequence of an exogenous integrant in the transgenic carnation variety Moonlite. The 5'-transgene integration sequence was isolated by thermal asymmetric interlaced PCR. Based upon the 5'-transgene integration sequence, the event-specific primers and TaqMan probe were designed to amplify the fragments, which spanned the exogenous DNA and carnation genomic DNA. Qualitative and quantitative PCR assays were developed employing the designed primers and probe. The detection limit of the qualitative PCR assay was 0.05% for Moonlite in 100 ng total carnation genomic DNA, corresponding to about 79 copies of the carnation haploid genome; the limit of detection and quantification of the quantitative PCR assay were estimated to be 38 and 190 copies of haploid carnation genomic DNA, respectively. Carnation samples with different contents of genetically modified components were quantified and the bias between the observed and true values of three samples were lower than the acceptance criterion (<25%) of the GMO detection method. These results indicated that these event-specific methods would be useful for the identification and quantification of the GMO carnation Moonlite.

  9. Interlaboratory validation study of an event-specific real-time polymerase chain reaction detection method for genetically modified 55-1 papaya.

    PubMed

    Noguchi, Akio; Nakamura, Kosuke; Sakata, Kozue; Kobayashi, Tomoko; Akiyama, Hiroshi; Kondo, Kazunari; Teshima, Reiko; Ohmori, Kiyomi; Kasahara, Masaki; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    Genetically modified (GM) papaya line 55-1 (55-1) is resistant to papaya ringspot virus infection, and is commercially available in several countries. A specific detection method for 55-1 is required for mandatory labeling regulations. An event-specific real-time PCR method was developed by our laboratory. To validate the method, interlaboratory validation of event-specific qualitative real-time PCR analysis for 55-1 was performed in collaboration with 12 laboratories. DNA extraction and real-time PCR reaction methods were evaluated using 12 blind samples: six non-GM papayas and six GM papayas in each laboratory. Genomic DNA was highly purified from all papayas using an ion-exchange column, and the resulting DNA sample was analyzed using real-time PCR. Papaya endogenous reference gene chymopapain (CHY) and the event-specific 55-1 targeted sequence were detected in GM papayas whereas CHYalone was detected in non-GM papayas in all laboratories. The cycle threshold values of CHYand the 55-1 targeted sequence showed high repeatability (RSD, 0.6-0.8%) and reproducibility (RSDR 2.2-3.6%). This study demonstrates that the 55-1 real-time PCR detection method is a useful and reliable method to monitor 55-1 papaya in foods.

  10. Event-Driven Collaboration through Publish/Subscribe Messaging Services for Near-Real- Time Environmental Sensor Anomaly Detection and Management

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Downey, S.; Minsker, B.; Myers, J. D.; Wentling, T.; Marini, L.

    2006-12-01

    One of the challenges in designing cyberinfrastructure for national environmental observatories is how to provide integrated cyberenvironment which not only provides a standardized pipeline for streaming data from sensors into the observatory for archiving and distrib