Screening DNA chip and event-specific multiplex PCR detection methods for biotech crops.
Lee, Seong-Hun
2014-11-01
There are about 80 biotech crop events that have been approved by safety assessment in Korea. They have been controlled by genetically modified organism (GMO) and living modified organism (LMO) labeling systems. The DNA-based detection method has been used as an efficient scientific management tool. Recently, the multiplex polymerase chain reaction (PCR) and DNA chip have been developed as simultaneous detection methods for several biotech crops' events. The event-specific multiplex PCR method was developed to detect five biotech maize events: MIR604, Event 3272, LY 038, MON 88017 and DAS-59122-7. The specificity was confirmed and the sensitivity was 0.5%. The screening DNA chip was developed from four endogenous genes of soybean, maize, cotton and canola respectively along with two regulatory elements and seven genes: P35S, tNOS, pat, bar, epsps1, epsps2, pmi, cry1Ac and cry3B. The specificity was confirmed and the sensitivity was 0.5% for four crops' 12 events: one soybean, six maize, three cotton and two canola events. The multiplex PCR and DNA chip can be available for screening, gene-specific and event-specific analysis of biotech crops as efficient detection methods by saving on workload and time. © 2014 Society of Chemical Industry. © 2014 Society of Chemical Industry.
Antón, Alfonso; Pazos, Marta; Martín, Belén; Navero, José Manuel; Ayala, Miriam Eleonora; Castany, Marta; Martínez, Patricia; Bardavío, Javier
2013-01-01
To assess sensitivity, specificity, and agreement among automated event analysis, automated trend analysis, and expert evaluation to detect glaucoma progression. This was a prospective study that included 37 eyes with a follow-up of 36 months. All had glaucomatous disks and fields and performed reliable visual fields every 6 months. Each series of fields was assessed with 3 different methods: subjective assessment by 2 independent teams of glaucoma experts, glaucoma/guided progression analysis (GPA) event analysis, and GPA (visual field index-based) trend analysis. Kappa agreement coefficient between methods and sensitivity and specificity for each method using expert opinion as gold standard were calculated. The incidence of glaucoma progression was 16% to 18% in 3 years but only 3 cases showed progression with all 3 methods. Kappa agreement coefficient was high (k=0.82) between subjective expert assessment and GPA event analysis, and only moderate between these two and GPA trend analysis (k=0.57). Sensitivity and specificity for GPA event and GPA trend analysis were 71% and 96%, and 57% and 93%, respectively. The 3 methods detected similar numbers of progressing cases. The GPA event analysis and expert subjective assessment showed high agreement between them and moderate agreement with GPA trend analysis. In a period of 3 years, both methods of GPA analysis offered high specificity, event analysis showed 83% sensitivity, and trend analysis had a 66% sensitivity.
Yang, Litao; Xu, Songci; Pan, Aihu; Yin, Changsong; Zhang, Kewei; Wang, Zhenying; Zhou, Zhigang; Zhang, Dabing
2005-11-30
Because of the genetically modified organisms (GMOs) labeling policies issued in many countries and areas, polymerase chain reaction (PCR) methods were developed for the execution of GMO labeling policies, such as screening, gene specific, construct specific, and event specific PCR detection methods, which have become a mainstay of GMOs detection. The event specific PCR detection method is the primary trend in GMOs detection because of its high specificity based on the flanking sequence of the exogenous integrant. This genetically modified maize, MON863, contains a Cry3Bb1 coding sequence that produces a protein with enhanced insecticidal activity against the coleopteran pest, corn rootworm. In this study, the 5'-integration junction sequence between the host plant DNA and the integrated gene construct of the genetically modified maize MON863 was revealed by means of thermal asymmetric interlaced-PCR, and the specific PCR primers and TaqMan probe were designed based upon the revealed 5'-integration junction sequence; the conventional qualitative PCR and quantitative TaqMan real-time PCR detection methods employing these primers and probes were successfully developed. In conventional qualitative PCR assay, the limit of detection (LOD) was 0.1% for MON863 in 100 ng of maize genomic DNA for one reaction. In the quantitative TaqMan real-time PCR assay, the LOD and the limit of quantification were eight and 80 haploid genome copies, respectively. In addition, three mixed maize samples with known MON863 contents were detected using the established real-time PCR systems, and the ideal results indicated that the established event specific real-time PCR detection systems were reliable, sensitive, and accurate.
An evaluation of computer-aided disproportionality analysis for post-marketing signal detection.
Lehman, H P; Chen, J; Gould, A L; Kassekert, R; Beninger, P R; Carney, R; Goldberg, M; Goss, M A; Kidos, K; Sharrar, R G; Shields, K; Sweet, A; Wiholm, B E; Honig, P K
2007-08-01
To understand the value of computer-aided disproportionality analysis (DA) in relation to current pharmacovigilance signal detection methods, four products were retrospectively evaluated by applying an empirical Bayes method to Merck's post-marketing safety database. Findings were compared with the prior detection of labeled post-marketing adverse events. Disproportionality ratios (empirical Bayes geometric mean lower 95% bounds for the posterior distribution (EBGM05)) were generated for product-event pairs. Overall (1993-2004 data, EBGM05> or =2, individual terms) results of signal detection using DA compared to standard methods were sensitivity, 31.1%; specificity, 95.3%; and positive predictive value, 19.9%. Using groupings of synonymous labeled terms, sensitivity improved (40.9%). More of the adverse events detected by both methods were detected earlier using DA and grouped (versus individual) terms. With 1939-2004 data, diagnostic properties were similar to those from 1993 to 2004. DA methods using Merck's safety database demonstrate sufficient sensitivity and specificity to be considered for use as an adjunct to conventional signal detection methods.
Liu, Jia; Guo, Jinchao; Zhang, Haibo; Li, Ning; Yang, Litao; Zhang, Dabing
2009-11-25
Various polymerase chain reaction (PCR) methods were developed for the execution of genetically modified organism (GMO) labeling policies, of which an event-specific PCR detection method based on the flanking sequence of exogenous integration is the primary trend in GMO detection due to its high specificity. In this study, the 5' and 3' flanking sequences of the exogenous integration of MON89788 soybean were revealed by thermal asymmetric interlaced PCR. The event-specific PCR primers and TaqMan probe were designed based upon the revealed 5' flanking sequence, and the qualitative and quantitative PCR assays were established employing these designed primers and probes. In qualitative PCR, the limit of detection (LOD) was about 0.01 ng of genomic DNA corresponding to 10 copies of haploid soybean genomic DNA. In the quantitative PCR assay, the LOD was as low as two haploid genome copies, and the limit of quantification was five haploid genome copies. Furthermore, the developed PCR methods were in-house validated by five researchers, and the validated results indicated that the developed event-specific PCR methods can be used for identification and quantification of MON89788 soybean and its derivates.
Pacheco Coello, Ricardo; Pestana Justo, Jorge; Factos Mendoza, Andrés; Santos Ordoñez, Efrén
2017-12-20
In Ecuador, food products need to be labeled if exceeded 0.9% of transgenic content in whole products. For the detection of genetically modified organisms (GMOs), three DNA extraction methods were tested in 35 food products commercialized in Ecuador. Samples with positive amplification of endogenous genes were screened for the presence of the Cauliflower mosaic virus 35S-promoter (P35S) and the nopaline synthase-terminator (Tnos). TaqMan™ probes were used for determination of transgenic content of the GTS 40-3-2 and MON810 events through quantitative PCR (qPCR). Twenty-six processed food samples were positive for the P35S alone and eight samples for the Tnos and P35S. Absolute qPCR results indicated that eleven samples were positive for GTS 40-3-2 specific event and two for MON810 specific event. A total of nine samples for events GTS 40-3-2 and MON810 exceeded the umbral allowed of transgenic content in the whole food product with the specific events. Different food products may require different DNA extraction protocols for GMO detection through PCR. Among the three methods tested, the DNeasy mericon food kit DNA extraction method obtained higher proportion of amplified endogenous genes through PCR. Finally, event-specific GMOs were detected in food products in Ecuador.
Nakamura, Kosuke; Akiyama, Hiroshi; Kawano, Noriaki; Kobayashi, Tomoko; Yoshimatsu, Kayo; Mano, Junichi; Kitta, Kazumi; Ohmori, Kiyomi; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko
2013-12-01
Genetically modified (GM) rice (Oryza sativa) lines, such as insecticidal Kefeng and Kemingdao, have been developed and found unauthorised in processed rice products in many countries. Therefore, qualitative detection methods for the GM rice are required for the GM food regulation. A transgenic construct for expressing cowpea (Vigna unguiculata) trypsin inhibitor (CpTI) was detected in some imported processed rice products contaminated with Kemingdao. The 3' terminal sequence of the identified transgenic construct for expression of CpTI included an endoplasmic reticulum retention signal coding sequence (KDEL) and nopaline synthase terminator (T-nos). The sequence was identical to that in a report on Kefeng. A novel construct-specific real-time polymerase chain reaction (PCR) detection method for detecting the junction region sequence between the CpTI-KDEL and T-nos was developed. The imported processed rice products were evaluated for the contamination of the GM rice using the developed construct-specific real-time PCR methods, and detection frequency was compared with five event-specific detection methods. The construct-specific detection methods detected the GM rice at higher frequency than the event-specific detection methods. Therefore, we propose that the construct-specific detection method is a beneficial tool for screening the contamination of GM rice lines, such as Kefeng, in processed rice products for the GM food regulation. Copyright © 2013 Elsevier Ltd. All rights reserved.
Jacchia, Sara; Nardini, Elena; Savini, Christian; Petrillo, Mauro; Angers-Loustau, Alexandre; Shim, Jung-Hyun; Trijatmiko, Kurniawan; Kreysa, Joachim; Mazzara, Marco
2015-02-18
In this study, we developed, optimized, and in-house validated a real-time PCR method for the event-specific detection and quantification of Golden Rice 2, a genetically modified rice with provitamin A in the grain. We optimized and evaluated the performance of the taxon (targeting rice Phospholipase D α2 gene)- and event (targeting the 3' insert-to-plant DNA junction)-specific assays that compose the method as independent modules, using haploid genome equivalents as unit of measurement. We verified the specificity of the two real-time PCR assays and determined their dynamic range, limit of quantification, limit of detection, and robustness. We also confirmed that the taxon-specific DNA sequence is present in single copy in the rice genome and verified its stability of amplification across 132 rice varieties. A relative quantification experiment evidenced the correct performance of the two assays when used in combination.
Patwardhan, Supriya; Dasari, Srikanth; Bhagavatula, Krishna; Mueller, Steffen; Deepak, Saligrama Adavigowda; Ghosh, Sudip; Basak, Sanjay
2015-01-01
An efficient PCR-based method to trace genetically modified food and feed products is in demand due to regulatory requirements and contaminant issues in India. However, post-PCR detection with conventional methods has limited sensitivity in amplicon separation that is crucial in multiplexing. The study aimed to develop a sensitive post-PCR detection method by using PCR-chip capillary electrophoresis (PCR-CCE) to detect and identify specific genetically modified organisms in their genomic DNA mixture by targeting event-specific nucleotide sequences. Using the PCR-CCE approach, novel multiplex methods were developed to detect MON531 cotton, EH 92-527-1 potato, Bt176 maize, GT73 canola, or GA21 maize simultaneously when their genomic DNAs in mixtures were amplified using their primer mixture. The repeatability RSD (RSDr) of the peak migration time was 0.06 and 3.88% for the MON531 and Bt176, respectively. The RSD (RSDR) of the Cry1Ac peak ranged from 0.12 to 0.40% in multiplex methods. The method was sensitive in resolving amplicon of size difference up to 4 bp. The PCR-CCE method is suitable to detect multiple genetically modified events in a composite DNA sample by tagging their event specific sequences.
Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang
2015-01-05
The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection.
Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang
2015-01-01
The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection. PMID:25556930
Li, P; Jia, J W; Jiang, L X; Zhu, H; Bai, L; Wang, J B; Tang, X M; Pan, A H
2012-04-27
To ensure the implementation of genetically modified organism (GMO)-labeling regulations, an event-specific detection method was developed based on the junction sequence of an exogenous integrant in the transgenic carnation variety Moonlite. The 5'-transgene integration sequence was isolated by thermal asymmetric interlaced PCR. Based upon the 5'-transgene integration sequence, the event-specific primers and TaqMan probe were designed to amplify the fragments, which spanned the exogenous DNA and carnation genomic DNA. Qualitative and quantitative PCR assays were developed employing the designed primers and probe. The detection limit of the qualitative PCR assay was 0.05% for Moonlite in 100 ng total carnation genomic DNA, corresponding to about 79 copies of the carnation haploid genome; the limit of detection and quantification of the quantitative PCR assay were estimated to be 38 and 190 copies of haploid carnation genomic DNA, respectively. Carnation samples with different contents of genetically modified components were quantified and the bias between the observed and true values of three samples were lower than the acceptance criterion (<25%) of the GMO detection method. These results indicated that these event-specific methods would be useful for the identification and quantification of the GMO carnation Moonlite.
Lee, David; La Mura, Maurizio; Allnutt, Theo R; Powell, Wayne
2009-02-02
The most common method of GMO detection is based upon the amplification of GMO-specific DNA amplicons using the polymerase chain reaction (PCR). Here we have applied the loop-mediated isothermal amplification (LAMP) method to amplify GMO-related DNA sequences, 'internal' commonly-used motifs for controlling transgene expression and event-specific (plant-transgene) junctions. We have tested the specificity and sensitivity of the technique for use in GMO studies. Results show that detection of 0.01% GMO in equivalent background DNA was possible and dilutions of template suggest that detection from single copies of the template may be possible using LAMP. This work shows that GMO detection can be carried out using LAMP for routine screening as well as for specific events detection. Moreover, the sensitivity and ability to amplify targets, even with a high background of DNA, here demonstrated, highlights the advantages of this isothermal amplification when applied for GMO detection.
Nakamura, Kosuke; Kondo, Kazunari; Akiyama, Hiroshi; Ishigaki, Takumi; Noguchi, Akio; Katsumata, Hiroshi; Takasaki, Kazuto; Futo, Satoshi; Sakata, Kozue; Fukuda, Nozomi; Mano, Junichi; Kitta, Kazumi; Tanaka, Hidenori; Akashi, Ryo; Nishimaki-Mogami, Tomoko
2016-08-15
Identification of transgenic sequences in an unknown genetically modified (GM) papaya (Carica papaya L.) by whole genome sequence analysis was demonstrated. Whole genome sequence data were generated for a GM-positive fresh papaya fruit commodity detected in monitoring using real-time polymerase chain reaction (PCR). The sequences obtained were mapped against an open database for papaya genome sequence. Transgenic construct- and event-specific sequences were identified as a GM papaya developed to resist infection from a Papaya ringspot virus. Based on the transgenic sequences, a specific real-time PCR detection method for GM papaya applicable to various food commodities was developed. Whole genome sequence analysis enabled identifying unknown transgenic construct- and event-specific sequences in GM papaya and development of a reliable method for detecting them in papaya food commodities. Copyright © 2016 Elsevier Ltd. All rights reserved.
Zhu, Pengyu; Fu, Wei; Wang, Chenguang; Du, Zhixin; Huang, Kunlun; Zhu, Shuifang; Xu, Wentao
2016-04-15
The possibility of the absolute quantitation of GMO events by digital PCR was recently reported. However, most absolute quantitation methods based on the digital PCR required pretreatment steps. Meanwhile, singleplex detection could not meet the demand of the absolute quantitation of GMO events that is based on the ratio of foreign fragments and reference genes. Thus, to promote the absolute quantitative detection of different GMO events by digital PCR, we developed a quantitative detection method based on duplex digital PCR without pretreatment. Moreover, we tested 7 GMO events in our study to evaluate the fitness of our method. The optimized combination of foreign and reference primers, limit of quantitation (LOQ), limit of detection (LOD) and specificity were validated. The results showed that the LOQ of our method for different GMO events was 0.5%, while the LOD is 0.1%. Additionally, we found that duplex digital PCR could achieve the detection results with lower RSD compared with singleplex digital PCR. In summary, the duplex digital PCR detection system is a simple and stable way to achieve the absolute quantitation of different GMO events. Moreover, the LOQ and LOD indicated that this method is suitable for the daily detection and quantitation of GMO events. Copyright © 2016 Elsevier B.V. All rights reserved.
Fu, Wei; Zhu, Pengyu; Wang, Chenguang; Huang, Kunlun; Du, Zhixin; Tian, Wenying; Wang, Qin; Wang, Huiyu; Xu, Wentao; Zhu, Shuifang
2015-01-01
Digital PCR has developed rapidly since it was first reported in the 1990s. It was recently reported that an improved method facilitated the detection of genetically modified organisms (GMOs). However, to use this improved method, the samples must be pretreated, which could introduce inaccuracy into the results. In our study, we explored a pretreatment-free digital PCR detection method for the screening for GMOs. We chose the CaMV35s promoter and the NOS terminator as the templates in our assay. To determine the specificity of our method, 9 events of GMOs were collected, including MON810, MON863, TC1507, MIR604, MIR162, GA21, T25, NK603 and Bt176. Moreover, the sensitivity, intra-laboratory and inter-laboratory reproducibility of our detection method were assessed. The results showed that the limit of detection of our method was 0.1%, which was lower than the labeling threshold level of the EU. The specificity and stability among the 9 events were consistent, respectively. The intra-laboratory and inter-laboratory reproducibility were both good. Finally, the perfect fitness for the detection of eight double-blind samples indicated the good practicability of our method. In conclusion, the method in our study would allow more sensitive, specific and stable screening detection of the GMO content of international trading products. PMID:26239916
Fu, Wei; Zhu, Pengyu; Wang, Chenguang; Huang, Kunlun; Du, Zhixin; Tian, Wenying; Wang, Qin; Wang, Huiyu; Xu, Wentao; Zhu, Shuifang
2015-08-04
Digital PCR has developed rapidly since it was first reported in the 1990 s. It was recently reported that an improved method facilitated the detection of genetically modified organisms (GMOs). However, to use this improved method, the samples must be pretreated, which could introduce inaccuracy into the results. In our study, we explored a pretreatment-free digital PCR detection method for the screening for GMOs. We chose the CaMV35s promoter and the NOS terminator as the templates in our assay. To determine the specificity of our method, 9 events of GMOs were collected, including MON810, MON863, TC1507, MIR604, MIR162, GA21, T25, NK603 and Bt176. Moreover, the sensitivity, intra-laboratory and inter-laboratory reproducibility of our detection method were assessed. The results showed that the limit of detection of our method was 0.1%, which was lower than the labeling threshold level of the EU. The specificity and stability among the 9 events were consistent, respectively. The intra-laboratory and inter-laboratory reproducibility were both good. Finally, the perfect fitness for the detection of eight double-blind samples indicated the good practicability of our method. In conclusion, the method in our study would allow more sensitive, specific and stable screening detection of the GMO content of international trading products.
Xu, Xiaodan; Li, Yingcong; Zhao, Heng; Wen, Si-yuan; Wang, Sheng-qi; Huang, Jian; Huang, Kun-lun; Luo, Yun-bo
2005-05-18
To devise a rapid and reliable method for the detection and identification of genetically modified (GM) events, we developed a multiplex polymerase chain reaction (PCR) coupled with a DNA microarray system simultaneously aiming at many targets in a single reaction. The system included probes for screening gene, species reference gene, specific gene, construct-specific gene, event-specific gene, and internal and negative control genes. 18S rRNA was combined with species reference genes as internal controls to assess the efficiency of all reactions and to eliminate false negatives. Two sets of the multiplex PCR system were used to amplify four and five targets, respectively. Eight different structure genes could be detected and identified simultaneously for Roundup Ready soybean in a single microarray. The microarray specificity was validated by its ability to discriminate two GM maizes Bt176 and Bt11. The advantages of this method are its high specificity and greatly reduced false-positives and -negatives. The multiplex PCR coupled with microarray technology presented here is a rapid and reliable tool for the simultaneous detection of GM organism ingredients.
Full-waveform detection of non-impulsive seismic events based on time-reversal methods
NASA Astrophysics Data System (ADS)
Solano, Ericka Alinne; Hjörleifsdóttir, Vala; Liu, Qinya
2017-12-01
We present a full-waveform detection method for non-impulsive seismic events, based on time-reversal principles. We use the strain Green's tensor as a matched filter, correlating it with continuous observed seismograms, to detect non-impulsive seismic events. We show that this is mathematically equivalent to an adjoint method for detecting earthquakes. We define the detection function, a scalar valued function, which depends on the stacked correlations for a group of stations. Event detections are given by the times at which the amplitude of the detection function exceeds a given value relative to the noise level. The method can make use of the whole seismic waveform or any combination of time-windows with different filters. It is expected to have an advantage compared to traditional detection methods for events that do not produce energetic and impulsive P waves, for example glacial events, landslides, volcanic events and transform-fault earthquakes for events which velocity structure along the path is relatively well known. Furthermore, the method has advantages over empirical Greens functions template matching methods, as it does not depend on records from previously detected events, and therefore is not limited to events occurring in similar regions and with similar focal mechanisms as these events. The method is not specific to any particular way of calculating the synthetic seismograms, and therefore complicated structural models can be used. This is particularly beneficial for intermediate size events that are registered on regional networks, for which the effect of lateral structure on the waveforms can be significant. To demonstrate the feasibility of the method, we apply it to two different areas located along the mid-oceanic ridge system west of Mexico where non-impulsive events have been reported. The first study area is between Clipperton and Siqueiros transform faults (9°N), during the time of two earthquake swarms, occurring in March 2012 and May 2016. The second area of interest is the Gulf of California where two swarms took place during July and September of 2015. We show that we are able to detect previously non-reported, non-impulsive events and recommend that this method be used together with more traditional template matching methods to maximize the number of detected events.
Development of an algorithm for automatic detection and rating of squeak and rattle events
NASA Astrophysics Data System (ADS)
Chandrika, Unnikrishnan Kuttan; Kim, Jay H.
2010-10-01
A new algorithm for automatic detection and rating of squeak and rattle (S&R) events was developed. The algorithm utilizes the perceived transient loudness (PTL) that approximates the human perception of a transient noise. At first, instantaneous specific loudness time histories are calculated over 1-24 bark range by applying the analytic wavelet transform and Zwicker loudness transform to the recorded noise. Transient specific loudness time histories are then obtained by removing estimated contributions of the background noise from instantaneous specific loudness time histories. These transient specific loudness time histories are summed to obtain the transient loudness time history. Finally, the PTL time history is obtained by applying Glasberg and Moore temporal integration to the transient loudness time history. Detection of S&R events utilizes the PTL time history obtained by summing only 18-24 barks components to take advantage of high signal-to-noise ratio in the high frequency range. A S&R event is identified when the value of the PTL time history exceeds the detection threshold pre-determined by a jury test. The maximum value of the PTL time history is used for rating of S&R events. Another jury test showed that the method performs much better if the PTL time history obtained by summing all frequency components is used. Therefore, r ating of S&R events utilizes this modified PTL time history. Two additional jury tests were conducted to validate the developed detection and rating methods. The algorithm developed in this work will enable automatic detection and rating of S&R events with good accuracy and minimum possibility of false alarm.
Fighting detection using interaction energy force
NASA Astrophysics Data System (ADS)
Wateosot, Chonthisa; Suvonvorn, Nikom
2017-02-01
Fighting detection is an important issue in security aimed to prevent criminal or undesirable events in public places. Many researches on computer vision techniques have studied to detect the specific event in crowded scenes. In this paper we focus on fighting detection using social-based Interaction Energy Force (IEF). The method uses low level features without object extraction and tracking. The interaction force is modeled using the magnitude and direction of optical flows. A fighting factor is developed under this model to detect fighting events using thresholding method. An energy map of interaction force is also presented to identify the corresponding events. The evaluation is performed using NUSHGA and BEHAVE datasets. The results show the efficiency with high accuracy regardless of various conditions.
Kim, Jae-Hwan; Park, Saet-Byul; Roh, Hyo-Jeong; Shin, Min-Ki; Moon, Gui-Im; Hong, Jin-Hwan; Kim, Hae-Yeong
2017-07-01
One novel standard reference plasmid, namely pUC-RICE5, was constructed as a positive control and calibrator for event-specific qualitative and quantitative detection of genetically modified (GM) rice (Bt63, Kemingdao1, Kefeng6, Kefeng8, and LLRice62). pUC-RICE5 contained fragments of a rice-specific endogenous reference gene (sucrose phosphate synthase) as well as the five GM rice events. An existing qualitative PCR assay approach was modified using pUC-RICE5 to create a quantitative method with limits of detection correlating to approximately 1-10 copies of rice haploid genomes. In this quantitative PCR assay, the square regression coefficients ranged from 0.993 to 1.000. The standard deviation and relative standard deviation values for repeatability ranged from 0.02 to 0.22 and 0.10% to 0.67%, respectively. The Ministry of Food and Drug Safety (Korea) validated the method and the results suggest it could be used routinely to identify five GM rice events. Copyright © 2017 Elsevier Ltd. All rights reserved.
Treml, Diana; Venturelli, Gustavo L; Brod, Fábio C A; Faria, Josias C; Arisi, Ana C M
2014-12-10
A genetically modified (GM) common bean event, namely Embrapa 5.1, resistant to the bean golden mosaic virus (BGMV), was approved for commercialization in Brazil. Brazilian regulation for genetically modified organism (GMO) labeling requires that any food containing more than 1% GMO be labeled. The event-specific polymerase chain reaction (PCR) method has been the primary trend for GMO identification and quantitation because of its high specificity based on the flanking sequence. This work reports the development of an event-specific assay, named FGM, for Embrapa 5.1 detection and quantitation by use of SYBR Green or hydrolysis probe. The FGM assay specificity was tested for Embrapa 2.3 event (a noncommercial GM common bean also resistant to BGMV), 46 non-GM common bean varieties, and other crop species including maize, GM maize, soybean, and GM soybean. The FGM assay showed high specificity to detect the Embrapa 5.1 event. Standard curves for the FGM assay presented a mean efficiency of 95% and a limit of detection (LOD) of 100 genome copies in the presence of background DNA. The primers and probe developed are suitable for the detection and quantitation of Embrapa 5.1.
Simplex and duplex event-specific analytical methods for functional biotech maize.
Lee, Seong-Hun; Kim, Su-Jeong; Yi, Bu-Young
2009-08-26
Analytical methods are very important in the control of genetically modified organism (GMO) labeling systems or living modified organism (LMO) management for biotech crops. Event-specific primers and probes were developed for qualitative and quantitative analysis for biotech maize event 3272 and LY 038 on the basis of the 3' flanking regions, respectively. The qualitative primers confirmed the specificity by a single PCR product and sensitivity to 0.05% as a limit of detection (LOD). Simplex and duplex quantitative methods were also developed using TaqMan real-time PCR. One synthetic plasmid was constructed from two taxon-specific DNA sequences of maize and two event-specific 3' flanking DNA sequences of event 3272 and LY 038 as reference molecules. In-house validation of the quantitative methods was performed using six levels of mixing samples, from 0.1 to 10.0%. As a result, the biases from the true value and the relative deviations were all within the range of +/-30%. Limits of quantitation (LOQs) of the quantitative methods were all 0.1% for simplex real-time PCRs of event 3272 and LY 038 and 0.5% for duplex real-time PCR of LY 038. This study reports that event-specific analytical methods were applicable for qualitative and quantitative analysis for biotech maize event 3272 and LY 038.
A high-throughput multiplex method adapted for GMO detection.
Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique
2008-12-24
A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.
Cheng, Nan; Shang, Ying; Xu, Yuancong; Zhang, Li; Luo, Yunbo; Huang, Kunlun; Xu, Wentao
2017-05-15
Stacked genetically modified organisms (GMO) are becoming popular for their enhanced production efficiency and improved functional properties, and on-site detection of stacked GMO is an urgent challenge to be solved. In this study, we developed a cascade system combining event-specific tag-labeled multiplex LAMP with a DNAzyme-lateral flow biosensor for reliable detection of stacked events (DP305423× GTS 40-3-2). Three primer sets, both event-specific and soybean species-specific, were newly designed for the tag-labeled multiplex LAMP system. A trident-like lateral flow biosensor displayed amplified products simultaneously without cross contamination, and DNAzyme enhancement improved the sensitivity effectively. After optimization, the limit of detection was approximately 0.1% (w/w) for stacked GM soybean, which is sensitive enough to detect genetically modified content up to a threshold value established by several countries for regulatory compliance. The entire detection process could be shortened to 120min without any large-scale instrumentation. This method may be useful for the in-field detection of DP305423× GTS 40-3-2 soybean on a single kernel basis and on-site screening tests of stacked GM soybean lines and individual parent GM soybean lines in highly processed foods. Copyright © 2017 Elsevier B.V. All rights reserved.
GMDD: a database of GMO detection methods.
Dong, Wei; Yang, Litao; Shen, Kailin; Kim, Banghyun; Kleter, Gijs A; Marvin, Hans J P; Guo, Rong; Liang, Wanqi; Zhang, Dabing
2008-06-04
Since more than one hundred events of genetically modified organisms (GMOs) have been developed and approved for commercialization in global area, the GMO analysis methods are essential for the enforcement of GMO labelling regulations. Protein and nucleic acid-based detection techniques have been developed and utilized for GMOs identification and quantification. However, the information for harmonization and standardization of GMO analysis methods at global level is needed. GMO Detection method Database (GMDD) has collected almost all the previous developed and reported GMOs detection methods, which have been grouped by different strategies (screen-, gene-, construct-, and event-specific), and also provide a user-friendly search service of the detection methods by GMO event name, exogenous gene, or protein information, etc. In this database, users can obtain the sequences of exogenous integration, which will facilitate PCR primers and probes design. Also the information on endogenous genes, certified reference materials, reference molecules, and the validation status of developed methods is included in this database. Furthermore, registered users can also submit new detection methods and sequences to this database, and the newly submitted information will be released soon after being checked. GMDD contains comprehensive information of GMO detection methods. The database will make the GMOs analysis much easier.
Picking vs Waveform based detection and location methods for induced seismicity monitoring
NASA Astrophysics Data System (ADS)
Grigoli, Francesco; Boese, Maren; Scarabello, Luca; Diehl, Tobias; Weber, Bernd; Wiemer, Stefan; Clinton, John F.
2017-04-01
Microseismic monitoring is a common operation in various industrial activities related to geo-resouces, such as oil and gas and mining operations or geothermal energy exploitation. In microseismic monitoring we generally deal with large datasets from dense monitoring networks that require robust automated analysis procedures. The seismic sequences being monitored are often characterized by very many events with short inter-event times that can even provide overlapped seismic signatures. In these situations, traditional approaches that identify seismic events using dense seismic networks based on detections, phase identification and event association can fail, leading to missed detections and/or reduced location resolution. In recent years, to improve the quality of automated catalogues, various waveform-based methods for the detection and location of microseismicity have been proposed. These methods exploit the coherence of the waveforms recorded at different stations and do not require any automated picking procedure. Although this family of methods have been applied to different induced seismicity datasets, an extensive comparison with sophisticated pick-based detection and location methods is still lacking. We aim here to perform a systematic comparison in term of performance using the waveform-based method LOKI and the pick-based detection and location methods (SCAUTOLOC and SCANLOC) implemented within the SeisComP3 software package. SCANLOC is a new detection and location method specifically designed for seismic monitoring at local scale. Although recent applications have proved an extensive test with induced seismicity datasets have been not yet performed. This method is based on a cluster search algorithm to associate detections to one or many potential earthquake sources. On the other hand, SCAUTOLOC is more a "conventional" method and is the basic tool for seismic event detection and location in SeisComp3. This approach was specifically designed for regional and teleseismic applications, thus its performance with microseismic data might be limited. We analyze the performance of the three methodologies for a synthetic dataset with realistic noise conditions as well as for the first hour of continuous waveform data, including the Ml 3.5 St. Gallen earthquake, recorded by a microseismic network deployed in the area. We finally compare the results obtained all these three methods with a manually revised catalogue.
A substitution method to improve completeness of events documentation in anesthesia records.
Lamer, Antoine; De Jonckheere, Julien; Marcilly, Romaric; Tavernier, Benoît; Vallet, Benoît; Jeanne, Mathieu; Logier, Régis
2015-12-01
AIMS are optimized to find and display data and curves about one specific intervention but is not retrospective analysis on a huge volume of interventions. Such a system present two main limitation; (1) the transactional database architecture, (2) the completeness of documentation. In order to solve the architectural problem, data warehouses were developed to propose architecture suitable for analysis. However, completeness of documentation stays unsolved. In this paper, we describe a method which allows determining of substitution rules in order to detect missing anesthesia events in an anesthesia record. Our method is based on the principle that missing event could be detected using a substitution one defined as the nearest documented event. As an example, we focused on the automatic detection of the start and the end of anesthesia procedure when these events were not documented by the clinicians. We applied our method on a set of records in order to evaluate; (1) the event detection accuracy, (2) the improvement of valid records. For the year 2010-2012, we obtained event detection with a precision of 0.00 (-2.22; 2.00) min for the start of anesthesia and 0.10 (0.00; 0.35) min for the end of anesthesia. On the other hand, we increased by 21.1% the data completeness (from 80.3 to 97.2% of the total database) for the start and the end of anesthesia events. This method seems to be efficient to replace missing "start and end of anesthesia" events. This method could also be used to replace other missing time events in this particular data warehouse as well as in other kind of data warehouses.
Wu, Yuhua; Wang, Yulei; Li, Jun; Li, Wei; Zhang, Li; Li, Yunjing; Li, Xiaofei; Li, Jun; Zhu, Li; Wu, Gang
2014-01-01
The Cauliflower mosaic virus (CaMV) 35S promoter (P35S) is a commonly used target for detection of genetically modified organisms (GMOs). There are currently 24 reported detection methods, targeting different regions of the P35S promoter. Initial assessment revealed that due to the absence of primer binding sites in the P35S sequence, 19 of the 24 reported methods failed to detect P35S in MON88913 cotton, and the other two methods could only be applied to certain GMOs. The rest three reported methods were not suitable for measurement of P35S in some testing events, because SNPs in binding sites of the primer/probe would result in abnormal amplification plots and poor linear regression parameters. In this study, we discovered a conserved region in the P35S sequence through sequencing of P35S promoters from multiple transgenic events, and developed new qualitative and quantitative detection systems targeting this conserved region. The qualitative PCR could detect the P35S promoter in 23 unique GMO events with high specificity and sensitivity. The quantitative method was suitable for measurement of P35S promoter, exhibiting good agreement between the amount of template and Ct values for each testing event. This study provides a general P35S screening method, with greater coverage than existing methods. PMID:25483893
GMDD: a database of GMO detection methods
Dong, Wei; Yang, Litao; Shen, Kailin; Kim, Banghyun; Kleter, Gijs A; Marvin, Hans JP; Guo, Rong; Liang, Wanqi; Zhang, Dabing
2008-01-01
Background Since more than one hundred events of genetically modified organisms (GMOs) have been developed and approved for commercialization in global area, the GMO analysis methods are essential for the enforcement of GMO labelling regulations. Protein and nucleic acid-based detection techniques have been developed and utilized for GMOs identification and quantification. However, the information for harmonization and standardization of GMO analysis methods at global level is needed. Results GMO Detection method Database (GMDD) has collected almost all the previous developed and reported GMOs detection methods, which have been grouped by different strategies (screen-, gene-, construct-, and event-specific), and also provide a user-friendly search service of the detection methods by GMO event name, exogenous gene, or protein information, etc. In this database, users can obtain the sequences of exogenous integration, which will facilitate PCR primers and probes design. Also the information on endogenous genes, certified reference materials, reference molecules, and the validation status of developed methods is included in this database. Furthermore, registered users can also submit new detection methods and sequences to this database, and the newly submitted information will be released soon after being checked. Conclusion GMDD contains comprehensive information of GMO detection methods. The database will make the GMOs analysis much easier. PMID:18522755
Molecular toolbox for the identification of unknown genetically modified organisms.
Ruttink, Tom; Demeyer, Rolinde; Van Gulck, Elke; Van Droogenbroeck, Bart; Querci, Maddalena; Taverniers, Isabel; De Loose, Marc
2010-03-01
Competent laboratories monitor genetically modified organisms (GMOs) and products derived thereof in the food and feed chain in the framework of labeling and traceability legislation. In addition, screening is performed to detect the unauthorized presence of GMOs including asynchronously authorized GMOs or GMOs that are not officially registered for commercialization (unknown GMOs). Currently, unauthorized or unknown events are detected by screening blind samples for commonly used transgenic elements, such as p35S or t-nos. If (1) positive detection of such screening elements shows the presence of transgenic material and (2) all known GMOs are tested by event-specific methods but are not detected, then the presence of an unknown GMO is inferred. However, such evidence is indirect because it is based on negative observations and inconclusive because the procedure does not identify the causative event per se. In addition, detection of unknown events is hampered in products that also contain known authorized events. Here, we outline alternative approaches for analytical detection and GMO identification and develop new methods to complement the existing routine screening procedure. We developed a fluorescent anchor-polymerase chain reaction (PCR) method for the identification of the sequences flanking the p35S and t-nos screening elements. Thus, anchor-PCR fingerprinting allows the detection of unique discriminative signals per event. In addition, we established a collection of in silico calculated fingerprints of known events to support interpretation of experimentally generated anchor-PCR GM fingerprints of blind samples. Here, we first describe the molecular characterization of a novel GMO, which expresses recombinant human intrinsic factor in Arabidopsis thaliana. Next, we purposefully treated the novel GMO as a blind sample to simulate how the new methods lead to the molecular identification of a novel unknown event without prior knowledge of its transgene sequence. The results demonstrate that the new methods complement routine screening procedures by providing direct conclusive evidence and may also be useful to resolve masking of unknown events by known events.
In-situ trainable intrusion detection system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Symons, Christopher T.; Beaver, Justin M.; Gillen, Rob
A computer implemented method detects intrusions using a computer by analyzing network traffic. The method includes a semi-supervised learning module connected to a network node. The learning module uses labeled and unlabeled data to train a semi-supervised machine learning sensor. The method records events that include a feature set made up of unauthorized intrusions and benign computer requests. The method identifies at least some of the benign computer requests that occur during the recording of the events while treating the remainder of the data as unlabeled. The method trains the semi-supervised learning module at the network node in-situ, such thatmore » the semi-supervised learning modules may identify malicious traffic without relying on specific rules, signatures, or anomaly detection.« less
Singh, Monika; Bhoge, Rajesh K; Randhawa, Gurinderjit
2018-04-20
Background : Confirming the integrity of seed samples in powdered form is important priorto conducting a genetically modified organism (GMO) test. Rapid onsite methods may provide a technological solution to check for genetically modified (GM) events at ports of entry. In India, Bt cotton is the commercialized GM crop with four approved GM events; however, 59 GM events have been approved globally. GMO screening is required to test for authorized GM events. The identity and amplifiability of test samples could be ensured first by employing endogenous genes as an internal control. Objective : A rapid onsite detection method was developed for an endogenous reference gene, stearoyl acyl carrier protein desaturase ( Sad1 ) of cotton, employing visual and real-time loop-mediated isothermal amplification (LAMP). Methods : The assays were performed at a constant temperature of 63°C for 30 min for visual LAMP and 62ºC for 40 min for real-time LAMP. Positive amplification was visualized as a change in color from orange to green on addition of SYBR ® Green or detected as real-time amplification curves. Results : Specificity of LAMP assays was confirmed using a set of 10 samples. LOD for visual LAMP was up to 0.1%, detecting 40 target copies, and for real-time LAMP up to 0.05%, detecting 20 target copies. Conclusions : The developed methods could be utilized to confirm the integrity of seed powder prior to conducting a GMO test for specific GM events of cotton. Highlights : LAMP assays for the endogenous Sad1 gene of cotton have been developed to be used as an internal control for onsite GMO testing in cotton.
Reischer, G H; Haider, J M; Sommer, R; Stadler, H; Keiblinger, K M; Hornek, R; Zerobin, W; Mach, R L; Farnleitner, A H
2008-10-01
The impairment of water quality by faecal pollution is a global public health concern. Microbial source tracking methods help to identify faecal sources but the few recent quantitative microbial source tracking applications disregarded catchment hydrology and pollution dynamics. This quantitative microbial source tracking study, conducted in a large karstic spring catchment potentially influenced by humans and ruminant animals, was based on a tiered sampling approach: a 31-month water quality monitoring (Monitoring) covering seasonal hydrological dynamics and an investigation of flood events (Events) as periods of the strongest pollution. The detection of a ruminant-specific and a human-specific faecal Bacteroidetes marker by quantitative real-time PCR was complemented by standard microbiological and on-line hydrological parameters. Both quantitative microbial source tracking markers were detected in spring water during Monitoring and Events, with preponderance of the ruminant-specific marker. Applying multiparametric analysis of all data allowed linking the ruminant-specific marker to general faecal pollution indicators, especially during Events. Up to 80% of the variation of faecal indicator levels during Events could be explained by ruminant-specific marker levels proving the dominance of ruminant faecal sources in the catchment. Furthermore, soil was ruled out as a source of quantitative microbial source tracking markers. This study demonstrates the applicability of quantitative microbial source tracking methods and highlights the prerequisite of considering hydrological catchment dynamics in source tracking study design.
Unsupervised Spatial Event Detection in Targeted Domains with Applications to Civil Unrest Modeling
Zhao, Liang; Chen, Feng; Dai, Jing; Hua, Ting; Lu, Chang-Tien; Ramakrishnan, Naren
2014-01-01
Twitter has become a popular data source as a surrogate for monitoring and detecting events. Targeted domains such as crime, election, and social unrest require the creation of algorithms capable of detecting events pertinent to these domains. Due to the unstructured language, short-length messages, dynamics, and heterogeneity typical of Twitter data streams, it is technically difficult and labor-intensive to develop and maintain supervised learning systems. We present a novel unsupervised approach for detecting spatial events in targeted domains and illustrate this approach using one specific domain, viz. civil unrest modeling. Given a targeted domain, we propose a dynamic query expansion algorithm to iteratively expand domain-related terms, and generate a tweet homogeneous graph. An anomaly identification method is utilized to detect spatial events over this graph by jointly maximizing local modularity and spatial scan statistics. Extensive experiments conducted in 10 Latin American countries demonstrate the effectiveness of the proposed approach. PMID:25350136
Vaudano, Enrico; Costantini, Antonella; Garcia-Moruno, Emilia
2016-10-03
The availability of genetically modified (GM) yeasts for winemaking and, in particular, transgenic strains based on the integration of genetic constructs deriving from other organisms into the genome of Saccharomyces cerevisiae, has been a reality for several years. Despite this, their use is only authorized in a few countries and limited to two strains: ML01, able to convert malic acid into lactic acid during alcoholic fermentation, and ECMo01 suitable for reducing the risk of carbamate production. In this work we propose a quali-quantitative culture-independent method for the detection of GM yeast ML01 in commercial preparations of ADY (Active Dry Yeast) consisting of efficient extraction of DNA and qPCR (quantitative PCR) analysis based on event-specific assay targeting MLC (malolactic cassette), and a taxon-specific S. cerevisiae assay detecting the MRP2 gene. The ADY DNA extraction methodology has been shown to provide good purity DNA suitable for subsequent qPCR. The MLC and MRP2 qPCR assay showed characteristics of specificity, dynamic range, limit of quantification (LOQ) limit of detection (LOD), precision and trueness, which were fully compliant with international reference guidelines. The method has been shown to reliably detect 0.005% (mass/mass) of GM ML01 S. cerevisiae in commercial preparations of ADY. Copyright © 2016 Elsevier B.V. All rights reserved.
Object-Oriented Query Language For Events Detection From Images Sequences
NASA Astrophysics Data System (ADS)
Ganea, Ion Eugen
2015-09-01
In this paper is presented a method to represent the events extracted from images sequences and the query language used for events detection. Using an object oriented model the spatial and temporal relationships between salient objects and also between events are stored and queried. This works aims to unify the storing and querying phases for video events processing. The object oriented language syntax used for events processing allow the instantiation of the indexes classes in order to improve the accuracy of the query results. The experiments were performed on images sequences provided from sport domain and it shows the reliability and the robustness of the proposed language. To extend the language will be added a specific syntax for constructing the templates for abnormal events and for detection of the incidents as the final goal of the research.
Fusing Symbolic and Numerical Diagnostic Computations
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.
Clayton, Hilary M.
2015-01-01
The study of animal movement commonly requires the segmentation of continuous data streams into individual strides. The use of forceplates and foot-mounted accelerometers readily allows the detection of the foot-on and foot-off events that define a stride. However, when relying on optical methods such as motion capture, there is lack of validated robust, universally applicable stride event detection methods. To date, no method has been validated for movement on a circle, while algorithms are commonly specific to front/hind limbs or gait. In this study, we aimed to develop and validate kinematic stride segmentation methods applicable to movement on straight line and circle at walk and trot, which exclusively rely on a single, dorsal hoof marker. The advantage of such marker placement is the robustness to marker loss and occlusion. Eight horses walked and trotted on a straight line and in a circle over an array of multiple forceplates. Kinetic events were detected based on the vertical force profile and used as the reference values. Kinematic events were detected based on displacement, velocity or acceleration signals of the dorsal hoof marker depending on the algorithm using (i) defined thresholds associated with derived movement signals and (ii) specific events in the derived movement signals. Method comparison was performed by calculating limits of agreement, accuracy, between-horse precision and within-horse precision based on differences between kinetic and kinematic event. In addition, we examined the effect of force thresholds ranging from 50 to 150 N on the timings of kinetic events. The two approaches resulted in very good and comparable performance: of the 3,074 processed footfall events, 95% of individual foot on and foot off events differed by no more than 26 ms from the kinetic event, with average accuracy between −11 and 10 ms and average within- and between horse precision ≤8 ms. While the event-based method may be less likely to suffer from scaling effects, on soft ground the threshold-based method may prove more valuable. While we found that use of velocity thresholds for foot on detection results in biased event estimates for the foot on the inside of the circle at trot, adjusting thresholds for this condition negated the effect. For the final four algorithms, we found no noteworthy bias between conditions or between front- and hind-foot timings. Different force thresholds in the range of 50 to 150 N had the greatest systematic effect on foot-off estimates in the hind limbs (up to on average 16 ms per condition), being greater than the effect on foot-on estimates or foot-off estimates in the forelimbs (up to on average ±7 ms per condition). PMID:26157641
Ballari, Rajashekhar V; Martin, Asha; Gowda, Lalitha R
2013-01-01
Brinjal is an important vegetable crop. Major crop loss of brinjal is due to insect attack. Insect-resistant EE-1 brinjal has been developed and is awaiting approval for commercial release. Consumer health concerns and implementation of international labelling legislation demand reliable analytical detection methods for genetically modified (GM) varieties. End-point and real-time polymerase chain reaction (PCR) methods were used to detect EE-1 brinjal. In end-point PCR, primer pairs specific to 35S CaMV promoter, NOS terminator and nptII gene common to other GM crops were used. Based on the revealed 3' transgene integration sequence, primers specific for the event EE-1 brinjal were designed. These primers were used for end-point single, multiplex and SYBR-based real-time PCR. End-point single PCR showed that the designed primers were highly specific to event EE-1 with a sensitivity of 20 pg of genomic DNA, corresponding to 20 copies of haploid EE-1 brinjal genomic DNA. The limits of detection and quantification for SYBR-based real-time PCR assay were 10 and 100 copies respectively. The prior development of detection methods for this important vegetable crop will facilitate compliance with any forthcoming labelling regulations. Copyright © 2012 Society of Chemical Industry.
Chen, Xiaoyun; Wang, Xiaofu; Jin, Nuo; Zhou, Yu; Huang, Sainan; Miao, Qingmei; Zhu, Qing; Xu, Junfeng
2012-11-07
Genetically modified (GM) rice KMD1, TT51-1, and KF6 are three of the most well known transgenic Bt rice lines in China. A rapid and sensitive molecular assay for risk assessment of GM rice is needed. Polymerase chain reaction (PCR), currently the most common method for detecting genetically modified organisms, requires temperature cycling and relatively complex procedures. Here we developed a visual and rapid loop-mediated isothermal amplification (LAMP) method to amplify three GM rice event-specific junction sequences. Target DNA was amplified and visualized by two indicators (SYBR green or hydroxy naphthol blue [HNB]) within 60 min at an isothermal temperature of 63 °C. Different kinds of plants were selected to ensure the specificity of detection and the results of the non-target samples were negative, indicating that the primer sets for the three GM rice varieties had good levels of specificity. The sensitivity of LAMP, with detection limits at low concentration levels (0.01%−0.005% GM), was 10- to 100-fold greater than that of conventional PCR. Additionally, the LAMP assay coupled with an indicator (SYBR green or HNB) facilitated analysis. These findings revealed that the rapid detection method was suitable as a simple field-based test to determine the status of GM crops.
Semiautomated tremor detection using a combined cross-correlation and neural network approach
Horstmann, Tobias; Harrington, Rebecca M.; Cochran, Elizabeth S.
2013-01-01
Despite observations of tectonic tremor in many locations around the globe, the emergent phase arrivals, low‒amplitude waveforms, and variable event durations make automatic detection a nontrivial task. In this study, we employ a new method to identify tremor in large data sets using a semiautomated technique. The method first reduces the data volume with an envelope cross‒correlation technique, followed by a Self‒Organizing Map (SOM) algorithm to identify and classify event types. The method detects tremor in an automated fashion after calibrating for a specific data set, hence we refer to it as being “semiautomated”. We apply the semiautomated detection algorithm to a newly acquired data set of waveforms from a temporary deployment of 13 seismometers near Cholame, California, from May 2010 to July 2011. We manually identify tremor events in a 3 week long test data set and compare to the SOM output and find a detection accuracy of 79.5%. Detection accuracy improves with increasing signal‒to‒noise ratios and number of available stations. We find detection completeness of 96% for tremor events with signal‒to‒noise ratios above 3 and optimal results when data from at least 10 stations are available. We compare the SOM algorithm to the envelope correlation method of Wech and Creager and find the SOM performs significantly better, at least for the data set examined here. Using the SOM algorithm, we detect 2606 tremor events with a cumulative signal duration of nearly 55 h during the 13 month deployment. Overall, the SOM algorithm is shown to be a flexible new method that utilizes characteristics of the waveforms to identify tremor from noise or other seismic signals.
Semiautomated tremor detection using a combined cross-correlation and neural network approach
NASA Astrophysics Data System (ADS)
Horstmann, T.; Harrington, R. M.; Cochran, E. S.
2013-09-01
Despite observations of tectonic tremor in many locations around the globe, the emergent phase arrivals, low-amplitude waveforms, and variable event durations make automatic detection a nontrivial task. In this study, we employ a new method to identify tremor in large data sets using a semiautomated technique. The method first reduces the data volume with an envelope cross-correlation technique, followed by a Self-Organizing Map (SOM) algorithm to identify and classify event types. The method detects tremor in an automated fashion after calibrating for a specific data set, hence we refer to it as being "semiautomated". We apply the semiautomated detection algorithm to a newly acquired data set of waveforms from a temporary deployment of 13 seismometers near Cholame, California, from May 2010 to July 2011. We manually identify tremor events in a 3 week long test data set and compare to the SOM output and find a detection accuracy of 79.5%. Detection accuracy improves with increasing signal-to-noise ratios and number of available stations. We find detection completeness of 96% for tremor events with signal-to-noise ratios above 3 and optimal results when data from at least 10 stations are available. We compare the SOM algorithm to the envelope correlation method of Wech and Creager and find the SOM performs significantly better, at least for the data set examined here. Using the SOM algorithm, we detect 2606 tremor events with a cumulative signal duration of nearly 55 h during the 13 month deployment. Overall, the SOM algorithm is shown to be a flexible new method that utilizes characteristics of the waveforms to identify tremor from noise or other seismic signals.
Takabatake, Reona; Akiyama, Hiroshi; Sakata, Kozue; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Teshima, Reiko; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi
2011-01-01
A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event; A2704-12. During the plant transformation, DNA fragments derived from pUC19 plasmid were integrated in A2704-12, and the region was found to be A2704-12 specific. The pUC19-derived DNA sequences were used as primers for the specific detection of A2704-12. We first tried to construct a standard plasmid for A2704-12 quantification using pUC19. However, non-specific signals appeared with both qualitative and quantitative PCR analyses using the specific primers with pUC19 as a template, and we then constructed a plasmid using pBR322. The conversion factor (C(f)), which is required to calculate the amount of the genetically modified organism (GMO), was experimentally determined with two real-time PCR instruments, the Applied Biosystems 7900HT and the Applied Biosystems 7500. The determined C(f) values were both 0.98. The quantitative method was evaluated by means of blind tests in multi-laboratory trials using the two real-time PCR instruments. The limit of quantitation for the method was estimated to be 0.1%. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were each less than 20%. These results suggest that the developed method would be suitable for practical analyses for the detection and quantification of A2704-12.
The effectiveness of pretreatment physics plan review for detecting errors in radiation therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopan, Olga; Zeng, Jing; Novak, Avrey
Purpose: The pretreatment physics plan review is a standard tool for ensuring treatment quality. Studies have shown that the majority of errors in radiation oncology originate in treatment planning, which underscores the importance of the pretreatment physics plan review. This quality assurance measure is fundamentally important and central to the safety of patients and the quality of care that they receive. However, little is known about its effectiveness. The purpose of this study was to analyze reported incidents to quantify the effectiveness of the pretreatment physics plan review with the goal of improving it. Methods: This study analyzed 522 potentiallymore » severe or critical near-miss events within an institutional incident learning system collected over a three-year period. Of these 522 events, 356 originated at a workflow point that was prior to the pretreatment physics plan review. The remaining 166 events originated after the pretreatment physics plan review and were not considered in the study. The applicable 356 events were classified into one of the three categories: (1) events detected by the pretreatment physics plan review, (2) events not detected but “potentially detectable” by the physics review, and (3) events “not detectable” by the physics review. Potentially detectable events were further classified by which specific checks performed during the pretreatment physics plan review detected or could have detected the event. For these events, the associated specific check was also evaluated as to the possibility of automating that check given current data structures. For comparison, a similar analysis was carried out on 81 events from the international SAFRON radiation oncology incident learning system. Results: Of the 356 applicable events from the institutional database, 180/356 (51%) were detected or could have been detected by the pretreatment physics plan review. Of these events, 125 actually passed through the physics review; however, only 38% (47/125) were actually detected at the review. Of the 81 events from the SAFRON database, 66/81 (81%) were potentially detectable by the pretreatment physics plan review. From the institutional database, three specific physics checks were particularly effective at detecting events (combined effectiveness of 38%): verifying the isocenter (39/180), verifying DRRs (17/180), and verifying that the plan matched the prescription (12/180). The most effective checks from the SAFRON database were verifying that the plan matched the prescription (13/66) and verifying the field parameters in the record and verify system against those in the plan (23/66). Software-based plan checking systems, if available, would have potential effectiveness of 29% and 64% at detecting events from the institutional and SAFRON databases, respectively. Conclusions: Pretreatment physics plan review is a key safety measure and can detect a high percentage of errors. However, the majority of errors that potentially could have been detected were not detected in this study, indicating the need to improve the pretreatment physics plan review performance. Suggestions for improvement include the automation of specific physics checks performed during the pretreatment physics plan review and the standardization of the review process.« less
Improved Detection of Local Earthquakes in the Vienna Basin (Austria), using Subspace Detectors
NASA Astrophysics Data System (ADS)
Apoloner, Maria-Theresia; Caffagni, Enrico; Bokelmann, Götz
2016-04-01
The Vienna Basin in Eastern Austria is densely populated and highly-developed; it is also a region of low to moderate seismicity, yet the seismological network coverage is relatively sparse. This demands improving our capability of earthquake detection by testing new methods, enlarging the existing local earthquake catalogue. This contributes to imaging tectonic fault zones for better understanding seismic hazard, also through improved earthquake statistics (b-value, magnitude of completeness). Detection of low-magnitude earthquakes or events for which the highest amplitudes slightly exceed the signal-to-noise-ratio (SNR), may be possible by using standard methods like the short-term over long-term average (STA/LTA). However, due to sparse network coverage and high background noise, such a technique may not detect all potentially recoverable events. Yet, earthquakes originating from the same source region and relatively close to each other, should be characterized by similarity in seismic waveforms, at a given station. Therefore, waveform similarity can be exploited by using specific techniques such as correlation-template based (also known as matched filtering) or subspace detection methods (based on the subspace theory). Matching techniques basically require a reference or template event, usually characterized by high waveform coherence in the array receivers, and high SNR, which is cross-correlated with the continuous data. Instead, subspace detection methods overcome in principle the necessity of defining template events as single events, but use a subspace extracted from multiple events. This approach theoretically should be more robust in detecting signals that exhibit a strong variability (e.g. because of source or magnitude). In this study we scan the continuous data recorded in the Vienna Basin with a subspace detector to identify additional events. This will allow us to estimate the increase of the seismicity rate in the local earthquake catalogue, therefore providing an evaluation of network performance and efficiency of the method.
Brauchli Pernus, Yolanda; Nan, Cassandra; Verstraeten, Thomas; Pedenko, Mariia; Osokogu, Osemeke U; Weibel, Daniel; Sturkenboom, Miriam; Bonhoeffer, Jan
2016-12-12
Safety signal detection in spontaneous reporting system databases and electronic healthcare records is key to detection of previously unknown adverse events following immunization. Various statistical methods for signal detection in these different datasources have been developed, however none are geared to the pediatric population and none specifically to vaccines. A reference set comprising pediatric vaccine-adverse event pairs is required for reliable performance testing of statistical methods within and across data sources. The study was conducted within the context of the Global Research in Paediatrics (GRiP) project, as part of the seventh framework programme (FP7) of the European Commission. Criteria for the selection of vaccines considered in the reference set were routine and global use in the pediatric population. Adverse events were primarily selected based on importance. Outcome based systematic literature searches were performed for all identified vaccine-adverse event pairs and complemented by expert committee reports, evidence based decision support systems (e.g. Micromedex), and summaries of product characteristics. Classification into positive (PC) and negative control (NC) pairs was performed by two independent reviewers according to a pre-defined algorithm and discussed for consensus in case of disagreement. We selected 13 vaccines and 14 adverse events to be included in the reference set. From a total of 182 vaccine-adverse event pairs, we classified 18 as PC, 113 as NC and 51 as unclassifiable. Most classifications (91) were based on literature review, 45 were based on expert committee reports, and for 46 vaccine-adverse event pairs, an underlying pathomechanism was not plausible classifying the association as NC. A reference set of vaccine-adverse event pairs was developed. We propose its use for comparing signal detection methods and systems in the pediatric population. Published by Elsevier Ltd.
Zhang, M Z; Zhang, X F; Chen, X M; Chen, X; Wu, S; Xu, L L
2015-08-10
The enzyme-linked probe hybridization chip utilizes a method based on ligase-hybridizing probe chip technology, with the principle of using thio-primers for protection against enzyme digestion, and using lambda DNA exonuclease to cut multiple PCR products obtained from the sample being tested into single-strand chains for hybridization. The 5'-end amino-labeled probe was fixed onto the aldehyde chip, and hybridized with the single-stranded PCR product, followed by addition of a fluorescent-modified probe that was then enzymatically linked with the adjacent, substrate-bound probe in order to achieve highly specific, parallel, and high-throughput detection. Specificity and sensitivity testing demonstrated that enzyme-linked probe hybridization technology could be applied to the specific detection of eight genetic modification events at the same time, with a sensitivity reaching 0.1% and the achievement of accurate, efficient, and stable results.
Cai, Yi; Du, Jingcheng; Huang, Jing; Ellenberg, Susan S; Hennessy, Sean; Tao, Cui; Chen, Yong
2017-07-05
To identify safety signals by manual review of individual report in large surveillance databases is time consuming; such an approach is very unlikely to reveal complex relationships between medications and adverse events. Since the late 1990s, efforts have been made to develop data mining tools to systematically and automatically search for safety signals in surveillance databases. Influenza vaccines present special challenges to safety surveillance because the vaccine changes every year in response to the influenza strains predicted to be prevalent that year. Therefore, it may be expected that reporting rates of adverse events following flu vaccines (number of reports for a specific vaccine-event combination/number of reports for all vaccine-event combinations) may vary substantially across reporting years. Current surveillance methods seldom consider these variations in signal detection, and reports from different years are typically collapsed together to conduct safety analyses. However, merging reports from different years ignores the potential heterogeneity of reporting rates across years and may miss important safety signals. Reports of adverse events between years 1990 to 2013 were extracted from the Vaccine Adverse Event Reporting System (VAERS) database and formatted into a three-dimensional data array with types of vaccine, groups of adverse events and reporting time as the three dimensions. We propose a random effects model to test the heterogeneity of reporting rates for a given vaccine-event combination across reporting years. The proposed method provides a rigorous statistical procedure to detect differences of reporting rates among years. We also introduce a new visualization tool to summarize the result of the proposed method when applied to multiple vaccine-adverse event combinations. We applied the proposed method to detect safety signals of FLU3, an influenza vaccine containing three flu strains, in the VAERS database. We showed that it had high statistical power to detect the variation in reporting rates across years. The identified vaccine-event combinations with significant different reporting rates over years suggested potential safety issues due to changes in vaccines which require further investigation. We developed a statistical model to detect safety signals arising from heterogeneity of reporting rates of a given vaccine-event combinations across reporting years. This method detects variation in reporting rates over years with high power. The temporal trend of reporting rate across years may reveal the impact of vaccine update on occurrence of adverse events and provide evidence for further investigations.
Sengupta, Partha Pratim; Gloria, Jared N; Amato, Dahlia N; Amato, Douglas V; Patton, Derek L; Murali, Beddhu; Flynt, Alex S
2015-10-12
Detection of specific RNA or DNA molecules by hybridization to "probe" nucleic acids via complementary base-pairing is a powerful method for analysis of biological systems. Here we describe a strategy for transducing hybridization events through modulating intrinsic properties of the electroconductive polymer polyaniline (PANI). When DNA-based probes electrostatically interact with PANI, its fluorescence properties are increased, a phenomenon that can be enhanced by UV irradiation. Hybridization of target nucleic acids results in dissociation of probes causing PANI fluorescence to return to basal levels. By monitoring restoration of base PANI fluorescence as little as 10(-11) M (10 pM) of target oligonucleotides could be detected within 15 min of hybridization. Detection of complementary oligos was specific, with introduction of a single mismatch failing to form a target-probe duplex that would dissociate from PANI. Furthermore, this approach is robust and is capable of detecting specific RNAs in extracts from animals. This sensor system improves on previously reported strategies by transducing highly specific probe dissociation events through intrinsic properties of a conducting polymer without the need for additional labels.
2013-01-01
Background The fovea, which is the most sensitive part of the retina, is known to have birefringent properties, i.e. it changes the polarization state of light upon reflection. Existing devices use this property to obtain information on the orientation of the fovea and the direction of gaze. Such devices employ specific frequency components that appear during moments of fixation on a target. To detect them, previous methods have used solely the power spectrum of the Fast Fourier Transform (FFT), which, unfortunately, is an integral method, and does not give information as to where exactly the events of interest occur. With very young patients who are not cooperative enough, this presents a problem, because central fixation may be present only during very short-lasting episodes, and can easily be missed by the FFT. Method This paper presents a method for detecting short-lasting moments of central fixation in existing devices for retinal birefringence scanning, with the goal of a reliable detection of eye alignment. Signal analysis is based on the Continuous Wavelet Transform (CWT), which reliably localizes such events in the time-frequency plane. Even though the characteristic frequencies are not always strongly expressed due to possible artifacts, simple topological analysis of the time-frequency distribution can detect fixation reliably. Results In all six subjects tested, the CWT allowed precise identification of both frequency components. Moreover, in four of these subjects, episodes of intermittent but definitely present central fixation were detectable, similar to those in Figure 4. A simple FFT is likely to treat them as borderline cases, or entirely miss them, depending on the thresholds used. Conclusion Joint time-frequency analysis is a powerful tool in the detection of eye alignment, even in a noisy environment. The method is applicable to similar situations, where short-lasting diagnostic events need to be detected in time series acquired by means of scanning some substrate along a specific path. PMID:23668264
Tsukahara, Keita; Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Nishimaki-Mogami, Tomoko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi
2016-01-01
A real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event, MON87701. First, a standard plasmid for MON87701 quantification was constructed. The conversion factor (C f ) required to calculate the amount of genetically modified organism (GMO) was experimentally determined for a real-time PCR instrument. The determined C f for the real-time PCR instrument was 1.24. For the evaluation of the developed method, a blind test was carried out in an inter-laboratory trial. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr), respectively. The determined biases and the RSDr values were less than 30 and 13%, respectively, at all evaluated concentrations. The limit of quantitation of the method was 0.5%, and the developed method would thus be applicable for practical analyses for the detection and quantification of MON87701.
Jacchia, Sara; Nardini, Elena; Bassani, Niccolò; Savini, Christian; Shim, Jung-Hyun; Trijatmiko, Kurniawan; Kreysa, Joachim; Mazzara, Marco
2015-05-27
This article describes the international validation of the quantitative real-time polymerase chain reaction (PCR) detection method for Golden Rice 2. The method consists of a taxon-specific assay amplifying a fragment of rice Phospholipase D α2 gene, and an event-specific assay designed on the 3' junction between transgenic insert and plant DNA. We validated the two assays independently, with absolute quantification, and in combination, with relative quantification, on DNA samples prepared in haploid genome equivalents. We assessed trueness, precision, efficiency, and linearity of the two assays, and the results demonstrate that both the assays independently assessed and the entire method fulfill European and international requirements for methods for genetically modified organism (GMO) testing, within the dynamic range tested. The homogeneity of the results of the collaborative trial between Europe and Asia is a good indicator of the robustness of the method.
A novel seizure detection algorithm informed by hidden Markov model event states
NASA Astrophysics Data System (ADS)
Baldassano, Steven; Wulsin, Drausin; Ung, Hoameng; Blevins, Tyler; Brown, Mesha-Gay; Fox, Emily; Litt, Brian
2016-06-01
Objective. Recently the FDA approved the first responsive, closed-loop intracranial device to treat epilepsy. Because these devices must respond within seconds of seizure onset and not miss events, they are tuned to have high sensitivity, leading to frequent false positive stimulations and decreased battery life. In this work, we propose a more robust seizure detection model. Approach. We use a Bayesian nonparametric Markov switching process to parse intracranial EEG (iEEG) data into distinct dynamic event states. Each event state is then modeled as a multidimensional Gaussian distribution to allow for predictive state assignment. By detecting event states highly specific for seizure onset zones, the method can identify precise regions of iEEG data associated with the transition to seizure activity, reducing false positive detections associated with interictal bursts. The seizure detection algorithm was translated to a real-time application and validated in a small pilot study using 391 days of continuous iEEG data from two dogs with naturally occurring, multifocal epilepsy. A feature-based seizure detector modeled after the NeuroPace RNS System was developed as a control. Main results. Our novel seizure detection method demonstrated an improvement in false negative rate (0/55 seizures missed versus 2/55 seizures missed) as well as a significantly reduced false positive rate (0.0012 h versus 0.058 h-1). All seizures were detected an average of 12.1 ± 6.9 s before the onset of unequivocal epileptic activity (unequivocal epileptic onset (UEO)). Significance. This algorithm represents a computationally inexpensive, individualized, real-time detection method suitable for implantable antiepileptic devices that may considerably reduce false positive rate relative to current industry standards.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopan, O; Novak, A; Zeng, J
Purpose: Physics pre-treatment plan review is crucial to safe radiation oncology treatments. Studies show that most errors originate in treatment planning, which underscores the importance of physics plan review. As a QA measure the physics review is of fundamental importance and is central to the profession of medical physics. However, little is known about its effectiveness. More hard data are needed. The purpose of this study was to quantify the effectiveness of physics review with the goal of improving it. Methods: This study analyzed 315 “potentially serious” near-miss incidents within an institutional incident learning system collected over a two-year period.more » 139 of these originated prior to physics review and were found at the review or after. Incidents were classified as events that: 1)were detected by physics review, 2)could have been detected (but were not), and 3)could not have been detected. Category 1 and 2 events were classified by which specific check (within physics review) detected or could have detected the event. Results: Of the 139 analyzed events, 73/139 (53%) were detected or could have been detected by the physics review; although, 42/73 (58%) were not actually detected. 45/73 (62%) errors originated in treatment planning, making physics review the first step in the workflow that could detect the error. Two specific physics checks were particularly effective (combined effectiveness of >20%): verifying DRRs (8/73) and verifying isocenter (7/73). Software-based plan checking systems were evaluated and found to have potential effectiveness of 40%. Given current data structures, software implementations of some tests such as isocenter verification check would be challenging. Conclusion: Physics plan review is a key safety measure and can detect majority of reported events. However, a majority of events that potentially could have been detected were NOT detected in this study, indicating the need to improve the performance of physics review.« less
N-glycosylation of plant recombinant pharmaceuticals.
Bardor, Muriel; Cabrera, Gleysin; Stadlmann, Johannes; Lerouge, Patrice; Cremata, José A; Gomord, Véronique; Fitchette, Anne-Catherine
2009-01-01
N-glycosylation is a maturation event necessary for the correct function, efficiency, and stability of a high number of biopharmaceuticals. This chapter presented here proposes various methods to determine whether, how, and where a plant pharmaceutical is N-glycosylated. These methods rely on blot detection with glycan-specific probes, specific deglycosylation of glycoproteins followed by mass spectrometry, N-glycan profile analysis, and glycopeptide identification by LC-MS.
Using waveform cross correlation for automatic recovery of aftershock sequences
NASA Astrophysics Data System (ADS)
Bobrov, Dmitry; Kitov, Ivan; Rozhkov, Mikhail
2017-04-01
Aftershock sequences of the largest earthquakes are difficult to recover. There can be several hundred mid-sized aftershocks per hour within a few hundred km from each other recorded by the same stations. Moreover, these events generate thousands of reflected/refracted phases having azimuth and slowness close to those from the P-waves. Therefore, aftershock sequences with thousands of events represent a major challenge for automatic and interactive processing at the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Organization (CTBTO). Standard methods of detection and phase association do not use all information contained in signals. As a result, wrong association of the first and later phases, both regular and site specific, produces enormous number of wrong event hypotheses and destroys valid event hypotheses in automatic IDC processing. In turn, the IDC analysts have to reject false and recreate valid hypotheses wasting precious human resources. At the current level of the IDC catalogue completeness, the method of waveform cross correlation (WCC) can resolve most of detection and association problems fully utilizing the similarity of waveforms generated by aftershocks. Array seismic stations of the International monitoring system (IMS) can enhance the performance of the WCC method: reduce station-specific detection thresholds, allow accurate estimate of signal attributes, including relative magnitude, and effectively suppress irrelevant arrivals. We have developed and tested a prototype of an aftershock tool matching all IDC processing requirements and merged it with the current IDC pipeline. This tool includes creation of master events consisting of real or synthetic waveform templates at ten and more IMS stations; cross correlation (CC) of real-time waveforms with these templates, association of arrivals detected at CC-traces in event hypotheses; building events matching the IDC quality criteria; and resolution of conflicts between events hypotheses created by neighboring master-events. The final cross correlation standard event lists (XSEL) is a start point for interactive analysis with standard tools. We present select results for the biggest earthquakes, like Sumatra 2004 and Tohoku 2011, as well as for several smaller events with hundreds of aftershocks. The sensitivity and resolution of the aftershock tool is demonstrated on the example of mb=2.2 aftershock found after the September 9, 2016 DPRK test.
Mull, Hillary J; Borzecki, Ann M; Loveland, Susan; Hickson, Kathleen; Chen, Qi; MacDonald, Sally; Shin, Marlena H; Cevasco, Marisa; Itani, Kamal M F; Rosen, Amy K
2014-04-01
The Patient Safety Indicators (PSIs) use administrative data to screen for select adverse events (AEs). In this study, VA Surgical Quality Improvement Program (VASQIP) chart review data were used as the gold standard to measure the criterion validity of 5 surgical PSIs. Independent chart review was also used to determine reasons for PSI errors. The sensitivity, specificity, and positive predictive value of PSI software version 4.1a were calculated among Veterans Health Administration hospitalizations (2003-2007) reviewed by VASQIP (n = 268,771). Nurses re-reviewed a sample of hospitalizations for which PSI and VASQIP AE detection disagreed. Sensitivities ranged from 31% to 68%, specificities from 99.1% to 99.8%, and positive predictive values from 31% to 72%. Reviewers found that coding errors accounted for some PSI-VASQIP disagreement; some disagreement was also the result of differences in AE definitions. These results suggest that the PSIs have moderate criterion validity; however, some surgical PSIs detect different AEs than VASQIP. Future research should explore using both methods to evaluate surgical quality. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Yuki, Akiyama; Satoshi, Ueyama; Ryosuke, Shibasaki; Adachi, Ryuichiro
2016-06-01
In this study, we developed a method to detect sudden population concentration on a certain day and area, that is, an "Event," all over Japan in 2012 using mass GPS data provided from mobile phone users. First, stay locations of all phone users were detected using existing methods. Second, areas and days where Events occurred were detected by aggregation of mass stay locations into 1-km-square grid polygons. Finally, the proposed method could detect Events with an especially large number of visitors in the year by removing the influences of Events that occurred continuously throughout the year. In addition, we demonstrated reasonable reliability of the proposed Event detection method by comparing the results of Event detection with light intensities obtained from the night light images from the DMSP/OLS night light images. Our method can detect not only positive events such as festivals but also negative events such as natural disasters and road accidents. These results are expected to support policy development of urban planning, disaster prevention, and transportation management.
Self-similarity Clustering Event Detection Based on Triggers Guidance
NASA Astrophysics Data System (ADS)
Zhang, Xianfei; Li, Bicheng; Tian, Yuxuan
Traditional method of Event Detection and Characterization (EDC) regards event detection task as classification problem. It makes words as samples to train classifier, which can lead to positive and negative samples of classifier imbalance. Meanwhile, there is data sparseness problem of this method when the corpus is small. This paper doesn't classify event using word as samples, but cluster event in judging event types. It adopts self-similarity to convergence the value of K in K-means algorithm by the guidance of event triggers, and optimizes clustering algorithm. Then, combining with named entity and its comparative position information, the new method further make sure the pinpoint type of event. The new method avoids depending on template of event in tradition methods, and its result of event detection can well be used in automatic text summarization, text retrieval, and topic detection and tracking.
Nadal, Anna; Coll, Anna; La Paz, Jose-Luis; Esteve, Teresa; Pla, Maria
2006-10-01
We present a novel multiplex PCR assay for simultaneous detection of multiple transgenic events in maize. Initially, five PCR primers pairs specific to events Bt11, GA21, MON810, and NK603, and Zea mays L. (alcohol dehydrogenase) were included. The event specificity was based on amplification of transgene/plant genome flanking regions, i.e., the same targets as for validated real-time PCR assays. These short and similarly sized amplicons were selected to achieve high and similar amplification efficiency for all targets; however, its unambiguous identification was a technical challenge. We achieved a clear distinction by a novel CGE approach that combined the identification by size and color (CGE-SC). In one single step, all five targets were amplified and specifically labeled with three different fluorescent dyes. The assay was specific and displayed an LOD of 0.1% of each genetically modified organism (GMO). Therefore, it was adequate to fulfill legal thresholds established, e.g., in the European Union. Our CGE-SC based strategy in combination with an adequate labeling design has the potential to simultaneously detect higher numbers of targets. As an example, we present the detection of up to eight targets in a single run. Multiplex PCR-CGE-SC only requires a conventional sequencer device and enables automation and high throughput. In addition, it proved to be transferable to a different laboratory. The number of authorized GMO events is rapidly growing; and the acreage of genetically modified (GM) varieties cultivated and commercialized worldwide is rapidly increasing. In this context, our multiplex PCR-CGE-SC can be suitable for screening GM contents in food.
Determination of a Limited Scope Network's Lightning Detection Efficiency
NASA Technical Reports Server (NTRS)
Rompala, John T.; Blakeslee, R.
2008-01-01
This paper outlines a modeling technique to map lightning detection efficiency variations over a region surveyed by a sparse array of ground based detectors. A reliable flash peak current distribution (PCD) for the region serves as the technique's base. This distribution is recast as an event probability distribution function. The technique then uses the PCD together with information regarding: site signal detection thresholds, type of solution algorithm used, and range attenuation; to formulate the probability that a flash at a specified location will yield a solution. Applying this technique to the full region produces detection efficiency contour maps specific to the parameters employed. These contours facilitate a comparative analysis of each parameter's effect on the network's detection efficiency. In an alternate application, this modeling technique gives an estimate of the number, strength, and distribution of events going undetected. This approach leads to a variety of event density contour maps. This application is also illustrated. The technique's base PCD can be empirical or analytical. A process for formulating an empirical PCD specific to the region and network being studied is presented. A new method for producing an analytical representation of the empirical PCD is also introduced.
Salvi, Sergio; D'Orso, Fabio; Morelli, Giorgio
2008-06-25
Many countries have introduced mandatory labeling requirements on foods derived from genetically modified organisms (GMOs). Real-time quantitative polymerase chain reaction (PCR) based upon the TaqMan probe chemistry has become the method mostly used to support these regulations; moreover, event-specific PCR is the preferred method in GMO detection because of its high specificity based on the flanking sequence of the exogenous integrant. The aim of this study was to evaluate the use of very short (eight-nucleotide long), locked nucleic acid (LNA) TaqMan probes in 5'-nuclease PCR assays for the detection and quantification of GMOs. Classic TaqMan and LNA TaqMan probes were compared for the analysis of the maize MON810 transgene. The performance of the two types of probes was tested on the maize endogenous reference gene hmga, the CaMV 35S promoter, and the hsp70/cryIA(b) construct as well as for the event-specific 5'-integration junction of MON810, using plasmids as standard reference molecules. The results of our study demonstrate that the LNA 5'-nuclease PCR assays represent a valid and reliable analytical system for the detection and quantification of transgenes. Application of very short LNA TaqMan probes to GMO quantification can simplify the design of 5'-nuclease assays.
Manzanares-Palenzuela, C Lorena; de-Los-Santos-Álvarez, Noemí; Lobo-Castañón, María Jesús; López-Ruiz, Beatriz
2015-06-15
Current EU regulations on the mandatory labeling of genetically modified organisms (GMOs) with a minimum content of 0.9% would benefit from the availability of reliable and rapid methods to detect and quantify DNA sequences specific for GMOs. Different genosensors have been developed to this aim, mainly intended for GMO screening. A remaining challenge, however, is the development of genosensing platforms for GMO quantification, which should be expressed as the number of event-specific DNA sequences per taxon-specific sequences. Here we report a simple and sensitive multiplexed electrochemical approach for the quantification of Roundup-Ready Soybean (RRS). Two DNA sequences, taxon (lectin) and event-specific (RR), are targeted via hybridization onto magnetic beads. Both sequences are simultaneously detected by performing the immobilization, hybridization and labeling steps in a single tube and parallel electrochemical readout. Hybridization is performed in a sandwich format using signaling probes labeled with fluorescein isothiocyanate (FITC) or digoxigenin (Dig), followed by dual enzymatic labeling using Fab fragments of anti-Dig and anti-FITC conjugated to peroxidase or alkaline phosphatase, respectively. Electrochemical measurement of the enzyme activity is finally performed on screen-printed carbon electrodes. The assay gave a linear range of 2-250 pM for both targets, with LOD values of 650 fM (160 amol) and 190 fM (50 amol) for the event-specific and the taxon-specific targets, respectively. Results indicate that the method could be applied for GMO quantification below the European labeling threshold level (0.9%), offering a general approach for the rapid quantification of specific GMO events in foods. Copyright © 2015 Elsevier B.V. All rights reserved.
Pham, Thuy T; Moore, Steven T; Lewis, Simon John Geoffrey; Nguyen, Diep N; Dutkiewicz, Eryk; Fuglevand, Andrew J; McEwan, Alistair L; Leong, Philip H W
2017-11-01
Freezing of gait (FoG) is common in Parkinsonian gait and strongly relates to falls. Current clinical FoG assessments are patients' self-report diaries and experts' manual video analysis. Both are subjective and yield moderate reliability. Existing detection algorithms have been predominantly designed in subject-dependent settings. In this paper, we aim to develop an automated FoG detector for subject independent. After extracting highly relevant features, we apply anomaly detection techniques to detect FoG events. Specifically, feature selection is performed using correlation and clusterability metrics. From a list of 244 feature candidates, 36 candidates were selected using saliency and robustness criteria. We develop an anomaly score detector with adaptive thresholding to identify FoG events. Then, using accuracy metrics, we reduce the feature list to seven candidates. Our novel multichannel freezing index was the most selective across all window sizes, achieving sensitivity (specificity) of (). On the other hand, freezing index from the vertical axis was the best choice for a single input, achieving sensitivity (specificity) of () for ankle and () for back sensors. Our subject-independent method is not only significantly more accurate than those previously reported, but also uses a much smaller window (e.g., versus ) and/or lower tolerance (e.g., versus ).Freezing of gait (FoG) is common in Parkinsonian gait and strongly relates to falls. Current clinical FoG assessments are patients' self-report diaries and experts' manual video analysis. Both are subjective and yield moderate reliability. Existing detection algorithms have been predominantly designed in subject-dependent settings. In this paper, we aim to develop an automated FoG detector for subject independent. After extracting highly relevant features, we apply anomaly detection techniques to detect FoG events. Specifically, feature selection is performed using correlation and clusterability metrics. From a list of 244 feature candidates, 36 candidates were selected using saliency and robustness criteria. We develop an anomaly score detector with adaptive thresholding to identify FoG events. Then, using accuracy metrics, we reduce the feature list to seven candidates. Our novel multichannel freezing index was the most selective across all window sizes, achieving sensitivity (specificity) of (). On the other hand, freezing index from the vertical axis was the best choice for a single input, achieving sensitivity (specificity) of () for ankle and () for back sensors. Our subject-independent method is not only significantly more accurate than those previously reported, but also uses a much smaller window (e.g., versus ) and/or lower tolerance (e.g., versus ).
Freedman, Kevin J; Bastian, Arangassery R; Chaiken, Irwin; Kim, Min Jun
2013-03-11
Protein conjugation provides a unique look into many biological phenomena and has been used for decades for molecular recognition purposes. In this study, the use of solid-state nanopores for the detection of gp120-associated complexes are investigated. They exhibit monovalent and multivalent binding to anti-gp120 antibody monomer and dimers. In order to investigate the feasibility of many practical applications related to nanopores, detection of specific protein complexes is attempted within a heterogeneous protein sample, and the role of voltage on complexed proteins is researched. It is found that the electric field within the pore can result in unbinding of a freely translocating protein complex within the transient event durations measured experimentally. The strong dependence of the unbinding time with voltage can be used to improve the detection capability of the nanopore system by adding an additional level of specificity that can be probed. These data provide a strong framework for future protein-specific detection schemes, which are shown to be feasible in the realm of a 'real-world' sample and an automated multidimensional method of detecting events. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
JRC GMO-Matrix: a web application to support Genetically Modified Organisms detection strategies.
Angers-Loustau, Alexandre; Petrillo, Mauro; Bonfini, Laura; Gatto, Francesco; Rosa, Sabrina; Patak, Alexandre; Kreysa, Joachim
2014-12-30
The polymerase chain reaction (PCR) is the current state of the art technique for DNA-based detection of Genetically Modified Organisms (GMOs). A typical control strategy starts by analyzing a sample for the presence of target sequences (GM-elements) known to be present in many GMOs. Positive findings from this "screening" are then confirmed with GM (event) specific test methods. A reliable knowledge of which GMOs are detected by combinations of GM-detection methods is thus crucial to minimize the verification efforts. In this article, we describe a novel platform that links the information of two unique databases built and maintained by the European Union Reference Laboratory for Genetically Modified Food and Feed (EU-RL GMFF) at the Joint Research Centre (JRC) of the European Commission, one containing the sequence information of known GM-events and the other validated PCR-based detection and identification methods. The new platform compiles in silico determinations of the detection of a wide range of GMOs by the available detection methods using existing scripts that simulate PCR amplification and, when present, probe binding. The correctness of the information has been verified by comparing the in silico conclusions to experimental results for a subset of forty-nine GM events and six methods. The JRC GMO-Matrix is unique for its reliance on DNA sequence data and its flexibility in integrating novel GMOs and new detection methods. Users can mine the database using a set of web interfaces that thus provide a valuable support to GMO control laboratories in planning and evaluating their GMO screening strategies. The platform is accessible at http://gmo-crl.jrc.ec.europa.eu/jrcgmomatrix/ .
Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi
2014-01-01
A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize event, MIR162. We first prepared a standard plasmid for MIR162 quantification. The conversion factor (Cf) required to calculate the genetically modified organism (GMO) amount was empirically determined for two real-time PCR instruments, the Applied Biosystems 7900HT (ABI7900) and the Applied Biosystems 7500 (ABI7500) for which the determined Cf values were 0.697 and 0.635, respectively. To validate the developed method, a blind test was carried out in an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr). The determined biases were less than 25% and the RSDr values were less than 20% at all evaluated concentrations. These results suggested that the limit of quantitation of the method was 0.5%, and that the developed method would thus be suitable for practical analyses for the detection and quantification of MIR162.
Ross, Joseph S; Bates, Jonathan; Parzynski, Craig S; Akar, Joseph G; Curtis, Jeptha P; Desai, Nihar R; Freeman, James V; Gamble, Ginger M; Kuntz, Richard; Li, Shu-Xia; Marinac-Dabic, Danica; Masoudi, Frederick A; Normand, Sharon-Lise T; Ranasinghe, Isuru; Shaw, Richard E; Krumholz, Harlan M
2017-01-01
Machine learning methods may complement traditional analytic methods for medical device surveillance. Using data from the National Cardiovascular Data Registry for implantable cardioverter-defibrillators (ICDs) linked to Medicare administrative claims for longitudinal follow-up, we applied three statistical approaches to safety-signal detection for commonly used dual-chamber ICDs that used two propensity score (PS) models: one specified by subject-matter experts (PS-SME), and the other one by machine learning-based selection (PS-ML). The first approach used PS-SME and cumulative incidence (time-to-event), the second approach used PS-SME and cumulative risk (Data Extraction and Longitudinal Trend Analysis [DELTA]), and the third approach used PS-ML and cumulative risk (embedded feature selection). Safety-signal surveillance was conducted for eleven dual-chamber ICD models implanted at least 2,000 times over 3 years. Between 2006 and 2010, there were 71,948 Medicare fee-for-service beneficiaries who received dual-chamber ICDs. Cumulative device-specific unadjusted 3-year event rates varied for three surveyed safety signals: death from any cause, 12.8%-20.9%; nonfatal ICD-related adverse events, 19.3%-26.3%; and death from any cause or nonfatal ICD-related adverse event, 27.1%-37.6%. Agreement among safety signals detected/not detected between the time-to-event and DELTA approaches was 90.9% (360 of 396, k =0.068), between the time-to-event and embedded feature-selection approaches was 91.7% (363 of 396, k =-0.028), and between the DELTA and embedded feature selection approaches was 88.1% (349 of 396, k =-0.042). Three statistical approaches, including one machine learning method, identified important safety signals, but without exact agreement. Ensemble methods may be needed to detect all safety signals for further evaluation during medical device surveillance.
Nonparametric Bayesian clustering to detect bipolar methylated genomic loci.
Wu, Xiaowei; Sun, Ming-An; Zhu, Hongxiao; Xie, Hehuang
2015-01-16
With recent development in sequencing technology, a large number of genome-wide DNA methylation studies have generated massive amounts of bisulfite sequencing data. The analysis of DNA methylation patterns helps researchers understand epigenetic regulatory mechanisms. Highly variable methylation patterns reflect stochastic fluctuations in DNA methylation, whereas well-structured methylation patterns imply deterministic methylation events. Among these methylation patterns, bipolar patterns are important as they may originate from allele-specific methylation (ASM) or cell-specific methylation (CSM). Utilizing nonparametric Bayesian clustering followed by hypothesis testing, we have developed a novel statistical approach to identify bipolar methylated genomic regions in bisulfite sequencing data. Simulation studies demonstrate that the proposed method achieves good performance in terms of specificity and sensitivity. We used the method to analyze data from mouse brain and human blood methylomes. The bipolar methylated segments detected are found highly consistent with the differentially methylated regions identified by using purified cell subsets. Bipolar DNA methylation often indicates epigenetic heterogeneity caused by ASM or CSM. With allele-specific events filtered out or appropriately taken into account, our proposed approach sheds light on the identification of cell-specific genes/pathways under strong epigenetic control in a heterogeneous cell population.
Repetitive deliberate fires: Development and validation of a methodology to detect series.
Bruenisholz, Eva; Delémont, Olivier; Ribaux, Olivier; Wilson-Wilde, Linzi
2017-08-01
The detection of repetitive deliberate fire events is challenging and still often ineffective due to a case-by-case approach. A previous study provided a critical review of the situation and analysis of the main challenges. This study suggested that the intelligence process, integrating forensic data, could be a valid framework to provide a follow-up and systematic analysis provided it is adapted to the specificities of repetitive deliberate fires. In this current manuscript, a specific methodology to detect deliberate fires series, i.e. set by the same perpetrators, is presented and validated. It is based on case profiles relying on specific elements previously identified. The method was validated using a dataset of approximately 8000 deliberate fire events collected over 12 years in a Swiss state. Twenty possible series were detected, including 6 of 9 known series. These results are very promising and lead the way to a systematic implementation of this methodology in an intelligence framework, whilst demonstrating the need and benefit of increasing the collection of forensic specific information to strengthen the value of links between cases. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.
2012-01-01
Background Adverse consequences of medical interventions are a source of concern, but clinical trials may lack power to detect elevated rates of such events, while observational studies have inherent limitations. Meta-analysis allows the combination of individual studies, which can increase power and provide stronger evidence relating to adverse events. However, meta-analysis of adverse events has associated methodological challenges. The aim of this study was to systematically identify and review the methodology used in meta-analyses where a primary outcome is an adverse or unintended event, following a therapeutic intervention. Methods Using a collection of reviews identified previously, 166 references including a meta-analysis were selected for review. At least one of the primary outcomes in each review was an adverse or unintended event. The nature of the intervention, source of funding, number of individual meta-analyses performed, number of primary studies included in the review, and use of meta-analytic methods were all recorded. Specific areas of interest relating to the methods used included the choice of outcome metric, methods of dealing with sparse events, heterogeneity, publication bias and use of individual patient data. Results The 166 included reviews were published between 1994 and 2006. Interventions included drugs and surgery among other interventions. Many of the references being reviewed included multiple meta-analyses with 44.6% (74/166) including more than ten. Randomised trials only were included in 42.2% of meta-analyses (70/166), observational studies only in 33.7% (56/166) and a mix of observational studies and trials in 15.7% (26/166). Sparse data, in the form of zero events in one or both arms where the outcome was a count of events, was found in 64 reviews of two-arm studies, of which 41 (64.1%) had zero events in both arms. Conclusions Meta-analyses of adverse events data are common and useful in terms of increasing the power to detect an association with an intervention, especially when the events are infrequent. However, with regard to existing meta-analyses, a wide variety of different methods have been employed, often with no evident rationale for using a particular approach. More specifically, the approach to dealing with zero events varies, and guidelines on this issue would be desirable. PMID:22553987
Monma, Kimio; Araki, Rie; Sagi, Naoki; Satoh, Masaki; Ichikawa, Hisatsugu; Satoh, Kazue; Tobe, Takashi; Kamata, Kunihiro; Hino, Akihiro; Saito, Kazuo
2005-06-01
Investigations of the validity of labeling regarding genetically modified (GM) products were conducted using polymerase chain reaction (PCR) methods for foreign-made processed foods made from corn and potato purchased in the Tokyo area and in the USA. Several kinds of GM crops were detected in 12 of 32 samples of processed corn samples. More than two GM events for which safety reviews have been completed in Japan were simultaneously detected in 10 samples. GM events MON810 and Bt11 were most frequently detected in the samples by qualitative PCR methods. MON810 was detected in 11 of the 12 samples, and Bt11 was detected in 6 of the 12 samples. In addition, Roundup Ready soy was detected in one of the 12 samples. On the other hand, CBH351, for which the safety assessment was withdrawn in Japan, was not detected in any of the 12 samples. A trial quantitative analysis was performed on six of the GM maize qualitatively positive samples. The estimated amounts of GM maize in these samples ranged from 0.2 to 2.8%, except for one sample, which contained 24.1%. For this sample, the total amount found by event-specific quantitative analysis was 23.8%. Additionally, Roundup Ready soy was detected in one sample of 21 potato-processed foods, although GM potatoes were not detected in any sample.
Neural network pattern recognition of lingual-palatal pressure for automated detection of swallow.
Hadley, Aaron J; Krival, Kate R; Ridgel, Angela L; Hahn, Elizabeth C; Tyler, Dustin J
2015-04-01
We describe a novel device and method for real-time measurement of lingual-palatal pressure and automatic identification of the oral transfer phase of deglutition. Clinical measurement of the oral transport phase of swallowing is a complicated process requiring either placement of obstructive sensors or sitting within a fluoroscope or articulograph for recording. Existing detection algorithms distinguish oral events with EMG, sound, and pressure signals from the head and neck, but are imprecise and frequently result in false detection. We placed seven pressure sensors on a molded mouthpiece fitting over the upper teeth and hard palate and recorded pressure during a variety of swallow and non-swallow activities. Pressure measures and swallow times from 12 healthy and 7 Parkinson's subjects provided training data for a time-delay artificial neural network to categorize the recordings as swallow or non-swallow events. User-specific neural networks properly categorized 96 % of swallow and non-swallow events, while a generalized population-trained network was able to properly categorize 93 % of swallow and non-swallow events across all recordings. Lingual-palatal pressure signals are sufficient to selectively and specifically recognize the initiation of swallowing in healthy and dysphagic patients.
Chan, Cheng Leng; Rudrappa, Sowmya; Ang, Pei San; Li, Shu Chuen; Evans, Stephen J W
2017-08-01
The ability to detect safety concerns from spontaneous adverse drug reaction reports in a timely and efficient manner remains important in public health. This paper explores the behaviour of the Sequential Probability Ratio Test (SPRT) and ability to detect signals of disproportionate reporting (SDRs) in the Singapore context. We used SPRT with a combination of two hypothesised relative risks (hRRs) of 2 and 4.1 to detect signals of both common and rare adverse events in our small database. We compared SPRT with other methods in terms of number of signals detected and whether labelled adverse drug reactions were detected or the reaction terms were considered serious. The other methods used were reporting odds ratio (ROR), Bayesian Confidence Propagation Neural Network (BCPNN) and Gamma Poisson Shrinker (GPS). The SPRT produced 2187 signals in common with all methods, 268 unique signals, and 70 signals in common with at least one other method, and did not produce signals in 178 cases where two other methods detected them, and there were 403 signals unique to one of the other methods. In terms of sensitivity, ROR performed better than other methods, but the SPRT method found more new signals. The performances of the methods were similar for negative predictive value and specificity. Using a combination of hRRs for SPRT could be a useful screening tool for regulatory agencies, and more detailed investigation of the medical utility of the system is merited.
Event Detection Challenges, Methods, and Applications in Natural and Artificial Systems
2009-03-01
using the composite event detection method [Kerman, Jiang, Blumberg , and Buttrey, 2009]. Although the techniques and utility of the...aforementioned method have been clearly demonstrated, there is still much work and research to be conducted within the realm of event detection. This...detection methods . The paragraphs that follow summarize the discoveries of and lessons learned by multiple researchers and authors over many
Ross, Joseph S; Bates, Jonathan; Parzynski, Craig S; Akar, Joseph G; Curtis, Jeptha P; Desai, Nihar R; Freeman, James V; Gamble, Ginger M; Kuntz, Richard; Li, Shu-Xia; Marinac-Dabic, Danica; Masoudi, Frederick A; Normand, Sharon-Lise T; Ranasinghe, Isuru; Shaw, Richard E; Krumholz, Harlan M
2017-01-01
Background Machine learning methods may complement traditional analytic methods for medical device surveillance. Methods and results Using data from the National Cardiovascular Data Registry for implantable cardioverter–defibrillators (ICDs) linked to Medicare administrative claims for longitudinal follow-up, we applied three statistical approaches to safety-signal detection for commonly used dual-chamber ICDs that used two propensity score (PS) models: one specified by subject-matter experts (PS-SME), and the other one by machine learning-based selection (PS-ML). The first approach used PS-SME and cumulative incidence (time-to-event), the second approach used PS-SME and cumulative risk (Data Extraction and Longitudinal Trend Analysis [DELTA]), and the third approach used PS-ML and cumulative risk (embedded feature selection). Safety-signal surveillance was conducted for eleven dual-chamber ICD models implanted at least 2,000 times over 3 years. Between 2006 and 2010, there were 71,948 Medicare fee-for-service beneficiaries who received dual-chamber ICDs. Cumulative device-specific unadjusted 3-year event rates varied for three surveyed safety signals: death from any cause, 12.8%–20.9%; nonfatal ICD-related adverse events, 19.3%–26.3%; and death from any cause or nonfatal ICD-related adverse event, 27.1%–37.6%. Agreement among safety signals detected/not detected between the time-to-event and DELTA approaches was 90.9% (360 of 396, k=0.068), between the time-to-event and embedded feature-selection approaches was 91.7% (363 of 396, k=−0.028), and between the DELTA and embedded feature selection approaches was 88.1% (349 of 396, k=−0.042). Conclusion Three statistical approaches, including one machine learning method, identified important safety signals, but without exact agreement. Ensemble methods may be needed to detect all safety signals for further evaluation during medical device surveillance. PMID:28860874
Konry, Tania; Yarmush, Joel M.; Irimia, Daniel
2011-01-01
With advances in immunology and cancer biology, there is an unmet need for increasingly sensitive systems to monitor the expression of specific cell markers for the development of new diagnostic and therapeutic tools. To address this challenge, we have applied a highly sensitive labeling method that translates antigen-antibody recognition processes into DNA detection event that can be greatly amplified via isothermal Rolling Circle Amplification (RCA). By merging the single-molecule detection power of RCA reaction with microfluidic technology we were able to demonstrate that identification of specific protein markers can be achieved on tumor cell surface in miniaturized nano-liter reaction droplets. Furthermore, this combined approach of signal amplification in a microfluidic format could extend the utility of existing methods by reducing sample and reagent consumption and enhancing the sensitivities and specificities for various applications, including early diagnosis of cancer. PMID:21294269
Queirós, R B; Gouveia, C; Fernandes, J R A; Jorge, P A S
2014-12-15
An evanescent wave fiber optic sensor for detection of Escherichia coli (E. coli) outer membranes proteins (EcOMPs) using long period gratings (LPGs) as a refractometric platform is presented. The sensing probes were attained by the functionalization of LPGs inscribed in single mode fiber using two different methods of immobilization; electrostatic assembly and covalent binding. The resulting label-free configuration enabled the specific recognition of EcOMPs in water by monitoring the resonance wavelength shift due to refractive index changes induced by binding events. The sensors displayed linear responses in the range of 0.1 nM to 10 nM EcOMPs with sensitivities of -0.1563±0.005 nm decade(-1) [EcOMP, M] (electrostatic method) and -0.1597±0.004 nm decade(-1) [EcOMP, M] (covalent method). The devices could be regenerated (under low pH conditions) with a deviation less than 0.1% for at least three subsequent detection events. The sensors were also applied to spiked environmental water samples. Copyright © 2014 Elsevier B.V. All rights reserved.
Salisu, Ibrahim B.; Shahid, Ahmad A.; Yaqoob, Amina; Ali, Qurban; Bajwa, Kamran S.; Rao, Abdul Q.; Husnain, Tayyab
2017-01-01
As long as the genetically modified crops are gaining attention globally, their proper approval and commercialization need accurate and reliable diagnostic methods for the transgenic content. These diagnostic techniques are mainly divided into two major groups, i.e., identification of transgenic (1) DNA and (2) proteins from GMOs and their products. Conventional methods such as PCR (polymerase chain reaction) and enzyme-linked immunosorbent assay (ELISA) were routinely employed for DNA and protein based quantification respectively. Although, these Techniques (PCR and ELISA) are considered as significantly convenient and productive, but there is need for more advance technologies that allow for high throughput detection and the quantification of GM event as the production of more complex GMO is increasing day by day. Therefore, recent approaches like microarray, capillary gel electrophoresis, digital PCR and next generation sequencing are more promising due to their accuracy and precise detection of transgenic contents. The present article is a brief comparative study of all such detection techniques on the basis of their advent, feasibility, accuracy, and cost effectiveness. However, these emerging technologies have a lot to do with detection of a specific event, contamination of different events and determination of fusion as well as stacked gene protein are the critical issues to be addressed in future. PMID:29085378
Real-Time Gait Event Detection Based on Kinematic Data Coupled to a Biomechanical Model.
Lambrecht, Stefan; Harutyunyan, Anna; Tanghe, Kevin; Afschrift, Maarten; De Schutter, Joris; Jonkers, Ilse
2017-03-24
Real-time detection of multiple stance events, more specifically initial contact (IC), foot flat (FF), heel off (HO), and toe off (TO), could greatly benefit neurorobotic (NR) and neuroprosthetic (NP) control. Three real-time threshold-based algorithms have been developed, detecting the aforementioned events based on kinematic data in combination with a biomechanical model. Data from seven subjects walking at three speeds on an instrumented treadmill were used to validate the presented algorithms, accumulating to a total of 558 steps. The reference for the gait events was obtained using marker and force plate data. All algorithms had excellent precision and no false positives were observed. Timing delays of the presented algorithms were similar to current state-of-the-art algorithms for the detection of IC and TO, whereas smaller delays were achieved for the detection of FF. Our results indicate that, based on their high precision and low delays, these algorithms can be used for the control of an NR/NP, with the exception of the HO event. Kinematic data is used in most NR/NP control schemes and is thus available at no additional cost, resulting in a minimal computational burden. The presented methods can also be applied for screening pathological gait or gait analysis in general in/outside of the laboratory.
The effectiveness of pretreatment physics plan review for detecting errors in radiation therapy.
Gopan, Olga; Zeng, Jing; Novak, Avrey; Nyflot, Matthew; Ford, Eric
2016-09-01
The pretreatment physics plan review is a standard tool for ensuring treatment quality. Studies have shown that the majority of errors in radiation oncology originate in treatment planning, which underscores the importance of the pretreatment physics plan review. This quality assurance measure is fundamentally important and central to the safety of patients and the quality of care that they receive. However, little is known about its effectiveness. The purpose of this study was to analyze reported incidents to quantify the effectiveness of the pretreatment physics plan review with the goal of improving it. This study analyzed 522 potentially severe or critical near-miss events within an institutional incident learning system collected over a three-year period. Of these 522 events, 356 originated at a workflow point that was prior to the pretreatment physics plan review. The remaining 166 events originated after the pretreatment physics plan review and were not considered in the study. The applicable 356 events were classified into one of the three categories: (1) events detected by the pretreatment physics plan review, (2) events not detected but "potentially detectable" by the physics review, and (3) events "not detectable" by the physics review. Potentially detectable events were further classified by which specific checks performed during the pretreatment physics plan review detected or could have detected the event. For these events, the associated specific check was also evaluated as to the possibility of automating that check given current data structures. For comparison, a similar analysis was carried out on 81 events from the international SAFRON radiation oncology incident learning system. Of the 356 applicable events from the institutional database, 180/356 (51%) were detected or could have been detected by the pretreatment physics plan review. Of these events, 125 actually passed through the physics review; however, only 38% (47/125) were actually detected at the review. Of the 81 events from the SAFRON database, 66/81 (81%) were potentially detectable by the pretreatment physics plan review. From the institutional database, three specific physics checks were particularly effective at detecting events (combined effectiveness of 38%): verifying the isocenter (39/180), verifying DRRs (17/180), and verifying that the plan matched the prescription (12/180). The most effective checks from the SAFRON database were verifying that the plan matched the prescription (13/66) and verifying the field parameters in the record and verify system against those in the plan (23/66). Software-based plan checking systems, if available, would have potential effectiveness of 29% and 64% at detecting events from the institutional and SAFRON databases, respectively. Pretreatment physics plan review is a key safety measure and can detect a high percentage of errors. However, the majority of errors that potentially could have been detected were not detected in this study, indicating the need to improve the pretreatment physics plan review performance. Suggestions for improvement include the automation of specific physics checks performed during the pretreatment physics plan review and the standardization of the review process.
Combining functionalised nanoparticles and SERS for the detection of DNA relating to disease.
Graham, Duncan; Stevenson, Ross; Thompson, David G; Barrett, Lee; Dalton, Colette; Faulds, Karen
2011-01-01
DNA functionalised nanoparticle probes offer new opportunities in analyte detection. Ultrasensitive, molecularly specific targeting of analytes is possible through the use of metallic nanoparticles and their ability to generate a surface enhanced Raman scattering (SERS) response. This is leading to a new range of diagnostic clinical probes based on SERS detection. Our approaches have shown how such probes can detect specific DNA sequences by using a biomolecular recognition event to 'turn on' a SERS response through a controlled assembly process of the DNA functionalised nanoparticles. Further, we have prepared DNA aptamer functionalised SERS probes and demonstrated how introduction of a protein target can change the aggregation state of the nanoparticles in a dose-dependant manner. These approaches are being used as methods to detect biomolecules that indicate a specific disease being present with a view to improving disease management.
van Holle, Lionel; Bauchau, Vincent
2014-01-01
Purpose Disproportionality methods measure how unexpected the observed number of adverse events is. Time-to-onset (TTO) methods measure how unexpected the TTO distribution of a vaccine-event pair is compared with what is expected from other vaccines and events. Our purpose is to compare the performance associated with each method. Methods For the disproportionality algorithms, we defined 336 combinations of stratification factors (sex, age, region and year) and threshold values of the multi-item gamma Poisson shrinker (MGPS). For the TTO algorithms, we defined 18 combinations of significance level and time windows. We used spontaneous reports of adverse events recorded for eight vaccines. The vaccine product labels were used as proxies for true safety signals. Algorithms were ranked according to their positive predictive value (PPV) for each vaccine separately; amedian rank was attributed to each algorithm across vaccines. Results The algorithm with the highest median rank was based on TTO with a significance level of 0.01 and a time window of 60 days after immunisation. It had an overall PPV 2.5 times higher than for the highest-ranked MGPS algorithm, 16th rank overall, which was fully stratified and had a threshold value of 0.8. A TTO algorithm with roughly the same sensitivity as the highest-ranked MGPS had better specificity but longer time-to-detection. Conclusions Within the scope of this study, the majority of the TTO algorithms presented a higher PPV than for any MGPS algorithm. Considering the complementarity of TTO and disproportionality methods, a signal detection strategy combining them merits further investigation. PMID:24038719
Jiang, Lingxi; Yang, Litao; Rao, Jun; Guo, Jinchao; Wang, Shu; Liu, Jia; Lee, Seonghun; Zhang, Dabing
2010-02-01
To implement genetically modified organism (GMO) labeling regulations, an event-specific analysis method based on the junction sequence between exogenous integration and host genomic DNA has become the preferential approach for GMO identification and quantification. In this study, specific primers and TaqMan probes based on the revealed 5'-end junction sequence of GM cotton MON15985 were designed, and qualitative and quantitative polymerase chain reaction (PCR) assays were established employing the designed primers and probes. In the qualitative PCR assay, the limit of detection (LOD) was 0.5 g kg(-1) in 100 ng total cotton genomic DNA, corresponding to about 17 copies of haploid cotton genomic DNA, and the LOD and limit of quantification (LOQ) for quantitative PCR assay were 10 and 17 copies of haploid cotton genomic DNA, respectively. Furthermore, the developed quantitative PCR assays were validated in-house by five different researchers. Also, five practical samples with known GM contents were quantified using the developed PCR assay in in-house validation, and the bias between the true and quantification values ranged from 2.06% to 12.59%. This study shows that the developed qualitative and quantitative PCR methods are applicable for the identification and quantification of GM cotton MON15985 and its derivates.
Tai, Bee-Choo; Grundy, Richard G; Machin, David
2010-04-01
In trials designed to delay or avoid irradiation among children with malignant brain tumor, although irradiation after disease progression is an important event, patients who have disease progression may decline radiotherapy (RT), or those without disease progression may opt for elective RT. To accurately describe the cumulative need for RT in such instances, it is crucial to account for these distinct events and to evaluate how each contributes to the delay or advancement of irradiation via a competing risks analysis. We describe the summary of competing events in such trials using competing risks methods based on cumulative incidence functions and Gray's test. The results obtained are contrasted with standard survival methods based on Kaplan-Meier curves, cause-specific hazard functions and log-rank test. The Kaplan-Meier method overestimates all event-specific rates. The cause-specific hazard analysis showed reduction in hazards for all events (A: RT after progression; B: no RT after progression; C: elective RT) among children with ependymoma. For event A, a higher cumulative incidence was reported for ependymoma. Although Gray's test failed to detect any difference (p = 0.331) between histologic subtypes, the log-rank test suggested marginal evidence (p = 0.057). Similarly, for event C, the log-rank test found stronger evidence of reduction in hazard among those with ependymoma (p = 0.005) as compared with Gray's test (p = 0.086). To evaluate treatment differences, failing to account for competing risks using appropriate methodology may lead to incorrect interpretations.
Detecting event-related changes in organizational networks using optimized neural network models.
Li, Ze; Sun, Duoyong; Zhu, Renqi; Lin, Zihan
2017-01-01
Organizational external behavior changes are caused by the internal structure and interactions. External behaviors are also known as the behavioral events of an organization. Detecting event-related changes in organizational networks could efficiently be used to monitor the dynamics of organizational behaviors. Although many different methods have been used to detect changes in organizational networks, these methods usually ignore the correlation between the internal structure and external events. Event-related change detection considers the correlation and could be used for event recognition based on social network modeling and supervised classification. Detecting event-related changes could be effectively useful in providing early warnings and faster responses to both positive and negative organizational activities. In this study, event-related change in an organizational network was defined, and artificial neural network models were used to quantitatively determine whether and when a change occurred. To achieve a higher accuracy, Back Propagation Neural Networks (BPNNs) were optimized using Genetic Algorithms (GAs) and Particle Swarm Optimization (PSO). We showed the feasibility of the proposed method by comparing its performance with that of other methods using two cases. The results suggested that the proposed method could identify organizational events based on a correlation between the organizational networks and events. The results also suggested that the proposed method not only has a higher precision but also has a better robustness than the previously used techniques.
Detecting event-related changes in organizational networks using optimized neural network models
Sun, Duoyong; Zhu, Renqi; Lin, Zihan
2017-01-01
Organizational external behavior changes are caused by the internal structure and interactions. External behaviors are also known as the behavioral events of an organization. Detecting event-related changes in organizational networks could efficiently be used to monitor the dynamics of organizational behaviors. Although many different methods have been used to detect changes in organizational networks, these methods usually ignore the correlation between the internal structure and external events. Event-related change detection considers the correlation and could be used for event recognition based on social network modeling and supervised classification. Detecting event-related changes could be effectively useful in providing early warnings and faster responses to both positive and negative organizational activities. In this study, event-related change in an organizational network was defined, and artificial neural network models were used to quantitatively determine whether and when a change occurred. To achieve a higher accuracy, Back Propagation Neural Networks (BPNNs) were optimized using Genetic Algorithms (GAs) and Particle Swarm Optimization (PSO). We showed the feasibility of the proposed method by comparing its performance with that of other methods using two cases. The results suggested that the proposed method could identify organizational events based on a correlation between the organizational networks and events. The results also suggested that the proposed method not only has a higher precision but also has a better robustness than the previously used techniques. PMID:29190799
Prins, Theo W; Scholtens, Ingrid M J; Bak, Arno W; van Dijk, Jeroen P; Voorhuijzen, Marleen M; Laurensse, Emile J; Kok, Esther J
2016-12-15
During routine monitoring for GMOs in food in the Netherlands, papaya-containing food supplements were found positive for the genetically modified (GM) elements P-35S and T-nos. The goal of this study was to identify the unknown and EU unauthorised GM papaya event(s). A screening strategy was applied using additional GM screening elements including a newly developed PRSV coat protein PCR. The detected PRSV coat protein PCR product was sequenced and the nucleotide sequence showed identity to PRSV YK strains indigenous to China and Taiwan. The GM events 16-0-1 and 18-2-4 could be identified by amplifying and sequencing events-specific sequences. Further analyses showed that both papaya event 16-0-1 and event 18-2-4 were transformed with the same construct. For use in routine analysis, derived TaqMan qPCR methods for events 16-0-1 and 18-2-4 were developed. Event 16-0-1 was detected in all samples tested whereas event 18-2-4 was detected in one sample. This study presents a strategy for combining information from different sources (literature, patent databases) and novel sequence data to identify unknown GM papaya events. Copyright © 2016 Elsevier Ltd. All rights reserved.
Comprehensive GMO detection using real-time PCR array: single-laboratory validation.
Mano, Junichi; Harada, Mioko; Takabatake, Reona; Furui, Satoshi; Kitta, Kazumi; Nakamura, Kosuke; Akiyama, Hiroshi; Teshima, Reiko; Noritake, Hiromichi; Hatano, Shuko; Futo, Satoshi; Minegishi, Yasutaka; Iizuka, Tayoshi
2012-01-01
We have developed a real-time PCR array method to comprehensively detect genetically modified (GM) organisms. In the method, genomic DNA extracted from an agricultural product is analyzed using various qualitative real-time PCR assays on a 96-well PCR plate, targeting for individual GM events, recombinant DNA (r-DNA) segments, taxon-specific DNAs, and donor organisms of the respective r-DNAs. In this article, we report the single-laboratory validation of both DNA extraction methods and component PCR assays constituting the real-time PCR array. We selected some DNA extraction methods for specified plant matrixes, i.e., maize flour, soybean flour, and ground canola seeds, then evaluated the DNA quantity, DNA fragmentation, and PCR inhibition of the resultant DNA extracts. For the component PCR assays, we evaluated the specificity and LOD. All DNA extraction methods and component PCR assays satisfied the criteria set on the basis of previous reports.
Li, Xiang; Pan, Liangwen; Li, Junyi; Zhang, Qigang; Zhang, Shuya; Lv, Rong; Yang, Litao
2011-12-28
For implementation of the issued regulations and labeling policies for genetically modified organism (GMO) supervision, the polymerase chain reaction (PCR) method has been widely used due to its high specificity and sensitivity. In particular, use of the event-specific PCR method based on the flanking sequence of transgenes has become the primary trend. In this study, both qualitative and quantitative PCR methods were established on the basis of the 5' flanking sequence of transgenic soybean A2704-12 and the 3' flanking sequence of transgenic soybean A5547-127, respectively. In qualitative PCR assays, the limits of detection (LODs) were 10 copies of haploid soybean genomic DNA for both A2704-12 and A5547-127. In quantitative real-time PCR assays, the LODs were 5 copies of haploid soybean genomic DNA for both A2704-12 and A5547-127, and the limits of quantification (LOQs) were 10 copies for both. Low bias and acceptable SD and RSD values were also achieved in quantification of four blind samples using the developed real-time PCR assays. In addition, the developed PCR assays for the two transgenic soybean events were used for routine analysis of soybean samples imported to Shanghai in a 6 month period from October 2010 to March 2011. A total of 27 lots of soybean from the United States and Argentina were analyzed: 8 lots from the Unites States were found to have the GM soybean A2704-12 event, and the GM contents were <1.5% in all eight analyzed lots. On the contrary, no GM soybean A5547-127 content was found in any of the eight lots. These results demonstrated that the established event-specific qualitative and quantitative PCR methods could be used effectively in routine identification and quantification of GM soybeans A2704-12 and A5547-127 and their derived products.
Detection of Epileptic Seizure Event and Onset Using EEG
Ahammad, Nabeel; Fathima, Thasneem; Joseph, Paul
2014-01-01
This study proposes a method of automatic detection of epileptic seizure event and onset using wavelet based features and certain statistical features without wavelet decomposition. Normal and epileptic EEG signals were classified using linear classifier. For seizure event detection, Bonn University EEG database has been used. Three types of EEG signals (EEG signal recorded from healthy volunteer with eye open, epilepsy patients in the epileptogenic zone during a seizure-free interval, and epilepsy patients during epileptic seizures) were classified. Important features such as energy, entropy, standard deviation, maximum, minimum, and mean at different subbands were computed and classification was done using linear classifier. The performance of classifier was determined in terms of specificity, sensitivity, and accuracy. The overall accuracy was 84.2%. In the case of seizure onset detection, the database used is CHB-MIT scalp EEG database. Along with wavelet based features, interquartile range (IQR) and mean absolute deviation (MAD) without wavelet decomposition were extracted. Latency was used to study the performance of seizure onset detection. Classifier gave a sensitivity of 98.5% with an average latency of 1.76 seconds. PMID:24616892
Hernández, Marta; Rodríguez-Lázaro, David; Zhang, David; Esteve, Teresa; Pla, Maria; Prat, Salomé
2005-05-04
The number of cultured hectares and commercialized genetically modified organisms (GMOs) has increased exponentially in the past 9 years. Governments in many countries have established a policy of labeling all food and feed containing or produced by GMOs. Consequently, versatile, laboratory-transferable GMO detection methods are in increasing demand. Here, we describe a qualitative PCR-based multiplex method for simultaneous detection and identification of four genetically modified maize lines: Bt11, MON810, T25, and GA21. The described system is based on the use of five primers directed to specific sequences in these insertion events. Primers were used in a single optimized multiplex PCR reaction, and sequences of the amplified fragments are reported. The assay allows amplification of the MON810 event from the 35S promoter to the hsp intron yielding a 468 bp amplicon. Amplification of the Bt11 and T25 events from the 35S promoter to the PAT gene yielded two different amplicons of 280 and 177 bp, respectively, whereas amplification of the 5' flanking region of the GA21 gave rise to an amplicon of 72 bp. These fragments are clearly distinguishable in agarose gels and have been reproduced successfully in a different laboratory. Hence, the proposed method comprises a rapid, simple, reliable, and sensitive (down to 0.05%) PCR-based assay, suitable for detection of these four GM maize lines in a single reaction.
Semiautomated TaqMan PCR screening of GMO labelled samples for (unauthorised) GMOs.
Scholtens, Ingrid M J; Molenaar, Bonnie; van Hoof, Richard A; Zaaijer, Stephanie; Prins, Theo W; Kok, Esther J
2017-06-01
In most countries, systems are in place to analyse food products for the potential presence of genetically modified organisms (GMOs), to enforce labelling requirements and to screen for the potential presence of unauthorised GMOs. With the growing number of GMOs on the world market, a larger diversity of methods is required for informative analyses. In this paper, the specificity of an extended screening set consisting of 32 screening methods to identify different crop species (endogenous genes) and GMO elements was verified against 59 different GMO reference materials. In addition, a cost- and time-efficient strategy for DNA isolation, screening and identification is presented. A module for semiautomated analysis of the screening results and planning of subsequent event-specific tests for identification has been developed. The Excel-based module contains information on the experimentally verified specificity of the element methods and of the EU authorisation status of the GMO events. If a detected GMO element cannot be explained by any of the events as identified in the same sample, this may indicate the presence of an unknown unauthorised GMO that may not yet have been assessed for its safety for humans, animals or the environment.
NASA Astrophysics Data System (ADS)
Jin, Dayong; Piper, James A.; Leif, Robert C.; Yang, Sean; Ferrari, Belinda C.; Yuan, Jingli; Wang, Guilan; Vallarino, Lidia M.; Williams, John W.
2009-03-01
A fundamental problem for rare-event cell analysis is auto-fluorescence from nontarget particles and cells. Time-gated flow cytometry is based on the temporal-domain discrimination of long-lifetime (>1 μs) luminescence-stained cells and can render invisible all nontarget cell and particles. We aim to further evaluate the technique, focusing on detection of ultra-rare-event 5-μm calibration beads in environmental water dirt samples. Europium-labeled 5-μm calibration beads with improved luminescence homogeneity and reduced aggregation were evaluated using the prototype UV LED excited time-gated luminescence (TGL) flow cytometer (FCM). A BD FACSAria flow cytometer was used to sort accurately a very low number of beads (<100 events), which were then spiked into concentrated samples of environmental water. The use of europium-labeled beads permitted the demonstration of specific detection rates of 100%+/-30% and 91%+/-3% with 10 and 100 target beads, respectively, that were mixed with over one million nontarget autofluorescent background particles. Under the same conditions, a conventional FCM was unable to recover rare-event fluorescein isothiocyanate (FITC) calibration beads. Preliminary results on Giardia detection are also reported. We have demonstrated the scientific value of lanthanide-complex biolabels in flow cytometry. This approach may augment the current method that uses multifluorescence-channel flow cytometry gating.
NASA Astrophysics Data System (ADS)
Reynen, Andrew; Audet, Pascal
2017-09-01
A new method using a machine learning technique is applied to event classification and detection at seismic networks. This method is applicable to a variety of network sizes and settings. The algorithm makes use of a small catalogue of known observations across the entire network. Two attributes, the polarization and frequency content, are used as input to regression. These attributes are extracted at predicted arrival times for P and S waves using only an approximate velocity model, as attributes are calculated over large time spans. This method of waveform characterization is shown to be able to distinguish between blasts and earthquakes with 99 per cent accuracy using a network of 13 stations located in Southern California. The combination of machine learning with generalized waveform features is further applied to event detection in Oklahoma, United States. The event detection algorithm makes use of a pair of unique seismic phases to locate events, with a precision directly related to the sampling rate of the generalized waveform features. Over a week of data from 30 stations in Oklahoma, United States are used to automatically detect 25 times more events than the catalogue of the local geological survey, with a false detection rate of less than 2 per cent. This method provides a highly confident way of detecting and locating events. Furthermore, a large number of seismic events can be automatically detected with low false alarm, allowing for a larger automatic event catalogue with a high degree of trust.
Flanking sequence determination and event-specific detection of genetically modified wheat B73-6-1.
Xu, Junyi; Cao, Jijuan; Cao, Dongmei; Zhao, Tongtong; Huang, Xin; Zhang, Piqiao; Luan, Fengxia
2013-05-01
In order to establish a specific identification method for genetically modified (GM) wheat, exogenous insert DNA and flanking sequence between exogenous fragment and recombinant chromosome of GM wheat B73-6-1 were successfully acquired by means of conventional polymerase chain reaction (PCR) and thermal asymmetric interlaced (TAIL)-PCR strategies. Newly acquired exogenous fragment covered the full-length sequence of transformed genes such as transformed plasmid and corresponding functional genes including marker uidA, herbicide-resistant bar, ubiquitin promoter, and high-molecular-weight gluten subunit. The flanking sequence between insert DNA revealed high similarity with Triticum turgidum A gene (GenBank: AY494981.1). A specific PCR detection method for GM wheat B73-6-1 was established on the basis of primers designed according to the flanking sequence. This specific PCR method was validated by GM wheat, GM corn, GM soybean, GM rice, and non-GM wheat. The specifically amplified target band was observed only in GM wheat B73-6-1. This method is of high specificity, high reproducibility, rapid identification, and excellent accuracy for the identification of GM wheat B73-6-1.
Kreilinger, Alex; Hiebel, Hannah; Müller-Putz, Gernot R
2016-03-01
This work aimed to find and evaluate a new method for detecting errors in continuous brain-computer interface (BCI) applications. Instead of classifying errors on a single-trial basis, the new method was based on multiple events (MEs) analysis to increase the accuracy of error detection. In a BCI-driven car game, based on motor imagery (MI), discrete events were triggered whenever subjects collided with coins and/or barriers. Coins counted as correct events, whereas barriers were errors. This new method, termed ME method, combined and averaged the classification results of single events (SEs) and determined the correctness of MI trials, which consisted of event sequences instead of SEs. The benefit of this method was evaluated in an offline simulation. In an online experiment, the new method was used to detect erroneous MI trials. Such MI trials were discarded and could be repeated by the users. We found that, even with low SE error potential (ErrP) detection rates, feasible accuracies can be achieved when combining MEs to distinguish erroneous from correct MI trials. Online, all subjects reached higher scores with error detection than without, at the cost of longer times needed for completing the game. Findings suggest that ErrP detection may become a reliable tool for monitoring continuous states in BCI applications when combining MEs. This paper demonstrates a novel technique for detecting errors in online continuous BCI applications, which yields promising results even with low single-trial detection rates.
Ahmed, W; Stewart, J; Gardner, T; Powell, D; Brooks, P; Sullivan, D; Tindale, N
2007-08-01
Library-dependent (LD) (biochemical fingerprinting of Escherichia coli and enterococci) and library-independent (LI) (PCR detection of human-specific biomarkers) methods were used to detect human faecal pollution in three non-sewered catchments. In all, 550 E. coli isolates and 700 enterococci isolates were biochemically fingerprinted from 18 water samples and compared with metabolic fingerprint libraries of 4508 E. coli and 4833 enterococci isolates. E. coli fingerprints identified human unique biochemical phenotypes (BPTs) in nine out of 18 water samples; similarly, enterococci fingerprints identified human faecal pollution in 10 water samples. Seven samples were tested by PCR for the detection of biomarkers. Human-specific HF134 Bacteroides and enterococci surface protein (esp) biomarkers were detected in five samples. Four samples were also positive for HF183 Bacteroides biomarker. The combination of biomarkers detected human faecal pollution in six out of seven water samples. Of the seven samples analysed for both the indicators/markers, at least one indicator/marker was detected in every sample. Four of the seven PCR-positive samples were also positive for one of the human-specific E. coli or enterococci BPTs. The results indicated human faecal pollution in the studied sub-catchments after storm events. LD and LI methods used in this study complimented each other and provided additional information regarding the polluting sources when one method failed to detect human faecal pollution. Therefore, it is recommended that a combination of methods should be used to identify the source(s) of faecal pollution where possible.
Event-specific real-time detection and quantification of genetically modified Roundup Ready soybean.
Huang, Chia-Chia; Pan, Tzu-Ming
2005-05-18
The event-specific real-time detection and quantification of Roundup Ready soybean (RRS) using an ABI PRISM 7700 sequence detection system with light upon extension (LUX) primer was developed in this study. The event-specific primers were designed, targeting the junction of the RRS 5' integration site and the endogenous gene lectin1. Then, a standard reference plasmid was constructed that carried both of the targeted sequences for quantitative analysis. The detection limit of the LUX real-time PCR system was 0.05 ng of 100% RRS genomic DNA, which was equal to 20.5 copies. The range of quantification was from 0.1 to 100%. The sensitivity and range of quantification successfully met the requirement of the labeling rules in the European Union and Taiwan.
Almenoff, June S; LaCroix, Karol K; Yuen, Nancy A; Fram, David; DuMouchel, William
2006-01-01
There is increasing interest in using disproportionality-based signal detection methods to support postmarketing safety surveillance activities. Two commonly used methods, empirical Bayes multi-item gamma Poisson shrinker (MGPS) and proportional reporting ratio (PRR), perform differently with respect to the number and types of signals detected. The goal of this study was to compare and analyse the performance characteristics of these two methods, to understand why they differ and to consider the practical implications of these differences for a large, industry-based pharmacovigilance department. We compared the numbers and types of signals of disproportionate reporting (SDRs) obtained with MGPS and PRR using two postmarketing safety databases and a simulated database. We recorded signal counts and performed a qualitative comparison of the drug-event combinations signalled by the two methods as well as a sensitivity analysis to better understand how the thresholds commonly used for these methods impact their performance. PRR detected more SDRs than MGPS. We observed that MGPS is less subject to confounding by demographic factors because it employs stratification and is more stable than PRR when report counts are low. Simulation experiments performed using published empirical thresholds demonstrated that PRR detected false-positive signals at a rate of 1.1%, while MGPS did not detect any statistical false positives. In an attempt to separate the effect of choice of signal threshold from more fundamental methodological differences, we performed a series of experiments in which we modified the conventional threshold values for each method so that each method detected the same number of SDRs for the example drugs studied. This analysis, which provided quantitative examples of the relationship between the published thresholds for the two methods, demonstrates that the signalling criterion published for PRR has a higher signalling frequency than that published for MGPS. The performance differences between the PRR and MGPS methods are related to (i) greater confounding by demographic factors with PRR; (ii) a higher tendency of PRR to detect false-positive signals when the number of reports is small; and (iii) the conventional thresholds that have been adapted for each method. PRR tends to be more 'sensitive' and less 'specific' than MGPS. A high-specificity disproportionality method, when used in conjunction with medical triage and investigation of critical medical events, may provide an efficient and robust approach to applying quantitative methods in routine postmarketing pharmacovigilance.
Rizzi, Giovanni; Lee, Jung-Rok; Dahl, Christina; Guldberg, Per; Dufva, Martin; Wang, Shan X; Hansen, Mikkel F
2017-09-26
Epigenetic modifications, in particular DNA methylation, are gaining increasing interest as complementary information to DNA mutations for cancer diagnostics and prognostics. We introduce a method to simultaneously profile DNA mutation and methylation events for an array of sites with single site specificity. Genomic (mutation) or bisulphite-treated (methylation) DNA is amplified using nondiscriminatory primers, and the amplicons are then hybridized to a giant magnetoresistive (GMR) biosensor array followed by melting curve measurements. The GMR biosensor platform offers scalable multiplexed detection of DNA hybridization, which is insensitive to temperature variation. The melting curve approach further enhances the assay specificity and tolerance to variations in probe length. We demonstrate the utility of this method by simultaneously profiling five mutation and four methylation sites in human melanoma cell lines. The method correctly identified all mutation and methylation events and further provided quantitative assessment of methylation density validated by bisulphite pyrosequencing.
Assessment of an Automated Touchdown Detection Algorithm for the Orion Crew Module
NASA Technical Reports Server (NTRS)
Gay, Robert S.
2011-01-01
Orion Crew Module (CM) touchdown detection is critical to activating the post-landing sequence that safe?s the Reaction Control Jets (RCS), ensures that the vehicle remains upright, and establishes communication with recovery forces. In order to accommodate safe landing of an unmanned vehicle or incapacitated crew, an onboard automated detection system is required. An Orion-specific touchdown detection algorithm was developed and evaluated to differentiate landing events from in-flight events. The proposed method will be used to initiate post-landing cutting of the parachute riser lines, to prevent CM rollover, and to terminate RCS jet firing prior to submersion. The RCS jets continue to fire until touchdown to maintain proper CM orientation with respect to the flight path and to limit impact loads, but have potentially hazardous consequences if submerged while firing. The time available after impact to cut risers and initiate the CM Up-righting System (CMUS) is measured in minutes, whereas the time from touchdown to RCS jet submersion is a function of descent velocity, sea state conditions, and is often less than one second. Evaluation of the detection algorithms was performed for in-flight events (e.g. descent under chutes) using hi-fidelity rigid body analyses in the Decelerator Systems Simulation (DSS), whereas water impacts were simulated using a rigid finite element model of the Orion CM in LS-DYNA. Two touchdown detection algorithms were evaluated with various thresholds: Acceleration magnitude spike detection, and Accumulated velocity changed (over a given time window) spike detection. Data for both detection methods is acquired from an onboard Inertial Measurement Unit (IMU) sensor. The detection algorithms were tested with analytically generated in-flight and landing IMU data simulations. The acceleration spike detection proved to be faster while maintaining desired safety margin. Time to RCS jet submersion was predicted analytically across a series of simulated Orion landing conditions. This paper details the touchdown detection method chosen and the analysis used to support the decision.
NASA Astrophysics Data System (ADS)
Solano, ErickaAlinne; Hjorleifsdottir, Vala; Perez-Campos, Xyoli
2015-04-01
A large subset of seismic events do not have impulsive arrivals, such as low frequency events in volcanoes, earthquakes in the shallow part of the subduction interface and further down dip from the traditional seismogenic part, glacial events, volcanic and non-volcanic tremors and landslides. A suite of methods can be used to detect these non-impulsive events. One of this methods is the full-waveform detection based on time reversal methods (Solano, et al , submitted to GJI). The method uses continuous observed seismograms, together with Greens functions and moment tensor responses calculated for an arbitrary 3D structure. This method was applied to the 2012 Ometepec-Pinotepa Nacional earthquake sequence in Guerrero, Mexico. During the span time of the study, we encountered three previously unknown events. One of this events was an impulsive earthquake in the Ometepec area, that only has clear arrivals on three stations and was therefore not located and reported by the SSN. The other two events are previously undetected events, very depleted in high frequencies, that occurred far outside the search area. A very rough estimate gives the location of this two events in the portion of the East Pacific Rise around 9 N. These two events are detected despite their distance from the search area, due to favorable move-out on the array of the Mexican National Seismological Service network (SSN). We are expanding the study area to the EPR and to a larger period of time, with the objective of finding more events in that region. We will present an analysis of the newly detected events, as well as any further findings at the meeting.
Fall Detection Using Smartphone Audio Features.
Cheffena, Michael
2016-07-01
An automated fall detection system based on smartphone audio features is developed. The spectrogram, mel frequency cepstral coefficents (MFCCs), linear predictive coding (LPC), and matching pursuit (MP) features of different fall and no-fall sound events are extracted from experimental data. Based on the extracted audio features, four different machine learning classifiers: k-nearest neighbor classifier (k-NN), support vector machine (SVM), least squares method (LSM), and artificial neural network (ANN) are investigated for distinguishing between fall and no-fall events. For each audio feature, the performance of each classifier in terms of sensitivity, specificity, accuracy, and computational complexity is evaluated. The best performance is achieved using spectrogram features with ANN classifier with sensitivity, specificity, and accuracy all above 98%. The classifier also has acceptable computational requirement for training and testing. The system is applicable in home environments where the phone is placed in the vicinity of the user.
State-of-the-Art Fusion-Finder Algorithms Sensitivity and Specificity
Carrara, Matteo; Beccuti, Marco; Lazzarato, Fulvio; Cavallo, Federica; Cordero, Francesca; Donatelli, Susanna; Calogero, Raffaele A.
2013-01-01
Background. Gene fusions arising from chromosomal translocations have been implicated in cancer. RNA-seq has the potential to discover such rearrangements generating functional proteins (chimera/fusion). Recently, many methods for chimeras detection have been published. However, specificity and sensitivity of those tools were not extensively investigated in a comparative way. Results. We tested eight fusion-detection tools (FusionHunter, FusionMap, FusionFinder, MapSplice, deFuse, Bellerophontes, ChimeraScan, and TopHat-fusion) to detect fusion events using synthetic and real datasets encompassing chimeras. The comparison analysis run only on synthetic data could generate misleading results since we found no counterpart on real dataset. Furthermore, most tools report a very high number of false positive chimeras. In particular, the most sensitive tool, ChimeraScan, reports a large number of false positives that we were able to significantly reduce by devising and applying two filters to remove fusions not supported by fusion junction-spanning reads or encompassing large intronic regions. Conclusions. The discordant results obtained using synthetic and real datasets suggest that synthetic datasets encompassing fusion events may not fully catch the complexity of RNA-seq experiment. Moreover, fusion detection tools are still limited in sensitivity or specificity; thus, there is space for further improvement in the fusion-finder algorithms. PMID:23555082
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Ning; He, Yuqing; Mao, Xun
This paper presents a novel approach to electrochemically determine enzymatically active PSA using ferrocene-functionalized helix peptide (CHSSLKQK). The principle of electrochemical measurement is based on the specific proteolytic cleavage events of the FC-peptide on the gold electrode surface in the presence of PSA, resulting the change of the current signal of the electrode. The percentage of the decreased current is linear with the concentration of active PSA at the range of 0.5-40 ng/mL with a detection limit of 0.2 ng/mL. The direct transduction of peptide cleavage events into an electrical signal provides a simple, sensitive method for detecting the enzymaticmore » activity of PSA and determining the active PSA concentration.« less
Samson, Maria Cristina; Gullì, Mariolina; Marmiroli, Nelson
2010-07-01
Methodologies that enable the detection of genetically modified organisms (GMOs) (authorized and non-authorized) in food and feed strongly influence the potential for adequate updating and implementation of legislation together with labeling requirements. Quantitative polymerase chain reaction (qPCR) systems were designed to boost the sensitivity and specificity on the identification of GMOs in highly degraded DNA samples; however, such testing will become economically difficult to cope with due to increasing numbers of approved genetically modified (GM) lines. Multiplexing approaches are therefore in development to provide cost-efficient solution. Construct-specific primers and probe were developed for quantitative analysis of Roundup Ready soybean (RRS) event glyphosate-tolerant soybean (GTS) 40-3-2. The lectin gene (Le1) was used as a reference gene, and its specificity was verified. RRS- and Le1-specific quantitative real-time PCR (qRTPCR) were optimized in a duplex platform that has been validated with respect to limit of detection (LOD) and limit of quantification (LOQ), as well as accuracy. The analysis of model processed food samples showed that the degradation of DNA has no adverse or little effects on the performance of quantification assay. In this study, a duplex qRTPCR using TaqMan minor groove binder-non-fluorescent quencher (MGB-NFQ) chemistry was developed for specific detection and quantification of RRS event GTS 40-3-2 that can be used for practical monitoring in processed food products.
Huber, Ingrid; Block, Annette; Sebah, Daniela; Debode, Frédéric; Morisset, Dany; Grohmann, Lutz; Berben, Gilbert; Stebih, Dejan; Milavec, Mojca; Zel, Jana; Busch, Ulrich
2013-10-30
Worldwide, qualitative methods based on PCR are most commonly used as screening tools for genetically modified material in food and feed. However, the increasing number and diversity of genetically modified organisms (GMO) require effective methods for simultaneously detecting several genetic elements marking the presence of transgenic events. Herein we describe the development and validation of a pentaplex, as well as complementary triplex and duplex real-time PCR assays, for the detection of the most common screening elements found in commercialized GMOs: P-35S, T-nos, ctp2-cp4-epsps, bar, and pat. The use of these screening assays allows the coverage of many GMO events globally approved for commercialization. Each multiplex real-time PCR assay shows high specificity and sensitivity with an absolute limit of detection below 20 copies for the targeted sequences. We demonstrate by intra- and interlaboratory tests that the assays are robust as well as cost- and time-effective for GMO screening if applied in routine GMO analysis.
A new prior for bayesian anomaly detection: application to biosurveillance.
Shen, Y; Cooper, G F
2010-01-01
Bayesian anomaly detection computes posterior probabilities of anomalous events by combining prior beliefs and evidence from data. However, the specification of prior probabilities can be challenging. This paper describes a Bayesian prior in the context of disease outbreak detection. The goal is to provide a meaningful, easy-to-use prior that yields a posterior probability of an outbreak that performs at least as well as a standard frequentist approach. If this goal is achieved, the resulting posterior could be usefully incorporated into a decision analysis about how to act in light of a possible disease outbreak. This paper describes a Bayesian method for anomaly detection that combines learning from data with a semi-informative prior probability over patterns of anomalous events. A univariate version of the algorithm is presented here for ease of illustration of the essential ideas. The paper describes the algorithm in the context of disease-outbreak detection, but it is general and can be used in other anomaly detection applications. For this application, the semi-informative prior specifies that an increased count over baseline is expected for the variable being monitored, such as the number of respiratory chief complaints per day at a given emergency department. The semi-informative prior is derived based on the baseline prior, which is estimated from using historical data. The evaluation reported here used semi-synthetic data to evaluate the detection performance of the proposed Bayesian method and a control chart method, which is a standard frequentist algorithm that is closest to the Bayesian method in terms of the type of data it uses. The disease-outbreak detection performance of the Bayesian method was statistically significantly better than that of the control chart method when proper baseline periods were used to estimate the baseline behavior to avoid seasonal effects. When using longer baseline periods, the Bayesian method performed as well as the control chart method. The time complexity of the Bayesian algorithm is linear in the number of the observed events being monitored, due to a novel, closed-form derivation that is introduced in the paper. This paper introduces a novel prior probability for Bayesian outbreak detection that is expressive, easy-to-apply, computationally efficient, and performs as well or better than a standard frequentist method.
Bower, Patricia A.; Scopel, Caitlin O.; Jensen, Erika T.; Depas, Morgan M.; McLellan, Sandra L.
2005-01-01
Lake Michigan surface waters impacted by fecal pollution were assessed to determine the occurrence of genetic markers for Bacteroides and Escherichia coli. Initial experiments with sewage treatment plant influent demonstrated that total Bacteroides spp. could be detected by PCR in a 25- to 125-fold-higher dilution series than E. coli and human-specific Bacteroides spp., which were both found in similar dilution ranges. The limit of detection for the human-specific genetic marker ranged from 0.2 CFU/100 ml to 82 CFU/100 ml culturable E. coli for four wastewater treatment plants in urban and rural areas. The spatial and temporal distributions of these markers were assessed following major rain events that introduced urban storm water, agricultural runoff, and sewage overflows into Lake Michigan. Bacteroides spp. were detected in all of these samples by PCR, including those with <1 CFU/100 ml E. coli. Human-specific Bacteroides spp. were detected as far as 2 km into Lake Michigan during sewage overflow events, with variable detection 1 to 9 days postoverflow, whereas the cow-specific Bacteroides spp. were detected in only highly contaminated samples near the river outflow. Lake Michigan beaches were also assessed throughout the summer season for the same markers. Bacteroides spp. were detected in all beach samples, including 28 of the 74 samples that did not exceed 235 CFU/100 ml of E. coli. Human-specific Bacteroides spp. were detected at three of the seven beaches; one of the sites demonstrating positive results was sampled during a reported sewage overflow, but E. coli levels were below 235 CFU/100 ml. This study demonstrates the usefulness of non-culture-based microbial-source tracking approaches and the prevalence of these genetic markers in the Great Lakes, including freshwater coastal beaches. PMID:16332817
PCR technology for screening and quantification of genetically modified organisms (GMOs).
Holst-Jensen, Arne; Rønning, Sissel B; Løvseth, Astrid; Berdal, Knut G
2003-04-01
Although PCR technology has obvious limitations, the potentially high degree of sensitivity and specificity explains why it has been the first choice of most analytical laboratories interested in detection of genetically modified (GM) organisms (GMOs) and derived materials. Because the products that laboratories receive for analysis are often processed and refined, the quality and quantity of target analyte (e.g. protein or DNA) frequently challenges the sensitivity of any detection method. Among the currently available methods, PCR methods are generally accepted as the most sensitive and reliable methods for detection of GM-derived material in routine applications. The choice of target sequence motif is the single most important factor controlling the specificity of the PCR method. The target sequence is normally a part of the modified gene construct, for example a promoter, a terminator, a gene, or a junction between two of these elements. However, the elements may originate from wildtype organisms, they may be present in more than one GMO, and their copy number may also vary from one GMO to another. They may even be combined in a similar way in more than one GMO. Thus, the choice of method should fit the purpose. Recent developments include event-specific methods, particularly useful for identification and quantification of GM content. Thresholds for labelling are now in place in many countries including those in the European Union. The success of the labelling schemes is dependent upon the efficiency with which GM-derived material can be detected. We will present an overview of currently available PCR methods for screening and quantification of GM-derived DNA, and discuss their applicability and limitations. In addition, we will discuss some of the major challenges related to determination of the limits of detection (LOD) and quantification (LOQ), and to validation of methods.
Hernández, Marta; Rodríguez-Lázaro, David; Esteve, Teresa; Prat, Salomé; Pla, Maria
2003-12-15
Commercialization of several genetically modified crops has been approved worldwide to date. Uniplex polymerase chain reaction (PCR)-based methods to identify these different insertion events have been developed, but their use in the analysis of all commercially available genetically modified organisms (GMOs) is becoming progressively insufficient. These methods require a large number of assays to detect all possible GMOs present in the sample and thereby the development of multiplex PCR systems using combined probes and primers targeted to sequences specific to various GMOs is needed for detection of this increasing number of GMOs. Here we report on the development of a multiplex real-time PCR suitable for multiple GMO identification, based on the intercalating dye SYBR Green I and the analysis of the melting curves of the amplified products. Using this method, different amplification products specific for Maximizer 176, Bt11, MON810, and GA21 maize and for GTS 40-3-2 soybean were obtained and identified by their specific Tm. We have combined amplification of these products in a number of multiplex reactions and show the suitability of the methods for identification of GMOs with a sensitivity of 0.1% in duplex reactions. The described methods offer an economic and simple alternative to real-time PCR systems based on sequence-specific probes (i.e., TaqMan chemistry). These methods can be used as selection tests and further optimized for uniplex GMO quantification.
Efficient method for events detection in phonocardiographic signals
NASA Astrophysics Data System (ADS)
Martinez-Alajarin, Juan; Ruiz-Merino, Ramon
2005-06-01
The auscultation of the heart is still the first basic analysis tool used to evaluate the functional state of the heart, as well as the first indicator used to submit the patient to a cardiologist. In order to improve the diagnosis capabilities of auscultation, signal processing algorithms are currently being developed to assist the physician at primary care centers for adult and pediatric population. A basic task for the diagnosis from the phonocardiogram is to detect the events (main and additional sounds, murmurs and clicks) present in the cardiac cycle. This is usually made by applying a threshold and detecting the events that are bigger than the threshold. However, this method usually does not allow the detection of the main sounds when additional sounds and murmurs exist, or it may join several events into a unique one. In this paper we present a reliable method to detect the events present in the phonocardiogram, even in the presence of heart murmurs or additional sounds. The method detects relative maxima peaks in the amplitude envelope of the phonocardiogram, and computes a set of parameters associated with each event. Finally, a set of characteristics is extracted from each event to aid in the identification of the events. Besides, the morphology of the murmurs is also detected, which aids in the differentiation of different diseases that can occur in the same temporal localization. The algorithms have been applied to real normal heart sounds and murmurs, achieving satisfactory results.
Why conventional detection methods fail in identifying the existence of contamination events.
Liu, Shuming; Li, Ruonan; Smith, Kate; Che, Han
2016-04-15
Early warning systems are widely used to safeguard water security, but their effectiveness has raised many questions. To understand why conventional detection methods fail to identify contamination events, this study evaluates the performance of three contamination detection methods using data from a real contamination accident and two artificial datasets constructed using a widely applied contamination data construction approach. Results show that the Pearson correlation Euclidean distance (PE) based detection method performs better for real contamination incidents, while the Euclidean distance method (MED) and linear prediction filter (LPF) method are more suitable for detecting sudden spike-like variation. This analysis revealed why the conventional MED and LPF methods failed to identify existence of contamination events. The analysis also revealed that the widely used contamination data construction approach is misleading. Copyright © 2016 Elsevier Ltd. All rights reserved.
Embedded security system for multi-modal surveillance in a railway carriage
NASA Astrophysics Data System (ADS)
Zouaoui, Rhalem; Audigier, Romaric; Ambellouis, Sébastien; Capman, François; Benhadda, Hamid; Joudrier, Stéphanie; Sodoyer, David; Lamarque, Thierry
2015-10-01
Public transport security is one of the main priorities of the public authorities when fighting against crime and terrorism. In this context, there is a great demand for autonomous systems able to detect abnormal events such as violent acts aboard passenger cars and intrusions when the train is parked at the depot. To this end, we present an innovative approach which aims at providing efficient automatic event detection by fusing video and audio analytics and reducing the false alarm rate compared to classical stand-alone video detection. The multi-modal system is composed of two microphones and one camera and integrates onboard video and audio analytics and fusion capabilities. On the one hand, for detecting intrusion, the system relies on the fusion of "unusual" audio events detection with intrusion detections from video processing. The audio analysis consists in modeling the normal ambience and detecting deviation from the trained models during testing. This unsupervised approach is based on clustering of automatically extracted segments of acoustic features and statistical Gaussian Mixture Model (GMM) modeling of each cluster. The intrusion detection is based on the three-dimensional (3D) detection and tracking of individuals in the videos. On the other hand, for violent events detection, the system fuses unsupervised and supervised audio algorithms with video event detection. The supervised audio technique detects specific events such as shouts. A GMM is used to catch the formant structure of a shout signal. Video analytics use an original approach for detecting aggressive motion by focusing on erratic motion patterns specific to violent events. As data with violent events is not easily available, a normality model with structured motions from non-violent videos is learned for one-class classification. A fusion algorithm based on Dempster-Shafer's theory analyses the asynchronous detection outputs and computes the degree of belief of each probable event.
Prins, Theo W; van Dijk, Jeroen P; Beenen, Henriek G; Van Hoef, AM Angeline; Voorhuijzen, Marleen M; Schoen, Cor D; Aarts, Henk JM; Kok, Esther J
2008-01-01
Background To maintain EU GMO regulations, producers of new GM crop varieties need to supply an event-specific method for the new variety. As a result methods are nowadays available for EU-authorised genetically modified organisms (GMOs), but only to a limited extent for EU-non-authorised GMOs (NAGs). In the last decade the diversity of genetically modified (GM) ingredients in food and feed has increased significantly. As a result of this increase GMO laboratories currently need to apply many different methods to establish to potential presence of NAGs in raw materials and complex derived products. Results In this paper we present an innovative method for detecting (approved) GMOs as well as the potential presence of NAGs in complex DNA samples containing different crop species. An optimised protocol has been developed for padlock probe ligation in combination with microarray detection (PPLMD) that can easily be scaled up. Linear padlock probes targeted against GMO-events, -elements and -species have been developed that can hybridise to their genomic target DNA and are visualised using microarray hybridisation. In a tenplex PPLMD experiment, different genomic targets in Roundup-Ready soya, MON1445 cotton and Bt176 maize were detected down to at least 1%. In single experiments, the targets were detected down to 0.1%, i.e. comparable to standard qPCR. Conclusion Compared to currently available methods this is a significant step forward towards multiplex detection in complex raw materials and derived products. It is shown that the PPLMD approach is suitable for large-scale detection of GMOs in real-life samples and provides the possibility to detect and/or identify NAGs that would otherwise remain undetected. PMID:19055784
Prins, Theo W; van Dijk, Jeroen P; Beenen, Henriek G; Van Hoef, Am Angeline; Voorhuijzen, Marleen M; Schoen, Cor D; Aarts, Henk J M; Kok, Esther J
2008-12-04
To maintain EU GMO regulations, producers of new GM crop varieties need to supply an event-specific method for the new variety. As a result methods are nowadays available for EU-authorised genetically modified organisms (GMOs), but only to a limited extent for EU-non-authorised GMOs (NAGs). In the last decade the diversity of genetically modified (GM) ingredients in food and feed has increased significantly. As a result of this increase GMO laboratories currently need to apply many different methods to establish to potential presence of NAGs in raw materials and complex derived products. In this paper we present an innovative method for detecting (approved) GMOs as well as the potential presence of NAGs in complex DNA samples containing different crop species. An optimised protocol has been developed for padlock probe ligation in combination with microarray detection (PPLMD) that can easily be scaled up. Linear padlock probes targeted against GMO-events, -elements and -species have been developed that can hybridise to their genomic target DNA and are visualised using microarray hybridisation.In a tenplex PPLMD experiment, different genomic targets in Roundup-Ready soya, MON1445 cotton and Bt176 maize were detected down to at least 1%. In single experiments, the targets were detected down to 0.1%, i.e. comparable to standard qPCR. Compared to currently available methods this is a significant step forward towards multiplex detection in complex raw materials and derived products. It is shown that the PPLMD approach is suitable for large-scale detection of GMOs in real-life samples and provides the possibility to detect and/or identify NAGs that would otherwise remain undetected.
Detecting Earthquakes over a Seismic Network using Single-Station Similarity Measures
NASA Astrophysics Data System (ADS)
Bergen, Karianne J.; Beroza, Gregory C.
2018-03-01
New blind waveform-similarity-based detection methods, such as Fingerprint and Similarity Thresholding (FAST), have shown promise for detecting weak signals in long-duration, continuous waveform data. While blind detectors are capable of identifying similar or repeating waveforms without templates, they can also be susceptible to false detections due to local correlated noise. In this work, we present a set of three new methods that allow us to extend single-station similarity-based detection over a seismic network; event-pair extraction, pairwise pseudo-association, and event resolution complete a post-processing pipeline that combines single-station similarity measures (e.g. FAST sparse similarity matrix) from each station in a network into a list of candidate events. The core technique, pairwise pseudo-association, leverages the pairwise structure of event detections in its network detection model, which allows it to identify events observed at multiple stations in the network without modeling the expected move-out. Though our approach is general, we apply it to extend FAST over a sparse seismic network. We demonstrate that our network-based extension of FAST is both sensitive and maintains a low false detection rate. As a test case, we apply our approach to two weeks of continuous waveform data from five stations during the foreshock sequence prior to the 2014 Mw 8.2 Iquique earthquake. Our method identifies nearly five times as many events as the local seismicity catalog (including 95% of the catalog events), and less than 1% of these candidate events are false detections.
NASA Technical Reports Server (NTRS)
Totman, Peter D. (Inventor); Everton, Randy L. (Inventor); Egget, Mark R. (Inventor); Macon, David J. (Inventor)
2007-01-01
A method and apparatus for detecting and determining event characteristics such as, for example, the material failure of a component, in a manner which significantly reduces the amount of data collected. A sensor array, including a plurality of individual sensor elements, is coupled to a programmable logic device (PLD) configured to operate in a passive state and an active state. A triggering event is established such that the PLD records information only upon detection of the occurrence of the triggering event which causes a change in state within one or more of the plurality of sensor elements. Upon the occurrence of the triggering event, the change in state of the one or more sensor elements causes the PLD to record in memory which sensor element detected the event and at what time the event was detected. The PLD may be coupled with a computer for subsequent downloading and analysis of the acquired data.
Cooper, Andrew J P; Lettis, Sally; Chapman, Charlotte L; Evans, Stephen J W; Waller, Patrick C; Shakir, Saad; Payvandi, Nassrin; Murray, Alison B
2008-05-01
Following the adoption of the ICH E2E guideline, risk management plans (RMP) defining the cumulative safety experience and identifying limitations in safety information are now required for marketing authorisation applications (MAA). A collaborative research project was conducted to gain experience with tools for presenting and evaluating data in the safety specification. This paper presents those tools found to be useful and the lessons learned from their use. Archive data from a successful MAA were utilised. Methods were assessed for demonstrating the extent of clinical safety experience, evaluating the sensitivity of the clinical trial data to detect treatment differences and identifying safety signals from adverse event and laboratory data to define the extent of safety knowledge with the drug. The extent of clinical safety experience was demonstrated by plots of patient exposure over time. Adverse event data were presented using dot plots, which display the percentages of patients with the events of interest, the odds ratio, and 95% confidence interval. Power and confidence interval plots were utilised for evaluating the sensitivity of the clinical database to detect treatment differences. Box and whisker plots were used to display laboratory data. This project enabled us to identify new evidence-based methods for presenting and evaluating clinical safety data. These methods represent an advance in the way safety data from clinical trials can be analysed and presented. This project emphasises the importance of early and comprehensive planning of the safety package, including evaluation of the use of epidemiology data.
Azim, Riyasat; Li, Fangxing; Xue, Yaosuo; ...
2017-07-14
Distributed generations (DGs) for grid-connected applications require an accurate and reliable islanding detection methodology (IDM) for secure system operation. This paper presents an IDM for grid-connected inverter-based DGs. The proposed method is a combination of passive and active islanding detection techniques for aggregation of their advantages and elimination/minimisation of the drawbacks. In the proposed IDM, the passive method utilises critical system attributes extracted from local voltage measurements at target DG locations as well as employs decision tree-based classifiers for characterisation and detection of islanding events. The active method is based on Sandia frequency shift technique and is initiated only whenmore » the passive method is unable to differentiate islanding events from other system events. Thus, the power quality degradation introduced into the system by active islanding detection techniques can be minimised. Furthermore, a combination of active and passive techniques allows detection of islanding events under low power mismatch scenarios eliminating the disadvantage associated with the use of passive techniques alone. Finally, detailed case study results demonstrate the effectiveness of the proposed method in detection of islanding events under various power mismatch scenarios, load quality factors and in the presence of single or multiple grid-connected inverter-based DG units.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azim, Riyasat; Li, Fangxing; Xue, Yaosuo
Distributed generations (DGs) for grid-connected applications require an accurate and reliable islanding detection methodology (IDM) for secure system operation. This paper presents an IDM for grid-connected inverter-based DGs. The proposed method is a combination of passive and active islanding detection techniques for aggregation of their advantages and elimination/minimisation of the drawbacks. In the proposed IDM, the passive method utilises critical system attributes extracted from local voltage measurements at target DG locations as well as employs decision tree-based classifiers for characterisation and detection of islanding events. The active method is based on Sandia frequency shift technique and is initiated only whenmore » the passive method is unable to differentiate islanding events from other system events. Thus, the power quality degradation introduced into the system by active islanding detection techniques can be minimised. Furthermore, a combination of active and passive techniques allows detection of islanding events under low power mismatch scenarios eliminating the disadvantage associated with the use of passive techniques alone. Finally, detailed case study results demonstrate the effectiveness of the proposed method in detection of islanding events under various power mismatch scenarios, load quality factors and in the presence of single or multiple grid-connected inverter-based DG units.« less
NASA Astrophysics Data System (ADS)
Flach, Milan; Mahecha, Miguel; Gans, Fabian; Rodner, Erik; Bodesheim, Paul; Guanche-Garcia, Yanira; Brenning, Alexander; Denzler, Joachim; Reichstein, Markus
2016-04-01
The number of available Earth observations (EOs) is currently substantially increasing. Detecting anomalous patterns in these multivariate time series is an important step in identifying changes in the underlying dynamical system. Likewise, data quality issues might result in anomalous multivariate data constellations and have to be identified before corrupting subsequent analyses. In industrial application a common strategy is to monitor production chains with several sensors coupled to some statistical process control (SPC) algorithm. The basic idea is to raise an alarm when these sensor data depict some anomalous pattern according to the SPC, i.e. the production chain is considered 'out of control'. In fact, the industrial applications are conceptually similar to the on-line monitoring of EOs. However, algorithms used in the context of SPC or process monitoring are rarely considered for supervising multivariate spatio-temporal Earth observations. The objective of this study is to exploit the potential and transferability of SPC concepts to Earth system applications. We compare a range of different algorithms typically applied by SPC systems and evaluate their capability to detect e.g. known extreme events in land surface processes. Specifically two main issues are addressed: (1) identifying the most suitable combination of data pre-processing and detection algorithm for a specific type of event and (2) analyzing the limits of the individual approaches with respect to the magnitude, spatio-temporal size of the event as well as the data's signal to noise ratio. Extensive artificial data sets that represent the typical properties of Earth observations are used in this study. Our results show that the majority of the algorithms used can be considered for the detection of multivariate spatiotemporal events and directly transferred to real Earth observation data as currently assembled in different projects at the European scale, e.g. http://baci-h2020.eu/index.php/ and http://earthsystemdatacube.net/. Known anomalies such as the Russian heatwave are detected as well as anomalies which are not detectable with univariate methods.
Development of a database and processing method for detecting hematotoxicity adverse drug events.
Shimai, Yoshie; Takeda, Toshihiro; Manabe, Shirou; Teramoto, Kei; Mihara, Naoki; Matsumura, Yasushi
2015-01-01
Adverse events are detected by monitoring the patient's status, including blood test results. However, it is difficult to identify all adverse events based on recognition by individual doctors. We developed a system that can be used to detect hematotoxicity adverse events according to blood test results recorded in an electronic medical record system. The blood test results were graded based on Common Terminology Criteria for Adverse Events (CTCAE) and changes in the blood test results (Up, Down, Flat) were assessed according to the variation in the grade. The changes in the blood test and injection data were stored in a database. By comparing the date of injection and start and end dates of the change in the blood test results, adverse events related to a designated drug were detected. Using this method, we searched for the occurrence of serious adverse events (CTCAE Grades 3 or 4) concerning WBC, ALT and creatinine related to paclitaxel at Osaka University Hospital. The rate of occurrence of a decreased WBC count, increased ALT level and increased creatinine level was 36.0%, 0.6% and 0.4%, respectively. This method is useful for detecting and estimating the rate of occurrence of hematotoxicity adverse drug events.
History, rare, and multiple events of mechanical unfolding of repeat proteins
NASA Astrophysics Data System (ADS)
Sumbul, Fidan; Marchesi, Arin; Rico, Felix
2018-03-01
Mechanical unfolding of proteins consisting of repeat domains is an excellent tool to obtain large statistics. Force spectroscopy experiments using atomic force microscopy on proteins presenting multiple domains have revealed that unfolding forces depend on the number of folded domains (history) and have reported intermediate states and rare events. However, the common use of unspecific attachment approaches to pull the protein of interest holds important limitations to study unfolding history and may lead to discarding rare and multiple probing events due to the presence of unspecific adhesion and uncertainty on the pulling site. Site-specific methods that have recently emerged minimize this uncertainty and would be excellent tools to probe unfolding history and rare events. However, detailed characterization of these approaches is required to identify their advantages and limitations. Here, we characterize a site-specific binding approach based on the ultrastable complex dockerin/cohesin III revealing its advantages and limitations to assess the unfolding history and to investigate rare and multiple events during the unfolding of repeated domains. We show that this approach is more robust, reproducible, and provides larger statistics than conventional unspecific methods. We show that the method is optimal to reveal the history of unfolding from the very first domain and to detect rare events, while being more limited to assess intermediate states. Finally, we quantify the forces required to unfold two molecules pulled in parallel, difficult when using unspecific approaches. The proposed method represents a step forward toward more reproducible measurements to probe protein unfolding history and opens the door to systematic probing of rare and multiple molecule unfolding mechanisms.
Measuring target detection performance in paradigms with high event rates.
Bendixen, Alexandra; Andersen, Søren K
2013-05-01
Combining behavioral and neurophysiological measurements inevitably implies mutual constraints, such as when the neurophysiological measurement requires fast-paced stimulus presentation and hence the attribution of a behavioral response to a particular preceding stimulus becomes ambiguous. We develop and test a method for validly assessing behavioral detection performance in spite of this ambiguity. We examine four approaches taken in the literature to treat such situations. We analytically derive a new variant of computing the classical parameters of signal detection theory, hit and false alarm rates, adapted to fast-paced paradigms. Each of the previous approaches shows specific shortcomings (susceptibility towards response window choice, biased estimates of behavioral detection performance). Superior performance of our new approach is demonstrated for both simulated and empirical behavioral data. Further evidence is provided by reliable correspondence between behavioral performance and the N2b component as an electrophysiological indicator of target detection. The appropriateness of our approach is substantiated by both theoretical and empirical arguments. We demonstrate an easy-to-implement solution for measuring target detection performance independent of the rate of event presentation. Thus overcoming the measurement bias of previous approaches, our method will help to clarify the behavioral relevance of different measures of cortical activation. Copyright © 2012 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Automatic Classification of Extensive Aftershock Sequences Using Empirical Matched Field Processing
NASA Astrophysics Data System (ADS)
Gibbons, Steven J.; Harris, David B.; Kværna, Tormod; Dodge, Douglas A.
2013-04-01
The aftershock sequences that follow large earthquakes create considerable problems for data centers attempting to produce comprehensive event bulletins in near real-time. The greatly increased number of events which require processing can overwhelm analyst resources and reduce the capacity for analyzing events of monitoring interest. This exacerbates a potentially reduced detection capability at key stations, due the noise generated by the sequence, and a deterioration in the quality of the fully automatic preliminary event bulletins caused by the difficulty in associating the vast numbers of closely spaced arrivals over the network. Considerable success has been enjoyed by waveform correlation methods for the automatic identification of groups of events belonging to the same geographical source region, facilitating the more time-efficient analysis of event ensembles as opposed to individual events. There are, however, formidable challenges associated with the automation of correlation procedures. The signal generated by a very large earthquake seldom correlates well enough with the signals generated by far smaller aftershocks for a correlation detector to produce statistically significant triggers at the correct times. Correlation between events within clusters of aftershocks is significantly better, although the issues of when and how to initiate new pattern detectors are still being investigated. Empirical Matched Field Processing (EMFP) is a highly promising method for detecting event waveforms suitable as templates for correlation detectors. EMFP is a quasi-frequency-domain technique that calibrates the spatial structure of a wavefront crossing a seismic array in a collection of narrow frequency bands. The amplitude and phase weights that result are applied in a frequency-domain beamforming operation that compensates for scattering and refraction effects not properly modeled by plane-wave beams. It has been demonstrated to outperform waveform correlation as a classifier of ripple-fired mining blasts since the narrowband procedure is insensitive to differences in the source-time functions. For sequences in which the spectral content and time-histories of the signals from the main shock and aftershocks vary greatly, the spatial structure calibrated by EMFP is an invariant that permits reliable detection of events in the specific source region. Examples from the 2005 Kashmir and 2011 Van earthquakes demonstrate how EMFP templates from the main events detect arrivals from the aftershock sequences with high sensitivity and exceptionally low false alarm rates. Classical waveform correlation detectors are demonstrated to fail for these examples. Even arrivals with SNR below unity can produce significant EMFP triggers as the spatial pattern of the incoming wavefront is identified, leading to robust detections at a greater number of stations and potentially more reliable automatic bulletins. False EMFP triggers are readily screened by scanning a space of phase shifts relative to the imposed template. EMFP has the potential to produce a rapid and robust overview of the evolving aftershock sequence such that correlation and subspace detectors can be applied semi-autonomously, with well-chosen parameter specifications, to identify and classify clusters of very closely spaced aftershocks.
Chu, Catherine. J.; Chan, Arthur; Song, Dan; Staley, Kevin J.; Stufflebeam, Steven M.; Kramer, Mark A.
2017-01-01
Summary Background High frequency oscillations are emerging as a clinically important indicator of epileptic networks. However, manual detection of these high frequency oscillations is difficult, time consuming, and subjective, especially in the scalp EEG, thus hindering further clinical exploration and application. Semi-automated detection methods augment manual detection by reducing inspection to a subset of time intervals. We propose a new method to detect high frequency oscillations that co-occur with interictal epileptiform discharges. New Method The new method proceeds in two steps. The first step identifies candidate time intervals during which high frequency activity is increased. The second step computes a set of seven features for each candidate interval. These features require that the candidate event contain a high frequency oscillation approximately sinusoidal in shape, with at least three cycles, that co-occurs with a large amplitude discharge. Candidate events that satisfy these features are stored for validation through visual analysis. Results We evaluate the detector performance in simulation and on ten examples of scalp EEG data, and show that the proposed method successfully detects spike-ripple events, with high positive predictive value, low false positive rate, and high intra-rater reliability. Comparison with Existing Method The proposed method is less sensitive than the existing method of visual inspection, but much faster and much more reliable. Conclusions Accurate and rapid detection of high frequency activity increases the clinical viability of this rhythmic biomarker of epilepsy. The proposed spike-ripple detector rapidly identifies candidate spike-ripple events, thus making clinical analysis of prolonged, multielectrode scalp EEG recordings tractable. PMID:27988323
Detecting Service Chains and Feature Interactions in Sensor-Driven Home Network Services
Inada, Takuya; Igaki, Hiroshi; Ikegami, Kosuke; Matsumoto, Shinsuke; Nakamura, Masahide; Kusumoto, Shinji
2012-01-01
Sensor-driven services often cause chain reactions, since one service may generate an environmental impact that automatically triggers another service. We first propose a framework that can formalize and detect such service chains based on ECA (event, condition, action) rules. Although the service chain can be a major source of feature interactions, not all service chains lead to harmful interactions. Therefore, we then propose a method that identifies feature interactions within the service chains. Specifically, we characterize the degree of deviation of every service chain by evaluating the gap between expected and actual service states. An experimental evaluation demonstrates that the proposed method successfully detects 11 service chains and 6 feature interactions within 7 practical sensor-driven services. PMID:23012499
Draelos, Timothy J.; Ballard, Sanford; Young, Christopher J.; ...
2015-10-01
Given a set of observations within a specified time window, a fitness value is calculated at each grid node by summing station-specific conditional fitness values. Assuming each observation was generated by a refracted P wave, these values are proportional to the conditional probabilities that each observation was generated by a seismic event at the grid node. The node with highest fitness value is accepted as a hypothetical event location, subject to some minimal fitness value, and all arrivals within a longer time window consistent with that event are associated with it. During the association step, a variety of different phasesmore » are considered. In addition, once associated with an event, an arrival is removed from further consideration. While unassociated arrivals remain, the search for other events is repeated until none are identified.« less
Detection of rain events in radiological early warning networks with spectro-dosimetric systems
NASA Astrophysics Data System (ADS)
Dąbrowski, R.; Dombrowski, H.; Kessler, P.; Röttger, A.; Neumaier, S.
2017-10-01
Short-term pronounced increases of the ambient dose equivalent rate, due to rainfall are a well-known phenomenon. Increases in the same order of magnitude or even below may also be caused by a nuclear or radiological event, i.e. by artificial radiation. Hence, it is important to be able to identify natural rain events in dosimetric early warning networks and to distinguish them from radiological events. Novel spectrometric systems based on scintillators may be used to differentiate between the two scenarios, because the measured gamma spectra provide significant nuclide-specific information. This paper describes three simple, automatic methods to check whether an dot H*(10) increase is caused by a rain event or by artificial radiation. These methods were applied to measurements of three spectrometric systems based on CeBr3, LaBr3 and SrI2 scintillation crystals, investigated and tested for their practicability at a free-field reference site of PTB.
Detection of Abnormal Events via Optical Flow Feature Analysis
Wang, Tian; Snoussi, Hichem
2015-01-01
In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227
Bühler, Mira; Vollstädt-Klein, Sabine; Klemen, Jane; Smolka, Michael N
2008-01-01
Background Existing brain imaging studies, investigating sexual arousal via the presentation of erotic pictures or film excerpts, have mainly used blocked designs with long stimulus presentation times. Methods To clarify how experimental functional magnetic resonance imaging (fMRI) design affects stimulus-induced brain activity, we compared brief event-related presentation of erotic vs. neutral stimuli with blocked presentation in 10 male volunteers. Results Brain activation differed depending on design type in only 10% of the voxels showing task related brain activity. Differences between blocked and event-related stimulus presentation were found in occipitotemporal and temporal regions (Brodmann Area (BA) 19, 37, 48), parietal areas (BA 7, 40) and areas in the frontal lobe (BA 6, 44). Conclusion Our results suggest that event-related designs might be a potential alternative when the core interest is the detection of networks associated with immediate processing of erotic stimuli. Additionally, blocked, compared to event-related, stimulus presentation allows the emergence and detection of non-specific secondary processes, such as sustained attention, motor imagery and inhibition of sexual arousal. PMID:18647397
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, Fei; Jiang, Huaiguang; Tan, Jin
This paper proposes an event-driven approach for reconfiguring distribution systems automatically. Specifically, an optimal synchrophasor sensor placement (OSSP) is used to reduce the number of synchrophasor sensors while keeping the whole system observable. Then, a wavelet-based event detection and location approach is used to detect and locate the event, which performs as a trigger for network reconfiguration. With the detected information, the system is then reconfigured using the hierarchical decentralized approach to seek for the new optimal topology. In this manner, whenever an event happens the distribution network can be reconfigured automatically based on the real-time information that is observablemore » and detectable.« less
Detecting earthquakes over a seismic network using single-station similarity measures
NASA Astrophysics Data System (ADS)
Bergen, Karianne J.; Beroza, Gregory C.
2018-06-01
New blind waveform-similarity-based detection methods, such as Fingerprint and Similarity Thresholding (FAST), have shown promise for detecting weak signals in long-duration, continuous waveform data. While blind detectors are capable of identifying similar or repeating waveforms without templates, they can also be susceptible to false detections due to local correlated noise. In this work, we present a set of three new methods that allow us to extend single-station similarity-based detection over a seismic network; event-pair extraction, pairwise pseudo-association, and event resolution complete a post-processing pipeline that combines single-station similarity measures (e.g. FAST sparse similarity matrix) from each station in a network into a list of candidate events. The core technique, pairwise pseudo-association, leverages the pairwise structure of event detections in its network detection model, which allows it to identify events observed at multiple stations in the network without modeling the expected moveout. Though our approach is general, we apply it to extend FAST over a sparse seismic network. We demonstrate that our network-based extension of FAST is both sensitive and maintains a low false detection rate. As a test case, we apply our approach to 2 weeks of continuous waveform data from five stations during the foreshock sequence prior to the 2014 Mw 8.2 Iquique earthquake. Our method identifies nearly five times as many events as the local seismicity catalogue (including 95 per cent of the catalogue events), and less than 1 per cent of these candidate events are false detections.
NASA Astrophysics Data System (ADS)
Kwon, Yong-Seok; Naeem, Khurram; Jeon, Min Yong; Kwon, Il-bum
2017-04-01
We analyze the relations of parameters in moving average method to enhance the event detectability of phase sensitive optical time domain reflectometer (OTDR). If the external events have unique frequency of vibration, then the control parameters of moving average method should be optimized in order to detect these events efficiently. A phase sensitive OTDR was implemented by a pulsed light source, which is composed of a laser diode, a semiconductor optical amplifier, an erbium-doped fiber amplifier, a fiber Bragg grating filter, and a light receiving part, which has a photo-detector and high speed data acquisition system. The moving average method is operated with the control parameters: total number of raw traces, M, number of averaged traces, N, and step size of moving, n. The raw traces are obtained by the phase sensitive OTDR with sound signals generated by a speaker. Using these trace data, the relation of the control parameters is analyzed. In the result, if the event signal has one frequency, then the optimal values of N, n are existed to detect the event efficiently.
MOLECULAR DIAGNOSTICS - ANOTHER PIECE IN THE ENVIRONMENTAL PUZZLE
Molecular biology offers sensitive and expedient tools for the detection of exposure to environmental stressors. Molecular approaches provide the means for detection of the "first cellular event(s)" in response to environmental changes-specifically, immediate changes in gene expr...
Unbeck, Maria; Schildmeijer, Kristina; Henriksson, Peter; Jürgensen, Urban; Muren, Olav; Nilsson, Lena; Pukk Härenstam, Karin
2013-04-15
There has been a theoretical debate as to which retrospective record review method is the most valid, reliable, cost efficient and feasible for detecting adverse events. The aim of the present study was to evaluate the feasibility and capability of two common retrospective record review methods, the "Harvard Medical Practice Study" method and the "Global Trigger Tool" in detecting adverse events in adult orthopaedic inpatients. We performed a three-stage structured retrospective record review process in a random sample of 350 orthopaedic admissions during 2009 at a Swedish university hospital. Two teams comprised each of a registered nurse and two physicians were assigned, one to each method. All records were primarily reviewed by registered nurses. Records containing a potential adverse event were forwarded to physicians for review in stage 2. Physicians made an independent review regarding, for example, healthcare causation, preventability and severity. In the third review stage all adverse events that were found with the two methods together were compared and all discrepancies after review stage 2 were analysed. Events that had not been identified by one of the methods in the first two review stages were reviewed by the respective physicians. Altogether, 160 different adverse events were identified in 105 (30.0%) of the 350 records with both methods combined. The "Harvard Medical Practice Study" method identified 155 of the 160 (96.9%, 95% CI: 92.9-99.0) adverse events in 104 (29.7%) records compared with 137 (85.6%, 95% CI: 79.2-90.7) adverse events in 98 (28.0%) records using the "Global Trigger Tool". Adverse events "causing harm without permanent disability" accounted for most of the observed difference. The overall positive predictive value for criteria and triggers using the "Harvard Medical Practice Study" method and the "Global Trigger Tool" was 40.3% and 30.4%, respectively. More adverse events were identified using the "Harvard Medical Practice Study" method than using the "Global Trigger Tool". Differences in review methodology, perception of less severe adverse events and context knowledge may explain the observed difference between two expert review teams in the detection of adverse events.
Schoenfeld, David A.; Brown, Samuel M.; Hough, Catherine L.; Yealy, Donald M.; Moss, Marc; Angus, Derek C.; Iwashyna, Theodore J.
2017-01-01
Rationale: After the sample size of a randomized clinical trial (RCT) is set by the power requirement of its primary endpoint, investigators select secondary endpoints while unable to further adjust sample size. How the sensitivity and specificity of an instrument used to measure these outcomes, together with their expected underlying event rates, affect an RCT’s power to measure significant differences in these outcomes is poorly understood. Objectives: Motivated by the design of an RCT of neuromuscular blockade in acute respiratory distress syndrome, we examined how power to detect a difference in secondary endpoints varies with the sensitivity and specificity of the instrument used to measure such outcomes. Methods: We derived a general formula and Stata code for calculating an RCT’s power to detect differences in binary outcomes when such outcomes are measured with imperfect sensitivity and specificity. The formula informed the choice of instrument for measuring post-traumatic stress–like symptoms in the Reevaluation of Systemic Early Neuromuscular Blockade RCT (www.clinicaltrials.gov identifier NCT02509078). Measurements and Main Results: On the basis of published sensitivities and specificities, the Impact of Events Scale-Revised was predicted to measure a 36% symptom rate, whereas the Post-Traumatic Stress Symptoms instrument was predicted to measure a 23% rate, if the true underlying rate of post-traumatic stress symptoms were 25%. Despite its lower sensitivity, the briefer Post-Traumatic Stress Symptoms instrument provided superior power to detect a difference in rates between trial arms, owing to its higher specificity. Conclusions: Examining instruments’ power to detect differences in outcomes may guide their selection when multiple instruments exist, each with different sensitivities and specificities. PMID:27788018
TE-Tracker: systematic identification of transposition events through whole-genome resequencing.
Gilly, Arthur; Etcheverry, Mathilde; Madoui, Mohammed-Amin; Guy, Julie; Quadrana, Leandro; Alberti, Adriana; Martin, Antoine; Heitkam, Tony; Engelen, Stefan; Labadie, Karine; Le Pen, Jeremie; Wincker, Patrick; Colot, Vincent; Aury, Jean-Marc
2014-11-19
Transposable elements (TEs) are DNA sequences that are able to move from their location in the genome by cutting or copying themselves to another locus. As such, they are increasingly recognized as impacting all aspects of genome function. With the dramatic reduction in cost of DNA sequencing, it is now possible to resequence whole genomes in order to systematically characterize novel TE mobilization in a particular individual. However, this task is made difficult by the inherently repetitive nature of TE sequences, which in some eukaryotes compose over half of the genome sequence. Currently, only a few software tools dedicated to the detection of TE mobilization using next-generation-sequencing are described in the literature. They often target specific TEs for which annotation is available, and are only able to identify families of closely related TEs, rather than individual elements. We present TE-Tracker, a general and accurate computational method for the de-novo detection of germ line TE mobilization from re-sequenced genomes, as well as the identification of both their source and destination sequences. We compare our method with the two classes of existing software: specialized TE-detection tools and generic structural variant (SV) detection tools. We show that TE-Tracker, while working independently of any prior annotation, bridges the gap between these two approaches in terms of detection power. Indeed, its positive predictive value (PPV) is comparable to that of dedicated TE software while its sensitivity is typical of a generic SV detection tool. TE-Tracker demonstrates the benefit of adopting an annotation-independent, de novo approach for the detection of TE mobilization events. We use TE-Tracker to provide a comprehensive view of transposition events induced by loss of DNA methylation in Arabidopsis. TE-Tracker is freely available at http://www.genoscope.cns.fr/TE-Tracker . We show that TE-Tracker accurately detects both the source and destination of novel transposition events in re-sequenced genomes. Moreover, TE-Tracker is able to detect all potential donor sequences for a given insertion, and can identify the correct one among them. Furthermore, TE-Tracker produces significantly fewer false positives than common SV detection programs, thus greatly facilitating the detection and analysis of TE mobilization events.
Yih, W Katherine; Maro, Judith C; Nguyen, Michael; Baker, Meghan A; Balsbaugh, Carolyn; Cole, David V; Dashevsky, Inna; Mba-Jonas, Adamma; Kulldorff, Martin
2018-06-01
The self-controlled tree-temporal scan statistic-a new signal-detection method-can evaluate whether any of a wide variety of health outcomes are temporally associated with receipt of a specific vaccine, while adjusting for multiple testing. Neither health outcomes nor postvaccination potential periods of increased risk need be prespecified. Using US medical claims data in the Food and Drug Administration's Sentinel system, we employed the method to evaluate adverse events occurring after receipt of quadrivalent human papillomavirus vaccine (4vHPV). Incident outcomes recorded in emergency department or inpatient settings within 56 days after first doses of 4vHPV received by 9- through 26.9-year-olds in 2006-2014 were identified using International Classification of Diseases, Ninth Revision, diagnosis codes and analyzed by pairing the new method with a standard hierarchical classification of diagnoses. On scanning diagnoses of 1.9 million 4vHPV recipients, 2 statistically significant categories of adverse events were found: cellulitis on days 2-3 after vaccination and "other complications of surgical and medical procedures" on days 1-3 after vaccination. Cellulitis is a known adverse event. Clinically informed investigation of electronic claims records of the patients with "other complications" did not suggest any previously unknown vaccine safety problem. Considering that thousands of potential short-term adverse events and hundreds of potential risk intervals were evaluated, these findings add significantly to the growing safety record of 4vHPV.
Automatic event recognition and anomaly detection with attribute grammar by learning scene semantics
NASA Astrophysics Data System (ADS)
Qi, Lin; Yao, Zhenyu; Li, Li; Dong, Junyu
2007-11-01
In this paper we present a novel framework for automatic event recognition and abnormal behavior detection with attribute grammar by learning scene semantics. This framework combines learning scene semantics by trajectory analysis and constructing attribute grammar-based event representation. The scene and event information is learned automatically. Abnormal behaviors that disobey scene semantics or event grammars rules are detected. By this method, an approach to understanding video scenes is achieved. Further more, with this prior knowledge, the accuracy of abnormal event detection is increased.
A Method for Automated Detection of Usability Problems from Client User Interface Events
Saadawi, Gilan M.; Legowski, Elizabeth; Medvedeva, Olga; Chavan, Girish; Crowley, Rebecca S.
2005-01-01
Think-aloud usability analysis provides extremely useful data but is very time-consuming and expensive to perform because of the extensive manual video analysis that is required. We describe a simple method for automated detection of usability problems from client user interface events for a developing medical intelligent tutoring system. The method incorporates (1) an agent-based method for communication that funnels all interface events and system responses to a centralized database, (2) a simple schema for representing interface events and higher order subgoals, and (3) an algorithm that reproduces the criteria used for manual coding of usability problems. A correction factor was empirically determining to account for the slower task performance of users when thinking aloud. We tested the validity of the method by simultaneously identifying usability problems using TAU and manually computing them from stored interface event data using the proposed algorithm. All usability problems that did not rely on verbal utterances were detectable with the proposed method. PMID:16779121
Heat waves in Senegal : detection, characterization and associated processes.
NASA Astrophysics Data System (ADS)
Gnacoussa Sambou, Marie Jeanne; Janicot, Serge; Badiane, Daouda; Pohl, Benjamin; Dieng, Abdou L.; Gaye, Amadou T.
2017-04-01
Atmospheric configuration and synoptic evolution of patterns associated with Senegalese heat wave (HW) are examined on the period 1979-2014 using the Global Surface Summary of the Day (GSOD) observational database and ERA-Interim reanalysis. Since there is no objective and uniform definition of HW events, threshold methods based on atmospheric variables as daily maximum (Tmax) / minimum (Tmin) temperatures and daily mean apparent temperature (AT) are used to define HW threshold detection. Each criterion is related to a specific category of HW events: Tmax (warm day events), Tmin (warm night events) and AT (combining temperature and moisture). These definitions are used in order to characterize as well as possible the warm events over the Senegalese regions (oceanic versus continental region). Statistics on time evolution and spatial distribution of warm events are carried out over the 2 seasons of maximum temperature (March-May and October-November). For each season, a composite of HW events, as well as the most extended event over Senegal (as a case study) are analyzed using usual atmospheric fields (sea level pressure, geopotential height, total column water content, wind components, 2m temperature). This study is part of the project ACASIS (https://acasis.locean-ipsl.upmc.fr/doku.php) on heat waves occurrences over the Sahel and their impact on health. Keywords: heat wave, Senegal, ACASIS.
Chu, Catherine J; Chan, Arthur; Song, Dan; Staley, Kevin J; Stufflebeam, Steven M; Kramer, Mark A
2017-02-01
High frequency oscillations are emerging as a clinically important indicator of epileptic networks. However, manual detection of these high frequency oscillations is difficult, time consuming, and subjective, especially in the scalp EEG, thus hindering further clinical exploration and application. Semi-automated detection methods augment manual detection by reducing inspection to a subset of time intervals. We propose a new method to detect high frequency oscillations that co-occur with interictal epileptiform discharges. The new method proceeds in two steps. The first step identifies candidate time intervals during which high frequency activity is increased. The second step computes a set of seven features for each candidate interval. These features require that the candidate event contain a high frequency oscillation approximately sinusoidal in shape, with at least three cycles, that co-occurs with a large amplitude discharge. Candidate events that satisfy these features are stored for validation through visual analysis. We evaluate the detector performance in simulation and on ten examples of scalp EEG data, and show that the proposed method successfully detects spike-ripple events, with high positive predictive value, low false positive rate, and high intra-rater reliability. The proposed method is less sensitive than the existing method of visual inspection, but much faster and much more reliable. Accurate and rapid detection of high frequency activity increases the clinical viability of this rhythmic biomarker of epilepsy. The proposed spike-ripple detector rapidly identifies candidate spike-ripple events, thus making clinical analysis of prolonged, multielectrode scalp EEG recordings tractable. Copyright © 2016 Elsevier B.V. All rights reserved.
Method and apparatus for data sampling
Odell, Daniel M. C.
1994-01-01
A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium.
Discovery of rare, diagnostic AluYb8/9 elements in diverse human populations.
Feusier, Julie; Witherspoon, David J; Scott Watkins, W; Goubert, Clément; Sasani, Thomas A; Jorde, Lynn B
2017-01-01
Polymorphic human Alu elements are excellent tools for assessing population structure, and new retrotransposition events can contribute to disease. Next-generation sequencing has greatly increased the potential to discover Alu elements in human populations, and various sequencing and bioinformatics methods have been designed to tackle the problem of detecting these highly repetitive elements. However, current techniques for Alu discovery may miss rare, polymorphic Alu elements. Combining multiple discovery approaches may provide a better profile of the polymorphic Alu mobilome. Alu Yb8/9 elements have been a focus of our recent studies as they are young subfamilies (~2.3 million years old) that contribute ~30% of recent polymorphic Alu retrotransposition events. Here, we update our ME-Scan methods for detecting Alu elements and apply these methods to discover new insertions in a large set of individuals with diverse ancestral backgrounds. We identified 5,288 putative Alu insertion events, including several hundred novel Alu Yb8/9 elements from 213 individuals from 18 diverse human populations. Hundreds of these loci were specific to continental populations, and 23 non-reference population-specific loci were validated by PCR. We provide high-quality sequence information for 68 rare Alu Yb8/9 elements, of which 11 have hallmarks of an active source element. Our subfamily distribution of rare Alu Yb8/9 elements is consistent with previous datasets, and may be representative of rare loci. We also find that while ME-Scan and low-coverage, whole-genome sequencing (WGS) detect different Alu elements in 41 1000 Genomes individuals, the two methods yield similar population structure results. Current in-silico methods for Alu discovery may miss rare, polymorphic Alu elements. Therefore, using multiple techniques can provide a more accurate profile of Alu elements in individuals and populations. We improved our false-negative rate as an indicator of sample quality for future ME-Scan experiments. In conclusion, we demonstrate that ME-Scan is a good supplement for next-generation sequencing methods and is well-suited for population-level analyses.
Subsurface event detection and classification using Wireless Signal Networks.
Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T
2012-11-05
Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events.
Subsurface Event Detection and Classification Using Wireless Signal Networks
Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T.
2012-01-01
Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events. PMID:23202191
An Improved Sparse Representation over Learned Dictionary Method for Seizure Detection.
Li, Junhui; Zhou, Weidong; Yuan, Shasha; Zhang, Yanli; Li, Chengcheng; Wu, Qi
2016-02-01
Automatic seizure detection has played an important role in the monitoring, diagnosis and treatment of epilepsy. In this paper, a patient specific method is proposed for seizure detection in the long-term intracranial electroencephalogram (EEG) recordings. This seizure detection method is based on sparse representation with online dictionary learning and elastic net constraint. The online learned dictionary could sparsely represent the testing samples more accurately, and the elastic net constraint which combines the 11-norm and 12-norm not only makes the coefficients sparse but also avoids over-fitting problem. First, the EEG signals are preprocessed using wavelet filtering and differential filtering, and the kernel function is applied to make the samples closer to linearly separable. Then the dictionaries of seizure and nonseizure are respectively learned from original ictal and interictal training samples with online dictionary optimization algorithm to compose the training dictionary. After that, the test samples are sparsely coded over the learned dictionary and the residuals associated with ictal and interictal sub-dictionary are calculated, respectively. Eventually, the test samples are classified as two distinct categories, seizure or nonseizure, by comparing the reconstructed residuals. The average segment-based sensitivity of 95.45%, specificity of 99.08%, and event-based sensitivity of 94.44% with false detection rate of 0.23/h and average latency of -5.14 s have been achieved with our proposed method.
van der Klift, Heleen M; Tops, Carli M J; Bik, Elsa C; Boogaard, Merel W; Borgstein, Anne-Marijke; Hansson, Kerstin B M; Ausems, Margreet G E M; Gomez Garcia, Encarna; Green, Andrew; Hes, Frederik J; Izatt, Louise; van Hest, Liselotte P; Alonso, Angel M; Vriends, Annette H J T; Wagner, Anja; van Zelst-Stams, Wendy A G; Vasen, Hans F A; Morreau, Hans; Devilee, Peter; Wijnen, Juul T
2010-05-01
Heterozygous mutations in PMS2 are involved in Lynch syndrome, whereas biallelic mutations are found in Constitutional mismatch repair-deficiency syndrome patients. Mutation detection is complicated by the occurrence of sequence exchange events between the duplicated regions of PMS2 and PMS2CL. We investigated the frequency of such events with a nonspecific polymerase chain reaction (PCR) strategy, co-amplifying both PMS2 and PMS2CL sequences. This allowed us to score ratios between gene and pseudogene-specific nucleotides at 29 PSV sites from exon 11 to the end of the gene. We found sequence transfer at all investigated PSVs from intron 12 to the 3' end of the gene in 4 to 52% of DNA samples. Overall, sequence exchange between PMS2 and PMS2CL was observed in 69% (83/120) of individuals. We demonstrate that mutation scanning with PMS2-specific PCR primers and MLPA probes, designed on PSVs, in the 3' duplicated region is unreliable, and present an RNA-based mutation detection strategy to improve reliability. Using this strategy, we found 19 different putative pathogenic PMS2 mutations. Four of these (21%) are lying in the region with frequent sequence transfer and are missed or called incorrectly as homozygous with several PSV-based mutation detection methods. (c) 2010 Wiley-Liss, Inc.
Performance of Traditional and Molecular Methods for Detecting Biological Agents in Drinking Water
Francy, Donna S.; Bushon, Rebecca N.; Brady, Amie M.G.; Bertke, Erin E.; Kephart, Christopher M.; Likirdopulos, Christina A.; Mailot, Brian E.; Schaefer, Frank W.; Lindquist, H.D. Alan
2009-01-01
To reduce the impact from a possible bioterrorist attack on drinking-water supplies, analytical methods are needed to rapidly detect the presence of biological agents in water. To this end, 13 drinking-water samples were collected at 9 water-treatment plants in Ohio to assess the performance of a molecular method in comparison to traditional analytical methods that take longer to perform. Two 100-liter samples were collected at each site during each sampling event; one was seeded in the laboratory with six biological agents - Bacillus anthracis (B. anthracis), Burkholderia cepacia (as a surrogate for Bu. pseudomallei), Francisella tularensis (F. tularensis), Salmonella Typhi (S. Typhi), Vibrio cholerae (V. cholerae), and Cryptospordium parvum (C. parvum). The seeded and unseeded samples were processed by ultrafiltration and analyzed by use of quantiative polymerase chain reaction (qPCR), a molecular method, and culture methods for bacterial agents or the immunomagnetic separation/fluorescent antibody (IMS/FA) method for C. parvum as traditional methods. Six replicate seeded samples were also processed and analyzed. For traditional methods, recoveries were highly variable between samples and even between some replicate samples, ranging from below detection to greater than 100 percent. Recoveries were significantly related to water pH, specific conductance, and dissolved organic carbon (DOC) for all bacteria combined by culture methods, but none of the water-quality characteristics tested were related to recoveries of C. parvum by IMS/FA. Recoveries were not determined by qPCR because of problems in quantifying organisms by qPCR in the composite seed. Instead, qPCR results were reported as detected, not detected (no qPCR signal), or +/- detected (Cycle Threshold or 'Ct' values were greater than 40). Several sample results by qPCR were omitted from the dataset because of possible problems with qPCR reagents, primers, and probes. For the remaining 14 qPCR results (including some replicate samples), F. tularensis and V. cholerae were detected in all samples after ultrafiltration, B. anthracis was detected in 13 and +/- detected in 1 sample, and C. parvum was detected in 9 and +/- detected in 4 samples. Bu. cepacia was detected in nine samples, +/- detected in two samples, and not detected in three samples (for two out of three samples not detected, a different strain was used). The qPCR assay for V. cholerae provided two false positive - but late - signals in one unseeded sample. Numbers found by qPCR after ultrafiltration were significantly or nearly significantly related to those found by traditional methods for B. anthracis, F. tularensis, and V. cholerae but not for Bu. cepacia and C. parvum. A qPCR assay for S. Typhi was not available. The qPCR method can be used to rapidly detect B. anthracis, F. tularensis, and V. cholerae with some certainty in drinking-water samples, but additional work would be needed to optimize and test qPCR for Bu. cepacia and C. parvum and establish relations to traditional methods. The specificity for the V. cholerae assay needs to be further investigated. Evidence is provided that ultrafiltration and qPCR are promising methods to rapidly detect biological agents in the Nation's drinking-water supplies and thus reduce the impact and consequences from intentional bioterrorist events. To our knowledge, this is the first study to compare the use of traditional and qPCR methods to detect biological agents in large-volume drinking-water samples.
Method for detection of selected chemicals in an open environment
NASA Technical Reports Server (NTRS)
Duong, Tuan (Inventor); Ryan, Margaret (Inventor)
2009-01-01
The present invention relates to a space-invariant independent component analysis and electronic nose for detection of selective chemicals in an unknown environment, and more specifically, an approach to analysis of sensor responses to mixtures of unknown chemicals by an electronic nose in an open and changing environment. It is intended to fill the gap between an alarm, which has little or no ability to distinguish among chemical compounds causing a response, and an analytical instrument, which can distinguish all compounds present but with no real-time or continuous event monitoring ability.
A spatial scan statistic for compound Poisson data.
Rosychuk, Rhonda J; Chang, Hsing-Ming
2013-12-20
The topic of spatial cluster detection gained attention in statistics during the late 1980s and early 1990s. Effort has been devoted to the development of methods for detecting spatial clustering of cases and events in the biological sciences, astronomy and epidemiology. More recently, research has examined detecting clusters of correlated count data associated with health conditions of individuals. Such a method allows researchers to examine spatial relationships of disease-related events rather than just incident or prevalent cases. We introduce a spatial scan test that identifies clusters of events in a study region. Because an individual case may have multiple (repeated) events, we base the test on a compound Poisson model. We illustrate our method for cluster detection on emergency department visits, where individuals may make multiple disease-related visits. Copyright © 2013 John Wiley & Sons, Ltd.
Method for detecting binding events using micro-X-ray fluorescence spectrometry
Warner, Benjamin P.; Havrilla, George J.; Mann, Grace
2010-12-28
Method for detecting binding events using micro-X-ray fluorescence spectrometry. Receptors are exposed to at least one potential binder and arrayed on a substrate support. Each member of the array is exposed to X-ray radiation. The magnitude of a detectable X-ray fluorescence signal for at least one element can be used to determine whether a binding event between a binder and a receptor has occurred, and can provide information related to the extent of binding between the binder and receptor.
Detection of goal events in soccer videos
NASA Astrophysics Data System (ADS)
Kim, Hyoung-Gook; Roeber, Steffen; Samour, Amjad; Sikora, Thomas
2005-01-01
In this paper, we present an automatic extraction of goal events in soccer videos by using audio track features alone without relying on expensive-to-compute video track features. The extracted goal events can be used for high-level indexing and selective browsing of soccer videos. The detection of soccer video highlights using audio contents comprises three steps: 1) extraction of audio features from a video sequence, 2) event candidate detection of highlight events based on the information provided by the feature extraction Methods and the Hidden Markov Model (HMM), 3) goal event selection to finally determine the video intervals to be included in the summary. For this purpose we compared the performance of the well known Mel-scale Frequency Cepstral Coefficients (MFCC) feature extraction method vs. MPEG-7 Audio Spectrum Projection feature (ASP) extraction method based on three different decomposition methods namely Principal Component Analysis( PCA), Independent Component Analysis (ICA) and Non-Negative Matrix Factorization (NMF). To evaluate our system we collected five soccer game videos from various sources. In total we have seven hours of soccer games consisting of eight gigabytes of data. One of five soccer games is used as the training data (e.g., announcers' excited speech, audience ambient speech noise, audience clapping, environmental sounds). Our goal event detection results are encouraging.
Modeling Concept Dependencies for Event Detection
2014-04-04
Gaussian Mixture Model (GMM). Jiang et al . [8] provide a summary of experiments for TRECVID MED 2010 . They employ low-level features such as SIFT and...event detection literature. Ballan et al . [2] present a method to introduce temporal information for video event detection with a BoW (bag-of-words...approach. Zhou et al . [24] study video event detection by encoding a video with a set of bag of SIFT feature vectors and describe the distribution with a
Ben-Yoav, Hadar; Dykstra, Peter H; Bentley, William E; Ghodssi, Reza
2017-01-01
A microfluidic electrochemical lab-on-a-chip (LOC) device for DNA hybridization detection has been developed. The device comprises a 3 × 3 array of microelectrodes integrated with a dual layer microfluidic valved manipulation system that provides controlled and automated capabilities for high throughput analysis of microliter volume samples. The surface of the microelectrodes is functionalized with single-stranded DNA (ssDNA) probes which enable specific detection of complementary ssDNA targets. These targets are detected by a capacitive technique which measures dielectric variation at the microelectrode-electrolyte interface due to DNA hybridization events. A quantitative analysis of the hybridization events is carried out based on a sensing modeling that includes detailed analysis of energy storage and dissipation components. By calculating these components during hybridization events the device is able to demonstrate specific and dose response sensing characteristics. The developed microfluidic LOC for DNA hybridization detection offers a technology for real-time and label-free assessment of genetic markers outside of laboratory settings, such as at the point-of-care or in-field environmental monitoring.
NASA Astrophysics Data System (ADS)
Hutchison, A. A.; Ghosh, A.
2016-12-01
Very low frequency earthquakes (VLFEs) occur in transitional zones of faults, releasing seismic energy in the 0.02-0.05 Hz frequency band over a 90 s duration and typically have magntitudes within the range of Mw 3.0-4.0. VLFEs can occur down-dip of the seismogenic zone, where they can transfer stress up-dip potentially bringing the locked zone closer to a critical failure stress. VLFEs also occur up-dip of the seismogenic zone in a region along the plate interface that can rupture coseismically during large megathrust events, such as the 2011 Tohoku-Oki earthquake [Ide et al., 2011]. VLFEs were first detected in Cascadia during the 2011 episodic tremor and slip (ETS) event, occurring coincidentally with tremor [Ghosh et al., 2015]. However, during the 2014 ETS event, VLFEs were spatially and temporally asynchronous with tremor activity [Hutchison and Ghosh, 2016]. Such contrasting behaviors remind us that the mechanics behind such events remain elusive, yet they are responsible for the largest portion of the moment release during an ETS event. Here, we apply a match filter method using known VLFEs as template events to detect additional VLFEs. Using a grid-search centroid moment tensor inversion method, we invert stacks of the resulting match filter detections to ensure moment tensor solutions are similar to that of the respective template events. Our ability to successfully employ a match filter method to VLFE detection in Cascadia intrinsically indicates that these events can be repeating, implying that the same asperities are likely responsible for generating multiple VLFEs.
Kim, Jae-Hwan; Park, Saet-Byul; Roh, Hyo-Jeong; Park, Sunghoon; Shin, Min-Ki; Moon, Gui Im; Hong, Jin-Hwan; Kim, Hae-Yeong
2015-06-01
With the increasing number of genetically modified (GM) events, unauthorized GMO releases into the food market have increased dramatically, and many countries have developed detection tools for them. This study described the qualitative and quantitative detection methods of unauthorized the GM wheat MON71800 with a reference plasmid (pGEM-M71800). The wheat acetyl-CoA carboxylase (acc) gene was used as the endogenous gene. The plasmid pGEM-M71800, which contains both the acc gene and the event-specific target MON71800, was constructed as a positive control for the qualitative and quantitative analyses. The limit of detection in the qualitative PCR assay was approximately 10 copies. In the quantitative PCR assay, the standard deviation and relative standard deviation repeatability values ranged from 0.06 to 0.25 and from 0.23% to 1.12%, respectively. This study supplies a powerful and very simple but accurate detection strategy for unauthorized GM wheat MON71800 that utilizes a single calibrator plasmid. Copyright © 2014 Elsevier Ltd. All rights reserved.
Method and apparatus for data sampling
Odell, D.M.C.
1994-04-19
A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples is described. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium. 6 figures.
Sommermeyer, Dirk; Zou, Ding; Grote, Ludger; Hedner, Jan
2012-01-01
Study Objective: To assess the accuracy of novel algorithms using an oximeter-based finger plethysmographic signal in combination with a nasal cannula for the detection and differentiation of central and obstructive apneas. The validity of single pulse oximetry to detect respiratory disturbance events was also studied. Methods: Patients recruited from four sleep laboratories underwent an ambulatory overnight cardiorespiratory polygraphy recording. The nasal flow and photoplethysmographic signals of the recording were analyzed by automated algorithms. The apnea hypopnea index (AHIauto) was calculated using both signals, and a respiratory disturbance index (RDIauto) was calculated from photoplethysmography alone. Apnea events were classified into obstructive and central types using the oximeter derived pulse wave signal and compared with manual scoring. Results: Sixty-six subjects (42 males, age 54 ± 14 yrs, body mass index 28.5 ± 5.9 kg/m2) were included in the analysis. AHImanual (19.4 ± 18.5 events/h) correlated highly significantly with AHIauto (19.9 ± 16.5 events/h) and RDIauto (20.4 ± 17.2 events/h); the correlation coefficients were r = 0.94 and 0.95, respectively (p < 0.001) with a mean difference of −0.5 ± 6.6 and −1.0 ± 6.1 events/h. The automatic analysis of AHIauto and RDIauto detected sleep apnea (cutoff AHImanual ≥ 15 events/h) with a sensitivity/specificity of 0.90/0.97 and 0.86/0.94, respectively. The automated obstructive/central apnea indices correlated closely with manually scoring (r = 0.87 and 0.95, p < 0.001) with mean difference of −4.3 ± 7.9 and 0.3 ± 1.5 events/h, respectively. Conclusions: Automatic analysis based on routine pulse oximetry alone may be used to detect sleep disordered breathing with accuracy. In addition, the combination of photoplethysmographic signals with a nasal flow signal provides an accurate distinction between obstructive and central apneic events during sleep. Citation: Sommermeyer D; Zou D; Grote L; Hedner J. Detection of sleep disordered breathing and its central/obstructive character using nasal cannula and finger pulse oximeter. J Clin Sleep Med 2012;8(5):527-533. PMID:23066364
Predictive modeling of structured electronic health records for adverse drug event detection
2015-01-01
Background The digitization of healthcare data, resulting from the increasingly widespread adoption of electronic health records, has greatly facilitated its analysis by computational methods and thereby enabled large-scale secondary use thereof. This can be exploited to support public health activities such as pharmacovigilance, wherein the safety of drugs is monitored to inform regulatory decisions about sustained use. To that end, electronic health records have emerged as a potentially valuable data source, providing access to longitudinal observations of patient treatment and drug use. A nascent line of research concerns predictive modeling of healthcare data for the automatic detection of adverse drug events, which presents its own set of challenges: it is not yet clear how to represent the heterogeneous data types in a manner conducive to learning high-performing machine learning models. Methods Datasets from an electronic health record database are used for learning predictive models with the purpose of detecting adverse drug events. The use and representation of two data types, as well as their combination, are studied: clinical codes, describing prescribed drugs and assigned diagnoses, and measurements. Feature selection is conducted on the various types of data to reduce dimensionality and sparsity, while allowing for an in-depth feature analysis of the usefulness of each data type and representation. Results Within each data type, combining multiple representations yields better predictive performance compared to using any single representation. The use of clinical codes for adverse drug event detection significantly outperforms the use of measurements; however, there is no significant difference over datasets between using only clinical codes and their combination with measurements. For certain adverse drug events, the combination does, however, outperform using only clinical codes. Feature selection leads to increased predictive performance for both data types, in isolation and combined. Conclusions We have demonstrated how machine learning can be applied to electronic health records for the purpose of detecting adverse drug events and proposed solutions to some of the challenges this presents, including how to represent the various data types. Overall, clinical codes are more useful than measurements and, in specific cases, it is beneficial to combine the two. PMID:26606038
Detecting and characterizing coal mine related seismicity in the Western U.S. using subspace methods
NASA Astrophysics Data System (ADS)
Chambers, Derrick J. A.; Koper, Keith D.; Pankow, Kristine L.; McCarter, Michael K.
2015-11-01
We present an approach for subspace detection of small seismic events that includes methods for estimating magnitudes and associating detections from multiple stations into unique events. The process is used to identify mining related seismicity from a surface coal mine and an underground coal mining district, both located in the Western U.S. Using a blasting log and a locally derived seismic catalogue as ground truth, we assess detector performance in terms of verified detections, false positives and failed detections. We are able to correctly identify over 95 per cent of the surface coal mine blasts and about 33 per cent of the events from the underground mining district, while keeping the number of potential false positives relatively low by requiring all detections to occur on two stations. We find that most of the potential false detections for the underground coal district are genuine events missed by the local seismic network, demonstrating the usefulness of regional subspace detectors in augmenting local catalogues. We note a trade-off in detection performance between stations at smaller source-receiver distances, which have increased signal-to-noise ratio, and stations at larger distances, which have greater waveform similarity. We also explore the increased detection capabilities of a single higher dimension subspace detector, compared to multiple lower dimension detectors, in identifying events that can be described as linear combinations of training events. We find, in our data set, that such an advantage can be significant, justifying the use of a subspace detection scheme over conventional correlation methods.
Carulli, Giovanni; Marini, Alessandra; Sammuri, Paola; Domenichini, Cristiana; Ottaviano, Virginia; Pacini, Simone; Petrini, Mario
2015-01-01
The identification of eosinophils by flow cytometry is difficult because most of the surface antigens expressed by eosinophils are shared with neutrophils. Some methods have been proposed, generally based on differential light scatter properties, enhanced autofluorescence, lack of CD16 or selective positivity of CD52. Such methods, however, show several limitations. In the present study we report a novel method based on the analysis of glycosylphosphatidylinositol (GPI)-linked molecules. The combination of CD157 and FLAER was used, since FLAER recognizes all GPI-linked molecules, while CD157 is absent on the membrane of eosinophils and expressed by neutrophils. Peripheral blood samples from normal subjects and patients with variable percentages of eosinophils (n = 31), and without any evidence for circulating immature myeloid cells, were stained with the combination of FLAER-Alexa Fluor and CD157-PE. A FascCanto II cytometer was used. Granulocytes were gated after CD33 staining and eosinophils were identified as CD157(-)/FLAER(+) events. Neutrophils were identified as CD157(+)/FLAER(+) events. The percentages of eosinophils detected by this method showed a very significant correlation both with automated counting and with manual counting (r = 0.981 and 0.989, respectively). Sorting assays were carried out by a S3 Cell Sorter: cytospins obtained from CD157(-)/FLAER(+) events consisted of 100% eosinophils, while samples from CD157(+)/FLAER(+) events were represented only by neutrophils. In conclusion, this method shows high sensitivity and specificity in order to distinguish eosinophils from neutrophils by flow cytometry. However, since CD157 is gradually up-regulated throughout bone marrow myeloid maturation, our method cannot be applied to cases characterized by immature myeloid cells.
Dick, Jeffrey E.; Hilterbrand, Adam T.; Strawsine, Lauren M.; Upton, Jason W.; Bard, Allen J.
2016-01-01
We report the specific collision of a single murine cytomegalovirus (MCMV) on a platinum ultramicroelectrode (UME, radius of 1 μm). Antibody directed against the viral surface protein glycoprotein B functionalized with glucose oxidase (GOx) allowed for specific detection of the virus in solution and a biological sample (urine). The oxidation of ferrocene methanol to ferrocenium methanol was carried out at the electrode surface, and the ferrocenium methanol acted as the cosubstrate to GOx to catalyze the oxidation of glucose to gluconolactone. In the presence of glucose, the incident collision of a GOx-covered virus onto the UME while ferrocene methanol was being oxidized produced stepwise increases in current as observed by amperometry. These current increases were observed due to the feedback loop of ferrocene methanol to the surface of the electrode after GOx reduces ferrocenium methanol back to ferrocene. Negative controls (i) without glucose, (ii) with an irrelevant virus (murine gammaherpesvirus 68), and (iii) without either virus do not display these current increases. Stepwise current decreases were observed for the prior two negative controls and no discrete events were observed for the latter. We further apply this method to the detection of MCMV in urine of infected mice. The method provides for a selective, rapid, and sensitive detection technique based on electrochemical collisions. PMID:27217569
Duan, Junbo; Zhang, Ji-Gang; Deng, Hong-Wen; Wang, Yu-Ping
2013-01-01
Copy number variation (CNV) has played an important role in studies of susceptibility or resistance to complex diseases. Traditional methods such as fluorescence in situ hybridization (FISH) and array comparative genomic hybridization (aCGH) suffer from low resolution of genomic regions. Following the emergence of next generation sequencing (NGS) technologies, CNV detection methods based on the short read data have recently been developed. However, due to the relatively young age of the procedures, their performance is not fully understood. To help investigators choose suitable methods to detect CNVs, comparative studies are needed. We compared six publicly available CNV detection methods: CNV-seq, FREEC, readDepth, CNVnator, SegSeq and event-wise testing (EWT). They are evaluated both on simulated and real data with different experiment settings. The receiver operating characteristic (ROC) curve is employed to demonstrate the detection performance in terms of sensitivity and specificity, box plot is employed to compare their performances in terms of breakpoint and copy number estimation, Venn diagram is employed to show the consistency among these methods, and F-score is employed to show the overlapping quality of detected CNVs. The computational demands are also studied. The results of our work provide a comprehensive evaluation on the performances of the selected CNV detection methods, which will help biological investigators choose the best possible method.
Recovery and normalization of triple coincidences in PET.
Lage, Eduardo; Parot, Vicente; Moore, Stephen C; Sitek, Arkadiusz; Udías, Jose M; Dave, Shivang R; Park, Mi-Ae; Vaquero, Juan J; Herraiz, Joaquin L
2015-03-01
Triple coincidences in positron emission tomography (PET) are events in which three γ-rays are detected simultaneously. These events, though potentially useful for enhancing the sensitivity of PET scanners, are discarded or processed without special consideration in current systems, because there is not a clear criterion for assigning them to a unique line-of-response (LOR). Methods proposed for recovering such events usually rely on the use of highly specialized detection systems, hampering general adoption, and/or are based on Compton-scatter kinematics and, consequently, are limited in accuracy by the energy resolution of standard PET detectors. In this work, the authors propose a simple and general solution for recovering triple coincidences, which does not require specialized detectors or additional energy resolution requirements. To recover triple coincidences, the authors' method distributes such events among their possible LORs using the relative proportions of double coincidences in these LORs. The authors show analytically that this assignment scheme represents the maximum-likelihood solution for the triple-coincidence distribution problem. The PET component of a preclinical PET/CT scanner was adapted to enable the acquisition and processing of triple coincidences. Since the efficiencies for detecting double and triple events were found to be different throughout the scanner field-of-view, a normalization procedure specific for triple coincidences was also developed. The effect of including triple coincidences using their method was compared against the cases of equally weighting the triples among their possible LORs and discarding all the triple events. The authors used as figures of merit for this comparison sensitivity, noise-equivalent count (NEC) rates and image quality calculated as described in the NEMA NU-4 protocol for the assessment of preclinical PET scanners. The addition of triple-coincidence events with the authors' method increased peak NEC rates of the scanner by 26.6% and 32% for mouse- and rat-sized objects, respectively. This increase in NEC-rate performance was also reflected in the image-quality metrics. Images reconstructed using double and triple coincidences recovered using their method had better signal-to-noise ratio than those obtained using only double coincidences, while preserving spatial resolution and contrast. Distribution of triple coincidences using an equal-weighting scheme increased apparent system sensitivity but degraded image quality. The performance boost provided by the inclusion of triple coincidences using their method allowed to reduce the acquisition time of standard imaging procedures by up to ∼25%. Recovering triple coincidences with the proposed method can effectively increase the sensitivity of current clinical and preclinical PET systems without compromising other parameters like spatial resolution or contrast.
NASA Astrophysics Data System (ADS)
Jia, Rui-Sheng; Sun, Hong-Mei; Peng, Yan-Jun; Liang, Yong-Quan; Lu, Xin-Ming
2017-07-01
Microseismic monitoring is an effective means for providing early warning of rock or coal dynamical disasters, and its first step is microseismic event detection, although low SNR microseismic signals often cannot effectively be detected by routine methods. To solve this problem, this paper presents permutation entropy and a support vector machine to detect low SNR microseismic events. First, an extraction method of signal features based on multi-scale permutation entropy is proposed by studying the influence of the scale factor on the signal permutation entropy. Second, the detection model of low SNR microseismic events based on the least squares support vector machine is built by performing a multi-scale permutation entropy calculation for the collected vibration signals, constructing a feature vector set of signals. Finally, a comparative analysis of the microseismic events and noise signals in the experiment proves that the different characteristics of the two can be fully expressed by using multi-scale permutation entropy. The detection model of microseismic events combined with the support vector machine, which has the features of high classification accuracy and fast real-time algorithms, can meet the requirements of online, real-time extractions of microseismic events.
An automated approach towards detecting complex behaviours in deep brain oscillations.
Mace, Michael; Yousif, Nada; Naushahi, Mohammad; Abdullah-Al-Mamun, Khondaker; Wang, Shouyan; Nandi, Dipankar; Vaidyanathan, Ravi
2014-03-15
Extracting event-related potentials (ERPs) from neurological rhythms is of fundamental importance in neuroscience research. Standard ERP techniques typically require the associated ERP waveform to have low variance, be shape and latency invariant and require many repeated trials. Additionally, the non-ERP part of the signal needs to be sampled from an uncorrelated Gaussian process. This limits methods of analysis to quantifying simple behaviours and movements only when multi-trial data-sets are available. We introduce a method for automatically detecting events associated with complex or large-scale behaviours, where the ERP need not conform to the aforementioned requirements. The algorithm is based on the calculation of a detection contour and adaptive threshold. These are combined using logical operations to produce a binary signal indicating the presence (or absence) of an event with the associated detection parameters tuned using a multi-objective genetic algorithm. To validate the proposed methodology, deep brain signals were recorded from implanted electrodes in patients with Parkinson's disease as they participated in a large movement-based behavioural paradigm. The experiment involved bilateral recordings of local field potentials from the sub-thalamic nucleus (STN) and pedunculopontine nucleus (PPN) during an orientation task. After tuning, the algorithm is able to extract events achieving training set sensitivities and specificities of [87.5 ± 6.5, 76.7 ± 12.8, 90.0 ± 4.1] and [92.6 ± 6.3, 86.0 ± 9.0, 29.8 ± 12.3] (mean ± 1 std) for the three subjects, averaged across the four neural sites. Furthermore, the methodology has the potential for utility in real-time applications as only a single-trial ERP is required. Copyright © 2013 Elsevier B.V. All rights reserved.
An Efficient Pattern Mining Approach for Event Detection in Multivariate Temporal Data
Batal, Iyad; Cooper, Gregory; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos
2015-01-01
This work proposes a pattern mining approach to learn event detection models from complex multivariate temporal data, such as electronic health records. We present Recent Temporal Pattern mining, a novel approach for efficiently finding predictive patterns for event detection problems. This approach first converts the time series data into time-interval sequences of temporal abstractions. It then constructs more complex time-interval patterns backward in time using temporal operators. We also present the Minimal Predictive Recent Temporal Patterns framework for selecting a small set of predictive and non-spurious patterns. We apply our methods for predicting adverse medical events in real-world clinical data. The results demonstrate the benefits of our methods in learning accurate event detection models, which is a key step for developing intelligent patient monitoring and decision support systems. PMID:26752800
Linardy, Evelyn M; Erskine, Simon M; Lima, Nicole E; Lonergan, Tina; Mokany, Elisa; Todd, Alison V
2016-01-15
Advancements in molecular biology have improved the ability to characterize disease-related nucleic acids and proteins. Recently, there has been an increasing desire for tests that can be performed outside of centralised laboratories. This study describes a novel isothermal signal amplification cascade called EzyAmp (enzymatic signal amplification) that is being developed for detection of targets at the point of care. EzyAmp exploits the ability of some restriction endonucleases to cleave substrates containing nicks within their recognition sites. EzyAmp uses two oligonucleotide duplexes (partial complexes 1 and 2) which are initially cleavage-resistant as they lack a complete recognition site. The recognition site of partial complex 1 can be completed by hybridization of a triggering oligonucleotide (Driver Fragment 1) that is generated by a target-specific initiation event. Binding of Driver Fragment 1 generates a completed complex 1, which upon cleavage, releases Driver Fragment 2. In turn, binding of Driver Fragment 2 to partial complex 2 creates completed complex 2 which when cleaved releases additional Driver Fragment 1. Each cleavage event separates fluorophore quencher pairs resulting in an increase in fluorescence. At this stage a cascade of signal production becomes independent of further target-specific initiation events. This study demonstrated that the EzyAmp cascade can facilitate detection and quantification of nucleic acid targets with sensitivity down to aM concentration. Further, the same cascade detected VEGF protein with a sensitivity of 20nM showing that this universal method for amplifying signal may be linked to the detection of different types of analytes in an isothermal format. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Wei, Wei; Gao, Chunyan; Xiong, Yanxiang; Zhang, Yuanjian; Liu, Songqin; Pu, Yuepu
2015-01-01
DNA methylation plays an important role in many biological events and is associated with various diseases. Most traditional methods for detection of DNA methylation are based on the complex and expensive bisulfite method. In this paper, we report a novel fluorescence method to detect DNA and DNA methylation based on graphene oxide (GO) and restriction endonuclease HpaII. The skillfully designed probe DNA labeled with 5-carboxyfluorescein (FAM) and optimized GO concentration keep the probe/target DNA still adsorbed on the GO. After the cleavage action of HpaII the labeled FAM is released from the GO surface and its fluorescence recovers, which could be used to detect DNA in the linear range of 50 pM-50 nM with a detection limit of 43 pM. DNA methylation induced by transmethylase (Mtase) or other chemical reagents prevents HpaII from recognizing and cleaving the specific site; as a result, fluorescence cannot recover. The fluorescence recovery efficiency is closely related to the DNA methylation level, which can be used to detect DNA methylation by comparing it with the fluorescence in the presence of intact target DNA. The method for detection of DNA and DNA methylation is simple, reliable and accurate. Copyright © 2014 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Novak, Avrey; Nyflot, Matthew J.; Ermoian, Ralph P.
Purpose: Radiation treatment planning involves a complex workflow that has multiple potential points of vulnerability. This study utilizes an incident reporting system to identify the origination and detection points of near-miss errors, in order to guide their departmental safety improvement efforts. Previous studies have examined where errors arise, but not where they are detected or applied a near-miss risk index (NMRI) to gauge severity. Methods: From 3/2012 to 3/2014, 1897 incidents were analyzed from a departmental incident learning system. All incidents were prospectively reviewed weekly by a multidisciplinary team and assigned a NMRI score ranging from 0 to 4 reflectingmore » potential harm to the patient (no potential harm to potential critical harm). Incidents were classified by point of incident origination and detection based on a 103-step workflow. The individual steps were divided among nine broad workflow categories (patient assessment, imaging for radiation therapy (RT) planning, treatment planning, pretreatment plan review, treatment delivery, on-treatment quality management, post-treatment completion, equipment/software quality management, and other). The average NMRI scores of incidents originating or detected within each broad workflow area were calculated. Additionally, out of 103 individual process steps, 35 were classified as safety barriers, the process steps whose primary function is to catch errors. The safety barriers which most frequently detected incidents were identified and analyzed. Finally, the distance between event origination and detection was explored by grouping events by the number of broad workflow area events passed through before detection, and average NMRI scores were compared. Results: Near-miss incidents most commonly originated within treatment planning (33%). However, the incidents with the highest average NMRI scores originated during imaging for RT planning (NMRI = 2.0, average NMRI of all events = 1.5), specifically during the documentation of patient positioning and localization of the patient. Incidents were most frequently detected during treatment delivery (30%), and incidents identified at this point also had higher severity scores than other workflow areas (NMRI = 1.6). Incidents identified during on-treatment quality management were also more severe (NMRI = 1.7), and the specific process steps of reviewing portal and CBCT images tended to catch highest-severity incidents. On average, safety barriers caught 46% of all incidents, most frequently at physics chart review, therapist’s chart check, and the review of portal images; however, most of the incidents that pass through a particular safety barrier are not designed to be capable of being captured at that barrier. Conclusions: Incident learning systems can be used to assess the most common points of error origination and detection in radiation oncology. This can help tailor safety improvement efforts and target the highest impact portions of the workflow. The most severe near-miss events tend to originate during simulation, with the most severe near-miss events detected at the time of patient treatment. Safety barriers can be improved to allow earlier detection of near-miss events.« less
Magnetoresistive biosensors for quantitative proteomics
NASA Astrophysics Data System (ADS)
Zhou, Xiahan; Huang, Chih-Cheng; Hall, Drew A.
2017-08-01
Quantitative proteomics, as a developing method for study of proteins and identification of diseases, reveals more comprehensive and accurate information of an organism than traditional genomics. A variety of platforms, such as mass spectrometry, optical sensors, electrochemical sensors, magnetic sensors, etc., have been developed for detecting proteins quantitatively. The sandwich immunoassay is widely used as a labeled detection method due to its high specificity and flexibility allowing multiple different types of labels. While optical sensors use enzyme and fluorophore labels to detect proteins with high sensitivity, they often suffer from high background signal and challenges in miniaturization. Magnetic biosensors, including nuclear magnetic resonance sensors, oscillator-based sensors, Hall-effect sensors, and magnetoresistive sensors, use the specific binding events between magnetic nanoparticles (MNPs) and target proteins to measure the analyte concentration. Compared with other biosensing techniques, magnetic sensors take advantage of the intrinsic lack of magnetic signatures in biological samples to achieve high sensitivity and high specificity, and are compatible with semiconductor-based fabrication process to have low-cost and small-size for point-of-care (POC) applications. Although still in the development stage, magnetic biosensing is a promising technique for in-home testing and portable disease monitoring.
Detecting Disease Outbreaks in Mass Gatherings Using Internet Data
Yom-Tov, Elad; Cox, Ingemar J; McKendry, Rachel A
2014-01-01
Background Mass gatherings, such as music festivals and religious events, pose a health care challenge because of the risk of transmission of communicable diseases. This is exacerbated by the fact that participants disperse soon after the gathering, potentially spreading disease within their communities. The dispersion of participants also poses a challenge for traditional surveillance methods. The ubiquitous use of the Internet may enable the detection of disease outbreaks through analysis of data generated by users during events and shortly thereafter. Objective The intent of the study was to develop algorithms that can alert to possible outbreaks of communicable diseases from Internet data, specifically Twitter and search engine queries. Methods We extracted all Twitter postings and queries made to the Bing search engine by users who repeatedly mentioned one of nine major music festivals held in the United Kingdom and one religious event (the Hajj in Mecca) during 2012, for a period of 30 days and after each festival. We analyzed these data using three methods, two of which compared words associated with disease symptoms before and after the time of the festival, and one that compared the frequency of these words with those of other users in the United Kingdom in the days following the festivals. Results The data comprised, on average, 7.5 million tweets made by 12,163 users, and 32,143 queries made by 1756 users from each festival. Our methods indicated the statistically significant appearance of a disease symptom in two of the nine festivals. For example, cough was detected at higher than expected levels following the Wakestock festival. Statistically significant agreement (chi-square test, P<.01) between methods and across data sources was found where a statistically significant symptom was detected. Anecdotal evidence suggests that symptoms detected are indeed indicative of a disease that some users attributed to being at the festival. Conclusions Our work shows the feasibility of creating a public health surveillance system for mass gatherings based on Internet data. The use of multiple data sources and analysis methods was found to be advantageous for rejecting false positives. Further studies are required in order to validate our findings with data from public health authorities. PMID:24943128
Detection of explosive cough events in audio recordings by internal sound analysis.
Rocha, B M; Mendes, L; Couceiro, R; Henriques, J; Carvalho, P; Paiva, R P
2017-07-01
We present a new method for the discrimination of explosive cough events, which is based on a combination of spectral content descriptors and pitch-related features. After the removal of near-silent segments, a vector of event boundaries is obtained and a proposed set of 9 features is extracted for each event. Two data sets, recorded using electronic stethoscopes and comprising a total of 46 healthy subjects and 13 patients, were employed to evaluate the method. The proposed feature set is compared to three other sets of descriptors: a baseline, a combination of both sets, and an automatic selection of the best 10 features from both sets. The combined feature set yields good results on the cross-validated database, attaining a sensitivity of 92.3±2.3% and a specificity of 84.7±3.3%. Besides, this feature set seems to generalize well when it is trained on a small data set of patients, with a variety of respiratory and cardiovascular diseases, and tested on a bigger data set of mostly healthy subjects: a sensitivity of 93.4% and a specificity of 83.4% are achieved in those conditions. These results demonstrate that complementing the proposed feature set with a baseline set is a promising approach.
Method for early detection of cooling-loss events
Bermudez, Sergio A.; Hamann, Hendrik; Marianno, Fernando J.
2015-06-30
A method of detecting cooling-loss event early is provided. The method includes defining a relative humidity limit and change threshold for a given space, measuring relative humidity in the given space, determining, with a processing unit, whether the measured relative humidity is within the defined relative humidity limit, generating a warning in an event the measured relative humidity is outside the defined relative humidity limit and determining whether a change in the measured relative humidity is less than the defined change threshold for the given space and generating an alarm in an event the change is greater than the defined change threshold.
Method for early detection of cooling-loss events
Bermudez, Sergio A.; Hamann, Hendrik F.; Marianno, Fernando J.
2015-12-22
A method of detecting cooling-loss event early is provided. The method includes defining a relative humidity limit and change threshold for a given space, measuring relative humidity in the given space, determining, with a processing unit, whether the measured relative humidity is within the defined relative humidity limit, generating a warning in an event the measured relative humidity is outside the defined relative humidity limit and determining whether a change in the measured relative humidity is less than the defined change threshold for the given space and generating an alarm in an event the change is greater than the defined change threshold.
NASA Astrophysics Data System (ADS)
McBeth, Rafe A.
Space radiation exposure to astronauts will need to be carefully monitored on future missions beyond low earth orbit. NASA has proposed an updated radiation risk framework that takes into account a significant amount of radiobiological and heavy ion track structure information. These models require active radiation detection systems to measure the energy and ion charge Z. However, current radiation detection systems cannot meet these demands. The aim of this study was to investigate several topics that will help next generation detection systems meet the NASA objectives. Specifically, this work investigates the required spatial resolution to avoid coincident events in a detector, the effects of energy straggling and conversion of dose from silicon to water, and methods for ion identification (Z) using machine learning. The main results of this dissertation are as follows: 1. Spatial resolution on the order of 0.1 cm is required for active space radiation detectors to have high confidence in identifying individual particles, i.e., to eliminate coincident events. 2. Energy resolution of a detector system will be limited by energy straggling effects and the conversion of dose in silicon to dose in biological tissue (water). 3. Machine learning methods show strong promise for identification of ion charge (Z) with simple detector designs.
Clinical outcome of subchromosomal events detected by whole‐genome noninvasive prenatal testing
Helgeson, J.; Wardrop, J.; Boomer, T.; Almasri, E.; Paxton, W. B.; Saldivar, J. S.; Dharajiya, N.; Monroe, T. J.; Farkas, D. H.; Grosu, D. S.
2015-01-01
Abstract Objective A novel algorithm to identify fetal microdeletion events in maternal plasma has been developed and used in clinical laboratory‐based noninvasive prenatal testing. We used this approach to identify the subchromosomal events 5pdel, 22q11del, 15qdel, 1p36del, 4pdel, 11qdel, and 8qdel in routine testing. We describe the clinical outcomes of those samples identified with these subchromosomal events. Methods Blood samples from high‐risk pregnant women submitted for noninvasive prenatal testing were analyzed using low coverage whole genome massively parallel sequencing. Sequencing data were analyzed using a novel algorithm to detect trisomies and microdeletions. Results In testing 175 393 samples, 55 subchromosomal deletions were reported. The overall positive predictive value for each subchromosomal aberration ranged from 60% to 100% for cases with diagnostic and clinical follow‐up information. The total false positive rate was 0.0017% for confirmed false positives results; false negative rate and sensitivity were not conclusively determined. Conclusion Noninvasive testing can be expanded into the detection of subchromosomal copy number variations, while maintaining overall high test specificity. In the current setting, our results demonstrate high positive predictive values for testing of rare subchromosomal deletions. © 2015 The Authors. Prenatal Diagnosis published by John Wiley & Sons Ltd. PMID:26088833
Fritsch, Leonie; Fischer, Rainer; Wambach, Christoph; Dudek, Max; Schillberg, Stefan; Schröper, Florian
2015-08-01
Simple and reliable, high-throughput techniques to detect the zygosity of transgenic events in plants are valuable for biotechnology and plant breeding companies seeking robust genotyping data for the assessment of new lines and the monitoring of breeding programs. We show that next-generation sequencing (NGS) applied to short PCR products spanning the transgene integration site provides accurate zygosity data that are more robust and reliable than those generated by PCR-based methods. The NGS reads covered the 5' border of the transgenic events (incorporating part of the transgene and the flanking genomic DNA), or the genomic sequences flanking the unfilled transgene integration site at the wild-type locus. We compared the NGS method to competitive real-time PCR with transgene-specific and wild-type-specific primer/probe pairs, one pair matching the 5' genomic flanking sequence and 5' part of the transgene and the other matching the unfilled transgene integration site. Although both NGS and real-time PCR provided useful zygosity data, the NGS technique was favorable because it needed fewer optimization steps. It also provided statistically more-reliable evidence for the presence of each allele because each product was often covered by more than 100 reads. The NGS method is also more suitable for the genotyping of large panels of plants because up to 80 million reads can be produced in one sequencing run. Our novel method is therefore ideal for the rapid and accurate genotyping of large numbers of samples.
Pies, Ross E.
2016-03-29
A method and device for the detection of impact events on a security barrier. A hollow rebar is farmed within a security barrier, whereby the hollow rebar is completely surrounded by the security barrier. An optical fiber passes through the interior of the hollow rebar. An optical transmitter and an optical receiver are both optically connected to the optical fiber and connected to optical electronics. The optical electronics are configured to provide notification upon the detection of an impact event at the security barrier based on the detection of disturbances within the optical fiber.
Nadal, Anna; Esteve, Teresa; Pla, Maria
2009-01-01
A multiplex polymerase chain reaction assay coupled to capillary gel electrophoresis for amplicon identification by size and color (multiplex PCR-CGE-SC) was developed for simultaneous detection of cotton species and 5 events of genetically modified (GM) cotton. Validated real-time-PCR reactions targeting Bollgard, Bollgard II, Roundup Ready, 3006-210-23, and 281-24-236 junction sequences, and the cotton reference gene acp1 were adapted to detect more than half of the European Union-approved individual or stacked GM cotton events in one reaction. The assay was fully specific (<1.7% of false classification rate), with limit of detection values of 0.1% for each event, which were also achieved with simulated mixtures at different relative percentages of targets. The assay was further combined with a second multiplex PCR-CGE-SC assay to allow simultaneous detection of 6 cotton and 5 maize targets (two endogenous genes and 9 GM events) in two multiplex PCRs and a single CGE, making the approach more economic. Besides allowing simultaneous detection of many targets with adequate specificity and sensitivity, the multiplex PCR-CGE-SC approach has high throughput and automation capabilities, while keeping a very simple protocol, e.g., amplification and labeling in one step. Thus, it is an easy and inexpensive tool for initial screening, to be complemented with quantitative assays if necessary.
Extracting rate changes in transcriptional regulation from MEDLINE abstracts.
Liu, Wenting; Miao, Kui; Li, Guangxia; Chang, Kuiyu; Zheng, Jie; Rajapakse, Jagath C
2014-01-01
Time delays are important factors that are often neglected in gene regulatory network (GRN) inference models. Validating time delays from knowledge bases is a challenge since the vast majority of biological databases do not record temporal information of gene regulations. Biological knowledge and facts on gene regulations are typically extracted from bio-literature with specialized methods that depend on the regulation task. In this paper, we mine evidences for time delays related to the transcriptional regulation of yeast from the PubMed abstracts. Since the vast majority of abstracts lack quantitative time information, we can only collect qualitative evidences of time delays. Specifically, the speed-up or delay in transcriptional regulation rate can provide evidences for time delays (shorter or longer) in GRN. Thus, we focus on deriving events related to rate changes in transcriptional regulation. A corpus of yeast regulation related abstracts was manually labeled with such events. In order to capture these events automatically, we create an ontology of sub-processes that are likely to result in transcription rate changes by combining textual patterns and biological knowledge. We also propose effective feature extraction methods based on the created ontology to identify the direct evidences with specific details of these events. Our ontologies outperform existing state-of-the-art gene regulation ontologies in the automatic rule learning method applied to our corpus. The proposed deterministic ontology rule-based method can achieve comparable performance to the automatic rule learning method based on decision trees. This demonstrates the effectiveness of our ontology in identifying rate-changing events. We also tested the effectiveness of the proposed feature mining methods on detecting direct evidence of events. Experimental results show that the machine learning method on these features achieves an F1-score of 71.43%. The manually labeled corpus of events relating to rate changes in transcriptional regulation for yeast is available in https://sites.google.com/site/wentingntu/data. The created ontologies summarized both biological causes of rate changes in transcriptional regulation and corresponding positive and negative textual patterns from the corpus. They are demonstrated to be effective in identifying rate-changing events, which shows the benefits of combining textual patterns and biological knowledge on extracting complex biological events.
Striking circadian neuron diversity and cycling of Drosophila alternative splicing.
Wang, Qingqing; Abruzzi, Katharine C; Rosbash, Michael; Rio, Donald C
2018-06-04
Although alternative pre-mRNA splicing (AS) significantly diversifies the neuronal proteome, the extent of AS is still unknown due in part to the large number of diverse cell types in the brain. To address this complexity issue, we used an annotation-free computational method to analyze and compare the AS profiles between small specific groups of Drosophila circadian neurons. The method, the J unction U sage M odel (JUM), allows the comprehensive profiling of both known and novel AS events from specific RNA-seq libraries. The results show that many diverse and novel pre-mRNA isoforms are preferentially expressed in one class of clock neuron and also absent from the more standard Drosophila head RNA preparation. These AS events are enriched in potassium channels important for neuronal firing, and there are also cycling isoforms with no detectable underlying transcriptional oscillations. The results suggest massive AS regulation in the brain that is also likely important for circadian regulation. © 2018, Wang et al.
A high-throughput method for GMO multi-detection using a microfluidic dynamic array.
Brod, Fábio Cristiano Angonesi; van Dijk, Jeroen P; Voorhuijzen, Marleen M; Dinon, Andréia Zilio; Guimarães, Luis Henrique S; Scholtens, Ingrid M J; Arisi, Ana Carolina Maisonnave; Kok, Esther J
2014-02-01
The ever-increasing production of genetically modified crops generates a demand for high-throughput DNA-based methods for the enforcement of genetically modified organisms (GMO) labelling requirements. The application of standard real-time PCR will become increasingly costly with the growth of the number of GMOs that is potentially present in an individual sample. The present work presents the results of an innovative approach in genetically modified crops analysis by DNA based methods, which is the use of a microfluidic dynamic array as a high throughput multi-detection system. In order to evaluate the system, six test samples with an increasing degree of complexity were prepared, preamplified and subsequently analysed in the Fluidigm system. Twenty-eight assays targeting different DNA elements, GM events and species-specific reference genes were used in the experiment. The large majority of the assays tested presented expected results. The power of low level detection was assessed and elements present at concentrations as low as 0.06 % were successfully detected. The approach proposed in this work presents the Fluidigm system as a suitable and promising platform for GMO multi-detection.
Using Dictionary Pair Learning for Seizure Detection.
Ma, Xin; Yu, Nana; Zhou, Weidong
2018-02-13
Automatic seizure detection is extremely important in the monitoring and diagnosis of epilepsy. The paper presents a novel method based on dictionary pair learning (DPL) for seizure detection in the long-term intracranial electroencephalogram (EEG) recordings. First, for the EEG data, wavelet filtering and differential filtering are applied, and the kernel function is performed to make the signal linearly separable. In DPL, the synthesis dictionary and analysis dictionary are learned jointly from original training samples with alternating minimization method, and sparse coefficients are obtained by using of linear projection instead of costly [Formula: see text]-norm or [Formula: see text]-norm optimization. At last, the reconstructed residuals associated with seizure and nonseizure sub-dictionary pairs are calculated as the decision values, and the postprocessing is performed for improving the recognition rate and reducing the false detection rate of the system. A total of 530[Formula: see text]h from 20 patients with 81 seizures were used to evaluate the system. Our proposed method has achieved an average segment-based sensitivity of 93.39%, specificity of 98.51%, and event-based sensitivity of 96.36% with false detection rate of 0.236/h.
NASA Astrophysics Data System (ADS)
Tang, Xiaojing
Fast and accurate monitoring of tropical forest disturbance is essential for understanding current patterns of deforestation as well as helping eliminate illegal logging. This dissertation explores the use of data from different satellites for near real-time monitoring of forest disturbance in tropical forests, including: development of new monitoring methods; development of new assessment methods; and assessment of the performance and operational readiness of existing methods. Current methods for accuracy assessment of remote sensing products do not address the priority of near real-time monitoring of detecting disturbance events as early as possible. I introduce a new assessment framework for near real-time products that focuses on the timing and the minimum detectable size of disturbance events. The new framework reveals the relationship between change detection accuracy and the time needed to identify events. In regions that are frequently cloudy, near real-time monitoring using data from a single sensor is difficult. This study extends the work by Xin et al. (2013) and develops a new time series method (Fusion2) based on fusion of Landsat and MODIS (Moderate Resolution Imaging Spectroradiometer) data. Results of three test sites in the Amazon Basin show that Fusion2 can detect 44.4% of the forest disturbance within 13 clear observations (82 days) after the initial disturbance. The smallest event detected by Fusion2 is 6.5 ha. Also, Fusion2 detects disturbance faster and has less commission error than more conventional methods. In a comparison of coarse resolution sensors, MODIS Terra and Aqua combined provides faster and more accurate detection of disturbance events than VIIRS (Visible Infrared Imaging Radiometer Suite) and MODIS single sensor data. The performance of near real-time monitoring using VIIRS is slightly worse than MODIS Terra but significantly better than MODIS Aqua. New monitoring methods developed in this dissertation provide forest protection organizations the capacity to monitor illegal logging events promptly. In the future, combining two Landsat and two Sentinel-2 satellites will provide global coverage at 30 m resolution every 4 days, and routine monitoring may be possible at high resolution. The methods and assessment framework developed in this dissertation are adaptable to newly available datasets.
Alves, Natasha; Chau, Tom
2010-04-01
Knowledge of muscle activity timing is critical to many clinical applications, such as the assessment of muscle coordination and the prescription of muscle-activated switches for individuals with disabilities. In this study, we introduce a continuous wavelet transform (CWT) algorithm for the detection of muscle activity via mechanomyogram (MMG) signals. CWT coefficients of the MMG signal were compared to scale-specific thresholds derived from the baseline signal to estimate the timing of muscle activity. Test signals were recorded from the flexor carpi radialis muscles of 15 able-bodied participants as they squeezed and released a hand dynamometer. Using the dynamometer signal as a reference, the proposed CWT detection algorithm was compared against a global-threshold CWT detector as well as amplitude-based event detection for sensitivity and specificity to voluntary contractions. The scale-specific CWT-based algorithm exhibited superior detection performance over the other detectors. CWT detection also showed good muscle selectivity during hand movement, particularly when a given muscle was the primary facilitator of the contraction. This may suggest that, during contraction, the compound MMG signal has a recurring morphological pattern that is not prevalent in the baseline signal. The ability of CWT analysis to be implemented in real time makes it a candidate for muscle-activity detection in clinical applications.
Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin
2016-10-01
Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.
Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin
2016-01-01
Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses. PMID:27706086
Automated Detection of Events of Scientific Interest
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
A report presents a slightly different perspective of the subject matter of Fusing Symbolic and Numerical Diagnostic Computations (NPO-42512), which appears elsewhere in this issue of NASA Tech Briefs. Briefly, the subject matter is the X-2000 Anomaly Detection Language, which is a developmental computing language for fusing two diagnostic computer programs one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for real-time detection of events. In the case of the cited companion NASA Tech Briefs article, the contemplated events that one seeks to detect would be primarily failures or other changes that could adversely affect the safety or success of a spacecraft mission. In the case of the instant report, the events to be detected could also include natural phenomena that could be of scientific interest. Hence, the use of X- 2000 Anomaly Detection Language could contribute to a capability for automated, coordinated use of multiple sensors and sensor-output-data-processing hardware and software to effect opportunistic collection and analysis of scientific data.
Bounds on the minimum number of recombination events in a sample history.
Myers, Simon R; Griffiths, Robert C
2003-01-01
Recombination is an important evolutionary factor in many organisms, including humans, and understanding its effects is an important task facing geneticists. Detecting past recombination events is thus important; this article introduces statistics that give a lower bound on the number of recombination events in the history of a sample, on the basis of the patterns of variation in the sample DNA. Such lower bounds are appropriate, since many recombination events in the history are typically undetectable, so the true number of historical recombinations is unobtainable. The statistics can be calculated quickly by computer and improve upon the earlier bound of Hudson and Kaplan 1985. A method is developed to combine bounds on local regions in the data to produce more powerful improved bounds. The method is flexible to different models of recombination occurrence. The approach gives recombination event bounds between all pairs of sites, to help identify regions with more detectable recombinations, and these bounds can be viewed graphically. Under coalescent simulations, there is a substantial improvement over the earlier method (of up to a factor of 2) in the expected number of recombination events detected by one of the new minima, across a wide range of parameter values. The method is applied to data from a region within the lipoprotein lipase gene and the amount of detected recombination is substantially increased. Further, there is strong clustering of detected recombination events in an area near the center of the region. A program implementing these statistics, which was used for this article, is available from http://www.stats.ox.ac.uk/mathgen/programs.html. PMID:12586723
Enzyme-Mediated Individual Nanoparticle Release Assay
Glass, James R.; Dickerson, Janet C.; Schultz, David A.
2007-01-01
Numerous methods have been developed to measure the presence of macromolecular species in a sample, however methods that detect functional activity, or modulators of that activity are more limited. To address this limitation, an approach was developed that utilizes the optical detection of nanoparticles as a measure of enzyme activity. Nanoparticles are increasingly being used as biological labels in static binding assays; here we describe their use in a release assay format where the enzyme-mediated liberation of individual nanoparticles from a surface is measured. A double stranded fragment of DNA is used as the initial tether to bind the nanoparticles to a solid surface. The nanoparticle spatial distribution and number are determined using dark-field optical microscopy and digital image capture. Site specific cleavage of the DNA tether results in nanoparticle release. The methodology and validation of this approach for measuring enzyme-mediated, individual DNA cleavage events, rapidly, with high specificity, and in real-time is described. This approach was used to detect and discriminate between non-methylated and methylated DNA, and demonstrates a novel platform for high-throughput screening of modulators of enzyme activity. PMID:16620746
Eventogram: A Visual Representation of Main Events in Biomedical Signals.
Elgendi, Mohamed
2016-09-22
Biomedical signals carry valuable physiological information and many researchers have difficulty interpreting and analyzing long-term, one-dimensional, quasi-periodic biomedical signals. Traditionally, biomedical signals are analyzed and visualized using periodogram, spectrogram, and wavelet methods. However, these methods do not offer an informative visualization of main events within the processed signal. This paper attempts to provide an event-related framework to overcome the drawbacks of the traditional visualization methods and describe the main events within the biomedical signal in terms of duration and morphology. Electrocardiogram and photoplethysmogram signals are used in the analysis to demonstrate the differences between the traditional visualization methods, and their performance is compared against the proposed method, referred to as the " eventogram " in this paper. The proposed method is based on two event-related moving averages that visualizes the main time-domain events in the processed biomedical signals. The traditional visualization methods were unable to find dominant events in processed signals while the eventogram was able to visualize dominant events in signals in terms of duration and morphology. Moreover, eventogram -based detection algorithms succeeded with detecting main events in different biomedical signals with a sensitivity and positive predictivity >95%. The output of the eventogram captured unique patterns and signatures of physiological events, which could be used to visualize and identify abnormal waveforms in any quasi-periodic signal.
Detecting disease outbreaks in mass gatherings using Internet data.
Yom-Tov, Elad; Borsa, Diana; Cox, Ingemar J; McKendry, Rachel A
2014-06-18
Mass gatherings, such as music festivals and religious events, pose a health care challenge because of the risk of transmission of communicable diseases. This is exacerbated by the fact that participants disperse soon after the gathering, potentially spreading disease within their communities. The dispersion of participants also poses a challenge for traditional surveillance methods. The ubiquitous use of the Internet may enable the detection of disease outbreaks through analysis of data generated by users during events and shortly thereafter. The intent of the study was to develop algorithms that can alert to possible outbreaks of communicable diseases from Internet data, specifically Twitter and search engine queries. We extracted all Twitter postings and queries made to the Bing search engine by users who repeatedly mentioned one of nine major music festivals held in the United Kingdom and one religious event (the Hajj in Mecca) during 2012, for a period of 30 days and after each festival. We analyzed these data using three methods, two of which compared words associated with disease symptoms before and after the time of the festival, and one that compared the frequency of these words with those of other users in the United Kingdom in the days following the festivals. The data comprised, on average, 7.5 million tweets made by 12,163 users, and 32,143 queries made by 1756 users from each festival. Our methods indicated the statistically significant appearance of a disease symptom in two of the nine festivals. For example, cough was detected at higher than expected levels following the Wakestock festival. Statistically significant agreement (chi-square test, P<.01) between methods and across data sources was found where a statistically significant symptom was detected. Anecdotal evidence suggests that symptoms detected are indeed indicative of a disease that some users attributed to being at the festival. Our work shows the feasibility of creating a public health surveillance system for mass gatherings based on Internet data. The use of multiple data sources and analysis methods was found to be advantageous for rejecting false positives. Further studies are required in order to validate our findings with data from public health authorities.
Krajewska, Maryla; Smith, Layton H.; Rong, Juan; Huang, Xianshu; Hyer, Marc L.; Zeps, Nikolajs; Iacopetta, Barry; Linke, Steven P.; Olson, Allen H.; Reed, John C.; Krajewski, Stan
2009-01-01
Cell death is of broad physiological and pathological importance, making quantification of biochemical events associated with cell demise a high priority for experimental pathology. Fibrosis is a common consequence of tissue injury involving necrotic cell death. Using tissue specimens from experimental mouse models of traumatic brain injury, cardiac fibrosis, and cancer, as well as human tumor specimens assembled in tissue microarray (TMA) format, we undertook computer-assisted quantification of specific immunohistochemical and histological parameters that characterize processes associated with cell death. In this study, we demonstrated the utility of image analysis algorithms for color deconvolution, colocalization, and nuclear morphometry to characterize cell death events in tissue specimens: (a) subjected to immunostaining for detecting cleaved caspase-3, cleaved poly(ADP-ribose)-polymerase, cleaved lamin-A, phosphorylated histone H2AX, and Bcl-2; (b) analyzed by terminal deoxyribonucleotidyl transferase–mediated dUTP nick end labeling assay to detect DNA fragmentation; and (c) evaluated with Masson's trichrome staining. We developed novel algorithm-based scoring methods and validated them using TMAs as a high-throughput format. The proposed computer-assisted scoring methods for digital images by brightfield microscopy permit linear quantification of immunohistochemical and histochemical stainings. Examples are provided of digital image analysis performed in automated or semiautomated fashion for successful quantification of molecular events associated with cell death in tissue sections. (J Histochem Cytochem 57:649–663, 2009) PMID:19289554
An integrated logit model for contamination event detection in water distribution systems.
Housh, Mashor; Ostfeld, Avi
2015-05-15
The problem of contamination event detection in water distribution systems has become one of the most challenging research topics in water distribution systems analysis. Current attempts for event detection utilize a variety of approaches including statistical, heuristics, machine learning, and optimization methods. Several existing event detection systems share a common feature in which alarms are obtained separately for each of the water quality indicators. Unifying those single alarms from different indicators is usually performed by means of simple heuristics. A salient feature of the current developed approach is using a statistically oriented model for discrete choice prediction which is estimated using the maximum likelihood method for integrating the single alarms. The discrete choice model is jointly calibrated with other components of the event detection system framework in a training data set using genetic algorithms. The fusing process of each indicator probabilities, which is left out of focus in many existing event detection system models, is confirmed to be a crucial part of the system which could be modelled by exploiting a discrete choice model for improving its performance. The developed methodology is tested on real water quality data, showing improved performances in decreasing the number of false positive alarms and in its ability to detect events with higher probabilities, compared to previous studies. Copyright © 2015 Elsevier Ltd. All rights reserved.
Li, Xiang; Wang, Xiuxiu; Yang, Jielin; Liu, Yueming; He, Yuping; Pan, Liangwen
2014-05-16
To date, over 150 genetically modified (GM) crops are widely cultivated. To comply with regulations developed for genetically modified organisms (GMOs), including labeling policies, many detection methods for GMO identification and quantification have been developed. To detect the entrance and exit of unauthorized GM crop events in China, we developed a novel quadruplex real-time PCR method for simultaneous detection and quantification of GM cotton events GHB119 and T304-40 in cotton-derived products (based on the 5'-flanking sequence) and the insect-resistance gene Cry2Ae. The limit of detection was 10 copies for GHB119 and Cry2Ae and 25 copies for T304-40. The limit of quantification was 25 copies for GHB119 and Cry2Ae and 50 copies for T304-40. Moreover, low bias and acceptable standard deviation and relative standard deviation values were obtained in quantification analysis of six blind samples containing different GHB119 and T304-40 ingredients. The developed quadruplex quantitative method could be used for quantitative detection of two GM cotton events (GHB119 and T304-40) and Cry2Ae gene ingredient in cotton derived products.
2014-01-01
Background To date, over 150 genetically modified (GM) crops are widely cultivated. To comply with regulations developed for genetically modified organisms (GMOs), including labeling policies, many detection methods for GMO identification and quantification have been developed. Results To detect the entrance and exit of unauthorized GM crop events in China, we developed a novel quadruplex real-time PCR method for simultaneous detection and quantification of GM cotton events GHB119 and T304-40 in cotton-derived products (based on the 5′-flanking sequence) and the insect-resistance gene Cry2Ae. The limit of detection was 10 copies for GHB119 and Cry2Ae and 25 copies for T304-40. The limit of quantification was 25 copies for GHB119 and Cry2Ae and 50 copies for T304-40. Moreover, low bias and acceptable standard deviation and relative standard deviation values were obtained in quantification analysis of six blind samples containing different GHB119 and T304-40 ingredients. Conclusions The developed quadruplex quantitative method could be used for quantitative detection of two GM cotton events (GHB119 and T304-40) and Cry2Ae gene ingredient in cotton derived products. PMID:24884946
Induced Seismicity in Greeley, CO: The Effects of Pore Pressure on Seismic Wave Character
NASA Astrophysics Data System (ADS)
Bogolub, K. R.; Holmes, R.; Sheehan, A. F.; Brown, M. R. M.
2017-12-01
Since 2013, a series of injection-induced earthquakes has occurred near Greeley, Colorado including a Mw 3.2 event in June 2014. With induced seismicity on the rise, it is important to understand injection-induced earthquakes to improve mitigation efforts. In this research, we analyzed seismograms from a local seismic network to see if there are any notable differences in seismic waveform as a result of changes in pore pressure from wastewater injection. Catalogued earthquake events from January-June 2017 that were clearly visible on 4 or more stations in the network were used as template events in a subspace detector. Since the template events were constructed using seismograms from a single event, the subspace detector operated similarly to a matched filter and detections had very similar waveforms to the template event. Having these detections ultimately helped us identify similar earthquakes, which gave us better located events for comparison. These detections were then examined and located using a 1D local velocity model. While many of these detections were already catalogued events, we also identified >20 new events by using this detector. Any two events that were matched by the detector, collocated within the error ellipses of both events and at least a month apart temporally were classified as "event pairs". One challenge of this method is that most of the collocated earthquakes occurred in a very narrow time window, which indicates that the events have a tendency to cluster both spatially and temporally. However, we were able to examine an event pair that fit our spatial proximity criteria, and were several months apart (March 3, 2017 and May 8, 2017). We present an examination of propagation velocity and frequency content for these two events specifically to assess if transient changes in pore pressure had any observable influence on these characteristics. Our preliminary results indicate a slight difference in lag time between P wave and S wave arrivals (slightly greater in lag time for March event) and frequency content (slightly higher dominant frequencies for March event). However, more work needs to be done to refine our earthquake locations so we can determine if these observations are caused by a transient change in velocity structure, a difference in location of the two events, or some other mechanism.
Silas, Reshma; Tibballs, James
2010-12-01
Little is known of the incidence of adverse events in the paediatric intensive care unit (PICU). Perceived incidence may be dependent on data-collection methods. To determine the incidence of adverse events by voluntary reporting and systematic enquiry. Adverse events in PICU were recorded contemporaneously by systematic enquiry with bedside nurses and attending doctors, and compared with data submitted voluntarily to the hospital's quality and safety unit. Events were classified as insignificant, minor, moderate, major and catastrophic or lethal, and assigned origins as medical/surgical diagnosis or management, medical/surgical procedures, medication or miscellaneous. Among 740 patients, 524 adverse events (mean 0.71 per patient) occurred in 193 patients (26.1%). Systematic enquiry detected 405 (80%) among 165 patients and were classified by one investigator as insignificant 30 (7%); minor 100 (25%); moderate 160 (37%); major 103(25%) and catastrophic 12 (3%). The coefficient of agreement (kappa) of severity between the two investigators was 0.82 (95% CI 0.78-0.87). Voluntary reporting detected 166 (32%) adverse events among 100 patients, of which 119 were undetected by systematic reporting. Forty-nine events (9%) were detected by both methods. The number and severity of events reported by the two methods were significantly different (p<0.0001). Voluntary reporting, mainly by nurses, did not capture major, severe or catastrophic events related to medical/surgical diagnosis or management. Neither voluntary reporting nor systematic enquiry captures all adverse events. While the two methods both capture some events, systematic reporting captures serious events, while voluntary reporting captures mainly insignificant and minor events.
Recovery and normalization of triple coincidences in PET
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lage, Eduardo, E-mail: elage@mit.edu; Parot, Vicente; Dave, Shivang R.
2015-03-15
Purpose: Triple coincidences in positron emission tomography (PET) are events in which three γ-rays are detected simultaneously. These events, though potentially useful for enhancing the sensitivity of PET scanners, are discarded or processed without special consideration in current systems, because there is not a clear criterion for assigning them to a unique line-of-response (LOR). Methods proposed for recovering such events usually rely on the use of highly specialized detection systems, hampering general adoption, and/or are based on Compton-scatter kinematics and, consequently, are limited in accuracy by the energy resolution of standard PET detectors. In this work, the authors propose amore » simple and general solution for recovering triple coincidences, which does not require specialized detectors or additional energy resolution requirements. Methods: To recover triple coincidences, the authors’ method distributes such events among their possible LORs using the relative proportions of double coincidences in these LORs. The authors show analytically that this assignment scheme represents the maximum-likelihood solution for the triple-coincidence distribution problem. The PET component of a preclinical PET/CT scanner was adapted to enable the acquisition and processing of triple coincidences. Since the efficiencies for detecting double and triple events were found to be different throughout the scanner field-of-view, a normalization procedure specific for triple coincidences was also developed. The effect of including triple coincidences using their method was compared against the cases of equally weighting the triples among their possible LORs and discarding all the triple events. The authors used as figures of merit for this comparison sensitivity, noise-equivalent count (NEC) rates and image quality calculated as described in the NEMA NU-4 protocol for the assessment of preclinical PET scanners. Results: The addition of triple-coincidence events with the authors’ method increased peak NEC rates of the scanner by 26.6% and 32% for mouse- and rat-sized objects, respectively. This increase in NEC-rate performance was also reflected in the image-quality metrics. Images reconstructed using double and triple coincidences recovered using their method had better signal-to-noise ratio than those obtained using only double coincidences, while preserving spatial resolution and contrast. Distribution of triple coincidences using an equal-weighting scheme increased apparent system sensitivity but degraded image quality. The performance boost provided by the inclusion of triple coincidences using their method allowed to reduce the acquisition time of standard imaging procedures by up to ∼25%. Conclusions: Recovering triple coincidences with the proposed method can effectively increase the sensitivity of current clinical and preclinical PET systems without compromising other parameters like spatial resolution or contrast.« less
Cottenet, Geoffrey; Blancpain, Carine; Sonnard, Véronique; Chuah, Poh Fong
2013-08-01
Considering the increase of the total cultivated land area dedicated to genetically modified organisms (GMO), the consumers' perception toward GMO and the need to comply with various local GMO legislations, efficient and accurate analytical methods are needed for their detection and identification. Considered as the gold standard for GMO analysis, the real-time polymerase chain reaction (RTi-PCR) technology was optimised to produce a high-throughput GMO screening method. Based on simultaneous 24 multiplex RTi-PCR running on a ready-to-use 384-well plate, this new procedure allows the detection and identification of 47 targets on seven samples in duplicate. To comply with GMO analytical quality requirements, a negative and a positive control were analysed in parallel. In addition, an internal positive control was also included in each reaction well for the detection of potential PCR inhibition. Tested on non-GM materials, on different GM events and on proficiency test samples, the method offered high specificity and sensitivity with an absolute limit of detection between 1 and 16 copies depending on the target. Easy to use, fast and cost efficient, this multiplex approach fits the purpose of GMO testing laboratories.
RoboTAP: Target priorities for robotic microlensing observations
NASA Astrophysics Data System (ADS)
Hundertmark, M.; Street, R. A.; Tsapras, Y.; Bachelet, E.; Dominik, M.; Horne, K.; Bozza, V.; Bramich, D. M.; Cassan, A.; D'Ago, G.; Figuera Jaimes, R.; Kains, N.; Ranc, C.; Schmidt, R. W.; Snodgrass, C.; Wambsganss, J.; Steele, I. A.; Mao, S.; Ment, K.; Menzies, J.; Li, Z.; Cross, S.; Maoz, D.; Shvartzvald, Y.
2018-01-01
Context. The ability to automatically select scientifically-important transient events from an alert stream of many such events, and to conduct follow-up observations in response, will become increasingly important in astronomy. With wide-angle time domain surveys pushing to fainter limiting magnitudes, the capability to follow-up on transient alerts far exceeds our follow-up telescope resources, and effective target prioritization becomes essential. The RoboNet-II microlensing program is a pathfinder project, which has developed an automated target selection process (RoboTAP) for gravitational microlensing events, which are observed in real time using the Las Cumbres Observatory telescope network. Aims: Follow-up telescopes typically have a much smaller field of view compared to surveys, therefore the most promising microlensing events must be automatically selected at any given time from an annual sample exceeding 2000 events. The main challenge is to select between events with a high planet detection sensitivity, with the aim of detecting many planets and characterizing planetary anomalies. Methods: Our target selection algorithm is a hybrid system based on estimates of the planet detection zones around a microlens. It follows automatic anomaly alerts and respects the expected survey coverage of specific events. Results: We introduce the RoboTAP algorithm, whose purpose is to select and prioritize microlensing events with high sensitivity to planetary companions. In this work, we determine the planet sensitivity of the RoboNet follow-up program and provide a working example of how a broker can be designed for a real-life transient science program conducting follow-up observations in response to alerts; we explore the issues that will confront similar programs being developed for the Large Synoptic Survey Telescope (LSST) and other time domain surveys.
Turkec, Aydin; Lucas, Stuart J; Karacanli, Burçin; Baykut, Aykut; Yuksel, Hakki
2016-03-01
Detection of GMO material in crop and food samples is the primary step in GMO monitoring and regulation, with the increasing number of GM events in the world market requiring detection solutions with high multiplexing capacity. In this study, we test the suitability of a high-density oligonucleotide microarray platform for direct, quantitative detection of GMOs found in the Turkish feed market. We tested 1830 different 60nt probes designed to cover the GM cassettes from 12 different GM cultivars (3 soya, 9 maize), as well as plant species-specific and contamination controls, and developed a data analysis method aiming to provide maximum throughput and sensitivity. The system was able specifically to identify each cultivar, and in 10/12 cases was sensitive enough to detect GMO DNA at concentrations of ⩽1%. These GMOs could also be quantified using the microarray, as their fluorescence signals increased linearly with GMO concentration. Copyright © 2015 Elsevier Ltd. All rights reserved.
Sun, Kai; Chang, Yong; Zhou, Binbin; Wang, Xiaojin; Liu, Lin
2017-01-01
This article presents a general method for the detection of protein kinase with a peptide-like kinase inhibitor as the bioreceptor, and it was done by converting gold nanoparticles (AuNPs)-based colorimetric assay into sensitive electrochemical analysis. In the colorimetric assay, the kinase-specific aptameric peptide triggered the aggregation of AuNPs in solution. However, the specific binding of peptide to the target protein (kinase) inhibited its ability to trigger the assembly of AuNPs. In the electrochemical analysis, peptides immobilized on a gold electrode and presented as solution triggered together the in situ formation of AuNPs-based network architecture on the electrode surface. Nevertheless, the formation of peptide–kinase complex on the electrode surface made the peptide-triggered AuNPs assembly difficult. Electrochemical impedance spectroscopy was used to measure the change in surface property in the binding events. When a ferrocene-labeled peptide (Fc-peptide) was used in this design, the network of AuNPs/Fc-peptide produced a good voltammetric signal. The competitive assay allowed for the detection of protein kinase A with a detection limit of 20 mU/mL. This work should be valuable for designing novel optical or electronic biosensors and likely lead to many detection applications. PMID:28331314
A large 2D PSD for thermal neutron detection
NASA Astrophysics Data System (ADS)
Knott, R. B.; Smith, G. C.; Watt, G.; Boldeman, J. W.
1997-02-01
A 2D PSD based on a MWPC has been constructed for a small angle neutron scattering instrument. The active area of the detector was 640 × 640 mm 2. To meet the specifications for neutron detection efficiency and spatial resolution, and to minimise parallax, the gas mixture was 190 kPa 3He plus 100 kPa CF 4, and the active volume had a thickness of 30 mm. The design maximum neutron count rate of the detector was 10 5 events per secod. The (calculated) neutron detection efficiency was 60% for 2 Å neutrons and the (measured) neutron energy resolution on the anode grid was typically 20% (fwhm). The location of a neutron detection event within the active area was determined using the wire-by-wire method: the spatial resolution (5 × 5 mm 2) was thereby defined by the wire geometry. A 16-channel charge-sensitive preamplifier/amplifier/comparator module has been developed with a channel sensitivity of 0.1 V/fC, noise line width of 0.4 fC (fwhm) and channel-to-channel cross-talk of less than 5%. The Proportional Counter Operating System (PCOS III) (LeCroy Corp, USA) was used for event encoding. The ECL signals produced by the 16 channel modules were latched in PCOS III by a trigger pulse from the anode and the fast encoders produce a position and width for each event. The information was transferred to a UNIX workstation for accumulation and online display.
NASA Astrophysics Data System (ADS)
Bergen, K.; Yoon, C. E.; OReilly, O. J.; Beroza, G. C.
2015-12-01
Recent improvements in computational efficiency for waveform correlation-based detections achieved by new methods such as Fingerprint and Similarity Thresholding (FAST) promise to allow large-scale blind search for similar waveforms in long-duration continuous seismic data. Waveform similarity search applied to datasets of months to years of continuous seismic data will identify significantly more events than traditional detection methods. With the anticipated increase in number of detections and associated increase in false positives, manual inspection of the detection results will become infeasible. This motivates the need for new approaches to process the output of similarity-based detection. We explore data mining techniques for improved detection post-processing. We approach this by considering similarity-detector output as a sparse similarity graph with candidate events as vertices and similarities as weighted edges. Image processing techniques are leveraged to define candidate events and combine results individually processed at multiple stations. Clustering and graph analysis methods are used to identify groups of similar waveforms and assign a confidence score to candidate detections. Anomaly detection and classification are applied to waveform data for additional false detection removal. A comparison of methods will be presented and their performance will be demonstrated on a suspected induced and non-induced earthquake sequence.
Causality Patterns for Detecting Adverse Drug Reactions From Social Media: Text Mining Approach.
Bollegala, Danushka; Maskell, Simon; Sloane, Richard; Hajne, Joanna; Pirmohamed, Munir
2018-05-09
Detecting adverse drug reactions (ADRs) is an important task that has direct implications for the use of that drug. If we can detect previously unknown ADRs as quickly as possible, then this information can be provided to the regulators, pharmaceutical companies, and health care organizations, thereby potentially reducing drug-related morbidity and saving lives of many patients. A promising approach for detecting ADRs is to use social media platforms such as Twitter and Facebook. A high level of correlation between a drug name and an event may be an indication of a potential adverse reaction associated with that drug. Although numerous association measures have been proposed by the signal detection community for identifying ADRs, these measures are limited in that they detect correlations but often ignore causality. This study aimed to propose a causality measure that can detect an adverse reaction that is caused by a drug rather than merely being a correlated signal. To the best of our knowledge, this was the first causality-sensitive approach for detecting ADRs from social media. Specifically, the relationship between a drug and an event was represented using a set of automatically extracted lexical patterns. We then learned the weights for the extracted lexical patterns that indicate their reliability for expressing an adverse reaction of a given drug. Our proposed method obtains an ADR detection accuracy of 74% on a large-scale manually annotated dataset of tweets, covering a standard set of drugs and adverse reactions. By using lexical patterns, we can accurately detect the causality between drugs and adverse reaction-related events. ©Danushka Bollegala, Simon Maskell, Richard Sloane, Joanna Hajne, Munir Pirmohamed. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 09.05.2018.
Mapping the Recent US Hurricanes Triggered Flood Events in Near Real Time
NASA Astrophysics Data System (ADS)
Shen, X.; Lazin, R.; Anagnostou, E. N.; Wanik, D. W.; Brakenridge, G. R.
2017-12-01
Synthetic Aperture Radar (SAR) observations is the only reliable remote sensing data source to map flood inundation during severe weather events. Unfortunately, since state-of-art data processing algorithms cannot meet the automation and quality standard of a near-real-time (NRT) system, quality controlled inundation mapping by SAR currently depends heavily on manual processing, which limits our capability to quickly issue flood inundation maps at global scale. Specifically, most SAR-based inundation mapping algorithms are not fully automated, while those that are automated exhibit severe over- and/or under-detection errors that limit their potential. These detection errors are primarily caused by the strong overlap among the SAR backscattering probability density functions (PDF) of different land cover types. In this study, we tested a newly developed NRT SAR-based inundation mapping system, named Radar Produced Inundation Diary (RAPID), using Sentinel-1 dual polarized SAR data over recent flood events caused by Hurricanes Harvey, Irma, and Maria (2017). The system consists of 1) self-optimized multi-threshold classification, 2) over-detection removal using land-cover information and change detection, 3) under-detection compensation, and 4) machine-learning based correction. Algorithm details are introduced in another poster, H53J-1603. Good agreements were obtained by comparing the result from RAPID with visual interpretation of SAR images and manual processing from Dartmouth Flood Observatory (DFO) (See Figure 1). Specifically, the over- and under-detections that is typically noted in automated methods is significantly reduced to negligible levels. This performance indicates that RAPID can address the automation and accuracy issues of current state-of-art algorithms and has the potential to apply operationally on a number of satellite SAR missions, such as SWOT, ALOS, Sentinel etc. RAPID data can support many applications such as rapid assessment of damage losses and disaster alleviation/rescue at global scale.
Reliability of a Qualitative Video Analysis for Running.
Pipkin, Andrew; Kotecki, Kristy; Hetzel, Scott; Heiderscheit, Bryan
2016-07-01
Study Design Reliability study. Background Video analysis of running gait is frequently performed in orthopaedic and sports medicine practices to assess biomechanical factors that may contribute to injury. However, the reliability of a whole-body assessment has not been determined. Objective To determine the intrarater and interrater reliability of the qualitative assessment of specific running kinematics from a 2-dimensional video. Methods Running-gait analysis was performed on videos recorded from 15 individuals (8 male, 7 female) running at a self-selected pace (3.17 ± 0.40 m/s, 8:28 ± 1:04 min/mi) using a high-speed camera (120 frames per second). These videos were independently rated on 2 occasions by 3 experienced physical therapists using a standardized qualitative assessment. Fifteen sagittal and frontal plane kinematic variables were rated on a 3- or 5-point categorical scale at specific events of the gait cycle, including initial contact (n = 3) and midstance (n = 9), or across the full gait cycle (n = 3). The video frame number corresponding to each gait event was also recorded. Intrarater and interrater reliability values were calculated for gait-event detection (intraclass correlation coefficient [ICC] and standard error of measurement [SEM]) and the individual kinematic variables (weighted kappa [κw]). Results Gait-event detection was highly reproducible within raters (ICC = 0.94-1.00; SEM, 0.3-1.0 frames) and between raters (ICC = 0.77-1.00; SEM, 0.4-1.9 frames). Eleven of the 15 kinematic variables demonstrated substantial (κw = 0.60-0.799) or excellent (κw>0.80) intrarater agreement, with the exception of foot-to-center-of-mass position (κw = 0.59), forefoot position (κw = 0.58), ankle dorsiflexion at midstance (κw = 0.49), and center-of-mass vertical excursion (κw = 0.36). Interrater agreement for the kinematic measures varied more widely (κw = 0.00-0.85), with 5 variables showing substantial or excellent reliability. Conclusion The qualitative assessment of specific kinematic measures during running can be reliably performed with the use of a high-speed video camera. Detection of specific gait events was highly reproducible, as were common kinematic variables such as rearfoot position, foot-strike pattern, tibial inclination angle, knee flexion angle, and forward trunk lean. Other variables should be used with caution. J Orthop Sports Phys Ther 2016;46(7):556-561. Epub 6 Jun 2016. doi:10.2519/jospt.2016.6280.
A Cyber-Attack Detection Model Based on Multivariate Analyses
NASA Astrophysics Data System (ADS)
Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi
In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.
Guo, Jinchao; Yang, Litao; Liu, Xin; Guan, Xiaoyan; Jiang, Lingxi; Zhang, Dabing
2009-08-26
Genetically modified (GM) papaya (Carica papaya L.), Huanong No. 1, was approved for commercialization in Guangdong province, China in 2006, and the development of the Huanong No. 1 papaya detection method is necessary for implementing genetically modified organism (GMO) labeling regulations. In this study, we reported the characterization of the exogenous integration of GM Huanong No. 1 papaya by means of conventional polymerase chain reaction (PCR) and thermal asymmetric interlaced (TAIL)-PCR strategies. The results suggested that one intact copy of the initial construction was integrated in the papaya genome and which probably resulted in one deletion (38 bp in size) of the host genomic DNA. Also, one unintended insertion of a 92 bp truncated NptII fragment was observed at the 5' end of the exogenous insert. Furthermore, we revealed its 5' and 3' flanking sequences between the insert DNA and the papaya genomic DNA, and developed the event-specific qualitative and quantitative PCR assays for GM Huanong No. 1 papaya based on the 5' integration flanking sequence. The relative limit of detection (LOD) of the qualitative PCR assay was about 0.01% in 100 ng of total papaya genomic DNA, corresponding to about 25 copies of papaya haploid genome. In the quantitative PCR, the limits of detection and quantification (LOD and LOQ) were as low as 12.5 and 25 copies of papaya haploid genome, respectively. In practical sample quantification, the quantified biases between the test and true values of three samples ranged from 0.44% to 4.41%. Collectively, we proposed that all of these results are useful for the identification and quantification of Huanong No. 1 papaya and its derivates.
Berlin, Conny; Blanch, Carles; Lewis, David J; Maladorno, Dionigi D; Michel, Christiane; Petrin, Michael; Sarp, Severine; Close, Philippe
2012-06-01
The detection of safety signals with medicines is an essential activity to protect public health. Despite widespread acceptance, it is unclear whether recently applied statistical algorithms provide enhanced performance characteristics when compared with traditional systems. Novartis has adopted a novel system for automated signal detection on the basis of disproportionality methods within a safety data mining application (Empirica™ Signal System [ESS]). ESS uses two algorithms for routine analyses: empirical Bayes Multi-item Gamma Poisson Shrinker and logistic regression (LR). A model was developed comprising 14 medicines, categorized as "new" or "established." A standard was prepared on the basis of safety findings selected from traditional sources. ESS results were compared with the standard to calculate the positive predictive value (PPV), specificity, and sensitivity. PPVs of the lower one-sided 5% and 0.05% confidence limits of the Bayes geometric mean (EB05) and of the LR odds ratio (LR0005) almost coincided for all the drug-event combinations studied. There was no obvious difference comparing the PPV of the leading Medical Dictionary for Regulatory Activities (MedDRA) terms to the PPV for all terms. The PPV of narrow MedDRA query searches was higher than that for broad searches. The widely used threshold value of EB05 = 2.0 or LR0005 = 2.0 together with more than three spontaneous reports of the drug-event combination produced balanced results for PPV, sensitivity, and specificity. Consequently, performance characteristics were best for leading terms with narrow MedDRA query searches irrespective of applying Multi-item Gamma Poisson Shrinker or LR at a threshold value of 2.0. This research formed the basis for the configuration of ESS for signal detection at Novartis. Copyright © 2011 John Wiley & Sons, Ltd.
Mapping Diffuse Seismicity Using Empirical Matched Field Processing Techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, J; Templeton, D C; Harris, D B
The objective of this project is to detect and locate more microearthquakes using the empirical matched field processing (MFP) method than can be detected using only conventional earthquake detection techniques. We propose that empirical MFP can complement existing catalogs and techniques. We test our method on continuous seismic data collected at the Salton Sea Geothermal Field during November 2009 and January 2010. In the Southern California Earthquake Data Center (SCEDC) earthquake catalog, 619 events were identified in our study area during this time frame and our MFP technique identified 1094 events. Therefore, we believe that the empirical MFP method combinedmore » with conventional methods significantly improves the network detection ability in an efficient matter.« less
Oligonucleotide gap-fill ligation for mutation detection and sequencing in situ
Mignardi, Marco; Mezger, Anja; Qian, Xiaoyan; La Fleur, Linnea; Botling, Johan; Larsson, Chatarina; Nilsson, Mats
2015-01-01
In clinical diagnostics a great need exists for targeted in situ multiplex nucleic acid analysis as the mutational status can offer guidance for effective treatment. One well-established method uses padlock probes for mutation detection and multiplex expression analysis directly in cells and tissues. Here, we use oligonucleotide gap-fill ligation to further increase specificity and to capture molecular substrates for in situ sequencing. Short oligonucleotides are joined at both ends of a padlock gap probe by two ligation events and are then locally amplified by target-primed rolling circle amplification (RCA) preserving spatial information. We demonstrate the specific detection of the A3243G mutation of mitochondrial DNA and we successfully characterize a single nucleotide variant in the ACTB mRNA in cells by in situ sequencing of RCA products generated by padlock gap-fill ligation. To demonstrate the clinical applicability of our assay, we show specific detection of a point mutation in the EGFR gene in fresh frozen and formalin-fixed, paraffin-embedded (FFPE) lung cancer samples and confirm the detected mutation by in situ sequencing. This approach presents several advantages over conventional padlock probes allowing simpler assay design for multiplexed mutation detection to screen for the presence of mutations in clinically relevant mutational hotspots directly in situ. PMID:26240388
NASA Astrophysics Data System (ADS)
Wang, L.; Toshioka, T.; Nakajima, T.; Narita, A.; Xue, Z.
2017-12-01
In recent years, more and more Carbon Capture and Storage (CCS) studies focus on seismicity monitoring. For the safety management of geological CO2 storage at Tomakomai, Hokkaido, Japan, an Advanced Traffic Light System (ATLS) combined different seismic messages (magnitudes, phases, distributions et al.) is proposed for injection controlling. The primary task for ATLS is the seismic events detection in a long-term sustained time series record. Considering the time-varying characteristics of Signal to Noise Ratio (SNR) of a long-term record and the uneven energy distributions of seismic event waveforms will increase the difficulty in automatic seismic detecting, in this work, an improved probability autoregressive (AR) method for automatic seismic event detecting is applied. This algorithm, called sequentially discounting AR learning (SDAR), can identify the effective seismic event in the time series through the Change Point detection (CPD) of the seismic record. In this method, an anomaly signal (seismic event) can be designed as a change point on the time series (seismic record). The statistical model of the signal in the neighborhood of event point will change, because of the seismic event occurrence. This means the SDAR aims to find the statistical irregularities of the record thought CPD. There are 3 advantages of SDAR. 1. Anti-noise ability. The SDAR does not use waveform messages (such as amplitude, energy, polarization) for signal detecting. Therefore, it is an appropriate technique for low SNR data. 2. Real-time estimation. When new data appears in the record, the probability distribution models can be automatic updated by SDAR for on-line processing. 3. Discounting property. the SDAR introduces a discounting parameter to decrease the influence of present statistic value on future data. It makes SDAR as a robust algorithm for non-stationary signal processing. Within these 3 advantages, the SDAR method can handle the non-stationary time-varying long-term series and achieve real-time monitoring. Finally, we employ the SDAR on a synthetic model and Tomakomai Ocean Bottom Cable (OBC) baseline data to prove the feasibility and advantage of our method.
Mathes, Melvin V.; O'Brien, Tara L.; Strickler, Kriston M.; Hardy, Joshua J.; Schill, William B.; Lukasik, Jerzy; Scott, Troy M.; Bailey, David E.; Fenger, Terry L.
2007-01-01
Several methods were used to determine the sources of fecal contamination in water samples collected during September and October 2004 from four tributaries to the New River Gorge National River -- Arbuckle Creek, Dunloup Creek, Keeney Creek, and Wolf Creek. All four tributaries historically have had elevated levels of fecal coliform bacteria. The source-tracking methods used yielded various results, possibly because one or more methods failed. Sourcing methods used in this study included the detection of several human-specific and animal-specific biological or molecular markers, and library-dependent pulsed-field gel electrophoresis analysis that attempted to associate Escherichia coli bacteria obtained from water samples with animal sources by matching DNA-fragment banding patterns. Evaluation of the results of quality-control analysis indicated that pulsed-field gel electrophoresis analysis was unable to identify known-source bacteria isolates. Increasing the size of the known-source library did not improve the results for quality-control samples. A number of emerging methods, using markers in Enterococcus, human urine, Bacteroidetes, and host mitochondrial DNA, demonstrated some potential in associating fecal contamination with human or animal sources in a limited analysis of quality-control samples. All four of the human-specific markers were detected in water samples from Keeney Creek, a watershed with no centralized municipal wastewater-treatment facilities, thus indicating human sources of fecal contamination. The human-specific Bacteroidetes and host mitochondrial DNA markers were detected in water samples from Dunloup Creek, Wolf Creek, and to a lesser degree Arbuckle Creek. Results of analysis for wastewater compounds indicate that the September 27 sample from Arbuckle Creek contained numerous human tracer compounds likely from sewage. Dog, horse, chicken, and pig host mitochondrial DNA were detected in some of the water samples with the exception of the October 5 sample from Dunloup Creek. Cow, white-tailed deer, and Canada goose DNA were not detected in any of the samples collected from the four tributaries, despite the presence of these animals in the watersheds. Future studies with more rigorous quality-control analyses are needed to investigate the potential applicability and use of these emerging methods. Because many of the detections for the various methods could vary over time and with flow conditions, repeated sampling during both base flow and storm events would be necessary to more definitively determine the sources of fecal contamination for each watershed.
Multi-Detection Events, Probability Density Functions, and Reduced Location Area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eslinger, Paul W.; Schrom, Brian T.
2016-03-01
Abstract Several efforts have been made in the Comprehensive Nuclear-Test-Ban Treaty (CTBT) community to assess the benefits of combining detections of radionuclides to improve the location estimates available from atmospheric transport modeling (ATM) backtrack calculations. We present a Bayesian estimation approach rather than a simple dilution field of regard approach to allow xenon detections and non-detections to be combined mathematically. This system represents one possible probabilistic approach to radionuclide event formation. Application of this method to a recent interesting radionuclide event shows a substantial reduction in the location uncertainty of that event.
Joint Attributes and Event Analysis for Multimedia Event Detection.
Ma, Zhigang; Chang, Xiaojun; Xu, Zhongwen; Sebe, Nicu; Hauptmann, Alexander G
2017-06-15
Semantic attributes have been increasingly used the past few years for multimedia event detection (MED) with promising results. The motivation is that multimedia events generally consist of lower level components such as objects, scenes, and actions. By characterizing multimedia event videos with semantic attributes, one could exploit more informative cues for improved detection results. Much existing work obtains semantic attributes from images, which may be suboptimal for video analysis since these image-inferred attributes do not carry dynamic information that is essential for videos. To address this issue, we propose to learn semantic attributes from external videos using their semantic labels. We name them video attributes in this paper. In contrast with multimedia event videos, these external videos depict lower level contents such as objects, scenes, and actions. To harness video attributes, we propose an algorithm established on a correlation vector that correlates them to a target event. Consequently, we could incorporate video attributes latently as extra information into the event detector learnt from multimedia event videos in a joint framework. To validate our method, we perform experiments on the real-world large-scale TRECVID MED 2013 and 2014 data sets and compare our method with several state-of-the-art algorithms. The experiments show that our method is advantageous for MED.
On event-based optical flow detection
Brosch, Tobias; Tschechne, Stephan; Neumann, Heiko
2015-01-01
Event-based sensing, i.e., the asynchronous detection of luminance changes, promises low-energy, high dynamic range, and sparse sensing. This stands in contrast to whole image frame-wise acquisition by standard cameras. Here, we systematically investigate the implications of event-based sensing in the context of visual motion, or flow, estimation. Starting from a common theoretical foundation, we discuss different principal approaches for optical flow detection ranging from gradient-based methods over plane-fitting to filter based methods and identify strengths and weaknesses of each class. Gradient-based methods for local motion integration are shown to suffer from the sparse encoding in address-event representations (AER). Approaches exploiting the local plane like structure of the event cloud, on the other hand, are shown to be well suited. Within this class, filter based approaches are shown to define a proper detection scheme which can also deal with the problem of representing multiple motions at a single location (motion transparency). A novel biologically inspired efficient motion detector is proposed, analyzed and experimentally validated. Furthermore, a stage of surround normalization is incorporated. Together with the filtering this defines a canonical circuit for motion feature detection. The theoretical analysis shows that such an integrated circuit reduces motion ambiguity in addition to decorrelating the representation of motion related activations. PMID:25941470
Chuang, Trees-Juen; Tseng, Yu-Hsiang; Chen, Chia-Ying; Wang, Yi-Da
2017-08-01
Genomic imprinting is an important epigenetic process that silences one of the parentally-inherited alleles of a gene and thereby exhibits allelic-specific expression (ASE). Detection of human imprinting events is hampered by the infeasibility of the reciprocal mating system in humans and the removal of ASE events arising from non-imprinting factors. Here, we describe a pipeline with the pattern of reciprocal allele descendants (RADs) through genotyping and transcriptome sequencing data across independent parent-offspring trios to discriminate between varied types of ASE (e.g., imprinting, genetic variation-dependent ASE, and random monoallelic expression (RME)). We show that the vast majority of ASE events are due to sequence-dependent genetic variant, which are evolutionarily conserved and may themselves play a cis-regulatory role. Particularly, 74% of non-RAD ASE events, even though they exhibit ASE biases toward the same parentally-inherited allele across different individuals, are derived from genetic variation but not imprinting. We further show that the RME effect may affect the effectiveness of the population-based method for detecting imprinting events and our pipeline can help to distinguish between these two ASE types. Taken together, this study provides a good indicator for categorization of different types of ASE, opening up this widespread and complex mechanism for comprehensive characterization.
Unconstrained sleep apnea monitoring using polyvinylidene fluoride film-based sensor.
Hwang, Su Hwan; Lee, Hong Ji; Yoon, Hee Nam; Jung, Da Woon; Lee, Yu-Jin G; Lee, Yu Jin; Jeong, Do-Un; Park, Kwang Suk
2014-07-01
We established and tested an unconstrained sleep apnea monitoring method using a polyvinylidene (PVDF) film-based sensor for continuous and accurate monitoring of apneic events occurred during sleep. Twenty-six sleep apnea patients and six normal subjects participated in this study. Subjects' respiratory signals were measured using the PVDF-based sensor during polysomnography. The PVDF sensor comprised a 4 × 1 array, and a thin silicon pad was placed over the sensor to prevent damage. Total thickness of the merged system was approximately 1.1 mm which was thin enough to prevent the subject from being consciously aware of its presence. It was designed to be placed under subjects' backs and installed between a bed cover and mattress. The proposed method was based on the standard deviation of the PVDF signals, and it was applied to a test set for detecting apneic events. The method's performance was assessed by comparing the results with a sleep physician's manual scoring. The correlation coefficient for the apnea-hypopnea index (AHI) values between the methods was 0.94 (p < 0.001). The areas under the receiver operating curves at three AHI threshold levels (>5, >15, and >20) for sleep apnea diagnosis were 0.98, 0.99, and 0.98, respectively. For min-by-min apnea detection, the method classified sleep apnea with an average sensitivity of 72.9%, specificity of 90.6%, accuracy of 85.5%, and kappa statistic of 0.60. The developed system and method can be applied to sleep apnea detection in home or ambulatory monitoring.
Multi-anode wire two dimensional proportional counter for detecting Iron-55 X-Ray Radiation
NASA Astrophysics Data System (ADS)
Weston, Michael William James
Radiation detectors in many applications use small sensor areas or large tubes which only collect one-dimensional information. There are some applications that require analyzing a large area and locating specific elements such as contamination on the heat tiles of a space shuttle or features on historical artifacts. The process can be time consuming and scanning a large area in a single pass is beneficial. The use of a two dimensional multi-wire proportional counter provides a large detection window presenting positional information in a single pass. This thesis described the design and implementation of an experimental detector to evaluate a specific design intended for use as a handheld instrument. The main effort of this research was to custom build a detector for testing purposes. The aluminum chamber and all circuit boards were custom designed and built specifically for this application. Various software and programmable logic algorithms were designed to analyze the raw data in real time and attempted to determine what data was useful and what could be discarded. The research presented here provides results useful for designing an improved second generation detector in the future. With the anode wire spacing chosen and the minimal collimation of the radiation source, detected events occurred all over the detection grid at any time. The raw event data did not make determining the source position easy and further data correlation was required. An abundance of samples had multiple wire hits which were not useful because it falsely reported the source to be all over the place and at different energy levels. By narrowing down the results to only use the largest signal pairs on different axes in each event, a much more accurate analysis of where the source existed above the grid was determined. The basic principle and construction method was shown to work, however the gas selection, geometry and anode wire constructs proved to be poor. To provide a system optimized for a specific application would require detailed Monte Carlo simulations. These simulation results together with the details and techniques implemented in this thesis would provide a final instrument of much higher accuracy.
Alsina-Pagès, Rosa Ma; Navarro, Joan; Alías, Francesc; Hervás, Marcos
2017-04-13
The consistent growth in human life expectancy during the recent years has driven governments and private organizations to increase the efforts in caring for the eldest segment of the population. These institutions have built hospitals and retirement homes that have been rapidly overfilled, making their associated maintenance and operating costs prohibitive. The latest advances in technology and communications envisage new ways to monitor those people with special needs at their own home, increasing their quality of life in a cost-affordable way. The purpose of this paper is to present an Ambient Assisted Living (AAL) platform able to analyze, identify, and detect specific acoustic events happening in daily life environments, which enables the medic staff to remotely track the status of every patient in real-time. Additionally, this tele-care proposal is validated through a proof-of-concept experiment that takes benefit of the capabilities of the NVIDIA Graphical Processing Unit running on a Jetson TK1 board to locally detect acoustic events. Conducted experiments demonstrate the feasibility of this approach by reaching an overall accuracy of 82% when identifying a set of 14 indoor environment events related to the domestic surveillance and patients' behaviour monitoring field. Obtained results encourage practitioners to keep working in this direction, and enable health care providers to remotely track the status of their patients in real-time with non-invasive methods.
Alsina-Pagès, Rosa Ma; Navarro, Joan; Alías, Francesc; Hervás, Marcos
2017-01-01
The consistent growth in human life expectancy during the recent years has driven governments and private organizations to increase the efforts in caring for the eldest segment of the population. These institutions have built hospitals and retirement homes that have been rapidly overfilled, making their associated maintenance and operating costs prohibitive. The latest advances in technology and communications envisage new ways to monitor those people with special needs at their own home, increasing their quality of life in a cost-affordable way. The purpose of this paper is to present an Ambient Assisted Living (AAL) platform able to analyze, identify, and detect specific acoustic events happening in daily life environments, which enables the medic staff to remotely track the status of every patient in real-time. Additionally, this tele-care proposal is validated through a proof-of-concept experiment that takes benefit of the capabilities of the NVIDIA Graphical Processing Unit running on a Jetson TK1 board to locally detect acoustic events. Conducted experiments demonstrate the feasibility of this approach by reaching an overall accuracy of 82% when identifying a set of 14 indoor environment events related to the domestic surveillance and patients’ behaviour monitoring field. Obtained results encourage practitioners to keep working in this direction, and enable health care providers to remotely track the status of their patients in real-time with non-invasive methods. PMID:28406459
Exploring the evolution of node neighborhoods in Dynamic Networks
NASA Astrophysics Data System (ADS)
Orman, Günce Keziban; Labatut, Vincent; Naskali, Ahmet Teoman
2017-09-01
Dynamic Networks are a popular way of modeling and studying the behavior of evolving systems. However, their analysis constitutes a relatively recent subfield of Network Science, and the number of available tools is consequently much smaller than for static networks. In this work, we propose a method specifically designed to take advantage of the longitudinal nature of dynamic networks. It characterizes each individual node by studying the evolution of its direct neighborhood, based on the assumption that the way this neighborhood changes reflects the role and position of the node in the whole network. For this purpose, we define the concept of neighborhood event, which corresponds to the various transformations such groups of nodes can undergo, and describe an algorithm for detecting such events. We demonstrate the interest of our method on three real-world networks: DBLP, LastFM and Enron. We apply frequent pattern mining to extract meaningful information from temporal sequences of neighborhood events. This results in the identification of behavioral trends emerging in the whole network, as well as the individual characterization of specific nodes. We also perform a cluster analysis, which reveals that, in all three networks, one can distinguish two types of nodes exhibiting different behaviors: a very small group of active nodes, whose neighborhood undergo diverse and frequent events, and a very large group of stable nodes.
Obstructive Sleep Apnea Screening Using a Piezo-Electric Sensor
2017-01-01
In this study, we propose a novel method for obstructive sleep apnea (OSA) detection using a piezo-electric sensor. OSA is a relatively common sleep disorder. However, more than 80% of OSA patients remain undiagnosed. We investigated the feasibility of OSA assessment using a single-channel physiological signal to simplify the OSA screening. We detected both snoring and heartbeat information by using a piezo-electric sensor, and snoring index (SI) and features based on pulse rate variability (PRV) analysis were extracted from the filtered piezo-electric sensor signal. A support vector machine (SVM) was used as a classifier to detect OSA events. The performance of the proposed method was evaluated on 45 patients from mild, moderate, and severe OSA groups. The method achieved a mean sensitivity, specificity, and accuracy of 72.5%, 74.2%, and 71.5%; 85.8%, 80.5%, and 80.0%; and 70.3%, 77.1%, and 71.9% for the mild, moderate, and severe groups, respectively. Finally, these results not only show the feasibility of OSA detection using a piezo-electric sensor, but also illustrate its usefulness for monitoring sleep and diagnosing OSA. PMID:28480645
Adaptable, high recall, event extraction system with minimal configuration.
Miwa, Makoto; Ananiadou, Sophia
2015-01-01
Biomedical event extraction has been a major focus of biomedical natural language processing (BioNLP) research since the first BioNLP shared task was held in 2009. Accordingly, a large number of event extraction systems have been developed. Most such systems, however, have been developed for specific tasks and/or incorporated task specific settings, making their application to new corpora and tasks problematic without modification of the systems themselves. There is thus a need for event extraction systems that can achieve high levels of accuracy when applied to corpora in new domains, without the need for exhaustive tuning or modification, whilst retaining competitive levels of performance. We have enhanced our state-of-the-art event extraction system, EventMine, to alleviate the need for task-specific tuning. Task-specific details are specified in a configuration file, while extensive task-specific parameter tuning is avoided through the integration of a weighting method, a covariate shift method, and their combination. The task-specific configuration and weighting method have been employed within the context of two different sub-tasks of BioNLP shared task 2013, i.e. Cancer Genetics (CG) and Pathway Curation (PC), removing the need to modify the system specifically for each task. With minimal task specific configuration and tuning, EventMine achieved the 1st place in the PC task, and 2nd in the CG, achieving the highest recall for both tasks. The system has been further enhanced following the shared task by incorporating the covariate shift method and entity generalisations based on the task definitions, leading to further performance improvements. We have shown that it is possible to apply a state-of-the-art event extraction system to new tasks with high levels of performance, without having to modify the system internally. Both covariate shift and weighting methods are useful in facilitating the production of high recall systems. These methods and their combination can adapt a model to the target data with no deep tuning and little manual configuration.
Detecting a Non-Gaussian Stochastic Background of Gravitational Radiation
NASA Astrophysics Data System (ADS)
Drasco, Steve; Flanagan, Éanna É.
2002-12-01
We derive a detection method for a stochastic background of gravitational waves produced by events where the ratio of the average time between events to the average duration of an event is large. Such a signal would sound something like popcorn popping. Our derivation is based on the somewhat unrealistic assumption that the duration of an event is smaller than the detector time resolution.
Conjugate LEP Events at Palmer Station, Antarctica: Hemisphere-Dependent Timing
NASA Astrophysics Data System (ADS)
Kim, D.; Moore, R. C.
2016-12-01
During March 2015, a large number of lightning-induced electron precipitation (LEP) events were simultaneously observed using very low frequency receivers in both the northern and southern hemispheres. After removing overlapping events and unclear (or not well-defined) events, 22 conjugate LEP events remain and are used to statistically analyze the hemispheric dependence of LEP onset time. LEP events were detected in the northern hemisphere using the VLF remote sensing method by tracking the NAA transmitter signal (24.0 kHz, Cutler, Maine) at Tuscaloosa, Alabama. In the southern hemisphere, the NPM transmitter signal (21.4 kHz, Laulaulei, Hawii) is tracked at Palmer station, Antarctica. In each case, the GLD360 dataset from Vaisala is used to determine the hemisphere of the causative lightning flash, and this is compared with the hemisphere in which the LEP event is detected first. The onset times and onset durations can be calculated using a number of different methods, however. In this paper, we compare and contrast the onset times and durations calculated using multiple different methods, with each method applied to the same 22 conjugate LEP events.
Sugano, Hiroto; Hara, Shinsuke; Tsujioka, Tetsuo; Inoue, Tadayuki; Nakajima, Shigeyoshi; Kozaki, Takaaki; Namkamura, Hajime; Takeuchi, Kazuhide
2011-01-01
For ubiquitous health care systems which continuously monitor a person's vital signs such as electrocardiogram (ECG), body surface temperature and three-dimensional (3D) acceleration by wireless, it is important to accurately detect the occurrence of an abnormal event in the data and immediately inform a medical doctor of its detail. In this paper, we introduce a remote health care system, which is composed of a wireless vital sensor, multiple receivers and a triage engine installed in a desktop personal computer (PC). The middleware installed in the receiver, which was developed in C++, supports reliable data handling of vital data to the ethernet port. On the other hand, the human interface of the triage engine, which was developed in JAVA, shows graphics on his/her ECG data, 3D acceleration data, body surface temperature data and behavior status in the display of the desktop PC and sends an urgent e-mail containing the display data to a pre-registered medical doctor when it detects the occurrence of an abnormal event. In the triage engine, the lethal arrhythmia detection algorithm based on short time Fourier transform (STFT) analysis can achieve 100 % sensitivity and 99.99 % specificity, and the behavior recognition algorithm based on the combination of the nearest neighbor method and the Naive Bayes method can achieve more than 71 % classification accuracy.
NAIMA as a solution for future GMO diagnostics challenges.
Dobnik, David; Morisset, Dany; Gruden, Kristina
2010-03-01
In the field of genetically modified organism (GMO) diagnostics, real-time PCR has been the method of choice for target detection and quantification in most laboratories. Despite its numerous advantages, however, the lack of a true multiplexing option may render real-time PCR less practical in the face of future GMO detection challenges such as the multiplicity and increasing complexity of new transgenic events, as well as the repeated occurrence of unauthorized GMOs on the market. In this context, we recently reported the development of a novel multiplex quantitative DNA-based target amplification method, named NASBA implemented microarray analysis (NAIMA), which is suitable for sensitive, specific and quantitative detection of GMOs on a microarray. In this article, the performance of NAIMA is compared with that of real-time PCR, the focus being their performances in view of the upcoming challenge to detect/quantify an increasing number of possible GMOs at a sustainable cost and affordable staff effort. Finally, we present our conclusions concerning the applicability of NAIMA for future use in GMO diagnostics.
Debode, Frederic; Janssen, Eric; Bragard, Claude; Berben, Gilbert
2017-08-01
The presence of genetically modified organisms (GMOs) in food and feed is mainly detected by the use of targets focusing on promoters and terminators. As some genes are frequently used in genetically modified (GM) construction, they also constitute excellent screening elements and their use is increasing. In this paper we propose a new target for the detection of cry1Ab and cry1Ac genes by real-time polymerase chain reaction (PCR) and pyrosequencing. The specificity, sensitivity and robustness of the real-time PCR method were tested following the recommendations of international guidelines and the method met the expected performance criteria. This paper also shows how the robustness testing was assessed. This new cry1Ab/Ac method can provide a positive signal with a larger number of GM events than do the other existing methods using double dye-probes. The method permits the analysis of results with less ambiguity than the SYBRGreen method recommended by the European Reference Laboratory (EURL) GM Food and Feed (GMFF). A pyrosequencing method was also developed to gain additional information thanks to the sequence of the amplicon. This method of sequencing-by-synthesis can determine the sequence between the primers used for PCR. Pyrosequencing showed that the sequences internal to the primers present differences following the GM events considered and three different sequences were observed. The sensitivity of the pyrosequencing was tested on reference flours with a low percentage GM content and different copy numbers. Improvements in the pyrosequencing protocol provided correct sequences with 50 copies of the target. Below this copy number, the quality of the sequence was more random.
Cao, Youfang; Liang, Jie
2013-01-01
Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape. PMID:23862966
NASA Astrophysics Data System (ADS)
Cao, Youfang; Liang, Jie
2013-07-01
Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape.
Cao, Youfang; Liang, Jie
2013-07-14
Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape.
Abnormal global and local event detection in compressive sensing domain
NASA Astrophysics Data System (ADS)
Wang, Tian; Qiao, Meina; Chen, Jie; Wang, Chuanyun; Zhang, Wenjia; Snoussi, Hichem
2018-05-01
Abnormal event detection, also known as anomaly detection, is one challenging task in security video surveillance. It is important to develop effective and robust movement representation models for global and local abnormal event detection to fight against factors such as occlusion and illumination change. In this paper, a new algorithm is proposed. It can locate the abnormal events on one frame, and detect the global abnormal frame. The proposed algorithm employs a sparse measurement matrix designed to represent the movement feature based on optical flow efficiently. Then, the abnormal detection mission is constructed as a one-class classification task via merely learning from the training normal samples. Experiments demonstrate that our algorithm performs well on the benchmark abnormal detection datasets against state-of-the-art methods.
Diagnosis diagrams for passing signals on an automatic block signaling railway section
NASA Astrophysics Data System (ADS)
Spunei, E.; Piroi, I.; Chioncel, C. P.; Piroi, F.
2018-01-01
This work presents a diagnosis method for railway traffic security installations. More specifically, the authors present a series of diagnosis charts for passing signals on a railway block equipped with an automatic block signaling installation. These charts are based on the exploitation electric schemes, and are subsequently used to develop a diagnosis software package. The thus developed software package contributes substantially to a reduction of failure detection and remedy for these types of installation faults. The use of the software package eliminates making wrong decisions in the fault detection process, decisions that may result in longer remedy times and, sometimes, to railway traffic events.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warburton, William K.; Hennig, Wolfgang G.
A method and apparatus for measuring the concentrations of radioxenon isotopes in a gaseous sample wherein the sample cell is surrounded by N sub-detectors that are sensitive to both electrons and to photons from radioxenon decays. Signal processing electronics are provided that can detect events within the sub-detectors, measure their energies, determine whether they arise from electrons or photons, and detect coincidences between events within the same or different sub-detectors. The energies of detected two or three event coincidences are recorded as points in associated two or three-dimensional histograms. Counts within regions of interest in the histograms are then usedmore » to compute estimates of the radioxenon isotope concentrations. The method achieves lower backgrounds and lower minimum detectable concentrations by using smaller detector crystals, eliminating interference between double and triple coincidence decay branches, and segregating double coincidences within the same sub-detector from those occurring between different sub-detectors.« less
TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach.
Elgendi, Mohamed
2016-11-02
Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages ("TERMA") involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages ( W 1 and W 2 ) have to follow the inequality ( 8 × W 1 ) ≥ W 2 ≥ ( 2 × W 1 ) . Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions.
Romero, Peggy; Miller, Ted; Garakani, Arman
2009-12-01
Current methods to assess neurodegradation in dorsal root ganglion cultures as a model for neurodegenerative diseases are imprecise and time-consuming. Here we describe two new methods to quantify neuroprotection in these cultures. The neurite quality index (NQI) builds upon earlier manual methods, incorporating additional morphological events to increase detection sensitivity for the detection of early degeneration events. Neurosight is a machine vision-based method that recapitulates many of the strengths of NQI while enabling high-throughput screening applications with decreased costs.
Liu, Chengyu; Zhao, Lina; Tang, Hong; Li, Qiao; Wei, Shoushui; Li, Jianqing
2016-08-01
False alarm (FA) rates as high as 86% have been reported in intensive care unit monitors. High FA rates decrease quality of care by slowing staff response times while increasing patient burdens and stresses. In this study, we proposed a rule-based and multi-channel information fusion method for accurately classifying the true or false alarms for five life-threatening arrhythmias: asystole (ASY), extreme bradycardia (EBR), extreme tachycardia (ETC), ventricular tachycardia (VTA) and ventricular flutter/fibrillation (VFB). The proposed method consisted of five steps: (1) signal pre-processing, (2) feature detection and validation, (3) true/false alarm determination for each channel, (4) 'real-time' true/false alarm determination and (5) 'retrospective' true/false alarm determination (if needed). Up to four signal channels, that is, two electrocardiogram signals, one arterial blood pressure and/or one photoplethysmogram signal were included in the analysis. Two events were set for the method validation: event 1 for 'real-time' and event 2 for 'retrospective' alarm classification. The results showed that 100% true positive ratio (i.e. sensitivity) on the training set were obtained for ASY, EBR, ETC and VFB types, and 94% for VTA type, accompanied by the corresponding true negative ratio (i.e. specificity) results of 93%, 81%, 78%, 85% and 50% respectively, resulting in the score values of 96.50, 90.70, 88.89, 92.31 and 64.90, as well as with a final score of 80.57 for event 1 and 79.12 for event 2. For the test set, the proposed method obtained the score of 88.73 for ASY, 77.78 for EBR, 89.92 for ETC, 67.74 for VFB and 61.04 for VTA types, with the final score of 71.68 for event 1 and 75.91 for event 2.
Real time algorithms for sharp wave ripple detection.
Sethi, Ankit; Kemere, Caleb
2014-01-01
Neural activity during sharp wave ripples (SWR), short bursts of co-ordinated oscillatory activity in the CA1 region of the rodent hippocampus, is implicated in a variety of memory functions from consolidation to recall. Detection of these events in an algorithmic framework, has thus far relied on simple thresholding techniques with heuristically derived parameters. This study is an investigation into testing and improving the current methods for detection of SWR events in neural recordings. We propose and profile methods to reduce latency in ripple detection. Proposed algorithms are tested on simulated ripple data. The findings show that simple realtime algorithms can improve upon existing power thresholding methods and can detect ripple activity with latencies in the range of 10-20 ms.
Olsen, Sisse; Neale, Graham; Schwab, Kat; Psaila, Beth; Patel, Tejal; Chapman, E Jane; Vincent, Charles
2007-01-01
Background Over the past five years, in most hospitals in England and Wales, incident reporting has become well established but it remains unclear how well reports match clinical adverse events. International epidemiological studies of adverse events are based on retrospective, multi‐hospital case record review. In this paper the authors describe the use of incident reporting, pharmacist surveillance and local real‐time record review for the recognition of clinical risks associated with hospital inpatient care. Methodology Data on adverse events were collected prospectively on 288 patients discharged from adult acute medical and surgical units in an NHS district general hospital using incident reports, active surveillance of prescription charts by pharmacists and record review at time of discharge. Results Record review detected 26 adverse events (AEs) and 40 potential adverse events (PAEs) occurring during the index admission. In contrast, in the same patient group, incident reporting detected 11 PAEs and no AEs. Pharmacy surveillance found 10 medication errors all of which were PAEs. There was little overlap in the nature of events detected by the three methods. Conclusion The findings suggest that incident reporting does not provide an adequate assessment of clinical adverse events and that this method needs to be supplemented with other more systematic forms of data collection. Structured record review, carried out by clinicians, provides an important component of an integrated approach to identifying risk in the context of developing a safety and quality improvement programme. PMID:17301203
Bowman, Howard; Filetti, Marco; Janssen, Dirk; Su, Li; Alsufyani, Abdulmajeed; Wyble, Brad
2013-01-01
We propose a novel deception detection system based on Rapid Serial Visual Presentation (RSVP). One motivation for the new method is to present stimuli on the fringe of awareness, such that it is more difficult for deceivers to confound the deception test using countermeasures. The proposed system is able to detect identity deception (by using the first names of participants) with a 100% hit rate (at an alpha level of 0.05). To achieve this, we extended the classic Event-Related Potential (ERP) techniques (such as peak-to-peak) by applying Randomisation, a form of Monte Carlo resampling, which we used to detect deception at an individual level. In order to make the deployment of the system simple and rapid, we utilised data from three electrodes only: Fz, Cz and Pz. We then combined data from the three electrodes using Fisher's method so that each participant was assigned a single p-value, which represents the combined probability that a specific participant was being deceptive. We also present subliminal salience search as a general method to determine what participants find salient by detecting breakthrough into conscious awareness using EEG. PMID:23372697
Detection of medication-related problems in hospital practice: a review
Manias, Elizabeth
2013-01-01
This review examines the effectiveness of detection methods in terms of their ability to identify and accurately determine medication-related problems in hospitals. A search was conducted of databases from inception to June 2012. The following keywords were used in combination: medication error or adverse drug event or adverse drug reaction, comparison, detection, hospital and method. Seven detection methods were considered: chart review, claims data review, computer monitoring, direct care observation, interviews, prospective data collection and incident reporting. Forty relevant studies were located. Detection methods that were better able to identify medication-related problems compared with other methods tested in the same study included chart review, computer monitoring, direct care observation and prospective data collection. However, only small numbers of studies were involved in comparisons with direct care observation (n = 5) and prospective data collection (n = 6). There was little focus on detecting medication-related problems during various stages of the medication process, and comparisons associated with the seriousness of medication-related problems were examined in 19 studies. Only 17 studies involved appropriate comparisons with a gold standard, which provided details about sensitivities and specificities. In view of the relatively low identification of medication-related problems with incident reporting, use of this method in tracking trends over time should be met with some scepticism. Greater attention should be placed on combining methods, such as chart review and computer monitoring in examining trends. More research is needed on the use of claims data, direct care observation, interviews and prospective data collection as detection methods. PMID:23194349
Bottlenecks drive temporal and spatial genetic changes in alpine caddisfly metapopulations.
Shama, Lisa N S; Kubow, Karen B; Jokela, Jukka; Robinson, Christopher T
2011-09-27
Extinction and re-colonisation of local populations is common in ephemeral habitats such as temporary streams. In most cases, such population turnover leads to reduced genetic diversity within populations and increased genetic differentiation among populations due to stochastic founder events, genetic drift, and bottlenecks associated with re-colonisation. Here, we examined the spatio-temporal genetic structure of 8 alpine caddisfly populations inhabiting permanent and temporary streams from four valleys in two regions of the Swiss Alps in years before and after a major stream drying event, the European heat wave in summer 2003. We found that population turnover after 2003 led to a loss of allelic richness and gene diversity but not to significant changes in observed heterozygosity. Within all valleys, permanent and temporary streams in any given year were not differentiated, suggesting considerable gene flow and admixture between streams with differing hydroperiods. Large changes in allele frequencies after 2003 resulted in a substantial increase in genetic differentiation among valleys within one to two years (1-2 generations) driven primarily by drift and immigration. Signatures of genetic bottlenecks were detected in all 8 populations after 2003 using the M-ratio method, but in no populations when using a heterozygosity excess method, indicating differential sensitivity of bottleneck detection methods. We conclude that genetic differentiation among A. uncatus populations changed markedly both temporally and spatially in response to the extreme climate event in 2003. Our results highlight the magnitude of temporal population genetic changes in response to extreme events. More specifically, our results show that extreme events can cause rapid genetic divergence in metapopulations. Further studies are needed to determine if recovery from this perturbation through gradual mixing of diverged populations by migration and gene flow leads to the pre-climate event state, or whether the observed changes represent a new genetic equilibrium.
Estimate of within population incremental selection through branch imbalance in lineage trees
Liberman, Gilad; Benichou, Jennifer I.C.; Maman, Yaakov; Glanville, Jacob; Alter, Idan; Louzoun, Yoram
2016-01-01
Incremental selection within a population, defined as limited fitness changes following mutation, is an important aspect of many evolutionary processes. Strongly advantageous or deleterious mutations are detected using the synonymous to non-synonymous mutations ratio. However, there are currently no precise methods to estimate incremental selection. We here provide for the first time such a detailed method and show its precision in multiple cases of micro-evolution. The proposed method is a novel mixed lineage tree/sequence based method to detect within population selection as defined by the effect of mutations on the average number of offspring. Specifically, we propose to measure the log of the ratio between the number of leaves in lineage trees branches following synonymous and non-synonymous mutations. The method requires a high enough number of sequences, and a large enough number of independent mutations. It assumes that all mutations are independent events. It does not require of a baseline model and is practically not affected by sampling biases. We show the method's wide applicability by testing it on multiple cases of micro-evolution. We show that it can detect genes and inter-genic regions using the selection rate and detect selection pressures in viral proteins and in the immune response to pathogens. PMID:26586802
Detection and analysis of a transient energy burst with beamforming of multiple teleseismic phases
NASA Astrophysics Data System (ADS)
Retailleau, Lise; Landès, Matthieu; Gualtieri, Lucia; Shapiro, Nikolai M.; Campillo, Michel; Roux, Philippe; Guilbert, Jocelyn
2018-01-01
Seismological detection methods are traditionally based on picking techniques. These methods cannot be used to analyse emergent signals where the arrivals cannot be picked. Here, we detect and locate seismic events by applying a beamforming method that combines multiple body-wave phases to USArray data. This method explores the consistency and characteristic behaviour of teleseismic body waves that are recorded by a large-scale, still dense, seismic network. We perform time-slowness analysis of the signals and correlate this with the time-slowness equivalent of the different body-wave phases predicted by a global traveltime calculator, to determine the occurrence of an event with no a priori information about it. We apply this method continuously to one year of data to analyse the different events that generate signals reaching the USArray network. In particular, we analyse in detail a low-frequency secondary microseismic event that occurred on 2010 February 1. This event, that lasted 1 d, has a narrow frequency band around 0.1 Hz, and it occurred at a distance of 150° to the USArray network, South of Australia. We show that the most energetic phase observed is the PKPab phase. Direct amplitude analysis of regional seismograms confirms the occurrence of this event. We compare the seismic observations with models of the spectral density of the pressure field generated by the interferences between oceanic waves. We attribute the observed signals to a storm-generated microseismic event that occurred along the South East Indian Ridge.
Gupta, Rahul; Audhkhasi, Kartik; Lee, Sungbok; Narayanan, Shrikanth
2017-01-01
Non-verbal communication involves encoding, transmission and decoding of non-lexical cues and is realized using vocal (e.g. prosody) or visual (e.g. gaze, body language) channels during conversation. These cues perform the function of maintaining conversational flow, expressing emotions, and marking personality and interpersonal attitude. In particular, non-verbal cues in speech such as paralanguage and non-verbal vocal events (e.g. laughters, sighs, cries) are used to nuance meaning and convey emotions, mood and attitude. For instance, laughters are associated with affective expressions while fillers (e.g. um, ah, um) are used to hold floor during a conversation. In this paper we present an automatic non-verbal vocal events detection system focusing on the detect of laughter and fillers. We extend our system presented during Interspeech 2013 Social Signals Sub-challenge (that was the winning entry in the challenge) for frame-wise event detection and test several schemes for incorporating local context during detection. Specifically, we incorporate context at two separate levels in our system: (i) the raw frame-wise features and, (ii) the output decisions. Furthermore, our system processes the output probabilities based on a few heuristic rules in order to reduce erroneous frame-based predictions. Our overall system achieves an Area Under the Receiver Operating Characteristics curve of 95.3% for detecting laughters and 90.4% for fillers on the test set drawn from the data specifications of the Interspeech 2013 Social Signals Sub-challenge. We perform further analysis to understand the interrelation between the features and obtained results. Specifically, we conduct a feature sensitivity analysis and correlate it with each feature's stand alone performance. The observations suggest that the trained system is more sensitive to a feature carrying higher discriminability with implications towards a better system design. PMID:28713197
NASA Astrophysics Data System (ADS)
Khosla, Deepak; Huber, David J.; Martin, Kevin
2017-05-01
This paper† describes a technique in which we improve upon the prior performance of the Rapid Serial Visual Presentation (RSVP) EEG paradigm for image classification though the insertion of visual attention distracters and overall sequence reordering based upon the expected ratio of rare to common "events" in the environment and operational context. Inserting distracter images maintains the ratio of common events to rare events at an ideal level, maximizing the rare event detection via P300 EEG response to the RSVP stimuli. The method has two steps: first, we compute the optimal number of distracters needed for an RSVP stimuli based on the desired sequence length and expected number of targets and insert the distracters into the RSVP sequence, and then we reorder the RSVP sequence to maximize P300 detection. We show that by reducing the ratio of target events to nontarget events using this method, we can allow RSVP sequences with more targets without sacrificing area under the ROC curve (azimuth).
A large 2D PSD for thermal neutron detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knott, R.B.; Watt, G.; Boldeman, J.W.
1996-12-31
A 2D PSD based on a MWPC has been constructed for a small angle neutron scattering instrument. The active area of the detector was 640 x 640 mm{sup 2}. To meet the specifications for neutron detection efficiency and spatial resolution, and to minimize parallax, the gas mixture was 190 kPa {sup 3}He plus 100 kPa CF{sub 4} and the active volume had a thickness of 30 mm. The design maximum neutron count-rate of the detector was 10{sup 5} events per second. The (calculated) neutron detection efficiency was 60% for 2{angstrom} neutrons and the (measured) neutron energy resolution on the anodemore » grid was typically 20% (fwhm). The location of a neutron detection event within the active area was determined using the wire-by-wire method: the spatial resolution (5 x 5 mm{sup 2}) was thereby defined by the wire geometry. A 16 channel charge-sensitive preamplifier/amplifier/comparator module has been developed with a channel sensitivity of 0.1 V/fC, noise linewidth of 0.4 fC (fwhm) and channel-to-channel cross-talk of less than 5%. The Proportional Counter Operating System (PCOS III) (LeCroy Corp USA) was used for event encoding. The ECL signals produced by the 16 channel modules were latched in PCOS III by a trigger pulse from the anode and the fast encoders produce a position and width for each event. The information was transferred to a UNIX workstation for accumulation and online display.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.
Precise analysis of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses analysis of video compression statistics for detecting and characterizing events in large data sets. This software works by converting the datamore » into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect events in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each analysis. Going forward, an algorithm for detecting and possibly describing events automatically can be written based on statistic(s) for each data type.« less
Detecting Seismic Events Using a Supervised Hidden Markov Model
NASA Astrophysics Data System (ADS)
Burks, L.; Forrest, R.; Ray, J.; Young, C.
2017-12-01
We explore the use of supervised hidden Markov models (HMMs) to detect seismic events in streaming seismogram data. Current methods for seismic event detection include simple triggering algorithms, such as STA/LTA and the Z-statistic, which can lead to large numbers of false positives that must be investigated by an analyst. The hypothesis of this study is that more advanced detection methods, such as HMMs, may decreases false positives while maintaining accuracy similar to current methods. We train a binary HMM classifier using 2 weeks of 3-component waveform data from the International Monitoring System (IMS) that was carefully reviewed by an expert analyst to pick all seismic events. Using an ensemble of simple and discrete features, such as the triggering of STA/LTA, the HMM predicts the time at which transition occurs from noise to signal. Compared to the STA/LTA detection algorithm, the HMM detects more true events, but the false positive rate remains unacceptably high. Future work to potentially decrease the false positive rate may include using continuous features, a Gaussian HMM, and multi-class HMMs to distinguish between types of seismic waves (e.g., P-waves and S-waves). Acknowledgement: Sandia National Laboratories is a multi-mission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525.SAND No: SAND2017-8154 A
Space-time clusters for early detection of grizzly bear predation.
Kermish-Wells, Joseph; Massolo, Alessandro; Stenhouse, Gordon B; Larsen, Terrence A; Musiani, Marco
2018-01-01
Accurate detection and classification of predation events is important to determine predation and consumption rates by predators. However, obtaining this information for large predators is constrained by the speed at which carcasses disappear and the cost of field data collection. To accurately detect predation events, researchers have used GPS collar technology combined with targeted site visits. However, kill sites are often investigated well after the predation event due to limited data retrieval options on GPS collars (VHF or UHF downloading) and to ensure crew safety when working with large predators. This can lead to missing information from small-prey (including young ungulates) kill sites due to scavenging and general site deterioration (e.g., vegetation growth). We used a space-time permutation scan statistic (STPSS) clustering method (SaTScan) to detect predation events of grizzly bears ( Ursus arctos ) fitted with satellite transmitting GPS collars. We used generalized linear mixed models to verify predation events and the size of carcasses using spatiotemporal characteristics as predictors. STPSS uses a probability model to compare expected cluster size (space and time) with the observed size. We applied this method retrospectively to data from 2006 to 2007 to compare our method to random GPS site selection. In 2013-2014, we applied our detection method to visit sites one week after their occupation. Both datasets were collected in the same study area. Our approach detected 23 of 27 predation sites verified by visiting 464 random grizzly bear locations in 2006-2007, 187 of which were within space-time clusters and 277 outside. Predation site detection increased by 2.75 times (54 predation events of 335 visited clusters) using 2013-2014 data. Our GLMMs showed that cluster size and duration predicted predation events and carcass size with high sensitivity (0.72 and 0.94, respectively). Coupling GPS satellite technology with clusters using a program based on space-time probability models allows for prompt visits to predation sites. This enables accurate identification of the carcass size and increases fieldwork efficiency in predation studies.
Design and Deployment of a Pediatric Cardiac Arrest Surveillance System
Newton, Heather Marie; McNamara, Leann; Engorn, Branden Michael; Jones, Kareen; Bernier, Meghan; Dodge, Pamela; Salamone, Cheryl; Bhalala, Utpal; Jeffers, Justin M.; Engineer, Lilly; Diener-West, Marie; Hunt, Elizabeth Anne
2018-01-01
Objective We aimed to increase detection of pediatric cardiopulmonary resuscitation (CPR) events and collection of physiologic and performance data for use in quality improvement (QI) efforts. Materials and Methods We developed a workflow-driven surveillance system that leveraged organizational information technology systems to trigger CPR detection and analysis processes. We characterized detection by notification source, type, location, and year, and compared it to previous methods of detection. Results From 1/1/2013 through 12/31/2015, there were 2,986 unique notifications associated with 2,145 events, 317 requiring CPR. PICU and PEDS-ED accounted for 65% of CPR events, whereas floor care areas were responsible for only 3% of events. 100% of PEDS-OR and >70% of PICU CPR events would not have been included in QI efforts. Performance data from both defibrillator and bedside monitor increased annually. (2013: 1%; 2014: 18%; 2015: 27%). Discussion After deployment of this system, detection has increased ∼9-fold and performance data collection increased annually. Had the system not been deployed, 100% of PEDS-OR and 50–70% of PICU, NICU, and PEDS-ED events would have been missed. Conclusion By leveraging hospital information technology and medical device data, identification of pediatric cardiac arrest with an associated increased capture in the proportion of objective performance data is possible. PMID:29854451
Ben Mansour, Khaireddine; Rezzoug, Nasser; Gorce, Philippe
2015-10-01
The purpose of this paper was to determine which types of inertial sensors and which advocated locations should be used for reliable and accurate gait event detection and temporal parameter assessment in normal adults. In addition, we aimed to remove the ambiguity found in the literature of the definition of the initial contact (IC) from the lumbar accelerometer. Acceleration and angular velocity data was gathered from the lumbar region and the distal edge of each shank. This data was evaluated in comparison to an instrumented treadmill and an optoelectronic system during five treadmill speed sessions. The lumbar accelerometer showed that the peak of the anteroposterior component was the most accurate for IC detection. Similarly, the valley that followed the peak of the vertical component was the most precise for terminal contact (TC) detection. Results based on ANOVA and Tukey tests showed that the set of inertial methods was suitable for temporal gait assessment and gait event detection in able-bodied subjects. For gait event detection, an exception was found with the shank accelerometer. The tool was suitable for temporal parameters assessment, despite the high root mean square error on the detection of IC (RMSEIC) and TC (RMSETC). The shank gyroscope was found to be as accurate as the kinematic method since the statistical tests revealed no significant difference between the two techniques for the RMSE off all gait events and temporal parameters. The lumbar and shank accelerometers were the most accurate alternative to the shank gyroscope for gait event detection and temporal parameters assessment, respectively. Copyright © 2015. Published by Elsevier B.V.
Detection of thiol-based redox switch processes in parasites - facts and future.
Rahbari, Mahsa; Diederich, Kathrin; Becker, Katja; Krauth-Siegel, R Luise; Jortzik, Esther
2015-05-01
Malaria and African trypanosomiasis are tropical diseases caused by the protozoa Plasmodium and Trypanosoma, respectively. The parasites undergo complex life cycles in the mammalian host and insect vector, during which they are exposed to oxidative and nitrosative challenges induced by the host immune system and endogenous processes. Attacking the parasite's redox metabolism is a target mechanism of several known antiparasitic drugs and a promising approach to novel drug development. Apart from this aspect, oxidation of cysteine residues plays a key role in protein-protein interaction, metabolic responses to redox events, and signaling. Understanding the role and dynamics of reactive oxygen species and thiol switches in regulating cellular redox homeostasis is crucial for both basic and applied biomedical approaches. Numerous techniques have therefore been established to detect redox changes in parasites including biochemical methods, fluorescent dyes, and genetically encoded probes. In this review, we aim to give an insight into the characteristics of redox networks in the pathogens Plasmodium and Trypanosoma, including a comprehensive overview of the consequences of specific deletions of redox-associated genes. Furthermore, we summarize mechanisms and detection methods of thiol switches in both parasites and discuss their specificity and sensitivity.
Earthquake recording at the Stanford DAS Array with fibers in existing telecomm conduits
NASA Astrophysics Data System (ADS)
Biondi, B. C.; Martin, E. R.; Yuan, S.; Cole, S.; Karrenbach, M. H.
2017-12-01
The Stanford Distributed Acoustic Sensing Array (SDASA-1) has been continuously recording seismic data since September 2016 on 2.5 km of single mode fiber optics in existing telecommunications conduits under Stanford's campus. The array is figure-eight shaped and roughly 600 m along its widest side with a channel spacing of roughly 8 m. This array is easy to maintain and is nonintrusive, making it well suited to urban environments, but it sacrifices some cable-to-ground coupling compared to more traditional seismometers. We have been testing its utility for earthquake recording, active seismic, and ambient noise interferometry. This talk will focus on earthquake observations. We will show comparisons between the strain rates measured throughout the DAS array and the particle velocities measured at the nearby Jasper Ridge Seismic Station (JRSC). In some of these events, we will point out directionality features specific to DAS that can require slight modifications in data processing. We also compare repeatability of DAS and JRSC recordings of blasts from a nearby quarry. Using existing earthquake databases, we have created a small catalog of DAS earthquake observations by pulling records of over 700 Northern California events spanning Sep. 2016 to Jul. 2017 from both the DAS data and JRSC. On these events we have tested common array methods for earthquake detection and location including beamforming and STA/LTA analysis in time and frequency. We have analyzed these events to approximate thresholds on what distances and magnitudes are clearly detectible by the DAS array. Further analysis should be done on detectability with methods tailored to small events (for example, template matching). In creating this catalog, we have developed open source software available for free download that can manage large sets of continuous seismic data files (both existing files, and files as they stream in). This software can both interface with existing earthquake networks, and efficiently extract earthquake recordings from many continuous recordings saved on the users machines.
DARHT Multi-intelligence Seismic and Acoustic Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevens, Garrison Nicole; Van Buren, Kendra Lu; Hemez, Francois M.
The purpose of this report is to document the analysis of seismic and acoustic data collected at the Dual-Axis Radiographic Hydrodynamic Test (DARHT) facility at Los Alamos National Laboratory for robust, multi-intelligence decision making. The data utilized herein is obtained from two tri-axial seismic sensors and three acoustic sensors, resulting in a total of nine data channels. The goal of this analysis is to develop a generalized, automated framework to determine internal operations at DARHT using informative features extracted from measurements collected external of the facility. Our framework involves four components: (1) feature extraction, (2) data fusion, (3) classification, andmore » finally (4) robustness analysis. Two approaches are taken for extracting features from the data. The first of these, generic feature extraction, involves extraction of statistical features from the nine data channels. The second approach, event detection, identifies specific events relevant to traffic entering and leaving the facility as well as explosive activities at DARHT and nearby explosive testing sites. Event detection is completed using a two stage method, first utilizing signatures in the frequency domain to identify outliers and second extracting short duration events of interest among these outliers by evaluating residuals of an autoregressive exogenous time series model. Features extracted from each data set are then fused to perform analysis with a multi-intelligence paradigm, where information from multiple data sets are combined to generate more information than available through analysis of each independently. The fused feature set is used to train a statistical classifier and predict the state of operations to inform a decision maker. We demonstrate this classification using both generic statistical features and event detection and provide a comparison of the two methods. Finally, the concept of decision robustness is presented through a preliminary analysis where uncertainty is added to the system through noise in the measurements.« less
Mechanism-based Pharmacovigilance over the Life Sciences Linked Open Data Cloud.
Kamdar, Maulik R; Musen, Mark A
2017-01-01
Adverse drug reactions (ADR) result in significant morbidity and mortality in patients, and a substantial proportion of these ADRs are caused by drug-drug interactions (DDIs). Pharmacovigilance methods are used to detect unanticipated DDIs and ADRs by mining Spontaneous Reporting Systems, such as the US FDA Adverse Event Reporting System (FAERS). However, these methods do not provide mechanistic explanations for the discovered drug-ADR associations in a systematic manner. In this paper, we present a systems pharmacology-based approach to perform mechanism-based pharmacovigilance. We integrate data and knowledge from four different sources using Semantic Web Technologies and Linked Data principles to generate a systems network. We present a network-based Apriori algorithm for association mining in FAERS reports. We evaluate our method against existing pharmacovigilance methods for three different validation sets. Our method has AUROC statistics of 0.7-0.8, similar to current methods, and event-specific thresholds generate AUROC statistics greater than 0.75 for certain ADRs. Finally, we discuss the benefits of using Semantic Web technologies to attain the objectives for mechanism-based pharmacovigilance.
High-speed event detector for embedded nanopore bio-systems.
Huang, Yiyun; Magierowski, Sebastian; Ghafar-Zadeh, Ebrahim; Wang, Chengjie
2015-08-01
Biological measurements of microscopic phenomena often deal with discrete-event signals. The ability to automatically carry out such measurements at high-speed in a miniature embedded system is desirable but compromised by high-frequency noise along with practical constraints on filter quality and sampler resolution. This paper presents a real-time event-detection method in the context of nanopore sensing that helps to mitigate these drawbacks and allows accurate signal processing in an embedded system. Simulations show at least a 10× improvement over existing on-line detection methods.
Association rule mining in the US Vaccine Adverse Event Reporting System (VAERS).
Wei, Lai; Scott, John
2015-09-01
Spontaneous adverse event reporting systems are critical tools for monitoring the safety of licensed medical products. Commonly used signal detection algorithms identify disproportionate product-adverse event pairs and may not be sensitive to more complex potential signals. We sought to develop a computationally tractable multivariate data-mining approach to identify product-multiple adverse event associations. We describe an application of stepwise association rule mining (Step-ARM) to detect potential vaccine-symptom group associations in the US Vaccine Adverse Event Reporting System. Step-ARM identifies strong associations between one vaccine and one or more adverse events. To reduce the number of redundant association rules found by Step-ARM, we also propose a clustering method for the post-processing of association rules. In sample applications to a trivalent intradermal inactivated influenza virus vaccine and to measles, mumps, rubella, and varicella (MMRV) vaccine and in simulation studies, we find that Step-ARM can detect a variety of medically coherent potential vaccine-symptom group signals efficiently. In the MMRV example, Step-ARM appears to outperform univariate methods in detecting a known safety signal. Our approach is sensitive to potentially complex signals, which may be particularly important when monitoring novel medical countermeasure products such as pandemic influenza vaccines. The post-processing clustering algorithm improves the applicability of the approach as a screening method to identify patterns that may merit further investigation. Copyright © 2015 John Wiley & Sons, Ltd.
Létant, Sonia E; Murphy, Gloria A; Alfaro, Teneile M; Avila, Julie R; Kane, Staci R; Raber, Ellen; Bunt, Thomas M; Shah, Sanjiv R
2011-09-01
In the event of a biothreat agent release, hundreds of samples would need to be rapidly processed to characterize the extent of contamination and determine the efficacy of remediation activities. Current biological agent identification and viability determination methods are both labor- and time-intensive such that turnaround time for confirmed results is typically several days. In order to alleviate this issue, automated, high-throughput sample processing methods were developed in which real-time PCR analysis is conducted on samples before and after incubation. The method, referred to as rapid-viability (RV)-PCR, uses the change in cycle threshold after incubation to detect the presence of live organisms. In this article, we report a novel RV-PCR method for detection of live, virulent Bacillus anthracis, in which the incubation time was reduced from 14 h to 9 h, bringing the total turnaround time for results below 15 h. The method incorporates a magnetic bead-based DNA extraction and purification step prior to PCR analysis, as well as specific real-time PCR assays for the B. anthracis chromosome and pXO1 and pXO2 plasmids. A single laboratory verification of the optimized method applied to the detection of virulent B. anthracis in environmental samples was conducted and showed a detection level of 10 to 99 CFU/sample with both manual and automated RV-PCR methods in the presence of various challenges. Experiments exploring the relationship between the incubation time and the limit of detection suggest that the method could be further shortened by an additional 2 to 3 h for relatively clean samples.
Sols, Ignasi; DuBrow, Sarah; Davachi, Lila; Fuentemilla, Lluís
2017-11-20
Although everyday experiences unfold continuously over time, shifts in context, or event boundaries, can influence how those events come to be represented in memory [1-4]. Specifically, mnemonic binding across sequential representations is more challenging at context shifts, such that successful temporal associations are more likely to be formed within than across contexts [1, 2, 5-9]. However, in order to preserve a subjective sense of continuity, it is important that the memory system bridge temporally adjacent events, even if they occur in seemingly distinct contexts. Here, we used pattern similarity analysis to scalp electroencephalographic (EEG) recordings during a sequential learning task [2, 3] in humans and showed that the detection of event boundaries triggered a rapid memory reinstatement of the just-encoded sequence episode. Memory reactivation was detected rapidly (∼200-800 ms from the onset of the event boundary) and was specific to context shifts that were preceded by an event sequence with episodic content. Memory reinstatement was not observed during the sequential encoding of events within an episode, indicating that memory reactivation was induced specifically upon context shifts. Finally, the degree of neural similarity between neural responses elicited during sequence encoding and at event boundaries correlated positively with participants' ability to later link across sequences of events, suggesting a critical role in binding temporally adjacent events in long-term memory. Current results shed light onto the neural mechanisms that promote episodic encoding not only for information within the event, but also, importantly, in the ability to link across events to create a memory representation of continuous experience. Copyright © 2017 Elsevier Ltd. All rights reserved.
Jasiewicz, Jan M; Allum, John H J; Middleton, James W; Barriskill, Andrew; Condie, Peter; Purcell, Brendan; Li, Raymond Che Tin
2006-12-01
We report on three different methods of gait event detection (toe-off and heel strike) using miniature linear accelerometers and angular velocity transducers in comparison to using standard pressure-sensitive foot switches. Detection was performed with normal and spinal-cord injured subjects. The detection of end contact (EC), normally toe-off, and initial contact (IC) normally, heel strike was based on either foot linear accelerations or foot sagittal angular velocity or shank sagittal angular velocity. The results showed that all three methods were as accurate as foot switches in estimating times of IC and EC for normal gait patterns. In spinal-cord injured subjects, shank angular velocity was significantly less accurate (p<0.02). We conclude that detection based on foot linear accelerations or foot angular velocity can correctly identify the timing of IC and EC events in both normal and spinal-cord injured subjects.
Numerical study on the sequential Bayesian approach for radioactive materials detection
NASA Astrophysics Data System (ADS)
Qingpei, Xiang; Dongfeng, Tian; Jianyu, Zhu; Fanhua, Hao; Ge, Ding; Jun, Zeng
2013-01-01
A new detection method, based on the sequential Bayesian approach proposed by Candy et al., offers new horizons for the research of radioactive detection. Compared with the commonly adopted detection methods incorporated with statistical theory, the sequential Bayesian approach offers the advantages of shorter verification time during the analysis of spectra that contain low total counts, especially in complex radionuclide components. In this paper, a simulation experiment platform implanted with the methodology of sequential Bayesian approach was developed. Events sequences of γ-rays associating with the true parameters of a LaBr3(Ce) detector were obtained based on an events sequence generator using Monte Carlo sampling theory to study the performance of the sequential Bayesian approach. The numerical experimental results are in accordance with those of Candy. Moreover, the relationship between the detection model and the event generator, respectively represented by the expected detection rate (Am) and the tested detection rate (Gm) parameters, is investigated. To achieve an optimal performance for this processor, the interval of the tested detection rate as a function of the expected detection rate is also presented.
Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks.
Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming
2015-12-15
With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady.
Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks
Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming
2015-01-01
With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady. PMID:26694394
Recent advances in understanding the role of nutrition in human genome evolution.
Ye, Kaixiong; Gu, Zhenglong
2011-11-01
Dietary transitions in human history have been suggested to play important roles in the evolution of mankind. Genetic variations caused by adaptation to diet during human evolution could have important health consequences in current society. The advance of sequencing technologies and the rapid accumulation of genome information provide an unprecedented opportunity to comprehensively characterize genetic variations in human populations and unravel the genetic basis of human evolution. Series of selection detection methods, based on various theoretical models and exploiting different aspects of selection signatures, have been developed. Their applications at the species and population levels have respectively led to the identification of human specific selection events that distinguish human from nonhuman primates and local adaptation events that contribute to human diversity. Scrutiny of candidate genes has revealed paradigms of adaptations to specific nutritional components and genome-wide selection scans have verified the prevalence of diet-related selection events and provided many more candidates awaiting further investigation. Understanding the role of diet in human evolution is fundamental for the development of evidence-based, genome-informed nutritional practices in the era of personal genomics.
Fast neutron imaging device and method
Popov, Vladimir; Degtiarenko, Pavel; Musatov, Igor V.
2014-02-11
A fast neutron imaging apparatus and method of constructing fast neutron radiography images, the apparatus including a neutron source and a detector that provides event-by-event acquisition of position and energy deposition, and optionally timing and pulse shape for each individual neutron event detected by the detector. The method for constructing fast neutron radiography images utilizes the apparatus of the invention.
A comparison of earthquake backprojection imaging methods for dense local arrays
NASA Astrophysics Data System (ADS)
Beskardes, G. D.; Hole, J. A.; Wang, K.; Michaelides, M.; Wu, Q.; Chapman, M. C.; Davenport, K. K.; Brown, L. D.; Quiros, D. A.
2018-03-01
Backprojection imaging has recently become a practical method for local earthquake detection and location due to the deployment of densely sampled, continuously recorded, local seismograph arrays. While backprojection sometimes utilizes the full seismic waveform, the waveforms are often pre-processed and simplified to overcome imaging challenges. Real data issues include aliased station spacing, inadequate array aperture, inaccurate velocity model, low signal-to-noise ratio, large noise bursts and varying waveform polarity. We compare the performance of backprojection with four previously used data pre-processing methods: raw waveform, envelope, short-term averaging/long-term averaging and kurtosis. Our primary goal is to detect and locate events smaller than noise by stacking prior to detection to improve the signal-to-noise ratio. The objective is to identify an optimized strategy for automated imaging that is robust in the presence of real-data issues, has the lowest signal-to-noise thresholds for detection and for location, has the best spatial resolution of the source images, preserves magnitude, and considers computational cost. Imaging method performance is assessed using a real aftershock data set recorded by the dense AIDA array following the 2011 Virginia earthquake. Our comparisons show that raw-waveform backprojection provides the best spatial resolution, preserves magnitude and boosts signal to detect events smaller than noise, but is most sensitive to velocity error, polarity error and noise bursts. On the other hand, the other methods avoid polarity error and reduce sensitivity to velocity error, but sacrifice spatial resolution and cannot effectively reduce noise by stacking. Of these, only kurtosis is insensitive to large noise bursts while being as efficient as the raw-waveform method to lower the detection threshold; however, it does not preserve the magnitude information. For automatic detection and location of events in a large data set, we therefore recommend backprojecting kurtosis waveforms, followed by a second pass on the detected events using noise-filtered raw waveforms to achieve the best of all criteria.
Homaeinezhad, M R; Erfanianmoshiri-Nejad, M; Naseri, H
2014-01-01
The goal of this study is to introduce a simple, standard and safe procedure to detect and to delineate P and T waves of the electrocardiogram (ECG) signal in real conditions. The proposed method consists of four major steps: (1) a secure QRS detection and delineation algorithm, (2) a pattern recognition algorithm designed for distinguishing various ECG clusters which take place between consecutive R-waves, (3) extracting template of the dominant events of each cluster waveform and (4) application of the correlation analysis in order to delineate automatically the P- and T-waves in noisy conditions. The performance characteristics of the proposed P and T detection-delineation algorithm are evaluated versus various ECG signals whose qualities are altered from the best to the worst cases based on the random-walk noise theory. Also, the method is applied to the MIT-BIH Arrhythmia and the QT databases for comparing some parts of its performance characteristics with a number of P and T detection-delineation algorithms. The conducted evaluations indicate that in a signal with low quality value of about 0.6, the proposed method detects the P and T events with sensitivity Se=85% and positive predictive value of P+=89%, respectively. In addition, at the same quality, the average delineation errors associated with those ECG events are 45 and 63ms, respectively. Stable delineation error, high detection accuracy and high noise tolerance were the most important aspects considered during development of the proposed method. © 2013 Elsevier Ltd. All rights reserved.
The use of Matlab for colour fuzzy representation of multichannel EEG short time spectra.
Bigan, C; Strungaru, R
1998-01-01
During the last years, a lot of EEG research efforts was directed to intelligent methods for automatic analysis of data from multichannel EEG recordings. However, all the applications reported were focused on specific single tasks like detection of one specific "event" in the EEG signal: spikes, sleep spindles, epileptic seizures, K complexes, alpha or other rhythms or even artefacts. The aim of this paper is to present a complex system being able to perform a representation of the dynamic changes in frequency components of each EEG channel. This representation uses colours as a powerful means to show the only one frequency range chosen from the shortest epoch of signal able to be processed with the conventional "Short Time Fast Fourier Transform" (S.T.F.F.T.) method.
Li, Zhuqing; Li, Xiang; Wang, Canhua; Song, Guiwen; Pi, Liqun; Zheng, Lan; Zhang, Dabing; Yang, Litao
2017-09-27
Multiple-target plasmid DNA reference materials have been generated and utilized as good substitutes of matrix-based reference materials in the analysis of genetically modified organisms (GMOs). Herein, we report the construction of one multiple-target plasmid reference molecule, pCAN, which harbors eight GM canola event-specific sequences (RF1, RF2, MS1, MS8, Topas 19/2, Oxy235, RT73, and T45) and a partial sequence of the canola endogenous reference gene PEP. The applicability of this plasmid reference material in qualitative and quantitative PCR assays of the eight GM canola events was evaluated, including the analysis of specificity, limit of detection (LOD), limit of quantification (LOQ), and performance of pCAN in the analysis of various canola samples, etc. The LODs are 15 copies for RF2, MS1, and RT73 assays using pCAN as the calibrator and 10 genome copies for the other events. The LOQ in each event-specific real-time PCR assay is 20 copies. In quantitative real-time PCR analysis, the PCR efficiencies of all event-specific and PEP assays are between 91% and 97%, and the squared regression coefficients (R 2 ) are all higher than 0.99. The quantification bias values varied from 0.47% to 20.68% with relative standard deviation (RSD) from 1.06% to 24.61% in the quantification of simulated samples. Furthermore, 10 practical canola samples sampled from imported shipments in the port of Shanghai, China, were analyzed employing pCAN as the calibrator, and the results were comparable with those assays using commercial certified materials as the calibrator. Concluding from these results, we believe that this newly developed pCAN plasmid is one good candidate for being a plasmid DNA reference material in the detection and quantification of the eight GM canola events in routine analysis.
Cartan invariants and event horizon detection
NASA Astrophysics Data System (ADS)
Brooks, D.; Chavy-Waddy, P. C.; Coley, A. A.; Forget, A.; Gregoris, D.; MacCallum, M. A. H.; McNutt, D. D.
2018-04-01
We show that it is possible to locate the event horizon of a black hole (in arbitrary dimensions) by the zeros of certain Cartan invariants. This approach accounts for the recent results on the detection of stationary horizons using scalar polynomial curvature invariants, and improves upon them since the proposed method is computationally less expensive. As an application, we produce Cartan invariants that locate the event horizons for various exact four-dimensional and five-dimensional stationary, asymptotically flat (or (anti) de Sitter), black hole solutions and compare the Cartan invariants with the corresponding scalar curvature invariants that detect the event horizon.
Adaptable, high recall, event extraction system with minimal configuration
2015-01-01
Background Biomedical event extraction has been a major focus of biomedical natural language processing (BioNLP) research since the first BioNLP shared task was held in 2009. Accordingly, a large number of event extraction systems have been developed. Most such systems, however, have been developed for specific tasks and/or incorporated task specific settings, making their application to new corpora and tasks problematic without modification of the systems themselves. There is thus a need for event extraction systems that can achieve high levels of accuracy when applied to corpora in new domains, without the need for exhaustive tuning or modification, whilst retaining competitive levels of performance. Results We have enhanced our state-of-the-art event extraction system, EventMine, to alleviate the need for task-specific tuning. Task-specific details are specified in a configuration file, while extensive task-specific parameter tuning is avoided through the integration of a weighting method, a covariate shift method, and their combination. The task-specific configuration and weighting method have been employed within the context of two different sub-tasks of BioNLP shared task 2013, i.e. Cancer Genetics (CG) and Pathway Curation (PC), removing the need to modify the system specifically for each task. With minimal task specific configuration and tuning, EventMine achieved the 1st place in the PC task, and 2nd in the CG, achieving the highest recall for both tasks. The system has been further enhanced following the shared task by incorporating the covariate shift method and entity generalisations based on the task definitions, leading to further performance improvements. Conclusions We have shown that it is possible to apply a state-of-the-art event extraction system to new tasks with high levels of performance, without having to modify the system internally. Both covariate shift and weighting methods are useful in facilitating the production of high recall systems. These methods and their combination can adapt a model to the target data with no deep tuning and little manual configuration. PMID:26201408
TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach
Elgendi, Mohamed
2016-01-01
Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages (“TERMA”) involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages (W1 and W2) have to follow the inequality (8×W1)≥W2≥(2×W1). Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions. PMID:27827852
2013-01-01
Background The field of cancer genomics has rapidly adopted next-generation sequencing (NGS) in order to study and characterize malignant tumors with unprecedented resolution. In particular for cancer, one is often trying to identify somatic mutations – changes specific to a tumor and not within an individual’s germline. However, false positive and false negative detections often result from lack of sufficient variant evidence, contamination of the biopsy by stromal tissue, sequencing errors, and the erroneous classification of germline variation as tumor-specific. Results We have developed a generalized Bayesian analysis framework for matched tumor/normal samples with the purpose of identifying tumor-specific alterations such as single nucleotide mutations, small insertions/deletions, and structural variation. We describe our methodology, and discuss its application to other types of paired-tissue analysis such as the detection of loss of heterozygosity as well as allelic imbalance. We also demonstrate the high level of sensitivity and specificity in discovering simulated somatic mutations, for various combinations of a) genomic coverage and b) emulated heterogeneity. Conclusion We present a Java-based implementation of our methods named Seurat, which is made available for free academic use. We have demonstrated and reported on the discovery of different types of somatic change by applying Seurat to an experimentally-derived cancer dataset using our methods; and have discussed considerations and practices regarding the accurate detection of somatic events in cancer genomes. Seurat is available at https://sites.google.com/site/seuratsomatic. PMID:23642077
Christoforides, Alexis; Carpten, John D; Weiss, Glen J; Demeure, Michael J; Von Hoff, Daniel D; Craig, David W
2013-05-04
The field of cancer genomics has rapidly adopted next-generation sequencing (NGS) in order to study and characterize malignant tumors with unprecedented resolution. In particular for cancer, one is often trying to identify somatic mutations--changes specific to a tumor and not within an individual's germline. However, false positive and false negative detections often result from lack of sufficient variant evidence, contamination of the biopsy by stromal tissue, sequencing errors, and the erroneous classification of germline variation as tumor-specific. We have developed a generalized Bayesian analysis framework for matched tumor/normal samples with the purpose of identifying tumor-specific alterations such as single nucleotide mutations, small insertions/deletions, and structural variation. We describe our methodology, and discuss its application to other types of paired-tissue analysis such as the detection of loss of heterozygosity as well as allelic imbalance. We also demonstrate the high level of sensitivity and specificity in discovering simulated somatic mutations, for various combinations of a) genomic coverage and b) emulated heterogeneity. We present a Java-based implementation of our methods named Seurat, which is made available for free academic use. We have demonstrated and reported on the discovery of different types of somatic change by applying Seurat to an experimentally-derived cancer dataset using our methods; and have discussed considerations and practices regarding the accurate detection of somatic events in cancer genomes. Seurat is available at https://sites.google.com/site/seuratsomatic.
Tao, Rongkun; Shi, Mei; Zou, Yejun; Cheng, Di; Wang, Qiaohui; Liu, Renmei; Wang, Aoxue; Zhu, Jiahuan; Deng, Lei; Hu, Hanyang; Chen, Xianjun; Du, Jiulin; Zhu, Weiping; Zhao, Yuzheng; Yang, Yi
2018-06-01
Engineered fluorescent indicators for visualizing mercury ion (Hg 2+ ) are powerful tools to illustrate the intracellular distribution and serious toxicity of the ion. However, the sensitive and specific detection of Hg 2+ in living cells and in vivo is challenging. This paper reported the development of fluorescent indicators for Hg 2+ in green or red color by inserting a circularly permuted fluorescent protein into a highly mercury-specific repressor. These sensors provided a rapid, sensitive, specific, and real-time read-out of Hg 2+ dynamics in solutions, bacteria, subcellular organelles of mammalian cells, and zebrafish, thereby providing a useful new method for Hg 2+ detection and bioimaging. In conjunction with the hydrogen peroxide sensor HyPer, we found mercury uptake would trigger subcellular oxidative events at the single-cell level, and provided visual evidence of the causality of mercury and oxidative damage. These sensors would paint the landscape of mercury toxicity to cell functions. Copyright © 2018 Elsevier Inc. All rights reserved.
Microbial source tracking and transfer hydrodynamics in rural catchments.
NASA Astrophysics Data System (ADS)
Murphy, Sinead; Bhreathnach, Niamh; O'Flaherty, Vincent; Jordan, Philip; Wuertz, Stefan
2013-04-01
In Ireland, bacterial pathogens from continual point source pollution and intermittent pollution from diffuse sources can impact both drinking water supplies and recreational waters. This poses a serious public health threat. Observing and establishing the source of faecal pollution is imperative for the protection of water quality and human health. Traditional culture methods to detect such pollution via faecal indicator bacteria have been widely utilised but do not decipher the source of pollution. To combat this, microbial source tracking, an important emerging molecular tool, is applied to detect host-specific markers in faecally contaminated waters. The aim of this study is to target ruminant and human-specific faecal Bacteroidales and Bacteroides 16S rRNA genes within rural river catchments in Ireland and investigate hydrological transfer dependencies. During storm events and non-storm periods, 1L untreated water samples, taken every 2 hours over a 48-hour time period at the spring (Cregduff) or outlet (Dunleer), and large (5-20L) untreated water samples were collected from two catchment sites. Cregduff is a spring emergence under a grassland karst landscape in Co. Mayo (west coast of Ireland) and Dunleer is a mixed landuse over till soils in Co. Louth (east coast). From a risk assessment point of view, the catchments are very different. Samples were filtered through 0.2µm nitrocellulose filters to concentrate bacterial cells which then underwent chemical extraction of total nucleic acids. Animal and human stool samples were also collected from the catchments to determine assay sensitivity and specificity following nucleic acid extraction. Aquifer response to seasonal events was assessed by monitoring coliforms and E. coli occurrence using the IDEXX Colisure® Quanti Tray®/2000 system in conjunction with chemical and hydrological parameters. Autoanalysers deployed at each catchment monitor multiple water parameters every 10 min such as phosphorus, nitrogen (nitrate), turbidity, conductivity and flow rate. InStat V 3.06 was used to determine correlations between chemical and microbial parameters (P< 0.05 considered significant).There was a positive correlation between E. coli and phosphorus in Cregduff during rain events (p=0.040) & significant correlation for a non-rain periods (<0.001). There was a positive correlation between E. coli and turbidity in Dunleer during rain events (p=0.0008) and in Cregduff during non-rain periods (p=0.0241). The water samples from Dunleer have a higher concentration of phosphorus than in Cregduff. Host specific primers BacCow-UCD, BacHum-UCD, BacUni-UCD and BoBac were then assayed against both faecal and water extracts and quantified using PCR. BacUni-UCD, BacCow-UCD and BoBac detected faecal contamination in three of the four sample sites in Dunleer and BacHum-UCD detected faecal contamination in one of the sites. The concentrations of the BacUni-UCD qPCR assay were higher in the water samples taken from Dunleer outlet than those taken from Cregduff spring. BacCow-UCD and BacHum-UCD qPCR detected low and very low concentrations, respectively, in water from the Dunleer outlet. The concentrations can be seen changing over the hydrograph event. None of the host-specific assays detected pollution in Cregduff. From the results, it can be seen that Dunleer is more subject to contamination than Cregduff.
Detection and Classification of Motor Vehicle Noise in a Forested Landscape
NASA Astrophysics Data System (ADS)
Brown, Casey L.; Reed, Sarah E.; Dietz, Matthew S.; Fristrup, Kurt M.
2013-11-01
Noise emanating from human activity has become a common addition to natural soundscapes and has the potential to harm wildlife and erode human enjoyment of nature. In particular, motor vehicles traveling along roads and trails produce high levels of both chronic and intermittent noise, eliciting varied responses from a wide range of animal species. Anthropogenic noise is especially conspicuous in natural areas where ambient background sound levels are low. In this article, we present an acoustic method to detect and analyze motor vehicle noise. Our approach uses inexpensive consumer products to record sound, sound analysis software to automatically detect sound events within continuous recordings and measure their acoustic properties, and statistical classification methods to categorize sound events. We describe an application of this approach to detect motor vehicle noise on paved, gravel, and natural-surface roads, and off-road vehicle trails in 36 sites distributed throughout a national forest in the Sierra Nevada, CA, USA. These low-cost, unobtrusive methods can be used by scientists and managers to detect anthropogenic noise events for many potential applications, including ecological research, transportation and recreation planning, and natural resource management.
Method of controlling cyclic variation in engine combustion
Davis, L.I. Jr.; Daw, C.S.; Feldkamp, L.A.; Hoard, J.W.; Yuan, F.; Connolly, F.T.
1999-07-13
Cyclic variation in combustion of a lean burning engine is reduced by detecting an engine combustion event output such as torsional acceleration in a cylinder (i) at a combustion event (k), using the detected acceleration to predict a target acceleration for the cylinder at the next combustion event (k+1), modifying the target output by a correction term that is inversely proportional to the average phase of the combustion event output of cylinder (i) and calculating a control output such as fuel pulse width or spark timing necessary to achieve the target acceleration for cylinder (i) at combustion event (k+1) based on anti-correlation with the detected acceleration and spill-over effects from fueling. 27 figs.
Method of controlling cyclic variation in engine combustion
Davis, Jr., Leighton Ira; Daw, Charles Stuart; Feldkamp, Lee Albert; Hoard, John William; Yuan, Fumin; Connolly, Francis Thomas
1999-01-01
Cyclic variation in combustion of a lean burning engine is reduced by detecting an engine combustion event output such as torsional acceleration in a cylinder (i) at a combustion event (k), using the detected acceleration to predict a target acceleration for the cylinder at the next combustion event (k+1), modifying the target output by a correction term that is inversely proportional to the average phase of the combustion event output of cylinder (i) and calculating a control output such as fuel pulse width or spark timing necessary to achieve the target acceleration for cylinder (i) at combustion event (k+1) based on anti-correlation with the detected acceleration and spill-over effects from fueling.
Single-Step qPCR and dPCR Detection of Diverse CRISPR-Cas9 Gene Editing Events In Vivo.
Falabella, Micol; Sun, Linqing; Barr, Justin; Pena, Andressa Z; Kershaw, Erin E; Gingras, Sebastien; Goncharova, Elena A; Kaufman, Brett A
2017-10-05
Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR)-CRISPR-associated protein 9 (Cas9)-based technology is currently the most flexible means to create targeted mutations by recombination or indel mutations by nonhomologous end joining. During mouse transgenesis, recombinant and indel alleles are often pursued simultaneously. Multiple alleles can be formed in each animal to create significant genetic complexity that complicates the CRISPR-Cas9 approach and analysis. Currently, there are no rapid methods to measure the extent of on-site editing with broad mutation sensitivity. In this study, we demonstrate the allelic diversity arising from targeted CRISPR editing in founder mice. Using this DNA sample collection, we validated specific quantitative and digital PCR methods (qPCR and dPCR, respectively) for measuring the frequency of on-target editing in founder mice. We found that locked nucleic acid (LNA) probes combined with an internal reference probe (Drop-Off Assay) provide accurate measurements of editing rates. The Drop-Off LNA Assay also detected on-target CRISPR-Cas9 gene editing in blastocysts with a sensitivity comparable to PCR-clone sequencing. Lastly, we demonstrate that the allele-specific LNA probes used in qPCR competitor assays can accurately detect recombinant mutations in founder mice. In summary, we show that LNA-based qPCR and dPCR assays provide a rapid method for quantifying the extent of on-target genome editing in vivo , testing RNA guides, and detecting recombinant mutations. Copyright © 2017 Falabella et al.
Automatic Detection and Classification of Audio Events for Road Surveillance Applications.
Almaadeed, Noor; Asim, Muhammad; Al-Maadeed, Somaya; Bouridane, Ahmed; Beghdadi, Azeddine
2018-06-06
This work investigates the problem of detecting hazardous events on roads by designing an audio surveillance system that automatically detects perilous situations such as car crashes and tire skidding. In recent years, research has shown several visual surveillance systems that have been proposed for road monitoring to detect accidents with an aim to improve safety procedures in emergency cases. However, the visual information alone cannot detect certain events such as car crashes and tire skidding, especially under adverse and visually cluttered weather conditions such as snowfall, rain, and fog. Consequently, the incorporation of microphones and audio event detectors based on audio processing can significantly enhance the detection accuracy of such surveillance systems. This paper proposes to combine time-domain, frequency-domain, and joint time-frequency features extracted from a class of quadratic time-frequency distributions (QTFDs) to detect events on roads through audio analysis and processing. Experiments were carried out using a publicly available dataset. The experimental results conform the effectiveness of the proposed approach for detecting hazardous events on roads as demonstrated by 7% improvement of accuracy rate when compared against methods that use individual temporal and spectral features.
Brawanski, Alexander
2017-01-01
Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca), correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp). In this study we compared the results of the sca with the pressure reactivity index (PRx), an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc). The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data. PMID:28255331
Proescholdt, Martin A; Faltermeier, Rupert; Bele, Sylvia; Brawanski, Alexander
2017-01-01
Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca), correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp). In this study we compared the results of the sca with the pressure reactivity index (PRx), an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc). The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data.
Determining dark matter properties with a XENONnT/LZ signal and LHC Run 3 monojet searches
NASA Astrophysics Data System (ADS)
Baum, Sebastian; Catena, Riccardo; Conrad, Jan; Freese, Katherine; Krauss, Martin B.
2018-04-01
We develop a method to forecast the outcome of the LHC Run 3 based on the hypothetical detection of O (100 ) signal events at XENONnT. Our method relies on a systematic classification of renormalizable single-mediator models for dark matter-quark interactions and is valid for dark matter candidates of spin less than or equal to one. Applying our method to simulated data, we find that at the end of the LHC Run 3 only two mutually exclusive scenarios would be compatible with the detection of O (100 ) signal events at XENONnT. In the first scenario, the energy distribution of the signal events is featureless, as for canonical spin-independent interactions. In this case, if a monojet signal is detected at the LHC, dark matter must have spin 1 /2 and interact with nucleons through a unique velocity-dependent operator. If a monojet signal is not detected, dark matter interacts with nucleons through canonical spin-independent interactions. In a second scenario, the spectral distribution of the signal events exhibits a bump at nonzero recoil energies. In this second case, a monojet signal can be detected at the LHC Run 3; dark matter must have spin 1 /2 and interact with nucleons through a unique momentum-dependent operator. We therefore conclude that the observation of O (100 ) signal events at XENONnT combined with the detection, or the lack of detection, of a monojet signal at the LHC Run 3 would significantly narrow the range of possible dark matter-nucleon interactions. As we argued above, it can also provide key information on the dark matter particle spin.
Anomaly Detection Based on Local Nearest Neighbor Distance Descriptor in Crowded Scenes
Hu, Shiqiang; Zhang, Huanlong; Luo, Lingkun
2014-01-01
We propose a novel local nearest neighbor distance (LNND) descriptor for anomaly detection in crowded scenes. Comparing with the commonly used low-level feature descriptors in previous works, LNND descriptor has two major advantages. First, LNND descriptor efficiently incorporates spatial and temporal contextual information around the video event that is important for detecting anomalous interaction among multiple events, while most existing feature descriptors only contain the information of single event. Second, LNND descriptor is a compact representation and its dimensionality is typically much lower than the low-level feature descriptor. Therefore, not only the computation time and storage requirement can be accordingly saved by using LNND descriptor for the anomaly detection method with offline training fashion, but also the negative aspects caused by using high-dimensional feature descriptor can be avoided. We validate the effectiveness of LNND descriptor by conducting extensive experiments on different benchmark datasets. Experimental results show the promising performance of LNND-based method against the state-of-the-art methods. It is worthwhile to notice that the LNND-based approach requires less intermediate processing steps without any subsequent processing such as smoothing but achieves comparable event better performance. PMID:25105164
Detecting and Locating Seismic Events Without Phase Picks or Velocity Models
NASA Astrophysics Data System (ADS)
Arrowsmith, S.; Young, C. J.; Ballard, S.; Slinkard, M.
2015-12-01
The standard paradigm for seismic event monitoring is to scan waveforms from a network of stations and identify the arrival time of various seismic phases. A signal association algorithm then groups the picks to form events, which are subsequently located by minimizing residuals between measured travel times and travel times predicted by an Earth model. Many of these steps are prone to significant errors which can lead to erroneous arrival associations and event locations. Here, we revisit a concept for event detection that does not require phase picks or travel time curves and fuses detection, association and location into a single algorithm. Our pickless event detector exploits existing catalog and waveform data to build an empirical stack of the full regional seismic wavefield, which is subsequently used to detect and locate events at a network level using correlation techniques. Because the technique uses more of the information content of the original waveforms, the concept is particularly powerful for detecting weak events that would be missed by conventional methods. We apply our detector to seismic data from the University of Utah Seismograph Stations network and compare our results with the earthquake catalog published by the University of Utah. We demonstrate that the pickless detector can detect and locate significant numbers of events previously missed by standard data processing techniques.
Applying complex networks to evaluate precipitation patterns over South America
NASA Astrophysics Data System (ADS)
Ciemer, Catrin; Boers, Niklas; Barbosa, Henrique; Kurths, Jürgen; Rammig, Anja
2016-04-01
The climate of South America exhibits pronounced differences between the wet- and the dry-season, which are accompanied by specific synoptic events like changes in the location of the South American Low Level Jet (SALLJ) and the establishment of the South American Convergence Zone (SACZ). The onset of these events can be related to the presence of typical large-scale precipitation patterns over South America, as previous studies have shown[1,2]. The application of complex network methods to precipitation data recently received increased scientific attention for the special case of extreme events, as it is possible with such methods to analyze the spatiotemporal correlation structure as well as possible teleconnections of these events[3,4]. In these approaches the correlation between precipitation datasets is calculated by means of Event Synchronization which restricts their applicability to extreme precipitation events. In this work, we propose a method which is able to consider not only extreme precipitation but complete time series. A direct application of standard similarity measures in order to correlate precipitation time series is impossible due to their intricate statistical properties as the large amount of zeros. Therefore, we introduced and evaluated a suitable modification of Pearson's correlation coefficient to construct spatial correlation networks of precipitation. By analyzing the characteristics of spatial correlation networks constructed on the basis of this new measure, we are able to determine coherent areas of similar precipitation patterns, spot teleconnections of correlated areas, and detect central regions for precipitation correlation. By analyzing the change of the network over the year[5], we are also able to determine local and global changes in precipitation correlation patterns. Additionally, global network characteristics as the network connectivity yield indications for beginning and end of wet- and dry season. In order to identify large-scale synoptic events like the SACZ and SALLJ onset, detecting the changes of correlation over time between certain regions is of significant relevance. [1] Nieto-Ferreira et al. Quarterly Journal of the Royal Meteorological Society (2011) [2] Vera et al. Bulletin of the American Meteorological Society (2006) [3] Quiroga et al. Physical review E (2002) [4] Boers et al. nature communications (2014) [5] Radebach et al. Physical review E (2013)
Shang, Ying; Xu, Wentao; Wang, Yong; Xu, Yuancong; Huang, Kunlun
2017-12-15
This study described a novel multiplex qualitative detection method using pyrosequencing. Based on the principle of the universal primer-multiplex-PCR, only one sequencing primer was employed to realize the detection of the multiple targets. Samples containing three genetically modified (GM) crops in different proportions were used to validate the method. The dNTP dispensing order was designed based on the product sequences. Only 12 rounds (ATCTGATCGACT) of dNTPs addition and, often, as few as three rounds (CAT) under ideal conditions, were required to detect the GM events qualitatively, and sensitivity was as low as 1% of a mixture. However, when considering a mixture, calculating signal values allowed the proportion of each GM to be estimated. Based on these results, we concluded that our novel method not only realized detection but also allowed semi-quantitative detection of individual events. Copyright © 2017. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Stadler, Philipp; Farnleitner, Andreas H.; Sommer, Regina; Kumpan, Monika; Zessner, Matthias
2014-05-01
For the near real time and on-site detection of microbiological fecal pollution of water, the measurement of beta-D- Glucuronidase (GLUC) enzymatic activity has been suggested as a surrogate parameter and has been already successfully operated for water quality monitoring of ground water resources (Ryzinska-Paier et al. 2014). Due to possible short measure intervals of three hours, this method has high potential as a water quality monitoring tool. While cultivation based standard determination takes more than one working day (Cabral 2010) the potential advantage of detecting the GLUC activity is the high temporal measuring resolution. Yet, there is still a big gap of knowledge on the fecal indication capacity of GLUC (specificity, sensitivity, persistence, etc.) in relation to potential pollution sources and catchment conditions (Cabral 2010, Ryzinska-Paier et al. 2014). Furthermore surface waters are a big challenge for automated detection devices in a technical point of view due to the high sediment load during event conditions. This presentation shows results gained form two years of monitoring in an experimental catchment (HOAL) dominated by agricultural land use. Two enzymatic measurement devices are operated parallel at the catchment outlet to test the reproducibility and precision of the method. Data from continuous GLUC monitoring under both base flow and event conditions is compared with reference samples analyzed by standardized laboratory methods for fecal pollution detection (e.g. ISO 16649-1, Colilert18). It is shown that rapid enzymatic on-site GLUC determination can successfully be operated from a technical point of view for surface water quality monitoring under the observed catchment conditions. The comparison of enzyme activity with microbiological standard analytics reveals distinct differences in the dynamic of the signals during event conditions. Cabral J. P. S. (2010) "Water Microbiology. Bacterial Pathogens and Water" International Journal of Environmental Research and Public Health 7 (10): 3657-3703. Ryzinska-Paier, G., T. Lendenfeld, K. Correa, P. Stadler, A.P. Blaschke, R. L. Mach, H. Stadler, AKT Kirschner und A.H. Farnleitner (2014) A sensitive and robust method for automated on-line monitoring of enzymatic activities in water and water resources. Water Sci. Technol. in press
Heller, Daniel A.; Pratt, George W.; Zhang, Jingqing; Nair, Nitish; Hansborough, Adam J.; Boghossian, Ardemis A.; Reuel, Nigel F.; Barone, Paul W.; Strano, Michael S.
2011-01-01
A class of peptides from the bombolitin family, not previously identified for nitroaromatic recognition, allows near-infrared fluorescent single-walled carbon nanotubes to transduce specific changes in their conformation. In response to the binding of specific nitroaromatic species, such peptide–nanotube complexes form a virtual “chaperone sensor,” which reports modulation of the peptide secondary structure via changes in single-walled carbon nanotubes, near-infrared photoluminescence. A split-channel microscope constructed to image quantized spectral wavelength shifts in real time, in response to nitroaromatic adsorption, results in the first single-nanotube imaging of solvatochromic events. The described indirect detection mechanism, as well as an additional exciton quenching-based optical nitroaromatic detection method, illustrate that functionalization of the carbon nanotube surface can result in completely unique sites for recognition, resolvable at the single-molecule level. PMID:21555544
Rockfall induced seismic signals: case study in Montserrat, Catalonia
NASA Astrophysics Data System (ADS)
Vilajosana, I.; Suriñach, E.; Abellán, A.; Khazaradze, G.; Garcia, D.; Llosa, J.
2008-08-01
After a rockfall event, a usual post event survey includes qualitative volume estimation, trajectory mapping and determination of departing zones. However, quantitative measurements are not usually made. Additional relevant quantitative information could be useful in determining the spatial occurrence of rockfall events and help us in quantifying their size. Seismic measurements could be suitable for detection purposes since they are non invasive methods and are relatively inexpensive. Moreover, seismic techniques could provide important information on rockfall size and location of impacts. On 14 February 2007 the Avalanche Group of the University of Barcelona obtained the seismic data generated by an artificially triggered rockfall event at the Montserrat massif (near Barcelona, Spain) carried out in order to purge a slope. Two 3 component seismic stations were deployed in the area about 200 m from the explosion point that triggered the rockfall. Seismic signals and video images were simultaneously obtained. The initial volume of the rockfall was estimated to be 75 m3 by laser scanner data analysis. After the explosion, dozens of boulders ranging from 10-4 to 5 m3 in volume impacted on the ground at different locations. The blocks fell down onto a terrace, 120 m below the release zone. The impact generated a small continuous mass movement composed of a mixture of rocks, sand and dust that ran down the slope and impacted on the road 60 m below. Time, time-frequency evolution and particle motion analysis of the seismic records and seismic energy estimation were performed. The results are as follows: 1 A rockfall event generates seismic signals with specific characteristics in the time domain; 2 the seismic signals generated by the mass movement show a time-frequency evolution different from that of other seismogenic sources (e.g. earthquakes, explosions or a single rock impact). This feature could be used for detection purposes; 3 particle motion plot analysis shows that the procedure to locate the rock impact using two stations is feasible; 4 The feasibility and validity of seismic methods for the detection of rockfall events, their localization and size determination are comfirmed.
Autonomous Detection of Eruptions, Plumes, and Other Transient Events in the Outer Solar System
NASA Astrophysics Data System (ADS)
Bunte, M. K.; Lin, Y.; Saripalli, S.; Bell, J. F.
2012-12-01
The outer solar system abounds with visually stunning examples of dynamic processes such as eruptive events that jettison materials from satellites and small bodies into space. The most notable examples of such events are the prominent volcanic plumes of Io, the wispy water jets of Enceladus, and the outgassing of comet nuclei. We are investigating techniques that will allow a spacecraft to autonomously detect those events in visible images. This technique will allow future outer planet missions to conduct sustained event monitoring and automate prioritization of data for downlink. Our technique detects plumes by searching for concentrations of large local gradients in images. Applying a Scale Invariant Feature Transform (SIFT) to either raw or calibrated images identifies interest points for further investigation based on the magnitude and orientation of local gradients in pixel values. The interest points are classified as possible transient geophysical events when they share characteristics with similar features in user-classified images. A nearest neighbor classification scheme assesses the similarity of all interest points within a threshold Euclidean distance and classifies each according to the majority classification of other interest points. Thus, features marked by multiple interest points are more likely to be classified positively as events; isolated large plumes or multiple small jets are easily distinguished from a textured background surface due to the higher magnitude gradient of the plume or jet when compared with the small, randomly oriented gradients of the textured surface. We have applied this method to images of Io, Enceladus, and comet Hartley 2 from the Voyager, Galileo, New Horizons, Cassini, and Deep Impact EPOXI missions, where appropriate, and have successfully detected up to 95% of manually identifiable events that our method was able to distinguish from the background surface and surface features of a body. Dozens of distinct features are identifiable under a variety of viewing conditions and hundreds of detections are made in each of the aforementioned datasets. In this presentation, we explore the controlling factors in detecting transient events and discuss causes of success or failure due to distinct data characteristics. These include the level of calibration of images, the ability to differentiate an event from artifacts, and the variety of event appearances in user-classified images. Other important factors include the physical characteristics of the events themselves: albedo, size as a function of image resolution, and proximity to other events (as in the case of multiple small jets which feed into the overall plume at the south pole of Enceladus). A notable strength of this method is the ability to detect events that do not extend beyond the limb of a planetary body or are adjacent to the terminator or other strong edges in the image. The former scenario strongly influences the success rate of detecting eruptive events in nadir views.
Obstructive Sleep Apnea Screening Using a Piezo-Electric Sensor.
Erdenebayar, Urtnasan; Park, Jong Uk; Jeong, Pilsoo; Lee, Kyoung Joung
2017-06-01
In this study, we propose a novel method for obstructive sleep apnea (OSA) detection using a piezo-electric sensor. OSA is a relatively common sleep disorder. However, more than 80% of OSA patients remain undiagnosed. We investigated the feasibility of OSA assessment using a single-channel physiological signal to simplify the OSA screening. We detected both snoring and heartbeat information by using a piezo-electric sensor, and snoring index (SI) and features based on pulse rate variability (PRV) analysis were extracted from the filtered piezo-electric sensor signal. A support vector machine (SVM) was used as a classifier to detect OSA events. The performance of the proposed method was evaluated on 45 patients from mild, moderate, and severe OSA groups. The method achieved a mean sensitivity, specificity, and accuracy of 72.5%, 74.2%, and 71.5%; 85.8%, 80.5%, and 80.0%; and 70.3%, 77.1%, and 71.9% for the mild, moderate, and severe groups, respectively. Finally, these results not only show the feasibility of OSA detection using a piezo-electric sensor, but also illustrate its usefulness for monitoring sleep and diagnosing OSA. © 2017 The Korean Academy of Medical Sciences.
NASA Astrophysics Data System (ADS)
Hopp, C. J.; Savage, M. K.; Townend, J.; Sherburn, S.
2016-12-01
Monitoring patterns in local microseismicity gives clues to the existence and location of subsurface structures. In the context of a geothermal reservoir, subsurface structures often indicate areas of high permeability and are vitally important in understanding fluid flow within the geothermal resource. Detecting and locating microseismic events within an area of power generation, however, is often challenging due to high levels of noise associated with nearby power plant infrastructure. In this situation, matched filter detection improves drastically upon standard earthquake detection techniques, specifically when events are likely induced by fluid injection and are therefore near-repeating. Using an earthquake catalog of 637 events which occurred between 1 January and 18 November 2015 as our initial dataset, we implemented a matched filtering routine for the Mighty River Power (MRP) geothermal fields at Rotokawa and Ngatamariki, central North Island, New Zealand. We detected nearly 21,000 additional events across both geothermal fields, a roughly 30-fold increase from the original catalog. On average, each of the 637 template events detected 45 additional events throughout the study period, with a maximum number of additional detections for a single template of 359. Cumulative detection rates for all template events, in general, do not mimic large scale changes in injection rates within the fields, however we do see indications of an increase in detection rate associated with power plant shutdown at Ngatamariki. Locations of detected events follow established patterns of historic seismicity at both Ngatamariki and Rotokawa. One large cluster of events persists in the southeastern portion of Rotokawa and is likely bounded to the northwest by a known fault dividing the injection and production sections of the field. Two distinct clusters of microseismicity occur in the North and South of Ngatamariki, the latter appearing to coincide with a structure dividing the production zone and the southern injection zone.
Real-time surveillance for abnormal events: the case of influenza outbreaks.
Rao, Yao; McCabe, Brendan
2016-06-15
This paper introduces a method of surveillance using deviations from probabilistic forecasts. Realised observations are compared with probabilistic forecasts, and the "deviation" metric is based on low probability events. If an alert is declared, the algorithm continues to monitor until an all-clear is announced. Specifically, this article addresses the problem of syndromic surveillance for influenza (flu) with the intention of detecting outbreaks, due to new strains of viruses, over and above the normal seasonal pattern. The syndrome is hospital admissions for flu-like illness, and hence, the data are low counts. In accordance with the count properties of the observations, an integer-valued autoregressive process is used to model flu occurrences. Monte Carlo evidence suggests the method works well in stylised but somewhat realistic situations. An application to real flu data indicates that the ideas may have promise. The model estimated on a short run of training data did not declare false alarms when used with new observations deemed in control, ex post. The model easily detected the 2009 H1N1 outbreak. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
DNA Clutch Probes for Circulating Tumor DNA Analysis.
Das, Jagotamoy; Ivanov, Ivaylo; Sargent, Edward H; Kelley, Shana O
2016-08-31
Progress toward the development of minimally invasive liquid biopsies of disease is being bolstered by breakthroughs in the analysis of circulating tumor DNA (ctDNA): DNA released from cancer cells into the bloodstream. However, robust, sensitive, and specific methods of detecting this emerging analyte are lacking. ctDNA analysis has unique challenges, since it is imperative to distinguish circulating DNA from normal cells vs mutation-bearing sequences originating from tumors. Here we report the electrochemical detection of mutated ctDNA in samples collected from cancer patients. By developing a strategy relying on the use of DNA clutch probes (DCPs) that render specific sequences of ctDNA accessible, we were able to readout the presence of mutated ctDNA. DCPs prevent reassociation of denatured DNA strands: they make one of the two strands of a dsDNA accessible for hybridization to a probe, and they also deactivate other closely related sequences in solution. DCPs ensure thereby that only mutated sequences associate with chip-based sensors detecting hybridization events. The assay exhibits excellent sensitivity and specificity in the detection of mutated ctDNA: it detects 1 fg/μL of a target mutation in the presence of 100 pg/μL of wild-type DNA, corresponding to detecting mutations at a level of 0.01% relative to wild type. This approach allows accurate analysis of samples collected from lung cancer and melanoma patients. This work represents the first detection of ctDNA without enzymatic amplification.
Comparison of Event Detection Methods for Centralized Sensor Networks
NASA Technical Reports Server (NTRS)
Sauvageon, Julien; Agogiono, Alice M.; Farhang, Ali; Tumer, Irem Y.
2006-01-01
The development of an Integrated Vehicle Health Management (IVHM) for space vehicles has become a great concern. Smart Sensor Networks is one of the promising technologies that are catching a lot of attention. In this paper, we propose to a qualitative comparison of several local event (hot spot) detection algorithms in centralized redundant sensor networks. The algorithms are compared regarding their ability to locate and evaluate the event under noise and sensor failures. The purpose of this study is to check if the ratio performance/computational power of the Mote Fuzzy Validation and Fusion algorithm is relevant compare to simpler methods.
Predictive modeling of structured electronic health records for adverse drug event detection.
Zhao, Jing; Henriksson, Aron; Asker, Lars; Boström, Henrik
2015-01-01
The digitization of healthcare data, resulting from the increasingly widespread adoption of electronic health records, has greatly facilitated its analysis by computational methods and thereby enabled large-scale secondary use thereof. This can be exploited to support public health activities such as pharmacovigilance, wherein the safety of drugs is monitored to inform regulatory decisions about sustained use. To that end, electronic health records have emerged as a potentially valuable data source, providing access to longitudinal observations of patient treatment and drug use. A nascent line of research concerns predictive modeling of healthcare data for the automatic detection of adverse drug events, which presents its own set of challenges: it is not yet clear how to represent the heterogeneous data types in a manner conducive to learning high-performing machine learning models. Datasets from an electronic health record database are used for learning predictive models with the purpose of detecting adverse drug events. The use and representation of two data types, as well as their combination, are studied: clinical codes, describing prescribed drugs and assigned diagnoses, and measurements. Feature selection is conducted on the various types of data to reduce dimensionality and sparsity, while allowing for an in-depth feature analysis of the usefulness of each data type and representation. Within each data type, combining multiple representations yields better predictive performance compared to using any single representation. The use of clinical codes for adverse drug event detection significantly outperforms the use of measurements; however, there is no significant difference over datasets between using only clinical codes and their combination with measurements. For certain adverse drug events, the combination does, however, outperform using only clinical codes. Feature selection leads to increased predictive performance for both data types, in isolation and combined. We have demonstrated how machine learning can be applied to electronic health records for the purpose of detecting adverse drug events and proposed solutions to some of the challenges this presents, including how to represent the various data types. Overall, clinical codes are more useful than measurements and, in specific cases, it is beneficial to combine the two.
de Bruijn cycles for neural decoding.
Aguirre, Geoffrey Karl; Mattar, Marcelo Gomes; Magis-Weinberg, Lucía
2011-06-01
Stimulus counterbalance is critical for studies of neural habituation, bias, anticipation, and (more generally) the effect of stimulus history and context. We introduce de Bruijn cycles, a class of combinatorial objects, as the ideal source of pseudo-random stimulus sequences with arbitrary levels of counterbalance. Neuro-vascular imaging studies (such as BOLD fMRI) have an additional requirement imposed by the filtering and noise properties of the method: only some temporal frequencies of neural modulation are detectable. Extant methods of generating counterbalanced stimulus sequences yield neural modulations that are weakly (or not at all) detected by BOLD fMRI. We solve this limitation using a novel "path-guided" approach for the generation of de Bruijn cycles. The algorithm encodes a hypothesized neural modulation of specific temporal frequency within the seemingly random order of events. By positioning the modulation between the signal and noise bands of the neuro-vascular imaging method, the resulting sequence markedly improves detection power. These sequences may be used to study stimulus context and history effects in a manner not previously possible. Copyright © 2011 Elsevier Inc. All rights reserved.
Confidential Clinician-reported Surveillance of Adverse Events Among Medical Inpatients
Weingart, Saul N; Ship, Amy N; Aronson, Mark D
2000-01-01
BACKGROUND Although iatrogenic injury poses a significant risk to hospitalized patients, detection of adverse events (AEs) is costly and difficult. METHODS The authors developed a confidential reporting method for detecting AEs on a medicine unit of a teaching hospital. Adverse events were defined as patient injuries. Potential adverse events (PAEs) represented errors that could have, but did not result in harm. Investigators interviewed house officers during morning rounds and by e-mail, asking them to identify obstacles to high quality care and iatrogenic injuries. They compared house officer reports with hospital incident reports and patients' medical records. A multivariate regression model identified correlates of reporting. RESULTS One hundred ten events occurred, affecting 84 patients. Queries by e-mail (incidence rate ratio [IRR ]=0.16; 95% confidence interval [95% CI], 0.05 to 0.49) and on days when house officers rotated to a new service (IRR =0.12; 95% CI, 0.02 to 0.91) resulted in fewer reports. The most commonly reported process of care problems were inadequate evaluation of the patient (16.4%), failure to monitor or follow up (12.7%), and failure of the laboratory to perform a test (12.7%). Respondents identified 29 (26.4%) AEs, 52 (47.3%) PAEs, and 29 (26.4%) other house officer-identified quality problems. An AE occurred in 2.6% of admissions. The hospital incident reporting system detected only one house officer-reported event. Chart review corroborated 72.9% of events. CONCLUSIONS House officers detect many AEs among inpatients. Confidential peer interviews of front-line providers is a promising method for identifying medical errors and substandard quality. PMID:10940133
Li, Jie; Li, Rui; You, Leiming; Xu, Anlong; Fu, Yonggui; Huang, Shengfeng
2015-01-01
Switching between different alternative polyadenylation (APA) sites plays an important role in the fine tuning of gene expression. New technologies for the execution of 3’-end enriched RNA-seq allow genome-wide detection of the genes that exhibit significant APA site switching between different samples. Here, we show that the independence test gives better results than the linear trend test in detecting APA site-switching events. Further examination suggests that the discrepancy between these two statistical methods arises from complex APA site-switching events that cannot be represented by a simple change of average 3’-UTR length. In theory, the linear trend test is only effective in detecting these simple changes. We classify the switching events into four switching patterns: two simple patterns (3’-UTR shortening and lengthening) and two complex patterns. By comparing the results of the two statistical methods, we show that complex patterns account for 1/4 of all observed switching events that happen between normal and cancerous human breast cell lines. Because simple and complex switching patterns may convey different biological meanings, they merit separate study. We therefore propose to combine both the independence test and the linear trend test in practice. First, the independence test should be used to detect APA site switching; second, the linear trend test should be invoked to identify simple switching events; and third, those complex switching events that pass independence testing but fail linear trend testing can be identified. PMID:25875641
Commonality of drug-associated adverse events detected by 4 commonly used data mining algorithms.
Sakaeda, Toshiyuki; Kadoyama, Kaori; Minami, Keiko; Okuno, Yasushi
2014-01-01
Data mining algorithms have been developed for the quantitative detection of drug-associated adverse events (signals) from a large database on spontaneously reported adverse events. In the present study, the commonality of signals detected by 4 commonly used data mining algorithms was examined. A total of 2,231,029 reports were retrieved from the public release of the US Food and Drug Administration Adverse Event Reporting System database between 2004 and 2009. The deletion of duplicated submissions and revision of arbitrary drug names resulted in a reduction in the number of reports to 1,644,220. Associations with adverse events were analyzed for 16 unrelated drugs, using the proportional reporting ratio (PRR), reporting odds ratio (ROR), information component (IC), and empirical Bayes geometric mean (EBGM). All EBGM-based signals were included in the PRR-based signals as well as IC- or ROR-based ones, and PRR- and IC-based signals were included in ROR-based ones. The PRR scores of PRR-based signals were significantly larger for 15 of 16 drugs when adverse events were also detected as signals by the EBGM method, as were the IC scores of IC-based signals for all drugs; however, no such effect was observed in the ROR scores of ROR-based signals. The EBGM method was the most conservative among the 4 methods examined, which suggested its better suitability for pharmacoepidemiological studies. Further examinations should be performed on the reproducibility of clinical observations, especially for EBGM-based signals.
3D-nanostructured Au electrodes for the event-specific detection of MON810 transgenic maize.
Fátima Barroso, M; Freitas, Maria; Oliveira, M Beatriz P P; de-Los-Santos-Álvarez, Noemí; Lobo-Castañón, María Jesús; Delerue-Matos, Cristina
2015-03-01
In the present work, the development of a genosensor for the event-specific detection of MON810 transgenic maize is proposed. Taking advantage of nanostructuration, a cost-effective three dimensional electrode was fabricated and a ternary monolayer containing a dithiol, a monothiol and the thiolated capture probe was optimized to minimize the unspecific signals. A sandwich format assay was selected as a way of precluding inefficient hybridization associated with stable secondary target structures. A comparison between the analytical performance of the Au nanostructured electrodes and commercially available screen-printed electrodes highlighted the superior performance of the nanostructured ones. Finally, the genosensor was effectively applied to detect the transgenic sequence in real samples, showing its potential for future quantitative analysis. Copyright © 2014 Elsevier B.V. All rights reserved.
Active Learning with Rationales for Identifying Operationally Significant Anomalies in Aviation
NASA Technical Reports Server (NTRS)
Sharma, Manali; Das, Kamalika; Bilgic, Mustafa; Matthews, Bryan; Nielsen, David Lynn; Oza, Nikunj C.
2016-01-01
A major focus of the commercial aviation community is discovery of unknown safety events in flight operations data. Data-driven unsupervised anomaly detection methods are better at capturing unknown safety events compared to rule-based methods which only look for known violations. However, not all statistical anomalies that are discovered by these unsupervised anomaly detection methods are operationally significant (e.g., represent a safety concern). Subject Matter Experts (SMEs) have to spend significant time reviewing these statistical anomalies individually to identify a few operationally significant ones. In this paper we propose an active learning algorithm that incorporates SME feedback in the form of rationales to build a classifier that can distinguish between uninteresting and operationally significant anomalies. Experimental evaluation on real aviation data shows that our approach improves detection of operationally significant events by as much as 75% compared to the state-of-the-art. The learnt classifier also generalizes well to additional validation data sets.
Traumatic Brain Injury Detection Using Electrophysiological Methods
Rapp, Paul E.; Keyser, David O.; Albano, Alfonso; Hernandez, Rene; Gibson, Douglas B.; Zambon, Robert A.; Hairston, W. David; Hughes, John D.; Krystal, Andrew; Nichols, Andrew S.
2015-01-01
Measuring neuronal activity with electrophysiological methods may be useful in detecting neurological dysfunctions, such as mild traumatic brain injury (mTBI). This approach may be particularly valuable for rapid detection in at-risk populations including military service members and athletes. Electrophysiological methods, such as quantitative electroencephalography (qEEG) and recording event-related potentials (ERPs) may be promising; however, the field is nascent and significant controversy exists on the efficacy and accuracy of the approaches as diagnostic tools. For example, the specific measures derived from an electroencephalogram (EEG) that are most suitable as markers of dysfunction have not been clearly established. A study was conducted to summarize and evaluate the statistical rigor of evidence on the overall utility of qEEG as an mTBI detection tool. The analysis evaluated qEEG measures/parameters that may be most suitable as fieldable diagnostic tools, identified other types of EEG measures and analysis methods of promise, recommended specific measures and analysis methods for further development as mTBI detection tools, identified research gaps in the field, and recommended future research and development thrust areas. The qEEG study group formed the following conclusions: (1) Individual qEEG measures provide limited diagnostic utility for mTBI. However, many measures can be important features of qEEG discriminant functions, which do show significant promise as mTBI detection tools. (2) ERPs offer utility in mTBI detection. In fact, evidence indicates that ERPs can identify abnormalities in cases where EEGs alone are non-disclosing. (3) The standard mathematical procedures used in the characterization of mTBI EEGs should be expanded to incorporate newer methods of analysis including non-linear dynamical analysis, complexity measures, analysis of causal interactions, graph theory, and information dynamics. (4) Reports of high specificity in qEEG evaluations of TBI must be interpreted with care. High specificities have been reported in carefully constructed clinical studies in which healthy controls were compared against a carefully selected TBI population. The published literature indicates, however, that similar abnormalities in qEEG measures are observed in other neuropsychiatric disorders. While it may be possible to distinguish a clinical patient from a healthy control participant with this technology, these measures are unlikely to discriminate between, for example, major depressive disorder, bipolar disorder, or TBI. The specificities observed in these clinical studies may well be lost in real world clinical practice. (5) The absence of specificity does not preclude clinical utility. The possibility of use as a longitudinal measure of treatment response remains. However, efficacy as a longitudinal clinical measure does require acceptable test–retest reliability. To date, very few test–retest reliability studies have been published with qEEG data obtained from TBI patients or from healthy controls. This is a particular concern because high variability is a known characteristic of the injured central nervous system. PMID:25698950
Traumatic brain injury detection using electrophysiological methods.
Rapp, Paul E; Keyser, David O; Albano, Alfonso; Hernandez, Rene; Gibson, Douglas B; Zambon, Robert A; Hairston, W David; Hughes, John D; Krystal, Andrew; Nichols, Andrew S
2015-01-01
Measuring neuronal activity with electrophysiological methods may be useful in detecting neurological dysfunctions, such as mild traumatic brain injury (mTBI). This approach may be particularly valuable for rapid detection in at-risk populations including military service members and athletes. Electrophysiological methods, such as quantitative electroencephalography (qEEG) and recording event-related potentials (ERPs) may be promising; however, the field is nascent and significant controversy exists on the efficacy and accuracy of the approaches as diagnostic tools. For example, the specific measures derived from an electroencephalogram (EEG) that are most suitable as markers of dysfunction have not been clearly established. A study was conducted to summarize and evaluate the statistical rigor of evidence on the overall utility of qEEG as an mTBI detection tool. The analysis evaluated qEEG measures/parameters that may be most suitable as fieldable diagnostic tools, identified other types of EEG measures and analysis methods of promise, recommended specific measures and analysis methods for further development as mTBI detection tools, identified research gaps in the field, and recommended future research and development thrust areas. The qEEG study group formed the following conclusions: (1) Individual qEEG measures provide limited diagnostic utility for mTBI. However, many measures can be important features of qEEG discriminant functions, which do show significant promise as mTBI detection tools. (2) ERPs offer utility in mTBI detection. In fact, evidence indicates that ERPs can identify abnormalities in cases where EEGs alone are non-disclosing. (3) The standard mathematical procedures used in the characterization of mTBI EEGs should be expanded to incorporate newer methods of analysis including non-linear dynamical analysis, complexity measures, analysis of causal interactions, graph theory, and information dynamics. (4) Reports of high specificity in qEEG evaluations of TBI must be interpreted with care. High specificities have been reported in carefully constructed clinical studies in which healthy controls were compared against a carefully selected TBI population. The published literature indicates, however, that similar abnormalities in qEEG measures are observed in other neuropsychiatric disorders. While it may be possible to distinguish a clinical patient from a healthy control participant with this technology, these measures are unlikely to discriminate between, for example, major depressive disorder, bipolar disorder, or TBI. The specificities observed in these clinical studies may well be lost in real world clinical practice. (5) The absence of specificity does not preclude clinical utility. The possibility of use as a longitudinal measure of treatment response remains. However, efficacy as a longitudinal clinical measure does require acceptable test-retest reliability. To date, very few test-retest reliability studies have been published with qEEG data obtained from TBI patients or from healthy controls. This is a particular concern because high variability is a known characteristic of the injured central nervous system.
NASA Astrophysics Data System (ADS)
Sciotto, M.; Rowe, C. A.; Cannata, A.; Arrowsmith, S.; Privitera, E.; Gresta, S.
2011-12-01
The current eruption of Mount Etna, which began in January, 2011, has produced numerous energetic episodes of lava fountaining, which have bee recorded by the INGV seismic and acoustic sensors located on and around the volcano. The source of these events was the pit crater on the east flank of the Southeast crater of Etna. Simultaneously, small levels of activity were noted in the Bocca Nuova as well, prior to its lava fountaining activity. We will present an analysis of seismic and acoustic signals related to the 2011 activity wherein we apply the method of subspace detection to determine whether the source exhibits a temporal evolution within or between fountaining events, or otherwise produces repeating, classifiable events occurring through the continuous explosive degassing. We will examine not only the raw waveforms, but also spectral variations in time as well as time-varying statistical functions such as signal skewness and kurtosis. These results will be compared to straightforward cross-correlation analysis. In addition to classification performance, the subspace method has promise to outperform standard STA/LTA methods for real-time event detection in cases where similar events can be expected.
Pick- and waveform-based techniques for real-time detection of induced seismicity
NASA Astrophysics Data System (ADS)
Grigoli, Francesco; Scarabello, Luca; Böse, Maren; Weber, Bernd; Wiemer, Stefan; Clinton, John F.
2018-05-01
The monitoring of induced seismicity is a common operation in many industrial activities, such as conventional and non-conventional hydrocarbon production or mining and geothermal energy exploitation, to cite a few. During such operations, we generally collect very large and strongly noise-contaminated data sets that require robust and automated analysis procedures. Induced seismicity data sets are often characterized by sequences of multiple events with short interevent times or overlapping events; in these cases, pick-based location methods may struggle to correctly assign picks to phases and events, and errors can lead to missed detections and/or reduced location resolution and incorrect magnitudes, which can have significant consequences if real-time seismicity information are used for risk assessment frameworks. To overcome these issues, different waveform-based methods for the detection and location of microseismicity have been proposed. The main advantages of waveform-based methods is that they appear to perform better and can simultaneously detect and locate seismic events providing high-quality locations in a single step, while the main disadvantage is that they are computationally expensive. Although these methods have been applied to different induced seismicity data sets, an extensive comparison with sophisticated pick-based detection methods is still missing. In this work, we introduce our improved waveform-based detector and we compare its performance with two pick-based detectors implemented within the SeiscomP3 software suite. We test the performance of these three approaches with both synthetic and real data sets related to the induced seismicity sequence at the deep geothermal project in the vicinity of the city of St. Gallen, Switzerland.
Hemozoin-generated vapor nanobubbles for transdermal reagent- and needle-free detection of malaria
Lukianova-Hleb, Ekaterina Y.; Campbell, Kelly M.; Constantinou, Pamela E.; Braam, Janet; Olson, John S.; Ware, Russell E.; Sullivan, David J.; Lapotko, Dmitri O.
2014-01-01
Successful diagnosis, screening, and elimination of malaria critically depend on rapid and sensitive detection of this dangerous infection, preferably transdermally and without sophisticated reagents or blood drawing. Such diagnostic methods are not currently available. Here we show that the high optical absorbance and nanosize of endogenous heme nanoparticles called “hemozoin,” a unique component of all blood-stage malaria parasites, generates a transient vapor nanobubble around hemozoin in response to a short and safe near-infrared picosecond laser pulse. The acoustic signals of these malaria-specific nanobubbles provided transdermal noninvasive and rapid detection of a malaria infection as low as 0.00034% in animals without using any reagents or drawing blood. These on-demand transient events have no analogs among current malaria markers and probes, can detect and screen malaria in seconds, and can be realized as a compact, easy-to-use, inexpensive, and safe field technology. PMID:24379385
Hemozoin-generated vapor nanobubbles for transdermal reagent- and needle-free detection of malaria.
Lukianova-Hleb, Ekaterina Y; Campbell, Kelly M; Constantinou, Pamela E; Braam, Janet; Olson, John S; Ware, Russell E; Sullivan, David J; Lapotko, Dmitri O
2014-01-21
Successful diagnosis, screening, and elimination of malaria critically depend on rapid and sensitive detection of this dangerous infection, preferably transdermally and without sophisticated reagents or blood drawing. Such diagnostic methods are not currently available. Here we show that the high optical absorbance and nanosize of endogenous heme nanoparticles called "hemozoin," a unique component of all blood-stage malaria parasites, generates a transient vapor nanobubble around hemozoin in response to a short and safe near-infrared picosecond laser pulse. The acoustic signals of these malaria-specific nanobubbles provided transdermal noninvasive and rapid detection of a malaria infection as low as 0.00034% in animals without using any reagents or drawing blood. These on-demand transient events have no analogs among current malaria markers and probes, can detect and screen malaria in seconds, and can be realized as a compact, easy-to-use, inexpensive, and safe field technology.
Ford, Alexander C; Malfertheiner, Peter; Giguère, Monique; Santana, José; Khan, Mostafizur; Moayyedi, Paul
2008-01-01
AIM: To assess the safety of bismuth used in Helicobacter pylori (H pylori) eradication therapy regimens. METHODS: We conducted a systematic review and meta-analysis. MEDLINE and EMBASE were searched (up to October 2007) to identify randomised controlled trials comparing bismuth with placebo or no treatment, or bismuth salts in combination with antibiotics as part of eradication therapy with the same dose and duration of antibiotics alone or, in combination, with acid suppression. Total numbers of adverse events were recorded. Data were pooled and expressed as relative risks with 95% confidence intervals (CI). RESULTS: We identified 35 randomised controlled trials containing 4763 patients. There were no serious adverse events occurring with bismuth therapy. There was no statistically significant difference detected in total adverse events with bismuth [relative risk (RR) = 1.01; 95% CI: 0.87-1.16], specific individual adverse events, with the exception of dark stools (RR = 5.06; 95% CI: 1.59-16.12), or adverse events leading to withdrawal of therapy (RR = 0.86; 95% CI: 0.54-1.37). CONCLUSION: Bismuth for the treatment of H pylori is safe and well-tolerated. The only adverse event occurring significantly more commonly was dark stools. PMID:19109870
van Luijtelaar, Gilles; Lüttjohann, Annika; Makarov, Vladimir V; Maksimenko, Vladimir A; Koronovskii, Alexei A; Hramov, Alexander E
2016-02-15
Genetic rat models for childhood absence epilepsy have become instrumental in developing theories on the origin of absence epilepsy, the evaluation of new and experimental treatments, as well as in developing new methods for automatic seizure detection, prediction, and/or interference of seizures. Various methods for automated off and on-line analyses of ECoG in rodent models are reviewed, as well as data on how to interfere with the spike-wave discharges by different types of invasive and non-invasive electrical, magnetic, and optical brain stimulation. Also a new method for seizure prediction is proposed. Many selective and specific methods for off- and on-line spike-wave discharge detection seem excellent, with possibilities to overcome the issue of individual differences. Moreover, electrical deep brain stimulation is rather effective in interrupting ongoing spike-wave discharges with low stimulation intensity. A network based method is proposed for absence seizures prediction with a high sensitivity but a low selectivity. Solutions that prevent false alarms, integrated in a closed loop brain stimulation system open the ways for experimental seizure control. The presence of preictal cursor activity detected with state of the art time frequency and network analyses shows that spike-wave discharges are not caused by sudden and abrupt transitions but that there are detectable dynamic events. Their changes in time-space-frequency characteristics might yield new options for seizure prediction and seizure control. Copyright © 2015 Elsevier B.V. All rights reserved.
Forghani-Arani, Farnoush; Behura, Jyoti; Haines, Seth S.; Batzle, Mike
2013-01-01
In studies on heavy oil, shale reservoirs, tight gas and enhanced geothermal systems, the use of surface passive seismic data to monitor induced microseismicity due to the fluid flow in the subsurface is becoming more common. However, in most studies passive seismic records contain days and months of data and manually analysing the data can be expensive and inaccurate. Moreover, in the presence of noise, detecting the arrival of weak microseismic events becomes challenging. Hence, the use of an automated, accurate and computationally fast technique for event detection in passive seismic data is essential. The conventional automatic event identification algorithm computes a running-window energy ratio of the short-term average to the long-term average of the passive seismic data for each trace. We show that for the common case of a low signal-to-noise ratio in surface passive records, the conventional method is not sufficiently effective at event identification. Here, we extend the conventional algorithm by introducing a technique that is based on the cross-correlation of the energy ratios computed by the conventional method. With our technique we can measure the similarities amongst the computed energy ratios at different traces. Our approach is successful at improving the detectability of events with a low signal-to-noise ratio that are not detectable with the conventional algorithm. Also, our algorithm has the advantage to identify if an event is common to all stations (a regional event) or to a limited number of stations (a local event). We provide examples of applying our technique to synthetic data and a field surface passive data set recorded at a geothermal site.
[Optimized application of nested PCR method for detection of malaria].
Yao-Guang, Z; Li, J; Zhen-Yu, W; Li, C
2017-04-28
Objective To optimize the application of the nested PCR method for the detection of malaria according to the working practice, so as to improve the efficiency of malaria detection. Methods Premixing solution of PCR, internal primers for further amplification and new designed primers that aimed at two Plasmodium ovale subspecies were employed to optimize the reaction system, reaction condition and specific primers of P . ovale on basis of routine nested PCR. Then the specificity and the sensitivity of the optimized method were analyzed. The positive blood samples and examination samples of malaria were detected by the routine nested PCR and the optimized method simultaneously, and the detection results were compared and analyzed. Results The optimized method showed good specificity, and its sensitivity could reach the pg to fg level. The two methods were used to detect the same positive malarial blood samples simultaneously, the results indicated that the PCR products of the two methods had no significant difference, but the non-specific amplification reduced obviously and the detection rates of P . ovale subspecies improved, as well as the total specificity also increased through the use of the optimized method. The actual detection results of 111 cases of malarial blood samples showed that the sensitivity and specificity of the routine nested PCR were 94.57% and 86.96%, respectively, and those of the optimized method were both 93.48%, and there was no statistically significant difference between the two methods in the sensitivity ( P > 0.05), but there was a statistically significant difference between the two methods in the specificity ( P < 0.05). Conclusion The optimized PCR can improve the specificity without reducing the sensitivity on the basis of the routine nested PCR, it also can save the cost and increase the efficiency of malaria detection as less experiment links.
The Meteorology of Storms that Produce Narrow Bipolar Events
NASA Technical Reports Server (NTRS)
Lang, Timothy J.; McCaul, Eugene W.; Cummer, Steven A.
2013-01-01
Narrow Bipolar Events (NBEs) are compact intracloud discharges that produce the most powerful lightning-related radio frequency signals that have been observed. However, their luminosity is below the threshold for detectability from current and past spaceborne optical sensors. NBEs have been loosely associated with convective intensity, but their occurrence tends to be highly localized in time and space within a thunderstorm, and there remain many questions about whether and to what extent they are significantly related to meteorological processes within thunderstorms. Using the North Alabama Lightning Mapping Array (NALMA), the National Lightning Detection Network, and available Doppler and polarimetric radar data, case studies will be presented for storm events that produced large numbers of NBEs (10s-100s) during their lifetimes. NBEs are documented via a method that identifies high peak power (>40-50 dBW) initial VHF sources within a specific altitude band in the upper levels of thunderstorms. The production of NBEs, including spatial and temporal variability, will be compared to the radar-inferred kinematic and microphysical structure and evolution of thunderstorms, as well as their NALMA- and NLDN-inferred electrical characteristics. The results should provide new insights into the relationships between NBEs and thunderstorm processes.
Progresses with Net-VISA on Global Infrasound Association
NASA Astrophysics Data System (ADS)
Mialle, Pierrick; Arora, Nimar
2017-04-01
Global Infrasound Association algorithms are an important area of active development at the International Data Centre (IDC). These algorithms play an important part of the automatic processing system for verification technologies. A key focus at the IDC is to enhance association and signal characterization methods by incorporating the identification of signals of interest and the optimization of the network detection threshold. The overall objective is to reduce the number of associated infrasound arrivals that are rejected from the automatic bulletins when generating the Reviewed Event Bulletins (REB), and hence reduce IDC analyst workload. Despite good accuracy by the IDC categorization, a number of signal detections due to clutter sources such as microbaroms or surf are built into events. In this work we aim to optimize the association criteria based on knowledge acquired by IDC in the last 6 years, and focus on the specificity of seismo-acoustic events. The resulting work has been incorporated into NETVISA [1], a Bayesian approach to network processing. The model that we propose is a fusion of seismic, hydroacoustic and infrasound processing built on a unified probabilistic framework. References: [1] NETVISA: Network Processing Vertically Integrated Seismic Analysis. N. S. Arora, S. Russell, and E. Sudderth. BSSA 2013
Progresses with Net-VISA on Global Infrasound Association
NASA Astrophysics Data System (ADS)
Mialle, P.; Arora, N. S.
2016-12-01
Global Infrasound Association algorithms are an important area of active development at the International Data Centre (IDC). These algorithms play an important part of the automatic processing system for verification technologies. A key focus at the IDC is to enhance association and signal characterization methods by incorporating the identification of signals of interest and the optimization of the network detection threshold. The overall objective is to reduce the number of associated infrasound arrivals that are rejected from the automatic bulletins when generating the Reviewed Event Bulletins (REB), and hence reduce IDC analyst workload. Despite good accuracy by the IDC categorization, a number of signal detections due to clutter sources such as microbaroms or surf are built into events. In this work we aim to optimize the association criteria based on knowledge acquired by IDC in the last 6 years, and focus on the specificity of seismo-acoustic events. The resulting work has been incorporated into NETVISA [1], a Bayesian approach to network processing. The model that we propose is a fusion of seismic, hydroacoustic and infrasound processing built on a unified probabilistic framework. References: [1] NETVISA: Network Processing Vertically Integrated Seismic Analysis. N. S. Arora, S. Russell, and E. Sudderth. BSSA 2013
Noller, Anna C; McEllistrem, M Catherine; Shutt, Kathleen A; Harrison, Lee H
2006-02-01
Multilocus variable-number tandem repeat analysis (MLVA) is a validated molecular subtyping method for detecting and evaluating Escherichia coli O157:H7 outbreaks. In a previous study, five outbreaks with a total of 21 isolates were examined by MLVA. Nearly 20% of the epidemiologically linked strains were single-locus variants (SLV) of their respective predominant outbreak clone. This result prompted an investigation into the mutation rates of the seven MLVA loci (TR1 to TR7). With an outbreak strain that was an SLV at the TR1 locus of the predominant clone, parallel and serial batch culture experiments were performed. In a parallel experiment, none (0/384) of the strains analyzed had mutations at the seven MLVA loci. In contrast, in the two 5-day serial experiments, 4.3% (41/960) of the strains analyzed had a significant variation in at least one of these loci (P < 0.001). The TR2 locus accounted for 85.3% (35/41) of the mutations, with an average mutation rate of 3.5 x 10(-3); the mutations rates for TR1 and TR5 were 10-fold lower. Single additions accounted for 77.1% (27/35) of the mutation events in TR2 and all (6/6) of the additions in TR1 and TR5. The remaining four loci had no slippage events detected. The mutation rates were locus specific and may impact the interpretation of MLVA data for epidemiologic investigations.
NASA Astrophysics Data System (ADS)
Salançon, Evelyne; Degiovanni, Alain; Lapena, Laurent; Morin, Roger
2018-04-01
An event-counting method using a two-microchannel plate stack in a low-energy electron point projection microscope is implemented. 15 μm detector spatial resolution, i.e., the distance between first-neighbor microchannels, is demonstrated. This leads to a 7 times better microscope resolution. Compared to previous work with neutrons [Tremsin et al., Nucl. Instrum. Methods Phys. Res., Sect. A 592, 374 (2008)], the large number of detection events achieved with electrons shows that the local response of the detector is mainly governed by the angle between the hexagonal structures of the two microchannel plates. Using this method in point projection microscopy offers the prospect of working with a greater source-object distance (350 nm instead of 50 nm), advancing toward atomic resolution.
Létant, Sonia E.; Murphy, Gloria A.; Alfaro, Teneile M.; Avila, Julie R.; Kane, Staci R.; Raber, Ellen; Bunt, Thomas M.; Shah, Sanjiv R.
2011-01-01
In the event of a biothreat agent release, hundreds of samples would need to be rapidly processed to characterize the extent of contamination and determine the efficacy of remediation activities. Current biological agent identification and viability determination methods are both labor- and time-intensive such that turnaround time for confirmed results is typically several days. In order to alleviate this issue, automated, high-throughput sample processing methods were developed in which real-time PCR analysis is conducted on samples before and after incubation. The method, referred to as rapid-viability (RV)-PCR, uses the change in cycle threshold after incubation to detect the presence of live organisms. In this article, we report a novel RV-PCR method for detection of live, virulent Bacillus anthracis, in which the incubation time was reduced from 14 h to 9 h, bringing the total turnaround time for results below 15 h. The method incorporates a magnetic bead-based DNA extraction and purification step prior to PCR analysis, as well as specific real-time PCR assays for the B. anthracis chromosome and pXO1 and pXO2 plasmids. A single laboratory verification of the optimized method applied to the detection of virulent B. anthracis in environmental samples was conducted and showed a detection level of 10 to 99 CFU/sample with both manual and automated RV-PCR methods in the presence of various challenges. Experiments exploring the relationship between the incubation time and the limit of detection suggest that the method could be further shortened by an additional 2 to 3 h for relatively clean samples. PMID:21764960
Cyber Surveillance for Flood Disasters
Lo, Shi-Wei; Wu, Jyh-Horng; Lin, Fang-Pang; Hsu, Ching-Han
2015-01-01
Regional heavy rainfall is usually caused by the influence of extreme weather conditions. Instant heavy rainfall often results in the flooding of rivers and the neighboring low-lying areas, which is responsible for a large number of casualties and considerable property loss. The existing precipitation forecast systems mostly focus on the analysis and forecast of large-scale areas but do not provide precise instant automatic monitoring and alert feedback for individual river areas and sections. Therefore, in this paper, we propose an easy method to automatically monitor the flood object of a specific area, based on the currently widely used remote cyber surveillance systems and image processing methods, in order to obtain instant flooding and waterlogging event feedback. The intrusion detection mode of these surveillance systems is used in this study, wherein a flood is considered a possible invasion object. Through the detection and verification of flood objects, automatic flood risk-level monitoring of specific individual river segments, as well as the automatic urban inundation detection, has become possible. The proposed method can better meet the practical needs of disaster prevention than the method of large-area forecasting. It also has several other advantages, such as flexibility in location selection, no requirement of a standard water-level ruler, and a relatively large field of view, when compared with the traditional water-level measurements using video screens. The results can offer prompt reference for appropriate disaster warning actions in small areas, making them more accurate and effective. PMID:25621609
Li, Qi; Melton, Kristin; Lingren, Todd; Kirkendall, Eric S; Hall, Eric; Zhai, Haijun; Ni, Yizhao; Kaiser, Megan; Stoutenborough, Laura; Solti, Imre
2014-01-01
Although electronic health records (EHRs) have the potential to provide a foundation for quality and safety algorithms, few studies have measured their impact on automated adverse event (AE) and medical error (ME) detection within the neonatal intensive care unit (NICU) environment. This paper presents two phenotyping AE and ME detection algorithms (ie, IV infiltrations, narcotic medication oversedation and dosing errors) and describes manual annotation of airway management and medication/fluid AEs from NICU EHRs. From 753 NICU patient EHRs from 2011, we developed two automatic AE/ME detection algorithms, and manually annotated 11 classes of AEs in 3263 clinical notes. Performance of the automatic AE/ME detection algorithms was compared to trigger tool and voluntary incident reporting results. AEs in clinical notes were double annotated and consensus achieved under neonatologist supervision. Sensitivity, positive predictive value (PPV), and specificity are reported. Twelve severe IV infiltrates were detected. The algorithm identified one more infiltrate than the trigger tool and eight more than incident reporting. One narcotic oversedation was detected demonstrating 100% agreement with the trigger tool. Additionally, 17 narcotic medication MEs were detected, an increase of 16 cases over voluntary incident reporting. Automated AE/ME detection algorithms provide higher sensitivity and PPV than currently used trigger tools or voluntary incident-reporting systems, including identification of potential dosing and frequency errors that current methods are unequipped to detect. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Ontology-based knowledge management for personalized adverse drug events detection.
Cao, Feng; Sun, Xingzhi; Wang, Xiaoyuan; Li, Bo; Li, Jing; Pan, Yue
2011-01-01
Since Adverse Drug Event (ADE) has become a leading cause of death around the world, there arises high demand for helping clinicians or patients to identify possible hazards from drug effects. Motivated by this, we present a personalized ADE detection system, with the focus on applying ontology-based knowledge management techniques to enhance ADE detection services. The development of electronic health records makes it possible to automate the personalized ADE detection, i.e., to take patient clinical conditions into account during ADE detection. Specifically, we define the ADE ontology to uniformly manage the ADE knowledge from multiple sources. We take advantage of the rich semantics from the terminology SNOMED-CT and apply it to ADE detection via the semantic query and reasoning.
Tackle and impact detection in elite Australian football using wearable microsensor technology.
Gastin, Paul B; McLean, Owen C; Breed, Ray V P; Spittle, Michael
2014-01-01
The effectiveness of a wearable microsensor device (MinimaxX(TM) S4, Catapult Innovations, Melbourne, VIC, Australia) to automatically detect tackles and impact events in elite Australian football (AF) was assessed during four matches. Video observation was used as the criterion measure. A total of 352 tackles were observed, with 78% correctly detected as tackles by the manufacturer's software. Tackles against (i.e. tackled by an opponent) were more accurately detected than tackles made (90% v 66%). Of the 77 tackles that were not detected at all, the majority (74%) were categorised as low-intensity. In contrast, a total of 1510 "tackle" events were detected, with only 18% of these verified as tackles. A further 57% were from contested ball situations involving player contact. The remaining 25% were in general play where no contact was evident; these were significantly lower in peak Player Load™ than those involving player contact (P < 0.01). The tackle detection algorithm, developed primarily for rugby, was not suitable for tackle detection in AF. The underlying sensor data may have the potential to detect a range of events within contact sports such as AF, yet to do so is a complex task and requires sophisticated sport and event-specific algorithms.
Day-time identification of summer hailstorm cells from MSG data
NASA Astrophysics Data System (ADS)
Merino, A.; López, L.; Sánchez, J. L.; García-Ortega, E.; Cattani, E.; Levizzani, V.
2013-10-01
Identifying deep convection is of paramount importance, as it may be associated with extreme weather that has significant impact on the environment, property and the population. A new method, the Hail Detection Tool (HDT), is described for identifying hail-bearing storms using multi-spectral Meteosat Second Generation (MSG) data. HDT was conceived as a two-phase method, in which the first step is the Convective Mask (CM) algorithm devised for detection of deep convection, and the second a Hail Detection algorithm (HD) for the identification of hail-bearing clouds among cumulonimbus systems detected by CM. Both CM and HD are based on logistic regression models trained with multi-spectral MSG data-sets comprised of summer convective events in the middle Ebro Valley between 2006-2010, and detected by the RGB visualization technique (CM) or C-band weather radar system of the University of León. By means of the logistic regression approach, the probability of identifying a cumulonimbus event with CM or a hail event with HD are computed by exploiting a proper selection of MSG wavelengths or their combination. A number of cloud physical properties (liquid water path, optical thickness and effective cloud drop radius) were used to physically interpret results of statistical models from a meteorological perspective, using a method based on these "ingredients." Finally, the HDT was applied to a new validation sample consisting of events during summer 2011. The overall Probability of Detection (POD) was 76.9% and False Alarm Ratio 16.7%.
Multimodal Event Detection in Twitter Hashtag Networks
Yilmaz, Yasin; Hero, Alfred O.
2016-07-01
In this study, event detection in a multimodal Twitter dataset is considered. We treat the hashtags in the dataset as instances with two modes: text and geolocation features. The text feature consists of a bag-of-words representation. The geolocation feature consists of geotags (i.e., geographical coordinates) of the tweets. Fusing the multimodal data we aim to detect, in terms of topic and geolocation, the interesting events and the associated hashtags. To this end, a generative latent variable model is assumed, and a generalized expectation-maximization (EM) algorithm is derived to learn the model parameters. The proposed method is computationally efficient, and lendsmore » itself to big datasets. Lastly, experimental results on a Twitter dataset from August 2014 show the efficacy of the proposed method.« less
Track-based event recognition in a realistic crowded environment
NASA Astrophysics Data System (ADS)
van Huis, Jasper R.; Bouma, Henri; Baan, Jan; Burghouts, Gertjan J.; Eendebak, Pieter T.; den Hollander, Richard J. M.; Dijk, Judith; van Rest, Jeroen H.
2014-10-01
Automatic detection of abnormal behavior in CCTV cameras is important to improve the security in crowded environments, such as shopping malls, airports and railway stations. This behavior can be characterized at different time scales, e.g., by small-scale subtle and obvious actions or by large-scale walking patterns and interactions between people. For example, pickpocketing can be recognized by the actual snatch (small scale), when he follows the victim, or when he interacts with an accomplice before and after the incident (longer time scale). This paper focusses on event recognition by detecting large-scale track-based patterns. Our event recognition method consists of several steps: pedestrian detection, object tracking, track-based feature computation and rule-based event classification. In the experiment, we focused on single track actions (walk, run, loiter, stop, turn) and track interactions (pass, meet, merge, split). The experiment includes a controlled setup, where 10 actors perform these actions. The method is also applied to all tracks that are generated in a crowded shopping mall in a selected time frame. The results show that most of the actions can be detected reliably (on average 90%) at a low false positive rate (1.1%), and that the interactions obtain lower detection rates (70% at 0.3% FP). This method may become one of the components that assists operators to find threatening behavior and enrich the selection of videos that are to be observed.
NASA Astrophysics Data System (ADS)
Patton, J.; Yeck, W.; Benz, H.
2017-12-01
The U.S. Geological Survey National Earthquake Information Center (USGS NEIC) is implementing and integrating new signal detection methods such as subspace correlation, continuous beamforming, multi-band picking and automatic phase identification into near-real-time monitoring operations. Leveraging the additional information from these techniques help the NEIC utilize a large and varied network on local to global scales. The NEIC is developing an ordered, rapid, robust, and decentralized framework for distributing seismic detection data as well as a set of formalized formatting standards. These frameworks and standards enable the NEIC to implement a seismic event detection framework that supports basic tasks, including automatic arrival time picking, social media based event detections, and automatic association of different seismic detection data into seismic earthquake events. In addition, this framework enables retrospective detection processing such as automated S-wave arrival time picking given a detected event, discrimination and classification of detected events by type, back-azimuth and slowness calculations, and ensuring aftershock and induced sequence detection completeness. These processes and infrastructure improve the NEIC's capabilities, accuracy, and speed of response. In addition, this same infrastructure provides an improved and convenient structure to support access to automatic detection data for both research and algorithmic development.
Application of data cubes for improving detection of water cycle extreme events
NASA Astrophysics Data System (ADS)
Teng, W. L.; Albayrak, A.
2015-12-01
As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case for our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme (WCE) events, a specific case of anomaly detection, requiring time series data. We investigate the use of the sequential probability ratio test (SPRT) for anomaly detection and support vector machines (SVM) for anomaly classification. We show an example of detection of WCE events, using the Global Land Data Assimilation Systems (GLDAS) data set.
NASA Astrophysics Data System (ADS)
Kosuga, M.
2013-12-01
The location of early aftershocks is very important to obtain information of mainshock fault, however, it is often difficult due to the long-lasting coda wave of mainshock and successive occurrence of afterrshocks. To overcome this difficulty, we developed a method of location using seismogram envelopes as templates, and applied the method to the early aftershock sequence of the 2004 Mid-Niigata Prefecture (Chuetsu) Earthquake (M = 6.8) in central Japan. The location method composes of three processes. The first process is the calculation of cross-correlation coefficients between a continuous (target) and template envelopes. We prepare envelopes by taking the logarithm of root-mean-squared amplitude of band-pass filtered seismograms. We perform the calculation by shifting the time window to obtain a set of cross-correlation values for each template. The second process is the event detection (selection of template) and magnitude estimate. We search for the events in descending order of cross-correlation in a time window excluding the dead times around the previously detected events. Magnitude is calculated by the amplitude ratio of target and template envelopes. The third process is the relative event location to the selected template. We applied this method to the Chuetsu earthquake, a large inland earthquake with extensive aftershock activity. The number of detected events depends on the number of templates, frequency range, and the threshold value of cross-correlation. We set the threshold as 0.5 by referring to the histogram of cross-correlation. During a period of one-hour from the mainshock, we could detect more events than the JMA catalog. The location of events is generally near the catalog location. Though we should improve the methods of relative location and magnitude estimate, we conclude that the proposed method works adequately even just after the mainshock of large inland earthquake. Acknowledgement: We thank JMA, NIED, and the University of Tokyo for providing arrival time data, and waveform data. This work was supported by JSPS KAKENHI Grant Number 23540487.
Signaling communication events in a computer network
Bender, Carl A.; DiNicola, Paul D.; Gildea, Kevin J.; Govindaraju, Rama K.; Kim, Chulho; Mirza, Jamshed H.; Shah, Gautam H.; Nieplocha, Jaroslaw
2000-01-01
A method, apparatus and program product for detecting a communication event in a distributed parallel data processing system in which a message is sent from an origin to a target. A low-level application programming interface (LAPI) is provided which has an operation for associating a counter with a communication event to be detected. The LAPI increments the counter upon the occurrence of the communication event. The number in the counter is monitored, and when the number increases, the event is detected. A completion counter in the origin is associated with the completion of a message being sent from the origin to the target. When the message is completed, LAPI increments the completion counter such that monitoring the completion counter detects the completion of the message. The completion counter may be used to insure that a first message has been sent from the origin to the target and completed before a second message is sent.
Cognitive workload modulation through degraded visual stimuli: a single-trial EEG study
NASA Astrophysics Data System (ADS)
Yu, K.; Prasad, I.; Mir, H.; Thakor, N.; Al-Nashash, H.
2015-08-01
Objective. Our experiments explored the effect of visual stimuli degradation on cognitive workload. Approach. We investigated the subjective assessment, event-related potentials (ERPs) as well as electroencephalogram (EEG) as measures of cognitive workload. Main results. These experiments confirm that degradation of visual stimuli increases cognitive workload as assessed by subjective NASA task load index and confirmed by the observed P300 amplitude attenuation. Furthermore, the single-trial multi-level classification using features extracted from ERPs and EEG is found to be promising. Specifically, the adopted single-trial oscillatory EEG/ERP detection method achieved an average accuracy of 85% for discriminating 4 workload levels. Additionally, we found from the spatial patterns obtained from EEG signals that the frontal parts carry information that can be used for differentiating workload levels. Significance. Our results show that visual stimuli can modulate cognitive workload, and the modulation can be measured by the single trial EEG/ERP detection method.
Security and Privacy Qualities of Medical Devices: An Analysis of FDA Postmarket Surveillance
Kramer, Daniel B.; Baker, Matthew; Ransford, Benjamin; Molina-Markham, Andres; Stewart, Quinn; Fu, Kevin; Reynolds, Matthew R.
2012-01-01
Background Medical devices increasingly depend on computing functions such as wireless communication and Internet connectivity for software-based control of therapies and network-based transmission of patients’ stored medical information. These computing capabilities introduce security and privacy risks, yet little is known about the prevalence of such risks within the clinical setting. Methods We used three comprehensive, publicly available databases maintained by the Food and Drug Administration (FDA) to evaluate recalls and adverse events related to security and privacy risks of medical devices. Results Review of weekly enforcement reports identified 1,845 recalls; 605 (32.8%) of these included computers, 35 (1.9%) stored patient data, and 31 (1.7%) were capable of wireless communication. Searches of databases specific to recalls and adverse events identified only one event with a specific connection to security or privacy. Software-related recalls were relatively common, and most (81.8%) mentioned the possibility of upgrades, though only half of these provided specific instructions for the update mechanism. Conclusions Our review of recalls and adverse events from federal government databases reveals sharp inconsistencies with databases at individual providers with respect to security and privacy risks. Recalls related to software may increase security risks because of unprotected update and correction mechanisms. To detect signals of security and privacy problems that adversely affect public health, federal postmarket surveillance strategies should rethink how to effectively and efficiently collect data on security and privacy problems in devices that increasingly depend on computing systems susceptible to malware. PMID:22829874
Application of artificial neural network to fMRI regression analysis.
Misaki, Masaya; Miyauchi, Satoru
2006-01-15
We used an artificial neural network (ANN) to detect correlations between event sequences and fMRI (functional magnetic resonance imaging) signals. The layered feed-forward neural network, given a series of events as inputs and the fMRI signal as a supervised signal, performed a non-linear regression analysis. This type of ANN is capable of approximating any continuous function, and thus this analysis method can detect any fMRI signals that correlated with corresponding events. Because of the flexible nature of ANNs, fitting to autocorrelation noise is a problem in fMRI analyses. We avoided this problem by using cross-validation and an early stopping procedure. The results showed that the ANN could detect various responses with different time courses. The simulation analysis also indicated an additional advantage of ANN over non-parametric methods in detecting parametrically modulated responses, i.e., it can detect various types of parametric modulations without a priori assumptions. The ANN regression analysis is therefore beneficial for exploratory fMRI analyses in detecting continuous changes in responses modulated by changes in input values.
NASA Astrophysics Data System (ADS)
Campbell, Adam J.; Hulbe, Christina L.; Lee, Choon-Ki
2018-01-01
As time series observations of Antarctic change proliferate, it is imperative that mathematical frameworks through which they are understood keep pace. Here we present a new method of interpreting remotely sensed change using spatial statistics and apply it to the specific case of thickness change on the Ross Ice Shelf. First, a numerical model of ice shelf flow is used together with empirical orthogonal function analysis to generate characteristic patterns of response to specific forcings. Because they are continuous and scalable in space and time, the patterns allow short duration observations to be placed in a longer time series context. Second, focusing only on changes that are statistically significant, the synthetic response surfaces are used to extract magnitude and timing of past events from the observational data. Slowdown of Kamb and Whillans Ice Streams is clearly detectable in remotely sensed thickness change. Moreover, those past events will continue to drive thinning into the future.
Melé, Marta; Javed, Asif; Pybus, Marc; Zalloua, Pierre; Haber, Marc; Comas, David; Netea, Mihai G; Balanovsky, Oleg; Balanovska, Elena; Jin, Li; Yang, Yajun; Pitchappan, R M; Arunkumar, G; Parida, Laxmi; Calafell, Francesc; Bertranpetit, Jaume
2012-01-01
The information left by recombination in our genomes can be used to make inferences on our recent evolutionary history. Specifically, the number of past recombination events in a population sample is a function of its effective population size (Ne). We have applied a method, Identifying Recombination in Sequences (IRiS), to detect specific past recombination events in 30 Old World populations to infer their Ne. We have found that sub-Saharan African populations have an Ne that is approximately four times greater than those of non-African populations and that outside of Africa, South Asian populations had the largest Ne. We also observe that the patterns of recombinational diversity of these populations correlate with distance out of Africa if that distance is measured along a path crossing South Arabia. No such correlation is found through a Sinai route, suggesting that anatomically modern humans first left Africa through the Bab-el-Mandeb strait rather than through present Egypt.
Brewer, Gene A; Knight, Justin B; Marsh, Richard L; Unsworth, Nash
2010-04-01
The multiprocess view proposes that different processes can be used to detect event-based prospective memory cues, depending in part on the specificity of the cue. According to this theory, attentional processes are not necessary to detect focal cues, whereas detection of nonfocal cues requires some form of controlled attention. This notion was tested using a design in which we compared performance on a focal and on a nonfocal prospective memory task by participants with high or low working memory capacity. An interaction was found, such that participants with high and low working memory performed equally well on the focal task, whereas the participants with high working memory performed significantly better on the nonfocal task than did their counterparts with low working memory. Thus, controlled attention was only necessary for detecting event-based prospective memory cues in the nonfocal task. These results have implications for theories of prospective memory, the processes necessary for cue detection, and the successful fulfillment of intentions.
Monitoring Seismo-volcanic and Infrasonic Signals at Volcanoes: Mt. Etna Case Study
NASA Astrophysics Data System (ADS)
Cannata, Andrea; Di Grazia, Giuseppe; Aliotta, Marco; Cassisi, Carmelo; Montalto, Placido; Patanè, Domenico
2013-11-01
Volcanoes generate a broad range of seismo-volcanic and infrasonic signals, whose features and variations are often closely related to volcanic activity. The study of these signals is hence very useful in the monitoring and investigation of volcano dynamics. The analysis of seismo-volcanic and infrasonic signals requires specifically developed techniques due to their unique characteristics, which are generally quite distinct compared with tectonic and volcano-tectonic earthquakes. In this work, we describe analysis methods used to detect and locate seismo-volcanic and infrasonic signals at Mt. Etna. Volcanic tremor sources are located using a method based on spatial seismic amplitude distribution, assuming propagation in a homogeneous medium. The tremor source is found by calculating the goodness of the linear regression fit ( R 2) of the log-linearized equation of the seismic amplitude decay with distance. The location method for long-period events is based on the joint computation of semblance and R 2 values, and the location method of very long-period events is based on the application of radial semblance. Infrasonic events and tremor are located by semblance-brightness- and semblance-based methods, respectively. The techniques described here can also be applied to other volcanoes and do not require particular network geometries (such as arrays) but rather simple sparse networks. Using the source locations of all the considered signals, we were able to reconstruct the shallow plumbing system (above sea level) during 2011.
Eriksson, Robert; Werge, Thomas; Jensen, Lars Juhl; Brunak, Søren
2014-04-01
Data collected for medical, filing and administrative purposes in electronic patient records (EPRs) represent a rich source of individualised clinical data, which has great potential for improved detection of patients experiencing adverse drug reactions (ADRs), across all approved drugs and across all indication areas. The aim of this study was to take advantage of techniques for temporal data mining of EPRs in order to detect ADRs in a patient- and dose-specific manner. We used a psychiatric hospital's EPR system to investigate undesired drug effects. Within one workflow the method identified patient-specific adverse events (AEs) and links these to specific drugs and dosages in a temporal manner, based on integration of text mining results and structured data. The structured data contained precise information on drug identity, dosage and strength. When applying the method to the 3,394 patients in the cohort, we identified AEs linked with a drug in 2,402 patients (70.8 %). Of the 43,528 patient-specific drug substances prescribed, 14,736 (33.9 %) were linked with AEs. From these links we identified multiple ADRs (p < 0.05) and found them to occur at similar frequencies, as stated by the manufacturer and in the literature. We showed that drugs displaying similar ADR profiles share targets, and we compared submitted spontaneous AE reports with our findings. For nine of the ten most prescribed antipsychotics in the patient population, larger doses were prescribed to sedated patients than non-sedated patients; five antipsychotics [corrected] exhibited a significant difference (p<0.05). Finally, we present two cases (p < 0.05) identified by the workflow. The method identified the potentially fatal AE QT prolongation caused by methadone, and a non-described likely ADR between levomepromazine and nightmares found among the hundreds of identified novel links between drugs and AEs (p < 0.05). The developed method can be used to extract dose-dependent ADR information from already collected EPR data. Large-scale AE extraction from EPRs may complement or even replace current drug safety monitoring methods in the future, reducing or eliminating manual reporting and enabling much faster ADR detection.
Expanding the 2011 Prague, OK Event Catalog: Detections, Relocations, and Stress Drop Estimates
NASA Astrophysics Data System (ADS)
Clerc, F.; Cochran, E. S.; Dougherty, S. L.; Keranen, K. M.; Harrington, R. M.
2016-12-01
The Mw 5.6 earthquake occurring on 6 Nov. 2011, near Prague, OK, is thought to have been triggered by a Mw 4.8 foreshock, which was likely induced by fluid injection into local wastewater disposal wells [Keranen et al., 2013; Sumy et al., 2014]. Previous stress drop estimates for the sequence have suggested values lower than those for most Central and Eastern U.S. tectonic events of similar magnitudes [Hough, 2014; Sun & Hartzell, 2014; Sumy & Neighbors et al., 2016]. Better stress drop estimates allow more realistic assessment of seismic hazard and more effective regulation of wastewater injection. More reliable estimates of source properties may help to differentiate induced events from natural ones. Using data from local and regional networks, we perform event detections, relocations, and stress drop calculations of the Prague aftershock sequence. We use the Match & Locate method, a variation on the matched-filter method which detects events of lower magnitudes by stacking cross-correlograms from different stations [Zhang & Wen, 2013; 2015], in order to create a more complete catalog from 6 Nov to 31 Dec 2011. We then relocate the detected events using the HypoDD double-difference algorithm. Using our enhanced catalog and relocations, we examine the seismicity distribution for evidence of migration and investigate implications for triggering mechanisms. To account for path and site effects, we calculate stress drops using the Empirical Green's Function (EGF) spectral ratio method, beginning with 2730 previously relocated events. We determine whether there is a correlation between the stress drop magnitudes and the spatial and temporal distribution of events, including depth, position relative to existing faults, and proximity to injection wells. Finally, we consider the range of stress drop values and scaling with respect to event magnitudes within the context of previously published work for the Prague sequence as well as other induced and natural sequences.
Guha Mazumder, Arpan; Chatterjee, Swarnadip; Chatterjee, Saunak; Gonzalez, Juan Jose; Bag, Swarnendu; Ghosh, Sambuddha; Mukherjee, Anirban; Chatterjee, Jyotirmoy
2017-01-01
Introduction Image-based early detection for diabetic retinopathy (DR) needs value addition due to lack of well-defined disease-specific quantitative imaging biomarkers (QIBs) for neuroretinal degeneration and spectropathological information at the systemic level. Retinal neurodegeneration is an early event in the pathogenesis of DR. Therefore, development of an integrated assessment method for detecting neuroretinal degeneration using spectropathology and QIBs is necessary for the early diagnosis of DR. Methods The present work explored the efficacy of intensity and textural features extracted from optical coherence tomography (OCT) images after selecting a specific subset of features for the precise classification of retinal layers using variants of support vector machine (SVM). Fourier transform infrared (FTIR) spectroscopy and nuclear magnetic resonance (NMR) spectroscopy were also performed to confirm the spectropathological attributes of serum for further value addition to the OCT, fundoscopy, and fluorescein angiography (FA) findings. The serum metabolomic findings were also incorporated for characterizing retinal layer thickness alterations and vascular asymmetries. Results Results suggested that OCT features could differentiate the retinal lesions indicating retinal neurodegeneration with high sensitivity and specificity. OCT, fundoscopy, and FA provided geometrical as well as optical features. NMR revealed elevated levels of ribitol, glycerophosphocholine, and uridine diphosphate N-acetyl glucosamine, while the FTIR of serum samples confirmed the higher expressions of lipids and β-sheet-containing proteins responsible for neoangiogenesis, vascular fragility, vascular asymmetry, and subsequent neuroretinal degeneration in DR. Conclusion Our data indicated that disease-specific spectropathological alterations could be the major phenomena behind the vascular attenuations observed through fundoscopy and FA, as well as the variations in the intensity and textural features observed in OCT images. Finally, we propose a model that uses spectropathology corroborated with specific QIBs for detecting neuroretinal degeneration in early diagnosis of DR. PMID:29200821
Mouffouk, Fouzi; Aouabdi, Sihem; Al-Hetlani, Entesar; Serrai, Hacene; Alrefae, Tareq; Leo Chen, Liaohai
2017-01-01
Screening and early diagnosis are the key factors for the reduction of mortality rate and treatment cost of cancer. Therefore, sensitive and selective methods that can reveal the low abundance of cancer biomarkers in a biological sample are always desired. Here, we report the development of a novel electrochemical biosensor for early detection of breast cancer by using bioconjugated self-assembled pH-responsive polymeric micelles. The micelles were loaded with ferrocene molecules as “tracers” to specifically target cell surface-associated epithelial mucin (MUC1), a biomarker for breast and other solid carcinoma. The synthesis of target-specific, ferrocene-loaded polymeric micelles was confirmed, and the resulting sensor was capable of detecting the presence of MUC1 in a sample containing about 10 cells/mL. Such a high sensitivity was achieved by maximizing the loading capacity of ferrocene inside the polymeric micelles. Every single event of binding between the antibody and antigen was represented by the signal of hundreds of thousands of ferrocene molecules that were released from the polymeric micelles. This resulted in a significant increase in the intensity of the ferrocene signal detected by cyclic voltammetry. PMID:28450780
Systematic detection of seismic events at Mount St. Helens with an ultra-dense array
NASA Astrophysics Data System (ADS)
Meng, X.; Hartog, J. R.; Schmandt, B.; Hotovec-Ellis, A. J.; Hansen, S. M.; Vidale, J. E.; Vanderplas, J.
2016-12-01
During the summer of 2014, an ultra-dense array of 900 geophones was deployed around the crater of Mount St. Helens and continuously operated for 15 days. This dataset provides us an unprecedented opportunity to systematically detect seismic events around an active volcano and study their underlying mechanisms. We use a waveform-based matched filter technique to detect seismic events from this dataset. Due to the large volume of continuous data ( 1 TB), we performed the detection on the GPU cluster Stampede (https://www.tacc.utexas.edu/systems/stampede). We build a suite of template events from three catalogs: 1) the standard Pacific Northwest Seismic Network (PNSN) catalog (45 events); 2) the catalog from Hansen&Schmandt (2015) obtained with a reverse-time imaging method (212 events); and 3) the catalog identified with a matched filter technique using the PNSN permanent stations (190 events). By searching for template matches in the ultra-dense array, we find 2237 events. We then calibrate precise relative magnitudes for template and detected events, using a principal component fit to measure waveform amplitude ratios. The magnitude of completeness and b-value of the detected catalog is -0.5 and 1.1, respectively. Our detected catalog shows several intensive swarms, which are likely driven by fluid pressure transients in conduits or slip transients on faults underneath the volcano. We are currently relocating the detected catalog with HypoDD and measuring the seismic velocity changes at Mount St. Helens using the coda wave interferometry of detected repeating earthquakes. The accurate temporal-spatial migration pattern of seismicity and seismic property changes should shed light on the physical processes beneath Mount St. Helens.
NASA Astrophysics Data System (ADS)
Bartlow, N. M.
2017-12-01
Slow Earthquake Hunters is a new citizen science project to detect, catalog, and monitor slow slip events. Slow slip events, also called "slow earthquakes", occur when faults slip too slowly to generate significant seismic radiation. They typically take between a few days and over a year to occur, and are most often found on subduction zone plate interfaces. While not dangerous in and of themselves, recent evidence suggests that monitoring slow slip events is important for earthquake hazards, as slow slip events have been known to trigger damaging "regular" earthquakes. Slow slip events, because they do not radiate seismically, are detected with a variety of methods, most commonly continuous geodetic Global Positioning System (GPS) stations. There is now a wealth of GPS data in some regions that experience slow slip events, but a reliable automated method to detect them in GPS data remains elusive. This project aims to recruit human users to view GPS time series data, with some post-processing to highlight slow slip signals, and flag slow slip events for further analysis by the scientific team. Slow Earthquake Hunters will begin with data from the Cascadia subduction zone, where geodetically detectable slow slip events with a duration of at least a few days recur at regular intervals. The project will then expand to other areas with slow slip events or other transient geodetic signals, including other subduction zones, and areas with strike-slip faults. This project has not yet rolled out to the public, and is in a beta testing phase. This presentation will show results from an initial pilot group of student participants at the University of Missouri, and solicit feedback for the future of Slow Earthquake Hunters.
Faigen, Zachary; Deyneka, Lana; Ising, Amy; Neill, Daniel; Conway, Mike; Fairchild, Geoffrey; Gunn, Julia; Swenson, David; Painter, Ian; Johnson, Lauren; Kiley, Chris; Streichert, Laura
2015-01-01
Introduction: We document a funded effort to bridge the gap between constrained scientific challenges of public health surveillance and methodologies from academia and industry. Component tasks are the collection of epidemiologists’ use case problems, multidisciplinary consultancies to refine them, and dissemination of problem requirements and shareable datasets. We describe an initial use case and consultancy as a concrete example and challenge to developers. Materials and Methods: Supported by the Defense Threat Reduction Agency Biosurveillance Ecosystem project, the International Society for Disease Surveillance formed an advisory group to select tractable use case problems and convene inter-disciplinary consultancies to translate analytic needs into well-defined problems and to promote development of applicable solution methods. The initial consultancy’s focus was a problem originated by the North Carolina Department of Health and its NC DETECT surveillance system: Derive a method for detection of patient record clusters worthy of follow-up based on free-text chief complaints and without syndromic classification. Results: Direct communication between public health problem owners and analytic developers was informative to both groups and constructive for the solution development process. The consultancy achieved refinement of the asyndromic detection challenge and of solution requirements. Participants summarized and evaluated solution approaches and discussed dissemination and collaboration strategies. Practice Implications: A solution meeting the specification of the use case described above could improve human monitoring efficiency with expedited warning of events requiring follow-up, including otherwise overlooked events with no syndromic indicators. This approach can remove obstacles to collaboration with efficient, minimal data-sharing and without costly overhead. PMID:26834939
Comparing CNV detection methods for SNP arrays.
Winchester, Laura; Yau, Christopher; Ragoussis, Jiannis
2009-09-01
Data from whole genome association studies can now be used for dual purposes, genotyping and copy number detection. In this review we discuss some of the methods for using SNP data to detect copy number events. We examine a number of algorithms designed to detect copy number changes through the use of signal-intensity data and consider methods to evaluate the changes found. We describe the use of several statistical models in copy number detection in germline samples. We also present a comparison of data using these methods to assess accuracy of prediction and detection of changes in copy number.
Detection of low-level environmental chemical allergy by a long-term sensitization method.
Fukuyama, Tomoki; Ueda, Hideo; Hayashi, Koichi; Tajima, Yukari; Shuto, Yasufumi; Saito, Toru R; Harada, Takanori; Kosaka, Tadashi
2008-07-30
Multiple chemical sensitivity (MCS) is characterized by various signs, including neurological disorders and allergy. Exposure may occur through a major event, such as a chemical spill, or from long-term contact with chemicals at low levels. We are interested in the allergenicity of MCS and the detection of low-level chemical-related hypersensitivity. We used long-term sensitization followed by low-dose challenge to evaluate sensitization by well-known Th2 type sensitizers (trimellitic anhydride (TMA) and toluene diisocyanate (TDI)) and a Th1 type sensitizer (2,4-dinitrochlorobenzene (DNCB)). After topically sensitizing BALB/c mice (9 times in 3 weeks) and challenging them with TMA, TDI or DNCB, we assayed their auricular lymph nodes (LNs) for number of lymphocytes, surface antigen expression of B cells, and local cytokine production, and measured antigen-specific serum IgE levels. TMA and TDI induced marked increases in levels of antigen-specific serum IgE and of Th2 cytokines (IL-4, IL-5, IL-10, and IL-13) produced by ex vivo restimulated lymph node cells. DNCB induced a marked increase in Th1 cytokine (IL-2, IFN-gamma, and TNF-alpha) levels, but antigen-specific serum IgE levels were not elevated. All chemicals induced significant increases in number of lymphocytes and surface antigen expression of B cells. Our mouse model enabled the identification and characterization of chemical-related allergic reactions at low levels. This long-term sensitization method would be useful for detecting environmental chemical-related hypersensitivity.
Methodological Caveats in the Detection of Coordinated Replay between Place Cells and Grid Cells.
Trimper, John B; Trettel, Sean G; Hwaun, Ernie; Colgin, Laura Lee
2017-01-01
At rest, hippocampal "place cells," neurons with receptive fields corresponding to specific spatial locations, reactivate in a manner that reflects recently traveled trajectories. These "replay" events have been proposed as a mechanism underlying memory consolidation, or the transfer of a memory representation from the hippocampus to neocortical regions associated with the original sensory experience. Accordingly, it has been hypothesized that hippocampal replay of a particular experience should be accompanied by simultaneous reactivation of corresponding representations in the neocortex and in the entorhinal cortex, the primary interface between the hippocampus and the neocortex. Recent studies have reported that coordinated replay may occur between hippocampal place cells and medial entorhinal cortex grid cells, cells with multiple spatial receptive fields. Assessing replay in grid cells is problematic, however, as the cells exhibit regularly spaced spatial receptive fields in all environments and, therefore, coordinated replay between place cells and grid cells may be detected by chance. In the present report, we adapted analytical approaches utilized in recent studies of grid cell and place cell replay to determine the extent to which coordinated replay is spuriously detected between grid cells and place cells recorded from separate rats. For a subset of the employed analytical methods, coordinated replay was detected spuriously in a significant proportion of cases in which place cell replay events were randomly matched with grid cell firing epochs of equal duration. More rigorous replay evaluation procedures and minimum spike count requirements greatly reduced the amount of spurious findings. These results provide insights into aspects of place cell and grid cell activity during rest that contribute to false detection of coordinated replay. The results further emphasize the need for careful controls and rigorous methods when testing the hypothesis that place cells and grid cells exhibit coordinated replay.
2013-01-01
Background We have previously published a technique for objective assessment of freezing of gait (FOG) in Parkinson's disease (PD) from a single shank-mounted accelerometer. Here we extend this approach to evaluate the optimal configuration of sensor placement and signal processing parameters using seven sensors attached to the lumbar back, thighs, shanks and feet. Methods Multi-segmental acceleration data was obtained from 25 PD patients performing 134 timed up and go tasks, and clinical assessment of FOG was performed by two experienced raters from video. Four metrics were used to compare objective and clinical measures; the intraclass correlation coefficient (ICC) for number of FOG episodes and the percent time frozen per trial; and the sensitivity and specificity of FOG detection. Results The seven-sensor configuration was the most robust, scoring highly on all measures of performance (ICC number of FOG 0.75; ICC percent time frozen 0.80; sensitivity 84.3%; specificity 78.4%). A simpler single-shank sensor approach provided similar ICC values and exhibited a high sensitivity to FOG events, but specificity was lower at 66.7%. Recordings from the lumbar sensor offered only moderate agreement with the clinical raters in terms of absolute number and duration of FOG events (likely due to musculoskeletal attenuation of lower-limb 'trembling' during FOG), but demonstrated a high sensitivity (86.2%) and specificity (82.4%) when considered as a binary test for the presence/absence of FOG within a single trial. Conclusions The seven-sensor approach was the most accurate method for quantifying FOG, and is best suited to demanding research applications. A single shank sensor provided measures comparable to the seven-sensor approach but is relatively straightforward in execution, facilitating clinical use. A single lumbar sensor may provide a simple means of objective FOG detection given the ubiquitous nature of accelerometers in mobile telephones and other belt-worn devices. PMID:23405951
Systems and methods of detecting force and stress using tetrapod nanocrystal
Choi, Charina L.; Koski, Kristie J.; Sivasankar, Sanjeevi; Alivisatos, A. Paul
2013-08-20
Systems and methods of detecting force on the nanoscale including methods for detecting force using a tetrapod nanocrystal by exposing the tetrapod nanocrystal to light, which produces a luminescent response by the tetrapod nanocrystal. The method continues with detecting a difference in the luminescent response by the tetrapod nanocrystal relative to a base luminescent response that indicates a force between a first and second medium or stresses or strains experienced within a material. Such systems and methods find use with biological systems to measure forces in biological events or interactions.
Multilingual Analysis of Twitter News in Support of Mass Emergency Events
NASA Astrophysics Data System (ADS)
Zielinski, A.; Bügel, U.; Middleton, L.; Middleton, S. E.; Tokarchuk, L.; Watson, K.; Chaves, F.
2012-04-01
Social media are increasingly becoming an additional source of information for event-based early warning systems in the sense that they can help to detect natural crises and support crisis management during or after disasters. Within the European FP7 TRIDEC project we study the problem of analyzing multilingual twitter feeds for emergency events. Specifically, we consider tsunami and earthquakes, as one possible originating cause of tsunami, and propose to analyze twitter messages for capturing testified information at affected points of interest in order to obtain a better picture of the actual situation. For tsunami, these could be the so called Forecast Points, i.e. agreed-upon points chosen by the Regional Tsunami Warning Centers (RTWC) and the potentially affected countries, which must be considered when calculating expected tsunami arrival times. Generally, local civil protection authorities and the population are likely to respond in their native languages. Therefore, the present work focuses on English as "lingua franca" and on under-resourced Mediterranean languages in endangered zones, particularly in Turkey, Greece, and Romania. We investigated ten earthquake events and defined four language-specific classifiers that can be used to detect natural crisis events by filtering out irrelevant messages that do not relate to the event. Preliminary results indicate that such a filter has the potential to support earthquake detection and could be integrated into seismographic sensor networks. One hindrance in our study is the lack of geo-located data for asserting the geographical origin of the tweets and thus to be able to observe correlations of events across languages. One way to overcome this deficit consists in identifying geographic names contained in tweets that correspond to or which are located in the vicinity of specific points-of-interest such as the forecast points of the tsunami scenario. We also intend to use twitter analysis for situation picture assessment, e.g. for planning relief actions. At present, a multilingual corpus of Twitter messages related to crises is being assembled, and domain-specific language resources such as multilingual terminology lists and language-specific Natural Language Processing (NLP) tools are being built up to help cross the language barrier. The final goal is to extend this work to the main languages spoken around the Mediterranean and to classify and extract relevant information from tweets, translating the main keywords into English.
Evaluation of the LWVD Luminosity for Use in the Spectral-Based Volume Sensor Algorithms
2010-04-29
VMI Vibro-Meter, Inc. VS Volume Sensor VSCS Volume Sensor Communications Specification VSDS Volume Sensor Detection Suite VSNP Volume Sensor Nodal Panel...using the VSCS communications protocol. Appendix A gives a complete listing of the SBVS EVENT parameters and the EVENT algorithm descriptions. See
Smalling, Kelly L.; Orlando, James L.
2011-01-01
Water and sediment (bed and suspended) were collected from January 2008 through October 2009 from 12 sites in 3 of the largest watersheds along California's Central Coast (Pajaro, Salinas, and Santa Maria Rivers) and analyzed for a suite of pesticides by the U.S. Geological Survey. Water samples were collected in each watershed from the estuaries and major tributaries during 4 storm events and 11 dry season sampling events in 2008 and 2009. Bed sediments were collected from depositional zones at the tributary sampling sites three times over the course of the study. Suspended sediment samples were collected from the major tributaries during the four storm events and in the tributaries and estuaries during three dry season sampling events in 2009. Water samples were analyzed for 68 pesticides using gas chromatography/mass spectrometry. A total of 38 pesticides were detected in 144 water samples, and 13 pesticides were detected in more than half the samples collected over the course of the study. Dissolved pesticide concentrations ranged from below their method detection limits to 36,000 nanograms per liter (boscalid). The most frequently detected pesticides in water from all the watersheds were azoxystrobin, boscalid, chlorpyrifos, DCPA, diazinon, oxyfluorfen, prometryn, and propyzamide, which were found in more than 80 percent of the samples. On average, detection frequencies and concentrations were higher in samples collected during winter storm events compared to the summer dry season. With the exception of the fungicide, myclobutanil, the Santa Maria estuary watershed exhibited higher pesticide detection frequencies than the Pajaro and Salinas watersheds. Bed and suspended sediment samples were analyzed for 55 pesticides using accelerated solvent extraction, gel permeation chromatography for sulfur removal, and carbon/alumina stacked solid-phase extraction cartridges to remove interfering sediment matrices. In bed sediment samples, 17 pesticides were detected including pyrethroid and organophosphate (OP) insecticides, p,p'-DDT and its degradates, as well as several herbicides. The only pesticides detected more than half the time were p,p'-DDD, p,p'-DDE, and p,p'-DDT. Maximum pesticide concentrations ranged from less than their respective method detection limits to 234 micrograms per kilogram (p,p'-DDE). Four pyrethroids (bifenthrin, 955;-cyhalothrin, permethrin, and 964;-fluvalinate) were detected in bed sediment samples, though concentrations were relatively low (less than 10 microgram per kilogram). The greatest number of pesticides were detected in samples collected from Lower Orcutt Creek, the major tributary to the Santa Maria estuary. In suspended sediment samples, 19 pesticides were detected, and maximum concentrations ranged from less than the method detection limits to 549 micrograms per kilogram (chlorpyrifos). The most frequently detected pesticides were p,p'-DDE (49 percent), p,p'-DDT (38 percent), and chlorpyrifos (32 percent). During storm events, 19 pesticides were detected in suspended sediment samples compared to 10 detected during the dry season. Pesticide concentrations commonly were higher in suspended sediments during storm events than during the dry season, as well.
Station Set Residual: Event Classification Using Historical Distribution of Observing Stations
NASA Astrophysics Data System (ADS)
Procopio, Mike; Lewis, Jennifer; Young, Chris
2010-05-01
Analysts working at the International Data Centre in support of treaty monitoring through the Comprehensive Nuclear-Test-Ban Treaty Organization spend a significant amount of time reviewing hypothesized seismic events produced by an automatic processing system. When reviewing these events to determine their legitimacy, analysts take a variety of approaches that rely heavily on training and past experience. One method used by analysts to gauge the validity of an event involves examining the set of stations involved in the detection of an event. In particular, leveraging past experience, an analyst can say that an event located in a certain part of the world is expected to be detected by Stations A, B, and C. Implicit in this statement is that such an event would usually not be detected by Stations X, Y, or Z. For some well understood parts of the world, the absence of one or more "expected" stations—or the presence of one or more "unexpected" stations—is correlated with a hypothesized event's legitimacy and to its survival to the event bulletin. The primary objective of this research is to formalize and quantify the difference between the observed set of stations detecting some hypothesized event, versus the expected set of stations historically associated with detecting similar nearby events close in magnitude. This Station Set Residual can be quantified in many ways, some of which are correlated with the analysts' determination of whether or not the event is valid. We propose that this Station Set Residual score can be used to screen out certain classes of "false" events produced by automatic processing with a high degree of confidence, reducing the analyst burden. Moreover, we propose that the visualization of the historically expected distribution of detecting stations can be immediately useful as an analyst aid during their review process.
Vertically Integrated Seismological Analysis II : Inference
NASA Astrophysics Data System (ADS)
Arora, N. S.; Russell, S.; Sudderth, E.
2009-12-01
Methods for automatically associating detected waveform features with hypothesized seismic events, and localizing those events, are a critical component of efforts to verify the Comprehensive Test Ban Treaty (CTBT). As outlined in our companion abstract, we have developed a hierarchical model which views detection, association, and localization as an integrated probabilistic inference problem. In this abstract, we provide more details on the Markov chain Monte Carlo (MCMC) methods used to solve this inference task. MCMC generates samples from a posterior distribution π(x) over possible worlds x by defining a Markov chain whose states are the worlds x, and whose stationary distribution is π(x). In the Metropolis-Hastings (M-H) method, transitions in the Markov chain are constructed in two steps. First, given the current state x, a candidate next state x‧ is generated from a proposal distribution q(x‧ | x), which may be (more or less) arbitrary. Second, the transition to x‧ is not automatic, but occurs with an acceptance probability—α(x‧ | x) = min(1, π(x‧)q(x | x‧)/π(x)q(x‧ | x)). The seismic event model outlined in our companion abstract is quite similar to those used in multitarget tracking, for which MCMC has proved very effective. In this model, each world x is defined by a collection of events, a list of properties characterizing those events (times, locations, magnitudes, and types), and the association of each event to a set of observed detections. The target distribution π(x) = P(x | y), the posterior distribution over worlds x given the observed waveform data y at all stations. Proposal distributions then implement several types of moves between worlds. For example, birth moves create new events; death moves delete existing events; split moves partition the detections for an event into two new events; merge moves combine event pairs; swap moves modify the properties and assocations for pairs of events. Importantly, the rules for accepting such complex moves need not be hand-designed. Instead, they are automatically determined by the underlying probabilistic model, which is in turn calibrated via historical data and scientific knowledge. Consider a small seismic event which generates weak signals at several different stations, which might independently be mistaken for noise. A birth move may nevertheless hypothesize an event jointly explaining these detections. If the corresponding waveform data then aligns with the seismological knowledge encoded in the probabilistic model, the event may be detected even though no single station observes it unambiguously. Alternatively, if a large outlier reading is produced at a single station, moves which instantiate a corresponding (false) event would be rejected because of the absence of plausible detections at other sensors. More broadly, one of the main advantages of our MCMC approach is its consistent handling of the relative uncertainties in different information sources. By avoiding low-level thresholds, we expect to improve accuracy and robustness. At the conference, we will present results quantitatively validating our approach, using ground-truth associations and locations provided either by simulation or human analysts.
Tan, Francisca M; Caballero-Gaudes, César; Mullinger, Karen J; Cho, Siu-Yeung; Zhang, Yaping; Dryden, Ian L; Francis, Susan T; Gowland, Penny A
2017-11-01
Most functional MRI (fMRI) studies map task-driven brain activity using a block or event-related paradigm. Sparse paradigm free mapping (SPFM) can detect the onset and spatial distribution of BOLD events in the brain without prior timing information, but relating the detected events to brain function remains a challenge. In this study, we developed a decoding method for SPFM using a coordinate-based meta-analysis method of activation likelihood estimation (ALE). We defined meta-maps of statistically significant ALE values that correspond to types of events and calculated a summation overlap between the normalized meta-maps and SPFM maps. As a proof of concept, this framework was applied to relate SPFM-detected events in the sensorimotor network (SMN) to six motor functions (left/right fingers, left/right toes, swallowing, and eye blinks). We validated the framework using simultaneous electromyography (EMG)-fMRI experiments and motor tasks with short and long duration, and random interstimulus interval. The decoding scores were considerably lower for eye movements relative to other movement types tested. The average successful rate for short and long motor events were 77 ± 13% and 74 ± 16%, respectively, excluding eye movements. We found good agreement between the decoding results and EMG for most events and subjects, with a range in sensitivity between 55% and 100%, excluding eye movements. The proposed method was then used to classify the movement types of spontaneous single-trial events in the SMN during resting state, which produced an average successful rate of 22 ± 12%. Finally, this article discusses methodological implications and improvements to increase the decoding performance. Hum Brain Mapp 38:5778-5794, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Tan, Francisca M.; Caballero-Gaudes, César; Mullinger, Karen J.; Cho, Siu-Yeung; Zhang, Yaping; Dryden, Ian L.; Francis, Susan T.; Gowland, Penny A.
2017-01-01
Most fMRI studies map task-driven brain activity using a block or event-related paradigm. Sparse Paradigm Free Mapping (SPFM) can detect the onset and spatial distribution of BOLD events in the brain without prior timing information; but relating the detected events to brain function remains a challenge. In this study, we developed a decoding method for SPFM using a coordinate-based meta-analysis method of Activation Likelihood Estimation (ALE). We defined meta-maps of statistically significant ALE values that correspond to types of events and calculated a summation overlap between the normalized meta-maps and SPFM maps. As a proof of concept, this framework was applied to relate SPFM-detected events in the Sensorimotor Network (SMN) to six motor function (left/right fingers, left/right toes, swallowing and eye blinks). We validated the framework using simultaneous Electromyography-fMRI experiments and motor tasks with short and long duration, and random inter-stimulus interval. The decoding scores were considerably lower for eye movements relative to other movement types tested. The average successful rate for short and long motor events was 77 ± 13% and 74 ± 16% respectively, excluding eye movements. We found good agreement between the decoding results and EMG for most events and subjects, with a range in sensitivity between 55 and 100%, excluding eye movements. The proposed method was then used to classify the movement types of spontaneous single-trial events in the SMN during resting state, which produced an average successful rate of 22 ± 12%. Finally, this paper discusses methodological implications and improvements to increase the decoding performance. PMID:28815863
Dick, Jeffrey E.; Hilterbrand, Adam T.; Boika, Aliaksei; Upton, Jason W.; Bard, Allen J.
2015-01-01
We report observations of stochastic collisions of murine cytomegalovirus (MCMV) on ultramicroelectrodes (UMEs), extending the observation of discrete collision events on UMEs to biologically relevant analytes. Adsorption of an antibody specific for a virion surface glycoprotein allowed differentiation of MCMV from MCMV bound by antibody from the collision frequency decrease and current magnitudes in the electrochemical collision experiments, which shows the efficacy of the method to size viral samples. To add selectivity to the technique, interactions between MCMV, a glycoprotein-specific primary antibody to MCMV, and polystyrene bead “anchors,” which were functionalized with a secondary antibody specific to the Fc region of the primary antibody, were used to affect virus mobility. Bead aggregation was observed, and the extent of aggregation was measured using the electrochemical collision technique. Scanning electron microscopy and optical microscopy further supported aggregate shape and extent of aggregation with and without MCMV. This work extends the field of collisions to biologically relevant antigens and provides a novel foundation upon which qualitative sensor technology might be built for selective detection of viruses and other biologically relevant analytes. PMID:25870261
Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi
2016-01-01
A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize, 3272. We first attempted to obtain genome DNA from this maize using a DNeasy Plant Maxi kit and a DNeasy Plant Mini kit, which have been widely utilized in our previous studies, but DNA extraction yields from 3272 were markedly lower than those from non-GM maize seeds. However, lowering of DNA extraction yields was not observed with GM quicker or Genomic-tip 20/G. We chose GM quicker for evaluation of the quantitative method. We prepared a standard plasmid for 3272 quantification. The conversion factor (Cf), which is required to calculate the amount of a genetically modified organism (GMO), was experimentally determined for two real-time PCR instruments, the Applied Biosystems 7900HT (the ABI 7900) and the Applied Biosystems 7500 (the ABI7500). The determined Cf values were 0.60 and 0.59 for the ABI 7900 and the ABI 7500, respectively. To evaluate the developed method, a blind test was conducted as part of an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSDr). The determined values were similar to those in our previous validation studies. The limit of quantitation for the method was estimated to be 0.5% or less, and we concluded that the developed method would be suitable and practical for detection and quantification of 3272.
NASA Astrophysics Data System (ADS)
Peidou, Athina C.; Fotopoulos, Georgia; Pagiatakis, Spiros
2017-10-01
The main focus of this paper is to assess the feasibility of utilizing dedicated satellite gravity missions in order to detect large-scale solid mass transfer events (e.g. landslides). Specifically, a sensitivity analysis of Gravity Recovery and Climate Experiment (GRACE) gravity field solutions in conjunction with simulated case studies is employed to predict gravity changes due to past subaerial and submarine mass transfer events, namely the Agulhas slump in southeastern Africa and the Heart Mountain Landslide in northwestern Wyoming. The detectability of these events is evaluated by taking into account the expected noise level in the GRACE gravity field solutions and simulating their impact on the gravity field through forward modelling of the mass transfer. The spectral content of the estimated gravity changes induced by a simulated large-scale landslide event is estimated for the known spatial resolution of the GRACE observations using wavelet multiresolution analysis. The results indicate that both the Agulhas slump and the Heart Mountain Landslide could have been detected by GRACE, resulting in {\\vert }0.4{\\vert } and {\\vert }0.18{\\vert } mGal change on GRACE solutions, respectively. The suggested methodology is further extended to the case studies of the submarine landslide in Tohoku, Japan, and the Grand Banks landslide in Newfoundland, Canada. The detectability of these events using GRACE solutions is assessed through their impact on the gravity field.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-01
... monitoring will achieve detection and quantification of CO 2 in the event surface leakage occurs. The UIC... leakage detection monitoring system or technical specifications should also be described in the MRV plan... of injected CO 2 or from another cause (e.g. natural variability). The MRV plan leakage detection and...
ROKU: a novel method for identification of tissue-specific genes.
Kadota, Koji; Ye, Jiazhen; Nakai, Yuji; Terada, Tohru; Shimizu, Kentaro
2006-06-12
One of the important goals of microarray research is the identification of genes whose expression is considerably higher or lower in some tissues than in others. We would like to have ways of identifying such tissue-specific genes. We describe a method, ROKU, which selects tissue-specific patterns from gene expression data for many tissues and thousands of genes. ROKU ranks genes according to their overall tissue specificity using Shannon entropy and detects tissues specific to each gene if any exist using an outlier detection method. We evaluated the capacity for the detection of various specific expression patterns using synthetic and real data. We observed that ROKU was superior to a conventional entropy-based method in its ability to rank genes according to overall tissue specificity and to detect genes whose expression pattern are specific only to objective tissues. ROKU is useful for the detection of various tissue-specific expression patterns. The framework is also directly applicable to the selection of diagnostic markers for molecular classification of multiple classes.
Daytime identification of summer hailstorm cells from MSG data
NASA Astrophysics Data System (ADS)
Merino, A.; López, L.; Sánchez, J. L.; García-Ortega, E.; Cattani, E.; Levizzani, V.
2014-04-01
Identifying deep convection is of paramount importance, as it may be associated with extreme weather phenomena that have significant impact on the environment, property and populations. A new method, the hail detection tool (HDT), is described for identifying hail-bearing storms using multispectral Meteosat Second Generation (MSG) data. HDT was conceived as a two-phase method, in which the first step is the convective mask (CM) algorithm devised for detection of deep convection, and the second a hail mask algorithm (HM) for the identification of hail-bearing clouds among cumulonimbus systems detected by CM. Both CM and HM are based on logistic regression models trained with multispectral MSG data sets comprised of summer convective events in the middle Ebro Valley (Spain) between 2006 and 2010, and detected by the RGB (red-green-blue) visualization technique (CM) or C-band weather radar system of the University of León. By means of the logistic regression approach, the probability of identifying a cumulonimbus event with CM or a hail event with HM are computed by exploiting a proper selection of MSG wavelengths or their combination. A number of cloud physical properties (liquid water path, optical thickness and effective cloud drop radius) were used to physically interpret results of statistical models from a meteorological perspective, using a method based on these "ingredients". Finally, the HDT was applied to a new validation sample consisting of events during summer 2011. The overall probability of detection was 76.9 % and the false alarm ratio 16.7 %.
Automatic Multi-sensor Data Quality Checking and Event Detection for Environmental Sensing
NASA Astrophysics Data System (ADS)
LIU, Q.; Zhang, Y.; Zhao, Y.; Gao, D.; Gallaher, D. W.; Lv, Q.; Shang, L.
2017-12-01
With the advances in sensing technologies, large-scale environmental sensing infrastructures are pervasively deployed to continuously collect data for various research and application fields, such as air quality study and weather condition monitoring. In such infrastructures, many sensor nodes are distributed in a specific area and each individual sensor node is capable of measuring several parameters (e.g., humidity, temperature, and pressure), providing massive data for natural event detection and analysis. However, due to the dynamics of the ambient environment, sensor data can be contaminated by errors or noise. Thus, data quality is still a primary concern for scientists before drawing any reliable scientific conclusions. To help researchers identify potential data quality issues and detect meaningful natural events, this work proposes a novel algorithm to automatically identify and rank anomalous time windows from multiple sensor data streams. More specifically, (1) the algorithm adaptively learns the characteristics of normal evolving time series and (2) models the spatial-temporal relationship among multiple sensor nodes to infer the anomaly likelihood of a time series window for a particular parameter in a sensor node. Case studies using different data sets are presented and the experimental results demonstrate that the proposed algorithm can effectively identify anomalous time windows, which may resulted from data quality issues and natural events.
Yu, Ke; Wang, Yue; Shen, Kaiquan; Li, Xiaoping
2013-01-01
The common spatial pattern analysis (CSP), a frequently utilized feature extraction method in brain-computer-interface applications, is believed to be time-invariant and sensitive to noises, mainly due to an inherent shortcoming of purely relying on spatial filtering. Therefore, temporal/spectral filtering which can be very effective to counteract the unfavorable influence of noises is usually used as a supplement. This work integrates the CSP spatial filters with complex channel-specific finite impulse response (FIR) filters in a natural and intuitive manner. Each hybrid spatial-FIR filter is of high-order, data-driven and is unique to its corresponding channel. They are derived by introducing multiple time delays and regularization into conventional CSP. The general framework of the method follows that of CSP but performs better, as proven in single-trial classification tasks like event-related potential detection and motor imagery.
Automatic Detection and Vulnerability Analysis of Areas Endangered by Heavy Rain
NASA Astrophysics Data System (ADS)
Krauß, Thomas; Fischer, Peter
2016-08-01
In this paper we present a new method for fully automatic detection and derivation of areas endangered by heavy rainfall based only on digital elevation models. Tracking news show that the majority of occuring natural hazards are flood events. So already many flood prediction systems were developed. But most of these existing systems for deriving areas endangered by flooding events are based only on horizontal and vertical distances to existing rivers and lakes. Typically such systems take not into account dangers arising directly from heavy rain events. In a study conducted by us together with a german insurance company a new approach for detection of areas endangered by heavy rain was proven to give a high correlation of the derived endangered areas and the losses claimed at the insurance company. Here we describe three methods for classification of digital terrain models and analyze their usability for automatic detection and vulnerability analysis for areas endangered by heavy rainfall and analyze the results using the available insurance data.
Global Seismic Event Detection Using Surface Waves: 15 Possible Antarctic Glacial Sliding Events
NASA Astrophysics Data System (ADS)
Chen, X.; Shearer, P. M.; Walker, K. T.; Fricker, H. A.
2008-12-01
To identify overlooked or anomalous seismic events not listed in standard catalogs, we have developed an algorithm to detect and locate global seismic events using intermediate-period (35-70s) surface waves. We apply our method to continuous vertical-component seismograms from the global seismic networks as archived in the IRIS UV FARM database from 1997 to 2007. We first bandpass filter the seismograms, apply automatic gain control, and compute envelope functions. We then examine 1654 target event locations defined at 5 degree intervals and stack the seismogram envelopes along the predicted Rayleigh-wave travel times. The resulting function has spatial and temporal peaks that indicate possible seismic events. We visually check these peaks using a graphical user interface to eliminate artifacts and assign an overall reliability grade (A, B or C) to the new events. We detect 78% of events in the Global Centroid Moment Tensor (CMT) catalog. However, we also find 840 new events not listed in the PDE, ISC and REB catalogs. Many of these new events were previously identified by Ekstrom (2006) using a different Rayleigh-wave detection scheme. Most of these new events are located along oceanic ridges and transform faults. Some new events can be associated with volcanic eruptions such as the 2000 Miyakejima sequence near Japan and others with apparent glacial sliding events in Greenland (Ekstrom et al., 2003). We focus our attention on 15 events detected from near the Antarctic coastline and relocate them using a cross-correlation approach. The events occur in 3 groups which are well-separated from areas of cataloged earthquake activity. We speculate that these are iceberg calving and/or glacial sliding events, and hope to test this by inverting for their source mechanisms and examining remote sensing data from their source regions.
Zhang, Xiaobing; Tang, Qiaoling; Wang, Xujing; Wang, Zhixing
2016-01-01
In this study, the flanking sequence of an inserted fragment conferring glyphosate tolerance on transgenic cotton line BG2-7 was analyzed by thermal asymmetric interlaced polymerase chain reaction (TAIL-PCR) and standard PCR. The results showed apparent insertion of the exogenous gene into chromosome D10 of the Gossypium hirsutum L. genome, as the left and right borders of the inserted fragment are nucleotides 61,962,952 and 61,962,921 of chromosome D10, respectively. In addition, a 31-bp cotton microsatellite sequence was noted between the genome sequence and the 5' end of the exogenous gene. In total, 84 and 298 bp were deleted from the left and right borders of the exogenous gene, respectively, with 30 bp deleted from the cotton chromosome at the insertion site. According to the flanking sequence obtained, several pairs of event-specific detection primers were designed to amplify sequence between the 5' end of the exogenous gene and the cotton genome junction region as well as between the 3' end and the cotton genome junction region. Based on screening tests, the 5'-end primers GTCATAACGTGACTCCCTTAATTCTCC/CCTATTACACGGCTATGC and 3'-end primers TCCTTTCGCTTTCTTCCCTT/ACACTTACATGGCGTCTTCT were used to detect the respective BG2-7 event-specific primers. The limit of detection of the former primers reached 44 copies, and that of the latter primers reached 88 copies. The results of this study provide useful data for assessment of BG2-7 safety and for accelerating its industrialization.
NASA Astrophysics Data System (ADS)
Touati, Sarah; Naylor, Mark; Main, Ian
2016-02-01
The recent spate of mega-earthquakes since 2004 has led to speculation of an underlying change in the global `background' rate of large events. At a regional scale, detecting changes in background rate is also an important practical problem for operational forecasting and risk calculation, for example due to volcanic processes, seismicity induced by fluid injection or withdrawal, or due to redistribution of Coulomb stress after natural large events. Here we examine the general problem of detecting changes in background rate in earthquake catalogues with and without correlated events, for the first time using the Bayes factor as a discriminant for models of varying complexity. First we use synthetic Poisson (purely random) and Epidemic-Type Aftershock Sequence (ETAS) models (which also allow for earthquake triggering) to test the effectiveness of many standard methods of addressing this question. These fall into two classes: those that evaluate the relative likelihood of different models, for example using Information Criteria or the Bayes Factor; and those that evaluate the probability of the observations (including extreme events or clusters of events) under a single null hypothesis, for example by applying the Kolmogorov-Smirnov and `runs' tests, and a variety of Z-score tests. The results demonstrate that the effectiveness among these tests varies widely. Information Criteria worked at least as well as the more computationally expensive Bayes factor method, and the Kolmogorov-Smirnov and runs tests proved to be the relatively ineffective in reliably detecting a change point. We then apply the methods tested to events at different thresholds above magnitude M ≥ 7 in the global earthquake catalogue since 1918, after first declustering the catalogue. This is most effectively done by removing likely correlated events using a much lower magnitude threshold (M ≥ 5), where triggering is much more obvious. We find no strong evidence that the background rate of large events worldwide has increased in recent years.
Surface Plasmon Resonance Label-Free Monitoring of Antibody Antigen Interactions in Real Time
ERIC Educational Resources Information Center
Kausaite, Asta; van Dijk, Martijn; Castrop, Jan; Ramanaviciene, Almira; Baltrus, John P.; Acaite, Juzefa; Ramanavicius, Arunas
2007-01-01
Detection of biologically active compounds is one of the most important topics in molecular biology and biochemistry. One of the most promising detection methods is based on the application of surface plasmon resonance for label-free detection of biologically active compounds. This method allows one to monitor binding events in real time without…
Vilar, Santiago; Harpaz, Rave; Chase, Herbert S; Costanzi, Stefano; Rabadan, Raul
2011-01-01
Background Adverse drug events (ADE) cause considerable harm to patients, and consequently their detection is critical for patient safety. The US Food and Drug Administration maintains an adverse event reporting system (AERS) to facilitate the detection of ADE in drugs. Various data mining approaches have been developed that use AERS to detect signals identifying associations between drugs and ADE. The signals must then be monitored further by domain experts, which is a time-consuming task. Objective To develop a new methodology that combines existing data mining algorithms with chemical information by analysis of molecular fingerprints to enhance initial ADE signals generated from AERS, and to provide a decision support mechanism to facilitate the identification of novel adverse events. Results The method achieved a significant improvement in precision in identifying known ADE, and a more than twofold signal enhancement when applied to the ADE rhabdomyolysis. The simplicity of the method assists in highlighting the etiology of the ADE by identifying structurally similar drugs. A set of drugs with strong evidence from both AERS and molecular fingerprint-based modeling is constructed for further analysis. Conclusion The results demonstrate that the proposed methodology could be used as a pharmacovigilance decision support tool to facilitate ADE detection. PMID:21946238
Misu, Shogo; Asai, Tsuyoshi; Ono, Rei; Sawa, Ryuichi; Tsutsumimoto, Kota; Ando, Hiroshi; Doi, Takehiko
2017-09-01
The heel is likely a suitable location to which inertial sensors are attached for the detection of gait events. However, there are few studies to detect gait events and determine temporal gait parameters using sensors attached to the heels. We developed two methods to determine temporal gait parameters: detecting heel-contact using acceleration and detecting toe-off using angular velocity data (acceleration-angular velocity method; A-V method), and detecting both heel-contact and toe-off using angular velocity data (angular velocity-angular velocity method; V-V method). The aim of this study was to examine the concurrent validity of the A-V and V-V methods against the standard method, and to compare their accuracy. Temporal gait parameters were measured in 10 younger and 10 older adults. The intra-class correlation coefficients were excellent in both methods compared with the standard method (0.80 to 1.00). The root mean square errors of stance and swing time in the A-V method were smaller than the V-V method in older adults, although there were no significant discrepancies in the other comparisons. Our study suggests that inertial sensors attached to the heels, using the A-V method in particular, provide a valid measurement of temporal gait parameters. Copyright © 2017 Elsevier B.V. All rights reserved.
Comprehensive methods for earlier detection and monitoring of forest decline
Jennifer Pontius; Richard Hallett
2014-01-01
Forested ecosystems are threatened by invasive pests, pathogens, and unusual climatic events brought about by climate change. Earlier detection of incipient forest health problems and a quantitatively rigorous assessment method is increasingly important. Here, we describe a method that is adaptable across tree species and stress agents and practical for use in the...
Detecting NEO Impacts using the International Monitoring System
NASA Astrophysics Data System (ADS)
Brown, Peter G.; Dube, Kimberlee; Silber, Elizabeth
2014-11-01
As part of the verification regime for the Comprehensive Nuclear Test Ban Treaty an International Monitoring System (IMS) consisting of seismic, hydroacoustic, infrasound and radionuclide technologies has been globally deployed beginning in the late 1990s. The infrasound network sub-component of the IMS consists of 47 active stations as of mid-2014. These microbarograph arrays detect coherent infrasonic signals from a range of sources including volcanoes, man-made explosions and bolides. Bolide detections from IMS stations have been reported since ~2000, but with the maturation of the network over the last several years the rate of detections has increased substantially. Presently the IMS performs semi-automated near real-time global event identification on timescales of 6-12 hours as well as analyst verified event identification having time lags of several weeks. Here we report on infrasound events identified by the IMS between 2010-2014 which are likely bolide impacts. Identification in this context refers to an event being included in one of the event bulletins issued by the IMS. In this untargeted study we find that the IMS globally identifies approximately 16 events per year which are likely bolide impacts. Using data released since the beginning of 2014 of US Government sensor detections (as given at http://neo.jpl.nasa.gov/fireballs/ ) of fireballs we find in a complementary targeted survey that the current IMS system is able to identify ~25% of fireballs with E > 0.1 kT energy. Using all 16 US Government sensor fireballs listed as of July 31, 2014 we are able to detect infrasound from 75% of these events on at least one IMS station. The high ratio of detection/identification is a product of the stricter criteria adopted by the IMS for inclusion in an event bulletin as compared to simple station detection.We discuss energy comparisons between infrasound-estimated energies based on amplitudes and periods and estimates provided by US Government sensors. Specific impact events of interest will be discussed as well as the utility of the global IMS infrasound system for location and timing of future NEAs detected prior to impact.
Local Explosion Monitoring using Rg
NASA Astrophysics Data System (ADS)
O'Rourke, C. T.; Baker, G. E.
2016-12-01
Rg is the high-frequency fundamental-mode Rayleigh wave, which is only excited by near-surface events. As such, an Rg detection indicates that a seismic source is shallow, generally less than a few km depending on the velocity structure, and so likely man-made. Conversely, the absence of Rg can indicate that the source is deeper and so likely naturally occurring. We have developed a new automated method of detecting Rg arrivals from various explosion sources at local distances, and a process for estimating the likelihood that a source is not shallow when no Rg is detected. Our Rg detection method scans the spectrogram of a seismic signal for a characteristic frequency peak. We test this on the Bighorn Arch Seismic Experiment data, which includes earthquakes, active source explosions in boreholes, and mining explosions recorded on a dense network that spans the Bighorn Mountains and Powder River Basin. The Rg passbands used were 0.4-0.8 Hz for mining blasts and 0.8-1.2 Hz for borehole shots. We successfully detect Rg across the full network for most mining blasts. The lower-yield shots are detectable out to 50 km. We achieve <1% false-positive rate for the small-magnitude earthquakes in the region. Rg detections on known non-shallow earthquake seismograms indicates they are largely due to windowing leakage at very close distances or occasionally to cultural noise. We compare our results to existing methods that use cross-correlation to detect retrograde motion of the surface waves. Our method shows more complete detection across the network, especially in the Powder River Basin where Rg exhibits prograde motion that does not trigger the existing detector. We also estimate the likelihood that Rg would have been detected from a surface source, based on the measured P amplitude. For example, an event with a large P wave and no detectable Rg would have a high probability of being a deeper event, whereas we cannot confidently determine whether an event with a small P wave and no Rg detection is shallow or not. These results allow us to detect Rg arrivals, which indicate a shallow source, and to use the absence of Rg to estimate the likelihood that a source in a calibrated region is not shallow enough to be man-made.
Monitoring hydrofrac-induced seismicity by surface arrays - the DHM-Project Basel case study
NASA Astrophysics Data System (ADS)
Blascheck, P.; Häge, M.; Joswig, M.
2012-04-01
The method "nanoseismic monitoring" was applied during the hydraulic stimulation at the Deep-Heat-Mining-Project (DHM-Project) Basel. Two small arrays in a distance of 2.1 km and 4.8 km to the borehole recorded continuously for two days. During this time more than 2500 seismic events were detected. The method of the surface monitoring of induced seismicity was compared to the reference which the hydrofrac monitoring presented. The latter was conducted by a network of borehole seismometers by Geothermal Explorers Limited. Array processing provides a outlier resistant, graphical jack-knifing localization method which resulted in a average deviation towards the reference of 850 m. Additionally, by applying the relative localization master-event method, the NNW-SSE strike direction of the reference was confirmed. It was shown that, in order to successfully estimate the magnitude of completeness as well as the b-value at the event rate and detection sensibility present, 3 h segments of data are sufficient. This is supported by two segment out of over 13 h of evaluated data. These segments were chosen so that they represent a time during the high seismic noise during normal working hours in daytime as well as the minimum anthropogenic noise at night. The low signal-to-noise ratio was compensated by the application of a sonogram event detection as well as a coincidence analysis within each array. Sonograms allow by autoadaptive, non-linear filtering to enhance signals whose amplitudes are just above noise level. For these events the magnitude was determined by the master-event method, allowing to compute the magnitude of completeness by the entire-magnitude-range method provided by the ZMAP toolbox. Additionally, the b-values were determined and compared to the reference values. An introduction to the method of "nanoseismic monitoring" will be given as well as the comparison to reference data in the Basel case study.
Detecting modification of biomedical events using a deep parsing approach
2012-01-01
Background This work describes a system for identifying event mentions in bio-molecular research abstracts that are either speculative (e.g. analysis of IkappaBalpha phosphorylation, where it is not specified whether phosphorylation did or did not occur) or negated (e.g. inhibition of IkappaBalpha phosphorylation, where phosphorylation did not occur). The data comes from a standard dataset created for the BioNLP 2009 Shared Task. The system uses a machine-learning approach, where the features used for classification are a combination of shallow features derived from the words of the sentences and more complex features based on the semantic outputs produced by a deep parser. Method To detect event modification, we use a Maximum Entropy learner with features extracted from the data relative to the trigger words of the events. The shallow features are bag-of-words features based on a small sliding context window of 3-4 tokens on either side of the trigger word. The deep parser features are derived from parses produced by the English Resource Grammar and the RASP parser. The outputs of these parsers are converted into the Minimal Recursion Semantics formalism, and from this, we extract features motivated by linguistics and the data itself. All of these features are combined to create training or test data for the machine learning algorithm. Results Over the test data, our methods produce approximately a 4% absolute increase in F-score for detection of event modification compared to a baseline based only on the shallow bag-of-words features. Conclusions Our results indicate that grammar-based techniques can enhance the accuracy of methods for detecting event modification. PMID:22595089
Schwind, Jessica S.; Wolking, David J.; Brownstein, John S.; Mazet, Jonna A. K.; Smith, Woutrina A.
2014-01-01
Digital disease detection tools are technologically sophisticated, but dependent on digital information, which for many areas suffering from high disease burdens is simply not an option. In areas where news is often reported in local media with no digital counterpart, integration of local news information with digital surveillance systems, such as HealthMap (Boston Children’s Hospital), is critical. Little research has been published in regards to the specific contribution of local health-related articles to digital surveillance systems. In response, the USAID PREDICT project implemented a local media surveillance (LMS) pilot study in partner countries to monitor disease events reported in print media. This research assessed the potential of LMS to enhance digital surveillance reach in five low- and middle-income countries. Over 16 weeks, select surveillance system attributes of LMS, such as simplicity, flexibility, acceptability, timeliness, and stability were evaluated to identify strengths and weaknesses in the surveillance method. Findings revealed that LMS filled gaps in digital surveillance network coverage by contributing valuable localized information on disease events to the global HealthMap database. A total of 87 health events were reported through the LMS pilot in the 16-week monitoring period, including 71 unique reports not found by the HealthMap digital detection tool. Furthermore, HealthMap identified an additional 236 health events outside of LMS. It was also observed that belief in the importance of the project and proper source selection from the participants was crucial to the success of this method. The timely identification of disease outbreaks near points of emergence and the recognition of risk factors associated with disease occurrence continue to be important components of any comprehensive surveillance system for monitoring disease activity across populations. The LMS method, with its minimal resource commitment, could be one tool used to address the information gaps seen in global ‘hot spot’ regions. PMID:25333618
Content analysis of 150 years of British periodicals.
Lansdall-Welfare, Thomas; Sudhahar, Saatviga; Thompson, James; Lewis, Justin; Cristianini, Nello
2017-01-24
Previous studies have shown that it is possible to detect macroscopic patterns of cultural change over periods of centuries by analyzing large textual time series, specifically digitized books. This method promises to empower scholars with a quantitative and data-driven tool to study culture and society, but its power has been limited by the use of data from books and simple analytics based essentially on word counts. This study addresses these problems by assembling a vast corpus of regional newspapers from the United Kingdom, incorporating very fine-grained geographical and temporal information that is not available for books. The corpus spans 150 years and is formed by millions of articles, representing 14% of all British regional outlets of the period. Simple content analysis of this corpus allowed us to detect specific events, like wars, epidemics, coronations, or conclaves, with high accuracy, whereas the use of more refined techniques from artificial intelligence enabled us to move beyond counting words by detecting references to named entities. These techniques allowed us to observe both a systematic underrepresentation and a steady increase of women in the news during the 20th century and the change of geographic focus for various concepts. We also estimate the dates when electricity overtook steam and trains overtook horses as a means of transportation, both around the year 1900, along with observing other cultural transitions. We believe that these data-driven approaches can complement the traditional method of close reading in detecting trends of continuity and change in historical corpora.
Ma, Xiaoyue; Niezgoda, Michael; Blanton, Jesse D; Recuenco, Sergio; Rupprecht, Charles E
2012-08-03
Two major techniques are currently used to estimate rabies virus antibody values: neutralization assays, such as the rapid fluorescent focus inhibition test (RFFIT), and enzyme-linked immunosorbent assays (ELISAs). The RFFIT is considered the gold standard assay and has been used to assess the titer of rabies virus neutralizing antibodies for more than three decades. In the late 1970s, ELISA began to be used to estimate the level of rabies virus antibody and has recently been used by some laboratories as an alternate screening test for animal sera. Although the ELISA appears simpler, safer and more efficient, the assay is less sensitive in detecting low values of rabies virus neutralizing antibodies than neutralization tests. This study was designed to evaluate a new ELISA-based method for detecting rabies virus binding antibody. This new technique uses electro-chemi-luminescence labels and carbon electrode plates to detect binding events. In this comparative study, the RFFIT and the new ELISA-based technique were used to evaluate the level of rabies virus antibodies in human and animal serum samples. By using a conservative approximation of 0.15 IU/ml as a cutoff point, the new ELISA-based technique demonstrated a sensitivity of 100% and a specificity of 95% for human samples and for experimental animal samples. The sensitivity and specificity for field animal samples was 96% and 95%, respectively. The preliminary results from this study appear promising and demonstrate a higher sensitivity than traditional ELISA methods. Published by Elsevier Ltd.
Content analysis of 150 years of British periodicals
Lansdall-Welfare, Thomas; Sudhahar, Saatviga; Thompson, James; Lewis, Justin; Cristianini, Nello
2017-01-01
Previous studies have shown that it is possible to detect macroscopic patterns of cultural change over periods of centuries by analyzing large textual time series, specifically digitized books. This method promises to empower scholars with a quantitative and data-driven tool to study culture and society, but its power has been limited by the use of data from books and simple analytics based essentially on word counts. This study addresses these problems by assembling a vast corpus of regional newspapers from the United Kingdom, incorporating very fine-grained geographical and temporal information that is not available for books. The corpus spans 150 years and is formed by millions of articles, representing 14% of all British regional outlets of the period. Simple content analysis of this corpus allowed us to detect specific events, like wars, epidemics, coronations, or conclaves, with high accuracy, whereas the use of more refined techniques from artificial intelligence enabled us to move beyond counting words by detecting references to named entities. These techniques allowed us to observe both a systematic underrepresentation and a steady increase of women in the news during the 20th century and the change of geographic focus for various concepts. We also estimate the dates when electricity overtook steam and trains overtook horses as a means of transportation, both around the year 1900, along with observing other cultural transitions. We believe that these data-driven approaches can complement the traditional method of close reading in detecting trends of continuity and change in historical corpora. PMID:28069962
Britton, Jr., Charles L.; Wintenberg, Alan L.
1993-01-01
A radiation detection method and system for continuously correcting the quantization of detected charge during pulse pile-up conditions. Charge pulses from a radiation detector responsive to the energy of detected radiation events are converted to voltage pulses of predetermined shape whose peak amplitudes are proportional to the quantity of charge of each corresponding detected event by means of a charge-sensitive preamplifier. These peak amplitudes are sampled and stored sequentially in accordance with their respective times of occurrence. Based on the stored peak amplitudes and times of occurrence, a correction factor is generated which represents the fraction of a previous pulses influence on a preceding pulse peak amplitude. This correction factor is subtracted from the following pulse amplitude in a summing amplifier whose output then represents the corrected charge quantity measurement.
NASA Astrophysics Data System (ADS)
Knox, H. A.; Draelos, T.; Young, C. J.; Lawry, B.; Chael, E. P.; Faust, A.; Peterson, M. G.
2015-12-01
The quality of automatic detections from seismic sensor networks depends on a large number of data processing parameters that interact in complex ways. The largely manual process of identifying effective parameters is painstaking and does not guarantee that the resulting controls are the optimal configuration settings. Yet, achieving superior automatic detection of seismic events is closely related to these parameters. We present an automated sensor tuning (AST) system that learns near-optimal parameter settings for each event type using neuro-dynamic programming (reinforcement learning) trained with historic data. AST learns to test the raw signal against all event-settings and automatically self-tunes to an emerging event in real-time. The overall goal is to reduce the number of missed legitimate event detections and the number of false event detections. Reducing false alarms early in the seismic pipeline processing will have a significant impact on this goal. Applicable both for existing sensor performance boosting and new sensor deployment, this system provides an important new method to automatically tune complex remote sensing systems. Systems tuned in this way will achieve better performance than is currently possible by manual tuning, and with much less time and effort devoted to the tuning process. With ground truth on detections in seismic waveforms from a network of stations, we show that AST increases the probability of detection while decreasing false alarms.
Optimal filter parameters for low SNR seismograms as a function of station and event location
NASA Astrophysics Data System (ADS)
Leach, Richard R.; Dowla, Farid U.; Schultz, Craig A.
1999-06-01
Global seismic monitoring requires deployment of seismic sensors worldwide, in many areas that have not been studied or have few useable recordings. Using events with lower signal-to-noise ratios (SNR) would increase the amount of data from these regions. Lower SNR events can add significant numbers to data sets, but recordings of these events must be carefully filtered. For a given region, conventional methods of filter selection can be quite subjective and may require intensive analysis of many events. To reduce this laborious process, we have developed an automated method to provide optimal filters for low SNR regional or teleseismic events. As seismic signals are often localized in frequency and time with distinct time-frequency characteristics, our method is based on the decomposition of a time series into a set of subsignals, each representing a band with f/Δ f constant (constant Q). The SNR is calculated on the pre-event noise and signal window. The band pass signals with high SNR are used to indicate the cutoff filter limits for the optimized filter. Results indicate a significant improvement in SNR, particularly for low SNR events. The method provides an optimum filter which can be immediately applied to unknown regions. The filtered signals are used to map the seismic frequency response of a region and may provide improvements in travel-time picking, azimuth estimation, regional characterization, and event detection. For example, when an event is detected and a preliminary location is determined, the computer could automatically select optimal filter bands for data from non-reporting stations. Results are shown for a set of low SNR events as well as 379 regional and teleseismic events recorded at stations ABKT, KIV, and ANTO in the Middle East.
An iterative matching and locating technique for borehole microseismic monitoring
NASA Astrophysics Data System (ADS)
Chen, H.; Meng, X.; Niu, F.; Tang, Y.
2016-12-01
Microseismic monitoring has been proven to be an effective and valuable technology to image hydraulic fracture geometry. The success of hydraulic fracturing monitoring relies on the detection and characterization (i.e., location and focal mechanism estimation) of a maximum number of induced microseismic events. All the events are important to quantify the stimulated reservior volume (SRV) and characterize the newly created fracture network. Detecting and locating low magnitude events, however, are notoriously difficult, particularly at a high noisy production environment. Here we propose an iterative matching and locating technique (iMLT) to obtain a maximum detection of small events and the best determination of their locations from continuous data recorded by a single azimuth downhole geophone array. As the downhole array is located in one azimuth, the regular M&L using the P-wave cross-correlation only is not able to resolve the location of a matched event relative to the template event. We thus introduce the polarization direction in the matching, which significantly improve the lateral resolution of the M&L method based on numerical simulations with synthetic data. Our synthetic tests further indicate that the inclusion of S-wave cross-correlation data can help better constrain the focal depth of the matched events. We apply this method to a dataset recorded during hydraulic fracturing treatment of a pilot horizontal well within the shale play in southwest China. Our approach yields a more than fourfold increase in the number of located events, compared with the original event catalog from traditional downhole processing.
Statistical Model Applied to NetFlow for Network Intrusion Detection
NASA Astrophysics Data System (ADS)
Proto, André; Alexandre, Leandro A.; Batista, Maira L.; Oliveira, Isabela L.; Cansian, Adriano M.
The computers and network services became presence guaranteed in several places. These characteristics resulted in the growth of illicit events and therefore the computers and networks security has become an essential point in any computing environment. Many methodologies were created to identify these events; however, with increasing of users and services on the Internet, many difficulties are found in trying to monitor a large network environment. This paper proposes a methodology for events detection in large-scale networks. The proposal approaches the anomaly detection using the NetFlow protocol, statistical methods and monitoring the environment in a best time for the application.
Li, Feiwu; Yan, Wei; Long, Likun; Qi, Xing; Li, Congcong; Zhang, Shihong
2014-01-01
The cry2Ab and cry3A genes are two of the most important insect-resistant exogenous genes and had been widely used in genetically-modified crops. To develop more effective alternatives for the quick identification of genetically-modified organisms (GMOs) containing these genes, a rapid and visual loop-mediated isothermal amplification (LAMP) method to detect the cry2Ab and cry3A genes is described in this study. The LAMP assay can be finished within 60 min at an isothermal condition of 63 °C. The derived LAMP products can be obtained by a real-time turbidimeter via monitoring the white turbidity or directly observed by the naked eye through adding SYBR Green I dye. The specificity of the LAMP assay was determined by analyzing thirteen insect-resistant genetically-modified (GM) crop events with different Bt genes. Furthermore, the sensitivity of the LAMP assay was evaluated by diluting the template genomic DNA. Results showed that the limit of detection of the established LAMP assays was approximately five copies of haploid genomic DNA, about five-fold greater than that of conventional PCR assays. All of the results indicated that this established rapid and visual LAMP assay was quick, accurate and cost effective, with high specificity and sensitivity. In addition, this method does not need specific expensive instruments or facilities, which can provide a simpler and quicker approach to detecting the cry2Ab and cry3A genes in GM crops, especially for on-site, large-scale test purposes in the field. PMID:25167136
Li, Feiwu; Yan, Wei; Long, Likun; Qi, Xing; Li, Congcong; Zhang, Shihong
2014-08-27
The cry2Ab and cry3A genes are two of the most important insect-resistant exogenous genes and had been widely used in genetically-modified crops. To develop more effective alternatives for the quick identification of genetically-modified organisms (GMOs) containing these genes, a rapid and visual loop-mediated isothermal amplification (LAMP) method to detect the cry2Ab and cry3A genes is described in this study. The LAMP assay can be finished within 60 min at an isothermal condition of 63 °C. The derived LAMP products can be obtained by a real-time turbidimeter via monitoring the white turbidity or directly observed by the naked eye through adding SYBR Green I dye. The specificity of the LAMP assay was determined by analyzing thirteen insect-resistant genetically-modified (GM) crop events with different Bt genes. Furthermore, the sensitivity of the LAMP assay was evaluated by diluting the template genomic DNA. Results showed that the limit of detection of the established LAMP assays was approximately five copies of haploid genomic DNA, about five-fold greater than that of conventional PCR assays. All of the results indicated that this established rapid and visual LAMP assay was quick, accurate and cost effective, with high specificity and sensitivity. In addition, this method does not need specific expensive instruments or facilities, which can provide a simpler and quicker approach to detecting the cry2Ab and cry3A genes in GM crops, especially for on-site, large-scale test purposes in the field.
Application of Data Cubes for Improving Detection of Water Cycle Extreme Events
NASA Technical Reports Server (NTRS)
Albayrak, Arif; Teng, William
2015-01-01
As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case of our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme events, a specific case of anomaly detection, requiring time series data. We investigate the use of support vector machines (SVM) for anomaly classification. We show an example of detection of water cycle extreme events, using data from the Tropical Rainfall Measuring Mission (TRMM).
Bates, Jonathan; Parzynski, Craig S; Dhruva, Sanket S; Coppi, Andreas; Kuntz, Richard; Li, Shu-Xia; Marinac-Dabic, Danica; Masoudi, Frederick A; Shaw, Richard E; Warner, Frederick; Krumholz, Harlan M; Ross, Joseph S
2018-06-12
To estimate medical device utilization needed to detect safety differences among implantable cardioverter defibrillators (ICDs) generator models and compare these estimates to utilization in practice. We conducted repeated sample size estimates to calculate the medical device utilization needed, systematically varying device-specific safety event rate ratios and significance levels while maintaining 80% power, testing 3 average adverse event rates (3.9, 6.1, and 12.6 events per 100 person-years) estimated from the American College of Cardiology's 2006 to 2010 National Cardiovascular Data Registry of ICDs. We then compared with actual medical device utilization. At significance level 0.05 and 80% power, 34% or fewer ICD models accrued sufficient utilization in practice to detect safety differences for rate ratios <1.15 and an average event rate of 12.6 events per 100 person-years. For average event rates of 3.9 and 12.6 events per 100 person-years, 30% and 50% of ICD models, respectively, accrued sufficient utilization for a rate ratio of 1.25, whereas 52% and 67% for a rate ratio of 1.50. Because actual ICD utilization was not uniformly distributed across ICD models, the proportion of individuals receiving any ICD that accrued sufficient utilization in practice was 0% to 21%, 32% to 70%, and 67% to 84% for rate ratios of 1.05, 1.15, and 1.25, respectively, for the range of 3 average adverse event rates. Small safety differences among ICD generator models are unlikely to be detected through routine surveillance given current ICD utilization in practice, but large safety differences can be detected for most patients at anticipated average adverse event rates. Copyright © 2018 John Wiley & Sons, Ltd.
Rabiei, Maryam; Mehdizadeh, Mehrangiz; Rastegar, Hossein; Vahidi, Hossein; Alebouyeh, Mahmoud
2013-01-01
Detection of genetically modified organisms (GMOs) in food is an important issue for all the subjects involved in food control and customer’s right. Due to the increasing number of GMOs imported to Iran during the past few years, it has become necessary to screen the products in order to determine the identity of the consumed daily foodstuffs. In this study, following the extraction of genomic DNA from processed foods sold commercially in Iran, qualitative PCR was performed to detect genetically modified maize. The recombinant DNA target sequences were detected with primers highly specific for each investigated transgene such as CaMV35s gene, Bt-11, MON810 and Bt-176 separately. Based on the gel electrophoresis results, Bt- 11 and MON810 events were detected in some maize samples, while, in none of them Bt- 176 modified gene was detected. For the first time, the results demonstrate the presence of genetically modified maize in Iranian food products, reinforcing the need for the development of labeling system and valid quantitative methods in routine analyses. PMID:24250568
Rabiei, Maryam; Mehdizadeh, Mehrangiz; Rastegar, Hossein; Vahidi, Hossein; Alebouyeh, Mahmoud
2013-01-01
Detection of genetically modified organisms (GMOs) in food is an important issue for all the subjects involved in food control and customer's right. Due to the increasing number of GMOs imported to Iran during the past few years, it has become necessary to screen the products in order to determine the identity of the consumed daily foodstuffs. In this study, following the extraction of genomic DNA from processed foods sold commercially in Iran, qualitative PCR was performed to detect genetically modified maize. The recombinant DNA target sequences were detected with primers highly specific for each investigated transgene such as CaMV35s gene, Bt-11, MON810 and Bt-176 separately. Based on the gel electrophoresis results, Bt- 11 and MON810 events were detected in some maize samples, while, in none of them Bt- 176 modified gene was detected. For the first time, the results demonstrate the presence of genetically modified maize in Iranian food products, reinforcing the need for the development of labeling system and valid quantitative methods in routine analyses.
Selected control events and reporting odds ratio in signal detection methodology.
Ooba, Nobuhiro; Kubota, Kiyoshi
2010-11-01
To know whether the reporting odds ratio (ROR) using "control events" can detect signals hidden behind striking reports on one or more particular events. We used data of 956 drug use investigations (DUIs) conducted between 1970 and 1998 in Japan and domestic spontaneous reports (SRs) between 1998 and 2008. The event terms in DUIs were converted to the preferred terms in Medical Dictionary for Regulatory Activities (MedDRA). We calculated the incidence proportion for various events and selected 20 "control events" with a relatively constant incidence proportion across DUIs and also reported regularly to the spontaneous reporting system. A "signal" was generated for the drug-event combination when the lower limit of 95% confidence interval of the ROR exceeded 1. We also compared the ROR in SRs with the RR in DUIs. The "control events" accounted for 18.2% of all reports. The ROR using "control events" may detect some hidden signals for a drug with the proportion of "control events" lower than the average. The median of the ratios of the ROR using "control events" to RR was around the unity indicating that "control events" roughly represented the exposure distribution though the range of the ratios was so diverse that the individual ROR might not be regarded as the estimate of RR. The use of the ROR with "control events" may give an adjunctive to the traditional signal detection methods to find a signal hidden behind some major events. Copyright © 2010 John Wiley & Sons, Ltd.
Accurate measurement of transgene copy number in crop plants using droplet digital PCR.
Collier, Ray; Dasgupta, Kasturi; Xing, Yan-Ping; Hernandez, Bryan Tarape; Shao, Min; Rohozinski, Dominica; Kovak, Emma; Lin, Jeanie; de Oliveira, Maria Luiza P; Stover, Ed; McCue, Kent F; Harmon, Frank G; Blechl, Ann; Thomson, James G; Thilmony, Roger
2017-06-01
Genetic transformation is a powerful means for the improvement of crop plants, but requires labor- and resource-intensive methods. An efficient method for identifying single-copy transgene insertion events from a population of independent transgenic lines is desirable. Currently, transgene copy number is estimated by either Southern blot hybridization analyses or quantitative polymerase chain reaction (qPCR) experiments. Southern hybridization is a convincing and reliable method, but it also is expensive, time-consuming and often requires a large amount of genomic DNA and radioactively labeled probes. Alternatively, qPCR requires less DNA and is potentially simpler to perform, but its results can lack the accuracy and precision needed to confidently distinguish between one- and two-copy events in transgenic plants with large genomes. To address this need, we developed a droplet digital PCR-based method for transgene copy number measurement in an array of crops: rice, citrus, potato, maize, tomato and wheat. The method utilizes specific primers to amplify target transgenes, and endogenous reference genes in a single duplexed reaction containing thousands of droplets. Endpoint amplicon production in the droplets is detected and quantified using sequence-specific fluorescently labeled probes. The results demonstrate that this approach can generate confident copy number measurements in independent transgenic lines in these crop species. This method and the compendium of probes and primers will be a useful resource for the plant research community, enabling the simple and accurate determination of transgene copy number in these six important crop species. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.
Detection of visual events along the apparent motion trace in patients with paranoid schizophrenia.
Sanders, Lia Lira Olivier; Muckli, Lars; de Millas, Walter; Lautenschlager, Marion; Heinz, Andreas; Kathmann, Norbert; Sterzer, Philipp
2012-07-30
Dysfunctional prediction in sensory processing has been suggested as a possible causal mechanism in the development of delusions in patients with schizophrenia. Previous studies in healthy subjects have shown that while the perception of apparent motion can mask visual events along the illusory motion trace, such motion masking is reduced when events are spatio-temporally compatible with the illusion, and, therefore, predictable. Here we tested the hypothesis that this specific detection advantage for predictable target stimuli on the apparent motion trace is reduced in patients with paranoid schizophrenia. Our data show that, although target detection along the illusory motion trace is generally impaired, both patients and healthy control participants detect predictable targets more often than unpredictable targets. Patients had a stronger motion masking effect when compared to controls. However, patients showed the same advantage in the detection of predictable targets as healthy control subjects. Our findings reveal stronger motion masking but intact prediction of visual events along the apparent motion trace in patients with paranoid schizophrenia and suggest that the sensory prediction mechanism underlying apparent motion is not impaired in paranoid schizophrenia. Copyright © 2012. Published by Elsevier Ireland Ltd.
The Event Detection and the Apparent Velocity Estimation Based on Computer Vision
NASA Astrophysics Data System (ADS)
Shimojo, M.
2012-08-01
The high spatial and time resolution data obtained by the telescopes aboard Hinode revealed the new interesting dynamics in solar atmosphere. In order to detect such events and estimate the velocity of dynamics automatically, we examined the estimation methods of the optical flow based on the OpenCV that is the computer vision library. We applied the methods to the prominence eruption observed by NoRH, and the polar X-ray jet observed by XRT. As a result, it is clear that the methods work well for solar images if the images are optimized for the methods. It indicates that the optical flow estimation methods in the OpenCV library are very useful to analyze the solar phenomena.
Systems and methods for detecting a failure event in a field programmable gate array
NASA Technical Reports Server (NTRS)
Ng, Tak-Kwong (Inventor); Herath, Jeffrey A. (Inventor)
2009-01-01
An embodiment generally relates to a method of self-detecting an error in a field programmable gate array (FPGA). The method includes writing a signature value into a signature memory in the FPGA and determining a conclusion of a configuration refresh operation in the FPGA. The method also includes reading an outcome value from the signature memory.
A method for detecting and locating geophysical events using groups of arrays
NASA Astrophysics Data System (ADS)
de Groot-Hedlin, Catherine D.; Hedlin, Michael A. H.
2015-11-01
We have developed a novel method to detect and locate geophysical events that makes use of any sufficiently dense sensor network. This method is demonstrated using acoustic sensor data collected in 2013 at the USArray Transportable Array (TA). The algorithm applies Delaunay triangulation to divide the sensor network into a mesh of three-element arrays, called triads. Because infrasound waveforms are incoherent between the sensors within each triad, the data are transformed into envelopes, which are cross-correlated to find signals that satisfy a consistency criterion. The propagation azimuth, phase velocity and signal arrival time are computed for each signal. Triads with signals that are consistent with a single source are bundled as an event group. The ensemble of arrival times and azimuths of detected signals within each group are used to locate a common source in space and time. A total of 513 infrasonic stations that were active for part or all of 2013 were divided into over 2000 triads. Low (0.5-2 Hz) and high (2-8 Hz) catalogues of infrasonic events were created for the eastern USA. The low-frequency catalogue includes over 900 events and reveals several highly active source areas on land that correspond with coal mining regions. The high-frequency catalogue includes over 2000 events, with most occurring offshore. Although their cause is not certain, most events are clearly anthropogenic as almost all occur during regular working hours each week. The regions to which the TA is most sensitive vary seasonally, with the direction of reception dependent on the direction of zonal winds. The catalogue has also revealed large acoustic events that may provide useful insight into the nature of long-range infrasound propagation in the atmosphere.
McGarraugh, Geoffrey
2010-01-01
Continuous glucose monitoring (CGM) devices available in the United States are approved for use as adjuncts to self-monitoring of blood glucose (SMBG). Alarm evaluation in the Clinical and Laboratory Standards Institute (CLSI) guideline for CGM does not specifically address devices that employ both CGM and SMBG. In this report, an alarm evaluation method is proposed for these devices. The proposed method builds on the CLSI method using data from an in-clinic study of subjects with type 1 diabetes. CGM was used to detect glycemic events, and SMBG was used to determine treatment. To optimize detection of a single glucose level, such as 70 mg/dl, a range of alarm threshold settings was evaluated. The alarm characterization provides a choice of alarm settings that trade off detection and false alarms. Detection of a range of high glucose levels was similarly evaluated. Using low glucose alarms, detection of 70 mg/dl within 30 minutes increased from 64 to 97% as alarm settings increased from 70 to 100 mg/dl, and alarms that did not require treatment (SMBG >85 mg/dl) increased from 18 to 52%. Using high glucose alarms, detection of 180 mg/dl within 30 minutes increased from 87 to 96% as alarm settings decreased from 180 to 165 mg/dl, and alarms that did not require treatment (SMBG <180 mg/dl) increased from 24 to 42%. The proposed alarm evaluation method provides information for choosing appropriate alarm thresholds and reflects the clinical utility of CGM alarms. 2010 Diabetes Technology Society.
The rate of transient beta frequency events predicts behavior across tasks and species
Law, Robert; Tsutsui, Shawn; Moore, Christopher I; Jones, Stephanie R
2017-01-01
Beta oscillations (15-29Hz) are among the most prominent signatures of brain activity. Beta power is predictive of healthy and abnormal behaviors, including perception, attention and motor action. In non-averaged signals, beta can emerge as transient high-power 'events'. As such, functionally relevant differences in averaged power across time and trials can reflect changes in event number, power, duration, and/or frequency span. We show that functionally relevant differences in averaged beta power in primary somatosensory neocortex reflect a difference in the number of high-power beta events per trial, i.e. event rate. Further, beta events occurring close to the stimulus were more likely to impair perception. These results are consistent across detection and attention tasks in human magnetoencephalography, and in local field potentials from mice performing a detection task. These results imply that an increased propensity of beta events predicts the failure to effectively transmit information through specific neocortical representations. PMID:29106374
Using Boosting Decision Trees in Gravitational Wave Searches triggered by Gamma-ray Bursts
NASA Astrophysics Data System (ADS)
Zuraw, Sarah; LIGO Collaboration
2015-04-01
The search for gravitational wave bursts requires the ability to distinguish weak signals from background detector noise. Gravitational wave bursts are characterized by their transient nature, making them particularly difficult to detect as they are similar to non-Gaussian noise fluctuations in the detector. The Boosted Decision Tree method is a powerful machine learning algorithm which uses Multivariate Analysis techniques to explore high-dimensional data sets in order to distinguish between gravitational wave signal and background detector noise. It does so by training with known noise events and simulated gravitational wave events. The method is tested using waveform models and compared with the performance of the standard gravitational wave burst search pipeline for Gamma-ray Bursts. It is shown that the method is able to effectively distinguish between signal and background events under a variety of conditions and over multiple Gamma-ray Burst events. This example demonstrates the usefulness and robustness of the Boosted Decision Tree and Multivariate Analysis techniques as a detection method for gravitational wave bursts. LIGO, UMass, PREP, NEGAP.
Initial Evaluation of Signal-Based Bayesian Monitoring
NASA Astrophysics Data System (ADS)
Moore, D.; Russell, S.
2016-12-01
We present SIGVISA (Signal-based Vertically Integrated Seismic Analysis), a next-generation system for global seismic monitoring through Bayesian inference on seismic signals. Traditional seismic monitoring systems rely on discrete detections produced by station processing software, discarding significant information present in the original recorded signal. By modeling signals directly, our forward model is able to incorporate a rich representation of the physics underlying the signal generation process, including source mechanisms, wave propagation, and station response. This allows inference in the model to recover the qualitative behavior of geophysical methods including waveform matching and double-differencing, all as part of a unified Bayesian monitoring system that simultaneously detects and locates events from a network of stations. We report results from an evaluation of SIGVISA monitoring the western United States for a two-week period following the magnitude 6.0 event in Wells, NV in February 2008. During this period, SIGVISA detects more than twice as many events as NETVISA, and three times as many as SEL3, while operating at the same precision; at lower precisions it detects up to five times as many events as SEL3. At the same time, signal-based monitoring reduces mean location errors by a factor of four relative to detection-based systems. We provide evidence that, given only IMS data, SIGVISA detects events that are missed by regional monitoring networks, indicating that our evaluations may even underestimate its performance. Finally, SIGVISA matches or exceeds the detection rates of existing systems for de novo events - events with no nearby historical seismicity - and detects through automated processing a number of such events missed even by the human analysts generating the LEB.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao
In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a systemcentric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologiesmore » within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.« less
[Are non-clinical studies predictive of adverse events in humans?].
Claude, N
2007-09-01
The predictibility of adverse events induced by drugs in non-clinical safety studies performed on in vitro and/or in vivo models is a key point for the safety of humans exposed to pharmaceuticals. The strength and the weakness of animal studies to predict human toxicity were assessed by an international study on the concordance of the toxicity of 150 pharmaceuticals observed in humans with that observed in experimental animals. The results showed a good correlation (70% of the adverse events in humans were detected in animal studies) and an early time to first appearance of concordant animal toxicity: 94% were first observed in studies of 1 month or less in duration. The highest incidence of overall concordance was seen in hematological and cardiovascular adverse effects and the least was seen in cutaneous and ophthalmological adverse effects. These studies, scientifically and regulatory standardized, need, in some cases to be adapted to specific problems linked to sensitive populations (young, old or with a pathology which could be worsened by the drug), or specific pharmaceuticals (produced by biotechnology). Some severe adverse events are not detected in conventional animal models (immuno-allergy, idiosyncrasy). Taken together, these elements support the value of toxicology studies to predict many human toxic events associated with pharmaceuticals. Nevertheless, a part of human toxicity is not detected by these experimental approaches, and new tools developed through progress in biology and bio-informatics should reduce this uncertainly margin.
Robotic guarded motion system and method
Bruemmer, David J.
2010-02-23
A robot platform includes perceptors, locomotors, and a system controller. The system controller executes instructions for repeating, on each iteration through an event timing loop, the acts of defining an event horizon, detecting a range to obstacles around the robot, and testing for an event horizon intrusion. Defining the event horizon includes determining a distance from the robot that is proportional to a current velocity of the robot and testing for the event horizon intrusion includes determining if any range to the obstacles is within the event horizon. Finally, on each iteration through the event timing loop, the method includes reducing the current velocity of the robot in proportion to a loop period of the event timing loop if the event horizon intrusion occurs.
Accuracy of a prey-specific DNA assay and a generic prey-immunomarking assay for detecting predation
USDA-ARS?s Scientific Manuscript database
1. Predator gut examinations are useful for detecting arthropod predation events. Here, the accuracy and reproducibility of two different types of gut assays are tested on various predator species that consumed an immature lacewing, Chrysoperla carnea (Stephens), that was externally labelled with ra...
Phenology satellite experiment
NASA Technical Reports Server (NTRS)
Dethier, B. E. (Principal Investigator)
1973-01-01
The author has identified the following significant results. The detection of a phenological event (the Brown Wave-vegetation sensescence) for specific forest and crop types using ERTS-1 imagery is described. Data handling techniques including computer analysis and photointerpretation procedures are explained. Computer analysis of multspectral scanner digital tapes in all bands was used to give the relative changes of spectral reflectance with time of forests and specified crops. These data were obtained for a number of the twenty-four sites located within four north-south corridors across the United States. Analysis of ground observation photography and ERTS-1 imagery for sites in the Appalachian Corridor and Mississippi Valley Corridor indicates that the recession of vegetation development can be detected very well. Tentative conclusions are that specific phenological events such as crop maturity or leaf fall can be mapped for specific sites and possible for different regions. Preliminary analysis based on a number of samples in mixed deciduous hardwood stands indicates that as senescence proceeds both the rate of change and differences in color among species can be detected. The results to data show the feasibility of the development and refinement of phenoclimatic models.
Deep Recurrent Neural Networks for seizure detection and early seizure detection systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Talathi, S. S.
Epilepsy is common neurological diseases, affecting about 0.6-0.8 % of world population. Epileptic patients suffer from chronic unprovoked seizures, which can result in broad spectrum of debilitating medical and social consequences. Since seizures, in general, occur infrequently and are unpredictable, automated seizure detection systems are recommended to screen for seizures during long-term electroencephalogram (EEG) recordings. In addition, systems for early seizure detection can lead to the development of new types of intervention systems that are designed to control or shorten the duration of seizure events. In this article, we investigate the utility of recurrent neural networks (RNNs) in designing seizuremore » detection and early seizure detection systems. We propose a deep learning framework via the use of Gated Recurrent Unit (GRU) RNNs for seizure detection. We use publicly available data in order to evaluate our method and demonstrate very promising evaluation results with overall accuracy close to 100 %. We also systematically investigate the application of our method for early seizure warning systems. Our method can detect about 98% of seizure events within the first 5 seconds of the overall epileptic seizure duration.« less
ROKU: a novel method for identification of tissue-specific genes
Kadota, Koji; Ye, Jiazhen; Nakai, Yuji; Terada, Tohru; Shimizu, Kentaro
2006-01-01
Background One of the important goals of microarray research is the identification of genes whose expression is considerably higher or lower in some tissues than in others. We would like to have ways of identifying such tissue-specific genes. Results We describe a method, ROKU, which selects tissue-specific patterns from gene expression data for many tissues and thousands of genes. ROKU ranks genes according to their overall tissue specificity using Shannon entropy and detects tissues specific to each gene if any exist using an outlier detection method. We evaluated the capacity for the detection of various specific expression patterns using synthetic and real data. We observed that ROKU was superior to a conventional entropy-based method in its ability to rank genes according to overall tissue specificity and to detect genes whose expression pattern are specific only to objective tissues. Conclusion ROKU is useful for the detection of various tissue-specific expression patterns. The framework is also directly applicable to the selection of diagnostic markers for molecular classification of multiple classes. PMID:16764735
Precise genotyping and recombination detection of Enterovirus
2015-01-01
Enteroviruses (EV) with different genotypes cause diverse infectious diseases in humans and mammals. A correct EV typing result is crucial for effective medical treatment and disease control; however, the emergence of novel viral strains has impaired the performance of available diagnostic tools. Here, we present a web-based tool, named EVIDENCE (EnteroVirus In DEep conception, http://symbiont.iis.sinica.edu.tw/evidence), for EV genotyping and recombination detection. We introduce the idea of using mixed-ranking scores to evaluate the fitness of prototypes based on relatedness and on the genome regions of interest. Using phylogenetic methods, the most possible genotype is determined based on the closest neighbor among the selected references. To detect possible recombination events, EVIDENCE calculates the sequence distance and phylogenetic relationship among sequences of all sliding windows scanning over the whole genome. Detected recombination events are plotted in an interactive figure for viewing of fine details. In addition, all EV sequences available in GenBank were collected and revised using the latest classification and nomenclature of EV in EVIDENCE. These sequences are built into the database and are retrieved in an indexed catalog, or can be searched for by keywords or by sequence similarity. EVIDENCE is the first web-based tool containing pipelines for genotyping and recombination detection, with updated, built-in, and complete reference sequences to improve sensitivity and specificity. The use of EVIDENCE can accelerate genotype identification, aiding clinical diagnosis and enhancing our understanding of EV evolution. PMID:26678286
Video content analysis of surgical procedures.
Loukas, Constantinos
2018-02-01
In addition to its therapeutic benefits, minimally invasive surgery offers the potential for video recording of the operation. The videos may be archived and used later for reasons such as cognitive training, skills assessment, and workflow analysis. Methods from the major field of video content analysis and representation are increasingly applied in the surgical domain. In this paper, we review recent developments and analyze future directions in the field of content-based video analysis of surgical operations. The review was obtained from PubMed and Google Scholar search on combinations of the following keywords: 'surgery', 'video', 'phase', 'task', 'skills', 'event', 'shot', 'analysis', 'retrieval', 'detection', 'classification', and 'recognition'. The collected articles were categorized and reviewed based on the technical goal sought, type of surgery performed, and structure of the operation. A total of 81 articles were included. The publication activity is constantly increasing; more than 50% of these articles were published in the last 3 years. Significant research has been performed for video task detection and retrieval in eye surgery. In endoscopic surgery, the research activity is more diverse: gesture/task classification, skills assessment, tool type recognition, shot/event detection and retrieval. Recent works employ deep neural networks for phase and tool recognition as well as shot detection. Content-based video analysis of surgical operations is a rapidly expanding field. Several future prospects for research exist including, inter alia, shot boundary detection, keyframe extraction, video summarization, pattern discovery, and video annotation. The development of publicly available benchmark datasets to evaluate and compare task-specific algorithms is essential.
Development of an Electrochemical DNA Biosensor to Detect a Foodborne Pathogen.
Nordin, Noordiana; Yusof, Nor Azah; Radu, Son; Hushiarian, Roozbeh
2018-06-03
Vibrio parahaemolyticus (V. parahaemolyticus) is a common foodborne pathogen that contributes to a large proportion of public health problems globally, significantly affecting the rate of human mortality and morbidity. Conventional methods for the detection of V. parahaemolyticus such as culture-based methods, immunological assays, and molecular-based methods require complicated sample handling and are time-consuming, tedious, and costly. Recently, biosensors have proven to be a promising and comprehensive detection method with the advantages of fast detection, cost-effectiveness, and practicality. This research focuses on developing a rapid method of detecting V. parahaemolyticus with high selectivity and sensitivity using the principles of DNA hybridization. In the work, characterization of synthesized polylactic acid-stabilized gold nanoparticles (PLA-AuNPs) was achieved using X-ray Diffraction (XRD), Ultraviolet-visible Spectroscopy (UV-Vis), Transmission Electron Microscopy (TEM), Field-emission Scanning Electron Microscopy (FESEM), and Cyclic Voltammetry (CV). We also carried out further testing of stability, sensitivity, and reproducibility of the PLA-AuNPs. We found that the PLA-AuNPs formed a sound structure of stabilized nanoparticles in aqueous solution. We also observed that the sensitivity improved as a result of the smaller charge transfer resistance (Rct) value and an increase of active surface area (0.41 cm 2 ). The development of our DNA biosensor was based on modification of a screen-printed carbon electrode (SPCE) with PLA-AuNPs and using methylene blue (MB) as the redox indicator. We assessed the immobilization and hybridization events by differential pulse voltammetry (DPV). We found that complementary, non-complementary, and mismatched oligonucleotides were specifically distinguished by the fabricated biosensor. It also showed reliably sensitive detection in cross-reactivity studies against various food-borne pathogens and in the identification of V. parahaemolyticus in fresh cockles.
Tanabe, Soichi; Miyauchi, Eiji; Muneshige, Akemi; Mio, Kazuhiro; Sato, Chikara; Sato, Masahiko
2007-07-01
A PCR method to detect porcine DNA was developed for verifying the allergen labeling of foods and for identifying hidden pork ingredients in processed foods. The primer pair, F2/R1, was designed to detect the gene encoding porcine cytochrome b for the specific detection of pork with high sensitivity. The amplified DNA fragment (130 bp) was specifically detected from porcine DNA, while no amplification occurred with other species such as cattle, chicken, sheep, and horse. When the developed PCR method was used for investigating commercial food products, porcine DNA was clearly detected in those containing pork in the list of ingredients. In addition, 100 ppb of pork in heated gyoza (pork and vegetable dumpling) could be detected by this method. This method is rapid, specific and sensitive, making it applicable for detecting trace amounts of pork in processed foods.
Jensen, Morten Hasselstrøm; Christensen, Toke Folke; Tarnow, Lise; Seto, Edmund; Dencker Johansen, Mette; Hejlesen, Ole Kristian
2013-07-01
Hypoglycemia is a potentially fatal condition. Continuous glucose monitoring (CGM) has the potential to detect hypoglycemia in real time and thereby reduce time in hypoglycemia and avoid any further decline in blood glucose level. However, CGM is inaccurate and shows a substantial number of cases in which the hypoglycemic event is not detected by the CGM. The aim of this study was to develop a pattern classification model to optimize real-time hypoglycemia detection. Features such as time since last insulin injection and linear regression, kurtosis, and skewness of the CGM signal in different time intervals were extracted from data of 10 male subjects experiencing 17 insulin-induced hypoglycemic events in an experimental setting. Nondiscriminative features were eliminated with SEPCOR and forward selection. The feature combinations were used in a Support Vector Machine model and the performance assessed by sample-based sensitivity and specificity and event-based sensitivity and number of false-positives. The best model was composed by using seven features and was able to detect 17 of 17 hypoglycemic events with one false-positive compared with 12 of 17 hypoglycemic events with zero false-positives for the CGM alone. Lead-time was 14 min and 0 min for the model and the CGM alone, respectively. This optimized real-time hypoglycemia detection provides a unique approach for the diabetes patient to reduce time in hypoglycemia and learn about patterns in glucose excursions. Although these results are promising, the model needs to be validated on CGM data from patients with spontaneous hypoglycemic events.
Novak, Avrey; Nyflot, Matthew J; Ermoian, Ralph P; Jordan, Loucille E; Sponseller, Patricia A; Kane, Gabrielle M; Ford, Eric C; Zeng, Jing
2016-05-01
Radiation treatment planning involves a complex workflow that has multiple potential points of vulnerability. This study utilizes an incident reporting system to identify the origination and detection points of near-miss errors, in order to guide their departmental safety improvement efforts. Previous studies have examined where errors arise, but not where they are detected or applied a near-miss risk index (NMRI) to gauge severity. From 3/2012 to 3/2014, 1897 incidents were analyzed from a departmental incident learning system. All incidents were prospectively reviewed weekly by a multidisciplinary team and assigned a NMRI score ranging from 0 to 4 reflecting potential harm to the patient (no potential harm to potential critical harm). Incidents were classified by point of incident origination and detection based on a 103-step workflow. The individual steps were divided among nine broad workflow categories (patient assessment, imaging for radiation therapy (RT) planning, treatment planning, pretreatment plan review, treatment delivery, on-treatment quality management, post-treatment completion, equipment/software quality management, and other). The average NMRI scores of incidents originating or detected within each broad workflow area were calculated. Additionally, out of 103 individual process steps, 35 were classified as safety barriers, the process steps whose primary function is to catch errors. The safety barriers which most frequently detected incidents were identified and analyzed. Finally, the distance between event origination and detection was explored by grouping events by the number of broad workflow area events passed through before detection, and average NMRI scores were compared. Near-miss incidents most commonly originated within treatment planning (33%). However, the incidents with the highest average NMRI scores originated during imaging for RT planning (NMRI = 2.0, average NMRI of all events = 1.5), specifically during the documentation of patient positioning and localization of the patient. Incidents were most frequently detected during treatment delivery (30%), and incidents identified at this point also had higher severity scores than other workflow areas (NMRI = 1.6). Incidents identified during on-treatment quality management were also more severe (NMRI = 1.7), and the specific process steps of reviewing portal and CBCT images tended to catch highest-severity incidents. On average, safety barriers caught 46% of all incidents, most frequently at physics chart review, therapist's chart check, and the review of portal images; however, most of the incidents that pass through a particular safety barrier are not designed to be capable of being captured at that barrier. Incident learning systems can be used to assess the most common points of error origination and detection in radiation oncology. This can help tailor safety improvement efforts and target the highest impact portions of the workflow. The most severe near-miss events tend to originate during simulation, with the most severe near-miss events detected at the time of patient treatment. Safety barriers can be improved to allow earlier detection of near-miss events.
NASA Astrophysics Data System (ADS)
Ziegler, A.; Balch, R. S.; Knox, H. A.; Van Wijk, J. W.; Draelos, T.; Peterson, M. G.
2016-12-01
We present results (e.g. seismic detections and STA/LTA detection parameters) from a continuous downhole seismic array in the Farnsworth Field, an oil field in Northern Texas that hosts an ongoing carbon capture, utilization, and storage project. Specifically, we evaluate data from a passive vertical monitoring array consisting of 16 levels of 3-component 15Hz geophones installed in the field and continuously recording since January 2014. This detection database is directly compared to ancillary data (i.e. wellbore pressure) to determine if there is any relationship between seismic observables and CO2 injection and pressure maintenance in the field. Of particular interest is detection of relatively low-amplitude signals constituting long-period long-duration (LPLD) events that may be associated with slow shear-slip analogous to low frequency tectonic tremor. While this category of seismic event provides great insight into dynamic behavior of the pressurized subsurface, it is inherently difficult to detect. To automatically detect seismic events using effective data processing parameters, an automated sensor tuning (AST) algorithm developed by Sandia National Laboratories is being utilized. AST exploits ideas from neuro-dynamic programming (reinforcement learning) to automatically self-tune and determine optimal detection parameter settings. AST adapts in near real-time to changing conditions and automatically self-tune a signal detector to identify (detect) only signals from events of interest, leading to a reduction in the number of missed legitimate event detections and the number of false event detections. Funding for this project is provided by the U.S. Department of Energy's (DOE) National Energy Technology Laboratory (NETL) through the Southwest Regional Partnership on Carbon Sequestration (SWP) under Award No. DE-FC26-05NT42591. Additional support has been provided by site operator Chaparral Energy, L.L.C. and Schlumberger Carbon Services. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
NASA Astrophysics Data System (ADS)
Habu, K.; Kaminohara, S.; Kimoto, T.; Kawagoe, A.; Sumiyoshi, F.; Okamoto, H.
2010-11-01
We have developed a new monitoring system to detect an unusual event in the superconducting coils without direct contact on the coils, using Poynting's vector method. In this system, the potential leads and pickup coils are set around the superconducting coils to measure local electric and magnetic fields, respectively. By measuring the sets of magnetic and electric fields, the Poynting's vectors around the coil can be obtained. An unusual event in the coil can be detected as the result of the change of the Poynting's vector. This system has no risk of the voltage breakdown which may happen with the balance voltage method, because there is no need of direct contacts on the coil windings. In a previous paper, we have demonstrated that our system can detect the normal transitions in the Bi-2223 coil without direct contact on the coil windings by using a small test system. For our system to be applied to practical devices, it is necessary for the early detection of an unusual event in the coils to be able to detect local normal transitions in the coils. The signal voltages of the small sensors to measure local magnetic and electric fields are small. Although the increase in signals of the pickup coils is attained easily by an increase in the number of turns of the pickup coils, an increase in the signals of the potential lead is not easily attained. In this paper, a new method to amplify the signal of local electric fields around the coil is proposed. The validity of the method has been confirmed by measuring local electric fields around the Bi-2223 coil.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foxall, W; Vincent, P; Walter, W
1999-07-23
We have previously presented simple elastic deformation modeling results for three classes of seismic events of concern in monitoring the CTBT--underground explosions, mine collapses and earthquakes. Those results explored the theoretical detectability of each event type using synthetic aperture radar interferometry (InSAR) based on commercially available satellite data. In those studies we identified and compared the characteristics of synthetic interferograms that distinguish each event type, as well the ability of the interferograms to constrain source parameters. These idealized modeling results, together with preliminary analysis of InSAR data for the 1995 mb 5.2 Solvay mine collapse in southwestern Wyoming, suggested thatmore » InSAR data used in conjunction with regional seismic monitoring holds great potential for CTBT discrimination and seismic source analysis, as well as providing accurate ground truth parameters for regional calibration events. In this paper we further examine the detectability and ''discriminating'' power of InSAR by presenting results from InSAR data processing, analysis and modeling of the surface deformation signals associated with underground explosions. Specifically, we present results of a detailed study of coseismic and postseismic surface deformation signals associated with underground nuclear and chemical explosion tests at the Nevada Test Site (NTS). Several interferograms were formed from raw ERS-1/2 radar data covering different time spans and epochs beginning just prior to the last U.S. nuclear tests in 1992 and ending in 1996. These interferograms have yielded information about the nature and duration of the source processes that produced the surface deformations associated with these events. A critical result of this study is that significant post-event surface deformation associated with underground nuclear explosions detonated at depths in excess of 600 meters can be detected using differential radar interferometry. An immediate implication of this finding is that underground nuclear explosions may not need to be captured coseismically by radar images acquired before and after an event in order to be detectable. This has obvious advantages in CTBT monitoring since suspect seismic events--which usually can be located within a 100 km by 100 km area of an ERS-1/2 satellite frame by established seismic methods-can be imaged after the event has been identified and located by existing regional seismic networks. Key Words: InSAR, SLC images, interferogram, synthetic interferogram, ERS-1/2 frame, phase unwrapping, DEM, coseismic, postseismic, source parameters.« less
High infrasonic goniometry applied to the detection of a helicopter in a high activity environment
NASA Astrophysics Data System (ADS)
Chritin, Vincent; Van Lancker, Eric; Wellig, Peter; Ott, Beat
2016-10-01
A current concern of armasuisse is the feasibility of a fixed or mobile acoustic surveillance and recognition network of sensors allowing to permanently monitor the noise immissions of a wide range of aerial activities such as civil or military aviation, and other possible acoustic events such as transient events, subsonic or sonic booms or other. This objective requires an ability to detect, localize and recognize a wide range of potential acoustic events of interest, among others possibly parasitic acoustic events (natural and industrial events on the ground for example), and possibly high background noise (for example close to urban or high activity areas). This article presents a general discussion and conclusion about this problem, based on 20 years of experience totalizing a dozen of research programs or internal researches by IAV, with an illustration through one central specific experimental case-study carried out within the framework of an armasuisse research program.
Towards a global flood detection system using social media
NASA Astrophysics Data System (ADS)
de Bruijn, Jens; de Moel, Hans; Jongman, Brenden; Aerts, Jeroen
2017-04-01
It is widely recognized that an early warning is critical in improving international disaster response. Analysis of social media in real-time can provide valuable information about an event or help to detect unexpected events. For successful and reliable detection systems that work globally, it is important that sufficient data is available and that the algorithm works both in data-rich and data-poor environments. In this study, both a new geotagging system and multi-level event detection system for flood hazards was developed using Twitter data. Geotagging algorithms that regard one tweet as a single document are well-studied. However, no algorithms exist that combine several sequential tweets mentioning keywords regarding a specific event type. Within the time frame of an event, multiple users use event related keywords that refer to the same place name. This notion allows us to treat several sequential tweets posted in the last 24 hours as one document. For all these tweets, we collect a series of spatial indicators given in the tweet metadata and extract additional topological indicators from the text. Using these indicators, we can reduce ambiguity and thus better estimate what locations are tweeted about. Using these localized tweets, Bayesian change-point analysis is used to find significant increases of tweets mentioning countries, provinces or towns. In data-poor environments detection of events on a country level is possible, while in other, data-rich, environments detection on a city level is achieved. Additionally, on a city-level we analyse the spatial dependence of mentioned places. If multiple places within a limited spatial extent are mentioned, detection confidence increases. We run the algorithm using 2 years of Twitter data with flood related keywords in 13 major languages and validate against a flood event database. We find that the geotagging algorithm yields significantly more data than previously developed algorithms and successfully deals with ambiguous place names. In addition, we show that our detection system can both quickly and reliably detect floods, even in countries where data is scarce, while achieving high detail in countries where more data is available.
Boudreaux, Edwin D; Bock, Beth; O'Hea, Erin
2012-03-01
Experiencing a negative consequence related to one's health behavior, like a medical problem leading to an emergency department (ED) visit, can promote behavior change, giving rise to the popular concept of the "teachable moment." However, the mechanisms of action underlying this process of change have received scant attention. In particular, most existing health behavior theories are limited in explaining why such events can inspire short-term change in some and long-term change in others. Expanding on recommendations published in the 2009 Academic Emergency Medicine consensus conference on public health in emergency medicine (EM), we propose a new method for developing conceptual models that explain how negative events, like medical emergencies, influence behavior change, called the Sentinel Event Method. The method itself is atheoretical; instead, it defines steps to guide investigations that seek to relate specific consequences or events to specific health behaviors. This method can be used to adapt existing health behavior theories to study the event-behavior change relationship or to guide formulation of completely new conceptual models. This paper presents the tenets underlying the Sentinel Event Method, describes the steps comprising the process, and illustrates its application to EM through an example of a cardiac-related ED visit and tobacco use. © 2012 by the Society for Academic Emergency Medicine.
Human visual system-based smoking event detection
NASA Astrophysics Data System (ADS)
Odetallah, Amjad D.; Agaian, Sos S.
2012-06-01
Human action (e.g. smoking, eating, and phoning) analysis is an important task in various application domains like video surveillance, video retrieval, human-computer interaction systems, and so on. Smoke detection is a crucial task in many video surveillance applications and could have a great impact to raise the level of safety of urban areas, public parks, airplanes, hospitals, schools and others. The detection task is challenging since there is no prior knowledge about the object's shape, texture and color. In addition, its visual features will change under different lighting and weather conditions. This paper presents a new scheme of a system for detecting human smoking events, or small smoke, in a sequence of images. In developed system, motion detection and background subtraction are combined with motion-region-saving, skin-based image segmentation, and smoke-based image segmentation to capture potential smoke regions which are further analyzed to decide on the occurrence of smoking events. Experimental results show the effectiveness of the proposed approach. As well, the developed method is capable of detecting the small smoking events of uncertain actions with various cigarette sizes, colors, and shapes.
Takabatake, Reona; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi
2013-01-01
A novel real-time polymerase chain reaction (PCR)-based quantitative screening method was developed for three genetically modified soybeans: RRS, A2704-12, and MON89788. The 35S promoter (P35S) of cauliflower mosaic virus is introduced into RRS and A2704-12 but not MON89788. We then designed a screening method comprised of the combination of the quantification of P35S and the event-specific quantification of MON89788. The conversion factor (Cf) required to convert the amount of a genetically modified organism (GMO) from a copy number ratio to a weight ratio was determined experimentally. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDR), respectively. The determined RSDR values for the method were less than 25% for both targets. We consider that the developed method would be suitable for the simple detection and approximate quantification of GMO.
Automatic detection of freezing of gait events in patients with Parkinson's disease.
Tripoliti, Evanthia E; Tzallas, Alexandros T; Tsipouras, Markos G; Rigas, George; Bougia, Panagiota; Leontiou, Michael; Konitsiotis, Spiros; Chondrogiorgi, Maria; Tsouli, Sofia; Fotiadis, Dimitrios I
2013-04-01
The aim of this study is to detect freezing of gait (FoG) events in patients suffering from Parkinson's disease (PD) using signals received from wearable sensors (six accelerometers and two gyroscopes) placed on the patients' body. For this purpose, an automated methodology has been developed which consists of four stages. In the first stage, missing values due to signal loss or degradation are replaced and then (second stage) low frequency components of the raw signal are removed. In the third stage, the entropy of the raw signal is calculated. Finally (fourth stage), four classification algorithms have been tested (Naïve Bayes, Random Forests, Decision Trees and Random Tree) in order to detect the FoG events. The methodology has been evaluated using several different configurations of sensors in order to conclude to the set of sensors which can produce optimal FoG episode detection. Signals recorded from five healthy subjects, five patients with PD who presented the symptom of FoG and six patients who suffered from PD but they do not present FoG events. The signals included 93 FoG events with 405.6s total duration. The results indicate that the proposed methodology is able to detect FoG events with 81.94% sensitivity, 98.74% specificity, 96.11% accuracy and 98.6% area under curve (AUC) using the signals from all sensors and the Random Forests classification algorithm. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Jothikumar, Prithiviraj; Narayanan, Jothikumar; Hill, Vincent R
2014-03-01
Rapid and specific detection methods for bacterial agents in drinking water are important for disease prevention and responding to suspected contamination events. In this study, an isothermal Genome Exponential Amplification Reaction (GEAR) assay for Escherichia coli O157:H7 was designed specifically to recognize a 199-bp fragment of the lipopolysaccharide gene (rfbE) for rapid testing of water samples. The GEAR assay was found to be specific for E. coli O157:H7 using 10 isolates of E. coli O157:H7 and a panel of 86 bacterial controls. The GEAR assay was performed at a constant temperature of 65°C using SYTO 9 intercalating dye. Detection limits were determined to be 20 CFU for the GEAR assay. When SYTO 9 fluorescence was measured using a real-time PCR instrument, the assay had the same detection limit as when malachite green was added to the reaction mix and a characteristic blue color was visually observed in positive reactions. The study also found that 50 and 20 CFU of E. coli O157:H7 seeded into 100-liter of tap water could be detected by the GEAR assays after the sample was concentrated by hollow-fiber ultrafiltration (HFUF) and approximately 10% of HFUF concentrate was cultured using trypticase soy broth-novobiocin. When applied to 19 surface water samples collected from Tennessee and Kentucky, the GEAR assay and a published real-time PCR assay both detected E. coli O157:H7 in two of the samples. The results of this study indicate that the GEAR assay can be sensitive for rapid detection of E. coli O157:H7 in water samples using fluorometric instruments and visual endpoint determination. Published by Elsevier B.V.
Ireno, Ivanildce C; Baumann, Cindy; Stöber, Regina; Hengstler, Jan G; Wiesmüller, Lisa
2014-05-01
In vitro genotoxicity tests are known to suffer from several shortcomings, mammalian cell-based assays, in particular, from low specificities. Following a novel concept of genotoxicity detection, we developed a fluorescence-based method in living human cells. The assay quantifies DNA recombination events triggered by DNA double-strand breaks and damage-induced replication fork stalling predicted to detect a broad spectrum of genotoxic modes of action. To maximize sensitivities, we engineered a DNA substrate encompassing a chemoresponsive element from the human genome. Using this substrate, we screened various human tumor and non-transformed cell types differing in the DNA damage response, which revealed that detection of genotoxic carcinogens was independent of the p53 status but abrogated by apoptosis. Cell types enabling robust and sensitive genotoxicity detection were selected for the generation of reporter clones with chromosomally integrated DNA recombination substrate. Reporter cell lines were scrutinized with 21 compounds, stratified into five sets according to the established categories for identification of carcinogenic compounds: genotoxic carcinogens ("true positives"), non-genotoxic carcinogens, compounds without genotoxic or carcinogenic effect ("true negatives") and non-carcinogenic compounds, which have been reported to induce chromosomal aberrations or mutations in mammalian cell-based assays ("false positives"). Our results document detection of genotoxic carcinogens in independent cell clones and at levels of cellular toxicities <60 % with a sensitivity of >85 %, specificity of ≥90 % and detection of false-positive compounds <17 %. Importantly, through testing cyclophosphamide in combination with primary hepatocyte cultures, we additionally provide proof-of-concept for the identification of carcinogens requiring metabolic activation using this novel assay system.
Representation of photon limited data in emission tomography using origin ensembles
NASA Astrophysics Data System (ADS)
Sitek, A.
2008-06-01
Representation and reconstruction of data obtained by emission tomography scanners are challenging due to high noise levels in the data. Typically, images obtained using tomographic measurements are represented using grids. In this work, we define images as sets of origins of events detected during tomographic measurements; we call these origin ensembles (OEs). A state in the ensemble is characterized by a vector of 3N parameters Y, where the parameters are the coordinates of origins of detected events in a three-dimensional space and N is the number of detected events. The 3N-dimensional probability density function (PDF) for that ensemble is derived, and we present an algorithm for OE image estimation from tomographic measurements. A displayable image (e.g. grid based image) is derived from the OE formulation by calculating ensemble expectations based on the PDF using the Markov chain Monte Carlo method. The approach was applied to computer-simulated 3D list-mode positron emission tomography data. The reconstruction errors for a 10 000 000 event acquisition for simulated ranged from 0.1 to 34.8%, depending on object size and sampling density. The method was also applied to experimental data and the results of the OE method were consistent with those obtained by a standard maximum-likelihood approach. The method is a new approach to representation and reconstruction of data obtained by photon-limited emission tomography measurements.
Microseismic Events Detection on Xishancun Landslide, Sichuan Province, China
NASA Astrophysics Data System (ADS)
Sheng, M.; Chu, R.; Wei, Z.
2016-12-01
On landslide, the slope movement and the fracturing of the rock mass often lead to microearthquakes, which are recorded as weak signals on seismographs. The distribution characteristics of temporal and spatial regional unstability as well as the impact of external factors on the unstable regions can be understand and analyzed by monitoring those microseismic events. Microseismic method can provide some information inside the landslide, which can be used as supplementary of geodetic methods for monitoring the movement of landslide surface. Compared to drilling on landslide, microseismic method is more economical and safe. Xishancun Landslide is located about 60km northwest of Wenchuan earthquake centroid, it keep deforming after the earthquake, which greatly increases the probability of disasters. In the autumn of 2015, 30 seismometers were deployed on the landslide for 3 months with intervals of 200 500 meters. First, we used regional earthquakes for time correction of seismometers to eliminate the influence of inaccuracy GPS clocks and the subsurface structure of stations. Due to low velocity of the loose medium, the travel time difference of microseismic events on the landslide up to 5s. According to travel time and waveform characteristics, we found many microseismic events and converted them into envelopes as templates, then we used a sliding-window cross-correlation technique based on waveform envelope to detect the other microseismic events. Consequently, 100 microseismic events were detected with the waveforms recorded on all seismometers. Based on the location, we found most of them located on the front of the landslide while the others located on the back end. The bottom and top of the landslide accumulated considerable energy and deformed largely, radiated waves could be recorded by all stations. What's more, the bottom with more events seemed very active. In addition, there were many smaller events happened in middle part of the landslide where released less energy, generated signals could be recorded only by a few stations. Based on the distribution of those microseismic events, we found four unstable regions which agreed well with deformed areas monitored by Geodesy methods. The distribution of those microseismic events, should be related to internal structure and movement of the landslide.
Femtosecond laser fabrication of fiber based optofluidic platform for flow cytometry applications
NASA Astrophysics Data System (ADS)
Serhatlioglu, Murat; Elbuken, Caglar; Ortac, Bulend; Solmaz, Mehmet E.
2017-02-01
Miniaturized optofluidic platforms play an important role in bio-analysis, detection and diagnostic applications. The advantages of such miniaturized devices are extremely low sample requirement, low cost development and rapid analysis capabilities. Fused silica is advantageous for optofluidic systems due to properties such as being chemically inert, mechanically stable, and optically transparent to a wide spectrum of light. As a three dimensional manufacturing method, femtosecond laser scanning followed by chemical etching shows great potential to fabricate glass based optofluidic chips. In this study, we demonstrate fabrication of all-fiber based, optofluidic flow cytometer in fused silica glass by femtosecond laser machining. 3D particle focusing was achieved through a straightforward planar chip design with two separately fabricated fused silica glass slides thermally bonded together. Bioparticles in a fluid stream encounter with optical interrogation region specifically designed to allocate 405nm single mode fiber laser source and two multi-mode collection fibers for forward scattering (FSC) and side scattering (SSC) signals detection. Detected signal data collected with oscilloscope and post processed with MATLAB script file. We were able to count number of events over 4000events/sec, and achieve size distribution for 5.95μm monodisperse polystyrene beads using FSC and SSC signals. Our platform shows promise for optical and fluidic miniaturization of flow cytometry systems.
Phenotypic screening for developmental neurotoxicity ...
There are large numbers of environmental chemicals with little or no available information on their toxicity, including developmental neurotoxicity. Because of the resource-intensive nature of traditional animal tests, high-throughput (HTP) methods that can rapidly evaluate chemicals for the potential to affect the developing brain are being explored. Typically, HTP screening uses biochemical and molecular assays to detect the interaction of a chemical with a known target or molecular initiating event (e.g., the mechanism of action). For developmental neurotoxicity, however, the mechanism(s) is often unknown. Thus, we have developed assays for detecting chemical effects on the key events of neurodevelopment at the cellular level (e.g., proliferation, differentiation, neurite growth, synaptogenesis, network formation). Cell-based assays provide a test system at a level of biological complexity that encompasses many potential neurotoxic mechanisms. For example, phenotypic assessment of neurite outgrowth at the cellular level can detect chemicals that target kinases, ion channels, or esterases at the molecular level. The results from cell-based assays can be placed in a conceptual framework using an Adverse Outcome Pathway (AOP) which links molecular, cellular, and organ level effects with apical measures of developmental neurotoxicity. Testing a wide range of concentrations allows for the distinction between selective effects on neurodevelopmental and non-specific
Method and apparatus for distinguishing actual sparse events from sparse event false alarms
Spalding, Richard E.; Grotbeck, Carter L.
2000-01-01
Remote sensing method and apparatus wherein sparse optical events are distinguished from false events. "Ghost" images of actual optical phenomena are generated using an optical beam splitter and optics configured to direct split beams to a single sensor or segmented sensor. True optical signals are distinguished from false signals or noise based on whether the ghost image is presence or absent. The invention obviates the need for dual sensor systems to effect a false target detection capability, thus significantly reducing system complexity and cost.
Limitations imposed on fire PRA methods as the result of incomplete and uncertain fire event data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nowlen, Steven Patrick; Hyslop, J. S.
2010-04-01
Fire probabilistic risk assessment (PRA) methods utilize data and insights gained from actual fire events in a variety of ways. For example, fire occurrence frequencies, manual fire fighting effectiveness and timing, and the distribution of fire events by fire source and plant location are all based directly on the historical experience base. Other factors are either derived indirectly or supported qualitatively based on insights from the event data. These factors include the general nature and intensity of plant fires, insights into operator performance, and insights into fire growth and damage behaviors. This paper will discuss the potential methodology improvements thatmore » could be realized if more complete fire event reporting information were available. Areas that could benefit from more complete event reporting that will be discussed in the paper include fire event frequency analysis, analysis of fire detection and suppression system performance including incipient detection systems, analysis of manual fire fighting performance, treatment of fire growth from incipient stages to fully-involved fires, operator response to fire events, the impact of smoke on plant operations and equipment, and the impact of fire-induced cable failures on plant electrical circuits.« less
NASA Astrophysics Data System (ADS)
Meng, Xiaobo; Chen, Haichao; Niu, Fenglin; Tang, Youcai; Yin, Chen; Wu, Furong
2018-02-01
We introduce an improved matching and locating technique to detect and locate microseismic events (-4 < ML < 0) associated with hydraulic fracturing treatment. We employ a set of representative master events to act as template waveforms and detect slave events that strongly resemble master events through stacking cross correlograms of both P and S waves between the template waveforms and the continuous records of the monitoring array. Moreover, the residual moveout in the cross correlograms across the array is used to locate slave events relative to the corresponding master event. In addition, P wave polarization constraint is applied to resolve the lateral extent of slave events in the case of unfavorable array configuration. We first demonstrate the detectability and location accuracy of the proposed approach with a pseudo-synthetic data set. Compared to the matched filter analysis, the proposed approach can significantly enhance detectability at low false alarm rate and yield robust location estimates of very low SNR events, particularly along the vertical direction. Then, we apply the method to a real microseismic data set acquired in the Weiyuan shale reservoir of China in November of 2014. The expanded microseismic catalog provides more easily interpretable spatiotemporal evolution of microseismicity, which is investigated in detail in a companion paper.
Human Rights Event Detection from Heterogeneous Social Media Graphs.
Chen, Feng; Neill, Daniel B
2015-03-01
Human rights organizations are increasingly monitoring social media for identification, verification, and documentation of human rights violations. Since manual extraction of events from the massive amount of online social network data is difficult and time-consuming, we propose an approach for automated, large-scale discovery and analysis of human rights-related events. We apply our recently developed Non-Parametric Heterogeneous Graph Scan (NPHGS), which models social media data such as Twitter as a heterogeneous network (with multiple different node types, features, and relationships) and detects emerging patterns in the network, to identify and characterize human rights events. NPHGS efficiently maximizes a nonparametric scan statistic (an aggregate measure of anomalousness) over connected subgraphs of the heterogeneous network to identify the most anomalous network clusters. It summarizes each event with information such as type of event, geographical locations, time, and participants, and provides documentation such as links to videos and news reports. Building on our previous work that demonstrates the utility of NPHGS for civil unrest prediction and rare disease outbreak detection, we present an analysis of human rights events detected by NPHGS using two years of Twitter data from Mexico. NPHGS was able to accurately detect relevant clusters of human rights-related tweets prior to international news sources, and in some cases, prior to local news reports. Analysis of social media using NPHGS could enhance the information-gathering missions of human rights organizations by pinpointing specific abuses, revealing events and details that may be blocked from traditional media sources, and providing evidence of emerging patterns of human rights violations. This could lead to more timely, targeted, and effective advocacy, as well as other potential interventions.
Polyvinylidene fluoride sensor-based method for unconstrained snoring detection.
Hwang, Su Hwan; Han, Chung Min; Yoon, Hee Nam; Jung, Da Woon; Lee, Yu Jin; Jeong, Do-Un; Park, Kwang Suk
2015-07-01
We established and tested a snoring detection method using a polyvinylidene fluoride (PVDF) sensor for accurate, fast, and motion-artifact-robust monitoring of snoring events during sleep. Twenty patients with obstructive sleep apnea participated in this study. The PVDF sensor was located between a mattress cover and mattress, and the patients' snoring signals were unconstrainedly measured with the sensor during polysomnography. The power ratio and peak frequency from the short-time Fourier transform were used to extract spectral features from the PVDF data. A support vector machine was applied to the spectral features to classify the data into either the snore or non-snore class. The performance of the method was assessed using manual labelling by three human observers as a reference. For event-by-event snoring detection, PVDF data that contained 'snoring' (SN), 'snoring with movement' (SM), and 'normal breathing' epochs were selected for each subject. As a result, the overall sensitivity and the positive predictive values were 94.6% and 97.5%, respectively, and there was no significant difference between the SN and SM results. The proposed method can be applied in both residential and ambulatory snoring monitoring systems.
Automated infrasound signal detection algorithms implemented in MatSeis - Infra Tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Darren
2004-07-01
MatSeis's infrasound analysis tool, Infra Tool, uses frequency slowness processing to deconstruct the array data into three outputs per processing step: correlation, azimuth and slowness. Until now, an experienced analyst trained to recognize a pattern observed in outputs from signal processing manually accomplished infrasound signal detection. Our goal was to automate the process of infrasound signal detection. The critical aspect of infrasound signal detection is to identify consecutive processing steps where the azimuth is constant (flat) while the time-lag correlation of the windowed waveform is above background value. These two statements describe the arrival of a correlated set of wavefrontsmore » at an array. The Hough Transform and Inverse Slope methods are used to determine the representative slope for a specified number of azimuth data points. The representative slope is then used in conjunction with associated correlation value and azimuth data variance to determine if and when an infrasound signal was detected. A format for an infrasound signal detection output file is also proposed. The detection output file will list the processed array element names, followed by detection characteristics for each method. Each detection is supplied with a listing of frequency slowness processing characteristics: human time (YYYY/MM/DD HH:MM:SS.SSS), epochal time, correlation, fstat, azimuth (deg) and trace velocity (km/s). As an example, a ground truth event was processed using the four-element DLIAR infrasound array located in New Mexico. The event is known as the Watusi chemical explosion, which occurred on 2002/09/28 at 21:25:17 with an explosive yield of 38,000 lb TNT equivalent. Knowing the source and array location, the array-to-event distance was computed to be approximately 890 km. This test determined the station-to-event azimuth (281.8 and 282.1 degrees) to within 1.6 and 1.4 degrees for the Inverse Slope and Hough Transform detection algorithms, respectively, and the detection window closely correlated to the theoretical stratospheric arrival time. Further testing will be required for tuning of detection threshold parameters for different types of infrasound events.« less
Bayes Node Energy Polynomial Distribution to Improve Routing in Wireless Sensor Network
Palanisamy, Thirumoorthy; Krishnasamy, Karthikeyan N.
2015-01-01
Wireless Sensor Network monitor and control the physical world via large number of small, low-priced sensor nodes. Existing method on Wireless Sensor Network (WSN) presented sensed data communication through continuous data collection resulting in higher delay and energy consumption. To conquer the routing issue and reduce energy drain rate, Bayes Node Energy and Polynomial Distribution (BNEPD) technique is introduced with energy aware routing in the wireless sensor network. The Bayes Node Energy Distribution initially distributes the sensor nodes that detect an object of similar event (i.e., temperature, pressure, flow) into specific regions with the application of Bayes rule. The object detection of similar events is accomplished based on the bayes probabilities and is sent to the sink node resulting in minimizing the energy consumption. Next, the Polynomial Regression Function is applied to the target object of similar events considered for different sensors are combined. They are based on the minimum and maximum value of object events and are transferred to the sink node. Finally, the Poly Distribute algorithm effectively distributes the sensor nodes. The energy efficient routing path for each sensor nodes are created by data aggregation at the sink based on polynomial regression function which reduces the energy drain rate with minimum communication overhead. Experimental performance is evaluated using Dodgers Loop Sensor Data Set from UCI repository. Simulation results show that the proposed distribution algorithm significantly reduce the node energy drain rate and ensure fairness among different users reducing the communication overhead. PMID:26426701
Bayes Node Energy Polynomial Distribution to Improve Routing in Wireless Sensor Network.
Palanisamy, Thirumoorthy; Krishnasamy, Karthikeyan N
2015-01-01
Wireless Sensor Network monitor and control the physical world via large number of small, low-priced sensor nodes. Existing method on Wireless Sensor Network (WSN) presented sensed data communication through continuous data collection resulting in higher delay and energy consumption. To conquer the routing issue and reduce energy drain rate, Bayes Node Energy and Polynomial Distribution (BNEPD) technique is introduced with energy aware routing in the wireless sensor network. The Bayes Node Energy Distribution initially distributes the sensor nodes that detect an object of similar event (i.e., temperature, pressure, flow) into specific regions with the application of Bayes rule. The object detection of similar events is accomplished based on the bayes probabilities and is sent to the sink node resulting in minimizing the energy consumption. Next, the Polynomial Regression Function is applied to the target object of similar events considered for different sensors are combined. They are based on the minimum and maximum value of object events and are transferred to the sink node. Finally, the Poly Distribute algorithm effectively distributes the sensor nodes. The energy efficient routing path for each sensor nodes are created by data aggregation at the sink based on polynomial regression function which reduces the energy drain rate with minimum communication overhead. Experimental performance is evaluated using Dodgers Loop Sensor Data Set from UCI repository. Simulation results show that the proposed distribution algorithm significantly reduce the node energy drain rate and ensure fairness among different users reducing the communication overhead.
Novel measurement method of heat and light detection for neutrinoless double beta decay
NASA Astrophysics Data System (ADS)
Kim, G. B.; Choi, J. H.; Jo, H. S.; Kang, C. S.; Kim, H. L.; Kim, I.; Kim, S. R.; Kim, Y. H.; Lee, C.; Lee, H. J.; Lee, M. K.; Li, J.; Oh, S. Y.; So, J. H.
2017-05-01
We developed a cryogenic phonon-scintillation detector to search for 0νββ decay of 100Mo. The detector module, a proto-type setup of the AMoRE experiment, has a scintillating 40Ca100MoO4 absorber composed of 100Mo-enriched and 48Ca-depleted elements. This new detection method employs metallic magnetic calorimeters (MMCs) as the sensor technology for simultaneous detection of heat and light signals. It is designed to have high energy and timing resolutions to increase sensitivity to probe the rare event. The detector, which is composed of a 200 g 40Ca100MoO4 crystal and phonon/photon sensors, showed an energy resolution of 8.7 keV FWHM at 2.6 MeV, with a weak temperature dependence in the range of 10-40 mK. Using rise-time and mean-time parameters and light/heat ratios, the proposed method showed a strong capability of rejecting alpha-induced events from electron events with as good as 20σ separation. Moreover, we discussed how the signal rise-time improves the rejection efficiency for random coincidence signals.
Taking Halo-Independent Dark Matter Methods Out of the Bin
Fox, Patrick J.; Kahn, Yonatan; McCullough, Matthew
2014-10-30
We develop a new halo-independent strategy for analyzing emerging DM hints, utilizing the method of extended maximum likelihood. This approach does not require the binning of events, making it uniquely suited to the analysis of emerging DM direct detection hints. It determines a preferred envelope, at a given confidence level, for the DM velocity integral which best fits the data using all available information and can be used even in the case of a single anomalous scattering event. All of the halo-independent information from a direct detection result may then be presented in a single plot, allowing simple comparisons betweenmore » multiple experiments. This results in the halo-independent analogue of the usual mass and cross-section plots found in typical direct detection analyses, where limit curves may be compared with best-fit regions in halo-space. The method is straightforward to implement, using already-established techniques, and its utility is demonstrated through the first unbinned halo-independent comparison of the three anomalous events observed in the CDMS-Si detector with recent limits from the LUX experiment.« less
Monitoring the Earth's Atmosphere with the Global IMS Infrasound Network
NASA Astrophysics Data System (ADS)
Brachet, Nicolas; Brown, David; Mialle, Pierrick; Le Bras, Ronan; Coyne, John; Given, Jeffrey
2010-05-01
The Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) is tasked with monitoring compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT) which bans nuclear weapon explosions underground, in the oceans, and in the atmosphere. The verification regime includes a globally distributed network of seismic, hydroacoustic, infrasound and radionuclide stations which collect and transmit data to the International Data Centre (IDC) in Vienna, Austria shortly after the data are recorded at each station. The infrasound network defined in the Protocol of the CTBT comprises 60 infrasound array stations. Each array is built according to the same technical specifications, it is typically composed of 4 to 9 sensors, with 1 to 3 km aperture geometry. At the end of 2000 only one infrasound station was transmitting data to the IDC. Since then, 41 additional stations have been installed and 70% of the infrasound network is currently certified and contributing data to the IDC. This constitutes the first global infrasound network ever built with such a large and uniform distribution of stations. Infrasound data at the IDC are processed at the station level using the Progressive Multi-Channel Correlation (PMCC) method for the detection and measurement of infrasound signals. The algorithm calculates the signal correlation between sensors at an infrasound array. If the signal is sufficiently correlated and consistent over an extended period of time and frequency range a detection is created. Groups of detections are then categorized according to their propagation and waveform features, and a phase name is assigned for infrasound, seismic or noise detections. The categorization complements the PMCC algorithm to avoid overwhelming the IDC automatic association algorithm with false alarm infrasound events. Currently, 80 to 90% of the detections are identified as noise by the system. Although the noise detections are not used to build events in the context of CTBT monitoring, they represent valuable data for other civil applications like monitoring of natural hazards (volcanic activity, storm tracking) and climate change. Non-noise detections are used in network processing at the IDC along with seismic and hydroacoustic technologies. The arrival phases detected on the three waveform technologies may be combined and used for locating events in an automatically generated bulletin of events. This automatic event bulletin is routinely reviewed by analysts during the interactive review process. However, the fusion of infrasound data with the other waveform technologies has only recently (in early 2010) become part of the IDC operational system, after a software development and testing period that began in 2004. The build-up of the IMS infrasound network, the recent developments of the IDC infrasound software, and the progress accomplished during the last decade in the domain of real-time atmospheric modelling have allowed better understanding of infrasound signals and identification of a growing data set of ground-truth sources. These infragenic sources originate from natural or man-made sources. Some of the detected signals are emitted by local or regional phenomena recorded by a single IMS infrasound station: man-made cultural activity, wind farms, aircraft, artillery exercises, ocean surf, thunderstorms, rumbling volcanoes, iceberg calving, aurora, avalanches. Other signals may be recorded by several IMS infrasound stations at larger distances: ocean swell, sonic booms, and mountain associated waves. Only a small fraction of events meet the event definition criteria considering the Treaty verification mission of the Organization. Candidate event types for the IDC Reviewed Event Bulletin include atmospheric or surface explosions, meteor explosions, rocket launches, signals from large earthquakes and explosive volcanic eruptions.
Identification and validation of FGFR2 peptide for detection of early Barrett's neoplasia
Zhou, Juan; He, Lei; Pang, Zhijun; Appelman, Henry D.; Kuick, Rork; Beer, David G.; Li, Meng; Wang, Thomas D.
2017-01-01
The incidence of esophageal adenocarcinoma (EAC) is rising rapidly, and early detection within the precursor state of Barrett's esophagus (BE) is challenged by flat premalignant lesions that are difficult detect with conventional endoscopic surveillance. Overexpression of cell surface fibroblast growth factor receptor 2 (FGFR2) is an early event in progression of BE to EAC, and is a promising imaging target. We used phage display to identify the peptide SRRPASFRTARE that binds specifically to the extracellular domain of FGFR2. We labeled this peptide with a near-infrared fluorophore Cy5.5, and validated the specific binding to FGFR2 overexpressed in cells in vitro. We found high affinity kd = 68 nM and rapid binding k = 0.16 min−1 (6.2 min). In human esophageal specimens, we found significantly greater peptide binding to high-grade dysplasia (HGD) versus either BE or normal squamous epithelium, and good correlation with anti-FGFR2 antibody. We also observed significantly greater peptide binding to excised specimens of esophageal squamous cell carcinoma and gastric cancer compared to normal mucosa. These results demonstrate potential for this FGFR2 peptide to be used as a clinical imaging agent to guide tissue biopsy and improve methods for early detection of EAC and potentially other epithelial-derived cancers. PMID:29152066
Security Event Recognition for Visual Surveillance
NASA Astrophysics Data System (ADS)
Liao, W.; Yang, C.; Yang, M. Ying; Rosenhahn, B.
2017-05-01
With rapidly increasing deployment of surveillance cameras, the reliable methods for automatically analyzing the surveillance video and recognizing special events are demanded by different practical applications. This paper proposes a novel effective framework for security event analysis in surveillance videos. First, convolutional neural network (CNN) framework is used to detect objects of interest in the given videos. Second, the owners of the objects are recognized and monitored in real-time as well. If anyone moves any object, this person will be verified whether he/she is its owner. If not, this event will be further analyzed and distinguished between two different scenes: moving the object away or stealing it. To validate the proposed approach, a new video dataset consisting of various scenarios is constructed for more complex tasks. For comparison purpose, the experiments are also carried out on the benchmark databases related to the task on abandoned luggage detection. The experimental results show that the proposed approach outperforms the state-of-the-art methods and effective in recognizing complex security events.
Bayesian analysis of caustic-crossing microlensing events
NASA Astrophysics Data System (ADS)
Cassan, A.; Horne, K.; Kains, N.; Tsapras, Y.; Browne, P.
2010-06-01
Aims: Caustic-crossing binary-lens microlensing events are important anomalous events because they are capable of detecting an extrasolar planet companion orbiting the lens star. Fast and robust modelling methods are thus of prime interest in helping to decide whether a planet is detected by an event. Cassan introduced a new set of parameters to model binary-lens events, which are closely related to properties of the light curve. In this work, we explain how Bayesian priors can be added to this framework, and investigate on interesting options. Methods: We develop a mathematical formulation that allows us to compute analytically the priors on the new parameters, given some previous knowledge about other physical quantities. We explicitly compute the priors for a number of interesting cases, and show how this can be implemented in a fully Bayesian, Markov chain Monte Carlo algorithm. Results: Using Bayesian priors can accelerate microlens fitting codes by reducing the time spent considering physically implausible models, and helps us to discriminate between alternative models based on the physical plausibility of their parameters.
Support Vector Machine Model for Automatic Detection and Classification of Seismic Events
NASA Astrophysics Data System (ADS)
Barros, Vesna; Barros, Lucas
2016-04-01
The automated processing of multiple seismic signals to detect, localize and classify seismic events is a central tool in both natural hazards monitoring and nuclear treaty verification. However, false detections and missed detections caused by station noise and incorrect classification of arrivals are still an issue and the events are often unclassified or poorly classified. Thus, machine learning techniques can be used in automatic processing for classifying the huge database of seismic recordings and provide more confidence in the final output. Applied in the context of the International Monitoring System (IMS) - a global sensor network developed for the Comprehensive Nuclear-Test-Ban Treaty (CTBT) - we propose a fully automatic method for seismic event detection and classification based on a supervised pattern recognition technique called the Support Vector Machine (SVM). According to Kortström et al., 2015, the advantages of using SVM are handleability of large number of features and effectiveness in high dimensional spaces. Our objective is to detect seismic events from one IMS seismic station located in an area of high seismicity and mining activity and classify them as earthquakes or quarry blasts. It is expected to create a flexible and easily adjustable SVM method that can be applied in different regions and datasets. Taken a step further, accurate results for seismic stations could lead to a modification of the model and its parameters to make it applicable to other waveform technologies used to monitor nuclear explosions such as infrasound and hydroacoustic waveforms. As an authorized user, we have direct access to all IMS data and bulletins through a secure signatory account. A set of significant seismic waveforms containing different types of events (e.g. earthquake, quarry blasts) and noise is being analysed to train the model and learn the typical pattern of the signal from these events. Moreover, comparing the performance of the support-vector network to various classical learning algorithms used before in seismic detection and classification is an essential final step to analyze the advantages and disadvantages of the model.
Category-Specific Comparison of Univariate Alerting Methods for Biosurveillance Decision Support
Elbert, Yevgeniy; Hung, Vivian; Burkom, Howard
2013-01-01
Objective For a multi-source decision support application, we sought to match univariate alerting algorithms to surveillance data types to optimize detection performance. Introduction Temporal alerting algorithms commonly used in syndromic surveillance systems are often adjusted for data features such as cyclic behavior but are subject to overfitting or misspecification errors when applied indiscriminately. In a project for the Armed Forces Health Surveillance Center to enable multivariate decision support, we obtained 4.5 years of out-patient, prescription and laboratory test records from all US military treatment facilities. A proof-of-concept project phase produced 16 events with multiple evidence corroboration for comparison of alerting algorithms for detection performance. We used the representative streams from each data source to compare sensitivity of 6 algorithms to injected spikes, and we used all data streams from 16 known events to compare them for detection timeliness. Methods The six methods compared were: Holt-Winters generalized exponential smoothing method (1)automated choice between daily methods, regression and an exponential weighted moving average (2)adaptive daily Shewhart-type chartadaptive one-sided daily CUSUMEWMA applied to 7-day means with a trend correction; and7-day temporal scan statistic Sensitivity testing: We conducted comparative sensitivity testing for categories of time series with similar scales and seasonal behavior. We added multiples of the standard deviation of each time series as single-day injects in separate algorithm runs. For each candidate method, we then used as a sensitivity measure the proportion of these runs for which the output of each algorithm was below alerting thresholds estimated empirically for each algorithm using simulated data streams. We identified the algorithm(s) whose sensitivity was most consistently high for each data category. For each syndromic query applied to each data source (outpatient, lab test orders, and prescriptions), 502 authentic time series were derived, one for each reporting treatment facility. Data categories were selected in order to group time series with similar expected algorithm performance: Median > 100 < Median ≤ 10Median = 0Lag 7 Autocorrelation Coefficient ≥ 0.2Lag 7 Autocorrelation Coefficient < 0.2 Timeliness testing: For the timeliness testing, we avoided artificiality of simulated signals by measuring alerting detection delays in the 16 corroborated outbreaks. The multiple time series from these events gave a total of 141 time series with outbreak intervals for timeliness testing. The following measures were computed to quantify timeliness of detection: Median Detection Delay – median number of days to detect the outbreak.Penalized Mean Detection Delay –mean number of days to detect the outbreak with outbreak misses penalized as 1 day plus the maximum detection time. Results Based on the injection results, the Holt-Winters algorithm was most sensitive among time series with positive medians. The adaptive CUSUM and the Shewhart methods were most sensitive for data streams with median zero. Table 1 provides timeliness results using the 141 outbreak-associated streams on sparse (Median=0) and non-sparse data categories. [Insert table #1 here] Data median Detection Delay, days Holt-winters Regression EWMA Adaptive Shewhart Adaptive CUSUM 7-day Trend-adj. EWMA 7-day Temporal Scan Median 0 Median 3 2 4 2 4.5 2 Penalized Mean 7.2 7 6.6 6.2 7.3 7.6 Median >0 Median 2 2 2.5 2 6 4 Penalized Mean 6.1 7 7.2 7.1 7.7 6.6 The gray shading in the table 1 indicates methods with shortest detection delays for sparse and non-sparse data streams. The Holt-Winters method was again superior for non-sparse data. For data with median=0, the adaptive CUSUM was superior for a daily false alarm probability of 0.01, but the Shewhart method was timelier for more liberal thresholds. Conclusions Both kinds of detection performance analysis showed the method based on Holt-Winters exponential smoothing superior on non-sparse time series with day-of-week effects. The adaptive CUSUM and She-whart methods proved optimal on sparse data and data without weekly patterns.
Knowledge acquisition for temporal abstraction.
Stein, A; Musen, M A; Shahar, Y
1996-01-01
Temporal abstraction is the task of detecting relevant patterns in data over time. The knowledge-based temporal-abstraction method uses knowledge about a clinical domain's contexts, external events, and parameters to create meaningful interval-based abstractions from raw time-stamped clinical data. In this paper, we describe the acquisition and maintenance of domain-specific temporal-abstraction knowledge. Using the PROTEGE-II framework, we have designed a graphical tool for acquiring temporal knowledge directly from expert physicians, maintaining the knowledge in a sharable form, and converting the knowledge into a suitable format for use by an appropriate problem-solving method. In initial tests, the tool offered significant gains in our ability to rapidly acquire temporal knowledge and to use that knowledge to perform automated temporal reasoning.
Detecting Single-Nucleotide Substitutions Induced by Genome Editing.
Miyaoka, Yuichiro; Chan, Amanda H; Conklin, Bruce R
2016-08-01
The detection of genome editing is critical in evaluating genome-editing tools or conditions, but it is not an easy task to detect genome-editing events-especially single-nucleotide substitutions-without a surrogate marker. Here we introduce a procedure that significantly contributes to the advancement of genome-editing technologies. It uses droplet digital polymerase chain reaction (ddPCR) and allele-specific hydrolysis probes to detect single-nucleotide substitutions generated by genome editing (via homology-directed repair, or HDR). HDR events that introduce substitutions using donor DNA are generally infrequent, even with genome-editing tools, and the outcome is only one base pair difference in 3 billion base pairs of the human genome. This task is particularly difficult in induced pluripotent stem (iPS) cells, in which editing events can be very rare. Therefore, the technological advances described here have implications for therapeutic genome editing and experimental approaches to disease modeling with iPS cells. © 2016 Cold Spring Harbor Laboratory Press.
Design and Deployment of a Pediatric Cardiac Arrest Surveillance System.
Duval-Arnould, Jordan Michel; Newton, Heather Marie; McNamara, Leann; Engorn, Branden Michael; Jones, Kareen; Bernier, Meghan; Dodge, Pamela; Salamone, Cheryl; Bhalala, Utpal; Jeffers, Justin M; Engineer, Lilly; Diener-West, Marie; Hunt, Elizabeth Anne
2018-01-01
We aimed to increase detection of pediatric cardiopulmonary resuscitation (CPR) events and collection of physiologic and performance data for use in quality improvement (QI) efforts. We developed a workflow-driven surveillance system that leveraged organizational information technology systems to trigger CPR detection and analysis processes. We characterized detection by notification source, type, location, and year, and compared it to previous methods of detection. From 1/1/2013 through 12/31/2015, there were 2,986 unique notifications associated with 2,145 events, 317 requiring CPR. PICU and PEDS-ED accounted for 65% of CPR events, whereas floor care areas were responsible for only 3% of events. 100% of PEDS-OR and >70% of PICU CPR events would not have been included in QI efforts. Performance data from both defibrillator and bedside monitor increased annually. (2013: 1%; 2014: 18%; 2015: 27%). After deployment of this system, detection has increased ∼9-fold and performance data collection increased annually. Had the system not been deployed, 100% of PEDS-OR and 50-70% of PICU, NICU, and PEDS-ED events would have been missed. By leveraging hospital information technology and medical device data, identification of pediatric cardiac arrest with an associated increased capture in the proportion of objective performance data is possible.
Follow-up of high energy neutrinos detected by the ANTARES telescope
NASA Astrophysics Data System (ADS)
Mathieu, Aurore
2016-04-01
The ANTARES telescope is well-suited to detect high energy neutrinos produced in astrophysical transient sources as it can observe a full hemisphere of the sky with a high duty cycle. Potential neutrino sources are gamma-ray bursts, core-collapse supernovae and flaring active galactic nuclei. To enhance the sensitivity of ANTARES to such sources, a detection method based on follow-up observations from the neutrino direction has been developed. This program, denoted as TAToO, includes a network of robotic optical telescopes (TAROT, Zadko and MASTER) and the Swift-XRT telescope, which are triggered when an "interesting" neutrino is detected by ANTARES. A follow-up of special events, such as neutrino doublets in time/space coincidence or a single neutrino having a very high energy or in the specific direction of a local galaxy, significantly improves the perspective for the detection of transient sources. The analysis of early and long term follow-up observations to search for fast and slowly varying transient sources, respectively, has been performed and the results covering optical and X-ray data are presented in this contribution.
The effects of lossy compression on diagnostically relevant seizure information in EEG signals.
Higgins, G; McGinley, B; Faul, S; McEvoy, R P; Glavin, M; Marnane, W P; Jones, E
2013-01-01
This paper examines the effects of compression on EEG signals, in the context of automated detection of epileptic seizures. Specifically, it examines the use of lossy compression on EEG signals in order to reduce the amount of data which has to be transmitted or stored, while having as little impact as possible on the information in the signal relevant to diagnosing epileptic seizures. Two popular compression methods, JPEG2000 and SPIHT, were used. A range of compression levels was selected for both algorithms in order to compress the signals with varying degrees of loss. This compression was applied to the database of epileptiform data provided by the University of Freiburg, Germany. The real-time EEG analysis for event detection automated seizure detection system was used in place of a trained clinician for scoring the reconstructed data. Results demonstrate that compression by a factor of up to 120:1 can be achieved, with minimal loss in seizure detection performance as measured by the area under the receiver operating characteristic curve of the seizure detection system.
Setting objective thresholds for rare event detection in flow cytometry
Richards, Adam J.; Staats, Janet; Enzor, Jennifer; McKinnon, Katherine; Frelinger, Jacob; Denny, Thomas N.; Weinhold, Kent J.; Chan, Cliburn
2014-01-01
The accurate identification of rare antigen-specific cytokine positive cells from peripheral blood mononuclear cells (PBMC) after antigenic stimulation in an intracellular staining (ICS) flow cytometry assay is challenging, as cytokine positive events may be fairly diffusely distributed and lack an obvious separation from the negative population. Traditionally, the approach by flow operators has been to manually set a positivity threshold to partition events into cytokine-positive and cytokine-negative. This approach suffers from subjectivity and inconsistency across different flow operators. The use of statistical clustering methods does not remove the need to find an objective threshold between between positive and negative events since consistent identification of rare event subsets is highly challenging for automated algorithms, especially when there is distributional overlap between the positive and negative events (“smear”). We present a new approach, based on the Fβ measure, that is similar to manual thresholding in providing a hard cutoff, but has the advantage of being determined objectively. The performance of this algorithm is compared with results obtained by expert visual gating. Several ICS data sets from the External Quality Assurance Program Oversight Laboratory (EQAPOL) proficiency program were used to make the comparisons. We first show that visually determined thresholds are difficult to reproduce and pose a problem when comparing results across operators or laboratories, as well as problems that occur with the use of commonly employed clustering algorithms. In contrast, a single parameterization for the Fβ method performs consistently across different centers, samples, and instruments because it optimizes the precision/recall tradeoff by using both negative and positive controls. PMID:24727143
Hierarchical CuInS2-based heterostructure: Application for photocathodic bioanalysis of sarcosine.
Jiang, Xin-Yuan; Zhang, Ling; Liu, Yi-Li; Yu, Xiao-Dong; Liang, Yan-Yu; Qu, Peng; Zhao, Wei-Wei; Xu, Jing-Juan; Chen, Hong-Yuan
2018-06-01
In this study, on the basis of hierarchical CuInS 2 -based heterostructure, a novel cathodic photoelectrochemical (PEC) enzymatic bioanalysis of the sarcosine detection was reported. Specifically, heterostructured CuInS 2 /NiO/ITO photocathode was prepared and sarcosine oxidases (SOx) were integrated for the construction of the enzymatic biosensor. In the bioanalysis, the O 2 -dependent suppression of the cathodic photocurrent can be observed due to the competition between the as-fabricated O 2 -sensitive photocathode and the SOx-catalytic event toward O 2 reduction. Based on the sarcosine-controlled O 2 concentration, a novel photocathodic enzymatic biosensor could be realized for the sensitive and specific sarcosine detection. This work manifested the great potential of CuInS 2 -based heterostructure as a novel platform for future PEC bioanalytical development and also a PEC method for sarcosine detection, which could be easily extended to numerous other enzymatic systems and to our knowledge has not been reported. This work is expected to stimulate more interest in the design and implementation of numerous CuInS 2 -based heterostructured photocathodic enzymatic sensing. Copyright © 2018 Elsevier B.V. All rights reserved.
Identification of unusual events in multi-channel bridge monitoring data
NASA Astrophysics Data System (ADS)
Omenzetter, Piotr; Brownjohn, James Mark William; Moyo, Pilate
2004-03-01
Continuously operating instrumented structural health monitoring (SHM) systems are becoming a practical alternative to replace visual inspection for assessment of condition and soundness of civil infrastructure such as bridges. However, converting large amounts of data from an SHM system into usable information is a great challenge to which special signal processing techniques must be applied. This study is devoted to identification of abrupt, anomalous and potentially onerous events in the time histories of static, hourly sampled strains recorded by a multi-sensor SHM system installed in a major bridge structure and operating continuously for a long time. Such events may result, among other causes, from sudden settlement of foundation, ground movement, excessive traffic load or failure of post-tensioning cables. A method of outlier detection in multivariate data has been applied to the problem of finding and localising sudden events in the strain data. For sharp discrimination of abrupt strain changes from slowly varying ones wavelet transform has been used. The proposed method has been successfully tested using known events recorded during construction of the bridge, and later effectively used for detection of anomalous post-construction events.
NASA Technical Reports Server (NTRS)
Turso, James; Lawrence, Charles; Litt, Jonathan
2004-01-01
The development of a wavelet-based feature extraction technique specifically targeting FOD-event induced vibration signal changes in gas turbine engines is described. The technique performs wavelet analysis of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/health parameter estimation for FOD-event detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-event test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite element model modal analysis to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-event detection. In the presence of significant noise, precise location of the FOD event in time was obtained using the developed wavelet-based feature.
NASA Technical Reports Server (NTRS)
Turso, James A.; Lawrence, Charles; Litt, Jonathan S.
2007-01-01
The development of a wavelet-based feature extraction technique specifically targeting FOD-event induced vibration signal changes in gas turbine engines is described. The technique performs wavelet analysis of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/ health parameter estimation for FOD-event detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-event test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite-element model modal analysis to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-event detection. In the presence of significant noise, precise location of the FOD event in time was obtained using the developed wavelet-based feature.
Method for detection of antibodies for metallic elements
Barrick, C.W.; Clarke, S.M.; Nordin, C.W.
1993-11-30
An apparatus and method for detecting antibodies specific to non-protein antigens. The apparatus is an immunological plate containing a plurality of plastic projections coated with a non-protein material. Assays utilizing the plate are capable of stabilizing the non-protein antigens with detection levels for antibodies specific to the antigens on a nanogram level. A screening assay with the apparatus allows for early detection of exposure to non-protein materials. Specifically metallic elements are detected. 10 figures.
A Patch-Based Method for Repetitive and Transient Event Detection in Fluorescence Imaging
Boulanger, Jérôme; Gidon, Alexandre; Kervran, Charles; Salamero, Jean
2010-01-01
Automatic detection and characterization of molecular behavior in large data sets obtained by fast imaging in advanced light microscopy become key issues to decipher the dynamic architectures and their coordination in the living cell. Automatic quantification of the number of sudden and transient events observed in fluorescence microscopy is discussed in this paper. We propose a calibrated method based on the comparison of image patches expected to distinguish sudden appearing/vanishing fluorescent spots from other motion behaviors such as lateral movements. We analyze the performances of two statistical control procedures and compare the proposed approach to a frame difference approach using the same controls on a benchmark of synthetic image sequences. We have then selected a molecular model related to membrane trafficking and considered real image sequences obtained in cells stably expressing an endocytic-recycling trans-membrane protein, the Langerin-YFP, for validation. With this model, we targeted the efficient detection of fast and transient local fluorescence concentration arising in image sequences from a data base provided by two different microscopy modalities, wide field (WF) video microscopy using maximum intensity projection along the axial direction and total internal reflection fluorescence microscopy. Finally, the proposed detection method is briefly used to statistically explore the effect of several perturbations on the rate of transient events detected on the pilot biological model. PMID:20976222
Feng, Jiawang; Tang, Shiming; Liu, Lideng; Kuang, Xiaoshan; Wang, Xiaoyu; Hu, Songnan; You, Shuzhu
2015-03-01
Here, we developed a loop-mediated isothermal amplification (LAMP) assay for 11 common transgenic target DNA in GMOs. Six sets of LAMP primer candidates for each target were designed and their specificity, sensitivity, and reproductivity were evaluated. With the optimized LAMP primers, this LAMP assay was simply run within 45-60 min to detect all these targets in GMOs tested. The sensitivity, specificity, and reproductivity of the LAMP assay were further analyzed in comparison with those of Real-Time PCR. In consistent with real-time PCR, detection of 0.5% GMOs in equivalent background DNA was possible using this LAMP assay for all targets. In comparison with real-time PCR, the LAMP assay showed the same results with simple instruments. Hence, the LAMP assay developed can provide a rapid and simple approach for routine screening as well as specific events detection of many GMOs.
Elsner, H-A; Himmel, A; Steitz, M; Hammer, P; Schmitz, G; Ballas, M; Blasczyk, R
2002-07-01
The serological characterization of allelic variants that have been generated by large-scale interallelic recombination events indicates which residues may be involved in the formation of epitopes crucial for serological recognition. The allelic product of HLA-B*3531 is composed of B35 in its alpha1 domain and of B61(40) in its alpha2 domain. Both specificities are only weakly detectable with available sera. Allelic products with 'mixed' serology also represent a challenge to DNA-based HLA typing methods, as only the sequence motif of one ancestral allele may be recognized. In this case the hidden specificity would not be considered in the matching process and might not be recognized as an antigen 'unacceptable' to the recipient.
Algorithms for Autonomous Plume Detection on Outer Planet Satellites
NASA Astrophysics Data System (ADS)
Lin, Y.; Bunte, M. K.; Saripalli, S.; Greeley, R.
2011-12-01
We investigate techniques for automated detection of geophysical events (i.e., volcanic plumes) from spacecraft images. The algorithms presented here have not been previously applied to detection of transient events on outer planet satellites. We apply Scale Invariant Feature Transform (SIFT) to raw images of Io and Enceladus from the Voyager, Galileo, Cassini, and New Horizons missions. SIFT produces distinct interest points in every image; feature descriptors are reasonably invariant to changes in illumination, image noise, rotation, scaling, and small changes in viewpoint. We classified these descriptors as plumes using the k-nearest neighbor (KNN) algorithm. In KNN, an object is classified by its similarity to examples in a training set of images based on user defined thresholds. Using the complete database of Io images and a selection of Enceladus images where 1-3 plumes were manually detected in each image, we successfully detected 74% of plumes in Galileo and New Horizons images, 95% in Voyager images, and 93% in Cassini images. Preliminary tests yielded some false positive detections; further iterations will improve performance. In images where detections fail, plumes are less than 9 pixels in size or are lost in image glare. We compared the appearance of plumes and illuminated mountain slopes to determine the potential for feature classification. We successfully differentiated features. An advantage over other methods is the ability to detect plumes in non-limb views where they appear in the shadowed part of the surface; improvements will enable detection against the illuminated background surface where gradient changes would otherwise preclude detection. This detection method has potential applications to future outer planet missions for sustained plume monitoring campaigns and onboard automated prioritization of all spacecraft data. The complementary nature of this method is such that it could be used in conjunction with edge detection algorithms to increase effectiveness. We have demonstrated an ability to detect transient events above the planetary limb and on the surface and to distinguish feature classes in spacecraft images.
An investigation of Martian and terrestrial dust devils
NASA Astrophysics Data System (ADS)
Ringrose, Timothy John
2004-10-01
It is the purpose of this work to provide an insight into the theoretical and practical dynamics of dust devils and how they are detected remotely from orbit or in situ on planetary surfaces. There is particular interest in the detection of convective vortices on Mars; this has been driven by involvement in the development of the Beagle 2 Environmental Sensor Suite. This suite of sensors is essentially a martian weather station and will be the first planetary lander experiment specifically looking for the presence of dust devils on Mars. Dust devils are characterised by their visible dusty core and intense rotation. The physics of particle motion, including dust lofting and the rotational dynamics within convective vortices are explained and modelled. This modelling has helped in identifying dust devils in meteorological data from both terrestrial and martian investigations. An automated technique for dust devil detection using meteorological data has been developed. This technique searches data looking for the specific vortex signature as well as detecting other transient events. This method has been tested on both terrestrial and martian data with surprising results. 38 possible convective vortices were detected in the first 60 sols of the Viking Lander 2 meteorological data. Tests were also carried out on data from a terrestrial dust devil campaign, which provided conclusive evidence from visual observations of the reliability of this technique. A considerable amount of this work does focus on terrestrial vortices. This is to aid in the understanding of dust devils, specifically how, why and when they form. Both laboratory and terrestrial fieldwork is investigated, providing useful data on the general structure of dust devils.
Tonelli, Marcello; Hemmelgarn, Brenda; Reiman, Tony; Manns, Braden; Reaume, M. Neil; Lloyd, Anita; Wiebe, Natasha; Klarenbach, Scott
2009-01-01
Background Erythropoiesis-stimulating agents are used to treat anemia in patients with cancer. However, their safety and effectiveness is controversial. We did a systematic review of the clinical efficacy and harms of these agents in adults with anemia related to cancer or chemotherapy. Methods We conducted a systematic review of published and unpublished randomized controlled trials (RCTs) using accepted methods for literature searches, article selection, data extraction and quality assessment. We included RCTs involving anemic adults with cancer. We compared the use of erythropoiesis-stimulating agents with nonuse and assessed clinical outcomes (all-cause mortality, cardiovascular events and hypertension, health-related quality of life, blood transfusions and tumour response) and harms (serious adverse events) between groups. Results We identified 52 trials (n = 12 006) that met our selection criteria. The pooled all-cause mortality during treatment was significantly higher in the group receiving erythropoiesis-stimulating therapy than in the control group (relative risk [RR] 1.15, 95% confidence interval [CI] 1.03 to 1.29). Compared with no treatment, use of erythropoiesis-stimulating agents led to clinically detectable improvements in disease-specific measures of quality of life. It also reduced the use of blood transfusions (RR 0.64, 95% CI 0.56 to 0.73). However, it led to an increased risk of thrombotic events (RR 1.69, 95% CI 1.27 to 2.24) and serious adverse events (RR 1.16, 95% CI 1.08 to 1.25). Interpretation Use of erythropoiesis-stimulating agents in patients with cancer-related anemia improved some disease-specific measures of quality of life and decreased the use of blood transfusions. However, it increased the risk of death and serious adverse events. Our findings suggest that such therapy not be used routinely as an alternative to blood transfusion in patients with anemia related to cancer. PMID:19407261
Development of structure switching aptamer assay for detection of aflatoxin M1 in milk sample.
Sharma, Atul; Catanante, Gaëlle; Hayat, Akhtar; Istamboulie, Georges; Ben Rejeb, Ines; Bhand, Sunil; Marty, Jean Louis
2016-09-01
The discovery of in-vitro systematic evolution of ligands by exponential enrichment (SELEX) process has considerably broaden the utility of aptamer as bio-recognition element, providing the high binding affinity and specificity against the target analytes. Recent research has focused on the development of structure switching signaling aptamer assay, transducing the aptamer- target recognition event into an easily detectable signal. In this paper, we demonstrate the development of structure switching aptamer assay for determination of aflatoxin M1 (AFM1) employing the quenching-dequenching mechanism. Hybridization of fluorescein labelled anti-AFM1 aptamer (F-aptamer) with TAMRA labelled complementary sequences (Q-aptamer) brings the fluorophore and the quencher into close proximity, which results in maximum fluorescence quenching. On addition of AFM1, the target induced conformational formation of antiparallel G-quadruplex aptamer-AFM1 complex results in fluorescence recovery. Under optimized experimental conditions, the developed method showed the good linearity with limit of detection (LOD) at 5.0ngkg(-1) for AFM1. The specificity of the sensing platform was carefully investigated against aflatoxin B1 (AFB1) and ochratoxin A (OTA). The developed assay platform showed the high specificity towards AFM1. The practical application of the developed aptamer assay was verified for detection of AFM1 in spiked milk samples. Good recoveries were obtained in the range from 94.40% to 95.28% (n=3) from AFM1 spiked milk sample. Copyright © 2016. Published by Elsevier B.V.
Park, Jong-Uk; Lee, Hyo-Ki; Lee, Junghun; Urtnasan, Erdenebayar; Kim, Hojoong; Lee, Kyoung-Joung
2015-09-01
This study proposes a method of automatically classifying sleep apnea/hypopnea events based on sleep states and the severity of sleep-disordered breathing (SDB) using photoplethysmogram (PPG) and oxygen saturation (SpO2) signals acquired from a pulse oximeter. The PPG was used to classify sleep state, while the severity of SDB was estimated by detecting events of SpO2 oxygen desaturation. Furthermore, we classified sleep apnea/hypopnea events by applying different categorisations according to the severity of SDB based on a support vector machine. The classification results showed sensitivity performances and positivity predictive values of 74.2% and 87.5% for apnea, 87.5% and 63.4% for hypopnea, and 92.4% and 92.8% for apnea + hypopnea, respectively. These results represent better or comparable outcomes compared to those of previous studies. In addition, our classification method reliably detected sleep apnea/hypopnea events in all patient groups without bias in particular patient groups when our algorithm was applied to a variety of patient groups. Therefore, this method has the potential to diagnose SDB more reliably and conveniently using a pulse oximeter.
Austin, Peter C
2018-01-01
The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest.
Austin, Peter C.
2017-01-01
The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest. PMID:29321694
Wenchuan Event Detection And Localization Using Waveform Correlation Coupled With Double Difference
NASA Astrophysics Data System (ADS)
Slinkard, M.; Heck, S.; Schaff, D. P.; Young, C. J.; Richards, P. G.
2014-12-01
The well-studied Wenchuan aftershock sequence triggered by the May 12, 2008, Ms 8.0, mainshock offers an ideal test case for evaluating the effectiveness of using waveform correlation coupled with double difference relocation to detect and locate events in a large aftershock sequence. We use Sandia's SeisCorr detector to process 3 months of data recorded by permanent IRIS and temporary ASCENT stations using templates from events listed in a global catalog to find similar events in the raw data stream. Then we take the detections and relocate them using the double difference method. We explore both the performance that can be expected with using just a small number of stations, and, the benefits of reprocessing a well-studied sequence such as this one using waveform correlation to find even more events. We benchmark our results against previously published results describing relocations of regional catalog data. Before starting this project, we had examples where with just a few stations at far-regional distances, waveform correlation combined with double difference did and impressive job of detection and location events with precision at the few hundred and even tens of meters level.
Detecting modification of biomedical events using a deep parsing approach.
Mackinlay, Andrew; Martinez, David; Baldwin, Timothy
2012-04-30
This work describes a system for identifying event mentions in bio-molecular research abstracts that are either speculative (e.g. analysis of IkappaBalpha phosphorylation, where it is not specified whether phosphorylation did or did not occur) or negated (e.g. inhibition of IkappaBalpha phosphorylation, where phosphorylation did not occur). The data comes from a standard dataset created for the BioNLP 2009 Shared Task. The system uses a machine-learning approach, where the features used for classification are a combination of shallow features derived from the words of the sentences and more complex features based on the semantic outputs produced by a deep parser. To detect event modification, we use a Maximum Entropy learner with features extracted from the data relative to the trigger words of the events. The shallow features are bag-of-words features based on a small sliding context window of 3-4 tokens on either side of the trigger word. The deep parser features are derived from parses produced by the English Resource Grammar and the RASP parser. The outputs of these parsers are converted into the Minimal Recursion Semantics formalism, and from this, we extract features motivated by linguistics and the data itself. All of these features are combined to create training or test data for the machine learning algorithm. Over the test data, our methods produce approximately a 4% absolute increase in F-score for detection of event modification compared to a baseline based only on the shallow bag-of-words features. Our results indicate that grammar-based techniques can enhance the accuracy of methods for detecting event modification.
O'Leary, Kevin J; Devisetty, Vikram K; Patel, Amitkumar R; Malkenson, David; Sama, Pradeep; Thompson, William K; Landler, Matthew P; Barnard, Cynthia; Williams, Mark V
2013-02-01
Research supports medical record review using screening triggers as the optimal method to detect hospital adverse events (AE), yet the method is labour-intensive. This study compared a traditional trigger tool with an enterprise data warehouse (EDW) based screening method to detect AEs. We created 51 automated queries based on 33 traditional triggers from prior research, and then applied them to 250 randomly selected medical patients hospitalised between 1 September 2009 and 31 August 2010. Two physicians each abstracted records from half the patients using a traditional trigger tool and then performed targeted abstractions for patients with positive EDW queries in the complementary half of the sample. A third physician confirmed presence of AEs and assessed preventability and severity. Traditional trigger tool and EDW based screening identified 54 (22%) and 53 (21%) patients with one or more AE. Overall, 140 (56%) patients had one or more positive EDW screens (total 366 positive screens). Of the 137 AEs detected by at least one method, 86 (63%) were detected by a traditional trigger tool, 97 (71%) by EDW based screening and 46 (34%) by both methods. Of the 11 total preventable AEs, 6 (55%) were detected by traditional trigger tool, 7 (64%) by EDW based screening and 2 (18%) by both methods. Of the 43 total serious AEs, 28 (65%) were detected by traditional trigger tool, 29 (67%) by EDW based screening and 14 (33%) by both. We found relatively poor agreement between traditional trigger tool and EDW based screening with only approximately a third of all AEs detected by both methods. A combination of complementary methods is the optimal approach to detecting AEs among hospitalised patients.
Detection limit for rate fluctuations in inhomogeneous Poisson processes
NASA Astrophysics Data System (ADS)
Shintani, Toshiaki; Shinomoto, Shigeru
2012-04-01
Estimations of an underlying rate from data points are inevitably disturbed by the irregular occurrence of events. Proper estimation methods are designed to avoid overfitting by discounting the irregular occurrence of data, and to determine a constant rate from irregular data derived from a constant probability distribution. However, it can occur that rapid or small fluctuations in the underlying density are undetectable when the data are sparse. For an estimation method, the maximum degree of undetectable rate fluctuations is uniquely determined as a phase transition, when considering an infinitely long series of events drawn from a fluctuating density. In this study, we analytically examine an optimized histogram and a Bayesian rate estimator with respect to their detectability of rate fluctuation, and determine whether their detectable-undetectable phase transition points are given by an identical formula defining a degree of fluctuation in an underlying rate. In addition, we numerically examine the variational Bayes hidden Markov model in its detectability of rate fluctuation, and determine whether the numerically obtained transition point is comparable to those of the other two methods. Such consistency among these three principled methods suggests the presence of a theoretical limit for detecting rate fluctuations.
Detection limit for rate fluctuations in inhomogeneous Poisson processes.
Shintani, Toshiaki; Shinomoto, Shigeru
2012-04-01
Estimations of an underlying rate from data points are inevitably disturbed by the irregular occurrence of events. Proper estimation methods are designed to avoid overfitting by discounting the irregular occurrence of data, and to determine a constant rate from irregular data derived from a constant probability distribution. However, it can occur that rapid or small fluctuations in the underlying density are undetectable when the data are sparse. For an estimation method, the maximum degree of undetectable rate fluctuations is uniquely determined as a phase transition, when considering an infinitely long series of events drawn from a fluctuating density. In this study, we analytically examine an optimized histogram and a Bayesian rate estimator with respect to their detectability of rate fluctuation, and determine whether their detectable-undetectable phase transition points are given by an identical formula defining a degree of fluctuation in an underlying rate. In addition, we numerically examine the variational Bayes hidden Markov model in its detectability of rate fluctuation, and determine whether the numerically obtained transition point is comparable to those of the other two methods. Such consistency among these three principled methods suggests the presence of a theoretical limit for detecting rate fluctuations.
Wang, Jingbo; Templeton, Dennise C.; Harris, David B.
2015-07-30
Using empirical matched field processing (MFP), we compare 4 yr of continuous seismic data to a set of 195 master templates from within an active geothermal field and identify over 140 per cent more events than were identified using traditional detection and location techniques alone. In managed underground reservoirs, a substantial fraction of seismic events can be excluded from the official catalogue due to an inability to clearly identify seismic-phase onsets. Empirical MFP can improve the effectiveness of current seismic detection and location methodologies by using conventionally located events with higher signal-to-noise ratios as master events to define wavefield templatesmore » that could then be used to map normally discarded indistinct seismicity. Since MFP does not require picking, it can be carried out automatically and rapidly once suitable templates are defined. In this application, we extend MFP by constructing local-distance empirical master templates using Southern California Earthquake Data Center archived waveform data of events originating within the Salton Sea Geothermal Field. We compare the empirical templates to continuous seismic data collected between 1 January 2008 and 31 December 2011. The empirical MFP method successfully identifies 6249 additional events, while the original catalogue reported 4352 events. The majority of these new events are lower-magnitude events with magnitudes between M0.2–M0.8. Here, the increased spatial-temporal resolution of the microseismicity map within the geothermal field illustrates how empirical MFP, when combined with conventional methods, can significantly improve seismic network detection capabilities, which can aid in long-term sustainability and monitoring of managed underground reservoirs.« less
The digital trigger system for the RED-100 detector
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naumov, P. P., E-mail: ddr727@yandex.ru; Akimov, D. Yu.; Belov, V. A.
The system for forming a trigger for the liquid xenon detector RED-100 is developed. The trigger can be generated for all types of events that the detector needs for calibration and data acquisition, including the events with a single electron of ionization. In the system, a mechanism of event detection is implemented according to which the timestamp and event type are assigned to each event. The trigger system is required in the systems searching for rare events to select and keep only the necessary information from the ADC array. The specifications and implementation of the trigger unit which provides amore » high efficiency of response even to low-energy events are considered.« less
SNIa detection in the SNLS photometric analysis using Morphological Component Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Möller, A.; Ruhlmann-Kleider, V.; Neveu, J.
2015-04-01
Detection of supernovae (SNe) and, more generally, of transient events in large surveys can provide numerous false detections. In the case of a deferred processing of survey images, this implies reconstructing complete light curves for all detections, requiring sizable processing time and resources. Optimizing the detection of transient events is thus an important issue for both present and future surveys. We present here the optimization done in the SuperNova Legacy Survey (SNLS) for the 5-year data deferred photometric analysis. In this analysis, detections are derived from stacks of subtracted images with one stack per lunation. The 3-year analysis provided 300,000more » detections dominated by signals of bright objects that were not perfectly subtracted. Allowing these artifacts to be detected leads not only to a waste of resources but also to possible signal coordinate contamination. We developed a subtracted image stack treatment to reduce the number of non SN-like events using morphological component analysis. This technique exploits the morphological diversity of objects to be detected to extract the signal of interest. At the level of our subtraction stacks, SN-like events are rather circular objects while most spurious detections exhibit different shapes. A two-step procedure was necessary to have a proper evaluation of the noise in the subtracted image stacks and thus a reliable signal extraction. We also set up a new detection strategy to obtain coordinates with good resolution for the extracted signal. SNIa Monte-Carlo (MC) generated images were used to study detection efficiency and coordinate resolution. When tested on SNLS 3-year data this procedure decreases the number of detections by a factor of two, while losing only 10% of SN-like events, almost all faint ones. MC results show that SNIa detection efficiency is equivalent to that of the original method for bright events, while the coordinate resolution is improved.« less
Shan, Yun; Xu, Jing-Juan; Chen, Hong-Yuan
2011-07-01
This work reports an aptasensor for ultrasensitive detection of thrombin based on remarkably efficient energy-transfer induced electrochemiluminescence (ECL) quenching from CdS:Mn nanocrystals (NCs) film to CdTe QDs-doped silica nanoparticles (CdTe/SiO(2) NPs). CdTe/SiO(2) NPs were synthesized via the Stöber method and showed black bodies' strong absorption in a wide spectral range without excitonic emission, which made them excellent ECL quenchers. Within the effective distance of energy scavenging, the ECL quenching efficiency was dependent on the number of CdTe QDs doped into the silica NPs. Using ca. 200 CdTe QDs doped silica NPs on average of 40 nm in diameter as ECL quenching labels, attomolar detection of thrombin was successfully realized. The protein detection involves a competition binding event, based on thrombin replacing CdTe/SiO(2) NPs labeled probing DNA which is hybridized with capturing aptamer immobilized on a CdS:Mn NCs film modified glassy carbon electrode surface by specific aptamer-protein affinity interactions. It results in the displacement of ECL quenching labels from CdS:Mn NCs film and concomitant ECL signal recovery. Owing to the high-content CdTe QDs in silica NP, the increment of ECL intensity (ΔI(ECL)) and the concentration of thrombin showed a double logarithmic linear correlation in the range of 5.0 aM∼5.0 fM with a detection limit of 1aM. And, the aptasensor hardly responded to antibody, bovine serum albumin (BSA), haemoglobin (Hb) and lysozyme, showing good detection selectivity for thrombin. This long-distance energy scavenging could have a promising application perspective in the detection of biological recognition events on a molecular level.
Sato, Naoki; Seo, Genichiro; Benno, Yoshimi
2014-01-01
Strain-specific polymerase chain reaction (PCR) primers for detection of Bacillus mesentericus strain TO-A (BM TO-A) were developed. The randomly amplified polymorphic DNA (RAPD) technique was used to produce potential strain-specific markers. A 991-bp RAPD marker found to be strain-specific was sequenced, and two primer pairs specific to BM TO-A were constructed based on this sequence. In addition, we explored a more specific DNA region using inverse PCR, and designed a strain-specific primer set for use in real-time quantitative PCR (qPCR). These primer pairs were tested against 25 Bacillus subtilis strains and were found to be strain-specific. After examination of the detection limit and linearity of detection of BM TO-A in feces, the qPCR method and strain-specific primers were used to quantify BM TO-A in the feces of healthy volunteers who had ingested 3×10(8) colony forming unit (CFU) of BM TO-A per day in tablets. During the administration period, BM TO-A was detected in the feces of all 24 subjects, and the average number of BM TO-A detected using the culture method and qPCR was about 10(4.8) and 10(5.8) cells per gram of feces, respectively. Using the qPCR method, BM TO-A was detected in the feces of half of the subjects 3 d after withdrawal, and was detected in the feces of only one subject 1 week after withdrawal. These results suggest that the qPCR method using BM TO-A strain-specific primers is useful for the quantitative detection of this strain in feces.
Wang, Anran; Wang, Jian; Lin, Hongfei; Zhang, Jianhai; Yang, Zhihao; Xu, Kan
2017-12-20
Biomedical event extraction is one of the most frontier domains in biomedical research. The two main subtasks of biomedical event extraction are trigger identification and arguments detection which can both be considered as classification problems. However, traditional state-of-the-art methods are based on support vector machine (SVM) with massive manually designed one-hot represented features, which require enormous work but lack semantic relation among words. In this paper, we propose a multiple distributed representation method for biomedical event extraction. The method combines context consisting of dependency-based word embedding, and task-based features represented in a distributed way as the input of deep learning models to train deep learning models. Finally, we used softmax classifier to label the example candidates. The experimental results on Multi-Level Event Extraction (MLEE) corpus show higher F-scores of 77.97% in trigger identification and 58.31% in overall compared to the state-of-the-art SVM method. Our distributed representation method for biomedical event extraction avoids the problems of semantic gap and dimension disaster from traditional one-hot representation methods. The promising results demonstrate that our proposed method is effective for biomedical event extraction.
A novel automatic method for monitoring Tourette motor tics through a wearable device.
Bernabei, Michel; Preatoni, Ezio; Mendez, Martin; Piccini, Luca; Porta, Mauro; Andreoni, Giuseppe
2010-09-15
The aim of this study was to propose a novel automatic method for quantifying motor-tics caused by the Tourette Syndrome (TS). In this preliminary report, the feasibility of the monitoring process was tested over a series of standard clinical trials in a population of 12 subjects affected by TS. A wearable instrument with an embedded three-axial accelerometer was used to detect and classify motor tics during standing and walking activities. An algorithm was devised to analyze acceleration data by: eliminating noise; detecting peaks connected to pathological events; and classifying intensity and frequency of motor tics into quantitative scores. These indexes were compared with the video-based ones provided by expert clinicians, which were taken as the gold-standard. Sensitivity, specificity, and accuracy of tic detection were estimated, and an agreement analysis was performed through the least square regression and the Bland-Altman test. The tic recognition algorithm showed sensitivity = 80.8% ± 8.5% (mean ± SD), specificity = 75.8% ± 17.3%, and accuracy = 80.5% ± 12.2%. The agreement study showed that automatic detection tended to overestimate the number of tics occurred. Although, it appeared this may be a systematic error due to the different recognition principles of the wearable and video-based systems. Furthermore, there was substantial concurrency with the gold-standard in estimating the severity indexes. The proposed methodology gave promising performances in terms of automatic motor-tics detection and classification in a standard clinical context. The system may provide physicians with a quantitative aid for TS assessment. Further developments will focus on the extension of its application to everyday long-term monitoring out of clinical environments. © 2010 Movement Disorder Society.
2012-01-01
Background Recent emerging evidences identify Human Papillomavirus (HPV) related Head and Neck squamous cell carcinomas (HN-SCCs) as a separate subgroup among Head and Neck Cancers with different epidemiology, histopathological characteristics, therapeutic response to chemo-radiation treatment and clinical outcome. However, there is not a worldwide consensus on the methods to be used in clinical practice. The endpoint of this study was to demonstrate the reliability of a triple method which combines evaluation of: 1. p16 protein expression by immunohistochemistry (p16-IHC); 2. HPV-DNA genotyping by consensus HPV-DNA PCR methods (Consensus PCR); and 3 viral integration into the host by in situ hybridization method (ISH). This triple method has been applied to HN-SCC originated from oral cavity (OSCC) and oropharynx (OPSCC), the two anatomical sites in which high risk (HR) HPVs have been clearly implicated as etiologic factors. Methylation-Specific PCR (MSP) was performed to study inactivation of p16-CDKN2a locus by epigenetic events. Reliability of multiple methods was measured by Kappa statistics. Results All the HN-SCCs confirmed HPV positive by PCR and/or ISH were also p16 positive by IHC, with the latter showing a very high level of sensitivity as single test (100% in both OSCC and OPSCC) but lower specificity level (74% in OSCC and 93% in OPSCC). Concordance analysis between ISH and Consensus PCR showed a faint agreement in OPSCC (κ = 0.38) and a moderate agreement in OSCC (κ = 0.44). Furthermore, the addition of double positive score (ISHpositive and Consensus PCR positive) increased significantly the specificity of HR-HPV detection on formalin-fixed paraffin embedded (FFPE) samples (100% in OSCC and 78.5% in OPSCC), but reduced the sensitivity (33% in OSCC and 60% in OPSCC). The significant reduction of sensitivity by the double method was compensated by a very high sensitivity of p16-IHC detection in the triple approach. Conclusions Although HR-HPVs detection is of utmost importance in clinical settings for the Head and Neck Cancer patients, there is no consensus on which to consider the 'golden standard' among the numerous detection methods available either as single test or combinations. Until recently, quantitative E6 RNA PCR has been considered the 'golden standard' since it was demonstrated to have very high accuracy level and very high statistical significance associated with prognostic parameters. In contrast, quantitative E6 DNA PCR has proven to have very high level of accuracy but lesser prognostic association with clinical outcome than the HPV E6 oncoprotein RNA PCR. However, although it is theoretically possible to perform quantitative PCR detection methods also on FFPE samples, they reach the maximum of accuracy on fresh frozen tissue. Furthermore, worldwide diagnostic laboratories have not all the same ability to analyze simultaneously both FFPE and fresh tissues with these quantitative molecular detection methods. Therefore, in the current clinical practice a p16-IHC test is considered as sufficient for HPV diagnostic in accordance with the recently published Head and Neck Cancer international guidelines. Although p16-IHC may serve as a good prognostic indicator, our study clearly demonstrated that it is not satisfactory when used exclusively as the only HPV detecting method. Adding ISH, although known as less sensitive than PCR-based detection methods, has the advantage to preserve the morphological context of HPV-DNA signals in FFPE samples and, thus increase the overall specificity of p16/Consensus PCR combination tests. PMID:22376902
Faigen, Zachary; Deyneka, Lana; Ising, Amy; Neill, Daniel; Conway, Mike; Fairchild, Geoffrey; Gunn, Julia; Swenson, David; Painter, Ian; Johnson, Lauren; Kiley, Chris; Streichert, Laura; Burkom, Howard
2015-01-01
We document a funded effort to bridge the gap between constrained scientific challenges of public health surveillance and methodologies from academia and industry. Component tasks are the collection of epidemiologists' use case problems, multidisciplinary consultancies to refine them, and dissemination of problem requirements and shareable datasets. We describe an initial use case and consultancy as a concrete example and challenge to developers. Supported by the Defense Threat Reduction Agency Biosurveillance Ecosystem project, the International Society for Disease Surveillance formed an advisory group to select tractable use case problems and convene inter-disciplinary consultancies to translate analytic needs into well-defined problems and to promote development of applicable solution methods. The initial consultancy's focus was a problem originated by the North Carolina Department of Health and its NC DETECT surveillance system: Derive a method for detection of patient record clusters worthy of follow-up based on free-text chief complaints and without syndromic classification. Direct communication between public health problem owners and analytic developers was informative to both groups and constructive for the solution development process. The consultancy achieved refinement of the asyndromic detection challenge and of solution requirements. Participants summarized and evaluated solution approaches and discussed dissemination and collaboration strategies. A solution meeting the specification of the use case described above could improve human monitoring efficiency with expedited warning of events requiring follow-up, including otherwise overlooked events with no syndromic indicators. This approach can remove obstacles to collaboration with efficient, minimal data-sharing and without costly overhead.
Bacteriophage Amplification-Coupled Detection and Identification of Bacterial Pathogens
NASA Astrophysics Data System (ADS)
Cox, Christopher R.; Voorhees, Kent J.
Current methods of species-specific bacterial detection and identification are complex, time-consuming, and often require expensive specialized equipment and highly trained personnel. Numerous biochemical and genotypic identification methods have been applied to bacterial characterization, but all rely on tedious microbiological culturing practices and/or costly sequencing protocols which render them impractical for deployment as rapid, cost-effective point-of-care or field detection and identification methods. With a view towards addressing these shortcomings, we have exploited the evolutionarily conserved interactions between a bacteriophage (phage) and its bacterial host to develop species-specific detection methods. Phage amplification-coupled matrix assisted laser desorption time-of-flight mass spectrometry (MALDI-TOF-MS) was utilized to rapidly detect phage propagation resulting from species-specific in vitro bacterial infection. This novel signal amplification method allowed for bacterial detection and identification in as little as 2 h, and when combined with disulfide bond reduction methods developed in our laboratory to enhance MALDI-TOF-MS resolution, was observed to lower the limit of detection by several orders of magnitude over conventional spectroscopy and phage typing methods. Phage amplification has been combined with lateral flow immunochromatography (LFI) to develop rapid, easy-to-operate, portable, species-specific point-of-care (POC) detection devices. Prototype LFI detectors have been developed and characterized for Yersinia pestis and Bacillus anthracis, the etiologic agents of plague and anthrax, respectively. Comparable sensitivity and rapidity was observed when phage amplification was adapted to a species-specific handheld LFI detector, thus allowing for rapid, simple, POC bacterial detection and identification while eliminating the need for bacterial culturing or DNA isolation and amplification techniques.
Population-based absolute risk estimation with survey data
Kovalchik, Stephanie A.; Pfeiffer, Ruth M.
2013-01-01
Absolute risk is the probability that a cause-specific event occurs in a given time interval in the presence of competing events. We present methods to estimate population-based absolute risk from a complex survey cohort that can accommodate multiple exposure-specific competing risks. The hazard function for each event type consists of an individualized relative risk multiplied by a baseline hazard function, which is modeled nonparametrically or parametrically with a piecewise exponential model. An influence method is used to derive a Taylor-linearized variance estimate for the absolute risk estimates. We introduce novel measures of the cause-specific influences that can guide modeling choices for the competing event components of the model. To illustrate our methodology, we build and validate cause-specific absolute risk models for cardiovascular and cancer deaths using data from the National Health and Nutrition Examination Survey. Our applications demonstrate the usefulness of survey-based risk prediction models for predicting health outcomes and quantifying the potential impact of disease prevention programs at the population level. PMID:23686614
Li, Ye; Whelan, Michael; Hobbs, Leigh; Fan, Wen Qi; Fung, Cecilia; Wong, Kenny; Marchand-Austin, Alex; Badiani, Tina; Johnson, Ian
2016-06-27
In 2014/2015, Public Health Ontario developed disease-specific, cumulative sum (CUSUM)-based statistical algorithms for detecting aberrant increases in reportable infectious disease incidence in Ontario. The objective of this study was to determine whether the prospective application of these CUSUM algorithms, based on historical patterns, have improved specificity and sensitivity compared to the currently used Early Aberration Reporting System (EARS) algorithm, developed by the US Centers for Disease Control and Prevention. A total of seven algorithms were developed for the following diseases: cyclosporiasis, giardiasis, influenza (one each for type A and type B), mumps, pertussis, invasive pneumococcal disease. Historical data were used as baseline to assess known outbreaks. Regression models were used to model seasonality and CUSUM was applied to the difference between observed and expected counts. An interactive web application was developed allowing program staff to directly interact with data and tune the parameters of CUSUM algorithms using their expertise on the epidemiology of each disease. Using these parameters, a CUSUM detection system was applied prospectively and the results were compared to the outputs generated by EARS. The outcome was the detection of outbreaks, or the start of a known seasonal increase and predicting the peak in activity. The CUSUM algorithms detected provincial outbreaks earlier than the EARS algorithm, identified the start of the influenza season in advance of traditional methods, and had fewer false positive alerts. Additionally, having staff involved in the creation of the algorithms improved their understanding of the algorithms and improved use in practice. Using interactive web-based technology to tune CUSUM improved the sensitivity and specificity of detection algorithms.
Visualizing frequent patterns in large multivariate time series
NASA Astrophysics Data System (ADS)
Hao, M.; Marwah, M.; Janetzko, H.; Sharma, R.; Keim, D. A.; Dayal, U.; Patnaik, D.; Ramakrishnan, N.
2011-01-01
The detection of previously unknown, frequently occurring patterns in time series, often called motifs, has been recognized as an important task. However, it is difficult to discover and visualize these motifs as their numbers increase, especially in large multivariate time series. To find frequent motifs, we use several temporal data mining and event encoding techniques to cluster and convert a multivariate time series to a sequence of events. Then we quantify the efficiency of the discovered motifs by linking them with a performance metric. To visualize frequent patterns in a large time series with potentially hundreds of nested motifs on a single display, we introduce three novel visual analytics methods: (1) motif layout, using colored rectangles for visualizing the occurrences and hierarchical relationships of motifs in a multivariate time series, (2) motif distortion, for enlarging or shrinking motifs as appropriate for easy analysis and (3) motif merging, to combine a number of identical adjacent motif instances without cluttering the display. Analysts can interactively optimize the degree of distortion and merging to get the best possible view. A specific motif (e.g., the most efficient or least efficient motif) can be quickly detected from a large time series for further investigation. We have applied these methods to two real-world data sets: data center cooling and oil well production. The results provide important new insights into the recurring patterns.
Slow earthquakes in microseism frequency band (0.1-2 Hz) off the Kii peninsula
NASA Astrophysics Data System (ADS)
Kaneko, L.; Ide, S.; Nakano, M.
2017-12-01
Slow earthquakes are divided into deep tectonic tremors, very low frequency (VLF) events, and slow slip events (SSE), each of which is observed in a different frequency band. Tremors are observed above 2 Hz, and VLF signals are visible mainly in 0.01-0.05 Hz. It was generally very difficult to find signals of slow underground deformation at frequencies between them, i.e., 0.1-2Hz, where microseism noise is dominant. However, after a Mw 5.9 plate boundary earthquake off the Kii peninsula on April 1st, 2016, sufficiently large signals have been observed in the microseism band, accompanied with signals from active tremors, VLFEs, and SSEs by the ocean bottom seismometer network DONET maintained by JAMSTEC and NIED. This is the first observation of slow earthquakes in the microseism frequency band. Here we report the detection and location of events in this band, and compare them with the spatial and temporal distributions of ordinary tectonic tremors above 2 Hz and VLF events. We used continuous records of 20 broadband seismometers of DONET from April 1st to 12th. We detected events by calculating arrival time differences between stations using an envelope correlation method of Ide (2010). Unlike ordinary applications, we repeated analyses for seismograms bandpass-filtered in four separated frequency bands, 0.1-1, 1-2, 2-4, and 4-8 Hz. For each band, we successfully detected events and determined their hypocenter locations. Many VLF events have also been detected in this region in the frequency band of 0.03-0.05 Hz, with location and focal mechanism using a method of Nakano et al. (2008). In the 0.1-1 Hz microseism band, hypocenters were determined mainly on April 10th, when microseism noises are small and signal amplitudes are quite large. In several time windows, events were detected in all four bands, and located within the 2-sigma error ellipses, with similar source time functions. Sometimes, events were detected in two or three bands, suggesting wide variations of in wave radiation at different frequencies. Although the location errors are not always small enough to confirm the collocation of sources, due to uncertainty in structure, we can confirm seismic wave are radiated in the microseism band from slow earthquake, which is considered as a continuous, broadband, and complicated phenomenon.
NASA Astrophysics Data System (ADS)
Muszynski, G.; Kashinath, K.; Wehner, M. F.; Prabhat, M.; Kurlin, V.
2017-12-01
We investigate novel approaches to detecting, classifying and characterizing extreme weather events, such as atmospheric rivers (ARs), in large high-dimensional climate datasets. ARs are narrow filaments of concentrated water vapour in the atmosphere that bring much of the precipitation in many mid-latitude regions. The precipitation associated with ARs is also responsible for major flooding events in many coastal regions of the world, including the west coast of the United States and western Europe. In this study we combine ideas from Topological Data Analysis (TDA) with Machine Learning (ML) for detecting, classifying and characterizing extreme weather events, like ARs. TDA is a new field that sits at the interface between topology and computer science, that studies "shape" - hidden topological structure - in raw data. It has been applied successfully in many areas of applied sciences, including complex networks, signal processing and image recognition. Using TDA we provide ARs with a shape characteristic as a new feature descriptor for the task of AR classification. In particular, we track the change in topology in precipitable water (integrated water vapour) fields using the Union-Find algorithm. We use the generated feature descriptors with ML classifiers to establish reliability and classification performance of our approach. We utilize the parallel toolkit for extreme climate events analysis (TECA: Petascale Pattern Recognition for Climate Science, Prabhat et al., Computer Analysis of Images and Patterns, 2015) for comparison (it is assumed that events identified by TECA is ground truth). Preliminary results indicate that our approach brings new insight into the study of ARs and provides quantitative information about the relevance of topological feature descriptors in analyses of a large climate datasets. We illustrate this method on climate model output and NCEP reanalysis datasets. Further, our method outperforms existing methods on detection and classification of ARs. This work illustrates that TDA combined with ML may provide a uniquely powerful approach for detection, classification and characterization of extreme weather phenomena.
Yamashita, S; Nakagawa, H; Sakaguchi, T; Arima, T-H; Kikoku, Y
2018-01-01
Heat-resistant fungi occur sporadically and are a continuing problem for the food and beverage industry. The genus Talaromyces, as a typical fungus, is capable of producing the heat-resistant ascospores responsible for the spoilage of processed food products. Isocitrate lyase, a signature enzyme of the glyoxylate cycle, is required for the metabolism of non-fermentable carbon compounds, like acetate and ethanol. Here, species-specific primer sets for detection and identification of DNA derived from Talaromyces macrosporus and Talaromyces trachyspermus were designed based on the nucleotide sequences of their isocitrate lyase genes. Polymerase chain reaction (PCR) using a species-specific primer set amplified products specific to T. macrosporus and T. trachyspermus. Other fungal species, such as Byssochlamys fulva and Hamigera striata, which cause food spoilage, were not detected using the Talaromyces-specific primer sets. The detection limit for each species-specific primer set was determined as being 50 pg of template DNA, without using a nested PCR method. The specificity of each species-specific primer set was maintained in the presence of 1,000-fold amounts of genomic DNA from other fungi. The method also detected fungal DNA extracted from blueberry inoculated with T. macrosporus. This PCR method provides a quick, simple, powerful and reliable way to detect T. macrosporus and T. trachyspermus. Polymerase chain reaction (PCR)-based detection is rapid, convenient and sensitive compared with traditional methods of detecting heat-resistant fungi. In this study, a PCR-based method was developed for the detection and identification of amplification products from Talaromyces macrosporus and Talaromyces trachyspermus using primer sets that target the isocitrate lyase gene. This method could be used for the on-site detection of T. macrosporus and T. trachyspermus in the near future, and will be helpful in the safety control of raw materials and in food and beverage production. © 2017 The Authors. Letters in Applied Microbiology published by John Wiley & Sons Ltd on behalf of The Society for Applied Microbiology.
2013-06-01
benefitting from rapid, automated discrimination of specific predefined signals , and is free-standing (requiring no other plugins or packages). The...previously labeled dataset, and comparing two labeled datasets. 15. SUBJECT TERMS Artifact, signal detection, EEG, MATLAB, toolbox 16. SECURITY... CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES 56 19a. NAME OF RESPONSIBLE PERSON W. David Hairston a. REPORT