Sample records for detection method called

  1. Influence of atmospheric properties on detection of wood-warbler nocturnal flight calls

    NASA Astrophysics Data System (ADS)

    Horton, Kyle G.; Stepanian, Phillip M.; Wainwright, Charlotte E.; Tegeler, Amy K.

    2015-10-01

    Avian migration monitoring can take on many forms; however, monitoring active nocturnal migration of land birds is limited to a few techniques. Avian nocturnal flight calls are currently the only method for describing migrant composition at the species level. However, as this method develops, more information is needed to understand the sources of variation in call detection. Additionally, few studies examine how detection probabilities differ under varying atmospheric conditions. We use nocturnal flight call recordings from captive individuals to explore the dependence of flight call detection on atmospheric temperature and humidity. Height or distance from origin had the largest influence on call detection, while temperature and humidity also influenced detectability at higher altitudes. Because flight call detection varies with both atmospheric conditions and flight height, improved monitoring across time and space will require correction for these factors to generate standardized metrics of songbird migration.

  2. Probability of detecting band-tailed pigeons during call-broadcast versus auditory surveys

    USGS Publications Warehouse

    Kirkpatrick, C.; Conway, C.J.; Hughes, K.M.; Devos, J.C.

    2007-01-01

    Estimates of population trend for the interior subspecies of band-tailed pigeon (Patagioenas fasciata fasciata) are not available because no standardized survey method exists for monitoring the interior subspecies. We evaluated 2 potential band-tailed pigeon survey methods (auditory and call-broadcast surveys) from 2002 to 2004 in 5 mountain ranges in southern Arizona, USA, and in mixed-conifer forest throughout the state. Both auditory and call-broadcast surveys produced low numbers of cooing pigeons detected per survey route (x?? ??? 0.67) and had relatively high temporal variance in average number of cooing pigeons detected during replicate surveys (CV ??? 161%). However, compared to auditory surveys, use of call-broadcast increased 1) the percentage of replicate surveys on which ???1 cooing pigeon was detected by an average of 16%, and 2) the number of cooing pigeons detected per survey route by an average of 29%, with this difference being greatest during the first 45 minutes of the morning survey period. Moreover, probability of detecting a cooing pigeon was 27% greater during call-broadcast (0.80) versus auditory (0.63) surveys. We found that cooing pigeons were most common in mixed-conifer forest in southern Arizona and density of male pigeons in mixed-conifer forest throughout the state averaged 0.004 (SE = 0.001) pigeons/ha. Our results are the first to show that call-broadcast increases the probability of detecting band-tailed pigeons (or any species of Columbidae) during surveys. Call-broadcast surveys may provide a useful method for monitoring populations of the interior subspecies of band-tailed pigeon in areas where other survey methods are inappropriate.

  3. Long-range acoustic detection and localization of blue whale calls in the northeast Pacific Ocean.

    PubMed

    Stafford, K M; Fox, C G; Clark, D S

    1998-12-01

    Analysis of acoustic signals recorded from the U.S. Navy's SOund SUrveillance System (SOSUS) was used to detect and locate blue whale (Balaenoptera musculus) calls offshore in the northeast Pacific. The long, low-frequency components of these calls are characteristic of calls recorded in the presence of blue whales elsewhere in the world. Mean values for frequency and time characteristics from field-recorded blue whale calls were used to develop a simple matched filter for detecting such calls in noisy time series. The matched filter was applied to signals from three different SOSUS arrays off the coast of the Pacific Northwest to detect and associate individual calls from the same animal on the different arrays. A U.S. Navy maritime patrol aircraft was directed to an area where blue whale calls had been detected on SOSUS using these methods, and the presence of vocalizing blue whale was confirmed at the site with field recordings from sonobuoys.

  4. The development of a super-fine-grained nuclear emulsion

    NASA Astrophysics Data System (ADS)

    Asada, Takashi; Naka, Tatsuhiro; Kuwabara, Ken-ichi; Yoshimoto, Masahiro

    2017-06-01

    A nuclear emulsion with micronized crystals is required for the tracking detection of submicron ionizing particles, which are one of the targets of dark-matter detection and other techniques. We found that a new production method, called the PVA—gelatin mixing method (PGMM), could effectively control crystal size from 20 nm to 50 nm. We called the two types of emulsion produced with the new method the nano imaging tracker and the ultra-nano imaging tracker. Their composition and spatial resolution were measured, and the results indicate that these emulsions detect extremely short tracks.

  5. Observations and Bayesian location methodology of transient acoustic signals (likely blue whales) in the Indian Ocean, using a hydrophone triplet.

    PubMed

    Le Bras, Ronan J; Kuzma, Heidi; Sucic, Victor; Bokelmann, Götz

    2016-05-01

    A notable sequence of calls was encountered, spanning several days in January 2003, in the central part of the Indian Ocean on a hydrophone triplet recording acoustic data at a 250 Hz sampling rate. This paper presents signal processing methods applied to the waveform data to detect, group, extract amplitude and bearing estimates for the recorded signals. An approximate location for the source of the sequence of calls is inferred from extracting the features from the waveform. As the source approaches the hydrophone triplet, the source level (SL) of the calls is estimated at 187 ± 6 dB re: 1 μPa-1 m in the 15-60 Hz frequency range. The calls are attributed to a subgroup of blue whales, Balaenoptera musculus, with a characteristic acoustic signature. A Bayesian location method using probabilistic models for bearing and amplitude is demonstrated on the calls sequence. The method is applied to the case of detection at a single triad of hydrophones and results in a probability distribution map for the origin of the calls. It can be extended to detections at multiple triads and because of the Bayesian formulation, additional modeling complexity can be built-in as needed.

  6. Evaluation of copy number variation detection for a SNP array platform

    PubMed Central

    2014-01-01

    Background Copy Number Variations (CNVs) are usually inferred from Single Nucleotide Polymorphism (SNP) arrays by use of some software packages based on given algorithms. However, there is no clear understanding of the performance of these software packages; it is therefore difficult to select one or several software packages for CNV detection based on the SNP array platform. We selected four publicly available software packages designed for CNV calling from an Affymetrix SNP array, including Birdsuite, dChip, Genotyping Console (GTC) and PennCNV. The publicly available dataset generated by Array-based Comparative Genomic Hybridization (CGH), with a resolution of 24 million probes per sample, was considered to be the “gold standard”. Compared with the CGH-based dataset, the success rate, average stability rate, sensitivity, consistence and reproducibility of these four software packages were assessed compared with the “gold standard”. Specially, we also compared the efficiency of detecting CNVs simultaneously by two, three and all of the software packages with that by a single software package. Results Simply from the quantity of the detected CNVs, Birdsuite detected the most while GTC detected the least. We found that Birdsuite and dChip had obvious detecting bias. And GTC seemed to be inferior because of the least amount of CNVs it detected. Thereafter we investigated the detection consistency produced by one certain software package and the rest three software suits. We found that the consistency of dChip was the lowest while GTC was the highest. Compared with the CNVs detecting result of CGH, in the matching group, GTC called the most matching CNVs, PennCNV-Affy ranked second. In the non-overlapping group, GTC called the least CNVs. With regards to the reproducibility of CNV calling, larger CNVs were usually replicated better. PennCNV-Affy shows the best consistency while Birdsuite shows the poorest. Conclusion We found that PennCNV outperformed the other three packages in the sensitivity and specificity of CNV calling. Obviously, each calling method had its own limitations and advantages for different data analysis. Therefore, the optimized calling methods might be identified using multiple algorithms to evaluate the concordance and discordance of SNP array-based CNV calling. PMID:24555668

  7. An effective method on pornographic images realtime recognition

    NASA Astrophysics Data System (ADS)

    Wang, Baosong; Lv, Xueqiang; Wang, Tao; Wang, Chengrui

    2013-03-01

    In this paper, skin detection, texture filtering and face detection are used to extract feature on an image library, training them with the decision tree arithmetic to create some rules as a decision tree classifier to distinguish an unknown image. Experiment based on more than twenty thousand images, the precision rate can get 76.21% when testing on 13025 pornographic images and elapsed time is less than 0.2s. This experiment shows it has a good popularity. Among the steps mentioned above, proposing a new skin detection model which called irregular polygon region skin detection model based on YCbCr color space. This skin detection model can lower the false detection rate on skin detection. A new method called sequence region labeling on binary connected area can calculate features on connected area, it is faster and needs less memory than other recursive methods.

  8. Site specific passive acoustic detection and densities of humpback whale calls off the coast of California

    NASA Astrophysics Data System (ADS)

    Helble, Tyler Adam

    Passive acoustic monitoring of marine mammal calls is an increasingly important method for assessing population numbers, distribution, and behavior. Automated methods are needed to aid in the analyses of the recorded data. When a mammal vocalizes in the marine environment, the received signal is a filtered version of the original waveform emitted by the marine mammal. The waveform is reduced in amplitude and distorted due to propagation effects that are influenced by the bathymetry and environment. It is important to account for these effects to determine a site-specific probability of detection for marine mammal calls in a given study area. A knowledge of that probability function over a range of environmental and ocean noise conditions allows vocalization statistics from recordings of single, fixed, omnidirectional sensors to be compared across sensors and at the same sensor over time with less bias and uncertainty in the results than direct comparison of the raw statistics. This dissertation focuses on both the development of new tools needed to automatically detect humpback whale vocalizations from single-fixed omnidirectional sensors as well as the determination of the site-specific probability of detection for monitoring sites off the coast of California. Using these tools, detected humpback calls are "calibrated" for environmental properties using the site-specific probability of detection values, and presented as call densities (calls per square kilometer per time). A two-year monitoring effort using these calibrated call densities reveals important biological and ecological information on migrating humpback whales off the coast of California. Call density trends are compared between the monitoring sites and at the same monitoring site over time. Call densities also are compared to several natural and human-influenced variables including season, time of day, lunar illumination, and ocean noise. The results reveal substantial differences in call densities between the two sites which were not noticeable using uncorrected (raw) call counts. Additionally, a Lombard effect was observed for humpback whale vocalizations in response to increasing ocean noise. The results presented in this thesis develop techniques to accurately measure marine mammal abundances from passive acoustic sensors.

  9. Selected Aspects of the eCall Emergency Notification System

    NASA Astrophysics Data System (ADS)

    Kaminski, Tomasz; Nowacki, Gabriel; Mitraszewska, Izabella; Niezgoda, Michał; Kruszewski, Mikołaj; Kaminska, Ewa; Filipek, Przemysław

    2012-02-01

    The article describes problems associated with the road collision detection for the purpose of the automatic emergency call. At the moment collision is detected, the eCall device installed in the vehicle will automatically make contact with Emergency Notification Centre and send the set of essential information on the vehicle and the place of the accident. To activate the alarm, the information about the deployment of the airbags will not be used, because connection of the eCall device might interfere with the vehicle’s safety systems. It is necessary to develop a method enabling detection of the road collision, similar to the one used in airbag systems, and based on the signals available from the acceleration sensors.

  10. A Shellcode Detection Method Based on Full Native API Sequence and Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Cheng, Yixuan; Fan, Wenqing; Huang, Wei; An, Jing

    2017-09-01

    Dynamic monitoring the behavior of a program is widely used to discriminate between benign program and malware. It is usually based on the dynamic characteristics of a program, such as API call sequence or API call frequency to judge. The key innovation of this paper is to consider the full Native API sequence and use the support vector machine to detect the shellcode. We also use the Markov chain to extract and digitize Native API sequence features. Our experimental results show that the method proposed in this paper has high accuracy and low detection rate.

  11. Automated detection and localization of bowhead whale sounds in the presence of seismic airgun surveys.

    PubMed

    Thode, Aaron M; Kim, Katherine H; Blackwell, Susanna B; Greene, Charles R; Nations, Christopher S; McDonald, Trent L; Macrander, A Michael

    2012-05-01

    An automated procedure has been developed for detecting and localizing frequency-modulated bowhead whale sounds in the presence of seismic airgun surveys. The procedure was applied to four years of data, collected from over 30 directional autonomous recording packages deployed over a 280 km span of continental shelf in the Alaskan Beaufort Sea. The procedure has six sequential stages that begin by extracting 25-element feature vectors from spectrograms of potential call candidates. Two cascaded neural networks then classify some feature vectors as bowhead calls, and the procedure then matches calls between recorders to triangulate locations. To train the networks, manual analysts flagged 219 471 bowhead call examples from 2008 and 2009. Manual analyses were also used to identify 1.17 million transient signals that were not whale calls. The network output thresholds were adjusted to reject 20% of whale calls in the training data. Validation runs using 2007 and 2010 data found that the procedure missed 30%-40% of manually detected calls. Furthermore, 20%-40% of the sounds flagged as calls are not present in the manual analyses; however, these extra detections incorporate legitimate whale calls overlooked by human analysts. Both manual and automated methods produce similar spatial and temporal call distributions.

  12. Allele-specific copy-number discovery from whole-genome and whole-exome sequencing

    PubMed Central

    Wang, WeiBo; Wang, Wei; Sun, Wei; Crowley, James J.; Szatkiewicz, Jin P.

    2015-01-01

    Copy-number variants (CNVs) are a major form of genetic variation and a risk factor for various human diseases, so it is crucial to accurately detect and characterize them. It is conceivable that allele-specific reads from high-throughput sequencing data could be leveraged to both enhance CNV detection and produce allele-specific copy number (ASCN) calls. Although statistical methods have been developed to detect CNVs using whole-genome sequence (WGS) and/or whole-exome sequence (WES) data, information from allele-specific read counts has not yet been adequately exploited. In this paper, we develop an integrated method, called AS-GENSENG, which incorporates allele-specific read counts in CNV detection and estimates ASCN using either WGS or WES data. To evaluate the performance of AS-GENSENG, we conducted extensive simulations, generated empirical data using existing WGS and WES data sets and validated predicted CNVs using an independent methodology. We conclude that AS-GENSENG not only predicts accurate ASCN calls but also improves the accuracy of total copy number calls, owing to its unique ability to exploit information from both total and allele-specific read counts while accounting for various experimental biases in sequence data. Our novel, user-friendly and computationally efficient method and a complete analytic protocol is freely available at https://sourceforge.net/projects/asgenseng/. PMID:25883151

  13. Do you hear what I see? Vocalization relative to visual detection rates of Hawaiian hoary bats (Lasiurus cinereus semotus)

    USGS Publications Warehouse

    Gorresen, Paulo Marcos; Cryan, Paul; Montoya-Aiona, Kristina; Bonaccorso, Frank

    2017-01-01

    Bats vocalize during flight as part of the sensory modality called echolocation, but very little is known about whether flying bats consistently call. Occasional vocal silence during flight when bats approach prey or conspecifics has been documented for relatively few species and situations. Bats flying alone in clutter-free airspace are not known to forgo vocalization, yet prior observations suggested possible silent behavior in certain, unexpected situations. Determining when, why, and where silent behavior occurs in bats will help evaluate major assumptions of a primary monitoring method for bats used in ecological research, management, and conservation. In this study, we recorded flight activity of Hawaiian hoary bats (Lasiurus cinereus semotus) under seminatural conditions using both thermal video cameras and acoustic detectors. Simultaneous video and audio recordings from 20 nights of observation at 10 sites were analyzed for correspondence between detection methods, with a focus on video observations in three distance categories for which accompanying vocalizations were detected. Comparison of video and audio detections revealed that a high proportion of Hawaiian hoary bats “seen” on video were not simultaneously “heard.” On average, only about one in three visual detections within a night had an accompanying call detection, but this varied greatly among nights. Bats flying on curved flight paths and individuals nearer the cameras were more likely to be detected by both methods. Feeding and social calls were detected, but no clear pattern emerged from the small number of observations involving closely interacting bats. These results may indicate that flying Hawaiian hoary bats often forgo echolocation, or do not always vocalize in a way that is detectable with common sampling and monitoring methods. Possible reasons for the low correspondence between visual and acoustic detections range from methodological to biological and include a number of biases associated with the propagation and detection of sound, cryptic foraging strategies, or conspecific presence. Silent flight behavior may be more prevalent in echolocating bats than previously appreciated, has profound implications for ecological research, and deserves further characterization and study.

  14. Cetacean population density estimation from single fixed sensors using passive acoustics.

    PubMed

    Küsel, Elizabeth T; Mellinger, David K; Thomas, Len; Marques, Tiago A; Moretti, David; Ward, Jessica

    2011-06-01

    Passive acoustic methods are increasingly being used to estimate animal population density. Most density estimation methods are based on estimates of the probability of detecting calls as functions of distance. Typically these are obtained using receivers capable of localizing calls or from studies of tagged animals. However, both approaches are expensive to implement. The approach described here uses a MonteCarlo model to estimate the probability of detecting calls from single sensors. The passive sonar equation is used to predict signal-to-noise ratios (SNRs) of received clicks, which are then combined with a detector characterization that predicts probability of detection as a function of SNR. Input distributions for source level, beam pattern, and whale depth are obtained from the literature. Acoustic propagation modeling is used to estimate transmission loss. Other inputs for density estimation are call rate, obtained from the literature, and false positive rate, obtained from manual analysis of a data sample. The method is applied to estimate density of Blainville's beaked whales over a 6-day period around a single hydrophone located in the Tongue of the Ocean, Bahamas. Results are consistent with those from previous analyses, which use additional tag data. © 2011 Acoustical Society of America

  15. Allele-specific copy-number discovery from whole-genome and whole-exome sequencing.

    PubMed

    Wang, WeiBo; Wang, Wei; Sun, Wei; Crowley, James J; Szatkiewicz, Jin P

    2015-08-18

    Copy-number variants (CNVs) are a major form of genetic variation and a risk factor for various human diseases, so it is crucial to accurately detect and characterize them. It is conceivable that allele-specific reads from high-throughput sequencing data could be leveraged to both enhance CNV detection and produce allele-specific copy number (ASCN) calls. Although statistical methods have been developed to detect CNVs using whole-genome sequence (WGS) and/or whole-exome sequence (WES) data, information from allele-specific read counts has not yet been adequately exploited. In this paper, we develop an integrated method, called AS-GENSENG, which incorporates allele-specific read counts in CNV detection and estimates ASCN using either WGS or WES data. To evaluate the performance of AS-GENSENG, we conducted extensive simulations, generated empirical data using existing WGS and WES data sets and validated predicted CNVs using an independent methodology. We conclude that AS-GENSENG not only predicts accurate ASCN calls but also improves the accuracy of total copy number calls, owing to its unique ability to exploit information from both total and allele-specific read counts while accounting for various experimental biases in sequence data. Our novel, user-friendly and computationally efficient method and a complete analytic protocol is freely available at https://sourceforge.net/projects/asgenseng/. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. Correcting for Sample Contamination in Genotype Calling of DNA Sequence Data

    PubMed Central

    Flickinger, Matthew; Jun, Goo; Abecasis, Gonçalo R.; Boehnke, Michael; Kang, Hyun Min

    2015-01-01

    DNA sample contamination is a frequent problem in DNA sequencing studies and can result in genotyping errors and reduced power for association testing. We recently described methods to identify within-species DNA sample contamination based on sequencing read data, showed that our methods can reliably detect and estimate contamination levels as low as 1%, and suggested strategies to identify and remove contaminated samples from sequencing studies. Here we propose methods to model contamination during genotype calling as an alternative to removal of contaminated samples from further analyses. We compare our contamination-adjusted calls to calls that ignore contamination and to calls based on uncontaminated data. We demonstrate that, for moderate contamination levels (5%–20%), contamination-adjusted calls eliminate 48%–77% of the genotyping errors. For lower levels of contamination, our contamination correction methods produce genotypes nearly as accurate as those based on uncontaminated data. Our contamination correction methods are useful generally, but are particularly helpful for sample contamination levels from 2% to 20%. PMID:26235984

  17. Site specific probability of passive acoustic detection of humpback whale calls from single fixed hydrophones.

    PubMed

    Helble, Tyler A; D'Spain, Gerald L; Hildebrand, John A; Campbell, Gregory S; Campbell, Richard L; Heaney, Kevin D

    2013-09-01

    Passive acoustic monitoring of marine mammal calls is an increasingly important method for assessing population numbers, distribution, and behavior. A common mistake in the analysis of marine mammal acoustic data is formulating conclusions about these animals without first understanding how environmental properties such as bathymetry, sediment properties, water column sound speed, and ocean acoustic noise influence the detection and character of vocalizations in the acoustic data. The approach in this paper is to use Monte Carlo simulations with a full wave field acoustic propagation model to characterize the site specific probability of detection of six types of humpback whale calls at three passive acoustic monitoring locations off the California coast. Results show that the probability of detection can vary by factors greater than ten when comparing detections across locations, or comparing detections at the same location over time, due to environmental effects. Effects of uncertainties in the inputs to the propagation model are also quantified, and the model accuracy is assessed by comparing calling statistics amassed from 24,690 humpback units recorded in the month of October 2008. Under certain conditions, the probability of detection can be estimated with uncertainties sufficiently small to allow for accurate density estimates.

  18. Mobile phones improve case detection and management of malaria in rural Bangladesh

    PubMed Central

    2013-01-01

    Background The recent introduction of mobile phones into the rural Bandarban district of Bangladesh provided a resource to improve case detection and treatment of patients with malaria. Methods During studies to define the epidemiology of malaria in villages in south-eastern Bangladesh, an area with hypoendemic malaria, the project recorded 986 mobile phone calls from families because of illness suspected to be malaria between June 2010 and June 2012. Results Based on phone calls, field workers visited the homes with ill persons, and collected blood samples for malaria on 1,046 people. 265 (25%) of the patients tested were positive for malaria. Of the 509 symptomatic malaria cases diagnosed during this study period, 265 (52%) were detected because of an initial mobile phone call. Conclusion Mobile phone technology was found to be an efficient and effective method for rapidly detecting and treating patients with malaria in this remote area. This technology, when combined with local knowledge and field support, may be applicable to other hard-to-reach areas to improve malaria control. PMID:23374585

  19. Target Detection and Classification Using Seismic and PIR Sensors

    DTIC Science & Technology

    2012-06-01

    time series analysis via wavelet - based partitioning,” Signal Process...regard, this paper presents a wavelet - based method for target detection and classification. The proposed method has been validated on data sets of...The work reported in this paper makes use of a wavelet - based feature extraction method , called Symbolic Dynamic Filtering (SDF) [12]–[14]. The

  20. Acoustic signal detection of manatee calls

    NASA Astrophysics Data System (ADS)

    Niezrecki, Christopher; Phillips, Richard; Meyer, Michael; Beusse, Diedrich O.

    2003-04-01

    The West Indian manatee (trichechus manatus latirostris) has become endangered partly because of a growing number of collisions with boats. A system to warn boaters of the presence of manatees, that can signal to boaters that manatees are present in the immediate vicinity, could potentially reduce these boat collisions. In order to identify the presence of manatees, acoustic methods are employed. Within this paper, three different detection algorithms are used to detect the calls of the West Indian manatee. The detection systems are tested in the laboratory using simulated manatee vocalizations from an audio compact disc. The detection method that provides the best overall performance is able to correctly identify ~=96% of the manatee vocalizations. However the system also results in a false positive rate of ~=16%. The results of this work may ultimately lead to the development of a manatee warning system that can warn boaters of the presence of manatees.

  1. Modeling seasonal detection patterns for burrowing owl surveys

    Treesearch

    Quresh S. Latif; Kathleen D. Fleming; Cameron Barrows; John T. Rotenberry

    2012-01-01

    To guide monitoring of burrowing owls (Athene cunicularia) in the Coachella Valley, California, USA, we analyzed survey-method-specific seasonal variation in detectability. Point-based call-broadcast surveys yielded high early season detectability that then declined through time, whereas detectability on driving surveys increased through the season. Point surveys...

  2. Biological relevance of CNV calling methods using familial relatedness including monozygotic twins.

    PubMed

    Castellani, Christina A; Melka, Melkaye G; Wishart, Andrea E; Locke, M Elizabeth O; Awamleh, Zain; O'Reilly, Richard L; Singh, Shiva M

    2014-04-21

    Studies involving the analysis of structural variation including Copy Number Variation (CNV) have recently exploded in the literature. Furthermore, CNVs have been associated with a number of complex diseases and neurodevelopmental disorders. Common methods for CNV detection use SNP, CNV, or CGH arrays, where the signal intensities of consecutive probes are used to define the number of copies associated with a given genomic region. These practices pose a number of challenges that interfere with the ability of available methods to accurately call CNVs. It has, therefore, become necessary to develop experimental protocols to test the reliability of CNV calling methods from microarray data so that researchers can properly discriminate biologically relevant data from noise. We have developed a workflow for the integration of data from multiple CNV calling algorithms using the same array results. It uses four CNV calling programs: PennCNV (PC), Affymetrix® Genotyping Console™ (AGC), Partek® Genomics Suite™ (PGS) and Golden Helix SVS™ (GH) to analyze CEL files from the Affymetrix® Human SNP 6.0 Array™. To assess the relative suitability of each program, we used individuals of known genetic relationships. We found significant differences in CNV calls obtained by different CNV calling programs. Although the programs showed variable patterns of CNVs in the same individuals, their distribution in individuals of different degrees of genetic relatedness has allowed us to offer two suggestions. The first involves the use of multiple algorithms for the detection of the largest possible number of CNVs, and the second suggests the use of PennCNV over all other methods when the use of only one software program is desirable.

  3. Profiler-2000: Attacking the Insider Threat

    DTIC Science & Technology

    2005-09-01

    detection approach and its incorporation into a number of current automated intrusion-detection strategies (e.g., AT&T’s Com- puterWatch, SRI’s Emerald ...administrative privileges, to be activated upon his or her next login . The system calls required to implement this method are chmod and exit. These two calls...kinds of information that can be derived from these (and other) logs are: time of login , physical location of login , duration of user session

  4. THE SCREENING AND RANKING ALGORITHM FOR CHANGE-POINTS DETECTION IN MULTIPLE SAMPLES

    PubMed Central

    Song, Chi; Min, Xiaoyi; Zhang, Heping

    2016-01-01

    The chromosome copy number variation (CNV) is the deviation of genomic regions from their normal copy number states, which may associate with many human diseases. Current genetic studies usually collect hundreds to thousands of samples to study the association between CNV and diseases. CNVs can be called by detecting the change-points in mean for sequences of array-based intensity measurements. Although multiple samples are of interest, the majority of the available CNV calling methods are single sample based. Only a few multiple sample methods have been proposed using scan statistics that are computationally intensive and designed toward either common or rare change-points detection. In this paper, we propose a novel multiple sample method by adaptively combining the scan statistic of the screening and ranking algorithm (SaRa), which is computationally efficient and is able to detect both common and rare change-points. We prove that asymptotically this method can find the true change-points with almost certainty and show in theory that multiple sample methods are superior to single sample methods when shared change-points are of interest. Additionally, we report extensive simulation studies to examine the performance of our proposed method. Finally, using our proposed method as well as two competing approaches, we attempt to detect CNVs in the data from the Primary Open-Angle Glaucoma Genes and Environment study, and conclude that our method is faster and requires less information while our ability to detect the CNVs is comparable or better. PMID:28090239

  5. Automated surveillance of 911 call data for detection of possible water contamination incidents

    PubMed Central

    2011-01-01

    Background Drinking water contamination, with the capability to affect large populations, poses a significant risk to public health. In recent water contamination events, the impact of contamination on public health appeared in data streams monitoring health-seeking behavior. While public health surveillance has traditionally focused on the detection of pathogens, developing methods for detection of illness from fast-acting chemicals has not been an emphasis. Methods An automated surveillance system was implemented for Cincinnati's drinking water contamination warning system to monitor health-related 911 calls in the city of Cincinnati. Incident codes indicative of possible water contamination were filtered from all 911 calls for analysis. The 911 surveillance system uses a space-time scan statistic to detect potential water contamination incidents. The frequency and characteristics of the 911 alarms over a 2.5 year period were studied. Results During the evaluation, 85 alarms occurred, although most occurred prior to the implementation of an additional alerting constraint in May 2009. Data were available for analysis approximately 48 minutes after calls indicating alarms may be generated 1-2 hours after a rapid increase in call volume. Most alerts occurred in areas of high population density. The average alarm area was 9.22 square kilometers. The average number of cases in an alarm was nine calls. Conclusions The 911 surveillance system provides timely notification of possible public health events, but did have limitations. While the alarms contained incident codes and location of the caller, additional information such as medical status was not available to assist validating the cause of the alarm. Furthermore, users indicated that a better understanding of 911 system functionality is necessary to understand how it would behave in an actual water contamination event. PMID:21450105

  6. Vecuum: identification and filtration of false somatic variants caused by recombinant vector contamination.

    PubMed

    Kim, Junho; Maeng, Ju Heon; Lim, Jae Seok; Son, Hyeonju; Lee, Junehawk; Lee, Jeong Ho; Kim, Sangwoo

    2016-10-15

    Advances in sequencing technologies have remarkably lowered the detection limit of somatic variants to a low frequency. However, calling mutations at this range is still confounded by many factors including environmental contamination. Vector contamination is a continuously occurring issue and is especially problematic since vector inserts are hardly distinguishable from the sample sequences. Such inserts, which may harbor polymorphisms and engineered functional mutations, can result in calling false variants at corresponding sites. Numerous vector-screening methods have been developed, but none could handle contamination from inserts because they are focusing on vector backbone sequences alone. We developed a novel method-Vecuum-that identifies vector-originated reads and resultant false variants. Since vector inserts are generally constructed from intron-less cDNAs, Vecuum identifies vector-originated reads by inspecting the clipping patterns at exon junctions. False variant calls are further detected based on the biased distribution of mutant alleles to vector-originated reads. Tests on simulated and spike-in experimental data validated that Vecuum could detect 93% of vector contaminants and could remove up to 87% of variant-like false calls with 100% precision. Application to public sequence datasets demonstrated the utility of Vecuum in detecting false variants resulting from various types of external contamination. Java-based implementation of the method is available at http://vecuum.sourceforge.net/ CONTACT: swkim@yuhs.acSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. A BAC clone fingerprinting approach to the detection of human genome rearrangements

    PubMed Central

    Krzywinski, Martin; Bosdet, Ian; Mathewson, Carrie; Wye, Natasja; Brebner, Jay; Chiu, Readman; Corbett, Richard; Field, Matthew; Lee, Darlene; Pugh, Trevor; Volik, Stas; Siddiqui, Asim; Jones, Steven; Schein, Jacquie; Collins, Collin; Marra, Marco

    2007-01-01

    We present a method, called fingerprint profiling (FPP), that uses restriction digest fingerprints of bacterial artificial chromosome clones to detect and classify rearrangements in the human genome. The approach uses alignment of experimental fingerprint patterns to in silico digests of the sequence assembly and is capable of detecting micro-deletions (1-5 kb) and balanced rearrangements. Our method has compelling potential for use as a whole-genome method for the identification and characterization of human genome rearrangements. PMID:17953769

  8. Red-shouldered hawk broadcast surveys: Factors affecting detection of responses and population trends

    USGS Publications Warehouse

    McLeod, M.A.; Andersen, D.E.

    1998-01-01

    Forest-nesting raptors are often difficult to detect and monitor because they can be secretive, and their nests can be difficult to locate. Some species, however, respond to broadcasts of taped calls, and these responses may be useful both in monitoring population trends and in locating nests. We conducted broadcast surveys on roads and at active red-shouldered hawk (Buteo lineatus) nests in northcentral Minnesota to determine effects of type of call (conspecific or great horned owl [Bubo virginianus]), time of day, and phase of the breeding cycle on red-shouldered hawk response behavior and to evaluate usefulness of broadcasts as a population monitoring tool using area occupied-probability-of-detection techniques. During the breeding seasons of 1994 and 1995, we surveyed 4 10-station road transects 59 times and conducted 76 surveys at 24 active nests. Results of these surveys indicated conspecific calls broadcast prior to hatch and early in the day were the most effective method of detecting red-shouldered hawks. Probability of detection via conspecific calls averaged 0.25, and area occupied was 100%. Computer simulations using these field data indicated broadcast surveys have the potential to be used as a population monitoring tool.

  9. Red-shouldered hawk occupancy surveys in central Minnesota, USA

    USGS Publications Warehouse

    Henneman, C.; McLeod, M.A.; Andersen, D.E.

    2007-01-01

    Forest-dwelling raptors are often difficult to detect because many species occur at low density or are secretive. Broadcasting conspecific vocalizations can increase the probability of detecting forest-dwelling raptors and has been shown to be an effective method for locating raptors and assessing their relative abundance. Recent advances in statistical techniques based on presence-absence data use probabilistic arguments to derive probability of detection when it is <1 and to provide a model and likelihood-based method for estimating proportion of sites occupied. We used these maximum-likelihood models with data from red-shouldered hawk (Buteo lineatus) call-broadcast surveys conducted in central Minnesota, USA, in 1994-1995 and 2004-2005. Our objectives were to obtain estimates of occupancy and detection probability 1) over multiple sampling seasons (yr), 2) incorporating within-season time-specific detection probabilities, 3) with call type and breeding stage included as covariates in models of probability of detection, and 4) with different sampling strategies. We visited individual survey locations 2-9 times per year, and estimates of both probability of detection (range = 0.28-0.54) and site occupancy (range = 0.81-0.97) varied among years. Detection probability was affected by inclusion of a within-season time-specific covariate, call type, and breeding stage. In 2004 and 2005 we used survey results to assess the effect that number of sample locations, double sampling, and discontinued sampling had on parameter estimates. We found that estimates of probability of detection and proportion of sites occupied were similar across different sampling strategies, and we suggest ways to reduce sampling effort in a monitoring program.

  10. Seasonal and Diel Vocalization Patterns of Antarctic Blue Whale (Balaenoptera musculus intermedia) in the Southern Indian Ocean: A Multi-Year and Multi-Site Study.

    PubMed

    Leroy, Emmanuelle C; Samaran, Flore; Bonnel, Julien; Royer, Jean-Yves

    2016-01-01

    Passive acoustic monitoring is an efficient way to provide insights on the ecology of large whales. This approach allows for long-term and species-specific monitoring over large areas. In this study, we examined six years (2010 to 2015) of continuous acoustic recordings at up to seven different locations in the Central and Southern Indian Basin to assess the peak periods of presence, seasonality and migration movements of Antarctic blue whales (Balaenoptera musculus intermedia). An automated method is used to detect the Antarctic blue whale stereotyped call, known as Z-call. Detection results are analyzed in terms of distribution, seasonal presence and diel pattern of emission at each site. Z-calls are detected year-round at each site, except for one located in the equatorial Indian Ocean, and display highly seasonal distribution. This seasonality is stable across years for every site, but varies between sites. Z-calls are mainly detected during autumn and spring at the subantarctic locations, suggesting that these sites are on the Antarctic blue whale migration routes, and mostly during winter at the subtropical sites. In addition to these seasonal trends, there is a significant diel pattern in Z-call emission, with more Z-calls in daytime than in nighttime. This diel pattern may be related to the blue whale feeding ecology.

  11. Seasonal and Diel Vocalization Patterns of Antarctic Blue Whale (Balaenoptera musculus intermedia) in the Southern Indian Ocean: A Multi-Year and Multi-Site Study

    PubMed Central

    Leroy, Emmanuelle C.; Samaran, Flore; Bonnel, Julien; Royer, Jean-Yves

    2016-01-01

    Passive acoustic monitoring is an efficient way to provide insights on the ecology of large whales. This approach allows for long-term and species-specific monitoring over large areas. In this study, we examined six years (2010 to 2015) of continuous acoustic recordings at up to seven different locations in the Central and Southern Indian Basin to assess the peak periods of presence, seasonality and migration movements of Antarctic blue whales (Balaenoptera musculus intermedia). An automated method is used to detect the Antarctic blue whale stereotyped call, known as Z-call. Detection results are analyzed in terms of distribution, seasonal presence and diel pattern of emission at each site. Z-calls are detected year-round at each site, except for one located in the equatorial Indian Ocean, and display highly seasonal distribution. This seasonality is stable across years for every site, but varies between sites. Z-calls are mainly detected during autumn and spring at the subantarctic locations, suggesting that these sites are on the Antarctic blue whale migration routes, and mostly during winter at the subtropical sites. In addition to these seasonal trends, there is a significant diel pattern in Z-call emission, with more Z-calls in daytime than in nighttime. This diel pattern may be related to the blue whale feeding ecology. PMID:27828976

  12. Evolutionary neural networks for anomaly detection based on the behavior of a program.

    PubMed

    Han, Sang-Jun; Cho, Sung-Bae

    2006-06-01

    The process of learning the behavior of a given program by using machine-learning techniques (based on system-call audit data) is effective to detect intrusions. Rule learning, neural networks, statistics, and hidden Markov models (HMMs) are some of the kinds of representative methods for intrusion detection. Among them, neural networks are known for good performance in learning system-call sequences. In order to apply this knowledge to real-world problems successfully, it is important to determine the structures and weights of these call sequences. However, finding the appropriate structures requires very long time periods because there are no suitable analytical solutions. In this paper, a novel intrusion-detection technique based on evolutionary neural networks (ENNs) is proposed. One advantage of using ENNs is that it takes less time to obtain superior neural networks than when using conventional approaches. This is because they discover the structures and weights of the neural networks simultaneously. Experimental results with the 1999 Defense Advanced Research Projects Agency (DARPA) Intrusion Detection Evaluation (IDEVAL) data confirm that ENNs are promising tools for intrusion detection.

  13. MPAI (mass probes aided ionization) method for total analysis of biomolecules by mass spectrometry.

    PubMed

    Honda, Aki; Hayashi, Shinichiro; Hifumi, Hiroki; Honma, Yuya; Tanji, Noriyuki; Iwasawa, Naoko; Suzuki, Yoshio; Suzuki, Koji

    2007-01-01

    We have designed and synthesized various mass probes, which enable us to effectively ionize various molecules to be detected with mass spectrometry. We call the ionization method using mass probes the "MPAI (mass probes aided ionization)" method. We aim at the sensitive detection of various biological molecules, and also the detection of bio-molecules by a single mass spectrometry serially without changing the mechanical settings. Here, we review mass probes for small molecules with various functional groups and mass probes for proteins. Further, we introduce newly developed mass probes for proteins for highly sensitive detection.

  14. Estimating the location of baleen whale calls using dual streamers to support mitigation procedures in seismic reflection surveys.

    PubMed

    Abadi, Shima H; Tolstoy, Maya; Wilcock, William S D

    2017-01-01

    In order to mitigate against possible impacts of seismic surveys on baleen whales it is important to know as much as possible about the presence of whales within the vicinity of seismic operations. This study expands on previous work that analyzes single seismic streamer data to locate nearby calling baleen whales with a grid search method that utilizes the propagation angles and relative arrival times of received signals along the streamer. Three dimensional seismic reflection surveys use multiple towed hydrophone arrays for imaging the structure beneath the seafloor, providing an opportunity to significantly improve the uncertainty associated with streamer-generated call locations. All seismic surveys utilizing airguns conduct visual marine mammal monitoring surveys concurrent with the experiment, with powering-down of seismic source if a marine mammal is observed within the exposure zone. This study utilizes data from power-down periods of a seismic experiment conducted with two 8-km long seismic hydrophone arrays by the R/V Marcus G. Langseth near Alaska in summer 2011. Simulated and experiment data demonstrate that a single streamer can be utilized to resolve left-right ambiguity because the streamer is rarely perfectly straight in a field setting, but dual streamers provides significantly improved locations. Both methods represent a dramatic improvement over the existing Passive Acoustic Monitoring (PAM) system for detecting low frequency baleen whale calls, with ~60 calls detected utilizing the seismic streamers, zero of which were detected using the current R/V Langseth PAM system. Furthermore, this method has the potential to be utilized not only for improving mitigation processes, but also for studying baleen whale behavior within the vicinity of seismic operations.

  15. Estimating the location of baleen whale calls using dual streamers to support mitigation procedures in seismic reflection surveys

    PubMed Central

    Abadi, Shima H.; Tolstoy, Maya; Wilcock, William S. D.

    2017-01-01

    In order to mitigate against possible impacts of seismic surveys on baleen whales it is important to know as much as possible about the presence of whales within the vicinity of seismic operations. This study expands on previous work that analyzes single seismic streamer data to locate nearby calling baleen whales with a grid search method that utilizes the propagation angles and relative arrival times of received signals along the streamer. Three dimensional seismic reflection surveys use multiple towed hydrophone arrays for imaging the structure beneath the seafloor, providing an opportunity to significantly improve the uncertainty associated with streamer-generated call locations. All seismic surveys utilizing airguns conduct visual marine mammal monitoring surveys concurrent with the experiment, with powering-down of seismic source if a marine mammal is observed within the exposure zone. This study utilizes data from power-down periods of a seismic experiment conducted with two 8-km long seismic hydrophone arrays by the R/V Marcus G. Langseth near Alaska in summer 2011. Simulated and experiment data demonstrate that a single streamer can be utilized to resolve left-right ambiguity because the streamer is rarely perfectly straight in a field setting, but dual streamers provides significantly improved locations. Both methods represent a dramatic improvement over the existing Passive Acoustic Monitoring (PAM) system for detecting low frequency baleen whale calls, with ~60 calls detected utilizing the seismic streamers, zero of which were detected using the current R/V Langseth PAM system. Furthermore, this method has the potential to be utilized not only for improving mitigation processes, but also for studying baleen whale behavior within the vicinity of seismic operations. PMID:28199400

  16. Detecting the Edge of the Tongue: A Tutorial

    ERIC Educational Resources Information Center

    Iskarous, Khalil

    2005-01-01

    The goal of this paper is to provide a tutorial introduction to the topic of edge detection of the tongue from ultrasound scans for researchers in speech science and phonetics. The method introduced here is Active Contours (also called snakes), a method for searching for an edge, assuming that it is a smooth curve in the image data. The advantage…

  17. TreeShrink: fast and accurate detection of outlier long branches in collections of phylogenetic trees.

    PubMed

    Mai, Uyen; Mirarab, Siavash

    2018-05-08

    Sequence data used in reconstructing phylogenetic trees may include various sources of error. Typically errors are detected at the sequence level, but when missed, the erroneous sequences often appear as unexpectedly long branches in the inferred phylogeny. We propose an automatic method to detect such errors. We build a phylogeny including all the data then detect sequences that artificially inflate the tree diameter. We formulate an optimization problem, called the k-shrink problem, that seeks to find k leaves that could be removed to maximally reduce the tree diameter. We present an algorithm to find the exact solution for this problem in polynomial time. We then use several statistical tests to find outlier species that have an unexpectedly high impact on the tree diameter. These tests can use a single tree or a set of related gene trees and can also adjust to species-specific patterns of branch length. The resulting method is called TreeShrink. We test our method on six phylogenomic biological datasets and an HIV dataset and show that the method successfully detects and removes long branches. TreeShrink removes sequences more conservatively than rogue taxon removal and often reduces gene tree discordance more than rogue taxon removal once the amount of filtering is controlled. TreeShrink is an effective method for detecting sequences that lead to unrealistically long branch lengths in phylogenetic trees. The tool is publicly available at https://github.com/uym2/TreeShrink .

  18. RAPTR-SV: a hybrid method for the detection of structural variants

    USDA-ARS?s Scientific Manuscript database

    Motivation: Identification of Structural Variants (SV) in sequence data results in a large number of false positive calls using existing software, which overburdens subsequent validation. Results: Simulations using RAPTR-SV and another software package that uses a similar algorithm for SV detection...

  19. [Sensitivity and specificity of the breast screening program in the Isere region based on positive results between 1991 and 1999].

    PubMed

    Garnier, A; Poncet, F; Billette De Villemeur, A; Exbrayat, C; Bon, M F; Chevalier, A; Salicru, B; Tournegros, J M

    2009-06-01

    The screening program guidelines specify that the call back rate of women for additional imaging (positive mammogram) should not exceed 7% at initial screening, and 5% at subsequent screening. Materials and methods. Results in the Isere region (12%) have prompted a review of the correlation between the call back rate and indicators of quality (detection rate, sensitivity, specificity, positive predictive value) for the radiologists providing interpretations during that time period. Three groups of radiologists were identified: the group with call back rate of 10% achieved the best results (sensitivity: 92%, detection rate: 0.53%, specificity: 90%). The group with lowest call back rate (7.7%) showed insufficient sensitivity (58%). The last group with call back rate of 18.3%, showed no improvement in sensitivity (82%) and detection rate (0.53%), but showed reduced specificity (82%). The protocol update in 2001 does not resolve this problematic situation and national results continue to demonstrate a high percentage of positive screening mammograms. A significant increase in the number of positive screening examinations compared to recommended guidelines is not advantageous and leads to an overall decrease in the quality of the screening.

  20. Optimizing Probability of Detection Point Estimate Demonstration

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  1. Do breeding phase and detection distance influence the effective area surveyed for northern goshawks?

    USGS Publications Warehouse

    Roberson, A.M.; Andersen, D.E.; Kennedy, P.L.

    2005-01-01

    Broadcast surveys using conspecific calls are currently the most effective method for detecting northern goshawks (Accipiter gentilis) during the breeding season. These surveys typically use alarm calls during the nestling phase and juvenile food-begging calls during the fledgling-dependency phase. Because goshawks are most vocal during the courtship phase, we hypothesized that this phase would be an effective time to detect goshawks. Our objective was to improve current survey methodology by evaluating the probability of detecting goshawks at active nests in northern Minnesota in 3 breeding phases and at 4 broadcast distances and to determine the effective area surveyed per broadcast station. Unlike previous studies, we broadcast calls at only 1 distance per trial. This approach better quantifies (1) the relationship between distance and probability of detection, and (2) the effective area surveyed (EAS) per broadcast station. We conducted 99 broadcast trials at 14 active breeding areas. When pooled over all distances, detection rates were highest during the courtship (70%) and fledgling-dependency phases (68%). Detection rates were lowest during the nestling phase (28%), when there appeared to be higher variation in likelihood of detecting individuals. EAS per broadcast station was 39.8 ha during courtship and 24.8 ha during fledgling-dependency. Consequently, in northern Minnesota, broadcast stations may be spaced 712m and 562 m apart when conducting systematic surveys during courtship and fledgling-dependency, respectively. We could not calculate EAS for the nestling phase because probability of detection was not a simple function of distance from nest. Calculation of EAS could be applied to other areas where the probability of detection is a known function of distance.

  2. GRIDSS: sensitive and specific genomic rearrangement detection using positional de Bruijn graph assembly

    PubMed Central

    Do, Hongdo; Molania, Ramyar

    2017-01-01

    The identification of genomic rearrangements with high sensitivity and specificity using massively parallel sequencing remains a major challenge, particularly in precision medicine and cancer research. Here, we describe a new method for detecting rearrangements, GRIDSS (Genome Rearrangement IDentification Software Suite). GRIDSS is a multithreaded structural variant (SV) caller that performs efficient genome-wide break-end assembly prior to variant calling using a novel positional de Bruijn graph-based assembler. By combining assembly, split read, and read pair evidence using a probabilistic scoring, GRIDSS achieves high sensitivity and specificity on simulated, cell line, and patient tumor data, recently winning SV subchallenge #5 of the ICGC-TCGA DREAM8.5 Somatic Mutation Calling Challenge. On human cell line data, GRIDSS halves the false discovery rate compared to other recent methods while matching or exceeding their sensitivity. GRIDSS identifies nontemplate sequence insertions, microhomologies, and large imperfect homologies, estimates a quality score for each breakpoint, stratifies calls into high or low confidence, and supports multisample analysis. PMID:29097403

  3. Tracking employment shocks using mobile phone data

    PubMed Central

    Toole, Jameson L.; Lin, Yu-Ru; Muehlegger, Erich; Shoag, Daniel; González, Marta C.; Lazer, David

    2015-01-01

    Can data from mobile phones be used to observe economic shocks and their consequences at multiple scales? Here we present novel methods to detect mass layoffs, identify individuals affected by them and predict changes in aggregate unemployment rates using call detail records (CDRs) from mobile phones. Using the closure of a large manufacturing plant as a case study, we first describe a structural break model to correctly detect the date of a mass layoff and estimate its size. We then use a Bayesian classification model to identify affected individuals by observing changes in calling behaviour following the plant's closure. For these affected individuals, we observe significant declines in social behaviour and mobility following job loss. Using the features identified at the micro level, we show that the same changes in these calling behaviours, aggregated at the regional level, can improve forecasts of macro unemployment rates. These methods and results highlight promise of new data resources to measure microeconomic behaviour and improve estimates of critical economic indicators. PMID:26018965

  4. Using accelerometers to determine the calling behavior of tagged baleen whales.

    PubMed

    Goldbogen, J A; Stimpert, A K; DeRuiter, S L; Calambokidis, J; Friedlaender, A S; Schorr, G S; Moretti, D J; Tyack, P L; Southall, B L

    2014-07-15

    Low-frequency acoustic signals generated by baleen whales can propagate over vast distances, making the assignment of calls to specific individuals problematic. Here, we report the novel use of acoustic recording tags equipped with high-resolution accelerometers to detect vibrations from the surface of two tagged fin whales that directly match the timing of recorded acoustic signals. A tag deployed on a buoy in the vicinity of calling fin whales and a recording from a tag that had just fallen off a whale were able to detect calls acoustically but did not record corresponding accelerometer signals that were measured on calling individuals. Across the hundreds of calls measured on two tagged fin whales, the accelerometer response was generally anisotropic across all three axes, appeared to depend on tag placement and increased with the level of received sound. These data demonstrate that high-sample rate accelerometry can provide important insights into the acoustic behavior of baleen whales that communicate at low frequencies. This method helps identify vocalizing whales, which in turn enables the quantification of call rates, a fundamental component of models used to estimate baleen whale abundance and distribution from passive acoustic monitoring. © 2014. Published by The Company of Biologists Ltd.

  5. Building dynamic population graph for accurate correspondence detection.

    PubMed

    Du, Shaoyi; Guo, Yanrong; Sanroma, Gerard; Ni, Dong; Wu, Guorong; Shen, Dinggang

    2015-12-01

    In medical imaging studies, there is an increasing trend for discovering the intrinsic anatomical difference across individual subjects in a dataset, such as hand images for skeletal bone age estimation. Pair-wise matching is often used to detect correspondences between each individual subject and a pre-selected model image with manually-placed landmarks. However, the large anatomical variability across individual subjects can easily compromise such pair-wise matching step. In this paper, we present a new framework to simultaneously detect correspondences among a population of individual subjects, by propagating all manually-placed landmarks from a small set of model images through a dynamically constructed image graph. Specifically, we first establish graph links between models and individual subjects according to pair-wise shape similarity (called as forward step). Next, we detect correspondences for the individual subjects with direct links to any of model images, which is achieved by a new multi-model correspondence detection approach based on our recently-published sparse point matching method. To correct those inaccurate correspondences, we further apply an error detection mechanism to automatically detect wrong correspondences and then update the image graph accordingly (called as backward step). After that, all subject images with detected correspondences are included into the set of model images, and the above two steps of graph expansion and error correction are repeated until accurate correspondences for all subject images are established. Evaluations on real hand X-ray images demonstrate that our proposed method using a dynamic graph construction approach can achieve much higher accuracy and robustness, when compared with the state-of-the-art pair-wise correspondence detection methods as well as a similar method but using static population graph. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. ParticleCall: A particle filter for base calling in next-generation sequencing systems

    PubMed Central

    2012-01-01

    Background Next-generation sequencing systems are capable of rapid and cost-effective DNA sequencing, thus enabling routine sequencing tasks and taking us one step closer to personalized medicine. Accuracy and lengths of their reads, however, are yet to surpass those provided by the conventional Sanger sequencing method. This motivates the search for computationally efficient algorithms capable of reliable and accurate detection of the order of nucleotides in short DNA fragments from the acquired data. Results In this paper, we consider Illumina’s sequencing-by-synthesis platform which relies on reversible terminator chemistry and describe the acquired signal by reformulating its mathematical model as a Hidden Markov Model. Relying on this model and sequential Monte Carlo methods, we develop a parameter estimation and base calling scheme called ParticleCall. ParticleCall is tested on a data set obtained by sequencing phiX174 bacteriophage using Illumina’s Genome Analyzer II. The results show that the developed base calling scheme is significantly more computationally efficient than the best performing unsupervised method currently available, while achieving the same accuracy. Conclusions The proposed ParticleCall provides more accurate calls than the Illumina’s base calling algorithm, Bustard. At the same time, ParticleCall is significantly more computationally efficient than other recent schemes with similar performance, rendering it more feasible for high-throughput sequencing data analysis. Improvement of base calling accuracy will have immediate beneficial effects on the performance of downstream applications such as SNP and genotype calling. ParticleCall is freely available at https://sourceforge.net/projects/particlecall. PMID:22776067

  7. Bat detective-Deep learning tools for bat acoustic signal detection.

    PubMed

    Mac Aodha, Oisin; Gibb, Rory; Barlow, Kate E; Browning, Ella; Firman, Michael; Freeman, Robin; Harder, Briana; Kinsey, Libby; Mead, Gary R; Newson, Stuart E; Pandourski, Ivan; Parsons, Stuart; Russ, Jon; Szodoray-Paradi, Abigel; Szodoray-Paradi, Farkas; Tilova, Elena; Girolami, Mark; Brostow, Gabriel; Jones, Kate E

    2018-03-01

    Passive acoustic sensing has emerged as a powerful tool for quantifying anthropogenic impacts on biodiversity, especially for echolocating bat species. To better assess bat population trends there is a critical need for accurate, reliable, and open source tools that allow the detection and classification of bat calls in large collections of audio recordings. The majority of existing tools are commercial or have focused on the species classification task, neglecting the important problem of first localizing echolocation calls in audio which is particularly problematic in noisy recordings. We developed a convolutional neural network based open-source pipeline for detecting ultrasonic, full-spectrum, search-phase calls produced by echolocating bats. Our deep learning algorithms were trained on full-spectrum ultrasonic audio collected along road-transects across Europe and labelled by citizen scientists from www.batdetective.org. When compared to other existing algorithms and commercial systems, we show significantly higher detection performance of search-phase echolocation calls with our test sets. As an example application, we ran our detection pipeline on bat monitoring data collected over five years from Jersey (UK), and compared results to a widely-used commercial system. Our detection pipeline can be used for the automatic detection and monitoring of bat populations, and further facilitates their use as indicator species on a large scale. Our proposed pipeline makes only a small number of bat specific design decisions, and with appropriate training data it could be applied to detecting other species in audio. A crucial novelty of our work is showing that with careful, non-trivial, design and implementation considerations, state-of-the-art deep learning methods can be used for accurate and efficient monitoring in audio.

  8. Contour detection improved by context-adaptive surround suppression.

    PubMed

    Sang, Qiang; Cai, Biao; Chen, Hao

    2017-01-01

    Recently, many image processing applications have taken advantage of a psychophysical and neurophysiological mechanism, called "surround suppression" to extract object contour from a natural scene. However, these traditional methods often adopt a single suppression model and a fixed input parameter called "inhibition level", which needs to be manually specified. To overcome these drawbacks, we propose a novel model, called "context-adaptive surround suppression", which can automatically control the effect of surround suppression according to image local contextual features measured by a surface estimator based on a local linear kernel. Moreover, a dynamic suppression method and its stopping mechanism are introduced to avoid manual intervention. The proposed algorithm is demonstrated and validated by a broad range of experimental results.

  9. Toxicology and detection methods of the alkaloid neurotoxin produced by cyanobacteria, anatoxin-a.

    PubMed

    Osswald, Joana; Rellán, Sandra; Gago, Ana; Vasconcelos, Vitor

    2007-11-01

    Freshwater resources are under stress due to naturally occurring conditions and human impacts. One of the consequences is the proliferation of cyanobacteria, microphytoplankton organisms that are capable to produce toxins called cyanotoxins. Anatoxin-a is one of the main cyanotoxins. It is a very potent neurotoxin that was already responsible for some animal fatalities. In this review we endeavor to divulgate much of the internationally published information about toxicology, occurrence and detection methods of anatoxin-a. Cyanobacteria generalities, anatoxin-a occurrence and production as well as anatoxin-a toxicology and its methods of detection are the aspects focused in this review. Remediation of anatoxin-a occurrence will be addressed with a public health perspective. Final remarks call the attention for some important gaps in the knowledge about this neurotoxin and its implication to public health. Alterations of aquatic ecosystems caused by anatoxin-a is also addressed. Although anatoxin-a is not the more frequent cyanotoxin worldwide, it has to be regarded as a health risk that can be fatal to terrestrial and aquatic organisms because of its high toxicity.

  10. Prevention of bacterial foodborne disease using nanobiotechnology.

    PubMed

    Billington, Craig; Hudson, J Andrew; D'Sa, Elaine

    2014-01-01

    Foodborne disease is an important source of expense, morbidity, and mortality for society. Detection and control constitute significant components of the overall management of foodborne bacterial pathogens, and this review focuses on the use of nanosized biological entities and molecules to achieve these goals. There is an emphasis on the use of organisms called bacteriophages (phages: viruses that infect bacteria), which are increasingly being used in pathogen detection and biocontrol applications. Detection of pathogens in foods by conventional techniques is time-consuming and expensive, although it can also be sensitive and accurate. Nanobiotechnology is being used to decrease detection times and cost through the development of biosensors, exploiting specific cell-recognition properties of antibodies and phage proteins. Although sensitivity per test can be excellent (eg, the detection of one cell), the very small volumes tested mean that sensitivity per sample is less compelling. An ideal detection method needs to be inexpensive, sensitive, and accurate, but no approach yet achieves all three. For nanobiotechnology to displace existing methods (culture-based, antibody-based rapid methods, or those that detect amplified nucleic acid) it will need to focus on improving sensitivity. Although manufactured nonbiological nanoparticles have been used to kill bacterial cells, nanosized organisms called phages are increasingly finding favor in food safety applications. Phages are amenable to protein and nucleic acid labeling, and can be very specific, and the typical large "burst size" resulting from phage amplification can be harnessed to produce a rapid increase in signal to facilitate detection. There are now several commercially available phages for pathogen control, and many reports in the literature demonstrate efficacy against a number of foodborne pathogens on diverse foods. As a method for control of pathogens, nanobiotechnology is therefore flourishing.

  11. Ricin toxicokinetics and its sensitive detection in mouse sera or feces using immuno-PCR

    USDA-ARS?s Scientific Manuscript database

    Ricin (also called RCA-II or RCA60), one of the most potent toxins and documented bioweapons, is derived from castor beans of Ricinus communis. Several in vitro methods have been designed for ricin detection in complex food matrices in the event of intentional contamination. Recently, a novel Immuno...

  12. Accuracy of a Screening Tool for Early Identification of Language Impairment

    ERIC Educational Resources Information Center

    Uilenburg, Noëlle; Wiefferink, Karin; Verkerk, Paul; van Denderen, Margot; van Schie, Carla; Oudesluys-Murphy, Ann-Marie

    2018-01-01

    Purpose: A screening tool called the "VTO Language Screening Instrument" (VTO-LSI) was developed to enable more uniform and earlier detection of language impairment. This report, consisting of 2 retrospective studies, focuses on the effects of using the VTO-LSI compared to regular detection procedures. Method: Study 1 retrospectively…

  13. Accurate Detection of Dysmorphic Nuclei Using Dynamic Programming and Supervised Classification.

    PubMed

    Verschuuren, Marlies; De Vylder, Jonas; Catrysse, Hannes; Robijns, Joke; Philips, Wilfried; De Vos, Winnok H

    2017-01-01

    A vast array of pathologies is typified by the presence of nuclei with an abnormal morphology. Dysmorphic nuclear phenotypes feature dramatic size changes or foldings, but also entail much subtler deviations such as nuclear protrusions called blebs. Due to their unpredictable size, shape and intensity, dysmorphic nuclei are often not accurately detected in standard image analysis routines. To enable accurate detection of dysmorphic nuclei in confocal and widefield fluorescence microscopy images, we have developed an automated segmentation algorithm, called Blebbed Nuclei Detector (BleND), which relies on two-pass thresholding for initial nuclear contour detection, and an optimal path finding algorithm, based on dynamic programming, for refining these contours. Using a robust error metric, we show that our method matches manual segmentation in terms of precision and outperforms state-of-the-art nuclear segmentation methods. Its high performance allowed for building and integrating a robust classifier that recognizes dysmorphic nuclei with an accuracy above 95%. The combined segmentation-classification routine is bound to facilitate nucleus-based diagnostics and enable real-time recognition of dysmorphic nuclei in intelligent microscopy workflows.

  14. Accurate Detection of Dysmorphic Nuclei Using Dynamic Programming and Supervised Classification

    PubMed Central

    Verschuuren, Marlies; De Vylder, Jonas; Catrysse, Hannes; Robijns, Joke; Philips, Wilfried

    2017-01-01

    A vast array of pathologies is typified by the presence of nuclei with an abnormal morphology. Dysmorphic nuclear phenotypes feature dramatic size changes or foldings, but also entail much subtler deviations such as nuclear protrusions called blebs. Due to their unpredictable size, shape and intensity, dysmorphic nuclei are often not accurately detected in standard image analysis routines. To enable accurate detection of dysmorphic nuclei in confocal and widefield fluorescence microscopy images, we have developed an automated segmentation algorithm, called Blebbed Nuclei Detector (BleND), which relies on two-pass thresholding for initial nuclear contour detection, and an optimal path finding algorithm, based on dynamic programming, for refining these contours. Using a robust error metric, we show that our method matches manual segmentation in terms of precision and outperforms state-of-the-art nuclear segmentation methods. Its high performance allowed for building and integrating a robust classifier that recognizes dysmorphic nuclei with an accuracy above 95%. The combined segmentation-classification routine is bound to facilitate nucleus-based diagnostics and enable real-time recognition of dysmorphic nuclei in intelligent microscopy workflows. PMID:28125723

  15. QuateXelero: An Accelerated Exact Network Motif Detection Algorithm

    PubMed Central

    Khakabimamaghani, Sahand; Sharafuddin, Iman; Dichter, Norbert; Koch, Ina; Masoudi-Nejad, Ali

    2013-01-01

    Finding motifs in biological, social, technological, and other types of networks has become a widespread method to gain more knowledge about these networks’ structure and function. However, this task is very computationally demanding, because it is highly associated with the graph isomorphism which is an NP problem (not known to belong to P or NP-complete subsets yet). Accordingly, this research is endeavoring to decrease the need to call NAUTY isomorphism detection method, which is the most time-consuming step in many existing algorithms. The work provides an extremely fast motif detection algorithm called QuateXelero, which has a Quaternary Tree data structure in the heart. The proposed algorithm is based on the well-known ESU (FANMOD) motif detection algorithm. The results of experiments on some standard model networks approve the overal superiority of the proposed algorithm, namely QuateXelero, compared with two of the fastest existing algorithms, G-Tries and Kavosh. QuateXelero is especially fastest in constructing the central data structure of the algorithm from scratch based on the input network. PMID:23874498

  16. Bat detective—Deep learning tools for bat acoustic signal detection

    PubMed Central

    Barlow, Kate E.; Firman, Michael; Freeman, Robin; Harder, Briana; Kinsey, Libby; Mead, Gary R.; Newson, Stuart E.; Pandourski, Ivan; Russ, Jon; Szodoray-Paradi, Abigel; Tilova, Elena; Girolami, Mark; Jones, Kate E.

    2018-01-01

    Passive acoustic sensing has emerged as a powerful tool for quantifying anthropogenic impacts on biodiversity, especially for echolocating bat species. To better assess bat population trends there is a critical need for accurate, reliable, and open source tools that allow the detection and classification of bat calls in large collections of audio recordings. The majority of existing tools are commercial or have focused on the species classification task, neglecting the important problem of first localizing echolocation calls in audio which is particularly problematic in noisy recordings. We developed a convolutional neural network based open-source pipeline for detecting ultrasonic, full-spectrum, search-phase calls produced by echolocating bats. Our deep learning algorithms were trained on full-spectrum ultrasonic audio collected along road-transects across Europe and labelled by citizen scientists from www.batdetective.org. When compared to other existing algorithms and commercial systems, we show significantly higher detection performance of search-phase echolocation calls with our test sets. As an example application, we ran our detection pipeline on bat monitoring data collected over five years from Jersey (UK), and compared results to a widely-used commercial system. Our detection pipeline can be used for the automatic detection and monitoring of bat populations, and further facilitates their use as indicator species on a large scale. Our proposed pipeline makes only a small number of bat specific design decisions, and with appropriate training data it could be applied to detecting other species in audio. A crucial novelty of our work is showing that with careful, non-trivial, design and implementation considerations, state-of-the-art deep learning methods can be used for accurate and efficient monitoring in audio. PMID:29518076

  17. Probabilistic double guarantee kidnapping detection in SLAM.

    PubMed

    Tian, Yang; Ma, Shugen

    2016-01-01

    For determining whether kidnapping has happened and which type of kidnapping it is while a robot performs autonomous tasks in an unknown environment, a double guarantee kidnapping detection (DGKD) method has been proposed. The good performance of DGKD in a relative small environment is shown. However, a limitation of DGKD is found in a large-scale environment by our recent work. In order to increase the adaptability of DGKD in a large-scale environment, an improved method called probabilistic double guarantee kidnapping detection is proposed in this paper to combine probability of features' positions and the robot's posture. Simulation results demonstrate the validity and accuracy of the proposed method.

  18. Mutual recognition of TNT using antibodies polymeric shell having CdS.

    PubMed

    Say, Ridvan; Büyüktiryaki, Sibel; Hür, Deniz; Yilmaz, Filiz; Ersöz, Arzu

    2012-02-15

    Click chemistry is the latest strategy called upon in the development of state of the art exponents of bioconjugation. In this study, we have proposed a covalent and photosensitive crosslinking conjugation of the antibody on nano-structures. For this purpose, quantum dots (QDs) without affecting conformation and function of proteins through the ruthenium-chelate based aminoacid monomer linkages have been applied. The aminoacid-monomer linkages called ANADOLUCA (AmiNoAcid Decorated and Light Underpining Conjugation Approach) give reusable oriented and cross-linked anti 2,4,6-trinitrotoluene (TNT) conjugated QD for TNT detection. In this work, a new and simple method has improved to design and prepare high sensitive nanoconjugates for TNT determination. We have demonstrated the use of luminescent QDs conjugated to antibody for the specific detection of the explosive TNT in aqueous environments. The binding affinity of each nanoconjugates for TNT detection by using Langmuir adsorption methods has also been investigated. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Modelling the effects of environmental conditions on the acoustic occurrence and behaviour of Antarctic blue whales

    PubMed Central

    Shabangu, Fannie W.; Yemane, Dawit; Stafford, Kathleen M.; Ensor, Paul; Findlay, Ken P.

    2017-01-01

    Harvested to perilously low numbers by commercial whaling during the past century, the large scale response of Antarctic blue whales Balaenoptera musculus intermedia to environmental variability is poorly understood. This study uses acoustic data collected from 586 sonobuoys deployed in the austral summers of 1997 through 2009, south of 38°S, coupled with visual observations of blue whales during the IWC SOWER line-transect surveys. The characteristic Z-call and D-call of Antarctic blue whales were detected using an automated detection template and visual verification method. Using a random forest model, we showed the environmental preferences pattern, spatial occurrence and acoustic behaviour of Antarctic blue whales. Distance to the southern boundary of the Antarctic Circumpolar Current (SBACC), latitude and distance from the nearest Antarctic shores were the main geographic predictors of blue whale call occurrence. Satellite-derived sea surface height, sea surface temperature, and productivity (chlorophyll-a) were the most important environmental predictors of blue whale call occurrence. Call rates of D-calls were strongly predicted by the location of the SBACC, latitude and visually detected number of whales in an area while call rates of Z-call were predicted by the SBACC, latitude and longitude. Satellite-derived sea surface height, wind stress, wind direction, water depth, sea surface temperatures, chlorophyll-a and wind speed were important environmental predictors of blue whale call rates in the Southern Ocean. Blue whale call occurrence and call rates varied significantly in response to inter-annual and long term variability of those environmental predictors. Our results identify the response of Antarctic blue whales to inter-annual variability in environmental conditions and highlighted potential suitable habitats for this population. Such emerging knowledge about the acoustic behaviour, environmental and habitat preferences of Antarctic blue whales is important in improving the management and conservation of this highly depleted species. PMID:28222124

  20. Modelling the effects of environmental conditions on the acoustic occurrence and behaviour of Antarctic blue whales.

    PubMed

    Shabangu, Fannie W; Yemane, Dawit; Stafford, Kathleen M; Ensor, Paul; Findlay, Ken P

    2017-01-01

    Harvested to perilously low numbers by commercial whaling during the past century, the large scale response of Antarctic blue whales Balaenoptera musculus intermedia to environmental variability is poorly understood. This study uses acoustic data collected from 586 sonobuoys deployed in the austral summers of 1997 through 2009, south of 38°S, coupled with visual observations of blue whales during the IWC SOWER line-transect surveys. The characteristic Z-call and D-call of Antarctic blue whales were detected using an automated detection template and visual verification method. Using a random forest model, we showed the environmental preferences pattern, spatial occurrence and acoustic behaviour of Antarctic blue whales. Distance to the southern boundary of the Antarctic Circumpolar Current (SBACC), latitude and distance from the nearest Antarctic shores were the main geographic predictors of blue whale call occurrence. Satellite-derived sea surface height, sea surface temperature, and productivity (chlorophyll-a) were the most important environmental predictors of blue whale call occurrence. Call rates of D-calls were strongly predicted by the location of the SBACC, latitude and visually detected number of whales in an area while call rates of Z-call were predicted by the SBACC, latitude and longitude. Satellite-derived sea surface height, wind stress, wind direction, water depth, sea surface temperatures, chlorophyll-a and wind speed were important environmental predictors of blue whale call rates in the Southern Ocean. Blue whale call occurrence and call rates varied significantly in response to inter-annual and long term variability of those environmental predictors. Our results identify the response of Antarctic blue whales to inter-annual variability in environmental conditions and highlighted potential suitable habitats for this population. Such emerging knowledge about the acoustic behaviour, environmental and habitat preferences of Antarctic blue whales is important in improving the management and conservation of this highly depleted species.

  1. Climate Verification Using Running Mann Whitney Z Statistics

    USDA-ARS?s Scientific Manuscript database

    A robust method previously used to detect observed intra- to multi-decadal (IMD) climate regimes was adapted to test whether climate models could reproduce IMD variations in U.S. surface temperatures during 1919-2008. This procedure, called the running Mann Whitney Z (MWZ) method, samples data ranki...

  2. Automated surveillance of 911 call data for detection of possible water contamination incidents.

    PubMed

    Haas, Adam J; Gibbons, Darcy; Dangel, Chrissy; Allgeier, Steve

    2011-03-30

    Drinking water contamination, with the capability to affect large populations, poses a significant risk to public health. In recent water contamination events, the impact of contamination on public health appeared in data streams monitoring health-seeking behavior. While public health surveillance has traditionally focused on the detection of pathogens, developing methods for detection of illness from fast-acting chemicals has not been an emphasis. An automated surveillance system was implemented for Cincinnati's drinking water contamination warning system to monitor health-related 911 calls in the city of Cincinnati. Incident codes indicative of possible water contamination were filtered from all 911 calls for analysis. The 911 surveillance system uses a space-time scan statistic to detect potential water contamination incidents. The frequency and characteristics of the 911 alarms over a 2.5 year period were studied. During the evaluation, 85 alarms occurred, although most occurred prior to the implementation of an additional alerting constraint in May 2009. Data were available for analysis approximately 48 minutes after calls indicating alarms may be generated 1-2 hours after a rapid increase in call volume. Most alerts occurred in areas of high population density. The average alarm area was 9.22 square kilometers. The average number of cases in an alarm was nine calls. The 911 surveillance system provides timely notification of possible public health events, but did have limitations. While the alarms contained incident codes and location of the caller, additional information such as medical status was not available to assist validating the cause of the alarm. Furthermore, users indicated that a better understanding of 911 system functionality is necessary to understand how it would behave in an actual water contamination event.

  3. Improved Spectroscopy of Molecular Ions in the Mid-Infrared with Up-Conversion Detection

    NASA Astrophysics Data System (ADS)

    Markus, Charles R.; Perry, Adam J.; Hodges, James N.; McCall, Benjamin J.

    2016-06-01

    Heterodyne detection, velocity modulation, and cavity enhancement are useful tools for observing rovibrational transitions of important molecular ions. We have utilized these methods to investigate a number of molecular ions, such as H_3^+, CH_5^+, HeH^+, and OH^+. In the past, parasitic etalons and the lack of fast and sensitive detectors in the mid-infrared have limited the number of transitions we could measure with MHz-level precision. Recently, we have significantly reduced the amplitude of unwanted interference fringes with a Brewster-plate spoiler. We have also developed a detection scheme which up-converts the mid-infrared light with difference frequency generation which allows the use of a faster and more sensitive avalanche photodetector. The higher detection bandwidth allows for optimized heterodyne detection at higher modulation frequencies. The overall gain in signal-to-noise from both improvements will enable extensive high-precision line lists of molecular ions and searches for previously unobserved transitions. K.N. Crabtree, J.N. Hodges, B.M. Siller, A.J. Perry, J.E. Kelly, P.A. Jenkins II, and B.J. McCall, Chem. Phys. Lett. 551 (2012) 1-6. A.J. Perry, J.N. Hodges, C.R. Markus, G.S. Kocheril, and B.J. McCall, J. Mol. Spec. 317 (2015) 71-73. J.N. Hodges, A.J. Perry, P.A. Jenkins II, B.M. Siller, and B.J. McCall, J. Chem. Phys. 139 (2013) 164291. A.J. Perry, J.N. Hodges, C.R. Markus, G.S. Kocheril, and B.J. McCall. 2014, J. Chem. Phys. 141, 101101 C.R. Markus, J.N. Hodges, A.J. Perry, G.S. Kocheril, H.S.P. Muller, and B.J. McCall, Astrophys. J. 817 (2016) 138.

  4. Construction of a combinatorial pipeline using two somatic variant  calling  methods  for whole exome sequence data of gastric cancer.

    PubMed

    Kohmoto, Tomohiro; Masuda, Kiyoshi; Naruto, Takuya; Tange, Shoichiro; Shoda, Katsutoshi; Hamada, Junichi; Saito, Masako; Ichikawa, Daisuke; Tajima, Atsushi; Otsuji, Eigo; Imoto, Issei

    2017-01-01

    High-throughput next-generation sequencing is a powerful tool to identify the genotypic landscapes of somatic variants and therapeutic targets in various cancers including gastric cancer, forming the basis for personalized medicine in the clinical setting. Although the advent of many computational algorithms leads to higher accuracy in somatic variant calling, no standard method exists due to the limitations of each method. Here, we constructed a new pipeline. We combined two different somatic variant callers with different algorithms, Strelka and VarScan 2, and evaluated performance using whole exome sequencing data obtained from 19 Japanese cases with gastric cancer (GC); then, we characterized these tumors based on identified driver molecular alterations. More single nucleotide variants (SNVs) and small insertions/deletions were detected by Strelka and VarScan 2, respectively. SNVs detected by both tools showed higher accuracy for estimating somatic variants compared with those detected by only one of the two tools and accurately showed the mutation signature and mutations of driver genes reported for GC. Our combinatorial pipeline may have an advantage in detection of somatic mutations in GC and may be useful for further genomic characterization of Japanese patients with GC to improve the efficacy of GC treatments. J. Med. Invest. 64: 233-240, August, 2017.

  5. Shilling Attacks Detection in Recommender Systems Based on Target Item Analysis

    PubMed Central

    Zhou, Wei; Wen, Junhao; Koh, Yun Sing; Xiong, Qingyu; Gao, Min; Dobbie, Gillian; Alam, Shafiq

    2015-01-01

    Recommender systems are highly vulnerable to shilling attacks, both by individuals and groups. Attackers who introduce biased ratings in order to affect recommendations, have been shown to negatively affect collaborative filtering (CF) algorithms. Previous research focuses only on the differences between genuine profiles and attack profiles, ignoring the group characteristics in attack profiles. In this paper, we study the use of statistical metrics to detect rating patterns of attackers and group characteristics in attack profiles. Another question is that most existing detecting methods are model specific. Two metrics, Rating Deviation from Mean Agreement (RDMA) and Degree of Similarity with Top Neighbors (DegSim), are used for analyzing rating patterns between malicious profiles and genuine profiles in attack models. Building upon this, we also propose and evaluate a detection structure called RD-TIA for detecting shilling attacks in recommender systems using a statistical approach. In order to detect more complicated attack models, we propose a novel metric called DegSim’ based on DegSim. The experimental results show that our detection model based on target item analysis is an effective approach for detecting shilling attacks. PMID:26222882

  6. Eye Tracking and Early Detection of Confusion in Digital Learning Environments: Proof of Concept

    ERIC Educational Resources Information Center

    Pachman, Mariya; Arguel, Amaël; Lockyer, Lori; Kennedy, Gregor; Lodge, Jason M.

    2016-01-01

    Research on incidence of and changes in confusion during complex learning and problem-solving calls for advanced methods of confusion detection in digital learning environments (DLEs). In this study we attempt to address this issue by investigating the use of multiple measures, including psychophysiological indicators and self-ratings, to detect…

  7. Speech Characteristics of Patients with Pallido-Ponto-Nigral Degeneration and Their Application to Presymptomatic Detection in At-Risk Relatives

    ERIC Educational Resources Information Center

    Liss, Julie M.; Krein-Jones, Kari; Wszolek, Zbigniew K.; Caviness, John N.

    2006-01-01

    Purpose: This report describes the speech characteristics of individuals with a neurodegenerative syndrome called pallido-ponto-nigral degeneration (PPND) and examines the speech samples of at-risk, but asymptomatic, relatives for possible preclinical detection. Method: Speech samples of 9 members of a PPND kindred were subjected to perceptual…

  8. Thermoelectric SQUID method for the detection of segregations

    NASA Astrophysics Data System (ADS)

    Hinken, Johann H.; Tavrin, Yury

    2000-05-01

    Aero engine turbine discs are most critical parts. Material inhomogeneities can cause disc fractures during the flight with fatal air disasters. Nondestructive testing (NDT) of the discs in various machining steps is necessary and performed as well as possible. Conventional NDT methods, however, like eddy current testing and ultrasonic testing have unacceptable limits. For example, subsurface segregations often cannot be detected directly but only indirectly in such cases when cracks already have developed from them. This may be too late. A new NDT method, which we call the Thermoelectric SQUID Method, has been developed. It allows for the detection of metallic inclusions within non-ferromagnetic metallic base material. This paper describes the results of a feasibility study on aero engine turbine discs made from Inconel® 718. These contained segregations that had been detected before by anodic etching. With the Thermoelectric SQUID Method, these segregations were detected again, and further segregations below the surfaces have been found, which had not been detected before. For this new NDT method the disc material is quasi-transparent. The Thermoelectric SQUID Method is also useful to detect distributed and localized inhomogeneities in pure metals like niobium sheets for particle accelerators.

  9. Varying face occlusion detection and iterative recovery for face recognition

    NASA Astrophysics Data System (ADS)

    Wang, Meng; Hu, Zhengping; Sun, Zhe; Zhao, Shuhuan; Sun, Mei

    2017-05-01

    In most sparse representation methods for face recognition (FR), occlusion problems were usually solved via removing the occlusion part of both query samples and training samples to perform the recognition process. This practice ignores the global feature of facial image and may lead to unsatisfactory results due to the limitation of local features. Considering the aforementioned drawback, we propose a method called varying occlusion detection and iterative recovery for FR. The main contributions of our method are as follows: (1) to detect an accurate occlusion area of facial images, an image processing and intersection-based clustering combination method is used for occlusion FR; (2) according to an accurate occlusion map, the new integrated facial images are recovered iteratively and put into a recognition process; and (3) the effectiveness on recognition accuracy of our method is verified by comparing it with three typical occlusion map detection methods. Experiments show that the proposed method has a highly accurate detection and recovery performance and that it outperforms several similar state-of-the-art methods against partial contiguous occlusion.

  10. Point counts from clustered populations: Lessons from an experiment with Hawaiian crows

    USGS Publications Warehouse

    Hayward, G.D.; Kepler, C.B.; Scott, J.M.

    1991-01-01

    We designed an experiment to identify factors contributing most to error in counts of Hawaiian Crow or Alala (Corvus hawaiiensis) groups that are detected aurally. Seven observers failed to detect calling Alala on 197 of 361 3-min point counts on four transects extending from cages with captive Alala. A detection curve describing the relation between frequency of flock detection and distance typified the distribution expected in transect or point counts. Failure to detect calling Alala was affected most by distance, observer, and Alala calling frequency. The number of individual Alala calling was not important in detection rate. Estimates of the number of Alala calling (flock size) were biased and imprecise: average difference between number of Alala calling and number heard was 3.24 (.+-. 0.277). Distance, observer, number of Alala calling, and Alala calling frequency all contributed to errors in estimates of group size (P < 0.0001). Multiple regression suggested that number of Alala calling contributed most to errors. These results suggest that well-designed point counts may be used to estimate the number of Alala flocks but cast doubt on attempts to estimate flock size when individuals are counted aurally.

  11. Implementation of sobel method to detect the seed rubber plant leaves

    NASA Astrophysics Data System (ADS)

    Suyanto; Munte, J.

    2018-03-01

    This research was conducted to develop a system that can identify and recognize the type of rubber tree based on the pattern of leaves of the plant. The steps research are started with the identification of the image data acquisition, image processing, image edge detection and identification method template matching. Edge detection is using Sobel edge detection. Pattern recognition would detect image as input and compared with other images in a database called templates. Experiments carried out in one phase, identification of the leaf edge, using a rubber plant leaf image 14 are superior and 5 for each type of test images (clones) of the plant. From the experimental results obtained by the recognition rate of 91.79%.

  12. A microRNA detection system based on padlock probes and rolling circle amplification

    PubMed Central

    Jonstrup, Søren Peter; Koch, Jørn; Kjems, Jørgen

    2006-01-01

    The differential expression and the regulatory roles of microRNAs (miRNAs) are being studied intensively these years. Their minute size of only 19–24 nucleotides and strong sequence similarity among related species call for enhanced methods for reliable detection and quantification. Moreover, miRNA expression is generally restricted to a limited number of specific cells within an organism and therefore requires highly sensitive detection methods. Here we present a simple and reliable miRNA detection protocol based on padlock probes and rolling circle amplification. It can be performed without specialized equipment and is capable of measuring the content of specific miRNAs in a few nanograms of total RNA. PMID:16888321

  13. A microRNA detection system based on padlock probes and rolling circle amplification.

    PubMed

    Jonstrup, Søren Peter; Koch, Jørn; Kjems, Jørgen

    2006-09-01

    The differential expression and the regulatory roles of microRNAs (miRNAs) are being studied intensively these years. Their minute size of only 19-24 nucleotides and strong sequence similarity among related species call for enhanced methods for reliable detection and quantification. Moreover, miRNA expression is generally restricted to a limited number of specific cells within an organism and therefore requires highly sensitive detection methods. Here we present a simple and reliable miRNA detection protocol based on padlock probes and rolling circle amplification. It can be performed without specialized equipment and is capable of measuring the content of specific miRNAs in a few nanograms of total RNA.

  14. Low Base-Substitution Mutation Rate in the Germline Genome of the Ciliate Tetrahymena thermophila

    DTIC Science & Technology

    2016-09-15

    generations of mutation accumulation (MA). We applied an existing mutation-calling pipeline and developed a new probabilistic mutation detection approach...noise introduced by mismapped reads. We used both our new method and an existing mutation-calling pipeline (Sung, Tucker, et al. 2012) to analyse the...and larger MA experiments will be required to confidently estimate the mutational spectrum of a species with such a low mutation rate. Materials and

  15. Microarray Detection Call Methodology as a Means to Identify and Compare Transcripts Expressed within Syncytial Cells from Soybean (Glycine max) Roots Undergoing Resistant and Susceptible Reactions to the Soybean Cyst Nematode (Heterodera glycines)

    PubMed Central

    Klink, Vincent P.; Overall, Christopher C.; Alkharouf, Nadim W.; MacDonald, Margaret H.; Matthews, Benjamin F.

    2010-01-01

    Background. A comparative microarray investigation was done using detection call methodology (DCM) and differential expression analyses. The goal was to identify genes found in specific cell populations that were eliminated by differential expression analysis due to the nature of differential expression methods. Laser capture microdissection (LCM) was used to isolate nearly homogeneous populations of plant root cells. Results. The analyses identified the presence of 13,291 transcripts between the 4 different sample types. The transcripts filtered down into a total of 6,267 that were detected as being present in one or more sample types. A comparative analysis of DCM and differential expression methods showed a group of genes that were not differentially expressed, but were expressed at detectable amounts within specific cell types. Conclusion. The DCM has identified patterns of gene expression not shown by differential expression analyses. DCM has identified genes that are possibly cell-type specific and/or involved in important aspects of plant nematode interactions during the resistance response, revealing the uniqueness of a particular cell population at a particular point during its differentiation process. PMID:20508855

  16. An Optimal Method for Detecting Internal and External Intrusion in MANET

    NASA Astrophysics Data System (ADS)

    Rafsanjani, Marjan Kuchaki; Aliahmadipour, Laya; Javidi, Mohammad M.

    Mobile Ad hoc Network (MANET) is formed by a set of mobile hosts which communicate among themselves through radio waves. The hosts establish infrastructure and cooperate to forward data in a multi-hop fashion without a central administration. Due to their communication type and resources constraint, MANETs are vulnerable to diverse types of attacks and intrusions. In this paper, we proposed a method for prevention internal intruder and detection external intruder by using game theory in mobile ad hoc network. One optimal solution for reducing the resource consumption of detection external intruder is to elect a leader for each cluster to provide intrusion service to other nodes in the its cluster, we call this mode moderate mode. Moderate mode is only suitable when the probability of attack is low. Once the probability of attack is high, victim nodes should launch their own IDS to detect and thwart intrusions and we call robust mode. In this paper leader should not be malicious or selfish node and must detect external intrusion in its cluster with minimum cost. Our proposed method has three steps: the first step building trust relationship between nodes and estimation trust value for each node to prevent internal intrusion. In the second step we propose an optimal method for leader election by using trust value; and in the third step, finding the threshold value for notifying the victim node to launch its IDS once the probability of attack exceeds that value. In first and third step we apply Bayesian game theory. Our method due to using game theory, trust value and honest leader can effectively improve the network security, performance and reduce resource consumption.

  17. Multi-Centrality Graph Spectral Decompositions and Their Application to Cyber Intrusion Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Pin-Yu; Choudhury, Sutanay; Hero, Alfred

    Many modern datasets can be represented as graphs and hence spectral decompositions such as graph principal component analysis (PCA) can be useful. Distinct from previous graph decomposition approaches based on subspace projection of a single topological feature, e.g., the centered graph adjacency matrix (graph Laplacian), we propose spectral decomposition approaches to graph PCA and graph dictionary learning that integrate multiple features, including graph walk statistics, centrality measures and graph distances to reference nodes. In this paper we propose a new PCA method for single graph analysis, called multi-centrality graph PCA (MC-GPCA), and a new dictionary learning method for ensembles ofmore » graphs, called multi-centrality graph dictionary learning (MC-GDL), both based on spectral decomposition of multi-centrality matrices. As an application to cyber intrusion detection, MC-GPCA can be an effective indicator of anomalous connectivity pattern and MC-GDL can provide discriminative basis for attack classification.« less

  18. Mapping Farming Practices in Belgian Intensive Cropping Systems from Sentinel-1 SAR Time Series

    NASA Astrophysics Data System (ADS)

    Chome, G.; Baret, P. V.; Defourny, P.

    2016-08-01

    The environmental impact of the so-called conventional farming system calls for new farming practices reducing negative externalities. Emerging farming practices such as no-till and new inter-cropping management are promising tracks. The development of methods to characterize crop management across an entire region and to understand their spatial dimension offers opportunities to accompany the transition towards a more sustainable agriculture.This research takes advantage of the unmatched polarimetric and temporal resolutions of Sentinel-1 SAR C- band to develop a method to identify farming practices at the parcel level. To this end, the detection of changes in backscattering due to surface roughness modification (tillage, inter-crop cover destruction ...) is used to detect the farming management. The final results are compared to a reference dataset collected through an intensive field campaign. Finally, the performances are discussed in the perspective of practices monitoring of cropping systems through remote sensing.

  19. QQ-SNV: single nucleotide variant detection at low frequency by comparing the quality quantiles.

    PubMed

    Van der Borght, Koen; Thys, Kim; Wetzels, Yves; Clement, Lieven; Verbist, Bie; Reumers, Joke; van Vlijmen, Herman; Aerssens, Jeroen

    2015-11-10

    Next generation sequencing enables studying heterogeneous populations of viral infections. When the sequencing is done at high coverage depth ("deep sequencing"), low frequency variants can be detected. Here we present QQ-SNV (http://sourceforge.net/projects/qqsnv), a logistic regression classifier model developed for the Illumina sequencing platforms that uses the quantiles of the quality scores, to distinguish true single nucleotide variants from sequencing errors based on the estimated SNV probability. To train the model, we created a dataset of an in silico mixture of five HIV-1 plasmids. Testing of our method in comparison to the existing methods LoFreq, ShoRAH, and V-Phaser 2 was performed on two HIV and four HCV plasmid mixture datasets and one influenza H1N1 clinical dataset. For default application of QQ-SNV, variants were called using a SNV probability cutoff of 0.5 (QQ-SNV(D)). To improve the sensitivity we used a SNV probability cutoff of 0.0001 (QQ-SNV(HS)). To also increase specificity, SNVs called were overruled when their frequency was below the 80(th) percentile calculated on the distribution of error frequencies (QQ-SNV(HS-P80)). When comparing QQ-SNV versus the other methods on the plasmid mixture test sets, QQ-SNV(D) performed similarly to the existing approaches. QQ-SNV(HS) was more sensitive on all test sets but with more false positives. QQ-SNV(HS-P80) was found to be the most accurate method over all test sets by balancing sensitivity and specificity. When applied to a paired-end HCV sequencing study, with lowest spiked-in true frequency of 0.5%, QQ-SNV(HS-P80) revealed a sensitivity of 100% (vs. 40-60% for the existing methods) and a specificity of 100% (vs. 98.0-99.7% for the existing methods). In addition, QQ-SNV required the least overall computation time to process the test sets. Finally, when testing on a clinical sample, four putative true variants with frequency below 0.5% were consistently detected by QQ-SNV(HS-P80) from different generations of Illumina sequencers. We developed and successfully evaluated a novel method, called QQ-SNV, for highly efficient single nucleotide variant calling on Illumina deep sequencing virology data.

  20. Peak Detection Method Evaluation for Ion Mobility Spectrometry by Using Machine Learning Approaches

    PubMed Central

    Hauschild, Anne-Christin; Kopczynski, Dominik; D’Addario, Marianna; Baumbach, Jörg Ingo; Rahmann, Sven; Baumbach, Jan

    2013-01-01

    Ion mobility spectrometry with pre-separation by multi-capillary columns (MCC/IMS) has become an established inexpensive, non-invasive bioanalytics technology for detecting volatile organic compounds (VOCs) with various metabolomics applications in medical research. To pave the way for this technology towards daily usage in medical practice, different steps still have to be taken. With respect to modern biomarker research, one of the most important tasks is the automatic classification of patient-specific data sets into different groups, healthy or not, for instance. Although sophisticated machine learning methods exist, an inevitable preprocessing step is reliable and robust peak detection without manual intervention. In this work we evaluate four state-of-the-art approaches for automated IMS-based peak detection: local maxima search, watershed transformation with IPHEx, region-merging with VisualNow, and peak model estimation (PME). We manually generated a gold standard with the aid of a domain expert (manual) and compare the performance of the four peak calling methods with respect to two distinct criteria. We first utilize established machine learning methods and systematically study their classification performance based on the four peak detectors’ results. Second, we investigate the classification variance and robustness regarding perturbation and overfitting. Our main finding is that the power of the classification accuracy is almost equally good for all methods, the manually created gold standard as well as the four automatic peak finding methods. In addition, we note that all tools, manual and automatic, are similarly robust against perturbations. However, the classification performance is more robust against overfitting when using the PME as peak calling preprocessor. In summary, we conclude that all methods, though small differences exist, are largely reliable and enable a wide spectrum of real-world biomedical applications. PMID:24957992

  1. Peak detection method evaluation for ion mobility spectrometry by using machine learning approaches.

    PubMed

    Hauschild, Anne-Christin; Kopczynski, Dominik; D'Addario, Marianna; Baumbach, Jörg Ingo; Rahmann, Sven; Baumbach, Jan

    2013-04-16

    Ion mobility spectrometry with pre-separation by multi-capillary columns (MCC/IMS) has become an established inexpensive, non-invasive bioanalytics technology for detecting volatile organic compounds (VOCs) with various metabolomics applications in medical research. To pave the way for this technology towards daily usage in medical practice, different steps still have to be taken. With respect to modern biomarker research, one of the most important tasks is the automatic classification of patient-specific data sets into different groups, healthy or not, for instance. Although sophisticated machine learning methods exist, an inevitable preprocessing step is reliable and robust peak detection without manual intervention. In this work we evaluate four state-of-the-art approaches for automated IMS-based peak detection: local maxima search, watershed transformation with IPHEx, region-merging with VisualNow, and peak model estimation (PME).We manually generated Metabolites 2013, 3 278 a gold standard with the aid of a domain expert (manual) and compare the performance of the four peak calling methods with respect to two distinct criteria. We first utilize established machine learning methods and systematically study their classification performance based on the four peak detectors' results. Second, we investigate the classification variance and robustness regarding perturbation and overfitting. Our main finding is that the power of the classification accuracy is almost equally good for all methods, the manually created gold standard as well as the four automatic peak finding methods. In addition, we note that all tools, manual and automatic, are similarly robust against perturbations. However, the classification performance is more robust against overfitting when using the PME as peak calling preprocessor. In summary, we conclude that all methods, though small differences exist, are largely reliable and enable a wide spectrum of real-world biomedical applications.

  2. An image-based automatic recognition method for the flowering stage of maize

    NASA Astrophysics Data System (ADS)

    Yu, Zhenghong; Zhou, Huabing; Li, Cuina

    2018-03-01

    In this paper, we proposed an image-based approach for automatic recognizing the flowering stage of maize. A modified HOG/SVM detection framework is first adopted to detect the ears of maize. Then, we use low-rank matrix recovery technology to precisely extract the ears at pixel level. At last, a new feature called color gradient histogram, as an indicator, is proposed to determine the flowering stage. Comparing experiment has been carried out to testify the validity of our method and the results indicate that our method can meet the demand for practical observation.

  3. Lead Poison Detection

    NASA Technical Reports Server (NTRS)

    1976-01-01

    With NASA contracts, Whittaker Corporations Space Science division has developed an electro-optical instrument to mass screen for lead poisoning. Device is portable and detects protoporphyrin in whole blood. Free corpuscular porphyrins occur as an early effect of lead ingestion. Also detects lead in urine used to confirm blood tests. Test is inexpensive and can be applied by relatively unskilled personnel. Similar Whittaker fluorometry device called "drug screen" can measure morphine and quinine in urine much faster and cheaper than other methods.

  4. Computer-aided detection of initial polyp candidates with level set-based adaptive convolution

    NASA Astrophysics Data System (ADS)

    Zhu, Hongbin; Duan, Chaijie; Liang, Zhengrong

    2009-02-01

    In order to eliminate or weaken the interference between different topological structures on the colon wall, adaptive and normalized convolution methods were used to compute the first and second order spatial derivatives of computed tomographic colonography images, which is the beginning of various geometric analyses. However, the performance of such methods greatly depends on the single-layer representation of the colon wall, which is called the starting layer (SL) in the following text. In this paper, we introduce a level set-based adaptive convolution (LSAC) method to compute the spatial derivatives, in which the level set method is employed to determine a more reasonable SL. The LSAC was applied to a computer-aided detection (CAD) scheme to detect the initial polyp candidates, and experiments showed that it benefits the CAD scheme in both the detection sensitivity and specificity as compared to our previous work.

  5. The ICR96 exon CNV validation series: a resource for orthogonal assessment of exon CNV calling in NGS data.

    PubMed

    Mahamdallie, Shazia; Ruark, Elise; Yost, Shawn; Ramsay, Emma; Uddin, Imran; Wylie, Harriett; Elliott, Anna; Strydom, Ann; Renwick, Anthony; Seal, Sheila; Rahman, Nazneen

    2017-01-01

    Detection of deletions and duplications of whole exons (exon CNVs) is a key requirement of genetic testing. Accurate detection of this variant type has proved very challenging in targeted next-generation sequencing (NGS) data, particularly if only a single exon is involved. Many different NGS exon CNV calling methods have been developed over the last five years. Such methods are usually evaluated using simulated and/or in-house data due to a lack of publicly-available datasets with orthogonally generated results. This hinders tool comparisons, transparency and reproducibility. To provide a community resource for assessment of exon CNV calling methods in targeted NGS data, we here present the ICR96 exon CNV validation series. The dataset includes high-quality sequencing data from a targeted NGS assay (the TruSight Cancer Panel) together with Multiplex Ligation-dependent Probe Amplification (MLPA) results for 96 independent samples. 66 samples contain at least one validated exon CNV and 30 samples have validated negative results for exon CNVs in 26 genes. The dataset includes 46 exon CNVs in BRCA1 , BRCA2 , TP53 , MLH1 , MSH2 , MSH6 , PMS2 , EPCAM or PTEN , giving excellent representation of the cancer predisposition genes most frequently tested in clinical practice. Moreover, the validated exon CNVs include 25 single exon CNVs, the most difficult type of exon CNV to detect. The FASTQ files for the ICR96 exon CNV validation series can be accessed through the European-Genome phenome Archive (EGA) under the accession number EGAS00001002428.

  6. A novel sensitivity-based method for damage detection of structures under unknown periodic excitations

    NASA Astrophysics Data System (ADS)

    Naseralavi, S. S.; Salajegheh, E.; Fadaee, M. J.; Salajegheh, J.

    2014-06-01

    This paper presents a technique for damage detection in structures under unknown periodic excitations using the transient displacement response. The method is capable of identifying the damage parameters without finding the input excitations. We first define the concept of displacement space as a linear space in which each point represents displacements of structure under an excitation and initial condition. Roughly speaking, the method is based on the fact that structural displacements under free and forced vibrations are associated with two parallel subspaces in the displacement space. Considering this novel geometrical viewpoint, an equation called kernel parallelization equation (KPE) is derived for damage detection under unknown periodic excitations and a sensitivity-based algorithm for solving KPE is proposed accordingly. The method is evaluated via three case studies under periodic excitations, which confirm the efficiency of the proposed method.

  7. Content recognition for telephone monitoring

    NASA Astrophysics Data System (ADS)

    Wenndt, Stanley J.; Harris, David M.; Cupples, Edward J.

    2001-02-01

    This research began due to federal inmates abusing their telephone privileges by committing serious offenses such as murder, drug dealing, and fraud. On average, about 1000 calls are made per day at each federal prison with a peak of over 4000. Current monitoring capabilities are very man- intensive and only allow for about 2-3% monitoring of inmate telephone conversations. One of the main deficiencies identified by prison officials is the need to flag phone conversations pertaining to criminal activity. This research looks at two unique voice-processing methods to detect phone conversion pertaining to criminal activity. These two methods are digit string detection and whisper detection.

  8. A method for the detection of the refractive index of irregular shape solid pigments in light absorbing liquid matrix.

    PubMed

    Niskanen, Ilpo; Räty, Jukka; Peiponen, Kai-Erik

    2010-06-15

    The immersion liquid method is powerful for the measurement of the refractive index of solid particles in a liquid matrix. However, this method applies best for cases when the liquid matrix is transparent. A problem is usually how to assess the refractive index of a pigment when it is in a colored host liquid. In this article we introduce a method, and show that by combining so-called multifunction spectrophotometer, immersion liquid method and detection of light transmission and reflection we can assess the refractive index of a pigment in a colored liquid, and also the extinction or absorption coefficient of the host liquid.

  9. A Machine Learning Method for Power Prediction on the Mobile Devices.

    PubMed

    Chen, Da-Ren; Chen, You-Shyang; Chen, Lin-Chih; Hsu, Ming-Yang; Chiang, Kai-Feng

    2015-10-01

    Energy profiling and estimation have been popular areas of research in multicore mobile architectures. While short sequences of system calls have been recognized by machine learning as pattern descriptions for anomalous detection, power consumption of running processes with respect to system-call patterns are not well studied. In this paper, we propose a fuzzy neural network (FNN) for training and analyzing process execution behaviour with respect to series of system calls, parameters and their power consumptions. On the basis of the patterns of a series of system calls, we develop a power estimation daemon (PED) to analyze and predict the energy consumption of the running process. In the initial stage, PED categorizes sequences of system calls as functional groups and predicts their energy consumptions by FNN. In the operational stage, PED is applied to identify the predefined sequences of system calls invoked by running processes and estimates their energy consumption.

  10. Polarization switching detection method using a ferroelectric liquid crystal for dichroic atomic vapor laser lock frequency stabilization techniques.

    PubMed

    Dudzik, Grzegorz; Rzepka, Janusz; Abramski, Krzysztof M

    2015-04-01

    We present a concept of the polarization switching detection method implemented for frequency-stabilized lasers, called the polarization switching dichroic atomic vapor laser lock (PSDAVLL) technique. It is a combination of the well-known dichroic atomic vapor laser lock method for laser frequency stabilization with a synchronous detection system based on the surface-stabilized ferroelectric liquid crystal (SSFLC).The SSFLC is a polarization switch and quarter wave-plate component. This technique provides a 9.6 dB better dynamic range ratio (DNR) than the well-known two-photodiode detection configuration known as the balanced polarimeter. This paper describes the proposed method used practically in the VCSEL laser frequency stabilization system. The applied PSDAVLL method has allowed us to obtain a frequency stability of 2.7×10⁻⁹ and a reproducibility of 1.2×10⁻⁸, with a DNR of detected signals of around 81 dB. It has been shown that PSDAVLL might be successfully used as a method for spectra-stable laser sources.

  11. Beluga whale, Delphinapterus leucas, vocalizations and their relation to behaviour in the Churchill River, Manitoba, Canada

    NASA Astrophysics Data System (ADS)

    Chmelnitsky, Elly Golda

    The investigation of a species' repertoire and the contexts in which different calls are used is central to understanding vocal communication among animals. Beluga whale, Delphinapterus leucas, calls were classified and described in association with behaviours, from recordings collected in the Churchill River, Manitoba, during the summers of 2006-2008. Calls were subjectively classified based on sound and visual analysis into whistles (64.2% of total calls; 22 call types), pulsed or noisy calls (25.9%; 15 call types), and combined calls (9.9%; seven types). A hierarchical cluster analysis, using six call measurements as variables, separated whistles into 12 groups and results were compared to subjective classification. Beluga calls associated with social interactions, travelling, feeding, and interactions with the boat were described. Call type percentages, relative proportions of different whistle contours (shapes), average frequency, and call duration varied with behaviour. Generally, higher percentages of whistles, more broadband pulsed and noisy calls, and shorter calls (<0.49s) were produced during behaviours associated with higher levels of activity and/or apparent arousal. Information on call types, call characteristics, and behavioural context of calls can be used for automated detection and classification methods and in future studies on call meaning and function.

  12. Automatic abdominal lymph node detection method based on local intensity structure analysis from 3D x-ray CT images

    NASA Astrophysics Data System (ADS)

    Nakamura, Yoshihiko; Nimura, Yukitaka; Kitasaka, Takayuki; Mizuno, Shinji; Furukawa, Kazuhiro; Goto, Hidemi; Fujiwara, Michitaka; Misawa, Kazunari; Ito, Masaaki; Nawano, Shigeru; Mori, Kensaku

    2013-03-01

    This paper presents an automated method of abdominal lymph node detection to aid the preoperative diagnosis of abdominal cancer surgery. In abdominal cancer surgery, surgeons must resect not only tumors and metastases but also lymph nodes that might have a metastasis. This procedure is called lymphadenectomy or lymph node dissection. Insufficient lymphadenectomy carries a high risk for relapse. However, excessive resection decreases a patient's quality of life. Therefore, it is important to identify the location and the structure of lymph nodes to make a suitable surgical plan. The proposed method consists of candidate lymph node detection and false positive reduction. Candidate lymph nodes are detected using a multi-scale blob-like enhancement filter based on local intensity structure analysis. To reduce false positives, the proposed method uses a classifier based on support vector machine with the texture and shape information. The experimental results reveal that it detects 70.5% of the lymph nodes with 13.0 false positives per case.

  13. Detection of shifted double JPEG compression by an adaptive DCT coefficient model

    NASA Astrophysics Data System (ADS)

    Wang, Shi-Lin; Liew, Alan Wee-Chung; Li, Sheng-Hong; Zhang, Yu-Jin; Li, Jian-Hua

    2014-12-01

    In many JPEG image splicing forgeries, the tampered image patch has been JPEG-compressed twice with different block alignments. Such phenomenon in JPEG image forgeries is called the shifted double JPEG (SDJPEG) compression effect. Detection of SDJPEG-compressed patches could help in detecting and locating the tampered region. However, the current SDJPEG detection methods do not provide satisfactory results especially when the tampered region is small. In this paper, we propose a new SDJPEG detection method based on an adaptive discrete cosine transform (DCT) coefficient model. DCT coefficient distributions for SDJPEG and non-SDJPEG patches have been analyzed and a discriminative feature has been proposed to perform the two-class classification. An adaptive approach is employed to select the most discriminative DCT modes for SDJPEG detection. The experimental results show that the proposed approach can achieve much better results compared with some existing approaches in SDJPEG patch detection especially when the patch size is small.

  14. Call progress time measurement in IP telephony

    NASA Astrophysics Data System (ADS)

    Khasnabish, Bhumip

    1999-11-01

    Usually a voice call is established through multiple stages in IP telephony. In the first stage, a phone number is dialed to reach a near-end or call-originating IP-telephony gateway. The next stages involve user identification through delivering an m-digit user-id to the authentication and/or billing server, and then user authentication by using an n- digit PIN. After that, the caller is allowed (last stage dial tone is provided) to dial a destination phone number provided that authentication is successful. In this paper, we present a very flexible method for measuring call progress time in IP telephony. The proposed technique can be used to measure the system response time at every stage. It is flexible, so that it can be easily modified to include new `tone' or a set of tones, or `voice begin' can be used in every stage to detect the system's response. The proposed method has been implemented using scripts written in Hammer visual basic language for testing with a few commercially available IP telephony gateways.

  15. Testing the effectiveness of automated acoustic sensors for monitoring vocal activity of Marbled Murrelets Brachyramphus marmoratus

    USGS Publications Warehouse

    Cragg, Jenna L.; Burger, Alan E.; Piatt, John F.

    2015-01-01

    Cryptic nest sites and secretive breeding behavior make population estimates and monitoring of Marbled Murrelets Brachyramphus marmoratus difficult and expensive. Standard audio-visual and radar protocols have been refined but require intensive field time by trained personnel. We examined the detection range of automated sound recorders (Song Meters; Wildlife Acoustics Inc.) and the reliability of automated recognition models (“recognizers”) for identifying and quantifying Marbled Murrelet vocalizations during the 2011 and 2012 breeding seasons at Kodiak Island, Alaska. The detection range of murrelet calls by Song Meters was estimated to be 60 m. Recognizers detected 20 632 murrelet calls (keer and keheer) from a sample of 268 h of recordings, yielding 5 870 call series, which compared favorably with human scanning of spectrograms (on average detecting 95% of the number of call series identified by a human observer, but not necessarily the same call series). The false-negative rate (percentage of murrelet call series that the recognizers failed to detect) was 32%, mainly involving weak calls and short call series. False-positives (other sounds included by recognizers as murrelet calls) were primarily due to complex songs of other bird species, wind and rain. False-positives were lower in forest nesting habitat (48%) and highest in shrubby vegetation where calls of other birds were common (97%–99%). Acoustic recorders tracked spatial and seasonal trends in vocal activity, with higher call detections in high-quality forested habitat and during late July/early August. Automated acoustic monitoring of Marbled Murrelet calls could provide cost-effective, valuable information for assessing habitat use and temporal and spatial trends in nesting activity; reliability is dependent on careful placement of sensors to minimize false-positives and on prudent application of digital recognizers with visual checking of spectrograms.

  16. Applying the Multiple Signal Classification Method to Silent Object Detection Using Ambient Noise

    NASA Astrophysics Data System (ADS)

    Mori, Kazuyoshi; Yokoyama, Tomoki; Hasegawa, Akio; Matsuda, Minoru

    2004-05-01

    The revolutionary concept of using ocean ambient noise positively to detect objects, called acoustic daylight imaging, has attracted much attention. The authors attempted the detection of a silent target object using ambient noise and a wide-band beam former consisting of an array of receivers. In experimental results obtained in air, using the wide-band beam former, we successfully applied the delay-sum array (DSA) method to detect a silent target object in an acoustic noise field generated by a large number of transducers. This paper reports some experimental results obtained by applying the multiple signal classification (MUSIC) method to a wide-band beam former to detect silent targets. The ocean ambient noise was simulated by transducers decentralized to many points in air. Both MUSIC and DSA detected a spherical target object in the noise field. The relative power levels near the target obtained with MUSIC were compared with those obtained by DSA. Then the effectiveness of the MUSIC method was evaluated according to the rate of increase in the maximum and minimum relative power levels.

  17. Detecting and Estimating Contamination of Human DNA Samples in Sequencing and Array-Based Genotype Data

    PubMed Central

    Jun, Goo; Flickinger, Matthew; Hetrick, Kurt N.; Romm, Jane M.; Doheny, Kimberly F.; Abecasis, Gonçalo R.; Boehnke, Michael; Kang, Hyun Min

    2012-01-01

    DNA sample contamination is a serious problem in DNA sequencing studies and may result in systematic genotype misclassification and false positive associations. Although methods exist to detect and filter out cross-species contamination, few methods to detect within-species sample contamination are available. In this paper, we describe methods to identify within-species DNA sample contamination based on (1) a combination of sequencing reads and array-based genotype data, (2) sequence reads alone, and (3) array-based genotype data alone. Analysis of sequencing reads allows contamination detection after sequence data is generated but prior to variant calling; analysis of array-based genotype data allows contamination detection prior to generation of costly sequence data. Through a combination of analysis of in silico and experimentally contaminated samples, we show that our methods can reliably detect and estimate levels of contamination as low as 1%. We evaluate the impact of DNA contamination on genotype accuracy and propose effective strategies to screen for and prevent DNA contamination in sequencing studies. PMID:23103226

  18. Evaluation of the status of anurans on a refuge in suburban Maryland

    USGS Publications Warehouse

    Brander, S.M.; Royle, J. Andrew; Eames, M.

    2007-01-01

    Because many anurans have well-defined breeding seasons and male anurans produce loud advertisement calls, surveys of these breeding choruses are believed to provide a dependable means of monitoring population trends. The Patuxent Research Refuge initiated such a calling survey in the spring of 1997, which uses volunteers to collect anuran (frog and toad) calling survey data. The primary goal of initiating the calling surveys at the Patuxent Refuge was to obtain baseline information on anuran populations, such as species occurrence, frequency of occurrence, and relative abundance over time. In this paper, we used the calling survey data to develop models for the ?proportion of area occupied? by individual anuran species, a method in which analysis is focused on the proportion of sites that are occupied by a species, instead of the number of individuals present in the population. This type of analysis is ideal for use in large-scale monitoring programs focused on species that are difficult to count, such as anurans or birds. We considered models for proportion of area occupied that allow for imperfect detection (that is, a species may be present but go undetected during sampling) by incorporating parameters that describe detection probability and the response of detection probability to various environmental and sampling covariates. Our results indicate that anuran populations on the Patuxent Research Refuge have high rates of occupancy compared to areas nearby and that extinction and colonization rates are stable. The potential uses for ?proportion of area occupied? analyses are far-reaching and will allow for more accurate quantification of data and better-informed management decisions for calling surveys on a larger scale.

  19. Evaluation of the status of anurans on a refuge in suburban Maryland

    USGS Publications Warehouse

    Brander, S.M.; Royle, J. Andrew; Eames, M.

    2007-01-01

    Because many anurans have well-defined breeding seasons and male anurans produce loud advertisement calls, surveys of these breeding choruses are believed to provide a dependable means of monitoring population trends. The Patuxent Research Refuge initiated such a calling survey in the spring of 1997, which uses volunteers to collect anuran (frog and toad) calling survey data. The primary goal of initiating the calling surveys at the Patuxent Refuge was to obtain baseline information on anuran populations, such as species occurrence, frequency of occurrence, and relative abundance over time. In this paper, we used the calling survey data to develop models for the "proportion of area occupied" by individual anuran species, a method in which analysis is focused on the proportion of sites that are occupied by a species, instead of the number of individuals present in the population. This type of analysis is ideal for use in large-scale monitoring programs focused on species that are difficult to count, such as anurans or birds. We considered models for proportion of area occupied that allow for imperfect detection (that is, a species may be present but go undetected during sampling) by incorporating parameters that describe detection probability and the response of detection probability to various environmental and sampling covariates. Our results indicate that anuran populations on the Patuxent Research Refuge have high rates of occupancy compared to areas nearby and that extinction and colonization rates are stable. The potential uses for "proportion of area occupied" analyses are far-reaching and will allow for more accurate quantification of data and better-informed management decisions for calling surveys on a larger scale. Copyright 2007 Society for the Study of Amphibians and Reptiles.

  20. E-commerce Review System to Detect False Reviews.

    PubMed

    Kolhar, Manjur

    2017-08-15

    E-commerce sites have been doing profitable business since their induction in high-speed and secured networks. Moreover, they continue to influence consumers through various methods. One of the most effective methods is the e-commerce review rating system, in which consumers provide review ratings for the products used. However, almost all e-commerce review rating systems are unable to provide cumulative review ratings. Furthermore, review ratings are influenced by positive and negative malicious feedback ratings, collectively called false reviews. In this paper, we proposed an e-commerce review system framework developed using the cumulative sum method to detect and remove malicious review ratings.

  1. Episodic Upwelling of Zooplankton within a Bowhead Whale Feeding Area Near Barrow, AK

    DTIC Science & Technology

    2011-09-30

    the Beaufort year-round. Bowhead whales vocalize using both calls and songs . There was distinct seasonal variability in the detection of the...different species’ calls/ songs . Calls/ songs from whale species were detected in fall and declined as ice concentration in the mooring vicinity increased...Figs. 4 & 5). In the spring, however, whale calls/ songs were detected beginning in April when the region was still covered with ice, and continued

  2. Mutual information estimation reveals global associations between stimuli and biological processes

    PubMed Central

    Suzuki, Taiji; Sugiyama, Masashi; Kanamori, Takafumi; Sese, Jun

    2009-01-01

    Background Although microarray gene expression analysis has become popular, it remains difficult to interpret the biological changes caused by stimuli or variation of conditions. Clustering of genes and associating each group with biological functions are often used methods. However, such methods only detect partial changes within cell processes. Herein, we propose a method for discovering global changes within a cell by associating observed conditions of gene expression with gene functions. Results To elucidate the association, we introduce a novel feature selection method called Least-Squares Mutual Information (LSMI), which computes mutual information without density estimaion, and therefore LSMI can detect nonlinear associations within a cell. We demonstrate the effectiveness of LSMI through comparison with existing methods. The results of the application to yeast microarray datasets reveal that non-natural stimuli affect various biological processes, whereas others are no significant relation to specific cell processes. Furthermore, we discover that biological processes can be categorized into four types according to the responses of various stimuli: DNA/RNA metabolism, gene expression, protein metabolism, and protein localization. Conclusion We proposed a novel feature selection method called LSMI, and applied LSMI to mining the association between conditions of yeast and biological processes through microarray datasets. In fact, LSMI allows us to elucidate the global organization of cellular process control. PMID:19208155

  3. Directional frequency and recording (DIFAR) sensors in seafloor recorders to locate calling bowhead whales during their fall migration.

    PubMed

    Greene, Charles R; McLennan, Miles Wm; Norman, Robert G; McDonald, Trent L; Jakubczak, Ray S; Richardson, W John

    2004-08-01

    Bowhead whales, Balaena mysticetus, migrate west during fall approximately 10-75 km off the north coast of Alaska, passing the petroleum developments around Prudhoe Bay. Oil production operations on an artificial island 5 km offshore create sounds heard by some whales. As part of an effort to assess whether migrating whales deflect farther offshore at times with high industrial noise, an acoustical approach was selected for localizing calling whales. The technique incorporated DIFAR (directional frequency and recording) sonobuoy techniques. An array of 11 DASARs (directional autonomous seafloor acoustic recorders) was built and installed with unit-to-unit separation of 5 km. When two or more DASARs detected the same call, the whale location was determined from the bearing intersections. This article describes the acoustic methods used to determine the locations of the calling bowhead whales and shows the types and precision of the data acquired. Calibration transmissions at GPS-measured times and locations provided measures of the individual DASAR clock drift and directional orientation. The standard error of the bearing measurements at distances of 3-4 km was approximately 1.35 degrees after corrections for gain imbalance in the two directional sensors. During 23 days in 2002, 10,587 bowhead calls were detected and 8383 were localized.

  4. The 'sniffer-patch' technique for detection of neurotransmitter release.

    PubMed

    Allen, T G

    1997-05-01

    A wide variety of techniques have been employed for the detection and measurement of neurotransmitter release from biological preparations. Whilst many of these methods offer impressive levels of sensitivity, few are able to combine sensitivity with the necessary temporal and spatial resolution required to study quantal release from single cells. One detection method that is seeing a revival of interest and has the potential to fill this niche is the so-called 'sniffer-patch' technique. In this article, specific examples of the practical aspects of using this technique are discussed along with the procedures involved in calibrating these biosensors to extend their applications to provide quantitative, in addition to simple qualitative, measurements of quantal transmitter release.

  5. SWCD: a sliding window and self-regulated learning-based background updating method for change detection in videos

    NASA Astrophysics Data System (ADS)

    Işık, Şahin; Özkan, Kemal; Günal, Serkan; Gerek, Ömer Nezih

    2018-03-01

    Change detection with background subtraction process remains to be an unresolved issue and attracts research interest due to challenges encountered on static and dynamic scenes. The key challenge is about how to update dynamically changing backgrounds from frames with an adaptive and self-regulated feedback mechanism. In order to achieve this, we present an effective change detection algorithm for pixelwise changes. A sliding window approach combined with dynamic control of update parameters is introduced for updating background frames, which we called sliding window-based change detection. Comprehensive experiments on related test videos show that the integrated algorithm yields good objective and subjective performance by overcoming illumination variations, camera jitters, and intermittent object motions. It is argued that the obtained method makes a fair alternative in most types of foreground extraction scenarios; unlike case-specific methods, which normally fail for their nonconsidered scenarios.

  6. A comparison of survey methods for documenting presence of Myotis leibii (Eastern Small-Footed Bats) at roosting areas in Western Virginia

    USGS Publications Warehouse

    Huth, John K.; Silvis, Alexander; Moosman, Paul R.; Ford, W. Mark; Sweeten, Sara E.

    2015-01-01

    Many aspects of foraging and roosting habitat of Myotis leibii (Eastern Small-Footed Bat), an emergent rock roosting-obligate, are poorly described. Previous comparisons of effectiveness of acoustic sampling and mist-net captures have not included Eastern Small-Footed Bat. Habitat requirements of this species differ from congeners in the region, and it is unclear whether survey protocols developed for other species are applicable. Using data from three overlapping studies at two sampling sites in western Virginia’s central Appalachian Mountains, detection probabilities were examined for three survey methods (acoustic surveys with automated identification of calls, visual searches of rock crevices, and mist-netting) for use in the development of “best practices” for future surveys and monitoring. Observer effects were investigated using an expanded version of visual search data. Results suggested that acoustic surveys with automated call identification are not effective for documenting presence of Eastern Small-Footed Bats on talus slopes (basal detection rate of 0%) even when the species is known to be present. The broadband, high frequency echolocation calls emitted by Eastern Small-Footed Bat may be prone to attenuation by virtue of their high frequencies, and these factors, along with signal reflection, lower echolocation rates or possible misidentification to other bat species over talus slopes may all have contributed to poor acoustic survey success. Visual searches and mist-netting of emergent rock had basal detection probabilities of 91% and 75%, respectively. Success of visual searches varied among observers, but detection probability improved with practice. Additionally, visual searches were considerably more economical than mist-netting.

  7. Bandwidth and Detection of Packet Length Covert Channels

    DTIC Science & Technology

    2011-03-01

    Shared Resource Matrix ( SRM ): Develop a matrix of all resources on one side and on the other all the processes. Then, determine which process uses which...system calls. This method is similar to that of the SRM . Covert channels have also been created by modulating packet timing, data and headers of net- work...analysis, noninterference analysis, SRM method, and the covert flow tree method [4]. These methods can be used during the design phase of a system. Less

  8. DB2: a probabilistic approach for accurate detection of tandem duplication breakpoints using paired-end reads.

    PubMed

    Yavaş, Gökhan; Koyutürk, Mehmet; Gould, Meetha P; McMahon, Sarah; LaFramboise, Thomas

    2014-03-05

    With the advent of paired-end high throughput sequencing, it is now possible to identify various types of structural variation on a genome-wide scale. Although many methods have been proposed for structural variation detection, most do not provide precise boundaries for identified variants. In this paper, we propose a new method, Distribution Based detection of Duplication Boundaries (DB2), for accurate detection of tandem duplication breakpoints, an important class of structural variation, with high precision and recall. Our computational experiments on simulated data show that DB2 outperforms state-of-the-art methods in terms of finding breakpoints of tandem duplications, with a higher positive predictive value (precision) in calling the duplications' presence. In particular, DB2's prediction of tandem duplications is correct 99% of the time even for very noisy data, while narrowing down the space of possible breakpoints within a margin of 15 to 20 bps on the average. Most of the existing methods provide boundaries in ranges that extend to hundreds of bases with lower precision values. Our method is also highly robust to varying properties of the sequencing library and to the sizes of the tandem duplications, as shown by its stable precision, recall and mean boundary mismatch performance. We demonstrate our method's efficacy using both simulated paired-end reads, and those generated from a melanoma sample and two ovarian cancer samples. Newly discovered tandem duplications are validated using PCR and Sanger sequencing. Our method, DB2, uses discordantly aligned reads, taking into account the distribution of fragment length to predict tandem duplications along with their breakpoints on a donor genome. The proposed method fine tunes the breakpoint calls by applying a novel probabilistic framework that incorporates the empirical fragment length distribution to score each feasible breakpoint. DB2 is implemented in Java programming language and is freely available at http://mendel.gene.cwru.edu/laframboiselab/software.php.

  9. Polarization-dependent optical reflection ultrasonic detection

    NASA Astrophysics Data System (ADS)

    Zhu, Xiaoyi; Huang, Zhiyu; Wang, Guohe; Li, Wenzhao; Li, Changhui

    2017-03-01

    Although ultrasound transducers based on commercial piezoelectric-material have been widely used, they generally have limited bandwidth centered at the resonant frequency. Currently, several pure-optical ultrasonic detection methods have gained increasing interest due to their wide bandwidth and high sensitivity. However, most of them require customized components (such as micro-ring, SPR, Fabry-Perot film, etc), which limit their broad implementations. In this study, we presented a simple pure-optical ultrasound detection method, called "Polarization-dependent Reflection Ultrasonic Detection" (PRUD). It detects the intensity difference between two polarization components of the probe beam that is modulated by ultrasound waves. PRUD detect the two components by using a balanced detector, which effectively suppressed much of the unwanted noise. We have achieved the sensitivity (noise equivalent pressure) to be 1.7kPa, and this can be further improved. In addition, like many other pure-optical ultrasonic detection methods, PRUD also has a flat and broad bandwidth from almost zero to over 100MHz. Besides theoretical analysis, we did a phantom study by imaging a tungsten filament to demonstrate the performance of PRUD. We believe this simple and economic method will attract both researchers and engineers in optical and ultrasound fields.

  10. Crack Detection in Concrete Tunnels Using a Gabor Filter Invariant to Rotation.

    PubMed

    Medina, Roberto; Llamas, José; Gómez-García-Bermejo, Jaime; Zalama, Eduardo; Segarra, Miguel José

    2017-07-20

    In this article, a system for the detection of cracks in concrete tunnel surfaces, based on image sensors, is presented. Both data acquisition and processing are covered. Linear cameras and proper lighting are used for data acquisition. The required resolution of the camera sensors and the number of cameras is discussed in terms of the crack size and the tunnel type. Data processing is done by applying a new method called Gabor filter invariant to rotation, allowing the detection of cracks in any direction. The parameter values of this filter are set by using a modified genetic algorithm based on the Differential Evolution optimization method. The detection of the pixels belonging to cracks is obtained to a balanced accuracy of 95.27%, thus improving the results of previous approaches.

  11. Convolutional neural network for earthquake detection and location

    PubMed Central

    Perol, Thibaut; Gharbi, Michaël; Denolle, Marine

    2018-01-01

    The recent evolution of induced seismicity in Central United States calls for exhaustive catalogs to improve seismic hazard assessment. Over the last decades, the volume of seismic data has increased exponentially, creating a need for efficient algorithms to reliably detect and locate earthquakes. Today’s most elaborate methods scan through the plethora of continuous seismic records, searching for repeating seismic signals. We leverage the recent advances in artificial intelligence and present ConvNetQuake, a highly scalable convolutional neural network for earthquake detection and location from a single waveform. We apply our technique to study the induced seismicity in Oklahoma, USA. We detect more than 17 times more earthquakes than previously cataloged by the Oklahoma Geological Survey. Our algorithm is orders of magnitude faster than established methods. PMID:29487899

  12. [Demand for and the Development of Detection Techniques for Source of Schistosome Infection in China].

    PubMed

    Wang, Shi-ping; He, Xin; Zhou, Yun-fei

    2015-12-01

    Schistosomiasis is a type of zoonotic parasitosis that severely impairs human health. Rapid detection of infection sources is a key to the control of schistosomiasis. With the effective control of schistosomiasis in China, the detection techniques for infection sources have also been developed. The rate and the intensity of infection among humans and livestocks have been significantly decreased in China, as the control program has entered the transmission control stage in most of the endemic areas. Under this situation, the traditional etiological diagnosing techniques and common immunological methods can not afford rapid detection of infection sources of schistosomiasis. Instead, we are calling for detection methods with higher sensitivity, specificity and stability while being less time-consuming, more convenient and less costing. In recent years, many improved or novel detection methods have been applied for the epidemiological surveillance of schistosomiasis, such as the automatic scanning microscopic image acquisition system, PCR-ELISA, immunosensors, loop-mediated isothermal amplification, etc. The development of new monitoring techniques can facilitate rapid detection of schistosome infection sources in endemic areas.

  13. Detection of Cutting Tool Wear using Statistical Analysis and Regression Model

    NASA Astrophysics Data System (ADS)

    Ghani, Jaharah A.; Rizal, Muhammad; Nuawi, Mohd Zaki; Haron, Che Hassan Che; Ramli, Rizauddin

    2010-10-01

    This study presents a new method for detecting the cutting tool wear based on the measured cutting force signals. A statistical-based method called Integrated Kurtosis-based Algorithm for Z-Filter technique, called I-kaz was used for developing a regression model and 3D graphic presentation of I-kaz 3D coefficient during machining process. The machining tests were carried out using a CNC turning machine Colchester Master Tornado T4 in dry cutting condition. A Kistler 9255B dynamometer was used to measure the cutting force signals, which were transmitted, analyzed, and displayed in the DasyLab software. Various force signals from machining operation were analyzed, and each has its own I-kaz 3D coefficient. This coefficient was examined and its relationship with flank wear lands (VB) was determined. A regression model was developed due to this relationship, and results of the regression model shows that the I-kaz 3D coefficient value decreases as tool wear increases. The result then is used for real time tool wear monitoring.

  14. A call for benchmarking transposable element annotation methods.

    PubMed

    Hoen, Douglas R; Hickey, Glenn; Bourque, Guillaume; Casacuberta, Josep; Cordaux, Richard; Feschotte, Cédric; Fiston-Lavier, Anna-Sophie; Hua-Van, Aurélie; Hubley, Robert; Kapusta, Aurélie; Lerat, Emmanuelle; Maumus, Florian; Pollock, David D; Quesneville, Hadi; Smit, Arian; Wheeler, Travis J; Bureau, Thomas E; Blanchette, Mathieu

    2015-01-01

    DNA derived from transposable elements (TEs) constitutes large parts of the genomes of complex eukaryotes, with major impacts not only on genomic research but also on how organisms evolve and function. Although a variety of methods and tools have been developed to detect and annotate TEs, there are as yet no standard benchmarks-that is, no standard way to measure or compare their accuracy. This lack of accuracy assessment calls into question conclusions from a wide range of research that depends explicitly or implicitly on TE annotation. In the absence of standard benchmarks, toolmakers are impeded in improving their tools, annotators cannot properly assess which tools might best suit their needs, and downstream researchers cannot judge how accuracy limitations might impact their studies. We therefore propose that the TE research community create and adopt standard TE annotation benchmarks, and we call for other researchers to join the authors in making this long-overdue effort a success.

  15. Using Puppets to Teach Schoolchildren to Detect Stroke and Call 911

    ERIC Educational Resources Information Center

    Sharkey, Sonya; Denke, Linda; Herbert, Morley A.

    2016-01-01

    To overcome barriers to improved outcomes, we undertook an intervention to teach schoolchildren how to detect a stroke and call emergency medical services (EMS). We obtained permission from parents and guardians to use an 8-min puppet show to instruct the fourth, fifth, and sixth graders about stroke detection, symptomatology, and calling EMS. A…

  16. An Accurate Framework for Arbitrary View Pedestrian Detection in Images

    NASA Astrophysics Data System (ADS)

    Fan, Y.; Wen, G.; Qiu, S.

    2018-01-01

    We consider the problem of detect pedestrian under from images collected under various viewpoints. This paper utilizes a novel framework called locality-constrained affine subspace coding (LASC). Firstly, the positive training samples are clustered into similar entities which represent similar viewpoint. Then Principal Component Analysis (PCA) is used to obtain the shared feature of each viewpoint. Finally, the samples that can be reconstructed by linear approximation using their top- k nearest shared feature with a small error are regarded as a correct detection. No negative samples are required for our method. Histograms of orientated gradient (HOG) features are used as the feature descriptors, and the sliding window scheme is adopted to detect humans in images. The proposed method exploits the sparse property of intrinsic information and the correlations among the multiple-views samples. Experimental results on the INRIA and SDL human datasets show that the proposed method achieves a higher performance than the state-of-the-art methods in form of effect and efficiency.

  17. Edge-directed inference for microaneurysms detection in digital fundus images

    NASA Astrophysics Data System (ADS)

    Huang, Ke; Yan, Michelle; Aviyente, Selin

    2007-03-01

    Microaneurysms (MAs) detection is a critical step in diabetic retinopathy screening, since MAs are the earliest visible warning of potential future problems. A variety of algorithms have been proposed for MAs detection in mass screening. Different methods have been proposed for MAs detection. The core technology for most of existing methods is based on a directional mathematical morphological operation called "Top-Hat" filter that requires multiple filtering operations at each pixel. Background structure, uneven illumination and noise often cause confusion between MAs and some non-MA structures and limits the applicability of the filter. In this paper, a novel detection framework based on edge directed inference is proposed for MAs detection. The candidate MA regions are first delineated from the edge map of a fundus image. Features measuring shape, brightness and contrast are extracted for each candidate MA region to better exclude false detection from true MAs. Algorithmic analysis and empirical evaluation reveal that the proposed edge directed inference outperforms the "Top-Hat" based algorithm in both detection accuracy and computational speed.

  18. The Main Belt Comets and ice in the Solar System

    NASA Astrophysics Data System (ADS)

    Snodgrass, Colin; Agarwal, Jessica; Combi, Michael; Fitzsimmons, Alan; Guilbert-Lepoutre, Aurelie; Hsieh, Henry H.; Hui, Man-To; Jehin, Emmanuel; Kelley, Michael S. P.; Knight, Matthew M.; Opitom, Cyrielle; Orosei, Roberto; de Val-Borro, Miguel; Yang, Bin

    2017-11-01

    We review the evidence for buried ice in the asteroid belt; specifically the questions around the so-called Main Belt Comets (MBCs). We summarise the evidence for water throughout the Solar System, and describe the various methods for detecting it, including remote sensing from ultraviolet to radio wavelengths. We review progress in the first decade of study of MBCs, including observations, modelling of ice survival, and discussion on their origins. We then look at which methods will likely be most effective for further progress, including the key challenge of direct detection of (escaping) water in these bodies.

  19. Multi-Source Fusion for Explosive Hazard Detection in Forward Looking Sensors

    DTIC Science & Technology

    2016-12-01

    include; (1) Investigating (a) thermal, (b) synthetic aperture acoustics ( SAA ) and (c) voxel space Radar for buried and side threat attacks. (2...detection. (3) With respect to SAA , we developed new approaches in the time and frequency domains for analyzing signature of concealed targets (called...Fraz). We also developed a method to extract a multi-spectral signature from SAA and deep learning was used on limited training and class imbalance

  20. GStream: Improving SNP and CNV Coverage on Genome-Wide Association Studies

    PubMed Central

    Alonso, Arnald; Marsal, Sara; Tortosa, Raül; Canela-Xandri, Oriol; Julià, Antonio

    2013-01-01

    We present GStream, a method that combines genome-wide SNP and CNV genotyping in the Illumina microarray platform with unprecedented accuracy. This new method outperforms previous well-established SNP genotyping software. More importantly, the CNV calling algorithm of GStream dramatically improves the results obtained by previous state-of-the-art methods and yields an accuracy that is close to that obtained by purely CNV-oriented technologies like Comparative Genomic Hybridization (CGH). We demonstrate the superior performance of GStream using microarray data generated from HapMap samples. Using the reference CNV calls generated by the 1000 Genomes Project (1KGP) and well-known studies on whole genome CNV characterization based either on CGH or genotyping microarray technologies, we show that GStream can increase the number of reliably detected variants up to 25% compared to previously developed methods. Furthermore, the increased genome coverage provided by GStream allows the discovery of CNVs in close linkage disequilibrium with SNPs, previously associated with disease risk in published Genome-Wide Association Studies (GWAS). These results could provide important insights into the biological mechanism underlying the detected disease risk association. With GStream, large-scale GWAS will not only benefit from the combined genotyping of SNPs and CNVs at an unprecedented accuracy, but will also take advantage of the computational efficiency of the method. PMID:23844243

  1. Reversed-phase high-performance liquid chromatography of sulfur mustard in water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raghuveeran, C.D.; Malhotra, R.C.; Dangi, R.S.

    1993-01-01

    A reversed-phase high-performance liquid chromatography method for the detection and quantitation of sulfur mustard (HD) in water is described with detection at 200 nm. The detection based on the solubility of HD in water revealed that extremely low quantities of HD (4 to 5 mg/L) only are soluble. Experience shows that water is still the medium of choice for the analysis of HD in water and aqueous effluents in spite of the minor handicap of its half-life of ca. 4 minutes, which only calls for speedy analysis.

  2. A deterministic compressive sensing model for bat biosonar.

    PubMed

    Hague, David A; Buck, John R; Bilik, Igal

    2012-12-01

    The big brown bat (Eptesicus fuscus) uses frequency modulated (FM) echolocation calls to accurately estimate range and resolve closely spaced objects in clutter and noise. They resolve glints spaced down to 2 μs in time delay which surpasses what traditional signal processing techniques can achieve using the same echolocation call. The Matched Filter (MF) attains 10-12 μs resolution while the Inverse Filter (IF) achieves higher resolution at the cost of significantly degraded detection performance. Recent work by Fontaine and Peremans [J. Acoustic. Soc. Am. 125, 3052-3059 (2009)] demonstrated that a sparse representation of bat echolocation calls coupled with a decimating sensing method facilitates distinguishing closely spaced objects over realistic SNRs. Their work raises the intriguing question of whether sensing approaches structured more like a mammalian auditory system contains the necessary information for the hyper-resolution observed in behavioral tests. This research estimates sparse echo signatures using a gammatone filterbank decimation sensing method which loosely models the processing of the bat's auditory system. The decimated filterbank outputs are processed with [script-l](1) minimization. Simulations demonstrate that this model maintains higher resolution than the MF and significantly better detection performance than the IF for SNRs of 5-45 dB while undersampling the return signal by a factor of six.

  3. An operant-based detection method for inferring tinnitus in mice.

    PubMed

    Zuo, Hongyan; Lei, Debin; Sivaramakrishnan, Shobhana; Howie, Benjamin; Mulvany, Jessica; Bao, Jianxin

    2017-11-01

    Subjective tinnitus is a hearing disorder in which a person perceives sound when no external sound is present. It can be acute or chronic. Because our current understanding of its pathology is incomplete, no effective cures have yet been established. Mouse models are useful for studying the pathophysiology of tinnitus as well as for developing therapeutic treatments. We have developed a new method for determining acute and chronic tinnitus in mice, called sound-based avoidance detection (SBAD). The SBAD method utilizes one paradigm to detect tinnitus and another paradigm to monitor possible confounding factors, such as motor impairment, loss of motivation, and deficits in learning and memory. The SBAD method has succeeded in monitoring both acute and chronic tinnitus in mice. Its detection ability is further validated by functional studies demonstrating an abnormal increase in neuronal activity in the inferior colliculus of mice that had previously been identified as having tinnitus by the SBAD method. The SBAD method provides a new means by which investigators can detect tinnitus in a single mouse accurately and with more control over potential confounding factors than existing methods. This work establishes a new behavioral method for detecting tinnitus in mice. The detection outcome is consistent with functional validation. One key advantage of mouse models is they provide researchers the opportunity to utilize an extensive array of genetic tools. This new method could lead to a deeper understanding of the molecular pathways underlying tinnitus pathology. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. An Efficient Acoustic Density Estimation Method with Human Detectors Applied to Gibbons in Cambodia.

    PubMed

    Kidney, Darren; Rawson, Benjamin M; Borchers, David L; Stevenson, Ben C; Marques, Tiago A; Thomas, Len

    2016-01-01

    Some animal species are hard to see but easy to hear. Standard visual methods for estimating population density for such species are often ineffective or inefficient, but methods based on passive acoustics show more promise. We develop spatially explicit capture-recapture (SECR) methods for territorial vocalising species, in which humans act as an acoustic detector array. We use SECR and estimated bearing data from a single-occasion acoustic survey of a gibbon population in northeastern Cambodia to estimate the density of calling groups. The properties of the estimator are assessed using a simulation study, in which a variety of survey designs are also investigated. We then present a new form of the SECR likelihood for multi-occasion data which accounts for the stochastic availability of animals. In the context of gibbon surveys this allows model-based estimation of the proportion of groups that produce territorial vocalisations on a given day, thereby enabling the density of groups, instead of the density of calling groups, to be estimated. We illustrate the performance of this new estimator by simulation. We show that it is possible to estimate density reliably from human acoustic detections of visually cryptic species using SECR methods. For gibbon surveys we also show that incorporating observers' estimates of bearings to detected groups substantially improves estimator performance. Using the new form of the SECR likelihood we demonstrate that estimates of availability, in addition to population density and detection function parameters, can be obtained from multi-occasion data, and that the detection function parameters are not confounded with the availability parameter. This acoustic SECR method provides a means of obtaining reliable density estimates for territorial vocalising species. It is also efficient in terms of data requirements since since it only requires routine survey data. We anticipate that the low-tech field requirements will make this method an attractive option in many situations where populations can be surveyed acoustically by humans.

  5. Improvements to Passive Acoustic Tracking Methods for Marine Mammal Monitoring

    DTIC Science & Technology

    2016-05-02

    separate and associate calls from individual animals . Marine mammal; Passive acoustic monitoring; Localization; Tracking; Multiple source; Sparse array...position and hydrophone timing offset in addition to animal position Almost all marine mammal tracking methods treat animal position as the only unknown...Workshop on Detection, Classification and Localization (DCL) of Marine Mammals). The animals were expected to be relatively close to the surface

  6. Missing RRI interpolation for HRV analysis using locally-weighted partial least squares regression.

    PubMed

    Kamata, Keisuke; Fujiwara, Koichi; Yamakawa, Toshiki; Kano, Manabu

    2016-08-01

    The R-R interval (RRI) fluctuation in electrocardiogram (ECG) is called heart rate variability (HRV). Since HRV reflects autonomic nervous function, HRV-based health monitoring services, such as stress estimation, drowsy driving detection, and epileptic seizure prediction, have been proposed. In these HRV-based health monitoring services, precise R wave detection from ECG is required; however, R waves cannot always be detected due to ECG artifacts. Missing RRI data should be interpolated appropriately for HRV analysis. The present work proposes a missing RRI interpolation method by utilizing using just-in-time (JIT) modeling. The proposed method adopts locally weighted partial least squares (LW-PLS) for RRI interpolation, which is a well-known JIT modeling method used in the filed of process control. The usefulness of the proposed method was demonstrated through a case study of real RRI data collected from healthy persons. The proposed JIT-based interpolation method could improve the interpolation accuracy in comparison with a static interpolation method.

  7. Detection of influenza antigenic variants directly from clinical samples using polyclonal antibody based proximity ligation assays

    PubMed Central

    Martin, Brigitte E.; Jia, Kun; Sun, Hailiang; Ye, Jianqiang; Hall, Crystal; Ware, Daphne; Wan, Xiu-Feng

    2016-01-01

    Identification of antigenic variants is the key to a successful influenza vaccination program. The empirical serological methods to determine influenza antigenic properties require viral propagation. Here a novel quantitative PCR-based antigenic characterization method using polyclonal antibody and proximity ligation assays, or so-called polyPLA, was developed and validated. This method can detect a viral titer that is less than 1000 TCID50/mL. Not only can this method differentiate between different HA subtypes of influenza viruses but also effectively identify antigenic drift events within the same HA subtype of influenza viruses. Applications in H3N2 seasonal influenza data showed that the results from this novel method are consistent with those from the conventional serological assays. This method is not limited to the detection of antigenic variants in influenza but also other pathogens. It has the potential to be applied through a large-scale platform in disease surveillance requiring minimal biosafety and directly using clinical samples. PMID:25546251

  8. A system for measuring thermal activation energy levels in silicon by thermally stimulated capacitance

    NASA Technical Reports Server (NTRS)

    Cockrum, R. H.

    1982-01-01

    One method being used to determine energy level(s) and electrical activity of impurities in silicon is described. The method is called capacitance transient spectroscopy (CTS). It can be classified into three basic categories: the thermally stimulated capacitance method, the voltage-stimulated capacitance method, and the light-stimulated capacitance method; the first two categories are discussed. From the total change in capacitance and the time constant of the capacitance response, emission rates, energy levels, and trap concentrations can be determined. A major advantage of using CTS is its ability to detect the presence of electrically active impurities that are invisible to other techniques, such as Zeeman effect atomic absorption, and the ability to detect more than one electrically active impurity in a sample. Examples of detection of majority and minority carrier traps from gold donor and acceptor centers in silicon using the capacitance transient spectrometer are given to illustrate the method and its sensitivity.

  9. Predicting chaos in memristive oscillator via harmonic balance method.

    PubMed

    Wang, Xin; Li, Chuandong; Huang, Tingwen; Duan, Shukai

    2012-12-01

    This paper studies the possible chaotic behaviors in a memristive oscillator with cubic nonlinearities via harmonic balance method which is also called the method of describing function. This method was proposed to detect chaos in classical Chua's circuit. We first transform the considered memristive oscillator system into Lur'e model and present the prediction of the existence of chaotic behaviors. To ensure the prediction result is correct, the distortion index is also measured. Numerical simulations are presented to show the effectiveness of theoretical results.

  10. Penalty dynamic programming algorithm for dim targets detection in sensor systems.

    PubMed

    Huang, Dayu; Xue, Anke; Guo, Yunfei

    2012-01-01

    In order to detect and track multiple maneuvering dim targets in sensor systems, an improved dynamic programming track-before-detect algorithm (DP-TBD) called penalty DP-TBD (PDP-TBD) is proposed. The performances of tracking techniques are used as a feedback to the detection part. The feedback is constructed by a penalty term in the merit function, and the penalty term is a function of the possible target state estimation, which can be obtained by the tracking methods. With this feedback, the algorithm combines traditional tracking techniques with DP-TBD and it can be applied to simultaneously detect and track maneuvering dim targets. Meanwhile, a reasonable constraint that a sensor measurement can originate from one target or clutter is proposed to minimize track separation. Thus, the algorithm can be used in the multi-target situation with unknown target numbers. The efficiency and advantages of PDP-TBD compared with two existing methods are demonstrated by several simulations.

  11. Acoustic detection of manatee vocalizations

    NASA Astrophysics Data System (ADS)

    Niezrecki, Christopher; Phillips, Richard; Meyer, Michael; Beusse, Diedrich O.

    2003-09-01

    The West Indian manatee (trichechus manatus latirostris) has become endangered partly because of a growing number of collisions with boats. A system to warn boaters of the presence of manatees, that can signal to boaters that manatees are present in the immediate vicinity, could potentially reduce these boat collisions. In order to identify the presence of manatees, acoustic methods are employed. Within this paper, three different detection algorithms are used to detect the calls of the West Indian manatee. The detection systems are tested in the laboratory using simulated manatee vocalizations from an audio compact disk. The detection method that provides the best overall performance is able to correctly identify ~96% of the manatee vocalizations. However, the system also results in a false alarm rate of ~16%. The results of this work may ultimately lead to the development of a manatee warning system that can warn boaters of the presence of manatees.

  12. Rapid and Inexpensive Screening of Genomic Copy Number Variations Using a Novel Quantitative Fluorescent PCR Method

    PubMed Central

    Han, Joan C.; Elsea, Sarah H.; Pena, Heloísa B.; Pena, Sérgio Danilo Junho

    2013-01-01

    Detection of human microdeletion and microduplication syndromes poses significant burden on public healthcare systems in developing countries. With genome-wide diagnostic assays frequently inaccessible, targeted low-cost PCR-based approaches are preferred. However, their reproducibility depends on equally efficient amplification using a number of target and control primers. To address this, the recently described technique called Microdeletion/Microduplication Quantitative Fluorescent PCR (MQF-PCR) was shown to reliably detect four human syndromes by quantifying DNA amplification in an internally controlled PCR reaction. Here, we confirm its utility in the detection of eight human microdeletion syndromes, including the more common WAGR, Smith-Magenis, and Potocki-Lupski syndromes with 100% sensitivity and 100% specificity. We present selection, design, and performance evaluation of detection primers using variety of approaches. We conclude that MQF-PCR is an easily adaptable method for detection of human pathological chromosomal aberrations. PMID:24288428

  13. Rootkit Detection Using a Cross-View Clean Boot Method

    DTIC Science & Technology

    2013-03-01

    FindNextFile: [2] Kernel32.dll 4. SSDTHooks r -- ... CALL NtQueryDirectoryFile 5. Code Patch ing - 6. Layered Driver 4 NtQueryDirectoryFile : 7...NTFS Driver 0 Volume Manger Disk Driver [2] I. Disk Driver r ! J IAT hooks take advantage of function calls in applications [13]. When an...f36e923898161fa7be50810288e2f48a 61 Appendix D: Windows Source Code Windows Batch File @echo o f f py thon walk . py pause shutdown − r − t 0 Walk.py in

  14. Comparative performance of high-density oligonucleotide sequencing and dideoxynucleotide sequencing of HIV type 1 pol from clinical samples.

    PubMed

    Günthard, H F; Wong, J K; Ignacio, C C; Havlir, D V; Richman, D D

    1998-07-01

    The performance of the high-density oligonucleotide array methodology (GeneChip) in detecting drug resistance mutations in HIV-1 pol was compared with that of automated dideoxynucleotide sequencing (ABI) of clinical samples, viral stocks, and plasmid-derived NL4-3 clones. Sequences from 29 clinical samples (plasma RNA, n = 17; lymph node RNA, n = 5; lymph node DNA, n = 7) from 12 patients, from 6 viral stock RNA samples, and from 13 NL4-3 clones were generated by both methods. Editing was done independently by a different investigator for each method before comparing the sequences. In addition, NL4-3 wild type (WT) and mutants were mixed in varying concentrations and sequenced by both methods. Overall, a concordance of 99.1% was found for a total of 30,865 bases compared. The comparison of clinical samples (plasma RNA and lymph node RNA and DNA) showed a slightly lower match of base calls, 98.8% for 19,831 nucleotides compared (protease region, 99.5%, n = 8272; RT region, 98.3%, n = 11,316), than for viral stocks and NL4-3 clones (protease region, 99.8%; RT region, 99.5%). Artificial mixing experiments showed a bias toward calling wild-type bases by GeneChip. Discordant base calls are most likely due to differential detection of mixtures. The concordance between GeneChip and ABI was high and appeared dependent on the nature of the templates (directly amplified versus cloned) and the complexity of mixes.

  15. GraphPrints: Towards a Graph Analytic Method for Network Anomaly Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harshaw, Chris R; Bridges, Robert A; Iannacone, Michael D

    This paper introduces a novel graph-analytic approach for detecting anomalies in network flow data called \\textit{GraphPrints}. Building on foundational network-mining techniques, our method represents time slices of traffic as a graph, then counts graphlets\\textemdash small induced subgraphs that describe local topology. By performing outlier detection on the sequence of graphlet counts, anomalous intervals of traffic are identified, and furthermore, individual IPs experiencing abnormal behavior are singled-out. Initial testing of GraphPrints is performed on real network data with an implanted anomaly. Evaluation shows false positive rates bounded by 2.84\\% at the time-interval level, and 0.05\\% at the IP-level with 100\\% truemore » positive rates at both.« less

  16. Face pose tracking using the four-point algorithm

    NASA Astrophysics Data System (ADS)

    Fung, Ho Yin; Wong, Kin Hong; Yu, Ying Kin; Tsui, Kwan Pang; Kam, Ho Chuen

    2017-06-01

    In this paper, we have developed an algorithm to track the pose of a human face robustly and efficiently. Face pose estimation is very useful in many applications such as building virtual reality systems and creating an alternative input method for the disabled. Firstly, we have modified a face detection toolbox called DLib for the detection of a face in front of a camera. The detected face features are passed to a pose estimation method, known as the four-point algorithm, for pose computation. The theory applied and the technical problems encountered during system development are discussed in the paper. It is demonstrated that the system is able to track the pose of a face in real time using a consumer grade laptop computer.

  17. An Energy-Efficient Multi-Tier Architecture for Fall Detection Using Smartphones.

    PubMed

    Guvensan, M Amac; Kansiz, A Oguz; Camgoz, N Cihan; Turkmen, H Irem; Yavuz, A Gokhan; Karsligil, M Elif

    2017-06-23

    Automatic detection of fall events is vital to providing fast medical assistance to the causality, particularly when the injury causes loss of consciousness. Optimization of the energy consumption of mobile applications, especially those which run 24/7 in the background, is essential for longer use of smartphones. In order to improve energy-efficiency without compromising on the fall detection performance, we propose a novel 3-tier architecture that combines simple thresholding methods with machine learning algorithms. The proposed method is implemented on a mobile application, called uSurvive, for Android smartphones. It runs as a background service and monitors the activities of a person in daily life and automatically sends a notification to the appropriate authorities and/or user defined contacts when it detects a fall. The performance of the proposed method was evaluated in terms of fall detection performance and energy consumption. Real life performance tests conducted on two different models of smartphone demonstrate that our 3-tier architecture with feature reduction could save up to 62% of energy compared to machine learning only solutions. In addition to this energy saving, the hybrid method has a 93% of accuracy, which is superior to thresholding methods and better than machine learning only solutions.

  18. Spatio-temporal segregation of calling behavior at a multispecies fish spawning site in Little Cayman

    NASA Astrophysics Data System (ADS)

    Cameron, K. C.; Sirovic, A.; Jaffe, J. S.; Semmens, B.; Pattengill-Semmens, C.; Gibb, J.

    2016-02-01

    Fish spawning aggregation (FSA) sites are extremely vulnerable to over-exploitation. Accurate understanding of the spatial and temporal use of such sites is necessary for effective species management. The size of FSAs can be on the order of kilometers and peak spawning often occurs at night, posing challenges to visual observation. Passive acoustics are an alternative method for dealing with these challenges. An array of passive acoustic recorders and GoPro cameras were deployed during Nassau grouper (Epinephelus striatus) spawning from February 7th to 12th, 2015 at a multispecies spawning aggregation site in Little Cayman, Cayman Islands. In addition to Nassau grouper, at least 10 other species are known to spawn at this location including tiger grouper (Mycteroperca tigris), red hind (Epinephelus guttatus), black grouper (Mycteroperca bonaci), and yellowfin grouper (Mycteroperca venenosa). During 5 days of continuous recordings, over 21,000 fish calls were detected. These calls were classified into 15 common types. Species identification and behavioral context of unknown common call types were determined by coupling video recordings collected during this time with call localizations. There are distinct temporal patterns in call production of different species. For example, red hind and yellowfin grouper call predominately at night with yellowfin call rates increasing after midnight, and black grouper call primarily during dusk and dawn. In addition, localization methods were used to reveal how the FSA area was divided among species. These findings facilitate a better understanding of the behavior of these important reef fish species allowing policymakers to more effectively manage and protect them.

  19. Experimental investigation of false positive errors in auditory species occurrence surveys

    USGS Publications Warehouse

    Miller, David A.W.; Weir, Linda A.; McClintock, Brett T.; Grant, Evan H. Campbell; Bailey, Larissa L.; Simons, Theodore R.

    2012-01-01

    False positive errors are a significant component of many ecological data sets, which in combination with false negative errors, can lead to severe biases in conclusions about ecological systems. We present results of a field experiment where observers recorded observations for known combinations of electronically broadcast calling anurans under conditions mimicking field surveys to determine species occurrence. Our objectives were to characterize false positive error probabilities for auditory methods based on a large number of observers, to determine if targeted instruction could be used to reduce false positive error rates, and to establish useful predictors of among-observer and among-species differences in error rates. We recruited 31 observers, ranging in abilities from novice to expert, that recorded detections for 12 species during 180 calling trials (66,960 total observations). All observers made multiple false positive errors and on average 8.1% of recorded detections in the experiment were false positive errors. Additional instruction had only minor effects on error rates. After instruction, false positive error probabilities decreased by 16% for treatment individuals compared to controls with broad confidence interval overlap of 0 (95% CI: -46 to 30%). This coincided with an increase in false negative errors due to the treatment (26%; -3 to 61%). Differences among observers in false positive and in false negative error rates were best predicted by scores from an online test and a self-assessment of observer ability completed prior to the field experiment. In contrast, years of experience conducting call surveys was a weak predictor of error rates. False positive errors were also more common for species that were played more frequently, but were not related to the dominant spectral frequency of the call. Our results corroborate other work that demonstrates false positives are a significant component of species occurrence data collected by auditory methods. Instructing observers to only report detections they are completely certain are correct is not sufficient to eliminate errors. As a result, analytical methods that account for false positive errors will be needed, and independent testing of observer ability is a useful predictor for among-observer variation in observation error rates.

  20. Survival analysis, or what to do with upper limits in astronomical surveys

    NASA Technical Reports Server (NTRS)

    Isobe, Takashi; Feigelson, Eric D.

    1986-01-01

    A field of applied statistics called survival analysis has been developed over several decades to deal with censored data, which occur in astronomical surveys when objects are too faint to be detected. How these methods can assist in the statistical interpretation of astronomical data are reviewed.

  1. PCA method for automated detection of mispronounced words

    NASA Astrophysics Data System (ADS)

    Ge, Zhenhao; Sharma, Sudhendu R.; Smith, Mark J. T.

    2011-06-01

    This paper presents a method for detecting mispronunciations with the aim of improving Computer Assisted Language Learning (CALL) tools used by foreign language learners. The algorithm is based on Principle Component Analysis (PCA). It is hierarchical with each successive step refining the estimate to classify the test word as being either mispronounced or correct. Preprocessing before detection, like normalization and time-scale modification, is implemented to guarantee uniformity of the feature vectors input to the detection system. The performance using various features including spectrograms and Mel-Frequency Cepstral Coefficients (MFCCs) are compared and evaluated. Best results were obtained using MFCCs, achieving up to 99% accuracy in word verification and 93% in native/non-native classification. Compared with Hidden Markov Models (HMMs) which are used pervasively in recognition application, this particular approach is computational efficient and effective when training data is limited.

  2. Multi-species call-broadcast improved detection of endangered Yuma clapper rail compared to single-species call-broadcast

    USGS Publications Warehouse

    Nadeau, Christopher P.; Conway, Courtney J.; Piest, Linden; Burger, William P.

    2013-01-01

    Broadcasting calls of marsh birds during point-count surveys increases their detection probability and decreases variation in the number of birds detected across replicate surveys. However, multi-species monitoring using call-broadcast may reduce these benefits if birds are reluctant to call once they hear broadcasted calls of other species. We compared a protocol that uses call-broadcast for only one species (Yuma clapper rail [Rallus longirostris yumanensis]) to a protocol that uses call-broadcast for multiple species. We detected more of each of the following species using the multi-species protocol: 25 % more pied-billed grebes, 160 % more American bitterns, 52 % more least bitterns, 388 % more California black rails, 12 % more Yuma clapper rails, 156 % more Virginia rails, 214 % more soras, and 19 % more common gallinules. Moreover, the coefficient of variation was smaller when using the multi-species protocol: 10 % smaller for pied-billed grebes, 38 % smaller for American bitterns, 19 % smaller for least bitterns, 55 % smaller for California black rails, 5 % smaller for Yuma clapper rails, 38 % smaller for Virginia rails, 44 % smaller for soras, and 8 % smaller for common gallinules. Our results suggest that multi-species monitoring approaches may be more effective and more efficient than single-species approaches even when using call-broadcast.

  3. New methods for the detection of viruses: call for review of drinking water quality guidelines.

    PubMed

    Grabow, W O; Taylor, M B; de Villiers, J C

    2001-01-01

    Drinking water supplies which meet international recommendations for source, treatment and disinfection were analysed. Viruses recovered from 100 L-1,000 L volumes by in-line glass wool filters were inoculated in parallel into four cell culture systems. Cell culture inoculation was used to isolate cytopathogenic viruses, amplify the nucleic acid of non-cytopathogenic viruses and confirm viability of viruses. Over a period of two years, viruses were detected in 23% of 413 drinking water samples and 73% of 224 raw water samples. Cytopathogenic viruses were detected in 6% raw water samples but not in any treated drinking water supplies. Enteroviruses were detected in 17% drinking water samples, adenoviruses in 4% and hepatitis A virus in 3%. In addition to these viruses, astro- and rotaviruses were detected in raw water. All drinking water supplies had heterotrophic plate counts of < 100/mL, total and faecal coliform counts of 0/100 mL and negative results in qualitative presence-absence tests for somatic and F-RNA coliphages (500 mL samples). These results call for a revision of water quality guidelines based on indicator organisms and vague reference to the absence of viruses.

  4. Detection of explosives, nerve agents, and illicit substances by zero-energy electron attachment

    NASA Technical Reports Server (NTRS)

    Chutjian, A.; Darrach, M. R.

    2000-01-01

    The Reversal Electron Attachment Detection (READ) method, developed at JPL/Caltech, has been used to detect a variety of substances which have electron-attachment resonances at low and intermediate electron energies. In the case of zero-energy resonances, the cross section (hence attachment probability and instrument sensitivity) is mediated by the so-called s-wave phenomenon, in which the cross sections vary as the inverse of the electron velocity. Hence this is, in the limit of zero electron energy or velocity, one of the rare cases in atomic and molecular physics where one carries out detection via infinite cross sections.

  5. Comparative Study of Speckle Filtering Methods in PolSAR Radar Images

    NASA Astrophysics Data System (ADS)

    Boutarfa, S.; Bouchemakh, L.; Smara, Y.

    2015-04-01

    Images acquired by polarimetric SAR (PolSAR) radar systems are characterized by the presence of a noise called speckle. This noise has a multiplicative nature, corrupts both the amplitude and phase images, which complicates data interpretation, degrades segmentation performance and reduces the detectability of targets. Hence, the need to preprocess the images by adapted filtering methods before analysis.In this paper, we present a comparative study of implemented methods for reducing speckle in PolSAR images. These developed filters are: refined Lee filter based on the estimation of the minimum mean square error MMSE, improved Sigma filter with detection of strong scatterers based on the calculation of the coherency matrix to detect the different scatterers in order to preserve the polarization signature and maintain structures that are necessary for image interpretation, filtering by stationary wavelet transform SWT using multi-scale edge detection and the technique for improving the wavelet coefficients called SSC (sum of squared coefficients), and Turbo filter which is a combination between two complementary filters the refined Lee filter and the wavelet transform SWT. One filter can boost up the results of the other.The originality of our work is based on the application of these methods to several types of images: amplitude, intensity and complex, from a satellite or an airborne radar, and on the optimization of wavelet filtering by adding a parameter in the calculation of the threshold. This parameter will control the filtering effect and get a good compromise between smoothing homogeneous areas and preserving linear structures.The methods are applied to the fully polarimetric RADARSAT-2 images (HH, HV, VH, VV) acquired on Algiers, Algeria, in C-band and to the three polarimetric E-SAR images (HH, HV, VV) acquired on Oberpfaffenhofen area located in Munich, Germany, in P-band.To evaluate the performance of each filter, we used the following criteria: smoothing homogeneous areas, preserving edges and polarimetric information.Experimental results are included to illustrate the different implemented methods.

  6. Detecting and quantifying stellar magnetic fields. Sparse Stokes profile approximation using orthogonal matching pursuit

    NASA Astrophysics Data System (ADS)

    Carroll, T. A.; Strassmeier, K. G.

    2014-03-01

    Context. In recent years, we have seen a rapidly growing number of stellar magnetic field detections for various types of stars. Many of these magnetic fields are estimated from spectropolarimetric observations (Stokes V) by using the so-called center-of-gravity (COG) method. Unfortunately, the accuracy of this method rapidly deteriorates with increasing noise and thus calls for a more robust procedure that combines signal detection and field estimation. Aims: We introduce an estimation method that provides not only the effective or mean longitudinal magnetic field from an observed Stokes V profile but also uses the net absolute polarization of the profile to obtain an estimate of the apparent (i.e., velocity resolved) absolute longitudinal magnetic field. Methods: By combining the COG method with an orthogonal-matching-pursuit (OMP) approach, we were able to decompose observed Stokes profiles with an overcomplete dictionary of wavelet-basis functions to reliably reconstruct the observed Stokes profiles in the presence of noise. The elementary wave functions of the sparse reconstruction process were utilized to estimate the effective longitudinal magnetic field and the apparent absolute longitudinal magnetic field. A multiresolution analysis complements the OMP algorithm to provide a robust detection and estimation method. Results: An extensive Monte-Carlo simulation confirms the reliability and accuracy of the magnetic OMP approach where a mean error of under 2% is found. Its full potential is obtained for heavily noise-corrupted Stokes profiles with signal-to-noise variance ratios down to unity. In this case a conventional COG method yields a mean error for the effective longitudinal magnetic field of up to 50%, whereas the OMP method gives a maximum error of 18%. It is, moreover, shown that even in the case of very small residual noise on a level between 10-3 and 10-5, a regime reached by current multiline reconstruction techniques, the conventional COG method incorrectly interprets a large portion of the residual noise as a magnetic field, with values of up to 100 G. The magnetic OMP method, on the other hand, remains largely unaffected by the noise, regardless of the noise level the maximum error is no greater than 0.7 G.

  7. Pairing call-response surveys and distance sampling for a mammalian carnivore

    USGS Publications Warehouse

    Hansen, Sara J. K.; Frair, Jacqueline L.; Underwood, Harold B.; Gibbs, James P.

    2015-01-01

    Density estimates accounting for differential animal detectability are difficult to acquire for wide-ranging and elusive species such as mammalian carnivores. Pairing distance sampling with call-response surveys may provide an efficient means of tracking changes in populations of coyotes (Canis latrans), a species of particular interest in the eastern United States. Blind field trials in rural New York State indicated 119-m linear error for triangulated coyote calls, and a 1.8-km distance threshold for call detectability, which was sufficient to estimate a detection function with precision using distance sampling. We conducted statewide road-based surveys with sampling locations spaced ≥6 km apart from June to August 2010. Each detected call (be it a single or group) counted as a single object, representing 1 territorial pair, because of uncertainty in the number of vocalizing animals. From 524 survey points and 75 detections, we estimated the probability of detecting a calling coyote to be 0.17 ± 0.02 SE, yielding a detection-corrected index of 0.75 pairs/10 km2 (95% CI: 0.52–1.1, 18.5% CV) for a minimum of 8,133 pairs across rural New York State. Importantly, we consider this an index rather than true estimate of abundance given the unknown probability of coyote availability for detection during our surveys. Even so, pairing distance sampling with call-response surveys provided a novel, efficient, and noninvasive means of monitoring populations of wide-ranging and elusive, albeit reliably vocal, mammalian carnivores. Our approach offers an effective new means of tracking species like coyotes, one that is readily extendable to other species and geographic extents, provided key assumptions of distance sampling are met.

  8. Salient object detection based on discriminative boundary and multiple cues integration

    NASA Astrophysics Data System (ADS)

    Jiang, Qingzhu; Wu, Zemin; Tian, Chang; Liu, Tao; Zeng, Mingyong; Hu, Lei

    2016-01-01

    In recent years, many saliency models have achieved good performance by taking the image boundary as the background prior. However, if all boundaries of an image are equally and artificially selected as background, misjudgment may happen when the object touches the boundary. We propose an algorithm called weighted contrast optimization based on discriminative boundary (wCODB). First, a background estimation model is reliably constructed through discriminating each boundary via Hausdorff distance. Second, the background-only weighted contrast is improved by fore-background weighted contrast, which is optimized through weight-adjustable optimization framework. Then to objectively estimate the quality of a saliency map, a simple but effective metric called spatial distribution of saliency map and mean saliency in covered window ratio (MSR) is designed. Finally, in order to further promote the detection result using MSR as the weight, we propose a saliency fusion framework to integrate three other cues-uniqueness, distribution, and coherence from three representative methods into our wCODB model. Extensive experiments on six public datasets demonstrate that our wCODB performs favorably against most of the methods based on boundary, and the integrated result outperforms all state-of-the-art methods.

  9. CNNdel: Calling Structural Variations on Low Coverage Data Based on Convolutional Neural Networks

    PubMed Central

    2017-01-01

    Many structural variations (SVs) detection methods have been proposed due to the popularization of next-generation sequencing (NGS). These SV calling methods use different SV-property-dependent features; however, they all suffer from poor accuracy when running on low coverage sequences. The union of results from these tools achieves fairly high sensitivity but still produces low accuracy on low coverage sequence data. That is, these methods contain many false positives. In this paper, we present CNNdel, an approach for calling deletions from paired-end reads. CNNdel gathers SV candidates reported by multiple tools and then extracts features from aligned BAM files at the positions of candidates. With labeled feature-expressed candidates as a training set, CNNdel trains convolutional neural networks (CNNs) to distinguish true unlabeled candidates from false ones. Results show that CNNdel works well with NGS reads from 26 low coverage genomes of the 1000 Genomes Project. The paper demonstrates that convolutional neural networks can automatically assign the priority of SV features and reduce the false positives efficaciously. PMID:28630866

  10. Change in chromatogram patterns after volatilization of some Aroclors, and the associated quantitation problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, J.M.; Hee, S.S.Q.

    1987-07-01

    PCBs with the highest vapor pressures (fewest chlorines) in Aroclors 1016, 1242, 1254, and 1268 were enriched in the vapor phase relative to the original Aroclor during volatilization from a glass surface for up to 8 hr. PCBs with the lowest vapor pressures (most highly chlorinated) were enriched in the corresponding residue. Thus, visual matching of gas chromatograms with those of Aroclor standards may not be sufficient to identify a specific Aroclor since the past history of a sample is often unknown. The enrichment also was detected using isomeric classes, but not using total chlorine content. The perchlorination method andmore » the Webb-McCall method using all chromatographic peaks agreed quantitatively; this was not always so for the NIOSH multiple peaks and the Webb-McCall methods.« less

  11. Development of a Salmonella screening tool for consumer complaint-based foodborne illness surveillance systems.

    PubMed

    Li, John; Maclehose, Rich; Smith, Kirk; Kaehler, Dawn; Hedberg, Craig

    2011-01-01

    Foodborne illness surveillance based on consumer complaints detects outbreaks by finding common exposures among callers, but this process is often difficult. Laboratory testing of ill callers could also help identify potential outbreaks. However, collection of stool samples from all callers is not feasible. Methods to help screen calls for etiology are needed to increase the efficiency of complaint surveillance systems and increase the likelihood of detecting foodborne outbreaks caused by Salmonella. Data from the Minnesota Department of Health foodborne illness surveillance database (2000 to 2008) were analyzed. Complaints with identified etiologies were examined to create a predictive model for Salmonella. Bootstrap methods were used to internally validate the model. Seventy-one percent of complaints in the foodborne illness database with known etiologies were due to norovirus. The predictive model had a good discriminatory ability to identify Salmonella calls. Three cutoffs for the predictive model were tested: one that maximized sensitivity, one that maximized specificity, and one that maximized predictive ability, providing sensitivities and specificities of 32 and 96%, 100 and 54%, and 89 and 72%, respectively. Development of a predictive model for Salmonella could help screen calls for etiology. The cutoff that provided the best predictive ability for Salmonella corresponded to a caller reporting diarrhea and fever with no vomiting, and five or fewer people ill. Screening calls for etiology would help identify complaints for further follow-up and result in identifying Salmonella cases that would otherwise go unconfirmed; in turn, this could lead to the identification of more outbreaks.

  12. Systematic Evaluation of In Vitro and In Vivo Adventitious Virus Assays for the Detection of Viral Contamination of Cell Banks and Biological Products1

    PubMed Central

    Gombold, James; Karakasidis, Stephen; Niksa, Paula; Podczasy, John; Neumann, Kitti; Richardson, James; Sane, Nandini; Johnson-Leva, Renita; Randolph, Valerie; Sadoff, Jerald; Minor, Phillip; Schmidt, Alexander; Duncan, Paul; Sheets, Rebecca L.

    2015-01-01

    Viral vaccines and the cell substrates used to manufacture them are subjected to tests for adventitious agents, including viruses, which might contaminant them. Some of the compendial methods (in vivo and in vitro in cell culture) were established in the mid-20th century. These methods have not been subjected to current assay validation, as new methods would need to be. This study was undertaken to provide insight into the breadth (selectivity) and sensitivity (limit of detection) of the routine methods, two such validation parameters. Sixteen viral stocks were prepared and characterized. These stocks were tested in serial dilutions by the routine methods to establish which viruses were detected by which methods and above what limit of detection. Sixteen out of sixteen viruses were detected in vitro, though one (bovine viral diarrhea virus) required special conditions to detect and another (rubella virus) was detected with low sensitivity. Many were detected at levels below 1 TCID50 or PFU (titers were established on the production cell line in most cases). In contrast, in vivo, only 6/11 viruses were detected, and 4 of these were detected only at amounts one or more logs above 1 TCID50 or PFU. Only influenza virus and vesicular stomatitis virus were detected at lower amounts in vivo than in vitro. Given the call to reduce, refine, or replace (3 R's) the use of animals in product safety testing and the emergence of new technologies for the detection of viruses, a re-examination of the current adventitious virus testing strategies seems warranted. Suggested pathways forward are offered. PMID:24681273

  13. Hungarian Marfan family with large FBN1 deletion calls attention to copy number variation detection in the current NGS era

    PubMed Central

    Ágg, Bence; Meienberg, Janine; Kopps, Anna M.; Fattorini, Nathalie; Stengl, Roland; Daradics, Noémi; Pólos, Miklós; Bors, András; Radovits, Tamás; Merkely, Béla; De Backer, Julie; Szabolcs, Zoltán; Mátyás, Gábor

    2018-01-01

    Copy number variations (CNVs) comprise about 10% of reported disease-causing mutations in Mendelian disorders. Nevertheless, pathogenic CNVs may have been under-detected due to the lack or insufficient use of appropriate detection methods. In this report, on the example of the diagnostic odyssey of a patient with Marfan syndrome (MFS) harboring a hitherto unreported 32-kb FBN1 deletion, we highlight the need for and the feasibility of testing for CNVs (>1 kb) in Mendelian disorders in the current next-generation sequencing (NGS) era. PMID:29850152

  14. UV gated Raman spectroscopy for standoff detection of explosives

    NASA Astrophysics Data System (ADS)

    Gaft, M.; Nagli, L.

    2008-07-01

    Real-time detection and identification of explosives at a standoff distance is a major issue in efforts to develop defense against so-called improvised explosive devices (IED). It is recognized that the only method, which is potentially capable to standoff detection of minimal amounts of explosives is laser-based spectroscopy. LDS technique belongs to trace detection, namely to its micro-particles variety. It is based on commonly held belief that surface contamination was very difficult to avoid and could be exploited for standoff detection. We have applied gated Raman spectroscopy for detection of main explosive materials, both factory and homemade. We developed and tested a Raman system for the field remote detection and identification of minimal amounts of explosives on relevant surfaces at a distance of up to 30 m.

  15. Anomaly-based intrusion detection for SCADA systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, D.; Usynin, A.; Hines, J. W.

    2006-07-01

    Most critical infrastructure such as chemical processing plants, electrical generation and distribution networks, and gas distribution is monitored and controlled by Supervisory Control and Data Acquisition Systems (SCADA. These systems have been the focus of increased security and there are concerns that they could be the target of international terrorists. With the constantly growing number of internet related computer attacks, there is evidence that our critical infrastructure may also be vulnerable. Researchers estimate that malicious online actions may cause $75 billion at 2007. One of the interesting countermeasures for enhancing information system security is called intrusion detection. This paper willmore » briefly discuss the history of research in intrusion detection techniques and introduce the two basic detection approaches: signature detection and anomaly detection. Finally, it presents the application of techniques developed for monitoring critical process systems, such as nuclear power plants, to anomaly intrusion detection. The method uses an auto-associative kernel regression (AAKR) model coupled with the statistical probability ratio test (SPRT) and applied to a simulated SCADA system. The results show that these methods can be generally used to detect a variety of common attacks. (authors)« less

  16. Structural Damage Detection Using Changes in Natural Frequencies: Theory and Applications

    NASA Astrophysics Data System (ADS)

    He, K.; Zhu, W. D.

    2011-07-01

    A vibration-based method that uses changes in natural frequencies of a structure to detect damage has advantages over conventional nondestructive tests in detecting various types of damage, including loosening of bolted joints, using minimum measurement data. Two major challenges associated with applications of the vibration-based damage detection method to engineering structures are addressed: accurate modeling of structures and the development of a robust inverse algorithm to detect damage, which are defined as the forward and inverse problems, respectively. To resolve the forward problem, new physics-based finite element modeling techniques are developed for fillets in thin-walled beams and for bolted joints, so that complex structures can be accurately modeled with a reasonable model size. To resolve the inverse problem, a logistical function transformation is introduced to convert the constrained optimization problem to an unconstrained one, and a robust iterative algorithm using a trust-region method, called the Levenberg-Marquardt method, is developed to accurately detect the locations and extent of damage. The new methodology can ensure global convergence of the iterative algorithm in solving under-determined system equations and deal with damage detection problems with relatively large modeling error and measurement noise. The vibration-based damage detection method is applied to various structures including lightning masts, a space frame structure and one of its components, and a pipeline. The exact locations and extent of damage can be detected in the numerical simulation where there is no modeling error and measurement noise. The locations and extent of damage can be successfully detected in experimental damage detection.

  17. Detecting Damage in Composite Material Using Nonlinear Elastic Wave Spectroscopy Methods

    NASA Astrophysics Data System (ADS)

    Meo, Michele; Polimeno, Umberto; Zumpano, Giuseppe

    2008-05-01

    Modern aerospace structures make increasing use of fibre reinforced plastic composites, due to their high specific mechanical properties. However, due to their brittleness, low velocity impact can cause delaminations beneath the surface, while the surface may appear to be undamaged upon visual inspection. Such damage is called barely visible impact damage (BVID). Such internal damages lead to significant reduction in local strengths and ultimately could lead to catastrophic failures. It is therefore important to detect and monitor damages in high loaded composite components to receive an early warning for a well timed maintenance of the aircraft. Non-linear ultrasonic spectroscopy methods are promising damage detection and material characterization tools. In this paper, two different non-linear elastic wave spectroscopy (NEWS) methods are presented: single mode nonlinear resonance ultrasound (NRUS) and nonlinear wave modulation technique (NWMS). The NEWS methods were applied to detect delamination damage due to low velocity impact (<12 J) on various composite plates. The results showed that the proposed methodology appear to be highly sensitive to the presence of damage with very promising future NDT and structural health monitoring applications.

  18. Red Lesion Detection Using Dynamic Shape Features for Diabetic Retinopathy Screening.

    PubMed

    Seoud, Lama; Hurtut, Thomas; Chelbi, Jihed; Cheriet, Farida; Langlois, J M Pierre

    2016-04-01

    The development of an automatic telemedicine system for computer-aided screening and grading of diabetic retinopathy depends on reliable detection of retinal lesions in fundus images. In this paper, a novel method for automatic detection of both microaneurysms and hemorrhages in color fundus images is described and validated. The main contribution is a new set of shape features, called Dynamic Shape Features, that do not require precise segmentation of the regions to be classified. These features represent the evolution of the shape during image flooding and allow to discriminate between lesions and vessel segments. The method is validated per-lesion and per-image using six databases, four of which are publicly available. It proves to be robust with respect to variability in image resolution, quality and acquisition system. On the Retinopathy Online Challenge's database, the method achieves a FROC score of 0.420 which ranks it fourth. On the Messidor database, when detecting images with diabetic retinopathy, the proposed method achieves an area under the ROC curve of 0.899, comparable to the score of human experts, and it outperforms state-of-the-art approaches.

  19. Anomaly Monitoring Method for Key Components of Satellite

    PubMed Central

    Fan, Linjun; Xiao, Weidong; Tang, Jun

    2014-01-01

    This paper presented a fault diagnosis method for key components of satellite, called Anomaly Monitoring Method (AMM), which is made up of state estimation based on Multivariate State Estimation Techniques (MSET) and anomaly detection based on Sequential Probability Ratio Test (SPRT). On the basis of analysis failure of lithium-ion batteries (LIBs), we divided the failure of LIBs into internal failure, external failure, and thermal runaway and selected electrolyte resistance (R e) and the charge transfer resistance (R ct) as the key parameters of state estimation. Then, through the actual in-orbit telemetry data of the key parameters of LIBs, we obtained the actual residual value (R X) and healthy residual value (R L) of LIBs based on the state estimation of MSET, and then, through the residual values (R X and R L) of LIBs, we detected the anomaly states based on the anomaly detection of SPRT. Lastly, we conducted an example of AMM for LIBs, and, according to the results of AMM, we validated the feasibility and effectiveness of AMM by comparing it with the results of threshold detective method (TDM). PMID:24587703

  20. Wavelet based detection of manatee vocalizations

    NASA Astrophysics Data System (ADS)

    Gur, Berke M.; Niezrecki, Christopher

    2005-04-01

    The West Indian manatee (Trichechus manatus latirostris) has become endangered partly because of watercraft collisions in Florida's coastal waterways. Several boater warning systems, based upon manatee vocalizations, have been proposed to reduce the number of collisions. Three detection methods based on the Fourier transform (threshold, harmonic content and autocorrelation methods) were previously suggested and tested. In the last decade, the wavelet transform has emerged as an alternative to the Fourier transform and has been successfully applied in various fields of science and engineering including the acoustic detection of dolphin vocalizations. As of yet, no prior research has been conducted in analyzing manatee vocalizations using the wavelet transform. Within this study, the wavelet transform is used as an alternative to the Fourier transform in detecting manatee vocalizations. The wavelet coefficients are analyzed and tested against a specified criterion to determine the existence of a manatee call. The performance of the method presented is tested on the same data previously used in the prior studies, and the results are compared. Preliminary results indicate that using the wavelet transform as a signal processing technique to detect manatee vocalizations shows great promise.

  1. A New Forensic Picture Polygraph Technique for Terrorist and Crime Deception System

    ERIC Educational Resources Information Center

    Costello, R. H. Brian; Axton, JoAnn; Gold, Karen L.

    2006-01-01

    The Forensic Terrorist Detection System called Pinocchio Assessment Profile (PAP) employs standard issue polygraphs for a non-verbal picture technique originated as a biofeedback careers interest instrument. The system can be integrated readily into airport screening protocols. However, the method does not rely on questioning or foreign language…

  2. A confusing world: what to call histology of three-dimensional tumour margins?

    PubMed

    Moehrle, M; Breuninger, H; Röcken, M

    2007-05-01

    Complete three-dimensional histology of excised skin tumour margins has a long tradition and, unfortunately, a multitude of names as well. Mohs, who introduced it, called it 'microscopically controlled surgery'. Others have described it as 'micrographic surgery', 'Mohs' micrographic surgery', or simply 'Mohs' surgery'. Semantic confusion became truly rampant when variant forms, each useful in its own way for detecting subclinical outgrowths of malignant skin tumours, were later introduced under such names as histographic surgery, systematic histologic control of the tumour bed, histological control of excised tissue margins, the square procedure, the perimeter technique, etc. All of these methods are basically identical in concept. All involve complete, three-dimensional histological visualization and evaluation of excision margins. Their common goal is to detect unseen tumour outgrowths. For greater clarity, the authors of this paper recommend general adoption of '3D histology' as a collective designation for all the above methods. As an added advantage, 3D histology can also be used in other medical disciplines to confirm true R0 resection of, for example, breast cancer or intestinal cancer.

  3. Base-Calling Algorithm with Vocabulary (BCV) Method for Analyzing Population Sequencing Chromatograms

    PubMed Central

    Fantin, Yuri S.; Neverov, Alexey D.; Favorov, Alexander V.; Alvarez-Figueroa, Maria V.; Braslavskaya, Svetlana I.; Gordukova, Maria A.; Karandashova, Inga V.; Kuleshov, Konstantin V.; Myznikova, Anna I.; Polishchuk, Maya S.; Reshetov, Denis A.; Voiciehovskaya, Yana A.; Mironov, Andrei A.; Chulanov, Vladimir P.

    2013-01-01

    Sanger sequencing is a common method of reading DNA sequences. It is less expensive than high-throughput methods, and it is appropriate for numerous applications including molecular diagnostics. However, sequencing mixtures of similar DNA of pathogens with this method is challenging. This is important because most clinical samples contain such mixtures, rather than pure single strains. The traditional solution is to sequence selected clones of PCR products, a complicated, time-consuming, and expensive procedure. Here, we propose the base-calling with vocabulary (BCV) method that computationally deciphers Sanger chromatograms obtained from mixed DNA samples. The inputs to the BCV algorithm are a chromatogram and a dictionary of sequences that are similar to those we expect to obtain. We apply the base-calling function on a test dataset of chromatograms without ambiguous positions, as well as one with 3–14% sequence degeneracy. Furthermore, we use BCV to assemble a consensus sequence for an HIV genome fragment in a sample containing a mixture of viral DNA variants and to determine the positions of the indels. Finally, we detect drug-resistant Mycobacterium tuberculosis strains carrying frameshift mutations mixed with wild-type bacteria in the pncA gene, and roughly characterize bacterial communities in clinical samples by direct 16S rRNA sequencing. PMID:23382983

  4. Detection of Tephra Layers in Antarctic Sediment Cores with Hyperspectral Imaging

    PubMed Central

    Aymerich, Ismael F.; Oliva, Marc; Giralt, Santiago; Martín-Herrero, Julio

    2016-01-01

    Tephrochronology uses recognizable volcanic ash layers (from airborne pyroclastic deposits, or tephras) in geological strata to set unique time references for paleoenvironmental events across wide geographic areas. This involves the detection of tephra layers which sometimes are not evident to the naked eye, including the so-called cryptotephras. Tests that are expensive, time-consuming, and/or destructive are often required. Destructive testing for tephra layers of cores from difficult regions, such as Antarctica, which are useful sources of other kinds of information beyond tephras, is always undesirable. Here we propose hyperspectral imaging of cores, Self-Organizing Map (SOM) clustering of the preprocessed spectral signatures, and spatial analysis of the classified images as a convenient, fast, non-destructive method for tephra detection. We test the method in five sediment cores from three Antarctic lakes, and show its potential for detection of tephras and cryptotephras. PMID:26815202

  5. Utility of NIST Whole-Genome Reference Materials for the Technical Validation of a Multigene Next-Generation Sequencing Test.

    PubMed

    Shum, Bennett O V; Henner, Ilya; Belluoccio, Daniele; Hinchcliffe, Marcus J

    2017-07-01

    The sensitivity and specificity of next-generation sequencing laboratory developed tests (LDTs) are typically determined by an analyte-specific approach. Analyte-specific validations use disease-specific controls to assess an LDT's ability to detect known pathogenic variants. Alternatively, a methods-based approach can be used for LDT technical validations. Methods-focused validations do not use disease-specific controls but use benchmark reference DNA that contains known variants (benign, variants of unknown significance, and pathogenic) to assess variant calling accuracy of a next-generation sequencing workflow. Recently, four whole-genome reference materials (RMs) from the National Institute of Standards and Technology (NIST) were released to standardize methods-based validations of next-generation sequencing panels across laboratories. We provide a practical method for using NIST RMs to validate multigene panels. We analyzed the utility of RMs in validating a novel newborn screening test that targets 70 genes, called NEO1. Despite the NIST RM variant truth set originating from multiple sequencing platforms, replicates, and library types, we discovered a 5.2% false-negative variant detection rate in the RM truth set genes that were assessed in our validation. We developed a strategy using complementary non-RM controls to demonstrate 99.6% sensitivity of the NEO1 test in detecting variants. Our findings have implications for laboratories or proficiency testing organizations using whole-genome NIST RMs for testing. Copyright © 2017 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  6. Parallel evaluation of broad virus detection methods.

    PubMed

    Modrof, Jens; Berting, Andreas; Kreil, Thomas R

    2014-01-01

    The testing for adventitious viruses is of critical importance during development and production of biological products. The recent emergence and ongoing development of broad virus detection methods calls for an evaluation of whether these methods can appropriately be implemented into current adventitious agent testing procedures. To assess the suitability of several broad virus detection methods, a comparative experimental study was conducted: four virus preparations, which were spiked at two different concentrations each into two different cell culture media, were sent to four investigators in a blinded fashion for analysis with broad virus detection methods such as polymerase chain reaction-electrospray ionization mass spectrometry (PCR-ESI/MS), microarray, and two approaches utilizing massively parallel sequencing. The results that were reported by the investigators revealed that all methods were able to identify the majority of samples correctly (mean 83%), with a surprisingly narrow range among the methods, that is, between 72% (PCR-ESI/MS) and 95% (microarray). In addition to the correct results, a variety of unexpected assignments were reported for a minority of samples, again with little variation regarding the methods used (range 20-45%), while false negatives were reported for 0-25% of the samples. Regarding assay sensitivity, the viruses were detected by all methods included in this study at concentrations of about 4-5 log10 quantitative PCR copies/mL, and probably with higher sensitivity in some cases. In summary, the broad virus detection methods investigated were shown to be suitable even for detection of relatively low virus concentrations. However, there is also some potential for the production of false-positive as well as false-negative assignments, which indicates the requirement for further improvements before these methods can be considered for routine use. © PDA, Inc. 2014.

  7. Adiabatic Quantum Anomaly Detection and Machine Learning

    NASA Astrophysics Data System (ADS)

    Pudenz, Kristen; Lidar, Daniel

    2012-02-01

    We present methods of anomaly detection and machine learning using adiabatic quantum computing. The machine learning algorithm is a boosting approach which seeks to optimally combine somewhat accurate classification functions to create a unified classifier which is much more accurate than its components. This algorithm then becomes the first part of the larger anomaly detection algorithm. In the anomaly detection routine, we first use adiabatic quantum computing to train two classifiers which detect two sets, the overlap of which forms the anomaly class. We call this the learning phase. Then, in the testing phase, the two learned classification functions are combined to form the final Hamiltonian for an adiabatic quantum computation, the low energy states of which represent the anomalies in a binary vector space.

  8. QuASAR: quantitative allele-specific analysis of reads

    PubMed Central

    Harvey, Chris T.; Moyerbrailean, Gregory A.; Davis, Gordon O.; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-01-01

    Motivation: Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. Results: We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. Availability and implementation: http://github.com/piquelab/QuASAR. Contact: fluca@wayne.edu or rpique@wayne.edu Supplementary information: Supplementary Material is available at Bioinformatics online. PMID:25480375

  9. VaDiR: an integrated approach to Variant Detection in RNA.

    PubMed

    Neums, Lisa; Suenaga, Seiji; Beyerlein, Peter; Anders, Sara; Koestler, Devin; Mariani, Andrea; Chien, Jeremy

    2018-02-01

    Advances in next-generation DNA sequencing technologies are now enabling detailed characterization of sequence variations in cancer genomes. With whole-genome sequencing, variations in coding and non-coding sequences can be discovered. But the cost associated with it is currently limiting its general use in research. Whole-exome sequencing is used to characterize sequence variations in coding regions, but the cost associated with capture reagents and biases in capture rate limit its full use in research. Additional limitations include uncertainty in assigning the functional significance of the mutations when these mutations are observed in the non-coding region or in genes that are not expressed in cancer tissue. We investigated the feasibility of uncovering mutations from expressed genes using RNA sequencing datasets with a method called Variant Detection in RNA(VaDiR) that integrates 3 variant callers, namely: SNPiR, RVBoost, and MuTect2. The combination of all 3 methods, which we called Tier 1 variants, produced the highest precision with true positive mutations from RNA-seq that could be validated at the DNA level. We also found that the integration of Tier 1 variants with those called by MuTect2 and SNPiR produced the highest recall with acceptable precision. Finally, we observed a higher rate of mutation discovery in genes that are expressed at higher levels. Our method, VaDiR, provides a possibility of uncovering mutations from RNA sequencing datasets that could be useful in further functional analysis. In addition, our approach allows orthogonal validation of DNA-based mutation discovery by providing complementary sequence variation analysis from paired RNA/DNA sequencing datasets.

  10. Optimizing occupancy surveys by maximizing detection probability: application to amphibian monitoring in the Mediterranean region.

    PubMed

    Petitot, Maud; Manceau, Nicolas; Geniez, Philippe; Besnard, Aurélien

    2014-09-01

    Setting up effective conservation strategies requires the precise determination of the targeted species' distribution area and, if possible, its local abundance. However, detection issues make these objectives complex for most vertebrates. The detection probability is usually <1 and is highly dependent on species phenology and other environmental variables. The aim of this study was to define an optimized survey protocol for the Mediterranean amphibian community, that is, to determine the most favorable periods and the most effective sampling techniques for detecting all species present on a site in a minimum number of field sessions and a minimum amount of prospecting effort. We visited 49 ponds located in the Languedoc region of southern France on four occasions between February and June 2011. Amphibians were detected using three methods: nighttime call count, nighttime visual encounter, and daytime netting. The detection nondetection data obtained was then modeled using site-occupancy models. The detection probability of amphibians sharply differed between species, the survey method used and the date of the survey. These three covariates also interacted. Thus, a minimum of three visits spread over the breeding season, using a combination of all three survey methods, is needed to reach a 95% detection level for all species in the Mediterranean region. Synthesis and applications: detection nondetection surveys combined to site occupancy modeling approach are powerful methods that can be used to estimate the detection probability and to determine the prospecting effort necessary to assert that a species is absent from a site.

  11. Simultaneous Multi-band Detection of Low Surface Brightness Galaxies with Markovian Modeling

    NASA Astrophysics Data System (ADS)

    Vollmer, B.; Perret, B.; Petremand, M.; Lavigne, F.; Collet, Ch.; van Driel, W.; Bonnarel, F.; Louys, M.; Sabatini, S.; MacArthur, L. A.

    2013-02-01

    We present to the astronomical community an algorithm for the detection of low surface brightness (LSB) galaxies in images, called MARSIAA (MARkovian Software for Image Analysis in Astronomy), which is based on multi-scale Markovian modeling. MARSIAA can be applied simultaneously to different bands. It segments an image into a user-defined number of classes, according to their surface brightness and surroundings—typically, one or two classes contain the LSB structures. We have developed an algorithm, called DetectLSB, which allows the efficient identification of LSB galaxies from among the candidate sources selected by MARSIAA. The application of the method to two and three bands simultaneously was tested on simulated images. Based on our tests, we are confident that we can detect LSB galaxies down to a central surface brightness level of only 1.5 times the standard deviation from the mean pixel value in the image background. To assess the robustness of our method, the method was applied to a set of 18 B- and I-band images (covering 1.3 deg2 in total) of the Virgo Cluster to which Sabatini et al. previously applied a matched-filter dwarf LSB galaxy search algorithm. We have detected all 20 objects from the Sabatini et al. catalog which we could classify by eye as bona fide LSB galaxies. Our method has also detected four additional Virgo Cluster LSB galaxy candidates undetected by Sabatini et al. To further assess the completeness of the results of our method, both MARSIAA, SExtractor, and DetectLSB were applied to search for (1) mock Virgo LSB galaxies inserted into a set of deep Next Generation Virgo Survey (NGVS) gri-band subimages and (2) Virgo LSB galaxies identified by eye in a full set of NGVS square degree gri images. MARSIAA/DetectLSB recovered ~20% more mock LSB galaxies and ~40% more LSB galaxies identified by eye than SExtractor/DetectLSB. With a 90% fraction of false positives from an entirely unsupervised pipeline, a completeness of 90% is reached for sources with r e > 3'' at a mean surface brightness level of μg = 27.7 mag arcsec-2 and a central surface brightness of μ0 g = 26.7 mag arcsec-2. About 10% of the false positives are artifacts, the rest being background galaxies. We have found our proposed Markovian LSB galaxy detection method to be complementary to the application of matched filters and an optimized use of SExtractor, and to have the following advantages: it is scale free, can be applied simultaneously to several bands, and is well adapted for crowded regions on the sky. .

  12. ERASE-Seq: Leveraging replicate measurements to enhance ultralow frequency variant detection in NGS data

    PubMed Central

    Kamps-Hughes, Nick; McUsic, Andrew; Kurihara, Laurie; Harkins, Timothy T.; Pal, Prithwish; Ray, Claire

    2018-01-01

    The accurate detection of ultralow allele frequency variants in DNA samples is of interest in both research and medical settings, particularly in liquid biopsies where cancer mutational status is monitored from circulating DNA. Next-generation sequencing (NGS) technologies employing molecular barcoding have shown promise but significant sensitivity and specificity improvements are still needed to detect mutations in a majority of patients before the metastatic stage. To address this we present analytical validation data for ERASE-Seq (Elimination of Recurrent Artifacts and Stochastic Errors), a method for accurate and sensitive detection of ultralow frequency DNA variants in NGS data. ERASE-Seq differs from previous methods by creating a robust statistical framework to utilize technical replicates in conjunction with background error modeling, providing a 10 to 100-fold reduction in false positive rates compared to published molecular barcoding methods. ERASE-Seq was tested using spiked human DNA mixtures with clinically realistic DNA input quantities to detect SNVs and indels between 0.05% and 1% allele frequency, the range commonly found in liquid biopsy samples. Variants were detected with greater than 90% sensitivity and a false positive rate below 0.1 calls per 10,000 possible variants. The approach represents a significant performance improvement compared to molecular barcoding methods and does not require changing molecular reagents. PMID:29630678

  13. An improved ChIP-seq peak detection system for simultaneously identifying post-translational modified transcription factors by combinatorial fusion, using SUMOylation as an example.

    PubMed

    Cheng, Chia-Yang; Chu, Chia-Han; Hsu, Hung-Wei; Hsu, Fang-Rong; Tang, Chung Yi; Wang, Wen-Ching; Kung, Hsing-Jien; Chang, Pei-Ching

    2014-01-01

    Post-translational modification (PTM) of transcriptional factors and chromatin remodelling proteins is recognized as a major mechanism by which transcriptional regulation occurs. Chromatin immunoprecipitation (ChIP) in combination with high-throughput sequencing (ChIP-seq) is being applied as a gold standard when studying the genome-wide binding sites of transcription factor (TFs). This has greatly improved our understanding of protein-DNA interactions on a genomic-wide scale. However, current ChIP-seq peak calling tools are not sufficiently sensitive and are unable to simultaneously identify post-translational modified TFs based on ChIP-seq analysis; this is largely due to the wide-spread presence of multiple modified TFs. Using SUMO-1 modification as an example; we describe here an improved approach that allows the simultaneous identification of the particular genomic binding regions of all TFs with SUMO-1 modification. Traditional peak calling methods are inadequate when identifying multiple TF binding sites that involve long genomic regions and therefore we designed a ChIP-seq processing pipeline for the detection of peaks via a combinatorial fusion method. Then, we annotate the peaks with known transcription factor binding sites (TFBS) using the Transfac Matrix Database (v7.0), which predicts potential SUMOylated TFs. Next, the peak calling result was further analyzed based on the promoter proximity, TFBS annotation, a literature review, and was validated by ChIP-real-time quantitative PCR (qPCR) and ChIP-reChIP real-time qPCR. The results show clearly that SUMOylated TFs are able to be pinpointed using our pipeline. A methodology is presented that analyzes SUMO-1 ChIP-seq patterns and predicts related TFs. Our analysis uses three peak calling tools. The fusion of these different tools increases the precision of the peak calling results. TFBS annotation method is able to predict potential SUMOylated TFs. Here, we offer a new approach that enhances ChIP-seq data analysis and allows the identification of multiple SUMOylated TF binding sites simultaneously, which can then be utilized for other functional PTM binding site prediction in future.

  14. An Energy-Efficient Multi-Tier Architecture for Fall Detection on Smartphones

    PubMed Central

    Guvensan, M. Amac; Kansiz, A. Oguz; Camgoz, N. Cihan; Turkmen, H. Irem; Yavuz, A. Gokhan; Karsligil, M. Elif

    2017-01-01

    Automatic detection of fall events is vital to providing fast medical assistance to the causality, particularly when the injury causes loss of consciousness. Optimization of the energy consumption of mobile applications, especially those which run 24/7 in the background, is essential for longer use of smartphones. In order to improve energy-efficiency without compromising on the fall detection performance, we propose a novel 3-tier architecture that combines simple thresholding methods with machine learning algorithms. The proposed method is implemented on a mobile application, called uSurvive, for Android smartphones. It runs as a background service and monitors the activities of a person in daily life and automatically sends a notification to the appropriate authorities and/or user defined contacts when it detects a fall. The performance of the proposed method was evaluated in terms of fall detection performance and energy consumption. Real life performance tests conducted on two different models of smartphone demonstrate that our 3-tier architecture with feature reduction could save up to 62% of energy compared to machine learning only solutions. In addition to this energy saving, the hybrid method has a 93% of accuracy, which is superior to thresholding methods and better than machine learning only solutions. PMID:28644378

  15. Contour-Based Corner Detection and Classification by Using Mean Projection Transform

    PubMed Central

    Kahaki, Seyed Mostafa Mousavi; Nordin, Md Jan; Ashtari, Amir Hossein

    2014-01-01

    Image corner detection is a fundamental task in computer vision. Many applications require reliable detectors to accurately detect corner points, commonly achieved by using image contour information. The curvature definition is sensitive to local variation and edge aliasing, and available smoothing methods are not sufficient to address these problems properly. Hence, we propose Mean Projection Transform (MPT) as a corner classifier and parabolic fit approximation to form a robust detector. The first step is to extract corner candidates using MPT based on the integral properties of the local contours in both the horizontal and vertical directions. Then, an approximation of the parabolic fit is calculated to localize the candidate corner points. The proposed method presents fewer false-positive (FP) and false-negative (FN) points compared with recent standard corner detection techniques, especially in comparison with curvature scale space (CSS) methods. Moreover, a new evaluation metric, called accuracy of repeatability (AR), is introduced. AR combines repeatability and the localization error (Le) for finding the probability of correct detection in the target image. The output results exhibit better repeatability, localization, and AR for the detected points compared with the criteria in original and transformed images. PMID:24590354

  16. Contour-based corner detection and classification by using mean projection transform.

    PubMed

    Kahaki, Seyed Mostafa Mousavi; Nordin, Md Jan; Ashtari, Amir Hossein

    2014-02-28

    Image corner detection is a fundamental task in computer vision. Many applications require reliable detectors to accurately detect corner points, commonly achieved by using image contour information. The curvature definition is sensitive to local variation and edge aliasing, and available smoothing methods are not sufficient to address these problems properly. Hence, we propose Mean Projection Transform (MPT) as a corner classifier and parabolic fit approximation to form a robust detector. The first step is to extract corner candidates using MPT based on the integral properties of the local contours in both the horizontal and vertical directions. Then, an approximation of the parabolic fit is calculated to localize the candidate corner points. The proposed method presents fewer false-positive (FP) and false-negative (FN) points compared with recent standard corner detection techniques, especially in comparison with curvature scale space (CSS) methods. Moreover, a new evaluation metric, called accuracy of repeatability (AR), is introduced. AR combines repeatability and the localization error (Le) for finding the probability of correct detection in the target image. The output results exhibit better repeatability, localization, and AR for the detected points compared with the criteria in original and transformed images.

  17. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method

    PubMed Central

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-01-01

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis. PMID:28029121

  18. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method.

    PubMed

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-12-24

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  19. An objectively-analyzed method for measuring the useful penetration of x-ray imaging systems.

    PubMed

    Glover, Jack L; Hudson, Lawrence T

    2016-06-01

    The ability to detect wires is an important capability of the cabinet x-ray imaging systems that are used in aviation security as well as the portable x-ray systems that are used by domestic law enforcement and military bomb squads. A number of national and international standards describe methods for testing this capability using the so called useful penetration test metric, where wires are imaged behind different thicknesses of blocking material. Presently, these tests are scored based on human judgments of wire visibility, which are inherently subjective. We propose a new method in which the useful penetration capabilities of an x-ray system are objectively evaluated by an image processing algorithm operating on digital images of a standard test object. The algorithm advantageously applies the Radon transform for curve parameter detection that reduces the problem of wire detection from two dimensions to one. The sensitivity of the wire detection method is adjustable and we demonstrate how the threshold parameter can be set to give agreement with human-judged results. The method was developed to be used in technical performance standards and is currently under ballot for inclusion in a US national aviation security standard.

  20. An objectively-analyzed method for measuring the useful penetration of x-ray imaging systems

    PubMed Central

    Glover, Jack L.; Hudson, Lawrence T.

    2016-01-01

    The ability to detect wires is an important capability of the cabinet x-ray imaging systems that are used in aviation security as well as the portable x-ray systems that are used by domestic law enforcement and military bomb squads. A number of national and international standards describe methods for testing this capability using the so called useful penetration test metric, where wires are imaged behind different thicknesses of blocking material. Presently, these tests are scored based on human judgments of wire visibility, which are inherently subjective. We propose a new method in which the useful penetration capabilities of an x-ray system are objectively evaluated by an image processing algorithm operating on digital images of a standard test object. The algorithm advantageously applies the Radon transform for curve parameter detection that reduces the problem of wire detection from two dimensions to one. The sensitivity of the wire detection method is adjustable and we demonstrate how the threshold parameter can be set to give agreement with human-judged results. The method was developed to be used in technical performance standards and is currently under ballot for inclusion in a US national aviation security standard. PMID:27499586

  1. An objectively-analyzed method for measuring the useful penetration of x-ray imaging systems

    NASA Astrophysics Data System (ADS)

    Glover, Jack L.; Hudson, Lawrence T.

    2016-06-01

    The ability to detect wires is an important capability of the cabinet x-ray imaging systems that are used in aviation security as well as the portable x-ray systems that are used by domestic law enforcement and military bomb squads. A number of national and international standards describe methods for testing this capability using the so called useful penetration test metric, where wires are imaged behind different thicknesses of blocking material. Presently, these tests are scored based on human judgments of wire visibility, which are inherently subjective. We propose a new method in which the useful penetration capabilities of an x-ray system are objectively evaluated by an image processing algorithm operating on digital images of a standard test object. The algorithm advantageously applies the Radon transform for curve parameter detection that reduces the problem of wire detection from two dimensions to one. The sensitivity of the wire detection method is adjustable and we demonstrate how the threshold parameter can be set to give agreement with human-judged results. The method was developed to be used in technical performance standards and is currently under ballot for inclusion in an international aviation security standard.

  2. Penalty Dynamic Programming Algorithm for Dim Targets Detection in Sensor Systems

    PubMed Central

    Huang, Dayu; Xue, Anke; Guo, Yunfei

    2012-01-01

    In order to detect and track multiple maneuvering dim targets in sensor systems, an improved dynamic programming track-before-detect algorithm (DP-TBD) called penalty DP-TBD (PDP-TBD) is proposed. The performances of tracking techniques are used as a feedback to the detection part. The feedback is constructed by a penalty term in the merit function, and the penalty term is a function of the possible target state estimation, which can be obtained by the tracking methods. With this feedback, the algorithm combines traditional tracking techniques with DP-TBD and it can be applied to simultaneously detect and track maneuvering dim targets. Meanwhile, a reasonable constraint that a sensor measurement can originate from one target or clutter is proposed to minimize track separation. Thus, the algorithm can be used in the multi-target situation with unknown target numbers. The efficiency and advantages of PDP-TBD compared with two existing methods are demonstrated by several simulations. PMID:22666074

  3. Eye pupil detection system using an ensemble of regression forest and fast radial symmetry transform with a near infrared camera

    NASA Astrophysics Data System (ADS)

    Jeong, Mira; Nam, Jae-Yeal; Ko, Byoung Chul

    2017-09-01

    In this paper, we focus on pupil center detection in various video sequences that include head poses and changes in illumination. To detect the pupil center, we first find four eye landmarks in each eye by using cascade local regression based on a regression forest. Based on the rough location of the pupil, a fast radial symmetric transform is applied using the previously found pupil location to rearrange the fine pupil center. As the final step, the pupil displacement is estimated between the previous frame and the current frame to maintain the level of accuracy against a false locating result occurring in a particular frame. We generated a new face dataset, called Keimyung University pupil detection (KMUPD), with infrared camera. The proposed method was successfully applied to the KMUPD dataset, and the results indicate that its pupil center detection capability is better than that of other methods and with a shorter processing time.

  4. VARiD: a variation detection framework for color-space and letter-space platforms.

    PubMed

    Dalca, Adrian V; Rumble, Stephen M; Levy, Samuel; Brudno, Michael

    2010-06-15

    High-throughput sequencing (HTS) technologies are transforming the study of genomic variation. The various HTS technologies have different sequencing biases and error rates, and while most HTS technologies sequence the residues of the genome directly, generating base calls for each position, the Applied Biosystem's SOLiD platform generates dibase-coded (color space) sequences. While combining data from the various platforms should increase the accuracy of variation detection, to date there are only a few tools that can identify variants from color space data, and none that can analyze color space and regular (letter space) data together. We present VARiD--a probabilistic method for variation detection from both letter- and color-space reads simultaneously. VARiD is based on a hidden Markov model and uses the forward-backward algorithm to accurately identify heterozygous, homozygous and tri-allelic SNPs, as well as micro-indels. Our analysis shows that VARiD performs better than the AB SOLiD toolset at detecting variants from color-space data alone, and improves the calls dramatically when letter- and color-space reads are combined. The toolset is freely available at http://compbio.cs.utoronto.ca/varid.

  5. Mapping Base Modifications in DNA by Transverse-Current Sequencing

    NASA Astrophysics Data System (ADS)

    Alvarez, Jose R.; Skachkov, Dmitry; Massey, Steven E.; Kalitsov, Alan; Velev, Julian P.

    2018-02-01

    Sequencing DNA modifications and lesions, such as methylation of cytosine and oxidation of guanine, is even more important and challenging than sequencing the genome itself. The traditional methods for detecting DNA modifications are either insensitive to these modifications or require additional processing steps to identify a particular type of modification. Transverse-current sequencing in nanopores can potentially identify the canonical bases and base modifications in the same run. In this work, we demonstrate that the most common DNA epigenetic modifications and lesions can be detected with any predefined accuracy based on their tunneling current signature. Our results are based on simulations of the nanopore tunneling current through DNA molecules, calculated using nonequilibrium electron-transport methodology within an effective multiorbital model derived from first-principles calculations, followed by a base-calling algorithm accounting for neighbor current-current correlations. This methodology can be integrated with existing experimental techniques to improve base-calling fidelity.

  6. Seasonal variability and detection range modeling of baleen whale calls in the Gulf of Alaska, 1999-2002.

    PubMed

    Stafford, Kathleen M; Mellinger, David K; Moore, Sue E; Fox, Christopher G

    2007-12-01

    Five species of large whales, including the blue (Balaenoptera musculus), fin (B. physalus), sei (B. borealis), humpback (Megaptera novaeangliae), and North Pacific right (Eubalaena japonica), were the target of commercial harvests in the Gulf of Alaska (GoA) during the 19th through mid-20th Centuries. Since this time, there have been a few summer time visual surveys for these species, but no overview of year-round use of these waters by endangered whales primarily because standard visual survey data are difficult and costly. From October 1999-May 2002, moored hydrophones were deployed in six locations in the GoA to record whale calls. Reception of calls from fin, humpback, and blue whales and an unknown source, called Watkins' whale, showed seasonal and geographic variation. Calls were detected more often during the winter than during the summer, suggesting that animals inhabit the GoA year-round. To estimate the distance at which species-diagnostic calls could be heard, parabolic equation propagation loss models for frequencies characteristic of each of each call type were run. Maximum detection ranges in the subarctic North Pacific ranged from 45 to 250 km among three species (fin, humpback, blue), although modeled detection ranges varied greatly with input parameters and choice of ambient noise level.

  7. Synthetic internal control sequences to increase negative call veracity in multiplexed, quantitative PCR assays for Phakopsora pachyrhizi

    USDA-ARS?s Scientific Manuscript database

    Quantitative PCR (Q-PCR) utilizing specific primer sequences and a fluorogenic, 5’-exonuclease linear hydrolysis probe is well established as a detection and identification method for Phakopsora pachyrhizi, the soybean rust pathogen. Because of the extreme sensitivity of Q-PCR, the DNA of a single u...

  8. A Call for a Neuroscience Approach to Cancer-Related Cognitive Impairment.

    PubMed

    Horowitz, Todd S; Suls, Jerry; Treviño, Melissa

    2018-05-23

    Cancer-related cognitive impairment (CRCI) is a widespread problem for the increasing population of cancer survivors. Our understanding of the nature, causes, and prevalence of CRCI is hampered by a reliance on clinical neuropsychological methods originally designed to detect focal lesions. Future progress will require collaboration between neuroscience and clinical neuropsychology. Published by Elsevier Ltd.

  9. Damage detection of an in-service condensation pipeline joint

    NASA Astrophysics Data System (ADS)

    Briand, Julie; Rezaei, Davood; Taheri, Farid

    2010-04-01

    The early detection of damage in structural or mechanical systems is of vital importance. With early detection, the damage may be repaired before the integrity of the system is jeopardized, resulting in monetary losses, loss of life or limb, and environmental impacts. Among the various types of structural health monitoring techniques, vibration-based methods are of significant interest since the damage location does not need to be known beforehand, making it a more versatile approach. The non-destructive damage detection method used for the experiments herein is a novel vibration-based method which uses an index called the EMD Energy Damage Index, developed with the aim of providing improved qualitative results compared to those methods currently available. As part of an effort to establish the integrity and limitation of this novel damage detection method, field testing was completed on a mechanical pipe joint on a condensation line, located in the physical plant of Dalhousie University. Piezoceramic sensors, placed at various locations around the joint were used to monitor the free vibration of the pipe imposed through the use of an impulse hammer. Multiple damage progression scenarios were completed, each having a healthy state and multiple damage cases. Subsequently, the recorded signals from the healthy and damaged joint were processed through the EMD Energy Damage Index developed in-house in an effort to detect the inflicted damage. The proposed methodology successfully detected the inflicted damages. In this paper, the effects of impact location, sensor location, frequency bandwidth, intrinsic mode functions, and boundary conditions are discussed.

  10. Surface contamination detection by means of near-infrared stimulation of thermal luminescence

    NASA Astrophysics Data System (ADS)

    Carrieri, Arthur H.; Roese, Erik S.

    2006-02-01

    A method for remotely detecting liquid chemical contamination on terrestrial surfaces is presented. Concurrent to irradiation by an absorbing near-infrared beam, the subject soil medium liberates radiance called thermal luminescence (TL) comprising middle-infrared energies (numir) that is scanned interferometrically in beam duration tau. Cyclic states of absorption and emission by the contaminant surrogate are rendered from a sequential differential-spectrum measurement [deltaS(numir,tau)] of the scanned TL. Detection of chemical warfare agent simulant wetting soil is performed in this manner, for example, through pattern recognition of its unique, thermally dynamic, molecular vibration resonance bands on display in the deltaS(numir,tau) metric.

  11. Calibrating passive acoustic monitoring: correcting humpback whale call detections for site-specific and time-dependent environmental characteristics.

    PubMed

    Helble, Tyler A; D'Spain, Gerald L; Campbell, Greg S; Hildebrand, John A

    2013-11-01

    This paper demonstrates the importance of accounting for environmental effects on passive underwater acoustic monitoring results. The situation considered is the reduction in shipping off the California coast between 2008-2010 due to the recession and environmental legislation. The resulting variations in ocean noise change the probability of detecting marine mammal vocalizations. An acoustic model was used to calculate the time-varying probability of detecting humpback whale vocalizations under best-guess environmental conditions and varying noise. The uncorrected call counts suggest a diel pattern and an increase in calling over a two-year period; the corrected call counts show minimal evidence of these features.

  12. Inappropriate Fiddling with Statistical Analyses to Obtain a Desirable P-value: Tests to Detect its Presence in Published Literature

    PubMed Central

    Gadbury, Gary L.; Allison, David B.

    2012-01-01

    Much has been written regarding p-values below certain thresholds (most notably 0.05) denoting statistical significance and the tendency of such p-values to be more readily publishable in peer-reviewed journals. Intuition suggests that there may be a tendency to manipulate statistical analyses to push a “near significant p-value” to a level that is considered significant. This article presents a method for detecting the presence of such manipulation (herein called “fiddling”) in a distribution of p-values from independent studies. Simulations are used to illustrate the properties of the method. The results suggest that the method has low type I error and that power approaches acceptable levels as the number of p-values being studied approaches 1000. PMID:23056287

  13. Inappropriate fiddling with statistical analyses to obtain a desirable p-value: tests to detect its presence in published literature.

    PubMed

    Gadbury, Gary L; Allison, David B

    2012-01-01

    Much has been written regarding p-values below certain thresholds (most notably 0.05) denoting statistical significance and the tendency of such p-values to be more readily publishable in peer-reviewed journals. Intuition suggests that there may be a tendency to manipulate statistical analyses to push a "near significant p-value" to a level that is considered significant. This article presents a method for detecting the presence of such manipulation (herein called "fiddling") in a distribution of p-values from independent studies. Simulations are used to illustrate the properties of the method. The results suggest that the method has low type I error and that power approaches acceptable levels as the number of p-values being studied approaches 1000.

  14. Detecting SNPs and estimating allele frequencies in clonal bacterial populations by sequencing pooled DNA.

    PubMed

    Holt, Kathryn E; Teo, Yik Y; Li, Heng; Nair, Satheesh; Dougan, Gordon; Wain, John; Parkhill, Julian

    2009-08-15

    Here, we present a method for estimating the frequencies of SNP alleles present within pooled samples of DNA using high-throughput short-read sequencing. The method was tested on real data from six strains of the highly monomorphic pathogen Salmonella Paratyphi A, sequenced individually and in a pool. A variety of read mapping and quality-weighting procedures were tested to determine the optimal parameters, which afforded > or =80% sensitivity of SNP detection and strong correlation with true SNP frequency at poolwide read depth of 40x, declining only slightly at read depths 20-40x. The method was implemented in Perl and relies on the opensource software Maq for read mapping and SNP calling. The Perl script is freely available from ftp://ftp.sanger.ac.uk/pub/pathogens/pools/.

  15. Accurate and exact CNV identification from targeted high-throughput sequence data.

    PubMed

    Nord, Alex S; Lee, Ming; King, Mary-Claire; Walsh, Tom

    2011-04-12

    Massively parallel sequencing of barcoded DNA samples significantly increases screening efficiency for clinically important genes. Short read aligners are well suited to single nucleotide and indel detection. However, methods for CNV detection from targeted enrichment are lacking. We present a method combining coverage with map information for the identification of deletions and duplications in targeted sequence data. Sequencing data is first scanned for gains and losses using a comparison of normalized coverage data between samples. CNV calls are confirmed by testing for a signature of sequences that span the CNV breakpoint. With our method, CNVs can be identified regardless of whether breakpoints are within regions targeted for sequencing. For CNVs where at least one breakpoint is within targeted sequence, exact CNV breakpoints can be identified. In a test data set of 96 subjects sequenced across ~1 Mb genomic sequence using multiplexing technology, our method detected mutations as small as 31 bp, predicted quantitative copy count, and had a low false-positive rate. Application of this method allows for identification of gains and losses in targeted sequence data, providing comprehensive mutation screening when combined with a short read aligner.

  16. A new way of searching for transients: the ADWO method and its results

    NASA Astrophysics Data System (ADS)

    Bagoly, Z.; Szecsi, D.; Ripa, J.; Racz, I. I.; Csabai, I.; Dobos, L.; Horvath, I.; Balazs, L. G.; Toth, L. V.

    2017-12-01

    With the detection of gravitational wave emissions from from merging compact objects, it is now more important than ever to effectively mine the data-set of gamma-satellites for non-triggered, short-duration transients. Hence we developed a new method called the Automatized Detector Weight Optimization (ADWO), applicable for space-borne detectors such as Fermi's GBM and RHESSI's Ge detectors. Provided that the trigger time of an astrophysical event is well known (as in the case of a gravitational wave detection) but the detector response matrix is uncertain, ADWO combines the data of all detectors and energy channels to provide the best signal-to-noise ratio. We used ADWO to successfully identify any potential electromagnetic counterpart of gravitational wave events, as well as to detect previously un-triggered short-duration GRBs in the data-sets.

  17. Fast preparation of ultrafine monolayered transition-metal dichalcogenide quantum dots using electrochemical shock for explosive detection.

    PubMed

    Chen, Zhigang; Tao, Zhengxu; Cong, Shan; Hou, Junyu; Zhang, Dengsong; Geng, Fengxia; Zhao, Zhigang

    2016-09-15

    A simple, general and fast method called "electrochemical shock" is developed to prepare monolayered transition-metal dichalcogenide (TMD) QDs with an average size of 2-4 nm and an average thickness of 0.85 ± 0.5 nm with only about 10 min of ultrasonication. Just like nails hammered into a plate, the electrochemical shock with Al 3+ ions and the following extraction with the help of oleic acid can disintegrate bulk TMD crystals into ultrafine TMD QDs. The fast-prepared QDs are then applied to detect highly explosive molecules such as 2,4,6-trinitrophenol (TNP) with a low detection limit of 10 -6 M. Our versatile method could be broadly applicable for the fast production of ultrathin QDs of other materials with great promise for various applications.

  18. Avian predators are less abundant during periodical cicada emergences, but why?

    PubMed

    Koenig, Walter D; Ries, Leslie; Olsen, V Beth K; Liebhold, Andrew M

    2011-03-01

    Despite a substantial resource pulse, numerous avian insectivores known to depredate periodical cicadas (Magicicada spp.) are detected less commonly during emergence years than in either the previous or following years. We used data on periodical cicada calls collected by volunteers conducting North American Breeding Bird Surveys within the range of cicada Brood X to test three hypotheses for this observation: lower detection rates could be caused by bird calls being obscured by cicada calls ("detectability" hypothesis), by birds avoiding areas with cicadas ("repel" hypothesis), or because bird abundances are generally lower during emergence years for some reason unrelated to the current emergence event ("true decline" hypothesis). We tested these hypotheses by comparing bird detections at stations coincident with calling cicadas vs. those without calling cicadas in the year prior to and during cicada emergences. At four distinct levels (stop, route, range, and season), parallel declines of birds in groups exposed and not exposed to cicada calls supported the true decline hypothesis. We discuss several potential mechanisms for this pattern, including the possibility that it is a consequence of the ecological and evolutionary interactions between predators of this extraordinary group of insects.

  19. Random Access Memories: A New Paradigm for Target Detection in High Resolution Aerial Remote Sensing Images.

    PubMed

    Zou, Zhengxia; Shi, Zhenwei

    2018-03-01

    We propose a new paradigm for target detection in high resolution aerial remote sensing images under small target priors. Previous remote sensing target detection methods frame the detection as learning of detection model + inference of class-label and bounding-box coordinates. Instead, we formulate it from a Bayesian view that at inference stage, the detection model is adaptively updated to maximize its posterior that is determined by both training and observation. We call this paradigm "random access memories (RAM)." In this paradigm, "Memories" can be interpreted as any model distribution learned from training data and "random access" means accessing memories and randomly adjusting the model at detection phase to obtain better adaptivity to any unseen distribution of test data. By leveraging some latest detection techniques e.g., deep Convolutional Neural Networks and multi-scale anchors, experimental results on a public remote sensing target detection data set show our method outperforms several other state of the art methods. We also introduce a new data set "LEarning, VIsion and Remote sensing laboratory (LEVIR)", which is one order of magnitude larger than other data sets of this field. LEVIR consists of a large set of Google Earth images, with over 22 k images and 10 k independently labeled targets. RAM gives noticeable upgrade of accuracy (an mean average precision improvement of 1% ~ 4%) of our baseline detectors with acceptable computational overhead.

  20. High order filtering methods for approximating hyperbolic systems of conservation laws

    NASA Technical Reports Server (NTRS)

    Lafon, F.; Osher, S.

    1991-01-01

    The essentially nonoscillatory (ENO) schemes, while potentially useful in the computation of discontinuous solutions of hyperbolic conservation-law systems, are computationally costly relative to simple central-difference methods. A filtering technique is presented which employs central differencing of arbitrarily high-order accuracy except where a local test detects the presence of spurious oscillations and calls upon the full ENO apparatus to remove them. A factor-of-three speedup is thus obtained over the full-ENO method for a wide range of problems, with high-order accuracy in regions of smooth flow.

  1. Combining Whispering-Gallery Mode Optical Biosensors with Microfluidics for Real-Time Detection of Protein Secretion from Living Cells in Complex Media.

    PubMed

    Chen, Ying-Jen; Schoeler, Ulrike; Huang, Chung-Hsuan Benjamin; Vollmer, Frank

    2018-05-01

    The noninvasive monitoring of protein secretion of cells responding to drug treatment is an effective and essential tool in latest drug development and for cytotoxicity assays. In this work, a surface functionalization method is demonstrated for specific detection of protein released from cells and a platform that integrates highly sensitive optical devices, called whispering-gallery mode biosensors, with precise microfluidics control to achieve label-free and real-time detection. Cell biomarker release is measured in real time and with nanomolar sensitivity. The surface functionalization method allows for antibodies to be immobilized on the surface for specific detection, while the microfluidics system enables detection in a continuous flow with a negligible compromise between sensitivity and flow control over stabilization and mixing. Cytochrome c detection is used to illustrate the merits of the system. Jurkat cells are treated with the toxin staurosporine to trigger cell apoptosis and cytochrome c released into the cell culture medium is monitored via the newly invented optical microfluidic platform. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Roi Detection and Vessel Segmentation in Retinal Image

    NASA Astrophysics Data System (ADS)

    Sabaz, F.; Atila, U.

    2017-11-01

    Diabetes disrupts work by affecting the structure of the eye and afterwards leads to loss of vision. Depending on the stage of disease that called diabetic retinopathy, there are sudden loss of vision and blurred vision problems. Automated detection of vessels in retinal images is a useful study to diagnose eye diseases, disease classification and other clinical trials. The shape and structure of the vessels give information about the severity of the disease and the stage of the disease. Automatic and fast detection of vessels allows for a quick diagnosis of the disease and the treatment process to start shortly. ROI detection and vessel extraction methods for retinal image are mentioned in this study. It is shown that the Frangi filter used in image processing can be successfully used in detection and extraction of vessels.

  3. Detecting very low allele fraction variants using targeted DNA sequencing and a novel molecular barcode-aware variant caller.

    PubMed

    Xu, Chang; Nezami Ranjbar, Mohammad R; Wu, Zhong; DiCarlo, John; Wang, Yexun

    2017-01-03

    Detection of DNA mutations at very low allele fractions with high accuracy will significantly improve the effectiveness of precision medicine for cancer patients. To achieve this goal through next generation sequencing, researchers need a detection method that 1) captures rare mutation-containing DNA fragments efficiently in the mix of abundant wild-type DNA; 2) sequences the DNA library extensively to deep coverage; and 3) distinguishes low level true variants from amplification and sequencing errors with high accuracy. Targeted enrichment using PCR primers provides researchers with a convenient way to achieve deep sequencing for a small, yet most relevant region using benchtop sequencers. Molecular barcoding (or indexing) provides a unique solution for reducing sequencing artifacts analytically. Although different molecular barcoding schemes have been reported in recent literature, most variant calling has been done on limited targets, using simple custom scripts. The analytical performance of barcode-aware variant calling can be significantly improved by incorporating advanced statistical models. We present here a highly efficient, simple and scalable enrichment protocol that integrates molecular barcodes in multiplex PCR amplification. In addition, we developed smCounter, an open source, generic, barcode-aware variant caller based on a Bayesian probabilistic model. smCounter was optimized and benchmarked on two independent read sets with SNVs and indels at 5 and 1% allele fractions. Variants were called with very good sensitivity and specificity within coding regions. We demonstrated that we can accurately detect somatic mutations with allele fractions as low as 1% in coding regions using our enrichment protocol and variant caller.

  4. High-order optical vortex position detection using a Shack-Hartmann wavefront sensor.

    PubMed

    Luo, Jia; Huang, Hongxin; Matsui, Yoshinori; Toyoda, Haruyoshi; Inoue, Takashi; Bai, Jian

    2015-04-06

    Optical vortex (OV) beams have null-intensity singular points, and the intensities in the region surrounding the singular point are quite low. This low intensity region influences the position detection accuracy of phase singular point, especially for high-order OV beam. In this paper, we propose a new method for solving this problem, called the phase-slope-combining correlation matching method. A Shack-Hartmann wavefront sensor (SH-WFS) is used to measure phase slope vectors at lenslet positions of the SH-WFS. Several phase slope vectors are combined into one to reduce the influence of low-intensity regions around the singular point, and the combined phase slope vectors are used to determine the OV position with the aid of correlation matching with a pre-calculated database. Experimental results showed that the proposed method works with high accuracy, even when detecting an OV beam with a topological charge larger than six. The estimated precision was about 0.15 in units of lenslet size when detecting an OV beam with a topological charge of up to 20.

  5. Toward the detection of abnormal chest radiographs the way radiologists do it

    NASA Astrophysics Data System (ADS)

    Alzubaidi, Mohammad; Patel, Ameet; Panchanathan, Sethuraman; Black, John A., Jr.

    2011-03-01

    Computer Aided Detection (CADe) and Computer Aided Diagnosis (CADx) are relatively recent areas of research that attempt to employ feature extraction, pattern recognition, and machine learning algorithms to aid radiologists in detecting and diagnosing abnormalities in medical images. However, these computational methods are based on the assumption that there are distinct classes of abnormalities, and that each class has some distinguishing features that set it apart from other classes. However, abnormalities in chest radiographs tend to be very heterogeneous. The literature suggests that thoracic (chest) radiologists develop their ability to detect abnormalities by developing a sense of what is normal, so that anything that is abnormal attracts their attention. This paper discusses an approach to CADe that is based on a technique called anomaly detection (which aims to detect outliers in data sets) for the purpose of detecting atypical regions in chest radiographs. However, in order to apply anomaly detection to chest radiographs, it is necessary to develop a basis for extracting features from corresponding anatomical locations in different chest radiographs. This paper proposes a method for doing this, and describes how it can be used to support CADe.

  6. Online detecting system of roller wear based on laser-linear array CCD technology

    NASA Astrophysics Data System (ADS)

    Guo, Yuan

    2010-10-01

    Roller is an important metallurgy tool in the rolling mill. And the surface of a roller affects the quantity of the rolling product directly. After using a period of time, roller must be repaired or replaced. Examining the profile of a working roller between the intervals of rolling is called online detecting for roller wear. The study of online detecting roller wear is very important for selecting the grinding time in reason, reducing the exchanging times of rollers, improving the quality of the product and realizing online grinding rollers. By applying the laser-linear array CCD detective technology, a method for online non-touch detecting roller wear was brought forward. The principle, composition and the operation process of the linear array CCD detecting system were expatiated. And an error compensation algorithm is exactly calculated to offset the shift of the roller axis in this measurement system. So the stability and the accuracy were improved remarkably. The experiment proves that the accuracy of the detecting system reaches to the demand of practical production process. It can provide a new method of high speed and high accuracy online detecting for roller wear.

  7. Optimal Weights Mixed Filter for removing mixture of Gaussian and impulse noises

    PubMed Central

    Grama, Ion; Liu, Quansheng

    2017-01-01

    In this paper we consider the problem of restoration of a image contaminated by a mixture of Gaussian and impulse noises. We propose a new statistic called ROADGI which improves the well-known Rank-Ordered Absolute Differences (ROAD) statistic for detecting points contaminated with the impulse noise in this context. Combining ROADGI statistic with the method of weights optimization we obtain a new algorithm called Optimal Weights Mixed Filter (OWMF) to deal with the mixed noise. Our simulation results show that the proposed filter is effective for mixed noises, as well as for single impulse noise and for single Gaussian noise. PMID:28692667

  8. Optimal Weights Mixed Filter for removing mixture of Gaussian and impulse noises.

    PubMed

    Jin, Qiyu; Grama, Ion; Liu, Quansheng

    2017-01-01

    In this paper we consider the problem of restoration of a image contaminated by a mixture of Gaussian and impulse noises. We propose a new statistic called ROADGI which improves the well-known Rank-Ordered Absolute Differences (ROAD) statistic for detecting points contaminated with the impulse noise in this context. Combining ROADGI statistic with the method of weights optimization we obtain a new algorithm called Optimal Weights Mixed Filter (OWMF) to deal with the mixed noise. Our simulation results show that the proposed filter is effective for mixed noises, as well as for single impulse noise and for single Gaussian noise.

  9. A feasibility study of damage detection in beams using high-speed camera (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wan, Chao; Yuan, Fuh-Gwo

    2017-04-01

    In this paper a method for damage detection in beam structures using high-speed camera is presented. Traditional methods of damage detection in structures typically involve contact (i.e., piezoelectric sensor or accelerometer) or non-contact sensors (i.e., laser vibrometer) which can be costly and time consuming to inspect an entire structure. With the popularity of the digital camera and the development of computer vision technology, video cameras offer a viable capability of measurement including higher spatial resolution, remote sensing and low-cost. In the study, a damage detection method based on the high-speed camera was proposed. The system setup comprises a high-speed camera and a line-laser which can capture the out-of-plane displacement of a cantilever beam. The cantilever beam with an artificial crack was excited and the vibration process was recorded by the camera. A methodology called motion magnification, which can amplify subtle motions in a video is used for modal identification of the beam. A finite element model was used for validation of the proposed method. Suggestions for applications of this methodology and challenges in future work will be discussed.

  10. Searching for exoplanets using artificial intelligence

    NASA Astrophysics Data System (ADS)

    Pearson, Kyle A.; Palafox, Leon; Griffith, Caitlin A.

    2018-02-01

    In the last decade, over a million stars were monitored to detect transiting planets. Manual interpretation of potential exoplanet candidates is labor intensive and subject to human error, the results of which are difficult to quantify. Here we present a new method of detecting exoplanet candidates in large planetary search projects which, unlike current methods uses a neural network. Neural networks, also called "deep learning" or "deep nets" are designed to give a computer perception into a specific problem by training it to recognize patterns. Unlike past transit detection algorithms deep nets learn to recognize planet features instead of relying on hand-coded metrics that humans perceive as the most representative. Our convolutional neural network is capable of detecting Earth-like exoplanets in noisy time-series data with a greater accuracy than a least-squares method. Deep nets are highly generalizable allowing data to be evaluated from different time series after interpolation without compromising performance. As validated by our deep net analysis of Kepler light curves, we detect periodic transits consistent with the true period without any model fitting. Our study indicates that machine learning will facilitate the characterization of exoplanets in future analysis of large astronomy data sets.

  11. Fast Vessel Detection in Gaofen-3 SAR Images with Ultrafine Strip-Map Mode

    PubMed Central

    Liu, Lei; Qiu, Xiaolan; Lei, Bin

    2017-01-01

    This study aims to detect vessels with lengths ranging from about 70 to 300 m, in Gaofen-3 (GF-3) SAR images with ultrafine strip-map (UFS) mode as fast as possible. Based on the analysis of the characteristics of vessels in GF-3 SAR imagery, an effective vessel detection method is proposed in this paper. Firstly, the iterative constant false alarm rate (CFAR) method is employed to detect the potential ship pixels. Secondly, the mean-shift operation is applied on each potential ship pixel to identify the candidate target region. During the mean-shift process, we maintain a selection matrix recording which pixels can be taken, and these pixels are called as the valid points of the candidate target. The l1 norm regression is used to extract the principal axis and detect the valid points. Finally, two kinds of false alarms, the bright line and the azimuth ambiguity, are removed by comparing the valid area of the candidate target with a pre-defined value and computing the displacement between the true target and the corresponding replicas respectively. Experimental results on three GF-3 SAR images with UFS mode demonstrate the effectiveness and efficiency of the proposed method. PMID:28678197

  12. Animals as Mobile Biological Sensors for Forest Fire Detection

    PubMed Central

    2007-01-01

    This paper proposes a mobile biological sensor system that can assist in early detection of forest fires one of the most dreaded natural disasters on the earth. The main idea presented in this paper is to utilize animals with sensors as Mobile Biological Sensors (MBS). The devices used in this system are animals which are native animals living in forests, sensors (thermo and radiation sensors with GPS features) that measure the temperature and transmit the location of the MBS, access points for wireless communication and a central computer system which classifies of animal actions. The system offers two different methods, firstly: access points continuously receive data about animals' location using GPS at certain time intervals and the gathered data is then classified and checked to see if there is a sudden movement (panic) of the animal groups: this method is called animal behavior classification (ABC). The second method can be defined as thermal detection (TD): the access points get the temperature values from the MBS devices and send the data to a central computer to check for instant changes in the temperatures. This system may be used for many purposes other than fire detection, namely animal tracking, poaching prevention and detecting instantaneous animal death. PMID:28903281

  13. Modeling anuran detection and site occupancy on North American Amphibian Monitoring Program (NAAMP) routes in Maryland

    USGS Publications Warehouse

    Weir, L.A.; Royle, J. Andrew; Nanjappa, P.; Jung, R.E.

    2005-01-01

    One of the most fundamental problems in monitoring animal populations is that of imperfect detection. Although imperfect detection can be modeled, studies examining patterns in occurrence often ignore detection and thus fail to properly partition variation in detection from that of occurrence. In this study, we used anuran calling survey data collected on North American Amphibian Monitoring Program routes in eastern Maryland to investigate factors that influence detection probability and site occupancy for 10 anuran species. In 2002, 17 calling survey routes in eastern Maryland were surveyed to collect environmental and species data nine or more times. To analyze these data, we developed models incorporating detection probability and site occupancy. The results suggest that, for more than half of the 10 species, detection probabilities vary most with season (i.e., day-of-year), air temperature, time, and moon illumination, whereas site occupancy may vary by the amount of palustrine forested wetland habitat. Our results suggest anuran calling surveys should document air temperature, time of night, moon illumination, observer skill, and habitat change over time, as these factors can be important to model-adjusted estimates of site occupancy. Our study represents the first formal modeling effort aimed at developing an analytic assessment framework for NAAMP calling survey data.

  14. Effects of Airgun Sounds on Bowhead Whale Calling Rates: Evidence for Two Behavioral Thresholds

    PubMed Central

    Blackwell, Susanna B.; Nations, Christopher S.; McDonald, Trent L.; Thode, Aaron M.; Mathias, Delphine; Kim, Katherine H.; Greene, Charles R.; Macrander, A. Michael

    2015-01-01

    In proximity to seismic operations, bowhead whales (Balaena mysticetus) decrease their calling rates. Here, we investigate the transition from normal calling behavior to decreased calling and identify two threshold levels of received sound from airgun pulses at which calling behavior changes. Data were collected in August–October 2007–2010, during the westward autumn migration in the Alaskan Beaufort Sea. Up to 40 directional acoustic recorders (DASARs) were deployed at five sites offshore of the Alaskan North Slope. Using triangulation, whale calls localized within 2 km of each DASAR were identified and tallied every 10 minutes each season, so that the detected call rate could be interpreted as the actual call production rate. Moreover, airgun pulses were identified on each DASAR, analyzed, and a cumulative sound exposure level was computed for each 10-min period each season (CSEL10-min). A Poisson regression model was used to examine the relationship between the received CSEL10-min from airguns and the number of detected bowhead calls. Calling rates increased as soon as airgun pulses were detectable, compared to calling rates in the absence of airgun pulses. After the initial increase, calling rates leveled off at a received CSEL10-min of ~94 dB re 1 μPa2-s (the lower threshold). In contrast, once CSEL10-min exceeded ~127 dB re 1 μPa2-s (the upper threshold), whale calling rates began decreasing, and when CSEL10-min values were above ~160 dB re 1 μPa2-s, the whales were virtually silent. PMID:26039218

  15. Effects of airgun sounds on bowhead whale calling rates: evidence for two behavioral thresholds.

    PubMed

    Blackwell, Susanna B; Nations, Christopher S; McDonald, Trent L; Thode, Aaron M; Mathias, Delphine; Kim, Katherine H; Greene, Charles R; Macrander, A Michael

    2015-01-01

    In proximity to seismic operations, bowhead whales (Balaena mysticetus) decrease their calling rates. Here, we investigate the transition from normal calling behavior to decreased calling and identify two threshold levels of received sound from airgun pulses at which calling behavior changes. Data were collected in August-October 2007-2010, during the westward autumn migration in the Alaskan Beaufort Sea. Up to 40 directional acoustic recorders (DASARs) were deployed at five sites offshore of the Alaskan North Slope. Using triangulation, whale calls localized within 2 km of each DASAR were identified and tallied every 10 minutes each season, so that the detected call rate could be interpreted as the actual call production rate. Moreover, airgun pulses were identified on each DASAR, analyzed, and a cumulative sound exposure level was computed for each 10-min period each season (CSEL10-min). A Poisson regression model was used to examine the relationship between the received CSEL10-min from airguns and the number of detected bowhead calls. Calling rates increased as soon as airgun pulses were detectable, compared to calling rates in the absence of airgun pulses. After the initial increase, calling rates leveled off at a received CSEL10-min of ~94 dB re 1 μPa2-s (the lower threshold). In contrast, once CSEL10-min exceeded ~127 dB re 1 μPa2-s (the upper threshold), whale calling rates began decreasing, and when CSEL10-min values were above ~160 dB re 1 μPa2-s, the whales were virtually silent.

  16. Single molecule molecular inversion probes for targeted, high-accuracy detection of low-frequency variation.

    PubMed

    Hiatt, Joseph B; Pritchard, Colin C; Salipante, Stephen J; O'Roak, Brian J; Shendure, Jay

    2013-05-01

    The detection and quantification of genetic heterogeneity in populations of cells is fundamentally important to diverse fields, ranging from microbial evolution to human cancer genetics. However, despite the cost and throughput advances associated with massively parallel sequencing, it remains challenging to reliably detect mutations that are present at a low relative abundance in a given DNA sample. Here we describe smMIP, an assay that combines single molecule tagging with multiplex targeted capture to enable practical and highly sensitive detection of low-frequency or subclonal variation. To demonstrate the potential of the method, we simultaneously resequenced 33 clinically informative cancer genes in eight cell line and 45 clinical cancer samples. Single molecule tagging facilitated extremely accurate consensus calling, with an estimated per-base error rate of 8.4 × 10(-6) in cell lines and 2.6 × 10(-5) in clinical specimens. False-positive mutations in the single molecule consensus base-calls exhibited patterns predominantly consistent with DNA damage, including 8-oxo-guanine and spontaneous deamination of cytosine. Based on mixing experiments with cell line samples, sensitivity for mutations above 1% frequency was 83% with no false positives. At clinically informative sites, we identified seven low-frequency point mutations (0.2%-4.7%), including BRAF p.V600E (melanoma, 0.2% alternate allele frequency), KRAS p.G12V (lung, 0.6%), JAK2 p.V617F (melanoma, colon, two lung, 0.3%-1.4%), and NRAS p.Q61R (colon, 4.7%). We anticipate that smMIP will be broadly adoptable as a practical and effective method for accurately detecting low-frequency mutations in both research and clinical settings.

  17. Single molecule molecular inversion probes for targeted, high-accuracy detection of low-frequency variation

    PubMed Central

    Hiatt, Joseph B.; Pritchard, Colin C.; Salipante, Stephen J.; O'Roak, Brian J.; Shendure, Jay

    2013-01-01

    The detection and quantification of genetic heterogeneity in populations of cells is fundamentally important to diverse fields, ranging from microbial evolution to human cancer genetics. However, despite the cost and throughput advances associated with massively parallel sequencing, it remains challenging to reliably detect mutations that are present at a low relative abundance in a given DNA sample. Here we describe smMIP, an assay that combines single molecule tagging with multiplex targeted capture to enable practical and highly sensitive detection of low-frequency or subclonal variation. To demonstrate the potential of the method, we simultaneously resequenced 33 clinically informative cancer genes in eight cell line and 45 clinical cancer samples. Single molecule tagging facilitated extremely accurate consensus calling, with an estimated per-base error rate of 8.4 × 10−6 in cell lines and 2.6 × 10−5 in clinical specimens. False-positive mutations in the single molecule consensus base-calls exhibited patterns predominantly consistent with DNA damage, including 8-oxo-guanine and spontaneous deamination of cytosine. Based on mixing experiments with cell line samples, sensitivity for mutations above 1% frequency was 83% with no false positives. At clinically informative sites, we identified seven low-frequency point mutations (0.2%–4.7%), including BRAF p.V600E (melanoma, 0.2% alternate allele frequency), KRAS p.G12V (lung, 0.6%), JAK2 p.V617F (melanoma, colon, two lung, 0.3%–1.4%), and NRAS p.Q61R (colon, 4.7%). We anticipate that smMIP will be broadly adoptable as a practical and effective method for accurately detecting low-frequency mutations in both research and clinical settings. PMID:23382536

  18. Standoff laser-based spectroscopy for explosives detection

    NASA Astrophysics Data System (ADS)

    Gaft, M.; Nagli, L.

    2007-10-01

    Real time detection and identification of explosives at a standoff distance is a major issue in efforts to develop defense against so-called Improvised Explosive Devices (IED). It is recognized that the only technique, which is potentially capable to standoff detection of minimal amounts of explosives is laser-based spectroscopy. LDS activity is based on a combination of laser-based spectroscopic methods with orthogonal capabilities. Our technique belongs to trace detection, namely to its micro-particles variety. It is based on commonly held belief that surface contamination was very difficult to avoid and could be exploited for standoff detection. We has applied optical techniques including gated Raman and time-resolved luminescence spectroscopy for detection of main explosive materials, both factory and homemade. We developed and tested a Raman system for the field remote detection and identification of minimal amounts of explosives on relevant surfaces at a distance of up to 30 meters.

  19. QuASAR: quantitative allele-specific analysis of reads.

    PubMed

    Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-04-15

    Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. Dynamic-thresholding level set: a novel computer-aided volumetry method for liver tumors in hepatic CT images

    NASA Astrophysics Data System (ADS)

    Cai, Wenli; Yoshida, Hiroyuki; Harris, Gordon J.

    2007-03-01

    Measurement of the volume of focal liver tumors, called liver tumor volumetry, is indispensable for assessing the growth of tumors and for monitoring the response of tumors to oncology treatments. Traditional edge models, such as the maximum gradient and zero-crossing methods, often fail to detect the accurate boundary of a fuzzy object such as a liver tumor. As a result, the computerized volumetry based on these edge models tends to differ from manual segmentation results performed by physicians. In this study, we developed a novel computerized volumetry method for fuzzy objects, called dynamic-thresholding level set (DT level set). An optimal threshold value computed from a histogram tends to shift, relative to the theoretical threshold value obtained from a normal distribution model, toward a smaller region in the histogram. We thus designed a mobile shell structure, called a propagating shell, which is a thick region encompassing the level set front. The optimal threshold calculated from the histogram of the shell drives the level set front toward the boundary of a liver tumor. When the volume ratio between the object and the background in the shell approaches one, the optimal threshold value best fits the theoretical threshold value and the shell stops propagating. Application of the DT level set to 26 hepatic CT cases with 63 biopsy-confirmed hepatocellular carcinomas (HCCs) and metastases showed that the computer measured volumes were highly correlated with those of tumors measured manually by physicians. Our preliminary results showed that DT level set was effective and accurate in estimating the volumes of liver tumors detected in hepatic CT images.

  1. A Corpus-Based System of Error Detection and Revision Suggestion for Spanish Learners in Taiwan: A Case Study

    ERIC Educational Resources Information Center

    Lu, Hui-Chuan; Chu, Yu-Hsin; Chang, Cheng-Yu

    2013-01-01

    Compared with English learners, Spanish learners have fewer resources for automatic error detection and revision and following the current integrative Computer Assisted Language Learning (CALL), we combined corpus-based approach and CALL to create the System of Error Detection and Revision Suggestion (SEDRS) for learning Spanish. Through…

  2. High-speed railway real-time localization auxiliary method based on deep neural network

    NASA Astrophysics Data System (ADS)

    Chen, Dongjie; Zhang, Wensheng; Yang, Yang

    2017-11-01

    High-speed railway intelligent monitoring and management system is composed of schedule integration, geographic information, location services, and data mining technology for integration of time and space data. Assistant localization is a significant submodule of the intelligent monitoring system. In practical application, the general access is to capture the image sequences of the components by using a high-definition camera, digital image processing technique and target detection, tracking and even behavior analysis method. In this paper, we present an end-to-end character recognition method based on a deep CNN network called YOLO-toc for high-speed railway pillar plate number. Different from other deep CNNs, YOLO-toc is an end-to-end multi-target detection framework, furthermore, it exhibits a state-of-art performance on real-time detection with a nearly 50fps achieved on GPU (GTX960). Finally, we realize a real-time but high-accuracy pillar plate number recognition system and integrate natural scene OCR into a dedicated classification YOLO-toc model.

  3. HPV Genotyping of Modified General Primer-Amplicons Is More Analytically Sensitive and Specific by Sequencing than by Hybridization

    PubMed Central

    Meisal, Roger; Rounge, Trine Ballestad; Christiansen, Irene Kraus; Eieland, Alexander Kirkeby; Worren, Merete Molton; Molden, Tor Faksvaag; Kommedal, Øyvind; Hovig, Eivind; Leegaard, Truls Michael

    2017-01-01

    Sensitive and specific genotyping of human papillomaviruses (HPVs) is important for population-based surveillance of carcinogenic HPV types and for monitoring vaccine effectiveness. Here we compare HPV genotyping by Next Generation Sequencing (NGS) to an established DNA hybridization method. In DNA isolated from urine, the overall analytical sensitivity of NGS was found to be 22% higher than that of hybridization. NGS was also found to be the most specific method and expanded the detection repertoire beyond the 37 types of the DNA hybridization assay. Furthermore, NGS provided an increased resolution by identifying genetic variants of individual HPV types. The same Modified General Primers (MGP)-amplicon was used in both methods. The NGS method is described in detail to facilitate implementation in the clinical microbiology laboratory and includes suggestions for new standards for detection and calling of types and variants with improved resolution. PMID:28045981

  4. HPV Genotyping of Modified General Primer-Amplicons Is More Analytically Sensitive and Specific by Sequencing than by Hybridization.

    PubMed

    Meisal, Roger; Rounge, Trine Ballestad; Christiansen, Irene Kraus; Eieland, Alexander Kirkeby; Worren, Merete Molton; Molden, Tor Faksvaag; Kommedal, Øyvind; Hovig, Eivind; Leegaard, Truls Michael; Ambur, Ole Herman

    2017-01-01

    Sensitive and specific genotyping of human papillomaviruses (HPVs) is important for population-based surveillance of carcinogenic HPV types and for monitoring vaccine effectiveness. Here we compare HPV genotyping by Next Generation Sequencing (NGS) to an established DNA hybridization method. In DNA isolated from urine, the overall analytical sensitivity of NGS was found to be 22% higher than that of hybridization. NGS was also found to be the most specific method and expanded the detection repertoire beyond the 37 types of the DNA hybridization assay. Furthermore, NGS provided an increased resolution by identifying genetic variants of individual HPV types. The same Modified General Primers (MGP)-amplicon was used in both methods. The NGS method is described in detail to facilitate implementation in the clinical microbiology laboratory and includes suggestions for new standards for detection and calling of types and variants with improved resolution.

  5. Harmonic component detection: Optimized Spectral Kurtosis for operational modal analysis

    NASA Astrophysics Data System (ADS)

    Dion, J.-L.; Tawfiq, I.; Chevallier, G.

    2012-01-01

    This work is a contribution in the field of Operational Modal Analysis to identify the modal parameters of mechanical structures using only measured responses. The study deals with structural responses coupled with harmonic components amplitude and frequency modulated in a short range, a common combination for mechanical systems with engines and other rotating machines in operation. These harmonic components generate misleading data interpreted erroneously by the classical methods used in OMA. The present work attempts to differentiate maxima in spectra stemming from harmonic components and structural modes. The detection method proposed is based on the so-called Optimized Spectral Kurtosis and compared with others definitions of Spectral Kurtosis described in the literature. After a parametric study of the method, a critical study is performed on numerical simulations and then on an experimental structure in operation in order to assess the method's performance.

  6. Deep Whole-Genome Sequencing to Detect Mixed Infection of Mycobacterium tuberculosis

    PubMed Central

    Gan, Mingyu; Liu, Qingyun; Yang, Chongguang; Gao, Qian; Luo, Tao

    2016-01-01

    Mixed infection by multiple Mycobacterium tuberculosis (MTB) strains is associated with poor treatment outcome of tuberculosis (TB). Traditional genotyping methods have been used to detect mixed infections of MTB, however, their sensitivity and resolution are limited. Deep whole-genome sequencing (WGS) has been proved highly sensitive and discriminative for studying population heterogeneity of MTB. Here, we developed a phylogenetic-based method to detect MTB mixed infections using WGS data. We collected published WGS data of 782 global MTB strains from public database. We called homogeneous and heterogeneous single nucleotide variations (SNVs) of individual strains by mapping short reads to the ancestral MTB reference genome. We constructed a phylogenomic database based on 68,639 homogeneous SNVs of 652 MTB strains. Mixed infections were determined if multiple evolutionary paths were identified by mapping the SNVs of individual samples to the phylogenomic database. By simulation, our method could specifically detect mixed infections when the sequencing depth of minor strains was as low as 1× coverage, and when the genomic distance of two mixed strains was as small as 16 SNVs. By applying our methods to all 782 samples, we detected 47 mixed infections and 45 of them were caused by locally endemic strains. The results indicate that our method is highly sensitive and discriminative for identifying mixed infections from deep WGS data of MTB isolates. PMID:27391214

  7. Cause-specific mortality time series analysis: a general method to detect and correct for abrupt data production changes

    PubMed Central

    2011-01-01

    Background Monitoring the time course of mortality by cause is a key public health issue. However, several mortality data production changes may affect cause-specific time trends, thus altering the interpretation. This paper proposes a statistical method that detects abrupt changes ("jumps") and estimates correction factors that may be used for further analysis. Methods The method was applied to a subset of the AMIEHS (Avoidable Mortality in the European Union, toward better Indicators for the Effectiveness of Health Systems) project mortality database and considered for six European countries and 13 selected causes of deaths. For each country and cause of death, an automated jump detection method called Polydect was applied to the log mortality rate time series. The plausibility of a data production change associated with each detected jump was evaluated through literature search or feedback obtained from the national data producers. For each plausible jump position, the statistical significance of the between-age and between-gender jump amplitude heterogeneity was evaluated by means of a generalized additive regression model, and correction factors were deduced from the results. Results Forty-nine jumps were detected by the Polydect method from 1970 to 2005. Most of the detected jumps were found to be plausible. The age- and gender-specific amplitudes of the jumps were estimated when they were statistically heterogeneous, and they showed greater by-age heterogeneity than by-gender heterogeneity. Conclusion The method presented in this paper was successfully applied to a large set of causes of death and countries. The method appears to be an alternative to bridge coding methods when the latter are not systematically implemented because they are time- and resource-consuming. PMID:21929756

  8. Homozygous and hemizygous CNV detection from exome sequencing data in a Mendelian disease cohort

    PubMed Central

    Gambin, Tomasz; Akdemir, Zeynep C.; Yuan, Bo; Gu, Shen; Chiang, Theodore; Carvalho, Claudia M.B.; Shaw, Chad; Jhangiani, Shalini; Boone, Philip M.; Eldomery, Mohammad K.; Karaca, Ender; Bayram, Yavuz; Stray-Pedersen, Asbjørg; Muzny, Donna; Charng, Wu-Lin; Bahrambeigi, Vahid; Belmont, John W.; Boerwinkle, Eric; Beaudet, Arthur L.; Gibbs, Richard A.

    2017-01-01

    Abstract We developed an algorithm, HMZDelFinder, that uses whole exome sequencing (WES) data to identify rare and intragenic homozygous and hemizygous (HMZ) deletions that may represent complete loss-of-function of the indicated gene. HMZDelFinder was applied to 4866 samples in the Baylor–Hopkins Center for Mendelian Genomics (BHCMG) cohort and detected 773 HMZ deletion calls (567 homozygous or 206 hemizygous) with an estimated sensitivity of 86.5% (82% for single-exonic and 88% for multi-exonic calls) and precision of 78% (53% single-exonic and 96% for multi-exonic calls). Out of 773 HMZDelFinder-detected deletion calls, 82 were subjected to array comparative genomic hybridization (aCGH) and/or breakpoint PCR and 64 were confirmed. These include 18 single-exon deletions out of which 8 were exclusively detected by HMZDelFinder and not by any of seven other CNV detection tools examined. Further investigation of the 64 validated deletion calls revealed at least 15 pathogenic HMZ deletions. Of those, 7 accounted for 17–50% of pathogenic CNVs in different disease cohorts where 7.1–11% of the molecular diagnosis solved rate was attributed to CNVs. In summary, we present an algorithm to detect rare, intragenic, single-exon deletion CNVs using WES data; this tool can be useful for disease gene discovery efforts and clinical WES analyses. PMID:27980096

  9. Detection of baleen whales on an ocean-bottom seismometer array in the Lau Basin

    NASA Astrophysics Data System (ADS)

    Brodie, D.; Dunn, R.

    2011-12-01

    Long-term deployment of ocean-bottom seismometer arrays provides a unique opportunity for identifying and tracking whales in a manner not usually possible in biological studies. Large baleen whales emit low frequency (>5Hz) sounds called 'calls' or 'songs' that can be detected on either the hydrophone or vertical channel of the instrument at distances in excess of 50 km. The calls are distinct to individual species and even geographical groups among species, and are thought to serve a variety of purposes. Distinct repeating calls can be automatically identified using matched-filter processing, and whales can be located in a manner similar to that of earthquakes. Many baleen whale species are endangered, and little is known about their geographic distribution, population dynamics, and basic behaviors. The Lau back-arc basin, a tectonically active, elongated basin bounded by volcanic shallows, lies in the southwestern Pacific Ocean between Fiji and Tonga. Although whales are known to exist around Fiji and Tonga, little is understood about the population dynamics and migration patterns throughout the basin. Twenty-nine broadband ocean-bottom seismometers deployed in the basin recorded data for approximately ten months during the years 2009-2010. To date, four species of whales have been identified in the data: Blue (one call type), Humpback (two call types, including long-lasting 'songs'), Bryde's (one call type), and Fin whales (three call types). Three as-yet-unknown call types have also been identified. After the calls were identified, idealized spectrograms of the known calls were matched against the entire data set using an auto-detection algorithm. The auto-detection output provides the number of calls and times of year when each call type was recorded. Based on the results, whales migrate seasonally through the basin with some overlapping of species. Initial results also indicate that different species of whales are more common in some parts of the basin than others, suggesting preferences in water depth and distance to land. In future work, whales will be tracked through the basin using call localization information to illustrate migration patterns of the various species.

  10. Effects of different analysis techniques and recording duty cycles on passive acoustic monitoring of killer whales.

    PubMed

    Riera, Amalis; Ford, John K; Ross Chapman, N

    2013-09-01

    Killer whales in British Columbia are at risk, and little is known about their winter distribution. Passive acoustic monitoring of their year-round habitat is a valuable supplemental method to traditional visual and photographic surveys. However, long-term acoustic studies of odontocetes have some limitations, including the generation of large amounts of data that require highly time-consuming processing. There is a need to develop tools and protocols to maximize the efficiency of such studies. Here, two types of analysis, real-time and long term spectral averages, were compared to assess their performance at detecting killer whale calls in long-term acoustic recordings. In addition, two different duty cycles, 1/3 and 2/3, were tested. Both the use of long term spectral averages and a lower duty cycle resulted in a decrease in call detection and positive pod identification, leading to underestimations of the amount of time the whales were present. The impact of these limitations should be considered in future killer whale acoustic surveys. A compromise between a lower resolution data processing method and a higher duty cycle is suggested for maximum methodological efficiency.

  11. Use of a Parabolic Microphone to Detect Hidden Subjects in Search and Rescue.

    PubMed

    Bowditch, Nathaniel L; Searing, Stanley K; Thomas, Jeffrey A; Thompson, Peggy K; Tubis, Jacqueline N; Bowditch, Sylvia P

    2018-03-01

    This study compares a parabolic microphone to unaided hearing in detecting and comprehending hidden callers at ranges of 322 to 2510 m. Eight subjects were placed 322 to 2510 m away from a central listening point. The subjects were concealed, and their calling volume was calibrated. In random order, subjects were asked to call the name of a state for 5 minutes. Listeners with parabolic microphones and others with unaided hearing recorded the direction of the call (detection) and name of the state (comprehension). The parabolic microphone was superior to unaided hearing in both detecting subjects and comprehending their calls, with an effect size (Cohen's d) of 1.58 for detection and 1.55 for comprehension. For each of the 8 hidden subjects, there were 24 detection attempts with the parabolic microphone and 54 to 60 attempts by unaided listeners. At the longer distances (1529-2510 m), the parabolic microphone was better at detecting callers (83% vs 51%; P<0.00001 by χ 2 ) and comprehension (57% vs 12%; P<0.00001). At the shorter distances (322-1190 m), the parabolic microphone offered advantages in detection (100% vs 83%; P=0.000023) and comprehension (86% vs 51%; P<0.00001), although not as pronounced as at the longer distances. Use of a 66-cm (26-inch) parabolic microphone significantly improved detection and comprehension of hidden calling subjects at distances between 322 and 2510 m when compared with unaided hearing. This study supports the use of a parabolic microphone in search and rescue to locate responsive subjects in favorable weather and terrain. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Efficient Forest Fire Detection Index for Application in Unmanned Aerial Systems (UASs).

    PubMed

    Cruz, Henry; Eckert, Martina; Meneses, Juan; Martínez, José-Fernán

    2016-06-16

    This article proposes a novel method for detecting forest fires, through the use of a new color index, called the Forest Fire Detection Index (FFDI), developed by the authors. The index is based on methods for vegetation classification and has been adapted to detect the tonalities of flames and smoke; the latter could be included adaptively into the Regions of Interest (RoIs) with the help of a variable factor. Multiple tests have been performed upon database imagery and present promising results: a detection precision of 96.82% has been achieved for image sizes of 960 × 540 pixels at a processing time of 0.0447 seconds. This achievement would lead to a performance of 22 f/s, for smaller images, while up to 54 f/s could be reached by maintaining a similar detection precision. Additional tests have been performed on fires in their early stages, achieving a precision rate of p = 96.62%. The method could be used in real-time in Unmanned Aerial Systems (UASs), with the aim of monitoring a wider area than through fixed surveillance systems. Thus, it would result in more cost-effective outcomes than conventional systems implemented in helicopters or satellites. UASs could also reach inaccessible locations without jeopardizing people's safety. On-going work includes implementation into a commercially available drone.

  13. Direct Detection of Biotinylated Proteins by Mass Spectrometry

    PubMed Central

    2015-01-01

    Mass spectrometric strategies to identify protein subpopulations involved in specific biological functions rely on covalently tagging biotin to proteins using various chemical modification methods. The biotin tag is primarily used for enrichment of the targeted subpopulation for subsequent mass spectrometry (MS) analysis. A limitation of these strategies is that MS analysis does not easily discriminate unlabeled contaminants from the labeled protein subpopulation under study. To solve this problem, we developed a flexible method that only relies on direct MS detection of biotin-tagged proteins called “Direct Detection of Biotin-containing Tags” (DiDBiT). Compared with conventional targeted proteomic strategies, DiDBiT improves direct detection of biotinylated proteins ∼200 fold. We show that DiDBiT is applicable to several protein labeling protocols in cell culture and in vivo using cell permeable NHS-biotin and incorporation of the noncanonical amino acid, azidohomoalanine (AHA), into newly synthesized proteins, followed by click chemistry tagging with biotin. We demonstrate that DiDBiT improves the direct detection of biotin-tagged newly synthesized peptides more than 20-fold compared to conventional methods. With the increased sensitivity afforded by DiDBiT, we demonstrate the MS detection of newly synthesized proteins labeled in vivo in the rodent nervous system with unprecedented temporal resolution as short as 3 h. PMID:25117199

  14. Intelligent Method for Diagnosing Structural Faults of Rotating Machinery Using Ant Colony Optimization

    PubMed Central

    Li, Ke; Chen, Peng

    2011-01-01

    Structural faults, such as unbalance, misalignment and looseness, etc., often occur in the shafts of rotating machinery. These faults may cause serious machine accidents and lead to great production losses. This paper proposes an intelligent method for diagnosing structural faults of rotating machinery using ant colony optimization (ACO) and relative ratio symptom parameters (RRSPs) in order to detect faults and distinguish fault types at an early stage. New symptom parameters called “relative ratio symptom parameters” are defined for reflecting the features of vibration signals measured in each state. Synthetic detection index (SDI) using statistical theory has also been defined to evaluate the applicability of the RRSPs. The SDI can be used to indicate the fitness of a RRSP for ACO. Lastly, this paper also compares the proposed method with the conventional neural networks (NN) method. Practical examples of fault diagnosis for a centrifugal fan are provided to verify the effectiveness of the proposed method. The verification results show that the structural faults often occurring in the centrifugal fan, such as unbalance, misalignment and looseness states are effectively identified by the proposed method, while these faults are difficult to detect using conventional neural networks. PMID:22163833

  15. Intelligent method for diagnosing structural faults of rotating machinery using ant colony optimization.

    PubMed

    Li, Ke; Chen, Peng

    2011-01-01

    Structural faults, such as unbalance, misalignment and looseness, etc., often occur in the shafts of rotating machinery. These faults may cause serious machine accidents and lead to great production losses. This paper proposes an intelligent method for diagnosing structural faults of rotating machinery using ant colony optimization (ACO) and relative ratio symptom parameters (RRSPs) in order to detect faults and distinguish fault types at an early stage. New symptom parameters called "relative ratio symptom parameters" are defined for reflecting the features of vibration signals measured in each state. Synthetic detection index (SDI) using statistical theory has also been defined to evaluate the applicability of the RRSPs. The SDI can be used to indicate the fitness of a RRSP for ACO. Lastly, this paper also compares the proposed method with the conventional neural networks (NN) method. Practical examples of fault diagnosis for a centrifugal fan are provided to verify the effectiveness of the proposed method. The verification results show that the structural faults often occurring in the centrifugal fan, such as unbalance, misalignment and looseness states are effectively identified by the proposed method, while these faults are difficult to detect using conventional neural networks.

  16. The Monitoring, Detection, Isolation and Assessment of Information Warfare Attacks Through Multi-Level, Multi-Scale System Modeling and Model Based Technology

    DTIC Science & Technology

    2004-01-01

    login identity to the one under which the system call is executed, the parameters of the system call execution - file names including full path...Anomaly detection COAST-EIMDT Distributed on target hosts EMERALD Distributed on target hosts and security servers Signature recognition Anomaly...uses a centralized architecture, and employs an anomaly detection technique for intrusion detection. The EMERALD project [80] proposes a

  17. An Ultrasonographic Periodontal Probe

    NASA Astrophysics Data System (ADS)

    Bertoncini, C. A.; Hinders, M. K.

    2010-02-01

    Periodontal disease, commonly known as gum disease, affects millions of people. The current method of detecting periodontal pocket depth is painful, invasive, and inaccurate. As an alternative to manual probing, an ultrasonographic periodontal probe is being developed to use ultrasound echo waveforms to measure periodontal pocket depth, which is the main measure of periodontal disease. Wavelet transforms and pattern classification techniques are implemented in artificial intelligence routines that can automatically detect pocket depth. The main pattern classification technique used here, called a binary classification algorithm, compares test objects with only two possible pocket depth measurements at a time and relies on dimensionality reduction for the final determination. This method correctly identifies up to 90% of the ultrasonographic probe measurements within the manual probe's tolerance.

  18. Estimating Lion Abundance using N-mixture Models for Social Species

    PubMed Central

    Belant, Jerrold L.; Bled, Florent; Wilton, Clay M.; Fyumagwa, Robert; Mwampeta, Stanslaus B.; Beyer, Dean E.

    2016-01-01

    Declining populations of large carnivores worldwide, and the complexities of managing human-carnivore conflicts, require accurate population estimates of large carnivores to promote their long-term persistence through well-informed management We used N-mixture models to estimate lion (Panthera leo) abundance from call-in and track surveys in southeastern Serengeti National Park, Tanzania. Because of potential habituation to broadcasted calls and social behavior, we developed a hierarchical observation process within the N-mixture model conditioning lion detectability on their group response to call-ins and individual detection probabilities. We estimated 270 lions (95% credible interval = 170–551) using call-ins but were unable to estimate lion abundance from track data. We found a weak negative relationship between predicted track density and predicted lion abundance from the call-in surveys. Luminosity was negatively correlated with individual detection probability during call-in surveys. Lion abundance and track density were influenced by landcover, but direction of the corresponding effects were undetermined. N-mixture models allowed us to incorporate multiple parameters (e.g., landcover, luminosity, observer effect) influencing lion abundance and probability of detection directly into abundance estimates. We suggest that N-mixture models employing a hierarchical observation process can be used to estimate abundance of other social, herding, and grouping species. PMID:27786283

  19. Estimating Lion Abundance using N-mixture Models for Social Species.

    PubMed

    Belant, Jerrold L; Bled, Florent; Wilton, Clay M; Fyumagwa, Robert; Mwampeta, Stanslaus B; Beyer, Dean E

    2016-10-27

    Declining populations of large carnivores worldwide, and the complexities of managing human-carnivore conflicts, require accurate population estimates of large carnivores to promote their long-term persistence through well-informed management We used N-mixture models to estimate lion (Panthera leo) abundance from call-in and track surveys in southeastern Serengeti National Park, Tanzania. Because of potential habituation to broadcasted calls and social behavior, we developed a hierarchical observation process within the N-mixture model conditioning lion detectability on their group response to call-ins and individual detection probabilities. We estimated 270 lions (95% credible interval = 170-551) using call-ins but were unable to estimate lion abundance from track data. We found a weak negative relationship between predicted track density and predicted lion abundance from the call-in surveys. Luminosity was negatively correlated with individual detection probability during call-in surveys. Lion abundance and track density were influenced by landcover, but direction of the corresponding effects were undetermined. N-mixture models allowed us to incorporate multiple parameters (e.g., landcover, luminosity, observer effect) influencing lion abundance and probability of detection directly into abundance estimates. We suggest that N-mixture models employing a hierarchical observation process can be used to estimate abundance of other social, herding, and grouping species.

  20. Is biomedical nuclear magnetic resonance limited by a revisitable paradigm in physics?

    PubMed

    de Certaines, J D

    2005-12-14

    The history of nuclear magnetic resonance (NMR) can be divided generally into two phases: before the Second World War, molecular beam methods made it possible to detect the whole set of spins. However, these methods were destructive for the sample and had a very low precision. The publications of F. Bloch and E. Purcell in 1946 opened up a second phase for NMR with the study of condensed matter, but at the expense of an enormous loss in theoretical sensitivity. During more than half a century, the method of Bloch and Purcell, based on inductive detection of the NMR signal, has allowed many developments in biomedicine. But, curiously, this severely constraining limitation on sensitivity has not been called into question during this half-century, as if the pioneers of the pre-war period had been forgotten.

  1. Detection of the Haemophilus somnus antibodies in the bulls' reproductive tract fluids using the ELISA. I. Elaboration of the ELISA for the detection of the specific antibodies in the IgG, IgM and IgA classes.

    PubMed

    Stefaniak, T

    1993-01-01

    The conditions of the ELISA for detecting the Haemophilus somnus antibodies in IgG, IgM and IgA classes were elaborated. The test was adapted for examining the seminal plasma, preputial washings and blood serum samples of bulls. In order to obtain the more precise evaluation of results, the tests were made for two different dilutions of the examined material. The arising out of this method difficulty in the evaluation of the intensity of reaction was solved by introducing the specific, semi-quantitative method of classification, using the eleven-degree scale. The mean arithmetic classificational values calculated from absorbance readings were called "absorbance index". The introduced parameter proved to be especially useful while comparing the reaction of antibodies in reproductive tract fluids samples.

  2. Infrared small target detection based on multiscale center-surround contrast measure

    NASA Astrophysics Data System (ADS)

    Fu, Hao; Long, Yunli; Zhu, Ran; An, Wei

    2018-04-01

    Infrared(IR) small target detection plays a critical role in the Infrared Search And Track (IRST) system. Although it has been studied for years, there are some difficulties remained to the clutter environment. According to the principle of human discrimination of small targets from a natural scene that there is a signature of discontinuity between the object and its neighboring regions, we develop an efficient method for infrared small target detection called multiscale centersurround contrast measure (MCSCM). First, to determine the maximum neighboring window size, an entropy-based window selection technique is used. Then, we construct a novel multiscale center-surround contrast measure to calculate the saliency map. Compared with the original image, the MCSCM map has less background clutters and noise residual. Subsequently, a simple threshold is used to segment the target. Experimental results show our method achieves better performance.

  3. From where to what: a neuroanatomically based evolutionary model of the emergence of speech in humans

    PubMed Central

    Poliva, Oren

    2017-01-01

    In the brain of primates, the auditory cortex connects with the frontal lobe via the temporal pole (auditory ventral stream; AVS) and via the inferior parietal lobe (auditory dorsal stream; ADS). The AVS is responsible for sound recognition, and the ADS for sound-localization, voice detection and integration of calls with faces. I propose that the primary role of the ADS in non-human primates is the detection and response to contact calls. These calls are exchanged between tribe members (e.g., mother-offspring) and are used for monitoring location. Detection of contact calls occurs by the ADS identifying a voice, localizing it, and verifying that the corresponding face is out of sight. Once a contact call is detected, the primate produces a contact call in return via descending connections from the frontal lobe to a network of limbic and brainstem regions. Because the ADS of present day humans also performs speech production, I further propose an evolutionary course for the transition from contact call exchange to an early form of speech. In accordance with this model, structural changes to the ADS endowed early members of the genus Homo with partial vocal control. This development was beneficial as it enabled offspring to modify their contact calls with intonations for signaling high or low levels of distress to their mother. Eventually, individuals were capable of participating in yes-no question-answer conversations. In these conversations the offspring emitted a low-level distress call for inquiring about the safety of objects (e.g., food), and his/her mother responded with a high- or low-level distress call to signal approval or disapproval of the interaction. Gradually, the ADS and its connections with brainstem motor regions became more robust and vocal control became more volitional. Speech emerged once vocal control was sufficient for inventing novel calls. PMID:28928931

  4. Detectability in Audio-Visual Surveys of Tropical Rainforest Birds: The Influence of Species, Weather and Habitat Characteristics.

    PubMed

    Anderson, Alexander S; Marques, Tiago A; Shoo, Luke P; Williams, Stephen E

    2015-01-01

    Indices of relative abundance do not control for variation in detectability, which can bias density estimates such that ecological processes are difficult to infer. Distance sampling methods can be used to correct for detectability, but in rainforest, where dense vegetation and diverse assemblages complicate sampling, information is lacking about factors affecting their application. Rare species present an additional challenge, as data may be too sparse to fit detection functions. We present analyses of distance sampling data collected for a diverse tropical rainforest bird assemblage across broad elevational and latitudinal gradients in North Queensland, Australia. Using audio and visual detections, we assessed the influence of various factors on Effective Strip Width (ESW), an intuitively useful parameter, since it can be used to calculate an estimate of density from count data. Body size and species exerted the most important influence on ESW, with larger species detectable over greater distances than smaller species. Secondarily, wet weather and high shrub density decreased ESW for most species. ESW for several species also differed between summer and winter, possibly due to seasonal differences in calling behavior. Distance sampling proved logistically intensive in these environments, but large differences in ESW between species confirmed the need to correct for detection probability to obtain accurate density estimates. Our results suggest an evidence-based approach to controlling for factors influencing detectability, and avenues for further work including modeling detectability as a function of species characteristics such as body size and call characteristics. Such models may be useful in developing a calibration for non-distance sampling data and for estimating detectability of rare species.

  5. Detectability in Audio-Visual Surveys of Tropical Rainforest Birds: The Influence of Species, Weather and Habitat Characteristics

    PubMed Central

    Anderson, Alexander S.; Marques, Tiago A.; Shoo, Luke P.; Williams, Stephen E.

    2015-01-01

    Indices of relative abundance do not control for variation in detectability, which can bias density estimates such that ecological processes are difficult to infer. Distance sampling methods can be used to correct for detectability, but in rainforest, where dense vegetation and diverse assemblages complicate sampling, information is lacking about factors affecting their application. Rare species present an additional challenge, as data may be too sparse to fit detection functions. We present analyses of distance sampling data collected for a diverse tropical rainforest bird assemblage across broad elevational and latitudinal gradients in North Queensland, Australia. Using audio and visual detections, we assessed the influence of various factors on Effective Strip Width (ESW), an intuitively useful parameter, since it can be used to calculate an estimate of density from count data. Body size and species exerted the most important influence on ESW, with larger species detectable over greater distances than smaller species. Secondarily, wet weather and high shrub density decreased ESW for most species. ESW for several species also differed between summer and winter, possibly due to seasonal differences in calling behavior. Distance sampling proved logistically intensive in these environments, but large differences in ESW between species confirmed the need to correct for detection probability to obtain accurate density estimates. Our results suggest an evidence-based approach to controlling for factors influencing detectability, and avenues for further work including modeling detectability as a function of species characteristics such as body size and call characteristics. Such models may be useful in developing a calibration for non-distance sampling data and for estimating detectability of rare species. PMID:26110433

  6. Scaling of echolocation call parameters in bats.

    PubMed

    Jones, G

    1999-12-01

    I investigated the scaling of echolocation call parameters (frequency, duration and repetition rate) in bats in a functional context. Low-duty-cycle bats operate with search phase cycles of usually less than 20 %. They process echoes in the time domain and are therefore intolerant of pulse-echo overlap. High-duty-cycle (>30 %) species use Doppler shift compensation, and they separate pulse and echo in the frequency domain. Call frequency scales negatively with body mass in at least five bat families. Pulse duration scales positively with mass in low-duty-cycle quasi-constant-frequency (QCF) species because the large aerial-hawking species that emit these signals fly fast in open habitats. They therefore detect distant targets and experience pulse-echo overlap later than do smaller bats. Pulse duration also scales positively with mass in the Hipposideridae, which show at least partial Doppler shift compensation. Pulse repetition rate corresponds closely with wingbeat frequency in QCF bat species that fly relatively slowly. Larger, fast-flying species often skip pulses when detecting distant targets. There is probably a trade-off between call intensity and repetition rate because 'whispering' bats (and hipposiderids) produce several calls per predicted wingbeat and because batches of calls are emitted per wingbeat during terminal buzzes. Severe atmospheric attenuation at high frequencies limits the range of high-frequency calls. Low-duty-cycle bats that call at high frequencies must therefore use short pulses to avoid pulse-echo overlap. Rhinolophids escape this constraint by Doppler shift compensation and, importantly, can exploit advantages associated with the emission of both high-frequency and long-duration calls. Low frequencies are unsuited for the detection of small prey, and low repetition rates may limit prey detection rates. Echolocation parameters may therefore constrain maximum body size in aerial-hawking bats.

  7. Environmental Influences On Diel Calling Behavior In Baleen Whales

    DTIC Science & Technology

    2015-09-30

    and calm seas were infrequent and short (Figure 1b), making traditional shipboard marine mammal observations difficult. The real time detection...first use of real-time detection and reporting of marine mammal calls from autonomous underwater vehicles to adaptively plan research activities. 3...conferences: • 6th International Workshop on Detection, Classification, Localization, and Density Estimation (DCLDE) of Marine Mammals using

  8. GiniClust: detecting rare cell types from single-cell gene expression data with Gini index.

    PubMed

    Jiang, Lan; Chen, Huidong; Pinello, Luca; Yuan, Guo-Cheng

    2016-07-01

    High-throughput single-cell technologies have great potential to discover new cell types; however, it remains challenging to detect rare cell types that are distinct from a large population. We present a novel computational method, called GiniClust, to overcome this challenge. Validation against a benchmark dataset indicates that GiniClust achieves high sensitivity and specificity. Application of GiniClust to public single-cell RNA-seq datasets uncovers previously unrecognized rare cell types, including Zscan4-expressing cells within mouse embryonic stem cells and hemoglobin-expressing cells in the mouse cortex and hippocampus. GiniClust also correctly detects a small number of normal cells that are mixed in a cancer cell population.

  9. Developing the Quantitative Histopathology Image Ontology (QHIO): A case study using the hot spot detection problem.

    PubMed

    Gurcan, Metin N; Tomaszewski, John; Overton, James A; Doyle, Scott; Ruttenberg, Alan; Smith, Barry

    2017-02-01

    Interoperability across data sets is a key challenge for quantitative histopathological imaging. There is a need for an ontology that can support effective merging of pathological image data with associated clinical and demographic data. To foster organized, cross-disciplinary, information-driven collaborations in the pathological imaging field, we propose to develop an ontology to represent imaging data and methods used in pathological imaging and analysis, and call it Quantitative Histopathological Imaging Ontology - QHIO. We apply QHIO to breast cancer hot-spot detection with the goal of enhancing reliability of detection by promoting the sharing of data between image analysts. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Low frequency baleen whale calls detected on ocean-bottom seismometers in the Lau basin, southwest Pacific Ocean.

    PubMed

    Brodie, Dana C; Dunn, Robert A

    2015-01-01

    Ten months of broadband seismic data, recorded on six ocean-bottom seismographs located in the Lau Basin, were examined to identify baleen whale species. As the first systematic survey of baleen whales in this part of the southwest Pacific Ocean, this study reveals the variety of species present and their temporal occurrence in and near the basin. Baleen whales produce species-specific low frequency calls that can be identified by distinct patterns in data spectrograms. By matching spectrograms with published accounts, fin, Bryde's, Antarctic blue, and New Zealand blue whale calls were identified. Probable whale sounds that could not be matched to published spectrograms, as well as non-biologic sounds that are likely of volcanogenic origin, were also recorded. Detections of fin whale calls (mid-June to mid-October) and blue whale calls (June through September) suggest that these species migrate through the region seasonally. Detections of Bryde's whale calls (primarily February to June, but also other times of the year) suggest this species resides around the basin nearly year round. The discovery of previously unpublished call types emphasizes the limited knowledge of the full call repertoires of baleen whales and the utility of using seismic survey data to enhance understanding in understudied regions.

  11. An aerial-hawking bat uses stealth echolocation to counter moth hearing.

    PubMed

    Goerlitz, Holger R; ter Hofstede, Hannah M; Zeale, Matt R K; Jones, Gareth; Holderied, Marc W

    2010-09-14

    Ears evolved in many nocturnal insects, including some moths, to detect bat echolocation calls and evade capture [1, 2]. Although there is evidence that some bats emit echolocation calls that are inconspicuous to eared moths, it is difficult to determine whether this was an adaptation to moth hearing or originally evolved for a different purpose [2, 3]. Aerial-hawking bats generally emit high-amplitude echolocation calls to maximize detection range [4, 5]. Here we present the first example of an echolocation counterstrategy to overcome prey hearing at the cost of reduced detection distance. We combined comparative bat flight-path tracking and moth neurophysiology with fecal DNA analysis to show that the barbastelle, Barbastella barbastellus, emits calls that are 10 to 100 times lower in amplitude than those of other aerial-hawking bats, remains undetected by moths until close, and captures mainly eared moths. Model calculations demonstrate that only bats emitting such low-amplitude calls hear moth echoes before their calls are conspicuous to moths. This stealth echolocation allows the barbastelle to exploit food resources that are difficult to catch for other aerial-hawking bats emitting calls of greater amplitude. Copyright © 2010 Elsevier Ltd. All rights reserved.

  12. Echolocation calls of Poey's flower bat (Phyllonycteris poeyi) unlike those of other phyllostomids.

    PubMed

    Mora, Emanuel C; Macías, Silvio

    2007-05-01

    Unlike any other foraging phyllostomid bat studied to date, Poey's flower bats (Phyllonycteris poeyi-Phyllostomidae) emit relatively long (up to 7.2 ms), intense, single-harmonic echolocation calls. These calls are readily detectable at distances of at least 15 m. Furthermore, the echolocation calls contain only the first harmonic, which is usually filtered out in the vocal tract of phyllostomids. The foraging echolocation calls of P. poeyi are more like search-phase echolocation calls of sympatric aerial-feeding bats (Molossidae, Vespertilionidae, Mormoopidae). Intense, long, narrowband, single-harmonic echolocation calls focus acoustic energy maximizing range and favoring detection, which may be particularly important for cruising bats, like P. poeyi, when flying in the open. Flying in enclosed spaces, P. poeyi emit short, low-intensity, frequency-modulated, multiharmonic echolocation calls typical of other phyllostomids. This is the first report of a phyllostomid species emitting long, intense, single-harmonic echolocation calls with most energy in the first harmonic.

  13. Effect of temporal and spectral noise features on gap detection behavior by calling green treefrogs.

    PubMed

    Höbel, Gerlinde

    2014-10-01

    Communication plays a central role in the behavioral ecology of many animals, yet the background noise generated by large breeding aggregations may impair effective communication. A common behavioral strategy to ameliorate noise interference is gap detection, where signalers display primarily during lulls in the background noise. When attempting gap detection, signalers have to deal with the fact that the spacing and duration of silent gaps is often unpredictable, and that noise varies in its spectral composition and may thus vary in the degree in which it impacts communication. I conducted playback experiments to examine how male treefrogs deal with the problem that refraining from calling while waiting for a gap to appear limits a male's ability to attract females, yet producing calls during noise also interferes with effective sexual communication. I found that the temporal structure of noise (i.e., duration of noise and silent gap segments) had a stronger effect on male calling behavior than the spectral composition. Males placed calls predominantly during silent gaps and avoided call production during short, but not long, noise segments. This suggests that male treefrogs use a calling strategy that maximizes the production of calls without interference, yet allows for calling to persist if lulls in the background noise are infrequent. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Global warming alters sound transmission: differential impact on the prey detection ability of echolocating bats

    PubMed Central

    Luo, Jinhong; Koselj, Klemen; Zsebők, Sándor; Siemers, Björn M.; Goerlitz, Holger R.

    2014-01-01

    Climate change impacts the biogeography and phenology of plants and animals, yet the underlying mechanisms are little known. Here, we present a functional link between rising temperature and the prey detection ability of echolocating bats. The maximum distance for echo-based prey detection is physically determined by sound attenuation. Attenuation is more pronounced for high-frequency sound, such as echolocation, and is a nonlinear function of both call frequency and ambient temperature. Hence, the prey detection ability, and thus possibly the foraging efficiency, of echolocating bats and susceptible to rising temperatures through climate change. Using present-day climate data and projected temperature rises, we modelled this effect for the entire range of bat call frequencies and climate zones around the globe. We show that depending on call frequency, the prey detection volume of bats will either decrease or increase: species calling above a crossover frequency will lose and species emitting lower frequencies will gain prey detection volume, with crossover frequency and magnitude depending on the local climatic conditions. Within local species assemblages, this may cause a change in community composition. Global warming can thus directly affect the prey detection ability of individual bats and indirectly their interspecific interactions with competitors and prey. PMID:24335559

  15. Global warming alters sound transmission: differential impact on the prey detection ability of echolocating bats.

    PubMed

    Luo, Jinhong; Koselj, Klemen; Zsebok, Sándor; Siemers, Björn M; Goerlitz, Holger R

    2014-02-06

    Climate change impacts the biogeography and phenology of plants and animals, yet the underlying mechanisms are little known. Here, we present a functional link between rising temperature and the prey detection ability of echolocating bats. The maximum distance for echo-based prey detection is physically determined by sound attenuation. Attenuation is more pronounced for high-frequency sound, such as echolocation, and is a nonlinear function of both call frequency and ambient temperature. Hence, the prey detection ability, and thus possibly the foraging efficiency, of echolocating bats and susceptible to rising temperatures through climate change. Using present-day climate data and projected temperature rises, we modelled this effect for the entire range of bat call frequencies and climate zones around the globe. We show that depending on call frequency, the prey detection volume of bats will either decrease or increase: species calling above a crossover frequency will lose and species emitting lower frequencies will gain prey detection volume, with crossover frequency and magnitude depending on the local climatic conditions. Within local species assemblages, this may cause a change in community composition. Global warming can thus directly affect the prey detection ability of individual bats and indirectly their interspecific interactions with competitors and prey.

  16. Tripartite community structure in social bookmarking data

    NASA Astrophysics Data System (ADS)

    Neubauer, Nicolas; Obermayer, Klaus

    2011-12-01

    Community detection is a branch of network analysis concerned with identifying strongly connected subnetworks. Social bookmarking sites aggregate datasets of often hundreds of millions of triples (document, user, and tag), which, when interpreted as edges of a graph, give rise to special networks called 3-partite, 3-uniform hypergraphs. We identify challenges and opportunities of generalizing community detection and in particular modularity optimization to these structures. Two methods for community detection are introduced that preserve the hypergraph's special structure to different degrees. Their performance is compared on synthetic datasets, showing the benefits of structure preservation. Furthermore, a tool for interactive exploration of the community detection results is introduced and applied to examples from real datasets. We find additional evidence for the importance of structure preservation and, more generally, demonstrate how tripartite community detection can help understand the structure of social bookmarking data.

  17. Fiber Scrambling for Extreme Doppler Precision

    NASA Astrophysics Data System (ADS)

    Spronck, Julien; Kaplan, Z.; Fischer, D.

    2011-09-01

    The detection of Earth-like exoplanets with the radial velocity method requires extreme Doppler precision and long-term stability in order to measure tiny reflex velocities in the host star. Recent planet searches have led to the detection of so called “super-Earths” (up to a few Earth masses) that induce radial velocity changes of about 1 m/s. However, the detection of true Earth analogs requires a precision of 10 cm/s. One of the factors limiting Doppler precision is variation in the Point Spread Function (PSF) from observation to observation due to changes in the illumination of the slit and spectrograph optics. Thus, this stability has become a focus of current instrumentation work. Fiber optics have been used since the 1980’s to couple telescopes to high-precision spectrographs, initially for simpler mechanical design and control. However, fiber optics are also naturally efficient scramblers. Scrambling refers to a fiber’s ability to produce an output beam independent of input. Our research is focused on understanding the scrambling properties of fibers with different geometries (circular, square, octagonal), different lengths and fiber sizes. Another important parameter when it comes to fibers is the so-called focal ratio degradation (FRD), which accounts for a different (faster) focal ratio after the fiber than the one sent into the fiber. In this paper, we will present new insight on fiber scrambling, FRD and what we call fiber personality, which describes differing behaviors for supposedly identical fiber.

  18. Modeling of ultrasonic and terahertz radiations in defective tiles for condition monitoring of thermal protection systems

    NASA Astrophysics Data System (ADS)

    Kabiri Rahani, Ehsan

    Condition based monitoring of Thermal Protection Systems (TPS) is necessary for safe operations of space shuttles when quick turn-around time is desired. In the current research Terahertz radiation (T-ray) has been used to detect mechanical and heat induced damages in TPS tiles. Voids and cracks inside the foam tile are denoted as mechanical damage while property changes due to long and short term exposures of tiles to high heat are denoted as heat induced damage. Ultrasonic waves cannot detect cracks and voids inside the tile because the tile material (silica foam) has high attenuation for ultrasonic energy. Instead, electromagnetic terahertz radiation can easily penetrate into the foam material and detect the internal voids although this electromagnetic radiation finds it difficult to detect delaminations between the foam tile and the substrate plate. Thus these two technologies are complementary to each other for TPS inspection. Ultrasonic and T-ray field modeling in free and mounted tiles with different types of mechanical and thermal damages has been the focus of this research. Shortcomings and limitations of FEM method in modeling 3D problems especially at high-frequencies has been discussed and a newly developed semi-analytical technique called Distributed Point Source Method (DPSM) has been used for this purpose. A FORTRAN code called DPSM3D has been developed to model both ultrasonic and electromagnetic problems using the conventional DPSM method. This code is designed in a general form capable of modeling a variety of geometries. DPSM has been extended from ultrasonic applications to electromagnetic to model THz Gaussian beams, multilayered dielectrics and Gaussian beam-scatterer interaction problems. Since the conventional DPSM has some drawbacks, to overcome it two modification methods called G-DPSM and ESM have been proposed. The conventional DPSM in the past was only capable of solving time harmonic (frequency domain) problems. Time history was obtained by FFT (Fast Fourier Transform) algorithm. In this research DPSM has been extended to model DPSM transient problems without using FFT. This modified technique has been denoted as t-DPSM. Using DPSM, scattering of focused ultrasonic fields by single and multiple cavities in fluid & solid media is studied. It is investigated when two cavities in close proximity can be distinguished and when it is not possible. A comparison between the radiation forces generated by the ultrasonic energies reflected from two small cavities versus a single big cavity is also carried out.

  19. Earthquake detection through computationally efficient similarity search

    PubMed Central

    Yoon, Clara E.; O’Reilly, Ossian; Bergen, Karianne J.; Beroza, Gregory C.

    2015-01-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection—identification of seismic events in continuous data—is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact “fingerprints” of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  20. Environmental Influences on Diel Calling Behavior in Baleen Whales

    DTIC Science & Technology

    2013-09-30

    to allow known calls (e.g., right whale upcall and gunshot, fin whale 20- Hz pulses, humpback whale downsweeps, sei whale low-frequency downsweeps...fin, humpback , sei, and North Atlantic right whales . Real-time detections were evaluated after recovery of the gliders by (1) comparing the acoustic...from both an aircraft and ship. The overall false detection rate for individual calls was 14%, and for right, humpback , and fin whales , false

  1. Remote sensing change detection methods to track deforestation and growth in threatened rainforests in Madre de Dios, Peru

    USGS Publications Warehouse

    Shermeyer, Jacob S.; Haack, Barry N.

    2015-01-01

    Two forestry-change detection methods are described, compared, and contrasted for estimating deforestation and growth in threatened forests in southern Peru from 2000 to 2010. The methods used in this study rely on freely available data, including atmospherically corrected Landsat 5 Thematic Mapper and Moderate Resolution Imaging Spectroradiometer (MODIS) vegetation continuous fields (VCF). The two methods include a conventional supervised signature extraction method and a unique self-calibrating method called MODIS VCF guided forest/nonforest (FNF) masking. The process chain for each of these methods includes a threshold classification of MODIS VCF, training data or signature extraction, signature evaluation, k-nearest neighbor classification, analyst-guided reclassification, and postclassification image differencing to generate forest change maps. Comparisons of all methods were based on an accuracy assessment using 500 validation pixels. Results of this accuracy assessment indicate that FNF masking had a 5% higher overall accuracy and was superior to conventional supervised classification when estimating forest change. Both methods succeeded in classifying persistently forested and nonforested areas, and both had limitations when classifying forest change.

  2. A comparison of acoustic montoring methods for common anurans of the northeastern United States

    USGS Publications Warehouse

    Brauer, Corinne; Donovan, Therese; Mickey, Ruth M.; Katz, Jonathan; Mitchell, Brian R.

    2016-01-01

    Many anuran monitoring programs now include autonomous recording units (ARUs). These devices collect audio data for extended periods of time with little maintenance and at sites where traditional call surveys might be difficult. Additionally, computer software programs have grown increasingly accurate at automatically identifying the calls of species. However, increased automation may cause increased error. We collected 435 min of audio data with 2 types of ARUs at 10 wetland sites in Vermont and New York, USA, from 1 May to 1 July 2010. For each minute, we determined presence or absence of 4 anuran species (Hyla versicolor, Pseudacris crucifer, Anaxyrus americanus, and Lithobates clamitans) using 1) traditional human identification versus 2) computer-mediated identification with software package, Song Scope® (Wildlife Acoustics, Concord, MA). Detections were compared with a data set consisting of verified calls in order to quantify false positive, false negative, true positive, and true negative rates. Multinomial logistic regression analysis revealed a strong (P < 0.001) 3-way interaction between the ARU recorder type, identification method, and focal species, as well as a trend in the main effect of rain (P = 0.059). Overall, human surveyors had the lowest total error rate (<2%) compared with 18–31% total errors with automated methods. Total error rates varied by species, ranging from 4% for A. americanus to 26% for L. clamitans. The presence of rain may reduce false negative rates. For survey minutes where anurans were known to be calling, the odds of a false negative were increased when fewer individuals of the same species were calling.

  3. Weak-lensing detection of intracluster filaments with ground-based data

    NASA Astrophysics Data System (ADS)

    Maturi, Matteo; Merten, Julian

    2013-11-01

    According to the current standard model of cosmology, matter in the Universe arranges itself along a network of filamentary structure. These filaments connect the main nodes of this so-called "cosmic web", which are clusters of galaxies. Although its large-scale distribution is clearly characterized by numerical simulations, constraining the dark-matter content of the cosmic web in reality turns out to be difficult. The natural method of choice is gravitational lensing. However, the direct detection and mapping of the elusive filament signal is challenging and in this work we present two methods that are specifically tailored to achieve this task. A linear matched filter aims at detecting the smooth mass-component of filaments and is optimized to perform a shear decomposition that follows the anisotropic component of the lensing signal. Filaments clearly inherit this property due to their morphology. At the same time, the contamination arising from the central massive cluster is controlled in a natural way. The filament 1σ detection is of about κ ~ 0.01 - 0.005 according to the filter's template width and length, enabling the detection of structures beyond reach with other approaches. The second, complementary method seeks to detect the clumpy component of filaments. The detection is determined by the number density of subclump identifications in an area enclosing the potential filament, as was found within the observed field with the filter approach. We tested both methods against mocked observations based on realistic N-body simulations of filamentary structure and proved the feasibility of detecting filaments with ground-based data.

  4. Novel optical scanning cryptography using Fresnel telescope imaging.

    PubMed

    Yan, Aimin; Sun, Jianfeng; Hu, Zhijuan; Zhang, Jingtao; Liu, Liren

    2015-07-13

    We propose a new method called modified optical scanning cryptography using Fresnel telescope imaging technique for encryption and decryption of remote objects. An image or object can be optically encrypted on the fly by Fresnel telescope scanning system together with an encryption key. For image decryption, the encrypted signals are received and processed with an optical coherent heterodyne detection system. The proposed method has strong performance through use of secure Fresnel telescope scanning with orthogonal polarized beams and efficient all-optical information processing. The validity of the proposed method is demonstrated by numerical simulations and experimental results.

  5. Intelligent monitoring and control of semiconductor manufacturing equipment

    NASA Technical Reports Server (NTRS)

    Murdock, Janet L.; Hayes-Roth, Barbara

    1991-01-01

    The use of AI methods to monitor and control semiconductor fabrication in a state-of-the-art manufacturing environment called the Rapid Thermal Multiprocessor is described. Semiconductor fabrication involves many complex processing steps with limited opportunities to measure process and product properties. By applying additional process and product knowledge to that limited data, AI methods augment classical control methods by detecting abnormalities and trends, predicting failures, diagnosing, planning corrective action sequences, explaining diagnoses or predictions, and reacting to anomalous conditions that classical control systems typically would not correct. Research methodology and issues are discussed, and two diagnosis scenarios are examined.

  6. In situ temperature measurement of. alpha. -mercuric iodide by reflection spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nason, D.; Burger, A.

    1991-12-30

    Crystal face temperatures of single crystals of {alpha}-HgI{sub 2} growing in transparent ampules by physical vapor transport have been measured, {ital in} {ital situ}, by a novel, noncontact method which may be called reflectance spectroscopy thermometry. The method is based on the temperature dependence of the energy of the free-exciton peak as detected with a low-energy reflected beam. As presently configured, the accuracy is {plus minus}1.5 {degree}C for a slowly varying surface temperature. The method has potential for noncontact temperature measurement in some systems for which pyrometry is unsatisfactory.

  7. Homozygous and hemizygous CNV detection from exome sequencing data in a Mendelian disease cohort.

    PubMed

    Gambin, Tomasz; Akdemir, Zeynep C; Yuan, Bo; Gu, Shen; Chiang, Theodore; Carvalho, Claudia M B; Shaw, Chad; Jhangiani, Shalini; Boone, Philip M; Eldomery, Mohammad K; Karaca, Ender; Bayram, Yavuz; Stray-Pedersen, Asbjørg; Muzny, Donna; Charng, Wu-Lin; Bahrambeigi, Vahid; Belmont, John W; Boerwinkle, Eric; Beaudet, Arthur L; Gibbs, Richard A; Lupski, James R

    2017-02-28

    We developed an algorithm, HMZDelFinder, that uses whole exome sequencing (WES) data to identify rare and intragenic homozygous and hemizygous (HMZ) deletions that may represent complete loss-of-function of the indicated gene. HMZDelFinder was applied to 4866 samples in the Baylor-Hopkins Center for Mendelian Genomics (BHCMG) cohort and detected 773 HMZ deletion calls (567 homozygous or 206 hemizygous) with an estimated sensitivity of 86.5% (82% for single-exonic and 88% for multi-exonic calls) and precision of 78% (53% single-exonic and 96% for multi-exonic calls). Out of 773 HMZDelFinder-detected deletion calls, 82 were subjected to array comparative genomic hybridization (aCGH) and/or breakpoint PCR and 64 were confirmed. These include 18 single-exon deletions out of which 8 were exclusively detected by HMZDelFinder and not by any of seven other CNV detection tools examined. Further investigation of the 64 validated deletion calls revealed at least 15 pathogenic HMZ deletions. Of those, 7 accounted for 17-50% of pathogenic CNVs in different disease cohorts where 7.1-11% of the molecular diagnosis solved rate was attributed to CNVs. In summary, we present an algorithm to detect rare, intragenic, single-exon deletion CNVs using WES data; this tool can be useful for disease gene discovery efforts and clinical WES analyses. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  8. Evaluation of somatic copy number estimation tools for whole-exome sequencing data.

    PubMed

    Nam, Jae-Yong; Kim, Nayoung K D; Kim, Sang Cheol; Joung, Je-Gun; Xi, Ruibin; Lee, Semin; Park, Peter J; Park, Woong-Yang

    2016-03-01

    Whole-exome sequencing (WES) has become a standard method for detecting genetic variants in human diseases. Although the primary use of WES data has been the identification of single nucleotide variations and indels, these data also offer a possibility of detecting copy number variations (CNVs) at high resolution. However, WES data have uneven read coverage along the genome owing to the target capture step, and the development of a robust WES-based CNV tool is challenging. Here, we evaluate six WES somatic CNV detection tools: ADTEx, CONTRA, Control-FREEC, EXCAVATOR, ExomeCNV and Varscan2. Using WES data from 50 kidney chromophobe, 50 bladder urothelial carcinoma, and 50 stomach adenocarcinoma patients from The Cancer Genome Atlas, we compared the CNV calls from the six tools with a reference CNV set that was identified by both single nucleotide polymorphism array 6.0 and whole-genome sequencing data. We found that these algorithms gave highly variable results: visual inspection reveals significant differences between the WES-based segmentation profiles and the reference profile, as well as among the WES-based profiles. Using a 50% overlap criterion, 13-77% of WES CNV calls were covered by CNVs from the reference set, up to 21% of the copy gains were called as losses or vice versa, and dramatic differences in CNV sizes and CNV numbers were observed. Overall, ADTEx and EXCAVATOR had the best performance with relatively high precision and sensitivity. We suggest that the current algorithms for somatic CNV detection from WES data are limited in their performance and that more robust algorithms are needed. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  9. Improving the Navy’s Passive Underwater Acoustic Monitoring of Marine Mammal Populations

    DTIC Science & Technology

    2013-09-30

    passive acoustic monitoring: Correcting humpback whale call detections for site-specific and time-dependent environmental characteristics ,” JASA Exp...marine mammal species using passive acoustic monitoring, with application to obtaining density estimates of transiting humpback whale populations in...minimize the variance of the density estimates, 3) to apply the numerical modeling methods for humpback whale vocalizations to understand distortions

  10. A challenging issue: Detection of white matter hyperintensities in neonatal brain MRI.

    PubMed

    Morel, Baptiste; Yongchao Xu; Virzi, Alessio; Geraud, Thierry; Adamsbaum, Catherine; Bloch, Isabelle

    2016-08-01

    The progress of magnetic resonance imaging (MRI) allows for a precise exploration of the brain of premature infants at term equivalent age. The so-called DEHSI (diffuse excessive high signal intensity) of the white matter of premature brains remains a challenging issue in terms of definition, and thus of interpretation. We propose a semi-automatic detection and quantification method of white matter hyperintensities in MRI relying on morphological operators and max-tree representations, which constitutes a powerful tool to help radiologists to improve their interpretation. Results show better reproducibility and robustness than interactive segmentation.

  11. Multiband tissue classification for ultrasonic transmission tomography using spectral profile detection

    NASA Astrophysics Data System (ADS)

    Jeong, Jeong-Won; Kim, Tae-Seong; Shin, Dae-Chul; Do, Synho; Marmarelis, Vasilis Z.

    2004-04-01

    Recently it was shown that soft tissue can be differentiated with spectral unmixing and detection methods that utilize multi-band information obtained from a High-Resolution Ultrasonic Transmission Tomography (HUTT) system. In this study, we focus on tissue differentiation using the spectral target detection method based on Constrained Energy Minimization (CEM). We have developed a new tissue differentiation method called "CEM filter bank". Statistical inference on the output of each CEM filter of a filter bank is used to make a decision based on the maximum statistical significance rather than the magnitude of each CEM filter output. We validate this method through 3-D inter/intra-phantom soft tissue classification where target profiles obtained from an arbitrary single slice are used for differentiation in multiple tomographic slices. Also spectral coherence between target and object profiles of an identical tissue at different slices and phantoms is evaluated by conventional cross-correlation analysis. The performance of the proposed classifier is assessed using Receiver Operating Characteristic (ROC) analysis. Finally we apply our method to classify tiny structures inside a beef kidney such as Styrofoam balls (~1mm), chicken tissue (~5mm), and vessel-duct structures.

  12. VarDict: a novel and versatile variant caller for next-generation sequencing in cancer research

    PubMed Central

    Lai, Zhongwu; Markovets, Aleksandra; Ahdesmaki, Miika; Chapman, Brad; Hofmann, Oliver; McEwen, Robert; Johnson, Justin; Dougherty, Brian; Barrett, J. Carl; Dry, Jonathan R.

    2016-01-01

    Abstract Accurate variant calling in next generation sequencing (NGS) is critical to understand cancer genomes better. Here we present VarDict, a novel and versatile variant caller for both DNA- and RNA-sequencing data. VarDict simultaneously calls SNV, MNV, InDels, complex and structural variants, expanding the detected genetic driver landscape of tumors. It performs local realignments on the fly for more accurate allele frequency estimation. VarDict performance scales linearly to sequencing depth, enabling ultra-deep sequencing used to explore tumor evolution or detect tumor DNA circulating in blood. In addition, VarDict performs amplicon aware variant calling for polymerase chain reaction (PCR)-based targeted sequencing often used in diagnostic settings, and is able to detect PCR artifacts. Finally, VarDict also detects differences in somatic and loss of heterozygosity variants between paired samples. VarDict reprocessing of The Cancer Genome Atlas (TCGA) Lung Adenocarcinoma dataset called known driver mutations in KRAS, EGFR, BRAF, PIK3CA and MET in 16% more patients than previously published variant calls. We believe VarDict will greatly facilitate application of NGS in clinical cancer research. PMID:27060149

  13. Field evaluation of distance-estimation error during wetland-dependent bird surveys

    USGS Publications Warehouse

    Nadeau, Christopher P.; Conway, Courtney J.

    2012-01-01

    Context: The most common methods to estimate detection probability during avian point-count surveys involve recording a distance between the survey point and individual birds detected during the survey period. Accurately measuring or estimating distance is an important assumption of these methods; however, this assumption is rarely tested in the context of aural avian point-count surveys. Aims: We expand on recent bird-simulation studies to document the error associated with estimating distance to calling birds in a wetland ecosystem. Methods: We used two approaches to estimate the error associated with five surveyor's distance estimates between the survey point and calling birds, and to determine the factors that affect a surveyor's ability to estimate distance. Key results: We observed biased and imprecise distance estimates when estimating distance to simulated birds in a point-count scenario (x̄error = -9 m, s.d.error = 47 m) and when estimating distances to real birds during field trials (x̄error = 39 m, s.d.error = 79 m). The amount of bias and precision in distance estimates differed among surveyors; surveyors with more training and experience were less biased and more precise when estimating distance to both real and simulated birds. Three environmental factors were important in explaining the error associated with distance estimates, including the measured distance from the bird to the surveyor, the volume of the call and the species of bird. Surveyors tended to make large overestimations to birds close to the survey point, which is an especially serious error in distance sampling. Conclusions: Our results suggest that distance-estimation error is prevalent, but surveyor training may be the easiest way to reduce distance-estimation error. Implications: The present study has demonstrated how relatively simple field trials can be used to estimate the error associated with distance estimates used to estimate detection probability during avian point-count surveys. Evaluating distance-estimation errors will allow investigators to better evaluate the accuracy of avian density and trend estimates. Moreover, investigators who evaluate distance-estimation errors could employ recently developed models to incorporate distance-estimation error into analyses. We encourage further development of such models, including the inclusion of such models into distance-analysis software.

  14. Directional ratio based on parabolic molecules and its application to the analysis of tubular structures

    NASA Astrophysics Data System (ADS)

    Labate, Demetrio; Negi, Pooran; Ozcan, Burcin; Papadakis, Manos

    2015-09-01

    As advances in imaging technologies make more and more data available for biomedical applications, there is an increasing need to develop efficient quantitative algorithms for the analysis and processing of imaging data. In this paper, we introduce an innovative multiscale approach called Directional Ratio which is especially effective to distingush isotropic from anisotropic structures. This task is especially useful in the analysis of images of neurons, the main units of the nervous systems which consist of a main cell body called the soma and many elongated processes called neurites. We analyze the theoretical properties of our method on idealized models of neurons and develop a numerical implementation of this approach for analysis of fluorescent images of cultured neurons. We show that this algorithm is very effective for the detection of somas and the extraction of neurites in images of small circuits of neurons.

  15. Design of parity generator and checker circuit using electro-optic effect of Mach-Zehnder interferometers

    NASA Astrophysics Data System (ADS)

    Kumar, Santosh; Chanderkanta; Amphawan, Angela

    2016-04-01

    Parity is an extra bit which is used to add in digital information to detect error at the receiver end. It can be even and odd parity. In case of even parity, the number of one's will be even included the parity and reverse in the case of odd parity. The circuit which is used to generate the parity at the transmitter side, called the parity generator and the circuit which is used to detect the parity at receiver side is called as parity checker. In this paper, an even and odd parity generator and checker circuits are designed using electro-optic effect inside lithium niobate based Mach-Zehnder Interferometers (MZIs). The MZIs structures collectively show powerful capability in switching an input optical signal to a desired output port from a collection of output ports. The paper constitutes a mathematical description of the proposed device and thereafter simulation using MATLAB. The study is verified using beam propagation method (BPM).

  16. Turnover of Lipidated LC3 and Autophagic Cargoes in Mammalian Cells.

    PubMed

    Rodríguez-Arribas, M; Yakhine-Diop, S M S; González-Polo, R A; Niso-Santano, M; Fuentes, J M

    2017-01-01

    Macroautophagy (usually referred to as autophagy) is the most important degradation system in mammalian cells. It is responsible for the elimination of protein aggregates, organelles, and other cellular content. During autophagy, these materials (i.e., cargo) must be engulfed by a double-membrane structure called an autophagosome, which delivers the cargo to the lysosome to complete its degradation. Autophagy is a very dynamic pathway called autophagic flux. The process involves all the steps that are implicated in cargo degradation from autophagosome formation. There are several techniques to monitor autophagic flux. Among them, the method most used experimentally to assess autophagy is the detection of LC3 protein processing and p62 degradation by Western blotting. In this chapter, we provide a detailed and straightforward protocol for this purpose in cultured mammalian cells, including a brief set of notes concerning problems associated with the Western-blotting detection of LC3 and p62. © 2017 Elsevier Inc. All rights reserved.

  17. Poisson denoising on the sphere

    NASA Astrophysics Data System (ADS)

    Schmitt, J.; Starck, J. L.; Fadili, J.; Grenier, I.; Casandjian, J. M.

    2009-08-01

    In the scope of the Fermi mission, Poisson noise removal should improve data quality and make source detection easier. This paper presents a method for Poisson data denoising on sphere, called Multi-Scale Variance Stabilizing Transform on Sphere (MS-VSTS). This method is based on a Variance Stabilizing Transform (VST), a transform which aims to stabilize a Poisson data set such that each stabilized sample has an (asymptotically) constant variance. In addition, for the VST used in the method, the transformed data are asymptotically Gaussian. Thus, MS-VSTS consists in decomposing the data into a sparse multi-scale dictionary (wavelets, curvelets, ridgelets...), and then applying a VST on the coefficients in order to get quasi-Gaussian stabilized coefficients. In this present article, the used multi-scale transform is the Isotropic Undecimated Wavelet Transform. Then, hypothesis tests are made to detect significant coefficients, and the denoised image is reconstructed with an iterative method based on Hybrid Steepest Descent (HST). The method is tested on simulated Fermi data.

  18. Blind Linguistic Steganalysis against Translation Based Steganography

    NASA Astrophysics Data System (ADS)

    Chen, Zhili; Huang, Liusheng; Meng, Peng; Yang, Wei; Miao, Haibo

    Translation based steganography (TBS) is a kind of relatively new and secure linguistic steganography. It takes advantage of the "noise" created by automatic translation of natural language text to encode the secret information. Up to date, there is little research on the steganalysis against this kind of linguistic steganography. In this paper, a blind steganalytic method, which is named natural frequency zoned word distribution analysis (NFZ-WDA), is presented. This method has improved on a previously proposed linguistic steganalysis method based on word distribution which is targeted for the detection of linguistic steganography like nicetext and texto. The new method aims to detect the application of TBS and uses none of the related information about TBS, its only used resource is a word frequency dictionary obtained from a large corpus, or a so called natural frequency dictionary, so it is totally blind. To verify the effectiveness of NFZ-WDA, two experiments with two-class and multi-class SVM classifiers respectively are carried out. The experimental results show that the steganalytic method is pretty promising.

  19. Effective Prediction of Errors by Non-native Speakers Using Decision Tree for Speech Recognition-Based CALL System

    NASA Astrophysics Data System (ADS)

    Wang, Hongcui; Kawahara, Tatsuya

    CALL (Computer Assisted Language Learning) systems using ASR (Automatic Speech Recognition) for second language learning have received increasing interest recently. However, it still remains a challenge to achieve high speech recognition performance, including accurate detection of erroneous utterances by non-native speakers. Conventionally, possible error patterns, based on linguistic knowledge, are added to the lexicon and language model, or the ASR grammar network. However, this approach easily falls in the trade-off of coverage of errors and the increase of perplexity. To solve the problem, we propose a method based on a decision tree to learn effective prediction of errors made by non-native speakers. An experimental evaluation with a number of foreign students learning Japanese shows that the proposed method can effectively generate an ASR grammar network, given a target sentence, to achieve both better coverage of errors and smaller perplexity, resulting in significant improvement in ASR accuracy.

  20. Automatic Hypocenter Determination Method in JMA Catalog and its Application

    NASA Astrophysics Data System (ADS)

    Tamaribuchi, K.

    2017-12-01

    The number of detectable earthquakes around Japan has increased by developing the high-sensitivity seismic observation network. After the 2011 Tohoku-oki earthquake, the number of detectable earthquakes have dramatically increased due to its aftershocks and induced earthquakes. This enormous number of earthquakes caused inability of manually determination of all the hypocenters. The Japan Meteorological Agency (JMA), which produces the earthquake catalog in Japan, has developed a new automatic hypocenter determination method and started its operation from April 1, 2016. This method (named PF method; Phase combination Forward search method) can determine the hypocenters of earthquakes that occur simultaneously by searching for the optimal combination of P- and S-wave arrival times and the maximum amplitudes using a Bayesian estimation technique. In the 2016 Kumamoto earthquake sequence, we successfully detected about 70,000 aftershocks automatically during the period from April 14 to the end of May, and this method contributed to the real-time monitoring of the seismic activity. Furthermore, this method can be also applied to the Earthquake Early Warning (EEW). Application of this method for EEW is called the IPF method and has been used as the hypocenter determination method of the EEW system in JMA from December 2016. By developing this method further, it is possible to contribute to not only speeding up the catalog production, but also improving reliability of the early warning.

  1. Lead Apron Inspection Using Infrared Light: A Model Validation Study.

    PubMed

    McKenney, Sarah E; Otero, Hansel J; Fricke, Stanley T

    2018-02-01

    To evaluate defect detection in radiation protective apparel, typically called lead aprons, using infrared (IR) thermal imaging. The use of IR lighting eliminates the need for access to x-ray-emitting equipment and radiation dose to the inspector. The performance of radiation workers was prospectively assessed using both a tactile inspection and the IR inspection with a lead apron phantom over a 2-month period. The phantom was a modified lead apron with a series of nine holes of increasing diameter ranging from 2 to 35 mm in accordance with typical rejection criteria. Using the tactile method, a radiation worker would feel for the defects in the lead apron. For the IR inspection, a 250-W IR light source was used to illuminate the lead apron phantom; an IR camera detected the transmitted radiation. The radiation workers evaluated two stills from the IR camera. From the 31 participants inspecting the lead apron phantom with the tactile method, only 2 participants (6%) correctly discovered all 9 holes and 1 participant reported a defect that was not there; 10 of the 20 participants (50%) correctly identified all 9 holes using the IR method. Using a weighted average, 5.4 defects were detected with the tactile method and 7.5 defects were detected with the IR method. IR light can penetrate an apron's protective outer fabric and illuminate defects below the current standard rejection size criteria. The IR method improves defect detectability as compared with the tactile method. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  2. Applicability of a Conservative Margin Approach for Assessing NDE Flaw Detectability

    NASA Technical Reports Server (NTRS)

    Koshti, ajay M.

    2007-01-01

    Nondestructive Evaluation (NDE) procedures are required to detect flaws in structures with a high percentage detectability and high confidence. Conventional Probability of Detection (POD) methods are statistical in nature and require detection data from a relatively large number of flaw specimens. In many circumstances, due to the high cost and long lead time, it is impractical to build the large set of flaw specimens that is required by the conventional POD methodology. Therefore, in such situations it is desirable to have a flaw detectability estimation approach that allows for a reduced number of flaw specimens but provides a high degree of confidence in establishing the flaw detectability size. This paper presents an alternative approach called the conservative margin approach (CMA). To investigate the applicability of the CMA approach, flaw detectability sizes determined by the CMA and POD approaches have been compared on actual datasets. The results of these comparisons are presented and the applicability of the CMA approach is discussed.

  3. Selective object encryption for privacy protection

    NASA Astrophysics Data System (ADS)

    Zhou, Yicong; Panetta, Karen; Cherukuri, Ravindranath; Agaian, Sos

    2009-05-01

    This paper introduces a new recursive sequence called the truncated P-Fibonacci sequence, its corresponding binary code called the truncated Fibonacci p-code and a new bit-plane decomposition method using the truncated Fibonacci pcode. In addition, a new lossless image encryption algorithm is presented that can encrypt a selected object using this new decomposition method for privacy protection. The user has the flexibility (1) to define the object to be protected as an object in an image or in a specific part of the image, a selected region of an image, or an entire image, (2) to utilize any new or existing method for edge detection or segmentation to extract the selected object from an image or a specific part/region of the image, (3) to select any new or existing method for the shuffling process. The algorithm can be used in many different areas such as wireless networking, mobile phone services and applications in homeland security and medical imaging. Simulation results and analysis verify that the algorithm shows good performance in object/image encryption and can withstand plaintext attacks.

  4. ICPD-a new peak detection algorithm for LC/MS.

    PubMed

    Zhang, Jianqiu; Haskins, William

    2010-12-01

    The identification and quantification of proteins using label-free Liquid Chromatography/Mass Spectrometry (LC/MS) play crucial roles in biological and biomedical research. Increasing evidence has shown that biomarkers are often low abundance proteins. However, LC/MS systems are subject to considerable noise and sample variability, whose statistical characteristics are still elusive, making computational identification of low abundance proteins extremely challenging. As a result, the inability of identifying low abundance proteins in a proteomic study is the main bottleneck in protein biomarker discovery. In this paper, we propose a new peak detection method called Information Combining Peak Detection (ICPD ) for high resolution LC/MS. In LC/MS, peptides elute during a certain time period and as a result, peptide isotope patterns are registered in multiple MS scans. The key feature of the new algorithm is that the observed isotope patterns registered in multiple scans are combined together for estimating the likelihood of the peptide existence. An isotope pattern matching score based on the likelihood probability is provided and utilized for peak detection. The performance of the new algorithm is evaluated based on protein standards with 48 known proteins. The evaluation shows better peak detection accuracy for low abundance proteins than other LC/MS peak detection methods.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christoph, G.G; Jackson, K.A.; Neuman, M.C.

    An effective method for detecting computer misuse is the automatic auditing and analysis of on-line user activity. This activity is reflected in the system audit record, by changes in the vulnerability posture of the system configuration, and in other evidence found through active testing of the system. In 1989 we started developing an automatic misuse detection system for the Integrated Computing Network (ICN) at Los Alamos National Laboratory. Since 1990 this system has been operational, monitoring a variety of network systems and services. We call it the Network Anomaly Detection and Intrusion Reporter, or NADIR. During the last year andmore » a half, we expanded NADIR to include processing of audit and activity records for the Cray UNICOS operating system. This new component is called the UNICOS Real-time NADIR, or UNICORN. UNICORN summarizes user activity and system configuration information in statistical profiles. In near real-time, it can compare current activity to historical profiles and test activity against expert rules that express our security policy and define improper or suspicious behavior. It reports suspicious behavior to security auditors and provides tools to aid in follow-up investigations. UNICORN is currently operational on four Crays in Los Alamos` main computing network, the ICN.« less

  6. Amphibian Bioacoustics

    NASA Astrophysics Data System (ADS)

    Christensen-Dalsgaard, Jakob

    Anuran amphibians (frogs and toads) of most of the 3,500 species that exist today are highly vocal animals. In most frogs, males will spend considerable energy on calling and incur sizeable predation risks and the females’ detection and localization of the calls of conspecific males is often a prerequisite for successful mating. Therefore, acoustic communication is evidently evolutionarily important in the anurans, and their auditory system is probably shaped by the selective pressures associated with production, detection and localization of the communication calls.

  7. Quality control and quality assurance in genotypic data for genome-wide association studies

    PubMed Central

    Laurie, Cathy C.; Doheny, Kimberly F.; Mirel, Daniel B.; Pugh, Elizabeth W.; Bierut, Laura J.; Bhangale, Tushar; Boehm, Frederick; Caporaso, Neil E.; Cornelis, Marilyn C.; Edenberg, Howard J.; Gabriel, Stacy B.; Harris, Emily L.; Hu, Frank B.; Jacobs, Kevin; Kraft, Peter; Landi, Maria Teresa; Lumley, Thomas; Manolio, Teri A.; McHugh, Caitlin; Painter, Ian; Paschall, Justin; Rice, John P.; Rice, Kenneth M.; Zheng, Xiuwen; Weir, Bruce S.

    2011-01-01

    Genome-wide scans of nucleotide variation in human subjects are providing an increasing number of replicated associations with complex disease traits. Most of the variants detected have small effects and, collectively, they account for a small fraction of the total genetic variance. Very large sample sizes are required to identify and validate findings. In this situation, even small sources of systematic or random error can cause spurious results or obscure real effects. The need for careful attention to data quality has been appreciated for some time in this field, and a number of strategies for quality control and quality assurance (QC/QA) have been developed. Here we extend these methods and describe a system of QC/QA for genotypic data in genome-wide association studies. This system includes some new approaches that (1) combine analysis of allelic probe intensities and called genotypes to distinguish gender misidentification from sex chromosome aberrations, (2) detect autosomal chromosome aberrations that may affect genotype calling accuracy, (3) infer DNA sample quality from relatedness and allelic intensities, (4) use duplicate concordance to infer SNP quality, (5) detect genotyping artifacts from dependence of Hardy-Weinberg equilibrium (HWE) test p-values on allelic frequency, and (6) demonstrate sensitivity of principal components analysis (PCA) to SNP selection. The methods are illustrated with examples from the ‘Gene Environment Association Studies’ (GENEVA) program. The results suggest several recommendations for QC/QA in the design and execution of genome-wide association studies. PMID:20718045

  8. Improved detection of CXCR4-using HIV by V3 genotyping: application of population-based and "deep" sequencing to plasma RNA and proviral DNA.

    PubMed

    Swenson, Luke C; Moores, Andrew; Low, Andrew J; Thielen, Alexander; Dong, Winnie; Woods, Conan; Jensen, Mark A; Wynhoven, Brian; Chan, Dennison; Glascock, Christopher; Harrigan, P Richard

    2010-08-01

    Tropism testing should rule out CXCR4-using HIV before treatment with CCR5 antagonists. Currently, the recombinant phenotypic Trofile assay (Monogram) is most widely utilized; however, genotypic tests may represent alternative methods. Independent triplicate amplifications of the HIV gp120 V3 region were made from either plasma HIV RNA or proviral DNA. These underwent standard, population-based sequencing with an ABI3730 (RNA n = 63; DNA n = 40), or "deep" sequencing with a Roche/454 Genome Sequencer-FLX (RNA n = 12; DNA n = 12). Position-specific scoring matrices (PSSMX4/R5) (-6.96 cutoff) and geno2pheno[coreceptor] (5% false-positive rate) inferred tropism from V3 sequence. These methods were then independently validated with a separate, blinded dataset (n = 278) of screening samples from the maraviroc MOTIVATE trials. Standard sequencing of HIV RNA with PSSM yielded 69% sensitivity and 91% specificity, relative to Trofile. The validation dataset gave 75% sensitivity and 83% specificity. Proviral DNA plus PSSM gave 77% sensitivity and 71% specificity. "Deep" sequencing of HIV RNA detected >2% inferred-CXCR4-using virus in 8/8 samples called non-R5 by Trofile, and <2% in 4/4 samples called R5. Triplicate analyses of V3 standard sequence data detect greater proportions of CXCR4-using samples than previously achieved. Sequencing proviral DNA and "deep" V3 sequencing may also be useful tools for assessing tropism.

  9. Pattern-histogram-based temporal change detection using personal chest radiographs

    NASA Astrophysics Data System (ADS)

    Ugurlu, Yucel; Obi, Takashi; Hasegawa, Akira; Yamaguchi, Masahiro; Ohyama, Nagaaki

    1999-05-01

    An accurate and reliable detection of temporal changes from a pair of images has considerable interest in the medical science. Traditional registration and subtraction techniques can be applied to extract temporal differences when,the object is rigid or corresponding points are obvious. However, in radiological imaging, loss of the depth information, the elasticity of object, the absence of clearly defined landmarks and three-dimensional positioning differences constraint the performance of conventional registration techniques. In this paper, we propose a new method in order to detect interval changes accurately without using an image registration technique. The method is based on construction of so-called pattern histogram and comparison procedure. The pattern histogram is a graphic representation of the frequency counts of all allowable patterns in the multi-dimensional pattern vector space. K-means algorithm is employed to partition pattern vector space successively. Any differences in the pattern histograms imply that different patterns are involved in the scenes. In our experiment, a pair of chest radiographs of pneumoconiosis is employed and the changing histogram bins are visualized on both of the images. We found that the method can be used as an alternative way of temporal change detection, particularly when the precise image registration is not available.

  10. Animals as Mobile Biological Sensors for Forest Fire Detection.

    PubMed

    Sahin, Yasar Guneri

    2007-12-04

    This paper proposes a mobile biological sensor system that can assist in earlydetection of forest fires one of the most dreaded natural disasters on the earth. The main ideapresented in this paper is to utilize animals with sensors as Mobile Biological Sensors(MBS). The devices used in this system are animals which are native animals living inforests, sensors (thermo and radiation sensors with GPS features) that measure thetemperature and transmit the location of the MBS, access points for wireless communicationand a central computer system which classifies of animal actions. The system offers twodifferent methods, firstly: access points continuously receive data about animals' locationusing GPS at certain time intervals and the gathered data is then classified and checked tosee if there is a sudden movement (panic) of the animal groups: this method is called animalbehavior classification (ABC). The second method can be defined as thermal detection(TD): the access points get the temperature values from the MBS devices and send the datato a central computer to check for instant changes in the temperatures. This system may beused for many purposes other than fire detection, namely animal tracking, poachingprevention and detecting instantaneous animal death.

  11. F-Formation Detection: Individuating Free-Standing Conversational Groups in Images

    PubMed Central

    Setti, Francesco; Russell, Chris; Bassetti, Chiara; Cristani, Marco

    2015-01-01

    Detection of groups of interacting people is a very interesting and useful task in many modern technologies, with application fields spanning from video-surveillance to social robotics. In this paper we first furnish a rigorous definition of group considering the background of the social sciences: this allows us to specify many kinds of group, so far neglected in the Computer Vision literature. On top of this taxonomy we present a detailed state of the art on the group detection algorithms. Then, as a main contribution, we present a brand new method for the automatic detection of groups in still images, which is based on a graph-cuts framework for clustering individuals; in particular, we are able to codify in a computational sense the sociological definition of F-formation, that is very useful to encode a group having only proxemic information: position and orientation of people. We call the proposed method Graph-Cuts for F-formation (GCFF). We show how GCFF definitely outperforms all the state of the art methods in terms of different accuracy measures (some of them are brand new), demonstrating also a strong robustness to noise and versatility in recognizing groups of various cardinality. PMID:25996922

  12. Shadow detection and removal in RGB VHR images for land use unsupervised classification

    NASA Astrophysics Data System (ADS)

    Movia, A.; Beinat, A.; Crosilla, F.

    2016-09-01

    Nowadays, high resolution aerial images are widely available thanks to the diffusion of advanced technologies such as UAVs (Unmanned Aerial Vehicles) and new satellite missions. Although these developments offer new opportunities for accurate land use analysis and change detection, cloud and terrain shadows actually limit benefits and possibilities of modern sensors. Focusing on the problem of shadow detection and removal in VHR color images, the paper proposes new solutions and analyses how they can enhance common unsupervised classification procedures for identifying land use classes related to the CO2 absorption. To this aim, an improved fully automatic procedure has been developed for detecting image shadows using exclusively RGB color information, and avoiding user interaction. Results show a significant accuracy enhancement with respect to similar methods using RGB based indexes. Furthermore, novel solutions derived from Procrustes analysis have been applied to remove shadows and restore brightness in the images. In particular, two methods implementing the so called "anisotropic Procrustes" and the "not-centered oblique Procrustes" algorithms have been developed and compared with the linear correlation correction method based on the Cholesky decomposition. To assess how shadow removal can enhance unsupervised classifications, results obtained with classical methods such as k-means, maximum likelihood, and self-organizing maps, have been compared to each other and with a supervised clustering procedure.

  13. Image Corruption Detection in Diffusion Tensor Imaging for Post-Processing and Real-Time Monitoring

    PubMed Central

    Li, Yue; Shea, Steven M.; Lorenz, Christine H.; Jiang, Hangyi; Chou, Ming-Chung; Mori, Susumu

    2013-01-01

    Due to the high sensitivity of diffusion tensor imaging (DTI) to physiological motion, clinical DTI scans often suffer a significant amount of artifacts. Tensor-fitting-based, post-processing outlier rejection is often used to reduce the influence of motion artifacts. Although it is an effective approach, when there are multiple corrupted data, this method may no longer correctly identify and reject the corrupted data. In this paper, we introduce a new criterion called “corrected Inter-Slice Intensity Discontinuity” (cISID) to detect motion-induced artifacts. We compared the performance of algorithms using cISID and other existing methods with regard to artifact detection. The experimental results show that the integration of cISID into fitting-based methods significantly improves the retrospective detection performance at post-processing analysis. The performance of the cISID criterion, if used alone, was inferior to the fitting-based methods, but cISID could effectively identify severely corrupted images with a rapid calculation time. In the second part of this paper, an outlier rejection scheme was implemented on a scanner for real-time monitoring of image quality and reacquisition of the corrupted data. The real-time monitoring, based on cISID and followed by post-processing, fitting-based outlier rejection, could provide a robust environment for routine DTI studies. PMID:24204551

  14. Diagnostic accuracy of the light microscope method to detect the eggs of Cardicola spp. in the gill filaments of the bluefin tuna.

    PubMed

    Palacios-Abella, José F; Rodríguez-Llanos, Javier; Víllora-Montero, María; Mele, Salvatore; Raga, Juan A; Montero, Francisco E

    2017-11-30

    Trematode blood flukes of the genus Cardicola are potentially lethal in bluefin tuna cultures. The present study proposed a new method to detect aporocotylid eggs in tuna gills. Aporocotylid eggs were detected by analysing a pair of gill filaments of five transversal areas of the eight holobranches of one hundred Atlantic bluefin tuna and observed with glycerol and a stereomicroscope with an oblique brightfield. Data were gathered according to holobranches, transversal areas and their combination. Eggs were uniformly distributed among the holobranches, but they had the highest prevalence in the second and fifth transversal areas, which is controversial with respect to previous studies of egg distribution. An abbreviated method called the T-two test, which had the highest sensitivity (96.8%), is proposed for the detection of Cardicola spp. infections instead of the analysis all the holobranches. The T-two test limits the time and cost of the egg parasite screening analysis. The analyses of ten samples could be sufficient to detect the presence of parasites in farmed bluefin tuna; fish from the wild are expected to be less infected and more samples (45) would therefore be necessary. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. A surface plasmon resonance based biochip for the detection of patulin toxin

    NASA Astrophysics Data System (ADS)

    Pennacchio, Anna; Ruggiero, Giuseppe; Staiano, Maria; Piccialli, Gennaro; Oliviero, Giorgia; Lewkowicz, Aneta; Synak, Anna; Bojarski, Piotr; D'Auria, Sabato

    2014-08-01

    Patulin is a toxic secondary metabolite of a number of fungal species belonging to the genera Penicillium and Aspergillus. One important aspect of the patulin toxicity in vivo is an injury of the gastrointestinal tract including ulceration and inflammation of the stomach and intestine. Recently, patulin has been shown to be genotoxic by causing oxidative damage to the DNA, and oxidative DNA base modifications have been considered to play a role in mutagenesis and cancer initiation. Conventional analytical methods for patulin detection involve chromatographic analyses, such as HPLC, GC, and, more recently, techniques such as LC/MS and GC/MS. All of these methods require the use of extensive protocols and the use of expensive analytical instrumentation. In this work, the conjugation of a new derivative of patulin to the bovine serum albumin for the production of polyclonal antibodies is described, and an innovative competitive immune-assay for detection of patulin is presented. Experimentally, an important part of the detection method is based on the optical technique called surface plasmon resonance (SPR). Laser beam induced interactions between probe and target molecules in the vicinity of gold surface of the biochip lead to the shift in resonance conditions and consequently to slight but easily detectable change of reflectivity.

  16. SvABA: genome-wide detection of structural variants and indels by local assembly.

    PubMed

    Wala, Jeremiah A; Bandopadhayay, Pratiti; Greenwald, Noah F; O'Rourke, Ryan; Sharpe, Ted; Stewart, Chip; Schumacher, Steve; Li, Yilong; Weischenfeldt, Joachim; Yao, Xiaotong; Nusbaum, Chad; Campbell, Peter; Getz, Gad; Meyerson, Matthew; Zhang, Cheng-Zhong; Imielinski, Marcin; Beroukhim, Rameen

    2018-04-01

    Structural variants (SVs), including small insertion and deletion variants (indels), are challenging to detect through standard alignment-based variant calling methods. Sequence assembly offers a powerful approach to identifying SVs, but is difficult to apply at scale genome-wide for SV detection due to its computational complexity and the difficulty of extracting SVs from assembly contigs. We describe SvABA, an efficient and accurate method for detecting SVs from short-read sequencing data using genome-wide local assembly with low memory and computing requirements. We evaluated SvABA's performance on the NA12878 human genome and in simulated and real cancer genomes. SvABA demonstrates superior sensitivity and specificity across a large spectrum of SVs and substantially improves detection performance for variants in the 20-300 bp range, compared with existing methods. SvABA also identifies complex somatic rearrangements with chains of short (<1000 bp) templated-sequence insertions copied from distant genomic regions. We applied SvABA to 344 cancer genomes from 11 cancer types and found that short templated-sequence insertions occur in ∼4% of all somatic rearrangements. Finally, we demonstrate that SvABA can identify sites of viral integration and cancer driver alterations containing medium-sized (50-300 bp) SVs. © 2018 Wala et al.; Published by Cold Spring Harbor Laboratory Press.

  17. Movements of radio-marked California Ridgway's rails during monitoring surveys: Implications for population monitoring

    USGS Publications Warehouse

    Bui, Thuy-Vy D.; Takekawa, John Y.; Overton, Cory T.; Schultz, Emily R.; Hull, Joshua M.; Casazza, Michael L.

    2015-01-01

    The California Ridgway's rail Rallus obsoletus obsoletus (hereafter California rail) is a secretive marsh bird endemic to tidal marshes in the San Francisco Bay (hereafter bay) of California. The California rail has undergone significant range contraction and population declines due to a variety of factors, including predation and the degradation and loss of habitat. Call-count surveys, which include call playbacks, based on the standardized North American marsh bird monitoring protocol have been conducted throughout the bay since 2005 to monitor population size and distribution of the California rail. However, call-count surveys are difficult to evaluate for efficacy or accuracy. To measure the accuracy of call-count surveys and investigate whether radio-marked California rails moved in response to call-count surveys, we compared locations of radio-marked California rails collected at frequent intervals (15 min) to California rail detections recorded during call-count surveys conducted over the same time periods. Overall, 60% of radio-marked California rails within 200 m of observers were not detected during call-count surveys. Movements of radio-marked California rails showed no directional bias (P = 0.92) irrespective of whether or not playbacks of five marsh bird species (including the California rail) were broadcast from listening stations. Our findings suggest that playbacks of rail vocalizations do not consistently influence California rail movements during surveys. However, call-count surveys may underestimate California rail presence; therefore, caution should be used when relating raw numbers of call-count detections to population abundance.

  18. Structure and Possible Functions of Constant-Frequency Calls in Ariopsis seemanni (Osteichthyes, Ariidae)

    PubMed Central

    Schmidtke, Daniel; Schulz, Jochen; Hartung, Jörg; Esser, Karl-Heinz

    2013-01-01

    In the 1970s, Tavolga conducted a series of experiments in which he found behavioral evidence that the vocalizations of the catfish species Ariopsis felis may play a role in a coarse form of echolocation. Based on his findings, he postulated a similar function for the calls of closely related catfish species. Here, we describe the physical characteristics of the predominant call-type of Ariopsis seemanni. In two behavioral experiments, we further explore whether A. seemanni uses these calls for acoustic obstacle detection by testing the hypothesis that the call-emission rate of individual fish should increase when subjects are confronted with novel objects, as it is known from other vertebrate species that use pulse-type signals to actively probe the environment. Audio-video monitoring of the fish under different obstacle conditions did not reveal a systematic increase in the number of emitted calls in the presence of novel objects or in dependence on the proximity between individual fish and different objects. These negative findings in combination with our current understanding of directional hearing in fishes (which is a prerequisite for acoustic obstacle detection) make it highly unlikely that A. seemanni uses its calls for acoustic obstacle detection. We argue that the calls are more likely to play a role in intra- or interspecific communication (e.g. in school formation or predator deterrence) and present results from a preliminary Y-maze experiment that are indicative for a positive phonotaxis of A. seemanni towards the calls of conspecifics. PMID:23741408

  19. Rapid detection of Streptococcus pneumoniae by real-time fluorescence loop-mediated isothermal amplification

    PubMed Central

    Guo, Xu-Guang; Zhou, Shan

    2014-01-01

    Background and aim of study A significant human pathogenic bacterium, Streptococcus pneumoniae was recognized as a major cause of pneumonia, and is the subject of many humoral immunity studies. Diagnosis is generally made based on clinical suspicion along with a positive culture from a sample from virtually any place in the body. But the testing time is too long. This study is to establish a rapid diagnostic method to identification of Streptococcus pneumoniae. Methods Our laboratory has recently developed a new platform called real-amp, which combines loop-mediated isothermal amplification (LAMP) with a portable tube scanner real-time isothermal instrument for the rapid detection of Streptococcus pneumonia. Two pairs of amplification primers required for this method were derived from a conserved DNA sequence unique to the Streptococcus pneumoniae. The amplification was carried out at 63 degree Celsius using SYBR Green for 60 minutes with the tube scanner set to collect fluorescence signals. Clinical samples of Streptococcus pneumoniae and other bacteria were used to determine the sensitivity and specificity of the primers by comparing with traditional culture method. Results The new set of primers consistently detected in laboratory-maintained isolates of Streptococcus pneumoniae from our hospital. The new primers also proved to be more sensitive than the published species-specific primers specifically developed for the LAMP method in detecting Streptococcus pneumoniae. Conclusions This study demonstrates that the Streptococcus pneumoniae LAMP primers developed here have the ability to accurately detect Streptococcus pneumoniae infections by real-time fluorescence LAMP. PMID:25276360

  20. Image-based fall detection and classification of a user with a walking support system

    NASA Astrophysics Data System (ADS)

    Taghvaei, Sajjad; Kosuge, Kazuhiro

    2017-10-01

    The classification of visual human action is important in the development of systems that interact with humans. This study investigates an image-based classification of the human state while using a walking support system to improve the safety and dependability of these systems.We categorize the possible human behavior while utilizing a walker robot into eight states (i.e., sitting, standing, walking, and five falling types), and propose two different methods, namely, normal distribution and hidden Markov models (HMMs), to detect and recognize these states. The visual feature for the state classification is the centroid position of the upper body, which is extracted from the user's depth images. The first method shows that the centroid position follows a normal distribution while walking, which can be adopted to detect any non-walking state. The second method implements HMMs to detect and recognize these states. We then measure and compare the performance of both methods. The classification results are employed to control the motion of a passive-type walker (called "RT Walker") by activating its brakes in non-walking states. Thus, the system can be used for sit/stand support and fall prevention. The experiments are performed with four subjects, including an experienced physiotherapist. Results show that the algorithm can be adapted to the new user's motion pattern within 40 s, with a fall detection rate of 96.25% and state classification rate of 81.0%. The proposed method can be implemented to other abnormality detection/classification applications that employ depth image-sensing devices.

  1. Evaluation of listener-based anuran surveys with automated audio recording devices

    USGS Publications Warehouse

    Shearin, A. F.; Calhoun, A.J.K.; Loftin, C.S.

    2012-01-01

    Volunteer-based audio surveys are used to document long-term trends in anuran community composition and abundance. Current sampling protocols, however, are not region- or species-specific and may not detect relatively rare or audibly cryptic species. We used automated audio recording devices to record calling anurans during 2006–2009 at wetlands in Maine, USA. We identified species calling, chorus intensity, time of day, and environmental variables when each species was calling and developed logistic and generalized mixed models to determine the time interval and environmental variables that optimize detection of each species during peak calling periods. We detected eight of nine anurans documented in Maine. Individual recordings selected from the sampling period (0.5 h past sunset to 0100 h) described in the North American Amphibian Monitoring Program (NAAMP) detected fewer species than were detected in recordings from 30 min past sunset until sunrise. Time of maximum detection of presence and full chorusing for three species (green frogs, mink frogs, pickerel frogs) occurred after the NAAMP sampling end time (0100 h). The NAAMP protocol’s sampling period may result in omissions and misclassifications of chorus sizes for certain species. These potential errors should be considered when interpreting trends generated from standardized anuran audio surveys.

  2. A comprehensive assessment of somatic mutation detection in cancer using whole-genome sequencing

    PubMed Central

    Alioto, Tyler S.; Buchhalter, Ivo; Derdak, Sophia; Hutter, Barbara; Eldridge, Matthew D.; Hovig, Eivind; Heisler, Lawrence E.; Beck, Timothy A.; Simpson, Jared T.; Tonon, Laurie; Sertier, Anne-Sophie; Patch, Ann-Marie; Jäger, Natalie; Ginsbach, Philip; Drews, Ruben; Paramasivam, Nagarajan; Kabbe, Rolf; Chotewutmontri, Sasithorn; Diessl, Nicolle; Previti, Christopher; Schmidt, Sabine; Brors, Benedikt; Feuerbach, Lars; Heinold, Michael; Gröbner, Susanne; Korshunov, Andrey; Tarpey, Patrick S.; Butler, Adam P.; Hinton, Jonathan; Jones, David; Menzies, Andrew; Raine, Keiran; Shepherd, Rebecca; Stebbings, Lucy; Teague, Jon W.; Ribeca, Paolo; Giner, Francesc Castro; Beltran, Sergi; Raineri, Emanuele; Dabad, Marc; Heath, Simon C.; Gut, Marta; Denroche, Robert E.; Harding, Nicholas J.; Yamaguchi, Takafumi N.; Fujimoto, Akihiro; Nakagawa, Hidewaki; Quesada, Víctor; Valdés-Mas, Rafael; Nakken, Sigve; Vodák, Daniel; Bower, Lawrence; Lynch, Andrew G.; Anderson, Charlotte L.; Waddell, Nicola; Pearson, John V.; Grimmond, Sean M.; Peto, Myron; Spellman, Paul; He, Minghui; Kandoth, Cyriac; Lee, Semin; Zhang, John; Létourneau, Louis; Ma, Singer; Seth, Sahil; Torrents, David; Xi, Liu; Wheeler, David A.; López-Otín, Carlos; Campo, Elías; Campbell, Peter J.; Boutros, Paul C.; Puente, Xose S.; Gerhard, Daniela S.; Pfister, Stefan M.; McPherson, John D.; Hudson, Thomas J.; Schlesner, Matthias; Lichter, Peter; Eils, Roland; Jones, David T. W.; Gut, Ivo G.

    2015-01-01

    As whole-genome sequencing for cancer genome analysis becomes a clinical tool, a full understanding of the variables affecting sequencing analysis output is required. Here using tumour-normal sample pairs from two different types of cancer, chronic lymphocytic leukaemia and medulloblastoma, we conduct a benchmarking exercise within the context of the International Cancer Genome Consortium. We compare sequencing methods, analysis pipelines and validation methods. We show that using PCR-free methods and increasing sequencing depth to ∼100 × shows benefits, as long as the tumour:control coverage ratio remains balanced. We observe widely varying mutation call rates and low concordance among analysis pipelines, reflecting the artefact-prone nature of the raw data and lack of standards for dealing with the artefacts. However, we show that, using the benchmark mutation set we have created, many issues are in fact easy to remedy and have an immediate positive impact on mutation detection accuracy. PMID:26647970

  3. TINS, target immobilized NMR screening: an efficient and sensitive method for ligand discovery.

    PubMed

    Vanwetswinkel, Sophie; Heetebrij, Robert J; van Duynhoven, John; Hollander, Johan G; Filippov, Dmitri V; Hajduk, Philip J; Siegal, Gregg

    2005-02-01

    We propose a ligand screening method, called TINS (target immobilized NMR screening), which reduces the amount of target required for the fragment-based approach to drug discovery. Binding is detected by comparing 1D NMR spectra of compound mixtures in the presence of a target immobilized on a solid support to a control sample. The method has been validated by the detection of a variety of ligands for protein and nucleic acid targets (K(D) from 60 to 5000 muM). The ligand binding capacity of a protein was undiminished after 2000 different compounds had been applied, indicating the potential to apply the assay for screening typical fragment libraries. TINS can be used in competition mode, allowing rapid characterization of the ligand binding site. TINS may allow screening of targets that are difficult to produce or that are insoluble, such as membrane proteins.

  4. Robust Foregrounds Removal for 21-cm Experiments

    NASA Astrophysics Data System (ADS)

    Mertens, F.; Ghosh, A.; Koopmans, L. V. E.

    2018-05-01

    Direct detection of the Epoch of Reionization via the redshifted 21-cm line will have unprecedented implications on the study of structure formation in the early Universe. To fulfill this promise current and future 21-cm experiments will need to detect the weak 21-cm signal over foregrounds several order of magnitude greater. This requires accurate modeling of the galactic and extragalactic emission and of its contaminants due to instrument chromaticity, ionosphere and imperfect calibration. To solve for this complex modeling, we propose a new method based on Gaussian Process Regression (GPR) which is able to cleanly separate the cosmological signal from most of the foregrounds contaminants. We also propose a new imaging method based on a maximum likelihood framework which solves for the interferometric equation directly on the sphere. Using this method, chromatic effects causing the so-called ``wedge'' are effectively eliminated (i.e. deconvolved) in the cylindrical (k⊥, k∥) power spectrum.

  5. Restoration of out-of-focus images based on circle of confusion estimate

    NASA Astrophysics Data System (ADS)

    Vivirito, Paolo; Battiato, Sebastiano; Curti, Salvatore; La Cascia, M.; Pirrone, Roberto

    2002-11-01

    In this paper a new method for a fast out-of-focus blur estimation and restoration is proposed. It is suitable for CFA (Color Filter Array) images acquired by typical CCD/CMOS sensor. The method is based on the analysis of a single image and consists of two steps: 1) out-of-focus blur estimation via Bayer pattern analysis; 2) image restoration. Blur estimation is based on a block-wise edge detection technique. This edge detection is carried out on the green pixels of the CFA sensor image also called Bayer pattern. Once the blur level has been estimated the image is restored through the application of a new inverse filtering technique. This algorithm gives sharp images reducing ringing and crisping artifact, involving wider region of frequency. Experimental results show the effectiveness of the method, both in subjective and numerical way, by comparison with other techniques found in literature.

  6. Single Component Sorption-Desorption Test Experimental Design Approach Discussions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phil WInston

    A task was identified within the fission-product-transport work package to develop a path forward for doing testing to determine behavior of volatile fission products behavior and to engage members of the NGNP community to advise and dissent on the approach. The following document is a summary of the discussions and the specific approaches suggested for components of the testing. Included in the summary isare the minutes of the conference call that was held with INL and external interested parties to elicit comments on the approaches brought forward by the INL participants. The conclusion was that an initial non-radioactive, single componentmore » test will be useful to establish the limits of currently available chemical detection methods, and to evaluated source-dispersion uniformity. In parallel, development of a real-time low-concentration monitoring method is believed to be useful in detecting rapid dispersion as well as desorption phenomena. Ultimately, the test cycle is expected to progress to the use of radio-traced species, simply because this method will allow the lowest possible detection limits. The consensus of the conference call was that there is no need for an in-core test because the duct and heat exchanger surfaces that will be the sorption target will be outside the main neutron flux and will not be affected by irradiation. Participants in the discussion and contributors to the INL approach were Jeffrey Berg, Pattrick Calderoni, Gary Groenewold, Paul Humrickhouse, Brad Merrill, and Phil Winston. Participants from outside the INL included David Hanson of General Atomics, Todd Allen, Tyler Gerczak, and Izabela Szlufarska of the University of Wisconsin, Gary Was, of the University of Michigan, Sudarshan Loyalka and Tushar Ghosh of the University of Missouri, and Robert Morris of Oak Ridge National Laboratory.« less

  7. Image reconstruction of muon tomographic data using a density-based clustering method

    NASA Astrophysics Data System (ADS)

    Perry, Kimberly B.

    Muons are subatomic particles capable of reaching the Earth's surface before decaying. When these particles collide with an object that has a high atomic number (Z), their path of travel changes substantially. Tracking muon movement through shielded containers can indicate what types of materials lie inside. This thesis proposes using a density-based clustering algorithm called OPTICS to perform image reconstructions using muon tomographic data. The results show that this method is capable of detecting high-Z materials quickly, and can also produce detailed reconstructions with large amounts of data.

  8. The effect of algorithms on copy number variant detection.

    PubMed

    Tsuang, Debby W; Millard, Steven P; Ely, Benjamin; Chi, Peter; Wang, Kenneth; Raskind, Wendy H; Kim, Sulgi; Brkanac, Zoran; Yu, Chang-En

    2010-12-30

    The detection of copy number variants (CNVs) and the results of CNV-disease association studies rely on how CNVs are defined, and because array-based technologies can only infer CNVs, CNV-calling algorithms can produce vastly different findings. Several authors have noted the large-scale variability between CNV-detection methods, as well as the substantial false positive and false negative rates associated with those methods. In this study, we use variations of four common algorithms for CNV detection (PennCNV, QuantiSNP, HMMSeg, and cnvPartition) and two definitions of overlap (any overlap and an overlap of at least 40% of the smaller CNV) to illustrate the effects of varying algorithms and definitions of overlap on CNV discovery. We used a 56 K Illumina genotyping array enriched for CNV regions to generate hybridization intensities and allele frequencies for 48 Caucasian schizophrenia cases and 48 age-, ethnicity-, and gender-matched control subjects. No algorithm found a difference in CNV burden between the two groups. However, the total number of CNVs called ranged from 102 to 3,765 across algorithms. The mean CNV size ranged from 46 kb to 787 kb, and the average number of CNVs per subject ranged from 1 to 39. The number of novel CNVs not previously reported in normal subjects ranged from 0 to 212. Motivated by the availability of multiple publicly available genome-wide SNP arrays, investigators are conducting numerous analyses to identify putative additional CNVs in complex genetic disorders. However, the number of CNVs identified in array-based studies, and whether these CNVs are novel or valid, will depend on the algorithm(s) used. Thus, given the variety of methods used, there will be many false positives and false negatives. Both guidelines for the identification of CNVs inferred from high-density arrays and the establishment of a gold standard for validation of CNVs are needed.

  9. Detection methods for biotech cotton MON 15985 and MON 88913 by PCR.

    PubMed

    Lee, Seong-Hun; Kim, Jin-Kug; Yi, Bu-Young

    2007-05-02

    Plants derived through agricultural biotechnology, or genetically modified organisms (GMOs), may affect human health and ecological environment. A living GMO is also called a living modified organism (LMO). Biotech cotton is a GMO in food or feed and also an LMO in the environment. Recently, two varieties of biotech cotton, MON 15985 and MON 88913, were developed by Monsanto Co. The detection method is an essential element for the GMO labeling system or LMO management of biotech plants. In this paper, two primer pairs and probes were designed for specific amplification of 116 and 120 bp PCR products from MON 15985 and MON 88913, respectively, with no amplification from any other biotech cotton. Limits of detection of the qualitative method were all 0.05% for MON 15985 and MON 88913. The quantitative method was developed using a TaqMan real-time PCR. A synthetic plasmid, as a reference molecule, was constructed from a taxon-specific DNA sequence of cotton and two construct-specific DNA sequences of MON 15985 and MON 88913. The quantitative method was validated using six samples that contained levels of biotech cotton mixed with conventional cotton ranging from 0.1 to 10.0%. As a result, the biases from the true value and the relative deviations were all within the range of +/-20%. Limits of quantitation of the quantitative method were all 0.1%. Consequently, it is reported that the proposed detection methods were applicable for qualitative and quantitative analyses for biotech cotton MON 15985 and MON 88913.

  10. Efficient Forest Fire Detection Index for Application in Unmanned Aerial Systems (UASs)

    PubMed Central

    Cruz, Henry; Eckert, Martina; Meneses, Juan; Martínez, José-Fernán

    2016-01-01

    This article proposes a novel method for detecting forest fires, through the use of a new color index, called the Forest Fire Detection Index (FFDI), developed by the authors. The index is based on methods for vegetation classification and has been adapted to detect the tonalities of flames and smoke; the latter could be included adaptively into the Regions of Interest (RoIs) with the help of a variable factor. Multiple tests have been performed upon database imagery and present promising results: a detection precision of 96.82% has been achieved for image sizes of 960 × 540 pixels at a processing time of 0.0447 seconds. This achievement would lead to a performance of 22 f/s, for smaller images, while up to 54 f/s could be reached by maintaining a similar detection precision. Additional tests have been performed on fires in their early stages, achieving a precision rate of p = 96.62%. The method could be used in real-time in Unmanned Aerial Systems (UASs), with the aim of monitoring a wider area than through fixed surveillance systems. Thus, it would result in more cost-effective outcomes than conventional systems implemented in helicopters or satellites. UASs could also reach inaccessible locations without jeopardizing people’s safety. On-going work includes implementation into a commercially available drone. PMID:27322264

  11. Railway crossing risk area detection using linear regression and terrain drop compensation techniques.

    PubMed

    Chen, Wen-Yuan; Wang, Mei; Fu, Zhou-Xing

    2014-06-16

    Most railway accidents happen at railway crossings. Therefore, how to detect humans or objects present in the risk area of a railway crossing and thus prevent accidents are important tasks. In this paper, three strategies are used to detect the risk area of a railway crossing: (1) we use a terrain drop compensation (TDC) technique to solve the problem of the concavity of railway crossings; (2) we use a linear regression technique to predict the position and length of an object from image processing; (3) we have developed a novel strategy called calculating local maximum Y-coordinate object points (CLMYOP) to obtain the ground points of the object. In addition, image preprocessing is also applied to filter out the noise and successfully improve the object detection. From the experimental results, it is demonstrated that our scheme is an effective and corrective method for the detection of railway crossing risk areas.

  12. Railway Crossing Risk Area Detection Using Linear Regression and Terrain Drop Compensation Techniques

    PubMed Central

    Chen, Wen-Yuan; Wang, Mei; Fu, Zhou-Xing

    2014-01-01

    Most railway accidents happen at railway crossings. Therefore, how to detect humans or objects present in the risk area of a railway crossing and thus prevent accidents are important tasks. In this paper, three strategies are used to detect the risk area of a railway crossing: (1) we use a terrain drop compensation (TDC) technique to solve the problem of the concavity of railway crossings; (2) we use a linear regression technique to predict the position and length of an object from image processing; (3) we have developed a novel strategy called calculating local maximum Y-coordinate object points (CLMYOP) to obtain the ground points of the object. In addition, image preprocessing is also applied to filter out the noise and successfully improve the object detection. From the experimental results, it is demonstrated that our scheme is an effective and corrective method for the detection of railway crossing risk areas. PMID:24936948

  13. A remark on copy number variation detection methods.

    PubMed

    Li, Shuo; Dou, Xialiang; Gao, Ruiqi; Ge, Xinzhou; Qian, Minping; Wan, Lin

    2018-01-01

    Copy number variations (CNVs) are gain and loss of DNA sequence of a genome. High throughput platforms such as microarrays and next generation sequencing technologies (NGS) have been applied for genome wide copy number losses. Although progress has been made in both approaches, the accuracy and consistency of CNV calling from the two platforms remain in dispute. In this study, we perform a deep analysis on copy number losses on 254 human DNA samples, which have both SNP microarray data and NGS data publicly available from Hapmap Project and 1000 Genomes Project respectively. We show that the copy number losses reported from Hapmap Project and 1000 Genome Project only have < 30% overlap, while these reports are required to have cross-platform (e.g. PCR, microarray and high-throughput sequencing) experimental supporting by their corresponding projects, even though state-of-art calling methods were employed. On the other hand, copy number losses are found directly from HapMap microarray data by an accurate algorithm, i.e. CNVhac, almost all of which have lower read mapping depth in NGS data; furthermore, 88% of which can be supported by the sequences with breakpoint in NGS data. Our results suggest the ability of microarray calling CNVs and the possible introduction of false negatives from the unessential requirement of the additional cross-platform supporting. The inconsistency of CNV reports from Hapmap Project and 1000 Genomes Project might result from the inadequate information containing in microarray data, the inconsistent detection criteria, or the filtration effect of cross-platform supporting. The statistical test on CNVs called from CNVhac show that the microarray data can offer reliable CNV reports, and majority of CNV candidates can be confirmed by raw sequences. Therefore, the CNV candidates given by a good caller could be highly reliable without cross-platform supporting, so additional experimental information should be applied in need instead of necessarily.

  14. A-Track: A new approach for detection of moving objects in FITS images

    NASA Astrophysics Data System (ADS)

    Atay, T.; Kaplan, M.; Kilic, Y.; Karapinar, N.

    2016-10-01

    We have developed a fast, open-source, cross-platform pipeline, called A-Track, for detecting the moving objects (asteroids and comets) in sequential telescope images in FITS format. The pipeline is coded in Python 3. The moving objects are detected using a modified line detection algorithm, called MILD. We tested the pipeline on astronomical data acquired by an SI-1100 CCD with a 1-meter telescope. We found that A-Track performs very well in terms of detection efficiency, stability, and processing time. The code is hosted on GitHub under the GNU GPL v3 license.

  15. Evaluation of a National Call Center and a Local Alerts System for Detection of New Cases of Ebola Virus Disease - Guinea, 2014-2015

    DTIC Science & Technology

    2016-03-11

    Control and Prevention Evaluation of a National Call Center and a Local Alerts System for Detection of New Cases of Ebola Virus Disease — Guinea, 2014...principally through the use of a telephone alert system. Community members and health facilities report deaths and suspected Ebola cases to local alert ...sensitivity of the national call center with the local alerts system, the CDC country team performed probabilistic record linkage of the combined

  16. A generalized baleen whale call detection and classification system.

    PubMed

    Baumgartner, Mark F; Mussoline, Sarah E

    2011-05-01

    Passive acoustic monitoring allows the assessment of marine mammal occurrence and distribution at greater temporal and spatial scales than is now possible with traditional visual surveys. However, the large volume of acoustic data and the lengthy and laborious task of manually analyzing these data have hindered broad application of this technique. To overcome these limitations, a generalized automated detection and classification system (DCS) was developed to efficiently and accurately identify low-frequency baleen whale calls. The DCS (1) accounts for persistent narrowband and transient broadband noise, (2) characterizes temporal variation of dominant call frequencies via pitch-tracking, and (3) classifies calls based on attributes of the resulting pitch tracks using quadratic discriminant function analysis (QDFA). Automated detections of sei whale (Balaenoptera borealis) downsweep calls and North Atlantic right whale (Eubalaena glacialis) upcalls were evaluated using recordings collected in the southwestern Gulf of Maine during the spring seasons of 2006 and 2007. The accuracy of the DCS was similar to that of a human analyst: variability in differences between the DCS and an analyst was similar to that between independent analysts, and temporal variability in call rates was similar among the DCS and several analysts.

  17. A Saliency Guided Semi-Supervised Building Change Detection Method for High Resolution Remote Sensing Images

    PubMed Central

    Hou, Bin; Wang, Yunhong; Liu, Qingjie

    2016-01-01

    Characterizations of up to date information of the Earth’s surface are an important application providing insights to urban planning, resources monitoring and environmental studies. A large number of change detection (CD) methods have been developed to solve them by utilizing remote sensing (RS) images. The advent of high resolution (HR) remote sensing images further provides challenges to traditional CD methods and opportunities to object-based CD methods. While several kinds of geospatial objects are recognized, this manuscript mainly focuses on buildings. Specifically, we propose a novel automatic approach combining pixel-based strategies with object-based ones for detecting building changes with HR remote sensing images. A multiresolution contextual morphological transformation called extended morphological attribute profiles (EMAPs) allows the extraction of geometrical features related to the structures within the scene at different scales. Pixel-based post-classification is executed on EMAPs using hierarchical fuzzy clustering. Subsequently, the hierarchical fuzzy frequency vector histograms are formed based on the image-objects acquired by simple linear iterative clustering (SLIC) segmentation. Then, saliency and morphological building index (MBI) extracted on difference images are used to generate a pseudo training set. Ultimately, object-based semi-supervised classification is implemented on this training set by applying random forest (RF). Most of the important changes are detected by the proposed method in our experiments. This study was checked for effectiveness using visual evaluation and numerical evaluation. PMID:27618903

  18. A Saliency Guided Semi-Supervised Building Change Detection Method for High Resolution Remote Sensing Images.

    PubMed

    Hou, Bin; Wang, Yunhong; Liu, Qingjie

    2016-08-27

    Characterizations of up to date information of the Earth's surface are an important application providing insights to urban planning, resources monitoring and environmental studies. A large number of change detection (CD) methods have been developed to solve them by utilizing remote sensing (RS) images. The advent of high resolution (HR) remote sensing images further provides challenges to traditional CD methods and opportunities to object-based CD methods. While several kinds of geospatial objects are recognized, this manuscript mainly focuses on buildings. Specifically, we propose a novel automatic approach combining pixel-based strategies with object-based ones for detecting building changes with HR remote sensing images. A multiresolution contextual morphological transformation called extended morphological attribute profiles (EMAPs) allows the extraction of geometrical features related to the structures within the scene at different scales. Pixel-based post-classification is executed on EMAPs using hierarchical fuzzy clustering. Subsequently, the hierarchical fuzzy frequency vector histograms are formed based on the image-objects acquired by simple linear iterative clustering (SLIC) segmentation. Then, saliency and morphological building index (MBI) extracted on difference images are used to generate a pseudo training set. Ultimately, object-based semi-supervised classification is implemented on this training set by applying random forest (RF). Most of the important changes are detected by the proposed method in our experiments. This study was checked for effectiveness using visual evaluation and numerical evaluation.

  19. A statistical, task-based evaluation method for three-dimensional x-ray breast imaging systems using variable-background phantoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Subok; Jennings, Robert; Liu Haimo

    Purpose: For the last few years, development and optimization of three-dimensional (3D) x-ray breast imaging systems, such as digital breast tomosynthesis (DBT) and computed tomography, have drawn much attention from the medical imaging community, either academia or industry. However, there is still much room for understanding how to best optimize and evaluate the devices over a large space of many different system parameters and geometries. Current evaluation methods, which work well for 2D systems, do not incorporate the depth information from the 3D imaging systems. Therefore, it is critical to develop a statistically sound evaluation method to investigate the usefulnessmore » of inclusion of depth and background-variability information into the assessment and optimization of the 3D systems. Methods: In this paper, we present a mathematical framework for a statistical assessment of planar and 3D x-ray breast imaging systems. Our method is based on statistical decision theory, in particular, making use of the ideal linear observer called the Hotelling observer. We also present a physical phantom that consists of spheres of different sizes and materials for producing an ensemble of randomly varying backgrounds to be imaged for a given patient class. Lastly, we demonstrate our evaluation method in comparing laboratory mammography and three-angle DBT systems for signal detection tasks using the phantom's projection data. We compare the variable phantom case to that of a phantom of the same dimensions filled with water, which we call the uniform phantom, based on the performance of the Hotelling observer as a function of signal size and intensity. Results: Detectability trends calculated using the variable and uniform phantom methods are different from each other for both mammography and DBT systems. Conclusions: Our results indicate that measuring the system's detection performance with consideration of background variability may lead to differences in system performance estimates and comparisons. For the assessment of 3D systems, to accurately determine trade offs between image quality and radiation dose, it is critical to incorporate randomness arising from the imaging chain including background variability into system performance calculations.« less

  20. Simultaneous Genotype Calling and Haplotype Phasing Improves Genotype Accuracy and Reduces False-Positive Associations for Genome-wide Association Studies

    PubMed Central

    Browning, Brian L.; Yu, Zhaoxia

    2009-01-01

    We present a novel method for simultaneous genotype calling and haplotype-phase inference. Our method employs the computationally efficient BEAGLE haplotype-frequency model, which can be applied to large-scale studies with millions of markers and thousands of samples. We compare genotype calls made with our method to genotype calls made with the BIRDSEED, CHIAMO, GenCall, and ILLUMINUS genotype-calling methods, using genotype data from the Illumina 550K and Affymetrix 500K arrays. We show that our method has higher genotype-call accuracy and yields fewer uncalled genotypes than competing methods. We perform single-marker analysis of data from the Wellcome Trust Case Control Consortium bipolar disorder and type 2 diabetes studies. For bipolar disorder, the genotype calls in the original study yield 25 markers with apparent false-positive association with bipolar disorder at a p < 10−7 significance level, whereas genotype calls made with our method yield no associated markers at this significance threshold. Conversely, for markers with replicated association with type 2 diabetes, there is good concordance between genotype calls used in the original study and calls made by our method. Results from single-marker and haplotypic analysis of our method's genotype calls for the bipolar disorder study indicate that our method is highly effective at eliminating genotyping artifacts that cause false-positive associations in genome-wide association studies. Our new genotype-calling methods are implemented in the BEAGLE and BEAGLECALL software packages. PMID:19931040

  1. FloCon 2005 Proceedings

    DTIC Science & Technology

    2005-09-01

    from one subject to another [5, 6]. Since covert com - munication is very difficult to detect, most researchers resort to investigating methods that...situations (unlike our own) where traffic is not filtered (a darknet , for example). To prevent isolated anomalies during the learning pe- riod from...call to the computer running the flow tools. Then, using a standard R data input function, the wrapper function reads in the ASCII output of the com

  2. Variability in echolocation call intensity in a community of horseshoe bats: a role for resource partitioning or communication?

    PubMed

    Schuchmann, Maike; Siemers, Björn M

    2010-09-17

    Only recently data on bat echolocation call intensities is starting to accumulate. Yet, intensity is an ecologically crucial parameter, as it determines the extent of the bats' perceptual space and, specifically, prey detection distance. Interspecifically, we thus asked whether sympatric, congeneric bat species differ in call intensities and whether differences play a role for niche differentiation. Specifically, we investigated whether R. mehelyi that calls at a frequency clearly above what is predicted by allometry, compensates for frequency-dependent loss in detection distance by using elevated call intensity. Maximum echolocation call intensities might depend on body size or condition and thus be used as an honest signal of quality for intraspecific communication. We for the first time investigated whether a size-intensity relation is present in echolocating bats. We measured maximum call intensities and frequencies for all five European horseshoe bat species. Maximum intensity differed among species largely due to R. euryale. Furthermore, we found no compensation for frequency-dependent loss in detection distance in R. mehelyi. Intraspecifically, there is a negative correlation between forearm lengths and intensity in R. euryale and a trend for a negative correlation between body condition index and intensity in R. ferrumequinum. In R. hipposideros, females had 8 dB higher intensities than males. There were no correlations with body size or sex differences and intensity for the other species. Based on call intensity and frequency measurements, we estimated echolocation ranges for our study community. These suggest that intensity differences result in different prey detection distances and thus likely play some role for resource access. It is interesting and at first glance counter-intuitive that, where a correlation was found, smaller bats called louder than large individuals. Such negative relationship between size or condition and vocal amplitude may indicate an as yet unknown physiological or sexual selection pressure.

  3. Variability in Echolocation Call Intensity in a Community of Horseshoe Bats: A Role for Resource Partitioning or Communication?

    PubMed Central

    Schuchmann, Maike; Siemers, Björn M.

    2010-01-01

    Background Only recently data on bat echolocation call intensities is starting to accumulate. Yet, intensity is an ecologically crucial parameter, as it determines the extent of the bats' perceptual space and, specifically, prey detection distance. Interspecifically, we thus asked whether sympatric, congeneric bat species differ in call intensities and whether differences play a role for niche differentiation. Specifically, we investigated whether R. mehelyi that calls at a frequency clearly above what is predicted by allometry, compensates for frequency-dependent loss in detection distance by using elevated call intensity. Maximum echolocation call intensities might depend on body size or condition and thus be used as an honest signal of quality for intraspecific communication. We for the first time investigated whether a size-intensity relation is present in echolocating bats. Methodology/Principal Findings We measured maximum call intensities and frequencies for all five European horseshoe bat species. Maximum intensity differed among species largely due to R. euryale. Furthermore, we found no compensation for frequency-dependent loss in detection distance in R. mehelyi. Intraspecifically, there is a negative correlation between forearm lengths and intensity in R. euryale and a trend for a negative correlation between body condition index and intensity in R. ferrumequinum. In R. hipposideros, females had 8 dB higher intensities than males. There were no correlations with body size or sex differences and intensity for the other species. Conclusions/Significance Based on call intensity and frequency measurements, we estimated echolocation ranges for our study community. These suggest that intensity differences result in different prey detection distances and thus likely play some role for resource access. It is interesting and at first glance counter-intuitive that, where a correlation was found, smaller bats called louder than large individuals. Such negative relationship between size or condition and vocal amplitude may indicate an as yet unknown physiological or sexual selection pressure. PMID:20862252

  4. The sunstone and polarised skylight: ancient Viking navigational tools?

    NASA Astrophysics Data System (ADS)

    Ropars, Guy; Lakshminarayanan, Vasudevan; Le Floch, Albert

    2014-10-01

    Although the polarisation of the light was discovered at the beginning of the nineteenth century, the Vikings could have used the polarised light around the tenth century in their navigation to America, using a 'sunstone' evoked in the Icelandic Sagas. Indeed, the birefringence of the Iceland spar (calcite), a common crystal in Scandinavia, permits a simple observation of the axis of polarisation of the skylight at the zenith. From this, it is possible to guess the azimuth of a hidden Sun below the horizon, for instance. The high sensitivity of the differential method provided by the ordinary and extraordinary beams of calcite at its so-called isotropy point is about two orders higher than that of the best dichroic polariser and permits to reach an accuracy of ±1° for the Sun azimuth (at sunrise and sunset). Unfortunately, due to the relative fragility of calcite, only the so-called Alderney crystal was discovered on board a 16th ancient ship. Curiously, beyond its use as a sunstone by the Vikings, during these last millennia calcite has led to the discovery of the polarisation of the light itself by Malus and is currently being used to detect the atmospheres of exoplanets. Moreover, the differential method for the light polarisation detection is widely used in the animal world.

  5. MuSCoWERT: multi-scale consistence of weighted edge Radon transform for horizon detection in maritime images.

    PubMed

    Prasad, Dilip K; Rajan, Deepu; Rachmawati, Lily; Rajabally, Eshan; Quek, Chai

    2016-12-01

    This paper addresses the problem of horizon detection, a fundamental process in numerous object detection algorithms, in a maritime environment. The maritime environment is characterized by the absence of fixed features, the presence of numerous linear features in dynamically changing objects and background and constantly varying illumination, rendering the typically simple problem of detecting the horizon a challenging one. We present a novel method called multi-scale consistence of weighted edge Radon transform, abbreviated as MuSCoWERT. It detects the long linear features consistent over multiple scales using multi-scale median filtering of the image followed by Radon transform on a weighted edge map and computing the histogram of the detected linear features. We show that MuSCoWERT has excellent performance, better than seven other contemporary methods, for 84 challenging maritime videos, containing over 33,000 frames, and captured using visible range and near-infrared range sensors mounted onboard, onshore, or on floating buoys. It has a median error of about 2 pixels (less than 0.2%) from the center of the actual horizon and a median angular error of less than 0.4 deg. We are also sharing a new challenging horizon detection dataset of 65 videos of visible, infrared cameras for onshore and onboard ship camera placement.

  6. Detection of inter-frame forgeries in digital videos.

    PubMed

    K, Sitara; Mehtre, B M

    2018-05-26

    Videos are acceptable as evidence in the court of law, provided its authenticity and integrity are scientifically validated. Videos recorded by surveillance systems are susceptible to malicious alterations of visual content by perpetrators locally or remotely. Such malicious alterations of video contents (called video forgeries) are categorized into inter-frame and intra-frame forgeries. In this paper, we propose inter-frame forgery detection techniques using tamper traces from spatio-temporal and compressed domains. Pristine videos containing frames that are recorded during sudden camera zooming event, may get wrongly classified as tampered videos leading to an increase in false positives. To address this issue, we propose a method for zooming detection and it is incorporated in video tampering detection. Frame shuffling detection, which was not explored so far is also addressed in our work. Our method is capable of differentiating various inter-frame tamper events and its localization in the temporal domain. The proposed system is tested on 23,586 videos of which 2346 are pristine and rest of them are candidates of inter-frame forged videos. Experimental results show that we have successfully detected frame shuffling with encouraging accuracy rates. We have achieved improved accuracy on forgery detection in frame insertion, frame deletion and frame duplication. Copyright © 2018. Published by Elsevier B.V.

  7. Application of a fast skyline computation algorithm for serendipitous searching problems

    NASA Astrophysics Data System (ADS)

    Koizumi, Kenichi; Hiraki, Kei; Inaba, Mary

    2018-02-01

    Skyline computation is a method of extracting interesting entries from a large population with multiple attributes. These entries, called skyline or Pareto optimal entries, are known to have extreme characteristics that cannot be found by outlier detection methods. Skyline computation is an important task for characterizing large amounts of data and selecting interesting entries with extreme features. When the population changes dynamically, the task of calculating a sequence of skyline sets is called continuous skyline computation. This task is known to be difficult to perform for the following reasons: (1) information of non-skyline entries must be stored since they may join the skyline in the future; (2) the appearance or disappearance of even a single entry can change the skyline drastically; (3) it is difficult to adopt a geometric acceleration algorithm for skyline computation tasks with high-dimensional datasets. Our new algorithm called jointed rooted-tree (JR-tree) manages entries using a rooted tree structure. JR-tree delays extend the tree to deep levels to accelerate tree construction and traversal. In this study, we presented the difficulties in extracting entries tagged with a rare label in high-dimensional space and the potential of fast skyline computation in low-latency cell identification technology.

  8. Oyster toadfish (Opsanus tau) boatwhistle call detection and patterns within a large-scale oyster restoration site

    PubMed Central

    Bohnenstiehl, DelWayne R.; Eggleston, David B.; Kellogg, M. Lisa; Lyon, R. Patrick

    2017-01-01

    During May 2015, passive acoustic recorders were deployed at eight subtidal oyster reefs within Harris Creek Oyster Sanctuary in Chesapeake Bay, Maryland USA. These sites were selected to represent both restored and unrestored habitats having a range of oyster densities. Throughout the survey, the soundscape within Harris Creek was dominated by the boatwhistle calls of the oyster toadfish, Opsanus tau. A novel, multi-kernel spectral correlation approach was developed to automatically detect these boatwhistle calls using their two lowest harmonic bands. The results provided quantitative information on how call rate and call frequency varied in space and time. Toadfish boatwhistle fundamental frequency ranged from 140 Hz to 260 Hz and was well correlated (r = 0.94) with changes in water temperature, with the fundamental frequency increasing by ~11 Hz for every 1°C increase in temperature. The boatwhistle call rate increased from just a few calls per minute at the start of monitoring on May 7th to ~100 calls/min on May 10th and remained elevated throughout the survey. As male toadfish are known to generate boatwhistles to attract mates, this rapid increase in call rate was interpreted to mark the onset of spring spawning behavior. Call rate was not modulated by water temperature, but showed a consistent diurnal pattern, with a sharp decrease in rate just before sunrise and a peak just after sunset. There was a significant difference in call rate between restored and unrestored reefs, with restored sites having nearly twice the call rate as unrestored sites. This work highlights the benefits of using automated detection techniques that provide quantitative information on species-specific call characteristics and patterns. This type of non-invasive acoustic monitoring provides long-term, semi-continuous information on animal behavior and abundance, and operates effectively in settings that are otherwise difficult to sample. PMID:28792543

  9. Oyster toadfish (Opsanus tau) boatwhistle call detection and patterns within a large-scale oyster restoration site.

    PubMed

    Ricci, Shannon W; Bohnenstiehl, DelWayne R; Eggleston, David B; Kellogg, M Lisa; Lyon, R Patrick

    2017-01-01

    During May 2015, passive acoustic recorders were deployed at eight subtidal oyster reefs within Harris Creek Oyster Sanctuary in Chesapeake Bay, Maryland USA. These sites were selected to represent both restored and unrestored habitats having a range of oyster densities. Throughout the survey, the soundscape within Harris Creek was dominated by the boatwhistle calls of the oyster toadfish, Opsanus tau. A novel, multi-kernel spectral correlation approach was developed to automatically detect these boatwhistle calls using their two lowest harmonic bands. The results provided quantitative information on how call rate and call frequency varied in space and time. Toadfish boatwhistle fundamental frequency ranged from 140 Hz to 260 Hz and was well correlated (r = 0.94) with changes in water temperature, with the fundamental frequency increasing by ~11 Hz for every 1°C increase in temperature. The boatwhistle call rate increased from just a few calls per minute at the start of monitoring on May 7th to ~100 calls/min on May 10th and remained elevated throughout the survey. As male toadfish are known to generate boatwhistles to attract mates, this rapid increase in call rate was interpreted to mark the onset of spring spawning behavior. Call rate was not modulated by water temperature, but showed a consistent diurnal pattern, with a sharp decrease in rate just before sunrise and a peak just after sunset. There was a significant difference in call rate between restored and unrestored reefs, with restored sites having nearly twice the call rate as unrestored sites. This work highlights the benefits of using automated detection techniques that provide quantitative information on species-specific call characteristics and patterns. This type of non-invasive acoustic monitoring provides long-term, semi-continuous information on animal behavior and abundance, and operates effectively in settings that are otherwise difficult to sample.

  10. 'Known Secure Sensor Measurements' for Critical Infrastructure Systems: Detecting Falsification of System State

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miles McQueen; Annarita Giani

    2011-09-01

    This paper describes a first investigation on a low cost and low false alarm, reliable mechanism for detecting manipulation of critical physical processes and falsification of system state. We call this novel mechanism Known Secure Sensor Measurements (KSSM). The method moves beyond analysis of network traffic and host based state information, in fact it uses physical measurements of the process being controlled to detect falsification of state. KSSM is intended to be incorporated into the design of new, resilient, cost effective critical infrastructure control systems. It can also be included in incremental upgrades of already in- stalled systems for enhancedmore » resilience. KSSM is based on known secure physical measurements for assessing the likelihood of an attack and will demonstrate a practical approach to creating, transmitting, and using the known secure measurements for detection.« less

  11. Detection of Fingerprints Based on Elemental Composition Using Micro-X-Ray Fluorescence.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Worley, C. G.; Wiltshire, S.; Miller, T. C.

    A method was developed to detect fingerprints using a technique known as micro-X-ray fluorescence. The traditional method of detecting fingerprints involves treating the sample with certain powders, liquids, or vapors to add color to the fingerprint so that it can be easily seen and photographed for forensic purposes. This is known as contrast enhancement, and a multitude of chemical processing methods have been developed in the past century to render fingerprints visible. However, fingerprints present on certain substances such as fibrous papers and textiles, wood, leather, plastic, adhesives, and human skin can sometimes be difficult to detect by contrast enhancement.more » Children's fingerprints are also difficult to detect due to the absence of sebum on their skin, and detection of prints left on certain colored backgrounds can sometimes be problematic. Micro-X-ray fluorescence (MXRF) was studied here as a method to detect fingerprints based on chemical elements present in fingerprint residue. For example, salts such as sodium chloride and potassium chloride excreted in sweat are sometimes present in detectable quantities in fingerprints. We demonstrated that MXRF can be used to detect this sodium, potassium, and chlorine from such salts. Furthermore, using MXRF, each of these elements (and many other elements if present) can be detected as a function of location on a surface, so we were able to 'see' a fingerprint because these salts are deposited mainly along the patterns present in a fingerprint (traditionally called friction ridges in forensic science). MXRF is not a panacea for detecting all fingerprints; some prints will not contain enough detectable material to be 'seen'; however, determining an effective means of coloring a fingerprint with traditional contrast enhancement methods can sometimes be an arduous process with limited success. Thus, MXRF offers a possible alternative for detecting fingerprints, and it does not require any additional chemical treatment steps which can be time consuming and permanently alter the sample. Additionally, MXRF is noninvasive, so a fingerprint analyzed by this method is left pristine for examination by other methods (eg. DNA extraction). To the best of the author's knowledge, no studies have been published to date concerning the detection of fingerprints by micro-X-ray fluorescence. Some studies have been published in which other spectroscopic methods were employed to examine the chemical composition of fingerprints (eg. IR, SEM/EDX, and Auger), but very few papers discuss the actual detection and imaging of a complete fingerprint by any spectroscopic method. Thus, this work is unique.« less

  12. Assay for the simultaneous determination of guanidinoacetic acid, creatinine and creatine in plasma and urine by capillary electrophoresis UV-detection.

    PubMed

    Zinellu, Angelo; Sotgia, Salvatore; Zinellu, Elisabetta; Chessa, Roberto; Deiana, Luca; Carru, Ciriaco

    2006-03-01

    Guanidinoacetic acid (GAA) measurement has recently become of great interest for the diagnosis of creatine (Cn) metabolism disorders, and research calls for rapid and inexpensive methods for its detection in plasma and urine in order to assess a large number of patients. We propose a new assay for the measurement of GAA by a simple CZE UV-detection without previous sample derivatization. Plasma samples were filtered by Microcon-10 microconcentrators and directly injected into the capillary, while for urine specimens a simple water dilution before injection was needed. A baseline separation was obtained in less than 8 min using a 60.2 cm x 75 microm uncoated silica capillary, 75 mmol/L Tris-phosphate buffer pH 2.25 at 15 degrees C. The performance of the developed method was assessed by measuring plasma creatinine and Cn in 32 normal subjects and comparing the data obtained by the new method with those found with the previous CE assay. Our new method seems to be an inexpensive, fast and specific tool to assess a large number of patients both in clinical and in research laboratories.

  13. Modeling the heterogeneous traffic correlations in urban road systems using traffic-enhanced community detection approach

    NASA Astrophysics Data System (ADS)

    Lu, Feng; Liu, Kang; Duan, Yingying; Cheng, Shifen; Du, Fei

    2018-07-01

    A better characterization of the traffic influence among urban roads is crucial for traffic control and traffic forecasting. The existence of spatial heterogeneity imposes great influence on modeling the extent and degree of road traffic correlation, which is usually neglected by the traditional distance based method. In this paper, we propose a traffic-enhanced community detection approach to spatially reveal the traffic correlation in city road networks. First, the road network is modeled as a traffic-enhanced dual graph with the closeness between two road segments determined not only by their topological connection, but also by the traffic correlation between them. Then a flow-based community detection algorithm called Infomap is utilized to identify the road segment clusters. Evaluated by Moran's I, Calinski-Harabaz Index and the traffic interpolation application, we find that compared to the distance based method and the community based method, our proposed traffic-enhanced community based method behaves better in capturing the extent of traffic relevance as both the topological structure of the road network and the traffic correlations among urban roads are considered. It can be used in more traffic-related applications, such as traffic forecasting, traffic control and guidance.

  14. Object detection system based on multimodel saliency maps

    NASA Astrophysics Data System (ADS)

    Guo, Ya'nan; Luo, Chongfan; Ma, Yide

    2017-03-01

    Detection of visually salient image regions is extensively applied in computer vision and computer graphics, such as object detection, adaptive compression, and object recognition, but any single model always has its limitations to various images, so in our work, we establish a method based on multimodel saliency maps to detect the object, which intelligently absorbs the merits of various individual saliency detection models to achieve promising results. The method can be roughly divided into three steps: in the first step, we propose a decision-making system to evaluate saliency maps obtained by seven competitive methods and merely select the three most valuable saliency maps; in the second step, we introduce heterogeneous PCNN algorithm to obtain three prime foregrounds; and then a self-designed nonlinear fusion method is proposed to merge these saliency maps; at last, the adaptive improved and simplified PCNN model is used to detect the object. Our proposed method can constitute an object detection system for different occasions, which requires no training, is simple, and highly efficient. The proposed saliency fusion technique shows better performance over a broad range of images and enriches the applicability range by fusing different individual saliency models, this proposed system is worthy enough to be called a strong model. Moreover, the proposed adaptive improved SPCNN model is stemmed from the Eckhorn's neuron model, which is skilled in image segmentation because of its biological background, and in which all the parameters are adaptive to image information. We extensively appraise our algorithm on classical salient object detection database, and the experimental results demonstrate that the aggregation of saliency maps outperforms the best saliency model in all cases, yielding highest precision of 89.90%, better recall rates of 98.20%, greatest F-measure of 91.20%, and lowest mean absolute error value of 0.057, the value of proposed saliency evaluation EHA reaches to 215.287. We deem our method can be wielded to diverse applications in the future.

  15. Development of an innovative immunoassay for CP4EPSPS and Cry1AB genetically modified protein detection and quantification.

    PubMed

    Ermolli, M; Prospero, A; Balla, B; Querci, M; Mazzeo, A; Van Den Eede, G

    2006-09-01

    An innovative immunoassay, called enzyme-linked immunoabsorbant assay (ELISA) Reverse, based on a new conformation of the solid phase, was developed. The solid support was expressly designed to be immersed directly in liquid samples to detect the presence of protein targets. Its application is proposed in those cases where a large number of samples have to be screened simultaneously or when the simultaneous detection of different proteins is required. As a first application, a quantitative immunoassay for Cry1AB protein in genetically modified maize was optimized. The method was tested using genetically modified organism concentrations from 0.1 to 2.0%. The limit of detection and limit of quantitation of the method were determined as 0.0056 and 0.0168 (expressed as the percentage of genetically modified organisms content), respectively. A qualitative multiplex assay to assess the presence of two genetically modified proteins simultaneously was also established for the case of the Cry1AB and the CP4EPSPS (5-enolpyruvylshikimate-3-phosphate synthase) present in genetically modified maize and soy, respectively.

  16. Identification of pathogen genomic variants through an integrated pipeline

    PubMed Central

    2014-01-01

    Background Whole-genome sequencing represents a powerful experimental tool for pathogen research. We present methods for the analysis of small eukaryotic genomes, including a streamlined system (called Platypus) for finding single nucleotide and copy number variants as well as recombination events. Results We have validated our pipeline using four sets of Plasmodium falciparum drug resistant data containing 26 clones from 3D7 and Dd2 background strains, identifying an average of 11 single nucleotide variants per clone. We also identify 8 copy number variants with contributions to resistance, and report for the first time that all analyzed amplification events are in tandem. Conclusions The Platypus pipeline provides malaria researchers with a powerful tool to analyze short read sequencing data. It provides an accurate way to detect SNVs using known software packages, and a novel methodology for detection of CNVs, though it does not currently support detection of small indels. We have validated that the pipeline detects known SNVs in a variety of samples while filtering out spurious data. We bundle the methods into a freely available package. PMID:24589256

  17. SEVEN NEW BINARIES DISCOVERED IN THE KEPLER LIGHT CURVES THROUGH THE BEER METHOD CONFIRMED BY RADIAL-VELOCITY OBSERVATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faigler, S.; Mazeh, T.; Tal-Or, L.

    We present seven newly discovered non-eclipsing short-period binary systems with low-mass companions, identified by the recently introduced BEER algorithm, applied to the publicly available 138-day photometric light curves obtained by the Kepler mission. The detection is based on the beaming effect (sometimes called Doppler boosting), which increases (decreases) the brightness of any light source approaching (receding from) the observer, enabling a prediction of the stellar Doppler radial-velocity (RV) modulation from its precise photometry. The BEER algorithm identifies the BEaming periodic modulation, with a combination of the well-known Ellipsoidal and Reflection/heating periodic effects, induced by short-period companions. The seven detections weremore » confirmed by spectroscopic RV follow-up observations, indicating minimum secondary masses in the range 0.07-0.4 M{sub Sun }. The binaries discovered establish for the first time the feasibility of the BEER algorithm as a new detection method for short-period non-eclipsing binaries, with the potential to detect in the near future non-transiting brown-dwarf secondaries, or even massive planets.« less

  18. ExScalibur: A High-Performance Cloud-Enabled Suite for Whole Exome Germline and Somatic Mutation Identification.

    PubMed

    Bao, Riyue; Hernandez, Kyle; Huang, Lei; Kang, Wenjun; Bartom, Elizabeth; Onel, Kenan; Volchenboum, Samuel; Andrade, Jorge

    2015-01-01

    Whole exome sequencing has facilitated the discovery of causal genetic variants associated with human diseases at deep coverage and low cost. In particular, the detection of somatic mutations from tumor/normal pairs has provided insights into the cancer genome. Although there is an abundance of publicly-available software for the detection of germline and somatic variants, concordance is generally limited among variant callers and alignment algorithms. Successful integration of variants detected by multiple methods requires in-depth knowledge of the software, access to high-performance computing resources, and advanced programming techniques. We present ExScalibur, a set of fully automated, highly scalable and modulated pipelines for whole exome data analysis. The suite integrates multiple alignment and variant calling algorithms for the accurate detection of germline and somatic mutations with close to 99% sensitivity and specificity. ExScalibur implements streamlined execution of analytical modules, real-time monitoring of pipeline progress, robust handling of errors and intuitive documentation that allows for increased reproducibility and sharing of results and workflows. It runs on local computers, high-performance computing clusters and cloud environments. In addition, we provide a data analysis report utility to facilitate visualization of the results that offers interactive exploration of quality control files, read alignment and variant calls, assisting downstream customization of potential disease-causing mutations. ExScalibur is open-source and is also available as a public image on Amazon cloud.

  19. Abnormal plasma DNA profiles in early ovarian cancer using a non-invasive prenatal testing platform: implications for cancer screening.

    PubMed

    Cohen, Paul A; Flowers, Nicola; Tong, Stephen; Hannan, Natalie; Pertile, Mark D; Hui, Lisa

    2016-08-24

    Non-invasive prenatal testing (NIPT) identifies fetal aneuploidy by sequencing cell-free DNA in the maternal plasma. Pre-symptomatic maternal malignancies have been incidentally detected during NIPT based on abnormal genomic profiles. This low coverage sequencing approach could have potential for ovarian cancer screening in the non-pregnant population. Our objective was to investigate whether plasma DNA sequencing with a clinical whole genome NIPT platform can detect early- and late-stage high-grade serous ovarian carcinomas (HGSOC). This is a case control study of prospectively-collected biobank samples comprising preoperative plasma from 32 women with HGSOC (16 'early cancer' (FIGO I-II) and 16 'advanced cancer' (FIGO III-IV)) and 32 benign controls. Plasma DNA from cases and controls were sequenced using a commercial NIPT platform and chromosome dosage measured. Sequencing data were blindly analyzed with two methods: (1) Subchromosomal changes were called using an open source algorithm WISECONDOR (WIthin-SamplE COpy Number aberration DetectOR). Genomic gains or losses ≥ 15 Mb were prespecified as "screen positive" calls, and mapped to recurrent copy number variations reported in an ovarian cancer genome atlas. (2) Selected whole chromosome gains or losses were reported using the routine NIPT pipeline for fetal aneuploidy. We detected 13/32 cancer cases using the subchromosomal analysis (sensitivity 40.6 %, 95 % CI, 23.7-59.4 %), including 6/16 early and 7/16 advanced HGSOC cases. Two of 32 benign controls had subchromosomal gains ≥ 15 Mb (specificity 93.8 %, 95 % CI, 79.2-99.2 %). Twelve of the 13 true positive cancer cases exhibited specific recurrent changes reported in HGSOC tumors. The NIPT pipeline resulted in one "monosomy 18" call from the cancer group, and two "monosomy X" calls in the controls. Low coverage plasma DNA sequencing used for prenatal testing detected 40.6 % of all HGSOC, including 38 % of early stage cases. Our findings demonstrate the potential of a high throughput sequencing platform to screen for early HGSOC in plasma based on characteristic multiple segmental chromosome gains and losses. The performance of this approach may be further improved by refining bioinformatics algorithms and targeting selected cancer copy number variations.

  20. Monte Carlo based statistical power analysis for mediation models: methods and software.

    PubMed

    Zhang, Zhiyong

    2014-12-01

    The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.

  1. Cross-correlation detection and analysis for California's electricity market based on analogous multifractal analysis

    NASA Astrophysics Data System (ADS)

    Wang, Fang; Liao, Gui-ping; Li, Jian-hui; Zou, Rui-biao; Shi, Wen

    2013-03-01

    A novel method, which we called the analogous multifractal cross-correlation analysis, is proposed in this paper to study the multifractal behavior in the power-law cross-correlation between price and load in California electricity market. In addition, a statistic ρAMF -XA, which we call the analogous multifractal cross-correlation coefficient, is defined to test whether the cross-correlation between two given signals is genuine or not. Our analysis finds that both the price and load time series in California electricity market express multifractal nature. While, as indicated by the ρAMF -XA statistical test, there is a huge difference in the cross-correlation behavior between the years 1999 and 2000 in California electricity markets.

  2. Cross-correlation detection and analysis for California's electricity market based on analogous multifractal analysis.

    PubMed

    Wang, Fang; Liao, Gui-ping; Li, Jian-hui; Zou, Rui-biao; Shi, Wen

    2013-03-01

    A novel method, which we called the analogous multifractal cross-correlation analysis, is proposed in this paper to study the multifractal behavior in the power-law cross-correlation between price and load in California electricity market. In addition, a statistic ρAMF-XA, which we call the analogous multifractal cross-correlation coefficient, is defined to test whether the cross-correlation between two given signals is genuine or not. Our analysis finds that both the price and load time series in California electricity market express multifractal nature. While, as indicated by the ρAMF-XA statistical test, there is a huge difference in the cross-correlation behavior between the years 1999 and 2000 in California electricity markets.

  3. Real-time RT-PCR systems for CTC detection from blood samples of breast cancer and gynaecological tumour patients (Review).

    PubMed

    Andergassen, Ulrich; Kölbl, Alexandra C; Mahner, Sven; Jeschke, Udo

    2016-04-01

    Cells, which detach from a primary epithelial tumour and migrate through lymphatic vessels and blood stream are called 'circulating tumour cells'. These cells are considered to be the main root of remote metastasis and are correlated to a worse prognosis concerning progression-free and overall survival of the patients. Therefore, the detection of the minimal residual disease is of great importance regarding therapeutic decisions. Many different detection strategies are already available, but only one method, the CellSearch® system, reached FDA approval. The present review focusses on the detection of circulating tumour cells by means of real-time PCR, a highly sensitive method based on differences in gene expression between normal and malignant cells. Strategies for an enrichment of tumour cells are mentioned, as well as a large panel of potential marker genes. Drawbacks and advantages of the technique are elucidated, whereas, the greatest advantage might be, that by selection of appropriate marker genes, also tumour cells, which have already undergone epithelial to mesenchymal transition can be detected. Finally, the application of real-time PCR in different gynaecological malignancies is described, with breast cancer being the most studied cancer entity.

  4. Trackline and point detection probabilities for acoustic surveys of Cuvier's and Blainville's beaked whales.

    PubMed

    Barlow, Jay; Tyack, Peter L; Johnson, Mark P; Baird, Robin W; Schorr, Gregory S; Andrews, Russel D; Aguilar de Soto, Natacha

    2013-09-01

    Acoustic survey methods can be used to estimate density and abundance using sounds produced by cetaceans and detected using hydrophones if the probability of detection can be estimated. For passive acoustic surveys, probability of detection at zero horizontal distance from a sensor, commonly called g(0), depends on the temporal patterns of vocalizations. Methods to estimate g(0) are developed based on the assumption that a beaked whale will be detected if it is producing regular echolocation clicks directly under or above a hydrophone. Data from acoustic recording tags placed on two species of beaked whales (Cuvier's beaked whale-Ziphius cavirostris and Blainville's beaked whale-Mesoplodon densirostris) are used to directly estimate the percentage of time they produce echolocation clicks. A model of vocal behavior for these species as a function of their diving behavior is applied to other types of dive data (from time-depth recorders and time-depth-transmitting satellite tags) to indirectly determine g(0) in other locations for low ambient noise conditions. Estimates of g(0) for a single instant in time are 0.28 [standard deviation (s.d.) = 0.05] for Cuvier's beaked whale and 0.19 (s.d. = 0.01) for Blainville's beaked whale.

  5. A STATISTICAL SURVEY OF DIOXIN-LIKE COMPOUNDS IN ...

    EPA Pesticide Factsheets

    The USEPA and the USDA completed the first statistically designed survey of the occurrence and concentration of dibenzo-p-dioxins (CDDs), dibenzofurans (CDFs), and coplanar polychlorinated biphenyls (PCBs) in the fat of beef animals raised for human consumption in the United States. Back fat was sampled from 63 carcasses at federally inspected slaughter establishments nationwide. The sample design called for sampling beef animal classes in proportion to national annual slaughter statistics. All samples were analyzed using a modification of EPA method 1613, using isotope dilution, High Resolution GC/MS to determine the rate of occurrence of 2,3,7,8-substituted CDDs/CDFs/PCBs. The method detection limits ranged from 0.05 ng/kg for TCDD to 3 ng/kg for OCDD. The results of this survey showed a mean concentration (reported as I-TEQ, lipid adjusted) in U.S. beef animals of 0.35 ng/kg and 0.89 ng/kg for CDD/CDF TEQs when either non-detects are treated as 0 value or assigned a value of 1/2 the detection limit, respectively, and 0.51 ng/kg for coplanar PCB TEQs at both non-detect equal 0 and 1/2 detection limit. journal article

  6. Comparison of PCR methods for the detection of genetic variants of carp edema virus.

    PubMed

    Adamek, Mikolaj; Matras, Marek; Jung-Schroers, Verena; Teitge, Felix; Heling, Max; Bergmann, Sven M; Reichert, Michal; Way, Keith; Stone, David M; Steinhagen, Dieter

    2017-09-20

    The infection of common carp and its ornamental variety, koi, with the carp edema virus (CEV) is often associated with the occurrence of a clinical disease called 'koi sleepy disease'. The disease may lead to high mortality in both koi and common carp populations. To prevent further spread of the infection and the disease, a reliable detection method for this virus is required. However, the high genetic variability of the CEV p4a gene used for PCR-based diagnostics could be a serious obstacle for successful and reliable detection of virus infection in field samples. By analysing 39 field samples from different geographical origins obtained from koi and farmed carp and from all 3 genogroups of CEV, using several recently available PCR protocols, we investigated which of the protocols would allow the detection of CEV from all known genogroups present in samples from Central European carp or koi populations. The comparison of 5 different PCR protocols showed that the PCR assays (both end-point and quantitative) developed in the Centre for Environment, Fisheries and Aquaculture Science exhibited the highest analytical inclusivity and diagnostic sensitivity. Currently, this makes them the most suitable protocols for detecting viruses from all known CEV genogroups.

  7. X-ray Scatter Imaging of Hepatocellular Carcinoma in a Mouse Model Using Nanoparticle Contrast Agents

    NASA Astrophysics Data System (ADS)

    Rand, Danielle; Derdak, Zoltan; Carlson, Rolf; Wands, Jack R.; Rose-Petruck, Christoph

    2015-10-01

    Hepatocellular carcinoma (HCC) is one of the most common malignant tumors worldwide and is almost uniformly fatal. Current methods of detection include ultrasound examination and imaging by CT scan or MRI; however, these techniques are problematic in terms of sensitivity and specificity, and the detection of early tumors (<1 cm diameter) has proven elusive. Better, more specific, and more sensitive detection methods are therefore urgently needed. Here we discuss the application of a newly developed x-ray imaging technique called Spatial Frequency Heterodyne Imaging (SFHI) for the early detection of HCC. SFHI uses x-rays scattered by an object to form an image and is more sensitive than conventional absorption-based x-radiography. We show that tissues labeled in vivo with gold nanoparticle contrast agents can be detected using SFHI. We also demonstrate that directed targeting and SFHI of HCC tumors in a mouse model is possible through the use of HCC-specific antibodies. The enhanced sensitivity of SFHI relative to currently available techniques enables the x-ray imaging of tumors that are just a few millimeters in diameter and substantially reduces the amount of nanoparticle contrast agent required for intravenous injection relative to absorption-based x-ray imaging.

  8. Accelerated SPECT Monte Carlo Simulation Using Multiple Projection Sampling and Convolution-Based Forced Detection

    NASA Astrophysics Data System (ADS)

    Liu, Shaoying; King, Michael A.; Brill, Aaron B.; Stabin, Michael G.; Farncombe, Troy H.

    2008-02-01

    Monte Carlo (MC) is a well-utilized tool for simulating photon transport in single photon emission computed tomography (SPECT) due to its ability to accurately model physical processes of photon transport. As a consequence of this accuracy, it suffers from a relatively low detection efficiency and long computation time. One technique used to improve the speed of MC modeling is the effective and well-established variance reduction technique (VRT) known as forced detection (FD). With this method, photons are followed as they traverse the object under study but are then forced to travel in the direction of the detector surface, whereby they are detected at a single detector location. Another method, called convolution-based forced detection (CFD), is based on the fundamental idea of FD with the exception that detected photons are detected at multiple detector locations and determined with a distance-dependent blurring kernel. In order to further increase the speed of MC, a method named multiple projection convolution-based forced detection (MP-CFD) is presented. Rather than forcing photons to hit a single detector, the MP-CFD method follows the photon transport through the object but then, at each scatter site, forces the photon to interact with a number of detectors at a variety of angles surrounding the object. This way, it is possible to simulate all the projection images of a SPECT simulation in parallel, rather than as independent projections. The result of this is vastly improved simulation time as much of the computation load of simulating photon transport through the object is done only once for all projection angles. The results of the proposed MP-CFD method agrees well with the experimental data in measurements of point spread function (PSF), producing a correlation coefficient (r2) of 0.99 compared to experimental data. The speed of MP-CFD is shown to be about 60 times faster than a regular forced detection MC program with similar results.

  9. High sensitivity contrast enhanced optical coherence tomography for functional in vivo imaging

    NASA Astrophysics Data System (ADS)

    Liba, Orly; SoRelle, Elliott D.; Sen, Debasish; de la Zerda, Adam

    2017-02-01

    In this study, we developed and applied highly-scattering large gold nanorods (LGNRs) and custom spectral detection algorithms for high sensitivity contrast-enhanced optical coherence tomography (OCT). We were able to detect LGNRs at a concentration as low as 50 pM in blood. We used this approach for noninvasive 3D imaging of blood vessels deep in solid tumors in living mice. Additionally, we demonstrated multiplexed imaging of spectrally-distinct LGNRs that enabled observations of functional drainage in lymphatic networks. This method, which we call MOZART, provides a platform for molecular imaging and characterization of tissue noninvasively at cellular resolution.

  10. Enhanced backscatter of optical beams reflected in turbulent air.

    PubMed

    Nelson, W; Palastro, J P; Wu, C; Davis, C C

    2015-07-01

    Optical beams propagating through air acquire phase distortions from turbulent fluctuations in the refractive index. While these distortions are usually deleterious to propagation, beams reflected in a turbulent medium can undergo a local recovery of spatial coherence and intensity enhancement referred to as enhanced backscatter (EBS). Here we validate the commonly used phase screen simulation with experimental results obtained from lab-scale experiments. We also verify theoretical predictions of the dependence of the turbulence strength on EBS. Finally, we present a novel algorithm called the "tilt-shift method" which allows detection of EBS in frozen turbulence, reducing the time required to detect the EBS signal.

  11. Validation of an acoustic location system to monitor Bornean orangutan (Pongo pygmaeus wurmbii) long calls.

    PubMed

    Spillmann, Brigitte; van Noordwijk, Maria A; Willems, Erik P; Mitra Setia, Tatang; Wipfli, Urs; van Schaik, Carel P

    2015-07-01

    The long call is an important vocal communication signal in the widely dispersed, semi-solitary orangutan. Long calls affect individuals' ranging behavior and mediate social relationships and regulate encounters between dispersed individuals in a dense rainforest. The aim of this study was to test the utility of an Acoustic Location System (ALS) for recording and triangulating the loud calls of free-living primates. We developed and validated a data extraction protocol for an ALS used to record wild orangutan males' long calls at the Tuanan field site (Central Kalimantan). We installed an ALS in a grid of 300 ha, containing 20 SM2+ recorders placed in a regular lattice at 500 m intervals, to monitor the distribution of calling males in the area. The validated system had the following main features: (i) a user-trained software algorithm (Song Scope) that reliably recognized orangutan long calls from sound files at distances up to 700 m from the nearest recorder, resulting in a total area of approximately 900 ha that could be monitored continuously; (ii) acoustic location of calling males up to 200 m outside the microphone grid, which meant that within an area of approximately 450 ha, call locations could be calculated through triangulation. The mean accuracy was 58 m, an error that is modest relative to orangutan mobility and average inter-individual distances. We conclude that an ALS is a highly effective method for detecting long-distance calls of wild primates and triangulating their position. In combination with conventional individual focal follow data, an ALS can greatly improve our knowledge of orangutans' social organization, and is readily adaptable for studying other highly vocal animals. © 2015 Wiley Periodicals, Inc.

  12. Early and Real-Time Detection of Seasonal Influenza Onset

    PubMed Central

    Marques-Pita, Manuel

    2017-01-01

    Every year, influenza epidemics affect millions of people and place a strong burden on health care services. A timely knowledge of the onset of the epidemic could allow these services to prepare for the peak. We present a method that can reliably identify and signal the influenza outbreak. By combining official Influenza-Like Illness (ILI) incidence rates, searches for ILI-related terms on Google, and an on-call triage phone service, Saúde 24, we were able to identify the beginning of the flu season in 8 European countries, anticipating current official alerts by several weeks. This work shows that it is possible to detect and consistently anticipate the onset of the flu season, in real-time, regardless of the amplitude of the epidemic, with obvious advantages for health care authorities. We also show that the method is not limited to one country, specific region or language, and that it provides a simple and reliable signal that can be used in early detection of other seasonal diseases. PMID:28158192

  13. Apricot DNA as an indicator for persipan: detection and quantitation in marzipan using ligation-dependent probe amplification.

    PubMed

    Luber, Florian; Demmel, Anja; Hosken, Anne; Busch, Ulrich; Engel, Karl-Heinz

    2012-06-13

    The confectionery ingredient marzipan is exclusively prepared from almond kernels and sugar. The potential use of apricot kernels, so-called persipan, is an important issue for the quality assessment of marzipan. Therefore, a ligation-dependent probe amplification (LPA) assay was developed that enables a specific and sensitive detection of apricot DNA, as an indicator for the presence of persipan. The limit of detection was determined to be 0.1% persipan in marzipan. The suitability of the method was confirmed by the analysis of 20 commercially available food samples. The integration of a Prunus -specific probe in the LPA assay as a reference allowed for the relative quantitation of persipan in marzipan. The limit of quantitation was determined to be 0.5% persipan in marzipan. The analysis of two self-prepared mixtures of marzipan and persipan demonstrated the applicability of the quantitation method at concentration levels of practical relevance for quality control.

  14. Using distances between Top-n-gram and residue pairs for protein remote homology detection.

    PubMed

    Liu, Bin; Xu, Jinghao; Zou, Quan; Xu, Ruifeng; Wang, Xiaolong; Chen, Qingcai

    2014-01-01

    Protein remote homology detection is one of the central problems in bioinformatics, which is important for both basic research and practical application. Currently, discriminative methods based on Support Vector Machines (SVMs) achieve the state-of-the-art performance. Exploring feature vectors incorporating the position information of amino acids or other protein building blocks is a key step to improve the performance of the SVM-based methods. Two new methods for protein remote homology detection were proposed, called SVM-DR and SVM-DT. SVM-DR is a sequence-based method, in which the feature vector representation for protein is based on the distances between residue pairs. SVM-DT is a profile-based method, which considers the distances between Top-n-gram pairs. Top-n-gram can be viewed as a profile-based building block of proteins, which is calculated from the frequency profiles. These two methods are position dependent approaches incorporating the sequence-order information of protein sequences. Various experiments were conducted on a benchmark dataset containing 54 families and 23 superfamilies. Experimental results showed that these two new methods are very promising. Compared with the position independent methods, the performance improvement is obvious. Furthermore, the proposed methods can also provide useful insights for studying the features of protein families. The better performance of the proposed methods demonstrates that the position dependant approaches are efficient for protein remote homology detection. Another advantage of our methods arises from the explicit feature space representation, which can be used to analyze the characteristic features of protein families. The source code of SVM-DT and SVM-DR is available at http://bioinformatics.hitsz.edu.cn/DistanceSVM/index.jsp.

  15. Biclustering of gene expression data using reactive greedy randomized adaptive search procedure.

    PubMed

    Dharan, Smitha; Nair, Achuthsankar S

    2009-01-30

    Biclustering algorithms belong to a distinct class of clustering algorithms that perform simultaneous clustering of both rows and columns of the gene expression matrix and can be a very useful analysis tool when some genes have multiple functions and experimental conditions are diverse. Cheng and Church have introduced a measure called mean squared residue score to evaluate the quality of a bicluster and has become one of the most popular measures to search for biclusters. In this paper, we review basic concepts of the metaheuristics Greedy Randomized Adaptive Search Procedure (GRASP)-construction and local search phases and propose a new method which is a variant of GRASP called Reactive Greedy Randomized Adaptive Search Procedure (Reactive GRASP) to detect significant biclusters from large microarray datasets. The method has two major steps. First, high quality bicluster seeds are generated by means of k-means clustering. In the second step, these seeds are grown using the Reactive GRASP, in which the basic parameter that defines the restrictiveness of the candidate list is self-adjusted, depending on the quality of the solutions found previously. We performed statistical and biological validations of the biclusters obtained and evaluated the method against the results of basic GRASP and as well as with the classic work of Cheng and Church. The experimental results indicate that the Reactive GRASP approach outperforms the basic GRASP algorithm and Cheng and Church approach. The Reactive GRASP approach for the detection of significant biclusters is robust and does not require calibration efforts.

  16. Background noise cancellation for improved acoustic detection of manatee vocalizations

    NASA Astrophysics Data System (ADS)

    Yan, Zheng; Niezrecki, Christopher; Beusse, Diedrich O.

    2005-06-01

    The West Indian manatee (Trichechus manatus latirostris) has become endangered partly because of an increase in the number of collisions with boats. A device to alert boaters of the presence of manatees, so that a collision can be avoided, is desired. A practical implementation of the technology is dependent on the hydrophone spacing and range of detection. These parameters are primarily dependent on the manatee vocalization strength, the decay of the signal's strength with distance, and the background noise levels. An efficient method to extend the detection range by using background noise cancellation is proposed in this paper. An adaptive line enhancer (ALE) that can detect and track narrow band signals buried in broadband noise is implemented to cancel the background noise. The results indicate that the ALE algorithm can efficiently extract the manatee calls from the background noise. The improved signal-to-noise ratio of the signal can be used to extend the range of detection of manatee vocalizations and reduce the false alarm and missing detection rate in their natural habitat. .

  17. Background noise cancellation for improved acoustic detection of manatee vocalizations

    NASA Astrophysics Data System (ADS)

    Yan, Zheng; Niezrecki, Christopher; Beusse, Diedrich O.

    2005-04-01

    The West Indian manatee (Trichechus manatus latirostris) has become endangered partly because of an increase in the number of collisions with boats. A device to alert boaters of the presence of manatees, so that a collision can be avoided, is desired. Practical implementation of the technology is dependent on the hydrophone spacing and range of detection. These parameters are primarily dependent on the manatee vocalization strength, the decay of the signal strength with distance, and the background noise levels. An efficient method to extend the detection range by using background noise cancellation is proposed in this paper. An adaptive line enhancer (ALE) that can detect and track narrowband signals buried in broadband noise is implemented to cancel the background noise. The results indicate that the ALE algorithm can efficiently extract the manatee calls from the background noise. The improved signal-to-noise ratio of the signal can be used to extend the range of detection of manatee vocalizations and reduce the false alarm and missing detection rate in their natural habitat.

  18. Accounting for GC-content bias reduces systematic errors and batch effects in ChIP-seq data.

    PubMed

    Teng, Mingxiang; Irizarry, Rafael A

    2017-11-01

    The main application of ChIP-seq technology is the detection of genomic regions that bind to a protein of interest. A large part of functional genomics' public catalogs is based on ChIP-seq data. These catalogs rely on peak calling algorithms that infer protein-binding sites by detecting genomic regions associated with more mapped reads (coverage) than expected by chance, as a result of the experimental protocol's lack of perfect specificity. We find that GC-content bias accounts for substantial variability in the observed coverage for ChIP-seq experiments and that this variability leads to false-positive peak calls. More concerning is that the GC effect varies across experiments, with the effect strong enough to result in a substantial number of peaks called differently when different laboratories perform experiments on the same cell line. However, accounting for GC content bias in ChIP-seq is challenging because the binding sites of interest tend to be more common in high GC-content regions, which confounds real biological signals with unwanted variability. To account for this challenge, we introduce a statistical approach that accounts for GC effects on both nonspecific noise and signal induced by the binding site. The method can be used to account for this bias in binding quantification as well to improve existing peak calling algorithms. We use this approach to show a reduction in false-positive peaks as well as improved consistency across laboratories. © 2017 Teng and Irizarry; Published by Cold Spring Harbor Laboratory Press.

  19. Detection of expression quantitative trait Loci in complex mouse crosses: impact and alleviation of data quality and complex population substructure.

    PubMed

    Iancu, Ovidiu D; Darakjian, Priscila; Kawane, Sunita; Bottomly, Daniel; Hitzemann, Robert; McWeeney, Shannon

    2012-01-01

    Complex Mus musculus crosses, e.g., heterogeneous stock (HS), provide increased resolution for quantitative trait loci detection. However, increased genetic complexity challenges detection methods, with discordant results due to low data quality or complex genetic architecture. We quantified the impact of theses factors across three mouse crosses and two different detection methods, identifying procedures that greatly improve detection quality. Importantly, HS populations have complex genetic architectures not fully captured by the whole genome kinship matrix, calling for incorporating chromosome specific relatedness information. We analyze three increasingly complex crosses, using gene expression levels as quantitative traits. The three crosses were an F(2) intercross, a HS formed by crossing four inbred strains (HS4), and a HS (HS-CC) derived from the eight lines found in the collaborative cross. Brain (striatum) gene expression and genotype data were obtained using the Illumina platform. We found large disparities between methods, with concordance varying as genetic complexity increased; this problem was more acute for probes with distant regulatory elements (trans). A suite of data filtering steps resulted in substantial increases in reproducibility. Genetic relatedness between samples generated overabundance of detected eQTLs; an adjustment procedure that includes the kinship matrix attenuates this problem. However, we find that relatedness between individuals is not evenly distributed across the genome; information from distinct chromosomes results in relatedness structure different from the whole genome kinship matrix. Shared polymorphisms from distinct chromosomes collectively affect expression levels, confounding eQTL detection. We suggest that considering chromosome specific relatedness can result in improved eQTL detection.

  20. Hybrid Monte Carlo/Deterministic Methods for Accelerating Active Interrogation Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peplow, Douglas E.; Miller, Thomas Martin; Patton, Bruce W

    2013-01-01

    The potential for smuggling special nuclear material (SNM) into the United States is a major concern to homeland security, so federal agencies are investigating a variety of preventive measures, including detection and interdiction of SNM during transport. One approach for SNM detection, called active interrogation, uses a radiation source, such as a beam of neutrons or photons, to scan cargo containers and detect the products of induced fissions. In realistic cargo transport scenarios, the process of inducing and detecting fissions in SNM is difficult due to the presence of various and potentially thick materials between the radiation source and themore » SNM, and the practical limitations on radiation source strength and detection capabilities. Therefore, computer simulations are being used, along with experimental measurements, in efforts to design effective active interrogation detection systems. The computer simulations mostly consist of simulating radiation transport from the source to the detector region(s). Although the Monte Carlo method is predominantly used for these simulations, difficulties persist related to calculating statistically meaningful detector responses in practical computing times, thereby limiting their usefulness for design and evaluation of practical active interrogation systems. In previous work, the benefits of hybrid methods that use the results of approximate deterministic transport calculations to accelerate high-fidelity Monte Carlo simulations have been demonstrated for source-detector type problems. In this work, the hybrid methods are applied and evaluated for three example active interrogation problems. Additionally, a new approach is presented that uses multiple goal-based importance functions depending on a particle s relevance to the ultimate goal of the simulation. Results from the examples demonstrate that the application of hybrid methods to active interrogation problems dramatically increases their calculational efficiency.« less

  1. Multi-modality image registration for effective thermographic fever screening

    NASA Astrophysics Data System (ADS)

    Dwith, C. Y. N.; Ghassemi, Pejhman; Pfefer, Joshua; Casamento, Jon; Wang, Quanzeng

    2017-02-01

    Fever screening based on infrared thermographs (IRTs) is a viable mass screening approach during infectious disease pandemics, such as Ebola and Severe Acute Respiratory Syndrome (SARS), for temperature monitoring in public places like hospitals and airports. IRTs have been found to be powerful, quick and non-invasive methods for detecting elevated temperatures. Moreover, regions medially adjacent to the inner canthi (called the canthi regions in this paper) are preferred sites for fever screening. Accurate localization of the canthi regions can be achieved through multi-modality registration of infrared (IR) and white-light images. Here we propose a registration method through a coarse-fine registration strategy using different registration models based on landmarks and edge detection on eye contours. We have evaluated the registration accuracy to be within +/- 2.7 mm, which enables accurate localization of the canthi regions.

  2. Precision Atomic Beam Laser Spectroscopy

    DTIC Science & Technology

    1999-02-20

    optical efficiency with a new coupled- cavity scheme. We have locked a MISER Nd:YAG laser to a finesse 50,000 cavity with a...sensitivity of optical heterodyne detection is preserved with ZERO sensitivity to small laser / cavity frequency noises. The new method is called Noise-Immune...1996), P. Dube, L.- S. Ma, J. Ye, and J.L.Hall. 9 . "Free-induction decay in molecular iodine measured with an extended - cavity diode laser ,"

  3. ICPD-A New Peak Detection Algorithm for LC/MS

    PubMed Central

    2010-01-01

    Background The identification and quantification of proteins using label-free Liquid Chromatography/Mass Spectrometry (LC/MS) play crucial roles in biological and biomedical research. Increasing evidence has shown that biomarkers are often low abundance proteins. However, LC/MS systems are subject to considerable noise and sample variability, whose statistical characteristics are still elusive, making computational identification of low abundance proteins extremely challenging. As a result, the inability of identifying low abundance proteins in a proteomic study is the main bottleneck in protein biomarker discovery. Results In this paper, we propose a new peak detection method called Information Combining Peak Detection (ICPD ) for high resolution LC/MS. In LC/MS, peptides elute during a certain time period and as a result, peptide isotope patterns are registered in multiple MS scans. The key feature of the new algorithm is that the observed isotope patterns registered in multiple scans are combined together for estimating the likelihood of the peptide existence. An isotope pattern matching score based on the likelihood probability is provided and utilized for peak detection. Conclusions The performance of the new algorithm is evaluated based on protein standards with 48 known proteins. The evaluation shows better peak detection accuracy for low abundance proteins than other LC/MS peak detection methods. PMID:21143790

  4. Task Performance with List-Mode Data

    NASA Astrophysics Data System (ADS)

    Caucci, Luca

    This dissertation investigates the application of list-mode data to detection, estimation, and image reconstruction problems, with an emphasis on emission tomography in medical imaging. We begin by introducing a theoretical framework for list-mode data and we use it to define two observers that operate on list-mode data. These observers are applied to the problem of detecting a signal (known in shape and location) buried in a random lumpy background. We then consider maximum-likelihood methods for the estimation of numerical parameters from list-mode data, and we characterize the performance of these estimators via the so-called Fisher information matrix. Reconstruction from PET list-mode data is then considered. In a process we called "double maximum-likelihood" reconstruction, we consider a simple PET imaging system and we use maximum-likelihood methods to first estimate a parameter vector for each pair of gamma-ray photons that is detected by the hardware. The collection of these parameter vectors forms a list, which is then fed to another maximum-likelihood algorithm for volumetric reconstruction over a grid of voxels. Efficient parallel implementation of the algorithms discussed above is then presented. In this work, we take advantage of two low-cost, mass-produced computing platforms that have recently appeared on the market, and we provide some details on implementing our algorithms on these devices. We conclude this dissertation work by elaborating on a possible application of list-mode data to X-ray digital mammography. We argue that today's CMOS detectors and computing platforms have become fast enough to make X-ray digital mammography list-mode data acquisition and processing feasible.

  5. SIRE: a MIMO radar for landmine/IED detection

    NASA Astrophysics Data System (ADS)

    Ojowu, Ode; Wu, Yue; Li, Jian; Nguyen, Lam

    2013-05-01

    Multiple-input multiple-output (MIMO) radar systems have been shown to have significant performance improvements over their single-input multiple-output (SIMO) counterparts. For transmit and receive elements that are collocated, the waveform diversity afforded by this radar is exploited for performance improvements. These improvements include but are not limited to improved target detection, improved parameter identifiability and better resolvability. In this paper, we present the Synchronous Impulse Reconstruction Radar (SIRE) Ultra-wideband (UWB) radar designed by the Army Research Lab (ARL) for landmine and improvised explosive device (IED) detection as a 2 by 16 MIMO radar (with collocated antennas). Its improvement over its SIMO counterpart in terms of beampattern/cross range resolution are discussed and demonstrated using simulated data herein. The limitations of this radar for Radio Frequency Interference (RFI) suppression are also discussed in this paper. A relaxation method (RELAX) combined with averaging of multiple realizations of the measured data is presented for RFI suppression; results show no noticeable target signature distortion after suppression. In this paper, the back-projection (delay and sum) data independent method is used for generating SAR images. A side-lobe minimization technique called recursive side-lobe minimization (RSM) is also discussed for reducing side-lobes in this data independent approach. We introduce a data-dependent sparsity based spectral estimation technique called Sparse Learning via Iterative Minimization (SLIM) as well as a data-dependent CLEAN approach for generating SAR images for the SIRE radar. These data-adaptive techniques show improvement in side-lobe reduction and resolution for simulated data for the SIRE radar.

  6. Glider-based Passive Acoustic Monitoring Techniques in the Southern California Region & West Coast Naval Training Range Demonstration of Glider-based Passive Acoustic Monitoring

    DTIC Science & Technology

    2012-09-30

    generalized power-law detection algorithm for humpback whale vocalizations. J. Acous. Soc. Am. 131(4), 2682-2699. Roch, M. A., H. Klinck, S...Heaney (2012b). Site specific probability of passive acoustic detection of humpback whale calls from single fixed hydrophones. J. Acous. Soc. Am...monitoring: Correcting humpback call detections for site-specific and time-dependent environmental characteristics . JASA Express Letters, submitted October, 2012, 5 pgs plus 3 figs.

  7. panelcn.MOPS: Copy-number detection in targeted NGS panel data for clinical diagnostics.

    PubMed

    Povysil, Gundula; Tzika, Antigoni; Vogt, Julia; Haunschmid, Verena; Messiaen, Ludwine; Zschocke, Johannes; Klambauer, Günter; Hochreiter, Sepp; Wimmer, Katharina

    2017-07-01

    Targeted next-generation-sequencing (NGS) panels have largely replaced Sanger sequencing in clinical diagnostics. They allow for the detection of copy-number variations (CNVs) in addition to single-nucleotide variants and small insertions/deletions. However, existing computational CNV detection methods have shortcomings regarding accuracy, quality control (QC), incidental findings, and user-friendliness. We developed panelcn.MOPS, a novel pipeline for detecting CNVs in targeted NGS panel data. Using data from 180 samples, we compared panelcn.MOPS with five state-of-the-art methods. With panelcn.MOPS leading the field, most methods achieved comparably high accuracy. panelcn.MOPS reliably detected CNVs ranging in size from part of a region of interest (ROI), to whole genes, which may comprise all ROIs investigated in a given sample. The latter is enabled by analyzing reads from all ROIs of the panel, but presenting results exclusively for user-selected genes, thus avoiding incidental findings. Additionally, panelcn.MOPS offers QC criteria not only for samples, but also for individual ROIs within a sample, which increases the confidence in called CNVs. panelcn.MOPS is freely available both as R package and standalone software with graphical user interface that is easy to use for clinical geneticists without any programming experience. panelcn.MOPS combines high sensitivity and specificity with user-friendliness rendering it highly suitable for routine clinical diagnostics. © 2017 The Authors. Human Mutation published by Wiley Periodicals, Inc.

  8. panelcn.MOPS: Copy‐number detection in targeted NGS panel data for clinical diagnostics

    PubMed Central

    Povysil, Gundula; Tzika, Antigoni; Vogt, Julia; Haunschmid, Verena; Messiaen, Ludwine; Zschocke, Johannes; Klambauer, Günter; Wimmer, Katharina

    2017-01-01

    Abstract Targeted next‐generation‐sequencing (NGS) panels have largely replaced Sanger sequencing in clinical diagnostics. They allow for the detection of copy‐number variations (CNVs) in addition to single‐nucleotide variants and small insertions/deletions. However, existing computational CNV detection methods have shortcomings regarding accuracy, quality control (QC), incidental findings, and user‐friendliness. We developed panelcn.MOPS, a novel pipeline for detecting CNVs in targeted NGS panel data. Using data from 180 samples, we compared panelcn.MOPS with five state‐of‐the‐art methods. With panelcn.MOPS leading the field, most methods achieved comparably high accuracy. panelcn.MOPS reliably detected CNVs ranging in size from part of a region of interest (ROI), to whole genes, which may comprise all ROIs investigated in a given sample. The latter is enabled by analyzing reads from all ROIs of the panel, but presenting results exclusively for user‐selected genes, thus avoiding incidental findings. Additionally, panelcn.MOPS offers QC criteria not only for samples, but also for individual ROIs within a sample, which increases the confidence in called CNVs. panelcn.MOPS is freely available both as R package and standalone software with graphical user interface that is easy to use for clinical geneticists without any programming experience. panelcn.MOPS combines high sensitivity and specificity with user‐friendliness rendering it highly suitable for routine clinical diagnostics. PMID:28449315

  9. A Comparison of Methods to Analyze Aquatic Heterotrophic Flagellates of Different Taxonomic Groups.

    PubMed

    Jeuck, Alexandra; Nitsche, Frank; Wylezich, Claudia; Wirth, Olaf; Bergfeld, Tanja; Brutscher, Fabienne; Hennemann, Melanie; Monir, Shahla; Scherwaß, Anja; Troll, Nicole; Arndt, Hartmut

    2017-08-01

    Heterotrophic flagellates contribute significantly to the matter flux in aquatic and terrestrial ecosystems. Still today their quantification and taxonomic classification bear several problems in field studies, though these methodological problems seem to be increasingly ignored in current ecological studies. Here we describe and test different methods, the live-counting technique, different fixation techniques, cultivation methods like the liquid aliquot method (LAM), and a molecular single cell survey called aliquot PCR (aPCR). All these methods have been tested either using aquatic field samples or cultures of freshwater and marine taxa. Each of the described methods has its advantages and disadvantages, which have to be considered in every single case. With the live-counting technique a detection of living cells up to morphospecies level is possible. Fixation of cells and staining methods are advantageous due to the possible long-term storage and observation of samples. Cultivation methods (LAM) offer the possibility of subsequent molecular analyses, and aPCR tools might complete the deficiency of LAM in terms of the missing detection of non-cultivable flagellates. In summary, we propose a combination of several investigation techniques reducing the gap between the different methodological problems. Copyright © 2017 Elsevier GmbH. All rights reserved.

  10. Evaluation of roadside emergency call box technology : a summary report : technical assistance report.

    DOT National Transportation Integrated Search

    2003-04-01

    Introduction Motorist aid call boxes are used to provide motorist assistance, improve safety, and can serve as an incident detection tool. More recently, Intelligent Transportation Systems (ITS) applications have been added to call box systems to enh...

  11. Novel algorithm by low complexity filter on retinal vessel segmentation

    NASA Astrophysics Data System (ADS)

    Rostampour, Samad

    2011-10-01

    This article shows a new method to detect blood vessels in the retina by digital images. Retinal vessel segmentation is important for detection of side effect of diabetic disease, because diabetes can form new capillaries which are very brittle. The research has been done in two phases: preprocessing and processing. Preprocessing phase consists to apply a new filter that produces a suitable output. It shows vessels in dark color on white background and make a good difference between vessels and background. The complexity is very low and extra images are eliminated. The second phase is processing and used the method is called Bayesian. It is a built-in in supervision classification method. This method uses of mean and variance of intensity of pixels for calculate of probability. Finally Pixels of image are divided into two classes: vessels and background. Used images are related to the DRIVE database. After performing this operation, the calculation gives 95 percent of efficiency average. The method also was performed from an external sample DRIVE database which has retinopathy, and perfect result was obtained

  12. Feature extraction and classification of clouds in high resolution panchromatic satellite imagery

    NASA Astrophysics Data System (ADS)

    Sharghi, Elan

    The development of sophisticated remote sensing sensors is rapidly increasing, and the vast amount of satellite imagery collected is too much to be analyzed manually by a human image analyst. It has become necessary for a tool to be developed to automate the job of an image analyst. This tool would need to intelligently detect and classify objects of interest through computer vision algorithms. Existing software called the Rapid Image Exploitation Resource (RAPIER®) was designed by engineers at Space and Naval Warfare Systems Center Pacific (SSC PAC) to perform exactly this function. This software automatically searches for anomalies in the ocean and reports the detections as a possible ship object. However, if the image contains a high percentage of cloud coverage, a high number of false positives are triggered by the clouds. The focus of this thesis is to explore various feature extraction and classification methods to accurately distinguish clouds from ship objects. An examination of a texture analysis method, line detection using the Hough transform, and edge detection using wavelets are explored as possible feature extraction methods. The features are then supplied to a K-Nearest Neighbors (KNN) or Support Vector Machine (SVM) classifier. Parameter options for these classifiers are explored and the optimal parameters are determined.

  13. A Frequency-Domain Adaptive Matched Filter for Active Sonar Detection.

    PubMed

    Zhao, Zhishan; Zhao, Anbang; Hui, Juan; Hou, Baochun; Sotudeh, Reza; Niu, Fang

    2017-07-04

    The most classical detector of active sonar and radar is the matched filter (MF), which is the optimal processor under ideal conditions. Aiming at the problem of active sonar detection, we propose a frequency-domain adaptive matched filter (FDAMF) with the use of a frequency-domain adaptive line enhancer (ALE). The FDAMF is an improved MF. In the simulations in this paper, the signal to noise ratio (SNR) gain of the FDAMF is about 18.6 dB higher than that of the classical MF when the input SNR is -10 dB. In order to improve the performance of the FDAMF with a low input SNR, we propose a pre-processing method, which is called frequency-domain time reversal convolution and interference suppression (TRC-IS). Compared with the classical MF, the FDAMF combined with the TRC-IS method obtains higher SNR gain, a lower detection threshold, and a better receiver operating characteristic (ROC) in the simulations in this paper. The simulation results show that the FDAMF has higher processing gain and better detection performance than the classical MF under ideal conditions. The experimental results indicate that the FDAMF does improve the performance of the MF, and can adapt to actual interference in a way. In addition, the TRC-IS preprocessing method works well in an actual noisy ocean environment.

  14. Bias Characterization in Probabilistic Genotype Data and Improved Signal Detection with Multiple Imputation

    PubMed Central

    Palmer, Cameron; Pe’er, Itsik

    2016-01-01

    Missing data are an unavoidable component of modern statistical genetics. Different array or sequencing technologies cover different single nucleotide polymorphisms (SNPs), leading to a complicated mosaic pattern of missingness where both individual genotypes and entire SNPs are sporadically absent. Such missing data patterns cannot be ignored without introducing bias, yet cannot be inferred exclusively from nonmissing data. In genome-wide association studies, the accepted solution to missingness is to impute missing data using external reference haplotypes. The resulting probabilistic genotypes may be analyzed in the place of genotype calls. A general-purpose paradigm, called Multiple Imputation (MI), is known to model uncertainty in many contexts, yet it is not widely used in association studies. Here, we undertake a systematic evaluation of existing imputed data analysis methods and MI. We characterize biases related to uncertainty in association studies, and find that bias is introduced both at the imputation level, when imputation algorithms generate inconsistent genotype probabilities, and at the association level, when analysis methods inadequately model genotype uncertainty. We find that MI performs at least as well as existing methods or in some cases much better, and provides a straightforward paradigm for adapting existing genotype association methods to uncertain data. PMID:27310603

  15. Strelka: accurate somatic small-variant calling from sequenced tumor-normal sample pairs.

    PubMed

    Saunders, Christopher T; Wong, Wendy S W; Swamy, Sajani; Becq, Jennifer; Murray, Lisa J; Cheetham, R Keira

    2012-07-15

    Whole genome and exome sequencing of matched tumor-normal sample pairs is becoming routine in cancer research. The consequent increased demand for somatic variant analysis of paired samples requires methods specialized to model this problem so as to sensitively call variants at any practical level of tumor impurity. We describe Strelka, a method for somatic SNV and small indel detection from sequencing data of matched tumor-normal samples. The method uses a novel Bayesian approach which represents continuous allele frequencies for both tumor and normal samples, while leveraging the expected genotype structure of the normal. This is achieved by representing the normal sample as a mixture of germline variation with noise, and representing the tumor sample as a mixture of the normal sample with somatic variation. A natural consequence of the model structure is that sensitivity can be maintained at high tumor impurity without requiring purity estimates. We demonstrate that the method has superior accuracy and sensitivity on impure samples compared with approaches based on either diploid genotype likelihoods or general allele-frequency tests. The Strelka workflow source code is available at ftp://strelka@ftp.illumina.com/. csaunders@illumina.com

  16. Computerized In Vitro Test for Chemical Toxicity Based on Tetrahymena Swimming Patterns

    NASA Technical Reports Server (NTRS)

    Noever, David A.; Matsos, Helen C.; Cronise, Raymond J.; Looger, Loren L.; Relwani, Rachna A.; Johnson, Jacqueline U.

    1994-01-01

    An apparatus and a method for rapidly determining chemical toxicity have been evaluated as an alternative to the rabbit eye initancy test (Draize). The toxicity monitor includes an automated scoring of how motile biological cells (Tetrahymena pyriformis) slow down or otherwise change their swimming patterns in a hostile chemical environment. The method, called the motility assay (MA), is tested for 30 s to determine the chemical toxicity in 20 aqueous samples containing trace organics and salts. With equal or better detection limits, results compare favorably to in vivo animal tests of eye irritancy.

  17. Using Ozone To Clean and Passivate Oxygen-Handling Hardware

    NASA Technical Reports Server (NTRS)

    Torrance, Paul; Biesinger, Paul

    2009-01-01

    A proposed method of cleaning, passivating, and verifying the cleanliness of oxygen-handling hardware would extend the established art of cleaning by use of ozone. As used here, "cleaning" signifies ridding all exposed surfaces of combustible (in particular, carbon-based) contaminants. The method calls for exposing the surfaces of the hardware to ozone while monitoring the ozone effluent for carbon dioxide. The ozone would passivate the hardware while oxidizing carbon-based residues, converting the carbon in them to carbon dioxide. The exposure to ozone would be continued until no more carbon dioxide was detected, signifying that cleaning and passivation were complete.

  18. A Bioinformatics Approach for Detecting Repetitive Nested Motifs using Pattern Matching.

    PubMed

    Romero, José R; Carballido, Jessica A; Garbus, Ingrid; Echenique, Viviana C; Ponzoni, Ignacio

    2016-01-01

    The identification of nested motifs in genomic sequences is a complex computational problem. The detection of these patterns is important to allow the discovery of transposable element (TE) insertions, incomplete reverse transcripts, deletions, and/or mutations. In this study, a de novo strategy for detecting patterns that represent nested motifs was designed based on exhaustive searches for pairs of motifs and combinatorial pattern analysis. These patterns can be grouped into three categories, motifs within other motifs, motifs flanked by other motifs, and motifs of large size. The methodology used in this study, applied to genomic sequences from the plant species Aegilops tauschii and Oryza sativa , revealed that it is possible to identify putative nested TEs by detecting these three types of patterns. The results were validated through BLAST alignments, which revealed the efficacy and usefulness of the new method, which is called Mamushka.

  19. Hemozoin-generated vapor nanobubbles for transdermal reagent- and needle-free detection of malaria

    PubMed Central

    Lukianova-Hleb, Ekaterina Y.; Campbell, Kelly M.; Constantinou, Pamela E.; Braam, Janet; Olson, John S.; Ware, Russell E.; Sullivan, David J.; Lapotko, Dmitri O.

    2014-01-01

    Successful diagnosis, screening, and elimination of malaria critically depend on rapid and sensitive detection of this dangerous infection, preferably transdermally and without sophisticated reagents or blood drawing. Such diagnostic methods are not currently available. Here we show that the high optical absorbance and nanosize of endogenous heme nanoparticles called “hemozoin,” a unique component of all blood-stage malaria parasites, generates a transient vapor nanobubble around hemozoin in response to a short and safe near-infrared picosecond laser pulse. The acoustic signals of these malaria-specific nanobubbles provided transdermal noninvasive and rapid detection of a malaria infection as low as 0.00034% in animals without using any reagents or drawing blood. These on-demand transient events have no analogs among current malaria markers and probes, can detect and screen malaria in seconds, and can be realized as a compact, easy-to-use, inexpensive, and safe field technology. PMID:24379385

  20. Hemozoin-generated vapor nanobubbles for transdermal reagent- and needle-free detection of malaria.

    PubMed

    Lukianova-Hleb, Ekaterina Y; Campbell, Kelly M; Constantinou, Pamela E; Braam, Janet; Olson, John S; Ware, Russell E; Sullivan, David J; Lapotko, Dmitri O

    2014-01-21

    Successful diagnosis, screening, and elimination of malaria critically depend on rapid and sensitive detection of this dangerous infection, preferably transdermally and without sophisticated reagents or blood drawing. Such diagnostic methods are not currently available. Here we show that the high optical absorbance and nanosize of endogenous heme nanoparticles called "hemozoin," a unique component of all blood-stage malaria parasites, generates a transient vapor nanobubble around hemozoin in response to a short and safe near-infrared picosecond laser pulse. The acoustic signals of these malaria-specific nanobubbles provided transdermal noninvasive and rapid detection of a malaria infection as low as 0.00034% in animals without using any reagents or drawing blood. These on-demand transient events have no analogs among current malaria markers and probes, can detect and screen malaria in seconds, and can be realized as a compact, easy-to-use, inexpensive, and safe field technology.

  1. Overlapping community detection in weighted networks via a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Chen, Yi; Wang, Xiaolong; Xiang, Xin; Tang, Buzhou; Chen, Qingcai; Fan, Shixi; Bu, Junzhao

    2017-02-01

    Complex networks as a powerful way to represent complex systems have been widely studied during the past several years. One of the most important tasks of complex network analysis is to detect communities embedded in networks. In the real world, weighted networks are very common and may contain overlapping communities where a node is allowed to belong to multiple communities. In this paper, we propose a novel Bayesian approach, called the Bayesian mixture network (BMN) model, to detect overlapping communities in weighted networks. The advantages of our method are (i) providing soft-partition solutions in weighted networks; (ii) providing soft memberships, which quantify 'how strongly' a node belongs to a community. Experiments on a large number of real and synthetic networks show that our model has the ability in detecting overlapping communities in weighted networks and is competitive with other state-of-the-art models at shedding light on community partition.

  2. Anaconda: AN automated pipeline for somatic COpy Number variation Detection and Annotation from tumor exome sequencing data.

    PubMed

    Gao, Jianing; Wan, Changlin; Zhang, Huan; Li, Ao; Zang, Qiguang; Ban, Rongjun; Ali, Asim; Yu, Zhenghua; Shi, Qinghua; Jiang, Xiaohua; Zhang, Yuanwei

    2017-10-03

    Copy number variations (CNVs) are the main genetic structural variations in cancer genome. Detecting CNVs in genetic exome region is efficient and cost-effective in identifying cancer associated genes. Many tools had been developed accordingly and yet these tools lack of reliability because of high false negative rate, which is intrinsically caused by genome exonic bias. To provide an alternative option, here, we report Anaconda, a comprehensive pipeline that allows flexible integration of multiple CNV-calling methods and systematic annotation of CNVs in analyzing WES data. Just by one command, Anaconda can generate CNV detection result by up to four CNV detecting tools. Associated with comprehensive annotation analysis of genes involved in shared CNV regions, Anaconda is able to deliver a more reliable and useful report in assistance with CNV-associate cancer researches. Anaconda package and manual can be freely accessed at http://mcg.ustc.edu.cn/bsc/ANACONDA/ .

  3. ProtDec-LTR2.0: an improved method for protein remote homology detection by combining pseudo protein and supervised Learning to Rank.

    PubMed

    Chen, Junjie; Guo, Mingyue; Li, Shumin; Liu, Bin

    2017-11-01

    As one of the most important tasks in protein sequence analysis, protein remote homology detection is critical for both basic research and practical applications. Here, we present an effective web server for protein remote homology detection called ProtDec-LTR2.0 by combining ProtDec-Learning to Rank (LTR) and pseudo protein representation. Experimental results showed that the detection performance is obviously improved. The web server provides a user-friendly interface to explore the sequence and structure information of candidate proteins and find their conserved domains by launching a multiple sequence alignment tool. The web server is free and open to all users with no login requirement at http://bioinformatics.hitsz.edu.cn/ProtDec-LTR2.0/. bliu@hit.edu.cn. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  4. Evaluation of the FilmArray® system for detection of Bacillus anthracis, Francisella tularensis, and Yersinia pestis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seiner, Derrick R.; Colburn, Heather A.; Baird, Cheryl L.

    2013-04-29

    To evaluate the sensitivity and specificity of the Idaho Technologies FilmArray® Biothreat Panel for the detection of Bacillus anthracis (Ba), Francisella tularensis (Ft), and Yersinia pestis (Yp) DNA, and demonstrate the detection of Ba spores. Methods and Results: DNA samples from Ba, Ft and Yp strains and near-neighbors, and live Ba spores were analyzed using the Biothreat Panel, a multiplexed PCR-based assay for 17 pathogens and toxins. Sensitivity studies with DNA suggest a limit of detection of 250 genome equivalents (GEs) per sample. Furthermore, the correct call of Ft, Yp or Bacillus species was made in 63 of 72 samplesmore » tested at 25 GE or less. With samples containing 25 Ba Sterne spores, at least one of the two possible Ba markers were identified in all samples tested. We observed no cross-reactivity with near-neighbor DNAs.« less

  5. NUTS and BOLTS: Applications of Fluorescence Detected Sedimentation

    PubMed Central

    Kroe, Rachel R.; Laue, Thomas M.

    2008-01-01

    Analytical ultracentrifugation is a widely used method for characterizing the solution behavior of macromolecules. However, the two commonly used detectors (absorbance and interference) impose some fundamental restrictions on the concentrations and complexity of the solutions that can be analyzed. The recent addition of a fluorescence detector for the XL-I analytical ultracentrifuge (AU-FDS) enables two different types of sedimentation experiments. First, the AU-FDS can detect picomolar concentrations of labeled solutes allowing the characterization of very dilute solutions of macromolecules, applications we call Normal Use Tracer Sedimentation (NUTS). The great sensitivity of NUTS analysis allows the characterization of small quantities of materials and high affinity interactions. Second, AU-FDS allows characterization of trace quantities of labeled molecules in solutions containing high concentrations and complex mixtures of unlabeled molecules, applications we call Biological On Line Tracer Sedimentation (BOLTS). The discrimination of BOLTS enables the size distribution of a labeled macromolecule to be determined in biological milieu such as cell lysates and serum. Examples are presented that embody features of both NUTS and BOLTS applications, along with our observations on these applications. PMID:19103145

  6. Exploring homogeneity of correlation structures of gene expression datasets within and between etiological disease categories.

    PubMed

    Jong, Victor L; Novianti, Putri W; Roes, Kit C B; Eijkemans, Marinus J C

    2014-12-01

    The literature shows that classifiers perform differently across datasets and that correlations within datasets affect the performance of classifiers. The question that arises is whether the correlation structure within datasets differ significantly across diseases. In this study, we evaluated the homogeneity of correlation structures within and between datasets of six etiological disease categories; inflammatory, immune, infectious, degenerative, hereditary and acute myeloid leukemia (AML). We also assessed the effect of filtering; detection call and variance filtering on correlation structures. We downloaded microarray datasets from ArrayExpress for experiments meeting predefined criteria and ended up with 12 datasets for non-cancerous diseases and six for AML. The datasets were preprocessed by a common procedure incorporating platform-specific recommendations and the two filtering methods mentioned above. Homogeneity of correlation matrices between and within datasets of etiological diseases was assessed using the Box's M statistic on permuted samples. We found that correlation structures significantly differ between datasets of the same and/or different etiological disease categories and that variance filtering eliminates more uncorrelated probesets than detection call filtering and thus renders the data highly correlated.

  7. High resolution in-operando microimaging of solar cells with pulsed electrically-detected magnetic resonance

    NASA Astrophysics Data System (ADS)

    Katz, Itai; Fehr, Matthias; Schnegg, Alexander; Lips, Klaus; Blank, Aharon

    2015-02-01

    The in-operando detection and high resolution spatial imaging of paramagnetic defects, impurities, and states becomes increasingly important for understanding loss mechanisms in solid-state electronic devices. Electron spin resonance (ESR), commonly employed for observing these species, cannot meet this challenge since it suffers from limited sensitivity and spatial resolution. An alternative and much more sensitive method, called electrically-detected magnetic resonance (EDMR), detects the species through their magnetic fingerprint, which can be traced in the device's electrical current. However, until now it could not obtain high resolution images in operating electronic devices. In this work, the first spatially-resolved electrically-detected magnetic resonance images (EDMRI) of paramagnetic states in an operating real-world electronic device are provided. The presented method is based on a novel microwave pulse sequence allowing for the coherent electrical detection of spin echoes in combination with powerful pulsed magnetic-field gradients. The applicability of the method is demonstrated on a device-grade 1-μm-thick amorphous silicon (a-Si:H) solar cell and an identical device that was degraded locally by an electron beam. The degraded areas with increased concentrations of paramagnetic defects lead to a local increase in recombination that is mapped by EDMRI with ∼20-μm-scale pixel resolution. The novel approach presented here can be widely used in the nondestructive in-operando three-dimensional characterization of solid-state electronic devices with a resolution potential of less than 100 nm.

  8. Recent advances of liquid chromatography-(tandem) mass spectrometry in clinical and forensic toxicology - An update.

    PubMed

    Remane, Daniela; Wissenbach, Dirk K; Peters, Frank T

    2016-09-01

    Liquid chromatography (LC) coupled to mass spectrometry (MS) or tandem mass spectrometry (MS/MS) is a well-established and widely used technique in clinical and forensic toxicology as well as doping control especially for quantitative analysis. In recent years, many applications for so-called multi-target screening and/or quantification of drugs, poisons, and or their metabolites in biological matrices have been developed. Such methods have proven particularly useful for analysis of so-called new psychoactive substances that have appeared on recreational drug markets throughout the world. Moreover, the evolvement of high resolution MS techniques and the development of data-independent detection modes have opened new possibilities for applications of LC-(MS/MS) in systematic toxicological screening analysis in the so called general unknown setting. The present paper will provide an overview and discuss these recent developments focusing on the literature published after 2010. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  9. Immuno-Northern Blotting: Detection of RNA Modifications by Using Antibodies against Modified Nucleosides.

    PubMed

    Mishima, Eikan; Jinno, Daisuke; Akiyama, Yasutoshi; Itoh, Kunihiko; Nankumo, Shinnosuke; Shima, Hisato; Kikuchi, Koichi; Takeuchi, Yoichi; Elkordy, Alaa; Suzuki, Takehiro; Niizuma, Kuniyasu; Ito, Sadayoshi; Tomioka, Yoshihisa; Abe, Takaaki

    2015-01-01

    The biological roles of RNA modifications are still largely not understood. Thus, developing a method for detecting RNA modifications is important for further clarification. We developed a method for detecting RNA modifications called immuno-northern blotting (INB) analysis and herein introduce its various capabilities. This method involves the separation of RNAs using either polyacrylamide or agarose gel electrophoresis, followed by transfer onto a nylon membrane and subsequent immunoblotting using antibodies against modified nucleosides for the detection of specific modifications. We confirmed that INB with the antibodies for 1-methyladenosine (m1A), N6-methyladenosine (m6A), pseudouridine, and 5-methylcytidine (m5C) showed different modifications in a variety of RNAs from various species and organelles. INB with the anti-m5C antibody revealed that the antibody cross-reacted with another modification on DNA, suggesting the application of this method for characterization of the antibody for modified nucleosides. Additionally, using INB with the antibody for m1A, which is a highly specific modification in eukaryotic tRNA, we detected tRNA-derived fragments known as tiRNAs under the cellular stress response, suggesting the application for tracking target RNA containing specific modifications. INB with the anti-m6A antibody confirmed the demethylation of m6A by the specific demethylases fat mass and obesity-associated protein (FTO) and ALKBH5, suggesting its application for quantifying target modifications in separated RNAs. Furthermore, INB demonstrated that the knockdown of FTO and ALKBH5 increased the m6A modification in small RNAs as well as in mRNA. The INB method has high specificity, sensitivity, and quantitative capability, and it can be employed with conventional experimental apparatus. Therefore, this method would be useful for research on RNA modifications and metabolism.

  10. Searching for life in the Universe: unconventional methods for an unconventional problem.

    PubMed

    Nealson, K H; Tsapin, A; Storrie-Lombardi, M

    2002-12-01

    The search for life, on and off our planet, can be done by conventional methods with which we are all familiar. These methods are sensitive and specific, and are often capable of detecting even single cells. However, if the search broadens to include life that may be different (even subtly different) in composition, the methods and even the approach must be altered. Here we discuss the development of what we call non-earthcentric life detection--detecting life with methods that could detect life no matter what its form or composition. To develop these methods, we simply ask, can we define life in terms of its general properties and particularly those that can be measured and quantified? Taking such an approach we can search for life using physics and chemistry to ask questions about structure, chemical composition, thermodynamics, and kinetics. Structural complexity can be searched for using computer algorithms that recognize complex structures. Once identified, these structures can be examined for a variety of chemical traits, including elemental composition, chirality, and complex chemistry. A second approach involves defining our environment in terms of energy sources (i.e., reductants), and oxidants (e.g. what is available to eat and breathe), and then looking for areas in which such phenomena are inexplicably out of chemical equilibrium. These disequilibria, when found, can then be examined in detail for the presence of the structural and chemical complexity that presumably characterizes any living systems. By this approach, we move the search for life to one that should facilitate the detection of any earthly life it encountered, as well as any non-conventional life forms that have structure, complex chemistry, and live via some form of redox chemistry.

  11. Migratory behavior of eastern North Pacific gray whales tracked using a hydrophone array

    PubMed Central

    Helble, Tyler A.; D’Spain, Gerald L.; Weller, David W.; Wiggins, Sean M.; Hildebrand, John A.

    2017-01-01

    Eastern North Pacific gray whales make one of the longest annual migrations of any mammal, traveling from their summer feeding areas in the Bering and Chukchi Seas to their wintering areas in the lagoons of Baja California, Mexico. Although a significant body of knowledge on gray whale biology and behavior exists, little is known about their vocal behavior while migrating. In this study, we used a sparse hydrophone array deployed offshore of central California to investigate how gray whales behave and use sound while migrating. We detected, localized, and tracked whales for one full migration season, a first for gray whales. We verified and localized 10,644 gray whale M3 calls and grouped them into 280 tracks. Results confirm that gray whales are acoustically active while migrating and their swimming and acoustic behavior changes on daily and seasonal time scales. The seasonal timing of the calls verifies the gray whale migration timing determined using other methods such as counts conducted by visual observers. The total number of calls and the percentage of calls that were part of a track changed significantly over both seasonal and daily time scales. An average calling rate of 5.7 calls/whale/day was observed, which is significantly greater than previously reported migration calling rates. We measured a mean speed of 1.6 m/s and quantified heading, direction, and water depth where tracks were located. Mean speed and water depth remained constant between night and day, but these quantities had greater variation at night. Gray whales produce M3 calls with a root mean square source level of 156.9 dB re 1 μPa at 1 m. Quantities describing call characteristics were variable and dependent on site-specific propagation characteristics. PMID:29084266

  12. Migratory behavior of eastern North Pacific gray whales tracked using a hydrophone array.

    PubMed

    Guazzo, Regina A; Helble, Tyler A; D'Spain, Gerald L; Weller, David W; Wiggins, Sean M; Hildebrand, John A

    2017-01-01

    Eastern North Pacific gray whales make one of the longest annual migrations of any mammal, traveling from their summer feeding areas in the Bering and Chukchi Seas to their wintering areas in the lagoons of Baja California, Mexico. Although a significant body of knowledge on gray whale biology and behavior exists, little is known about their vocal behavior while migrating. In this study, we used a sparse hydrophone array deployed offshore of central California to investigate how gray whales behave and use sound while migrating. We detected, localized, and tracked whales for one full migration season, a first for gray whales. We verified and localized 10,644 gray whale M3 calls and grouped them into 280 tracks. Results confirm that gray whales are acoustically active while migrating and their swimming and acoustic behavior changes on daily and seasonal time scales. The seasonal timing of the calls verifies the gray whale migration timing determined using other methods such as counts conducted by visual observers. The total number of calls and the percentage of calls that were part of a track changed significantly over both seasonal and daily time scales. An average calling rate of 5.7 calls/whale/day was observed, which is significantly greater than previously reported migration calling rates. We measured a mean speed of 1.6 m/s and quantified heading, direction, and water depth where tracks were located. Mean speed and water depth remained constant between night and day, but these quantities had greater variation at night. Gray whales produce M3 calls with a root mean square source level of 156.9 dB re 1 μPa at 1 m. Quantities describing call characteristics were variable and dependent on site-specific propagation characteristics.

  13. The correct estimate of the probability of false detection of the matched filter in weak-signal detection problems . II. Further results with application to a set of ALMA and ATCA data

    NASA Astrophysics Data System (ADS)

    Vio, R.; Vergès, C.; Andreani, P.

    2017-08-01

    The matched filter (MF) is one of the most popular and reliable techniques to the detect signals of known structure and amplitude smaller than the level of the contaminating noise. Under the assumption of stationary Gaussian noise, MF maximizes the probability of detection subject to a constant probability of false detection or false alarm (PFA). This property relies upon a priori knowledge of the position of the searched signals, which is usually not available. Recently, it has been shown that when applied in its standard form, MF may severely underestimate the PFA. As a consequence the statistical significance of features that belong to noise is overestimated and the resulting detections are actually spurious. For this reason, an alternative method of computing the PFA has been proposed that is based on the probability density function (PDF) of the peaks of an isotropic Gaussian random field. In this paper we further develop this method. In particular, we discuss the statistical meaning of the PFA and show that, although useful as a preliminary step in a detection procedure, it is not able to quantify the actual reliability of a specific detection. For this reason, a new quantity is introduced called the specific probability of false alarm (SPFA), which is able to carry out this computation. We show how this method works in targeted simulations and apply it to a few interferometric maps taken with the Atacama Large Millimeter/submillimeter Array (ALMA) and the Australia Telescope Compact Array (ATCA). We select a few potential new point sources and assign an accurate detection reliability to these sources.

  14. Cloud detection method for Chinese moderate high resolution satellite imagery (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Zhong, Bo; Chen, Wuhan; Wu, Shanlong; Liu, Qinhuo

    2016-10-01

    Cloud detection of satellite imagery is very important for quantitative remote sensing research and remote sensing applications. However, many satellite sensors don't have enough bands for a quick, accurate, and simple detection of clouds. Particularly, the newly launched moderate to high spatial resolution satellite sensors of China, such as the charge-coupled device on-board the Chinese Huan Jing 1 (HJ-1/CCD) and the wide field of view (WFV) sensor on-board the Gao Fen 1 (GF-1), only have four available bands including blue, green, red, and near infrared bands, which are far from the requirements of most could detection methods. In order to solve this problem, an improved and automated cloud detection method for Chinese satellite sensors called OCM (Object oriented Cloud and cloud-shadow Matching method) is presented in this paper. It firstly modified the Automatic Cloud Cover Assessment (ACCA) method, which was developed for Landsat-7 data, to get an initial cloud map. The modified ACCA method is mainly based on threshold and different threshold setting produces different cloud map. Subsequently, a strict threshold is used to produce a cloud map with high confidence and large amount of cloud omission and a loose threshold is used to produce a cloud map with low confidence and large amount of commission. Secondly, a corresponding cloud-shadow map is also produced using the threshold of near-infrared band. Thirdly, the cloud maps and cloud-shadow map are transferred to cloud objects and cloud-shadow objects. Cloud and cloud-shadow are usually in pairs; consequently, the final cloud and cloud-shadow maps are made based on the relationship between cloud and cloud-shadow objects. OCM method was tested using almost 200 HJ-1/CCD images across China and the overall accuracy of cloud detection is close to 90%.

  15. Blue-Whale Calls Detected at the Pioneer Seamount Underwater Observatory

    NASA Astrophysics Data System (ADS)

    Hoffman, M. D.; Vuosalo, C. O.; Bland, R. W.; Garfield, N.

    2002-12-01

    In September of 2001 a cabled vertical linear array (VLA) of hydrophones was deployed on Pioneer Seamount, 90 km off the California coast near Half Moon Bay, by the NOAA-PMEL and University of Washington-APL. The array of 4 hydrophones is at a depth of 950 m, and the four signals are digitized at the shore end of the cable at 1000 Hz. The data are archived by PMEL, and are available to the public over the internet. Spectrograms of all of the data are accessible on the SFSU web site. A large number of blue-whale calls are evident in the spectrograms. We have employed spectrogram correlation [Mellinger 2000] and a matched-filter detection scheme [Stafford 1998] to automatically identify these whale calls in three months of data. Results on the frequency of calls and their variability will be presented. Mellinger, David K., and Christopher W. Clark [2000], "Recognizing transient low-frequency whale sounds by spectrogram correlation," J. Acoust. Soc. Am. 107 (3518). Stafford, Kathleen M., Christopher G. Fox, and Davis S. Clark [1998], "Long-range acoustic detection and localization of blue whale calls in the northeast Pacific Ocean," J. Acoust. Soc. Am. 104 (3616).

  16. Walsh-Hadamard transform kernel-based feature vector for shot boundary detection.

    PubMed

    Lakshmi, Priya G G; Domnic, S

    2014-12-01

    Video shot boundary detection (SBD) is the first step of video analysis, summarization, indexing, and retrieval. In SBD process, videos are segmented into basic units called shots. In this paper, a new SBD method is proposed using color, edge, texture, and motion strength as vector of features (feature vector). Features are extracted by projecting the frames on selected basis vectors of Walsh-Hadamard transform (WHT) kernel and WHT matrix. After extracting the features, based on the significance of the features, weights are calculated. The weighted features are combined to form a single continuity signal, used as input for Procedure Based shot transition Identification process (PBI). Using the procedure, shot transitions are classified into abrupt and gradual transitions. Experimental results are examined using large-scale test sets provided by the TRECVID 2007, which has evaluated hard cut and gradual transition detection. To evaluate the robustness of the proposed method, the system evaluation is performed. The proposed method yields F1-Score of 97.4% for cut, 78% for gradual, and 96.1% for overall transitions. We have also evaluated the proposed feature vector with support vector machine classifier. The results show that WHT-based features can perform well than the other existing methods. In addition to this, few more video sequences are taken from the Openvideo project and the performance of the proposed method is compared with the recent existing SBD method.

  17. MAFsnp: A Multi-Sample Accurate and Flexible SNP Caller Using Next-Generation Sequencing Data

    PubMed Central

    Hu, Jiyuan; Li, Tengfei; Xiu, Zidi; Zhang, Hong

    2015-01-01

    Most existing statistical methods developed for calling single nucleotide polymorphisms (SNPs) using next-generation sequencing (NGS) data are based on Bayesian frameworks, and there does not exist any SNP caller that produces p-values for calling SNPs in a frequentist framework. To fill in this gap, we develop a new method MAFsnp, a Multiple-sample based Accurate and Flexible algorithm for calling SNPs with NGS data. MAFsnp is based on an estimated likelihood ratio test (eLRT) statistic. In practical situation, the involved parameter is very close to the boundary of the parametric space, so the standard large sample property is not suitable to evaluate the finite-sample distribution of the eLRT statistic. Observing that the distribution of the test statistic is a mixture of zero and a continuous part, we propose to model the test statistic with a novel two-parameter mixture distribution. Once the parameters in the mixture distribution are estimated, p-values can be easily calculated for detecting SNPs, and the multiple-testing corrected p-values can be used to control false discovery rate (FDR) at any pre-specified level. With simulated data, MAFsnp is shown to have much better control of FDR than the existing SNP callers. Through the application to two real datasets, MAFsnp is also shown to outperform the existing SNP callers in terms of calling accuracy. An R package “MAFsnp” implementing the new SNP caller is freely available at http://homepage.fudan.edu.cn/zhangh/softwares/. PMID:26309201

  18. Mass-tag enhanced immuno-laser desorption/ionization mass spectrometry for sensitive detection of intact protein antigens.

    PubMed

    Lorey, Martina; Adler, Belinda; Yan, Hong; Soliymani, Rabah; Ekström, Simon; Yli-Kauhaluoma, Jari; Laurell, Thomas; Baumann, Marc

    2015-05-19

    A new read-out method for antibody arrays using laser desorption/ionization-mass spectrometry (LDI-MS) is presented. Small, photocleavable reporter molecules with a defined mass called "mass-tags" are used for detection of immunocaptured proteins from human plasma. Using prostate specific antigen (PSA), a biomarker for prostate cancer, as a model antigen, a high sensitivity generic detection methodology based immunocapture with a primary antibody and with a biotin labeled secondary antibody coupled to mass-tagged avidin is demonstrated. As each secondary antibody can bind several avidin molecules, each having a large number of mass-tags, signal amplification can be achieved. The developed PSA sandwich mass-tag analysis method provided a limit of detection below 200 pg/mL (6 pM) for a 10 μL plasma sample, well below the clinically relevant cutoff value of 3-4 ng/mL. This brings the limit of detection (LOD) for detection of intact antigens with matrix-assisted laser desorption/ionization-mass spectrometry (MALDI-MS) down to levels comparable to capture by anti-peptide antibodies selected reaction monitoring (SISCAPA SRM) and enzyme linked immunosorbent assay (ELISA), as 6 pM corresponds to a maximal amount of 60 amol PSA captured on-spot. We propose the potential use of LDI (laser desorption/ionization) with mass-tag read-out implemented in a sandwich assay format for low abundant and/or early disease biomarker detection.

  19. On the robustness of EC-PC spike detection method for online neural recording.

    PubMed

    Zhou, Yin; Wu, Tong; Rastegarnia, Amir; Guan, Cuntai; Keefer, Edward; Yang, Zhi

    2014-09-30

    Online spike detection is an important step to compress neural data and perform real-time neural information decoding. An unsupervised, automatic, yet robust signal processing is strongly desired, thus it can support a wide range of applications. We have developed a novel spike detection algorithm called "exponential component-polynomial component" (EC-PC) spike detection. We firstly evaluate the robustness of the EC-PC spike detector under different firing rates and SNRs. Secondly, we show that the detection Precision can be quantitatively derived without requiring additional user input parameters. We have realized the algorithm (including training) into a 0.13 μm CMOS chip, where an unsupervised, nonparametric operation has been demonstrated. Both simulated data and real data are used to evaluate the method under different firing rates (FRs), SNRs. The results show that the EC-PC spike detector is the most robust in comparison with some popular detectors. Moreover, the EC-PC detector can track changes in the background noise due to the ability to re-estimate the neural data distribution. Both real and synthesized data have been used for testing the proposed algorithm in comparison with other methods, including the absolute thresholding detector (AT), median absolute deviation detector (MAD), nonlinear energy operator detector (NEO), and continuous wavelet detector (CWD). Comparative testing results reveals that the EP-PC detection algorithm performs better than the other algorithms regardless of recording conditions. The EC-PC spike detector can be considered as an unsupervised and robust online spike detection. It is also suitable for hardware implementation. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Automatic Ship Detection in Remote Sensing Images from Google Earth of Complex Scenes Based on Multiscale Rotation Dense Feature Pyramid Networks

    NASA Astrophysics Data System (ADS)

    Yang, Xue; Sun, Hao; Fu, Kun; Yang, Jirui; Sun, Xian; Yan, Menglong; Guo, Zhi

    2018-01-01

    Ship detection has been playing a significant role in the field of remote sensing for a long time but it is still full of challenges. The main limitations of traditional ship detection methods usually lie in the complexity of application scenarios, the difficulty of intensive object detection and the redundancy of detection region. In order to solve such problems above, we propose a framework called Rotation Dense Feature Pyramid Networks (R-DFPN) which can effectively detect ship in different scenes including ocean and port. Specifically, we put forward the Dense Feature Pyramid Network (DFPN), which is aimed at solving the problem resulted from the narrow width of the ship. Compared with previous multi-scale detectors such as Feature Pyramid Network (FPN), DFPN builds the high-level semantic feature-maps for all scales by means of dense connections, through which enhances the feature propagation and encourages the feature reuse. Additionally, in the case of ship rotation and dense arrangement, we design a rotation anchor strategy to predict the minimum circumscribed rectangle of the object so as to reduce the redundant detection region and improve the recall. Furthermore, we also propose multi-scale ROI Align for the purpose of maintaining the completeness of semantic and spatial information. Experiments based on remote sensing images from Google Earth for ship detection show that our detection method based on R-DFPN representation has a state-of-the-art performance.

  1. [Assessment of cervical intraepithelial neoplasia (CIN) lesions by DNA image cytometry].

    PubMed

    Sun, Xiao-rong; Che, Dong-yuan; Tu, Hong-zhang; Li, Dan; Wang, Jian

    2006-11-01

    To compare the value of conventional cytology and DNA image cytometry (DNA-ICM) assisted cytology in detection and prognostic assessment of cervical CIN lesions. 87 women were enrolled in this study. Cervical samples were collected employing cervix brushes which were then washed in Sedfix. After preparing single cell suspensions by mechanical procedure, cell monolayers were prepared by cyto-spinning the cells onto microscope slides. Two slides were prepared from each case: one slide was stained by Papanicolou staining for conventional cytology, another was stained by Feulgen-Thionin method for measurements of the amount of DNA in the cell nuclei using an automated DNA imaging cytometer. Biopsies from the cervical lesions were also taken for histopathology and Ki-67 immunohistochemistry. Of the total of 20 ASCUS cases called by conventional cytology, no CIN, nor greater lesions were found. Among the 20 cases, 7 cases did not show any cells with DNA amount greater than 5c, while CIN2 lesions were found in 11 of other 13 cases that had some aneuploid cells with DNA amount greater than 5c. Of 30 LSIL cases called by conventional cytology, CIN2 lesions were detected in 3 out of 7 cases that did not contain any aneuploid cells with DNA greater than 5c, but in 22 out of the other 23 cases that contained aneuploid cells with DNA amount greater than > 5c. Of the remaining 7 cases called HSIL by conventional cytology, all case contained aneuploid cells containing DNA greater than 5c. If cytology was used to refer all cases of LSIL and HSIL to colposcopy procedure to detect potential CIN2 or greater lesions, the sensitivity, specificity, positive predictive value and negative predictive value were 58.2%, 84.4%, 86.5% and 54.0%, respectively. If DNA-ICM were used and all cases having 3 or more cells with a DNA amount greater than 5c were assessed to be referred to pathology to detect potential CIN2 or greater lesions, the sensitivity, specificity, positive predictive value and negative predictive were 72.7% , 87.5%, 90.9% and 65.1%, respectively. We also compared Ki67 positive cells in these samples and found that DNA-ICM results were comparable to this biomarker method. The study demonstrated that DNA-ICM approach can be successfully used to detect significant (i.e. CIN2 or greater) lesions, and also provide a prognostic assessment of CIN lesions.

  2. Minimizing Higgs potentials via numerical polynomial homotopy continuation

    NASA Astrophysics Data System (ADS)

    Maniatis, M.; Mehta, D.

    2012-08-01

    The study of models with extended Higgs sectors requires to minimize the corresponding Higgs potentials, which is in general very difficult. Here, we apply a recently developed method, called numerical polynomial homotopy continuation (NPHC), which guarantees to find all the stationary points of the Higgs potentials with polynomial-like non-linearity. The detection of all stationary points reveals the structure of the potential with maxima, metastable minima, saddle points besides the global minimum. We apply the NPHC method to the most general Higgs potential having two complex Higgs-boson doublets and up to five real Higgs-boson singlets. Moreover the method is applicable to even more involved potentials. Hence the NPHC method allows to go far beyond the limits of the Gröbner basis approach.

  3. Toward the detection of gravitational waves under non-Gaussian noises I. Locally optimal statistic.

    PubMed

    Yokoyama, Jun'ichi

    2014-01-01

    After reviewing the standard hypothesis test and the matched filter technique to identify gravitational waves under Gaussian noises, we introduce two methods to deal with non-Gaussian stationary noises. We formulate the likelihood ratio function under weakly non-Gaussian noises through the Edgeworth expansion and strongly non-Gaussian noises in terms of a new method we call Gaussian mapping where the observed marginal distribution and the two-body correlation function are fully taken into account. We then apply these two approaches to Student's t-distribution which has a larger tails than Gaussian. It is shown that while both methods work well in the case the non-Gaussianity is small, only the latter method works well for highly non-Gaussian case.

  4. An EMAT-based shear horizontal (SH) wave technique for adhesive bond inspection

    NASA Astrophysics Data System (ADS)

    Arun, K.; Dhayalan, R.; Balasubramaniam, Krishnan; Maxfield, Bruce; Peres, Patrick; Barnoncel, David

    2012-05-01

    The evaluation of adhesively bonded structures has been a challenge over the several decades that these structures have been used. Applications within the aerospace industry often call for particularly high performance adhesive bonds. Several techniques have been proposed for the detection of disbonds and cohesive weakness but a reliable NDE method for detecting interfacial weakness (also sometimes called a kissing bond) has been elusive. Different techniques, including ultrasonic, thermal imaging and shearographic methods, have been proposed; all have had some degree of success. In particular, ultrasonic methods, including those based upon shear and guided waves, have been explored for the assessment of interfacial bond quality. Since 3-D guided shear horizontal (SH) waves in plates have predominantly shear displacement at the plate surfaces, we conjectured that SH guided waves should be influenced by interfacial conditions when they propagate between adhesively bonded plates of comparable thickness. This paper describes a new technique based on SH guided waves that propagate within and through a lap joint. Through mechanisms we have yet to fully understand, the propagation of an SH wave through a lap joint gives rise to a reverberation signal that is due to one or more reflections of an SH guided wave mode within that lap joint. Based upon a combination of numerical simulations and measurements, this method shows promise for detecting and classifying interfacial bonds. It is also apparent from our measurements that the SH wave modes can discriminate between adhesive and cohesive bond weakness in both Aluminum-Epoxy-Aluminum and Composite-Epoxy-Composite lap joints. All measurements reported here used periodic permanent magnet (PPM) Electro-Magnetic Acoustic Transducers (EMATs) to generate either or both of the two lowest order SH modes in the plates that comprise the lap joint. This exact configuration has been simulated using finite element (FE) models to describe the SH mode generation, propagation and reception. Of particular interest is that one SH guided wave mode (probably SH0) reverberates within the lap joint. Moreover, in both simulations and measurements, features of this so-called reverberation signal appear to be related to interfacial weakness between the plate (substrate) and the epoxy bond. The results of a hybrid numerical (FE) approach based on using COMSOL to calculate the driving forces within an elastic solid and ABAQUS to propagate the resulting elastic disturbances (waves) within the plates and lap joint are compared with measurements of SH wave generation and reception in lap joint specimens having different interfacial and cohesive bonding conditions.

  5. Analysis of nonlinear modulation between sound and vibrations in metallic structure and its use for damage detection

    NASA Astrophysics Data System (ADS)

    Liu, Bin; Gang, Tie; Wan, Chuhao; Wang, Changxi; Luo, Zhiwei

    2015-07-01

    Vibro-acoustic modulation technique is a nonlinear ultrasonic method in nondestructive testing. This technique detects the defects by monitoring the modulation components generated by the interaction between the vibration and the ultrasound wave due to the nonlinear material behaviour caused by the damage. In this work, a swept frequency signal was used as high frequency excitation, then the Hilbert transform based amplitude and phase demodulation and synchronous demodulation (SD) were used to extract the modulation information from the received signal, the results were graphed in the time-frequency domain after the short time Fourier transform. The demodulation results were quite different from each other. The reason for the difference was investigated by analysing the demodulation process of the two methods. According to the analysis and the subsequent verification test, it was indicated that the SD method was more proper for the test and a new index called MISD was defined to evaluate the structure quality in the Vibro-acoustic modulation test with swept probing excitation.

  6. Bi-model processing for early detection of breast tumor in CAD system

    NASA Astrophysics Data System (ADS)

    Mughal, Bushra; Sharif, Muhammad; Muhammad, Nazeer

    2017-06-01

    Early screening of skeptical masses in mammograms may reduce mortality rate among women. This rate can be further reduced upon developing the computer-aided diagnosis system with decrease in false assumptions in medical informatics. This method highlights the early tumor detection in digitized mammograms. For improving the performance of this system, a novel bi-model processing algorithm is introduced. It divides the region of interest into two parts, the first one is called pre-segmented region (breast parenchyma) and other is the post-segmented region (suspicious region). This system follows the scheme of the preprocessing technique of contrast enhancement that can be utilized to segment and extract the desired feature of the given mammogram. In the next phase, a hybrid feature block is presented to show the effective performance of computer-aided diagnosis. In order to assess the effectiveness of the proposed method, a database provided by the society of mammographic images is tested. Our experimental outcomes on this database exhibit the usefulness and robustness of the proposed method.

  7. Early Warning and Outbreak Detection Using Social Networking Websites: The Potential of Twitter

    NASA Astrophysics Data System (ADS)

    de Quincey, Ed; Kostkova, Patty

    Epidemic Intelligence is being used to gather information about potential diseases outbreaks from both formal and increasingly informal sources. A potential addition to these informal sources are social networking sites such as Facebook and Twitter. In this paper we describe a method for extracting messages, called "tweets" from the Twitter website and the results of a pilot study which collected over 135,000 tweets in a week during the current Swine Flu pandemic.

  8. A simple algorithm for quantifying DNA methylation levels on multiple independent CpG sites in bisulfite genomic sequencing electropherograms.

    PubMed

    Leakey, Tatiana I; Zielinski, Jerzy; Siegfried, Rachel N; Siegel, Eric R; Fan, Chun-Yang; Cooney, Craig A

    2008-06-01

    DNA methylation at cytosines is a widely studied epigenetic modification. Methylation is commonly detected using bisulfite modification of DNA followed by PCR and additional techniques such as restriction digestion or sequencing. These additional techniques are either laborious, require specialized equipment, or are not quantitative. Here we describe a simple algorithm that yields quantitative results from analysis of conventional four-dye-trace sequencing. We call this method Mquant and we compare it with the established laboratory method of combined bisulfite restriction assay (COBRA). This analysis of sequencing electropherograms provides a simple, easily applied method to quantify DNA methylation at specific CpG sites.

  9. SigEMD: A powerful method for differential gene expression analysis in single-cell RNA sequencing data.

    PubMed

    Wang, Tianyu; Nabavi, Sheida

    2018-04-24

    Differential gene expression analysis is one of the significant efforts in single cell RNA sequencing (scRNAseq) analysis to discover the specific changes in expression levels of individual cell types. Since scRNAseq exhibits multimodality, large amounts of zero counts, and sparsity, it is different from the traditional bulk RNA sequencing (RNAseq) data. The new challenges of scRNAseq data promote the development of new methods for identifying differentially expressed (DE) genes. In this study, we proposed a new method, SigEMD, that combines a data imputation approach, a logistic regression model and a nonparametric method based on the Earth Mover's Distance, to precisely and efficiently identify DE genes in scRNAseq data. The regression model and data imputation are used to reduce the impact of large amounts of zero counts, and the nonparametric method is used to improve the sensitivity of detecting DE genes from multimodal scRNAseq data. By additionally employing gene interaction network information to adjust the final states of DE genes, we further reduce the false positives of calling DE genes. We used simulated datasets and real datasets to evaluate the detection accuracy of the proposed method and to compare its performance with those of other differential expression analysis methods. Results indicate that the proposed method has an overall powerful performance in terms of precision in detection, sensitivity, and specificity. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Bio-Aerosol Detection Using Mass Spectrometry: Public Health Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ludvigson, Laura D.

    2004-01-01

    I recently spent a summer as an intern at the Lawrence Livermore National Laboratory. I worked on a project involving the real-time, reagentless, single cell detection of aerosolized pathogens using a novel mass spectrometry approach called Bio-Aerosol Mass Spectrometry (BAMS). Based upon preliminary results showing the differentiation capabilities of BAMS, I would like to explore the development and use of this novel detection system in the context of both environmental and clinical sample pathogen detection. I would also like to explore the broader public health applications that a system such as BAMS might have in terms of infectious disease preventionmore » and control. In order to appreciate the potential of this instrument, I will demonstrate the need for better pathogen detection methods, and outline the instrumentation, data analysis and preliminary results that lead me toward a desire to explore this technology further. I will also discuss potential experiments for the future along with possible problems that may be encountered along the way.« less

  11. A deeper look at two concepts of measuring gene-gene interactions: logistic regression and interaction information revisited.

    PubMed

    Mielniczuk, Jan; Teisseyre, Paweł

    2018-03-01

    Detection of gene-gene interactions is one of the most important challenges in genome-wide case-control studies. Besides traditional logistic regression analysis, recently the entropy-based methods attracted a significant attention. Among entropy-based methods, interaction information is one of the most promising measures having many desirable properties. Although both logistic regression and interaction information have been used in several genome-wide association studies, the relationship between them has not been thoroughly investigated theoretically. The present paper attempts to fill this gap. We show that although certain connections between the two methods exist, in general they refer two different concepts of dependence and looking for interactions in those two senses leads to different approaches to interaction detection. We introduce ordering between interaction measures and specify conditions for independent and dependent genes under which interaction information is more discriminative measure than logistic regression. Moreover, we show that for so-called perfect distributions those measures are equivalent. The numerical experiments illustrate the theoretical findings indicating that interaction information and its modified version are more universal tools for detecting various types of interaction than logistic regression and linkage disequilibrium measures. © 2017 WILEY PERIODICALS, INC.

  12. Estimating site occupancy and abundance using indirect detection indices

    USGS Publications Warehouse

    Stanley, T.R.; Royle, J. Andrew

    2005-01-01

    Knowledge of factors influencing animal distribution and abundance is essential in many areas of ecological research, management, and policy-making. Because common methods for modeling and estimating abundance (e.g., capture-recapture, distance sampling) are sometimes not practical for large areas or elusive species, indices are sometimes used as surrogate measures of abundance. We present an extension of the Royle and Nichols (2003) generalization of the MacKenzie et al. (2002) site-occupancy model that incorporates length of the sampling interval into the, model for detection probability. As a result, we obtain a modeling framework that shows how useful information can be extracted from a class of index methods we call indirect detection indices (IDIs). Examples of IDIs include scent station, tracking tube, snow track, tracking plate, and hair snare surveys. Our model is maximum likelihood, and it can be used to estimate site occupancy and model factors influencing patterns of occupancy and abundance in space. Under certain circumstances, it can also be used to estimate abundance. We evaluated model properties using Monte Carlo simulations and illustrate the method with tracking tube and scent station data. We believe this model will be a useful tool for determining factors that influence animal distribution and abundance.

  13. ExScalibur: A High-Performance Cloud-Enabled Suite for Whole Exome Germline and Somatic Mutation Identification

    PubMed Central

    Huang, Lei; Kang, Wenjun; Bartom, Elizabeth; Onel, Kenan; Volchenboum, Samuel; Andrade, Jorge

    2015-01-01

    Whole exome sequencing has facilitated the discovery of causal genetic variants associated with human diseases at deep coverage and low cost. In particular, the detection of somatic mutations from tumor/normal pairs has provided insights into the cancer genome. Although there is an abundance of publicly-available software for the detection of germline and somatic variants, concordance is generally limited among variant callers and alignment algorithms. Successful integration of variants detected by multiple methods requires in-depth knowledge of the software, access to high-performance computing resources, and advanced programming techniques. We present ExScalibur, a set of fully automated, highly scalable and modulated pipelines for whole exome data analysis. The suite integrates multiple alignment and variant calling algorithms for the accurate detection of germline and somatic mutations with close to 99% sensitivity and specificity. ExScalibur implements streamlined execution of analytical modules, real-time monitoring of pipeline progress, robust handling of errors and intuitive documentation that allows for increased reproducibility and sharing of results and workflows. It runs on local computers, high-performance computing clusters and cloud environments. In addition, we provide a data analysis report utility to facilitate visualization of the results that offers interactive exploration of quality control files, read alignment and variant calls, assisting downstream customization of potential disease-causing mutations. ExScalibur is open-source and is also available as a public image on Amazon cloud. PMID:26271043

  14. The dawn of the liquid biopsy in the fight against cancer

    PubMed Central

    Domínguez-Vigil, Irma G.; Moreno-Martínez, Ana K.; Wang, Julia Y.; Roehrl, Michael H.A.; Barrera-Saldaña, Hugo A.

    2018-01-01

    Cancer is a molecular disease associated with alterations in the genome, which, thanks to the highly improved sensitivity of mutation detection techniques, can be identified in cell-free DNA (cfDNA) circulating in blood, a method also called liquid biopsy. This is a non-invasive alternative to surgical biopsy and has the potential of revealing the molecular signature of tumors to aid in the individualization of treatments. In this review, we focus on cfDNA analysis, its advantages, and clinical applications employing genomic tools (NGS and dPCR) particularly in the field of oncology, and highlight its valuable contributions to early detection, prognosis, and prediction of treatment response. PMID:29416824

  15. Enhanced backscatter of optical beams reflected in atmospheric turbulence

    NASA Astrophysics Data System (ADS)

    Nelson, W.; Palastro, J. P.; Wu, C.; Davis, C. C.

    2014-10-01

    Optical beams propagating through the atmosphere acquire phase distortions from turbulent fluctuations in the refractive index. While these distortions are usually deleterious to propagation, beams reflected in a turbulent medium can undergo a local recovery of spatial coherence and intensity enhancement referred to as enhanced backscatter (EBS). Using simulations, we investigate the EBS of optical beams reflected from mirrors, corner cubes, and rough surfaces, and identify the regimes in which EBS is most distinctly observed. Standard EBS detection requires averaging the reflected intensity over many passes through uncorrelated turbulence. Here we present an algorithm called the "tilt-shift method" which allows detection of EBS in static turbulence, improving its suitability for potential applications.

  16. Automatic Line Calling Badminton System

    NASA Astrophysics Data System (ADS)

    Affandi Saidi, Syahrul; Adawiyah Zulkiplee, Nurabeahtul; Muhammad, Nazmizan; Sarip, Mohd Sharizan Md

    2018-05-01

    A system and relevant method are described to detect whether a projectile impact occurs on one side of a boundary line or the other. The system employs the use of force sensing resistor-based sensors that may be designed in segments or assemblies and linked to a mechanism with a display. An impact classification system is provided for distinguishing between various events, including a footstep, ball impact and tennis racquet contact. A sensor monitoring system is provided for determining the condition of sensors and providing an error indication if sensor problems exist. A service detection system is provided when the system is used for tennis that permits activation of selected groups of sensors and deactivation of others.

  17. Cell culture-based biosensing techniques for detecting toxicity in water.

    PubMed

    Tan, Lu; Schirmer, Kristin

    2017-06-01

    The significant increase of contaminants entering fresh water bodies calls for the development of rapid and reliable methods to monitor the aquatic environment and to detect water toxicity. Cell culture-based biosensing techniques utilise the overall cytotoxic response to external stimuli, mediated by a transduced signal, to specify the toxicity of aqueous samples. These biosensing techniques can effectively indicate water toxicity for human safety and aquatic organism health. In this review we account for the recent developments of the mainstream cell culture-based biosensing techniques for water quality evaluation, discuss their key features, potentials and limitations, and outline the future prospects of their development. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Characterization of St. Lawrence blue whale vocalizations and their correlation with field observations

    NASA Astrophysics Data System (ADS)

    Berchok, Catherine L.

    During four field seasons from 1998--2001, 115 hours of acoustic recordings were made in the presence of the well-studied St. Lawrence population of blue whales. The primary field site for this study was the estuary region of the St. Lawrence River (Quebec, Canada) with most recordings made between mid-August and late October. Effort was concentrated in the daylight hours, although occasionally extending past nightfall. An inexpensive and portable recording system was built that was easy to deploy and provided quality recordings in a variety of sea conditions. It consisted of a calibrated omni-directional hydrophone with a flat (+/-3dB) response from 5Hz to 800Hz; and a surface isolation buoy to minimize the vertical movement of the sensor. During the recording sessions detailed field notes were taken on all blue whales within sight, with individual identities confirmed through photo-identification work between sessions. Notes were also taken on all other species sighted during the recording sessions. Characterization of the more than one-thousand blue whale calls detected during this study revealed that the St. Lawrence repertoire is much more extensive than previously reported. Three infrasonic (<20Hz) and four audible range (30--200Hz) call types were detected in this study, with much time/frequency variation seen within each type. The infrasonic calls were long (5--30s) in duration and arranged into regularly patterned series. These calls were similar in call characteristics and spacing to those detected in the North Atlantic, but had much shorter and more variable patterned series. The audible call types were much shorter (1--4s), and occurred singly or in irregularly spaced clusters, although a special patterning was seen that contained both regular and irregular spaced components. Comparison of the daily, seasonal, and spatial distributions of calling behavior with those of several biological parameters revealed interesting differences between the three call types examined. The trends seen suggest a migratory, reproductive, or foraging context for the infrasonic calls. A closer-range social context is suggested for the audible downsweeps, which have been detected in foraging situations as well as in courtship displays. The audible mixed-pattern call type appears to have a primarily reproductive context.

  19. Systems configured to distribute a telephone call, communication systems, communication methods and methods of routing a telephone call to a service representative

    DOEpatents

    Harris, Scott H.; Johnson, Joel A.; Neiswanger, Jeffery R.; Twitchell, Kevin E.

    2004-03-09

    The present invention includes systems configured to distribute a telephone call, communication systems, communication methods and methods of routing a telephone call to a customer service representative. In one embodiment of the invention, a system configured to distribute a telephone call within a network includes a distributor adapted to connect with a telephone system, the distributor being configured to connect a telephone call using the telephone system and output the telephone call and associated data of the telephone call; and a plurality of customer service representative terminals connected with the distributor and a selected customer service representative terminal being configured to receive the telephone call and the associated data, the distributor and the selected customer service representative terminal being configured to synchronize, application of the telephone call and associated data from the distributor to the selected customer service representative terminal.

  20. Evaluation and optimisation of indel detection workflows for ion torrent sequencing of the BRCA1 and BRCA2 genes.

    PubMed

    Yeo, Zhen Xuan; Wong, Joshua Chee Leong; Rozen, Steven G; Lee, Ann Siew Gek

    2014-06-24

    The Ion Torrent PGM is a popular benchtop sequencer that shows promise in replacing conventional Sanger sequencing as the gold standard for mutation detection. Despite the PGM's reported high accuracy in calling single nucleotide variations, it tends to generate many false positive calls in detecting insertions and deletions (indels), which may hinder its utility for clinical genetic testing. Recently, the proprietary analytical workflow for the Ion Torrent sequencer, Torrent Suite (TS), underwent a series of upgrades. We evaluated three major upgrades of TS by calling indels in the BRCA1 and BRCA2 genes. Our analysis revealed that false negative indels could be generated by TS under both default calling parameters and parameters adjusted for maximum sensitivity. However, indel calling with the same data using the open source variant callers, GATK and SAMtools showed that false negatives could be minimised with the use of appropriate bioinformatics analysis. Furthermore, we identified two variant calling measures, Quality-by-Depth (QD) and VARiation of the Width of gaps and inserts (VARW), which substantially reduced false positive indels, including non-homopolymer associated errors without compromising sensitivity. In our best case scenario that involved the TMAP aligner and SAMtools, we achieved 100% sensitivity, 99.99% specificity and 29% False Discovery Rate (FDR) in indel calling from all 23 samples, which is a good performance for mutation screening using PGM. New versions of TS, BWA and GATK have shown improvements in indel calling sensitivity and specificity over their older counterpart. However, the variant caller of TS exhibits a lower sensitivity than GATK and SAMtools. Our findings demonstrate that although indel calling from PGM sequences may appear to be noisy at first glance, proper computational indel calling analysis is able to maximize both the sensitivity and specificity at the single base level, paving the way for the usage of this technology for future clinical genetic testing.

  1. Novel diagnostic procedure for determining metastasis to sentinel lymph nodes in breast cancer using a semi-dry dot-blot method.

    PubMed

    Otsubo, Ryota; Oikawa, Masahiro; Hirakawa, Hiroshi; Shibata, Kenichiro; Abe, Kuniko; Hayashi, Tomayoshi; Kinoshita, Naoe; Shigematsu, Kazuto; Hatachi, Toshiko; Yano, Hiroshi; Matsumoto, Megumi; Takagi, Katsunori; Tsuchiya, Tomoshi; Tomoshige, Koichi; Nakashima, Masahiro; Taniguchi, Hideki; Omagari, Takeyuki; Itoyanagi, Noriaki; Nagayasu, Takeshi

    2014-02-15

    We developed an easy, quick and cost-effective detection method for lymph node metastasis called the semi-dry dot-blot (SDB) method, which visualizes the presence of cancer cells with washing of sectioned lymph nodes by anti-pancytokeratin antibody, modifying dot-blot technology. We evaluated the validity and efficacy of the SDB method for the diagnosis of lymph node metastasis in a clinical setting (Trial 1). To evaluate the validity of the SDB method in clinical specimens, 180 dissected lymph nodes from 29 cases, including breast, gastric and colorectal cancer, were examined. Each lymph node was sliced at the maximum diameter and the sensitivity, specificity and accuracy of the SDB method were determined and compared with the final pathology report. Metastasis was detected in 32 lymph nodes (17.8%), and the sensitivity, specificity and accuracy of the SDB method were 100, 98.0 and 98.3%, respectively (Trial 2). To evaluate the efficacy of the SDB method in sentinel lymph node (SLN) biopsy, 174 SLNs from 100 cases of clinically node-negative breast cancer were analyzed. Each SLN was longitudinally sliced at 2-mm intervals and the sensitivity, specificity, accuracy and time required for the SDB method were determined and compared with the intraoperative pathology report. Metastasis was detected in 15 SLNs (8.6%), and the sensitivity, specificity, accuracy and mean required time of the SDB method were 93.3, 96.9, 96.6 and 43.3 min, respectively. The SDB method is a novel and reliable modality for the intraoperative diagnosis of SLN metastasis. © 2013 UICC.

  2. Use of a public telephone hotline to detect urban plague cases.

    PubMed

    Malberg, J A; Pape, W J; Lezotte, D; Hill, A E

    2012-11-01

    Current methods for vector-borne disease surveillance are limited by time and cost. To avoid human infections from emerging zoonotic diseases, it is important that the United States develop cost-effective surveillance systems for these diseases. This study examines the methodology used in the surveillance of a plague epizootic involving tree squirrels (Sciurus niger) in Denver Colorado, during the summer of 2007. A call-in centre for the public to report dead squirrels was used to direct animal carcass sampling. Staff used these reports to collect squirrel carcasses for the analysis of Yersinia pestis infection. This sampling protocol was analysed at the census tract level using Poisson regression to determine the relationship between higher call volumes in a census tract and the risk of a carcass in that tract testing positive for plague. Over-sampling owing to call volume-directed collection was accounted for by including the number of animals collected as the denominator in the model. The risk of finding an additional plague-positive animal increased as the call volume per census tract increased. The risk in the census tracts with >3 calls a month was significantly higher than that with three or less calls in a month. For tracts with 4-5 calls, the relative risk (RR) of an additional plague-positive carcass was 10.08 (95% CI 5.46-18.61); for tracts with 6-8 calls, the RR = 5.20 (2.93-9.20); for tracts with 9-11 calls, the RR = 12.80 (5.85-28.03) and tracts with >11 calls had RR = 35.41 (18.60-67.40). Overall, the call-in centre directed sampling increased the probability of locating plague-infected carcasses in the known Denver epizootic. Further studies are needed to determine the effectiveness of this methodology at monitoring large-scale zoonotic disease occurrence in the absence of a recognized epizootic. © 2012 Blackwell Verlag GmbH.

  3. Alkaline Comet Assay for Assessing DNA Damage in Individual Cells.

    PubMed

    Pu, Xinzhu; Wang, Zemin; Klaunig, James E

    2015-08-06

    Single-cell gel electrophoresis, commonly called a comet assay, is a simple and sensitive method for assessing DNA damage at the single-cell level. It is an important technique in genetic toxicological studies. The comet assay performed under alkaline conditions (pH >13) is considered the optimal version for identifying agents with genotoxic activity. The alkaline comet assay is capable of detecting DNA double-strand breaks, single-strand breaks, alkali-labile sites, DNA-DNA/DNA-protein cross-linking, and incomplete excision repair sites. The inclusion of digestion of lesion-specific DNA repair enzymes in the procedure allows the detection of various DNA base alterations, such as oxidative base damage. This unit describes alkaline comet assay procedures for assessing DNA strand breaks and oxidative base alterations. These methods can be applied in a variety of cells from in vitro and in vivo experiments, as well as human studies. Copyright © 2015 John Wiley & Sons, Inc.

  4. Novel trace chemical detection algorithms: a comparative study

    NASA Astrophysics Data System (ADS)

    Raz, Gil; Murphy, Cara; Georgan, Chelsea; Greenwood, Ross; Prasanth, R. K.; Myers, Travis; Goyal, Anish; Kelley, David; Wood, Derek; Kotidis, Petros

    2017-05-01

    Algorithms for standoff detection and estimation of trace chemicals in hyperspectral images in the IR band are a key component for a variety of applications relevant to law-enforcement and the intelligence communities. Performance of these methods is impacted by the spectral signature variability due to presence of contaminants, surface roughness, nonlinear dependence on abundances as well as operational limitations on the compute platforms. In this work we provide a comparative performance and complexity analysis of several classes of algorithms as a function of noise levels, error distribution, scene complexity, and spatial degrees of freedom. The algorithm classes we analyze and test include adaptive cosine estimator (ACE and modifications to it), compressive/sparse methods, Bayesian estimation, and machine learning. We explicitly call out the conditions under which each algorithm class is optimal or near optimal as well as their built-in limitations and failure modes.

  5. Probe classification of on-off type DNA microarray images with a nonlinear matching measure

    NASA Astrophysics Data System (ADS)

    Ryu, Munho; Kim, Jong Dae; Min, Byoung Goo; Kim, Jongwon; Kim, Y. Y.

    2006-01-01

    We propose a nonlinear matching measure, called counting measure, as a signal detection measure that is defined as the number of on pixels in the spot area. It is applied to classify probes for an on-off type DNA microarray, where each probe spot is classified as hybridized or not. The counting measure also incorporates the maximum response search method, where the expected signal is obtained by taking the maximum among the measured responses of the various positions and sizes of the spot template. The counting measure was compared to existing signal detection measures such as the normalized covariance and the median for 2390 patient samples tested on the human papillomavirus (HPV) DNA chip. The counting measure performed the best regardless of whether or not the maximum response search method was used. The experimental results showed that the counting measure combined with the positional search was the most preferable.

  6. Numerical detection of the Gardner transition in a mean-field glass former.

    PubMed

    Charbonneau, Patrick; Jin, Yuliang; Parisi, Giorgio; Rainone, Corrado; Seoane, Beatriz; Zamponi, Francesco

    2015-07-01

    Recent theoretical advances predict the existence, deep into the glass phase, of a novel phase transition, the so-called Gardner transition. This transition is associated with the emergence of a complex free energy landscape composed of many marginally stable sub-basins within a glass metabasin. In this study, we explore several methods to detect numerically the Gardner transition in a simple structural glass former, the infinite-range Mari-Kurchan model. The transition point is robustly located from three independent approaches: (i) the divergence of the characteristic relaxation time, (ii) the divergence of the caging susceptibility, and (iii) the abnormal tail in the probability distribution function of cage order parameters. We show that the numerical results are fully consistent with the theoretical expectation. The methods we propose may also be generalized to more realistic numerical models as well as to experimental systems.

  7. Microfluidic devices for sample preparation and rapid detection of foodborne pathogens.

    PubMed

    Kant, Krishna; Shahbazi, Mohammad-Ali; Dave, Vivek Priy; Ngo, Tien Anh; Chidambara, Vinayaka Aaydha; Than, Linh Quyen; Bang, Dang Duong; Wolff, Anders

    2018-03-10

    Rapid detection of foodborne pathogens at an early stage is imperative for preventing the outbreak of foodborne diseases, known as serious threats to human health. Conventional bacterial culturing methods for foodborne pathogen detection are time consuming, laborious, and with poor pathogen diagnosis competences. This has prompted researchers to call the current status of detection approaches into question and leverage new technologies for superior pathogen sensing outcomes. Novel strategies mainly rely on incorporating all the steps from sample preparation to detection in miniaturized devices for online monitoring of pathogens with high accuracy and sensitivity in a time-saving and cost effective manner. Lab on chip is a blooming area in diagnosis, which exploits different mechanical and biological techniques to detect very low concentrations of pathogens in food samples. This is achieved through streamlining the sample handling and concentrating procedures, which will subsequently reduce human errors and enhance the accuracy of the sensing methods. Integration of sample preparation techniques into these devices can effectively minimize the impact of complex food matrix on pathogen diagnosis and improve the limit of detections. Integration of pathogen capturing bio-receptors on microfluidic devices is a crucial step, which can facilitate recognition abilities in harsh chemical and physical conditions, offering a great commercial benefit to the food-manufacturing sector. This article reviews recent advances in current state-of-the-art of sample preparation and concentration from food matrices with focus on bacterial capturing methods and sensing technologies, along with their advantages and limitations when integrated into microfluidic devices for online rapid detection of pathogens in foods and food production line. Copyright © 2018. Published by Elsevier Inc.

  8. The SOBANE risk management strategy and the Déparis method for the participatory screening of the risks.

    PubMed

    Malchaire, J B

    2004-08-01

    The first section of the document describes a risk-prevention strategy, called SOBANE, in four levels: screening, observation, analysis and expertise. The aim is to make risk prevention faster, more cost effective, and more effective in coordinating the contributions of the workers themselves, their management, the internal and external occupational health (OH) practitioners and the experts. These four levels are: screening, where the risk factors are detected by the workers and their management, and obvious solutions are implemented; observation, where the remaining problems are studied in more detail, one by one, and the reasons and the solutions are discussed in detail; analysis, where, when necessary, an OH practitioner is called upon to carry out appropriate measurements to develop specific solutions; expertise, where, in very sophisticated and rare cases, the assistance of an expert is called upon to solve a particular problem. The method for the participatory screening of the risks (in French: Dépistage Participatif des Risques), Déparis, is proposed for the first level screening of the SOBANE strategy. The work situation is systematically reviewed and all the aspects conditioning the easiness, the effectiveness and the satisfaction at work are discussed, in search of practical prevention measures. The points to be studied more in detail at level 2, observation, are identified. The method is carried out during a meeting of key workers and technical staff. The method proves to be simple, sparing in time and means and playing a significant role in the development of a dynamic plan of risk management and of a culture of dialogue in the company.

  9. Maximizing detection probability of Wetland-dependent birds during point-count surveys in northwestern Florida

    USGS Publications Warehouse

    Nadeau, C.P.; Conway, C.J.; Smith, B.S.; Lewis, T.E.

    2008-01-01

    We conducted 262 call-broadcast point-count surveys (1-6 replicate surveys on each of 62 points) using standardized North American Marsh Bird Monitoring Protocols between 31 May and 7 July 2006 on St. Vincent National Wildlife Refuge, an island off the northwest coast of Florida. We conducted double-blind multiple-observer surveys, paired morning and evening surveys, and paired morning and night surveys to examine the influence of call-broadcast and time of day on detection probability. Observer detection probability for all species pooled was 75% and was similar between passive (69%) and call-broadcast (65%) periods. Detection probability was higher on morning than evening (t = 3.0, P = 0.030) or night (t = 3.4, P = 0.042) surveys when we pooled all species. Detection probability was higher (but not significant for all species) on morning compared to evening or night surveys for all five focal species detected on surveys: Least Bittern (Ixobrychus exilis), Clapper Rail (Rallus longirostris), Purple Gallinule (Porphyrula martinica), Common Moorhen (Gallinula chloropus), and American Coot (Fulica americana). We detected more Least Bitterns (t = 2.4, P = 0.064) and Common Moorhens (t = 2.8, P = 0.026) on morning than evening surveys, and more Clapper Rails (t = 5.1, P = 0.014) on morning than night surveys.

  10. A scalable self-priming fractal branching microchannel net chip for digital PCR.

    PubMed

    Zhu, Qiangyuan; Xu, Yanan; Qiu, Lin; Ma, Congcong; Yu, Bingwen; Song, Qi; Jin, Wei; Jin, Qinhan; Liu, Jinyu; Mu, Ying

    2017-05-02

    As an absolute quantification method at the single-molecule level, digital PCR has been widely used in many bioresearch fields, such as next generation sequencing, single cell analysis, gene editing detection and so on. However, existing digital PCR methods still have some disadvantages, including high cost, sample loss, and complicated operation. In this work, we develop an exquisite scalable self-priming fractal branching microchannel net digital PCR chip. This chip with a special design inspired by natural fractal-tree systems has an even distribution and 100% compartmentalization of the sample without any sample loss, which is not available in existing chip-based digital PCR methods. A special 10 nm nano-waterproof layer was created to prevent the solution from evaporating. A vacuum pre-packaging method called self-priming reagent introduction is used to passively drive the reagent flow into the microchannel nets, so that this chip can realize sequential reagent loading and isolation within a couple of minutes, which is very suitable for point-of-care detection. When the number of positive microwells stays in the range of 100 to 4000, the relative uncertainty is below 5%, which means that one panel can detect an average of 101 to 15 374 molecules by the Poisson distribution. This chip is proved to have an excellent ability for single molecule detection and quantification of low expression of hHF-MSC stem cell markers. Due to its potential for high throughput, high density, low cost, lack of sample and reagent loss, self-priming even compartmentalization and simple operation, we envision that this device will significantly expand and extend the application range of digital PCR involving rare samples, liquid biopsy detection and point-of-care detection with higher sensitivity and accuracy.

  11. Design and analysis of a spectro-angular surface plasmon resonance biosensor operating in the visible spectrum

    NASA Astrophysics Data System (ADS)

    Filion-Côté, Sandrine; Roche, Philip J. R.; Foudeh, Amir M.; Tabrizian, Maryam; Kirk, Andrew G.

    2014-09-01

    Surface plasmon resonance (SPR) sensing is one of the most widely used methods to implement biosensing due to its sensitivity and capacity for label-free detection. Whilst most commercial SPR sensors operate in the angular regime, it has recently been shown that an increase in sensitivity and a greater robustness against noise can be achieved by measuring the reflectivity when varying both the angle and wavelength simultaneously, in a so-called spectro-angular SPR biosensor. A single value decomposition method is used to project the two-dimensional spectro-angular reflection signal onto a basis set and allow the image obtained from an unknown refractive index sample to be compared very accurately with a pre-calculated reference set. Herein we demonstrate that a previously reported system operated in the near infra-red has a lower detection limit when operating in the visible spectrum due to the improved spatial resolution and numerical precision of the image sensor. The SPR biosensor presented here has an experimental detection limit of 9.8 × 10-7 refractive index unit. To validate the system as a biosensor, we also performed the detection of synthetic RNA from pathogenic Legionella pneumophila with the developed biosensing platform.

  12. Predictive inference for best linear combination of biomarkers subject to limits of detection.

    PubMed

    Coolen-Maturi, Tahani

    2017-08-15

    Measuring the accuracy of diagnostic tests is crucial in many application areas including medicine, machine learning and credit scoring. The receiver operating characteristic (ROC) curve is a useful tool to assess the ability of a diagnostic test to discriminate between two classes or groups. In practice, multiple diagnostic tests or biomarkers are combined to improve diagnostic accuracy. Often, biomarker measurements are undetectable either below or above the so-called limits of detection (LoD). In this paper, nonparametric predictive inference (NPI) for best linear combination of two or more biomarkers subject to limits of detection is presented. NPI is a frequentist statistical method that is explicitly aimed at using few modelling assumptions, enabled through the use of lower and upper probabilities to quantify uncertainty. The NPI lower and upper bounds for the ROC curve subject to limits of detection are derived, where the objective function to maximize is the area under the ROC curve. In addition, the paper discusses the effect of restriction on the linear combination's coefficients on the analysis. Examples are provided to illustrate the proposed method. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Improved Statistical Fault Detection Technique and Application to Biological Phenomena Modeled by S-Systems.

    PubMed

    Mansouri, Majdi; Nounou, Mohamed N; Nounou, Hazem N

    2017-09-01

    In our previous work, we have demonstrated the effectiveness of the linear multiscale principal component analysis (PCA)-based moving window (MW)-generalized likelihood ratio test (GLRT) technique over the classical PCA and multiscale principal component analysis (MSPCA)-based GLRT methods. The developed fault detection algorithm provided optimal properties by maximizing the detection probability for a particular false alarm rate (FAR) with different values of windows, and however, most real systems are nonlinear, which make the linear PCA method not able to tackle the issue of non-linearity to a great extent. Thus, in this paper, first, we apply a nonlinear PCA to obtain an accurate principal component of a set of data and handle a wide range of nonlinearities using the kernel principal component analysis (KPCA) model. The KPCA is among the most popular nonlinear statistical methods. Second, we extend the MW-GLRT technique to one that utilizes exponential weights to residuals in the moving window (instead of equal weightage) as it might be able to further improve fault detection performance by reducing the FAR using exponentially weighed moving average (EWMA). The developed detection method, which is called EWMA-GLRT, provides improved properties, such as smaller missed detection and FARs and smaller average run length. The idea behind the developed EWMA-GLRT is to compute a new GLRT statistic that integrates current and previous data information in a decreasing exponential fashion giving more weight to the more recent data. This provides a more accurate estimation of the GLRT statistic and provides a stronger memory that will enable better decision making with respect to fault detection. Therefore, in this paper, a KPCA-based EWMA-GLRT method is developed and utilized in practice to improve fault detection in biological phenomena modeled by S-systems and to enhance monitoring process mean. The idea behind a KPCA-based EWMA-GLRT fault detection algorithm is to combine the advantages brought forward by the proposed EWMA-GLRT fault detection chart with the KPCA model. Thus, it is used to enhance fault detection of the Cad System in E. coli model through monitoring some of the key variables involved in this model such as enzymes, transport proteins, regulatory proteins, lysine, and cadaverine. The results demonstrate the effectiveness of the proposed KPCA-based EWMA-GLRT method over Q , GLRT, EWMA, Shewhart, and moving window-GLRT methods. The detection performance is assessed and evaluated in terms of FAR, missed detection rates, and average run length (ARL 1 ) values.

  14. Automated real time constant-specificity surveillance for disease outbreaks.

    PubMed

    Wieland, Shannon C; Brownstein, John S; Berger, Bonnie; Mandl, Kenneth D

    2007-06-13

    For real time surveillance, detection of abnormal disease patterns is based on a difference between patterns observed, and those predicted by models of historical data. The usefulness of outbreak detection strategies depends on their specificity; the false alarm rate affects the interpretation of alarms. We evaluate the specificity of five traditional models: autoregressive, Serfling, trimmed seasonal, wavelet-based, and generalized linear. We apply each to 12 years of emergency department visits for respiratory infection syndromes at a pediatric hospital, finding that the specificity of the five models was almost always a non-constant function of the day of the week, month, and year of the study (p < 0.05). We develop an outbreak detection method, called the expectation-variance model, based on generalized additive modeling to achieve a constant specificity by accounting for not only the expected number of visits, but also the variance of the number of visits. The expectation-variance model achieves constant specificity on all three time scales, as well as earlier detection and improved sensitivity compared to traditional methods in most circumstances. Modeling the variance of visit patterns enables real-time detection with known, constant specificity at all times. With constant specificity, public health practitioners can better interpret the alarms and better evaluate the cost-effectiveness of surveillance systems.

  15. X-ray scatter imaging of hepatocellular carcinoma in a mouse model using nanoparticle contrast agents

    DOE PAGES

    Rand, Danielle; Derdak, Zoltan; Carlson, Rolf; ...

    2015-10-29

    Hepatocellular carcinoma (HCC) is one of the most common malignant tumors worldwide and is almost uniformly fatal. Current methods of detection include ultrasound examination and imaging by CT scan or MRI; however, these techniques are problematic in terms of sensitivity and specificity, and the detection of early tumors (<1 cm diameter) has proven elusive. Better, more specific, and more sensitive detection methods are therefore urgently needed. Here we discuss the application of a newly developed x-ray imaging technique called Spatial Frequency Heterodyne Imaging (SFHI) for the early detection of HCC. SFHI uses x-rays scattered by an object to form anmore » image and is more sensitive than conventional absorption-based x-radiography. We show that tissues labeled in vivo with gold nanoparticle contrast agents can be detected using SFHI. We also demonstrate that directed targeting and SFHI of HCC tumors in a mouse model is possible through the use of HCC-specific antibodies. As a result, the enhanced sensitivity of SFHI relative to currently available techniques enables the x-ray imaging of tumors that are just a few millimeters in diameter and substantially reduces the amount of nanoparticle contrast agent required for intravenous injection relative to absorption-based x-ray imaging.« less

  16. GUIDE-Seq enables genome-wide profiling of off-target cleavage by CRISPR-Cas nucleases

    PubMed Central

    Nguyen, Nhu T.; Liebers, Matthew; Topkar, Ved V.; Thapar, Vishal; Wyvekens, Nicolas; Khayter, Cyd; Iafrate, A. John; Le, Long P.; Aryee, Martin J.; Joung, J. Keith

    2014-01-01

    CRISPR RNA-guided nucleases (RGNs) are widely used genome-editing reagents, but methods to delineate their genome-wide off-target cleavage activities have been lacking. Here we describe an approach for global detection of DNA double-stranded breaks (DSBs) introduced by RGNs and potentially other nucleases. This method, called Genome-wide Unbiased Identification of DSBs Enabled by Sequencing (GUIDE-Seq), relies on capture of double-stranded oligodeoxynucleotides into breaks Application of GUIDE-Seq to thirteen RGNs in two human cell lines revealed wide variability in RGN off-target activities and unappreciated characteristics of off-target sequences. The majority of identified sites were not detected by existing computational methods or ChIP-Seq. GUIDE-Seq also identified RGN-independent genomic breakpoint ‘hotspots’. Finally, GUIDE-Seq revealed that truncated guide RNAs exhibit substantially reduced RGN-induced off-target DSBs. Our experiments define the most rigorous framework for genome-wide identification of RGN off-target effects to date and provide a method for evaluating the safety of these nucleases prior to clinical use. PMID:25513782

  17. Color filter array pattern identification using variance of color difference image

    NASA Astrophysics Data System (ADS)

    Shin, Hyun Jun; Jeon, Jong Ju; Eom, Il Kyu

    2017-07-01

    A color filter array is placed on the image sensor of a digital camera to acquire color images. Each pixel uses only one color, since the image sensor can measure only one color per pixel. Therefore, empty pixels are filled using an interpolation process called demosaicing. The original and the interpolated pixels have different statistical characteristics. If the image is modified by manipulation or forgery, the color filter array pattern is altered. This pattern change can be a clue for image forgery detection. However, most forgery detection algorithms have the disadvantage of assuming the color filter array pattern. We present an identification method of the color filter array pattern. Initially, the local mean is eliminated to remove the background effect. Subsequently, the color difference block is constructed to emphasize the difference between the original pixel and the interpolated pixel. The variance measure of the color difference image is proposed as a means of estimating the color filter array configuration. The experimental results show that the proposed method is effective in identifying the color filter array pattern. Compared with conventional methods, our method provides superior performance.

  18. Real-time loop-mediated isothermal amplification (RealAmp) for the species-specific identification of Plasmodium vivax.

    PubMed

    Patel, Jaymin C; Oberstaller, Jenna; Xayavong, Maniphet; Narayanan, Jothikumar; DeBarry, Jeremy D; Srinivasamoorthy, Ganesh; Villegas, Leopoldo; Escalante, Ananias A; DaSilva, Alexandre; Peterson, David S; Barnwell, John W; Kissinger, Jessica C; Udhayakumar, Venkatachalam; Lucchi, Naomi W

    2013-01-01

    Plasmodium vivax infections remain a major source of malaria-related morbidity and mortality. Early and accurate diagnosis is an integral component of effective malaria control programs. Conventional molecular diagnostic methods provide accurate results but are often resource-intensive, expensive, have a long turnaround time and are beyond the capacity of most malaria-endemic countries. Our laboratory has recently developed a new platform called RealAmp, which combines loop-mediated isothermal amplification (LAMP) with a portable tube scanner real-time isothermal instrument for the rapid detection of malaria parasites. Here we describe new primers for the detection of P. vivax using the RealAmp method. Three pairs of amplification primers required for this method were derived from a conserved DNA sequence unique to the P. vivax genome. The amplification was carried out at 64°C using SYBR Green or SYTO-9 intercalating dyes for 90 minutes with the tube scanner set to collect fluorescence signals at 1-minute intervals. Clinical samples of P. vivax and other human-infecting malaria parasite species were used to determine the sensitivity and specificity of the primers by comparing with an 18S ribosomal RNA-based nested PCR as the gold standard. The new set of primers consistently detected laboratory-maintained isolates of P. vivax from different parts of the world. The primers detected P. vivax in the clinical samples with 94.59% sensitivity (95% CI: 87.48-98.26%) and 100% specificity (95% CI: 90.40-100%) compared to the gold standard nested-PCR method. The new primers also proved to be more sensitive than the published species-specific primers specifically developed for the LAMP method in detecting P. vivax.

  19. Drowsy driver mobile application: Development of a novel scleral-area detection method.

    PubMed

    Mohammad, Faisal; Mahadas, Kausalendra; Hung, George K

    2017-10-01

    A reliable and practical app for mobile devices was developed to detect driver drowsiness. It consisted of two main components: a Haar cascade classifier, provided by a computer vision framework called OpenCV, for face/eye detection; and a dedicated JAVA software code for image processing that was applied over a masked region circumscribing the eye. A binary threshold was performed over the masked region to provide a quantitative measure of the number of white pixels in the sclera, which represented the state of eye opening. A continuously low white-pixel count would indicate drowsiness, thereby triggering an alarm to alert the driver. This system was successfully implemented on: (1) a static face image, (2) two subjects under laboratory conditions, and (3) a subject in a vehicle environment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Automatic Railway Traffic Object Detection System Using Feature Fusion Refine Neural Network under Shunting Mode.

    PubMed

    Ye, Tao; Wang, Baocheng; Song, Ping; Li, Juan

    2018-06-12

    Many accidents happen under shunting mode when the speed of a train is below 45 km/h. In this mode, train attendants observe the railway condition ahead using the traditional manual method and tell the observation results to the driver in order to avoid danger. To address this problem, an automatic object detection system based on convolutional neural network (CNN) is proposed to detect objects ahead in shunting mode, which is called Feature Fusion Refine neural network (FR-Net). It consists of three connected modules, i.e., the depthwise-pointwise convolution, the coarse detection module, and the object detection module. Depth-wise-pointwise convolutions are used to improve the detection in real time. The coarse detection module coarsely refine the locations and sizes of prior anchors to provide better initialization for the subsequent module and also reduces search space for the classification, whereas the object detection module aims to regress accurate object locations and predict the class labels for the prior anchors. The experimental results on the railway traffic dataset show that FR-Net achieves 0.8953 mAP with 72.3 FPS performance on a machine with a GeForce GTX1080Ti with the input size of 320 × 320 pixels. The results imply that FR-Net takes a good tradeoff both on effectiveness and real time performance. The proposed method can meet the needs of practical application in shunting mode.

  1. Design of RNA splicing analysis null models for post hoc filtering of Drosophila head RNA-Seq data with the splicing analysis kit (Spanki)

    PubMed Central

    2013-01-01

    Background The production of multiple transcript isoforms from one gene is a major source of transcriptome complexity. RNA-Seq experiments, in which transcripts are converted to cDNA and sequenced, allow the resolution and quantification of alternative transcript isoforms. However, methods to analyze splicing are underdeveloped and errors resulting in incorrect splicing calls occur in every experiment. Results We used RNA-Seq data to develop sequencing and aligner error models. By applying these error models to known input from simulations, we found that errors result from false alignment to minor splice motifs and antisense stands, shifted junction positions, paralog joining, and repeat induced gaps. By using a series of quantitative and qualitative filters, we eliminated diagnosed errors in the simulation, and applied this to RNA-Seq data from Drosophila melanogaster heads. We used high-confidence junction detections to specifically interrogate local splicing differences between transcripts. This method out-performed commonly used RNA-seq methods to identify known alternative splicing events in the Drosophila sex determination pathway. We describe a flexible software package to perform these tasks called Splicing Analysis Kit (Spanki), available at http://www.cbcb.umd.edu/software/spanki. Conclusions Splice-junction centric analysis of RNA-Seq data provides advantages in specificity for detection of alternative splicing. Our software provides tools to better understand error profiles in RNA-Seq data and improve inference from this new technology. The splice-junction centric approach that this software enables will provide more accurate estimates of differentially regulated splicing than current tools. PMID:24209455

  2. Biclustering of gene expression data using reactive greedy randomized adaptive search procedure

    PubMed Central

    Dharan, Smitha; Nair, Achuthsankar S

    2009-01-01

    Background Biclustering algorithms belong to a distinct class of clustering algorithms that perform simultaneous clustering of both rows and columns of the gene expression matrix and can be a very useful analysis tool when some genes have multiple functions and experimental conditions are diverse. Cheng and Church have introduced a measure called mean squared residue score to evaluate the quality of a bicluster and has become one of the most popular measures to search for biclusters. In this paper, we review basic concepts of the metaheuristics Greedy Randomized Adaptive Search Procedure (GRASP)-construction and local search phases and propose a new method which is a variant of GRASP called Reactive Greedy Randomized Adaptive Search Procedure (Reactive GRASP) to detect significant biclusters from large microarray datasets. The method has two major steps. First, high quality bicluster seeds are generated by means of k-means clustering. In the second step, these seeds are grown using the Reactive GRASP, in which the basic parameter that defines the restrictiveness of the candidate list is self-adjusted, depending on the quality of the solutions found previously. Results We performed statistical and biological validations of the biclusters obtained and evaluated the method against the results of basic GRASP and as well as with the classic work of Cheng and Church. The experimental results indicate that the Reactive GRASP approach outperforms the basic GRASP algorithm and Cheng and Church approach. Conclusion The Reactive GRASP approach for the detection of significant biclusters is robust and does not require calibration efforts. PMID:19208127

  3. Design of RNA splicing analysis null models for post hoc filtering of Drosophila head RNA-Seq data with the splicing analysis kit (Spanki).

    PubMed

    Sturgill, David; Malone, John H; Sun, Xia; Smith, Harold E; Rabinow, Leonard; Samson, Marie-Laure; Oliver, Brian

    2013-11-09

    The production of multiple transcript isoforms from one gene is a major source of transcriptome complexity. RNA-Seq experiments, in which transcripts are converted to cDNA and sequenced, allow the resolution and quantification of alternative transcript isoforms. However, methods to analyze splicing are underdeveloped and errors resulting in incorrect splicing calls occur in every experiment. We used RNA-Seq data to develop sequencing and aligner error models. By applying these error models to known input from simulations, we found that errors result from false alignment to minor splice motifs and antisense stands, shifted junction positions, paralog joining, and repeat induced gaps. By using a series of quantitative and qualitative filters, we eliminated diagnosed errors in the simulation, and applied this to RNA-Seq data from Drosophila melanogaster heads. We used high-confidence junction detections to specifically interrogate local splicing differences between transcripts. This method out-performed commonly used RNA-seq methods to identify known alternative splicing events in the Drosophila sex determination pathway. We describe a flexible software package to perform these tasks called Splicing Analysis Kit (Spanki), available at http://www.cbcb.umd.edu/software/spanki. Splice-junction centric analysis of RNA-Seq data provides advantages in specificity for detection of alternative splicing. Our software provides tools to better understand error profiles in RNA-Seq data and improve inference from this new technology. The splice-junction centric approach that this software enables will provide more accurate estimates of differentially regulated splicing than current tools.

  4. Using Puppets to Teach Schoolchildren to Detect Stroke and Call 911.

    PubMed

    Sharkey, Sonya; Denke, Linda; Herbert, Morley A

    2016-08-01

    To overcome barriers to improved outcomes, we undertook an intervention to teach schoolchildren how to detect a stroke and call emergency medical services (EMS). We obtained permission from parents and guardians to use an 8-min puppet show to instruct the fourth, fifth, and sixth graders about stroke detection, symptomatology, and calling EMS. A pretest and three posttests-one immediately following the presentation, one at 3 months, and a third at 6 months-were administered. Responses from 282 students were evaluable. Significant improvements (p < .001) in knowledge were found through all posttests in identifying what parts of the body stroke affected and through the first two posttests in recognizing symptoms stroke victims experienced. Students demonstrated at pretest a high awareness of EMS and 911 (97.5%) and showed slight, but not significant, improvement over time. © The Author(s) 2016.

  5. Timing and technique impact the effectiveness of road-based, mobile acoustic surveys of bats.

    PubMed

    D'Acunto, Laura E; Pauli, Benjamin P; Moy, Mikko; Johnson, Kiara; Abu-Omar, Jasmine; Zollner, Patrick A

    2018-03-01

    Mobile acoustic surveys are a common method of surveying bat communities. However, there is a paucity of empirical studies exploring different methods for conducting mobile road surveys of bats. During 2013, we conducted acoustic mobile surveys on three routes in north-central Indiana, U.S.A., using (1) a standard road survey, (2) a road survey where the vehicle stopped for 1 min at every half mile of the survey route (called a "start-stop method"), and (3) a road survey with an individual using a bicycle. Linear mixed models with multiple comparison procedures revealed that when all bat passes were analyzed, using a bike to conduct mobile surveys detected significantly more bat passes per unit time compared to other methods. However, incorporating genus-level comparisons revealed no advantage to using a bike over vehicle-based methods. We also found that survey method had a significant effect when analyses were limited to those bat passes that could be identified to genus, with the start-stop method generally detecting more identifiable passes than the standard protocol or bike survey. Additionally, we found that significantly more identifiable bat passes (particularly those of the Eptesicus and Lasiurus genera) were detected in surveys conducted immediately following sunset. As governing agencies, particularly in North America, implement vehicle-based bat monitoring programs, it is important for researchers to understand how variations on protocols influence the inference that can be gained from different monitoring schemes.

  6. A Combinatorial Approach to Detecting Gene-Gene and Gene-Environment Interactions in Family Studies

    PubMed Central

    Lou, Xiang-Yang; Chen, Guo-Bo; Yan, Lei; Ma, Jennie Z.; Mangold, Jamie E.; Zhu, Jun; Elston, Robert C.; Li, Ming D.

    2008-01-01

    Widespread multifactor interactions present a significant challenge in determining risk factors of complex diseases. Several combinatorial approaches, such as the multifactor dimensionality reduction (MDR) method, have emerged as a promising tool for better detecting gene-gene (G × G) and gene-environment (G × E) interactions. We recently developed a general combinatorial approach, namely the generalized multifactor dimensionality reduction (GMDR) method, which can entertain both qualitative and quantitative phenotypes and allows for both discrete and continuous covariates to detect G × G and G × E interactions in a sample of unrelated individuals. In this article, we report the development of an algorithm that can be used to study G × G and G × E interactions for family-based designs, called pedigree-based GMDR (PGMDR). Compared to the available method, our proposed method has several major improvements, including allowing for covariate adjustments and being applicable to arbitrary phenotypes, arbitrary pedigree structures, and arbitrary patterns of missing marker genotypes. Our Monte Carlo simulations provide evidence that the PGMDR method is superior in performance to identify epistatic loci compared to the MDR-pedigree disequilibrium test (PDT). Finally, we applied our proposed approach to a genetic data set on tobacco dependence and found a significant interaction between two taste receptor genes (i.e., TAS2R16 and TAS2R38) in affecting nicotine dependence. PMID:18834969

  7. Method for detecting the reactivity of chemicals towards peptides as an alternative test method for assessing skin sensitization potential.

    PubMed

    Cho, Sun-A; Jeong, Yun Hyeok; Kim, Ji Hoon; Kim, Seoyoung; Cho, Jun-Cheol; Heo, Yong; Heo, Young; Suh, Kyung-Do; Shin, Kyeho; An, Susun

    2014-02-10

    Cosmetics are normally composed of various ingredients. Some cosmetic ingredients can act as chemical haptens reacting toward proteins or peptides of human skin and they can provoke an immunologic reaction, called as skin sensitization. This haptenation process is very important step of inducing skin sensitization and evaluating the sensitizing potentials of cosmetic ingredients is very important for consumer safety. Therefore, animal alternative methods focusing on monitoring haptenation potential are undergoing vigorous research. To examine the further usefulness of spectrophotometric methods to monitor reactivity of chemicals toward peptides for cosmetic ingredients. Forty chemicals (25 sensitizers and 15 non-sensitizers) were reacted with 2 synthetic peptides, e.g., the cysteine peptides (Ac-RFAACAA-COOH) with free thiol group and the lysine peptides (Ac-RFAAKAA-COOH) with free amine group. Unreacted peptides can be detected after incubating with 5,5'-dithiobis-2-nitrobenzoic acid or fluorescamine™ as detection reagents for free thiol and amine group, respectively. Chemicals were categorized as sensitizers when they induced more than 10% depletion of cysteine peptides or more than 30% depletion of lysine peptides. The sensitivity, specificity, and accuracy were 80.0%, 86.7% and 82.5%, respectively. These results demonstrate that spectrophotometric methods can be an easy, fast, and high-throughput screening tools predicting the skin sensitization potential of chemical including cosmetic ingredient. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  8. Prospective multi-center study of an automatic online seizure detection system for epilepsy monitoring units.

    PubMed

    Fürbass, F; Ossenblok, P; Hartmann, M; Perko, H; Skupch, A M; Lindinger, G; Elezi, L; Pataraia, E; Colon, A J; Baumgartner, C; Kluge, T

    2015-06-01

    A method for automatic detection of epileptic seizures in long-term scalp-EEG recordings called EpiScan will be presented. EpiScan is used as alarm device to notify medical staff of epilepsy monitoring units (EMUs) in case of a seizure. A prospective multi-center study was performed in three EMUs including 205 patients. A comparison between EpiScan and the Persyst seizure detector on the prospective data will be presented. In addition, the detection results of EpiScan on retrospective EEG data of 310 patients and the public available CHB-MIT dataset will be shown. A detection sensitivity of 81% was reached for unequivocal electrographic seizures with false alarm rate of only 7 per day. No statistical significant differences in the detection sensitivities could be found between the centers. The comparison to the Persyst seizure detector showed a lower false alarm rate of EpiScan but the difference was not of statistical significance. The automatic seizure detection method EpiScan showed high sensitivity and low false alarm rate in a prospective multi-center study on a large number of patients. The application as seizure alarm device in EMUs becomes feasible and will raise the efficiency of video-EEG monitoring and the safety levels of patients. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  9. Automatic Defect Detection for TFT-LCD Array Process Using Quasiconformal Kernel Support Vector Data Description

    PubMed Central

    Liu, Yi-Hung; Chen, Yan-Jen

    2011-01-01

    Defect detection has been considered an efficient way to increase the yield rate of panels in thin film transistor liquid crystal display (TFT-LCD) manufacturing. In this study we focus on the array process since it is the first and key process in TFT-LCD manufacturing. Various defects occur in the array process, and some of them could cause great damage to the LCD panels. Thus, how to design a method that can robustly detect defects from the images captured from the surface of LCD panels has become crucial. Previously, support vector data description (SVDD) has been successfully applied to LCD defect detection. However, its generalization performance is limited. In this paper, we propose a novel one-class machine learning method, called quasiconformal kernel SVDD (QK-SVDD) to address this issue. The QK-SVDD can significantly improve generalization performance of the traditional SVDD by introducing the quasiconformal transformation into a predefined kernel. Experimental results, carried out on real LCD images provided by an LCD manufacturer in Taiwan, indicate that the proposed QK-SVDD not only obtains a high defect detection rate of 96%, but also greatly improves generalization performance of SVDD. The improvement has shown to be over 30%. In addition, results also show that the QK-SVDD defect detector is able to accomplish the task of defect detection on an LCD image within 60 ms. PMID:22016625

  10. A Study of Lane Detection Algorithm for Personal Vehicle

    NASA Astrophysics Data System (ADS)

    Kobayashi, Kazuyuki; Watanabe, Kajiro; Ohkubo, Tomoyuki; Kurihara, Yosuke

    By the word “Personal vehicle”, we mean a simple and lightweight vehicle expected to emerge as personal ground transportation devices. The motorcycle, electric wheelchair, motor-powered bicycle, etc. are examples of the personal vehicle and have been developed as the useful for transportation for a personal use. Recently, a new types of intelligent personal vehicle called the Segway has been developed which is controlled and stabilized by using on-board intelligent multiple sensors. The demand for needs for such personal vehicles are increasing, 1) to enhance human mobility, 2) to support mobility for elderly person, 3) reduction of environmental burdens. Since rapidly growing personal vehicles' market, a number of accidents caused by human error is also increasing. The accidents are caused by it's drive ability. To enhance or support drive ability as well as to prevent accidents, intelligent assistance is necessary. One of most important elemental functions for personal vehicle is robust lane detection. In this paper, we develop a robust lane detection method for personal vehicle at outdoor environments. The proposed lane detection method employing a 360 degree omni directional camera and unique robust image processing algorithm. In order to detect lanes, combination of template matching technique and Hough transform are employed. The validity of proposed lane detection algorithm is confirmed by actual developed vehicle at various type of sunshined outdoor conditions.

  11. Automatic defect detection for TFT-LCD array process using quasiconformal kernel support vector data description.

    PubMed

    Liu, Yi-Hung; Chen, Yan-Jen

    2011-01-01

    Defect detection has been considered an efficient way to increase the yield rate of panels in thin film transistor liquid crystal display (TFT-LCD) manufacturing. In this study we focus on the array process since it is the first and key process in TFT-LCD manufacturing. Various defects occur in the array process, and some of them could cause great damage to the LCD panels. Thus, how to design a method that can robustly detect defects from the images captured from the surface of LCD panels has become crucial. Previously, support vector data description (SVDD) has been successfully applied to LCD defect detection. However, its generalization performance is limited. In this paper, we propose a novel one-class machine learning method, called quasiconformal kernel SVDD (QK-SVDD) to address this issue. The QK-SVDD can significantly improve generalization performance of the traditional SVDD by introducing the quasiconformal transformation into a predefined kernel. Experimental results, carried out on real LCD images provided by an LCD manufacturer in Taiwan, indicate that the proposed QK-SVDD not only obtains a high defect detection rate of 96%, but also greatly improves generalization performance of SVDD. The improvement has shown to be over 30%. In addition, results also show that the QK-SVDD defect detector is able to accomplish the task of defect detection on an LCD image within 60 ms.

  12. Detection and labeling ribs on expiration chest radiographs

    NASA Astrophysics Data System (ADS)

    Park, Mira; Jin, Jesse S.; Wilson, Laurence S.

    2003-06-01

    Typically, inspiration is preferred when xraying the lungs. The x-ray technologist will ask a patient to be still and to take a deep breath and to hold it. This not only reduces the possibility of a blurred image but also enhances the quality of the image since air-filled lungs are easier to see on x-ray film. However, inspiration causes low density in the inner part of lung field. That means that ribs in the inner part of lung field have lower density than the other parts nearer to the border of the lung field. That is why edge detection algorithms often fail to detect ribs. Therefore to make rib edges clear we try to produce an expiration lung field using a 'hemi-elliptical cavity.' Based on the expiration lung field, we extract the rib edges using canny edge detector and a new connectivity method, called '4 way with 10-neighbors connectivity' to detect clavicle and rib edge candidates. Once the edge candidates are formed, our system selects the best candidates using knowledge-based constraints such as a gradient, length and location. The edges can be paired and labeled as superior rib edge and inferior rib edge. Then the system uses the clavicle, which is obtained in a same method for the rib edge detection, as a landmark to label all detected ribs.

  13. Fault detection on a sewer network by a combination of a Kalman filter and a binary sequential probability ratio test

    NASA Astrophysics Data System (ADS)

    Piatyszek, E.; Voignier, P.; Graillot, D.

    2000-05-01

    One of the aims of sewer networks is the protection of population against floods and the reduction of pollution rejected to the receiving water during rainy events. To meet these goals, managers have to equip the sewer networks with and to set up real-time control systems. Unfortunately, a component fault (leading to intolerable behaviour of the system) or sensor fault (deteriorating the process view and disturbing the local automatism) makes the sewer network supervision delicate. In order to ensure an adequate flow management during rainy events it is essential to set up procedures capable of detecting and diagnosing these anomalies. This article introduces a real-time fault detection method, applicable to sewer networks, for the follow-up of rainy events. This method consists in comparing the sensor response with a forecast of this response. This forecast is provided by a model and more precisely by a state estimator: a Kalman filter. This Kalman filter provides not only a flow estimate but also an entity called 'innovation'. In order to detect abnormal operations within the network, this innovation is analysed with the binary sequential probability ratio test of Wald. Moreover, by crossing available information on several nodes of the network, a diagnosis of the detected anomalies is carried out. This method provided encouraging results during the analysis of several rains, on the sewer network of Seine-Saint-Denis County, France.

  14. Validation of the TrichinEasy® digestion system for the detection of Anisakidae larvae in fish products.

    PubMed

    Cammilleri, Gaetano; Chetta, Michele; Costa, Antonella; Graci, Stefania; Collura, Rosaria; Buscemi, Maria Drussilla; Cusimano, Maria; Alongi, Angelina; Principato, Deborah; Giangrosso, Giuseppe; Vella, Antonio; Ferrantelli, Vincenzo

    2016-03-01

    Anisakis and other parasites belonging to the Anisakidae family are organisms of interest for human health, because of their high zoonotic potential. Parasites belonging to this family can cause Anisakiasis, a parasitological disease caused by the ingestion of raw, infested fish products. Furthermore, evidence from the EFSA (European Food Safety Authority; EFSA 2010) has highlighted the allergological potential of nematodes belonging to the Anisakis genre. The detection and identification of Anisakidae larvae in fish products requires an initial visual inspection of the fish sample, as well as other techniques such as candling, UV illumination and artificial digestion. The digestion method consists of the simulation of digestive mechanics, which is made possible by the utilization of HCl and pepsin, according to EC Regulation 2075/2005. In this study, a new Anisakidae larvae detection method using a mechanical digestion system called Trichineasy® was developed. A total of 142 fish samples, belonging to 14 different species, were examined to validate the method. A reaction mixture with 100 g of sample, 10 g of pepsin (1:10000 NF) and 50 ml of 10% HCl at 36 ± 1°C for 20 minutes was evaluated to be the best condition for the digestion of fish samples. These parameters have also allowed the detection of viable larvae after digestion. The results confirm this instrumentation as a valuable and safe tool for the detection of Anisakidae larvae in fishery products.

  15. A removal model for estimating detection probabilities from point-count surveys

    USGS Publications Warehouse

    Farnsworth, G.L.; Pollock, K.H.; Nichols, J.D.; Simons, T.R.; Hines, J.E.; Sauer, J.R.

    2000-01-01

    We adapted a removal model to estimate detection probability during point count surveys. The model assumes one factor influencing detection during point counts is the singing frequency of birds. This may be true for surveys recording forest songbirds when most detections are by sound. The model requires counts to be divided into several time intervals. We used time intervals of 2, 5, and 10 min to develop a maximum-likelihood estimator for the detectability of birds during such surveys. We applied this technique to data from bird surveys conducted in Great Smoky Mountains National Park. We used model selection criteria to identify whether detection probabilities varied among species, throughout the morning, throughout the season, and among different observers. The overall detection probability for all birds was 75%. We found differences in detection probability among species. Species that sing frequently such as Winter Wren and Acadian Flycatcher had high detection probabilities (about 90%) and species that call infrequently such as Pileated Woodpecker had low detection probability (36%). We also found detection probabilities varied with the time of day for some species (e.g. thrushes) and between observers for other species. This method of estimating detectability during point count surveys offers a promising new approach to using count data to address questions of the bird abundance, density, and population trends.

  16. Correlated evolution between hearing sensitivity and social calls in bats

    PubMed Central

    Bohn, Kirsten M; Moss, Cynthia F; Wilkinson, Gerald S

    2006-01-01

    Echolocating bats are auditory specialists, with exquisite hearing that spans several octaves. In the ultrasonic range, bat audiograms typically show highest sensitivity in the spectral region of their species-specific echolocation calls. Well-developed hearing in the audible range has been commonly attributed to a need to detect sounds produced by prey. However, bat pups often emit isolation calls with low-frequency components that facilitate mother–young reunions. In this study, we examine whether low-frequency hearing in bats exhibits correlated evolution with (i) body size; (ii) high-frequency hearing sensitivity or (iii) pup isolation call frequency. Using published audiograms, we found that low-frequency hearing sensitivity is not dependent on body size but is related to high-frequency hearing. After controlling for high-frequency hearing, we found that low-frequency hearing exhibits correlated evolution with isolation call frequency. We infer that detection and discrimination of isolation calls have favoured enhanced low-frequency hearing because accurate parental investment is critical: bats have low reproductive rates, non-volant altricial young and must often identify their pups within large crèches. PMID:17148288

  17. On the Inference of Functional Circadian Networks Using Granger Causality

    PubMed Central

    Pourzanjani, Arya; Herzog, Erik D.; Petzold, Linda R.

    2015-01-01

    Being able to infer one way direct connections in an oscillatory network such as the suprachiastmatic nucleus (SCN) of the mammalian brain using time series data is difficult but crucial to understanding network dynamics. Although techniques have been developed for inferring networks from time series data, there have been no attempts to adapt these techniques to infer directional connections in oscillatory time series, while accurately distinguishing between direct and indirect connections. In this paper an adaptation of Granger Causality is proposed that allows for inference of circadian networks and oscillatory networks in general called Adaptive Frequency Granger Causality (AFGC). Additionally, an extension of this method is proposed to infer networks with large numbers of cells called LASSO AFGC. The method was validated using simulated data from several different networks. For the smaller networks the method was able to identify all one way direct connections without identifying connections that were not present. For larger networks of up to twenty cells the method shows excellent performance in identifying true and false connections; this is quantified by an area-under-the-curve (AUC) 96.88%. We note that this method like other Granger Causality-based methods, is based on the detection of high frequency signals propagating between cell traces. Thus it requires a relatively high sampling rate and a network that can propagate high frequency signals. PMID:26413748

  18. G-index: A new metric to describe dynamic refractive index effects in HPLC absorbance detection.

    PubMed

    Kraiczek, Karsten G; Rozing, Gerard P; Zengerle, Roland

    2018-09-01

    High performance liquid chromatography (HPLC) with a solvent gradient and absorbance detection is one of the most widely used methods in analytical chemistry. The observed absorbance baseline is affected by the changes in the refractive index (RI) of the mobile phase. Near the limited of detection, this complicates peak quantitation. The general aspects of these RI-induced apparent absorbance effects are discussed. Two different detectors with fundamentally different optics and flow cell concepts, a variable-wavelength detector equipped with a conventional flow cell and a diode-array detector equipped with a liquid core waveguide flow cell, are compared with respect to their RI behavior. A simple method to separate static - partly unavoidable - RI effects from dynamic RI effects is presented. It is shown that the dynamic RI behavior of an absorbance detector can be well described using a single, relatively easy-to-determine metric called the G-index. The G-index is typically in the order of a few seconds and its sign depends on the optical flow cell concept. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Methods for Multiplex Template Sampling in Digital PCR Assays

    PubMed Central

    Petriv, Oleh I.; Heyries, Kevin A.; VanInsberghe, Michael; Walker, David; Hansen, Carl L.

    2014-01-01

    The efficient use of digital PCR (dPCR) for precision copy number analysis requires high concentrations of target molecules that may be difficult or impossible to obtain from clinical samples. To solve this problem we present a strategy, called Multiplex Template Sampling (MTS), that effectively increases template concentrations by detecting multiple regions of fragmented target molecules. Three alternative assay approaches are presented for implementing MTS analysis of chromosome 21, providing a 10-fold concentration enhancement while preserving assay precision. PMID:24854517

  20. Methods for multiplex template sampling in digital PCR assays.

    PubMed

    Petriv, Oleh I; Heyries, Kevin A; VanInsberghe, Michael; Walker, David; Hansen, Carl L

    2014-01-01

    The efficient use of digital PCR (dPCR) for precision copy number analysis requires high concentrations of target molecules that may be difficult or impossible to obtain from clinical samples. To solve this problem we present a strategy, called Multiplex Template Sampling (MTS), that effectively increases template concentrations by detecting multiple regions of fragmented target molecules. Three alternative assay approaches are presented for implementing MTS analysis of chromosome 21, providing a 10-fold concentration enhancement while preserving assay precision.

  1. Automatic Recognition of Road Signs

    NASA Astrophysics Data System (ADS)

    Inoue, Yasuo; Kohashi, Yuuichirou; Ishikawa, Naoto; Nakajima, Masato

    2002-11-01

    The increase in traffic accidents is becoming a serious social problem with the recent rapid traffic increase. In many cases, the driver"s carelessness is the primary factor of traffic accidents, and the driver assistance system is demanded for supporting driver"s safety. In this research, we propose the new method of automatic detection and recognition of road signs by image processing. The purpose of this research is to prevent accidents caused by driver"s carelessness, and call attention to a driver when the driver violates traffic a regulation. In this research, high accuracy and the efficient sign detecting method are realized by removing unnecessary information except for a road sign from an image, and detect a road sign using shape features. At first, the color information that is not used in road signs is removed from an image. Next, edges except for circular and triangle ones are removed to choose sign shape. In the recognition process, normalized cross correlation operation is carried out to the two-dimensional differentiation pattern of a sign, and the accurate and efficient method for detecting the road sign is realized. Moreover, the real-time operation in a software base was realized by holding down calculation cost, maintaining highly precise sign detection and recognition. Specifically, it becomes specifically possible to process by 0.1 sec(s)/frame using a general-purpose PC (CPU: Pentium4 1.7GHz). As a result of in-vehicle experimentation, our system could process on real time and has confirmed that detection and recognition of a sign could be performed correctly.

  2. Integrated Detection and Prediction of Influenza Activity for Real-Time Surveillance: Algorithm Design

    PubMed Central

    2017-01-01

    Background Influenza is a viral respiratory disease capable of causing epidemics that represent a threat to communities worldwide. The rapidly growing availability of electronic “big data” from diagnostic and prediagnostic sources in health care and public health settings permits advance of a new generation of methods for local detection and prediction of winter influenza seasons and influenza pandemics. Objective The aim of this study was to present a method for integrated detection and prediction of influenza virus activity in local settings using electronically available surveillance data and to evaluate its performance by retrospective application on authentic data from a Swedish county. Methods An integrated detection and prediction method was formally defined based on a design rationale for influenza detection and prediction methods adapted for local surveillance. The novel method was retrospectively applied on data from the winter influenza season 2008-09 in a Swedish county (population 445,000). Outcome data represented individuals who met a clinical case definition for influenza (based on International Classification of Diseases version 10 [ICD-10] codes) from an electronic health data repository. Information from calls to a telenursing service in the county was used as syndromic data source. Results The novel integrated detection and prediction method is based on nonmechanistic statistical models and is designed for integration in local health information systems. The method is divided into separate modules for detection and prediction of local influenza virus activity. The function of the detection module is to alert for an upcoming period of increased load of influenza cases on local health care (using influenza-diagnosis data), whereas the function of the prediction module is to predict the timing of the activity peak (using syndromic data) and its intensity (using influenza-diagnosis data). For detection modeling, exponential regression was used based on the assumption that the beginning of a winter influenza season has an exponential growth of infected individuals. For prediction modeling, linear regression was applied on 7-day periods at the time in order to find the peak timing, whereas a derivate of a normal distribution density function was used to find the peak intensity. We found that the integrated detection and prediction method detected the 2008-09 winter influenza season on its starting day (optimal timeliness 0 days), whereas the predicted peak was estimated to occur 7 days ahead of the factual peak and the predicted peak intensity was estimated to be 26% lower than the factual intensity (6.3 compared with 8.5 influenza-diagnosis cases/100,000). Conclusions Our detection and prediction method is one of the first integrated methods specifically designed for local application on influenza data electronically available for surveillance. The performance of the method in a retrospective study indicates that further prospective evaluations of the methods are justified. PMID:28619700

  3. A Machine Learning Ensemble Classifier for Early Prediction of Diabetic Retinopathy.

    PubMed

    S K, Somasundaram; P, Alli

    2017-11-09

    The main complication of diabetes is Diabetic retinopathy (DR), retinal vascular disease and it leads to the blindness. Regular screening for early DR disease detection is considered as an intensive labor and resource oriented task. Therefore, automatic detection of DR diseases is performed only by using the computational technique is the great solution. An automatic method is more reliable to determine the presence of an abnormality in Fundus images (FI) but, the classification process is poorly performed. Recently, few research works have been designed for analyzing texture discrimination capacity in FI to distinguish the healthy images. However, the feature extraction (FE) process was not performed well, due to the high dimensionality. Therefore, to identify retinal features for DR disease diagnosis and early detection using Machine Learning and Ensemble Classification method, called, Machine Learning Bagging Ensemble Classifier (ML-BEC) is designed. The ML-BEC method comprises of two stages. The first stage in ML-BEC method comprises extraction of the candidate objects from Retinal Images (RI). The candidate objects or the features for DR disease diagnosis include blood vessels, optic nerve, neural tissue, neuroretinal rim, optic disc size, thickness and variance. These features are initially extracted by applying Machine Learning technique called, t-distributed Stochastic Neighbor Embedding (t-SNE). Besides, t-SNE generates a probability distribution across high-dimensional images where the images are separated into similar and dissimilar pairs. Then, t-SNE describes a similar probability distribution across the points in the low-dimensional map. This lessens the Kullback-Leibler divergence among two distributions regarding the locations of the points on the map. The second stage comprises of application of ensemble classifiers to the extracted features for providing accurate analysis of digital FI using machine learning. In this stage, an automatic detection of DR screening system using Bagging Ensemble Classifier (BEC) is investigated. With the help of voting the process in ML-BEC, bagging minimizes the error due to variance of the base classifier. With the publicly available retinal image databases, our classifier is trained with 25% of RI. Results show that the ensemble classifier can achieve better classification accuracy (CA) than single classification models. Empirical experiments suggest that the machine learning-based ensemble classifier is efficient for further reducing DR classification time (CT).

  4. Prognostics Methodology for Complex Systems

    NASA Technical Reports Server (NTRS)

    Gulati, Sandeep; Mackey, Ryan

    2003-01-01

    An automatic method to schedule maintenance and repair of complex systems is produced based on a computational structure called the Informed Maintenance Grid (IMG). This method provides solutions to the two fundamental problems in autonomic logistics: (1) unambiguous detection of deterioration or impending loss of function and (2) determination of the time remaining to perform maintenance or other corrective action based upon information from the system. The IMG provides a health determination over the medium-to-longterm operation of the system, from one or more days to years of study. The IMG is especially applicable to spacecraft and both piloted and autonomous aircraft, or industrial control processes.

  5. Simple Backdoors on RSA Modulus by Using RSA Vulnerability

    NASA Astrophysics Data System (ADS)

    Sun, Hung-Min; Wu, Mu-En; Yang, Cheng-Ta

    This investigation proposes two methods for embedding backdoors in the RSA modulus N=pq rather than in the public exponent e. This strategy not only permits manufacturers to embed backdoors in an RSA system, but also allows users to choose any desired public exponent, such as e=216+1, to ensure efficient encryption. This work utilizes lattice attack and exhaustive attack to embed backdoors in two proposed methods, called RSASBLT and RSASBES, respectively. Both approaches involve straightforward steps, making their running time roughly the same as that of normal RSA key-generation time, implying that no one can detect the backdoor by observing time imparity.

  6. Application of learning to rank to protein remote homology detection.

    PubMed

    Liu, Bin; Chen, Junjie; Wang, Xiaolong

    2015-11-01

    Protein remote homology detection is one of the fundamental problems in computational biology, aiming to find protein sequences in a database of known structures that are evolutionarily related to a given query protein. Some computational methods treat this problem as a ranking problem and achieve the state-of-the-art performance, such as PSI-BLAST, HHblits and ProtEmbed. This raises the possibility to combine these methods to improve the predictive performance. In this regard, we are to propose a new computational method called ProtDec-LTR for protein remote homology detection, which is able to combine various ranking methods in a supervised manner via using the Learning to Rank (LTR) algorithm derived from natural language processing. Experimental results on a widely used benchmark dataset showed that ProtDec-LTR can achieve an ROC1 score of 0.8442 and an ROC50 score of 0.9023 outperforming all the individual predictors and some state-of-the-art methods. These results indicate that it is correct to treat protein remote homology detection as a ranking problem, and predictive performance improvement can be achieved by combining different ranking approaches in a supervised manner via using LTR. For users' convenience, the software tools of three basic ranking predictors and Learning to Rank algorithm were provided at http://bioinformatics.hitsz.edu.cn/ProtDec-LTR/home/ bliu@insun.hit.edu.cn Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. A Multi-Index Integrated Change detection method for updating the National Land Cover Database

    USGS Publications Warehouse

    Jin, Suming; Yang, Limin; Xian, George Z.; Danielson, Patrick; Homer, Collin G.

    2010-01-01

    Land cover change is typically captured by comparing two or more dates of imagery and associating spectral change with true thematic change. A new change detection method, Multi-Index Integrated Change (MIIC), has been developed to capture a full range of land cover disturbance patterns for updating the National Land Cover Database (NLCD). Specific indices typically specialize in identifying only certain types of disturbances; for example, the Normalized Burn Ratio (NBR) has been widely used for monitoring fire disturbance. Recognizing the potential complementary nature of multiple indices, we integrated four indices into one model to more accurately detect true change between two NLCD time periods. The four indices are NBR, Normalized Difference Vegetation Index (NDVI), Change Vector (CV), and a newly developed index called the Relative Change Vector (RCV). The model is designed to provide both change location and change direction (e.g. biomass increase or biomass decrease). The integrated change model has been tested on five image pairs from different regions exhibiting a variety of disturbance types. Compared with a simple change vector method, MIIC can better capture the desired change without introducing additional commission errors. The model is particularly accurate at detecting forest disturbances, such as forest harvest, forest fire, and forest regeneration. Agreement between the initial change map areas derived from MIIC and the retained final land cover type change areas will be showcased from the pilot test sites.

  8. Application of immobilized synthetic anti-lipopolysaccharide peptides for the isolation and detection of bacteria.

    PubMed

    Sandetskaya, N; Engelmann, B; Brandenburg, K; Kuhlmeier, D

    2015-08-01

    The molecular detection of microorganisms in liquid samples generally requires their enrichment or isolation. The aim of our study was to evaluate the capture and pre-concentration of bacteria by immobilized particular cationic antimicrobial peptides, called synthetic anti-lipopolysaccharide peptides (SALP). For the proof-of-concept and screening of different SALP, the peptides were covalently immobilized on glass slides, and the binding of bacteria was confirmed by microscopic examination of the slides or their scanning, in case of fluorescent bacterial cells. The most efficient SALP was further tethered to magnetic beads. SALP beads were used for the magnetic capture of Escherichia coli in liquid samples. The efficiency of this strategy was evaluated using polymerase chain reaction (PCR). Covalently immobilized SALP were capable of capturing bacteria in liquid samples. However, PCR was hampered by the unspecific binding of DNA to the positively charged peptide. We developed a method for DNA recovery by the enzymatic digestion of the peptide, which allowed for a successful PCR, though the method had its own adverse impact on the detection and, thus, did not allow for the reliable quantitative analysis of the pathogen enrichment. Immobilized SALP can be used as capture molecules for bacteria in liquid samples and can be recommended for the design of the assays or decontamination of the fluids. For the accurate subsequent detection of bacteria, DNA-independent methods should be used.

  9. A removal model for estimating detection probabilities from point-count surveys

    USGS Publications Warehouse

    Farnsworth, G.L.; Pollock, K.H.; Nichols, J.D.; Simons, T.R.; Hines, J.E.; Sauer, J.R.

    2002-01-01

    Use of point-count surveys is a popular method for collecting data on abundance and distribution of birds. However, analyses of such data often ignore potential differences in detection probability. We adapted a removal model to directly estimate detection probability during point-count surveys. The model assumes that singing frequency is a major factor influencing probability of detection when birds are surveyed using point counts. This may be appropriate for surveys in which most detections are by sound. The model requires counts to be divided into several time intervals. Point counts are often conducted for 10 min, where the number of birds recorded is divided into those first observed in the first 3 min, the subsequent 2 min, and the last 5 min. We developed a maximum-likelihood estimator for the detectability of birds recorded during counts divided into those intervals. This technique can easily be adapted to point counts divided into intervals of any length. We applied this method to unlimited-radius counts conducted in Great Smoky Mountains National Park. We used model selection criteria to identify whether detection probabilities varied among species, throughout the morning, throughout the season, and among different observers. We found differences in detection probability among species. Species that sing frequently such as Winter Wren (Troglodytes troglodytes) and Acadian Flycatcher (Empidonax virescens) had high detection probabilities (∼90%) and species that call infrequently such as Pileated Woodpecker (Dryocopus pileatus) had low detection probability (36%). We also found detection probabilities varied with the time of day for some species (e.g. thrushes) and between observers for other species. We used the same approach to estimate detection probability and density for a subset of the observations with limited-radius point counts.

  10. Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology.

    PubMed

    Faltermeier, Rupert; Proescholdt, Martin A; Bele, Sylvia; Brawanski, Alexander

    2015-01-01

    Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses.

  11. Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology

    PubMed Central

    Faltermeier, Rupert; Proescholdt, Martin A.; Bele, Sylvia; Brawanski, Alexander

    2015-01-01

    Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses. PMID:26693250

  12. Early outbreak detection by linking health advice line calls to water distribution areas retrospectively demonstrated in a large waterborne outbreak of cryptosporidiosis in Sweden.

    PubMed

    Bjelkmar, Pär; Hansen, Anette; Schönning, Caroline; Bergström, Jakob; Löfdahl, Margareta; Lebbad, Marianne; Wallensten, Anders; Allestam, Görel; Stenmark, Stephan; Lindh, Johan

    2017-04-18

    In the winter and spring of 2011 a large outbreak of cryptosporidiosis occurred in Skellefteå municipality, Sweden. This study summarizes the outbreak investigation in terms of outbreak size, duration, clinical characteristics, possible source(s) and the potential for earlier detection using calls to a health advice line. The investigation included two epidemiological questionnaires and microbial analysis of samples from patients, water and other environmental sources. In addition, a retrospective study based on phone calls to a health advice line was performed by comparing patterns of phone calls between different water distribution areas. Our analyses showed that approximately 18,500 individuals were affected by a waterborne outbreak of cryptosporidiosis in Skellefteå in 2011. This makes it the second largest outbreak of cryptosporidiosis in Europe to date. Cryptosporidium hominis oocysts of subtype IbA10G2 were found in patient and sewage samples, but not in raw water or in drinking water, and the initial contamination source could not be determined. The outbreak went unnoticed to authorities for several months. The analysis of the calls to the health advice line provides strong indications early in the outbreak that it was linked to a particular water treatment plant. We conclude that an earlier detection of the outbreak by linking calls to a health advice line to water distribution areas could have limited the outbreak substantially.

  13. Whole exome sequencing is an efficient, sensitive and specific method of mutation detection in osteogenesis imperfecta and Marfan syndrome

    PubMed Central

    McInerney-Leo, Aideen M; Marshall, Mhairi S; Gardiner, Brooke; Coucke, Paul J; Van Laer, Lut; Loeys, Bart L; Summers, Kim M; Symoens, Sofie; West, Jennifer A; West, Malcolm J; Paul Wordsworth, B; Zankl, Andreas; Leo, Paul J; Brown, Matthew A; Duncan, Emma L

    2013-01-01

    Osteogenesis imperfecta (OI) and Marfan syndrome (MFS) are common Mendelian disorders. Both conditions are usually diagnosed clinically, as genetic testing is expensive due to the size and number of potentially causative genes and mutations. However, genetic testing may benefit patients, at-risk family members and individuals with borderline phenotypes, as well as improving genetic counseling and allowing critical differential diagnoses. We assessed whether whole exome sequencing (WES) is a sensitive method for mutation detection in OI and MFS. WES was performed on genomic DNA from 13 participants with OI and 10 participants with MFS who had known mutations, with exome capture followed by massive parallel sequencing of multiplexed samples. Single nucleotide polymorphisms (SNPs) and small indels were called using Genome Analysis Toolkit (GATK) and annotated with ANNOVAR. CREST, exomeCopy and exomeDepth were used for large deletion detection. Results were compared with the previous data. Specificity was calculated by screening WES data from a control population of 487 individuals for mutations in COL1A1, COL1A2 and FBN1. The target capture of five exome capture platforms was compared. All 13 mutations in the OI cohort and 9/10 in the MFS cohort were detected (sensitivity=95.6%) including non-synonymous SNPs, small indels (<10 bp), and a large UTR5/exon 1 deletion. One mutation was not detected by GATK due to strand bias. Specificity was 99.5%. Capture platforms and analysis programs differed considerably in their ability to detect mutations. Consumable costs for WES were low. WES is an efficient, sensitive, specific and cost-effective method for mutation detection in patients with OI and MFS. Careful selection of platform and analysis programs is necessary to maximize success. PMID:24501682

  14. Cloud Detection of Optical Satellite Images Using Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Lee, Kuan-Yi; Lin, Chao-Hung

    2016-06-01

    Cloud covers are generally present in optical remote-sensing images, which limit the usage of acquired images and increase the difficulty of data analysis, such as image compositing, correction of atmosphere effects, calculations of vegetation induces, land cover classification, and land cover change detection. In previous studies, thresholding is a common and useful method in cloud detection. However, a selected threshold is usually suitable for certain cases or local study areas, and it may be failed in other cases. In other words, thresholding-based methods are data-sensitive. Besides, there are many exceptions to control, and the environment is changed dynamically. Using the same threshold value on various data is not effective. In this study, a threshold-free method based on Support Vector Machine (SVM) is proposed, which can avoid the abovementioned problems. A statistical model is adopted to detect clouds instead of a subjective thresholding-based method, which is the main idea of this study. The features used in a classifier is the key to a successful classification. As a result, Automatic Cloud Cover Assessment (ACCA) algorithm, which is based on physical characteristics of clouds, is used to distinguish the clouds and other objects. In the same way, the algorithm called Fmask (Zhu et al., 2012) uses a lot of thresholds and criteria to screen clouds, cloud shadows, and snow. Therefore, the algorithm of feature extraction is based on the ACCA algorithm and Fmask. Spatial and temporal information are also important for satellite images. Consequently, co-occurrence matrix and temporal variance with uniformity of the major principal axis are used in proposed method. We aim to classify images into three groups: cloud, non-cloud and the others. In experiments, images acquired by the Landsat 7 Enhanced Thematic Mapper Plus (ETM+) and images containing the landscapes of agriculture, snow area, and island are tested. Experiment results demonstrate the detection accuracy of the proposed method is better than related methods.

  15. Toward the detection of gravitational waves under non-Gaussian noises I. Locally optimal statistic

    PubMed Central

    YOKOYAMA, Jun’ichi

    2014-01-01

    After reviewing the standard hypothesis test and the matched filter technique to identify gravitational waves under Gaussian noises, we introduce two methods to deal with non-Gaussian stationary noises. We formulate the likelihood ratio function under weakly non-Gaussian noises through the Edgeworth expansion and strongly non-Gaussian noises in terms of a new method we call Gaussian mapping where the observed marginal distribution and the two-body correlation function are fully taken into account. We then apply these two approaches to Student’s t-distribution which has a larger tails than Gaussian. It is shown that while both methods work well in the case the non-Gaussianity is small, only the latter method works well for highly non-Gaussian case. PMID:25504231

  16. Integrated Detection and Prediction of Influenza Activity for Real-Time Surveillance: Algorithm Design.

    PubMed

    Spreco, Armin; Eriksson, Olle; Dahlström, Örjan; Cowling, Benjamin John; Timpka, Toomas

    2017-06-15

    Influenza is a viral respiratory disease capable of causing epidemics that represent a threat to communities worldwide. The rapidly growing availability of electronic "big data" from diagnostic and prediagnostic sources in health care and public health settings permits advance of a new generation of methods for local detection and prediction of winter influenza seasons and influenza pandemics. The aim of this study was to present a method for integrated detection and prediction of influenza virus activity in local settings using electronically available surveillance data and to evaluate its performance by retrospective application on authentic data from a Swedish county. An integrated detection and prediction method was formally defined based on a design rationale for influenza detection and prediction methods adapted for local surveillance. The novel method was retrospectively applied on data from the winter influenza season 2008-09 in a Swedish county (population 445,000). Outcome data represented individuals who met a clinical case definition for influenza (based on International Classification of Diseases version 10 [ICD-10] codes) from an electronic health data repository. Information from calls to a telenursing service in the county was used as syndromic data source. The novel integrated detection and prediction method is based on nonmechanistic statistical models and is designed for integration in local health information systems. The method is divided into separate modules for detection and prediction of local influenza virus activity. The function of the detection module is to alert for an upcoming period of increased load of influenza cases on local health care (using influenza-diagnosis data), whereas the function of the prediction module is to predict the timing of the activity peak (using syndromic data) and its intensity (using influenza-diagnosis data). For detection modeling, exponential regression was used based on the assumption that the beginning of a winter influenza season has an exponential growth of infected individuals. For prediction modeling, linear regression was applied on 7-day periods at the time in order to find the peak timing, whereas a derivate of a normal distribution density function was used to find the peak intensity. We found that the integrated detection and prediction method detected the 2008-09 winter influenza season on its starting day (optimal timeliness 0 days), whereas the predicted peak was estimated to occur 7 days ahead of the factual peak and the predicted peak intensity was estimated to be 26% lower than the factual intensity (6.3 compared with 8.5 influenza-diagnosis cases/100,000). Our detection and prediction method is one of the first integrated methods specifically designed for local application on influenza data electronically available for surveillance. The performance of the method in a retrospective study indicates that further prospective evaluations of the methods are justified. ©Armin Spreco, Olle Eriksson, Örjan Dahlström, Benjamin John Cowling, Toomas Timpka. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 15.06.2017.

  17. Subsurface event detection and classification using Wireless Signal Networks.

    PubMed

    Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T

    2012-11-05

    Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events.

  18. Subsurface Event Detection and Classification Using Wireless Signal Networks

    PubMed Central

    Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T.

    2012-01-01

    Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events. PMID:23202191

  19. Evaluation of the QuEChERS method for the extraction of pharmaceuticals and personal care products from drinking-water treatment sludge with determination by UPLC-ESI-MS/MS.

    PubMed

    Cerqueira, Maristela B R; Guilherme, Juliana R; Caldas, Sergiane S; Martins, Manoel L; Zanella, Renato; Primel, Ednei G

    2014-07-01

    A modified version of the QuEChERS method has been evaluated for the determination of 21 pharmaceuticals and 6 personal care products (PPCPs) in drinking-water sludge samples by employing ultra high liquid chromatography-tandem mass spectrometry (UPLC-MS/MS). The performance of the method was evaluated through linearity, recovery, precision (intra-day), method detection and quantification limits (MDL and MQL) and matrix effect. The calibration curves prepared in acetonitrile and in the matrix extract showed a correlation coefficient ranging from 0.98 to 0.99. MQLs values were on the ng g(-1) order of magnitude for most compounds. Recoveries between 50% and 93% were reached with RSDs lower than 10% for most compounds. Matrix effect was almost absent with values lower than 16% for 93% of the compounds. By coupling a quick and simple extraction called QuEChERS with the UPLC-MS/MS analysis, a method that is both selective and sensitive was obtained. This methodology was successfully applied to real samples and caffeine and benzophenone-3 were detected in ng g(-1) levels. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Acoustic and Visual Monitoring for Marine Mammals at the Southern California Off-Shore Range (SCORE)

    DTIC Science & Technology

    2005-02-28

    1998 ). Long - range acoustic detection and localization of blue whale calls in the northeast Pacific Ocean. Journal of the Acoustical ...C. G. and Clark, D.S. 1998 . Long - range acoustic detection and localization of blue whale calls in the northeast Pacific Ocean, J. Acous. Soc. Am...Gulf of Alaska. Marine Mammal Science 19: 682-693. Stafford , K.M., C.G. Fox, and D.S. Clark. 1998 .

  1. Beluga whale, Delphinapterus leucas, vocalizations from the Churchill River, Manitoba, Canada.

    PubMed

    Chmelnitsky, Elly G; Ferguson, Steven H

    2012-06-01

    Classification of animal vocalizations is often done by a human observer using aural and visual analysis but more efficient, automated methods have also been utilized to reduce bias and increase reproducibility. Beluga whale, Delphinapterus leucas, calls were described from recordings collected in the summers of 2006-2008, in the Churchill River, Manitoba. Calls (n=706) were classified based on aural and visual analysis, and call characteristics were measured; calls were separated into 453 whistles (64.2%; 22 types), 183 pulsed∕noisy calls (25.9%; 15 types), and 70 combined calls (9.9%; seven types). Measured parameters varied within each call type but less variation existed in pulsed and noisy call types and some combined call types than in whistles. A more efficient and repeatable hierarchical clustering method was applied to 200 randomly chosen whistles using six call characteristics as variables; twelve groups were identified. Call characteristics varied less in cluster analysis groups than in whistle types described by visual and aural analysis and results were similar to the whistle contours described. This study provided the first description of beluga calls in Hudson Bay and using two methods provides more robust interpretations and an assessment of appropriate methods for future studies.

  2. Bayesian learning

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1989-01-01

    In 1983 and 1984, the Infrared Astronomical Satellite (IRAS) detected 5,425 stellar objects and measured their infrared spectra. In 1987 a program called AUTOCLASS used Bayesian inference methods to discover the classes present in these data and determine the most probable class of each object, revealing unknown phenomena in astronomy. AUTOCLASS has rekindled the old debate on the suitability of Bayesian methods, which are computationally intensive, interpret probabilities as plausibility measures rather than frequencies, and appear to depend on a subjective assessment of the probability of a hypothesis before the data were collected. Modern statistical methods have, however, recently been shown to also depend on subjective elements. These debates bring into question the whole tradition of scientific objectivity and offer scientists a new way to take responsibility for their findings and conclusions.

  3. Application of high-performance liquid chromatography with ultraviolet diode array detection and refractive index detection to the determination of class composition and to the analysis of gasoline.

    PubMed

    Kamiński, Marian; Kartanowicz, Rafał; Przyjazny, Andrzej

    2004-03-12

    A method of effective application of normal-phase high-performance liquid chromatography (NP-HPLC) with ultraviolet diode array detection (DAD) and refractive index detection (RID) for the determination of class composition of gasoline and its components, i.e. for the determination of content of alkenes, aromatic and saturated hydrocarbons in gasoline meeting modern quality standards, has been developed. An aminopropyl-bonded silica stationary phase was used along with n-hexane or n-heptane as the mobile phase. A DAD signal integrated over the 207-240 nm range was used to determine alkenes. This eliminates the necessity of separating alkenes from saturates, because the latter do not absorb UV radiation above 200 nm. The content of aromatic hydrocarbons is determined by means of a refractive index detector. Calibration was based on hydrocarbon type composition determined by the fluorescent indicator adsorption method, ASTM D1319. The results obtained by the developed method were found to be consistent with those obtained by fluorescent indicator adsorption or by a multidimensional GC method (PIONA) (ASTM D5443). The method can be applied to gasoline meeting recent quality standards, irrespective of refining technology used in the production of gasoline components, including gasoline with various contents of oxygenates. The developed method cannot be used to determine the hydrocarbon type composition of gasoline that contains as a component the so-called pyrocondensate, i.e. the fraction with a boiling point up to 220 degrees C, obtained through thermal pyrolysis of distillation residues of crude oil or coal and, consequently, does not meet the quality standards. The paper includes the procedure for identification of this type of gasoline.

  4. An improved method to detect correct protein folds using partial clustering.

    PubMed

    Zhou, Jianjun; Wishart, David S

    2013-01-16

    Structure-based clustering is commonly used to identify correct protein folds among candidate folds (also called decoys) generated by protein structure prediction programs. However, traditional clustering methods exhibit a poor runtime performance on large decoy sets. We hypothesized that a more efficient "partial" clustering approach in combination with an improved scoring scheme could significantly improve both the speed and performance of existing candidate selection methods. We propose a new scheme that performs rapid but incomplete clustering on protein decoys. Our method detects structurally similar decoys (measured using either C(α) RMSD or GDT-TS score) and extracts representatives from them without assigning every decoy to a cluster. We integrated our new clustering strategy with several different scoring functions to assess both the performance and speed in identifying correct or near-correct folds. Experimental results on 35 Rosetta decoy sets and 40 I-TASSER decoy sets show that our method can improve the correct fold detection rate as assessed by two different quality criteria. This improvement is significantly better than two recently published clustering methods, Durandal and Calibur-lite. Speed and efficiency testing shows that our method can handle much larger decoy sets and is up to 22 times faster than Durandal and Calibur-lite. The new method, named HS-Forest, avoids the computationally expensive task of clustering every decoy, yet still allows superior correct-fold selection. Its improved speed, efficiency and decoy-selection performance should enable structure prediction researchers to work with larger decoy sets and significantly improve their ab initio structure prediction performance.

  5. An improved method to detect correct protein folds using partial clustering

    PubMed Central

    2013-01-01

    Background Structure-based clustering is commonly used to identify correct protein folds among candidate folds (also called decoys) generated by protein structure prediction programs. However, traditional clustering methods exhibit a poor runtime performance on large decoy sets. We hypothesized that a more efficient “partial“ clustering approach in combination with an improved scoring scheme could significantly improve both the speed and performance of existing candidate selection methods. Results We propose a new scheme that performs rapid but incomplete clustering on protein decoys. Our method detects structurally similar decoys (measured using either Cα RMSD or GDT-TS score) and extracts representatives from them without assigning every decoy to a cluster. We integrated our new clustering strategy with several different scoring functions to assess both the performance and speed in identifying correct or near-correct folds. Experimental results on 35 Rosetta decoy sets and 40 I-TASSER decoy sets show that our method can improve the correct fold detection rate as assessed by two different quality criteria. This improvement is significantly better than two recently published clustering methods, Durandal and Calibur-lite. Speed and efficiency testing shows that our method can handle much larger decoy sets and is up to 22 times faster than Durandal and Calibur-lite. Conclusions The new method, named HS-Forest, avoids the computationally expensive task of clustering every decoy, yet still allows superior correct-fold selection. Its improved speed, efficiency and decoy-selection performance should enable structure prediction researchers to work with larger decoy sets and significantly improve their ab initio structure prediction performance. PMID:23323835

  6. New target and detection methods: active detectors

    NASA Astrophysics Data System (ADS)

    Mittig, W.; Savajols, H.; Demonchy, C. E.; Giot, L.; Roussel-Chomaz, P.; Wang, H.; Ter-Akopian, G.; Fomichev, A.; Golovkov, M. S.; Stepansov, S.; Wolski, R.; Alamanos, N.; Drouart, A.; Gillibert, A.; Lapoux, V.; Pollacco, E.

    2003-07-01

    The study of nuclei far from stability interacting with simple target nuclei, such as protons, deuterons, 3He and 4He implies the use of inverse kinematics. The very special kinematics, together with the low intensities of the beams calls for special techniques. In july 2002 we tested a new detector, in which the detector gas is the target. This allows in principle a 4π solid angle of the detection, and a big effective target thickness without loss of resolution. The detector developped, called Maya, used isobuthane C4H10 as gas in present tests, and other gases are possible. The multiplexed electronics of more than 1000channels allows the reconstruction of the events occuring between the incoming particle and the detector gas atoms in 3D. Here we were interested in the elastic scattering of 8He on protons for the study of the isobaric analogue states (IAS) of 9He. The beam, in this case, is stopped in the detector. The resonance energy is determined by the place of interaction and the energy of the recoiling proton. The design of the detector is shown, together with some preliminary results are discussed.

  7. Method and apparatus for detecting halogenated hydrocarbons

    DOEpatents

    Monagle, Matthew; Coogan, John J.

    1997-01-01

    A halogenated hydrocarbon (HHC) detector is formed from a silent discharge (also called a dielectric barrier discharge) plasma generator. A silent discharge plasma device receives a gas sample that may contain one or more HHCs and produces free radicals and excited electrons for oxidizing the HHCs in the gas sample to produce water, carbon dioxide, and an acid including halogens in the HHCs. A detector is used to sensitively detect the presence of the acid. A conductivity cell detector combines the oxidation products with a solvent where dissociation of the acid increases the conductivity of the solvent. The conductivity cell output signal is then functionally related to the presence of HHCs in the gas sample. Other detectors include electrochemical cells, infrared spectrometers, and negative ion mobility spectrometers.

  8. Environment Monitor

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Viking landers touched down on Mars equipped with a variety of systems to conduct automated research, each carrying a compact but highly sophisticated instrument for analyzing Martian soil and atmosphere. Instrument called a Gas Chromatography/Mass Spectrometer (GC/MS) had to be small, lightweight, shock resistant, highly automated and extremely sensitive, yet require minimal electrical power. Viking Instruments Corporation commercialized this technology and targeted their primary market as environmental monitoring, especially toxic and hazardous waste site monitoring. Waste sites often contain chemicals in complex mixtures, and the conventional method of site characterization, taking samples on-site and sending them to a laboratory for analysis is time consuming and expensive. Other terrestrial applications are explosive detection in airports, drug detection, industrial air monitoring, medical metabolic monitoring and for military, chemical warfare agents.

  9. Two-colour dip spectroscopy of jet-cooled molecules

    NASA Astrophysics Data System (ADS)

    Ito, Mitsuo

    In optical-optical double resonance spectroscopy, the resonance transition from an intermediate state to a final state can be detected by a dip of the signal (fluorescence or ion) associated with the intermediate state. This method probing the signal of the intermediate state may be called `two-colour dip spectroscopy'. Various kinds of two-colour dip spectroscopy such as two-colour fluorescence/ion dip spectroscopy, two-colour ionization dip spectroscopy employing stimulated emission, population labelling spectroscopy and mass-selected ion dip spectroscopy with dissociation were briefly described, paying special attention to their characteristics in excitation, detection and application. They were extensively and successfully applied to jet-cooled large molecules and provided us with new useful information on the energy and dynamics of excited molecules.

  10. Multi-fault clustering and diagnosis of gear system mined by spectrum entropy clustering based on higher order cumulants

    NASA Astrophysics Data System (ADS)

    Shao, Renping; Li, Jing; Hu, Wentao; Dong, Feifei

    2013-02-01

    Higher order cumulants (HOC) is a new kind of modern signal analysis of theory and technology. Spectrum entropy clustering (SEC) is a data mining method of statistics, extracting useful characteristics from a mass of nonlinear and non-stationary data. Following a discussion on the characteristics of HOC theory and SEC method in this paper, the study of signal processing techniques and the unique merits of nonlinear coupling characteristic analysis in processing random and non-stationary signals are introduced. Also, a new clustering analysis and diagnosis method is proposed for detecting multi-damage on gear by introducing the combination of HOC and SEC into the damage-detection and diagnosis of the gear system. The noise is restrained by HOC and by extracting coupling features and separating the characteristic signal at different speeds and frequency bands. Under such circumstances, the weak signal characteristics in the system are emphasized and the characteristic of multi-fault is extracted. Adopting a data-mining method of SEC conducts an analysis and diagnosis at various running states, such as the speed of 300 r/min, 900 r/min, 1200 r/min, and 1500 r/min of the following six signals: no-fault, short crack-fault in tooth root, long crack-fault in tooth root, short crack-fault in pitch circle, long crack-fault in pitch circle, and wear-fault on tooth. Research shows that this combined method of detection and diagnosis can also identify the degree of damage of some faults. On this basis, the virtual instrument of the gear system which detects damage and diagnoses faults is developed by combining with advantages of MATLAB and VC++, employing component object module technology, adopting mixed programming methods, and calling the program transformed from an *.m file under VC++. This software system possesses functions of collecting and introducing vibration signals of gear, analyzing and processing signals, extracting features, visualizing graphics, detecting and diagnosing faults, detecting and monitoring, etc. Finally, the results of testing and verifying show that the developed system can effectively be used to detect and diagnose faults in an actual operating gear transmission system.

  11. Multi-fault clustering and diagnosis of gear system mined by spectrum entropy clustering based on higher order cumulants.

    PubMed

    Shao, Renping; Li, Jing; Hu, Wentao; Dong, Feifei

    2013-02-01

    Higher order cumulants (HOC) is a new kind of modern signal analysis of theory and technology. Spectrum entropy clustering (SEC) is a data mining method of statistics, extracting useful characteristics from a mass of nonlinear and non-stationary data. Following a discussion on the characteristics of HOC theory and SEC method in this paper, the study of signal processing techniques and the unique merits of nonlinear coupling characteristic analysis in processing random and non-stationary signals are introduced. Also, a new clustering analysis and diagnosis method is proposed for detecting multi-damage on gear by introducing the combination of HOC and SEC into the damage-detection and diagnosis of the gear system. The noise is restrained by HOC and by extracting coupling features and separating the characteristic signal at different speeds and frequency bands. Under such circumstances, the weak signal characteristics in the system are emphasized and the characteristic of multi-fault is extracted. Adopting a data-mining method of SEC conducts an analysis and diagnosis at various running states, such as the speed of 300 r/min, 900 r/min, 1200 r/min, and 1500 r/min of the following six signals: no-fault, short crack-fault in tooth root, long crack-fault in tooth root, short crack-fault in pitch circle, long crack-fault in pitch circle, and wear-fault on tooth. Research shows that this combined method of detection and diagnosis can also identify the degree of damage of some faults. On this basis, the virtual instrument of the gear system which detects damage and diagnoses faults is developed by combining with advantages of MATLAB and VC++, employing component object module technology, adopting mixed programming methods, and calling the program transformed from an *.m file under VC++. This software system possesses functions of collecting and introducing vibration signals of gear, analyzing and processing signals, extracting features, visualizing graphics, detecting and diagnosing faults, detecting and monitoring, etc. Finally, the results of testing and verifying show that the developed system can effectively be used to detect and diagnose faults in an actual operating gear transmission system.

  12. Combining Deep and Handcrafted Image Features for Presentation Attack Detection in Face Recognition Systems Using Visible-Light Camera Sensors

    PubMed Central

    Nguyen, Dat Tien; Pham, Tuyen Danh; Baek, Na Rae; Park, Kang Ryoung

    2018-01-01

    Although face recognition systems have wide application, they are vulnerable to presentation attack samples (fake samples). Therefore, a presentation attack detection (PAD) method is required to enhance the security level of face recognition systems. Most of the previously proposed PAD methods for face recognition systems have focused on using handcrafted image features, which are designed by expert knowledge of designers, such as Gabor filter, local binary pattern (LBP), local ternary pattern (LTP), and histogram of oriented gradients (HOG). As a result, the extracted features reflect limited aspects of the problem, yielding a detection accuracy that is low and varies with the characteristics of presentation attack face images. The deep learning method has been developed in the computer vision research community, which is proven to be suitable for automatically training a feature extractor that can be used to enhance the ability of handcrafted features. To overcome the limitations of previously proposed PAD methods, we propose a new PAD method that uses a combination of deep and handcrafted features extracted from the images by visible-light camera sensor. Our proposed method uses the convolutional neural network (CNN) method to extract deep image features and the multi-level local binary pattern (MLBP) method to extract skin detail features from face images to discriminate the real and presentation attack face images. By combining the two types of image features, we form a new type of image features, called hybrid features, which has stronger discrimination ability than single image features. Finally, we use the support vector machine (SVM) method to classify the image features into real or presentation attack class. Our experimental results indicate that our proposed method outperforms previous PAD methods by yielding the smallest error rates on the same image databases. PMID:29495417

  13. Combining Deep and Handcrafted Image Features for Presentation Attack Detection in Face Recognition Systems Using Visible-Light Camera Sensors.

    PubMed

    Nguyen, Dat Tien; Pham, Tuyen Danh; Baek, Na Rae; Park, Kang Ryoung

    2018-02-26

    Although face recognition systems have wide application, they are vulnerable to presentation attack samples (fake samples). Therefore, a presentation attack detection (PAD) method is required to enhance the security level of face recognition systems. Most of the previously proposed PAD methods for face recognition systems have focused on using handcrafted image features, which are designed by expert knowledge of designers, such as Gabor filter, local binary pattern (LBP), local ternary pattern (LTP), and histogram of oriented gradients (HOG). As a result, the extracted features reflect limited aspects of the problem, yielding a detection accuracy that is low and varies with the characteristics of presentation attack face images. The deep learning method has been developed in the computer vision research community, which is proven to be suitable for automatically training a feature extractor that can be used to enhance the ability of handcrafted features. To overcome the limitations of previously proposed PAD methods, we propose a new PAD method that uses a combination of deep and handcrafted features extracted from the images by visible-light camera sensor. Our proposed method uses the convolutional neural network (CNN) method to extract deep image features and the multi-level local binary pattern (MLBP) method to extract skin detail features from face images to discriminate the real and presentation attack face images. By combining the two types of image features, we form a new type of image features, called hybrid features, which has stronger discrimination ability than single image features. Finally, we use the support vector machine (SVM) method to classify the image features into real or presentation attack class. Our experimental results indicate that our proposed method outperforms previous PAD methods by yielding the smallest error rates on the same image databases.

  14. Spatial detection of tv channel logos as outliers from the content

    NASA Astrophysics Data System (ADS)

    Ekin, Ahmet; Braspenning, Ralph

    2006-01-01

    This paper proposes a purely image-based TV channel logo detection algorithm that can detect logos independently from their motion and transparency features. The proposed algorithm can robustly detect any type of logos, such as transparent and animated, without requiring any temporal constraints whereas known methods have to wait for the occurrence of large motion in the scene and assume stationary logos. The algorithm models logo pixels as outliers from the actual scene content that is represented by multiple 3-D histograms in the YC BC R space. We use four scene histograms corresponding to each of the four corners because the content characteristics change from one image corner to another. A further novelty of the proposed algorithm is that we define image corners and the areas where we compute the scene histograms by a cinematic technique called Golden Section Rule that is used by professionals. The robustness of the proposed algorithm is demonstrated over a dataset of representative TV content.

  15. Entropy Beacon: A Hairpin-Free DNA Amplification Strategy for Efficient Detection of Nucleic Acids

    PubMed Central

    2015-01-01

    Here, we propose an efficient strategy for enzyme- and hairpin-free nucleic acid detection called an entropy beacon (abbreviated as Ebeacon). Different from previously reported DNA hybridization/displacement-based strategies, Ebeacon is driven forward by increases in the entropy of the system, instead of free energy released from new base-pair formation. Ebeacon shows high sensitivity, with a detection limit of 5 pM target DNA in buffer and 50 pM in cellular homogenate. Ebeacon also benefits from the hairpin-free amplification strategy and zero-background, excellent thermostability from 20 °C to 50 °C, as well as good resistance to complex environments. In particular, based on the huge difference between the breathing rate of a single base pair and two adjacent base pairs, Ebeacon also shows high selectivity toward base mutations, such as substitution, insertion, and deletion and, therefore, is an efficient nucleic acid detection method, comparable to most reported enzyme-free strategies. PMID:26505212

  16. Morphological filtering and multiresolution fusion for mammographic microcalcification detection

    NASA Astrophysics Data System (ADS)

    Chen, Lulin; Chen, Chang W.; Parker, Kevin J.

    1997-04-01

    Mammographic images are often of relatively low contrast and poor sharpness with non-stationary background or clutter and are usually corrupted by noise. In this paper, we propose a new method for microcalcification detection using gray scale morphological filtering followed by multiresolution fusion and present a unified general filtering form called the local operating transformation for whitening filtering and adaptive thresholding. The gray scale morphological filters are used to remove all large areas that are considered as non-stationary background or clutter variations, i.e., to prewhiten images. The multiresolution fusion decision is based on matched filter theory. In addition to the normal matched filter, the Laplacian matched filter which is directly related through the wavelet transforms to multiresolution analysis is exploited for microcalcification feature detection. At the multiresolution fusion stage, the region growing techniques are used in each resolution level. The parent-child relations between resolution levels are adopted to make final detection decision. FROC is computed from test on the Nijmegen database.

  17. Infrared Imaging and Characterization of Exoplanets: Can we Detect Earth-Twins on a Budget?

    NASA Technical Reports Server (NTRS)

    Danchi, William

    2010-01-01

    During the past decade considerable progress has been made developing techniques that can be used to detect and characterize Earth twins in the mid- infrared (7-20 microns). The principal technique is called nulling interferometry, and it was invented by Bracewell in the late 1970's. The nulling technique is an interferometric equivalent of an optical coronagraph. At the present time most of the technological hurdles have been overcome for a space mission to be able to begin Phase A early in the next decade, and it is possible to detect and characterize Earth-twins on a mid- sized strategic mission budget ($600-800 million). I will review progress on this exciting method of planet detection in the context of recent work on the Exoplanet Community Forum and the US Decadal Survey (Astro2010), including biomarkers, technological progress, mission concepts, the theory of these instruments, and a.comparison of the discovery space of this technique with others also under consideration.

  18. Intentional Voice Command Detection for Trigger-Free Speech Interface

    NASA Astrophysics Data System (ADS)

    Obuchi, Yasunari; Sumiyoshi, Takashi

    In this paper we introduce a new framework of audio processing, which is essential to achieve a trigger-free speech interface for home appliances. If the speech interface works continually in real environments, it must extract occasional voice commands and reject everything else. It is extremely important to reduce the number of false alarms because the number of irrelevant inputs is much larger than the number of voice commands even for heavy users of appliances. The framework, called Intentional Voice Command Detection, is based on voice activity detection, but enhanced by various speech/audio processing techniques such as emotion recognition. The effectiveness of the proposed framework is evaluated using a newly-collected large-scale corpus. The advantages of combining various features were tested and confirmed, and the simple LDA-based classifier demonstrated acceptable performance. The effectiveness of various methods of user adaptation is also discussed.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, K.A.; Neuman, M.C.; Simmonds, D.D.

    An effective method for detecting computer misuse is the automatic monitoring and analysis of on-line user activity. This activity is reflected in the system audit record, in the system vulnerability posture, and in other evidence found through active testing of the system. During the last several years we have implemented an automatic misuse detection system at Los Alamos. This is the Network Anomaly Detection and Intrusion Reporter (NADIR). We are currently expanding NADIR to include processing of the Cray UNICOS operating system. This new component is called the UNICOS Realtime NADIR, or UNICORN. UNICORN summarizes user activity and system configurationmore » in statistical profiles. It compares these profiles to expert rules that define security policy and improper or suspicious behavior. It reports suspicious behavior to security auditors and provides tools to aid in follow-up investigations. The first phase of UNICORN development is nearing completion, and will be operational in late 1994.« less

  20. Linguistic Summarization of Video for Fall Detection Using Voxel Person and Fuzzy Logic

    PubMed Central

    Anderson, Derek; Luke, Robert H.; Keller, James M.; Skubic, Marjorie; Rantz, Marilyn; Aud, Myra

    2009-01-01

    In this paper, we present a method for recognizing human activity from linguistic summarizations of temporal fuzzy inference curves representing the states of a three-dimensional object called voxel person. A hierarchy of fuzzy logic is used, where the output from each level is summarized and fed into the next level. We present a two level model for fall detection. The first level infers the states of the person at each image. The second level operates on linguistic summarizations of voxel person’s states and inference regarding activity is performed. The rules used for fall detection were designed under the supervision of nurses to ensure that they reflect the manner in which elders perform these activities. The proposed framework is extremely flexible. Rules can be modified, added, or removed, allowing for per-resident customization based on knowledge about their cognitive and physical ability. PMID:20046216

Top