Sample records for ridge detection algorithm

  1. Ridge-branch-based blood vessel detection algorithm for multimodal retinal images

    NASA Astrophysics Data System (ADS)

    Li, Y.; Hutchings, N.; Knighton, R. W.; Gregori, G.; Lujan, B. J.; Flanagan, J. G.

    2009-02-01

    Automatic detection of retinal blood vessels is important to medical diagnoses and imaging. With the development of imaging technologies, various modals of retinal images are available. Few of currently published algorithms are applied to multimodal retinal images. Besides, the performance of algorithms with pathologies is expected to be improved. The purpose of this paper is to propose an automatic Ridge-Branch-Based (RBB) detection algorithm of blood vessel centerlines and blood vessels for multimodal retinal images (color fundus photographs, fluorescein angiograms, fundus autofluorescence images, SLO fundus images and OCT fundus images, for example). Ridges, which can be considered as centerlines of vessel-like patterns, are first extracted. The method uses the connective branching information of image ridges: if ridge pixels are connected, they are more likely to be in the same class, vessel ridge pixels or non-vessel ridge pixels. Thanks to the good distinguishing ability of the designed "Segment-Based Ridge Features", the classifier and its parameters can be easily adapted to multimodal retinal images without ground truth training. We present thorough experimental results on SLO images, color fundus photograph database and other multimodal retinal images, as well as comparison between other published algorithms. Results showed that the RBB algorithm achieved a good performance.

  2. Automated detection of jet contrails using the AVHRR split window

    NASA Technical Reports Server (NTRS)

    Engelstad, M.; Sengupta, S. K.; Lee, T.; Welch, R. M.

    1992-01-01

    This paper investigates the automated detection of jet contrails using data from the Advanced Very High Resolution Radiometer. A preliminary algorithm subtracts the 11.8-micron image from the 10.8-micron image, creating a difference image on which contrails are enhanced. Then a three-stage algorithm searches the difference image for the nearly-straight line segments which characterize contrails. First, the algorithm searches for elevated, linear patterns called 'ridges'. Second, it applies a Hough transform to the detected ridges to locate nearly-straight lines. Third, the algorithm determines which of the nearly-straight lines are likely to be contrails. The paper applies this technique to several test scenes.

  3. Morphological analysis of dendrites and spines by hybridization of ridge detection with twin support vector machine.

    PubMed

    Wang, Shuihua; Chen, Mengmeng; Li, Yang; Shao, Ying; Zhang, Yudong; Du, Sidan; Wu, Jane

    2016-01-01

    Dendritic spines are described as neuronal protrusions. The morphology of dendritic spines and dendrites has a strong relationship to its function, as well as playing an important role in understanding brain function. Quantitative analysis of dendrites and dendritic spines is essential to an understanding of the formation and function of the nervous system. However, highly efficient tools for the quantitative analysis of dendrites and dendritic spines are currently undeveloped. In this paper we propose a novel three-step cascaded algorithm-RTSVM- which is composed of ridge detection as the curvature structure identifier for backbone extraction, boundary location based on differences in density, the Hu moment as features and Twin Support Vector Machine (TSVM) classifiers for spine classification. Our data demonstrates that this newly developed algorithm has performed better than other available techniques used to detect accuracy and false alarm rates. This algorithm will be used effectively in neuroscience research.

  4. An enhanced structure tensor method for sea ice ridge detection from GF-3 SAR imagery

    NASA Astrophysics Data System (ADS)

    Zhu, T.; Li, F.; Zhang, Y.; Zhang, S.; Spreen, G.; Dierking, W.; Heygster, G.

    2017-12-01

    In SAR imagery, ridges or leads are shown as the curvilinear features. The proposed ridge detection method is facilitated by their curvilinear shapes. The bright curvilinear features are recognized as the ridges while the dark curvilinear features are classified as the leads. In dual-polarization HH or HV channel of C-band SAR imagery, the bright curvilinear feature may be false alarm because the frost flowers of young leads may show as bright pixels associated with changes in the surface salinity under calm surface conditions. Wind roughened leads also trigger the backscatter increasing that can be misclassified as ridges [1]. Thus the width limitation is considered in this proposed structure tensor method [2], since only shape feature based method is not enough for detecting ridges. The ridge detection algorithm is based on the hypothesis that the bright pixels are ridges with curvilinear shapes and the ridge width is less 30 meters. Benefited from GF-3 with high spatial resolution of 3 meters, we provide an enhanced structure tensor method for detecting the significant ridge. The preprocessing procedures including the calibration and incidence angle normalization are also investigated. The bright pixels will have strong response to the bandpass filtering. The ridge training samples are delineated from the SAR imagery in the Log-Gabor filters to construct structure tensor. From the tensor, the dominant orientation of the pixel representing the ridge is determined by the dominant eigenvector. For the post-processing of structure tensor, the elongated kernel is desired to enhance the ridge curvilinear shape. Since ridge presents along a certain direction, the ratio of the dominant eigenvector will be used to measure the intensity of local anisotropy. The convolution filter has been utilized in the constructed structure tensor is used to model spatial contextual information. Ridge detection results from GF-3 show the proposed method performs better compared to the direct threshold method.

  5. Object-Based Arctic Sea Ice Feature Extraction through High Spatial Resolution Aerial photos

    NASA Astrophysics Data System (ADS)

    Miao, X.; Xie, H.

    2015-12-01

    High resolution aerial photographs used to detect and classify sea ice features can provide accurate physical parameters to refine, validate, and improve climate models. However, manually delineating sea ice features, such as melt ponds, submerged ice, water, ice/snow, and pressure ridges, is time-consuming and labor-intensive. An object-based classification algorithm is developed to automatically extract sea ice features efficiently from aerial photographs taken during the Chinese National Arctic Research Expedition in summer 2010 (CHINARE 2010) in the MIZ near the Alaska coast. The algorithm includes four steps: (1) the image segmentation groups the neighboring pixels into objects based on the similarity of spectral and textural information; (2) the random forest classifier distinguishes four general classes: water, general submerged ice (GSI, including melt ponds and submerged ice), shadow, and ice/snow; (3) the polygon neighbor analysis separates melt ponds and submerged ice based on spatial relationship; and (4) pressure ridge features are extracted from shadow based on local illumination geometry. The producer's accuracy of 90.8% and user's accuracy of 91.8% are achieved for melt pond detection, and shadow shows a user's accuracy of 88.9% and producer's accuracies of 91.4%. Finally, pond density, pond fraction, ice floes, mean ice concentration, average ridge height, ridge profile, and ridge frequency are extracted from batch processing of aerial photos, and their uncertainties are estimated.

  6. A ridge tracking algorithm and error estimate for efficient computation of Lagrangian coherent structures.

    PubMed

    Lipinski, Doug; Mohseni, Kamran

    2010-03-01

    A ridge tracking algorithm for the computation and extraction of Lagrangian coherent structures (LCS) is developed. This algorithm takes advantage of the spatial coherence of LCS by tracking the ridges which form LCS to avoid unnecessary computations away from the ridges. We also make use of the temporal coherence of LCS by approximating the time dependent motion of the LCS with passive tracer particles. To justify this approximation, we provide an estimate of the difference between the motion of the LCS and that of tracer particles which begin on the LCS. In addition to the speedup in computational time, the ridge tracking algorithm uses less memory and results in smaller output files than the standard LCS algorithm. Finally, we apply our ridge tracking algorithm to two test cases, an analytically defined double gyre as well as the more complicated example of the numerical simulation of a swimming jellyfish. In our test cases, we find up to a 35 times speedup when compared with the standard LCS algorithm.

  7. Automatic vasculature identification in coronary angiograms by adaptive geometrical tracking.

    PubMed

    Xiao, Ruoxiu; Yang, Jian; Goyal, Mahima; Liu, Yue; Wang, Yongtian

    2013-01-01

    As the uneven distribution of contrast agents and the perspective projection principle of X-ray, the vasculatures in angiographic image are with low contrast and are generally superposed with other organic tissues; therefore, it is very difficult to identify the vasculature and quantitatively estimate the blood flow directly from angiographic images. In this paper, we propose a fully automatic algorithm named adaptive geometrical vessel tracking (AGVT) for coronary artery identification in X-ray angiograms. Initially, the ridge enhancement (RE) image is obtained utilizing multiscale Hessian information. Then, automatic initialization procedures including seed points detection, and initial directions determination are performed on the RE image. The extracted ridge points can be adjusted to the geometrical centerline points adaptively through diameter estimation. Bifurcations are identified by discriminating connecting relationship of the tracked ridge points. Finally, all the tracked centerlines are merged and smoothed by classifying the connecting components on the vascular structures. Synthetic angiographic images and clinical angiograms are used to evaluate the performance of the proposed algorithm. The proposed algorithm is compared with other two vascular tracking techniques in terms of the efficiency and accuracy, which demonstrate successful applications of the proposed segmentation and extraction scheme in vasculature identification.

  8. Enclosure Transform for Interest Point Detection From Speckle Imagery.

    PubMed

    Yongjian Yu; Jue Wang

    2017-03-01

    We present a fast enclosure transform (ET) to localize complex objects of interest from speckle imagery. This approach explores the spatial confinement on regional features from a sparse image feature representation. Unrelated, broken ridge features surrounding an object are organized collaboratively, giving rise to the enclosureness of the object. Three enclosure likelihood measures are constructed, consisting of the enclosure force, potential energy, and encloser count. In the transform domain, the local maxima manifest the locations of objects of interest, for which only the intrinsic dimension is known a priori. The discrete ET algorithm is computationally efficient, being on the order of O(MN) using N measuring distances across an image of M ridge pixels. It involves easy and few parameter settings. We demonstrate and assess the performance of ET on the automatic detection of the prostate locations from supra-pubic ultrasound images. ET yields superior results in terms of positive detection rate, accuracy and coverage.

  9. Fast Sparse Coding for Range Data Denoising with Sparse Ridges Constraint.

    PubMed

    Gao, Zhi; Lao, Mingjie; Sang, Yongsheng; Wen, Fei; Ramesh, Bharath; Zhai, Ruifang

    2018-05-06

    Light detection and ranging (LiDAR) sensors have been widely deployed on intelligent systems such as unmanned ground vehicles (UGVs) and unmanned aerial vehicles (UAVs) to perform localization, obstacle detection, and navigation tasks. Thus, research into range data processing with competitive performance in terms of both accuracy and efficiency has attracted increasing attention. Sparse coding has revolutionized signal processing and led to state-of-the-art performance in a variety of applications. However, dictionary learning, which plays the central role in sparse coding techniques, is computationally demanding, resulting in its limited applicability in real-time systems. In this study, we propose sparse coding algorithms with a fixed pre-learned ridge dictionary to realize range data denoising via leveraging the regularity of laser range measurements in man-made environments. Experiments on both synthesized data and real data demonstrate that our method obtains accuracy comparable to that of sophisticated sparse coding methods, but with much higher computational efficiency.

  10. Automated circumferential construction of first-order aqueous humor outflow pathways using spectral-domain optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Huang, Alex S.; Belghith, Akram; Dastiridou, Anna; Chopra, Vikas; Zangwill, Linda M.; Weinreb, Robert N.

    2017-06-01

    The purpose was to create a three-dimensional (3-D) model of circumferential aqueous humor outflow (AHO) in a living human eye with an automated detection algorithm for Schlemm's canal (SC) and first-order collector channels (CC) applied to spectral-domain optical coherence tomography (SD-OCT). Anterior segment SD-OCT scans from a subject were acquired circumferentially around the limbus. A Bayesian Ridge method was used to approximate the location of the SC on infrared confocal laser scanning ophthalmoscopic images with a cross multiplication tool developed to initiate SC/CC detection automated through a fuzzy hidden Markov Chain approach. Automatic segmentation of SC and initial CC's was manually confirmed by two masked graders. Outflow pathways detected by the segmentation algorithm were reconstructed into a 3-D representation of AHO. Overall, only <1% of images (5114 total B-scans) were ungradable. Automatic segmentation algorithm performed well with SC detection 98.3% of the time and <0.1% false positive detection compared to expert grader consensus. CC was detected 84.2% of the time with 1.4% false positive detection. 3-D representation of AHO pathways demonstrated variably thicker and thinner SC with some clear CC roots. Circumferential (360 deg), automated, and validated AHO detection of angle structures in the living human eye with reconstruction was possible.

  11. Leukemia and colon tumor detection based on microarray data classification using momentum backpropagation and genetic algorithm as a feature selection method

    NASA Astrophysics Data System (ADS)

    Wisesty, Untari N.; Warastri, Riris S.; Puspitasari, Shinta Y.

    2018-03-01

    Cancer is one of the major causes of mordibility and mortality problems in the worldwide. Therefore, the need of a system that can analyze and identify a person suffering from a cancer by using microarray data derived from the patient’s Deoxyribonucleic Acid (DNA). But on microarray data has thousands of attributes, thus making the challenges in data processing. This is often referred to as the curse of dimensionality. Therefore, in this study built a system capable of detecting a patient whether contracted cancer or not. The algorithm used is Genetic Algorithm as feature selection and Momentum Backpropagation Neural Network as a classification method, with data used from the Kent Ridge Bio-medical Dataset. Based on system testing that has been done, the system can detect Leukemia and Colon Tumor with best accuracy equal to 98.33% for colon tumor data and 100% for leukimia data. Genetic Algorithm as feature selection algorithm can improve system accuracy, which is from 64.52% to 98.33% for colon tumor data and 65.28% to 100% for leukemia data, and the use of momentum parameters can accelerate the convergence of the system in the training process of Neural Network.

  12. Automated circumferential construction of first-order aqueous humor outflow pathways using spectral-domain optical coherence tomography.

    PubMed

    Huang, Alex S; Belghith, Akram; Dastiridou, Anna; Chopra, Vikas; Zangwill, Linda M; Weinreb, Robert N

    2017-06-01

    The purpose was to create a three-dimensional (3-D) model of circumferential aqueous humor outflow (AHO) in a living human eye with an automated detection algorithm for Schlemm’s canal (SC) and first-order collector channels (CC) applied to spectral-domain optical coherence tomography (SD-OCT). Anterior segment SD-OCT scans from a subject were acquired circumferentially around the limbus. A Bayesian Ridge method was used to approximate the location of the SC on infrared confocal laser scanning ophthalmoscopic images with a cross multiplication tool developed to initiate SC/CC detection automated through a fuzzy hidden Markov Chain approach. Automatic segmentation of SC and initial CC’s was manually confirmed by two masked graders. Outflow pathways detected by the segmentation algorithm were reconstructed into a 3-D representation of AHO. Overall, only <1% of images (5114 total B-scans) were ungradable. Automatic segmentation algorithm performed well with SC detection 98.3% of the time and <0.1% false positive detection compared to expert grader consensus. CC was detected 84.2% of the time with 1.4% false positive detection. 3-D representation of AHO pathways demonstrated variably thicker and thinner SC with some clear CC roots. Circumferential (360 deg), automated, and validated AHO detection of angle structures in the living human eye with reconstruction was possible.

  13. Diffusion tensor driven contour closing for cell microinjection targeting.

    PubMed

    Becattini, Gabriele; Mattos, Leonardo S; Caldwell, Darwin G

    2010-01-01

    This article introduces a novel approach to robust automatic detection of unstained living cells in bright-field (BF) microscope images with the goal of producing a target list for an automated microinjection system. The overall image analysis process is described and includes: preprocessing, ridge enhancement, image segmentation, shape analysis and injection point definition. The developed algorithm implements a new version of anisotropic contour completion (ACC) based on the partial differential equation (PDE) for heat diffusion which improves the cell segmentation process by elongating the edges only along their tangent direction. The developed ACC algorithm is equivalent to a dilation of the binary edge image with a continuous elliptic structural element that takes into account local orientation of the contours preventing extension towards normal direction. Experiments carried out on real images of 10 to 50 microm CHO-K1 adherent cells show a remarkable reliability in the algorithm along with up to 85% success for cell detection and injection point definition.

  14. Fast Sparse Coding for Range Data Denoising with Sparse Ridges Constraint

    PubMed Central

    Lao, Mingjie; Sang, Yongsheng; Wen, Fei; Zhai, Ruifang

    2018-01-01

    Light detection and ranging (LiDAR) sensors have been widely deployed on intelligent systems such as unmanned ground vehicles (UGVs) and unmanned aerial vehicles (UAVs) to perform localization, obstacle detection, and navigation tasks. Thus, research into range data processing with competitive performance in terms of both accuracy and efficiency has attracted increasing attention. Sparse coding has revolutionized signal processing and led to state-of-the-art performance in a variety of applications. However, dictionary learning, which plays the central role in sparse coding techniques, is computationally demanding, resulting in its limited applicability in real-time systems. In this study, we propose sparse coding algorithms with a fixed pre-learned ridge dictionary to realize range data denoising via leveraging the regularity of laser range measurements in man-made environments. Experiments on both synthesized data and real data demonstrate that our method obtains accuracy comparable to that of sophisticated sparse coding methods, but with much higher computational efficiency. PMID:29734793

  15. Pulmonary lobe segmentation based on ridge surface sampling and shape model fitting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, James C., E-mail: jross@bwh.harvard.edu; Surgical Planning Lab, Brigham and Women's Hospital, Boston, Massachusetts 02215; Laboratory of Mathematics in Imaging, Brigham and Women's Hospital, Boston, Massachusetts 02126

    2013-12-15

    Purpose: Performing lobe-based quantitative analysis of the lung in computed tomography (CT) scans can assist in efforts to better characterize complex diseases such as chronic obstructive pulmonary disease (COPD). While airways and vessels can help to indicate the location of lobe boundaries, segmentations of these structures are not always available, so methods to define the lobes in the absence of these structures are desirable. Methods: The authors present a fully automatic lung lobe segmentation algorithm that is effective in volumetric inspiratory and expiratory computed tomography (CT) datasets. The authors rely on ridge surface image features indicating fissure locations and amore » novel approach to modeling shape variation in the surfaces defining the lobe boundaries. The authors employ a particle system that efficiently samples ridge surfaces in the image domain and provides a set of candidate fissure locations based on the Hessian matrix. Following this, lobe boundary shape models generated from principal component analysis (PCA) are fit to the particles data to discriminate between fissure and nonfissure candidates. The resulting set of particle points are used to fit thin plate spline (TPS) interpolating surfaces to form the final boundaries between the lung lobes. Results: The authors tested algorithm performance on 50 inspiratory and 50 expiratory CT scans taken from the COPDGene study. Results indicate that the authors' algorithm performs comparably to pulmonologist-generated lung lobe segmentations and can produce good results in cases with accessory fissures, incomplete fissures, advanced emphysema, and low dose acquisition protocols. Dice scores indicate that only 29 out of 500 (5.85%) lobes showed Dice scores lower than 0.9. Two different approaches for evaluating lobe boundary surface discrepancies were applied and indicate that algorithm boundary identification is most accurate in the vicinity of fissures detectable on CT. Conclusions: The proposed algorithm is effective for lung lobe segmentation in absence of auxiliary structures such as vessels and airways. The most challenging cases are those with mostly incomplete, absent, or near-absent fissures and in cases with poorly revealed fissures due to high image noise. However, the authors observe good performance even in the majority of these cases.« less

  16. Automated detection of geological landforms on Mars using Convolutional Neural Networks

    NASA Astrophysics Data System (ADS)

    Palafox, Leon F.; Hamilton, Christopher W.; Scheidt, Stephen P.; Alvarez, Alexander M.

    2017-04-01

    The large volume of high-resolution images acquired by the Mars Reconnaissance Orbiter has opened a new frontier for developing automated approaches to detecting landforms on the surface of Mars. However, most landform classifiers focus on crater detection, which represents only one of many geological landforms of scientific interest. In this work, we use Convolutional Neural Networks (ConvNets) to detect both volcanic rootless cones and transverse aeolian ridges. Our system, named MarsNet, consists of five networks, each of which is trained to detect landforms of different sizes. We compare our detection algorithm with a widely used method for image recognition, Support Vector Machines (SVMs) using Histogram of Oriented Gradients (HOG) features. We show that ConvNets can detect a wide range of landforms and has better accuracy and recall in testing data than traditional classifiers based on SVMs.

  17. Automated detection of geological landforms on Mars using Convolutional Neural Networks.

    PubMed

    Palafox, Leon F; Hamilton, Christopher W; Scheidt, Stephen P; Alvarez, Alexander M

    2017-04-01

    The large volume of high-resolution images acquired by the Mars Reconnaissance Orbiter has opened a new frontier for developing automated approaches to detecting landforms on the surface of Mars. However, most landform classifiers focus on crater detection, which represents only one of many geological landforms of scientific interest. In this work, we use Convolutional Neural Networks (ConvNets) to detect both volcanic rootless cones and transverse aeolian ridges. Our system, named MarsNet, consists of five networks, each of which is trained to detect landforms of different sizes. We compare our detection algorithm with a widely used method for image recognition, Support Vector Machines (SVMs) using Histogram of Oriented Gradients (HOG) features. We show that ConvNets can detect a wide range of landforms and has better accuracy and recall in testing data than traditional classifiers based on SVMs.

  18. SigWin-detector: a Grid-enabled workflow for discovering enriched windows of genomic features related to DNA sequences.

    PubMed

    Inda, Márcia A; van Batenburg, Marinus F; Roos, Marco; Belloum, Adam S Z; Vasunin, Dmitry; Wibisono, Adianto; van Kampen, Antoine H C; Breit, Timo M

    2008-08-08

    Chromosome location is often used as a scaffold to organize genomic information in both the living cell and molecular biological research. Thus, ever-increasing amounts of data about genomic features are stored in public databases and can be readily visualized by genome browsers. To perform in silico experimentation conveniently with this genomics data, biologists need tools to process and compare datasets routinely and explore the obtained results interactively. The complexity of such experimentation requires these tools to be based on an e-Science approach, hence generic, modular, and reusable. A virtual laboratory environment with workflows, workflow management systems, and Grid computation are therefore essential. Here we apply an e-Science approach to develop SigWin-detector, a workflow-based tool that can detect significantly enriched windows of (genomic) features in a (DNA) sequence in a fast and reproducible way. For proof-of-principle, we utilize a biological use case to detect regions of increased and decreased gene expression (RIDGEs and anti-RIDGEs) in human transcriptome maps. We improved the original method for RIDGE detection by replacing the costly step of estimation by random sampling with a faster analytical formula for computing the distribution of the null hypothesis being tested and by developing a new algorithm for computing moving medians. SigWin-detector was developed using the WS-VLAM workflow management system and consists of several reusable modules that are linked together in a basic workflow. The configuration of this basic workflow can be adapted to satisfy the requirements of the specific in silico experiment. As we show with the results from analyses in the biological use case on RIDGEs, SigWin-detector is an efficient and reusable Grid-based tool for discovering windows enriched for features of a particular type in any sequence of values. Thus, SigWin-detector provides the proof-of-principle for the modular e-Science based concept of integrative bioinformatics experimentation.

  19. Blood vessel segmentation in modern wide-field retinal images in the presence of additive Gaussian noise.

    PubMed

    Asem, Morteza Modarresi; Oveisi, Iman Sheikh; Janbozorgi, Mona

    2018-07-01

    Retinal blood vessels indicate some serious health ramifications, such as cardiovascular disease and stroke. Thanks to modern imaging technology, high-resolution images provide detailed information to help analyze retinal vascular features before symptoms associated with such conditions fully develop. Additionally, these retinal images can be used by ophthalmologists to facilitate diagnosis and the procedures of eye surgery. A fuzzy noise reduction algorithm was employed to enhance color images corrupted by Gaussian noise. The present paper proposes employing a contrast limited adaptive histogram equalization to enhance illumination and increase the contrast of retinal images captured from state-of-the-art cameras. Possessing directional properties, the multistructure elements method can lead to high-performance edge detection. Therefore, multistructure elements-based morphology operators are used to detect high-quality image ridges. Following this detection, the irrelevant ridges, which are not part of the vessel tree, were removed by morphological operators by reconstruction, attempting also to keep the thin vessels preserved. A combined method of connected components analysis (CCA) in conjunction with a thresholding approach was further used to identify the ridges that correspond to vessels. The application of CCA can yield higher efficiency when it is locally applied rather than applied on the whole image. The significance of our work lies in the way in which several methods are effectively combined and the originality of the database employed, making this work unique in the literature. Computer simulation results in wide-field retinal images with up to a 200-deg field of view are a testimony of the efficacy of the proposed approach, with an accuracy of 0.9524.

  20. [An automatic peak detection method for LIBS spectrum based on continuous wavelet transform].

    PubMed

    Chen, Peng-Fei; Tian, Di; Qiao, Shu-Jun; Yang, Guang

    2014-07-01

    Spectrum peak detection in the laser-induced breakdown spectroscopy (LIBS) is an essential step, but the presence of background and noise seriously disturb the accuracy of peak position. The present paper proposed a method applied to automatic peak detection for LIBS spectrum in order to enhance the ability of overlapping peaks searching and adaptivity. We introduced the ridge peak detection method based on continuous wavelet transform to LIBS, and discussed the choice of the mother wavelet and optimized the scale factor and the shift factor. This method also improved the ridge peak detection method with a correcting ridge method. The experimental results show that compared with other peak detection methods (the direct comparison method, derivative method and ridge peak search method), our method had a significant advantage on the ability to distinguish overlapping peaks and the precision of peak detection, and could be be applied to data processing in LIBS.

  1. Using landscape topology to compare continuous metaheuristics: a framework and case study on EDAs and ridge structure.

    PubMed

    Morgan, R; Gallagher, M

    2012-01-01

    In this paper we extend a previously proposed randomized landscape generator in combination with a comparative experimental methodology to study the behavior of continuous metaheuristic optimization algorithms. In particular, we generate two-dimensional landscapes with parameterized, linear ridge structure, and perform pairwise comparisons of algorithms to gain insight into what kind of problems are easy and difficult for one algorithm instance relative to another. We apply this methodology to investigate the specific issue of explicit dependency modeling in simple continuous estimation of distribution algorithms. Experimental results reveal specific examples of landscapes (with certain identifiable features) where dependency modeling is useful, harmful, or has little impact on mean algorithm performance. Heat maps are used to compare algorithm performance over a large number of landscape instances and algorithm trials. Finally, we perform a meta-search in the landscape parameter space to find landscapes which maximize the performance between algorithms. The results are related to some previous intuition about the behavior of these algorithms, but at the same time lead to new insights into the relationship between dependency modeling in EDAs and the structure of the problem landscape. The landscape generator and overall methodology are quite general and extendable and can be used to examine specific features of other algorithms.

  2. Algorithm applying a modified BRDF function in Λ-ridge concentrator of solar radiation

    NASA Astrophysics Data System (ADS)

    Plachta, Kamil

    2015-05-01

    This paper presents an algorithm that uses the modified BRDF function. It allows the calculation of the parameters of Λ-ridge concentrator system. The concentrator directs reflected solar radiation on photovoltaic surface, increasing its efficiency. The efficiency of the concentrator depends on the surface characteristics of the material which it is made of, the angle of the photovoltaic panel and the resolution of the tracking system. It shows a method of modeling the surface by using the BRDF function and describes its basic parameters, e.g. roughness and the components of the reflected stream. A cost calculation of chosen models with presented in this article BRDF function modification has been made. The author's own simulation program allows to choose the appropriate material for construction of a Λ-ridge concentrator, generate micro surface of the material, and simulate the shape and components of the reflected stream.

  3. Fingerprint Identification Using SIFT-Based Minutia Descriptors and Improved All Descriptor-Pair Matching

    PubMed Central

    Zhou, Ru; Zhong, Dexing; Han, Jiuqiang

    2013-01-01

    The performance of conventional minutiae-based fingerprint authentication algorithms degrades significantly when dealing with low quality fingerprints with lots of cuts or scratches. A similar degradation of the minutiae-based algorithms is observed when small overlapping areas appear because of the quite narrow width of the sensors. Based on the detection of minutiae, Scale Invariant Feature Transformation (SIFT) descriptors are employed to fulfill verification tasks in the above difficult scenarios. However, the original SIFT algorithm is not suitable for fingerprint because of: (1) the similar patterns of parallel ridges; and (2) high computational resource consumption. To enhance the efficiency and effectiveness of the algorithm for fingerprint verification, we propose a SIFT-based Minutia Descriptor (SMD) to improve the SIFT algorithm through image processing, descriptor extraction and matcher. A two-step fast matcher, named improved All Descriptor-Pair Matching (iADM), is also proposed to implement the 1:N verifications in real-time. Fingerprint Identification using SMD and iADM (FISiA) achieved a significant improvement with respect to accuracy in representative databases compared with the conventional minutiae-based method. The speed of FISiA also can meet real-time requirements. PMID:23467056

  4. Identification of residue pairing in interacting β-strands from a predicted residue contact map.

    PubMed

    Mao, Wenzhi; Wang, Tong; Zhang, Wenxuan; Gong, Haipeng

    2018-04-19

    Despite the rapid progress of protein residue contact prediction, predicted residue contact maps frequently contain many errors. However, information of residue pairing in β strands could be extracted from a noisy contact map, due to the presence of characteristic contact patterns in β-β interactions. This information may benefit the tertiary structure prediction of mainly β proteins. In this work, we propose a novel ridge-detection-based β-β contact predictor to identify residue pairing in β strands from any predicted residue contact map. Our algorithm RDb 2 C adopts ridge detection, a well-developed technique in computer image processing, to capture consecutive residue contacts, and then utilizes a novel multi-stage random forest framework to integrate the ridge information and additional features for prediction. Starting from the predicted contact map of CCMpred, RDb 2 C remarkably outperforms all state-of-the-art methods on two conventional test sets of β proteins (BetaSheet916 and BetaSheet1452), and achieves F1-scores of ~ 62% and ~ 76% at the residue level and strand level, respectively. Taking the prediction of the more advanced RaptorX-Contact as input, RDb 2 C achieves impressively higher performance, with F1-scores reaching ~ 76% and ~ 86% at the residue level and strand level, respectively. In a test of structural modeling using the top 1 L predicted contacts as constraints, for 61 mainly β proteins, the average TM-score achieves 0.442 when using the raw RaptorX-Contact prediction, but increases to 0.506 when using the improved prediction by RDb 2 C. Our method can significantly improve the prediction of β-β contacts from any predicted residue contact maps. Prediction results of our algorithm could be directly applied to effectively facilitate the practical structure prediction of mainly β proteins. All source data and codes are available at http://166.111.152.91/Downloads.html or the GitHub address of https://github.com/wzmao/RDb2C .

  5. Biologically inspired EM image alignment and neural reconstruction.

    PubMed

    Knowles-Barley, Seymour; Butcher, Nancy J; Meinertzhagen, Ian A; Armstrong, J Douglas

    2011-08-15

    Three-dimensional reconstruction of consecutive serial-section transmission electron microscopy (ssTEM) images of neural tissue currently requires many hours of manual tracing and annotation. Several computational techniques have already been applied to ssTEM images to facilitate 3D reconstruction and ease this burden. Here, we present an alternative computational approach for ssTEM image analysis. We have used biologically inspired receptive fields as a basis for a ridge detection algorithm to identify cell membranes, synaptic contacts and mitochondria. Detected line segments are used to improve alignment between consecutive images and we have joined small segments of membrane into cell surfaces using a dynamic programming algorithm similar to the Needleman-Wunsch and Smith-Waterman DNA sequence alignment procedures. A shortest path-based approach has been used to close edges and achieve image segmentation. Partial reconstructions were automatically generated and used as a basis for semi-automatic reconstruction of neural tissue. The accuracy of partial reconstructions was evaluated and 96% of membrane could be identified at the cost of 13% false positive detections. An open-source reference implementation is available in the Supplementary information. seymour.kb@ed.ac.uk; douglas.armstrong@ed.ac.uk Supplementary data are available at Bioinformatics online.

  6. Local electron tomography using angular variations of surface tangents: Stomo version 2

    NASA Astrophysics Data System (ADS)

    Petersen, T. C.; Ringer, S. P.

    2012-03-01

    In a recent publication, we investigated the prospect of measuring the outer three-dimensional (3D) shapes of nano-scale atom probe specimens from tilt-series of images collected in the transmission electron microscope. For this purpose alone, an algorithm and simplified reconstruction theory were developed to circumvent issues that arise in commercial "back-projection" computations in this context. In our approach, we give up the difficult task of computing the complete 3D continuum structure and instead seek only the 3D morphology of internal and external scattering interfaces. These interfaces can be described as embedded 2D surfaces projected onto each image in a tilt series. Curves and other features in the images are interpreted as inscribed sets of tangent lines, which intersect the scattering interfaces at unknown locations along the direction of the incident electron beam. Smooth angular variations of the tangent line abscissa are used to compute the surface tangent intersections and hence the 3D morphology as a "point cloud". We have published the explicit details of our alternative algorithm along with the source code entitled "stomo_version_1". For this work, we have further modified the code to efficiently handle rectangular image sets, perform much faster tangent-line "edge detection" and smoother tilt-axis image alignment using simple bi-linear interpolation. We have also adapted the algorithm to detect tangent lines as "ridges", based upon 2nd order partial derivatives of the image intensity; the magnitude and orientation of which is described by a Hessian matrix. Ridges are more appropriate descriptors for tangent-line curves in phase contrast images outlined by Fresnel fringes or absorption contrast data from fine-scale objects. Improved accuracy, efficiency and speed for "stomo_version_2" is demonstrated in this paper using both high resolution electron tomography data of a nano-sized atom probe tip and simulated absorption-contrast images. Program summaryProgram title: STOMO version 2 Catalogue identifier: AEFS_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFS_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2854 No. of bytes in distributed program, including test data, etc.: 23 559 Distribution format: tar.gz Programming language: C/C++ Computer: PC Operating system: Windows XP RAM: Scales as the product of experimental image dimensions multiplied by the number of points chosen by the user in polynomial fitting. Typical runs require between 50 Mb and 100 Mb of RAM. Supplementary material: Sample output files, for the test run provided, are available. Classification: 7.4, 14 Catalogue identifier of previous version: AEFS_v1_0 Journal reference of previous version: Comput. Phys. Comm. 181 (2010) 676 Does the new version supersede the previous version?: Yes Nature of problem: A local electron tomography algorithm of specimens for which conventional back projection may fail and or data for which there is a limited angular range (which would otherwise cause significant 'missing-wedge' artefacts). The algorithm does not solve the tomography back projection problem but rather locally reconstructs the 3D morphology of surfaces defined by varied scattering densities. Solution method: Local reconstruction is effected using image-analysis edge and ridge detection computations on experimental tilt series to measure smooth angular variations of surface tangent-line intersections, which generate point clouds decorating the embedded and or external scattering surfaces of a specimen. Reasons for new version: The new version was coded to cater for rectangular images in experimental tilt-series, ensure smoother image rotations, provide ridge detection (suitable for sensing phase-contrast Fresnel fringes and other fine-scale structures), faster/larger kernel edge detection and also greatly reduce RAM usage. Specimen surface normals are also explicitly computed from tangent-line and edge intersections, providing new information for potential use in point cloud rendering. Hysteresis thresholding implemented in the version 1 edge-detection algorithm provided only sparse edge-linking. Version 2 now implements edge tracking using recursion to fully link the edges during hysteresis thresholding. Furthermore in version 1 the minimum number of fitted polynomial points (specified in the input file) was not correctly imposed, which has been fixed for version 2. Most of these changes increase the accuracy of 3d morphology surface-tomography reconstructions by facilitating the use of more/finer tilt angles and experimental images of increased spatial-resolution. The ridge detection was incorporated to specifically improve the reconstruction of internal specimen morphology. Summary of revisions: Included Hessian() function to compute 2nd order spatial derivatives of image intensities (operates in the same fashion as the previous and existing Sobel() function). Changed convolve_Gaussian() function to alternatively use successive 1D convolutions (rather than cumbersome 2D summations implemented in version 1), resulting in a large increase in computational speed without any loss in accuracy. The convolution kernel size was hence widened to three times the full width half maximum of the Gaussian filter to improve scale-space selection accuracy. A ridge detection option was included to compute edge maps sensitive to ridges, rather than edges, using elements from a Hessian matrix; the eigenvalues of which were used to define ridge direction for Canny-type hysteresis thresholding. Function edge_detect_Canny() was also altered to pass the gradient-direction maps (from either Hessian or Sobel based operators) in and out of scope for computation of surface normals; thereby enabling the output of both point-cloud and corresponding unstructured vector-field surface descriptors. Function rotate_imgs() was changed to incorporate basic bi-linear interpolation for improved tilt-axis alignment of the entire tilt series in exp_data.dat. Smoother and more accurate edge maps are thereby produced. Algorithm convert_point_cloud_to_tomogram() was created to output the tomogram 3d_imgs.dat in a more memory efficient manner. The function shell_sort(), adapted from numerical recipes in C, was also coded for this purpose. The new function compute_xyz() was coded to calculate point-clouds and tomogram surface normals using information from single tilt images, as opposed to the entire stack. This function is hence used iteratively throughout the reconstruction as each tilt image is analysed in succession. The new function reconstruct_local() is the heart of stomo_version_2.cpp. the main() source code in stomo_version_1.cpp has been rewritten here to process experimental images and edge maps one at a time, using a buffered 3d array of dimensions dictated solely by the number of tilt images required for the local SVD fit of the angular variations. These changes (along with similar iterative file writing) have been made to vastly reduce memory usage and hence allow higher spatial and angular resolution data sets to be analysed without recourse to high performance computing resources. The input file has been simplified by removing the 'slices' and 'channels' settings (used in version 1 for crude image binning), which are now equal to the respective numbers of image rows and columns. Every summation over image rows and columns has been checked to enable the analysis of rectangular images without error. For images of specimens with high aspect-ratios, such as narrow tips, these fixes allow significant reductions in computation time and memory usage. Some arrays in the source code were not appropriately zeroed in version 1, causing reconstruction artefacts in some cases. These problems have now been fixed. Fixed an if-statement to correctly impose the minimum number of fitted polynomial points, thereby reducing noise in the reconstructed data. Implemented proper edge linking in the hysteresis thresholding code for Canny edge detection. Restrictions: The input experimental tilt-series of images must be registered with respect to a common single tilt axis with known orientation and position. Running time: For high quality reconstruction, 2-5 min.

  7. Ultrafast fingerprint indexing for embedded systems

    NASA Astrophysics Data System (ADS)

    Zhou, Ru; Sin, Sang Woo; Li, Dongju; Isshiki, Tsuyoshi; Kunieda, Hiroaki

    2011-10-01

    A novel core-based fingerprint indexing scheme for embedded systems is presented in this paper. Our approach is enabled by our new precise and fast core-detection algorithm with the direction map. It introduces the feature of CMP (core minutiae pair), which describes the coordinates of minutiae and the direction of ridges associated with the minutiae based on the uniquely defined core coordinates. Since each CMP is identical against the shift and rotation of the fingerprint image, the CMP comparison between a template and an input image can be performed without any alignment. The proposed indexing algorithm based on CMP is suitable for embedded systems because the tremendous speed up and the memory reduction are achieved. In fact, the experiments with the fingerprint database FVC2002 show that its speed for the identifications becomes about 40 times faster than conventional approaches, even though the database includes fingerprints with no core.

  8. From template to image: reconstructing fingerprints from minutiae points.

    PubMed

    Ross, Arun; Shah, Jidnya; Jain, Anil K

    2007-04-01

    Most fingerprint-based biometric systems store the minutiae template of a user in the database. It has been traditionally assumed that the minutiae template of a user does not reveal any information about the original fingerprint. In this paper, we challenge this notion and show that three levels of information about the parent fingerprint can be elicited from the minutiae template alone, viz., 1) the orientation field information, 2) the class or type information, and 3) the friction ridge structure. The orientation estimation algorithm determines the direction of local ridges using the evidence of minutiae triplets. The estimated orientation field, along with the given minutiae distribution, is then used to predict the class of the fingerprint. Finally, the ridge structure of the parent fingerprint is generated using streamlines that are based on the estimated orientation field. Line Integral Convolution is used to impart texture to the ensuing ridges, resulting in a ridge map resembling the parent fingerprint. The salient feature of this noniterative method to generate ridges is its ability to preserve the minutiae at specified locations in the reconstructed ridge map. Experiments using a commercial fingerprint matcher suggest that the reconstructed ridge structure bears close resemblance to the parent fingerprint.

  9. Evaluation of Variable Refrigerant Flow Systems Performance and the Enhanced Control Algorithm on Oak Ridge National Laboratory s Flexible Research Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Im, Piljae; Munk, Jeffrey D; Gehl, Anthony C

    2015-06-01

    A research project “Evaluation of Variable Refrigerant Flow (VRF) Systems Performance and the Enhanced Control Algorithm on Oak Ridge National Laboratory’s (ORNL’s) Flexible Research Platform” was performed to (1) install and validate the performance of Samsung VRF systems compared with the baseline rooftop unit (RTU) variable-air-volume (VAV) system and (2) evaluate the enhanced control algorithm for the VRF system on the two-story flexible research platform (FRP) in Oak Ridge, Tennessee. Based on the VRF system designed by Samsung and ORNL, the system was installed from February 18 through April 15, 2014. The final commissioning and system optimization were completed onmore » June 2, 2014, and the initial test for system operation was started the following day, June 3, 2014. In addition, the enhanced control algorithm was implemented and updated on June 18. After a series of additional commissioning actions, the energy performance data from the RTU and the VRF system were monitored from July 7, 2014, through February 28, 2015. Data monitoring and analysis were performed for the cooling season and heating season separately, and the calibrated simulation model was developed and used to estimate the energy performance of the RTU and VRF systems. This final report includes discussion of the design and installation of the VRF system, the data monitoring and analysis plan, the cooling season and heating season data analysis, and the building energy modeling study« less

  10. Crustal thickness variations across the Blue Ridge mountains, southern Appalachians: an alternative procedure for migrating wide-angle reflection data

    Treesearch

    Robert B. Hawman

    2008-01-01

    Migration of wide-angle reflections generated by quarry blasts suggests that crustal thickness increases from 38 km beneath the Carolina Terrane to 47–51 km along the southeastern flank of the Blue Ridge. The migration algorithm, developed for generating single-fold images from explosions and earthquakes recorded with isolated, short-aperture arrays, uses the localized...

  11. Groundwater quality in the Valley and Ridge and Piedmont and Blue Ridge carbonate-rock aquifers, eastern United States

    USGS Publications Warehouse

    Lindsey, Bruce; Belitz, Kenneth

    2017-01-19

    Groundwater provides nearly 50 percent of the Nation’s drinking water. To help protect this vital resource, the U.S. Geological Survey (USGS) National Water-Quality Assessment (NAWQA) Project assesses groundwater quality in aquifers that are important sources of drinking water. The Valley and Ridge and Piedmont and Blue Ridge carbonate-rock aquifers constitute two of the important areas being evaluated. One or more inorganic constituents with human-health benchmarks were detected at high concentrations in about 15 percent of the study area and at moderate concentrations in about 17 percent. Organic constituents were not detected at high concentrations in the study area. One or more organic constituents with human-health benchmarks were detected at moderate concentrations in about 2 percent of the study area.

  12. Accurately tracking single-cell movement trajectories in microfluidic cell sorting devices.

    PubMed

    Jeong, Jenny; Frohberg, Nicholas J; Zhou, Enlu; Sulchek, Todd; Qiu, Peng

    2018-01-01

    Microfluidics are routinely used to study cellular properties, including the efficient quantification of single-cell biomechanics and label-free cell sorting based on the biomechanical properties, such as elasticity, viscosity, stiffness, and adhesion. Both quantification and sorting applications require optimal design of the microfluidic devices and mathematical modeling of the interactions between cells, fluid, and the channel of the device. As a first step toward building such a mathematical model, we collected video recordings of cells moving through a ridged microfluidic channel designed to compress and redirect cells according to cell biomechanics. We developed an efficient algorithm that automatically and accurately tracked the cell trajectories in the recordings. We tested the algorithm on recordings of cells with different stiffness, and showed the correlation between cell stiffness and the tracked trajectories. Moreover, the tracking algorithm successfully picked up subtle differences of cell motion when passing through consecutive ridges. The algorithm for accurately tracking cell trajectories paves the way for future efforts of modeling the flow, forces, and dynamics of cell properties in microfluidics applications.

  13. Studies of images of short-lived events using ERTS data. [forest fires, oil spills, vegetation damage, volcanoes, storm ridges, earthquakes, and floods

    NASA Technical Reports Server (NTRS)

    Deutschman, W. A. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. Detection of short-lived events has continued. Forest fires, oil spills, vegetation damage, volcanoes, storm ridges, earthquakes, and floods have been detected and analyzed.

  14. Lava Morphology Classification of a Fast-Spreading Ridge Using Deep-Towed Sonar Data: East Pacific Rise

    NASA Astrophysics Data System (ADS)

    Meyer, J.; White, S.

    2005-05-01

    Classification of lava morphology on a regional scale contributes to the understanding of the distribution and extent of lava flows at a mid-ocean ridge. Seafloor classification is essential to understand the regional undersea environment at midocean ridges. In this study, the development of a classification scheme is found to identify and extract textural patterns of different lava morphologies along the East Pacific Rise using DSL-120 side-scan and ARGO camera imagery. Application of an accurate image classification technique to side-scan sonar allows us to expand upon the locally available visual ground reference data to make the first comprehensive regional maps of small-scale lava morphology present at a mid-ocean ridge. The submarine lava morphologies focused upon in this study; sheet flows, lobate flows, and pillow flows; have unique textures. Several algorithms were applied to the sonar backscatter intensity images to produce multiple textural image layers useful in distinguishing the different lava morphologies. The intensity and spatially enhanced images were then combined and applied to a hybrid classification technique. The hybrid classification involves two integrated classifiers, a rule-based expert system classifier and a machine learning classifier. The complementary capabilities of the two integrated classifiers provided a higher accuracy of regional seafloor classification compared to using either classifier alone. Once trained, the hybrid classifier can then be applied to classify neighboring images with relative ease. This classification technique has been used to map the lava morphology distribution and infer spatial variability of lava effusion rates along two segments of the East Pacific Rise, 17 deg S and 9 deg N. Future use of this technique may also be useful for attaining temporal information. Repeated documentation of morphology classification in this dynamic environment can be compared to detect regional seafloor change.

  15. Novel methods for detecting buried explosive devices

    NASA Astrophysics Data System (ADS)

    Kercel, Stephen W.; Burlage, Robert S.; Patek, David R.; Smith, Cyrus M.; Hibbs, Andrew D.; Rayner, Timothy J.

    1997-07-01

    Oak Ridge National Laboratory and Quantum Magnetics, Inc. are exploring novel landmine detection technologies. Technologies considered here include bioreporter bacteria, swept acoustic resonance, nuclear quadrupole resonance (NQR), and semiotic data fusion. Bioreporter bacteria look promising for third-world humanitarian applications; they are inexpensive, and deployment does not require high-tech methods. Swept acoustic resonance may be a useful adjunct to magnetometers in humanitarian demining. For military demining, NQR is a promising method for detecting explosive substances; of 50,000 substances that have been tested, one has an NQR signature that can be mistaken for RDX or TNT. For both military and commercial demining, sensor fusion entails two daunting tasks, identifying fusible features in both present-day and emerging technologies, and devising a fusion algorithm that runs in real-time on cheap hardware. Preliminary research in these areas is encouraging. A bioreporter bacterium for TNT detection is under development. Investigation has just started in swept acoustic resonance as an approach to a cheap mine detector for humanitarian use. Real-time wavelet processing appears to be a key to extending NQR bomb detection into mine detection, including TNT-based mines. Recent discoveries in semiotics may be the breakthrough that will lead to a robust fused detection scheme.

  16. Detection of retinal capillary nonperfusion in fundus fluorescein angiogram of diabetic retinopathy.

    PubMed

    Rasta, Seyed Hossein; Nikfarjam, Shima; Javadzadeh, Alireza

    2015-01-01

    Retinal capillary nonperfusion (CNP) is one of the retinal vascular diseases in diabetic retinopathy (DR) patients. As there is no comprehensive detection technique to recognize CNP areas, we proposed a different method for computing detection of ischemic retina, non-perfused (NP) regions, in fundus fluorescein angiogram (FFA) images. Whilst major vessels appear as ridges, non-perfused areas are usually observed as ponds that are surrounded by healthy capillaries in FFA images. A new technique using homomorphic filtering to correct light illumination and detect the ponds surrounded in healthy capillaries on FFA images was designed and applied on DR fundus images. These images were acquired from the diabetic patients who had referred to the Nikookari hospital and were diagnosed for diabetic retinopathy during one year. Our strategy was screening the whole image with a fixed window size, which is small enough to enclose areas with identified topographic characteristics. To discard false nominees, we also performed a thresholding operation on the screen and marked images. To validate its performance we applied our detection algorithm on 41 FFA diabetic retinopathy fundus images in which the CNP areas were manually delineated by three clinical experts. Lesions were found as smooth regions with very high uniformity, low entropy, and small intensity variations in FFA images. The results of automated detection method were compared with manually marked CNP areas so achieved sensitivity of 81%, specificity of 78%, and accuracy of 91%.The result was present as a Receiver operating character (ROC) curve, which has an area under the curve (AUC) of 0.796 with 95% confidence intervals. This technique introduced a new automated detection algorithm to recognize non-perfusion lesions on FFA. This has potential to assist detecting and managing of ischemic retina and may be incorporated into automated grading diabetic retinopathy structures.

  17. The application research of microwave nondestructive testing and imaging based on ω-k algorithm

    NASA Astrophysics Data System (ADS)

    Qi, Shengxiang; Ren, Jian; Gu, Lihua; Xu, Hui; Wang, Yuanbo

    2017-07-01

    The Bridges had collapsed accidents in recent years due to bridges quality problems. Therefore, concretes nondestructive testing are particularly important. At present, most applications are Ground Penetrating Radar (GPR) technology in the detection of reinforced concretes structure. GPR are used the pulse method which alongside with definitive advantages, but the testing of the internal structure of the small thickness concretes has very low resolution by this method. In this paper, it's the first time to use the ultra-wideband (UWB) stepped frequency conversion radar above problems. We use vector network analyzer and double ridged horn antenna microwave imaging system to test the reinforced concretes block. The internal structure of the concretes is reconstructed with a method of synthetic aperture of ω-k algorithm. By this method, the depth of the steel bar with the diameter of 1cm is shown exactly in the depth of 450mm×400mm×500mm and the depth error do not exceed 1cm.

  18. Methods for predicting properties and tailoring salt solutions for industrial processes

    NASA Technical Reports Server (NTRS)

    Ally, Moonis R.

    1993-01-01

    An algorithm developed at Oak Ridge National Laboratory accurately and quickly predicts thermodynamic properties of concentrated aqueous salt solutions. This algorithm is much simpler and much faster than other modeling schemes and is unique because it can predict solution behavior at very high concentrations and under varying conditions. Typical industrial applications of this algorithm would be in manufacture of inorganic chemicals by crystallization, thermal storage, refrigeration and cooling, extraction of metals, emissions controls, etc.

  19. Determining the bias and variance of a deterministic finger-tracking algorithm.

    PubMed

    Morash, Valerie S; van der Velden, Bas H M

    2016-06-01

    Finger tracking has the potential to expand haptic research and applications, as eye tracking has done in vision research. In research applications, it is desirable to know the bias and variance associated with a finger-tracking method. However, assessing the bias and variance of a deterministic method is not straightforward. Multiple measurements of the same finger position data will not produce different results, implying zero variance. Here, we present a method of assessing deterministic finger-tracking variance and bias through comparison to a non-deterministic measure. A proof-of-concept is presented using a video-based finger-tracking algorithm developed for the specific purpose of tracking participant fingers during a psychological research study. The algorithm uses ridge detection on videos of the participant's hand, and estimates the location of the right index fingertip. The algorithm was evaluated using data from four participants, who explored tactile maps using only their right index finger and all right-hand fingers. The algorithm identified the index fingertip in 99.78 % of one-finger video frames and 97.55 % of five-finger video frames. Although the algorithm produced slightly biased and more dispersed estimates relative to a human coder, these differences (x=0.08 cm, y=0.04 cm) and standard deviations (σ x =0.16 cm, σ y =0.21 cm) were small compared to the size of a fingertip (1.5-2.0 cm). Some example finger-tracking results are provided where corrections are made using the bias and variance estimates.

  20. Using GA-Ridge regression to select hydro-geological parameters influencing groundwater pollution vulnerability.

    PubMed

    Ahn, Jae Joon; Kim, Young Min; Yoo, Keunje; Park, Joonhong; Oh, Kyong Joo

    2012-11-01

    For groundwater conservation and management, it is important to accurately assess groundwater pollution vulnerability. This study proposed an integrated model using ridge regression and a genetic algorithm (GA) to effectively select the major hydro-geological parameters influencing groundwater pollution vulnerability in an aquifer. The GA-Ridge regression method determined that depth to water, net recharge, topography, and the impact of vadose zone media were the hydro-geological parameters that influenced trichloroethene pollution vulnerability in a Korean aquifer. When using these selected hydro-geological parameters, the accuracy was improved for various statistical nonlinear and artificial intelligence (AI) techniques, such as multinomial logistic regression, decision trees, artificial neural networks, and case-based reasoning. These results provide a proof of concept that the GA-Ridge regression is effective at determining influential hydro-geological parameters for the pollution vulnerability of an aquifer, and in turn, improves the AI performance in assessing groundwater pollution vulnerability.

  1. Polygonal Ridge Networks on Mars

    NASA Astrophysics Data System (ADS)

    Kerber, Laura; Dickson, James; Grosfils, Eric; Head, James W.

    2016-10-01

    Polygonal ridge networks, also known as boxwork or reticulate ridges, are found in numerous locations and geological contexts across Mars. While networks formed from mineralized fractures hint at hot, possibly life-sustaining circulating ground waters, networks formed by impact-driven clasting diking, magmatic dikes, gas escape, or lava flows do not have the same astrobiological implications. Distinguishing the morphologies and geological context of the ridge networks sheds light on their potential as astrobiological and mineral resource sites of interest. The most widespread type of ridge morphology is characteristic of the Nili Fossae and Nilosyrtis region and consists of thin, criss-crossing ridges with a variety of heights, widths, and intersection angles. They are found in ancient Noachian terrains at a variety of altitudes and geographic locations and may be a mixture of clastic dikes, brecciated dikes, and mineral veins. They occur in the same general areas as valley networks and ancient lake basins, but they are not more numerous where these features are concentrated, and can appear in places where they morphologies are absent. Similarly, some of the ridge networks are associated with hydrated mineral detections, but some occur in locations without detections. Smaller, light-toned ridges of variable widths have been found in Gale Crater and other rover sites and are interpreted to be smaller version of the Nili-like ridges, in this case formed by the mineralization of fractures. This type of ridge is likely to be found in many other places on Mars as more high-resolution data becomes available. Hellas Basin is host to a third type of ridge morphology consisting of large, thick, light-toned ridges forming regular polygons at several superimposed scales. While still enigmatic, these are most likely to be the result of sediment-filled fractures. The Eastern Medusae Fossae Formation contains large swaths of a fourth, previously undocumented, ridge network type. The dark ridges, reaching up to 50 m in height, enclose regular polygons and erode into dark boulders. These ridge networks are interpreted to form as a result of lava flow embayment of deeply fractured Medusae Fossae Formation outcrops.

  2. Detecting Multi-scale Structures in Chandra Images of Centaurus A

    NASA Astrophysics Data System (ADS)

    Karovska, M.; Fabbiano, G.; Elvis, M. S.; Evans, I. N.; Kim, D. W.; Prestwich, A. H.; Schwartz, D. A.; Murray, S. S.; Forman, W.; Jones, C.; Kraft, R. P.; Isobe, T.; Cui, W.; Schreier, E. J.

    1999-12-01

    Centaurus A (NGC 5128) is a giant early-type galaxy with a merger history, containing the nearest radio-bright AGN. Recent Chandra High Resolution Camera (HRC) observations of Cen A reveal X-ray multi-scale structures in this object with unprecedented detail and clarity. We show the results of an analysis of the Chandra data with smoothing and edge enhancement techniques that allow us to enhance and quantify the multi-scale structures present in the HRC images. These techniques include an adaptive smoothing algorithm (Ebeling et al 1999), and a multi-directional gradient detection algorithm (Karovska et al 1994). The Ebeling et al adaptive smoothing algorithm, which is incorporated in the CXC analysis s/w package, is a powerful tool for smoothing images containing complex structures at various spatial scales. The adaptively smoothed images of Centaurus A show simultaneously the high-angular resolution bright structures at scales as small as an arcsecond and the extended faint structures as large as several arc minutes. The large scale structures suggest complex symmetry, including a component possibly associated with the inner radio lobes (as suggested by the ROSAT HRI data, Dobereiner et al 1996), and a separate component with an orthogonal symmetry that may be associated with the galaxy as a whole. The dust lane and the x-ray ridges are very clearly visible. The adaptively smoothed images and the edge-enhanced images also suggest several filamentary features including a large filament-like structure extending as far as about 5 arcminutes to North-West.

  3. Demonstration and Validation of an Improved Airborne Electromagnetic System for UXO Detection and Mapping

    DTIC Science & Technology

    2010-05-01

    William E. Doll Battelle 105 Mitchell Road Suite 103 Oak Ridge, TN 37830 865-483-2548 865-599-6165 dollw@battelle.org Airborne Survey...Manager David T. Bell Battelle 105 Mitchell Road Suite 103 Oak Ridge, TN 37830 865-483-2547 865-250-0578 belldt@battelle.org Battelle-Oak Ridge

  4. Unexpected series of regular frequency spacing of δ Scuti stars in the non-asymptotic regime. II. Sample-Echelle diagrams and rotation

    DOE PAGES

    Paparo, M.; Benko, J. M.; Hareter, M.; ...

    2016-06-17

    A sequence search method was developed for searching for regular frequency spacing in δ Scuti stars by visual inspection (VI) and algorithmic search. The sample contains 90 δ Scuti stars observed by CoRoT. An example is given to represent the VI. The algorithm (SSA) is described in detail. The data treatment of the CoRoT light curves, the criteria for frequency filtering, and the spacings derived by two methods (i.e., three approaches: VI, SSA, and FT) are given for each target. Echelle diagrams are presented for 77 targets for which at least one sequence of regular spacing was identified. Comparing the spacing and the shifts between pairs of echelle ridges revealed that at least one pair of echelle ridges is shifted to midway between the spacing for 22 stars. The estimated rotational frequencies compared to the shifts revealed rotationally split doublets, triplets, and multiplets not only for single frequencies, but for the complete echelle ridges in 31 δ Scuti stars. Furthermore, using several possible assumptions for the origin of the spacings, we derived the large separation (more » $${\\rm{\\Delta }}\

  5. Unexpected series of regular frequency spacing of δ Scuti stars in the non-asymptotic regime. II. Sample-Echelle diagrams and rotation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paparo, M.; Benko, J. M.; Hareter, M.

    A sequence search method was developed for searching for regular frequency spacing in δ Scuti stars by visual inspection (VI) and algorithmic search. The sample contains 90 δ Scuti stars observed by CoRoT. An example is given to represent the VI. The algorithm (SSA) is described in detail. The data treatment of the CoRoT light curves, the criteria for frequency filtering, and the spacings derived by two methods (i.e., three approaches: VI, SSA, and FT) are given for each target. Echelle diagrams are presented for 77 targets for which at least one sequence of regular spacing was identified. Comparing the spacing and the shifts between pairs of echelle ridges revealed that at least one pair of echelle ridges is shifted to midway between the spacing for 22 stars. The estimated rotational frequencies compared to the shifts revealed rotationally split doublets, triplets, and multiplets not only for single frequencies, but for the complete echelle ridges in 31 δ Scuti stars. Furthermore, using several possible assumptions for the origin of the spacings, we derived the large separation (more » $${\\rm{\\Delta }}\

  6. Ridge extraction from the time-frequency representation (TFR) of signals based on an image processing approach: application to the analysis of uterine electromyogram AR TFR.

    PubMed

    Terrien, Jérémy; Marque, Catherine; Germain, Guy

    2008-05-01

    Time-frequency representations (TFRs) of signals are increasingly being used in biomedical research. Analysis of such representations is sometimes difficult, however, and is often reduced to the extraction of ridges, or local energy maxima. In this paper, we describe a new ridge extraction method based on the image processing technique of active contours or snakes. We have tested our method on several synthetic signals and for the analysis of uterine electromyogram or electrohysterogram (EHG) recorded during gestation in monkeys. We have also evaluated a postprocessing algorithm that is especially suited for EHG analysis. Parameters are evaluated on real EHG signals in different gestational periods. The presented method gives good results when applied to synthetic as well as EHG signals. We have been able to obtain smaller ridge extraction errors when compared to two other methods specially developed for EHG. The gradient vector flow (GVF) snake method, or GVF-snake method, appears to be a good ridge extraction tool, which could be used on TFR of mono or multicomponent signals with good results.

  7. A Comparative Study of Pairwise Learning Methods Based on Kernel Ridge Regression.

    PubMed

    Stock, Michiel; Pahikkala, Tapio; Airola, Antti; De Baets, Bernard; Waegeman, Willem

    2018-06-12

    Many machine learning problems can be formulated as predicting labels for a pair of objects. Problems of that kind are often referred to as pairwise learning, dyadic prediction, or network inference problems. During the past decade, kernel methods have played a dominant role in pairwise learning. They still obtain a state-of-the-art predictive performance, but a theoretical analysis of their behavior has been underexplored in the machine learning literature. In this work we review and unify kernel-based algorithms that are commonly used in different pairwise learning settings, ranging from matrix filtering to zero-shot learning. To this end, we focus on closed-form efficient instantiations of Kronecker kernel ridge regression. We show that independent task kernel ridge regression, two-step kernel ridge regression, and a linear matrix filter arise naturally as a special case of Kronecker kernel ridge regression, implying that all these methods implicitly minimize a squared loss. In addition, we analyze universality, consistency, and spectral filtering properties. Our theoretical results provide valuable insights into assessing the advantages and limitations of existing pairwise learning methods.

  8. The non-contact detection and identification of blood stained fingerprints using visible wavelength reflectance hyperspectral imaging: Part 1.

    PubMed

    Cadd, Samuel; Li, Bo; Beveridge, Peter; O'Hare, William T; Campbell, Andrew; Islam, Meez

    2016-05-01

    Blood is one of the most commonly encountered types of biological evidence found at scenes of violent crime and one of the most commonly observed fingerprint contaminants. Current visualisation methods rely on presumptive tests or chemical enhancement methods. Although these can successfully visualise ridge detail, they are destructive, do not confirm the presence of blood and can have a negative impact on DNA sampling. A novel application of visible wavelength reflectance hyperspectral imaging (HSI) has been used for the detection and positive identification of blood stained fingerprints in a non-contact and non-destructive manner on white ceramic tiles. The identification of blood was based on the unique visible absorption spectrum of haemoglobin between 400 and 500 nm. HSI has been used to successfully visualise ridge detail in blood stained fingerprints to the ninth depletion. Ridge detail was still detectable with diluted blood to 20-fold dilutions. Latent blood stains were detectable to 15,000-fold dilutions. Ridge detail was detectable for fingerprints up to 6 months old. HSI was also able to conclusively distinguish blood stained fingerprints from fingerprints in six paints and eleven other red/brown media with zero false positives. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  9. On the adequacy of identified Cole Cole models

    NASA Astrophysics Data System (ADS)

    Xiang, Jianping; Cheng, Daizhan; Schlindwein, F. S.; Jones, N. B.

    2003-06-01

    The Cole-Cole model has been widely used to interpret electrical geophysical data. Normally an iterative computer program is used to invert the frequency domain complex impedance data and simple error estimation is obtained from the squared difference of the measured (field) and calculated values over the full frequency range. Recently a new direct inversion algorithm was proposed for the 'optimal' estimation of the Cole-Cole parameters, which differs from existing inversion algorithms in that the estimated parameters are direct solutions of a set of equations without the need for an initial guess for initialisation. This paper first briefly investigates the advantages and disadvantages of the new algorithm compared to the standard Levenberg-Marquardt "ridge regression" algorithm. Then, and more importantly, we address the adequacy of the models resulting from both the "ridge regression" and the new algorithm, using two different statistical tests and we give objective statistical criteria for acceptance or rejection of the estimated models. The first is the standard χ2 technique. The second is a parameter-accuracy based test that uses a joint multi-normal distribution. Numerical results that illustrate the performance of both testing methods are given. The main goals of this paper are (i) to provide the source code for the new ''direct inversion'' algorithm in Matlab and (ii) to introduce and demonstrate two methods to determine the reliability of a set of data before data processing, i.e., to consider the adequacy of the resulting Cole-Cole model.

  10. Oak Ridge Graph Analytics for Medical Innovation (ORiGAMI)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, Larry W.; Lee, Sangkeun

    2016-01-01

    In this era of data-driven decisions and discovery where Big Data is producing Bigger Data, data scientists at the Oak Ridge National Laboratory are leveraging unique leadership infrastructure (e.g., Urika XA and Urika GD appliances) to develop scalable algorithms for semantic, logical and statistical reasoning with Big Data (i.e., data stored in databases as well as unstructured data in documents). ORiGAMI is a next-generation knowledge-discovery framework that is: (a) knowledge nurturing (i.e., evolves seamlessly with newer knowledge and data), (b) smart and curious (i.e. using information-foraging and reasoning algorithms to digest content) and (c) synergistic (i.e., interfaces computers with whatmore » they do best to help subject-matter-experts do their best. ORiGAMI has been demonstrated using the National Library of Medicine's SEMANTIC MEDLINE (archive of medical knowledge since 1994).« less

  11. Extracting valley-ridge lines from point-cloud-based 3D fingerprint models.

    PubMed

    Pang, Xufang; Song, Zhan; Xie, Wuyuan

    2013-01-01

    3D fingerprinting is an emerging technology with the distinct advantage of touchless operation. More important, 3D fingerprint models contain more biometric information than traditional 2D fingerprint images. However, current approaches to fingerprint feature detection usually must transform the 3D models to a 2D space through unwrapping or other methods, which might introduce distortions. A new approach directly extracts valley-ridge features from point-cloud-based 3D fingerprint models. It first applies the moving least-squares method to fit a local paraboloid surface and represent the local point cloud area. It then computes the local surface's curvatures and curvature tensors to facilitate detection of the potential valley and ridge points. The approach projects those points to the most likely valley-ridge lines, using statistical means such as covariance analysis and cross correlation. To finally extract the valley-ridge lines, it grows the polylines that approximate the projected feature points and removes the perturbations between the sampled points. Experiments with different 3D fingerprint models demonstrate this approach's feasibility and performance.

  12. New Bottom Roughness Calculation from Multibeam Echo Sounders for Mine Warfare

    DTIC Science & Technology

    2012-09-01

    complex including craters, gullies, seaweed , rocks, sand ridges, tall obstructions, deep holes and sloping regions. Underwater mines can be hidden...and shadows for detecting objects lying on the seafloor. The seafloor is rather complex including craters, gullies, seaweed , rocks, sand ridges, tall...roughness as craters, gullies, seaweed , sand ridges, tall obstructions, deep holes, or steeply sloping regions. Slopes can make it possible for mines to

  13. Retina Image Analysis and Ocular Telehealth: The Oak Ridge National Laboratory-Hamilton Eye Institute Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karnowski, Thomas Paul; Giancardo, Luca; Li, Yaquin

    2013-01-01

    Automated retina image analysis has reached a high level of maturity in recent years, and thus the question of how validation is performed in these systems is beginning to grow in importance. One application of retina image analysis is in telemedicine, where an automated system could enable the automated detection of diabetic retinopathy and other eye diseases as a low-cost method for broad-based screening. In this work we discuss our experiences in developing a telemedical network for retina image analysis, including our progression from a manual diagnosis network to a more fully automated one. We pay special attention to howmore » validations of our algorithm steps are performed, both using data from the telemedicine network and other public databases.« less

  14. Fracture Analysis of Vessels. Oak Ridge FAVOR, v06.1, Computer Code: Theory and Implementation of Algorithms, Methods, and Correlations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, P. T.; Dickson, T. L.; Yin, S.

    The current regulations to insure that nuclear reactor pressure vessels (RPVs) maintain their structural integrity when subjected to transients such as pressurized thermal shock (PTS) events were derived from computational models developed in the early-to-mid 1980s. Since that time, advancements and refinements in relevant technologies that impact RPV integrity assessment have led to an effort by the NRC to re-evaluate its PTS regulations. Updated computational methodologies have been developed through interactions between experts in the relevant disciplines of thermal hydraulics, probabilistic risk assessment, materials embrittlement, fracture mechanics, and inspection (flaw characterization). Contributors to the development of these methodologies include themore » NRC staff, their contractors, and representatives from the nuclear industry. These updated methodologies have been integrated into the Fracture Analysis of Vessels -- Oak Ridge (FAVOR, v06.1) computer code developed for the NRC by the Heavy Section Steel Technology (HSST) program at Oak Ridge National Laboratory (ORNL). The FAVOR, v04.1, code represents the baseline NRC-selected applications tool for re-assessing the current PTS regulations. This report is intended to document the technical bases for the assumptions, algorithms, methods, and correlations employed in the development of the FAVOR, v06.1, code.« less

  15. Chemical and Mineralogical Characterization of a Hematite-bearing Ridge on Mauna Kea, Hawaii: A Potential Mineralogical Process Analog for the Mount Sharp Hematite Ridge

    NASA Technical Reports Server (NTRS)

    Graff, T. G.; Morris, R. V.; Ming, D. W.; Hamilton, J. C.; Adams, M.; Fraeman, A. A.; Arvidson, R. E.; Catalano, J. G.; Mertzman, S. A.

    2014-01-01

    The Mars Science Laboratory (MSL) rover Curiosity landed in Gale Crater in August 2012 and is currently roving towards the layered central mound known as Mount Sharp [1]. Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) hyperspectral data indicate Mount Sharp contains an 5 km stratigraphic sequence including Fe-Mg smectites, hematite, and hydrated sulfates in the lower layers separated by an unconformity from the overlying anhydrous strata [1,2,3]. Hematite was initially detected in CRISM data to occur in the lower sulfate layers on the north side of the mound [2]. [3] further mapped a distinct hematite detection occurring as part of a 200 m wide ridge that extends 6.5 km NE-SW, approximately parallel with the base of Mount Sharp. It is likely a target for in-situ analyses by Curiosity. We document here the occurrence of a stratum of hematite-bearing breccia that is exposed on the Puu Poliahu cinder cone near the summit of Mauna Kea volcano (Hawaii) (Fig.1). The stratum is more resistant to weathering than surrounding material, giving it the appearance of a ridge. The Mauna Kea hematite ridge is thus arguably a potential terrestrial mineralogical and process analog for the Gale Crater hematite ridge. We are acquiring a variety of chemical and mineralogical data on the Mauna Kea samples, with a focus on the chemical and mineralogical information already available or planned for the Gale hematite ridge.

  16. High-Power Vehicle-Towed TEM for Small Ordnance Detection at Depth

    DTIC Science & Technology

    2014-02-01

    Operations,100A Donner Dr,Oak Ridge,TN, 37830 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10...T. Jeffrey Gamey Battelle, 100A Donner Dr., Oak Ridge, TN 37830 (865) 599-0820 gameytj@battelle.org Geophysicist – Principle Investigator...William Doll Battelle, 100A Donner Dr., Oak Ridge, TN 37830 (865) 599-6165 dollw@battelle.org Geophysicist – modeling assessment, data collection, data

  17. 2002 Airborne Geophysical Survey at Pueblo of Isleta Bombing Targets, New Mexico, April 10 May 6, 2002 (Rev 1)

    DTIC Science & Technology

    2005-12-01

    helicopter geophysical survey performed by US Army Engineering Support Center, Huntsville (USAESCH) and Oak Ridge National Laboratory ( ORNL ) over areas...Array Detection System NAD North American Datum ORAGS Oak Ridge Airborne Geophysical System ORNL Oak Ridge National Laboratory RMS Root...used by ORNL in 1999 for.....................5 Figure 2.4 ORAGS-Hammerhead airborne magnetometer system used at Badlands Bombing Range in FY2000

  18. Particle detector spatial resolution

    DOEpatents

    Perez-Mendez, V.

    1992-12-15

    Method and apparatus for producing separated columns of scintillation layer material, for use in detection of X-rays and high energy charged particles with improved spatial resolution is disclosed. A pattern of ridges or projections is formed on one surface of a substrate layer or in a thin polyimide layer, and the scintillation layer is grown at controlled temperature and growth rate on the ridge-containing material. The scintillation material preferentially forms cylinders or columns, separated by gaps conforming to the pattern of ridges, and these columns direct most of the light produced in the scintillation layer along individual columns for subsequent detection in a photodiode layer. The gaps may be filled with a light-absorbing material to further enhance the spatial resolution of the particle detector. 12 figs.

  19. Particle detector spatial resolution

    DOEpatents

    Perez-Mendez, Victor

    1992-01-01

    Method and apparatus for producing separated columns of scintillation layer material, for use in detection of X-rays and high energy charged particles with improved spatial resolution. A pattern of ridges or projections is formed on one surface of a substrate layer or in a thin polyimide layer, and the scintillation layer is grown at controlled temperature and growth rate on the ridge-containing material. The scintillation material preferentially forms cylinders or columns, separated by gaps conforming to the pattern of ridges, and these columns direct most of the light produced in the scintillation layer along individual columns for subsequent detection in a photodiode layer. The gaps may be filled with a light-absorbing material to further enhance the spatial resolution of the particle detector.

  20. A Hessian-based methodology for automatic surface crack detection and classification from pavement images

    NASA Astrophysics Data System (ADS)

    Ghanta, Sindhu; Shahini Shamsabadi, Salar; Dy, Jennifer; Wang, Ming; Birken, Ralf

    2015-04-01

    Around 3,000,000 million vehicle miles are annually traveled utilizing the US transportation systems alone. In addition to the road traffic safety, maintaining the road infrastructure in a sound condition promotes a more productive and competitive economy. Due to the significant amounts of financial and human resources required to detect surface cracks by visual inspection, detection of these surface defects are often delayed resulting in deferred maintenance operations. This paper introduces an automatic system for acquisition, detection, classification, and evaluation of pavement surface cracks by unsupervised analysis of images collected from a camera mounted on the rear of a moving vehicle. A Hessian-based multi-scale filter has been utilized to detect ridges in these images at various scales. Post-processing on the extracted features has been implemented to produce statistics of length, width, and area covered by cracks, which are crucial for roadway agencies to assess pavement quality. This process has been realized on three sets of roads with different pavement conditions in the city of Brockton, MA. A ground truth dataset labeled manually is made available to evaluate this algorithm and results rendered more than 90% segmentation accuracy demonstrating the feasibility of employing this approach at a larger scale.

  1. Ground Testing of Prototype Hardware and Processing Algorithms for a Wide Area Space Surveillance System (WASSS)

    DTIC Science & Technology

    2013-09-01

    Ground testing of prototype hardware and processing algorithms for a Wide Area Space Surveillance System (WASSS) Neil Goldstein, Rainer A...at Magdalena Ridge Observatory using the prototype Wide Area Space Surveillance System (WASSS) camera, which has a 4 x 60 field-of-view , < 0.05...objects with larger-aperture cameras. The sensitivity of the system depends on multi-frame averaging and a Principal Component Analysis based image

  2. A wavelet ridge extraction method employing a novel cost function in two-dimensional wavelet transform profilometry

    NASA Astrophysics Data System (ADS)

    Wang, Jianhua; Yang, Yanxi

    2018-05-01

    We present a new wavelet ridge extraction method employing a novel cost function in two-dimensional wavelet transform profilometry (2-D WTP). First of all, the maximum value point is extracted from two-dimensional wavelet transform coefficient modulus, and the local extreme value points over 90% of maximum value are also obtained, they both constitute wavelet ridge candidates. Then, the gradient of rotate factor is introduced into the Abid's cost function, and the logarithmic Logistic model is used to adjust and improve the cost function weights so as to obtain more reasonable value estimation. At last, the dynamic programming method is used to accurately find the optimal wavelet ridge, and the wrapped phase can be obtained by extracting the phase at the ridge. Its advantage is that, the fringe pattern with low signal-to-noise ratio can be demodulated accurately, and its noise immunity will be better. Meanwhile, only one fringe pattern is needed to projected to measured object, so dynamic three-dimensional (3-D) measurement in harsh environment can be realized. Computer simulation and experimental results show that, for the fringe pattern with noise pollution, the 3-D surface recovery accuracy by the proposed algorithm is increased. In addition, the demodulation phase accuracy of Morlet, Fan and Cauchy mother wavelets are compared.

  3. Temporal and spatial patterns in vegetation and atmospheric properties from AVIRIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, D.A.; Green, R.O.; Adams, J.B.

    1997-12-01

    Little research has focused on the use of imaging spectrometry for change detection. In this paper, the authors apply Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data to the monitoring of seasonal changes in atmospheric water vapor, liquid water, and surface cover in the vicinity of the Jasper Ridge, CA, for three dates in 1992. Apparent surface reflectance was retrieved and water vapor and liquid water mapped by using a radiative-transfer-based inversion that accounts for spatially variable atmospheres. Spectral mixture analysis (SMA) was used to model reflectance data as mixtures of green vegetation (GV), nonphotosynthetic vegetation (NPV), soil, and shade. Temporal andmore » spatial patterns in endmember fractions and liquid water were compared to the normalized difference vegetation index (NDVI). The reflectance retrieval algorithm was tested by using a temporally invariant target.« less

  4. Spatial Variability of Barrow-Area Shore-Fast Sea Ice and Its Relationships to Passive Microwave Emissivity

    NASA Technical Reports Server (NTRS)

    Maslanik, J. A.; Rivas, M. Belmonte; Holmgren, J.; Gasiewski, A. J.; Heinrichs, J. F.; Stroeve, J. C.; Klein, M.; Markus, T.; Perovich, D. K.; Sonntag, J. G.; hide

    2006-01-01

    Aircraft-acquired passive microwave data, laser radar height observations, RADARSAT synthetic aperture radar imagery, and in situ measurements obtained during the AMSR-Ice03 experiment are used to investigate relationships between microwave emission and ice characteristics over several space scales. The data fusion allows delineation of the shore-fast ice and pack ice in the Barrow area, AK, into several ice classes. Results show good agreement between observed and Polarimetric Scanning Radiometer (PSR)-derived snow depths over relatively smooth ice, with larger differences over ridged and rubbled ice. The PSR results are consistent with the effects on snow depth of the spatial distribution and nature of ice roughness, ridging, and other factors such as ice age. Apparent relationships exist between ice roughness and the degree of depolarization of emission at 10,19, and 37 GHz. This depolarization .would yield overestimates of total ice concentration using polarization-based algorithms, with indications of this seen when the NT-2 algorithm is applied to the PSR data. Other characteristics of the microwave data, such as effects of grounding of sea ice and large contrast between sea ice and adjacent land, are also apparent in the PSR data. Overall, the results further demonstrate the importance of macroscale ice roughness conditions such as ridging and rubbling on snow depth and microwave emissivity.

  5. Segmentation Control on Crustal Accretion: Insights From the Chile Ridge

    NASA Astrophysics Data System (ADS)

    Martinez, F.; Karsten, J. L.; Milman, M. S.; Klein, E. M.

    2002-12-01

    Controls on crustal accretion at mid-ocean ridges include spreading rate and mantle temperature and composition. Less studied is the effect of the segmentation geometry, although it has been known for some time that large offset transforms have significant effects on the extent of melting and lava compositions produced by ridges in their vicinity. The PANORAMA 4 expedition surveyed the Chile Ridge between 36°-43°S in order to examine the effects of ridge segmentation on crustal accretion. This section of the ridge is spreading uniformly at intermediate rates (~53 mm/yr) and rock sampling and regional data indicate a largely uniform mantle composition with no systematic changes in mantle thermal structure. Thus the segmentation geometry is the primary crustal accretion variable. The survey mapped and sampled 19 first order ridge segments and their transform offsets. The ridges range from 130 to 10 km in length with mapped transform offsets from 168 to 19 km. The segments primarily have axial valley morphology, with segments longer than ~65 km typically displaying central highs deepening toward segment ends. Mantle Bouguer anomalies (MBAs) show that these segments also have bulls eye lows associated with the central highs indicating thicker crust than at segment ends. Overall the mapped segments displays a trend of increasing depth and MBA, implying diminishing crustal production, with decreasing segment length and increasing transform offset. We examine the cause of this trend by modeling the mantle flow pattern generated by finite length ridge segments using the Phipps-Morgan and Forsyth (1988) algorithm. The results indicate that at a constant spreading rate mantle upwelling rates are greatest and extend deeper near the segment center, and that for segments that are significantly offset, upwelling rates decrease overall with decreasing segment length. The modeling implies that segmentation itself, even without cooling and lithospheric relief at transforms has a strong influence on mantle advection and therefore on crustal production.

  6. A global reference model of Moho depths based on WGM2012

    NASA Astrophysics Data System (ADS)

    Zhou, D.; Li, C.

    2017-12-01

    The crust-mantle boundary (Moho discontinuity) represents the largest density contrast in the lithosphere, which can be detected by Bouguer gravity anomaly. We present our recent inversion of global Moho depths from World Gravity Map 2012. Because oceanic lithospheres increase in density as they cool, we perform thermal correction based on the plate cooling model. We adopt a temperature Tm=1300°C at the bottom of lithosphere. The plate thickness is tested by varying by 5 km from 90 to 140 km, and taken as 130 km that gives a best-fit crustal thickness constrained by seismic crustal thickness profiles. We obtain the residual Bouguer gravity anomalies by subtracting the thermal correction from WGM2012, and then estimate Moho depths based on the Parker-Oldenburg algorithm. Taking the global model Crust1.0 as a priori constraint, we adopt Moho density contrasts of 0.43 and 0.4 g/cm3 , and initial mean Moho depths of 37 and 20 km in the continental and oceanic domains, respectively. The number of iterations in the inversion is set to be 150, which is large enough to obtain an error lower than a pre-assigned convergence criterion. The estimated Moho depths range between 0 76 km, and are averaged at 36 and 15 km in continental and oceanic domain, respectively. Our results correlate very well with Crust1.0 with differences mostly within ±5.0 km. Compared to the low resolution of Crust1.0 in oceanic domain, our results have a much larger depth range reflecting diverse structures such as ridges, seamounts, volcanic chains and subduction zones. Base on this model, we find that young(<5 Ma) oceanic crust thicknesses show dependence on spreading rates: (1) From ultraslow (<4mm/yr) to slow (4 45mm/yr) spreading ridges, the thicknesses increase dramatically; (2)From slow to fast (45 95mm/yr) spreading ridges , the thickness decreases slightly; (3) For the super-fast ridges (>95mm/yr) we observe relatively thicker crust. Conductive cooling of lithosphere may constrain the melting of the mantle at ultraslow spreading centers. Lower mantle temperatures indicated by deeper Curie depths at slow and fast spreading ridges may decrease the volume of magmatism and crustal thickness. This new global model of gravity-derived Moho depth, combined with geochemical and Curie point depth, can be used to investigate thermal evolution of lithosphere.

  7. Adaptive OFDM Waveform Design for Spatio-Temporal-Sparsity Exploited STAP Radar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Satyabrata

    In this chapter, we describe a sparsity-based space-time adaptive processing (STAP) algorithm to detect a slowly moving target using an orthogonal frequency division multiplexing (OFDM) radar. The motivation of employing an OFDM signal is that it improves the target-detectability from the interfering signals by increasing the frequency diversity of the system. However, due to the addition of one extra dimension in terms of frequency, the adaptive degrees-of-freedom in an OFDM-STAP also increases. Therefore, to avoid the construction a fully adaptive OFDM-STAP, we develop a sparsity-based STAP algorithm. We observe that the interference spectrum is inherently sparse in the spatio-temporal domain,more » as the clutter responses occupy only a diagonal ridge on the spatio-temporal plane and the jammer signals interfere only from a few spatial directions. Hence, we exploit that sparsity to develop an efficient STAP technique that utilizes considerably lesser number of secondary data compared to the other existing STAP techniques, and produces nearly optimum STAP performance. In addition to designing the STAP filter, we optimally design the transmit OFDM signals by maximizing the output signal-to-interference-plus-noise ratio (SINR) in order to improve the STAP performance. The computation of output SINR depends on the estimated value of the interference covariance matrix, which we obtain by applying the sparse recovery algorithm. Therefore, we analytically assess the effects of the synthesized OFDM coefficients on the sparse recovery of the interference covariance matrix by computing the coherence measure of the sparse measurement matrix. Our numerical examples demonstrate the achieved STAP-performance due to sparsity-based technique and adaptive waveform design.« less

  8. Morphology of sea ice pressure ridges in the northwestern Weddell Sea in winter

    NASA Astrophysics Data System (ADS)

    Tan, Bing; Li, Zhi-Jun; Lu, Peng; Haas, Christian; Nicolaus, Marcel

    2012-06-01

    To investigate the morphology and distribution of pressure ridges in the northwestern Weddell Sea, ice surface elevation profiles were measured by a helicopter-borne laser altimeter during Winter Weddell Outflow Study with the German R/V Polarstern in 2006. An optimal cutoff height of 0.62 m, derived from the best fits between the measured and theoretical ridge height and spacing distributions, was first used to separate pressure ridges from other sea ice surface undulations. It was found that the measured ridge height distribution was well modeled by a negative exponential function, and the ridge spacing distribution by a lognormal function. Next, based on the ridging intensity Ri (the ratio of mean ridge sail height to mean spacing), all profiles were clustered into three regimes by an improved k-means clustering algorithm: Ri ≤ 0.01, 0.01 < Ri ≤ 0.026, and Ri > 0.026 (denoted as C1, C2, and C3 respectively). Mean (and standard deviation) of sail height was 0.99 (±0.07) m in Regime C1, 1.12 (±0.06) m in C2, and 1.17 (±0.04) m in C3, respectively, while the mean spacings (and standard deviations) were 232 (±240) m, 54 (±20) m, and 31 (±5.6) m. These three ice regimes coincided closely with distinct sea ice regions identified in a satellite radar image, where C1 corresponded to the broken ice in the marginal ice zone and level ice formed in the Larsen Polynya, C2 corresponded to the deformed first- and second-year ice formed by dynamic action in the center of the study region, and C3 corresponded to heavily deformed ice in the outflowing branch of the Weddell Gyre. The results of our analysis showed that the relationship between the mean ridge height and frequency was well modeled by a logarithmic function with a correlation coefficient of 0.8, although such correlation was weaker when considering each regime individually. The measured ridge height and frequency were both greater than those reported by others for the Ross Sea. Compared with reported values for other parts of the Antarctic, the present ridge heights were greater, but the ridge frequencies and ridging intensities were smaller than the most extreme of them. Meanwhile, average thickness of ridged ice in our study region was significantly larger than that of the Coastal Ross Sea showing the importance of deformation and ice age for ice conditions in the northwestern Weddell Sea.

  9. An effective one-dimensional anisotropic fingerprint enhancement algorithm

    NASA Astrophysics Data System (ADS)

    Ye, Zhendong; Xie, Mei

    2012-01-01

    Fingerprint identification is one of the most important biometric technologies. The performance of the minutiae extraction and the speed of the fingerprint verification system rely heavily on the quality of the input fingerprint images, so the enhancement of the low fingerprint is a critical and difficult step in a fingerprint verification system. In this paper we proposed an effective algorithm for fingerprint enhancement. Firstly we use normalization algorithm to reduce the variations in gray level values along ridges and valleys. Then we utilize the structure tensor approach to estimate each pixel of the fingerprint orientations. At last we propose a novel algorithm which combines the advantages of onedimensional Gabor filtering method and anisotropic method to enhance the fingerprint in recoverable region. The proposed algorithm has been evaluated on the database of Fingerprint Verification Competition 2004, and the results show that our algorithm performs within less time.

  10. An effective one-dimensional anisotropic fingerprint enhancement algorithm

    NASA Astrophysics Data System (ADS)

    Ye, Zhendong; Xie, Mei

    2011-12-01

    Fingerprint identification is one of the most important biometric technologies. The performance of the minutiae extraction and the speed of the fingerprint verification system rely heavily on the quality of the input fingerprint images, so the enhancement of the low fingerprint is a critical and difficult step in a fingerprint verification system. In this paper we proposed an effective algorithm for fingerprint enhancement. Firstly we use normalization algorithm to reduce the variations in gray level values along ridges and valleys. Then we utilize the structure tensor approach to estimate each pixel of the fingerprint orientations. At last we propose a novel algorithm which combines the advantages of onedimensional Gabor filtering method and anisotropic method to enhance the fingerprint in recoverable region. The proposed algorithm has been evaluated on the database of Fingerprint Verification Competition 2004, and the results show that our algorithm performs within less time.

  11. Results from a 14-month hydroacoustic monitoring of the three mid-oceanic ridges in the Indian Ocean

    NASA Astrophysics Data System (ADS)

    Royer, J.-Y.; Dziak, R. P.; Delatre, M.; Chateau, R.; Brachet, C.; Haxel, J. H.; Matsumoto, H.; Goslin, J.; Brandon, V.; Bohnenstielh, D. R.

    2009-04-01

    From October 2006 to January 2008, an hydroacoustic experiment in the Indian Ocean was carried out by the CNRS/University of Brest and NOAA/Oregon State University to monitor the low-level seismic activity associated with the three contrasting spreading ridges and deforming zones in the Indian Ocean. Three autonomous hydrophones were moored in the SOFAR channel by R/V Marion Dufresne for 14 months in the Madagascar Basin, and northeast and southwest of Amsterdam Island, complementing the two permanent hydroacoustic stations of the Comprehensive nuclear-Test-Ban Treaty Organization (CTBTO) located near Diego Garcia Island and off Cape Leeuwin. The three instruments successfully collected 14 month of continuous acoustic records. Combined with the records from the permanent stations, the array detected 1780 acoustic events consisting mostly of earthquake generated T-waves, but also of iceberg tremors from Wilkes Land, Antarctica. Within the triangle defined by the temporary array, the three ridges exhibit contrasting seismicity patterns. Along the Southeast Indian ridge (SEIR), the 272 acoustic events (vs 24 events in the NEIC catalog) occur predominantly along the transform faults ; only one ridge segment (76˚E) displays a continuous activity for 10 months. Along the Central Indian Ridge (CIR), seismicity is distributed along fracture zones and ridge segments (269 events vs 45 NEIC events), with two clusters of events near the triple junction (24-25S) and south of Marie-Celeste FZ (18.5S). Along the Southwest Indian Ridge (SWIR), the 222 events (vs 31 NEIC events) are distributed along the ridge segments with a larger number of events west of Melville FZ and a cluster at 58E. The immediate vicinity of the Rodrigues triple junction shows periods of quiescence and of intense activity. Some large earthquakes (Mb>5) near the triple junction (SEIR and CIR) seem to be preceded by several acoustic events that may be precursors. Finally, off-ridge seismicity is mostly detected in the southern part of the Central Indian Basin as a result of the intraplate deformation between the Capricorn and Australian plates. Other signals of interest are identified such as a 6-week long series of broadband (1-125 Hz) explosive signals detected only by the instrument located between Kerguelen and Amsterdam islands, many cryogenic tremors easily recognizable from their varying tones and harmonics, some of which can be precisely located off the Antarctic shelf, and finally whale calls attributed to four different whale species. This vocal activity is found to be highly seasonal, occurring mainly from April to October with subspecies variations. Detailed analyses of this unique data set are still underway.

  12. Water Quality and Evaluation of Pesticides in Lakes in the Ridge Citrus Region of Central Florida

    USGS Publications Warehouse

    Choquette, Anne F.; Kroening, Sharon E.

    2009-01-01

    Water chemistry, including major inorganic constituents, nutrients, and pesticide compounds, was compared between seven lakes surrounded by citrus agriculture and an undeveloped lake on the Lake Wales Ridge (herein referred to as the Ridge) in central Florida. The region has been recognized for its vulnerability to the leaching of agricultural chemicals into the subsurface due to factors including soils, climate, and land use. About 40 percent of Florida's citrus cultivation occurs in 'ridge citrus' areas characterized by sandy well drained soils, with the remainder in 'flatwoods citrus' characterized by high water tables and poorly drained soils. The lakes on the Ridge are typically flow-through lakes that exchange water with adjacent and underlying aquifer systems. This study is the first to evaluate the occurrence of pesticides in lakes on the Ridge, and also represents one of the first monitoring efforts nationally to focus on regional-scale assessment of current-use pesticides in small- to moderate-sized lakes (5 to 393 acres). The samples were collected between December 2003 and September 2005. The lakes in citrus areas contained elevated concentrations of major inorganic constituents (including alkalinity, total dissolved solids, calcium, magnesium, sodium, potassium, chloride, and sulfate), total nitrogen, pH, and pesticides compared to the undeveloped lake. Nitrate (as N) and total nitrogen concentrations were typically elevated in the citrus lakes, with maximum values of 4.70 and 5.19 mg/L (milligrams per liter), respectively. Elevated concentrations of potassium, nitrate, and other inorganic constituents in the citrus lakes likely reflect inputs from the surficial ground-water system that originated predominantly from agricultural fertilizers, soil amendments, and inorganic pesticides. A total of 20 pesticide compounds were detected in the lakes, of which 12 compounds exceeded the standardized reporting level of 0.06 ug/L (microgram per liter). Those most frequently detected above the 0.06-ug/L level were aldicarb sulfoxide, diuron, simazine degradates hydroxysimazine and didealkylatrazine (DDA), bromacil, norflurazon, and demethyl norflurazon which occurred at detection rates ranging from 25 to 86 percent of samples, respectively. Typically, pesticide concentrations in the lake samples were less than 1 microgram per liter. The number of targeted pesticide compounds detected per lake in the citrus areas ranged from 9 to 14 compared to 3 compounds detected at trace levels in the undeveloped lake. Consistent detections of parents and degradates in quarterly samples indicated the presence of pesticide compounds in the lakes many months or years (for example, bromacil) after their application, signaling the persistence of some pesticide compounds in the lakes and/or ground-water systems. Pesticide degradate concentrations frequently exceeded parent concentrations in the lakes. This study was the first in the Ridge citrus region to analyze for glyphosate - widely used in citrus - and its degradate aminomethylphosphonic acid (AMPA), neither of which were detected, as well as a number of triazine degradates, including hydroxysimazine, which were detected. The lake pesticide concentrations did not exceed current Federal aquatic-life benchmarks, available for 10 of the 20 detected pesticide compounds. Limited occurrences of bromacil, diuron, or norflurazon concentrations were within about 10 to 90 percent of benchmark guidelines for acute effects on nonvascular aquatic plants in one or two of the lakes. The lake pesticide concentrations for several targeted pesticides were relatively high compared to corresponding national stream-water percentiles, which is consistent with this region's vulnerability for pesticide leaching into water resources. Several factors were evaluated to gain insight into the processes controlling pesticide transport and fate, and to assess their utility for estimating th

  13. III/V nano ridge structures for optical applications on patterned 300 mm silicon substrate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kunert, B.; Guo, W.; Mols, Y.

    We report on an integration approach of III/V nano ridges on patterned silicon (Si) wafers by metal organic vapor phase epitaxy (MOVPE). Trenches of different widths (≤500 nm) were processed in a silicon oxide (SiO{sub 2}) layer on top of a 300 mm (001) Si substrate. The MOVPE growth conditions were chosen in a way to guarantee an efficient defect trapping within narrow trenches and to form a box shaped ridge with increased III/V volume when growing out of the trench. Compressively strained InGaAs/GaAs multi-quantum wells with 19% indium were deposited on top of the fully relaxed GaAs ridges as an activemore » material for optical applications. Transmission electron microcopy investigation shows that very flat quantum well (QW) interfaces were realized. A clear defect trapping inside the trenches is observed whereas the ridge material is free of threading dislocations with only a very low density of planar defects. Pronounced QW photoluminescence (PL) is detected from different ridge sizes at room temperature. The potential of these III/V nano ridges for laser integration on Si substrates is emphasized by the achieved ridge volume which could enable wave guidance and by the high crystal quality in line with the distinct PL.« less

  14. Geochemical and Visual Indicators of Hydrothermal Fluid Flow through a Sediment-Hosted Volcanic Ridge in the Central Bransfield Basin (Antarctica)

    PubMed Central

    Aquilina, Alfred; Connelly, Douglas P.; Copley, Jon T.; Green, Darryl R. H.; Hawkes, Jeffrey A.; Hepburn, Laura E.; Huvenne, Veerle A. I.; Marsh, Leigh; Mills, Rachel A.; Tyler, Paul A.

    2013-01-01

    In the austral summer of 2011 we undertook an investigation of three volcanic highs in the Central Bransfield Basin, Antarctica, in search of hydrothermal activity and associated fauna to assess changes since previous surveys and to evaluate the extent of hydrothermalism in this basin. At Hook Ridge, a submarine volcanic edifice at the eastern end of the basin, anomalies in water column redox potential (Eh) were detected close to the seafloor, unaccompanied by temperature or turbidity anomalies, indicating low-temperature hydrothermal discharge. Seepage was manifested as shimmering water emanating from the sediment and from mineralised structures on the seafloor; recognisable vent endemic fauna were not observed. Pore fluids extracted from Hook Ridge sediment were depleted in chloride, sulfate and magnesium by up to 8% relative to seawater, enriched in lithium, boron and calcium, and had a distinct strontium isotope composition (87Sr/86Sr  = 0.708776 at core base) compared with modern seawater (87Sr/86Sr ≈0.70918), indicating advection of hydrothermal fluid through sediment at this site. Biogeochemical zonation of redox active species implies significant moderation of the hydrothermal fluid with in situ diagenetic processes. At Middle Sister, the central ridge of the Three Sisters complex located about 100 km southwest of Hook Ridge, small water column Eh anomalies were detected but visual observations of the seafloor and pore fluid profiles provided no evidence of active hydrothermal circulation. At The Axe, located about 50 km southwest of Three Sisters, no water column anomalies in Eh, temperature or turbidity were detected. These observations demonstrate that the temperature anomalies observed in previous surveys are episodic features, and suggest that hydrothermal circulation in the Bransfield Strait is ephemeral in nature and therefore may not support vent biota. PMID:23359806

  15. Porous silicon micro-resonator implemented by standard photolithography process for sensing application

    NASA Astrophysics Data System (ADS)

    Girault, P.; Azuelos, P.; Lorrain, N.; Poffo, L.; Lemaitre, J.; Pirasteh, P.; Hardy, I.; Thual, M.; Guendouz, M.; Charrier, J.

    2017-10-01

    A micro-resonator based on porous silicon ridge waveguides is implemented by a large scale standard photolithography process to obtain a low cost and sensitive sensor based on volume detection principle instead of the evanescent one usually used. The porous nature of the ridge waveguides allows the target molecules to be infiltrated in the core and to be detected by direct interaction with the propagated light. Racetrack resonator with radius of 100 μm and a coupling length of 70 μm is optically characterized for the volume detection of different concentrations of glucose. A high sensitivity of 560 nm/RIU is reached with only one micro-resonator and a limit of detection of 8.10-5 RIU, equivalent to a glucose concentration of 0.7 g/L, is obtained.

  16. [Study on objectively evaluating skin aging according to areas of skin texture].

    PubMed

    Shan, Gaixin; Gan, Ping; He, Ling; Sun, Lu; Li, Qiannan; Jiang, Zheng; He, Xiangqian

    2015-02-01

    Skin aging principles play important roles in skin disease diagnosis, the evaluation of skin cosmetic effect, forensic identification and age identification in sports competition, etc. This paper proposes a new method to evaluate the skin aging objectively and quantitatively by skin texture area. Firstly, the enlarged skin image was acquired. Then, the skin texture image was segmented by using the iterative threshold method, and the skin ridge image was extracted according to the watershed algorithm. Finally, the skin ridge areas of the skin texture were extracted. The experiment data showed that the average areas of skin ridges, of both men and women, had a good correlation with age (the correlation coefficient r of male was 0.938, and the correlation coefficient r of female was 0.922), and skin texture area and age regression curve showed that the skin texture area increased with age. Therefore, it is effective to evaluate skin aging objectively by the new method presented in this paper.

  17. HYFLUX: Satellite Exploration of Natural Hydrocarbon Seeps and Discovery of a Methane Hydrate Mound at GC600

    NASA Astrophysics Data System (ADS)

    Garcia-Pineda, O. G.; MacDonald, I. R.; Shedd, W.; Zimmer, B.

    2009-12-01

    Analysis of natural hydrocarbon seeps is important to improve our understanding of methane flux from deeper sediments to the water column. In order to quantify natural hydrocarbon seep formations in the Northern Gulf of Mexico, a set of 686 Synthetic Aperture Radar (SAR) images was analyzed using the Texture Classifying Neural Network Algorithm (TCNNA), which processes SAR data to delineate oil slicks. This analysis resulted in a characterization of 396 natural seep sites distributed in the northern GOM. Within these sites, a maximum of 1248 individual vents where identified. Oil reaching the sea-surface is deflected from its source during transit through the water column. This presentation describes a method for estimating locations of active oil vents based on repeated slick detection in SAR. One of the most active seep formations was detected in MMS lease block GC600. A total of 82 SAR scenes (collected by RADARSAT-1 from 1995 to 2007) was processed covering this region. Using TCNNA the area covered by each slick was computed and Oil Slicks Origins (OSO) were selected as single points within detected oil slicks. At this site, oil slick signatures had lengths up to 74 km and up to 27 km^2 of area. Using SAR and TCNNA, four active vents were identified in this seep formation. The geostatistical mean centroid among all detections indicated a location along a ridge-line at ~1200m. Sea truth observations with an ROV, confirmed that the estimated location of vents had a maximum offset of ~30 m from their actual locations on the seafloor. At the largest vent, a 3-m high, 12-m long mound of oil-saturated gas hydrate was observed. The outcrop contained thousands of ice worms and numerous semi-rigid chimneys from where oily bubbles were escaping in a continuous stream. Three additional vents were found along the ridge; these had lower apparent flow, but were also plugged with gas hydrate mounds. These results support use of SAR data for precise delineation of active seep formation and shallow gas hydrate deposits.

  18. Evidence of recent volcanic activity on the ultraslow-spreading Gakkel ridge.

    PubMed

    Edwards, M H; Kurras, G J; Tolstoy, M; Bohnenstiehl, D R; Coakley, B J; Cochran, J R

    2001-02-15

    Seafloor spreading is accommodated by volcanic and tectonic processes along the global mid-ocean ridge system. As spreading rate decreases the influence of volcanism also decreases, and it is unknown whether significant volcanism occurs at all at ultraslow spreading rates (<1.5 cm yr(-1)). Here we present three-dimensional sonar maps of the Gakkel ridge, Earth's slowest-spreading mid-ocean ridge, located in the Arctic basin under the Arctic Ocean ice canopy. We acquired this data using hull-mounted sonars attached to a nuclear-powered submarine, the USS Hawkbill. Sidescan data for the ultraslow-spreading (approximately 1.0 cm yr(-1)) eastern Gakkel ridge depict two young volcanoes covering approximately 720 km2 of an otherwise heavily sedimented axial valley. The western volcano coincides with the average location of epicentres for more than 250 teleseismic events detected in 1999, suggesting that an axial eruption was imaged shortly after its occurrence. These findings demonstrate that eruptions along the ultraslow-spreading Gakkel ridge are focused at discrete locations and appear to be more voluminous and occur more frequently than was previously thought.

  19. Predicting Microstegium vimineum invasion in natural plant communities of the southern Blue Ridge Mountains, USA

    Treesearch

    Dean P. Anderson; Monica G. Turner; Scott M. Pearson; Thomas P. Albright; Robert K. Peet; Ann Wieben

    2012-01-01

    Shade-tolerant non-native invasive plant species may make deep incursions into natural plant communities, but detecting such species is challenging because occurrences are often sparse. We developed Bayesian models of the distribution of Microstegium vimineum in natural plant communities of the southern Blue Ridge Mountains, USA to address three objectives: (1) to...

  20. REPORT FOR COMMERCIAL GRADE NICKEL CHARACTERIZATION AND BENCHMARKING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2012-12-20

    Oak Ridge Associated Universities (ORAU), under the Oak Ridge Institute for Science and Education (ORISE) contract, has completed the collection, sample analysis, and review of analytical results to benchmark the concentrations of gross alpha-emitting radionuclides, gross beta-emitting radionuclides, and technetium-99 in commercial grade nickel. This report presents methods, change management, observations, and statistical analysis of materials procured from sellers representing nine countries on four continents. The data suggest there is a low probability of detecting alpha- and beta-emitting radionuclides in commercial nickel. Technetium-99 was not detected in any samples, thus suggesting it is not present in commercial nickel.

  1. Polygonal ridge networks on Mars: Diversity of morphologies and the special case of the Eastern Medusae Fossae Formation

    NASA Astrophysics Data System (ADS)

    Kerber, Laura; Dickson, James L.; Head, James W.; Grosfils, Eric B.

    2017-01-01

    Polygonal ridge networks, also known as boxwork or reticulate ridges, are found in numerous locations and geological contexts across Mars. Distinguishing the morphologies and geological context of the ridge networks sheds light on their potential as astrobiological and mineral resource sites of interest. The most widespread type of ridge morphology is characteristic of the Nili Fossae and Nilosyrtis region and consists of thin, criss-crossing ridges with a variety of heights, widths, and intersection angles. They are found in ancient Noachian terrains at a variety of altitudes (between -2500 and 2200 m) and geographic locations and are likely to be chemically altered fracture planes or mineral veins. They occur in the same general areas as valley networks and ancient lake basins, but they are not more numerous where these water-related features are concentrated, and can appear in places where th morphologies are absent. Similarly, some of the ridge networks are located near hydrated mineral detections, but there is not a one-to-one correlation. Smaller, light-toned ridges of variable widths have been found in Gale Crater and other rover sites and are interpreted to be smaller versions of the Nili-like ridges, mostly formed by the mineralization of fractures. This type of ridge is likely to be found in many other places on Mars as more high-resolution data become available. Sinus Meridiani contains many flat-topped ridges arranged into quasi-circular patterns. The ridges are eroding from a clay-rich unit, and could be formed by a similar process as the Nili-type ridges, but at a much larger scale and controlled by fractures made through a different process. Hellas Basin is host to a fourth type of ridge morphology consisting of large, thick, light-toned ridges forming regular polygons at several superimposed scales. While still enigmatic, these are most likely to be the result of sediment-filled fractures. The Eastern Medusae Fossae Formation contains large swaths of a fifth, previously undocumented, ridge network type. The dark ridges, reaching up to 50 m in height, enclose regular polygons and erode into dark boulders. These ridge networks are interpreted to form as a result of lava flow embayment of deeply fractured Medusae Fossae Formation outcrops.

  2. Probabilistic #D data fusion for multiresolution surface generation

    NASA Technical Reports Server (NTRS)

    Manduchi, R.; Johnson, A. E.

    2002-01-01

    In this paper we present an algorithm for adaptive resolution integration of 3D data collected from multiple distributed sensors. The input to the algorithm is a set of 3D surface points and associated sensor models. Using a probabilistic rule, a surface probability function is generated that represents the probability that a particular volume of space contains the surface. The surface probability function is represented using an octree data structure; regions of space with samples of large conariance are stored at a coarser level than regions of space containing samples with smaller covariance. The algorithm outputs an adaptive resolution surface generated by connecting points that lie on the ridge of surface probability with triangles scaled to match the local discretization of space given by the algorithm, we present results from 3D data generated by scanning lidar and structure from motion.

  3. Estimating the dose response relationship for occupational radiation exposure measured with minimum detection level.

    PubMed

    Xue, Xiaonan; Shore, Roy E; Ye, Xiangyang; Kim, Mimi Y

    2004-10-01

    Occupational exposures are often recorded as zero when the exposure is below the minimum detection level (BMDL). This can lead to an underestimation of the doses received by individuals and can lead to biased estimates of risk in occupational epidemiologic studies. The extent of the exposure underestimation is increased with the magnitude of the minimum detection level (MDL) and the frequency of monitoring. This paper uses multiple imputation methods to impute values for the missing doses due to BMDL. A Gibbs sampling algorithm is developed to implement the method, which is applied to two distinct scenarios: when dose information is available for each measurement (but BMDL is recorded as zero or some other arbitrary value), or when the dose information available represents the summation of a series of measurements (e.g., only yearly cumulative exposure is available but based on, say, weekly measurements). Then the average of the multiple imputed exposure realizations for each individual is used to obtain an unbiased estimate of the relative risk associated with exposure. Simulation studies are used to evaluate the performance of the estimators. As an illustration, the method is applied to a sample of historical occupational radiation exposure data from the Oak Ridge National Laboratory.

  4. Detection and Rectification of Distorted Fingerprints.

    PubMed

    Si, Xuanbin; Feng, Jianjiang; Zhou, Jie; Luo, Yuxuan

    2015-03-01

    Elastic distortion of fingerprints is one of the major causes for false non-match. While this problem affects all fingerprint recognition applications, it is especially dangerous in negative recognition applications, such as watchlist and deduplication applications. In such applications, malicious users may purposely distort their fingerprints to evade identification. In this paper, we proposed novel algorithms to detect and rectify skin distortion based on a single fingerprint image. Distortion detection is viewed as a two-class classification problem, for which the registered ridge orientation map and period map of a fingerprint are used as the feature vector and a SVM classifier is trained to perform the classification task. Distortion rectification (or equivalently distortion field estimation) is viewed as a regression problem, where the input is a distorted fingerprint and the output is the distortion field. To solve this problem, a database (called reference database) of various distorted reference fingerprints and corresponding distortion fields is built in the offline stage, and then in the online stage, the nearest neighbor of the input fingerprint is found in the reference database and the corresponding distortion field is used to transform the input fingerprint into a normal one. Promising results have been obtained on three databases containing many distorted fingerprints, namely FVC2004 DB1, Tsinghua Distorted Fingerprint database, and the NIST SD27 latent fingerprint database.

  5. Structure of the southern Juan de Fuca Ridge from seismic reflection records

    USGS Publications Warehouse

    Morton, Janet L.; Sleep, Norman H.; Normark, William R.; Tompkins, Donald H.

    1987-01-01

    Twenty-four-channel seismic reflection records were obtained from the axial region of the southern Juan de Fuca Ridge. Two profiles are normal to the strike of the spreading center and intersect the ridge at latitude 44°40′N and 45°05′N; a third profile extends south along the ridge axis from latitude 45°20′N and crosses the Blanco Fracture Zone. Processing of the axial portions of the cross-strike lines resolved a weak reflection centered beneath the axis. The reflector is at a depth similar to seismically detected magma chambers on the East Pacific Rise and a Lau Basin spreading center; we suggest that the reflector represents the top of an axial magma chamber. In the migrated sections the top of the probable magma chamber is relatively flat and 1–2 km wide, and the subbottom depth of the chamber is greater where the depth to the ridge axis is greater.

  6. Visualizing dispersive features in 2D image via minimum gradient method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    He, Yu; Wang, Yan; Shen, Zhi -Xun

    Here, we developed a minimum gradient based method to track ridge features in a 2D image plot, which is a typical data representation in many momentum resolved spectroscopy experiments. Through both analytic formulation and numerical simulation, we compare this new method with existing DC (distribution curve) based and higher order derivative based analyses. We find that the new method has good noise resilience and enhanced contrast especially for weak intensity features and meanwhile preserves the quantitative local maxima information from the raw image. An algorithm is proposed to extract 1D ridge dispersion from the 2D image plot, whose quantitative applicationmore » to angle-resolved photoemission spectroscopy measurements on high temperature superconductors is demonstrated.« less

  7. Visualizing dispersive features in 2D image via minimum gradient method

    DOE PAGES

    He, Yu; Wang, Yan; Shen, Zhi -Xun

    2017-07-24

    Here, we developed a minimum gradient based method to track ridge features in a 2D image plot, which is a typical data representation in many momentum resolved spectroscopy experiments. Through both analytic formulation and numerical simulation, we compare this new method with existing DC (distribution curve) based and higher order derivative based analyses. We find that the new method has good noise resilience and enhanced contrast especially for weak intensity features and meanwhile preserves the quantitative local maxima information from the raw image. An algorithm is proposed to extract 1D ridge dispersion from the 2D image plot, whose quantitative applicationmore » to angle-resolved photoemission spectroscopy measurements on high temperature superconductors is demonstrated.« less

  8. Partial fingerprint identification algorithm based on the modified generalized Hough transform on mobile device

    NASA Astrophysics Data System (ADS)

    Qin, Jin; Tang, Siqi; Han, Congying; Guo, Tiande

    2018-04-01

    Partial fingerprint identification technology which is mainly used in device with small sensor area like cellphone, U disk and computer, has taken more attention in recent years with its unique advantages. However, owing to the lack of sufficient minutiae points, the conventional method do not perform well in the above situation. We propose a new fingerprint matching technique which utilizes ridges as features to deal with partial fingerprint images and combines the modified generalized Hough transform and scoring strategy based on machine learning. The algorithm can effectively meet the real-time and space-saving requirements of the resource constrained devices. Experiments on in-house database indicate that the proposed algorithm have an excellent performance.

  9. Reading Ombrone river delta evolution through beach ridges morphology

    NASA Astrophysics Data System (ADS)

    Mammi, Irene; Piccardi, Marco; Pranzini, Enzo; Rossi, Lorenzo

    2017-04-01

    The present study focuses on the evolution of the Ombrone River delta (Southern Tuscany, Italy) in the last five centuries, when fluvial sediment input was huge also as a consequence of the deforestation performed on the watershed. The aim of this study is to find a correlation between river input and beach ridges morphology and to explain the different distribution of wetlands and sand deposits on the two sides of the delta. Visible, NIR and TIR satellite images were processed to retrieve soil wetness associated to sand ridges and interdune silty deposits. High resolution LiDAR data were analysed using vegetation filter and GIS enhancement algorithms in order to highlight small morphological variations, especially in areas closer to the river where agriculture has almost deleted these morphologies. A topographic survey and a very high resolution 3D model obtained from a set of images acquired by an Unmanned Aerial Vehicle (UAV) were carried out in selected sites, both to calibrate satellite LiDAR 3D data, and to map low relief areas. Historical maps, aerial photography and written documents were analysed for dating ancient shorelines associated to specific beach ridges. Thus allowing the reconstruction of erosive and accretive phases of the delta. Seventy beach ridges were identified on the two wings of the delta. On the longer down-drift side (Northern wing) beach ridges are more spaced at the apex and gradually converge to the extremity, where the Bruna River runs and delimits the sub aerial depositional area of the Ombrone River. On the shorter up-drift lobe (Southern wing), beach ridges are closer, but run almost parallel each other. In this case, a rocky headland called Collelungo promontory closes and cuts the beach ridges sequence but shallow water depth allows sediment by pass. One kilometre to the south a more pronounced promontory encloses a small pocket beach (Cala di Forno) and identifies the limit of the subaerial depositionary area. Beach ridges heights were analysed through LiDAR data and some of them were found higher than average. Conceptual models in literature allowed us to explain higher beach ridges as periods of stability or a very initial erosion stage interesting the beach. The high resolution DTM produced from LiDAR and UAV data permitted a better reconstruction of the last five centuries of delta evolution and to characterize the difference of beach ridges morphology of the up-drift and the down-drift sides of the delta. Within this framework the presence of interdune swales in the down-drift side has been explained.

  10. Mantle-crust interaction at the Blanco Ridge segment of the Blanco Transform Fault Zone: Results from the Blanco Transform Fault OBS Experiment

    NASA Astrophysics Data System (ADS)

    Kuna, V. M.; Nabelek, J.; Braunmiller, J.

    2016-12-01

    We present results of the Blanco Transform OBS Experiment, which consists of the deployment of 55 three-component broadband and short-period ocean bottom seismometers in the vicinity of the Blanco Fault Zone for the period between September 2012 and October 2013. Our research concentrates on the Blanco Ridge, a purely transform segment of the Blanco Fault Zone, that spans over 130 km between the Cascadia and the Gorda pull-apart depressions. Almost 3,000 well-constrained earthquakes were detected and located along the Blanco Ridge by an automatic procedure (using BRTT Antelope) and relocated using a relative location algorithm (hypoDD). The catalog magnitude of completeness is M=2.2 with an overall b value of 1. Earthquakes extend from 0 km to 20 km depth, but cluster predominantly at two depth levels: in the crust (5-7 km) and in the uppermost mantle (12-17 km). Statistical analysis reveals striking differences between crustal and mantle seismicity. The temporal distribution of crustal events follows common patterns given by Omori's law, while most mantle seismicity occurs in spatially tight sequences of unusually short durations lasting 30 minutes or less. These sequences cannot be described by known empirical laws. Moreover, we observe increased seismic activity in the uppermost mantle about 30 days before the largest (M=5.4) earthquake. Two mantle sequences occurred in a small area of 3x3 km about 4 and 2 weeks before the M=5.4 event. In the week leading up to the M=5.4 event we observe a significant downward migration of crustal seismicity, which results in the subsequent nucleation of the main event at the base of the crust. We hypothesize that the highly localized uppermost mantle seismicity is triggered by aseismic slow-slip of the surrounding ductile mantle. We also suggest that the mantle slip loads the crust eventually resulting in relatively large crustal earthquakes.

  11. Sex differences in fingerprint ridge density in a Turkish young adult population: a sample of Baskent University.

    PubMed

    Oktem, Hale; Kurkcuoglu, Ayla; Pelin, Ismail Can; Yazici, Ayse Canan; Aktaş, Gulnihal; Altunay, Fikret

    2015-05-01

    Fingerprints are considered to be one of the most reliable methods of identification. Identification of an individual plays a vital part of any medico-legal investigations. Dermatoglyphics is a branch of science that studies epidermal ridges and ridge patterns. Epidermal ridges are polygenic characteristics that form intrauterine 10-18 weeks and considered fully developed by the sixth month of fetal growth. Fingerprints are permanent morphological characteristics and criminal detection based on fingerprints is based on the principle that no two people can have identical fingerprints. Sex determination from fingerprints has been examined in different population. In this study we aimed to study fingerprint ridge density in Turkish population sample of Baskent University students. Fingerprints were obtained from 118 women, 88 men a total of 206 students aged between 17 and 28 years old by means of simple inking method. Fingerprints from all right and left hands fingers were collected in three different area of each. The ridges on fingerprints were counted diagonally on squares measuring 5 mm × 5 mm on radial, ulnar and inferior areas. The fingerprint ridge density in radial, ulnar and inferior areas and between sexes was compared statistically Mann Whitney U test and Friedman test. The ridge density was significantly greater in women in every region studied and in all fingers when compared to men. The fingerprint ridge density in the ulnar and radial areas of the fingerprints was significantly greater than the lower area. Fingerprint ridge density can be used by medico-legal examination for sex identification. Copyright © 2015 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  12. Fingermark ridge drift.

    PubMed

    De Alcaraz-Fossoul, Josep; Roberts, Katherine A; Feixat, Carme Barrot; Hogrebe, Gregory G; Badia, Manel Gené

    2016-01-01

    Distortions of the fingermark topography are usually considered when comparing latent and exemplar fingerprints. These alterations are characterized as caused by an extrinsic action, which affects entire areas of the deposition and alters the overall flow of a series of contiguous ridges. Here we introduce a novel visual phenomenon that does not follow these principles, named fingermark ridge drift. An experiment was designed that included variables such as type of secretion (eccrine and sebaceous), substrate (glass and polystyrene), and degrees of exposure to natural light (darkness, shade, and direct light) indoors. Fingermarks were sequentially visualized with titanium dioxide powder, photographed and analyzed. The comparison between fresh and aged depositions revealed that under certain environmental conditions an individual ridge could randomly change its original position regardless of its unaltered adjacent ridges. The causes of the drift phenomenon are not well understood. We believe it is exclusively associated with intrinsic natural aging processes of latent fingermarks. This discovery will help explain the detection of certain dissimilarities at the minutiae/ridge level; determine more accurate "hits"; identify potentially erroneous corresponding points; and rethink identification protocols, especially the criteria of "no single minutiae discrepancy" for a positive identification. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. Mid-2017 Map of NASA's Curiosity Mars Rover Mission

    NASA Image and Video Library

    2017-07-11

    This map shows the route driven by NASA's Curiosity Mars rover, from the location where it landed in August 2012 to its location in July 2017, and its planned path to additional geological layers of lower Mount Sharp. The blue star near top center marks "Bradbury Landing," the site where Curiosity arrived on Mars on Aug. 5, 2012, PDT (Aug. 6, EDT and Universal Time). Blue triangles mark waypoints investigated by Curiosity on the floor of Gale Crater and, starting with "Pahrump Hills," on Mount Sharp. The Sol 1750 label identifies the rover's location on July 9, 2017, the 1,750th Martian day, or sol, since the landing. In July 2017, the mission is examining "Vera Rubin Ridge" from the downhill side of the ridge. Spectrometry observations from NASA's Mars Reconnaissance Orbiter have detected hematite, an iron-oxide mineral, in the ridge. Curiosity's planned route continues to the top of the ridge and then to geological units where clay minerals and sulfate minerals have been detected from orbit. The base image for the map is from the High Resolution Imaging Science Experiment (HiRISE) camera on the Mars Reconnaissance Orbiter. North is up. "Bagnold Dunes" form a band of dark, wind-blown material at the foot of Mount Sharp. https://photojournal.jpl.nasa.gov/catalog/PIA21720

  14. Seafloor seismicity, Antarctic ice-sounds, cetacean vocalizations and long-term ambient sound in the Indian Ocean basin

    NASA Astrophysics Data System (ADS)

    Royer, J.-Y.; Chateau, R.; Dziak, R. P.; Bohnenstiehl, D. R.

    2015-08-01

    This paper presents the results from the Deflo-hydroacoustic experiment in the Southern Indian Ocean using three autonomous underwater hydrophones, complemented by two permanent hydroacoustic stations. The array monitored for 14 months, from November 2006 to December 2007, a 3000 × 3000 km wide area, encompassing large segments of the three Indian spreading ridges that meet at the Indian Triple Junction. A catalogue of 11 105 acoustic events is derived from the recorded data, of which 55 per cent are located from three hydrophones, 38 per cent from 4, 6 per cent from five and less than 1 per cent by six hydrophones. From a comparison with land-based seismic catalogues, the smallest detected earthquakes are mb 2.6 in size, the range of recorded magnitudes is about twice that of land-based networks and the number of detected events is 5-16 times larger. Seismicity patterns vary between the three spreading ridges, with activity mainly focused on transform faults along the fast spreading Southeast Indian Ridge and more evenly distributed along spreading segments and transforms on the slow spreading Central and ultra-slow spreading Southwest Indian ridges; the Central Indian Ridge is the most active of the three with an average of 1.9 events/100 km/month. Along the Sunda Trench, acoustic events mostly radiate from the inner wall of the trench and show a 200-km-long seismic gap between 2 °S and the Equator. The array also detected more than 3600 cryogenic events, with different seasonal trends observed for events from the Antarctic margin, compared to those from drifting icebergs at lower (up to 50°S) latitudes. Vocalizations of five species and subspecies of large baleen whales were also observed and exhibit clear seasonal variability. On the three autonomous hydrophones, whale vocalizations dominate sound levels in the 20-30 and 100 Hz frequency bands, whereas earthquakes and ice tremor are a dominant source of ambient sound at frequencies <20 Hz.

  15. Capillary sieving electrophoresis and micellar electrokinetic capillary chromatography produce highly correlated separation of tryptic digests

    PubMed Central

    Dickerson, Jane A.; Dovichi, Norman J.

    2011-01-01

    We perform two-dimensional capillary electrophoresis on fluorescently labeled proteins and peptides. Capillary sieving electrophoresis was performed in the first dimension and micellar electrokinetic capillary chromatography was performed in the second. A cellular homogenate was labeled with the fluorogenic reagent FQ and separated using the system. This homogenate generated a pair of ridges; the first had essentially constant migration time in the CSE dimension, while the second had essentially constant migration time in the MEKC dimension. In addition a few spots were scattered through the electropherogram. The same homogenate was digested using trypsin, and then labeled and subjected to the two dimensional separation. In this case, the two ridges observed from the original two-dimensional separation disappeared, and were replaced by a set of spots that fell along the diagonal. Those spots were identified using a local-maximum algorithm and each was fit using a two-dimensional Gaussian surface by an unsupervised nonlinear least squares regression algorithm. The migration times of the tryptic digest components were highly correlated (r = 0.862). When the slowest migrating components were eliminated from the analysis, the correlation coefficient improved to r = 0.956. PMID:20564272

  16. Fingerprint separation: an application of ICA

    NASA Astrophysics Data System (ADS)

    Singh, Meenakshi; Singh, Deepak Kumar; Kalra, Prem Kumar

    2008-04-01

    Among all existing biometric techniques, fingerprint-based identification is the oldest method, which has been successfully used in numerous applications. Fingerprint-based identification is the most recognized tool in biometrics because of its reliability and accuracy. Fingerprint identification is done by matching questioned and known friction skin ridge impressions from fingers, palms, and toes to determine if the impressions are from the same finger (or palm, toe, etc.). There are many fingerprint matching algorithms which automate and facilitate the job of fingerprint matching, but for any of these algorithms matching can be difficult if the fingerprints are overlapped or mixed. In this paper, we have proposed a new algorithm for separating overlapped or mixed fingerprints so that the performance of the matching algorithms will improve when they are fed with these inputs. Independent Component Analysis (ICA) has been used as a tool to separate the overlapped or mixed fingerprints.

  17. Adaptive multimode signal reconstruction from time–frequency representations

    PubMed Central

    Meignen, Sylvain; Oberlin, Thomas; Depalle, Philippe; Flandrin, Patrick

    2016-01-01

    This paper discusses methods for the adaptive reconstruction of the modes of multicomponent AM–FM signals by their time–frequency (TF) representation derived from their short-time Fourier transform (STFT). The STFT of an AM–FM component or mode spreads the information relative to that mode in the TF plane around curves commonly called ridges. An alternative view is to consider a mode as a particular TF domain termed a basin of attraction. Here we discuss two new approaches to mode reconstruction. The first determines the ridge associated with a mode by considering the location where the direction of the reassignment vector sharply changes, the technique used to determine the basin of attraction being directly derived from that used for ridge extraction. A second uses the fact that the STFT of a signal is fully characterized by its zeros (and then the particular distribution of these zeros for Gaussian noise) to deduce an algorithm to compute the mode domains. For both techniques, mode reconstruction is then carried out by simply integrating the information inside these basins of attraction or domains. PMID:26953184

  18. Is ridge preservation/augmentation at periodontally compromised extraction sockets safe? A retrospective study.

    PubMed

    Kim, Jung-Ju; Ben Amara, Heithem; Schwarz, Frank; Kim, Hae-Young; Lee, Jung-Won; Wikesjö, Ulf M E; Koo, Ki-Tae

    2017-10-01

    This study aimed to evaluate the safety of ridge preservation/augmentation procedures when performed at compromised extraction sockets. Patients subject to ridge preservation/augmentation at periodontally compromised sockets at Seoul National University Dental Hospital (SNUDH) were evaluated in a chart review. Tooth extractions due to acute infection were not included in our study as chronically formed lesions are the only lesions that can be detected from radiographic images. If inflammatory symptoms persisted following ridge preservation/augmentation and antimicrobial and anti-inflammatory therapy, the patient was categorized as a re-infection case and implanted biomaterial removed. Of 10,060 patients subject to tooth extractions at SNUDH, 2011 through 2015, 297 cases meeting inclusion criteria were reviewed. The severity and type of lesions were not specific because extracting data was only done by radiographic images and chart records. The review identified eight patients exhibiting inflammatory symptoms that required additional antimicrobial and anti-inflammatory therapy. Within this group, re-infection occurred in two patients requiring biomaterials removal. The final safety rate for the ridge preservation/augmentation was 99.3%. None of the demographic factors, systemic conditions or choice of biomaterial affected the safety of ridge preservation/augmentation. Alveolar ridge preservation/augmentation at periodontally compromised sockets appears safe following thorough removal of infectious source. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. An introduction of Markov chain Monte Carlo method to geochemical inverse problems: Reading melting parameters from REE abundances in abyssal peridotites

    NASA Astrophysics Data System (ADS)

    Liu, Boda; Liang, Yan

    2017-04-01

    Markov chain Monte Carlo (MCMC) simulation is a powerful statistical method in solving inverse problems that arise from a wide range of applications. In Earth sciences applications of MCMC simulations are primarily in the field of geophysics. The purpose of this study is to introduce MCMC methods to geochemical inverse problems related to trace element fractionation during mantle melting. MCMC methods have several advantages over least squares methods in deciphering melting processes from trace element abundances in basalts and mantle rocks. Here we use an MCMC method to invert for extent of melting, fraction of melt present during melting, and extent of chemical disequilibrium between the melt and residual solid from REE abundances in clinopyroxene in abyssal peridotites from Mid-Atlantic Ridge, Central Indian Ridge, Southwest Indian Ridge, Lena Trough, and American-Antarctic Ridge. We consider two melting models: one with exact analytical solution and the other without. We solve the latter numerically in a chain of melting models according to the Metropolis-Hastings algorithm. The probability distribution of inverted melting parameters depends on assumptions of the physical model, knowledge of mantle source composition, and constraints from the REE data. Results from MCMC inversion are consistent with and provide more reliable uncertainty estimates than results based on nonlinear least squares inversion. We show that chemical disequilibrium is likely to play an important role in fractionating LREE in residual peridotites during partial melting beneath mid-ocean ridge spreading centers. MCMC simulation is well suited for more complicated but physically more realistic melting problems that do not have analytical solutions.

  20. Comparison of human and algorithmic target detection in passive infrared imagery

    NASA Astrophysics Data System (ADS)

    Weber, Bruce A.; Hutchinson, Meredith

    2003-09-01

    We have designed an experiment that compares the performance of human observers and a scale-insensitive target detection algorithm that uses pixel level information for the detection of ground targets in passive infrared imagery. The test database contains targets near clutter whose detectability ranged from easy to very difficult. Results indicate that human observers detect more "easy-to-detect" targets, and with far fewer false alarms, than the algorithm. For "difficult-to-detect" targets, human and algorithm detection rates are considerably degraded, and algorithm false alarms excessive. Analysis of detections as a function of observer confidence shows that algorithm confidence attribution does not correspond to human attribution, and does not adequately correlate with correct detections. The best target detection score for any human observer was 84%, as compared to 55% for the algorithm for the same false alarm rate. At 81%, the maximum detection score for the algorithm, the same human observer had 6 false alarms per frame as compared to 29 for the algorithm. Detector ROC curves and observer-confidence analysis benchmarks the algorithm and provides insights into algorithm deficiencies and possible paths to improvement.

  1. RCRA Facility investigation report for Waste Area Grouping 6 at Oak Ridge National Laboratory, Oak Ridge, Tennessee. Volume 5, Technical Memorandums 06-09A, 06-10A, and 06-12A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    This report provides a detailed summary of the activities carried out to sample groundwater at Waste Area Grouping (WAG) 6. The analytical results for samples collected during Phase 1, Activity 2 of the WAG 6 Resource Conservation and Recovery Act Facility Investigation (RFI) are also presented. In addition, analytical results for Phase 1, activity sampling events for which data were not previously reported are included in this TM. A summary of the groundwater sampling activities of WAG 6, to date, are given in the Introduction. The Methodology section describes the sampling procedures and analytical parameters. Six attachments are included. Attachmentsmore » 1 and 2 provide analytical results for selected RFI groundwater samples and ORNL sampling event. Attachment 3 provides a summary of the contaminants detected in each well sampled for all sampling events conducted at WAG 6. Bechtel National Inc. (BNI)/IT Corporation Contract Laboratory (IT) RFI analytical methods and detection limits are given in Attachment 4. Attachment 5 provides the Oak Ridge National Laboratory (ORNL)/Analytical Chemistry Division (ACD) analytical methods and detection limits and Resource Conservation and Recovery Act (RCRA) quarterly compliance monitoring (1988--1989). Attachment 6 provides ORNL/ACD groundwater analytical methods and detection limits (for the 1990 RCRA semi-annual compliance monitoring).« less

  2. In Search for Diffuse Hydrothermal Venting at North Pond, Western Flank of the Mid-Atlantic-Ridge

    NASA Astrophysics Data System (ADS)

    Villinger, H. W.; Becker, K.; Hulme, S.; Kaul, N. E.; Müller, P.; Wheat, C. G.

    2015-12-01

    We present results from temperature measurements made with a ROV temperature lance in sediments deposited on the slopes of abyssal hills and small basins surrounding North Pond. North Pond is a ~8x15 km large sediment basin located on ~7 Ma year old crust west of the Mid-Atlantic Ridge at 23°N. Data were collected with the ROV Jason II during cruise MSM37 on the German RV Maria S. Merian in April 2014. The temperature lance consists of a 60 cm long stainless steel tube (o.d. 12 mm) housing 8 thermistors with a spacing of 80 mm, resulting in an active length of 56 cm. Data are logged with an 8-channel data logger (XR-420-T8, RBR, Ottawa) and transmitted online to the control van of the ROV. Data reduction and temperature gradient calculation is done according to the HFRED algorithm (Villinger & Davis, 1987). 90 sites in total were visited, 88 gave good data for temperature gradient calculation. Calculated gradients are usually of good to very good quality. The gradients vary between less than 20 to more than 1000 mK/m reflecting the very heterogeneous distribution of geothermal heat flow. The expected conductive lithospheric heat flow for North Pond is ~190 mW/m2 (geothermal gradient of ~190 mK/m with a thermal conductivity of 1 W/Km). The highest temperature gradients are measured in places where temperature ~50 cm below the sediment-water boundary exceeds bottom water temperature by ~0.5 K . These high temperature gradients may reflect local hydrothermal circulation within the pillow lavas, however no focused discharge was detected. The analysis of temperature measurements made with the ROV-mounted CTD shows clearly detectable bottom water temperature anomalies. We infer that they are either caused by hydrothermal discharge through the thin sediment cover or through unsedimented pillow basalts nearby. Hydrothermal circulation in a North-Pond-like environment appears to be diffuse in nature, hence very difficult if not impossible to detect and to quantify.

  3. The distribution of near-axis seamounts at intermediate spreading ridges

    NASA Astrophysics Data System (ADS)

    Howell, J. K.; Bohnenstiehl, D. R.; White, S. M.; Supak, S. K.

    2008-12-01

    The ridge axes along the intermediate-spreading rate Galapagos Spreading Center (GSC, 46-56 mm/yr) and South East Indian Ridge (SEIR, 72-76 mm/yr) vary from rifted axial valleys to inflated axial highs independent of spreading rate. The delivery and storage of melt is believed to control axial morphology, with axial highs typically observed in areas underlain by a shallow melt lens and axial valleys in areas without a significant melt lens [e.g., Baran et al., 2005 G-cubed; Detrick et al. 2002 G-cubed]. To investigate a possible correlation between the style of seafloor volcanism and axial morphology, a closed contour algorithm is used to identify near axis (2.5km off axis) semi-circular seamounts of heights greater than 20m from shipboard multibeam bathymetry. In areas characterized by an axial high, more seamounts are formed at the ends of the segments than in the center. This is consistent with observations at fast-spreading ridges and suggests a tendency of lavas to erupt at lower effusion rates near second-order segment boundaries. Segments with a rift valley along the GSC show the opposite trend, with more seamounts at the center of second-order segments. Both patterns however are observed along SEIR segments with rift valleys where magma supply may be reflected in size and not their abundance.

  4. Non-stationary component extraction in noisy multicomponent signal using polynomial chirping Fourier transform.

    PubMed

    Lu, Wenlong; Xie, Junwei; Wang, Heming; Sheng, Chuan

    2016-01-01

    Inspired by track-before-detection technology in radar, a novel time-frequency transform, namely polynomial chirping Fourier transform (PCFT), is exploited to extract components from noisy multicomponent signal. The PCFT combines advantages of Fourier transform and polynomial chirplet transform to accumulate component energy along a polynomial chirping curve in the time-frequency plane. The particle swarm optimization algorithm is employed to search optimal polynomial parameters with which the PCFT will achieve a most concentrated energy ridge in the time-frequency plane for the target component. The component can be well separated in the polynomial chirping Fourier domain with a narrow-band filter and then reconstructed by inverse PCFT. Furthermore, an iterative procedure, involving parameter estimation, PCFT, filtering and recovery, is introduced to extract components from a noisy multicomponent signal successively. The Simulations and experiments show that the proposed method has better performance in component extraction from noisy multicomponent signal as well as provides more time-frequency details about the analyzed signal than conventional methods.

  5. Rock fracture skeleton tracing by image processing and quantitative analysis by geometry features

    NASA Astrophysics Data System (ADS)

    Liang, Yanjie

    2016-06-01

    In rock engineering, fracture measurement is important for many applications. This paper proposes a novel method for rock fracture skeleton tracing and analyzing. As for skeleton localizing, the curvilinear fractures are multiscale enhanced based on a Hessian matrix, after image binarization, and clutters are post-processed by image analysis; subsequently, the fracture skeleton is extracted via ridge detection combined with a distance transform and thinning algorithm, after which gap sewing and burrs removal repair the skeleton. In regard to skeleton analyzing, the roughness and distribution of a fracture network are respectively described by the fractal dimensions D s and D b; the intersection and fragmentation of a fracture network are respectively characterized by the average number of ends and junctions per fracture N average and the average length per fracture L average. Three rock fracture surfaces are analyzed for experiments and the results verify that both the fracture tracing accuracy and the analysis feasibility are satisfactory using the new method.

  6. Efficacy of a bioactive glass-ceramic (Biosilicate) in the maintenance of alveolar ridges and in osseointegration of titanium implants.

    PubMed

    Roriz, Virgílio M; Rosa, Adalberto L; Peitl, Oscar; Zanotto, Edgar D; Panzeri, Heitor; de Oliveira, Paulo T

    2010-02-01

    The aims of this research were to evaluate the efficacy of a bioactive glass-ceramic (Biosilicate) and a bioactive glass (Biogran) placed in dental sockets in the maintenance of alveolar ridge and in the osseointegration of Ti implants. Six dogs had their low premolars extracted and the sockets were implanted with Biosilicate, Biogran particles, or left untreated. After the extractions, measurements of width and height on the alveolar ridge were taken. After 12 weeks a new surgery was performed to take the final ridge measurements and to insert bilaterally three Ti implants in biomaterial-implanted and control sites. Eight weeks post-Ti implant placement block biopsies were processed for histological and histomorphometric analysis. The percentages of bone-implant contact (BIC), of mineralized bone area between threads (BABT), and of mineralized bone area within the mirror area (BAMA) were determined. The presence of Biosilicate or Biogran particles preserved alveolar ridge height without affecting its width. No significant differences in terms of BIC, BAMA, and BABT values were detected among Biosilicate, Biogran, and the non-implanted group. The results of the present study indicate that filling of sockets with either Biosilicate or Biogran particles preserves alveolar bone ridge height and allows osseointegration of Ti implants.

  7. A tool for computer-aided diagnosis of retinopathy of prematurity

    NASA Astrophysics Data System (ADS)

    Zhao, Zheen; Wallace, David K.; Freedman, Sharon F.; Aylward, Stephen R.

    2008-03-01

    In this paper we present improvements to a software application, named ROPtool, that aids in the timely and accurate detection and diagnosis of retinopathy of prematurity (ROP). ROP occurs in 68% of infants less than 1251 grams at birth, and it is a leading cause of blindness for prematurely born infants. The standard of care for its diagnosis is the subjective assessment of retinal vessel dilation and tortuosity. There is significant inter-observer variation in those assessments. ROPtool analyzes retinal images, extracts user-selected blood vessels from those images, and quantifies the tortuosity of those vessels. The presence of ROP is then gauged by comparing the tortuosity of an infant's retinal vessels with measures made from a clinical-standard image of severely tortuous retinal vessels. The presence of such tortuous retinal vessels is referred to as 'plus disease'. In this paper, a novel metric of tortuosity is proposed. From the ophthalmologist's point of view, the new metric is an improvement from our previously published algorithm, since it uses smooth curves instead of straight lines to simulate 'normal vessels'. Another advantage of the new ROPtool is that minimal user interactions are required. ROPtool utilizes a ridge traversal algorithm to extract retinal vessels. The algorithm reconstructs connectivity along a vessel automatically. This paper supports its claims by reporting ROC curves from a pilot study involving 20 retinal images. The areas under two ROC curves, from two experts in ROP, using the new metric to diagnose 'tortuosity sufficient for plus disease', varied from 0.86 to 0.91.

  8. Geophysical Characteristics of the Australian-Antarctic Ridge

    NASA Astrophysics Data System (ADS)

    Kim, S. S.; Lin, J.; Park, S. H.; Choi, H.; Lee, S. M.

    2014-12-01

    Between 2011 and 2013, the Korea Polar Research Institute (KOPRI) conducted three consecutive geologic surveys at the little explored eastern ends of the Australian-Antarctic Ridge (AAR) to characterize the tectonics, geochemistry, and hydrothermal activity of this intermediate spreading system. Using the Korean icebreaker R/V Araon, the multi-disciplinary research team collected bathymetry, gravity, magnetics, and rock and water column samples. In addition, Miniature Autonomous Plume Recorders (MAPRs) were deployed at wax-core rock sampling sites to detect the presence of active hydrothermal vents. Here we present a detailed analysis of a 300-km-long supersegment of the AAR to quantify the spatial variations in ridge morphology and robust axial and off-axis volcanisms. The ridge axis morphology alternates between rift valleys and axial highs within relatively short ridge segments. To obtain a geological proxy for regional variations in magma supply, we calculated residual mantle Bouguer gravity anomalies (RMBA), gravity-derived crustal thickness, and residual topography for seven sub-segments. The results of the analyses revealed that the southern flank of the AAR is associated with shallower seafloor, more negative RMBA, thicker crust, and/or less dense mantle than the conjugate northern flank. Furthermore, this north-south asymmetry becomes more prominent toward the KR1 supersegment of the AAR. The axial topography of the KR1 supersegment exhibits a sharp transition from axial highs at the western end to rift valleys at the eastern end, with regions of axial highs being associated with more magma supply as indicated by more negative RMBA. We also compare and contrast the characteristics of the AAR supersegment with that of other ridges of intermediate spreading rates, including the Juan de Fuca Ridge, Galápagos Spreading Center, and Southeast Indian Ridge west of the Australian-Antarctic Discordance, to investigate the influence of ridge-hotspot interaction on ridge magma supply and tectonics.

  9. Proteomics goes forensic: Detection and mapping of blood signatures in fingermarks.

    PubMed

    Deininger, Lisa; Patel, Ekta; Clench, Malcolm R; Sears, Vaughn; Sammon, Chris; Francese, Simona

    2016-06-01

    A bottom up in situ proteomic method has been developed enabling the mapping of multiple blood signatures on the intact ridges of blood fingermarks by Matrix Assisted Laser Desorption Mass Spectrometry Imaging (MALDI-MSI). This method, at a proof of concept stage, builds upon recently published work demonstrating the opportunity to profile and identify multiple blood signatures in bloodstains via a bottom up proteomic approach. The present protocol addresses the limitation of the previously developed profiling method with respect to destructivity; destructivity should be avoided for evidence such as blood fingermarks, where the ridge detail must be preserved in order to provide the associative link between the biometric information and the events of bloodshed. Using a blood mark reference model, trypsin concentration and spraying conditions have been optimised within the technical constraints of the depositor eventually employed; the application of MALDI-MSI and Ion Mobility MS have enabled the detection, confirmation and visualisation of blood signatures directly onto the ridge pattern. These results are to be considered a first insight into a method eventually informing investigations (and judicial debates) of violent crimes in which the reliable and non-destructive detection and mapping of blood in fingermarks is paramount to reconstruct the events of bloodshed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. TEV GAMMA-RAY OBSERVATIONS OF THE GALACTIC CENTER RIDGE BY VERITAS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Archer, A.; Buckley, J. H.; Bugaev, V.

    2016-04-20

    The Galactic Center ridge has been observed extensively in the past by both GeV and TeV gamma-ray instruments revealing a wealth of structure, including a diffuse component and the point sources G0.9+0.1 (a composite supernova remnant) and Sgr A* (believed to be associated with the supermassive black hole located at the center of our Galaxy). Previous very high energy (VHE) gamma-ray observations with the H.E.S.S. experiment have also detected an extended TeV gamma-ray component along the Galactic plane in the >300 GeV gamma-ray regime. Here we report on observations of the Galactic Center ridge from 2010 to 2014 by themore » VERITAS telescope array in the >2 TeV energy range. From these observations we (1) provide improved measurements of the differential energy spectrum for Sgr A* in the >2 TeV gamma-ray regime, (2) provide a detection in the >2 TeV gamma-ray emission from the composite SNR G0.9+0.1 and an improved determination of its multi-TeV gamma-ray energy spectrum, and (3) report on the detection of VER J1746-289, a localized enhancement of >2 TeV gamma-ray emission along the Galactic plane.« less

  11. Preliminary Analysis of Multibeam, Subbottom, and Water Column Data Collected from the Juan de Fuca Plate and Gorda Ridge Earthquake Swarm Sites, March-April 2008.

    NASA Astrophysics Data System (ADS)

    Merle, S. G.; Dziak, R. P.; Embley, R. W.; Lupton, J. E.; Greene, R. R.; Chadwick, W. W.; Lilley, M.; Bohnenstiehl, D. R.; Braunmiller, J.; Fowler, M.; Resing, J.

    2008-12-01

    Two oceanographic expeditions were undertaken in the northeast Pacific during April and September of 2008 to collect a variety of scientific data at the sites of intense earthquake swarms that occurred from 30 March to 9 April 2008. The earthquake swarms were detected by the NOAA/PMEL and US Navy SOSUS hydrophone system in the northeast Pacific. The first swarm occurred within the central Juan de Fuca Plate, ~280 km west of the Oregon coast and ~70 km north of the Blanco Transform Fault Zone (BTFZ). Time history of the events indicate this swarm was not a typical mainshock-aftershock sequence, and was the largest SOSUS detected swarm within the intraplate. This intraplate swarm activity was followed by three distinct clusters of earthquakes located along the BTFZ. Two of the clusters, which began on 10 and 12 April, were initiated by MW 5+ earthquakes suggesting these were mainshock-aftershock sequences, and the number of earthquakes on the BTFZ were small relative to the intraplate swarm. On 22 April, another intense earthquake swarm began on the northern Gorda Ridge segment adjacent to the BTFZ. The Gorda swarm produced >1000 SOSUS detected earthquakes over a five-day duration, with activity distributed between the mid-segment high and the ridge-transform intersection. This swarm was of special interest because of previous magmatic activity near its location in 1996. Overall, the March-April earthquake activity showed an interesting spatio-temporal progression, beginning at the intraplate, to the transform, then to a spreading event at the ridge. This pattern once again demonstrates the Juan de Fuca plate is continually moving and converging with North America at the Cascadia Subduction Zone. As the initial swarm was not focused on the ridge crest, it was not interpreted as a significant eruptive event, and we did not advocate a large-scale Ridge2000 response effort. The earthquake activity, however, did have an unusual character and therefore a short (four-day) cruise was organized using the R/V Wecoma in April (support via NOAA Vents Program and NSF). While this cruise was underway, the Gorda Ridge swarm began and therefore another day was added to also sample the Gorda site. A total of 11 CTD casts were completed, covering the significant areas of earthquake activity. Measurements for helium isotopes have been completed on all 11 casts, and for methane and CO2 on one of the Gorda Ridge casts. A second response cruise aboard the R/V Melville will take place in September, funded by the NOAA/Vents, providing 2 days of multibeam survey time. The cruise plan is to collect EM120 multibeam bathymetry and backscatter data, as well as 3.5 kHz subbottom in the area of the initial swarm. The northern Gorda Ridge will also be surveyed, with the goal of comparing this bathymetry with previously collected data to see if there is evidence of depth anomalies and therefore recent seafloor eruptions.

  12. Ridge Tectonics, Magma Supply, and Ridge-Hotpot Interaction at the Eastern End of the Australian-Antarctic Ridge

    NASA Astrophysics Data System (ADS)

    Kim, S.; Lin, J.; Park, S.; Choi, H.; Lee, S.

    2013-12-01

    During 2011-2013 the Korea Polar Research Institute (KOPRI) conducted three successive expeditions to the eastern end of the Australian-Antarctic Ridge (AAR) to investigate the tectonics, geochemistry, and hydrothermal activity of this intermediate fast spreading system. On board the Korean icebreaker R/V Araon, the science party collected multiple types of data including multibeam bathymetry, gravity, magnetics, as well as rock and water column samples. In addition, Miniature Autonomous Plume Recorders (MAPRs) were deployed at each of the wax-core rock sampling sites to detect the presence of active hydrothermal vents. In this study, we present a detailed analysis of a 360-km-long super-segment at the eastern end of the AAR to quantify the spatial variations in ridge morphology and investigate its respond to changes in melt supply. The study region contains several intriguing bathymetric features including (1) abrupt changes in the axial topography, alternating between rift valleys and axial highs within relatively short ridge segments; (2) overshooting ridge tips at the ridge-transform intersections; (3) systematic migration patterns of hooked ridges; (4) a 350-km-long mega-transform fault; and (5) robust axial and off-axis volcanisms. To obtain a proxy for regional variations in magma supply, we calculated residual mantle Bouguer gravity anomalies (RMBA), gravity-derived crustal thickness, and residual topography for seven sub-segments. The results of the analyses revealed that the southern flank of the AAR is associated with a shallower seafloor, more negative RMBA, thicker crust, and/or less dense mantle than the conjugate northern flank. Furthermore, this N-S asymmetry becomes more prominent toward the super-segment of the AAR. Such regional variations in seafloor topography and RMBA are consistent with the hypothesis that ridge segments in the study area have interacted with the Balleny hotspot, currently lies southwest of the AAR. However, the influence of the Balleny hotpot is not dominant in the axial morphology of the AAR super-segment. The axial topography of this super-segment exhibits a sharp transition from axial highs at the western end to rift valleys at the eastern end, with regions of axial highs being associated with more magma supply as indicated by more negative RMBA. The eastern AAR will be further compared with other intermediate fast spreading ridges, such as the Juan de Fuca Ridge, Galápagos Spreading Center, and Southeast Indian Ridge west of the Australian-Antarctic Discordance, to better understand the influence of ridge-hotspot interaction on ridge magma supply and tectonics.

  13. Large-scale fluid-deposited mineralization in Margaritifer Terra, Mars

    NASA Astrophysics Data System (ADS)

    Thomas, Rebecca J.; Potter-McIntyre, Sally L.; Hynek, Brian M.

    2017-07-01

    Mineral deposits precipitated from subsurface-sourced fluids are a key astrobiological detection target on Mars, due to the long-term viability of the subsurface as a habitat for life and the ability of precipitated minerals to preserve biosignatures. We report morphological and stratigraphic evidence for ridges along fractures in impact crater floors in Margaritifer Terra. Parallels with terrestrial analog environments and the regional context indicate that two observed ridge types are best explained by groundwater-emplaced cementation in the shallow subsurface and higher-temperature hydrothermal deposition at the surface, respectively. Both mechanisms have considerable astrobiological significance. Finally, we propose that morphologically similar ridges previously documented at the Mars 2020 landing site in NE Syrtis Major may have formed by similar mechanisms.

  14. Spatiotemporal distribution of the seismicity along the Mid-Atlantic Ridge north of the Azores from hydroacoustic data: Insights into seismogenic processes in a ridge-hot spot context

    NASA Astrophysics Data System (ADS)

    Goslin, J.; Perrot, J.; Royer, J.-Y.; Martin, C.; LourençO, N.; Luis, J.; Dziak, R. P.; Matsumoto, H.; Haxel, J.; Fowler, M. J.; Fox, C. G.; Lau, A. T.-K.; Bazin, S.

    2012-02-01

    The seismicity of the North Atlantic was monitored from May 2002 to September 2003 by the `SIRENA array' of autonomous hydrophones. The hydroacoustic signals provide a unique data set documenting numerous low-magnitude earthquakes along the section of the Mid-Atlantic Ridge (MAR) located in a ridge-hot spot interaction context. During the experiment, 1696 events were detected along the MAR axis between 40°N and 51°N, with a magnitude of completeness level ofmb≈ 2.4. Inside the array, location errors are in the order of 2 km, and errors in the origin time are less than 1 s. From this catalog, 15 clusters were detected. The distribution of source level (SL) versus time within each cluster is used to discriminate clusters occurring in a tectonic context from those attributed to non-tectonic (i.e. volcanic or hydrothermal) processes. The location of tectonic and non-tectonic sequences correlates well with regions with positive and negative Mantle Bouguer Anomalies (MBAs), indicating the presence of thinner/colder and thicker/warmer crust respectively. At the scale of the entire array, both the complete and declustered catalogs derived from the hydroacoustic signals show an increase of the seismicity rate from the Azores up to 43°30'N suggesting a diminishing influence of the Azores hot spot on the ridge-axis temperature, and well correlated with a similar increase in the along-axis MBAs. The comparison of the MAR seismicity with the Residual MBA (RMBA) at different scales leads us to think that the low-magnitude seismicity rates are directly related to along-axis variations in lithosphere rheology and temperatures.

  15. Low-frequency coherent motions within the spruce canopy on the upwind vs. downwind side of a forested ridge

    NASA Astrophysics Data System (ADS)

    Potužníková, K.; Sedlák, P.; Šauli, P.

    2009-09-01

    Airflow and turbulence within and above the forest canopy determine the forest - atmosphere exchange of atmospheric constituents and pollutants. Our investigation is related to the existence of large-scale intermittent coherent structures, which have been detected in turbulence time series measured at the Experimental Ecological Study Site Bílý Kříž (800-900 m a.s.l.) in the Czech Republic. The site is situated on a steep (13°) SSW-faced slope near the top of a mountain ridge forested by a young Norway spruce plantation. Flow directions across the ridge (along the slope) strongly prevail at the site. Results based on a recent study reveal significant differences between the cases when the site is on the upwind vs. downwind side of the ridge. Typical downwind cases are characterized by a low wind speed above the canopy and by relatively higher friction velocity than in the upwind cases. This is explained by the flow retardation by the upslope-directed hydrodynamic pressure gradient and by the large wind shear in the upper part of the wake behind the ridge top. This contribution concentrates on the vertical coherency of the turbulent flow within the forest canopy. Analysed variables include the high-frequency wind velocity components and sonic temperature measured during periods of neutral thermal stratification at two different levels. Wavelet analysis was used for detection of characteristic temporal scale of coherent structures, their persistence and effectivity parameter. Special attention is paid to the differences between the upwind and downwind cases. Acknowledgements: The study is supported by the grant IAA300420803 and IAA300420704 from Grant Agency of Academy of Sciences of the Czech Republic.

  16. DETECTION OF SHOCK MERGING IN THE CHROMOSPHERE OF A SOLAR PORE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chae, Jongchul; Song, Donguk; Seo, Minju

    2015-06-01

    It was theoretically demonstrated that a shock propagating in the solar atmosphere can overtake another and merge with it. We provide clear observational evidence that shock merging does occur quite often in the chromosphere of sunspots. Using Hα imaging spectral data taken by the Fast Imaging Solar Spectrograph of the 1.6 m New Solar Telescope at the Big Bear Soar Observatory, we construct time–distance maps of line-of-sight velocities along two appropriately chosen cuts in a pore. The maps show a number of alternating redshift and blueshift ridges, and we identify each interface between a preceding redshift ridge and the followingmore » blueshift ridge as a shock ridge. The important finding of ours is that two successive shock ridges often merge with each other. This finding can be theoretically explained by the merging of magneto-acoustic shock waves propagating with lower speeds of about 10 km s{sup −1} and those propagating at higher speeds of about 16–22 km s{sup −1}. The shock merging is an important nonlinear dynamical process of the solar chromosphere that can bridge the gap between higher-frequency chromospheric oscillations and lower-frequency dynamic phenomena such as fibrils.« less

  17. Investigating the Relationship between Fin Whales, Zooplankton Concentrations and Hydrothermal Venting on the Juan de Fuca Ridge

    DTIC Science & Technology

    2014-09-30

    correlation detector to investigate the behavior of vocalizing whales and their distribution relative to the vent fields. To determine call...in earthquake studies but has not previously been applied to marine mammals. The close relationship between biomass and acoustic backscatter...F., and Ellsworth, W. L. (2000). A double-difference earthquake location algorithm: Method and application to the northern Hayward Fault, California

  18. Resolution and quantification accuracy enhancement of functional delay and sum beamforming for three-dimensional acoustic source identification with solid spherical arrays

    NASA Astrophysics Data System (ADS)

    Chu, Zhigang; Yang, Yang; Shen, Linbang

    2017-05-01

    Functional delay and sum (FDAS) is a novel beamforming algorithm introduced for the three-dimensional (3D) acoustic source identification with solid spherical microphone arrays. Being capable of offering significantly attenuated sidelobes with a fast speed, the algorithm promises to play an important role in interior acoustic source identification. However, it presents some intrinsic imperfections, specifically poor spatial resolution and low quantification accuracy. This paper focuses on conquering these imperfections by ridge detection (RD) and deconvolution approach for the mapping of acoustic sources (DAMAS). The suggested methods are referred to as FDAS+RD and FDAS+RD+DAMAS. Both computer simulations and experiments are utilized to validate their effects. Several interesting conclusions have emerged: (1) FDAS+RD and FDAS+RD+DAMAS both can dramatically ameliorate FDAS's spatial resolution and at the same time inherit its advantages. (2) Compared to the conventional DAMAS, FDAS+RD+DAMAS enjoys the same super spatial resolution, stronger sidelobe attenuation capability and more than two hundred times faster speed. (3) FDAS+RD+DAMAS can effectively conquer FDAS's low quantification accuracy. Whether the focus distance is equal to the distance from the source to the array center or not, it can quantify the source average pressure contribution accurately. This study will be of great significance to the accurate and quick localization and quantification of acoustic sources in cabin environments.

  19. Detection of gas hydrate with downhole logs and assessment of gas hydrate concentrations (saturations) and gas volumes on the Blake Ridge with electrical resistivity log data

    USGS Publications Warehouse

    Collett, T.S.; Ladd, J.

    2000-01-01

    Let 164 of the Ocean Drilling Program was designed to investigate the occurrence of gas hydrate in the sedimentary section beneath the Blake Ridge on the southeastern continental margin of North America. Site 994, and 997 were drilled on the Blake Ridge to refine our understanding of the in situ characteristics of natural gas hydrate. Because gas hydrate is unstable at surface pressure and temperature conditions, a major emphasis was placed on the downhole logging program to determine the in situ physical properties of the gas hydrate-bearing sediments. Downhole logging tool strings deployed on Leg 164 included the Schlumberger quad-combination tool (NGT, LSS/SDT, DIT, CNT-G, HLDT), the Formation MicroScanner (FMS), and the Geochemical Combination Tool (GST). Electrical resistivity (DIT) and acoustic transit-time (LSS/SDT) downhole logs from Sites 994, 995, and 997 indicate the presence of gas hydrate in the depth interval between 185 and 450 mbsf on the Blake Ridge. Electrical resistivity log calculations suggest that the gas hydrate-bearing sedimentary section on the Blake Ridge may contain between 2 and 11 percent bulk volume (vol%) gas hydrate. We have determined that the log-inferred gas hydrates and underlying free-gas accumulations on the Blake Ridge may contain as much as 57 trillion m3 of gas.

  20. Motion Between the Indian, African and Antarctic Plates in the Early Cenozoic

    NASA Astrophysics Data System (ADS)

    Cande, S. C.; Patriat, P.; Dyment, J.

    2009-12-01

    We used a three-plate, best-fit algorithm to calculate four sets of Euler rotations for India (Capricorn) - Africa (Somali), India (Capricorn)-Antarctic, and Africa (Somali)-Antarctic motion for twelve time intervals between Chrons 20 and 29 in the early Cenozoic. Each set of rotations had a different combination of data constraints. The first set of rotations used a basic set of magnetic anomaly picks on the Central Indian Ridge (CIR), Southeast Indian Ridge (SEIR) and Southwest Indian Ridge (SWIR), but did not incorporate data from the Carlsberg ridge and did not use fracture zones on the SWIR. The second set added fracture zone constraints from the region west of the Bain FZ on the SWIR and also included corrections for Nubia-Somalia and Lwandle-Somalia motion on the western and central SWIR, respectively. The third set of rotations used the basic constraints from the first rotation set and added data from the Carlsberg ridge. The fourth set of rotations combined both the additional SWIR constraints of the second data set and the Carlsberg ridge constraints of the third data set. Data on the Indian plate side of the Carlsberg ridge (Arabian Basin) were rotated to the Capricorn plate before being included in the constraints. We found that the rotations constrained by the Carlsberg ridge data set diverged from the other two sets of rotations prior to anomaly 22o. We concluded that, relative to the rest of the CIR, there is a progressively larger separation of anomalies on the Carlsberg ridge, starting at anomaly 22o and increasing to over 100 km for anomaly 26. These observations support two alternative interpretations. First, they are consistent with a distinct Seychelles microplate in the early Cenozoic. The sense of the misfit on the Carlsberg ridge is consistent with roughly 100 to 150 km of convergence across a boundary between the Seychelles microplate and Somali plate between Chrons 26 and 22 running from the Amirante Trench and extending north to the Carlsberg ridge axis. Alternatively, the misfit is consistent with convergent motion of the same magnitude between the Indian and a proto-Capricorn plate east of the CIR between Chrons 26 and 22. Our work also sharpens the dating of the two major Eocene events that Patriat and Achache (1984) recognized in the Indian Ocean: a large but gradual slowdown on the CIR and SEIR starting shortly after Chron 23o (51.9 Ma) and continuing until Chron 21y (45.3 Ma), a period of 6.6 Ma, followed two or three million years later by an abrupt change in spreading azimuth on the CIR and SEIR which occurred around Chron 20o (42.8) Ma and which was completed by Chron 20y (41.5 Ma). No change in spreading rate accompanied the change in spreading direction.

  1. Trans-Pacific Bathymetry Survey crossing over the Pacific, Antarctic, and Nazca plates

    NASA Astrophysics Data System (ADS)

    Abe, N.; Fujiwara, T.

    2013-12-01

    Multibeam bathymetric data reveals seafloor fabrics, i.e. abyssal hills and fracture zones, distribution of seamounts and/or knolls and are usually smaller than the detectable size by global prediction derived from satellite altimetry. The seafloor depths combined with shipboard gravity data indicate the structure of oceanic lithosphere, thermal state, and mantle dynamics and become more accurate data set to estimate fine-scale crustal structures and subsurface mass distribution. We present the ~22000 km long survey line from the northeast Japan through to the equator at the mid-Pacific on to the southwest Chilean coast collected during the JAMSTEC R/V Mirai MR08-06 Leg-1 cruise in January-March 2009. The cruise was as a part of SORA2009 (Abe, 2009 Cruise report) for geological and geophysical studies in the southern Pacific, and was an unprecedented opportunity to collect data in the regions of the Pacific Ocean where it has been sparsely surveyed using state-of-the-art echo-sounding technology. Our multibeam bathymetric and shipboard gravity survey track crossed over the Pacific, the Antarctic, and the Nazca plates, and covered lithospheric ages varying from zero to 150 Ma. Strikes of lineated abyssal hills give critical evidences for future studies of the plate reconstruction and tectonic evolution of the old Pacific Plate because magnetic lineations are unconstrained on the seafloor in the Cretaceous magnetic quiet (125-80 Ma) zone. Consecutive trends of lineated abyssal hills and fracture zones indicate stable tectonic stress field originated from the Pacific Antarctic Ridge (PAR) and the Chile Ridge spreading systems. The seafloor fabric morphology revealed a clear boundary between the PAR and the Chile Ridge domains. The observed bathymetric boundary is probably a part of a trace of the Pacific-Antarctic-Farallon (Nazca) plate's triple junction. The result will be constraint for future studies of the plate reconstruction and tectonic evolution of the PAR, the Chile Ridge, and the Antarctic Plate. Fluctuation of the seafloor fabric strikes on Chile Ridge off-ridge flank suggests instability of tectonic stress field. The seafloor fabric may be largely influenced by the tectonic structure of offsets at fracture zones system separated by short ridge segments. The offset length by fracture zones is short at the flank. The offset of fracture zone increases with age decrease due to ridge jumps (Bourgois et al., 2000 JGR) or change in spreading rates (Matsumoto et al., 2013 Geochem. J.). The dominant stress may vary spatially or temporally, during the fracture zone evolution. Abyssal hills elongated in the direction originated from the Chile Ridge system and fracture zones having long offset lengths distinctly bisect at right angles. We also detected many small seamounts and knolls superimposed on the seafloor fabrics. These are considered to be constructed by excess magmatism at a mid-ocean ridge or intra-plate volcanism.

  2. GPU based cloud system for high-performance arrhythmia detection with parallel k-NN algorithm.

    PubMed

    Tae Joon Jun; Hyun Ji Park; Hyuk Yoo; Young-Hak Kim; Daeyoung Kim

    2016-08-01

    In this paper, we propose an GPU based Cloud system for high-performance arrhythmia detection. Pan-Tompkins algorithm is used for QRS detection and we optimized beat classification algorithm with K-Nearest Neighbor (K-NN). To support high performance beat classification on the system, we parallelized beat classification algorithm with CUDA to execute the algorithm on virtualized GPU devices on the Cloud system. MIT-BIH Arrhythmia database is used for validation of the algorithm. The system achieved about 93.5% of detection rate which is comparable to previous researches while our algorithm shows 2.5 times faster execution time compared to CPU only detection algorithm.

  3. A fingerprint key binding algorithm based on vector quantization and error correction

    NASA Astrophysics Data System (ADS)

    Li, Liang; Wang, Qian; Lv, Ke; He, Ning

    2012-04-01

    In recent years, researches on seamless combination cryptosystem with biometric technologies, e.g. fingerprint recognition, are conducted by many researchers. In this paper, we propose a binding algorithm of fingerprint template and cryptographic key to protect and access the key by fingerprint verification. In order to avoid the intrinsic fuzziness of variant fingerprints, vector quantization and error correction technique are introduced to transform fingerprint template and then bind with key, after a process of fingerprint registration and extracting global ridge pattern of fingerprint. The key itself is secure because only hash value is stored and it is released only when fingerprint verification succeeds. Experimental results demonstrate the effectiveness of our ideas.

  4. Multiscale analysis of potential fields by a ridge consistency criterion: the reconstruction of the Bishop basement

    NASA Astrophysics Data System (ADS)

    Fedi, M.; Florio, G.; Cascone, L.

    2012-01-01

    We use a multiscale approach as a semi-automated interpreting tool of potential fields. The depth to the source and the structural index are estimated in two steps: first the depth to the source, as the intersection of the field ridges (lines built joining the extrema of the field at various altitudes) and secondly, the structural index by the scale function. We introduce a new criterion, called 'ridge consistency' in this strategy. The criterion is based on the principle that the structural index estimations on all the ridges converging towards the same source should be consistent. If these estimates are significantly different, field differentiation is used to lessen the interference effects from nearby sources or regional fields, to obtain a consistent set of estimates. In our multiscale framework, vertical differentiation is naturally joint to the low-pass filtering properties of the upward continuation, so is a stable process. Before applying our criterion, we studied carefully the errors on upward continuation caused by the finite size of the survey area. To this end, we analysed the complex magnetic synthetic case, known as Bishop model, and evaluated the best extrapolation algorithm and the optimal width of the area extension, needed to obtain accurate upward continuation. Afterwards, we applied the method to the depth estimation of the whole Bishop basement bathymetry. The result is a good reconstruction of the complex basement and of the shape properties of the source at the estimated points.

  5. Linear feature detection algorithm for astronomical surveys - I. Algorithm description

    NASA Astrophysics Data System (ADS)

    Bektešević, Dino; Vinković, Dejan

    2017-11-01

    Computer vision algorithms are powerful tools in astronomical image analyses, especially when automation of object detection and extraction is required. Modern object detection algorithms in astronomy are oriented towards detection of stars and galaxies, ignoring completely the detection of existing linear features. With the emergence of wide-field sky surveys, linear features attract scientific interest as possible trails of fast flybys of near-Earth asteroids and meteors. In this work, we describe a new linear feature detection algorithm designed specifically for implementation in big data astronomy. The algorithm combines a series of algorithmic steps that first remove other objects (stars and galaxies) from the image and then enhance the line to enable more efficient line detection with the Hough algorithm. The rate of false positives is greatly reduced thanks to a step that replaces possible line segments with rectangles and then compares lines fitted to the rectangles with the lines obtained directly from the image. The speed of the algorithm and its applicability in astronomical surveys are also discussed.

  6. East Pacific Rise 18 deg-19 deg S: Asymmetric spreading and ridge reorientation by ultrafast migration of axial discontinuities

    NASA Astrophysics Data System (ADS)

    Cormier, Marie-Helene; MacDonald, Ken C.

    1994-01-01

    A detailed bathymetric, side scan, and magnetic survey of the East Pacific Rise out to a seafloor age of 1 Ma has been carried out between 18 deg and 19 deg S. It reveals that some left-stepping axial discontinuities have been migrating southward at rates an order of magnitude faster than the spreading rates (1000 mm/a or higher). These rapid migration events have left on the Nazca plate discordant features striking nearly parallel to the ridge axis. A discontinuity with an offset of several kilometers has migrated in two stages at around 0.45 and 0.3 Ma, and has left two large discordant zones consisting of a series of unfaulted, hummocky basins bounded to the east by short ridges oriented about N-S, oblique to the ambient 013 deg fabric. The morphology and reflectivity characteristics of these discordant zones are akin to the overlap basins and abandoned ridge tips which make up the migration trails of large, slowly-migrating overlapping spreading centers. Between 18 deg 35 min and 19 deg 03 min S, the ridge axis is flanked a few kilometers to the east by a prominent, sedimented ridge previously recognized as a recent abandoned ridge axis. The present ridge segment steadily deepens and narrows southward, which suggests the abandoned ridge has been rafted onto the Nazca plate during the ultrafast southward propagation of the ridge segment rather than by one discrete ridge jump. By transferring Pacific lithosphere to the Nazca plate, these migration events account for most of the asymmetric accretion observed (faster to the east). This process is consistent with the features common to asymmetric spreading, namely the sudden onset or demise of asymmetric spreading, and the ridge segment to ridge segment variablity. Because the discordant zones left by these rapid migration events are near-parallel to the ambient seafloor fabric, they are unlikely to be detected by conventional bathymetry or magnetic surveys, and so-called 'ridge-jumps' may actually often represent ultrafast propagation of a ridge segment. Variations in fault azimuth with age show there has not been any significant change in spreading direction over the past 0.8 m.y. Instead, the counterclockwise trend of the East Pacific Rise relative to the Brunhes/Matuyama reversal (0.78 Ma) mostly reflects that ultrafast propagation of ridge segments has transferred a large amount of the Pacific lithosphere to the Nazca plate at 18 deg S. than at 19 deg. In keeping with the regional features of the magnetic anomalies, we propose that an 8 to 10 km left-stepping discontinuity which was located between 17 deg and 17 deg 30 S at 0.78 Ma has been recently redistributed into the present staircase of small left-stepping discontinuities between 16 deg and 19 deg S. This smoothing of the ridge geometry probably occurred through repeated small lateral steps of the ridge segments inside of the discontinuities during ultra-fast propagation episodes, and may be the consequence of a significant replenishment of the magma reservoir between 17 deg and 17 deg 30 min S during the past 1 m.y.

  7. An efficient parallel termination detection algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, A. H.; Crivelli, S.; Jessup, E. R.

    2004-05-27

    Information local to any one processor is insufficient to monitor the overall progress of most distributed computations. Typically, a second distributed computation for detecting termination of the main computation is necessary. In order to be a useful computational tool, the termination detection routine must operate concurrently with the main computation, adding minimal overhead, and it must promptly and correctly detect termination when it occurs. In this paper, we present a new algorithm for detecting the termination of a parallel computation on distributed-memory MIMD computers that satisfies all of those criteria. A variety of termination detection algorithms have been devised. Ofmore » these, the algorithm presented by Sinha, Kale, and Ramkumar (henceforth, the SKR algorithm) is unique in its ability to adapt to the load conditions of the system on which it runs, thereby minimizing the impact of termination detection on performance. Because their algorithm also detects termination quickly, we consider it to be the most efficient practical algorithm presently available. The termination detection algorithm presented here was developed for use in the PMESC programming library for distributed-memory MIMD computers. Like the SKR algorithm, our algorithm adapts to system loads and imposes little overhead. Also like the SKR algorithm, ours is tree-based, and it does not depend on any assumptions about the physical interconnection topology of the processors or the specifics of the distributed computation. In addition, our algorithm is easier to implement and requires only half as many tree traverses as does the SKR algorithm. This paper is organized as follows. In section 2, we define our computational model. In section 3, we review the SKR algorithm. We introduce our new algorithm in section 4, and prove its correctness in section 5. We discuss its efficiency and present experimental results in section 6.« less

  8. ELASTIC NET FOR COX'S PROPORTIONAL HAZARDS MODEL WITH A SOLUTION PATH ALGORITHM.

    PubMed

    Wu, Yichao

    2012-01-01

    For least squares regression, Efron et al. (2004) proposed an efficient solution path algorithm, the least angle regression (LAR). They showed that a slight modification of the LAR leads to the whole LASSO solution path. Both the LAR and LASSO solution paths are piecewise linear. Recently Wu (2011) extended the LAR to generalized linear models and the quasi-likelihood method. In this work we extend the LAR further to handle Cox's proportional hazards model. The goal is to develop a solution path algorithm for the elastic net penalty (Zou and Hastie (2005)) in Cox's proportional hazards model. This goal is achieved in two steps. First we extend the LAR to optimizing the log partial likelihood plus a fixed small ridge term. Then we define a path modification, which leads to the solution path of the elastic net regularized log partial likelihood. Our solution path is exact and piecewise determined by ordinary differential equation systems.

  9. Learning to Predict Combinatorial Structures

    NASA Astrophysics Data System (ADS)

    Vembu, Shankar

    2009-12-01

    The major challenge in designing a discriminative learning algorithm for predicting structured data is to address the computational issues arising from the exponential size of the output space. Existing algorithms make different assumptions to ensure efficient, polynomial time estimation of model parameters. For several combinatorial structures, including cycles, partially ordered sets, permutations and other graph classes, these assumptions do not hold. In this thesis, we address the problem of designing learning algorithms for predicting combinatorial structures by introducing two new assumptions: (i) The first assumption is that a particular counting problem can be solved efficiently. The consequence is a generalisation of the classical ridge regression for structured prediction. (ii) The second assumption is that a particular sampling problem can be solved efficiently. The consequence is a new technique for designing and analysing probabilistic structured prediction models. These results can be applied to solve several complex learning problems including but not limited to multi-label classification, multi-category hierarchical classification, and label ranking.

  10. An improved principal component analysis based region matching method for fringe direction estimation

    NASA Astrophysics Data System (ADS)

    He, A.; Quan, C.

    2018-04-01

    The principal component analysis (PCA) and region matching combined method is effective for fringe direction estimation. However, its mask construction algorithm for region matching fails in some circumstances, and the algorithm for conversion of orientation to direction in mask areas is computationally-heavy and non-optimized. We propose an improved PCA based region matching method for the fringe direction estimation, which includes an improved and robust mask construction scheme, and a fast and optimized orientation-direction conversion algorithm for the mask areas. Along with the estimated fringe direction map, filtered fringe pattern by automatic selective reconstruction modification and enhanced fast empirical mode decomposition (ASRm-EFEMD) is used for Hilbert spiral transform (HST) to demodulate the phase. Subsequently, windowed Fourier ridge (WFR) method is used for the refinement of the phase. The robustness and effectiveness of proposed method are demonstrated by both simulated and experimental fringe patterns.

  11. Seismicity And Accretion Process Along The North Mid-Atlantic Ridge From The SIRENA Autonomous Hydrophone Data

    NASA Astrophysics Data System (ADS)

    Perrot, J.; Goslin, J.; Dziak, R. P.; Haxel, J. H.; Maia, M. A.; Tisseau, C.; Royer, J.

    2009-12-01

    The seismicity of the North Atlantic Ocean was recorded by the SIRENA array of 6 autonomous underwater hydrophones (AUH) moored within the SOFAR channel on the flanks of the Mid-Atlantic Ridge (MAR). The instruments were deployed north of the Azores Plateau between 40° and 50°N from June 2002 to September 2003. The low attenuation properties of the SOFAR channel for earthquake T-wave propagation result in a detection threshold reduction to a magnitude completeness level (Mc) of ~2.8, to be compared to a Mc~4.7 for MAR events recorded by land-based seismic networks. A spatio-temporal analysis was performed over the 1696 events localized inside the SIRENA array. For hydrophone-derived catalogs, the acoustic magnitude, or Source Level (SL), is used as a measure of earthquake size. The ''source level completeness'', above which the data set is complete, is SLc=208 dB. The SIRENA catalog was searched for swarms using the cluster software of the SEISAN distribution. A minimum SL of 210 dB was chosen to detect a possible mainshock, and all subsequent events within 40 days following the possible mainshock, located within a radius of 15 km from the mainshock were considered as events of the swarm. 15 km correspond to the maximum fault length in a slow-ridge context. 11 swarms with more than 15 events were detected along the MAR between 40°et 50°N during the SIRENA deployment. The maximum number of earthquakes in a swarm is 40 events. The SL vs. time distribution within each swarm allowed a first discrimination between the swarms occurring in a tectonic context and those which can be attributed to volcanic processes, the latter showing a more constant SL vs. time distribution. Moreover, the swarms occurring in a tectonic context show a "mainshock-afterschock" distribution of the cumulative number of events vs. time, fitting a Modified Omori Law. The location of tectonic and volcanic swarms correlates well with regions where a positive and negative Mantle Bouguer Anomalies (MBAs)(Maia et al., 2007) are observed, indicating the presence of thinner/colder and thicker/warmer crust respectively. Our results thus show that hydrophone data can be fruitfully used to help and characterize active ridge processes at various spatial scales. Maia M., J. Goslin, and P. Gente (2007), Evolution of accretion processes along the Mid-Atlantic Ridge north of the Azores since 5.5 Ma: An insight into the interactions between the ridge and the plume, Geochem. Geophys. Geosyst., 8.

  12. Tectonics and evolution of the Juan Fernandez microplate at the Pacific-Nazca-Antarctic triple junction

    NASA Technical Reports Server (NTRS)

    Anderson-Fontana, S.; Larson, R. L.; Engein, J. F.; Lundgren, P.; Stein, S.

    1986-01-01

    Magnetic and bathymetric profiles derived from the R/V Endeavor survey and focal mechanism studies for earthquakes on two of the Juan Fernandez microplate boundaries are analyzed. It is observed that the Nazca-Juan Fernandez pole is in the northern end of the microplate since the magnetic lineation along the East Ridge of the microplate fans to the south. The calculation of the relative motion of the Juan Fernandez-Pacific-Nazca-Antarctic four-plate system using the algorithm of Minster et al. (1974) is described. The development of tectonic and evolutionary models of the region is examined. The tectonic model reveals that the northern boundary of the Juan Fernandez microplate is a zone of compression and that the West Ridge and southwestern boundary are spreading obliquely; the evolutionary model relates the formation of the Juan Fernandez microplate to differential spreading rates at the triple junction.

  13. Passive tire pressure sensor and method

    DOEpatents

    Pfeifer, Kent Bryant; Williams, Robert Leslie; Waldschmidt, Robert Lee; Morgan, Catherine Hook

    2006-08-29

    A surface acoustic wave device includes a micro-machined pressure transducer for monitoring tire pressure. The device is configured having a micro-machined cavity that is sealed with a flexible conductive membrane. When an external tire pressure equivalent to the cavity pressure is detected, the membrane makes contact with ridges on the backside of the surface acoustic wave device. The ridges are electrically connected to conductive fingers of the device. When the detected pressure is correct, selected fingers on the device will be grounded producing patterned acoustic reflections to an impulse RF signal. When the external tire pressure is less than the cavity reference pressure, a reduced reflected signal to the receiver results. The sensor may further be constructed so as to identify itself by a unique reflected identification pulse series.

  14. Passive tire pressure sensor and method

    DOEpatents

    Pfeifer, Kent Bryant; Williams, Robert Leslie; Waldschmidt, Robert Lee; Morgan, Catherine Hook

    2007-09-04

    A surface acoustic wave device includes a micro-machined pressure transducer for monitoring tire pressure. The device is configured having a micro-machined cavity that is sealed with a flexible conductive membrane. When an external tire pressure equivalent to the cavity pressure is detected, the membrane makes contact with ridges on the backside of the surface acoustic wave device. The ridges are electrically connected to conductive fingers of the device. When the detected pressure is correct, selected fingers on the device will be grounded producing patterned acoustic reflections to an impulse RF signal. When the external tire pressure is less than the cavity reference pressure, a reduced reflected signal to the receiver results. The sensor may further be constructed so as to identify itself by a unique reflected identification pulse series.

  15. Fingerprint deposition on nitrocellulose and polyvinylidene difluoride membranes using alkaline phosphatase.

    PubMed

    Kurien, Biji T; Danda, Debashish; Scofield, R Hal

    2015-01-01

    Dactyloscopy or fingerprint identification is a vital part of forensic evidence. Identification with fingerprints has been known since the finding of finger impressions on the clay surface of Babylonian legal contracts almost 4,000 years ago. The skin on the fingers and palms appears as grooves and ridges when observed under a microscope. A unique fingerprint is produced by the patterns of these friction skin ridges. Visible fingerprints can be deposited on solid surfaces. Colored inks have been used to deposit fingermarks on documents. Herein, we show that alkaline phosphatase can be used to transfer prints from fingers or palm to nitrocellulose or polyvinylidene difluoride membranes. The prints can be detected by using the nitro blue tetrazolium/5-bromo-4-chloro-3-indolyl phosphate method of detection.

  16. Radar Detection of Marine Mammals

    DTIC Science & Technology

    2011-09-30

    BFT-BPT algorithm for use with our radar data. This track - before - detect algorithm had been effective in enhancing small but persistent signatures in...will be possible with the detect before track algorithm. 4 We next evaluated the track before detect algorithm, the BFT-BPT, on the CEDAR data

  17. Preliminary Results from an Hydroacoustic Experiment in the Indian Ocean

    NASA Astrophysics Data System (ADS)

    Royer, J.; Dziak, R. P.; Delatre, M.; Brachet, C.; Haxel, J. H.; Matsumoto, H.; Goslin, J.; Brandon, V.; Bohnenstiehl, D. R.; Guinet, C.; Samaran, F.

    2008-12-01

    We report initial results from a 14-month hydroacoustic experiment in the Indian Ocean conducted by CNRS/University of Brest and NOAA/Oregon State University. The objective was to monitor the low-level seismic activity associated with the three contrasting spreading ridges and deforming zones in the Indian Ocean. Three autonomous hydrophones, moored in the SOFAR channel, were deployed in October 2006 and recovered early 2008 by R/V Marion Dufresne, in the Madagascar Basin, and northeast and southwest of Amsterdam Island, complementing the two permanent hydroacoustic stations of the Comprehensive nuclear-Test-Ban Treaty Organization (CTBTO) located near Diego Garcia Island and off Cape Leeuwin. Our temporary network detected more than 2000 events. Inside the array, we located 592 events (compared to 49 in the NEIC earthquake catalog) with location errors less than 5 km and time error less than 2s. The hydrophone array detected on average 5 to 40 times more events per month than land-based networks. First-order observations indicate that hydroacoustic seismicity along the Southeast Indian ridge (SEIR) occurs predominantly along the transform faults. The Southwest Indian Ridge exhibits some periodicity in earthquake activity between adjacent ridge segments. Two large tectonic/volcanic earthquake swarms are observed along the Central Indian Ridge (near the triple junction) in September and December 2007. Moreover, many off ridge-axis events are also observed both south and north of the SEIR axis. Improved localization using the CTBTO records will help refine these preliminary results and further investigate extended volcanic sequences along the SEIR east of 80°E and other events outside of the temporary array. The records also display numerous vocalizations of baleen whales in the 20-40Hz bandwidth. The calls are attributed to fin whales, Antarctic blue whales and pygmy blue whales of Madagascar and Australian type. Their vocal activity is found to be highly seasonal, occurring mainly from April to October with subspecies variations. This array thus provides a unique data set to improve our understanding of the seismic activity in this region and to establish the occurrence and migration pattern of critically endangered whale species.

  18. EM Bias-Correction for Ice Thickness and Surface Roughness Retrievals over Rough Deformed Sea Ice

    NASA Astrophysics Data System (ADS)

    Li, L.; Gaiser, P. W.; Allard, R.; Posey, P. G.; Hebert, D. A.; Richter-Menge, J.; Polashenski, C. M.

    2016-12-01

    The very rough ridge sea ice accounts for significant percentage of total ice areas and even larger percentage of total volume. The commonly used Radar altimeter surface detection techniques are empirical in nature and work well only over level/smooth sea ice. Rough sea ice surfaces can modify the return waveforms, resulting in significant Electromagnetic (EM) bias in the estimated surface elevations, and thus large errors in the ice thickness retrievals. To understand and quantify such sea ice surface roughness effects, a combined EM rough surface and volume scattering model was developed to simulate radar returns from the rough sea ice `layer cake' structure. A waveform matching technique was also developed to fit observed waveforms to a physically-based waveform model and subsequently correct the roughness induced EM bias in the estimated freeboard. This new EM Bias Corrected (EMBC) algorithm was able to better retrieve surface elevations and estimate the surface roughness parameter simultaneously. In situ data from multi-instrument airborne and ground campaigns were used to validate the ice thickness and surface roughness retrievals. For the surface roughness retrievals, we applied this EMBC algorithm to co-incident LiDAR/Radar measurements collected during a Cryosat-2 under-flight by the NASA IceBridge missions. Results show that not only does the waveform model fit very well to the measured radar waveform, but also the roughness parameters derived independently from the LiDAR and radar data agree very well for both level and deformed sea ice. For sea ice thickness retrievals, validation based on in-situ data from the coordinated CRREL/NRL field campaign demonstrates that the physically-based EMBC algorithm performs fundamentally better than the empirical algorithm over very rough deformed sea ice, suggesting that sea ice surface roughness effects can be modeled and corrected based solely on the radar return waveforms.

  19. Performances of the New Real Time Tsunami Detection Algorithm applied to tide gauges data

    NASA Astrophysics Data System (ADS)

    Chierici, F.; Embriaco, D.; Morucci, S.

    2017-12-01

    Real-time tsunami detection algorithms play a key role in any Tsunami Early Warning System. We have developed a new algorithm for tsunami detection (TDA) based on the real-time tide removal and real-time band-pass filtering of seabed pressure time series acquired by Bottom Pressure Recorders. The TDA algorithm greatly increases the tsunami detection probability, shortens the detection delay and enhances detection reliability with respect to the most widely used tsunami detection algorithm, while containing the computational cost. The algorithm is designed to be used also in autonomous early warning systems with a set of input parameters and procedures which can be reconfigured in real time. We have also developed a methodology based on Monte Carlo simulations to test the tsunami detection algorithms. The algorithm performance is estimated by defining and evaluating statistical parameters, namely the detection probability, the detection delay, which are functions of the tsunami amplitude and wavelength, and the occurring rate of false alarms. In this work we present the performance of the TDA algorithm applied to tide gauge data. We have adapted the new tsunami detection algorithm and the Monte Carlo test methodology to tide gauges. Sea level data acquired by coastal tide gauges in different locations and environmental conditions have been used in order to consider real working scenarios in the test. We also present an application of the algorithm to the tsunami event generated by Tohoku earthquake on March 11th 2011, using data recorded by several tide gauges scattered all over the Pacific area.

  20. A Novel Zero Velocity Interval Detection Algorithm for Self-Contained Pedestrian Navigation System with Inertial Sensors

    PubMed Central

    Tian, Xiaochun; Chen, Jiabin; Han, Yongqiang; Shang, Jianyu; Li, Nan

    2016-01-01

    Zero velocity update (ZUPT) plays an important role in pedestrian navigation algorithms with the premise that the zero velocity interval (ZVI) should be detected accurately and effectively. A novel adaptive ZVI detection algorithm based on a smoothed pseudo Wigner–Ville distribution to remove multiple frequencies intelligently (SPWVD-RMFI) is proposed in this paper. The novel algorithm adopts the SPWVD-RMFI method to extract the pedestrian gait frequency and to calculate the optimal ZVI detection threshold in real time by establishing the function relationships between the thresholds and the gait frequency; then, the adaptive adjustment of thresholds with gait frequency is realized and improves the ZVI detection precision. To put it into practice, a ZVI detection experiment is carried out; the result shows that compared with the traditional fixed threshold ZVI detection method, the adaptive ZVI detection algorithm can effectively reduce the false and missed detection rate of ZVI; this indicates that the novel algorithm has high detection precision and good robustness. Furthermore, pedestrian trajectory positioning experiments at different walking speeds are carried out to evaluate the influence of the novel algorithm on positioning precision. The results show that the ZVI detected by the adaptive ZVI detection algorithm for pedestrian trajectory calculation can achieve better performance. PMID:27669266

  1. Vascularized interpositional periosteal connective tissue flap: A modern approach to augment soft tissue

    PubMed Central

    Agarwal, Chitra; Deora, Savita; Abraham, Dennis; Gaba, Rohini; Kumar, Baron Tarun; Kudva, Praveen

    2015-01-01

    Context: Nowadays esthetics plays an important role in dentistry along with function of the prosthesis. Various soft tissue augmentation procedures are available to correct the ridge defects in the anterior region. The newer technique, vascularized interpositional periosteal connective tissue (VIP-CT) flap has been introduced, which has the potential to augment predictable amount of tissue and has many benefits when compared to other techniques. Aim: The study was designed to determine the efficacy of the VIP-CT flap in augmenting the ridge defect. Materials and Methods: Ten patients with Class III (Seibert's) ridge defects were treated with VIP-CT flap technique before fabricating fixed partial denture. Height and width of the ridge defects were measured before and after the procedure. Subsequent follow-up was done every 3 months for 1-year. Statistical Analysis Used: Paired t-test was performed to detect the significance of the procedure. Results: The surgical site healed uneventfully. The predictable amount of soft tissue augmentation had been achieved with the procedure. The increase in height and width of the ridge was statistically highly significant. Conclusion: The VIP-CT flap technique was effective in augmenting the soft tissue in esthetic area that remained stable over a long period. PMID:25810597

  2. Meta-analysis of cancer gene expression signatures reveals new cancer genes, SAGE tags and tumor associated regions of co-regulation

    PubMed Central

    Kavak, Erşen; Ünlü, Mustafa; Nistér, Monica; Koman, Ahmet

    2010-01-01

    Cancer is among the major causes of human death and its mechanism(s) are not fully understood. We applied a novel meta-analysis approach to multiple sets of merged serial analysis of gene expression and microarray cancer data in order to analyze transcriptome alterations in human cancer. Our methodology, which we denote ‘COgnate Gene Expression patterNing in tumours’ (COGENT), unmasked numerous genes that were differentially expressed in multiple cancers. COGENT detected well-known tumor-associated (TA) genes such as TP53, EGFR and VEGF, as well as many multi-cancer, but not-yet-tumor-associated genes. In addition, we identified 81 co-regulated regions on the human genome (RIDGEs) by using expression data from all cancers. Some RIDGEs (28%) consist of paralog genes while another subset (30%) are specifically dysregulated in tumors but not in normal tissues. Furthermore, a significant number of RIDGEs are associated with GC-rich regions on the genome. All assembled data is freely available online (www.oncoreveal.org) as a tool implementing COGENT analysis of multi-cancer genes and RIDGEs. These findings engender a deeper understanding of cancer biology by demonstrating the existence of a pool of under-studied multi-cancer genes and by highlighting the cancer-specificity of some TA-RIDGEs. PMID:20621981

  3. Detection of Chorus Elements and other Wave Signatures Using Geometric Computational Techniques in the Van Allen radiation belts

    NASA Astrophysics Data System (ADS)

    Sengupta, A.; Kletzing, C.; Howk, R.; Kurth, W. S.

    2017-12-01

    An important goal of the Van Allen Probes mission is to understand wave particle interactions that can energize relativistic electron in the Earth's Van Allen radiation belts. The EMFISIS instrumentation suite provides measurements of wave electric and magnetic fields of wave features such as chorus that participate in these interactions. Geometric signal processing discovers structural relationships, e.g. connectivity across ridge-like features in chorus elements to reveal properties such as dominant angles of the element (frequency sweep rate) and integrated power along the a given chorus element. These techniques disambiguate these wave features against background hiss-like chorus. This enables autonomous discovery of chorus elements across the large volumes of EMFISIS data. At the scale of individual or overlapping chorus elements, topological pattern recognition techniques enable interpretation of chorus microstructure by discovering connectivity and other geometric features within the wave signature of a single chorus element or between overlapping chorus elements. Thus chorus wave features can be quantified and studied at multiple scales of spectral geometry using geometric signal processing techniques. We present recently developed computational techniques that exploit spectral geometry of chorus elements and whistlers to enable large-scale automated discovery, detection and statistical analysis of these events over EMFISIS data. Specifically, we present different case studies across a diverse portfolio of chorus elements and discuss the performance of our algorithms regarding precision of detection as well as interpretation of chorus microstructure. We also provide large-scale statistical analysis on the distribution of dominant sweep rates and other properties of the detected chorus elements.

  4. Determination of the Ecological and Geographic Distributions of Armillaria Species in Missouri Ozark Forest Ecosystems

    Treesearch

    Johann N. Bruhn; James J. Wetteroff; Jeanne D. Mihail; Susan Burks

    1997-01-01

    Armillaria root rot contributes to oak decline in the Ozarks. Three Armillaria species were detected in Ecological Landtypes (ELT's) representing south- to west-facing side slopes (ELT 17), north- to east-facing side slopes (ELT 18), and ridge tops (ELT 11). Armillaria mellea was detected in 91 percent...

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elmagarmid, A.K.

    The availability of distributed data bases is directly affected by the timely detection and resolution of deadlocks. Consequently, mechanisms are needed to make deadlock detection algorithms resilient to failures. Presented first is a centralized algorithm that allows transactions to have multiple requests outstanding. Next, a new distributed deadlock detection algorithm (DDDA) is presented, using a global detector (GD) to detect global deadlocks and local detectors (LDs) to detect local deadlocks. This algorithm essentially identifies transaction-resource interactions that m cause global (multisite) deadlocks. Third, a deadlock detection algorithm utilizing a transaction-wait-for (TWF) graph is presented. It is a fully disjoint algorithmmore » that allows multiple outstanding requests. The proposed algorithm can achieve improved overall performance by using multiple disjoint controllers coupled with the two-phase property while maintaining the simplicity of centralized schemes. Fourth, an algorithm that combines deadlock detection and avoidance is given. This algorithm uses concurrent transaction controllers and resource coordinators to achieve maximum distribution. The language of CSP is used to describe this algorithm. Finally, two efficient deadlock resolution protocols are given along with some guidelines to be used in choosing a transaction for abortion.« less

  6. Global Seismic Event Detection Using Surface Waves: 15 Possible Antarctic Glacial Sliding Events

    NASA Astrophysics Data System (ADS)

    Chen, X.; Shearer, P. M.; Walker, K. T.; Fricker, H. A.

    2008-12-01

    To identify overlooked or anomalous seismic events not listed in standard catalogs, we have developed an algorithm to detect and locate global seismic events using intermediate-period (35-70s) surface waves. We apply our method to continuous vertical-component seismograms from the global seismic networks as archived in the IRIS UV FARM database from 1997 to 2007. We first bandpass filter the seismograms, apply automatic gain control, and compute envelope functions. We then examine 1654 target event locations defined at 5 degree intervals and stack the seismogram envelopes along the predicted Rayleigh-wave travel times. The resulting function has spatial and temporal peaks that indicate possible seismic events. We visually check these peaks using a graphical user interface to eliminate artifacts and assign an overall reliability grade (A, B or C) to the new events. We detect 78% of events in the Global Centroid Moment Tensor (CMT) catalog. However, we also find 840 new events not listed in the PDE, ISC and REB catalogs. Many of these new events were previously identified by Ekstrom (2006) using a different Rayleigh-wave detection scheme. Most of these new events are located along oceanic ridges and transform faults. Some new events can be associated with volcanic eruptions such as the 2000 Miyakejima sequence near Japan and others with apparent glacial sliding events in Greenland (Ekstrom et al., 2003). We focus our attention on 15 events detected from near the Antarctic coastline and relocate them using a cross-correlation approach. The events occur in 3 groups which are well-separated from areas of cataloged earthquake activity. We speculate that these are iceberg calving and/or glacial sliding events, and hope to test this by inverting for their source mechanisms and examining remote sensing data from their source regions.

  7. Comparison of public peak detection algorithms for MALDI mass spectrometry data analysis.

    PubMed

    Yang, Chao; He, Zengyou; Yu, Weichuan

    2009-01-06

    In mass spectrometry (MS) based proteomic data analysis, peak detection is an essential step for subsequent analysis. Recently, there has been significant progress in the development of various peak detection algorithms. However, neither a comprehensive survey nor an experimental comparison of these algorithms is yet available. The main objective of this paper is to provide such a survey and to compare the performance of single spectrum based peak detection methods. In general, we can decompose a peak detection procedure into three consequent parts: smoothing, baseline correction and peak finding. We first categorize existing peak detection algorithms according to the techniques used in different phases. Such a categorization reveals the differences and similarities among existing peak detection algorithms. Then, we choose five typical peak detection algorithms to conduct a comprehensive experimental study using both simulation data and real MALDI MS data. The results of comparison show that the continuous wavelet-based algorithm provides the best average performance.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramuhalli, Pradeep; Hirt, Evelyn H.; Dib, Gerges

    This project involved the development of enhanced risk monitors (ERMs) for active components in Advanced Reactor (AdvRx) designs by integrating real-time information about equipment condition with risk monitors. Health monitoring techniques in combination with predictive estimates of component failure based on condition and risk monitors can serve to indicate the risk posed by continued operation in the presence of detected degradation. This combination of predictive health monitoring based on equipment condition assessment and risk monitors can also enable optimization of maintenance scheduling with respect to the economics of plant operation. This report summarizes PNNL’s multi-year project on the development andmore » evaluation of an ERM concept for active components while highlighting FY2016 accomplishments. Specifically, this report provides a status summary of the integration and demonstration of the prototypic ERM framework with the plant supervisory control algorithms being developed at Oak Ridge National Laboratory (ORNL), and describes additional case studies conducted to assess sensitivity of the technology to different quantities. Supporting documentation on the software package to be provided to ONRL is incorporated in this report.« less

  9. 3D matching techniques using OCT fingerprint point clouds

    NASA Astrophysics Data System (ADS)

    Gutierrez da Costa, Henrique S.; Silva, Luciano; Bellon, Olga R. P.; Bowden, Audrey K.; Czovny, Raphael K.

    2017-02-01

    Optical Coherence Tomography (OCT) makes viable acquisition of 3D fingerprints from both dermis and epidermis skin layers and their interfaces, exposing features that can be explored to improve biometric identification such as the curvatures and distinctive 3D regions. Scanned images from eleven volunteers allowed the construction of the first OCT 3D fingerprint database, to our knowledge, containing epidermal and dermal fingerprints. 3D dermal fingerprints can be used to overcome cases of Failure to Enroll (FTE) due to poor ridge image quality and skin alterations, cases that affect 2D matching performance. We evaluate three matching techniques, including the well-established Iterative Closest Points algorithm (ICP), Surface Interpenetration Measure (SIM) and the well-known KH Curvature Maps, all assessed using a 3D OCT fingerprint database, the first one for this purpose. Two of these techniques are based on registration techniques and one on curvatures. These were evaluated, compared and the fusion of matching scores assessed. We applied a sequence of steps to extract regions of interest named (ROI) minutiae clouds, representing small regions around distinctive minutia, usually located at ridges/valleys endings or bifurcations. The obtained ROI is acquired from the epidermis and dermis-epidermis interface by OCT imaging. A comparative analysis of identification accuracy was explored using different scenarios and the obtained results shows improvements for biometric identification. A comparison against 2D fingerprint matching algorithms is also presented to assess the improvements.

  10. Environmental baseline survey report for West Black Oak Ridge, East Black Oak Ridge, McKinney Ridge, West Pine Ridge and parcel 21D in the vicinity of the East Technology Park, Oak Ridge, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, David A.

    2012-11-29

    This environmental baseline survey (EBS) report documents the baseline environmental conditions of five land parcels located near the U.S. Department of Energy?s (DOE?s) East Tennessee Technology Park (ETTP), including West Black Oak Ridge, East Black Oak Ridge, McKinney Ridge, West Pine Ridge, and Parcel 21d. Preparation of this report included the detailed search of federal government records, title documents, aerial photos that may reflect prior uses, and visual inspections of the property and adjacent properties. Interviews with current employees involved in, or familiar with, operations on the real property were also conducted to identify any areas on the property wheremore » hazardous substances and petroleum products, or their derivatives, and acutely hazardous wastes may have been released or disposed. In addition, a search was made of reasonably obtainable federal, state, and local government records of each adjacent facility where there has been a release of any hazardous substance or any petroleum product or their derivatives, including aviation fuel and motor oil, and which is likely to cause or contribute to a release of any hazardous substance or any petroleum product or its derivatives, including aviation fuel or motor oil, on the real property. A radiological survey and soil/sediment sampling was conducted to assess baseline conditions of Parcel 21d that were not addressed by the soils-only no-further-investigation (NFI) reports. Groundwater sampling was also conducted to support a Parcel 21d decision. Based on available data West Black Oak Ridge, East Black Oak Ridge, McKinney Ridge, and West Pine Ridge are not impacted by site operations and are not subject to actions per the Federal Facility Agreement (FFA). This determination is supported by visual inspections, records searches and interviews, groundwater conceptual modeling, approved NFI reports, analytical data, and risk analysis results. Parcel 21d data, however, demonstrate impacts from site operations, specifically as associated with lead in surface soil at the abandoned water tank and nickel in surface soils over the northern portion of the parcel from former Bldg. K-1037 smelting operations. Low level detections of organics are also reported in some surface soils including Polycyclic aromatic hydrocarbons (PAHs) near Blair Road and common laboratory contaminants at randomly distributed locations. However, human health risk from site-related contaminants of potential concern (COPCs) are acceptable?though maximum concentrations of lead and nickel and the screening-level ecological risk assessment (SLERA) demonstrate no further ecological evaluation is warranted. The weight of evidence leads to the conclusion Parcel 21d does not require any actions per the FFA.« less

  11. Improved target detection algorithm using Fukunaga-Koontz transform and distance classifier correlation filter

    NASA Astrophysics Data System (ADS)

    Bal, A.; Alam, M. S.; Aslan, M. S.

    2006-05-01

    Often sensor ego-motion or fast target movement causes the target to temporarily go out of the field-of-view leading to reappearing target detection problem in target tracking applications. Since the target goes out of the current frame and reenters at a later frame, the reentering location and variations in rotation, scale, and other 3D orientations of the target are not known thus complicating the detection algorithm has been developed using Fukunaga-Koontz Transform (FKT) and distance classifier correlation filter (DCCF). The detection algorithm uses target and background information, extracted from training samples, to detect possible candidate target images. The detected candidate target images are then introduced into the second algorithm, DCCF, called clutter rejection module, to determine the target coordinates are detected and tracking algorithm is initiated. The performance of the proposed FKT-DCCF based target detection algorithm has been tested using real-world forward looking infrared (FLIR) video sequences.

  12. Adaboost multi-view face detection based on YCgCr skin color model

    NASA Astrophysics Data System (ADS)

    Lan, Qi; Xu, Zhiyong

    2016-09-01

    Traditional Adaboost face detection algorithm uses Haar-like features training face classifiers, whose detection error rate is low in the face region. While under the complex background, the classifiers will make wrong detection easily to the background regions with the similar faces gray level distribution, which leads to the error detection rate of traditional Adaboost algorithm is high. As one of the most important features of a face, skin in YCgCr color space has good clustering. We can fast exclude the non-face areas through the skin color model. Therefore, combining with the advantages of the Adaboost algorithm and skin color detection algorithm, this paper proposes Adaboost face detection algorithm method that bases on YCgCr skin color model. Experiments show that, compared with traditional algorithm, the method we proposed has improved significantly in the detection accuracy and errors.

  13. A method for velocity signal reconstruction of AFDISAR/PDV based on crazy-climber algorithm

    NASA Astrophysics Data System (ADS)

    Peng, Ying-cheng; Guo, Xian; Xing, Yuan-ding; Chen, Rong; Li, Yan-jie; Bai, Ting

    2017-10-01

    The resolution of Continuous wavelet transformation (CWT) is different when the frequency is different. For this property, the time-frequency signal of coherent signal obtained by All Fiber Displacement Interferometer System for Any Reflector (AFDISAR) is extracted. Crazy-climber Algorithm is adopted to extract wavelet ridge while Velocity history curve of the measuring object is obtained. Numerical simulation is carried out. The reconstruction signal is completely consistent with the original signal, which verifies the accuracy of the algorithm. Vibration of loudspeaker and free end of Hopkinson incident bar under impact loading are measured by AFDISAR, and the measured coherent signals are processed. Velocity signals of loudspeaker and free end of Hopkinson incident bar are reconstructed respectively. Comparing with the theoretical calculation, the particle vibration arrival time difference error of the free end of Hopkinson incident bar is 2μs. It is indicated from the results that the algorithm is of high accuracy, and is of high adaptability to signals of different time-frequency feature. The algorithm overcomes the limitation of modulating the time window artificially according to the signal variation when adopting STFT, and is suitable for extracting signal measured by AFDISAR.

  14. A false-alarm aware methodology to develop robust and efficient multi-scale infrared small target detection algorithm

    NASA Astrophysics Data System (ADS)

    Moradi, Saed; Moallem, Payman; Sabahi, Mohamad Farzan

    2018-03-01

    False alarm rate and detection rate are still two contradictory metrics for infrared small target detection in an infrared search and track system (IRST), despite the development of new detection algorithms. In certain circumstances, not detecting true targets is more tolerable than detecting false items as true targets. Hence, considering background clutter and detector noise as the sources of the false alarm in an IRST system, in this paper, a false alarm aware methodology is presented to reduce false alarm rate while the detection rate remains undegraded. To this end, advantages and disadvantages of each detection algorithm are investigated and the sources of the false alarms are determined. Two target detection algorithms having independent false alarm sources are chosen in a way that the disadvantages of the one algorithm can be compensated by the advantages of the other one. In this work, multi-scale average absolute gray difference (AAGD) and Laplacian of point spread function (LoPSF) are utilized as the cornerstones of the desired algorithm of the proposed methodology. After presenting a conceptual model for the desired algorithm, it is implemented through the most straightforward mechanism. The desired algorithm effectively suppresses background clutter and eliminates detector noise. Also, since the input images are processed through just four different scales, the desired algorithm has good capability for real-time implementation. Simulation results in term of signal to clutter ratio and background suppression factor on real and simulated images prove the effectiveness and the performance of the proposed methodology. Since the desired algorithm was developed based on independent false alarm sources, our proposed methodology is expandable to any pair of detection algorithms which have different false alarm sources.

  15. Multi-level study of C3H2: The first interstellar hydrocarbon ring

    NASA Technical Reports Server (NTRS)

    Madden, S. C.; Irvine, W. M.; Matthews, H. E.; Avery, L. W.

    1986-01-01

    Cyclic species in the interstellar medium have been searched for almost since the first detection of interstellar polyatomic molecules. Eleven different C3H2 rotational transitions were detected; 9 of which were studied in TMC-1, a nearby dark dust cloud, are shown. The 1 sub 10 yields 1 sub 01 and 2 sub 20 yields 2 sub 11 transitions were observed with the 43 m NRAO telescope, while the remaining transitions were detected with the 14 m antenna of the Five College Radio Observatory (FCRAO). The lines detected in TMC-1 have energies above the ground state ranging from 0.9 to 17.1 K and consist of both ortho and para species. Limited maps were made along the ridge for several of the transitions. The HC3N J = 2 yields 1 transition were mapped simultaneously with the C3H2 1 sub 10 yields 1 sub 01 line and therefore can compare the distribution of this ring with a carbon chain in TMC-1. C3H2 is distributed along a narrow ridge with a SE - NW extension which is slightly more extended than the HC2N J = 2 yields 1. Gaussian fits gives a FWHP extension of 8'5 for C3H2 while HC3N has a FWHP of 7'. The data show variations of the two velocity components along the ridge as a function of transition. Most of the transitions show a peak at the position of strongest HC3N emission while the 2 sub 21 yields 2 sub 10 transition shows a peak at the NH3 position.

  16. Recognizable-image selection for fingerprint recognition with a mobile-device camera.

    PubMed

    Lee, Dongjae; Choi, Kyoungtaek; Choi, Heeseung; Kim, Jaihie

    2008-02-01

    This paper proposes a recognizable-image selection algorithm for fingerprint-verification systems that use a camera embedded in a mobile device. A recognizable image is defined as the fingerprint image which includes the characteristics that are sufficiently discriminating an individual from other people. While general camera systems obtain focused images by using various gradient measures to estimate high-frequency components, mobile cameras cannot acquire recognizable images in the same way because the obtained images may not be adequate for fingerprint recognition, even if they are properly focused. A recognizable image has to meet the following two conditions: First, valid region in the recognizable image should be large enough compared with other nonrecognizable images. Here, a valid region is a well-focused part, and ridges in the region are clearly distinguishable from valleys. In order to select valid regions, this paper proposes a new focus-measurement algorithm using the secondary partial derivatives and a quality estimation utilizing the coherence and symmetry of gradient distribution. Second, rolling and pitching degrees of a finger measured from the camera plane should be within some limit for a recognizable image. The position of a core point and the contour of a finger are used to estimate the degrees of rolling and pitching. Experimental results show that our proposed method selects valid regions and estimates the degrees of rolling and pitching properly. In addition, fingerprint-verification performance is improved by detecting the recognizable images.

  17. Pronounced Shear Velocity Asymmetry in the Mantle Across the Juan de Fuca Ridge and Curious Lack of Features at the Gorda Ridge

    NASA Astrophysics Data System (ADS)

    Bell, S. W.; Ruan, Y.; Forsyth, D. W.

    2015-12-01

    With new Rayleigh-wave tomography results, we have detected a clear and strong asymmetry in the shear velocity structure of the Juan de Fuca ridge. Concentrated in a relatively thin layer with a depth range of ~30-60km, there lies a region of very low shear velocity, with velocities ranging from ~3.8km/s to 4.0km/s. Such low velocities provide strong evidence for the presence of partial melt. This low-velocity region is highly asymmetric, extending much further west than east of the ridge. Especially at shallow depths of ~35 km, this low-velocity region is concentrated just west of the southern portion of the ridge. Peaking near the Axial Seamount, the youngest of the Cobb-Eickelberg Seamounts, it extends south to the region around the small Vance Seamounts just north of the junction with the Blanco Fracture Zone. The Juan de Fuca plate is relatively stationary in the hotspot reference frame, and the Juan de Fuca ridge migrates westward in the hotspot reference frame. Seamounts are overwhelmingly concentrated on the western flank of the ridge, and an asymmetric upwelling driven by migration in the hotspot reference frame has been proposed to explain the seamount asymmetry (i.e. Davis and Karsten, 1986). Our velocity asymmetry, which matches the seamount asymmetry, provides evidence for this asymmetric upwelling and its connection to migration in the absolute hotspot reference frame. In the shear velocity results, the Gorda ridge displays a remarkable lack of features, with no clearly identifiable expression in the subsurface velocity. There is evidence of a broad low-velocity feature beneath Gorda beginning at a depth of ~150 km, but no clear shallow features can be tied to the ridge. At the depths we can resolve (~25-250km), the anisotropy beneath and within the Juan de Fuca plate is small, indicating a deep source of the shear wave splitting results (Bodmer et al., in press), which indicate a fast axis aligned with the Juan de Fuca plate's absolute motion. Around the Gorda ridge, we observe clear East-West fast axis orientation on both the Pacific Plate and the Gorda portion of the Juan de Fuca Plate.

  18. Modified automatic R-peak detection algorithm for patients with epilepsy using a portable electrocardiogram recorder.

    PubMed

    Jeppesen, J; Beniczky, S; Fuglsang Frederiksen, A; Sidenius, P; Johansen, P

    2017-07-01

    Earlier studies have shown that short term heart rate variability (HRV) analysis of ECG seems promising for detection of epileptic seizures. A precise and accurate automatic R-peak detection algorithm is a necessity in a real-time, continuous measurement of HRV, in a portable ECG device. We used the portable CE marked ePatch® heart monitor to record the ECG of 14 patients, who were enrolled in the videoEEG long term monitoring unit for clinical workup of epilepsy. Recordings of the first 7 patients were used as training set of data for the R-peak detection algorithm and the recordings of the last 7 patients (467.6 recording hours) were used to test the performance of the algorithm. We aimed to modify an existing QRS-detection algorithm to a more precise R-peak detection algorithm to avoid the possible jitter Qand S-peaks can create in the tachogram, which causes error in short-term HRVanalysis. The proposed R-peak detection algorithm showed a high sensitivity (Se = 99.979%) and positive predictive value (P+ = 99.976%), which was comparable with a previously published QRS-detection algorithm for the ePatch® ECG device, when testing the same dataset. The novel R-peak detection algorithm designed to avoid jitter has very high sensitivity and specificity and thus is a suitable tool for a robust, fast, real-time HRV-analysis in patients with epilepsy, creating the possibility for real-time seizure detection for these patients.

  19. Low-complexity R-peak detection in ECG signals: a preliminary step towards ambulatory fetal monitoring.

    PubMed

    Rooijakkers, Michiel; Rabotti, Chiara; Bennebroek, Martijn; van Meerbergen, Jef; Mischi, Massimo

    2011-01-01

    Non-invasive fetal health monitoring during pregnancy has become increasingly important. Recent advances in signal processing technology have enabled fetal monitoring during pregnancy, using abdominal ECG recordings. Ubiquitous ambulatory monitoring for continuous fetal health measurement is however still unfeasible due to the computational complexity of noise robust solutions. In this paper an ECG R-peak detection algorithm for ambulatory R-peak detection is proposed, as part of a fetal ECG detection algorithm. The proposed algorithm is optimized to reduce computational complexity, while increasing the R-peak detection quality compared to existing R-peak detection schemes. Validation of the algorithm is performed on two manually annotated datasets, the MIT/BIH Arrhythmia database and an in-house abdominal database. Both R-peak detection quality and computational complexity are compared to state-of-the-art algorithms as described in the literature. With a detection error rate of 0.22% and 0.12% on the MIT/BIH Arrhythmia and in-house databases, respectively, the quality of the proposed algorithm is comparable to the best state-of-the-art algorithms, at a reduced computational complexity.

  20. Resource Management plan for the Oak Ridge Reservation. Volume 28, Wetlands on the Oak Ridge Reservation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cunningham, M.; Pounds, Larry

    1991-12-01

    A survey of wetlands on the Oak Ridge Reservation (ORR) was conducted in 1990. Wetlands occurring on ORR were identified using National Wetlands Inventory (NWI) maps and field surveys. More than 120 sites were visited and 90 wetlands were identified. Wetland types on ORR included emergent communities in shallow embayments on reservoirs, emergent and aquatic communities in ponds, forested wetland on low ground along major creeks, and wet meadows and marshes associated with streams and seeps. Vascular plant species occurring on sites visited were inventoried, and 57 species were added to the checklist of vascular plants on ORR. Three speciesmore » listed as rare in Tennessee were discovered on ORR during the wetlands survey. The survey provided an intensive ground truth of the wetlands identified by NWI and offered an indication of wetlands that the NWI remote sensing techniques did not detect.« less

  1. NASA airborne radar wind shear detection algorithm and the detection of wet microbursts in the vicinity of Orlando, Florida

    NASA Technical Reports Server (NTRS)

    Britt, Charles L.; Bracalente, Emedio M.

    1992-01-01

    The algorithms used in the NASA experimental wind shear radar system for detection, characterization, and determination of windshear hazard are discussed. The performance of the algorithms in the detection of wet microbursts near Orlando is presented. Various suggested algorithms that are currently being evaluated using the flight test results from Denver and Orlando are reviewed.

  2. Deformation patterns in the southwestern part of the Mediterranean Ridge (South Matapan Trench, Western Greece)

    NASA Astrophysics Data System (ADS)

    Andronikidis, Nikolaos; Kokinou, Eleni; Vafidis, Antonios; Kamberis, Evangelos; Manoutsoglou, Emmanouil

    2017-12-01

    Seismic reflection data and bathymetry analyses, together with geological information, are combined in the present work to identify seabed structural deformation and crustal structure in the Western Mediterranean Ridge (the backstop and the South Matapan Trench). As a first step, we apply bathymetric data and state of art methods of pattern recognition to automatically detect seabed lineaments, which are possibly related to the presence of tectonic structures (faults). The resulting pattern is tied to seismic reflection data, further assisting in the construction of a stratigraphic and structural model for this part of the Mediterranean Ridge. Structural elements and stratigraphic units in the final model are estimated based on: (a) the detected lineaments on the seabed, (b) the distribution of the interval velocities and the presence of velocity inversions, (c) the continuity and the amplitudes of the seismic reflections, the seismic structure of the units and (d) well and stratigraphic data as well as the main tectonic structures from the nearest onshore areas. Seabed morphology in the study area is probably related with the past and recent tectonics movements that result from African and European plates' convergence. Backthrusts and reverse faults, flower structures and deep normal faults are among the most important extensional/compressional structures interpreted in the study area.

  3. Online Adaboost-Based Parameterized Methods for Dynamic Distributed Network Intrusion Detection.

    PubMed

    Hu, Weiming; Gao, Jun; Wang, Yanguo; Wu, Ou; Maybank, Stephen

    2014-01-01

    Current network intrusion detection systems lack adaptability to the frequently changing network environments. Furthermore, intrusion detection in the new distributed architectures is now a major requirement. In this paper, we propose two online Adaboost-based intrusion detection algorithms. In the first algorithm, a traditional online Adaboost process is used where decision stumps are used as weak classifiers. In the second algorithm, an improved online Adaboost process is proposed, and online Gaussian mixture models (GMMs) are used as weak classifiers. We further propose a distributed intrusion detection framework, in which a local parameterized detection model is constructed in each node using the online Adaboost algorithm. A global detection model is constructed in each node by combining the local parametric models using a small number of samples in the node. This combination is achieved using an algorithm based on particle swarm optimization (PSO) and support vector machines. The global model in each node is used to detect intrusions. Experimental results show that the improved online Adaboost process with GMMs obtains a higher detection rate and a lower false alarm rate than the traditional online Adaboost process that uses decision stumps. Both the algorithms outperform existing intrusion detection algorithms. It is also shown that our PSO, and SVM-based algorithm effectively combines the local detection models into the global model in each node; the global model in a node can handle the intrusion types that are found in other nodes, without sharing the samples of these intrusion types.

  4. Universal test fixture for monolithic mm-wave integrated circuits calibrated with an augmented TRD algorithm

    NASA Technical Reports Server (NTRS)

    Romanofsky, Robert R.; Shalkhauser, Kurt A.

    1989-01-01

    The design and evaluation of a novel fixturing technique for characterizing millimeter wave solid state devices is presented. The technique utilizes a cosine-tapered ridge guide fixture and a one-tier de-embedding procedure to produce accurate and repeatable device level data. Advanced features of this technique include nondestructive testing, full waveguide bandwidth operation, universality of application, and rapid, yet repeatable, chip-level characterization. In addition, only one set of calibration standards is required regardless of the device geometry.

  5. Engineering PFLOTRAN for Scalable Performance on Cray XT and IBM BlueGene Architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, Richard T; Sripathi, Vamsi K; Mahinthakumar, Gnanamanika

    We describe PFLOTRAN - a code for simulation of coupled hydro-thermal-chemical processes in variably saturated, non-isothermal, porous media - and the approaches we have employed to obtain scalable performance on some of the largest scale supercomputers in the world. We present detailed analyses of I/O and solver performance on Jaguar, the Cray XT5 at Oak Ridge National Laboratory, and Intrepid, the IBM BlueGene/P at Argonne National Laboratory, that have guided our choice of algorithms.

  6. Machine Learning Methods for Attack Detection in the Smart Grid.

    PubMed

    Ozay, Mete; Esnaola, Inaki; Yarman Vural, Fatos Tunay; Kulkarni, Sanjeev R; Poor, H Vincent

    2016-08-01

    Attack detection problems in the smart grid are posed as statistical learning problems for different attack scenarios in which the measurements are observed in batch or online settings. In this approach, machine learning algorithms are used to classify measurements as being either secure or attacked. An attack detection framework is provided to exploit any available prior knowledge about the system and surmount constraints arising from the sparse structure of the problem in the proposed approach. Well-known batch and online learning algorithms (supervised and semisupervised) are employed with decision- and feature-level fusion to model the attack detection problem. The relationships between statistical and geometric properties of attack vectors employed in the attack scenarios and learning algorithms are analyzed to detect unobservable attacks using statistical learning methods. The proposed algorithms are examined on various IEEE test systems. Experimental analyses show that machine learning algorithms can detect attacks with performances higher than attack detection algorithms that employ state vector estimation methods in the proposed attack detection framework.

  7. Low-complexity R-peak detection for ambulatory fetal monitoring.

    PubMed

    Rooijakkers, Michael J; Rabotti, Chiara; Oei, S Guid; Mischi, Massimo

    2012-07-01

    Non-invasive fetal health monitoring during pregnancy is becoming increasingly important because of the increasing number of high-risk pregnancies. Despite recent advances in signal-processing technology, which have enabled fetal monitoring during pregnancy using abdominal electrocardiogram (ECG) recordings, ubiquitous fetal health monitoring is still unfeasible due to the computational complexity of noise-robust solutions. In this paper, an ECG R-peak detection algorithm for ambulatory R-peak detection is proposed, as part of a fetal ECG detection algorithm. The proposed algorithm is optimized to reduce computational complexity, without reducing the R-peak detection performance compared to the existing R-peak detection schemes. Validation of the algorithm is performed on three manually annotated datasets. With a detection error rate of 0.23%, 1.32% and 9.42% on the MIT/BIH Arrhythmia and in-house maternal and fetal databases, respectively, the detection rate of the proposed algorithm is comparable to the best state-of-the-art algorithms, at a reduced computational complexity.

  8. A joint swarm intelligence algorithm for multi-user detection in MIMO-OFDM system

    NASA Astrophysics Data System (ADS)

    Hu, Fengye; Du, Dakun; Zhang, Peng; Wang, Zhijun

    2014-11-01

    In the multi-input multi-output orthogonal frequency division multiplexing (MIMO-OFDM) system, traditional multi-user detection (MUD) algorithms that usually used to suppress multiple access interference are difficult to balance system detection performance and the complexity of the algorithm. To solve this problem, this paper proposes a joint swarm intelligence algorithm called Ant Colony and Particle Swarm Optimisation (AC-PSO) by integrating particle swarm optimisation (PSO) and ant colony optimisation (ACO) algorithms. According to simulation results, it has been shown that, with low computational complexity, the MUD for the MIMO-OFDM system based on AC-PSO algorithm gains comparable MUD performance with maximum likelihood algorithm. Thus, the proposed AC-PSO algorithm provides a satisfactory trade-off between computational complexity and detection performance.

  9. A new real-time tsunami detection algorithm

    NASA Astrophysics Data System (ADS)

    Chierici, F.; Embriaco, D.; Pignagnoli, L.

    2016-12-01

    Real-time tsunami detection algorithms play a key role in any Tsunami Early Warning System. We have developed a new algorithm for tsunami detection based on the real-time tide removal and real-time band-pass filtering of sea-bed pressure recordings. The algorithm greatly increases the tsunami detection probability, shortens the detection delay and enhances detection reliability, at low computational cost. The algorithm is designed to be used also in autonomous early warning systems with a set of input parameters and procedures which can be reconfigured in real time. We have also developed a methodology based on Monte Carlo simulations to test the tsunami detection algorithms. The algorithm performance is estimated by defining and evaluating statistical parameters, namely the detection probability, the detection delay, which are functions of the tsunami amplitude and wavelength, and the occurring rate of false alarms. Pressure data sets acquired by Bottom Pressure Recorders in different locations and environmental conditions have been used in order to consider real working scenarios in the test. We also present an application of the algorithm to the tsunami event which occurred at Haida Gwaii on October 28th, 2012 using data recorded by the Bullseye underwater node of Ocean Networks Canada. The algorithm successfully ran for test purpose in year-long missions onboard the GEOSTAR stand-alone multidisciplinary abyssal observatory, deployed in the Gulf of Cadiz during the EC project NEAREST and on NEMO-SN1 cabled observatory deployed in the Western Ionian Sea, operational node of the European research infrastructure EMSO.

  10. A hardware-algorithm co-design approach to optimize seizure detection algorithms for implantable applications.

    PubMed

    Raghunathan, Shriram; Gupta, Sumeet K; Markandeya, Himanshu S; Roy, Kaushik; Irazoqui, Pedro P

    2010-10-30

    Implantable neural prostheses that deliver focal electrical stimulation upon demand are rapidly emerging as an alternate therapy for roughly a third of the epileptic patient population that is medically refractory. Seizure detection algorithms enable feedback mechanisms to provide focally and temporally specific intervention. Real-time feasibility and computational complexity often limit most reported detection algorithms to implementations using computers for bedside monitoring or external devices communicating with the implanted electrodes. A comparison of algorithms based on detection efficacy does not present a complete picture of the feasibility of the algorithm with limited computational power, as is the case with most battery-powered applications. We present a two-dimensional design optimization approach that takes into account both detection efficacy and hardware cost in evaluating algorithms for their feasibility in an implantable application. Detection features are first compared for their ability to detect electrographic seizures from micro-electrode data recorded from kainate-treated rats. Circuit models are then used to estimate the dynamic and leakage power consumption of the compared features. A score is assigned based on detection efficacy and the hardware cost for each of the features, then plotted on a two-dimensional design space. An optimal combination of compared features is used to construct an algorithm that provides maximal detection efficacy per unit hardware cost. The methods presented in this paper would facilitate the development of a common platform to benchmark seizure detection algorithms for comparison and feasibility analysis in the next generation of implantable neuroprosthetic devices to treat epilepsy. Copyright © 2010 Elsevier B.V. All rights reserved.

  11. An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform

    DTIC Science & Technology

    2018-01-01

    ARL-TR-8270 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter...Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform by Kwok F Tom Sensors and Electron...1 October 2016–30 September 2017 4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a

  12. ORNL Develops Novel, Nontoxic System That Seeks Air Leaks in Occupied Buildings

    ScienceCinema

    Hun, Diana

    2018-06-13

    Oak Ridge National Laboratory scientists demonstrate their novel, nontoxic fluorescent air leak detection system that uses a vitamin- and water-based solution to quickly locate cracks in occupied buildings without damaging property.

  13. ORNL Develops Novel, Nontoxic System That Seeks Air Leaks in Occupied Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hun, Diana

    2016-12-06

    Oak Ridge National Laboratory scientists demonstrate their novel, nontoxic fluorescent air leak detection system that uses a vitamin- and water-based solution to quickly locate cracks in occupied buildings without damaging property.

  14. Object detection approach using generative sparse, hierarchical networks with top-down and lateral connections for combining texture/color detection and shape/contour detection

    DOEpatents

    Paiton, Dylan M.; Kenyon, Garrett T.; Brumby, Steven P.; Schultz, Peter F.; George, John S.

    2015-07-28

    An approach to detecting objects in an image dataset may combine texture/color detection, shape/contour detection, and/or motion detection using sparse, generative, hierarchical models with lateral and top-down connections. A first independent representation of objects in an image dataset may be produced using a color/texture detection algorithm. A second independent representation of objects in the image dataset may be produced using a shape/contour detection algorithm. A third independent representation of objects in the image dataset may be produced using a motion detection algorithm. The first, second, and third independent representations may then be combined into a single coherent output using a combinatorial algorithm.

  15. Error detection method

    DOEpatents

    Olson, Eric J.

    2013-06-11

    An apparatus, program product, and method that run an algorithm on a hardware based processor, generate a hardware error as a result of running the algorithm, generate an algorithm output for the algorithm, compare the algorithm output to another output for the algorithm, and detect the hardware error from the comparison. The algorithm is designed to cause the hardware based processor to heat to a degree that increases the likelihood of hardware errors to manifest, and the hardware error is observable in the algorithm output. As such, electronic components may be sufficiently heated and/or sufficiently stressed to create better conditions for generating hardware errors, and the output of the algorithm may be compared at the end of the run to detect a hardware error that occurred anywhere during the run that may otherwise not be detected by traditional methodologies (e.g., due to cooling, insufficient heat and/or stress, etc.).

  16. Spectrum sensing algorithm based on autocorrelation energy in cognitive radio networks

    NASA Astrophysics Data System (ADS)

    Ren, Shengwei; Zhang, Li; Zhang, Shibing

    2016-10-01

    Cognitive radio networks have wide applications in the smart home, personal communications and other wireless communication. Spectrum sensing is the main challenge in cognitive radios. This paper proposes a new spectrum sensing algorithm which is based on the autocorrelation energy of signal received. By taking the autocorrelation energy of the received signal as the statistics of spectrum sensing, the effect of the channel noise on the detection performance is reduced. Simulation results show that the algorithm is effective and performs well in low signal-to-noise ratio. Compared with the maximum generalized eigenvalue detection (MGED) algorithm, function of covariance matrix based detection (FMD) algorithm and autocorrelation-based detection (AD) algorithm, the proposed algorithm has 2 11 dB advantage.

  17. Lining seam elimination algorithm and surface crack detection in concrete tunnel lining

    NASA Astrophysics Data System (ADS)

    Qu, Zhong; Bai, Ling; An, Shi-Quan; Ju, Fang-Rong; Liu, Ling

    2016-11-01

    Due to the particularity of the surface of concrete tunnel lining and the diversity of detection environments such as uneven illumination, smudges, localized rock falls, water leakage, and the inherent seams of the lining structure, existing crack detection algorithms cannot detect real cracks accurately. This paper proposed an algorithm that combines lining seam elimination with the improved percolation detection algorithm based on grid cell analysis for surface crack detection in concrete tunnel lining. First, check the characteristics of pixels within the overlapping grid to remove the background noise and generate the percolation seed map (PSM). Second, cracks are detected based on the PSM by the accelerated percolation algorithm so that the fracture unit areas can be scanned and connected. Finally, the real surface cracks in concrete tunnel lining can be obtained by removing the lining seam and performing percolation denoising. Experimental results show that the proposed algorithm can accurately, quickly, and effectively detect the real surface cracks. Furthermore, it can fill the gap in the existing concrete tunnel lining surface crack detection by removing the lining seam.

  18. A community detection algorithm based on structural similarity

    NASA Astrophysics Data System (ADS)

    Guo, Xuchao; Hao, Xia; Liu, Yaqiong; Zhang, Li; Wang, Lu

    2017-09-01

    In order to further improve the efficiency and accuracy of community detection algorithm, a new algorithm named SSTCA (the community detection algorithm based on structural similarity with threshold) is proposed. In this algorithm, the structural similarities are taken as the weights of edges, and the threshold k is considered to remove multiple edges whose weights are less than the threshold, and improve the computational efficiency. Tests were done on the Zachary’s network, Dolphins’ social network and Football dataset by the proposed algorithm, and compared with GN and SSNCA algorithm. The results show that the new algorithm is superior to other algorithms in accuracy for the dense networks and the operating efficiency is improved obviously.

  19. Detection of dominant flow and abnormal events in surveillance video

    NASA Astrophysics Data System (ADS)

    Kwak, Sooyeong; Byun, Hyeran

    2011-02-01

    We propose an algorithm for abnormal event detection in surveillance video. The proposed algorithm is based on a semi-unsupervised learning method, a kind of feature-based approach so that it does not detect the moving object individually. The proposed algorithm identifies dominant flow without individual object tracking using a latent Dirichlet allocation model in crowded environments. It can also automatically detect and localize an abnormally moving object in real-life video. The performance tests are taken with several real-life databases, and their results show that the proposed algorithm can efficiently detect abnormally moving objects in real time. The proposed algorithm can be applied to any situation in which abnormal directions or abnormal speeds are detected regardless of direction.

  20. ELASTIC NET FOR COX’S PROPORTIONAL HAZARDS MODEL WITH A SOLUTION PATH ALGORITHM

    PubMed Central

    Wu, Yichao

    2012-01-01

    For least squares regression, Efron et al. (2004) proposed an efficient solution path algorithm, the least angle regression (LAR). They showed that a slight modification of the LAR leads to the whole LASSO solution path. Both the LAR and LASSO solution paths are piecewise linear. Recently Wu (2011) extended the LAR to generalized linear models and the quasi-likelihood method. In this work we extend the LAR further to handle Cox’s proportional hazards model. The goal is to develop a solution path algorithm for the elastic net penalty (Zou and Hastie (2005)) in Cox’s proportional hazards model. This goal is achieved in two steps. First we extend the LAR to optimizing the log partial likelihood plus a fixed small ridge term. Then we define a path modification, which leads to the solution path of the elastic net regularized log partial likelihood. Our solution path is exact and piecewise determined by ordinary differential equation systems. PMID:23226932

  1. Extremum seeking-based optimization of high voltage converter modulator rise-time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scheinker, Alexander; Bland, Michael; Krstic, Miroslav

    2013-02-01

    We digitally implement an extremum seeking (ES) algorithm, which optimizes the rise time of the output voltage of a high voltage converter modulator (HVCM) at the Los Alamos Neutron Science Center (LANSCE) HVCM test stand by iteratively, simultaneously tuning the first 8 switching edges of each of the three phase drive waveforms (24 variables total). We achieve a 50 μs rise time, which is reduction in half compared to the 100 μs achieved at the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory. Considering that HVCMs typically operate with an output voltage of 100 kV, with a 60Hz repetitionmore » rate, the 50 μs rise time reduction will result in very significant energy savings. The ES algorithm will prove successful, despite the noisy measurements and cost calculations, confirming the theoretical results that the algorithm is not affected by noise whose frequency components are independent of the perturbing frequencies.« less

  2. Direct surface analysis coupled to high-resolution mass spectrometry reveals heterogeneous composition of the cuticle of Hibiscus trionum petals.

    PubMed

    Giorio, Chiara; Moyroud, Edwige; Glover, Beverley J; Skelton, Paul C; Kalberer, Markus

    2015-10-06

    Plant cuticle, which is the outermost layer covering the aerial parts of all plants including petals and leaves, can present a wide range of patterns that, combined with cell shape, can generate unique physical, mechanical, or optical properties. For example, arrays of regularly spaced nanoridges have been found on the dark (anthocyanin-rich) portion at the base of the petals of Hibiscus trionum. Those ridges act as a diffraction grating, producing an iridescent effect. Because the surface of the distal white region of the petals is smooth and noniridescent, a selective chemical characterization of the surface of the petals on different portions (i.e., ridged vs smooth) is needed to understand whether distinct cuticular patterns correlate with distinct chemical compositions of the cuticle. In the present study, a rapid screening method has been developed for the direct surface analysis of Hibiscus trionum petals using liquid extraction surface analysis (LESA) coupled with high-resolution mass spectrometry. The optimized method was used to characterize a wide range of plant metabolites and cuticle monomers on the upper (adaxial) surface of the petals on both the white/smooth and anthocyanic/ridged regions, and on the lower (abaxial) surface, which is entirely smooth. The main components detected on the surface of the petals are low-molecular-weight organic acids, sugars, and flavonoids. The ridged portion on the upper surface of the petal is enriched in long-chain fatty acids, which are constituents of the wax fraction of the cuticle. These compounds were not detected on the white/smooth region of the upper petal surface or on the smooth lower surface.

  3. Quantum machine learning for quantum anomaly detection

    NASA Astrophysics Data System (ADS)

    Liu, Nana; Rebentrost, Patrick

    2018-04-01

    Anomaly detection is used for identifying data that deviate from "normal" data patterns. Its usage on classical data finds diverse applications in many important areas such as finance, fraud detection, medical diagnoses, data cleaning, and surveillance. With the advent of quantum technologies, anomaly detection of quantum data, in the form of quantum states, may become an important component of quantum applications. Machine-learning algorithms are playing pivotal roles in anomaly detection using classical data. Two widely used algorithms are the kernel principal component analysis and the one-class support vector machine. We find corresponding quantum algorithms to detect anomalies in quantum states. We show that these two quantum algorithms can be performed using resources that are logarithmic in the dimensionality of quantum states. For pure quantum states, these resources can also be logarithmic in the number of quantum states used for training the machine-learning algorithm. This makes these algorithms potentially applicable to big quantum data applications.

  4. A Formally Verified Conflict Detection Algorithm for Polynomial Trajectories

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony; Munoz, Cesar

    2015-01-01

    In air traffic management, conflict detection algorithms are used to determine whether or not aircraft are predicted to lose horizontal and vertical separation minima within a time interval assuming a trajectory model. In the case of linear trajectories, conflict detection algorithms have been proposed that are both sound, i.e., they detect all conflicts, and complete, i.e., they do not present false alarms. In general, for arbitrary nonlinear trajectory models, it is possible to define detection algorithms that are either sound or complete, but not both. This paper considers the case of nonlinear aircraft trajectory models based on polynomial functions. In particular, it proposes a conflict detection algorithm that precisely determines whether, given a lookahead time, two aircraft flying polynomial trajectories are in conflict. That is, it has been formally verified that, assuming that the aircraft trajectories are modeled as polynomial functions, the proposed algorithm is both sound and complete.

  5. Health management system for rocket engines

    NASA Technical Reports Server (NTRS)

    Nemeth, Edward

    1990-01-01

    The functional framework of a failure detection algorithm for the Space Shuttle Main Engine (SSME) is developed. The basic algorithm is based only on existing SSME measurements. Supplemental measurements, expected to enhance failure detection effectiveness, are identified. To support the algorithm development, a figure of merit is defined to estimate the likelihood of SSME criticality 1 failure modes and the failure modes are ranked in order of likelihood of occurrence. Nine classes of failure detection strategies are evaluated and promising features are extracted as the basis for the failure detection algorithm. The failure detection algorithm provides early warning capabilities for a wide variety of SSME failure modes. Preliminary algorithm evaluation, using data from three SSME failures representing three different failure types, demonstrated indications of imminent catastrophic failure well in advance of redline cutoff in all three cases.

  6. Clustering analysis of moving target signatures

    NASA Astrophysics Data System (ADS)

    Martone, Anthony; Ranney, Kenneth; Innocenti, Roberto

    2010-04-01

    Previously, we developed a moving target indication (MTI) processing approach to detect and track slow-moving targets inside buildings, which successfully detected moving targets (MTs) from data collected by a low-frequency, ultra-wideband radar. Our MTI algorithms include change detection, automatic target detection (ATD), clustering, and tracking. The MTI algorithms can be implemented in a real-time or near-real-time system; however, a person-in-the-loop is needed to select input parameters for the clustering algorithm. Specifically, the number of clusters to input into the cluster algorithm is unknown and requires manual selection. A critical need exists to automate all aspects of the MTI processing formulation. In this paper, we investigate two techniques that automatically determine the number of clusters: the adaptive knee-point (KP) algorithm and the recursive pixel finding (RPF) algorithm. The KP algorithm is based on a well-known heuristic approach for determining the number of clusters. The RPF algorithm is analogous to the image processing, pixel labeling procedure. Both algorithms are used to analyze the false alarm and detection rates of three operational scenarios of personnel walking inside wood and cinderblock buildings.

  7. Development of the HERMIES III mobile robot research testbed at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manges, W.W.; Hamel, W.R.; Weisbin, C.R.

    1988-01-01

    The latest robot in the Hostile Environment Robotic Machine Intelligence Experiment Series (HERMIES) is now under development at the Center for Engineering Systems Advanced Research (CESAR) in the Oak Ridge National Laboratory. The HERMIES III robot incorporates a larger than human size 7-degree-of-freedom manipulator mounted on a 2-degree-of-freedom mobile platform including a variety of sensors and computers. The deployment of this robot represents a significant increase in research capabilities for the CESAR laboratory. The initial on-board computer capacity of the robot exceeds that of 20 Vax 11/780s. The navigation and vision algorithms under development make extensive use of the on-boardmore » NCUBE hypercube computer while the sensors are interfaced through five VME computers running the OS-9 real-time, multitasking operating system. This paper describes the motivation, key issues, and detailed design trade-offs of implementing the first phase (basic functionality) of the HERMIES III robot. 10 refs., 7 figs.« less

  8. Cool-Season Moisture Delivery and Multi-Basin Streamflow Anomalies in the Western United States

    NASA Astrophysics Data System (ADS)

    Malevich, Steven B.

    Widespread droughts can have a significant impact on western United States streamflow, but the causes of these events are not fully understood. This dissertation examines streamflow from multiple western US basins and establishes the robust, leading modes of variability in interannual streamflow throughout the past century. I show that approximately 50% of this variability is associated with spatially widespread streamflow anomalies that are statistically independent from streamflow's response to the El Nino-Southern Oscillation (ENSO). The ENSO-teleconnection accounts for approximately 25% of the interannual variability in streamflow, across this network. These atmospheric circulation anomalies associated with the most spatially widespread variability are associated with the Aleutian low and the persistent coastal atmospheric ridge in the Pacific Northwest. I use a watershed segmentation algorithm to explicitly track the position and intensity of these features and compare their variability to the multi-basin streamflow variability. Results show that latitudinal shifts in the coastal atmospheric ridge are more strongly associated with streamflow's north-south dipole response to ENSO variability while more spatially widespread anomalies in streamflow most strongly relate to seasonal changes in the coastal ridge intensity. This likely reflects persistent coastal ridge blocking of cool-season precipitation into western US river basins. I utilize the 35 model runs of the Community Earth System Model Large Ensemble (CESMLE) to determine whether the model ensemble simulates the anomalously strong coastal ridges and extreme widespread wintertime precipitation anomalies found in the observation record. Though there is considerable bias in the CESMLE, the CESMLE runs simulate extremely widespread dry precipitation anomalies with a frequency of approximately one extreme event per century during the historical simulations (1920 - 2005). These extremely widespread dry events correspond significantly with anomalously intense coastal atmospheric ridges. The results from these three papers connect widespread interannual streamflow anomalies in the western US--and especially extremely widespread streamflow droughts--with semi-permanent atmospheric ridge anomalies near the coastal Pacific Northwest. This is important to western US water managers because these widespread events appear to have been a robust feature of the past century. The semi-permanent atmospheric features associated with these widespread dry streamflow anomalies are projected to change position significantly in the next century as a response to global climate change. This may change widespread streamflow anomaly characteristic in the western US, though my results do not show evidence of these changes within the instrument record of last century.

  9. DALMATIAN: An Algorithm for Automatic Cell Detection and Counting in 3D.

    PubMed

    Shuvaev, Sergey A; Lazutkin, Alexander A; Kedrov, Alexander V; Anokhin, Konstantin V; Enikolopov, Grigori N; Koulakov, Alexei A

    2017-01-01

    Current 3D imaging methods, including optical projection tomography, light-sheet microscopy, block-face imaging, and serial two photon tomography enable visualization of large samples of biological tissue. Large volumes of data obtained at high resolution require development of automatic image processing techniques, such as algorithms for automatic cell detection or, more generally, point-like object detection. Current approaches to automated cell detection suffer from difficulties originating from detection of particular cell types, cell populations of different brightness, non-uniformly stained, and overlapping cells. In this study, we present a set of algorithms for robust automatic cell detection in 3D. Our algorithms are suitable for, but not limited to, whole brain regions and individual brain sections. We used watershed procedure to split regional maxima representing overlapping cells. We developed a bootstrap Gaussian fit procedure to evaluate the statistical significance of detected cells. We compared cell detection quality of our algorithm and other software using 42 samples, representing 6 staining and imaging techniques. The results provided by our algorithm matched manual expert quantification with signal-to-noise dependent confidence, including samples with cells of different brightness, non-uniformly stained, and overlapping cells for whole brain regions and individual tissue sections. Our algorithm provided the best cell detection quality among tested free and commercial software.

  10. Multi-object Detection and Discrimination Algorithms

    DTIC Science & Technology

    2015-03-26

    with  an   algorithm  similar  to  a  depth-­‐first   search .   This  stage  of  the   algorithm  is  O(CN).  From...Multi-object Detection and Discrimination Algorithms This document contains an overview of research and work performed and published at the University...of Florida from October 1, 2009 to October 31, 2013 pertaining to proposal 57306CS: Multi-object Detection and Discrimination Algorithms

  11. Video Shot Boundary Detection Using QR-Decomposition and Gaussian Transition Detection

    NASA Astrophysics Data System (ADS)

    Amiri, Ali; Fathy, Mahmood

    2010-12-01

    This article explores the problem of video shot boundary detection and examines a novel shot boundary detection algorithm by using QR-decomposition and modeling of gradual transitions by Gaussian functions. Specifically, the authors attend to the challenges of detecting gradual shots and extracting appropriate spatiotemporal features that affect the ability of algorithms to efficiently detect shot boundaries. The algorithm utilizes the properties of QR-decomposition and extracts a block-wise probability function that illustrates the probability of video frames to be in shot transitions. The probability function has abrupt changes in hard cut transitions, and semi-Gaussian behavior in gradual transitions. The algorithm detects these transitions by analyzing the probability function. Finally, we will report the results of the experiments using large-scale test sets provided by the TRECVID 2006, which has assessments for hard cut and gradual shot boundary detection. These results confirm the high performance of the proposed algorithm.

  12. Airborne electromagnetic detection of shallow seafloor topographic features, including resolution of multiple sub-parallel seafloor ridges

    NASA Astrophysics Data System (ADS)

    Vrbancich, Julian; Boyd, Graham

    2014-05-01

    The HoistEM helicopter time-domain electromagnetic (TEM) system was flown over waters in Backstairs Passage, South Australia, in 2003 to test the bathymetric accuracy and hence the ability to resolve seafloor structure in shallow and deeper waters (extending to ~40 m depth) that contain interesting seafloor topography. The topography that forms a rock peak (South Page) in the form of a mini-seamount that barely rises above the water surface was accurately delineated along its ridge from the start of its base (where the seafloor is relatively flat) in ~30 m water depth to its peak at the water surface, after an empirical correction was applied to the data to account for imperfect system calibration, consistent with earlier studies using the same HoistEM system. A much smaller submerged feature (Threshold Bank) of ~9 m peak height located in waters of 35 to 40 m depth was also accurately delineated. These observations when checked against known water depths in these two regions showed that the airborne TEM system, following empirical data correction, was effectively operating correctly. The third and most important component of the survey was flown over the Yatala Shoals region that includes a series of sub-parallel seafloor ridges (resembling large sandwaves rising up to ~20 m from the seafloor) that branch out and gradually decrease in height as the ridges spread out across the seafloor. These sub-parallel ridges provide an interesting topography because the interpreted water depths obtained from 1D inversion of TEM data highlight the limitations of the EM footprint size in resolving both the separation between the ridges (which vary up to ~300 m) and the height of individual ridges (which vary up to ~20 m), and possibly also the limitations of assuming a 1D model in areas where the topography is quasi-2D/3D.

  13. Looking Up at Layers of 'Vera Rubin Ridge' on Sol 1790

    NASA Image and Video Library

    2017-09-13

    The Mast Camera (Mastcam) on NASA's Curiosity Mars rover captured this view of "Vera Rubin Ridge" about two weeks before the rover started ascending this steep ridge on lower Mount Sharp. The view combines 13 images taken with the Mastcam's right-eye, telephoto-lens camera, on Aug. 19, 2017, during the 1,790th Martian day, or sol, of Curiosity's work on Mars. This and other Mastcam panoramas show details of the sedimentary rocks that make up the "Vera Rubin Ridge." This distinct topographic feature located on the lower slopes of Mount Sharp (Aeolis Mons) is characterized by the presence of hematite, an iron-oxide mineral, which has been detected from orbit. The Mastcam images show that the rocks making up the lower part of the ridge are characterized by distinct horizontal stratification with individual rock layers of the order of several inches (tens of centimeters) thick. Scientists on the mission are using such images to determine the ancient environment these rocks were deposited in. The repeated beds indicate progressive accumulation of sediments that now make up the lower part of Mount Sharp, although from this distance it is not possible to know if they were formed by aqueous or wind-blown processes. Close-up images collected as the rover climbs the ridge will help answer this question. The stratified rocks are cross cut by veins filled with a white mineral, likely calcium sulfate, that provide evidence of later episodes of fluid flow through the rocks. The panorama has been white-balanced so that the colors of the rock materials resemble how they would appear under daytime lighting conditions on Earth. It spans about 55 compass degrees centered to the south-southeast. The Sol 1790 location just north of the ridge is shown in a Sol 1789 traverse map. The ridge was informally named in early 2017 in memory of Vera Cooper Rubin (1928-2016), whose astronomical observations provided evidence for the existence of the universe's dark matter. An annotated figure is shown at https://photojournal.jpl.nasa.gov/catalog/PIA21851

  14. Martian Ridge Looming Above Curiosity Prior to Ascent

    NASA Image and Video Library

    2017-09-13

    Researchers used the Mast Camera (Mastcam) on NASA's Curiosity Mars rover to gain this detailed view of layers in "Vera Rubin Ridge" from just below the ridge. The scene combines 70 images taken with the Mastcam's right-eye, telephoto-lens camera, on Aug. 13, 2017, during the 1,785th Martian day, or sol, of Curiosity's work on Mars. This and other Mastcam panoramas show details of the sedimentary rocks that make up the "Vera Rubin Ridge." This distinct topographic feature located on the lower slopes of Mount Sharp (Aeolis Mons) is characterized by the presence of hematite, an iron-oxide mineral, which has been detected from orbit. The Mastcam images show that the rocks making up the lower part of the ridge are characterized by distinct horizontal stratification with individual rock layers of the order of several inches (tens of centimeters) thick. Scientists on the mission are using such images to determine the ancient environment these rocks were deposited in. The repeated beds indicate progressive accumulation of sediments that now make up the lower part of Mount Sharp, although from this distance it is not possible to know if they were formed by aqueous or wind-blown processes. Close-up images collected as the rover climbs the ridge will help answer this question. The stratified rocks are cross cut by veins filled with a white mineral, likely calcium sulfate, that provide evidence of later episodes of fluid flow through the rocks. The panorama has been white-balanced so that the colors of the rock materials resemble how they would appear under daytime lighting conditions on Earth. It spans from southeast on the left to west on the right. The Sol 1785 location just north of the ridge is shown in a Sol 1782 traverse map. The ridge was informally named in early 2017 in memory of Vera Cooper Rubin (1928-2016), whose astronomical observations provided evidence for the existence of the universe's dark matter. An annotated figure is shown at https://photojournal.jpl.nasa.gov/catalog/PIA21850

  15. An open randomized controlled clinical trial to evaluate ridge preservation and repair using SocketKAP(™) and SocketKAGE(™) : part 1-three-dimensional volumetric soft tissue analysis of study casts.

    PubMed

    Zadeh, Homayoun H; Abdelhamid, Alaa; Omran, Mostafa; Bakhshalian, Neema; Tarnow, Dennis

    2016-06-01

    The aims of this study were to evaluate (i) the efficacy of ridge preservation and repair involving SocketKAP(™) and SocketKAGE(™) devices following tooth removal; and (ii) ridge contour changes at 6 months post-extraction in intact sockets and sockets with dehiscence defects. Thirty-six patients required a total of 61 teeth to be extracted. Five cohorts were established with groups A-C involving intact sockets and groups D and E involving facial dehiscence: (A) Negative Control; (B) SocketKAP(™) alone; (C) Anorganic Bovine Bone Mineral (ABBM) + SocketKAP(™) ; (D) Negative Control; and (E) ABBM + SocketKAP(™)  + SocketKAGE(™) . Preoperative CBCT and laser-scanned casts were obtained. Teeth segmented from preoperative CBCT were merged with study cast images to allow for digital removal of teeth from the casts. Volumetric measurements of ridge contour were performed. Images of preoperative and 6 months post-operative casts were superimposed to measure ridge contour changes. Post-extraction contour loss occurred in all sockets primarily in the crestal 3 mm but was also detected up to 6 mm from alveolar crest. For intact sockets, SocketKAP(™) or SocketKAP(™)  + ABBM interventions led to greater percentages of remaining ridge contour when compared to controls. A significant difference favoring SocketKAP(™)  + SocketKAGE(™)  + ABBM treatment was observed for sockets with facial dehiscence when compared to controls. SocketKAP(™) , with or without ABBM, significantly limited post-extraction ridge contour loss in intact sockets. In the absence of a group treated with only the SocketKAGE(™) , it is not possible to determine its efficacy, although the combination of SocketKAGE(™)  + SocketKAP(™)  + ABBM was effective in limiting post-extraction ridge contour loss in sockets with dehiscence defects. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Analysis of gravity and topography in the GLIMPSE study region: Isostatic compensation and uplift of the Sojourn and Hotu Matua Ridge systems

    USGS Publications Warehouse

    Harmon, N.; Forsyth, D.W.; Scheirer, D.S.

    2006-01-01

    The Gravity Lieations Intraplate Melting Petrologic and Seismic Expedition (GLIMPSE) Experiment investigated the formation of a series of non-hot spot, intraplate volcanic ridges in the South Pacific and their relationship to cross-grain gravity lineaments detected by satellite altimetry. Using shipboard gravity measurements and a simple model of surface loading of a thin elastic plate, we estimate effective elastic thicknesses ranging from ???2 km beneath the Sojourn Ridge to a maximum of 10 km beneath the Southern Cross Seamount. These elastic thicknesses are lower than predicted for the 3-9 Ma seafloor on which the volcanoes lie, perhaps due to reheating and thinning of the plate during emplacement. Anomalously low apparent densities estimated for the Matua and Southern Cross seamounts 2050 and 2250 kg m-3, respectively, probably are artifacts caused by the assumption of only surface loading, ignoring the presence of subsurface loading in the form of underplated crust and/or low-density mantle. Using satellite free-air gravity and shipboard bathymetry, we calculate the age-detrended, residual mantle Bouguer anomaly (rMBA). The rMBA corrects the free-air anomaly for the direct effects of topography, including the thickening of the crust beneath the seamounts and volcanic ridges due to surface loading of the volcanic edifices. There are broad, negative rMBA anomalies along the Sojourn and Brown ridges and the Hotu Matua seamount chain that extend nearly to the East Pacific Rise. These negative rMBA anomalies connect to negative free-air anomalies in the western part of the study area that have been recognized previously as the beginnings of the cross-grain gravity lineaments. Subtracting the topographic effects of surface loading by the ridges and seamounts from the observed topography reveals that the ridges are built on broad bands of anomalously elevated seafloor. This swell topography and the negative rMBA anomalies contradict the predictions of lithospheric cracking models for the origin of gravity lineaments and associated volcanic ridges, favoring models with a dynamic mantle component such as small-scale convection or channelized asthenospheric return flow. Copyright 2006 by the American Geophysical Union.

  17. Fast and accurate image recognition algorithms for fresh produce food safety sensing

    NASA Astrophysics Data System (ADS)

    Yang, Chun-Chieh; Kim, Moon S.; Chao, Kuanglin; Kang, Sukwon; Lefcourt, Alan M.

    2011-06-01

    This research developed and evaluated the multispectral algorithms derived from hyperspectral line-scan fluorescence imaging under violet LED excitation for detection of fecal contamination on Golden Delicious apples. The algorithms utilized the fluorescence intensities at four wavebands, 680 nm, 684 nm, 720 nm, and 780 nm, for computation of simple functions for effective detection of contamination spots created on the apple surfaces using four concentrations of aqueous fecal dilutions. The algorithms detected more than 99% of the fecal spots. The effective detection of feces showed that a simple multispectral fluorescence imaging algorithm based on violet LED excitation may be appropriate to detect fecal contamination on fast-speed apple processing lines.

  18. Object detection approach using generative sparse, hierarchical networks with top-down and lateral connections for combining texture/color detection and shape/contour detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paiton, Dylan M.; Kenyon, Garrett T.; Brumby, Steven P.

    An approach to detecting objects in an image dataset may combine texture/color detection, shape/contour detection, and/or motion detection using sparse, generative, hierarchical models with lateral and top-down connections. A first independent representation of objects in an image dataset may be produced using a color/texture detection algorithm. A second independent representation of objects in the image dataset may be produced using a shape/contour detection algorithm. A third independent representation of objects in the image dataset may be produced using a motion detection algorithm. The first, second, and third independent representations may then be combined into a single coherent output using amore » combinatorial algorithm.« less

  19. Gas leak detection in infrared video with background modeling

    NASA Astrophysics Data System (ADS)

    Zeng, Xiaoxia; Huang, Likun

    2018-03-01

    Background modeling plays an important role in the task of gas detection based on infrared video. VIBE algorithm is a widely used background modeling algorithm in recent years. However, the processing speed of the VIBE algorithm sometimes cannot meet the requirements of some real time detection applications. Therefore, based on the traditional VIBE algorithm, we propose a fast prospect model and optimize the results by combining the connected domain algorithm and the nine-spaces algorithm in the following processing steps. Experiments show the effectiveness of the proposed method.

  20. Network intrusion detection by the coevolutionary immune algorithm of artificial immune systems with clonal selection

    NASA Astrophysics Data System (ADS)

    Salamatova, T.; Zhukov, V.

    2017-02-01

    The paper presents the application of the artificial immune systems apparatus as a heuristic method of network intrusion detection for algorithmic provision of intrusion detection systems. The coevolutionary immune algorithm of artificial immune systems with clonal selection was elaborated. In testing different datasets the empirical results of evaluation of the algorithm effectiveness were achieved. To identify the degree of efficiency the algorithm was compared with analogs. The fundamental rules based of solutions generated by this algorithm are described in the article.

  1. Change detection using landsat time series: A review of frequencies, preprocessing, algorithms, and applications

    NASA Astrophysics Data System (ADS)

    Zhu, Zhe

    2017-08-01

    The free and open access to all archived Landsat images in 2008 has completely changed the way of using Landsat data. Many novel change detection algorithms based on Landsat time series have been developed We present a comprehensive review of four important aspects of change detection studies based on Landsat time series, including frequencies, preprocessing, algorithms, and applications. We observed the trend that the more recent the study, the higher the frequency of Landsat time series used. We reviewed a series of image preprocessing steps, including atmospheric correction, cloud and cloud shadow detection, and composite/fusion/metrics techniques. We divided all change detection algorithms into six categories, including thresholding, differencing, segmentation, trajectory classification, statistical boundary, and regression. Within each category, six major characteristics of different algorithms, such as frequency, change index, univariate/multivariate, online/offline, abrupt/gradual change, and sub-pixel/pixel/spatial were analyzed. Moreover, some of the widely-used change detection algorithms were also discussed. Finally, we reviewed different change detection applications by dividing these applications into two categories, change target and change agent detection.

  2. CellSegm - a MATLAB toolbox for high-throughput 3D cell segmentation

    PubMed Central

    2013-01-01

    The application of fluorescence microscopy in cell biology often generates a huge amount of imaging data. Automated whole cell segmentation of such data enables the detection and analysis of individual cells, where a manual delineation is often time consuming, or practically not feasible. Furthermore, compared to manual analysis, automation normally has a higher degree of reproducibility. CellSegm, the software presented in this work, is a Matlab based command line software toolbox providing an automated whole cell segmentation of images showing surface stained cells, acquired by fluorescence microscopy. It has options for both fully automated and semi-automated cell segmentation. Major algorithmic steps are: (i) smoothing, (ii) Hessian-based ridge enhancement, (iii) marker-controlled watershed segmentation, and (iv) feature-based classfication of cell candidates. Using a wide selection of image recordings and code snippets, we demonstrate that CellSegm has the ability to detect various types of surface stained cells in 3D. After detection and outlining of individual cells, the cell candidates can be subject to software based analysis, specified and programmed by the end-user, or they can be analyzed by other software tools. A segmentation of tissue samples with appropriate characteristics is also shown to be resolvable in CellSegm. The command-line interface of CellSegm facilitates scripting of the separate tools, all implemented in Matlab, offering a high degree of flexibility and tailored workflows for the end-user. The modularity and scripting capabilities of CellSegm enable automated workflows and quantitative analysis of microscopic data, suited for high-throughput image based screening. PMID:23938087

  3. CellSegm - a MATLAB toolbox for high-throughput 3D cell segmentation.

    PubMed

    Hodneland, Erlend; Kögel, Tanja; Frei, Dominik Michael; Gerdes, Hans-Hermann; Lundervold, Arvid

    2013-08-09

    : The application of fluorescence microscopy in cell biology often generates a huge amount of imaging data. Automated whole cell segmentation of such data enables the detection and analysis of individual cells, where a manual delineation is often time consuming, or practically not feasible. Furthermore, compared to manual analysis, automation normally has a higher degree of reproducibility. CellSegm, the software presented in this work, is a Matlab based command line software toolbox providing an automated whole cell segmentation of images showing surface stained cells, acquired by fluorescence microscopy. It has options for both fully automated and semi-automated cell segmentation. Major algorithmic steps are: (i) smoothing, (ii) Hessian-based ridge enhancement, (iii) marker-controlled watershed segmentation, and (iv) feature-based classfication of cell candidates. Using a wide selection of image recordings and code snippets, we demonstrate that CellSegm has the ability to detect various types of surface stained cells in 3D. After detection and outlining of individual cells, the cell candidates can be subject to software based analysis, specified and programmed by the end-user, or they can be analyzed by other software tools. A segmentation of tissue samples with appropriate characteristics is also shown to be resolvable in CellSegm. The command-line interface of CellSegm facilitates scripting of the separate tools, all implemented in Matlab, offering a high degree of flexibility and tailored workflows for the end-user. The modularity and scripting capabilities of CellSegm enable automated workflows and quantitative analysis of microscopic data, suited for high-throughput image based screening.

  4. Investigating the Tectonics of Mare Crisium with Topographic Data

    NASA Astrophysics Data System (ADS)

    Byrne, P. K.; Klimczak, C.; Solomon, S. C.

    2013-12-01

    Mare Crisium is a 560-km-diameter lunar mare, 170,500 km2 in area. Like other lunar maria, Crisium has been tectonically deformed by wrinkle ridges. Early studies of the tectonics of Crisium were hampered by poor resolution or illumination conditions, however. The recent availability of high-resolution digital topographic models (DTMs) from Lunar Orbiter Laser Altimeter (LOLA) data enables a fresh assessment of lunar tectonics, including those in Mare Crisium. LOLA DTMs show that the basin is replete with wrinkle ridges, consistent with previous observations; we observe over 170. The largest such structures follow the basin outline and verge towards the interior, most notably from 30°-180° and 270°-330° azimuth (measured clockwise from north). Artificially illuminated hillshade maps derived from the DTMs, for solar azimuth angles of 0° and 180°, reveal ~east-west-orientated structures that are not readily visible in photogeological data. We identify 10 partially buried craters within Crisium, but we note a further five demarcated only by wrinkle ridges, the largest of which is ~95 km in diameter, that have no other surface manifestation. Moreover, LOLA topographic data reveal subtle ridge-like changes in relief across the mare that are virtually impossible to detect otherwise. We interpret these 13 ridges, ~30-100 km in length, as additional shortening structures that have no surficial faulted component. Surface displacement models can be fit to topographic profiles across structures to estimate displacements and geometries of the underlying faults. Models fit to one such profile (see accompanying figure) across an inward-verging ridge with 500 m of relief in the southeast of Crisium indicate that its fault dips 22°, penetrates to a depth of ~20 km (far beneath the base of the mare deposits), and accumulated ~1 km of along-slip displacement. This result, given the other large structures and inferred buried ridges in Crisium, implies that this mare experienced substantial shortening. Lunar wrinkle ridges are ascribed to some combination of mare subsidence and global contraction; if representative of lunar maria in general, our findings for Crisium suggest that these processes have shaped lunar tectonics to an extent greater than previously recognized. Structural map of Mare Crisium showing wrinkle ridges (flags give down-dip direction), buried ridges (arrows give down-slope direction), buried craters, superposed craters >5 km in diameter, and the location of the topographic profile (green line); inset shows topographic (green) and model (blue) profiles. Graticule has 10° increments in latitude and longitude.

  5. Comparison of human observer and algorithmic target detection in nonurban forward-looking infrared imagery

    NASA Astrophysics Data System (ADS)

    Weber, Bruce A.

    2005-07-01

    We have performed an experiment that compares the performance of human observers with that of a robust algorithm for the detection of targets in difficult, nonurban forward-looking infrared imagery. Our purpose was to benchmark the comparison and document performance differences for future algorithm improvement. The scale-insensitive detection algorithm, used as a benchmark by the Night Vision Electronic Sensors Directorate for algorithm evaluation, employed a combination of contrastlike features to locate targets. Detection receiver operating characteristic curves and observer-confidence analyses were used to compare human and algorithmic responses and to gain insight into differences. The test database contained ground targets, in natural clutter, whose detectability, as judged by human observers, ranged from easy to very difficult. In general, as compared with human observers, the algorithm detected most of the same targets, but correlated confidence with correct detections poorly and produced many more false alarms at any useful level of performance. Though characterizing human performance was not the intent of this study, results suggest that previous observational experience was not a strong predictor of human performance, and that combining individual human observations by majority vote significantly reduced false-alarm rates.

  6. SA-SOM algorithm for detecting communities in complex networks

    NASA Astrophysics Data System (ADS)

    Chen, Luogeng; Wang, Yanran; Huang, Xiaoming; Hu, Mengyu; Hu, Fang

    2017-10-01

    Currently, community detection is a hot topic. This paper, based on the self-organizing map (SOM) algorithm, introduced the idea of self-adaptation (SA) that the number of communities can be identified automatically, a novel algorithm SA-SOM of detecting communities in complex networks is proposed. Several representative real-world networks and a set of computer-generated networks by LFR-benchmark are utilized to verify the accuracy and the efficiency of this algorithm. The experimental findings demonstrate that this algorithm can identify the communities automatically, accurately and efficiently. Furthermore, this algorithm can also acquire higher values of modularity, NMI and density than the SOM algorithm does.

  7. Primary studies of trace quantities of green vegetation in Mono Lake area using 1990 AVIRIS data

    NASA Technical Reports Server (NTRS)

    Chen, Zhi-Kang; Elvidge, Chris D.; Groeneveld, David P.

    1992-01-01

    Our primary results in Jasper Ridge Biological Preserve indicate that high spectral resolution Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data may provide a substantial advantage in vegetation, based on the chlorophyll red edge feature from 700-780 nm. The chlorophyll red edge was detected for green vegetation cover as low as 4.8 percent. The objective of our studies in Mono Lake area is to continue the experiments performed in Jasper Ridge and to examine the persistence of red edge feature of trace quantities of green vegetation for different plant communities with non-uniform soil backgrounds.

  8. Hotspot activity and plume pulses recorded by geometry of spreading axes

    NASA Astrophysics Data System (ADS)

    Abelson, Meir; Agnon, Amotz

    2001-06-01

    Anomalous plan view geometry (planform) of spreading axes is shown to be a faithful indicator of hotspot influence, possibly capable of detecting pulses of hotspot discharge. A planform anomaly (PA) occurs when the orientation of second-order ridge segments is prominently oblique to the spreading direction. PA is found in the vicinity of hotspots at shallow ridges (<1.5 km), suggesting hotspot influence. In places the PA and shallow bathymetry are accompanied by geochemical anomalies, corroborating hotspot influence. This linkage is best expressed in the western Gulf of Aden, where the extent of the PA from the Afar hotspot coincides with the extent of La/Sm and Sr isotopic anomalies. Using fracture mechanics we predict PA to reflect overpressurized melt that dominates the stresses in the crust, consistent with hotspot regime. Accordingly, the temporal variations of the planform previously inferred from magnetic anomalies around the Kolbeinsey Ridge (KR), north of Iceland, record episodes of interaction with the hotspot and major pulses of the plume. This suggestion is corroborated by temporal correlation of episodes showing PA north of Iceland with plume pulses previously inferred by the V-shaped ridges around the Reykjanes Ridge (RR), south of Iceland. In contrast to the RR, the temporal correlation suggests simultaneous incidence of the plume pulses at Iceland and KR, hundreds of kilometers to the north. A deep northward branch of the Iceland plume active during pulse-periods may explain these observations.

  9. A novel adaptive, real-time algorithm to detect gait events from wearable sensors.

    PubMed

    Chia Bejarano, Noelia; Ambrosini, Emilia; Pedrocchi, Alessandra; Ferrigno, Giancarlo; Monticone, Marco; Ferrante, Simona

    2015-05-01

    A real-time, adaptive algorithm based on two inertial and magnetic sensors placed on the shanks was developed for gait-event detection. For each leg, the algorithm detected the Initial Contact (IC), as the minimum of the flexion/extension angle, and the End Contact (EC) and the Mid-Swing (MS), as minimum and maximum of the angular velocity, respectively. The algorithm consisted of calibration, real-time detection, and step-by-step update. Data collected from 22 healthy subjects (21 to 85 years) walking at three self-selected speeds were used to validate the algorithm against the GaitRite system. Comparable levels of accuracy and significantly lower detection delays were achieved with respect to other published methods. The algorithm robustness was tested on ten healthy subjects performing sudden speed changes and on ten stroke subjects (43 to 89 years). For healthy subjects, F1-scores of 1 and mean detection delays lower than 14 ms were obtained. For stroke subjects, F1-scores of 0.998 and 0.944 were obtained for IC and EC, respectively, with mean detection delays always below 31 ms. The algorithm accurately detected gait events in real time from a heterogeneous dataset of gait patterns and paves the way for the design of closed-loop controllers for customized gait trainings and/or assistive devices.

  10. Multiscale peak detection in wavelet space.

    PubMed

    Zhang, Zhi-Min; Tong, Xia; Peng, Ying; Ma, Pan; Zhang, Ming-Jin; Lu, Hong-Mei; Chen, Xiao-Qing; Liang, Yi-Zeng

    2015-12-07

    Accurate peak detection is essential for analyzing high-throughput datasets generated by analytical instruments. Derivatives with noise reduction and matched filtration are frequently used, but they are sensitive to baseline variations, random noise and deviations in the peak shape. A continuous wavelet transform (CWT)-based method is more practical and popular in this situation, which can increase the accuracy and reliability by identifying peaks across scales in wavelet space and implicitly removing noise as well as the baseline. However, its computational load is relatively high and the estimated features of peaks may not be accurate in the case of peaks that are overlapping, dense or weak. In this study, we present multi-scale peak detection (MSPD) by taking full advantage of additional information in wavelet space including ridges, valleys, and zero-crossings. It can achieve a high accuracy by thresholding each detected peak with the maximum of its ridge. It has been comprehensively evaluated with MALDI-TOF spectra in proteomics, the CAMDA 2006 SELDI dataset as well as the Romanian database of Raman spectra, which is particularly suitable for detecting peaks in high-throughput analytical signals. Receiver operating characteristic (ROC) curves show that MSPD can detect more true peaks while keeping the false discovery rate lower than MassSpecWavelet and MALDIquant methods. Superior results in Raman spectra suggest that MSPD seems to be a more universal method for peak detection. MSPD has been designed and implemented efficiently in Python and Cython. It is available as an open source package at .

  11. AdaBoost-based algorithm for network intrusion detection.

    PubMed

    Hu, Weiming; Hu, Wei; Maybank, Steve

    2008-04-01

    Network intrusion detection aims at distinguishing the attacks on the Internet from normal use of the Internet. It is an indispensable part of the information security system. Due to the variety of network behaviors and the rapid development of attack fashions, it is necessary to develop fast machine-learning-based intrusion detection algorithms with high detection rates and low false-alarm rates. In this correspondence, we propose an intrusion detection algorithm based on the AdaBoost algorithm. In the algorithm, decision stumps are used as weak classifiers. The decision rules are provided for both categorical and continuous features. By combining the weak classifiers for continuous features and the weak classifiers for categorical features into a strong classifier, the relations between these two different types of features are handled naturally, without any forced conversions between continuous and categorical features. Adaptable initial weights and a simple strategy for avoiding overfitting are adopted to improve the performance of the algorithm. Experimental results show that our algorithm has low computational complexity and error rates, as compared with algorithms of higher computational complexity, as tested on the benchmark sample data.

  12. Corner detection and sorting method based on improved Harris algorithm in camera calibration

    NASA Astrophysics Data System (ADS)

    Xiao, Ying; Wang, Yonghong; Dan, Xizuo; Huang, Anqi; Hu, Yue; Yang, Lianxiang

    2016-11-01

    In traditional Harris corner detection algorithm, the appropriate threshold which is used to eliminate false corners is selected manually. In order to detect corners automatically, an improved algorithm which combines Harris and circular boundary theory of corners is proposed in this paper. After detecting accurate corner coordinates by using Harris algorithm and Forstner algorithm, false corners within chessboard pattern of the calibration plate can be eliminated automatically by using circular boundary theory. Moreover, a corner sorting method based on an improved calibration plate is proposed to eliminate false background corners and sort remaining corners in order. Experiment results show that the proposed algorithms can eliminate all false corners and sort remaining corners correctly and automatically.

  13. QRS Detection Algorithm for Telehealth Electrocardiogram Recordings.

    PubMed

    Khamis, Heba; Weiss, Robert; Xie, Yang; Chang, Chan-Wei; Lovell, Nigel H; Redmond, Stephen J

    2016-07-01

    QRS detection algorithms are needed to analyze electrocardiogram (ECG) recordings generated in telehealth environments. However, the numerous published QRS detectors focus on clean clinical data. Here, a "UNSW" QRS detection algorithm is described that is suitable for clinical ECG and also poorer quality telehealth ECG. The UNSW algorithm generates a feature signal containing information about ECG amplitude and derivative, which is filtered according to its frequency content and an adaptive threshold is applied. The algorithm was tested on clinical and telehealth ECG and the QRS detection performance is compared to the Pan-Tompkins (PT) and Gutiérrez-Rivas (GR) algorithm. For the MIT-BIH Arrhythmia database (virtually artifact free, clinical ECG), the overall sensitivity (Se) and positive predictivity (+P) of the UNSW algorithm was >99%, which was comparable to PT and GR. When applied to the MIT-BIH noise stress test database (clinical ECG with added calibrated noise) after artifact masking, all three algorithms had overall Se >99%, and the UNSW algorithm had higher +P (98%, p < 0.05) than PT and GR. For 250 telehealth ECG records (unsupervised recordings; dry metal electrodes), the UNSW algorithm had 98% Se and 95% +P which was superior to PT (+P: p < 0.001) and GR (Se and +P: p < 0.001). This is the first study to describe a QRS detection algorithm for telehealth data and evaluate it on clinical and telehealth ECG with superior results to published algorithms. The UNSW algorithm could be used to manage increasing telehealth ECG analysis workloads.

  14. Comparative analysis of peak-detection techniques for comprehensive two-dimensional chromatography.

    PubMed

    Latha, Indu; Reichenbach, Stephen E; Tao, Qingping

    2011-09-23

    Comprehensive two-dimensional gas chromatography (GC×GC) is a powerful technology for separating complex samples. The typical goal of GC×GC peak detection is to aggregate data points of analyte peaks based on their retention times and intensities. Two techniques commonly used for two-dimensional peak detection are the two-step algorithm and the watershed algorithm. A recent study [4] compared the performance of the two-step and watershed algorithms for GC×GC data with retention-time shifts in the second-column separations. In that analysis, the peak retention-time shifts were corrected while applying the two-step algorithm but the watershed algorithm was applied without shift correction. The results indicated that the watershed algorithm has a higher probability of erroneously splitting a single two-dimensional peak than the two-step approach. This paper reconsiders the analysis by comparing peak-detection performance for resolved peaks after correcting retention-time shifts for both the two-step and watershed algorithms. Simulations with wide-ranging conditions indicate that when shift correction is employed with both algorithms, the watershed algorithm detects resolved peaks with greater accuracy than the two-step method. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Bio-ALIRT biosurveillance detection algorithm evaluation.

    PubMed

    Siegrist, David; Pavlin, J

    2004-09-24

    Early detection of disease outbreaks by a medical biosurveillance system relies on two major components: 1) the contribution of early and reliable data sources and 2) the sensitivity, specificity, and timeliness of biosurveillance detection algorithms. This paper describes an effort to assess leading detection algorithms by arranging a common challenge problem and providing a common data set. The objectives of this study were to determine whether automated detection algorithms can reliably and quickly identify the onset of natural disease outbreaks that are surrogates for possible terrorist pathogen releases, and do so at acceptable false-alert rates (e.g., once every 2-6 weeks). Historic de-identified data were obtained from five metropolitan areas over 23 months; these data included International Classification of Diseases, Ninth Revision (ICD-9) codes related to respiratory and gastrointestinal illness syndromes. An outbreak detection group identified and labeled two natural disease outbreaks in these data and provided them to analysts for training of detection algorithms. All outbreaks in the remaining test data were identified but not revealed to the detection groups until after their analyses. The algorithms established a probability of outbreak for each day's counts. The probability of outbreak was assessed as an "actual" alert for different false-alert rates. The best algorithms were able to detect all of the outbreaks at false-alert rates of one every 2-6 weeks. They were often able to detect for the same day human investigators had identified as the true start of the outbreak. Because minimal data exists for an actual biologic attack, determining how quickly an algorithm might detect such an attack is difficult. However, application of these algorithms in combination with other data-analysis methods to historic outbreak data indicates that biosurveillance techniques for analyzing syndrome counts can rapidly detect seasonal respiratory and gastrointestinal illness outbreaks. Further research is needed to assess the value of electronic data sources for predictive detection. In addition, simulations need to be developed and implemented to better characterize the size and type of biologic attack that can be detected by current methods by challenging them under different projected operational conditions.

  16. A lightweight QRS detector for single lead ECG signals using a max-min difference algorithm.

    PubMed

    Pandit, Diptangshu; Zhang, Li; Liu, Chengyu; Chattopadhyay, Samiran; Aslam, Nauman; Lim, Chee Peng

    2017-06-01

    Detection of the R-peak pertaining to the QRS complex of an ECG signal plays an important role for the diagnosis of a patient's heart condition. To accurately identify the QRS locations from the acquired raw ECG signals, we need to handle a number of challenges, which include noise, baseline wander, varying peak amplitudes, and signal abnormality. This research aims to address these challenges by developing an efficient lightweight algorithm for QRS (i.e., R-peak) detection from raw ECG signals. A lightweight real-time sliding window-based Max-Min Difference (MMD) algorithm for QRS detection from Lead II ECG signals is proposed. Targeting to achieve the best trade-off between computational efficiency and detection accuracy, the proposed algorithm consists of five key steps for QRS detection, namely, baseline correction, MMD curve generation, dynamic threshold computation, R-peak detection, and error correction. Five annotated databases from Physionet are used for evaluating the proposed algorithm in R-peak detection. Integrated with a feature extraction technique and a neural network classifier, the proposed ORS detection algorithm has also been extended to undertake normal and abnormal heartbeat detection from ECG signals. The proposed algorithm exhibits a high degree of robustness in QRS detection and achieves an average sensitivity of 99.62% and an average positive predictivity of 99.67%. Its performance compares favorably with those from the existing state-of-the-art models reported in the literature. In regards to normal and abnormal heartbeat detection, the proposed QRS detection algorithm in combination with the feature extraction technique and neural network classifier achieves an overall accuracy rate of 93.44% based on an empirical evaluation using the MIT-BIH Arrhythmia data set with 10-fold cross validation. In comparison with other related studies, the proposed algorithm offers a lightweight adaptive alternative for R-peak detection with good computational efficiency. The empirical results indicate that it not only yields a high accuracy rate in QRS detection, but also exhibits efficient computational complexity at the order of O(n), where n is the length of an ECG signal. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Detection of Coronal Mass Ejections Using Multiple Features and Space-Time Continuity

    NASA Astrophysics Data System (ADS)

    Zhang, Ling; Yin, Jian-qin; Lin, Jia-ben; Feng, Zhi-quan; Zhou, Jin

    2017-07-01

    Coronal Mass Ejections (CMEs) release tremendous amounts of energy in the solar system, which has an impact on satellites, power facilities and wireless transmission. To effectively detect a CME in Large Angle Spectrometric Coronagraph (LASCO) C2 images, we propose a novel algorithm to locate the suspected CME regions, using the Extreme Learning Machine (ELM) method and taking into account the features of the grayscale and the texture. Furthermore, space-time continuity is used in the detection algorithm to exclude the false CME regions. The algorithm includes three steps: i) define the feature vector which contains textural and grayscale features of a running difference image; ii) design the detection algorithm based on the ELM method according to the feature vector; iii) improve the detection accuracy rate by using the decision rule of the space-time continuum. Experimental results show the efficiency and the superiority of the proposed algorithm in the detection of CMEs compared with other traditional methods. In addition, our algorithm is insensitive to most noise.

  18. STREAMFINDER - I. A new algorithm for detecting stellar streams

    NASA Astrophysics Data System (ADS)

    Malhan, Khyati; Ibata, Rodrigo A.

    2018-07-01

    We have designed a powerful new algorithm to detect stellar streams in an automated and systematic way. The algorithm, which we call the STREAMFINDER, is well suited for finding dynamically cold and thin stream structures that may lie along any simple or complex orbits in Galactic stellar surveys containing any combination of positional and kinematic information. In the present contribution, we introduce the algorithm, lay out the ideas behind it, explain the methodology adopted to detect streams, and detail its workings by running it on a suite of simulations of mock Galactic survey data of similar quality to that expected from the European Space Agency/Gaia mission. We show that our algorithm is able to detect even ultra-faint stream features lying well below previous detection limits. Tests show that our algorithm will be able to detect distant halo stream structures >10° long containing as few as ˜15 members (ΣG ˜ 33.6 mag arcsec-2) in the Gaia data set.

  19. Distributed learning automata-based algorithm for community detection in complex networks

    NASA Astrophysics Data System (ADS)

    Khomami, Mohammad Mehdi Daliri; Rezvanian, Alireza; Meybodi, Mohammad Reza

    2016-03-01

    Community structure is an important and universal topological property of many complex networks such as social and information networks. The detection of communities of a network is a significant technique for understanding the structure and function of networks. In this paper, we propose an algorithm based on distributed learning automata for community detection (DLACD) in complex networks. In the proposed algorithm, each vertex of network is equipped with a learning automation. According to the cooperation among network of learning automata and updating action probabilities of each automaton, the algorithm interactively tries to identify high-density local communities. The performance of the proposed algorithm is investigated through a number of simulations on popular synthetic and real networks. Experimental results in comparison with popular community detection algorithms such as walk trap, Danon greedy optimization, Fuzzy community detection, Multi-resolution community detection and label propagation demonstrated the superiority of DLACD in terms of modularity, NMI, performance, min-max-cut and coverage.

  20. An Improved Harmonic Current Detection Method Based on Parallel Active Power Filter

    NASA Astrophysics Data System (ADS)

    Zeng, Zhiwu; Xie, Yunxiang; Wang, Yingpin; Guan, Yuanpeng; Li, Lanfang; Zhang, Xiaoyu

    2017-05-01

    Harmonic detection technology plays an important role in the applications of active power filter. The accuracy and real-time performance of harmonic detection are the precondition to ensure the compensation performance of Active Power Filter (APF). This paper proposed an improved instantaneous reactive power harmonic current detection algorithm. The algorithm uses an improved ip -iq algorithm which is combined with the moving average value filter. The proposed ip -iq algorithm can remove the αβ and dq coordinate transformation, decreasing the cost of calculation, simplifying the extraction process of fundamental components of load currents, and improving the detection speed. The traditional low-pass filter is replaced by the moving average filter, detecting the harmonic currents more precisely and quickly. Compared with the traditional algorithm, the THD (Total Harmonic Distortion) of the grid currents is reduced from 4.41% to 3.89% for the simulations and from 8.50% to 4.37% for the experiments after the improvement. The results show the proposed algorithm is more accurate and efficient.

  1. A Space Object Detection Algorithm using Fourier Domain Likelihood Ratio Test

    NASA Astrophysics Data System (ADS)

    Becker, D.; Cain, S.

    Space object detection is of great importance in the highly dependent yet competitive and congested space domain. Detection algorithms employed play a crucial role in fulfilling the detection component in the situational awareness mission to detect, track, characterize and catalog unknown space objects. Many current space detection algorithms use a matched filter or a spatial correlator to make a detection decision at a single pixel point of a spatial image based on the assumption that the data follows a Gaussian distribution. This paper explores the potential for detection performance advantages when operating in the Fourier domain of long exposure images of small and/or dim space objects from ground based telescopes. A binary hypothesis test is developed based on the joint probability distribution function of the image under the hypothesis that an object is present and under the hypothesis that the image only contains background noise. The detection algorithm tests each pixel point of the Fourier transformed images to make the determination if an object is present based on the criteria threshold found in the likelihood ratio test. Using simulated data, the performance of the Fourier domain detection algorithm is compared to the current algorithm used in space situational awareness applications to evaluate its value.

  2. Effect of Non-rigid Registration Algorithms on Deformation Based Morphometry: A Comparative Study with Control and Williams Syndrome Subjects

    PubMed Central

    Han, Zhaoying; Thornton-Wells, Tricia A.; Dykens, Elisabeth M.; Gore, John C.; Dawant, Benoit M.

    2014-01-01

    Deformation Based Morphometry (DBM) is a widely used method for characterizing anatomical differences across groups. DBM is based on the analysis of the deformation fields generated by non-rigid registration algorithms, which warp the individual volumes to a DBM atlas. Although several studies have compared non-rigid registration algorithms for segmentation tasks, few studies have compared the effect of the registration algorithms on group differences that may be uncovered through DBM. In this study, we compared group atlas creation and DBM results obtained with five well-established non-rigid registration algorithms using thirteen subjects with Williams Syndrome (WS) and thirteen Normal Control (NC) subjects. The five non-rigid registration algorithms include: (1) The Adaptive Bases Algorithm (ABA); (2) The Image Registration Toolkit (IRTK); (3) The FSL Nonlinear Image Registration Tool (FSL); (4) The Automatic Registration Tool (ART); and (5) the normalization algorithm available in SPM8. Results indicate that the choice of algorithm has little effect on the creation of group atlases. However, regions of differences between groups detected with DBM vary from algorithm to algorithm both qualitatively and quantitatively. The unique nature of the data set used in this study also permits comparison of visible anatomical differences between the groups and regions of difference detected by each algorithm. Results show that the interpretation of DBM results is difficult. Four out of the five algorithms we have evaluated detect bilateral differences between the two groups in the insular cortex, the basal ganglia, orbitofrontal cortex, as well as in the cerebellum. These correspond to differences that have been reported in the literature and that are visible in our samples. But our results also show that some algorithms detect regions that are not detected by the others and that the extent of the detected regions varies from algorithm to algorithm. These results suggest that using more than one algorithm when performing DBM studies would increase confidence in the results. Properties of the algorithms such as the similarity measure they maximize and the regularity of the deformation fields, as well as the location of differences detected with DBM, also need to be taken into account in the interpretation process. PMID:22459439

  3. A Region Tracking-Based Vehicle Detection Algorithm in Nighttime Traffic Scenes

    PubMed Central

    Wang, Jianqiang; Sun, Xiaoyan; Guo, Junbin

    2013-01-01

    The preceding vehicles detection technique in nighttime traffic scenes is an important part of the advanced driver assistance system (ADAS). This paper proposes a region tracking-based vehicle detection algorithm via the image processing technique. First, the brightness of the taillights during nighttime is used as the typical feature, and we use the existing global detection algorithm to detect and pair the taillights. When the vehicle is detected, a time series analysis model is introduced to predict vehicle positions and the possible region (PR) of the vehicle in the next frame. Then, the vehicle is only detected in the PR. This could reduce the detection time and avoid the false pairing between the bright spots in the PR and the bright spots out of the PR. Additionally, we present a thresholds updating method to make the thresholds adaptive. Finally, experimental studies are provided to demonstrate the application and substantiate the superiority of the proposed algorithm. The results show that the proposed algorithm can simultaneously reduce both the false negative detection rate and the false positive detection rate.

  4. Expert system constant false alarm rate processor

    NASA Astrophysics Data System (ADS)

    Baldygo, William J., Jr.; Wicks, Michael C.

    1993-10-01

    The requirements for high detection probability and low false alarm probability in modern wide area surveillance radars are rarely met due to spatial variations in clutter characteristics. Many filtering and CFAR detection algorithms have been developed to effectively deal with these variations; however, any single algorithm is likely to exhibit excessive false alarms and intolerably low detection probabilities in a dynamically changing environment. A great deal of research has led to advances in the state of the art in Artificial Intelligence (AI) and numerous areas have been identified for application to radar signal processing. The approach suggested here, discussed in a patent application submitted by the authors, is to intelligently select the filtering and CFAR detection algorithms being executed at any given time, based upon the observed characteristics of the interference environment. This approach requires sensing the environment, employing the most suitable algorithms, and applying an appropriate multiple algorithm fusion scheme or consensus algorithm to produce a global detection decision.

  5. Toward an Objective Enhanced-V Detection Algorithm

    NASA Technical Reports Server (NTRS)

    Moses, John F.; Brunner,Jason C.; Feltz, Wayne F.; Ackerman, Steven A.; Moses, John F.; Rabin, Robert M.

    2007-01-01

    The area of coldest cloud tops above thunderstorms sometimes has a distinct V or U shape. This pattern, often referred to as an "enhanced-V signature, has been observed to occur during and preceding severe weather. This study describes an algorithmic approach to objectively detect overshooting tops, temperature couplets, and enhanced-V features with observations from the Geostationary Operational Environmental Satellite and Low Earth Orbit data. The methodology consists of temperature, temperature difference, and distance thresholds for the overshooting top and temperature couplet detection parts of the algorithm and consists of cross correlation statistics of pixels for the enhanced-V detection part of the algorithm. The effectiveness of the overshooting top and temperature couplet detection components of the algorithm is examined using GOES and MODIS image data for case studies in the 2003-2006 seasons. The main goal is for the algorithm to be useful for operations with future sensors, such as GOES-R.

  6. Enhancing Deep-Water Low-Resolution Gridded Bathymetry Using Single Image Super-Resolution

    NASA Astrophysics Data System (ADS)

    Elmore, P. A.; Nock, K.; Bonanno, D.; Smith, L.; Ferrini, V. L.; Petry, F. E.

    2017-12-01

    We present research to employ single-image super-resolution (SISR) algorithms to enhance knowledge of the seafloor using the 1-minute GEBCO 2014 grid when 100m grids from high-resolution sonar systems are available for training. Our numerical upscaling experiments of x15 upscaling of the GEBCO grid along three areas of the Eastern Pacific Ocean along mid-ocean ridge systems where we have these 100m gridded bathymetry data sets, which we accept as ground-truth. We show that four SISR algorithms can enhance this low-resolution knowledge of bathymetry versus bicubic or Spline-In-Tension algorithms through upscaling under these conditions: 1) rough topography is present in both training and testing areas and 2) the range of depths and features in the training area contains the range of depths in the enhancement area. We quantitatively judged successful SISR enhancement versus bicubic interpolation when Student's hypothesis testing show significant improvement of the root-mean squared error (RMSE) between upscaled bathymetry and 100m gridded ground-truth bathymetry at p < 0.05. In addition, we found evidence that random forest based SISR methods may provide more robust enhancements versus non-forest based SISR algorithms.

  7. Improving energy efficiency in handheld biometric applications

    NASA Astrophysics Data System (ADS)

    Hoyle, David C.; Gale, John W.; Schultz, Robert C.; Rakvic, Ryan N.; Ives, Robert W.

    2012-06-01

    With improved smartphone and tablet technology, it is becoming increasingly feasible to implement powerful biometric recognition algorithms on portable devices. Typical iris recognition algorithms, such as Ridge Energy Direction (RED), utilize two-dimensional convolution in their implementation. This paper explores the energy consumption implications of 12 different methods of implementing two-dimensional convolution on a portable device. Typically, convolution is implemented using floating point operations. If a given algorithm implemented integer convolution vice floating point convolution, it could drastically reduce the energy consumed by the processor. The 12 methods compared include 4 major categories: Integer C, Integer Java, Floating Point C, and Floating Point Java. Each major category is further divided into 3 implementations: variable size looped convolution, static size looped convolution, and unrolled looped convolution. All testing was performed using the HTC Thunderbolt with energy measured directly using a Tektronix TDS5104B Digital Phosphor oscilloscope. Results indicate that energy savings as high as 75% are possible by using Integer C versus Floating Point C. Considering the relative proportion of processing time that convolution is responsible for in a typical algorithm, the savings in energy would likely result in significantly greater time between battery charges.

  8. Evaluating the utility of syndromic surveillance algorithms for screening to detect potentially clonal hospital infection outbreaks

    PubMed Central

    Talbot, Thomas R; Schaffner, William; Bloch, Karen C; Daniels, Titus L; Miller, Randolph A

    2011-01-01

    Objective The authors evaluated algorithms commonly used in syndromic surveillance for use as screening tools to detect potentially clonal outbreaks for review by infection control practitioners. Design Study phase 1 applied four aberrancy detection algorithms (CUSUM, EWMA, space-time scan statistic, and WSARE) to retrospective microbiologic culture data, producing a list of past candidate outbreak clusters. In phase 2, four infectious disease physicians categorized the phase 1 algorithm-identified clusters to ascertain algorithm performance. In phase 3, project members combined the algorithms to create a unified screening system and conducted a retrospective pilot evaluation. Measurements The study calculated recall and precision for each algorithm, and created precision-recall curves for various methods of combining the algorithms into a unified screening tool. Results Individual algorithm recall and precision ranged from 0.21 to 0.31 and from 0.053 to 0.29, respectively. Few candidate outbreak clusters were identified by more than one algorithm. The best method of combining the algorithms yielded an area under the precision-recall curve of 0.553. The phase 3 combined system detected all infection control-confirmed outbreaks during the retrospective evaluation period. Limitations Lack of phase 2 reviewers' agreement indicates that subjective expert review was an imperfect gold standard. Less conservative filtering of culture results and alternate parameter selection for each algorithm might have improved algorithm performance. Conclusion Hospital outbreak detection presents different challenges than traditional syndromic surveillance. Nevertheless, algorithms developed for syndromic surveillance have potential to form the basis of a combined system that might perform clinically useful hospital outbreak screening. PMID:21606134

  9. Evaluation schemes for video and image anomaly detection algorithms

    NASA Astrophysics Data System (ADS)

    Parameswaran, Shibin; Harguess, Josh; Barngrover, Christopher; Shafer, Scott; Reese, Michael

    2016-05-01

    Video anomaly detection is a critical research area in computer vision. It is a natural first step before applying object recognition algorithms. There are many algorithms that detect anomalies (outliers) in videos and images that have been introduced in recent years. However, these algorithms behave and perform differently based on differences in domains and tasks to which they are subjected. In order to better understand the strengths and weaknesses of outlier algorithms and their applicability in a particular domain/task of interest, it is important to measure and quantify their performance using appropriate evaluation metrics. There are many evaluation metrics that have been used in the literature such as precision curves, precision-recall curves, and receiver operating characteristic (ROC) curves. In order to construct these different metrics, it is also important to choose an appropriate evaluation scheme that decides when a proposed detection is considered a true or a false detection. Choosing the right evaluation metric and the right scheme is very critical since the choice can introduce positive or negative bias in the measuring criterion and may favor (or work against) a particular algorithm or task. In this paper, we review evaluation metrics and popular evaluation schemes that are used to measure the performance of anomaly detection algorithms on videos and imagery with one or more anomalies. We analyze the biases introduced by these by measuring the performance of an existing anomaly detection algorithm.

  10. One algorithm to rule them all? An evaluation and discussion of ten eye movement event-detection algorithms.

    PubMed

    Andersson, Richard; Larsson, Linnea; Holmqvist, Kenneth; Stridh, Martin; Nyström, Marcus

    2017-04-01

    Almost all eye-movement researchers use algorithms to parse raw data and detect distinct types of eye movement events, such as fixations, saccades, and pursuit, and then base their results on these. Surprisingly, these algorithms are rarely evaluated. We evaluated the classifications of ten eye-movement event detection algorithms, on data from an SMI HiSpeed 1250 system, and compared them to manual ratings of two human experts. The evaluation focused on fixations, saccades, and post-saccadic oscillations. The evaluation used both event duration parameters, and sample-by-sample comparisons to rank the algorithms. The resulting event durations varied substantially as a function of what algorithm was used. This evaluation differed from previous evaluations by considering a relatively large set of algorithms, multiple events, and data from both static and dynamic stimuli. The main conclusion is that current detectors of only fixations and saccades work reasonably well for static stimuli, but barely better than chance for dynamic stimuli. Differing results across evaluation methods make it difficult to select one winner for fixation detection. For saccade detection, however, the algorithm by Larsson, Nyström and Stridh (IEEE Transaction on Biomedical Engineering, 60(9):2484-2493,2013) outperforms all algorithms in data from both static and dynamic stimuli. The data also show how improperly selected algorithms applied to dynamic data misestimate fixation and saccade properties.

  11. Development and validation of a dual sensing scheme to improve accuracy of bradycardia and pause detection in an insertable cardiac monitor.

    PubMed

    Passman, Rod S; Rogers, John D; Sarkar, Shantanu; Reiland, Jerry; Reisfeld, Erin; Koehler, Jodi; Mittal, Suneet

    2017-07-01

    Undersensing of premature ventricular beats and low-amplitude R waves are primary causes for inappropriate bradycardia and pause detections in insertable cardiac monitors (ICMs). The purpose of this study was to develop and validate an enhanced algorithm to reduce inappropriately detected bradycardia and pause episodes. Independent data sets to develop and validate the enhanced algorithm were derived from a database of ICM-detected bradycardia and pause episodes in de-identified patients monitored for unexplained syncope. The original algorithm uses an auto-adjusting sensitivity threshold for R-wave sensing to detect tachycardia and avoid T-wave oversensing. In the enhanced algorithm, a second sensing threshold is used with a long blanking and fixed lower sensitivity threshold, looking for evidence of undersensed signals. Data reported includes percent change in appropriate and inappropriate bradycardia and pause detections as well as changes in episode detection sensitivity and positive predictive value with the enhanced algorithm. The validation data set, from 663 consecutive patients, consisted of 4904 (161 patients) bradycardia and 2582 (133 patients) pause episodes, of which 2976 (61%) and 996 (39%) were appropriately detected bradycardia and pause episodes. The enhanced algorithm reduced inappropriate bradycardia and pause episodes by 95% and 47%, respectively, with 1.7% and 0.6% reduction in appropriate episodes, respectively. The average episode positive predictive value improved by 62% (P < .001) for bradycardia detection and by 26% (P < .001) for pause detection, with an average relative sensitivity of 95% (P < .001) and 99% (P = .5), respectively. The enhanced dual sense algorithm for bradycardia and pause detection in ICMs substantially reduced inappropriate episode detection with a minimal reduction in true episode detection. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Adapting detection sensitivity based on evidence of irregular sinus arrhythmia to improve atrial fibrillation detection in insertable cardiac monitors.

    PubMed

    Pürerfellner, Helmut; Sanders, Prashanthan; Sarkar, Shantanu; Reisfeld, Erin; Reiland, Jerry; Koehler, Jodi; Pokushalov, Evgeny; Urban, Luboš; Dekker, Lukas R C

    2017-10-03

    Intermittent change in p-wave discernibility during periods of ectopy and sinus arrhythmia is a cause of inappropriate atrial fibrillation (AF) detection in insertable cardiac monitors (ICM). To address this, we developed and validated an enhanced AF detection algorithm. Atrial fibrillation detection in Reveal LINQ ICM uses patterns of incoherence in RR intervals and absence of P-wave evidence over a 2-min period. The enhanced algorithm includes P-wave evidence during RR irregularity as evidence of sinus arrhythmia or ectopy to adaptively optimize sensitivity for AF detection. The algorithm was developed and validated using Holter data from the XPECT and LINQ Usability studies which collected surface electrocardiogram (ECG) and continuous ICM ECG over a 24-48 h period. The algorithm detections were compared with Holter annotations, performed by multiple reviewers, to compute episode and duration detection performance. The validation dataset comprised of 3187 h of valid Holter and LINQ recordings from 138 patients, with true AF in 37 patients yielding 108 true AF episodes ≥2-min and 449 h of AF. The enhanced algorithm reduced inappropriately detected episodes by 49% and duration by 66% with <1% loss in true episodes or duration. The algorithm correctly identified 98.9% of total AF duration and 99.8% of total sinus or non-AF rhythm duration. The algorithm detected 97.2% (99.7% per-patient average) of all AF episodes ≥2-min, and 84.9% (95.3% per-patient average) of detected episodes involved AF. An enhancement that adapts sensitivity for AF detection reduced inappropriately detected episodes and duration with minimal reduction in sensitivity. © The Author 2017. Published by Oxford University Press on behalf of the European Society of Cardiology

  13. Modeling and analysis of the solar concentrator in photovoltaic systems

    NASA Astrophysics Data System (ADS)

    Mroczka, Janusz; Plachta, Kamil

    2015-06-01

    The paper presents the Λ-ridge and V-trough concentrator system with a low concentration ratio. Calculations and simulations have been made in the program created by the author. The results of simulation allow to choose the best parameters of photovoltaic system: the opening angle between the surface of the photovoltaic module and mirrors, resolution of the tracking system and the material for construction of the concentrator mirrors. The research shows the effect each of these parameters on the efficiency of the photovoltaic system and method of surface modeling using BRDF function. The parameters of concentrator surface (eg. surface roughness) were calculated using a new algorithm based on the BRDF function. The algorithm uses a combination of model Torrance-Sparrow and HTSG. The simulation shows the change in voltage, current and output power depending on system parameters.

  14. Revealing Individual Lifestyles through Mass Spectrometry Imaging of Chemical Compounds in Fingerprints.

    PubMed

    Hinners, Paige; O'Neill, Kelly C; Lee, Young Jin

    2018-03-26

    Fingerprints, specifically the ridge details within the print, have long been used in forensic investigations for individual identification. Beyond the ridge detail, fingerprints contain useful chemical information. The study of fingerprint chemical information has become of interest, especially with mass spectrometry imaging technologies. Mass spectrometry imaging visualizes the spatial relationship of each compound detected, allowing ridge detail and chemical information in a single analysis. In this work, a range of exogenous fingerprint compounds that may reveal a personal lifestyle were studied using matrix-assisted laser desorption/ionization mass spectrometry imaging (MALDI-MSI). Studied chemical compounds include various brands of bug sprays and sunscreens, as well as food oils, alcohols, and citrus fruits. Brand differentiation and source determination were possible based on the active ingredients or exclusive compounds left in fingerprints. Tandem mass spectrometry was performed for the key compounds, so that these compounds could be confidently identified in a single multiplex mass spectrometry imaging data acquisition.

  15. Stride search: A general algorithm for storm detection in high resolution climate data

    DOE PAGES

    Bosler, Peter Andrew; Roesler, Erika Louise; Taylor, Mark A.; ...

    2015-09-08

    This article discusses the problem of identifying extreme climate events such as intense storms within large climate data sets. The basic storm detection algorithm is reviewed, which splits the problem into two parts: a spatial search followed by a temporal correlation problem. Two specific implementations of the spatial search algorithm are compared. The commonly used grid point search algorithm is reviewed, and a new algorithm called Stride Search is introduced. Stride Search is designed to work at all latitudes, while grid point searches may fail in polar regions. Results from the two algorithms are compared for the application of tropicalmore » cyclone detection, and shown to produce similar results for the same set of storm identification criteria. The time required for both algorithms to search the same data set is compared. Furthermore, Stride Search's ability to search extreme latitudes is demonstrated for the case of polar low detection.« less

  16. Halogens in normal- and enriched-basalts from Central Indian Ridge (18-20°S): Testing the E-MORB subduction origin hypothesis

    NASA Astrophysics Data System (ADS)

    Ruzie, L.; Burgess, R.; Hilton, D. R.; Ballentine, C. J.

    2012-12-01

    Basalts emitted along oceanic ridges have often been subdivided into two categories: the Normal-MORB and the Enriched-MORB, anomalously enriched in highly incompatible elements. Donnelly et al. (2004) proposed that the formation of enriched sources is related to two stages of melting. The first one occurs in subduction zones where the mantle wedge is enriched by the addition of low-degree melts of subducted slab. The second stage of melting occurs beneath ocean ridges. Because of their incompatibility, relatively high concentrations and distinct elemental compositions in surface reservoirs, the heavy halogens (Cl, Br, I) are good tracers to detect the slab contribution in E-MORB sources. However, the halogen systematics in mantle reservoirs remains poorly constrained mainly because of their very low abundance in materials of interest. An innovative halogen analytical technique, developed at the University of Manchester, involving neutron irradiation of samples to convert halogens to noble gases provides detection limits unmatched by any other technique studies [Johnson et al. 2000]. For the first time Cl, Br and I can now be determined in appropriate samples. We focus on the content of halogens in the glassy margins of basalts erupted along the CIR from 18-20°S and the off-axis Gasitao Ridge. Our set of samples contains both N- and E-MORB and is fully described in terms of major and trace elements, as well as 3He/4He ratios and water concentrations [Murton et al., 2005; Nauret et al., 2006; Füri et al., 2011; Barry et al., in prep.]. The halogen concentration range is between 10 and 140 ppm for Cl, 30 and 500 ppb for Br and 0.8 and 10 ppb for I. The higher concentrations are found in E-MORB samples from the northern part of ridge axis. Comparing our data with previous halogen studies, our sample suites fall within the range of N-MORB from East Pacific Ridge (EPR) and Mid-Atlantic Ridge (MAR) [Jambon et al. 1995; Deruelle et al. 1992] and in the lower range of E-MORB from Macquarie Island [Kendrick et al., 2012]. The concentrations are not related to superficial processes. The on-axis samples display a relatively restricted range (6.9-8.6wt%) of MgO contents, suggesting no control of the crystallisation processes. The basalts were erupted between 3900-2000 m bsl, so no appreciable degassing of halogens would be expected. The strong correlation, which exists between the halogens and other incompatible elements (e.g., Rb, La), also rules out seawater assimilation. Therefore, concentrations and elemental ratios can be directly linked to melting and source features. Estimates of halogens abundances in the depleted-mantle source are 4 ppm Cl, 14 ppb Br and 0.3 ppb I. These low abundances, which are in agreement with values derived for sub-continental mantle from coated diamonds [Burgess et al., 2002], suggest that, like noble gases, the upper mantle is degassed of its halogens. Critically, the halogen elemental ratios show no significant variations along the axial ridge and off-axis ridge or between N-MORB and E-MORB: Br/Cl=0.00147±0.00014, I/Cl=0.000021±0.000005; I/Br=0.0142±0.0036. These ratios are similar to E-MORB from Macquarie Island [Kendrick et al., 2012]. This observation is thus not consistent with subduction as a source of halogen enrichment in E-MORB.

  17. LIDAR-based coastal landscape reconstruction and harbour location: The Viking-age royal burial site of Borre (Norway)

    NASA Astrophysics Data System (ADS)

    Draganits, Erich; Doneus, Michael; Gansum, Terje

    2013-04-01

    Airborne light detection and ranging (LIDAR) has found wide application in archaeological research for the detection and documentation of archaeological and palaeo-environmental features. In this study we demonstrate the analysis of an LIDAR derived 1x1 m digital elevation model (DTM) combined with geoarchaeological research of the coastal Viking-age burial site in Borre, Olso Fjord (Norway). Borre is an exceptional burial site in Scandinavia, containing burial mounds up to 40 m in diameter and 6 m height, mentioned in Nordic Sagas, especially in the skaldic poem Ynglingatal, as the burial place of one or two kings of the Ynglinga dynasty. Archaeological findings and radiocarbon ages indicate that the Borre burial ground had been in use broadly between 600-1000 AD. Despite the reasonable expectation that a coastal site connected with the Viking kings of Vestfold, with hall buildings and ship graves demands a harbour, up to now no harbour has not been found with traditional archaeological surveys. Since the area of Borre is affected by a continuous land uplift related to glacial rebound of Scandinavia, any former harbour site is expected to be exposed to the land surface today. The present day vertical crustal uplift is calculated around 2.5 mm/yr in the area of Borre. Burial mounds and surrounding borrow pits as well as geomorphological features of the uplifted coast of Borre have been analysed by the 1x1 m LIDAR-DTM, using hillshade, slope and local relief model for visualisation. Altogether, 41 burial mounds and further 6 potential mounds are visible in the high-resolution DTM. A succession of more than 14 beach ridges, cross-cut by the burial mounds, is visible from the present shore line up to 18 m asl. They are more or less parallel and similar in size, except between at ca. 4-6 m asl, where the most prominent ridge is located, which probably has been enforced artificially. Using published shoreline displacement curves from nearby areas, the shore-line at Borre in the period 600-1000 AD was ca. 6-4 m higher than today, exactly at the position of the prominent beach ridge at the eastern boundary of the burial ground. Below 4 m asl there are three, prominent, ca. 70 m, 150 m and 200 m long ridges, oriented at right angle to the general trend of the coast. All three structures cause considerable deflection of the beach ridges in this area and therefore must be older than the beach ridges below 4 m asl. These ridges comprise polymict, sub-rounded to rounded boulders up to 1 m in diameter, including porphyric volcanics, various types of gneisses, amphibolite, sandstone, etc. These boulders occur neither at the nearby beach areas nor in the beach ridges in the whole Borre area, but can be found in the Younger Dryas Ra moraine, some 1 km west of Borre. In contrast to the strongly undulating shoreline at Borre, the nearby coast of the western Oslo Fjord between Åsgårdstrand and Horten, shows quite a straight shore line with hardly any natural harbours. The ridges seen in the LIDAR are exactly in the size range of the modern day jetties in this area and we believe that those between 4-0 m asl have been made for the same purpose, representing harbour structures for landing at Borre in Viking age.

  18. A scale-invariant keypoint detector in log-polar space

    NASA Astrophysics Data System (ADS)

    Tao, Tao; Zhang, Yun

    2017-02-01

    The scale-invariant feature transform (SIFT) algorithm is devised to detect keypoints via the difference of Gaussian (DoG) images. However, the DoG data lacks the high-frequency information, which can lead to a performance drop of the algorithm. To address this issue, this paper proposes a novel log-polar feature detector (LPFD) to detect scale-invariant blubs (keypoints) in log-polar space, which, in contrast, can retain all the image information. The algorithm consists of three components, viz. keypoint detection, descriptor extraction and descriptor matching. Besides, the algorithm is evaluated in detecting keypoints from the INRIA dataset by comparing with the SIFT algorithm and one of its fast versions, the speed up robust features (SURF) algorithm in terms of three performance measures, viz. correspondences, repeatability, correct matches and matching score.

  19. CONEDEP: COnvolutional Neural network based Earthquake DEtection and Phase Picking

    NASA Astrophysics Data System (ADS)

    Zhou, Y.; Huang, Y.; Yue, H.; Zhou, S.; An, S.; Yun, N.

    2017-12-01

    We developed an automatic local earthquake detection and phase picking algorithm based on Fully Convolutional Neural network (FCN). The FCN algorithm detects and segments certain features (phases) in 3 component seismograms to realize efficient picking. We use STA/LTA algorithm and template matching algorithm to construct the training set from seismograms recorded 1 month before and after the Wenchuan earthquake. Precise P and S phases are identified and labeled to construct the training set. Noise data are produced by combining back-ground noise and artificial synthetic noise to form the equivalent scale of noise set as the signal set. Training is performed on GPUs to achieve efficient convergence. Our algorithm has significantly improved performance in terms of the detection rate and precision in comparison with STA/LTA and template matching algorithms.

  20. Improvement and implementation for Canny edge detection algorithm

    NASA Astrophysics Data System (ADS)

    Yang, Tao; Qiu, Yue-hong

    2015-07-01

    Edge detection is necessary for image segmentation and pattern recognition. In this paper, an improved Canny edge detection approach is proposed due to the defect of traditional algorithm. A modified bilateral filter with a compensation function based on pixel intensity similarity judgment was used to smooth image instead of Gaussian filter, which could preserve edge feature and remove noise effectively. In order to solve the problems of sensitivity to the noise in gradient calculating, the algorithm used 4 directions gradient templates. Finally, Otsu algorithm adaptively obtain the dual-threshold. All of the algorithm simulated with OpenCV 2.4.0 library in the environments of vs2010, and through the experimental analysis, the improved algorithm has been proved to detect edge details more effectively and with more adaptability.

  1. An Integrated Intrusion Detection Model of Cluster-Based Wireless Sensor Network

    PubMed Central

    Sun, Xuemei; Yan, Bo; Zhang, Xinzhong; Rong, Chuitian

    2015-01-01

    Considering wireless sensor network characteristics, this paper combines anomaly and mis-use detection and proposes an integrated detection model of cluster-based wireless sensor network, aiming at enhancing detection rate and reducing false rate. Adaboost algorithm with hierarchical structures is used for anomaly detection of sensor nodes, cluster-head nodes and Sink nodes. Cultural-Algorithm and Artificial-Fish–Swarm-Algorithm optimized Back Propagation is applied to mis-use detection of Sink node. Plenty of simulation demonstrates that this integrated model has a strong performance of intrusion detection. PMID:26447696

  2. An Integrated Intrusion Detection Model of Cluster-Based Wireless Sensor Network.

    PubMed

    Sun, Xuemei; Yan, Bo; Zhang, Xinzhong; Rong, Chuitian

    2015-01-01

    Considering wireless sensor network characteristics, this paper combines anomaly and mis-use detection and proposes an integrated detection model of cluster-based wireless sensor network, aiming at enhancing detection rate and reducing false rate. Adaboost algorithm with hierarchical structures is used for anomaly detection of sensor nodes, cluster-head nodes and Sink nodes. Cultural-Algorithm and Artificial-Fish-Swarm-Algorithm optimized Back Propagation is applied to mis-use detection of Sink node. Plenty of simulation demonstrates that this integrated model has a strong performance of intrusion detection.

  3. Automatic target detection using binary template matching

    NASA Astrophysics Data System (ADS)

    Jun, Dong-San; Sun, Sun-Gu; Park, HyunWook

    2005-03-01

    This paper presents a new automatic target detection (ATD) algorithm to detect targets such as battle tanks and armored personal carriers in ground-to-ground scenarios. Whereas most ATD algorithms were developed for forward-looking infrared (FLIR) images, we have developed an ATD algorithm for charge-coupled device (CCD) images, which have superior quality to FLIR images in daylight. The proposed algorithm uses fast binary template matching with an adaptive binarization, which is robust to various light conditions in CCD images and saves computation time. Experimental results show that the proposed method has good detection performance.

  4. Lesion Detection in CT Images Using Deep Learning Semantic Segmentation Technique

    NASA Astrophysics Data System (ADS)

    Kalinovsky, A.; Liauchuk, V.; Tarasau, A.

    2017-05-01

    In this paper, the problem of automatic detection of tuberculosis lesion on 3D lung CT images is considered as a benchmark for testing out algorithms based on a modern concept of Deep Learning. For training and testing of the algorithms a domestic dataset of 338 3D CT scans of tuberculosis patients with manually labelled lesions was used. The algorithms which are based on using Deep Convolutional Networks were implemented and applied in three different ways including slice-wise lesion detection in 2D images using semantic segmentation, slice-wise lesion detection in 2D images using sliding window technique as well as straightforward detection of lesions via semantic segmentation in whole 3D CT scans. The algorithms demonstrate superior performance compared to algorithms based on conventional image analysis methods.

  5. Frequency hopping signal detection based on wavelet decomposition and Hilbert-Huang transform

    NASA Astrophysics Data System (ADS)

    Zheng, Yang; Chen, Xihao; Zhu, Rui

    2017-07-01

    Frequency hopping (FH) signal is widely adopted by military communications as a kind of low probability interception signal. Therefore, it is very important to research the FH signal detection algorithm. The existing detection algorithm of FH signals based on the time-frequency analysis cannot satisfy the time and frequency resolution requirement at the same time due to the influence of window function. In order to solve this problem, an algorithm based on wavelet decomposition and Hilbert-Huang transform (HHT) was proposed. The proposed algorithm removes the noise of the received signals by wavelet decomposition and detects the FH signals by Hilbert-Huang transform. Simulation results show the proposed algorithm takes into account both the time resolution and the frequency resolution. Correspondingly, the accuracy of FH signals detection can be improved.

  6. Robust automatic line scratch detection in films.

    PubMed

    Newson, Alasdair; Almansa, Andrés; Gousseau, Yann; Pérez, Patrick

    2014-03-01

    Line scratch detection in old films is a particularly challenging problem due to the variable spatiotemporal characteristics of this defect. Some of the main problems include sensitivity to noise and texture, and false detections due to thin vertical structures belonging to the scene. We propose a robust and automatic algorithm for frame-by-frame line scratch detection in old films, as well as a temporal algorithm for the filtering of false detections. In the frame-by-frame algorithm, we relax some of the hypotheses used in previous algorithms in order to detect a wider variety of scratches. This step's robustness and lack of external parameters is ensured by the combined use of an a contrario methodology and local statistical estimation. In this manner, over-detection in textured or cluttered areas is greatly reduced. The temporal filtering algorithm eliminates false detections due to thin vertical structures by exploiting the coherence of their motion with that of the underlying scene. Experiments demonstrate the ability of the resulting detection procedure to deal with difficult situations, in particular in the presence of noise, texture, and slanted or partial scratches. Comparisons show significant advantages over previous work.

  7. Image based book cover recognition and retrieval

    NASA Astrophysics Data System (ADS)

    Sukhadan, Kalyani; Vijayarajan, V.; Krishnamoorthi, A.; Bessie Amali, D. Geraldine

    2017-11-01

    In this we are developing a graphical user interface using MATLAB for the users to check the information related to books in real time. We are taking the photos of the book cover using GUI, then by using MSER algorithm it will automatically detect all the features from the input image, after this it will filter bifurcate non-text features which will be based on morphological difference between text and non-text regions. We implemented a text character alignment algorithm which will improve the accuracy of the original text detection. We will also have a look upon the built in MATLAB OCR recognition algorithm and an open source OCR which is commonly used to perform better detection results, post detection algorithm is implemented and natural language processing to perform word correction and false detection inhibition. Finally, the detection result will be linked to internet to perform online matching. More than 86% accuracy can be obtained by this algorithm.

  8. The non-contact detection and identification of blood stained fingerprints using visible wavelength hyperspectral imaging: Part II effectiveness on a range of substrates.

    PubMed

    Cadd, Samuel; Li, Bo; Beveridge, Peter; O'Hare, William T; Campbell, Andrew; Islam, Meez

    2016-05-01

    Biological samples, such as blood, are regularly encountered at violent crime scenes and successful identification is critical for criminal investigations. Blood is one of the most commonly encountered fingerprint contaminants and current identification methods involve presumptive tests or wet chemical enhancement. These are destructive however; can affect subsequent DNA sampling; and do not confirm the presence of blood, meaning they are susceptible to false positives. A novel application of visible wavelength reflectance hyperspectral imaging (HSI) has been used for the non-contact, non-destructive detection and identification of blood stained fingerprints across a range of coloured substrates of varying porosities. The identification of blood was based on the Soret γ band absorption of haemoglobin between 400 nm and 500 nm. Ridge detail was successfully visualised to the third depletion across light coloured substrates and the stain detected to the tenth depletion on both porous and non-porous substrates. A higher resolution setup for blood stained fingerprints on black tiles, detected ridge detail to the third depletion and the stain to the tenth depletion, demonstrating considerable advancements from previous work. Diluted blood stains at 1500 and 1000 fold dilutions for wet and dry stains respectively were also detected on pig skin as a replica for human skin. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  9. Hazardous gas detection for FTIR-based hyperspectral imaging system using DNN and CNN

    NASA Astrophysics Data System (ADS)

    Kim, Yong Chan; Yu, Hyeong-Geun; Lee, Jae-Hoon; Park, Dong-Jo; Nam, Hyun-Woo

    2017-10-01

    Recently, a hyperspectral imaging system (HIS) with a Fourier Transform InfraRed (FTIR) spectrometer has been widely used due to its strengths in detecting gaseous fumes. Even though numerous algorithms for detecting gaseous fumes have already been studied, it is still difficult to detect target gases properly because of atmospheric interference substances and unclear characteristics of low concentration gases. In this paper, we propose detection algorithms for classifying hazardous gases using a deep neural network (DNN) and a convolutional neural network (CNN). In both the DNN and CNN, spectral signal preprocessing, e.g., offset, noise, and baseline removal, are carried out. In the DNN algorithm, the preprocessed spectral signals are used as feature maps of the DNN with five layers, and it is trained by a stochastic gradient descent (SGD) algorithm (50 batch size) and dropout regularization (0.7 ratio). In the CNN algorithm, preprocessed spectral signals are trained with 1 × 3 convolution layers and 1 × 2 max-pooling layers. As a result, the proposed algorithms improve the classification accuracy rate by 1.5% over the existing support vector machine (SVM) algorithm for detecting and classifying hazardous gases.

  10. A Motion Detection Algorithm Using Local Phase Information

    PubMed Central

    Lazar, Aurel A.; Ukani, Nikul H.; Zhou, Yiyin

    2016-01-01

    Previous research demonstrated that global phase alone can be used to faithfully represent visual scenes. Here we provide a reconstruction algorithm by using only local phase information. We also demonstrate that local phase alone can be effectively used to detect local motion. The local phase-based motion detector is akin to models employed to detect motion in biological vision, for example, the Reichardt detector. The local phase-based motion detection algorithm introduced here consists of two building blocks. The first building block measures/evaluates the temporal change of the local phase. The temporal derivative of the local phase is shown to exhibit the structure of a second order Volterra kernel with two normalized inputs. We provide an efficient, FFT-based algorithm for implementing the change of the local phase. The second processing building block implements the detector; it compares the maximum of the Radon transform of the local phase derivative with a chosen threshold. We demonstrate examples of applying the local phase-based motion detection algorithm on several video sequences. We also show how the locally detected motion can be used for segmenting moving objects in video scenes and compare our local phase-based algorithm to segmentation achieved with a widely used optic flow algorithm. PMID:26880882

  11. Detection of dechallenge in spontaneous reporting systems: a comparison of Bayes methods.

    PubMed

    Banu, A Bazila; Alias Balamurugan, S Appavu; Thirumalaikolundusubramanian, Ponniah

    2014-01-01

    Dechallenge is a response observed for the reduction or disappearance of adverse drug reactions (ADR) on withdrawal of a drug from a patient. Currently available algorithms to detect dechallenge have limitations. Hence, there is a need to compare available new methods. To detect dechallenge in Spontaneous Reporting Systems, data-mining algorithms like Naive Bayes and Improved Naive Bayes were applied for comparing the performance of the algorithms in terms of accuracy and error. Analyzing the factors of dechallenge like outcome and disease category will help medical practitioners and pharmaceutical industries to determine the reasons for dechallenge in order to take essential steps toward drug safety. Adverse drug reactions of the year 2011 and 2012 were downloaded from the United States Food and Drug Administration's database. The outcome of classification algorithms showed that Improved Naive Bayes algorithm outperformed Naive Bayes with accuracy of 90.11% and error of 9.8% in detecting the dechallenge. Detecting dechallenge for unknown samples are essential for proper prescription. To overcome the issues exposed by Naive Bayes algorithm, Improved Naive Bayes algorithm can be used to detect dechallenge in terms of higher accuracy and minimal error.

  12. Detection and Tracking of Moving Objects with Real-Time Onboard Vision System

    NASA Astrophysics Data System (ADS)

    Erokhin, D. Y.; Feldman, A. B.; Korepanov, S. E.

    2017-05-01

    Detection of moving objects in video sequence received from moving video sensor is a one of the most important problem in computer vision. The main purpose of this work is developing set of algorithms, which can detect and track moving objects in real time computer vision system. This set includes three main parts: the algorithm for estimation and compensation of geometric transformations of images, an algorithm for detection of moving objects, an algorithm to tracking of the detected objects and prediction their position. The results can be claimed to create onboard vision systems of aircraft, including those relating to small and unmanned aircraft.

  13. Research on improved edge extraction algorithm of rectangular piece

    NASA Astrophysics Data System (ADS)

    He, Yi-Bin; Zeng, Ya-Jun; Chen, Han-Xin; Xiao, San-Xia; Wang, Yan-Wei; Huang, Si-Yu

    Traditional edge detection operators such as Prewitt operator, LOG operator and Canny operator, etc. cannot meet the requirements of the modern industrial measurement. This paper proposes a kind of image edge detection algorithm based on improved morphological gradient. It can be detect the image using structural elements, which deals with the characteristic information of the image directly. Choosing different shapes and sizes of structural elements to use together, the ideal image edge information can be detected. The experimental result shows that the algorithm can well extract image edge with noise, which is clearer, and has more detailed edges compared with the previous edge detection algorithm.

  14. Rapid detection of microbial cell abundance in aquatic systems

    DOE PAGES

    Rocha, Andrea M.; Yuan, Quan; Close, Dan M.; ...

    2016-06-01

    The detection and quantification of naturally occurring microbial cellular densities is an essential component of environmental systems monitoring. While there are a number of commonly utilized approaches for monitoring microbial abundance, capacitance-based biosensors represent a promising approach because of their low-cost and label-free detection of microbial cells, but are not as well characterized as more traditional methods. Here, we investigate the applicability of enhanced alternating current electrokinetics (ACEK) capacitive sensing as a new application for rapidly detecting and quantifying microbial cellular densities in cultured and environmentally sourced aquatic samples. ACEK capacitive sensor performance was evaluated using two distinct and dynamicmore » systems the Great Australian Bight and groundwater from the Oak Ridge Reservation in Oak Ridge, TN. Results demonstrate that ACEK capacitance-based sensing can accurately determine microbial cell counts throughout cellular concentrations typically encountered in naturally occurring microbial communities (10 3 – 10 6 cells/mL). A linear relationship was observed between cellular density and capacitance change correlations, allowing a simple linear curve fitting equation to be used for determining microbial abundances in unknown samples. As a result, this work provides a foundation for understanding the limits of capacitance-based sensing in natural environmental samples and supports future efforts focusing on evaluating the robustness ACEK capacitance-based within aquatic environments.« less

  15. Rapid detection of microbial cell abundance in aquatic systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rocha, Andrea M.; Yuan, Quan; Close, Dan M.

    The detection and quantification of naturally occurring microbial cellular densities is an essential component of environmental systems monitoring. While there are a number of commonly utilized approaches for monitoring microbial abundance, capacitance-based biosensors represent a promising approach because of their low-cost and label-free detection of microbial cells, but are not as well characterized as more traditional methods. Here, we investigate the applicability of enhanced alternating current electrokinetics (ACEK) capacitive sensing as a new application for rapidly detecting and quantifying microbial cellular densities in cultured and environmentally sourced aquatic samples. ACEK capacitive sensor performance was evaluated using two distinct and dynamicmore » systems the Great Australian Bight and groundwater from the Oak Ridge Reservation in Oak Ridge, TN. Results demonstrate that ACEK capacitance-based sensing can accurately determine microbial cell counts throughout cellular concentrations typically encountered in naturally occurring microbial communities (10 3 – 10 6 cells/mL). A linear relationship was observed between cellular density and capacitance change correlations, allowing a simple linear curve fitting equation to be used for determining microbial abundances in unknown samples. As a result, this work provides a foundation for understanding the limits of capacitance-based sensing in natural environmental samples and supports future efforts focusing on evaluating the robustness ACEK capacitance-based within aquatic environments.« less

  16. Heterogeneous Vision Data Fusion for Independently Moving Cameras

    DTIC Science & Technology

    2010-03-01

    target detection , tracking , and identification over a large terrain. The goal of the project is to investigate and evaluate the existing image...fusion algorithms, develop new real-time algorithms for Category-II image fusion, and apply these algorithms in moving target detection and tracking . The...moving target detection and classification. 15. SUBJECT TERMS Image Fusion, Target Detection , Moving Cameras, IR Camera, EO Camera 16. SECURITY

  17. An Automated Energy Detection Algorithm Based on Consecutive Mean Excision

    DTIC Science & Technology

    2018-01-01

    present in the RF spectrum. 15. SUBJECT TERMS RF spectrum, detection threshold algorithm, consecutive mean excision, rank order filter , statistical...Median 4 3.1.9 Rank Order Filter (ROF) 4 3.1.10 Crest Factor (CF) 5 3.2 Statistical Summary 6 4. Algorithm 7 5. Conclusion 8 6. References 9...energy detection algorithm based on morphological filter processing with a semi- disk structure. Adelphi (MD): Army Research Laboratory (US); 2018 Jan

  18. The BMPix and PEAK Tools: New Methods for Automated Laminae Recognition and Counting - Application to Glacial Varves From Antarctic Marine Sediment

    NASA Astrophysics Data System (ADS)

    Weber, M. E.; Reichelt, L.; Kuhn, G.; Thurow, J. W.; Ricken, W.

    2009-12-01

    We present software-based tools for rapid and quantitative detection of sediment lamination. The BMPix tool extracts color and gray-scale curves from images at ultrahigh (pixel) resolution. The PEAK tool uses the gray-scale curve and performs, for the first time, fully automated counting of laminae based on three methods. The maximum count algorithm counts every bright peak of a couplet of two laminae (annual resolution) in a Gaussian smoothed gray-scale curve. The zero-crossing algorithm counts every positive and negative halfway-passage of the gray-scale curve through a wide moving average. Hence, the record is separated into bright and dark intervals (seasonal resolution). The same is true for the frequency truncation method, which uses Fourier transformation to decompose the gray-scale curve into its frequency components, before positive and negative passages are count. We applied the new methods successfully to tree rings and to well-dated and already manually counted marine varves from Saanich Inlet before we adopted the tools to rather complex marine laminae from the Antarctic continental margin. In combination with AMS14C dating, we found convincing evidence that the laminations from three Weddell Sea sites represent true varves that were deposited on sediment ridges over several millennia during the last glacial maximum (LGM). There are apparently two seasonal layers of terrigenous composition, a coarser-grained bright layer, and a finer-grained dark layer. The new tools offer several advantages over previous tools. The counting procedures are based on a moving average generated from gray-scale curves instead of manual counting. Hence, results are highly objective and rely on reproducible mathematical criteria. Since PEAK associates counts with a specific depth, the thickness of each year or each season is also measured which is an important prerequisite for later spectral analysis. Since all information required to conduct the analysis is displayed graphically, interactive optimization of the counting algorithms can be achieved quickly and conveniently.

  19. Long-Term Seismicity of Northern (15° N-60° N) Mid-Atlantic Ridge (MAR) Recorded by two Regional Hydrophone Arrays: a Widespread Along-Ridge Influence of the Azores and Iceland Hotspots

    NASA Astrophysics Data System (ADS)

    Goslin, J.; Bazin, S.; Dziak, R. P.; Fox, C.; Fowler, M.; Haxel, J.; Lourenco, N.; Luis, J.; Martin, C.; Matsumoto, H.; Perrot, J.; Royer, J.

    2004-12-01

    The seismicity of the North Atlantic was recorded by two networks of hydrophones moored in the SOFAR channel, north and south of the Azores Plateau. The interpretation of the hydro-acoustic signals recorded during the first six-month common period of operation of the two networks (June 2002 to Nov. 2002) provides a unique data set on the spatial and time distributions of the numerous low-magnitude earthquakes which occurred along the Mid-Atlantic Ridge. Close to 2000 events were localized during this six-month period between latitudes 15° N and 63° N, 501 of which are localized within the SIRENA network (40° N-51° N) and 692 within the wider South Azores network (17° N-33° N). Using hydrophones to locate seafloor earthquakes by interpreting T-wave signals lowers the detection threshold of Mid-Atlantic Ridge events to 3.0 mb from the 4.7 mb of global seismic networks. This represents an average thirty-fold increase in the number of events: 62 events were detected by global seismological networks within the same area during the same period. An along-ridge spatial distribution of the seismicity is obtained by computing the cumulated numbers of events in 1° -wide latitudinal bins. When plotted vs. latitude, this first-order distribution shows remarkable long-wavelength patterns: the seismicity rate is low when approaching the Azores and Iceland (reaching values as low as 10 events/d° ), while it peaks to 70 events/d° in the vicinity of the Gibbs FZ. Moreover, the latitudinal distribution of the seismicity hints at an asymmetric influence of the Azores hotpot on the MAR. Finally, the spatial distribution of the seismicity anti-correlates well at long wavelengths with the zero-age depths along the MAR and correlates with the zero-age Mantle Bouguer (MBA) anomaly values and the Vs velocity anomalies at 100 km in the upper mantle. It is thus proposed that the seismicity level would be partly tied to the rheology and thickness of the brittle layer and be thus dependant on the thermal regime of the upper mantle. The seismicity distribution could then be used as an additional tool to characterize the along-ridge influence of the Azores and Iceland hotspots on the MAR slow-spreading center.

  20. Phenotyping for patient safety: algorithm development for electronic health record based automated adverse event and medical error detection in neonatal intensive care.

    PubMed

    Li, Qi; Melton, Kristin; Lingren, Todd; Kirkendall, Eric S; Hall, Eric; Zhai, Haijun; Ni, Yizhao; Kaiser, Megan; Stoutenborough, Laura; Solti, Imre

    2014-01-01

    Although electronic health records (EHRs) have the potential to provide a foundation for quality and safety algorithms, few studies have measured their impact on automated adverse event (AE) and medical error (ME) detection within the neonatal intensive care unit (NICU) environment. This paper presents two phenotyping AE and ME detection algorithms (ie, IV infiltrations, narcotic medication oversedation and dosing errors) and describes manual annotation of airway management and medication/fluid AEs from NICU EHRs. From 753 NICU patient EHRs from 2011, we developed two automatic AE/ME detection algorithms, and manually annotated 11 classes of AEs in 3263 clinical notes. Performance of the automatic AE/ME detection algorithms was compared to trigger tool and voluntary incident reporting results. AEs in clinical notes were double annotated and consensus achieved under neonatologist supervision. Sensitivity, positive predictive value (PPV), and specificity are reported. Twelve severe IV infiltrates were detected. The algorithm identified one more infiltrate than the trigger tool and eight more than incident reporting. One narcotic oversedation was detected demonstrating 100% agreement with the trigger tool. Additionally, 17 narcotic medication MEs were detected, an increase of 16 cases over voluntary incident reporting. Automated AE/ME detection algorithms provide higher sensitivity and PPV than currently used trigger tools or voluntary incident-reporting systems, including identification of potential dosing and frequency errors that current methods are unequipped to detect. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  1. Texture orientation-based algorithm for detecting infrared maritime targets.

    PubMed

    Wang, Bin; Dong, Lili; Zhao, Ming; Wu, Houde; Xu, Wenhai

    2015-05-20

    Infrared maritime target detection is a key technology for maritime target searching systems. However, in infrared maritime images (IMIs) taken under complicated sea conditions, background clutters, such as ocean waves, clouds or sea fog, usually have high intensity that can easily overwhelm the brightness of real targets, which is difficult for traditional target detection algorithms to deal with. To mitigate this problem, this paper proposes a novel target detection algorithm based on texture orientation. This algorithm first extracts suspected targets by analyzing the intersubband correlation between horizontal and vertical wavelet subbands of the original IMI on the first scale. Then the self-adaptive wavelet threshold denoising and local singularity analysis of the original IMI is combined to remove false alarms further. Experiments show that compared with traditional algorithms, this algorithm can suppress background clutter much better and realize better single-frame detection for infrared maritime targets. Besides, in order to guarantee accurate target extraction further, the pipeline-filtering algorithm is adopted to eliminate residual false alarms. The high practical value and applicability of this proposed strategy is backed strongly by experimental data acquired under different environmental conditions.

  2. Investigating prior probabilities in a multiple hypothesis test for use in space domain awareness

    NASA Astrophysics Data System (ADS)

    Hardy, Tyler J.; Cain, Stephen C.

    2016-05-01

    The goal of this research effort is to improve Space Domain Awareness (SDA) capabilities of current telescope systems through improved detection algorithms. Ground-based optical SDA telescopes are often spatially under-sampled, or aliased. This fact negatively impacts the detection performance of traditionally proposed binary and correlation-based detection algorithms. A Multiple Hypothesis Test (MHT) algorithm has been previously developed to mitigate the effects of spatial aliasing. This is done by testing potential Resident Space Objects (RSOs) against several sub-pixel shifted Point Spread Functions (PSFs). A MHT has been shown to increase detection performance for the same false alarm rate. In this paper, the assumption of a priori probability used in a MHT algorithm is investigated. First, an analysis of the pixel decision space is completed to determine alternate hypothesis prior probabilities. These probabilities are then implemented into a MHT algorithm, and the algorithm is then tested against previous MHT algorithms using simulated RSO data. Results are reported with Receiver Operating Characteristic (ROC) curves and probability of detection, Pd, analysis.

  3. A wavelet transform algorithm for peak detection and application to powder x-ray diffraction data.

    PubMed

    Gregoire, John M; Dale, Darren; van Dover, R Bruce

    2011-01-01

    Peak detection is ubiquitous in the analysis of spectral data. While many noise-filtering algorithms and peak identification algorithms have been developed, recent work [P. Du, W. Kibbe, and S. Lin, Bioinformatics 22, 2059 (2006); A. Wee, D. Grayden, Y. Zhu, K. Petkovic-Duran, and D. Smith, Electrophoresis 29, 4215 (2008)] has demonstrated that both of these tasks are efficiently performed through analysis of the wavelet transform of the data. In this paper, we present a wavelet-based peak detection algorithm with user-defined parameters that can be readily applied to the application of any spectral data. Particular attention is given to the algorithm's resolution of overlapping peaks. The algorithm is implemented for the analysis of powder diffraction data, and successful detection of Bragg peaks is demonstrated for both low signal-to-noise data from theta-theta diffraction of nanoparticles and combinatorial x-ray diffraction data from a composition spread thin film. These datasets have different types of background signals which are effectively removed in the wavelet-based method, and the results demonstrate that the algorithm provides a robust method for automated peak detection.

  4. Robust crop and weed segmentation under uncontrolled outdoor illumination.

    PubMed

    Jeon, Hong Y; Tian, Lei F; Zhu, Heping

    2011-01-01

    An image processing algorithm for detecting individual weeds was developed and evaluated. Weed detection processes included were normalized excessive green conversion, statistical threshold value estimation, adaptive image segmentation, median filter, morphological feature calculation and Artificial Neural Network (ANN). The developed algorithm was validated for its ability to identify and detect weeds and crop plants under uncontrolled outdoor illuminations. A machine vision implementing field robot captured field images under outdoor illuminations and the image processing algorithm automatically processed them without manual adjustment. The errors of the algorithm, when processing 666 field images, ranged from 2.1 to 2.9%. The ANN correctly detected 72.6% of crop plants from the identified plants, and considered the rest as weeds. However, the ANN identification rates for crop plants were improved up to 95.1% by addressing the error sources in the algorithm. The developed weed detection and image processing algorithm provides a novel method to identify plants against soil background under the uncontrolled outdoor illuminations, and to differentiate weeds from crop plants. Thus, the proposed new machine vision and processing algorithm may be useful for outdoor applications including plant specific direct applications (PSDA).

  5. Superior Rhythm Discrimination With the SmartShock Technology Algorithm - Results of the Implantable Defibrillator With Enhanced Features and Settings for Reduction of Inaccurate Detection (DEFENSE) Trial.

    PubMed

    Oginosawa, Yasushi; Kohno, Ritsuko; Honda, Toshihiro; Kikuchi, Kan; Nozoe, Masatsugu; Uchida, Takayuki; Minamiguchi, Hitoshi; Sonoda, Koichiro; Ogawa, Masahiro; Ideguchi, Takeshi; Kizaki, Yoshihisa; Nakamura, Toshihiro; Oba, Kageyuki; Higa, Satoshi; Yoshida, Keiki; Tsunoda, Soichi; Fujino, Yoshihisa; Abe, Haruhiko

    2017-08-25

    Shocks delivered by implanted anti-tachyarrhythmia devices, even when appropriate, lower the quality of life and survival. The new SmartShock Technology ® (SST) discrimination algorithm was developed to prevent the delivery of inappropriate shock. This prospective, multicenter, observational study compared the rate of inaccurate detection of ventricular tachyarrhythmia using the SST vs. a conventional discrimination algorithm.Methods and Results:Recipients of implantable cardioverter defibrillators (ICD) or cardiac resynchronization therapy defibrillators (CRT-D) equipped with the SST algorithm were enrolled and followed up every 6 months. The tachycardia detection rate was set at ≥150 beats/min with the SST algorithm. The primary endpoint was the time to first inaccurate detection of ventricular tachycardia (VT) with conventional vs. the SST discrimination algorithm, up to 2 years of follow-up. Between March 2012 and September 2013, 185 patients (mean age, 64.0±14.9 years; men, 74%; secondary prevention indication, 49.5%) were enrolled at 14 Japanese medical centers. Inaccurate detection was observed in 32 patients (17.6%) with the conventional, vs. in 19 patients (10.4%) with the SST algorithm. SST significantly lowered the rate of inaccurate detection by dual chamber devices (HR, 0.50; 95% CI: 0.263-0.950; P=0.034). Compared with previous algorithms, the SST discrimination algorithm significantly lowered the rate of inaccurate detection of VT in recipients of dual-chamber ICD or CRT-D.

  6. Data-driven approach of CUSUM algorithm in temporal aberrant event detection using interactive web applications.

    PubMed

    Li, Ye; Whelan, Michael; Hobbs, Leigh; Fan, Wen Qi; Fung, Cecilia; Wong, Kenny; Marchand-Austin, Alex; Badiani, Tina; Johnson, Ian

    2016-06-27

    In 2014/2015, Public Health Ontario developed disease-specific, cumulative sum (CUSUM)-based statistical algorithms for detecting aberrant increases in reportable infectious disease incidence in Ontario. The objective of this study was to determine whether the prospective application of these CUSUM algorithms, based on historical patterns, have improved specificity and sensitivity compared to the currently used Early Aberration Reporting System (EARS) algorithm, developed by the US Centers for Disease Control and Prevention. A total of seven algorithms were developed for the following diseases: cyclosporiasis, giardiasis, influenza (one each for type A and type B), mumps, pertussis, invasive pneumococcal disease. Historical data were used as baseline to assess known outbreaks. Regression models were used to model seasonality and CUSUM was applied to the difference between observed and expected counts. An interactive web application was developed allowing program staff to directly interact with data and tune the parameters of CUSUM algorithms using their expertise on the epidemiology of each disease. Using these parameters, a CUSUM detection system was applied prospectively and the results were compared to the outputs generated by EARS. The outcome was the detection of outbreaks, or the start of a known seasonal increase and predicting the peak in activity. The CUSUM algorithms detected provincial outbreaks earlier than the EARS algorithm, identified the start of the influenza season in advance of traditional methods, and had fewer false positive alerts. Additionally, having staff involved in the creation of the algorithms improved their understanding of the algorithms and improved use in practice. Using interactive web-based technology to tune CUSUM improved the sensitivity and specificity of detection algorithms.

  7. Text Extraction from Scene Images by Character Appearance and Structure Modeling

    PubMed Central

    Yi, Chucai; Tian, Yingli

    2012-01-01

    In this paper, we propose a novel algorithm to detect text information from natural scene images. Scene text classification and detection are still open research topics. Our proposed algorithm is able to model both character appearance and structure to generate representative and discriminative text descriptors. The contributions of this paper include three aspects: 1) a new character appearance model by a structure correlation algorithm which extracts discriminative appearance features from detected interest points of character samples; 2) a new text descriptor based on structons and correlatons, which model character structure by structure differences among character samples and structure component co-occurrence; and 3) a new text region localization method by combining color decomposition, character contour refinement, and string line alignment to localize character candidates and refine detected text regions. We perform three groups of experiments to evaluate the effectiveness of our proposed algorithm, including text classification, text detection, and character identification. The evaluation results on benchmark datasets demonstrate that our algorithm achieves the state-of-the-art performance on scene text classification and detection, and significantly outperforms the existing algorithms for character identification. PMID:23316111

  8. Natural Inspired Intelligent Visual Computing and Its Application to Viticulture.

    PubMed

    Ang, Li Minn; Seng, Kah Phooi; Ge, Feng Lu

    2017-05-23

    This paper presents an investigation of natural inspired intelligent computing and its corresponding application towards visual information processing systems for viticulture. The paper has three contributions: (1) a review of visual information processing applications for viticulture; (2) the development of natural inspired computing algorithms based on artificial immune system (AIS) techniques for grape berry detection; and (3) the application of the developed algorithms towards real-world grape berry images captured in natural conditions from vineyards in Australia. The AIS algorithms in (2) were developed based on a nature-inspired clonal selection algorithm (CSA) which is able to detect the arcs in the berry images with precision, based on a fitness model. The arcs detected are then extended to perform the multiple arcs and ring detectors information processing for the berry detection application. The performance of the developed algorithms were compared with traditional image processing algorithms like the circular Hough transform (CHT) and other well-known circle detection methods. The proposed AIS approach gave a Fscore of 0.71 compared with Fscores of 0.28 and 0.30 for the CHT and a parameter-free circle detection technique (RPCD) respectively.

  9. Research on Abnormal Detection Based on Improved Combination of K - means and SVDD

    NASA Astrophysics Data System (ADS)

    Hao, Xiaohong; Zhang, Xiaofeng

    2018-01-01

    In order to improve the efficiency of network intrusion detection and reduce the false alarm rate, this paper proposes an anomaly detection algorithm based on improved K-means and SVDD. The algorithm first uses the improved K-means algorithm to cluster the training samples of each class, so that each class is independent and compact in class; Then, according to the training samples, the SVDD algorithm is used to construct the minimum superspheres. The subordinate relationship of the samples is determined by calculating the distance of the minimum superspheres constructed by SVDD. If the test sample is less than the center of the hypersphere, the test sample belongs to this class, otherwise it does not belong to this class, after several comparisons, the final test of the effective detection of the test sample.In this paper, we use KDD CUP99 data set to simulate the proposed anomaly detection algorithm. The results show that the algorithm has high detection rate and low false alarm rate, which is an effective network security protection method.

  10. Detecting an atomic clock frequency anomaly using an adaptive Kalman filter algorithm

    NASA Astrophysics Data System (ADS)

    Song, Huijie; Dong, Shaowu; Wu, Wenjun; Jiang, Meng; Wang, Weixiong

    2018-06-01

    The abnormal frequencies of an atomic clock mainly include frequency jump and frequency drift jump. Atomic clock frequency anomaly detection is a key technique in time-keeping. The Kalman filter algorithm, as a linear optimal algorithm, has been widely used in real-time detection for abnormal frequency. In order to obtain an optimal state estimation, the observation model and dynamic model of the Kalman filter algorithm should satisfy Gaussian white noise conditions. The detection performance is degraded if anomalies affect the observation model or dynamic model. The idea of the adaptive Kalman filter algorithm, applied to clock frequency anomaly detection, uses the residuals given by the prediction for building ‘an adaptive factor’ the prediction state covariance matrix is real-time corrected by the adaptive factor. The results show that the model error is reduced and the detection performance is improved. The effectiveness of the algorithm is verified by the frequency jump simulation, the frequency drift jump simulation and the measured data of the atomic clock by using the chi-square test.

  11. Reducing false-positive detections by combining two stage-1 computer-aided mass detection algorithms

    NASA Astrophysics Data System (ADS)

    Bedard, Noah D.; Sampat, Mehul P.; Stokes, Patrick A.; Markey, Mia K.

    2006-03-01

    In this paper we present a strategy for reducing the number of false-positives in computer-aided mass detection. Our approach is to only mark "consensus" detections from among the suspicious sites identified by different "stage-1" detection algorithms. By "stage-1" we mean that each of the Computer-aided Detection (CADe) algorithms is designed to operate with high sensitivity, allowing for a large number of false positives. In this study, two mass detection methods were used: (1) Heath and Bowyer's algorithm based on the average fraction under the minimum filter (AFUM) and (2) a low-threshold bi-lateral subtraction algorithm. The two methods were applied separately to a set of images from the Digital Database for Screening Mammography (DDSM) to obtain paired sets of mass candidates. The consensus mass candidates for each image were identified by a logical "and" operation of the two CADe algorithms so as to eliminate regions of suspicion that were not independently identified by both techniques. It was shown that by combining the evidence from the AFUM filter method with that obtained from bi-lateral subtraction, the same sensitivity could be reached with fewer false-positives per image relative to using the AFUM filter alone.

  12. A New Pivoting and Iterative Text Detection Algorithm for Biomedical Images

    PubMed Central

    Xu, Songhua; Krauthammer, Michael

    2010-01-01

    There is interest to expand the reach of literature mining to include the analysis of biomedical images, which often contain a paper’s key findings. Examples include recent studies that use Optical Character Recognition (OCR) to extract image text, which is used to boost biomedical image retrieval and classification. Such studies rely on the robust identification of text elements in biomedical images, which is a non-trivial task. In this work, we introduce a new text detection algorithm for biomedical images based on iterative projection histograms. We study the effectiveness of our algorithm by evaluating the performance on a set of manually labeled random biomedical images, and compare the performance against other state-of-the-art text detection algorithms. In this paper, we demonstrate that a projection histogram-based text detection approach is well suited for text detection in biomedical images, with a performance of F score of .60. The approach performs better than comparable approaches for text detection. Further, we show that the iterative application of the algorithm is boosting overall detection performance. A C++ implementation of our algorithm is freely available through email request for academic use. PMID:20887803

  13. A stationary wavelet transform and a time-frequency based spike detection algorithm for extracellular recorded data.

    PubMed

    Lieb, Florian; Stark, Hans-Georg; Thielemann, Christiane

    2017-06-01

    Spike detection from extracellular recordings is a crucial preprocessing step when analyzing neuronal activity. The decision whether a specific part of the signal is a spike or not is important for any kind of other subsequent preprocessing steps, like spike sorting or burst detection in order to reduce the classification of erroneously identified spikes. Many spike detection algorithms have already been suggested, all working reasonably well whenever the signal-to-noise ratio is large enough. When the noise level is high, however, these algorithms have a poor performance. In this paper we present two new spike detection algorithms. The first is based on a stationary wavelet energy operator and the second is based on the time-frequency representation of spikes. Both algorithms are more reliable than all of the most commonly used methods. The performance of the algorithms is confirmed by using simulated data, resembling original data recorded from cortical neurons with multielectrode arrays. In order to demonstrate that the performance of the algorithms is not restricted to only one specific set of data, we also verify the performance using a simulated publicly available data set. We show that both proposed algorithms have the best performance under all tested methods, regardless of the signal-to-noise ratio in both data sets. This contribution will redound to the benefit of electrophysiological investigations of human cells. Especially the spatial and temporal analysis of neural network communications is improved by using the proposed spike detection algorithms.

  14. TSCA Environmental Release Application (TERA) for Pseudomonas putida (P. putida)

    EPA Pesticide Factsheets

    TERA submitted by Oak Ridge National Laboratory and given the tracking designations of R-01-0002.The microorganism will be tested to determine whether it will produce light in the presence of trinitrotoluene (TNT) as a means of detecting TNT in soil.

  15. Distortion in fingerprints: a statistical investigation using shape measurement tools.

    PubMed

    Sheets, H David; Torres, Anne; Langenburg, Glenn; Bush, Peter J; Bush, Mary A

    2014-07-01

    Friction ridge impression appearance can be affected due to the type of surface touched and pressure exerted during deposition. Understanding the magnitude of alterations, regions affected, and systematic/detectable changes occurring would provide useful information. Geometric morphometric techniques were used to statistically characterize these changes. One hundred and fourteen prints were obtained from a single volunteer and impressed with heavy, normal, and light pressure on computer paper, soft gloss paper, 10-print card stock, and retabs. Six hundred prints from 10 volunteers were rolled with heavy, normal, and light pressure on soft gloss paper and 10-print card stock. Results indicate that while different substrates/pressure levels produced small systematic changes in fingerprints, the changes were small in magnitude: roughly the width of one ridge. There were no detectable changes in the degree of random variability of prints associated with either pressure or substrate. In conclusion, the prints transferred reliably regardless of pressure or substrate. © 2014 American Academy of Forensic Sciences.

  16. Acoustic change detection algorithm using an FM radio

    NASA Astrophysics Data System (ADS)

    Goldman, Geoffrey H.; Wolfe, Owen

    2012-06-01

    The U.S. Army is interested in developing low-cost, low-power, non-line-of-sight sensors for monitoring human activity. One modality that is often overlooked is active acoustics using sources of opportunity such as speech or music. Active acoustics can be used to detect human activity by generating acoustic images of an area at different times, then testing for changes among the imagery. A change detection algorithm was developed to detect physical changes in a building, such as a door changing positions or a large box being moved using acoustics sources of opportunity. The algorithm is based on cross correlating the acoustic signal measured from two microphones. The performance of the algorithm was shown using data generated with a hand-held FM radio as a sound source and two microphones. The algorithm could detect a door being opened in a hallway.

  17. An Algorithm for Pedestrian Detection in Multispectral Image Sequences

    NASA Astrophysics Data System (ADS)

    Kniaz, V. V.; Fedorenko, V. V.

    2017-05-01

    The growing interest for self-driving cars provides a demand for scene understanding and obstacle detection algorithms. One of the most challenging problems in this field is the problem of pedestrian detection. Main difficulties arise from a diverse appearances of pedestrians. Poor visibility conditions such as fog and low light conditions also significantly decrease the quality of pedestrian detection. This paper presents a new optical flow based algorithm BipedDetet that provides robust pedestrian detection on a single-borad computer. The algorithm is based on the idea of simplified Kalman filtering suitable for realization on modern single-board computers. To detect a pedestrian a synthetic optical flow of the scene without pedestrians is generated using slanted-plane model. The estimate of a real optical flow is generated using a multispectral image sequence. The difference of the synthetic optical flow and the real optical flow provides the optical flow induced by pedestrians. The final detection of pedestrians is done by the segmentation of the difference of optical flows. To evaluate the BipedDetect algorithm a multispectral dataset was collected using a mobile robot.

  18. Infrared small target detection technology based on OpenCV

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Huang, Zhijian

    2013-05-01

    Accurate and fast detection of infrared (IR) dim target has very important meaning for infrared precise guidance, early warning, video surveillance, etc. In this paper, some basic principles and the implementing flow charts of a series of algorithms for target detection are described. These algorithms are traditional two-frame difference method, improved three-frame difference method, background estimate and frame difference fusion method, and building background with neighborhood mean method. On the foundation of above works, an infrared target detection software platform which is developed by OpenCV and MFC is introduced. Three kinds of tracking algorithms are integrated in this software. In order to explain the software clearly, the framework and the function are described in this paper. At last, the experiments are performed for some real-life IR images. The whole algorithm implementing processes and results are analyzed, and those algorithms for detection targets are evaluated from the two aspects of subjective and objective. The results prove that the proposed method has satisfying detection effectiveness and robustness. Meanwhile, it has high detection efficiency and can be used for real-time detection.

  19. Infrared small target detection technology based on OpenCV

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Huang, Zhijian

    2013-09-01

    Accurate and fast detection of infrared (IR) dim target has very important meaning for infrared precise guidance, early warning, video surveillance, etc. In this paper, some basic principles and the implementing flow charts of a series of algorithms for target detection are described. These algorithms are traditional two-frame difference method, improved three-frame difference method, background estimate and frame difference fusion method, and building background with neighborhood mean method. On the foundation of above works, an infrared target detection software platform which is developed by OpenCV and MFC is introduced. Three kinds of tracking algorithms are integrated in this software. In order to explain the software clearly, the framework and the function are described in this paper. At last, the experiments are performed for some real-life IR images. The whole algorithm implementing processes and results are analyzed, and those algorithms for detection targets are evaluated from the two aspects of subjective and objective. The results prove that the proposed method has satisfying detection effectiveness and robustness. Meanwhile, it has high detection efficiency and can be used for real-time detection.

  20. EFFECT OF POLONIUM /cap alpha/ RADIATION ON GELATINE (in French)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ader, M.

    1962-08-01

    When a nuclear plate, which has been exposed to radiation, developed, and dried, is exposed to a Po source, no effect can be detected by either the eye or the microscope. However if the plate is placed in distilled water, the emulsion thickness of the irradiated region is reduced by approximately 20 mu . A ridge'' separates this region from the nonirradiated region. The ridge contains piles of silver grains, very deformed traces of the old radiation, and some gelatin fragments. It appears that the alpha particles penetrating the gelatine transforms this gelatin, reversible protein, into a substance soluble'' inmore » distilled water or entrained by the distilled water. (J.S.R.)« less

  1. Tinea nigra showing a parallel ridge pattern on dermoscopy.

    PubMed

    Noguchi, Hiromitsu; Hiruma, Masataro; Inoue, Yuji; Miyata, Keishi; Tanaka, Masaru; Ihn, Hironobu

    2015-05-01

    An 18-year-old healthy female student noticed a brown macule measuring 21 mm in diameter on the left palm and visited our clinic concerned about a cancerous mole. Dermoscopic examination revealed a brown, fine-dotted and granule-like structure overlapping an amorphous light brown macule. However, unlike previous cases, analysis of the high dynamic range-converted image revealed the parallel ridge pattern frequently observed in malignant melanomas. Brown mycelia were detected on direct microscopic examination; black colonies were isolated on fungal culture and the fungus was identified as Hortaea werneckii. The lesion was treated with topical ketoconazole cream, and it diminished 1 month later. © 2015 Japanese Dermatological Association.

  2. Methods for monitoring hydroacoustic events using direct and reflected T waves in the Indian Ocean

    NASA Astrophysics Data System (ADS)

    Hanson, Jeffrey A.; Bowman, J. Roger

    2006-02-01

    The recent installation of permanent, three-element hydrophone arrays in the Indian Ocean offshore Diego Garcia and Cape Leeuwin, Australia, provides an opportunity to study hydroacoustic sources in more detail than previously possible. We developed and applied methods for coherent processing of the array data, for automated association of signals detected at more than one array, and for source location using only direct arrivals and using signals reflected from coastlines and other bathymetric features. During the 286-day study, 4725 hydroacoustic events were defined and located in the Indian and Southern oceans. Events fall into two classes: tectonic earthquakes and ice-related noise. The tectonic earthquakes consist of mid-ocean ridge, trench, and intraplate earthquakes. Mid-ocean ridge earthquakes are the most common tectonic events and often occur in clusters along transform offsets. Hydroacoustic signal levels for earthquakes in a standard catalog suggest that the hydroacoustic processing threshold for ridge events is one magnitude below the seismic network. Fewer earthquakes are observed along the Java Trench than expected because the large bathymetric relief of the source region complicates coupling between seismic and hydroacoustic signals, leading to divergent signal characteristics at different stations. We located 1843 events along the Antarctic coast resulting from various ice noises, most likely thermal fracturing and ice ridge forming events. Reflectors of signals from earthquakes are observed along coastlines, the mid-Indian Ocean and Ninety East ridges, and other bathymetric features. Reflected signals are used as synthetic stations to reduce location uncertainty and to enable event location with a single station.

  3. An improved algorithm for small and cool fire detection using MODIS data: A preliminary study in the southeastern United States

    Treesearch

    Wanting Wang; John J. Qu; Xianjun Hao; Yongqiang Liu; William T. Sommers

    2006-01-01

    Traditional fire detection algorithms mainly rely on hot spot detection using thermal infrared (TIR) channels with fixed or contextual thresholds. Three solar reflectance channels (0.65 μm, 0.86 μm, and 2.1 μm) were recently adopted into the MODIS version 4 contextual algorithm to improve the active fire detection. In the southeastern United...

  4. Influenza detection and prediction algorithms: comparative accuracy trial in Östergötland county, Sweden, 2008-2012.

    PubMed

    Spreco, A; Eriksson, O; Dahlström, Ö; Timpka, T

    2017-07-01

    Methods for the detection of influenza epidemics and prediction of their progress have seldom been comparatively evaluated using prospective designs. This study aimed to perform a prospective comparative trial of algorithms for the detection and prediction of increased local influenza activity. Data on clinical influenza diagnoses recorded by physicians and syndromic data from a telenursing service were used. Five detection and three prediction algorithms previously evaluated in public health settings were calibrated and then evaluated over 3 years. When applied on diagnostic data, only detection using the Serfling regression method and prediction using the non-adaptive log-linear regression method showed acceptable performances during winter influenza seasons. For the syndromic data, none of the detection algorithms displayed a satisfactory performance, while non-adaptive log-linear regression was the best performing prediction method. We conclude that evidence was found for that available algorithms for influenza detection and prediction display satisfactory performance when applied on local diagnostic data during winter influenza seasons. When applied on local syndromic data, the evaluated algorithms did not display consistent performance. Further evaluations and research on combination of methods of these types in public health information infrastructures for 'nowcasting' (integrated detection and prediction) of influenza activity are warranted.

  5. A New Pivoting and Iterative Text Detection Algorithm for Biomedical Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Songhua; Krauthammer, Prof. Michael

    2010-01-01

    There is interest to expand the reach of literature mining to include the analysis of biomedical images, which often contain a paper's key findings. Examples include recent studies that use Optical Character Recognition (OCR) to extract image text, which is used to boost biomedical image retrieval and classification. Such studies rely on the robust identification of text elements in biomedical images, which is a non-trivial task. In this work, we introduce a new text detection algorithm for biomedical images based on iterative projection histograms. We study the effectiveness of our algorithm by evaluating the performance on a set of manuallymore » labeled random biomedical images, and compare the performance against other state-of-the-art text detection algorithms. We demonstrate that our projection histogram-based text detection approach is well suited for text detection in biomedical images, and that the iterative application of the algorithm boosts performance to an F score of .60. We provide a C++ implementation of our algorithm freely available for academic use.« less

  6. Breadth-First Search-Based Single-Phase Algorithms for Bridge Detection in Wireless Sensor Networks

    PubMed Central

    Akram, Vahid Khalilpour; Dagdeviren, Orhan

    2013-01-01

    Wireless sensor networks (WSNs) are promising technologies for exploring harsh environments, such as oceans, wild forests, volcanic regions and outer space. Since sensor nodes may have limited transmission range, application packets may be transmitted by multi-hop communication. Thus, connectivity is a very important issue. A bridge is a critical edge whose removal breaks the connectivity of the network. Hence, it is crucial to detect bridges and take preventions. Since sensor nodes are battery-powered, services running on nodes should consume low energy. In this paper, we propose energy-efficient and distributed bridge detection algorithms for WSNs. Our algorithms run single phase and they are integrated with the Breadth-First Search (BFS) algorithm, which is a popular routing algorithm. Our first algorithm is an extended version of Milic's algorithm, which is designed to reduce the message length. Our second algorithm is novel and uses ancestral knowledge to detect bridges. We explain the operation of the algorithms, analyze their proof of correctness, message, time, space and computational complexities. To evaluate practical importance, we provide testbed experiments and extensive simulations. We show that our proposed algorithms provide less resource consumption, and the energy savings of our algorithms are up by 5.5-times. PMID:23845930

  7. An exact computational method for performance analysis of sequential test algorithms for detecting network intrusions

    NASA Astrophysics Data System (ADS)

    Chen, Xinjia; Lacy, Fred; Carriere, Patrick

    2015-05-01

    Sequential test algorithms are playing increasingly important roles for quick detecting network intrusions such as portscanners. In view of the fact that such algorithms are usually analyzed based on intuitive approximation or asymptotic analysis, we develop an exact computational method for the performance analysis of such algorithms. Our method can be used to calculate the probability of false alarm and average detection time up to arbitrarily pre-specified accuracy.

  8. The 2016 seismic series in the south Alboran Sea: Seismotectonics, Coulomb Failure Stress changes and implications for the active tectonics in the area.

    NASA Astrophysics Data System (ADS)

    Alvarez-Gómez, José A.; Martín, Rosa; Pérez-López, Raul; Stich, Daniel; Cantavella, Juan V.; Martínez-Díaz, José J.; Morales, José; Soto, Juan I.; Carreño, Emilio

    2017-04-01

    The Southern Alboran Sea, particularly the area offshore Al Hoceima Bay, presents moderate but continuous seismic activity since the Mw 6.0 1994 Al Hoceima earthquake. The maximum magnitude occurred in the area was a Mw 6.3 earthquake in the 2004 Al Hoceima - Tamasint seismic series. Since then, the seismicity in the Al Hoceima area has been usual, with maximum seismic magnitudes around 4. An increase in the seismic rate was registered during 2015, especially from May, culminating in the seismic series in January 2016. The mainshock occurred on January 25th 2016 with a magnitude Mw 6.3 and it was preceded by a Mw 5.1 foreshock on January 21st. The seismic series took place at the western end of the Alboran Ridge. Towards the northeast the Alboran Ridge bends, and seems to be connected with the NW-SE right-lateral transtensional Yusuf Fault. The recorded seismicity is mainly located in the Alboran Ridge area and along the N-S Al-Idrisi Fault that seems to continue southwards, towards the Al Hoceima Bay. The focal mechanisms calculated previously in the area showed a left-lateral strike-slip faulting with some normal component in the Alboran Ridge; but always within a complex system of diffuse deformation and high rupture type variability. We have used 41 computed focal mechanisms of this seismic series to analyze its seismotectonics and structural characteristics. To group the focal mechanisms we used a clustering algorithm using the spatial distribution of the events and also the type of rupture mechanism. For each cluster we have obtained the composed focal mechanism, associating it to a particular fault or family of structures. We have tested the mechanical compatibility of these structures by Coulomb Failure Stress transfer modeling. The mainshock of the series occurred in the Al Idrisi Fault intersecting the western Alboran Ridge. This event triggered aftershocks and independent series in left-lateral strike-slip faults associated with the Al Idrisi Fault System towards the south, but also in near pure reverse faults in the fault zone bounding the the Alboran Ridge. Both types of faults and rupture-mechanisms coexist, linked mechanically by stress transfer, being coeval the uplift of the Alboran Ridge and its northwestward displacement due to the left-lateral motion of the Al-Idrisi Fault. It is also discussed how the contrasting faulting processes and seismic ruptures are developed in two differentially oriented fault zones in the context the current NW-SE plate convergence between the African and Eurasian plates in the Westernmost Mediterranean.

  9. A robust human face detection algorithm

    NASA Astrophysics Data System (ADS)

    Raviteja, Thaluru; Karanam, Srikrishna; Yeduguru, Dinesh Reddy V.

    2012-01-01

    Human face detection plays a vital role in many applications like video surveillance, managing a face image database, human computer interface among others. This paper proposes a robust algorithm for face detection in still color images that works well even in a crowded environment. The algorithm uses conjunction of skin color histogram, morphological processing and geometrical analysis for detecting human faces. To reinforce the accuracy of face detection, we further identify mouth and eye regions to establish the presence/absence of face in a particular region of interest.

  10. Detection of suspicious pain regions on a digital infrared thermal image using the multimodal function optimization.

    PubMed

    Lee, Junghoon; Lee, Joosung; Song, Sangha; Lee, Hyunsook; Lee, Kyoungjoung; Yoon, Youngro

    2008-01-01

    Automatic detection of suspicious pain regions is very useful in the medical digital infrared thermal imaging research area. To detect those regions, we use the SOFES (Survival Of the Fitness kind of the Evolution Strategy) algorithm which is one of the multimodal function optimization methods. We apply this algorithm to famous diseases, such as a foot of the glycosuria, the degenerative arthritis and the varicose vein. The SOFES algorithm is available to detect some hot spots or warm lines as veins. And according to a hundred of trials, the algorithm is very fast to converge.

  11. Biased normalized cuts for target detection in hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Xuewen; Dorado-Munoz, Leidy P.; Messinger, David W.; Cahill, Nathan D.

    2016-05-01

    The Biased Normalized Cuts (BNC) algorithm is a useful technique for detecting targets or objects in RGB imagery. In this paper, we propose modifying BNC for the purpose of target detection in hyperspectral imagery. As opposed to other target detection algorithms that typically encode target information prior to dimensionality reduction, our proposed algorithm encodes target information after dimensionality reduction, enabling a user to detect different targets in interactive mode. To assess the proposed BNC algorithm, we utilize hyperspectral imagery (HSI) from the SHARE 2012 data campaign, and we explore the relationship between the number and the position of expert-provided target labels and the precision/recall of the remaining targets in the scene.

  12. Long-term observations of seafloor pressure variations at Lucky Strike volcano, Mid-Atlantic Ridge

    NASA Astrophysics Data System (ADS)

    Ballu, V.; de Viron, O.; Crawford, W. C.; Cannat, M.; Escartin, J.

    2012-12-01

    Lucky Strike volcano is a segment-center volcano on the Mid-Atlantic Ridge at 37°N. Extensive faulting reveals an important tectonic component in its formation, while a seismically imaged axial magma chamber reflector and active high-temperature hydrothermal vents reveal an important present-day magmatic component. Lucky Strike volcano has been the subject of long-term multidisciplinary seafloor observations to understand relations between magmatism, tectonism, hydrothermal circulation, biology and chemistry as part of the MoMAR (Monitoring of the Mid-Atlantic Ridge) program. Absolute pressure gauges have been recording on the volcano since 2007, to identify deformations associated with tectonism or magmatism. Deformation measurements are one of the principal means of determining volcanic activity, but the amount of deformation associated with volcanic events varies greatly between different volcanos. We installed two sites: one in the volcano's summit "lava lake" (1700 m depth) and another on the volcano's flank (2000 m depth). Pressure is recorded every thirty seconds, giving a data set that constrains movements on the scale from minutes to years. No major deformation event has been detected by the instruments since their installation (nor has any significant tectonic event been detected by a seismic network in place since 2007), so we concentrate here on the detection limit of these instruments and on variations in the long-period ocean wave climate. Using the statistical characteristics of the pressure signal, modeled by an auto-regressive process, we determine that a movement between the sites of 1 cm over less than 10 days is detectable; the detection threshold decreases to about 0.2 cm for the shortest time periods and increases for longer time periods due to instrumental drift. We compare the statistical characteristics and short- and long-term sensitivity of three different types of gauges used during the experiment: Paroscientific standard, Paroscientific nanoprecision and SeaBird. We also present seasonal variations in the ocean wave climate (infragravity waves) and correlate them with ocean storms in the Atlantic Ocean basin. We compare these measurements with those made over 3 years at the same site using a differential pressure gauge on a broadband ocean bottom seismometer.

  13. Algorithmic detectability threshold of the stochastic block model

    NASA Astrophysics Data System (ADS)

    Kawamoto, Tatsuro

    2018-03-01

    The assumption that the values of model parameters are known or correctly learned, i.e., the Nishimori condition, is one of the requirements for the detectability analysis of the stochastic block model in statistical inference. In practice, however, there is no example demonstrating that we can know the model parameters beforehand, and there is no guarantee that the model parameters can be learned accurately. In this study, we consider the expectation-maximization (EM) algorithm with belief propagation (BP) and derive its algorithmic detectability threshold. Our analysis is not restricted to the community structure but includes general modular structures. Because the algorithm cannot always learn the planted model parameters correctly, the algorithmic detectability threshold is qualitatively different from the one with the Nishimori condition.

  14. Object Occlusion Detection Using Automatic Camera Calibration for a Wide-Area Video Surveillance System

    PubMed Central

    Jung, Jaehoon; Yoon, Inhye; Paik, Joonki

    2016-01-01

    This paper presents an object occlusion detection algorithm using object depth information that is estimated by automatic camera calibration. The object occlusion problem is a major factor to degrade the performance of object tracking and recognition. To detect an object occlusion, the proposed algorithm consists of three steps: (i) automatic camera calibration using both moving objects and a background structure; (ii) object depth estimation; and (iii) detection of occluded regions. The proposed algorithm estimates the depth of the object without extra sensors but with a generic red, green and blue (RGB) camera. As a result, the proposed algorithm can be applied to improve the performance of object tracking and object recognition algorithms for video surveillance systems. PMID:27347978

  15. Detection of spontaneous vesicle release at individual synapses using multiple wavelets in a CWT-based algorithm.

    PubMed

    Sokoll, Stefan; Tönnies, Klaus; Heine, Martin

    2012-01-01

    In this paper we present an algorithm for the detection of spontaneous activity at individual synapses in microscopy images. By employing the optical marker pHluorin, we are able to visualize synaptic vesicle release with a spatial resolution in the nm range in a non-invasive manner. We compute individual synaptic signals from automatically segmented regions of interest and detect peaks that represent synaptic activity using a continuous wavelet transform based algorithm. As opposed to standard peak detection algorithms, we employ multiple wavelets to match all relevant features of the peak. We evaluate our multiple wavelet algorithm (MWA) on real data and assess the performance on synthetic data over a wide range of signal-to-noise ratios.

  16. Automated detection of hospital outbreaks: A systematic review of methods.

    PubMed

    Leclère, Brice; Buckeridge, David L; Boëlle, Pierre-Yves; Astagneau, Pascal; Lepelletier, Didier

    2017-01-01

    Several automated algorithms for epidemiological surveillance in hospitals have been proposed. However, the usefulness of these methods to detect nosocomial outbreaks remains unclear. The goal of this review was to describe outbreak detection algorithms that have been tested within hospitals, consider how they were evaluated, and synthesize their results. We developed a search query using keywords associated with hospital outbreak detection and searched the MEDLINE database. To ensure the highest sensitivity, no limitations were initially imposed on publication languages and dates, although we subsequently excluded studies published before 2000. Every study that described a method to detect outbreaks within hospitals was included, without any exclusion based on study design. Additional studies were identified through citations in retrieved studies. Twenty-nine studies were included. The detection algorithms were grouped into 5 categories: simple thresholds (n = 6), statistical process control (n = 12), scan statistics (n = 6), traditional statistical models (n = 6), and data mining methods (n = 4). The evaluation of the algorithms was often solely descriptive (n = 15), but more complex epidemiological criteria were also investigated (n = 10). The performance measures varied widely between studies: e.g., the sensitivity of an algorithm in a real world setting could vary between 17 and 100%. Even if outbreak detection algorithms are useful complementary tools for traditional surveillance, the heterogeneity in results among published studies does not support quantitative synthesis of their performance. A standardized framework should be followed when evaluating outbreak detection methods to allow comparison of algorithms across studies and synthesis of results.

  17. A service relation model for web-based land cover change detection

    NASA Astrophysics Data System (ADS)

    Xing, Huaqiao; Chen, Jun; Wu, Hao; Zhang, Jun; Li, Songnian; Liu, Boyu

    2017-10-01

    Change detection with remotely sensed imagery is a critical step in land cover monitoring and updating. Although a variety of algorithms or models have been developed, none of them can be universal for all cases. The selection of appropriate algorithms and construction of processing workflows depend largely on the expertise of experts about the "algorithm-data" relations among change detection algorithms and the imagery data used. This paper presents a service relation model for land cover change detection by integrating the experts' knowledge about the "algorithm-data" relations into the web-based geo-processing. The "algorithm-data" relations are mapped into a set of web service relations with the analysis of functional and non-functional service semantics. These service relations are further classified into three different levels, i.e., interface, behavior and execution levels. A service relation model is then established using the Object and Relation Diagram (ORD) approach to represent the multi-granularity services and their relations for change detection. A set of semantic matching rules are built and used for deriving on-demand change detection service chains from the service relation model. A web-based prototype system is developed in .NET development environment, which encapsulates nine change detection and pre-processing algorithms and represents their service relations as an ORD. Three test areas from Shandong and Hebei provinces, China with different imagery conditions are selected for online change detection experiments, and the results indicate that on-demand service chains can be generated according to different users' demands.

  18. Mathematical detection of aortic valve opening (B point) in impedance cardiography: A comparison of three popular algorithms.

    PubMed

    Árbol, Javier Rodríguez; Perakakis, Pandelis; Garrido, Alba; Mata, José Luis; Fernández-Santaella, M Carmen; Vila, Jaime

    2017-03-01

    The preejection period (PEP) is an index of left ventricle contractility widely used in psychophysiological research. Its computation requires detecting the moment when the aortic valve opens, which coincides with the B point in the first derivative of impedance cardiogram (ICG). Although this operation has been traditionally made via visual inspection, several algorithms based on derivative calculations have been developed to enable an automatic performance of the task. However, despite their popularity, data about their empirical validation are not always available. The present study analyzes the performance in the estimation of the aortic valve opening of three popular algorithms, by comparing their performance with the visual detection of the B point made by two independent scorers. Algorithm 1 is based on the first derivative of the ICG, Algorithm 2 on the second derivative, and Algorithm 3 on the third derivative. Algorithm 3 showed the highest accuracy rate (78.77%), followed by Algorithm 1 (24.57%) and Algorithm 2 (13.82%). In the automatic computation of PEP, Algorithm 2 resulted in significantly more missed cycles (48.57%) than Algorithm 1 (6.3%) and Algorithm 3 (3.5%). Algorithm 2 also estimated a significantly lower average PEP (70 ms), compared with the values obtained by Algorithm 1 (119 ms) and Algorithm 3 (113 ms). Our findings indicate that the algorithm based on the third derivative of the ICG performs significantly better. Nevertheless, a visual inspection of the signal proves indispensable, and this article provides a novel visual guide to facilitate the manual detection of the B point. © 2016 Society for Psychophysiological Research.

  19. An Improved Vision-based Algorithm for Unmanned Aerial Vehicles Autonomous Landing

    NASA Astrophysics Data System (ADS)

    Zhao, Yunji; Pei, Hailong

    In vision-based autonomous landing system of UAV, the efficiency of target detecting and tracking will directly affect the control system. The improved algorithm of SURF(Speed Up Robust Features) will resolve the problem which is the inefficiency of the SURF algorithm in the autonomous landing system. The improved algorithm is composed of three steps: first, detect the region of the target using the Camshift; second, detect the feature points in the region of the above acquired using the SURF algorithm; third, do the matching between the template target and the region of target in frame. The results of experiment and theoretical analysis testify the efficiency of the algorithm.

  20. Overdentures on implants placed in bone augmented with fresh frozen bone.

    PubMed

    Rigo, L; Viscioni, A; Franco, M; Lucchese, A; Zollino, I; Brunelli, G; Carinci, F

    2011-01-01

    In the last decade several studies have been performed to evaluate the clinical outcome of one or two stage loaded implants supporting overdentures. Since fresh frozen bone (FFB) has an ever-increasing number of clinical applications and few reports are available on implants inserted into FFB, we performed a retrospective study on fixtures inserted in FFB and bearing overdentures. In the period between December 2003 and December 2006, 17 patients (14 females and 3 males with a median age of about 56 years) were grafted and 60 implants inserted thereafter. A total of 17 overdentures were delivered: 8 in the mandible and 9 in the maxilla. Multiple implant systems were used: 22 Double etched, 7 SLA, 9 Anodic oxidized, and 22 CaPo4 ceramic-blasted. Implant diameter ranged from 3.25 to 4.3 mm and length from 11.5 to 16.0 mm. Implants were inserted to replace 23 incisors, 9 cuspids, 20 premolars and 8 molars. No implants were lost (i.e., survival rate=100%) and no differences were detected among the studied variables. Kaplan Meier algorithm and Cox regression did not reveal any statistical differences among the studied variables also as regards the success rate. Implants inserted FFB and bearing overdentures have a high survival rate and success rates, which are comparable to those of implants inserted in non-grafted bone. FFB bone is a reliable material for alveolar ridge augmentation. No difference was detected among removable prostheses supported by two or more implants.

  1. Implementing a Parallel Image Edge Detection Algorithm Based on the Otsu-Canny Operator on the Hadoop Platform.

    PubMed

    Cao, Jianfang; Chen, Lichao; Wang, Min; Tian, Yun

    2018-01-01

    The Canny operator is widely used to detect edges in images. However, as the size of the image dataset increases, the edge detection performance of the Canny operator decreases and its runtime becomes excessive. To improve the runtime and edge detection performance of the Canny operator, in this paper, we propose a parallel design and implementation for an Otsu-optimized Canny operator using a MapReduce parallel programming model that runs on the Hadoop platform. The Otsu algorithm is used to optimize the Canny operator's dual threshold and improve the edge detection performance, while the MapReduce parallel programming model facilitates parallel processing for the Canny operator to solve the processing speed and communication cost problems that occur when the Canny edge detection algorithm is applied to big data. For the experiments, we constructed datasets of different scales from the Pascal VOC2012 image database. The proposed parallel Otsu-Canny edge detection algorithm performs better than other traditional edge detection algorithms. The parallel approach reduced the running time by approximately 67.2% on a Hadoop cluster architecture consisting of 5 nodes with a dataset of 60,000 images. Overall, our approach system speeds up the system by approximately 3.4 times when processing large-scale datasets, which demonstrates the obvious superiority of our method. The proposed algorithm in this study demonstrates both better edge detection performance and improved time performance.

  2. Global Stratigraphy of Venus: Analysis of a Random Sample of Thirty-Six Test Areas

    NASA Technical Reports Server (NTRS)

    Basilevsky, Alexander T.; Head, James W., III

    1995-01-01

    The age relations between 36 impact craters with dark paraboloids and other geologic units and structures at these localities have been studied through photogeologic analysis of Magellan SAR images of the surface of Venus. Geologic settings in all 36 sites, about 1000 x 1000 km each, could be characterized using only 10 different terrain units and six types of structures. These units and structures form a major stratigraphic and geologic sequence (from oldest to youngest): (1) tessera terrain; (2) densely fractured terrains associated with coronae and in the form of remnants among plains; (3) fractured and ridged plains and ridge belts; (4) plains with wrinkle ridges; (5) ridges associated with coronae annulae and ridges of arachnoid annulae which are contemporary with wrinkle ridges of the ridged plains; (6) smooth and lobate plains; (7) fractures of coronae annulae, and fractures not related to coronae annulae, which disrupt ridged and smooth plains; (8) rift-associated fractures; and (9) craters with associated dark paraboloids, which represent the youngest 1O% of the Venus impact crater population (Campbell et al.), and are on top of all volcanic and tectonic units except the youngest episodes of rift-associated fracturing and volcanism; surficial streaks and patches are approximately contemporary with dark-paraboloid craters. Mapping of such units and structures in 36 randomly distributed large regions (each approximately 10(exp 6) sq km) shows evidence for a distinctive regional and global stratigraphic and geologic sequence. On the basis of this sequence we have developed a model that illustrates several major themes in the history of Venus. Most of the history of Venus (that of its first 80% or so) is not preserved in the surface geomorphological record. The major deformation associated with tessera formation in the period sometime between 0.5-1.0 b.y. ago (Ivanov and Basilevsky) is the earliest event detected. In the terminal stages of tessera fon-nation, extensive parallel linear graben swarms representing a change in the style of deformation from shortening to extension were formed on the tessera and on some volcanic plains that were emplaced just after, and perhaps also during the latter stages of the major compressional phase of tessera emplacement. Our stratigraphic analyses suggest that following tessera formation, extensive volcanic flooding resurfaced at least 85% of the planet in the form of the presently-ridged and fractured plains. Several lines of evidence favor a high flux in the post-tessera period but we have no independent evidence for the absolute duration of ridged plains emplacement. During this time, the net state of stress in the lithosphere apparently changed from extensional to compressional, first in the form of extensive ridge belt development, followed by the formation of extensive wrinkle ridges on the flow units. Subsequently, there occurred local emplacement of smooth and lobate plains units which are presently essentially undefortned. The major events in the latest 10% of the presently preserved history of Venus (less than 50 m.y. ago) are continued rifting and some associated volcanism, and the redistribution of eolian material largely derived from impact crater deposits.

  3. Curiosity View of 'Vera Rubin Ridge' From Below, Sol 1734

    NASA Image and Video Library

    2017-09-13

    "Vera Rubin Ridge," a favored destination for NASA's Curiosity Mars rover even before the rover landed in 2012, rises near the rover nearly five years later in this panorama from Curiosity's Mast Camera (Mastcam). The scene combines 23 images taken with the Mastcam's right-eye, telephoto-lens camera, on June 22, 2017, during the 1,734th Martian day, or sol, of Curiosity's work on Mars. The rover began ascending the ridge in September 2017. This and other Mastcam panoramas show details of the sedimentary rocks that make up the "Vera Rubin Ridge." This distinct topographic feature located on the lower slopes of Mount Sharp (Aeolis Mons) is characterized by the presence of hematite, an iron-oxide mineral, which has been detected from orbit. The Mastcam images show that the rocks making up the lower part of the ridge are characterized by distinct horizontal stratification with individual rock layers of the order of several inches (tens of centimeters) thick. Scientists on the mission are using such images to determine the ancient environment these rocks were deposited in. The repeated beds indicate progressive accumulation of sediments that now make up the lower part of Mount Sharp, although from this distance it is not possible to know if they were formed by aqueous or wind-blown processes. Close-up images collected as the rover climbs the ridge will help answer this question. The stratified rocks are cross cut by veins filled with a white mineral, likely calcium sulfate, that provide evidence of later episodes of fluid flow through the rocks. The panorama has been white-balanced so that the colors of the rock materials resemble how they would appear under daytime lighting conditions on Earth. It spans about 65 compass degrees, centered toward the south-southeast. Higher portions of Mount Sharp are visible at upper left. The Sol 1734 location just north of the ridge is shown in a Sol 1732 traverse map. An annotated figure is shown at https://photojournal.jpl.nasa.gov/catalog/PIA21849

  4. Leveraging disjoint communities for detecting overlapping community structure

    NASA Astrophysics Data System (ADS)

    Chakraborty, Tanmoy

    2015-05-01

    Network communities represent mesoscopic structure for understanding the organization of real-world networks, where nodes often belong to multiple communities and form overlapping community structure in the network. Due to non-triviality in finding the exact boundary of such overlapping communities, this problem has become challenging, and therefore huge effort has been devoted to detect overlapping communities from the network. In this paper, we present PVOC (Permanence based Vertex-replication algorithm for Overlapping Community detection), a two-stage framework to detect overlapping community structure. We build on a novel observation that non-overlapping community structure detected by a standard disjoint community detection algorithm from a network has high resemblance with its actual overlapping community structure, except the overlapping part. Based on this observation, we posit that there is perhaps no need of building yet another overlapping community finding algorithm; but one can efficiently manipulate the output of any existing disjoint community finding algorithm to obtain the required overlapping structure. We propose a new post-processing technique that by combining with any existing disjoint community detection algorithm, can suitably process each vertex using a new vertex-based metric, called permanence, and thereby finds out overlapping candidates with their community memberships. Experimental results on both synthetic and large real-world networks show that PVOC significantly outperforms six state-of-the-art overlapping community detection algorithms in terms of high similarity of the output with the ground-truth structure. Thus our framework not only finds meaningful overlapping communities from the network, but also allows us to put an end to the constant effort of building yet another overlapping community detection algorithm.

  5. High reliability - low noise radionuclide signature identification algorithms for border security applications

    NASA Astrophysics Data System (ADS)

    Lee, Sangkyu

    Illicit trafficking and smuggling of radioactive materials and special nuclear materials (SNM) are considered as one of the most important recent global nuclear threats. Monitoring the transport and safety of radioisotopes and SNM are challenging due to their weak signals and easy shielding. Great efforts worldwide are focused at developing and improving the detection technologies and algorithms, for accurate and reliable detection of radioisotopes of interest in thus better securing the borders against nuclear threats. In general, radiation portal monitors enable detection of gamma and neutron emitting radioisotopes. Passive or active interrogation techniques, present and/or under the development, are all aimed at increasing accuracy, reliability, and in shortening the time of interrogation as well as the cost of the equipment. Equally important efforts are aimed at advancing algorithms to process the imaging data in an efficient manner providing reliable "readings" of the interiors of the examined volumes of various sizes, ranging from cargos to suitcases. The main objective of this thesis is to develop two synergistic algorithms with the goal to provide highly reliable - low noise identification of radioisotope signatures. These algorithms combine analysis of passive radioactive detection technique with active interrogation imaging techniques such as gamma radiography or muon tomography. One algorithm consists of gamma spectroscopy and cosmic muon tomography, and the other algorithm is based on gamma spectroscopy and gamma radiography. The purpose of fusing two detection methodologies per algorithm is to find both heavy-Z radioisotopes and shielding materials, since radionuclides can be identified with gamma spectroscopy, and shielding materials can be detected using muon tomography or gamma radiography. These combined algorithms are created and analyzed based on numerically generated images of various cargo sizes and materials. In summary, the three detection methodologies are fused into two algorithms with mathematical functions providing: reliable identification of radioisotopes in gamma spectroscopy; noise reduction and precision enhancement in muon tomography; and the atomic number and density estimation in gamma radiography. It is expected that these new algorithms maybe implemented at portal scanning systems with the goal to enhance the accuracy and reliability in detecting nuclear materials inside the cargo containers.

  6. Searching Information Sources in Networks

    DTIC Science & Technology

    2017-06-14

    SECURITY CLASSIFICATION OF: During the course of this project, we made significant progresses in multiple directions of the information detection...result on information source detection on non-tree networks; (2) The development of information source localization algorithms to detect multiple... information sources. The algorithms have provable performance guarantees and outperform existing algorithms in 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND

  7. Identification of large-scale meteorological patterns associated with extreme precipitation in the US northeast

    NASA Astrophysics Data System (ADS)

    Agel, Laurie; Barlow, Mathew; Feldstein, Steven B.; Gutowski, William J.

    2018-03-01

    Patterns of daily large-scale circulation associated with Northeast US extreme precipitation are identified using both k-means clustering (KMC) and Self-Organizing Maps (SOM) applied to tropopause height. The tropopause height provides a compact representation of the upper-tropospheric potential vorticity, which is closely related to the overall evolution and intensity of weather systems. Extreme precipitation is defined as the top 1% of daily wet-day observations at 35 Northeast stations, 1979-2008. KMC is applied on extreme precipitation days only, while the SOM algorithm is applied to all days in order to place the extreme results into the overall context of patterns for all days. Six tropopause patterns are identified through KMC for extreme day precipitation: a summertime tropopause ridge, a summertime shallow trough/ridge, a summertime shallow eastern US trough, a deeper wintertime eastern US trough, and two versions of a deep cold-weather trough located across the east-central US. Thirty SOM patterns for all days are identified. Results for all days show that 6 SOM patterns account for almost half of the extreme days, although extreme precipitation occurs in all SOM patterns. The same SOM patterns associated with extreme precipitation also routinely produce non-extreme precipitation; however, on extreme precipitation days the troughs, on average, are deeper and the downstream ridges more pronounced. Analysis of other fields associated with the large-scale patterns show various degrees of anomalously strong moisture transport preceding, and upward motion during, extreme precipitation events.

  8. Large-Scale Meteorological Patterns Associated with Extreme Precipitation in the US Northeast

    NASA Astrophysics Data System (ADS)

    Agel, L. A.; Barlow, M. A.

    2016-12-01

    Patterns of daily large-scale circulation associated with Northeast US extreme precipitation are identified using both k-means clustering (KMC) and Self-Organizing Maps (SOM) applied to tropopause height. Tropopause height provides a compact representation of large-scale circulation patterns, as it is linked to mid-level circulation, low-level thermal contrasts and low-level diabatic heating. Extreme precipitation is defined as the top 1% of daily wet-day observations at 35 Northeast stations, 1979-2008. KMC is applied on extreme precipitation days only, while the SOM algorithm is applied to all days in order to place the extreme results into a larger context. Six tropopause patterns are identified on extreme days: a summertime tropopause ridge, a summertime shallow trough/ridge, a summertime shallow eastern US trough, a deeper wintertime eastern US trough, and two versions of a deep cold-weather trough located across the east-central US. Thirty SOM patterns for all days are identified. Results for all days show that 6 SOM patterns account for almost half of the extreme days, although extreme precipitation occurs in all SOM patterns. The same SOM patterns associated with extreme precipitation also routinely produce non-extreme precipitation; however, on extreme precipitation days the troughs, on average, are deeper and the downstream ridges more pronounced. Analysis of other fields associated with the large-scale patterns show various degrees of anomalously strong upward motion during, and moisture transport preceding, extreme precipitation events.

  9. LPA-CBD an improved label propagation algorithm based on community belonging degree for community detection

    NASA Astrophysics Data System (ADS)

    Gui, Chun; Zhang, Ruisheng; Zhao, Zhili; Wei, Jiaxuan; Hu, Rongjing

    In order to deal with stochasticity in center node selection and instability in community detection of label propagation algorithm, this paper proposes an improved label propagation algorithm named label propagation algorithm based on community belonging degree (LPA-CBD) that employs community belonging degree to determine the number and the center of community. The general process of LPA-CBD is that the initial community is identified by the nodes with the maximum degree, and then it is optimized or expanded by community belonging degree. After getting the rough structure of network community, the remaining nodes are labeled by using label propagation algorithm. The experimental results on 10 real-world networks and three synthetic networks show that LPA-CBD achieves reasonable community number, better algorithm accuracy and higher modularity compared with other four prominent algorithms. Moreover, the proposed algorithm not only has lower algorithm complexity and higher community detection quality, but also improves the stability of the original label propagation algorithm.

  10. QuateXelero: An Accelerated Exact Network Motif Detection Algorithm

    PubMed Central

    Khakabimamaghani, Sahand; Sharafuddin, Iman; Dichter, Norbert; Koch, Ina; Masoudi-Nejad, Ali

    2013-01-01

    Finding motifs in biological, social, technological, and other types of networks has become a widespread method to gain more knowledge about these networks’ structure and function. However, this task is very computationally demanding, because it is highly associated with the graph isomorphism which is an NP problem (not known to belong to P or NP-complete subsets yet). Accordingly, this research is endeavoring to decrease the need to call NAUTY isomorphism detection method, which is the most time-consuming step in many existing algorithms. The work provides an extremely fast motif detection algorithm called QuateXelero, which has a Quaternary Tree data structure in the heart. The proposed algorithm is based on the well-known ESU (FANMOD) motif detection algorithm. The results of experiments on some standard model networks approve the overal superiority of the proposed algorithm, namely QuateXelero, compared with two of the fastest existing algorithms, G-Tries and Kavosh. QuateXelero is especially fastest in constructing the central data structure of the algorithm from scratch based on the input network. PMID:23874498

  11. Anomaly Detection in Large Sets of High-Dimensional Symbol Sequences

    NASA Technical Reports Server (NTRS)

    Budalakoti, Suratna; Srivastava, Ashok N.; Akella, Ram; Turkov, Eugene

    2006-01-01

    This paper addresses the problem of detecting and describing anomalies in large sets of high-dimensional symbol sequences. The approach taken uses unsupervised clustering of sequences using the normalized longest common subsequence (LCS) as a similarity measure, followed by detailed analysis of outliers to detect anomalies. As the LCS measure is expensive to compute, the first part of the paper discusses existing algorithms, such as the Hunt-Szymanski algorithm, that have low time-complexity. We then discuss why these algorithms often do not work well in practice and present a new hybrid algorithm for computing the LCS that, in our tests, outperforms the Hunt-Szymanski algorithm by a factor of five. The second part of the paper presents new algorithms for outlier analysis that provide comprehensible indicators as to why a particular sequence was deemed to be an outlier. The algorithms provide a coherent description to an analyst of the anomalies in the sequence, compared to more normal sequences. The algorithms we present are general and domain-independent, so we discuss applications in related areas such as anomaly detection.

  12. Robust Kalman filter design for predictive wind shear detection

    NASA Technical Reports Server (NTRS)

    Stratton, Alexander D.; Stengel, Robert F.

    1991-01-01

    Severe, low-altitude wind shear is a threat to aviation safety. Airborne sensors under development measure the radial component of wind along a line directly in front of an aircraft. In this paper, optimal estimation theory is used to define a detection algorithm to warn of hazardous wind shear from these sensors. To achieve robustness, a wind shear detection algorithm must distinguish threatening wind shear from less hazardous gustiness, despite variations in wind shear structure. This paper presents statistical analysis methods to refine wind shear detection algorithm robustness. Computational methods predict the ability to warn of severe wind shear and avoid false warning. Comparative capability of the detection algorithm as a function of its design parameters is determined, identifying designs that provide robust detection of severe wind shear.

  13. Real-time ECG monitoring and arrhythmia detection using Android-based mobile devices.

    PubMed

    Gradl, Stefan; Kugler, Patrick; Lohmuller, Clemens; Eskofier, Bjoern

    2012-01-01

    We developed an application for Android™-based mobile devices that allows real-time electrocardiogram (ECG) monitoring and automated arrhythmia detection by analyzing ECG parameters. ECG data provided by pre-recorded files or acquired live by accessing a Shimmer™ sensor node via Bluetooth™ can be processed and evaluated. The application is based on the Pan-Tompkins algorithm for QRS-detection and contains further algorithm blocks to detect abnormal heartbeats. The algorithm was validated using the MIT-BIH Arrhythmia and MIT-BIH Supraventricular Arrhythmia databases. More than 99% of all QRS complexes were detected correctly by the algorithm. Overall sensitivity for abnormal beat detection was 89.5% with a specificity of 80.6%. The application is available for download and may be used for real-time ECG-monitoring on mobile devices.

  14. Accurate derivation of heart rate variability signal for detection of sleep disordered breathing in children.

    PubMed

    Chatlapalli, S; Nazeran, H; Melarkod, V; Krishnam, R; Estrada, E; Pamula, Y; Cabrera, S

    2004-01-01

    The electrocardiogram (ECG) signal is used extensively as a low cost diagnostic tool to provide information concerning the heart's state of health. Accurate determination of the QRS complex, in particular, reliable detection of the R wave peak, is essential in computer based ECG analysis. ECG data from Physionet's Sleep-Apnea database were used to develop, test, and validate a robust heart rate variability (HRV) signal derivation algorithm. The HRV signal was derived from pre-processed ECG signals by developing an enhanced Hilbert transform (EHT) algorithm with built-in missing beat detection capability for reliable QRS detection. The performance of the EHT algorithm was then compared against that of a popular Hilbert transform-based (HT) QRS detection algorithm. Autoregressive (AR) modeling of the HRV power spectrum for both EHT- and HT-derived HRV signals was achieved and different parameters from their power spectra as well as approximate entropy were derived for comparison. Poincare plots were then used as a visualization tool to highlight the detection of the missing beats in the EHT method After validation of the EHT algorithm on ECG data from the Physionet, the algorithm was further tested and validated on a dataset obtained from children undergoing polysomnography for detection of sleep disordered breathing (SDB). Sensitive measures of accurate HRV signals were then derived to be used in detecting and diagnosing sleep disordered breathing in children. All signal processing algorithms were implemented in MATLAB. We present a description of the EHT algorithm and analyze pilot data for eight children undergoing nocturnal polysomnography. The pilot data demonstrated that the EHT method provides an accurate way of deriving the HRV signal and plays an important role in extraction of reliable measures to distinguish between periods of normal and sleep disordered breathing (SDB) in children.

  15. Algorithm for detection the QRS complexes based on support vector machine

    NASA Astrophysics Data System (ADS)

    Van, G. V.; Podmasteryev, K. V.

    2017-11-01

    The efficiency of computer ECG analysis depends on the accurate detection of QRS-complexes. This paper presents an algorithm for QRS complex detection based of support vector machine (SVM). The proposed algorithm is evaluated on annotated standard databases such as MIT-BIH Arrhythmia database. The QRS detector obtained a sensitivity Se = 98.32% and specificity Sp = 95.46% for MIT-BIH Arrhythmia database. This algorithm can be used as the basis for the software to diagnose electrical activity of the heart.

  16. Aiding the Detection of QRS Complex in ECG Signals by Detecting S Peaks Independently.

    PubMed

    Sabherwal, Pooja; Singh, Latika; Agrawal, Monika

    2018-03-30

    In this paper, a novel algorithm for the accurate detection of QRS complex by combining the independent detection of R and S peaks, using fusion algorithm is proposed. R peak detection has been extensively studied and is being used to detect the QRS complex. Whereas, S peaks, which is also part of QRS complex can be independently detected to aid the detection of QRS complex. In this paper, we suggest a method to first estimate S peak from raw ECG signal and then use them to aid the detection of QRS complex. The amplitude of S peak in ECG signal is relatively weak than corresponding R peak, which is traditionally used for the detection of QRS complex, therefore, an appropriate digital filter is designed to enhance the S peaks. These enhanced S peaks are then detected by adaptive thresholding. The algorithm is validated on all the signals of MIT-BIH arrhythmia database and noise stress database taken from physionet.org. The algorithm performs reasonably well even for the signals highly corrupted by noise. The algorithm performance is confirmed by sensitivity and positive predictivity of 99.99% and the detection accuracy of 99.98% for QRS complex detection. The number of false positives and false negatives resulted while analysis has been drastically reduced to 80 and 42 against the 98 and 84 the best results reported so far.

  17. Enhancement of Fast Face Detection Algorithm Based on a Cascade of Decision Trees

    NASA Astrophysics Data System (ADS)

    Khryashchev, V. V.; Lebedev, A. A.; Priorov, A. L.

    2017-05-01

    Face detection algorithm based on a cascade of ensembles of decision trees (CEDT) is presented. The new approach allows detecting faces other than the front position through the use of multiple classifiers. Each classifier is trained for a specific range of angles of the rotation head. The results showed a high rate of productivity for CEDT on images with standard size. The algorithm increases the area under the ROC-curve of 13% compared to a standard Viola-Jones face detection algorithm. Final realization of given algorithm consist of 5 different cascades for frontal/non-frontal faces. One more thing which we take from the simulation results is a low computational complexity of CEDT algorithm in comparison with standard Viola-Jones approach. This could prove important in the embedded system and mobile device industries because it can reduce the cost of hardware and make battery life longer.

  18. Detection of pseudosinusoidal epileptic seizure segments in the neonatal EEG by cascading a rule-based algorithm with a neural network.

    PubMed

    Karayiannis, Nicolaos B; Mukherjee, Amit; Glover, John R; Ktonas, Periklis Y; Frost, James D; Hrachovy, Richard A; Mizrahi, Eli M

    2006-04-01

    This paper presents an approach to detect epileptic seizure segments in the neonatal electroencephalogram (EEG) by characterizing the spectral features of the EEG waveform using a rule-based algorithm cascaded with a neural network. A rule-based algorithm screens out short segments of pseudosinusoidal EEG patterns as epileptic based on features in the power spectrum. The output of the rule-based algorithm is used to train and compare the performance of conventional feedforward neural networks and quantum neural networks. The results indicate that the trained neural networks, cascaded with the rule-based algorithm, improved the performance of the rule-based algorithm acting by itself. The evaluation of the proposed cascaded scheme for the detection of pseudosinusoidal seizure segments reveals its potential as a building block of the automated seizure detection system under development.

  19. Improved space object detection using short-exposure image data with daylight background.

    PubMed

    Becker, David; Cain, Stephen

    2018-05-10

    Space object detection is of great importance in the highly dependent yet competitive and congested space domain. The detection algorithms employed play a crucial role in fulfilling the detection component in the space situational awareness mission to detect, track, characterize, and catalog unknown space objects. Many current space detection algorithms use a matched filter or a spatial correlator on long-exposure data to make a detection decision at a single pixel point of a spatial image based on the assumption that the data follow a Gaussian distribution. Long-exposure imaging is critical to detection performance in these algorithms; however, for imaging under daylight conditions, it becomes necessary to create a long-exposure image as the sum of many short-exposure images. This paper explores the potential for increasing detection capabilities for small and dim space objects in a stack of short-exposure images dominated by a bright background. The algorithm proposed in this paper improves the traditional stack and average method of forming a long-exposure image by selectively removing short-exposure frames of data that do not positively contribute to the overall signal-to-noise ratio of the averaged image. The performance of the algorithm is compared to a traditional matched filter detector using data generated in MATLAB as well as laboratory-collected data. The results are illustrated on a receiver operating characteristic curve to highlight the increased probability of detection associated with the proposed algorithm.

  20. Two TSCA Environmental Release Applications (TERAs) for Pseudomonas putida (P. Putida)

    EPA Pesticide Factsheets

    TERAs submitted by Oak Ridge National Laboratory and given the tracking designations of R-01-0003 and R-01-0004. The microorganisms will be tested at the Ravenna Army Ammunition Plant in Ohio to determine whether they can detect traces of TNT in soil.

  1. Shadow Detection Based on Regions of Light Sources for Object Extraction in Nighttime Video

    PubMed Central

    Lee, Gil-beom; Lee, Myeong-jin; Lee, Woo-Kyung; Park, Joo-heon; Kim, Tae-Hwan

    2017-01-01

    Intelligent video surveillance systems detect pre-configured surveillance events through background modeling, foreground and object extraction, object tracking, and event detection. Shadow regions inside video frames sometimes appear as foreground objects, interfere with ensuing processes, and finally degrade the event detection performance of the systems. Conventional studies have mostly used intensity, color, texture, and geometric information to perform shadow detection in daytime video, but these methods lack the capability of removing shadows in nighttime video. In this paper, a novel shadow detection algorithm for nighttime video is proposed; this algorithm partitions each foreground object based on the object’s vertical histogram and screens out shadow objects by validating their orientations heading toward regions of light sources. From the experimental results, it can be seen that the proposed algorithm shows more than 93.8% shadow removal and 89.9% object extraction rates for nighttime video sequences, and the algorithm outperforms conventional shadow removal algorithms designed for daytime videos. PMID:28327515

  2. Obstacle Detection Algorithms for Rotorcraft Navigation

    NASA Technical Reports Server (NTRS)

    Kasturi, Rangachar; Camps, Octavia I.; Huang, Ying; Narasimhamurthy, Anand; Pande, Nitin; Ahumada, Albert (Technical Monitor)

    2001-01-01

    In this research we addressed the problem of obstacle detection for low altitude rotorcraft flight. In particular, the problem of detecting thin wires in the presence of image clutter and noise was studied. Wires present a serious hazard to rotorcrafts. Since they are very thin, their detection early enough so that the pilot has enough time to take evasive action is difficult, as their images can be less than one or two pixels wide. After reviewing the line detection literature, an algorithm for sub-pixel edge detection proposed by Steger was identified as having good potential to solve the considered task. The algorithm was tested using a set of images synthetically generated by combining real outdoor images with computer generated wire images. The performance of the algorithm was evaluated both, at the pixel and the wire levels. It was observed that the algorithm performs well, provided that the wires are not too thin (or distant) and that some post processing is performed to remove false alarms due to clutter.

  3. Falls event detection using triaxial accelerometry and barometric pressure measurement.

    PubMed

    Bianchi, Federico; Redmond, Stephen J; Narayanan, Michael R; Cerutti, Sergio; Celler, Branko G; Lovell, Nigel H

    2009-01-01

    A falls detection system, employing a Bluetooth-based wearable device, containing a triaxial accelerometer and a barometric pressure sensor, is described. The aim of this study is to evaluate the use of barometric pressure measurement, as a surrogate measure of altitude, to augment previously reported accelerometry-based falls detection algorithms. The accelerometry and barometric pressure signals obtained from the waist-mounted device are analyzed by a signal processing and classification algorithm to discriminate falls from activities of daily living. This falls detection algorithm has been compared to two existing algorithms which utilize accelerometry signals alone. A set of laboratory-based simulated falls, along with other tasks associated with activities of daily living (16 tests) were performed by 15 healthy volunteers (9 male and 6 female; age: 23.7 +/- 2.9 years; height: 1.74 +/- 0.11 m). The algorithm incorporating pressure information detected falls with the highest sensitivity (97.8%) and the highest specificity (96.7%).

  4. Comparison of algorithms for automatic border detection of melanoma in dermoscopy images

    NASA Astrophysics Data System (ADS)

    Srinivasa Raghavan, Sowmya; Kaur, Ravneet; LeAnder, Robert

    2016-09-01

    Melanoma is one of the most rapidly accelerating cancers in the world [1]. Early diagnosis is critical to an effective cure. We propose a new algorithm for more accurately detecting melanoma borders in dermoscopy images. Proper border detection requires eliminating occlusions like hair and bubbles by processing the original image. The preprocessing step involves transforming the RGB image to the CIE L*u*v* color space, in order to decouple brightness from color information, then increasing contrast, using contrast-limited adaptive histogram equalization (CLAHE), followed by artifacts removal using a Gaussian filter. After preprocessing, the Chen-Vese technique segments the preprocessed images to create a lesion mask which undergoes a morphological closing operation. Next, the largest central blob in the lesion is detected, after which, the blob is dilated to generate an image output mask. Finally, the automatically-generated mask is compared to the manual mask by calculating the XOR error [3]. Our border detection algorithm was developed using training and test sets of 30 and 20 images, respectively. This detection method was compared to the SRM method [4] by calculating the average XOR error for each of the two algorithms. Average error for test images was 0.10, using the new algorithm, and 0.99, using SRM method. In comparing the average error values produced by the two algorithms, it is evident that the average XOR error for our technique is lower than the SRM method, thereby implying that the new algorithm detects borders of melanomas more accurately than the SRM algorithm.

  5. Leapfrogging into new territory: How Mascarene ridged frogs diversified across Africa and Madagascar to maintain their ecological niche.

    PubMed

    Zimkus, Breda M; Lawson, Lucinda P; Barej, Michael F; Barratt, Christopher D; Channing, Alan; Dash, Katrina M; Dehling, J Maximilian; Du Preez, Louis; Gehring, Philip-Sebastian; Greenbaum, Eli; Gvoždík, Václav; Harvey, James; Kielgast, Jos; Kusamba, Chifundera; Nagy, Zoltán T; Pabijan, Maciej; Penner, Johannes; Rödel, Mark-Oliver; Vences, Miguel; Lötters, Stefan

    2017-01-01

    The Mascarene ridged frog, Ptychadena mascareniensis, is a species complex that includes numerous lineages occurring mostly in humid savannas and open forests of mainland Africa, Madagascar, the Seychelles, and the Mascarene Islands. Sampling across this broad distribution presents an opportunity to examine the genetic differentiation within this complex and to investigate how the evolution of bioclimatic niches may have shaped current biogeographic patterns. Using model-based phylogenetic methods and molecular-clock dating, we constructed a time-calibrated molecular phylogenetic hypothesis for the group based on mitochondrial 16S rRNA and cytochrome b (cytb) genes and the nuclear RAG1 gene from 173 individuals. Haplotype networks were reconstructed and species boundaries were investigated using three species-delimitation approaches: Bayesian generalized mixed Yule-coalescent model (bGMYC), the Poisson Tree Process model (PTP) and a cluster algorithm (SpeciesIdentifier). Estimates of similarity in bioclimatic niche were calculated from species-distribution models (maxent) and multivariate statistics (Principal Component Analysis, Discriminant Function Analysis). Ancestral-area reconstructions were performed on the phylogeny using probabilistic approaches implemented in BioGeoBEARS. We detected high levels of genetic differentiation yielding ten distinct lineages or operational taxonomic units, and Central Africa was found to be a diversity hotspot for these frogs. Most speciation events took place throughout the Miocene, including "out-of-Africa" overseas dispersal events to Madagascar in the East and to São Tomé in the West. Bioclimatic niche was remarkably well conserved, with most species tolerating similar temperature and rainfall conditions common to the Central African region. The P. mascareniensis complex provides insights into how bioclimatic niche shaped the current biogeographic patterns with niche conservatism being exhibited by the Central African radiation and niche divergence shaping populations in West Africa and Madagascar. Central Africa, including the Albertine Rift region, has been an important center of diversification for this species complex. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  6. Profugus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Thomas; Hamilton, Steven; Slattery, Stuart

    Profugus is an open-source mini-application (mini-app) for radiation transport and reactor applications. It contains the fundamental computational kernels used in the Exnihilo code suite from Oak Ridge National Laboratory. However, Exnihilo is production code with a substantial user base. Furthermore, Exnihilo is export controlled. This makes collaboration with computer scientists and computer engineers difficult. Profugus is designed to bridge that gap. By encapsulating the core numerical algorithms in an abbreviated code base that is open-source, computer scientists can analyze the algorithms and easily make code-architectural changes to test performance without compromising the production code values of Exnihilo. Profugus is notmore » meant to be production software with respect to problem analysis. The computational kernels in Profugus are designed to analyze performance, not correctness. Nonetheless, users of Profugus can setup and run problems with enough real-world features to be useful as proof-of-concept for actual production work.« less

  7. Final report for “Extreme-scale Algorithms and Solver Resilience”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gropp, William Douglas

    2017-06-30

    This is a joint project with principal investigators at Oak Ridge National Laboratory, Sandia National Laboratories, the University of California at Berkeley, and the University of Tennessee. Our part of the project involves developing performance models for highly scalable algorithms and the development of latency tolerant iterative methods. During this project, we extended our performance models for the Multigrid method for solving large systems of linear equations and conducted experiments with highly scalable variants of conjugate gradient methods that avoid blocking synchronization. In addition, we worked with the other members of the project on alternative techniques for resilience and reproducibility.more » We also presented an alternative approach for reproducible dot-products in parallel computations that performs almost as well as the conventional approach by separating the order of computation from the details of the decomposition of vectors across the processes.« less

  8. Localized Dictionaries Based Orientation Field Estimation for Latent Fingerprints.

    PubMed

    Xiao Yang; Jianjiang Feng; Jie Zhou

    2014-05-01

    Dictionary based orientation field estimation approach has shown promising performance for latent fingerprints. In this paper, we seek to exploit stronger prior knowledge of fingerprints in order to further improve the performance. Realizing that ridge orientations at different locations of fingerprints have different characteristics, we propose a localized dictionaries-based orientation field estimation algorithm, in which noisy orientation patch at a location output by a local estimation approach is replaced by real orientation patch in the local dictionary at the same location. The precondition of applying localized dictionaries is that the pose of the latent fingerprint needs to be estimated. We propose a Hough transform-based fingerprint pose estimation algorithm, in which the predictions about fingerprint pose made by all orientation patches in the latent fingerprint are accumulated. Experimental results on challenging latent fingerprint datasets show the proposed method outperforms previous ones markedly.

  9. Y-12 Groundwater Protection Program Groundwater and Surface Water Sampling and Analysis Plan for Calendar Year 2004

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elvado Environmental LLC for the Environmental Compliance Department ES&H Division, Y-12 National Security Complex Oak Ridge, Tennessee

    2003-09-30

    This plan provides a description of the groundwater and surface water quality monitoring activities planned for calendar year (CY) 2004 at the U.S. Department of Energy (DOE) Y-12 National Security Complex that will be managed by the Y-12 Groundwater Protection Program (GWPP). Groundwater and surface water monitoring performed by the GWPP during CY 2004 will be in accordance with the following requirements of DOE Order 5400.1: (1) to maintain surveillance of existing and potential groundwater contamination sources; (2) to provide for the early detection of groundwater contamination and determine the quality of groundwater and surface water where contaminants are mostmore » likely to migrate beyond the Oak Ridge Reservation property line; (3) to identify and characterize long-term trends in groundwater quality at Y-12; and (4) to provide data to support decisions concerning the management and protection of groundwater resources. Groundwater and surface water monitoring during CY 2004 will be performed primarily in three hydrogeologic regimes at Y-12: the Bear Creek Hydrogeologic Regime (Bear Creek Regime), the Upper East Fork Poplar Creek Hydrogeologic Regime (East Fork Regime), and the Chestnut Ridge Hydrogeologic Regime (Chestnut Ridge Regime). The Bear Creek and East Fork regimes are located in Bear Creek Valley, and the Chestnut Ridge Regime is located south of Y-12 (Figure A.1). Additional surface water monitoring will be performed north of Pine Ridge, along the boundary of the Oak Ridge Reservation (Figure A.1). Modifications to the CY 2004 monitoring program may be necessary during implementation. Changes in programmatic requirements may alter the analytes specified for selected monitoring wells, or wells could be added or removed from the planned monitoring network. All modifications to the monitoring program will be approved by the Y-12 GWPP manager and documented as addenda to this sampling and analysis plan.« less

  10. Y-12 Groundwater Protection Program Groundwater and Surface Water Sampling and Analysis Plan for Calendar Year 2005

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2004-09-30

    This plan provides a description of the groundwater and surface water quality monitoring activities planned for calendar year (CY) 2005 at the U.S. Department of Energy (DOE) Y-12 National Security Complex (Y-12) that will be managed by the Y-12 Groundwater Protection Program (GWPP). Groundwater and surface water monitoring performed by the GWPP during CY 2005 will be in accordance with DOE Order 540.1 requirements and the following goals: (1) to maintain surveillance of existing and potential groundwater contamination sources; (2) to provide for the early detection of groundwater contamination and determine the quality of groundwater and surface water where contaminantsmore » are most likely to migrate beyond the Oak Ridge Reservation property line; (3) to identify and characterize long-term trends in groundwater quality at Y-12; and (4) to provide data to support decisions concerning the management and protection of groundwater resources. Groundwater and surface water monitoring during CY 2005 will be performed primarily in three hydrogeologic regimes at Y-12: the Bear Creek Hydrogeologic Regime (Bear Creek Regime), the Upper East Fork Poplar Creek Hydrogeologic Regime (East Fork Regime), and the Chestnut Ridge Hydrogeologic Regime (Chestnut Ridge Regime). The Bear Creek and East Fork regimes are located in Bear Creek Valley, and the Chestnut Ridge Regime is located south of Y-12 (Figure A.1). Additional surface water monitoring will be performed north of Pine Ridge, along the boundary of the Oak Ridge Reservation (Figure A.1). Modifications to the CY 2005 monitoring program may be necessary during implementation. Changes in programmatic requirements may alter the analytes specified for selected monitoring wells or may add or remove wells from the planned monitoring network. All modifications to the monitoring program will be approved by the Y-12 GWPP manager and documented as addenda to this sampling and analysis plan.« less

  11. Detachment Fault Behavior Revealed by Micro-Seismicity at 13°N, Mid-Atlantic Ridge

    NASA Astrophysics Data System (ADS)

    Parnell-Turner, R. E.; Sohn, R. A.; MacLeod, C. J.; Peirce, C.; Reston, T. J.; Searle, R. C.

    2016-12-01

    Under certain tectono-magmatic conditions, crustal accretion and extension at slow-spreading mid-ocean ridges is accommodated by low-angle detachment faults. While it is now generally accepted that oceanic detachments initiate on steeply dipping faults that rotate to low-angles at shallow depths, many details of their kinematics remain unknown. Debate has continued between a "continuous" model, where a single, undulating detachment surface underlies an entire ridge segment, and a "discrete" (or discontinuous) model, where detachments are spatially restricted and ephemeral. Here we present results from a passive microearthquake study of detachment faulting at the 13°N region of the Mid-Atlantic Ridge. This study is one component of a joint US-UK seismic study to constrain the sub-surface structure and 3-dimensional geometry of oceanic detachment faults. We detected over 300,000 microearthquakes during a 6-month deployment of 25 ocean bottom seismographs. Events are concentrated in two 1-2 km wide ridge-parallel bands, located between the prominent corrugated detachment fault surface at 13°20'N and the present-day spreading axis, separated by a 1-km wide patch of reduced seismicity. These two bands are 7-8 km in length parallel to the ridge and are clearly limited in spatial extent to the north and south. Events closest to the axis are generally at depths of 6-8 km, while those nearest to the oceanic detachment fault are shallower, at 4-6 km. There is an overall trend of deepening seismicity northwards, with events occurring progressively deeper by 4 km over an along-axis length of 8 km. Events are typically very small, and range in local magnitude from ML -1 to 3. Focal mechanisms indicate two modes of deformation, with extension nearest to the axis and compression at shallower depths near to the detachment fault termination.

  12. Strand-plain evidence for late Holocene lake-level variations in Lake Michigan

    USGS Publications Warehouse

    Thompson, T.A.; Baedke, S.J.

    1997-01-01

    Lake level is a primary control on shoreline behavior in Lake Michigan. The historical record from lake-level gauges is the most accurate source of information on past lake levels, but the short duration of the record does not permit the recognition of long-term patterns of lake-level change (longer than a decade or two). To extend the record of lake-level change, the internal architecture and timing of development of five strand plains of late Holocene beach ridges along the Lake Michigan coastline were studied. Relative lake-level curves for each site were constructed by determining the elevation of foreshore (swash zone) sediments in the beach ridges and by dating basal wetland sediments in the swales between ridges. These curves detect long-term (30+ yr) lake-level variations and differential isostatic adjustments over the past 4700 yr at a greater resolution than achieved by other studies. The average timing of beach-ridge development for all sites is between 29 and 38 yr/ridge. This correspondence occurs in spite of the embayments containing the strand plains being different in size, orientation, hydrographic regime, and available sediment type and caliber. If not coincidental, all sites responded to a lake-level fluctuation of a little more than three decades in duration and a range of 0.5 to 0.6 m. Most pronounced in the relative lake-level curves is a fluctuation of 120-180 yr in duration. This ???150 yr variation is defined by groups of four to six ridges that show a rise and fall in foreshore elevations of 0.5 to 1.5 m within the group. The 150 yr variation can be correlated between sites in the Lake Michigan basin. The ???30 and 150 yr fluctuations are superimposed on a long-term loss of water to the Lake Michigan basin and differential rates of isostatic adjustment.

  13. A stationary wavelet transform and a time-frequency based spike detection algorithm for extracellular recorded data

    NASA Astrophysics Data System (ADS)

    Lieb, Florian; Stark, Hans-Georg; Thielemann, Christiane

    2017-06-01

    Objective. Spike detection from extracellular recordings is a crucial preprocessing step when analyzing neuronal activity. The decision whether a specific part of the signal is a spike or not is important for any kind of other subsequent preprocessing steps, like spike sorting or burst detection in order to reduce the classification of erroneously identified spikes. Many spike detection algorithms have already been suggested, all working reasonably well whenever the signal-to-noise ratio is large enough. When the noise level is high, however, these algorithms have a poor performance. Approach. In this paper we present two new spike detection algorithms. The first is based on a stationary wavelet energy operator and the second is based on the time-frequency representation of spikes. Both algorithms are more reliable than all of the most commonly used methods. Main results. The performance of the algorithms is confirmed by using simulated data, resembling original data recorded from cortical neurons with multielectrode arrays. In order to demonstrate that the performance of the algorithms is not restricted to only one specific set of data, we also verify the performance using a simulated publicly available data set. We show that both proposed algorithms have the best performance under all tested methods, regardless of the signal-to-noise ratio in both data sets. Significance. This contribution will redound to the benefit of electrophysiological investigations of human cells. Especially the spatial and temporal analysis of neural network communications is improved by using the proposed spike detection algorithms.

  14. Crowdsourcing seizure detection: algorithm development and validation on human implanted device recordings.

    PubMed

    Baldassano, Steven N; Brinkmann, Benjamin H; Ung, Hoameng; Blevins, Tyler; Conrad, Erin C; Leyde, Kent; Cook, Mark J; Khambhati, Ankit N; Wagenaar, Joost B; Worrell, Gregory A; Litt, Brian

    2017-06-01

    There exist significant clinical and basic research needs for accurate, automated seizure detection algorithms. These algorithms have translational potential in responsive neurostimulation devices and in automatic parsing of continuous intracranial electroencephalography data. An important barrier to developing accurate, validated algorithms for seizure detection is limited access to high-quality, expertly annotated seizure data from prolonged recordings. To overcome this, we hosted a kaggle.com competition to crowdsource the development of seizure detection algorithms using intracranial electroencephalography from canines and humans with epilepsy. The top three performing algorithms from the contest were then validated on out-of-sample patient data including standard clinical data and continuous ambulatory human data obtained over several years using the implantable NeuroVista seizure advisory system. Two hundred teams of data scientists from all over the world participated in the kaggle.com competition. The top performing teams submitted highly accurate algorithms with consistent performance in the out-of-sample validation study. The performance of these seizure detection algorithms, achieved using freely available code and data, sets a new reproducible benchmark for personalized seizure detection. We have also shared a 'plug and play' pipeline to allow other researchers to easily use these algorithms on their own datasets. The success of this competition demonstrates how sharing code and high quality data results in the creation of powerful translational tools with significant potential to impact patient care. © The Author (2017). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akcakaya, Murat; Nehorai, Arye; Sen, Satyabrata

    Most existing radar algorithms are developed under the assumption that the environment (clutter) is stationary. However, in practice, the characteristics of the clutter can vary enormously depending on the radar-operational scenarios. If unaccounted for, these nonstationary variabilities may drastically hinder the radar performance. Therefore, to overcome such shortcomings, we develop a data-driven method for target detection in nonstationary environments. In this method, the radar dynamically detects changes in the environment and adapts to these changes by learning the new statistical characteristics of the environment and by intelligibly updating its statistical detection algorithm. Specifically, we employ drift detection algorithms to detectmore » changes in the environment; incremental learning, particularly learning under concept drift algorithms, to learn the new statistical characteristics of the environment from the new radar data that become available in batches over a period of time. The newly learned environment characteristics are then integrated in the detection algorithm. Furthermore, we use Monte Carlo simulations to demonstrate that the developed method provides a significant improvement in the detection performance compared with detection techniques that are not aware of the environmental changes.« less

  16. Advanced power system protection and incipient fault detection and protection of spaceborne power systems

    NASA Technical Reports Server (NTRS)

    Russell, B. Don

    1989-01-01

    This research concentrated on the application of advanced signal processing, expert system, and digital technologies for the detection and control of low grade, incipient faults on spaceborne power systems. The researchers have considerable experience in the application of advanced digital technologies and the protection of terrestrial power systems. This experience was used in the current contracts to develop new approaches for protecting the electrical distribution system in spaceborne applications. The project was divided into three distinct areas: (1) investigate the applicability of fault detection algorithms developed for terrestrial power systems to the detection of faults in spaceborne systems; (2) investigate the digital hardware and architectures required to monitor and control spaceborne power systems with full capability to implement new detection and diagnostic algorithms; and (3) develop a real-time expert operating system for implementing diagnostic and protection algorithms. Significant progress has been made in each of the above areas. Several terrestrial fault detection algorithms were modified to better adapt to spaceborne power system environments. Several digital architectures were developed and evaluated in light of the fault detection algorithms.

  17. Vehicle tracking using fuzzy-based vehicle detection window with adaptive parameters

    NASA Astrophysics Data System (ADS)

    Chitsobhuk, Orachat; Kasemsiri, Watjanapong; Glomglome, Sorayut; Lapamonpinyo, Pipatphon

    2018-04-01

    In this paper, fuzzy-based vehicle tracking system is proposed. The proposed system consists of two main processes: vehicle detection and vehicle tracking. In the first process, the Gradient-based Adaptive Threshold Estimation (GATE) algorithm is adopted to provide the suitable threshold value for the sobel edge detection. The estimated threshold can be adapted to the changes of diverse illumination conditions throughout the day. This leads to greater vehicle detection performance compared to a fixed user's defined threshold. In the second process, this paper proposes the novel vehicle tracking algorithms namely Fuzzy-based Vehicle Analysis (FBA) in order to reduce the false estimation of the vehicle tracking caused by uneven edges of the large vehicles and vehicle changing lanes. The proposed FBA algorithm employs the average edge density and the Horizontal Moving Edge Detection (HMED) algorithm to alleviate those problems by adopting fuzzy rule-based algorithms to rectify the vehicle tracking. The experimental results demonstrate that the proposed system provides the high accuracy of vehicle detection about 98.22%. In addition, it also offers the low false detection rates about 3.92%.

  18. Automatic arrival time detection for earthquakes based on Modified Laplacian of Gaussian filter

    NASA Astrophysics Data System (ADS)

    Saad, Omar M.; Shalaby, Ahmed; Samy, Lotfy; Sayed, Mohammed S.

    2018-04-01

    Precise identification of onset time for an earthquake is imperative in the right figuring of earthquake's location and different parameters that are utilized for building seismic catalogues. P-wave arrival detection of weak events or micro-earthquakes cannot be precisely determined due to background noise. In this paper, we propose a novel approach based on Modified Laplacian of Gaussian (MLoG) filter to detect the onset time even in the presence of very weak signal-to-noise ratios (SNRs). The proposed algorithm utilizes a denoising-filter algorithm to smooth the background noise. In the proposed algorithm, we employ the MLoG mask to filter the seismic data. Afterward, we apply a Dual-threshold comparator to detect the onset time of the event. The results show that the proposed algorithm can detect the onset time for micro-earthquakes accurately, with SNR of -12 dB. The proposed algorithm achieves an onset time picking accuracy of 93% with a standard deviation error of 0.10 s for 407 field seismic waveforms. Also, we compare the results with short and long time average algorithm (STA/LTA) and the Akaike Information Criterion (AIC), and the proposed algorithm outperforms them.

  19. Robust Crop and Weed Segmentation under Uncontrolled Outdoor Illumination

    PubMed Central

    Jeon, Hong Y.; Tian, Lei F.; Zhu, Heping

    2011-01-01

    An image processing algorithm for detecting individual weeds was developed and evaluated. Weed detection processes included were normalized excessive green conversion, statistical threshold value estimation, adaptive image segmentation, median filter, morphological feature calculation and Artificial Neural Network (ANN). The developed algorithm was validated for its ability to identify and detect weeds and crop plants under uncontrolled outdoor illuminations. A machine vision implementing field robot captured field images under outdoor illuminations and the image processing algorithm automatically processed them without manual adjustment. The errors of the algorithm, when processing 666 field images, ranged from 2.1 to 2.9%. The ANN correctly detected 72.6% of crop plants from the identified plants, and considered the rest as weeds. However, the ANN identification rates for crop plants were improved up to 95.1% by addressing the error sources in the algorithm. The developed weed detection and image processing algorithm provides a novel method to identify plants against soil background under the uncontrolled outdoor illuminations, and to differentiate weeds from crop plants. Thus, the proposed new machine vision and processing algorithm may be useful for outdoor applications including plant specific direct applications (PSDA). PMID:22163954

  20. Detection and Tracking of Dynamic Objects by Using a Multirobot System: Application to Critical Infrastructures Surveillance

    PubMed Central

    Rodríguez-Canosa, Gonzalo; Giner, Jaime del Cerro; Barrientos, Antonio

    2014-01-01

    The detection and tracking of mobile objects (DATMO) is progressively gaining importance for security and surveillance applications. This article proposes a set of new algorithms and procedures for detecting and tracking mobile objects by robots that work collaboratively as part of a multirobot system. These surveillance algorithms are conceived of to work with data provided by long distance range sensors and are intended for highly reliable object detection in wide outdoor environments. Contrary to most common approaches, in which detection and tracking are done by an integrated procedure, the approach proposed here relies on a modular structure, in which detection and tracking are carried out independently, and the latter might accept input data from different detection algorithms. Two movement detection algorithms have been developed for the detection of dynamic objects by using both static and/or mobile robots. The solution to the overall problem is based on the use of a Kalman filter to predict the next state of each tracked object. Additionally, new tracking algorithms capable of combining dynamic objects lists coming from either one or various sources complete the solution. The complementary performance of the separated modular structure for detection and identification is evaluated and, finally, a selection of test examples discussed. PMID:24526305

  1. Detecting and visualizing weak signatures in hyperspectral data

    NASA Astrophysics Data System (ADS)

    MacPherson, Duncan James

    This thesis evaluates existing techniques for detecting weak spectral signatures from remotely sensed hyperspectral data. Algorithms are presented that successfully detect hard-to-find 'mystery' signatures in unknown cluttered backgrounds. The term 'mystery' is used to describe a scenario where the spectral target and background endmembers are unknown. Sub-Pixel analysis and background suppression are used to find deeply embedded signatures which can be less than 10% of the total signal strength. Existing 'mystery target' detection algorithms are derived and compared. Several techniques are shown to be superior both visually and quantitatively. Detection performance is evaluated using confidence metrics that are developed. A multiple algorithm approach is shown to improve detection confidence significantly. Although the research focuses on remote sensing applications, the algorithms presented can be applied to a wide variety of diverse fields such as medicine, law enforcement, manufacturing, earth science, food production, and astrophysics. The algorithms are shown to be general and can be applied to both the reflective and emissive parts of the electromagnetic spectrum. The application scope is a broad one and the final results open new opportunities for many specific applications including: land mine detection, pollution and hazardous waste detection, crop abundance calculations, volcanic activity monitoring, detecting diseases in food, automobile or airplane target recognition, cancer detection, mining operations, extracting galactic gas emissions, etc.

  2. Algorithms and data structures for automated change detection and classification of sidescan sonar imagery

    NASA Astrophysics Data System (ADS)

    Gendron, Marlin Lee

    During Mine Warfare (MIW) operations, MIW analysts perform change detection by visually comparing historical sidescan sonar imagery (SSI) collected by a sidescan sonar with recently collected SSI in an attempt to identify objects (which might be explosive mines) placed at sea since the last time the area was surveyed. This dissertation presents a data structure and three algorithms, developed by the author, that are part of an automated change detection and classification (ACDC) system. MIW analysts at the Naval Oceanographic Office, to reduce the amount of time to perform change detection, are currently using ACDC. The dissertation introductory chapter gives background information on change detection, ACDC, and describes how SSI is produced from raw sonar data. Chapter 2 presents the author's Geospatial Bitmap (GB) data structure, which is capable of storing information geographically and is utilized by the three algorithms. This chapter shows that a GB data structure used in a polygon-smoothing algorithm ran between 1.3--48.4x faster than a sparse matrix data structure. Chapter 3 describes the GB clustering algorithm, which is the author's repeatable, order-independent method for clustering. Results from tests performed in this chapter show that the time to cluster a set of points is not affected by the distribution or the order of the points. In Chapter 4, the author presents his real-time computer-aided detection (CAD) algorithm that automatically detects mine-like objects on the seafloor in SSI. The author ran his GB-based CAD algorithm on real SSI data, and results of these tests indicate that his real-time CAD algorithm performs comparably to or better than other non-real-time CAD algorithms. The author presents his computer-aided search (CAS) algorithm in Chapter 5. CAS helps MIW analysts locate mine-like features that are geospatially close to previously detected features. A comparison between the CAS and a great circle distance algorithm shows that the CAS performs geospatial searching 1.75x faster on large data sets. Finally, the concluding chapter of this dissertation gives important details on how the completed ACDC system will function, and discusses the author's future research to develop additional algorithms and data structures for ACDC.

  3. Computer algorithms for automated detection and analysis of local Ca2+ releases in spontaneously beating cardiac pacemaker cells

    PubMed Central

    Kim, Mary S.; Tsutsui, Kenta; Stern, Michael D.; Lakatta, Edward G.; Maltsev, Victor A.

    2017-01-01

    Local Ca2+ Releases (LCRs) are crucial events involved in cardiac pacemaker cell function. However, specific algorithms for automatic LCR detection and analysis have not been developed in live, spontaneously beating pacemaker cells. In the present study we measured LCRs using a high-speed 2D-camera in spontaneously contracting sinoatrial (SA) node cells isolated from rabbit and guinea pig and developed a new algorithm capable of detecting and analyzing the LCRs spatially in two-dimensions, and in time. Our algorithm tracks points along the midline of the contracting cell. It uses these points as a coordinate system for affine transform, producing a transformed image series where the cell does not contract. Action potential-induced Ca2+ transients and LCRs were thereafter isolated from recording noise by applying a series of spatial filters. The LCR birth and death events were detected by a differential (frame-to-frame) sensitivity algorithm applied to each pixel (cell location). An LCR was detected when its signal changes sufficiently quickly within a sufficiently large area. The LCR is considered to have died when its amplitude decays substantially, or when it merges into the rising whole cell Ca2+ transient. Ultimately, our algorithm provides major LCR parameters such as period, signal mass, duration, and propagation path area. As the LCRs propagate within live cells, the algorithm identifies splitting and merging behaviors, indicating the importance of locally propagating Ca2+-induced-Ca2+-release for the fate of LCRs and for generating a powerful ensemble Ca2+ signal. Thus, our new computer algorithms eliminate motion artifacts and detect 2D local spatiotemporal events from recording noise and global signals. While the algorithms were developed to detect LCRs in sinoatrial nodal cells, they have the potential to be used in other applications in biophysics and cell physiology, for example, to detect Ca2+ wavelets (abortive waves), sparks and embers in muscle cells and Ca2+ puffs and syntillas in neurons. PMID:28683095

  4. A novel fast phase correlation algorithm for peak wavelength detection of Fiber Bragg Grating sensors.

    PubMed

    Lamberti, A; Vanlanduit, S; De Pauw, B; Berghmans, F

    2014-03-24

    Fiber Bragg Gratings (FBGs) can be used as sensors for strain, temperature and pressure measurements. For this purpose, the ability to determine the Bragg peak wavelength with adequate wavelength resolution and accuracy is essential. However, conventional peak detection techniques, such as the maximum detection algorithm, can yield inaccurate and imprecise results, especially when the Signal to Noise Ratio (SNR) and the wavelength resolution are poor. Other techniques, such as the cross-correlation demodulation algorithm are more precise and accurate but require a considerable higher computational effort. To overcome these problems, we developed a novel fast phase correlation (FPC) peak detection algorithm, which computes the wavelength shift in the reflected spectrum of a FBG sensor. This paper analyzes the performance of the FPC algorithm for different values of the SNR and wavelength resolution. Using simulations and experiments, we compared the FPC with the maximum detection and cross-correlation algorithms. The FPC method demonstrated a detection precision and accuracy comparable with those of cross-correlation demodulation and considerably higher than those obtained with the maximum detection technique. Additionally, FPC showed to be about 50 times faster than the cross-correlation. It is therefore a promising tool for future implementation in real-time systems or in embedded hardware intended for FBG sensor interrogation.

  5. Wearable physiological sensors and real-time algorithms for detection of acute mountain sickness.

    PubMed

    Muza, Stephen R

    2018-03-01

    This is a minireview of potential wearable physiological sensors and algorithms (process and equations) for detection of acute mountain sickness (AMS). Given the emerging status of this effort, the focus of the review is on the current clinical assessment of AMS, known risk factors (environmental, demographic, and physiological), and current understanding of AMS pathophysiology. Studies that have examined a range of physiological variables to develop AMS prediction and/or detection algorithms are reviewed to provide insight and potential technological roadmaps for future development of real-time physiological sensors and algorithms to detect AMS. Given the lack of signs and nonspecific symptoms associated with AMS, development of wearable physiological sensors and embedded algorithms to predict in the near term or detect established AMS will be challenging. Prior work using [Formula: see text], HR, or HRv has not provided the sensitivity and specificity for useful application to predict or detect AMS. Rather than using spot checks as most prior studies have, wearable systems that continuously measure SpO 2 and HR are commercially available. Employing other statistical modeling approaches such as general linear and logistic mixed models or time series analysis to these continuously measured variables is the most promising approach for developing algorithms that are sensitive and specific for physiological prediction or detection of AMS.

  6. Real time algorithms for sharp wave ripple detection.

    PubMed

    Sethi, Ankit; Kemere, Caleb

    2014-01-01

    Neural activity during sharp wave ripples (SWR), short bursts of co-ordinated oscillatory activity in the CA1 region of the rodent hippocampus, is implicated in a variety of memory functions from consolidation to recall. Detection of these events in an algorithmic framework, has thus far relied on simple thresholding techniques with heuristically derived parameters. This study is an investigation into testing and improving the current methods for detection of SWR events in neural recordings. We propose and profile methods to reduce latency in ripple detection. Proposed algorithms are tested on simulated ripple data. The findings show that simple realtime algorithms can improve upon existing power thresholding methods and can detect ripple activity with latencies in the range of 10-20 ms.

  7. Glint-induced false alarm reduction in signature adaptive target detection

    NASA Astrophysics Data System (ADS)

    Crosby, Frank J.

    2002-07-01

    The signal adaptive target detection algorithm developed by Crosby and Riley uses target geometry to discern anomalies in local backgrounds. Detection is not restricted based on specific target signatures. The robustness of the algorithm is limited by an increased false alarm potential. The base algorithm is extended to eliminate one common source of false alarms in a littoral environment. This common source is glint reflected on the surface of water. The spectral and spatial transience of glint prevent straightforward characterization and complicate exclusion. However, the statistical basis of the detection algorithm and its inherent computations allow for glint discernment and the removal of its influence.

  8. Systolic peak detection in acceleration photoplethysmograms measured from emergency responders in tropical conditions.

    PubMed

    Elgendi, Mohamed; Norton, Ian; Brearley, Matt; Abbott, Derek; Schuurmans, Dale

    2013-01-01

    Photoplethysmogram (PPG) monitoring is not only essential for critically ill patients in hospitals or at home, but also for those undergoing exercise testing. However, processing PPG signals measured after exercise is challenging, especially if the environment is hot and humid. In this paper, we propose a novel algorithm that can detect systolic peaks under challenging conditions, as in the case of emergency responders in tropical conditions. Accurate systolic-peak detection is an important first step for the analysis of heart rate variability. Algorithms based on local maxima-minima, first-derivative, and slope sum are evaluated, and a new algorithm is introduced to improve the detection rate. With 40 healthy subjects, the new algorithm demonstrates the highest overall detection accuracy (99.84% sensitivity, 99.89% positive predictivity). Existing algorithms, such as Billauer's, Li's and Zong's, have comparable although lower accuracy. However, the proposed algorithm presents an advantage for real-time applications by avoiding human intervention in threshold determination. For best performance, we show that a combination of two event-related moving averages with an offset threshold has an advantage in detecting systolic peaks, even in heat-stressed PPG signals.

  9. An incremental anomaly detection model for virtual machines.

    PubMed

    Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu

    2017-01-01

    Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform.

  10. An incremental anomaly detection model for virtual machines

    PubMed Central

    Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu

    2017-01-01

    Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform. PMID:29117245

  11. Toward detecting deception in intelligent systems

    NASA Astrophysics Data System (ADS)

    Santos, Eugene, Jr.; Johnson, Gregory, Jr.

    2004-08-01

    Contemporary decision makers often must choose a course of action using knowledge from several sources. Knowledge may be provided from many diverse sources including electronic sources such as knowledge-based diagnostic or decision support systems or through data mining techniques. As the decision maker becomes more dependent on these electronic information sources, detecting deceptive information from these sources becomes vital to making a correct, or at least more informed, decision. This applies to unintentional disinformation as well as intentional misinformation. Our ongoing research focuses on employing models of deception and deception detection from the fields of psychology and cognitive science to these systems as well as implementing deception detection algorithms for probabilistic intelligent systems. The deception detection algorithms are used to detect, classify and correct attempts at deception. Algorithms for detecting unexpected information rely upon a prediction algorithm from the collaborative filtering domain to predict agent responses in a multi-agent system.

  12. Local Prediction Models on Mid-Atlantic Ridge MORB by Principal Component Regression

    NASA Astrophysics Data System (ADS)

    Ling, X.; Snow, J. E.; Chin, W.

    2017-12-01

    The isotopic compositions of the daughter isotopes of long-lived radioactive systems (Sr, Nd, Hf and Pb ) can be used to map the scale and history of mantle heterogeneities beneath mid-ocean ridges. Our goal is to relate the multidimensional structure in the existing isotopic dataset with an underlying physical reality of mantle sources. The numerical technique of Principal Component Analysis is useful to reduce the linear dependence of the data to a minimum set of orthogonal eigenvectors encapsulating the information contained (cf Agranier et al 2005). The dataset used for this study covers almost all the MORBs along mid-Atlantic Ridge (MAR), from 54oS to 77oN and 8.8oW to -46.7oW, including replicating the dataset of Agranier et al., 2005 published plus 53 basalt samples dredged and analyzed since then (data from PetDB). The principal components PC1 and PC2 account for 61.56% and 29.21%, respectively, of the total isotope ratios variability. The samples with similar compositions to HIMU and EM and DM are identified to better understand the PCs. PC1 and PC2 are accountable for HIMU and EM whereas PC2 has limited control over the DM source. PC3 is more strongly controlled by the depleted mantle source than PC2. What this means is that all three principal components have a high degree of significance relevant to the established mantle sources. We also tested the relationship between mantle heterogeneity and sample locality. K-means clustering algorithm is a type of unsupervised learning to find groups in the data based on feature similarity. The PC factor scores of each sample are clustered into three groups. Cluster one and three are alternating on the north and south MAR. Cluster two exhibits on 45.18oN to 0.79oN and -27.9oW to -30.40oW alternating with cluster one. The ridge has been preliminarily divided into 16 sections considering both the clusters and ridge segments. The principal component regression models the section based on 6 isotope ratios and PCs. The prediction residual is about 1-2km. It means that the combined 5 isotopes are a strong predictor of geographic location along the ridge, a slightly surprising result. PCR is a robust and powerful method for both visualizing and manipulating the multidimensional representation of isotope data.

  13. A multi-directional and multi-scale roughness filter to detect lineament segments on digital elevation models - analyzing spatial objects in R

    NASA Astrophysics Data System (ADS)

    Baumann, Sebastian; Robl, Jörg; Wendt, Lorenz; Willingshofer, Ernst; Hilberg, Sylke

    2016-04-01

    Automated lineament analysis on remotely sensed data requires two general process steps: The identification of neighboring pixels showing high contrast and the conversion of these domains into lines. The target output is the lineaments' position, extent and orientation. We developed a lineament extraction tool programmed in R using digital elevation models as input data to generate morphological lineaments defined as follows: A morphological lineament represents a zone of high relief roughness, whose length significantly exceeds the width. As relief roughness any deviation from a flat plane, defined by a roughness threshold, is considered. In our novel approach a multi-directional and multi-scale roughness filter uses moving windows of different neighborhood sizes to identify threshold limited rough domains on digital elevation models. Surface roughness is calculated as the vertical elevation difference between the center cell and the different orientated straight lines connecting two edge cells of a neighborhood, divided by the horizontal distance of the edge cells. Thus multiple roughness values depending on the neighborhood sizes and orientations of the edge connecting lines are generated for each cell and their maximum and minimum values are extracted. Thereby negative signs of the roughness parameter represent concave relief structures as valleys, positive signs convex relief structures as ridges. A threshold defines domains of high relief roughness. These domains are thinned to a representative point pattern by a 3x3 neighborhood filter, highlighting maximum and minimum roughness peaks, and representing the center points of lineament segments. The orientation and extent of the lineament segments are calculated within the roughness domains, generating a straight line segment in the direction of least roughness differences. We tested our algorithm on digital elevation models of multiple sources and scales and compared the results visually with shaded relief map of these digital elevation models. The lineament segments trace the relief structure to a great extent and the calculated roughness parameter represents the physical geometry of the digital elevation model. Modifying the threshold for the surface roughness value highlights different distinct relief structures. Also the neighborhood size at which lineament segments are detected correspond with the width of the surface structure and may be a useful additional parameter for further analysis. The discrimination of concave and convex relief structures perfectly matches with valleys and ridges of the surface.

  14. The effect of orthology and coregulation on detecting regulatory motifs.

    PubMed

    Storms, Valerie; Claeys, Marleen; Sanchez, Aminael; De Moor, Bart; Verstuyf, Annemieke; Marchal, Kathleen

    2010-02-03

    Computational de novo discovery of transcription factor binding sites is still a challenging problem. The growing number of sequenced genomes allows integrating orthology evidence with coregulation information when searching for motifs. Moreover, the more advanced motif detection algorithms explicitly model the phylogenetic relatedness between the orthologous input sequences and thus should be well adapted towards using orthologous information. In this study, we evaluated the conditions under which complementing coregulation with orthologous information improves motif detection for the class of probabilistic motif detection algorithms with an explicit evolutionary model. We designed datasets (real and synthetic) covering different degrees of coregulation and orthologous information to test how well Phylogibbs and Phylogenetic sampler, as representatives of the motif detection algorithms with evolutionary model performed as compared to MEME, a more classical motif detection algorithm that treats orthologs independently. Under certain conditions detecting motifs in the combined coregulation-orthology space is indeed more efficient than using each space separately, but this is not always the case. Moreover, the difference in success rate between the advanced algorithms and MEME is still marginal. The success rate of motif detection depends on the complex interplay between the added information and the specificities of the applied algorithms. Insights in this relation provide information useful to both developers and users. All benchmark datasets are available at http://homes.esat.kuleuven.be/~kmarchal/Supplementary_Storms_Valerie_PlosONE.

  15. The Effect of Orthology and Coregulation on Detecting Regulatory Motifs

    PubMed Central

    Storms, Valerie; Claeys, Marleen; Sanchez, Aminael; De Moor, Bart; Verstuyf, Annemieke; Marchal, Kathleen

    2010-01-01

    Background Computational de novo discovery of transcription factor binding sites is still a challenging problem. The growing number of sequenced genomes allows integrating orthology evidence with coregulation information when searching for motifs. Moreover, the more advanced motif detection algorithms explicitly model the phylogenetic relatedness between the orthologous input sequences and thus should be well adapted towards using orthologous information. In this study, we evaluated the conditions under which complementing coregulation with orthologous information improves motif detection for the class of probabilistic motif detection algorithms with an explicit evolutionary model. Methodology We designed datasets (real and synthetic) covering different degrees of coregulation and orthologous information to test how well Phylogibbs and Phylogenetic sampler, as representatives of the motif detection algorithms with evolutionary model performed as compared to MEME, a more classical motif detection algorithm that treats orthologs independently. Results and Conclusions Under certain conditions detecting motifs in the combined coregulation-orthology space is indeed more efficient than using each space separately, but this is not always the case. Moreover, the difference in success rate between the advanced algorithms and MEME is still marginal. The success rate of motif detection depends on the complex interplay between the added information and the specificities of the applied algorithms. Insights in this relation provide information useful to both developers and users. All benchmark datasets are available at http://homes.esat.kuleuven.be/~kmarchal/Supplementary_Storms_Valerie_PlosONE. PMID:20140085

  16. Forward collision warning based on kernelized correlation filters

    NASA Astrophysics Data System (ADS)

    Pu, Jinchuan; Liu, Jun; Zhao, Yong

    2017-07-01

    A vehicle detection and tracking system is one of the indispensable methods to reduce the occurrence of traffic accidents. The nearest vehicle is the most likely to cause harm to us. So, this paper will do more research on about the nearest vehicle in the region of interest (ROI). For this system, high accuracy, real-time and intelligence are the basic requirement. In this paper, we set up a system that combines the advanced KCF tracking algorithm with the HaarAdaBoost detection algorithm. The KCF algorithm reduces computation time and increase the speed through the cyclic shift and diagonalization. This algorithm satisfies the real-time requirement. At the same time, Haar features also have the same advantage of simple operation and high speed for detection. The combination of this two algorithm contribute to an obvious improvement of the system running rate comparing with previous works. The detection result of the HaarAdaBoost classifier provides the initial value for the KCF algorithm. This fact optimizes KCF algorithm flaws that manual car marking in the initial phase, which is more scientific and more intelligent. Haar detection and KCF tracking with Histogram of Oriented Gradient (HOG) ensures the accuracy of the system. We evaluate the performance of framework on dataset that were self-collected. The experimental results demonstrate that the proposed method is robust and real-time. The algorithm can effectively adapt to illumination variation, even in the night it can meet the detection and tracking requirements, which is an improvement compared with the previous work.

  17. System and method for resolving gamma-ray spectra

    DOEpatents

    Gentile, Charles A.; Perry, Jason; Langish, Stephen W.; Silber, Kenneth; Davis, William M.; Mastrovito, Dana

    2010-05-04

    A system for identifying radionuclide emissions is described. The system includes at least one processor for processing output signals from a radionuclide detecting device, at least one training algorithm run by the at least one processor for analyzing data derived from at least one set of known sample data from the output signals, at least one classification algorithm derived from the training algorithm for classifying unknown sample data, wherein the at least one training algorithm analyzes the at least one sample data set to derive at least one rule used by said classification algorithm for identifying at least one radionuclide emission detected by the detecting device.

  18. A novel data-driven learning method for radar target detection in nonstationary environments

    DOE PAGES

    Akcakaya, Murat; Nehorai, Arye; Sen, Satyabrata

    2016-04-12

    Most existing radar algorithms are developed under the assumption that the environment (clutter) is stationary. However, in practice, the characteristics of the clutter can vary enormously depending on the radar-operational scenarios. If unaccounted for, these nonstationary variabilities may drastically hinder the radar performance. Therefore, to overcome such shortcomings, we develop a data-driven method for target detection in nonstationary environments. In this method, the radar dynamically detects changes in the environment and adapts to these changes by learning the new statistical characteristics of the environment and by intelligibly updating its statistical detection algorithm. Specifically, we employ drift detection algorithms to detectmore » changes in the environment; incremental learning, particularly learning under concept drift algorithms, to learn the new statistical characteristics of the environment from the new radar data that become available in batches over a period of time. The newly learned environment characteristics are then integrated in the detection algorithm. Furthermore, we use Monte Carlo simulations to demonstrate that the developed method provides a significant improvement in the detection performance compared with detection techniques that are not aware of the environmental changes.« less

  19. Automatic cardiac cycle determination directly from EEG-fMRI data by multi-scale peak detection method.

    PubMed

    Wong, Chung-Ki; Luo, Qingfei; Zotev, Vadim; Phillips, Raquel; Chan, Kam Wai Clifford; Bodurka, Jerzy

    2018-03-31

    In simultaneous EEG-fMRI, identification of the period of cardioballistic artifact (BCG) in EEG is required for the artifact removal. Recording the electrocardiogram (ECG) waveform during fMRI is difficult, often causing inaccurate period detection. Since the waveform of the BCG extracted by independent component analysis (ICA) is relatively invariable compared to the ECG waveform, we propose a multiple-scale peak-detection algorithm to determine the BCG cycle directly from the EEG data. The algorithm first extracts the high contrast BCG component from the EEG data by ICA. The BCG cycle is then estimated by band-pass filtering the component around the fundamental frequency identified from its energy spectral density, and the peak of BCG artifact occurrence is selected from each of the estimated cycle. The algorithm is shown to achieve a high accuracy on a large EEG-fMRI dataset. It is also adaptive to various heart rates without the needs of adjusting the threshold parameters. The cycle detection remains accurate with the scan duration reduced to half a minute. Additionally, the algorithm gives a figure of merit to evaluate the reliability of the detection accuracy. The algorithm is shown to give a higher detection accuracy than the commonly used cycle detection algorithm fmrib_qrsdetect implemented in EEGLAB. The achieved high cycle detection accuracy of our algorithm without using the ECG waveforms makes possible to create and automate pipelines for processing large EEG-fMRI datasets, and virtually eliminates the need for ECG recordings for BCG artifact removal. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  20. Automated detection of hospital outbreaks: A systematic review of methods

    PubMed Central

    Buckeridge, David L.; Lepelletier, Didier

    2017-01-01

    Objectives Several automated algorithms for epidemiological surveillance in hospitals have been proposed. However, the usefulness of these methods to detect nosocomial outbreaks remains unclear. The goal of this review was to describe outbreak detection algorithms that have been tested within hospitals, consider how they were evaluated, and synthesize their results. Methods We developed a search query using keywords associated with hospital outbreak detection and searched the MEDLINE database. To ensure the highest sensitivity, no limitations were initially imposed on publication languages and dates, although we subsequently excluded studies published before 2000. Every study that described a method to detect outbreaks within hospitals was included, without any exclusion based on study design. Additional studies were identified through citations in retrieved studies. Results Twenty-nine studies were included. The detection algorithms were grouped into 5 categories: simple thresholds (n = 6), statistical process control (n = 12), scan statistics (n = 6), traditional statistical models (n = 6), and data mining methods (n = 4). The evaluation of the algorithms was often solely descriptive (n = 15), but more complex epidemiological criteria were also investigated (n = 10). The performance measures varied widely between studies: e.g., the sensitivity of an algorithm in a real world setting could vary between 17 and 100%. Conclusion Even if outbreak detection algorithms are useful complementary tools for traditional surveillance, the heterogeneity in results among published studies does not support quantitative synthesis of their performance. A standardized framework should be followed when evaluating outbreak detection methods to allow comparison of algorithms across studies and synthesis of results. PMID:28441422

  1. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm.

    PubMed

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-10-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.

  2. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm

    PubMed Central

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-01-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses. PMID:27706086

  3. Remote Sensing Image Change Detection Based on NSCT-HMT Model and Its Application.

    PubMed

    Chen, Pengyun; Zhang, Yichen; Jia, Zhenhong; Yang, Jie; Kasabov, Nikola

    2017-06-06

    Traditional image change detection based on a non-subsampled contourlet transform always ignores the neighborhood information's relationship to the non-subsampled contourlet coefficients, and the detection results are susceptible to noise interference. To address these disadvantages, we propose a denoising method based on the non-subsampled contourlet transform domain that uses the Hidden Markov Tree model (NSCT-HMT) for change detection of remote sensing images. First, the ENVI software is used to calibrate the original remote sensing images. After that, the mean-ratio operation is adopted to obtain the difference image that will be denoised by the NSCT-HMT model. Then, using the Fuzzy Local Information C-means (FLICM) algorithm, the difference image is divided into the change area and unchanged area. The proposed algorithm is applied to a real remote sensing data set. The application results show that the proposed algorithm can effectively suppress clutter noise, and retain more detailed information from the original images. The proposed algorithm has higher detection accuracy than the Markov Random Field-Fuzzy C-means (MRF-FCM), the non-subsampled contourlet transform-Fuzzy C-means clustering (NSCT-FCM), the pointwise approach and graph theory (PA-GT), and the Principal Component Analysis-Nonlocal Means (PCA-NLM) denosing algorithm. Finally, the five algorithms are used to detect the southern boundary of the Gurbantunggut Desert in Xinjiang Uygur Autonomous Region of China, and the results show that the proposed algorithm has the best effect on real remote sensing image change detection.

  4. Remote Sensing Image Change Detection Based on NSCT-HMT Model and Its Application

    PubMed Central

    Chen, Pengyun; Zhang, Yichen; Jia, Zhenhong; Yang, Jie; Kasabov, Nikola

    2017-01-01

    Traditional image change detection based on a non-subsampled contourlet transform always ignores the neighborhood information’s relationship to the non-subsampled contourlet coefficients, and the detection results are susceptible to noise interference. To address these disadvantages, we propose a denoising method based on the non-subsampled contourlet transform domain that uses the Hidden Markov Tree model (NSCT-HMT) for change detection of remote sensing images. First, the ENVI software is used to calibrate the original remote sensing images. After that, the mean-ratio operation is adopted to obtain the difference image that will be denoised by the NSCT-HMT model. Then, using the Fuzzy Local Information C-means (FLICM) algorithm, the difference image is divided into the change area and unchanged area. The proposed algorithm is applied to a real remote sensing data set. The application results show that the proposed algorithm can effectively suppress clutter noise, and retain more detailed information from the original images. The proposed algorithm has higher detection accuracy than the Markov Random Field-Fuzzy C-means (MRF-FCM), the non-subsampled contourlet transform-Fuzzy C-means clustering (NSCT-FCM), the pointwise approach and graph theory (PA-GT), and the Principal Component Analysis-Nonlocal Means (PCA-NLM) denosing algorithm. Finally, the five algorithms are used to detect the southern boundary of the Gurbantunggut Desert in Xinjiang Uygur Autonomous Region of China, and the results show that the proposed algorithm has the best effect on real remote sensing image change detection. PMID:28587299

  5. Road detection and buried object detection in elevated EO/IR imagery

    NASA Astrophysics Data System (ADS)

    Kennedy, Levi; Kolba, Mark P.; Walters, Joshua R.

    2012-06-01

    To assist the warfighter in visually identifying potentially dangerous roadside objects, the U.S. Army RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD) has developed an elevated video sensor system testbed for data collection. This system provides color and mid-wave infrared (MWIR) imagery. Signal Innovations Group (SIG) has developed an automated processing capability that detects the road within the sensor field of view and identifies potentially threatening buried objects within the detected road. The road detection algorithm leverages system metadata to project the collected imagery onto a flat ground plane, allowing for more accurate detection of the road as well as the direct specification of realistic physical constraints in the shape of the detected road. Once the road has been detected in an image frame, a buried object detection algorithm is applied to search for threatening objects within the detected road space. The buried object detection algorithm leverages textural and pixel intensity-based features to detect potential anomalies and then classifies them as threatening or non-threatening objects. Both the road detection and the buried object detection algorithms have been developed to facilitate their implementation in real-time in the NVESD system.

  6. Implementing a Parallel Image Edge Detection Algorithm Based on the Otsu-Canny Operator on the Hadoop Platform

    PubMed Central

    Wang, Min; Tian, Yun

    2018-01-01

    The Canny operator is widely used to detect edges in images. However, as the size of the image dataset increases, the edge detection performance of the Canny operator decreases and its runtime becomes excessive. To improve the runtime and edge detection performance of the Canny operator, in this paper, we propose a parallel design and implementation for an Otsu-optimized Canny operator using a MapReduce parallel programming model that runs on the Hadoop platform. The Otsu algorithm is used to optimize the Canny operator's dual threshold and improve the edge detection performance, while the MapReduce parallel programming model facilitates parallel processing for the Canny operator to solve the processing speed and communication cost problems that occur when the Canny edge detection algorithm is applied to big data. For the experiments, we constructed datasets of different scales from the Pascal VOC2012 image database. The proposed parallel Otsu-Canny edge detection algorithm performs better than other traditional edge detection algorithms. The parallel approach reduced the running time by approximately 67.2% on a Hadoop cluster architecture consisting of 5 nodes with a dataset of 60,000 images. Overall, our approach system speeds up the system by approximately 3.4 times when processing large-scale datasets, which demonstrates the obvious superiority of our method. The proposed algorithm in this study demonstrates both better edge detection performance and improved time performance. PMID:29861711

  7. VOLATILE ORGANIC COMPOUNDS AND ISOPRENE OXIDATION PRODUCTS AT A TEMPERATE DECIDUOUS FOREST SITE

    EPA Science Inventory

    Biogenic volatile compounds (BVOCs) and their role in atmospheric oxidant formation were investigated at a forest site near Oak Ridge, Tennessee, as part of the Nashville Southern Oxidants Study (SOS) in July 1995. Of 98 VOCs detected, a major fraction were anthropogenic VOCs suc...

  8. Automatic Detection of Sand Ripple Features in Sidescan Sonar Imagery

    DTIC Science & Technology

    2014-07-09

    Among the features used in forensic scientific fingerprint analysis are terminations or bifurcations of print ridges. Sidescan sonar imagery of ripple...always be pathological cases. The size of the blocks of pixels used in determining the ripple wavelength is evident in the output images on the right in

  9. A new pivoting and iterative text detection algorithm for biomedical images.

    PubMed

    Xu, Songhua; Krauthammer, Michael

    2010-12-01

    There is interest to expand the reach of literature mining to include the analysis of biomedical images, which often contain a paper's key findings. Examples include recent studies that use Optical Character Recognition (OCR) to extract image text, which is used to boost biomedical image retrieval and classification. Such studies rely on the robust identification of text elements in biomedical images, which is a non-trivial task. In this work, we introduce a new text detection algorithm for biomedical images based on iterative projection histograms. We study the effectiveness of our algorithm by evaluating the performance on a set of manually labeled random biomedical images, and compare the performance against other state-of-the-art text detection algorithms. We demonstrate that our projection histogram-based text detection approach is well suited for text detection in biomedical images, and that the iterative application of the algorithm boosts performance to an F score of .60. We provide a C++ implementation of our algorithm freely available for academic use. Copyright © 2010 Elsevier Inc. All rights reserved.

  10. Site-specific standard request for underground storage tanks 1219-U, 1222-U, 2082-U, and 2068-U at the rust garage facility buildings 9754-1 and 9720-15: Oak Ridge Y-12 Plant, Oak Ridge, Tennessee, Facility ID No. 0-010117

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1994-12-01

    This document represents a Site-specific Standard Request for underground storage tanks (USTs) 1219-U,1222-U and 2082-U previously located at former Building 9754-1, and tank 2086-U previously located at Building 9720-15, Oak Ridge Y-12 Plant, Oak Ridge, Tennessee. The tanks previously contained petroleum products. For the purposes of this report, the two building sites will be regarded as a single UST site and will be referred to as the Rust Garage Facility. The current land use associated with the Y-12 Plant is light industrial and the operational period of the plant is projected to be at least 30 years. Thus, potential futuremore » residential exposures are not expected to occur for at least 30 years. Based on the degradation coefficient for benzene (the only carcinogenic petroleum constituent detected in soils or groundwater at the Rust Garage Facility), it is expected that the benzene and other contaminants at the site will likely be reduced prior to expiration of the 30-year plant operational period. As the original sources of petroleum contamination have been removed, and the area of petroleum contamination is limited, a site-specific standard is therefore being requested for the Rust Garage Facility.« less

  11. Alveolar ridge keratosis--a retrospective clinicopathological study.

    PubMed

    Bellato, Lorenzo; Martinelli-Kläy, Carla P; Martinelli, Celso R; Lombardi, Tommaso

    2013-04-16

    Alveolar ridge keratosis (ARK) is a distinct, benign clinicopathological entity, characterized by a hyperkeratotic plaque or patch that occurs on the alveolar edentulous ridge or on the retromolar trigone, considered to be caused by chronic frictional trauma. The aim of this retrospective study is to present the clinicopathological features of 23 consecutive cases of ARK. The 23 biopsy samples of ARK were selected and pathological features were revised (keratosis, acanthosis, surface architecture, and inflammation). Factors such as the patient's gender, age, anatomical location, tobacco and alcohol use were analyzed. Sixteen out of the 23 cases studied were men and 7 women with a mean age of 55.05 (age ranged from 17 to 88 years). Thirteen cases had a history of tobacco habit, amongst whom, 4 also presented alcohol consumption. All the cases presented only unilateral lesions. Nineteen cases involved the retromolar trigone while 4 cases involved edentulous alveolar ridges. When observed microscopically, the lesions were mainly characterized by moderate to important hyperorthokeratosis. Inflammation was scanty or absent. In four of the cases, the presence of melanin pigment in the superficial corium or in the cytoplasm of macrophages was detected. None of the cases showed any features of dysplasia. Our results reveal that ARK is a benign lesion. However, the high prevalence of smokers amongst the patients might suggest that some potentially malignant disorders such as tobacco associated leukoplakia may clinically mimic ARK.

  12. Analysis of the low-level seismicity along the Southern Indian Ocean spreading ridges recorded by the OHASISBIO array of hydrophones in 2012

    NASA Astrophysics Data System (ADS)

    Tsang-Hin-Sun, Eve; Royer, Jean-Yves; Sukhovich, Alexey; Perrot, Julie

    2014-05-01

    Arrays of autonomous hydrophones (AUHs) proved to be a very valuable tool for monitoring the seismic activity of mid-ocean ridges. AUHs take advantage of the ocean acoustic properties to detect many low-magnitude underwater earthquakes undetected by land-based stations. This allows for a significant improvement in the magnitude completeness level of seismic catalogs in remote oceanic areas. This study presents some results from the deployment of the OHASISBIO array comprising 7 AUHs deployed in the southern Indian Ocean. The source of acoustic events, i.e. site where - conversion from seismic to acoustic waves occur and proxy to epicenters for shallow earthquakes - can be precisely located within few km, inside the AUH array. The distribution of the uncertainties in the locations and time-origins shows that the OHASISBIO array reliably covers a wide region encompassing the Indian Ocean triple junction and large extent of the three mid-oceanic Indian spreading ridges, from 52°E to 80°E and from 25°S to 40°S. During its one year long deployment in 2012 and in this area the AUH array recorded 1670 events, while, for the same period, land-based networks only detected 470 events. A comparison of the background seismicity along the South-east (SEIR) and South-west (SWIR) Indian ridges suggests that the microseismicity, even over a year period, could be representative of the steady-state of stress along the SEIR and SWIR; this conclusion is based on very high Spearman's correlations between our one-year long AUH catalog and teleseismic catalogs over nearly 40 years. Seismicity along the ultra-slow spreading SWIR is regularly distributed in space and time, along spreading segments and transform faults, whereas the intermediate spreading SEIR diplays clusters of events in the vicinity of some transform faults or near specific geological structures such as the St-Paul and Amsterdam hotspot. A majority of these clusters seem to be related to magmatic processes, such as dyke intrusion or propagation. The analysis of mainshock-aftershock sequences reveals that flew clusters fit a modified Omori law, non-withstanding of their location (on transform faults or not), reflecting complex rupture mechanisms along both spreading ridges.

  13. Improved peak detection in mass spectrum by incorporating continuous wavelet transform-based pattern matching.

    PubMed

    Du, Pan; Kibbe, Warren A; Lin, Simon M

    2006-09-01

    A major problem for current peak detection algorithms is that noise in mass spectrometry (MS) spectra gives rise to a high rate of false positives. The false positive rate is especially problematic in detecting peaks with low amplitudes. Usually, various baseline correction algorithms and smoothing methods are applied before attempting peak detection. This approach is very sensitive to the amount of smoothing and aggressiveness of the baseline correction, which contribute to making peak detection results inconsistent between runs, instrumentation and analysis methods. Most peak detection algorithms simply identify peaks based on amplitude, ignoring the additional information present in the shape of the peaks in a spectrum. In our experience, 'true' peaks have characteristic shapes, and providing a shape-matching function that provides a 'goodness of fit' coefficient should provide a more robust peak identification method. Based on these observations, a continuous wavelet transform (CWT)-based peak detection algorithm has been devised that identifies peaks with different scales and amplitudes. By transforming the spectrum into wavelet space, the pattern-matching problem is simplified and in addition provides a powerful technique for identifying and separating the signal from the spike noise and colored noise. This transformation, with the additional information provided by the 2D CWT coefficients can greatly enhance the effective signal-to-noise ratio. Furthermore, with this technique no baseline removal or peak smoothing preprocessing steps are required before peak detection, and this improves the robustness of peak detection under a variety of conditions. The algorithm was evaluated with SELDI-TOF spectra with known polypeptide positions. Comparisons with two other popular algorithms were performed. The results show the CWT-based algorithm can identify both strong and weak peaks while keeping false positive rate low. The algorithm is implemented in R and will be included as an open source module in the Bioconductor project.

  14. The evaluation of the OSGLR algorithm for restructurable controls

    NASA Technical Reports Server (NTRS)

    Bonnice, W. F.; Wagner, E.; Hall, S. R.; Motyka, P.

    1986-01-01

    The detection and isolation of commercial aircraft control surface and actuator failures using the orthogonal series generalized likelihood ratio (OSGLR) test was evaluated. The OSGLR algorithm was chosen as the most promising algorithm based on a preliminary evaluation of three failure detection and isolation (FDI) algorithms (the detection filter, the generalized likelihood ratio test, and the OSGLR test) and a survey of the literature. One difficulty of analytic FDI techniques and the OSGLR algorithm in particular is their sensitivity to modeling errors. Therefore, methods of improving the robustness of the algorithm were examined with the incorporation of age-weighting into the algorithm being the most effective approach, significantly reducing the sensitivity of the algorithm to modeling errors. The steady-state implementation of the algorithm based on a single cruise linear model was evaluated using a nonlinear simulation of a C-130 aircraft. A number of off-nominal no-failure flight conditions including maneuvers, nonzero flap deflections, different turbulence levels and steady winds were tested. Based on the no-failure decision functions produced by off-nominal flight conditions, the failure detection performance at the nominal flight condition was determined. The extension of the algorithm to a wider flight envelope by scheduling the linear models used by the algorithm on dynamic pressure and flap deflection was also considered. Since simply scheduling the linear models over the entire flight envelope is unlikely to be adequate, scheduling of the steady-state implentation of the algorithm was briefly investigated.

  15. Spatial cluster detection using dynamic programming.

    PubMed

    Sverchkov, Yuriy; Jiang, Xia; Cooper, Gregory F

    2012-03-25

    The task of spatial cluster detection involves finding spatial regions where some property deviates from the norm or the expected value. In a probabilistic setting this task can be expressed as finding a region where some event is significantly more likely than usual. Spatial cluster detection is of interest in fields such as biosurveillance, mining of astronomical data, military surveillance, and analysis of fMRI images. In almost all such applications we are interested both in the question of whether a cluster exists in the data, and if it exists, we are interested in finding the most accurate characterization of the cluster. We present a general dynamic programming algorithm for grid-based spatial cluster detection. The algorithm can be used for both Bayesian maximum a-posteriori (MAP) estimation of the most likely spatial distribution of clusters and Bayesian model averaging over a large space of spatial cluster distributions to compute the posterior probability of an unusual spatial clustering. The algorithm is explained and evaluated in the context of a biosurveillance application, specifically the detection and identification of Influenza outbreaks based on emergency department visits. A relatively simple underlying model is constructed for the purpose of evaluating the algorithm, and the algorithm is evaluated using the model and semi-synthetic test data. When compared to baseline methods, tests indicate that the new algorithm can improve MAP estimates under certain conditions: the greedy algorithm we compared our method to was found to be more sensitive to smaller outbreaks, while as the size of the outbreaks increases, in terms of area affected and proportion of individuals affected, our method overtakes the greedy algorithm in spatial precision and recall. The new algorithm performs on-par with baseline methods in the task of Bayesian model averaging. We conclude that the dynamic programming algorithm performs on-par with other available methods for spatial cluster detection and point to its low computational cost and extendability as advantages in favor of further research and use of the algorithm.

  16. Spatial cluster detection using dynamic programming

    PubMed Central

    2012-01-01

    Background The task of spatial cluster detection involves finding spatial regions where some property deviates from the norm or the expected value. In a probabilistic setting this task can be expressed as finding a region where some event is significantly more likely than usual. Spatial cluster detection is of interest in fields such as biosurveillance, mining of astronomical data, military surveillance, and analysis of fMRI images. In almost all such applications we are interested both in the question of whether a cluster exists in the data, and if it exists, we are interested in finding the most accurate characterization of the cluster. Methods We present a general dynamic programming algorithm for grid-based spatial cluster detection. The algorithm can be used for both Bayesian maximum a-posteriori (MAP) estimation of the most likely spatial distribution of clusters and Bayesian model averaging over a large space of spatial cluster distributions to compute the posterior probability of an unusual spatial clustering. The algorithm is explained and evaluated in the context of a biosurveillance application, specifically the detection and identification of Influenza outbreaks based on emergency department visits. A relatively simple underlying model is constructed for the purpose of evaluating the algorithm, and the algorithm is evaluated using the model and semi-synthetic test data. Results When compared to baseline methods, tests indicate that the new algorithm can improve MAP estimates under certain conditions: the greedy algorithm we compared our method to was found to be more sensitive to smaller outbreaks, while as the size of the outbreaks increases, in terms of area affected and proportion of individuals affected, our method overtakes the greedy algorithm in spatial precision and recall. The new algorithm performs on-par with baseline methods in the task of Bayesian model averaging. Conclusions We conclude that the dynamic programming algorithm performs on-par with other available methods for spatial cluster detection and point to its low computational cost and extendability as advantages in favor of further research and use of the algorithm. PMID:22443103

  17. Sensitivity and specificity of automated detection of early repolarization in standard 12-lead electrocardiography.

    PubMed

    Kenttä, Tuomas; Porthan, Kimmo; Tikkanen, Jani T; Väänänen, Heikki; Oikarinen, Lasse; Viitasalo, Matti; Karanko, Hannu; Laaksonen, Maarit; Huikuri, Heikki V

    2015-07-01

    Early repolarization (ER) is defined as an elevation of the QRS-ST junction in at least two inferior or lateral leads of the standard 12-lead electrocardiogram (ECG). Our purpose was to create an algorithm for the automated detection and classification of ER. A total of 6,047 electrocardiograms were manually graded for ER by two experienced readers. The automated detection of ER was based on quantification of the characteristic slurring or notching in ER-positive leads. The ER detection algorithm was tested and its results were compared with manual grading, which served as the reference. Readers graded 183 ECGs (3.0%) as ER positive, of which the algorithm detected 176 recordings, resulting in sensitivity of 96.2%. Of the 5,864 ER-negative recordings, the algorithm classified 5,281 as negative, resulting in 90.1% specificity. Positive and negative predictive values for the algorithm were 23.2% and 99.9%, respectively, and its accuracy was 90.2%. Inferior ER was correctly detected in 84.6% and lateral ER in 98.6% of the cases. As the automatic algorithm has high sensitivity, it could be used as a prescreening tool for ER; only the electrocardiograms graded positive by the algorithm would be reviewed manually. This would reduce the need for manual labor by 90%. © 2014 Wiley Periodicals, Inc.

  18. Automated video-based detection of nocturnal convulsive seizures in a residential care setting.

    PubMed

    Geertsema, Evelien E; Thijs, Roland D; Gutter, Therese; Vledder, Ben; Arends, Johan B; Leijten, Frans S; Visser, Gerhard H; Kalitzin, Stiliyan N

    2018-06-01

    People with epilepsy need assistance and are at risk of sudden death when having convulsive seizures (CS). Automated real-time seizure detection systems can help alert caregivers, but wearable sensors are not always tolerated. We determined algorithm settings and investigated detection performance of a video algorithm to detect CS in a residential care setting. The algorithm calculates power in the 2-6 Hz range relative to 0.5-12.5 Hz range in group velocity signals derived from video-sequence optical flow. A detection threshold was found using a training set consisting of video-electroencephalogaphy (EEG) recordings of 72 CS. A test set consisting of 24 full nights of 12 new subjects in residential care and additional recordings of 50 CS selected randomly was used to estimate performance. All data were analyzed retrospectively. The start and end of CS (generalized clonic and tonic-clonic seizures) and other seizures considered desirable to detect (long generalized tonic, hyperkinetic, and other major seizures) were annotated. The detection threshold was set to the value that obtained 97% sensitivity in the training set. Sensitivity, latency, and false detection rate (FDR) per night were calculated in the test set. A seizure was detected when the algorithm output exceeded the threshold continuously for 2 seconds. With the detection threshold determined in the training set, all CS were detected in the test set (100% sensitivity). Latency was ≤10 seconds in 78% of detections. Three/five hyperkinetic and 6/9 other major seizures were detected. Median FDR was 0.78 per night and no false detections occurred in 9/24 nights. Our algorithm could improve safety unobtrusively by automated real-time detection of CS in video registrations, with an acceptable latency and FDR. The algorithm can also detect some other motor seizures requiring assistance. © 2018 The Authors. Epilepsia published by Wiley Periodicals, Inc. on behalf of International League Against Epilepsy.

  19. Deformation of Forearcs during Aseismic Ridge Subduction

    NASA Astrophysics Data System (ADS)

    Zeumann, S.; Hampel, A.

    2014-12-01

    Subduction of aseismic oceanic ridges causes considerable deformation of the forearc region. To identify the crucial parameters for forearc deformation we created 3D finite-element models representing both erosive and accretive forearcs as well as migrating and non-migrating ridges. As natural examples we choose the Cocos ridge subducting stationary beneath the erosive margin of Costa Rica and the Nazca and Gagua Ridges that migrate along the erosive Peruvian margin and the accretive accretive Ryukyu margin, respectively. A series of models show that the deformation of the forearc depends on the ridge shape (height, width), on the frictional coupling along the plate interface and the mechanical strength of the forearc. The forearc is uplifted and moved sideward during ridge subduction. Strain components show domains of both, shortening and extension. Along the ridge axis, extension occurs except at the ridge tip, where shortening prevails. The strain component normal to the ridge axis reveals extension at the ridge tip and contraction above the ridge flanks. Shortening and extension increase with increasing ridge height. Higher friction coefficients lead to less extension and more shortening. Accretive wedges show larger indentation at the model trench. For stationary ridges (Cocos Ridge) the deformation pattern of the forearc is symmetric with respect to the ridge axis whereas for migrating ridges (Nazca Ridge, Gagua Ridge) the oblique convergence direction leads to asymmetric deformation of the forearc. In case of ridge migration, uplift occurs at the leading flank of the ridge and subsidence at the trailing flank, in agreement with field observations and analogue models. For a model with a 200-km-wide and 1500-m-high ridge (i.e. similar to the dimensions of the Nazca Ridge), the modelled uplift rate at the southern ridge flank of the ridge is ~1 mm/a, which agrees well with uplift rates of ~0.7 mm/a derived from the elevation of marine terraces in southern Peru.

  20. Statistically significant performance results of a mine detector and fusion algorithm from an x-band high-resolution SAR

    NASA Astrophysics Data System (ADS)

    Williams, Arnold C.; Pachowicz, Peter W.

    2004-09-01

    Current mine detection research indicates that no single sensor or single look from a sensor will detect mines/minefields in a real-time manner at a performance level suitable for a forward maneuver unit. Hence, the integrated development of detectors and fusion algorithms are of primary importance. A problem in this development process has been the evaluation of these algorithms with relatively small data sets, leading to anecdotal and frequently over trained results. These anecdotal results are often unreliable and conflicting among various sensors and algorithms. Consequently, the physical phenomena that ought to be exploited and the performance benefits of this exploitation are often ambiguous. The Army RDECOM CERDEC Night Vision Laboratory and Electron Sensors Directorate has collected large amounts of multisensor data such that statistically significant evaluations of detection and fusion algorithms can be obtained. Even with these large data sets care must be taken in algorithm design and data processing to achieve statistically significant performance results for combined detectors and fusion algorithms. This paper discusses statistically significant detection and combined multilook fusion results for the Ellipse Detector (ED) and the Piecewise Level Fusion Algorithm (PLFA). These statistically significant performance results are characterized by ROC curves that have been obtained through processing this multilook data for the high resolution SAR data of the Veridian X-Band radar. We discuss the implications of these results on mine detection and the importance of statistical significance, sample size, ground truth, and algorithm design in performance evaluation.

  1. An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques

    DTIC Science & Technology

    2018-01-09

    ARL-TR-8272 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological and...is no longer needed. Do not return it to the originator. ARL-TR-8272 ● JAN 2018 US Army Research Laboratory An Automated Energy ...4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques 5a. CONTRACT NUMBER

  2. Advanced detection, isolation and accommodation of sensor failures: Real-time evaluation

    NASA Technical Reports Server (NTRS)

    Merrill, Walter C.; Delaat, John C.; Bruton, William M.

    1987-01-01

    The objective of the Advanced Detection, Isolation, and Accommodation (ADIA) Program is to improve the overall demonstrated reliability of digital electronic control systems for turbine engines by using analytical redundacy to detect sensor failures. The results of a real time hybrid computer evaluation of the ADIA algorithm are presented. Minimum detectable levels of sensor failures for an F100 engine control system are determined. Also included are details about the microprocessor implementation of the algorithm as well as a description of the algorithm itself.

  3. Costs and consequences of automated algorithms versus manual grading for the detection of referable diabetic retinopathy.

    PubMed

    Scotland, G S; McNamee, P; Fleming, A D; Goatman, K A; Philip, S; Prescott, G J; Sharp, P F; Williams, G J; Wykes, W; Leese, G P; Olson, J A

    2010-06-01

    To assess the cost-effectiveness of an improved automated grading algorithm for diabetic retinopathy against a previously described algorithm, and in comparison with manual grading. Efficacy of the alternative algorithms was assessed using a reference graded set of images from three screening centres in Scotland (1253 cases with observable/referable retinopathy and 6333 individuals with mild or no retinopathy). Screening outcomes and grading and diagnosis costs were modelled for a cohort of 180 000 people, with prevalence of referable retinopathy at 4%. Algorithm (b), which combines image quality assessment with detection algorithms for microaneurysms (MA), blot haemorrhages and exudates, was compared with a simpler algorithm (a) (using image quality assessment and MA/dot haemorrhage (DH) detection), and the current practice of manual grading. Compared with algorithm (a), algorithm (b) would identify an additional 113 cases of referable retinopathy for an incremental cost of pound 68 per additional case. Compared with manual grading, automated grading would be expected to identify between 54 and 123 fewer referable cases, for a grading cost saving between pound 3834 and pound 1727 per case missed. Extrapolation modelling over a 20-year time horizon suggests manual grading would cost between pound 25,676 and pound 267,115 per additional quality adjusted life year gained. Algorithm (b) is more cost-effective than the algorithm based on quality assessment and MA/DH detection. With respect to the value of introducing automated detection systems into screening programmes, automated grading operates within the recommended national standards in Scotland and is likely to be considered a cost-effective alternative to manual disease/no disease grading.

  4. Development and testing of incident detection algorithms. Vol. 2, research methodology and detailed results.

    DOT National Transportation Integrated Search

    1976-04-01

    The development and testing of incident detection algorithms was based on Los Angeles and Minneapolis freeway surveillance data. Algorithms considered were based on times series and pattern recognition techniques. Attention was given to the effects o...

  5. Predicting Error Bars for QSAR Models

    NASA Astrophysics Data System (ADS)

    Schroeter, Timon; Schwaighofer, Anton; Mika, Sebastian; Ter Laak, Antonius; Suelzle, Detlev; Ganzer, Ursula; Heinrich, Nikolaus; Müller, Klaus-Robert

    2007-09-01

    Unfavorable physicochemical properties often cause drug failures. It is therefore important to take lipophilicity and water solubility into account early on in lead discovery. This study presents log D7 models built using Gaussian Process regression, Support Vector Machines, decision trees and ridge regression algorithms based on 14556 drug discovery compounds of Bayer Schering Pharma. A blind test was conducted using 7013 new measurements from the last months. We also present independent evaluations using public data. Apart from accuracy, we discuss the quality of error bars that can be computed by Gaussian Process models, and ensemble and distance based techniques for the other modelling approaches.

  6. Innovative signal processing for Johnson Noise thermometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ezell, N. Dianne Bull; Britton, Jr, Charles L.; Roberts, Michael

    This report summarizes the newly developed algorithm that subtracted the Electromagnetic Interference (EMI). The EMI performance is very important to this measurement because any interference in the form on pickup from external signal sources from such as fluorescent lighting ballasts, motors, etc. can skew the measurement. Two methods of removing EMI were developed and tested at various locations. This report also summarizes the testing performed at different facilities outside Oak Ridge National Laboratory using both EMI removal techniques. The first EMI removal technique reviewed in previous milestone reports and therefore this report will detail the second method.

  7. Screening for Human Immunodeficiency Virus, Hepatitis B Virus, Hepatitis C Virus, and Treponema pallidum by Blood Testing Using a Bio-Flash Technology-Based Algorithm before Gastrointestinal Endoscopy

    PubMed Central

    Zhen, Chen; QuiuLi, Zhang; YuanQi, An; Casado, Verónica Vocero; Fan, Yuan

    2016-01-01

    Currently, conventional enzyme immunoassays which use manual gold immunoassays and colloidal tests (GICTs) are used as screening tools to detect Treponema pallidum (syphilis), hepatitis B virus (HBV), hepatitis C virus (HCV), human immunodeficiency virus type 1 (HIV-1), and HIV-2 in patients undergoing surgery. The present observational, cross-sectional study compared the sensitivity, specificity, and work flow characteristics of the conventional algorithm with manual GICTs with those of a newly proposed algorithm that uses the automated Bio-Flash technology as a screening tool in patients undergoing gastrointestinal (GI) endoscopy. A total of 956 patients were examined for the presence of serological markers of infection with HIV-1/2, HCV, HBV, and T. pallidum. The proposed algorithm with the Bio-Flash technology was superior for the detection of all markers (100.0% sensitivity and specificity for detection of anti-HIV and anti-HCV antibodies, HBV surface antigen [HBsAg], and T. pallidum) compared with the conventional algorithm based on the manual method (80.0% sensitivity and 98.6% specificity for the detection of anti-HIV, 75.0% sensitivity for the detection of anti-HCV, 94.7% sensitivity for the detection of HBsAg, and 100% specificity for the detection of anti-HCV and HBsAg) in these patients. The automated Bio-Flash technology-based screening algorithm also reduced the operation time by 85.0% (205 min) per day, saving up to 24 h/week. In conclusion, the use of the newly proposed screening algorithm based on the automated Bio-Flash technology can provide an advantage over the use of conventional algorithms based on manual methods for screening for HIV, HBV, HCV, and syphilis before GI endoscopy. PMID:27707942

  8. Screening for Human Immunodeficiency Virus, Hepatitis B Virus, Hepatitis C Virus, and Treponema pallidum by Blood Testing Using a Bio-Flash Technology-Based Algorithm before Gastrointestinal Endoscopy.

    PubMed

    Jun, Zhou; Zhen, Chen; QuiuLi, Zhang; YuanQi, An; Casado, Verónica Vocero; Fan, Yuan

    2016-12-01

    Currently, conventional enzyme immunoassays which use manual gold immunoassays and colloidal tests (GICTs) are used as screening tools to detect Treponema pallidum (syphilis), hepatitis B virus (HBV), hepatitis C virus (HCV), human immunodeficiency virus type 1 (HIV-1), and HIV-2 in patients undergoing surgery. The present observational, cross-sectional study compared the sensitivity, specificity, and work flow characteristics of the conventional algorithm with manual GICTs with those of a newly proposed algorithm that uses the automated Bio-Flash technology as a screening tool in patients undergoing gastrointestinal (GI) endoscopy. A total of 956 patients were examined for the presence of serological markers of infection with HIV-1/2, HCV, HBV, and T. pallidum The proposed algorithm with the Bio-Flash technology was superior for the detection of all markers (100.0% sensitivity and specificity for detection of anti-HIV and anti-HCV antibodies, HBV surface antigen [HBsAg], and T. pallidum) compared with the conventional algorithm based on the manual method (80.0% sensitivity and 98.6% specificity for the detection of anti-HIV, 75.0% sensitivity for the detection of anti-HCV, 94.7% sensitivity for the detection of HBsAg, and 100% specificity for the detection of anti-HCV and HBsAg) in these patients. The automated Bio-Flash technology-based screening algorithm also reduced the operation time by 85.0% (205 min) per day, saving up to 24 h/week. In conclusion, the use of the newly proposed screening algorithm based on the automated Bio-Flash technology can provide an advantage over the use of conventional algorithms based on manual methods for screening for HIV, HBV, HCV, and syphilis before GI endoscopy. Copyright © 2016 Jun et al.

  9. A methodology for evaluating detection performance of ultrasonic array imaging algorithms for coarse-grained materials.

    PubMed

    Van Pamel, Anton; Brett, Colin R; Lowe, Michael J S

    2014-12-01

    Improving the ultrasound inspection capability for coarse-grained metals remains of longstanding interest and is expected to become increasingly important for next-generation electricity power plants. Conventional ultrasonic A-, B-, and C-scans have been found to suffer from strong background noise caused by grain scattering, which can severely limit the detection of defects. However, in recent years, array probes and full matrix capture (FMC) imaging algorithms have unlocked exciting possibilities for improvements. To improve and compare these algorithms, we must rely on robust methodologies to quantify their performance. This article proposes such a methodology to evaluate the detection performance of imaging algorithms. For illustration, the methodology is applied to some example data using three FMC imaging algorithms; total focusing method (TFM), phase-coherent imaging (PCI), and decomposition of the time-reversal operator with multiple scattering filter (DORT MSF). However, it is important to note that this is solely to illustrate the methodology; this article does not attempt the broader investigation of different cases that would be needed to compare the performance of these algorithms in general. The methodology considers the statistics of detection, presenting the detection performance as probability of detection (POD) and probability of false alarm (PFA). A test sample of coarse-grained nickel super alloy, manufactured to represent materials used for future power plant components and containing some simple artificial defects, is used to illustrate the method on the candidate algorithms. The data are captured in pulse-echo mode using 64-element array probes at center frequencies of 1 and 5 MHz. In this particular case, it turns out that all three algorithms are shown to perform very similarly when comparing their flaw detection capabilities.

  10. Semi-supervised spectral algorithms for community detection in complex networks based on equivalence of clustering methods

    NASA Astrophysics Data System (ADS)

    Ma, Xiaoke; Wang, Bingbo; Yu, Liang

    2018-01-01

    Community detection is fundamental for revealing the structure-functionality relationship in complex networks, which involves two issues-the quantitative function for community as well as algorithms to discover communities. Despite significant research on either of them, few attempt has been made to establish the connection between the two issues. To attack this problem, a generalized quantification function is proposed for community in weighted networks, which provides a framework that unifies several well-known measures. Then, we prove that the trace optimization of the proposed measure is equivalent with the objective functions of algorithms such as nonnegative matrix factorization, kernel K-means as well as spectral clustering. It serves as the theoretical foundation for designing algorithms for community detection. On the second issue, a semi-supervised spectral clustering algorithm is developed by exploring the equivalence relation via combining the nonnegative matrix factorization and spectral clustering. Different from the traditional semi-supervised algorithms, the partial supervision is integrated into the objective of the spectral algorithm. Finally, through extensive experiments on both artificial and real world networks, we demonstrate that the proposed method improves the accuracy of the traditional spectral algorithms in community detection.

  11. Combined Dust Detection Algorithm by Using MODIS Infrared Channels over East Asia

    NASA Technical Reports Server (NTRS)

    Park, Sang Seo; Kim, Jhoon; Lee, Jaehwa; Lee, Sukjo; Kim, Jeong Soo; Chang, Lim Seok; Ou, Steve

    2014-01-01

    A new dust detection algorithm is developed by combining the results of multiple dust detectionmethods using IR channels onboard the MODerate resolution Imaging Spectroradiometer (MODIS). Brightness Temperature Difference (BTD) between two wavelength channels has been used widely in previous dust detection methods. However, BTDmethods have limitations in identifying the offset values of the BTDto discriminate clear-sky areas. The current algorithm overcomes the disadvantages of previous dust detection methods by considering the Brightness Temperature Ratio (BTR) values of the dual wavelength channels with 30-day composite, the optical properties of the dust particles, the variability of surface properties, and the cloud contamination. Therefore, the current algorithm shows improvements in detecting the dust loaded region over land during daytime. Finally, the confidence index of the current dust algorithm is shown in 10 × 10 pixels of the MODIS observations. From January to June, 2006, the results of the current algorithm are within 64 to 81% of those found using the fine mode fraction (FMF) and aerosol index (AI) from the MODIS and Ozone Monitoring Instrument (OMI). The agreement between the results of the current algorithm and the OMI AI over the non-polluted land also ranges from 60 to 67% to avoid errors due to the anthropogenic aerosol. In addition, the developed algorithm shows statistically significant results at four AErosol RObotic NETwork (AERONET) sites in East Asia.

  12. cWINNOWER algorithm for finding fuzzy dna motifs

    NASA Technical Reports Server (NTRS)

    Liang, S.; Samanta, M. P.; Biegel, B. A.

    2004-01-01

    The cWINNOWER algorithm detects fuzzy motifs in DNA sequences rich in protein-binding signals. A signal is defined as any short nucleotide pattern having up to d mutations differing from a motif of length l. The algorithm finds such motifs if a clique consisting of a sufficiently large number of mutated copies of the motif (i.e., the signals) is present in the DNA sequence. The cWINNOWER algorithm substantially improves the sensitivity of the winnower method of Pevzner and Sze by imposing a consensus constraint, enabling it to detect much weaker signals. We studied the minimum detectable clique size qc as a function of sequence length N for random sequences. We found that qc increases linearly with N for a fast version of the algorithm based on counting three-member sub-cliques. Imposing consensus constraints reduces qc by a factor of three in this case, which makes the algorithm dramatically more sensitive. Our most sensitive algorithm, which counts four-member sub-cliques, needs a minimum of only 13 signals to detect motifs in a sequence of length N = 12,000 for (l, d) = (15, 4). Copyright Imperial College Press.

  13. cWINNOWER Algorithm for Finding Fuzzy DNA Motifs

    NASA Technical Reports Server (NTRS)

    Liang, Shoudan

    2003-01-01

    The cWINNOWER algorithm detects fuzzy motifs in DNA sequences rich in protein-binding signals. A signal is defined as any short nucleotide pattern having up to d mutations differing from a motif of length l. The algorithm finds such motifs if multiple mutated copies of the motif (i.e., the signals) are present in the DNA sequence in sufficient abundance. The cWINNOWER algorithm substantially improves the sensitivity of the winnower method of Pevzner and Sze by imposing a consensus constraint, enabling it to detect much weaker signals. We studied the minimum number of detectable motifs qc as a function of sequence length N for random sequences. We found that qc increases linearly with N for a fast version of the algorithm based on counting three-member sub-cliques. Imposing consensus constraints reduces qc, by a factor of three in this case, which makes the algorithm dramatically more sensitive. Our most sensitive algorithm, which counts four-member sub-cliques, needs a minimum of only 13 signals to detect motifs in a sequence of length N = 12000 for (l,d) = (15,4).

  14. Epidemic failure detection and consensus for extreme parallelism

    DOE PAGES

    Katti, Amogh; Di Fatta, Giuseppe; Naughton, Thomas; ...

    2017-02-01

    Future extreme-scale high-performance computing systems will be required to work under frequent component failures. The MPI Forum s User Level Failure Mitigation proposal has introduced an operation, MPI Comm shrink, to synchronize the alive processes on the list of failed processes, so that applications can continue to execute even in the presence of failures by adopting algorithm-based fault tolerance techniques. This MPI Comm shrink operation requires a failure detection and consensus algorithm. This paper presents three novel failure detection and consensus algorithms using Gossiping. The proposed algorithms were implemented and tested using the Extreme-scale Simulator. The results show that inmore » all algorithms the number of Gossip cycles to achieve global consensus scales logarithmically with system size. The second algorithm also shows better scalability in terms of memory and network bandwidth usage and a perfect synchronization in achieving global consensus. The third approach is a three-phase distributed failure detection and consensus algorithm and provides consistency guarantees even in very large and extreme-scale systems while at the same time being memory and bandwidth efficient.« less

  15. PROPAGATION AND LINKAGE OF OCEANIC RIDGE SEGMENTS.

    USGS Publications Warehouse

    Pollard, David D.; Aydin, Atilla

    1984-01-01

    An investigation was made of spreading ridges and the development of structures that link ridge segments using an analogy between ridges and cracks in elastic plates. The ridge-propagation force and a path factor that controls propagation direction were calculated for echelon ridge segments propagating toward each other. The ridge-propagation force increases as ridge ends approach but then declines sharply as the ends pass, so ridge segments may overlap somewhat. The sign of the path factor changes as ridge ends approach and pass, so the overlapping ridge ends may diverge and then converge following a hook-shaped path. The magnitudes of shear stresses in the plane of the plate and orientations of maximum shear planes between adjacent ridge segments were calculated to study transform faulting. For different loading conditions simulating ridge push, plate pull, and ridge suction, a zone of intense mechanical interaction between adjacent ridge ends in which stresses are concentrated was identified. The magnitudes of mean stresses in the plane of the plate and orientations of principal stress planes were also calculated.

  16. Performance improvement of multi-class detection using greedy algorithm for Viola-Jones cascade selection

    NASA Astrophysics Data System (ADS)

    Tereshin, Alexander A.; Usilin, Sergey A.; Arlazarov, Vladimir V.

    2018-04-01

    This paper aims to study the problem of multi-class object detection in video stream with Viola-Jones cascades. An adaptive algorithm for selecting Viola-Jones cascade based on greedy choice strategy in solution of the N-armed bandit problem is proposed. The efficiency of the algorithm on the problem of detection and recognition of the bank card logos in the video stream is shown. The proposed algorithm can be effectively used in documents localization and identification, recognition of road scene elements, localization and tracking of the lengthy objects , and for solving other problems of rigid object detection in a heterogeneous data flows. The computational efficiency of the algorithm makes it possible to use it both on personal computers and on mobile devices based on processors with low power consumption.

  17. Aircraft Detection in High-Resolution SAR Images Based on a Gradient Textural Saliency Map.

    PubMed

    Tan, Yihua; Li, Qingyun; Li, Yansheng; Tian, Jinwen

    2015-09-11

    This paper proposes a new automatic and adaptive aircraft target detection algorithm in high-resolution synthetic aperture radar (SAR) images of airport. The proposed method is based on gradient textural saliency map under the contextual cues of apron area. Firstly, the candidate regions with the possible existence of airport are detected from the apron area. Secondly, directional local gradient distribution detector is used to obtain a gradient textural saliency map in the favor of the candidate regions. In addition, the final targets will be detected by segmenting the saliency map using CFAR-type algorithm. The real high-resolution airborne SAR image data is used to verify the proposed algorithm. The results demonstrate that this algorithm can detect aircraft targets quickly and accurately, and decrease the false alarm rate.

  18. Research on the attitude detection technology of the tetrahedron robot

    NASA Astrophysics Data System (ADS)

    Gong, Hao; Chen, Keshan; Ren, Wenqiang; Cai, Xin

    2017-10-01

    The traditional attitude detection technology can't tackle the problem of attitude detection of the polyhedral robot. Thus we propose a novel algorithm of multi-sensor data fusion which is based on Kalman filter. In the algorithm a tetrahedron robot is investigated. We devise an attitude detection system for the polyhedral robot and conduct the verification of data fusion algorithm. It turns out that the minimal attitude detection system we devise could capture attitudes of the tetrahedral robot in different working conditions. Thus the Kinematics model we establish for the tetrahedron robot is correct and the feasibility of the attitude detection system is proven.

  19. A General Event Location Algorithm with Applications to Eclipse and Station Line-of-Sight

    NASA Technical Reports Server (NTRS)

    Parker, Joel J. K.; Hughes, Steven P.

    2011-01-01

    A general-purpose algorithm for the detection and location of orbital events is developed. The proposed algorithm reduces the problem to a global root-finding problem by mapping events of interest (such as eclipses, station access events, etc.) to continuous, differentiable event functions. A stepping algorithm and a bracketing algorithm are used to detect and locate the roots. Examples of event functions and the stepping/bracketing algorithms are discussed, along with results indicating performance and accuracy in comparison to commercial tools across a variety of trajectories.

  20. A General Event Location Algorithm with Applications to Eclispe and Station Line-of-Sight

    NASA Technical Reports Server (NTRS)

    Parker, Joel J. K.; Hughes, Steven P.

    2011-01-01

    A general-purpose algorithm for the detection and location of orbital events is developed. The proposed algorithm reduces the problem to a global root-finding problem by mapping events of interest (such as eclipses, station access events, etc.) to continuous, differentiable event functions. A stepping algorithm and a bracketing algorithm are used to detect and locate the roots. Examples of event functions and the stepping/bracketing algorithms are discussed, along with results indicating performance and accuracy in comparison to commercial tools across a variety of trajectories.

  1. Fast object detection algorithm based on HOG and CNN

    NASA Astrophysics Data System (ADS)

    Lu, Tongwei; Wang, Dandan; Zhang, Yanduo

    2018-04-01

    In the field of computer vision, object classification and object detection are widely used in many fields. The traditional object detection have two main problems:one is that sliding window of the regional selection strategy is high time complexity and have window redundancy. And the other one is that Robustness of the feature is not well. In order to solve those problems, Regional Proposal Network (RPN) is used to select candidate regions instead of selective search algorithm. Compared with traditional algorithms and selective search algorithms, RPN has higher efficiency and accuracy. We combine HOG feature and convolution neural network (CNN) to extract features. And we use SVM to classify. For TorontoNet, our algorithm's mAP is 1.6 percentage points higher. For OxfordNet, our algorithm's mAP is 1.3 percentage higher.

  2. A novel line segment detection algorithm based on graph search

    NASA Astrophysics Data System (ADS)

    Zhao, Hong-dan; Liu, Guo-ying; Song, Xu

    2018-02-01

    To overcome the problem of extracting line segment from an image, a method of line segment detection was proposed based on the graph search algorithm. After obtaining the edge detection result of the image, the candidate straight line segments are obtained in four directions. For the candidate straight line segments, their adjacency relationships are depicted by a graph model, based on which the depth-first search algorithm is employed to determine how many adjacent line segments need to be merged. Finally we use the least squares method to fit the detected straight lines. The comparative experimental results verify that the proposed algorithm has achieved better results than the line segment detector (LSD).

  3. Novel Hierarchical Fall Detection Algorithm Using a Multiphase Fall Model.

    PubMed

    Hsieh, Chia-Yeh; Liu, Kai-Chun; Huang, Chih-Ning; Chu, Woei-Chyn; Chan, Chia-Tai

    2017-02-08

    Falls are the primary cause of accidents for the elderly in the living environment. Reducing hazards in the living environment and performing exercises for training balance and muscles are the common strategies for fall prevention. However, falls cannot be avoided completely; fall detection provides an alarm that can decrease injuries or death caused by the lack of rescue. The automatic fall detection system has opportunities to provide real-time emergency alarms for improving the safety and quality of home healthcare services. Two common technical challenges are also tackled in order to provide a reliable fall detection algorithm, including variability and ambiguity. We propose a novel hierarchical fall detection algorithm involving threshold-based and knowledge-based approaches to detect a fall event. The threshold-based approach efficiently supports the detection and identification of fall events from continuous sensor data. A multiphase fall model is utilized, including free fall, impact, and rest phases for the knowledge-based approach, which identifies fall events and has the potential to deal with the aforementioned technical challenges of a fall detection system. Seven kinds of falls and seven types of daily activities arranged in an experiment are used to explore the performance of the proposed fall detection algorithm. The overall performances of the sensitivity, specificity, precision, and accuracy using a knowledge-based algorithm are 99.79%, 98.74%, 99.05% and 99.33%, respectively. The results show that the proposed novel hierarchical fall detection algorithm can cope with the variability and ambiguity of the technical challenges and fulfill the reliability, adaptability, and flexibility requirements of an automatic fall detection system with respect to the individual differences.

  4. Novel Hierarchical Fall Detection Algorithm Using a Multiphase Fall Model

    PubMed Central

    Hsieh, Chia-Yeh; Liu, Kai-Chun; Huang, Chih-Ning; Chu, Woei-Chyn; Chan, Chia-Tai

    2017-01-01

    Falls are the primary cause of accidents for the elderly in the living environment. Reducing hazards in the living environment and performing exercises for training balance and muscles are the common strategies for fall prevention. However, falls cannot be avoided completely; fall detection provides an alarm that can decrease injuries or death caused by the lack of rescue. The automatic fall detection system has opportunities to provide real-time emergency alarms for improving the safety and quality of home healthcare services. Two common technical challenges are also tackled in order to provide a reliable fall detection algorithm, including variability and ambiguity. We propose a novel hierarchical fall detection algorithm involving threshold-based and knowledge-based approaches to detect a fall event. The threshold-based approach efficiently supports the detection and identification of fall events from continuous sensor data. A multiphase fall model is utilized, including free fall, impact, and rest phases for the knowledge-based approach, which identifies fall events and has the potential to deal with the aforementioned technical challenges of a fall detection system. Seven kinds of falls and seven types of daily activities arranged in an experiment are used to explore the performance of the proposed fall detection algorithm. The overall performances of the sensitivity, specificity, precision, and accuracy using a knowledge-based algorithm are 99.79%, 98.74%, 99.05% and 99.33%, respectively. The results show that the proposed novel hierarchical fall detection algorithm can cope with the variability and ambiguity of the technical challenges and fulfill the reliability, adaptability, and flexibility requirements of an automatic fall detection system with respect to the individual differences. PMID:28208694

  5. Ship detection in satellite imagery using rank-order greyscale hit-or-miss transforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harvey, Neal R; Porter, Reid B; Theiler, James

    2010-01-01

    Ship detection from satellite imagery is something that has great utility in various communities. Knowing where ships are and their types provides useful intelligence information. However, detecting and recognizing ships is a difficult problem. Existing techniques suffer from too many false-alarms. We describe approaches we have taken in trying to build ship detection algorithms that have reduced false alarms. Our approach uses a version of the grayscale morphological Hit-or-Miss transform. While this is well known and used in its standard form, we use a version in which we use a rank-order selection for the dilation and erosion parts of themore » transform, instead of the standard maximum and minimum operators. This provides some slack in the fitting that the algorithm employs and provides a method for tuning the algorithm's performance for particular detection problems. We describe our algorithms, show the effect of the rank-order parameter on the algorithm's performance and illustrate the use of this approach for real ship detection problems with panchromatic satellite imagery.« less

  6. High-speed cell recognition algorithm for ultrafast flow cytometer imaging system.

    PubMed

    Zhao, Wanyue; Wang, Chao; Chen, Hongwei; Chen, Minghua; Yang, Sigang

    2018-04-01

    An optical time-stretch flow imaging system enables high-throughput examination of cells/particles with unprecedented high speed and resolution. A significant amount of raw image data is produced. A high-speed cell recognition algorithm is, therefore, highly demanded to analyze large amounts of data efficiently. A high-speed cell recognition algorithm consisting of two-stage cascaded detection and Gaussian mixture model (GMM) classification is proposed. The first stage of detection extracts cell regions. The second stage integrates distance transform and the watershed algorithm to separate clustered cells. Finally, the cells detected are classified by GMM. We compared the performance of our algorithm with support vector machine. Results show that our algorithm increases the running speed by over 150% without sacrificing the recognition accuracy. This algorithm provides a promising solution for high-throughput and automated cell imaging and classification in the ultrafast flow cytometer imaging platform. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  7. High-speed cell recognition algorithm for ultrafast flow cytometer imaging system

    NASA Astrophysics Data System (ADS)

    Zhao, Wanyue; Wang, Chao; Chen, Hongwei; Chen, Minghua; Yang, Sigang

    2018-04-01

    An optical time-stretch flow imaging system enables high-throughput examination of cells/particles with unprecedented high speed and resolution. A significant amount of raw image data is produced. A high-speed cell recognition algorithm is, therefore, highly demanded to analyze large amounts of data efficiently. A high-speed cell recognition algorithm consisting of two-stage cascaded detection and Gaussian mixture model (GMM) classification is proposed. The first stage of detection extracts cell regions. The second stage integrates distance transform and the watershed algorithm to separate clustered cells. Finally, the cells detected are classified by GMM. We compared the performance of our algorithm with support vector machine. Results show that our algorithm increases the running speed by over 150% without sacrificing the recognition accuracy. This algorithm provides a promising solution for high-throughput and automated cell imaging and classification in the ultrafast flow cytometer imaging platform.

  8. Community detection in complex networks by using membrane algorithm

    NASA Astrophysics Data System (ADS)

    Liu, Chuang; Fan, Linan; Liu, Zhou; Dai, Xiang; Xu, Jiamei; Chang, Baoren

    Community detection in complex networks is a key problem of network analysis. In this paper, a new membrane algorithm is proposed to solve the community detection in complex networks. The proposed algorithm is based on membrane systems, which consists of objects, reaction rules, and a membrane structure. Each object represents a candidate partition of a complex network, and the quality of objects is evaluated according to network modularity. The reaction rules include evolutionary rules and communication rules. Evolutionary rules are responsible for improving the quality of objects, which employ the differential evolutionary algorithm to evolve objects. Communication rules implement the information exchanged among membranes. Finally, the proposed algorithm is evaluated on synthetic, real-world networks with real partitions known and the large-scaled networks with real partitions unknown. The experimental results indicate the superior performance of the proposed algorithm in comparison with other experimental algorithms.

  9. Early Obstacle Detection and Avoidance for All to All Traffic Pattern in Wireless Sensor Networks

    NASA Astrophysics Data System (ADS)

    Huc, Florian; Jarry, Aubin; Leone, Pierre; Moraru, Luminita; Nikoletseas, Sotiris; Rolim, Jose

    This paper deals with early obstacles recognition in wireless sensor networks under various traffic patterns. In the presence of obstacles, the efficiency of routing algorithms is increased by voluntarily avoiding some regions in the vicinity of obstacles, areas which we call dead-ends. In this paper, we first propose a fast convergent routing algorithm with proactive dead-end detection together with a formal definition and description of dead-ends. Secondly, we present a generalization of this algorithm which improves performances in all to many and all to all traffic patterns. In a third part we prove that this algorithm produces paths that are optimal up to a constant factor of 2π + 1. In a fourth part we consider the reactive version of the algorithm which is an extension of a previously known early obstacle detection algorithm. Finally we give experimental results to illustrate the efficiency of our algorithms in different scenarios.

  10. A TCAS-II Resolution Advisory Detection Algorithm

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Narkawicz, Anthony; Chamberlain, James

    2013-01-01

    The Traffic Alert and Collision Avoidance System (TCAS) is a family of airborne systems designed to reduce the risk of mid-air collisions between aircraft. TCASII, the current generation of TCAS devices, provides resolution advisories that direct pilots to maintain or increase vertical separation when aircraft distance and time parameters are beyond designed system thresholds. This paper presents a mathematical model of the TCASII Resolution Advisory (RA) logic that assumes accurate aircraft state information. Based on this model, an algorithm for RA detection is also presented. This algorithm is analogous to a conflict detection algorithm, but instead of predicting loss of separation, it predicts resolution advisories. It has been formally verified that for a kinematic model of aircraft trajectories, this algorithm completely and correctly characterizes all encounter geometries between two aircraft that lead to a resolution advisory within a given lookahead time interval. The RA detection algorithm proposed in this paper is a fundamental component of a NASA sense and avoid concept for the integration of Unmanned Aircraft Systems in civil airspace.

  11. Pre-Scheduled and Self Organized Sleep-Scheduling Algorithms for Efficient K-Coverage in Wireless Sensor Networks

    PubMed Central

    Hwang, I-Shyan

    2017-01-01

    The K-coverage configuration that guarantees coverage of each location by at least K sensors is highly popular and is extensively used to monitor diversified applications in wireless sensor networks. Long network lifetime and high detection quality are the essentials of such K-covered sleep-scheduling algorithms. However, the existing sleep-scheduling algorithms either cause high cost or cannot preserve the detection quality effectively. In this paper, the Pre-Scheduling-based K-coverage Group Scheduling (PSKGS) and Self-Organized K-coverage Scheduling (SKS) algorithms are proposed to settle the problems in the existing sleep-scheduling algorithms. Simulation results show that our pre-scheduled-based KGS approach enhances the detection quality and network lifetime, whereas the self-organized-based SKS algorithm minimizes the computation and communication cost of the nodes and thereby is energy efficient. Besides, SKS outperforms PSKGS in terms of network lifetime and detection quality as it is self-organized. PMID:29257078

  12. Development of an Algorithm for Satellite Remote Sensing of Sea and Lake Ice

    NASA Astrophysics Data System (ADS)

    Dorofy, Peter T.

    Satellite remote sensing of snow and ice has a long history. The traditional method for many snow and ice detection algorithms has been the use of the Normalized Difference Snow Index (NDSI). This manuscript is composed of two parts. Chapter 1, Development of a Mid-Infrared Sea and Lake Ice Index (MISI) using the GOES Imager, discusses the desirability, development, and implementation of alternative index for an ice detection algorithm, application of the algorithm to the detection of lake ice, and qualitative validation against other ice mapping products; such as, the Ice Mapping System (IMS). Chapter 2, Application of Dynamic Threshold in a Lake Ice Detection Algorithm, continues with a discussion of the development of a method that considers the variable viewing and illumination geometry of observations throughout the day. The method is an alternative to Bidirectional Reflectance Distribution Function (BRDF) models. Evaluation of the performance of the algorithm is introduced by aggregating classified pixels within geometrical boundaries designated by IMS and obtaining sensitivity and specificity statistical measures.

  13. The effect of machine learning regression algorithms and sample size on individualized behavioral prediction with functional connectivity features.

    PubMed

    Cui, Zaixu; Gong, Gaolang

    2018-06-02

    Individualized behavioral/cognitive prediction using machine learning (ML) regression approaches is becoming increasingly applied. The specific ML regression algorithm and sample size are two key factors that non-trivially influence prediction accuracies. However, the effects of the ML regression algorithm and sample size on individualized behavioral/cognitive prediction performance have not been comprehensively assessed. To address this issue, the present study included six commonly used ML regression algorithms: ordinary least squares (OLS) regression, least absolute shrinkage and selection operator (LASSO) regression, ridge regression, elastic-net regression, linear support vector regression (LSVR), and relevance vector regression (RVR), to perform specific behavioral/cognitive predictions based on different sample sizes. Specifically, the publicly available resting-state functional MRI (rs-fMRI) dataset from the Human Connectome Project (HCP) was used, and whole-brain resting-state functional connectivity (rsFC) or rsFC strength (rsFCS) were extracted as prediction features. Twenty-five sample sizes (ranged from 20 to 700) were studied by sub-sampling from the entire HCP cohort. The analyses showed that rsFC-based LASSO regression performed remarkably worse than the other algorithms, and rsFCS-based OLS regression performed markedly worse than the other algorithms. Regardless of the algorithm and feature type, both the prediction accuracy and its stability exponentially increased with increasing sample size. The specific patterns of the observed algorithm and sample size effects were well replicated in the prediction using re-testing fMRI data, data processed by different imaging preprocessing schemes, and different behavioral/cognitive scores, thus indicating excellent robustness/generalization of the effects. The current findings provide critical insight into how the selected ML regression algorithm and sample size influence individualized predictions of behavior/cognition and offer important guidance for choosing the ML regression algorithm or sample size in relevant investigations. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. Sensor failure detection for jet engines

    NASA Technical Reports Server (NTRS)

    Beattie, E. C.; Laprad, R. F.; Akhter, M. M.; Rock, S. M.

    1983-01-01

    Revisions to the advanced sensor failure detection, isolation, and accommodation (DIA) algorithm, developed under the sensor failure detection system program were studied to eliminate the steady state errors due to estimation filter biases. Three algorithm revisions were formulated and one revision for detailed evaluation was chosen. The selected version modifies the DIA algorithm to feedback the actual sensor outputs to the integral portion of the control for the nofailure case. In case of a failure, the estimates of the failed sensor output is fed back to the integral portion. The estimator outputs are fed back to the linear regulator portion of the control all the time. The revised algorithm is evaluated and compared to the baseline algorithm developed previously.

  15. Algorithm architecture co-design for ultra low-power image sensor

    NASA Astrophysics Data System (ADS)

    Laforest, T.; Dupret, A.; Verdant, A.; Lattard, D.; Villard, P.

    2012-03-01

    In a context of embedded video surveillance, stand alone leftbehind image sensors are used to detect events with high level of confidence, but also with a very low power consumption. Using a steady camera, motion detection algorithms based on background estimation to find regions in movement are simple to implement and computationally efficient. To reduce power consumption, the background is estimated using a down sampled image formed of macropixels. In order to extend the class of moving objects to be detected, we propose an original mixed mode architecture developed thanks to an algorithm architecture co-design methodology. This programmable architecture is composed of a vector of SIMD processors. A basic RISC architecture was optimized in order to implement motion detection algorithms with a dedicated set of 42 instructions. Definition of delta modulation as a calculation primitive has allowed to implement algorithms in a very compact way. Thereby, a 1920x1080@25fps CMOS image sensor performing integrated motion detection is proposed with a power estimation of 1.8 mW.

  16. Collision detection for spacecraft proximity operations

    NASA Technical Reports Server (NTRS)

    Vaughan, Robin M.; Bergmann, Edward V.; Walker, Bruce K.

    1991-01-01

    A new collision detection algorithm has been developed for use when two spacecraft are operating in the same vicinity. The two spacecraft are modeled as unions of convex polyhedra, where the resulting polyhedron many be either convex or nonconvex. The relative motion of the two spacecraft is assumed to be such that one vehicle is moving with constant linear and angular velocity with respect to the other. Contacts between the vertices, faces, and edges of the polyhedra representing the two spacecraft are shown to occur when the value of one or more of a set of functions is zero. The collision detection algorithm is then formulated as a search for the zeros (roots) of these functions. Special properties of the functions for the assumed relative trajectory are exploited to expedite the zero search. The new algorithm is the first algorithm that can solve the collision detection problem exactly for relative motion with constant angular velocity. This is a significant improvement over models of rotational motion used in previous collision detection algorithms.

  17. Using Information From Prior Satellite Scans to Improve Cloud Detection Near the Day-Night Terminator

    NASA Technical Reports Server (NTRS)

    Yost, Christopher R.; Minnis, Patrick; Trepte, Qing Z.; Palikonda, Rabindra; Ayers, Jeffrey K.; Spangenberg, Doulas A.

    2012-01-01

    With geostationary satellite data it is possible to have a continuous record of diurnal cycles of cloud properties for a large portion of the globe. Daytime cloud property retrieval algorithms are typically superior to nighttime algorithms because daytime methods utilize measurements of reflected solar radiation. However, reflected solar radiation is difficult to accurately model for high solar zenith angles where the amount of incident radiation is small. Clear and cloudy scenes can exhibit very small differences in reflected radiation and threshold-based cloud detection methods have more difficulty setting the proper thresholds for accurate cloud detection. Because top-of-atmosphere radiances are typically more accurately modeled outside the terminator region, information from previous scans can help guide cloud detection near the terminator. This paper presents an algorithm that uses cloud fraction and clear and cloudy infrared brightness temperatures from previous satellite scan times to improve the performance of a threshold-based cloud mask near the terminator. Comparisons of daytime, nighttime, and terminator cloud fraction derived from Geostationary Operational Environmental Satellite (GOES) radiance measurements show that the algorithm greatly reduces the number of false cloud detections and smoothes the transition from the daytime to the nighttime clod detection algorithm. Comparisons with the Geoscience Laser Altimeter System (GLAS) data show that using this algorithm decreases the number of false detections by approximately 20 percentage points.

  18. Recombinant Temporal Aberration Detection Algorithms for Enhanced Biosurveillance

    PubMed Central

    Murphy, Sean Patrick; Burkom, Howard

    2008-01-01

    Objective Broadly, this research aims to improve the outbreak detection performance and, therefore, the cost effectiveness of automated syndromic surveillance systems by building novel, recombinant temporal aberration detection algorithms from components of previously developed detectors. Methods This study decomposes existing temporal aberration detection algorithms into two sequential stages and investigates the individual impact of each stage on outbreak detection performance. The data forecasting stage (Stage 1) generates predictions of time series values a certain number of time steps in the future based on historical data. The anomaly measure stage (Stage 2) compares features of this prediction to corresponding features of the actual time series to compute a statistical anomaly measure. A Monte Carlo simulation procedure is then used to examine the recombinant algorithms’ ability to detect synthetic aberrations injected into authentic syndromic time series. Results New methods obtained with procedural components of published, sometimes widely used, algorithms were compared to the known methods using authentic datasets with plausible stochastic injected signals. Performance improvements were found for some of the recombinant methods, and these improvements were consistent over a range of data types, outbreak types, and outbreak sizes. For gradual outbreaks, the WEWD MovAvg7+WEWD Z-Score recombinant algorithm performed best; for sudden outbreaks, the HW+WEWD Z-Score performed best. Conclusion This decomposition was found not only to yield valuable insight into the effects of the aberration detection algorithms but also to produce novel combinations of data forecasters and anomaly measures with enhanced detection performance. PMID:17947614

  19. Fast, shape-directed, landmark-based deep gray matter segmentation for quantification of iron deposition

    NASA Astrophysics Data System (ADS)

    Ekin, Ahmet; Jasinschi, Radu; van der Grond, Jeroen; van Buchem, Mark A.; van Muiswinkel, Arianne

    2006-03-01

    This paper introduces image processing methods to automatically detect the 3D volume-of-interest (VOI) and 2D region-of-interest (ROI) for deep gray matter organs (thalamus, globus pallidus, putamen, and caudate nucleus) of patients with suspected iron deposition from MR dual echo images. Prior to the VOI and ROI detection, cerebrospinal fluid (CSF) region is segmented by a clustering algorithm. For the segmentation, we automatically determine the cluster centers with the mean shift algorithm that can quickly identify the modes of a distribution. After the identification of the modes, we employ the K-Harmonic means clustering algorithm to segment the volumetric MR data into CSF and non-CSF. Having the CSF mask and observing that the frontal lobe of the lateral ventricle has more consistent shape accross age and pathological abnormalities, we propose a shape-directed landmark detection algorithm to detect the VOI in a speedy manner. The proposed landmark detection algorithm utilizes a novel shape model of the front lobe of the lateral ventricle for the slices where thalamus, globus pallidus, putamen, and caudate nucleus are expected to appear. After this step, for each slice in the VOI, we use horizontal and vertical projections of the CSF map to detect the approximate locations of the relevant organs to define the ROI. We demonstrate the robustness of the proposed VOI and ROI localization algorithms to pathologies, including severe amounts of iron accumulation as well as white matter lesions, and anatomical variations. The proposed algorithms achieved very high detection accuracy, 100% in the VOI detection , over a large set of a challenging MR dataset.

  20. [Tachycardia detection in implantable cardioverter-defibrillators by Sorin/LivaNova : Algorithms, pearls and pitfalls].

    PubMed

    Kolb, Christof; Ocklenburg, Rolf

    2016-09-01

    For physicians involved in the treatment of patients with implantable cardioverter-defibrillators (ICDs) the knowledge of tachycardia detection algorithms is of paramount importance. This knowledge is essential for adequate device selection during de-novo implantation, ICD replacement, and for troubleshooting during follow-up. This review describes tachycardia detection algorithms incorporated in ICDs by Sorin/LivaNova and analyses their strengths and weaknesses.

  1. Combining spatial and spectral information to improve crop/weed discrimination algorithms

    NASA Astrophysics Data System (ADS)

    Yan, L.; Jones, G.; Villette, S.; Paoli, J. N.; Gée, C.

    2012-01-01

    Reduction of herbicide spraying is an important key to environmentally and economically improve weed management. To achieve this, remote sensors such as imaging systems are commonly used to detect weed plants. We developed spatial algorithms that detect the crop rows to discriminate crop from weeds. These algorithms have been thoroughly tested and provide robust and accurate results without learning process but their detection is limited to inter-row areas. Crop/Weed discrimination using spectral information is able to detect intra-row weeds but generally needs a prior learning process. We propose a method based on spatial and spectral information to enhance the discrimination and overcome the limitations of both algorithms. The classification from the spatial algorithm is used to build the training set for the spectral discrimination method. With this approach we are able to improve the range of weed detection in the entire field (inter and intra-row). To test the efficiency of these algorithms, a relevant database of virtual images issued from SimAField model has been used and combined to LOPEX93 spectral database. The developed method based is evaluated and compared with the initial method in this paper and shows an important enhancement from 86% of weed detection to more than 95%.

  2. Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach

    NASA Astrophysics Data System (ADS)

    Koeppen, W. C.; Pilger, E.; Wright, R.

    2011-07-01

    We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.

  3. Spatial-Spectral Approaches to Edge Detection in Hyperspectral Remote Sensing

    NASA Astrophysics Data System (ADS)

    Cox, Cary M.

    This dissertation advances geoinformation science at the intersection of hyperspectral remote sensing and edge detection methods. A relatively new phenomenology among its remote sensing peers, hyperspectral imagery (HSI) comprises only about 7% of all remote sensing research - there are five times as many radar-focused peer reviewed journal articles than hyperspectral-focused peer reviewed journal articles. Similarly, edge detection studies comprise only about 8% of image processing research, most of which is dedicated to image processing techniques most closely associated with end results, such as image classification and feature extraction. Given the centrality of edge detection to mapping, that most important of geographic functions, improving the collective understanding of hyperspectral imagery edge detection methods constitutes a research objective aligned to the heart of geoinformation sciences. Consequently, this dissertation endeavors to narrow the HSI edge detection research gap by advancing three HSI edge detection methods designed to leverage HSI's unique chemical identification capabilities in pursuit of generating accurate, high-quality edge planes. The Di Zenzo-based gradient edge detection algorithm, an innovative version of the Resmini HySPADE edge detection algorithm and a level set-based edge detection algorithm are tested against 15 traditional and non-traditional HSI datasets spanning a range of HSI data configurations, spectral resolutions, spatial resolutions, bandpasses and applications. This study empirically measures algorithm performance against Dr. John Canny's six criteria for a good edge operator: false positives, false negatives, localization, single-point response, robustness to noise and unbroken edges. The end state is a suite of spatial-spectral edge detection algorithms that produce satisfactory edge results against a range of hyperspectral data types applicable to a diverse set of earth remote sensing applications. This work also explores the concept of an edge within hyperspectral space, the relative importance of spatial and spectral resolutions as they pertain to HSI edge detection and how effectively compressed HSI data improves edge detection results. The HSI edge detection experiments yielded valuable insights into the algorithms' strengths, weaknesses and optimal alignment to remote sensing applications. The gradient-based edge operator produced strong edge planes across a range of evaluation measures and applications, particularly with respect to false negatives, unbroken edges, urban mapping, vegetation mapping and oil spill mapping applications. False positives and uncompressed HSI data presented occasional challenges to the algorithm. The HySPADE edge operator produced satisfactory results with respect to localization, single-point response, oil spill mapping and trace chemical detection, and was challenged by false positives, declining spectral resolution and vegetation mapping applications. The level set edge detector produced high-quality edge planes for most tests and demonstrated strong performance with respect to false positives, single-point response, oil spill mapping and mineral mapping. False negatives were a regular challenge for the level set edge detection algorithm. Finally, HSI data optimized for spectral information compression and noise was shown to improve edge detection performance across all three algorithms, while the gradient-based algorithm and HySPADE demonstrated significant robustness to declining spectral and spatial resolutions.

  4. A new automated quantification algorithm for the detection and evaluation of focal liver lesions with contrast-enhanced ultrasound.

    PubMed

    Gatos, Ilias; Tsantis, Stavros; Spiliopoulos, Stavros; Skouroliakou, Aikaterini; Theotokas, Ioannis; Zoumpoulis, Pavlos; Hazle, John D; Kagadis, George C

    2015-07-01

    Detect and classify focal liver lesions (FLLs) from contrast-enhanced ultrasound (CEUS) imaging by means of an automated quantification algorithm. The proposed algorithm employs a sophisticated segmentation method to detect and contour focal lesions from 52 CEUS video sequences (30 benign and 22 malignant). Lesion detection involves wavelet transform zero crossings utilization as an initialization step to the Markov random field model toward the lesion contour extraction. After FLL detection across frames, time intensity curve (TIC) is computed which provides the contrast agents' behavior at all vascular phases with respect to adjacent parenchyma for each patient. From each TIC, eight features were automatically calculated and employed into the support vector machines (SVMs) classification algorithm in the design of the image analysis model. With regard to FLLs detection accuracy, all lesions detected had an average overlap value of 0.89 ± 0.16 with manual segmentations for all CEUS frame-subsets included in the study. Highest classification accuracy from the SVM model was 90.3%, misdiagnosing three benign and two malignant FLLs with sensitivity and specificity values of 93.1% and 86.9%, respectively. The proposed quantification system that employs FLLs detection and classification algorithms may be of value to physicians as a second opinion tool for avoiding unnecessary invasive procedures.

  5. Automated Detection of Craters in Martian Satellite Imagery Using Convolutional Neural Networks

    NASA Astrophysics Data System (ADS)

    Norman, C. J.; Paxman, J.; Benedix, G. K.; Tan, T.; Bland, P. A.; Towner, M.

    2018-04-01

    Crater counting is used in determining surface age of planets. We propose improvements to martian Crater Detection Algorithms by implementing an end-to-end detection approach with the possibility of scaling the algorithm planet-wide.

  6. A fuzzy clustering algorithm to detect planar and quadric shapes

    NASA Technical Reports Server (NTRS)

    Krishnapuram, Raghu; Frigui, Hichem; Nasraoui, Olfa

    1992-01-01

    In this paper, we introduce a new fuzzy clustering algorithm to detect an unknown number of planar and quadric shapes in noisy data. The proposed algorithm is computationally and implementationally simple, and it overcomes many of the drawbacks of the existing algorithms that have been proposed for similar tasks. Since the clustering is performed in the original image space, and since no features need to be computed, this approach is particularly suited for sparse data. The algorithm may also be used in pattern recognition applications.

  7. Face detection assisted auto exposure: supporting evidence from a psychophysical study

    NASA Astrophysics Data System (ADS)

    Jin, Elaine W.; Lin, Sheng; Dharumalingam, Dhandapani

    2010-01-01

    Face detection has been implemented in many digital still cameras and camera phones with the promise of enhancing existing camera functions (e.g. auto exposure) and adding new features to cameras (e.g. blink detection). In this study we examined the use of face detection algorithms in assisting auto exposure (AE). The set of 706 images, used in this study, was captured using Canon Digital Single Lens Reflex cameras and subsequently processed with an image processing pipeline. A psychophysical study was performed to obtain optimal exposure along with the upper and lower bounds of exposure for all 706 images. Three methods of marking faces were utilized: manual marking, face detection algorithm A (FD-A), and face detection algorithm B (FD-B). The manual marking method found 751 faces in 426 images, which served as the ground-truth for face regions of interest. The remaining images do not have any faces or the faces are too small to be considered detectable. The two face detection algorithms are different in resource requirements and in performance. FD-A uses less memory and gate counts compared to FD-B, but FD-B detects more faces and has less false positives. A face detection assisted auto exposure algorithm was developed and tested against the evaluation results from the psychophysical study. The AE test results showed noticeable improvement when faces were detected and used in auto exposure. However, the presence of false positives would negatively impact the added benefit.

  8. An ant colony based algorithm for overlapping community detection in complex networks

    NASA Astrophysics Data System (ADS)

    Zhou, Xu; Liu, Yanheng; Zhang, Jindong; Liu, Tuming; Zhang, Di

    2015-06-01

    Community detection is of great importance to understand the structures and functions of networks. Overlap is a significant feature of networks and overlapping community detection has attracted an increasing attention. Many algorithms have been presented to detect overlapping communities. In this paper, we present an ant colony based overlapping community detection algorithm which mainly includes ants' location initialization, ants' movement and post processing phases. An ants' location initialization strategy is designed to identify initial location of ants and initialize label list stored in each node. During the ants' movement phase, the entire ants move according to the transition probability matrix, and a new heuristic information computation approach is redefined to measure similarity between two nodes. Every node keeps a label list through the cooperation made by ants until a termination criterion is reached. A post processing phase is executed on the label list to get final overlapping community structure naturally. We illustrate the capability of our algorithm by making experiments on both synthetic networks and real world networks. The results demonstrate that our algorithm will have better performance in finding overlapping communities and overlapping nodes in synthetic datasets and real world datasets comparing with state-of-the-art algorithms.

  9. The Classification of Diabetes Mellitus Using Kernel k-means

    NASA Astrophysics Data System (ADS)

    Alamsyah, M.; Nafisah, Z.; Prayitno, E.; Afida, A. M.; Imah, E. M.

    2018-01-01

    Diabetes Mellitus is a metabolic disorder which is characterized by chronicle hypertensive glucose. Automatics detection of diabetes mellitus is still challenging. This study detected diabetes mellitus by using kernel k-Means algorithm. Kernel k-means is an algorithm which was developed from k-means algorithm. Kernel k-means used kernel learning that is able to handle non linear separable data; where it differs with a common k-means. The performance of kernel k-means in detecting diabetes mellitus is also compared with SOM algorithms. The experiment result shows that kernel k-means has good performance and a way much better than SOM.

  10. Evaluation of methods for detection of fluorescence labeled subcellular objects in microscope images.

    PubMed

    Ruusuvuori, Pekka; Aijö, Tarmo; Chowdhury, Sharif; Garmendia-Torres, Cecilia; Selinummi, Jyrki; Birbaumer, Mirko; Dudley, Aimée M; Pelkmans, Lucas; Yli-Harja, Olli

    2010-05-13

    Several algorithms have been proposed for detecting fluorescently labeled subcellular objects in microscope images. Many of these algorithms have been designed for specific tasks and validated with limited image data. But despite the potential of using extensive comparisons between algorithms to provide useful information to guide method selection and thus more accurate results, relatively few studies have been performed. To better understand algorithm performance under different conditions, we have carried out a comparative study including eleven spot detection or segmentation algorithms from various application fields. We used microscope images from well plate experiments with a human osteosarcoma cell line and frames from image stacks of yeast cells in different focal planes. These experimentally derived images permit a comparison of method performance in realistic situations where the number of objects varies within image set. We also used simulated microscope images in order to compare the methods and validate them against a ground truth reference result. Our study finds major differences in the performance of different algorithms, in terms of both object counts and segmentation accuracies. These results suggest that the selection of detection algorithms for image based screens should be done carefully and take into account different conditions, such as the possibility of acquiring empty images or images with very few spots. Our inclusion of methods that have not been used before in this context broadens the set of available detection methods and compares them against the current state-of-the-art methods for subcellular particle detection.

  11. Intrusion-aware alert validation algorithm for cooperative distributed intrusion detection schemes of wireless sensor networks.

    PubMed

    Shaikh, Riaz Ahmed; Jameel, Hassan; d'Auriol, Brian J; Lee, Heejo; Lee, Sungyoung; Song, Young-Jae

    2009-01-01

    Existing anomaly and intrusion detection schemes of wireless sensor networks have mainly focused on the detection of intrusions. Once the intrusion is detected, an alerts or claims will be generated. However, any unidentified malicious nodes in the network could send faulty anomaly and intrusion claims about the legitimate nodes to the other nodes. Verifying the validity of such claims is a critical and challenging issue that is not considered in the existing cooperative-based distributed anomaly and intrusion detection schemes of wireless sensor networks. In this paper, we propose a validation algorithm that addresses this problem. This algorithm utilizes the concept of intrusion-aware reliability that helps to provide adequate reliability at a modest communication cost. In this paper, we also provide a security resiliency analysis of the proposed intrusion-aware alert validation algorithm.

  12. Intrusion-Aware Alert Validation Algorithm for Cooperative Distributed Intrusion Detection Schemes of Wireless Sensor Networks

    PubMed Central

    Shaikh, Riaz Ahmed; Jameel, Hassan; d’Auriol, Brian J.; Lee, Heejo; Lee, Sungyoung; Song, Young-Jae

    2009-01-01

    Existing anomaly and intrusion detection schemes of wireless sensor networks have mainly focused on the detection of intrusions. Once the intrusion is detected, an alerts or claims will be generated. However, any unidentified malicious nodes in the network could send faulty anomaly and intrusion claims about the legitimate nodes to the other nodes. Verifying the validity of such claims is a critical and challenging issue that is not considered in the existing cooperative-based distributed anomaly and intrusion detection schemes of wireless sensor networks. In this paper, we propose a validation algorithm that addresses this problem. This algorithm utilizes the concept of intrusion-aware reliability that helps to provide adequate reliability at a modest communication cost. In this paper, we also provide a security resiliency analysis of the proposed intrusion-aware alert validation algorithm. PMID:22454568

  13. Design of an Acoustic Target Intrusion Detection System Based on Small-Aperture Microphone Array.

    PubMed

    Zu, Xingshui; Guo, Feng; Huang, Jingchang; Zhao, Qin; Liu, Huawei; Li, Baoqing; Yuan, Xiaobing

    2017-03-04

    Automated surveillance of remote locations in a wireless sensor network is dominated by the detection algorithm because actual intrusions in such locations are a rare event. Therefore, a detection method with low power consumption is crucial for persistent surveillance to ensure longevity of the sensor networks. A simple and effective two-stage algorithm composed of energy detector (ED) and delay detector (DD) with all its operations in time-domain using small-aperture microphone array (SAMA) is proposed. The algorithm analyzes the quite different velocities between wind noise and sound waves to improve the detection capability of ED in the surveillance area. Experiments in four different fields with three types of vehicles show that the algorithm is robust to wind noise and the probability of detection and false alarm are 96.67% and 2.857%, respectively.

  14. Aircraft Detection in High-Resolution SAR Images Based on a Gradient Textural Saliency Map

    PubMed Central

    Tan, Yihua; Li, Qingyun; Li, Yansheng; Tian, Jinwen

    2015-01-01

    This paper proposes a new automatic and adaptive aircraft target detection algorithm in high-resolution synthetic aperture radar (SAR) images of airport. The proposed method is based on gradient textural saliency map under the contextual cues of apron area. Firstly, the candidate regions with the possible existence of airport are detected from the apron area. Secondly, directional local gradient distribution detector is used to obtain a gradient textural saliency map in the favor of the candidate regions. In addition, the final targets will be detected by segmenting the saliency map using CFAR-type algorithm. The real high-resolution airborne SAR image data is used to verify the proposed algorithm. The results demonstrate that this algorithm can detect aircraft targets quickly and accurately, and decrease the false alarm rate. PMID:26378543

  15. Android Malware Classification Using K-Means Clustering Algorithm

    NASA Astrophysics Data System (ADS)

    Hamid, Isredza Rahmi A.; Syafiqah Khalid, Nur; Azma Abdullah, Nurul; Rahman, Nurul Hidayah Ab; Chai Wen, Chuah

    2017-08-01

    Malware was designed to gain access or damage a computer system without user notice. Besides, attacker exploits malware to commit crime or fraud. This paper proposed Android malware classification approach based on K-Means clustering algorithm. We evaluate the proposed model in terms of accuracy using machine learning algorithms. Two datasets were selected to demonstrate the practicing of K-Means clustering algorithms that are Virus Total and Malgenome dataset. We classify the Android malware into three clusters which are ransomware, scareware and goodware. Nine features were considered for each types of dataset such as Lock Detected, Text Detected, Text Score, Encryption Detected, Threat, Porn, Law, Copyright and Moneypak. We used IBM SPSS Statistic software for data classification and WEKA tools to evaluate the built cluster. The proposed K-Means clustering algorithm shows promising result with high accuracy when tested using Random Forest algorithm.

  16. Research On Vehicle-Based Driver Status/Performance Monitoring; Development, Validation, And Refinement Of Algorithms For Detection Of Driver Drowsiness, Final Report

    DOT National Transportation Integrated Search

    1994-12-01

    THIS REPORT SUMMARIZES THE RESULTS OF A 3-YEAR RESEARCH PROJECT TO DEVELOP RELIABLE ALGORITHMS FOR THE DETECTION OF MOTOR VEHICLE DRIVER IMPAIRMENT DUE TO DROWSINESS. THESE ALGORITHMS ARE BASED ON DRIVING PERFORMANCE MEASURES THAT CAN POTENTIALLY BE ...

  17. Multispectral fluorescence image algorithms for detection of frass on mature tomatoes

    USDA-ARS?s Scientific Manuscript database

    A multispectral algorithm derived from hyperspectral line-scan fluorescence imaging under violet LED excitation was developed for the detection of frass contamination on mature tomatoes. The algorithm utilized the fluorescence intensities at five wavebands, 515 nm, 640 nm, 664 nm, 690 nm, and 724 nm...

  18. Access Restoration Project Task 1.2 Report 2 (of 2) Algorithms for Debris Volume and Water Depth Computation : Appendix A

    DOT National Transportation Integrated Search

    0000-01-01

    n the Access Restoration Project Task 1.2 Report 1, the algorithms for detecting roadway debris piles and flooded areas were described in detail. Those algorithms take CRS data as input and automatically detect the roadway obstructions. Although the ...

  19. Bifurcations and catastrophes in a nonlinear dynamical model of the western Pacific subtropical high ridge line index and its evolution mechanism

    NASA Astrophysics Data System (ADS)

    Hong, Mei; Zhang, Ren; Li, Ming; Wang, Shuo; Zeng, Wenhua; Wang, Zhengxin

    2017-07-01

    Despite much previous effort, the establishment of an accurate model of the western Pacific subtropical high (WPSH) and analysis of its chaotic behavior has proved to be difficult. Based on a phase-space technique, a nonlinear dynamical model of the WPSH ridge line and summer monsoon factors is constructed here from 50 years of data. Using a genetic algorithm, model inversion and parameter optimization are performed. The Lyapunov spectrum, phase portraits, time history, and Poincaré surface of section of the model are analyzed and an initial-value sensitivity test is performed, showing that the model and data have similar phase portraits and that the model is robust. Based on equilibrium stability criteria, four types of equilibria of the model are analyzed. Bifurcations and catastrophes of the equilibria are studied and related to the physical mechanism and actual weather phenomena. The results show that the onset and enhancement of the Somali low-level jet and the latent heat flux of the Indian monsoon are among the most important reasons for the appearance and maintenance of the double-ridge phenomenon. Violent breakout and enhancement of the Mascarene cold high will cause the WPSH to jump northward, resulting in the "empty plum" phenomenon. In the context of bifurcation and catastrophe in the dynamical system, the influence of the factors considered here on the WPSH has theoretical and practical significance. This work also opens the way to new lines of research on the interaction between the WPSH and the summer monsoon system.

  20. Development of a digital impression procedure using photogrammetry for complete denture fabrication.

    PubMed

    Matsuda, Takashi; Goto, Takaharu; Kurahashi, Kosuke; Kashiwabara, Toshiya; Ichikawa, Tetsuo

    We developed an innovative procedure for digitizing maxillary edentulous residual ridges with a photogrammetric system capable of estimating three-dimensional (3D) digital forms from multiple two-dimensional (2D) digital images. The aim of this study was to validate the effectiveness of the photogrammetric system. Impressions of the maxillary residual ridges of five edentulous patients were taken with four kinds of procedures: three conventional impression procedures and the photogrammetric system. Plaster models were fabricated from conventional impressions and digitized with a 3D scanner. Two 3D forms out of four forms were superimposed with 3D inspection software, and differences were evaluated using a least squares best fit algorithm. The in vitro experiment suggested that better imaging conditions were in the horizontal range of ± 15 degrees and at a vertical angle of 45 degrees. The mean difference between the photogrammetric image (Form A) and the image taken from conventional preliminarily impression (Form C) was 0.52 ± 0.22 mm. The mean difference between the image taken of final impression through a special tray (Form B) and Form C was 0.26 ± 0.06 mm. The mean difference between the image taken from conventional final impression (Form D) and Form C was 0.25 ± 0.07 mm. The difference between Forms A and C was significantly larger than the differences between Forms B and C and between Forms D and C. The results of this study suggest that obtaining digital impressions of edentulous residual ridges using a photogrammetric system is feasible and available for clinical use.

  1. Reliable Detection and Smart Deletion of Malassez Counting Chamber Grid in Microscopic White Light Images for Microbiological Applications.

    PubMed

    Denimal, Emmanuel; Marin, Ambroise; Guyot, Stéphane; Journaux, Ludovic; Molin, Paul

    2015-08-01

    In biology, hemocytometers such as Malassez slides are widely used and are effective tools for counting cells manually. In a previous work, a robust algorithm was developed for grid extraction in Malassez slide images. This algorithm was evaluated on a set of 135 images and grids were accurately detected in most cases, but there remained failures for the most difficult images. In this work, we present an optimization of this algorithm that allows for 100% grid detection and a 25% improvement in grid positioning accuracy. These improvements make the algorithm fully reliable for grid detection. This optimization also allows complete erasing of the grid without altering the cells, which eases their segmentation.

  2. Cultural Artifact Detection in Long Wave Infrared Imagery.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Dylan Zachary; Craven, Julia M.; Ramon, Eric

    2017-01-01

    Detection of cultural artifacts from airborne remotely sensed data is an important task in the context of on-site inspections. Airborne artifact detection can reduce the size of the search area the ground based inspection team must visit, thereby improving the efficiency of the inspection process. This report details two algorithms for detection of cultural artifacts in aerial long wave infrared imagery. The first algorithm creates an explicit model for cultural artifacts, and finds data that fits the model. The second algorithm creates a model of the background and finds data that does not fit the model. Both algorithms are appliedmore » to orthomosaic imagery generated as part of the MSFE13 data collection campaign under the spectral technology evaluation project.« less

  3. Latent palmprint matching.

    PubMed

    Jain, Anil K; Feng, Jianjiang

    2009-06-01

    The evidential value of palmprints in forensic applications is clear as about 30 percent of the latents recovered from crime scenes are from palms. While biometric systems for palmprint-based personal authentication in access control type of applications have been developed, they mostly deal with low-resolution (about 100 ppi) palmprints and only perform full-to-full palmprint matching. We propose a latent-to-full palmprint matching system that is needed in forensic applications. Our system deals with palmprints captured at 500 ppi (the current standard in forensic applications) or higher resolution and uses minutiae as features to be compatible with the methodology used by latent experts. Latent palmprint matching is a challenging problem because latent prints lifted at crime scenes are of poor image quality, cover only a small area of the palm, and have a complex background. Other difficulties include a large number of minutiae in full prints (about 10 times as many as fingerprints), and the presence of many creases in latents and full prints. A robust algorithm to reliably estimate the local ridge direction and frequency in palmprints is developed. This facilitates the extraction of ridge and minutiae features even in poor quality palmprints. A fixed-length minutia descriptor, MinutiaCode, is utilized to capture distinctive information around each minutia and an alignment-based minutiae matching algorithm is used to match two palmprints. Two sets of partial palmprints (150 live-scan partial palmprints and 100 latent palmprints) are matched to a background database of 10,200 full palmprints to test the proposed system. Despite the inherent difficulty of latent-to-full palmprint matching, rank-1 recognition rates of 78.7 and 69 percent, respectively, were achieved in searching live-scan partial palmprints and latent palmprints against the background database.

  4. The Data Transfer Kit: A geometric rendezvous-based tool for multiphysics data transfer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slattery, S. R.; Wilson, P. P. H.; Pawlowski, R. P.

    2013-07-01

    The Data Transfer Kit (DTK) is a software library designed to provide parallel data transfer services for arbitrary physics components based on the concept of geometric rendezvous. The rendezvous algorithm provides a means to geometrically correlate two geometric domains that may be arbitrarily decomposed in a parallel simulation. By repartitioning both domains such that they have the same geometric domain on each parallel process, efficient and load balanced search operations and data transfer can be performed at a desirable algorithmic time complexity with low communication overhead relative to other types of mapping algorithms. With the increased development efforts in multiphysicsmore » simulation and other multiple mesh and geometry problems, generating parallel topology maps for transferring fields and other data between geometric domains is a common operation. The algorithms used to generate parallel topology maps based on the concept of geometric rendezvous as implemented in DTK are described with an example using a conjugate heat transfer calculation and thermal coupling with a neutronics code. In addition, we provide the results of initial scaling studies performed on the Jaguar Cray XK6 system at Oak Ridge National Laboratory for a worse-case-scenario problem in terms of algorithmic complexity that shows good scaling on 0(1 x 104) cores for topology map generation and excellent scaling on 0(1 x 105) cores for the data transfer operation with meshes of O(1 x 109) elements. (authors)« less

  5. Hydrothermal plume anomalies over the southwest Indian ridge: magmatic control

    NASA Astrophysics Data System (ADS)

    Yue, X.; Li, H.; Tao, C.; Ren, J.; Zhou, J.; Chen, J.; Chen, S.; Wang, Y.

    2017-12-01

    Here we firstly reported the extensive survey results of the hydrothermal activity along the ultra-slow spreading southwest Indian ridge (SWIR). The study area is located at segment 27, between the Indomed and Gallieni transform faults, SWIR. The seismic crustal thickness reaches 9.5km in this segment (Li et al., 2015), which is much thicker than normal crustal. The anomaly thickened crust could be affected by the Crozet hotspot or highly focused melt delivery from the mantle. The Duanqiao hydrothermal field was reported at the ridge valley of the segment by Tao et al (2009). The Deep-towed Hydrothermal Detection System (DHDS) was used to collect information related with hydrothermal activity, like temperature, turbidity, oxidation-reduction potential (ORP) and seabed types. There are 15 survey lines at the interval of 2 to 3 km which are occupied about 1300 km2 in segment 27. After processing the raw data, including wiping out random noise points, 5-points moving average processing and subtracting the ambient, we got anomalous Nephelometric Turbidity Units values (ΔNTU). And dE/dt was used to identify the ORP anomalous as the raw data is easily influenced by electrode potentials drifting (Baker et al., 2016). According to the results of water column turbidity and ORP distributions, we confirmed three hydrothermal anomaly fields named A1, A2 and A3. The three fields are all located in the western part of the segment. The A1 field lies on the ridge valley, west side of Duanqiao field. The A2 and A3 field lie on the northern and southern of the ridge valley, respectively. We propose that recent magmatic activity probably focus on the western part of segment 27.And the extensive distribution of hydrothermal plume in the segment is the result of the discrete magma intrusion. References Baker E T, et al. How many vent fields? New estimates of vent field populations on ocean ridges from precise mapping of hydrothermal discharge locations. EPSL, 2016, 449:186-196. Li J, et al. Seismic observation of an extremely magmatic accretion at the ultraslow spreading Southwest Indian Ridge. GRL, 2015, 42:2656-2663. Tao C, Wu G, Ni J, et al. New hydrothermal fields found along the SWIR during the Legs 5-7 of the Chinese DY115-20 Expedition. AGU 2009.

  6. Continental Affinities of the Alpha Ridge

    NASA Astrophysics Data System (ADS)

    Jackson, H. Ruth; Li, Qingmou; Shimeld, John; Chian, Deping

    2017-04-01

    Identifying the crustal attributes of the Alpha Ridge (AR) part of the High Arctic Large Igneous Province and tracing the spreading centre across the Amerasia Basin plays a key role in understanding the opening history of the Arctic Ocean. In this approach, we report the evidence for a continental influence on the development of the AR and reduced ocean crust in the Amerasia Basin. These points are inferred from a documented continental sedimentation source in the Amerasia Basin and calculated diagnostic compressional and shear refraction waves, and from the tracing of the distinct spreading centre using the potential field data. (1) The circum-Arctic geology of the small polar ocean provides compelling evidence of a long-lived continental landmass north of the Sverdrup Basin in the Canadian Arctic Islands and north of the Barents Sea continental margin. Based on sediment distribution patterns in the Sverdrup Basin a continental source is required from the Triassic to mid Jurassic. In addition, an extensive continental sediment source to the north of the Barents Sea is required until the Barremian. (2) Offshore data suggest a portion of continental crust in the Alpha and Mendeleev ridges including measured shear wave velocities, similarity of compressional wave velocities with large igneous province with continental fragments and magnetic patterns. Ocean bottom seismometers recorded shear waves velocities that are sensitive to the quartz content of rocks across the Chukchi Borderland and the Mendeleev Ridge that are diagnostic of both an upper and lower continental crust. On the Nautilus Spur of the Alpha Ridge expendable sonobuoys recorded clear converted shear waves also consistent with continental crust. The magnetic patterns (amplitude, frequency, and textures) on the Northwind Ridge and the Nautilus Spur also have similarities. In fact only limited portions of the deepest water portions of the Canada Basin and the Makarov Basin have typical oceanic layer 2 and 3 crustal velocities and lineated magnetic anomalies. (3) The gravity and magnetic anomalies associated with the spreading centre in the Canada Basin unveiled by multifractal singularity analysis of the potential field data can now be traced as far as the Lomonosov Ridge. In addition, linear magnetic features cutting across the spreading centres are identified as transform faults. The combination of the detected continental attributes of AR, the quantification of transform faults, and the outlined reduced extent of oceanic crust in the Amerasia Basin provide new insights into the opening history of the basin.

  7. A Comparative Evaluation of Unsupervised Anomaly Detection Algorithms for Multivariate Data.

    PubMed

    Goldstein, Markus; Uchida, Seiichi

    2016-01-01

    Anomaly detection is the process of identifying unexpected items or events in datasets, which differ from the norm. In contrast to standard classification tasks, anomaly detection is often applied on unlabeled data, taking only the internal structure of the dataset into account. This challenge is known as unsupervised anomaly detection and is addressed in many practical applications, for example in network intrusion detection, fraud detection as well as in the life science and medical domain. Dozens of algorithms have been proposed in this area, but unfortunately the research community still lacks a comparative universal evaluation as well as common publicly available datasets. These shortcomings are addressed in this study, where 19 different unsupervised anomaly detection algorithms are evaluated on 10 different datasets from multiple application domains. By publishing the source code and the datasets, this paper aims to be a new well-funded basis for unsupervised anomaly detection research. Additionally, this evaluation reveals the strengths and weaknesses of the different approaches for the first time. Besides the anomaly detection performance, computational effort, the impact of parameter settings as well as the global/local anomaly detection behavior is outlined. As a conclusion, we give an advise on algorithm selection for typical real-world tasks.

  8. Performance characterization of a combined material identification and screening algorithm

    NASA Astrophysics Data System (ADS)

    Green, Robert L.; Hargreaves, Michael D.; Gardner, Craig M.

    2013-05-01

    Portable analytical devices based on a gamut of technologies (Infrared, Raman, X-Ray Fluorescence, Mass Spectrometry, etc.) are now widely available. These tools have seen increasing adoption for field-based assessment by diverse users including military, emergency response, and law enforcement. Frequently, end-users of portable devices are non-scientists who rely on embedded software and the associated algorithms to convert collected data into actionable information. Two classes of problems commonly encountered in field applications are identification and screening. Identification algorithms are designed to scour a library of known materials and determine whether the unknown measurement is consistent with a stored response (or combination of stored responses). Such algorithms can be used to identify a material from many thousands of possible candidates. Screening algorithms evaluate whether at least a subset of features in an unknown measurement correspond to one or more specific substances of interest and are typically configured to detect from a small list potential target analytes. Thus, screening algorithms are much less broadly applicable than identification algorithms; however, they typically provide higher detection rates which makes them attractive for specific applications such as chemical warfare agent or narcotics detection. This paper will present an overview and performance characterization of a combined identification/screening algorithm that has recently been developed. It will be shown that the combined algorithm provides enhanced detection capability more typical of screening algorithms while maintaining a broad identification capability. Additionally, we will highlight how this approach can enable users to incorporate situational awareness during a response.

  9. Network Community Detection based on the Physarum-inspired Computational Framework.

    PubMed

    Gao, Chao; Liang, Mingxin; Li, Xianghua; Zhang, Zili; Wang, Zhen; Zhou, Zhili

    2016-12-13

    Community detection is a crucial and essential problem in the structure analytics of complex networks, which can help us understand and predict the characteristics and functions of complex networks. Many methods, ranging from the optimization-based algorithms to the heuristic-based algorithms, have been proposed for solving such a problem. Due to the inherent complexity of identifying network structure, how to design an effective algorithm with a higher accuracy and a lower computational cost still remains an open problem. Inspired by the computational capability and positive feedback mechanism in the wake of foraging process of Physarum, which is a large amoeba-like cell consisting of a dendritic network of tube-like pseudopodia, a general Physarum-based computational framework for community detection is proposed in this paper. Based on the proposed framework, the inter-community edges can be identified from the intra-community edges in a network and the positive feedback of solving process in an algorithm can be further enhanced, which are used to improve the efficiency of original optimization-based and heuristic-based community detection algorithms, respectively. Some typical algorithms (e.g., genetic algorithm, ant colony optimization algorithm, and Markov clustering algorithm) and real-world datasets have been used to estimate the efficiency of our proposed computational framework. Experiments show that the algorithms optimized by Physarum-inspired computational framework perform better than the original ones, in terms of accuracy and computational cost. Moreover, a computational complexity analysis verifies the scalability of our framework.

  10. Ground Testing of Prototype Hardware and Processing Algorithms for a Wide Area Space Surveillance System (WASSS)

    NASA Astrophysics Data System (ADS)

    Goldstein, N.; Dressler, R. A.; Richtsmeier, S. S.; McLean, J.; Dao, P. D.; Murray-Krezan, J.; Fulcoly, D. O.

    2013-09-01

    Recent ground testing of a wide area camera system and automated star removal algorithms has demonstrated the potential to detect, quantify, and track deep space objects using small aperture cameras and on-board processors. The camera system, which was originally developed for a space-based Wide Area Space Surveillance System (WASSS), operates in a fixed-stare mode, continuously monitoring a wide swath of space and differentiating celestial objects from satellites based on differential motion across the field of view. It would have greatest utility in a LEO orbit to provide automated and continuous monitoring of deep space with high refresh rates, and with particular emphasis on the GEO belt and GEO transfer space. Continuous monitoring allows a concept of change detection and custody maintenance not possible with existing sensors. The detection approach is equally applicable to Earth-based sensor systems. A distributed system of such sensors, either Earth-based, or space-based, could provide automated, persistent night-time monitoring of all of deep space. The continuous monitoring provides a daily record of the light curves of all GEO objects above a certain brightness within the field of view. The daily updates of satellite light curves offers a means to identify specific satellites, to note changes in orientation and operational mode, and to queue other SSA assets for higher resolution queries. The data processing approach may also be applied to larger-aperture, higher resolution camera systems to extend the sensitivity towards dimmer objects. In order to demonstrate the utility of the WASSS system and data processing, a ground based field test was conducted in October 2012. We report here the results of the observations made at Magdalena Ridge Observatory using the prototype WASSS camera, which has a 4×60° field-of-view , <0.05° resolution, a 2.8 cm2 aperture, and the ability to view within 4° of the sun. A single camera pointed at the GEO belt provided a continuous night-long record of the intensity and location of more than 50 GEO objects detected within the camera's 60-degree field-of-view, with a detection sensitivity similar to the camera's shot noise limit of Mv=13.7. Performance is anticipated to scale with aperture area, allowing the detection of dimmer objects with larger-aperture cameras. The sensitivity of the system depends on multi-frame averaging and an image processing algorithm that exploits the different angular velocities of celestial objects and SOs. Principal Components Analysis (PCA) is used to filter out all objects moving with the velocity of the celestial frame of reference. The resulting filtered images are projected back into an Earth-centered frame of reference, or into any other relevant frame of reference, and co-added to form a series of images of the GEO objects as a function of time. The PCA approach not only removes the celestial background, but it also removes systematic variations in system calibration, sensor pointing, and atmospheric conditions. The resulting images are shot-noise limited, and can be exploited to automatically identify deep space objects, produce approximate state vectors, and track their locations and intensities as a function of time.

  11. Color object detection using spatial-color joint probability functions.

    PubMed

    Luo, Jiebo; Crandall, David

    2006-06-01

    Object detection in unconstrained images is an important image understanding problem with many potential applications. There has been little success in creating a single algorithm that can detect arbitrary objects in unconstrained images; instead, algorithms typically must be customized for each specific object. Consequently, it typically requires a large number of exemplars (for rigid objects) or a large amount of human intuition (for nonrigid objects) to develop a robust algorithm. We present a robust algorithm designed to detect a class of compound color objects given a single model image. A compound color object is defined as having a set of multiple, particular colors arranged spatially in a particular way, including flags, logos, cartoon characters, people in uniforms, etc. Our approach is based on a particular type of spatial-color joint probability function called the color edge co-occurrence histogram. In addition, our algorithm employs perceptual color naming to handle color variation, and prescreening to limit the search scope (i.e., size and location) for the object. Experimental results demonstrated that the proposed algorithm is insensitive to object rotation, scaling, partial occlusion, and folding, outperforming a closely related algorithm based on color co-occurrence histograms by a decisive margin.

  12. Optimized Seizure Detection Algorithm: A Fast Approach for Onset of Epileptic in EEG Signals Using GT Discriminant Analysis and K-NN Classifier

    PubMed Central

    Rezaee, Kh.; Azizi, E.; Haddadnia, J.

    2016-01-01

    Background Epilepsy is a severe disorder of the central nervous system that predisposes the person to recurrent seizures. Fifty million people worldwide suffer from epilepsy; after Alzheimer’s and stroke, it is the third widespread nervous disorder. Objective In this paper, an algorithm to detect the onset of epileptic seizures based on the analysis of brain electrical signals (EEG) has been proposed. 844 hours of EEG were recorded form 23 pediatric patients consecutively with 163 occurrences of seizures. Signals had been collected from Children’s Hospital Boston with a sampling frequency of 256 Hz through 18 channels in order to assess epilepsy surgery. By selecting effective features from seizure and non-seizure signals of each individual and putting them into two categories, the proposed algorithm detects the onset of seizures quickly and with high sensitivity. Method In this algorithm, L-sec epochs of signals are displayed in form of a third-order tensor in spatial, spectral and temporal spaces by applying wavelet transform. Then, after applying general tensor discriminant analysis (GTDA) on tensors and calculating mapping matrix, feature vectors are extracted. GTDA increases the sensitivity of the algorithm by storing data without deleting them. Finally, K-Nearest neighbors (KNN) is used to classify the selected features. Results The results of simulating algorithm on algorithm standard dataset shows that the algorithm is capable of detecting 98 percent of seizures with an average delay of 4.7 seconds and the average error rate detection of three errors in 24 hours. Conclusion Today, the lack of an automated system to detect or predict the seizure onset is strongly felt. PMID:27672628

  13. A cloud masking algorithm for EARLINET lidar systems

    NASA Astrophysics Data System (ADS)

    Binietoglou, Ioannis; Baars, Holger; D'Amico, Giuseppe; Nicolae, Doina

    2015-04-01

    Cloud masking is an important first step in any aerosol lidar processing chain as most data processing algorithms can only be applied on cloud free observations. Up to now, the selection of a cloud-free time interval for data processing is typically performed manually, and this is one of the outstanding problems for automatic processing of lidar data in networks such as EARLINET. In this contribution we present initial developments of a cloud masking algorithm that permits the selection of the appropriate time intervals for lidar data processing based on uncalibrated lidar signals. The algorithm is based on a signal normalization procedure using the range of observed values of lidar returns, designed to work with different lidar systems with minimal user input. This normalization procedure can be applied to measurement periods of only few hours, even if no suitable cloud-free interval exists, and thus can be used even when only a short period of lidar measurements is available. Clouds are detected based on a combination of criteria including the magnitude of the normalized lidar signal and time-space edge detection performed using the Sobel operator. In this way the algorithm avoids misclassification of strong aerosol layers as clouds. Cloud detection is performed using the highest available time and vertical resolution of the lidar signals, allowing the effective detection of low-level clouds (e.g. cumulus humilis). Special attention is given to suppress false cloud detection due to signal noise that can affect the algorithm's performance, especially during day-time. In this contribution we present the details of algorithm, the effect of lidar characteristics (space-time resolution, available wavelengths, signal-to-noise ratio) to detection performance, and highlight the current strengths and limitations of the algorithm using lidar scenes from different lidar systems in different locations across Europe.

  14. Automatic Fault Recognition of Photovoltaic Modules Based on Statistical Analysis of Uav Thermography

    NASA Astrophysics Data System (ADS)

    Kim, D.; Youn, J.; Kim, C.

    2017-08-01

    As a malfunctioning PV (Photovoltaic) cell has a higher temperature than adjacent normal cells, we can detect it easily with a thermal infrared sensor. However, it will be a time-consuming way to inspect large-scale PV power plants by a hand-held thermal infrared sensor. This paper presents an algorithm for automatically detecting defective PV panels using images captured with a thermal imaging camera from an UAV (unmanned aerial vehicle). The proposed algorithm uses statistical analysis of thermal intensity (surface temperature) characteristics of each PV module to verify the mean intensity and standard deviation of each panel as parameters for fault diagnosis. One of the characteristics of thermal infrared imaging is that the larger the distance between sensor and target, the lower the measured temperature of the object. Consequently, a global detection rule using the mean intensity of all panels in the fault detection algorithm is not applicable. Therefore, a local detection rule based on the mean intensity and standard deviation range was developed to detect defective PV modules from individual array automatically. The performance of the proposed algorithm was tested on three sample images; this verified a detection accuracy of defective panels of 97 % or higher. In addition, as the proposed algorithm can adjust the range of threshold values for judging malfunction at the array level, the local detection rule is considered better suited for highly sensitive fault detection compared to a global detection rule.

  15. Comparison of Diagnostic Algorithms for Detecting Toxigenic Clostridium difficile in Routine Practice at a Tertiary Referral Hospital in Korea.

    PubMed

    Moon, Hee-Won; Kim, Hyeong Nyeon; Hur, Mina; Shim, Hee Sook; Kim, Heejung; Yun, Yeo-Min

    2016-01-01

    Since every single test has some limitations for detecting toxigenic Clostridium difficile, multistep algorithms are recommended. This study aimed to compare the current, representative diagnostic algorithms for detecting toxigenic C. difficile, using VIDAS C. difficile toxin A&B (toxin ELFA), VIDAS C. difficile GDH (GDH ELFA, bioMérieux, Marcy-l'Etoile, France), and Xpert C. difficile (Cepheid, Sunnyvale, California, USA). In 271 consecutive stool samples, toxigenic culture, toxin ELFA, GDH ELFA, and Xpert C. difficile were performed. We simulated two algorithms: screening by GDH ELFA and confirmation by Xpert C. difficile (GDH + Xpert) and combined algorithm of GDH ELFA, toxin ELFA, and Xpert C. difficile (GDH + Toxin + Xpert). The performance of each assay and algorithm was assessed. The agreement of Xpert C. difficile and two algorithms (GDH + Xpert and GDH+ Toxin + Xpert) with toxigenic culture were strong (Kappa, 0.848, 0.857, and 0.868, respectively). The sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) of algorithms (GDH + Xpert and GDH + Toxin + Xpert) were 96.7%, 95.8%, 85.0%, 98.1%, and 94.5%, 95.8%, 82.3%, 98.5%, respectively. There were no significant differences between Xpert C. difficile and two algorithms in sensitivity, specificity, PPV and NPV. The performances of both algorithms for detecting toxigenic C. difficile were comparable to that of Xpert C. difficile. Either algorithm would be useful in clinical laboratories and can be optimized in the diagnostic workflow of C. difficile depending on costs, test volume, and clinical needs.

  16. Sounds from airguns and fin whales recorded in the mid-Atlantic Ocean, 1999-2009.

    PubMed

    Nieukirk, Sharon L; Mellinger, David K; Moore, Sue E; Klinck, Karolin; Dziak, Robert P; Goslin, Jean

    2012-02-01

    Between 1999 and 2009, autonomous hydrophones were deployed to monitor seismic activity from 16° N to 50° N along the Mid-Atlantic Ridge. These data were examined for airgun sounds produced during offshore surveys for oil and gas deposits, as well as the 20 Hz pulse sounds from fin whales, which may be masked by airgun noise. An automatic detection algorithm was used to identify airgun sound patterns, and fin whale calling levels were summarized via long-term spectral analysis. Both airgun and fin whale sounds were recorded at all sites. Fin whale calling rates were higher at sites north of 32° N, increased during the late summer and fall months at all sites, and peaked during the winter months, a time when airgun noise was often prevalent. Seismic survey vessels were acoustically located off the coasts of three major areas: Newfoundland, northeast Brazil, and Senegal and Mauritania in West Africa. In some cases, airgun sounds were recorded almost 4000 km from the survey vessel in areas that are likely occupied by fin whales, and at some locations airgun sounds were recorded more than 80% days/month for more than 12 consecutive months. © 2012 Acoustical Society of America

  17. 3D membrane segmentation and quantification of intact thick cells using cryo soft X-ray transmission microscopy: A pilot study

    PubMed Central

    Klementieva, Oxana; Werner, Stephan; Guttmann, Peter; Pratsch, Christoph; Cladera, Josep

    2017-01-01

    Structural analysis of biological membranes is important for understanding cell and sub-cellular organelle function as well as their interaction with the surrounding environment. Imaging of whole cells in three dimension at high spatial resolution remains a significant challenge, particularly for thick cells. Cryo-transmission soft X-ray microscopy (cryo-TXM) has recently gained popularity to image, in 3D, intact thick cells (∼10μm) with details of sub-cellular architecture and organization in near-native state. This paper reports a new tool to segment and quantify structural changes of biological membranes in 3D from cryo-TXM images by tracking an initial 2D contour along the third axis of the microscope, through a multi-scale ridge detection followed by an active contours-based model, with a subsequent refinement along the other two axes. A quantitative metric that assesses the grayscale profiles perpendicular to the membrane surfaces is introduced and shown to be linearly related to the membrane thickness. Our methodology has been validated on synthetic phantoms using realistic microscope properties and structure dimensions, as well as on real cryo-TXM data. Results demonstrate the validity of our algorithms for cryo-TXM data analysis. PMID:28376110

  18. New Seismic Monitoring Station at Mohawk Ridge, Valles Caldera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, Peter Morse

    Two new broadband digital seismic stations were installed in the Valles Caldera in 2011 and 2012. The first is located on the summit of Cerros del Abrigo (station code CDAB) and the second is located on the flanks of San Antonio Mountain (station code SAMT). Seismic monitoring stations in the caldera serve multiple purposes. These stations augment and expand the current coverage of the Los Alamos Seismic Network (LASN), which is operated to support seismic and volcanic hazards studies for LANL and northern New Mexico (Figure 1). They also provide unique continuous seismic data within the caldera that can bemore » used for scientific studies of the caldera’s substructure and detection of very small seismic signals that may indicate changes in the current and evolving state of remnant magma that is known to exist beneath the caldera. Since the installation of CDAB and SAMT, several very small earthquakes have already been detected near San Antonio Mountain just west of SAMT (Figure 2). These are the first events to be seen in that area. Caldera stations also improve the detection and epicenter determination quality for larger local earthquakes on the Pajarito Fault System east of the Preserve and the Nacimiento Uplift to the west. These larger earthquakes are a concern to LANL Seismic Hazards assessments and seismic monitoring of the Los Alamos region, including the VCNP, is a DOE requirement. Currently the next closest seismic stations to the caldera are on Pipeline Road (PPR) just west of Los Alamos, and Peralta Ridge (PER) south of the caldera. There is no station coverage near the resurgent dome, Redondo Peak, in the center of the caldera. Filling this “hole” is the highest priority for the next new LASN station. We propose to install this station in 2018 on Mohawk Ridge just east of Redondito, in the same area already occupied by other scientific installations, such as the MCON flux tower operated by UNM.« less

  19. Optimizations for the EcoPod field identification tool

    PubMed Central

    Manoharan, Aswath; Stamberger, Jeannie; Yu, YuanYuan; Paepcke, Andreas

    2008-01-01

    Background We sketch our species identification tool for palm sized computers that helps knowledgeable observers with census activities. An algorithm turns an identification matrix into a minimal length series of questions that guide the operator towards identification. Historic observation data from the census geographic area helps minimize question volume. We explore how much historic data is required to boost performance, and whether the use of history negatively impacts identification of rare species. We also explore how characteristics of the matrix interact with the algorithm, and how best to predict the probability of observing a previously unseen species. Results Point counts of birds taken at Stanford University's Jasper Ridge Biological Preserve between 2000 and 2005 were used to examine the algorithm. A computer identified species by correctly answering, and counting the algorithm's questions. We also explored how the character density of the key matrix and the theoretical minimum number of questions for each bird in the matrix influenced the algorithm. Our investigation of the required probability smoothing determined whether Laplace smoothing of observation probabilities was sufficient, or whether the more complex Good-Turing technique is required. Conclusion Historic data improved identification speed, but only impacted the top 25% most frequently observed birds. For rare birds the history based algorithms did not impose a noticeable penalty in the number of questions required for identification. For our dataset neither age of the historic data, nor the number of observation years impacted the algorithm. Density of characters for different taxa in the identification matrix did not impact the algorithms. Intrinsic differences in identifying different birds did affect the algorithm, but the differences affected the baseline method of not using historic data to exactly the same degree. We found that Laplace smoothing performed better for rare species than Simple Good-Turing, and that, contrary to expectation, the technique did not then adversely affect identification performance for frequently observed birds. PMID:18366649

  20. Space Object Maneuver Detection Algorithms Using TLE Data

    NASA Astrophysics Data System (ADS)

    Pittelkau, M.

    2016-09-01

    An important aspect of Space Situational Awareness (SSA) is detection of deliberate and accidental orbit changes of space objects. Although space surveillance systems detect orbit maneuvers within their tracking algorithms, maneuver data are not readily disseminated for general use. However, two-line element (TLE) data is available and can be used to detect maneuvers of space objects. This work is an attempt to improve upon existing TLE-based maneuver detection algorithms. Three adaptive maneuver detection algorithms are developed and evaluated: The first is a fading-memory Kalman filter, which is equivalent to the sliding-window least-squares polynomial fit, but computationally more efficient and adaptive to the noise in the TLE data. The second algorithm is based on a sample cumulative distribution function (CDF) computed from a histogram of the magnitude-squared |V|2 of change-in-velocity vectors (V), which is computed from the TLE data. A maneuver detection threshold is computed from the median estimated from the CDF, or from the CDF and a specified probability of false alarm. The third algorithm is a median filter. The median filter is the simplest of a class of nonlinear filters called order statistics filters, which is within the theory of robust statistics. The output of the median filter is practically insensitive to outliers, or large maneuvers. The median of the |V|2 data is proportional to the variance of the V, so the variance is estimated from the output of the median filter. A maneuver is detected when the input data exceeds a constant times the estimated variance.

  1. A Contextual Fire Detection Algorithm for Simulated HJ-1B Imagery.

    PubMed

    Qian, Yonggang; Yan, Guangjian; Duan, Sibo; Kong, Xiangsheng

    2009-01-01

    The HJ-1B satellite, which was launched on September 6, 2008, is one of the small ones placed in the constellation for disaster prediction and monitoring. HJ-1B imagery was simulated in this paper, which contains fires of various sizes and temperatures in a wide range of terrestrial biomes and climates, including RED, NIR, MIR and TIR channels. Based on the MODIS version 4 contextual algorithm and the characteristics of HJ-1B sensor, a contextual fire detection algorithm was proposed and tested using simulated HJ-1B data. It was evaluated by the probability of fire detection and false alarm as functions of fire temperature and fire area. Results indicate that when the simulated fire area is larger than 45 m(2) and the simulated fire temperature is larger than 800 K, the algorithm has a higher probability of detection. But if the simulated fire area is smaller than 10 m(2), only when the simulated fire temperature is larger than 900 K, may the fire be detected. For fire areas about 100 m(2), the proposed algorithm has a higher detection probability than that of the MODIS product. Finally, the omission and commission error were evaluated which are important factors to affect the performance of this algorithm. It has been demonstrated that HJ-1B satellite data are much sensitive to smaller and cooler fires than MODIS or AVHRR data and the improved capabilities of HJ-1B data will offer a fine opportunity for the fire detection.

  2. Detection of anomaly in human retina using Laplacian Eigenmaps and vectorized matched filtering

    NASA Astrophysics Data System (ADS)

    Yacoubou Djima, Karamatou A.; Simonelli, Lucia D.; Cunningham, Denise; Czaja, Wojciech

    2015-03-01

    We present a novel method for automated anomaly detection on auto fluorescent data provided by the National Institute of Health (NIH). This is motivated by the need for new tools to improve the capability of diagnosing macular degeneration in its early stages, track the progression over time, and test the effectiveness of new treatment methods. In previous work, macular anomalies have been detected automatically through multiscale analysis procedures such as wavelet analysis or dimensionality reduction algorithms followed by a classification algorithm, e.g., Support Vector Machine. The method that we propose is a Vectorized Matched Filtering (VMF) algorithm combined with Laplacian Eigenmaps (LE), a nonlinear dimensionality reduction algorithm with locality preserving properties. By applying LE, we are able to represent the data in the form of eigenimages, some of which accentuate the visibility of anomalies. We pick significant eigenimages and proceed with the VMF algorithm that classifies anomalies across all of these eigenimages simultaneously. To evaluate our performance, we compare our method to two other schemes: a matched filtering algorithm based on anomaly detection on single images and a combination of PCA and VMF. LE combined with VMF algorithm performs best, yielding a high rate of accurate anomaly detection. This shows the advantage of using a nonlinear approach to represent the data and the effectiveness of VMF, which operates on the images as a data cube rather than individual images.

  3. Continuous Glucose Monitoring Enables the Detection of Losses in Infusion Set Actuation (LISAs)

    PubMed Central

    Howsmon, Daniel P.; Cameron, Faye; Baysal, Nihat; Ly, Trang T.; Forlenza, Gregory P.; Maahs, David M.; Buckingham, Bruce A.; Hahn, Juergen; Bequette, B. Wayne

    2017-01-01

    Reliable continuous glucose monitoring (CGM) enables a variety of advanced technology for the treatment of type 1 diabetes. In addition to artificial pancreas algorithms that use CGM to automate continuous subcutaneous insulin infusion (CSII), CGM can also inform fault detection algorithms that alert patients to problems in CGM or CSII. Losses in infusion set actuation (LISAs) can adversely affect clinical outcomes, resulting in hyperglycemia due to impaired insulin delivery. Prolonged hyperglycemia may lead to diabetic ketoacidosis—a serious metabolic complication in type 1 diabetes. Therefore, an algorithm for the detection of LISAs based on CGM and CSII signals was developed to improve patient safety. The LISA detection algorithm is trained retrospectively on data from 62 infusion set insertions from 20 patients. The algorithm collects glucose and insulin data, and computes relevant fault metrics over two different sliding windows; an alarm sounds when these fault metrics are exceeded. With the chosen algorithm parameters, the LISA detection strategy achieved a sensitivity of 71.8% and issued 0.28 false positives per day on the training data. Validation on two independent data sets confirmed that similar performance is seen on data that was not used for training. The developed algorithm is able to effectively alert patients to possible infusion set failures in open-loop scenarios, with limited evidence of its extension to closed-loop scenarios. PMID:28098839

  4. Continuous Glucose Monitoring Enables the Detection of Losses in Infusion Set Actuation (LISAs).

    PubMed

    Howsmon, Daniel P; Cameron, Faye; Baysal, Nihat; Ly, Trang T; Forlenza, Gregory P; Maahs, David M; Buckingham, Bruce A; Hahn, Juergen; Bequette, B Wayne

    2017-01-15

    Reliable continuous glucose monitoring (CGM) enables a variety of advanced technology for the treatment of type 1 diabetes. In addition to artificial pancreas algorithms that use CGM to automate continuous subcutaneous insulin infusion (CSII), CGM can also inform fault detection algorithms that alert patients to problems in CGM or CSII. Losses in infusion set actuation (LISAs) can adversely affect clinical outcomes, resulting in hyperglycemia due to impaired insulin delivery. Prolonged hyperglycemia may lead to diabetic ketoacidosis-a serious metabolic complication in type 1 diabetes. Therefore, an algorithm for the detection of LISAs based on CGM and CSII signals was developed to improve patient safety. The LISA detection algorithm is trained retrospectively on data from 62 infusion set insertions from 20 patients. The algorithm collects glucose and insulin data, and computes relevant fault metrics over two different sliding windows; an alarm sounds when these fault metrics are exceeded. With the chosen algorithm parameters, the LISA detection strategy achieved a sensitivity of 71.8% and issued 0.28 false positives per day on the training data. Validation on two independent data sets confirmed that similar performance is seen on data that was not used for training. The developed algorithm is able to effectively alert patients to possible infusion set failures in open-loop scenarios, with limited evidence of its extension to closed-loop scenarios.

  5. Generalized Detectability for Discrete Event Systems

    PubMed Central

    Shu, Shaolong; Lin, Feng

    2011-01-01

    In our previous work, we investigated detectability of discrete event systems, which is defined as the ability to determine the current and subsequent states of a system based on observation. For different applications, we defined four types of detectabilities: (weak) detectability, strong detectability, (weak) periodic detectability, and strong periodic detectability. In this paper, we extend our results in three aspects. (1) We extend detectability from deterministic systems to nondeterministic systems. Such a generalization is necessary because there are many systems that need to be modeled as nondeterministic discrete event systems. (2) We develop polynomial algorithms to check strong detectability. The previous algorithms are based on observer whose construction is of exponential complexity, while the new algorithms are based on a new automaton called detector. (3) We extend detectability to D-detectability. While detectability requires determining the exact state of a system, D-detectability relaxes this requirement by asking only to distinguish certain pairs of states. With these extensions, the theory on detectability of discrete event systems becomes more applicable in solving many practical problems. PMID:21691432

  6. Detection of facilities in satellite imagery using semi-supervised image classification and auxiliary contextual observables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harvey, Neal R; Ruggiero, Christy E; Pawley, Norma H

    2009-01-01

    Detecting complex targets, such as facilities, in commercially available satellite imagery is a difficult problem that human analysts try to solve by applying world knowledge. Often there are known observables that can be extracted by pixel-level feature detectors that can assist in the facility detection process. Individually, each of these observables is not sufficient for an accurate and reliable detection, but in combination, these auxiliary observables may provide sufficient context for detection by a machine learning algorithm. We describe an approach for automatic detection of facilities that uses an automated feature extraction algorithm to extract auxiliary observables, and a semi-supervisedmore » assisted target recognition algorithm to then identify facilities of interest. We illustrate the approach using an example of finding schools in Quickbird image data of Albuquerque, New Mexico. We use Los Alamos National Laboratory's Genie Pro automated feature extraction algorithm to find a set of auxiliary features that should be useful in the search for schools, such as parking lots, large buildings, sports fields and residential areas and then combine these features using Genie Pro's assisted target recognition algorithm to learn a classifier that finds schools in the image data.« less

  7. Detection and inpainting of facial wrinkles using texture orientation fields and Markov random field modeling.

    PubMed

    Batool, Nazre; Chellappa, Rama

    2014-09-01

    Facial retouching is widely used in media and entertainment industry. Professional software usually require a minimum level of user expertise to achieve the desirable results. In this paper, we present an algorithm to detect facial wrinkles/imperfection. We believe that any such algorithm would be amenable to facial retouching applications. The detection of wrinkles/imperfections can allow these skin features to be processed differently than the surrounding skin without much user interaction. For detection, Gabor filter responses along with texture orientation field are used as image features. A bimodal Gaussian mixture model (GMM) represents distributions of Gabor features of normal skin versus skin imperfections. Then, a Markov random field model is used to incorporate the spatial relationships among neighboring pixels for their GMM distributions and texture orientations. An expectation-maximization algorithm then classifies skin versus skin wrinkles/imperfections. Once detected automatically, wrinkles/imperfections are removed completely instead of being blended or blurred. We propose an exemplar-based constrained texture synthesis algorithm to inpaint irregularly shaped gaps left by the removal of detected wrinkles/imperfections. We present results conducted on images downloaded from the Internet to show the efficacy of our algorithms.

  8. Application of multivariate Gaussian detection theory to known non-Gaussian probability density functions

    NASA Astrophysics Data System (ADS)

    Schwartz, Craig R.; Thelen, Brian J.; Kenton, Arthur C.

    1995-06-01

    A statistical parametric multispectral sensor performance model was developed by ERIM to support mine field detection studies, multispectral sensor design/performance trade-off studies, and target detection algorithm development. The model assumes target detection algorithms and their performance models which are based on data assumed to obey multivariate Gaussian probability distribution functions (PDFs). The applicability of these algorithms and performance models can be generalized to data having non-Gaussian PDFs through the use of transforms which convert non-Gaussian data to Gaussian (or near-Gaussian) data. An example of one such transform is the Box-Cox power law transform. In practice, such a transform can be applied to non-Gaussian data prior to the introduction of a detection algorithm that is formally based on the assumption of multivariate Gaussian data. This paper presents an extension of these techniques to the case where the joint multivariate probability density function of the non-Gaussian input data is known, and where the joint estimate of the multivariate Gaussian statistics, under the Box-Cox transform, is desired. The jointly estimated multivariate Gaussian statistics can then be used to predict the performance of a target detection algorithm which has an associated Gaussian performance model.

  9. Algorithms used in the Airborne Lidar Processing System (ALPS)

    USGS Publications Warehouse

    Nagle, David B.; Wright, C. Wayne

    2016-05-23

    The Airborne Lidar Processing System (ALPS) analyzes Experimental Advanced Airborne Research Lidar (EAARL) data—digitized laser-return waveforms, position, and attitude data—to derive point clouds of target surfaces. A full-waveform airborne lidar system, the EAARL seamlessly and simultaneously collects mixed environment data, including submerged, sub-aerial bare earth, and vegetation-covered topographies.ALPS uses three waveform target-detection algorithms to determine target positions within a given waveform: centroid analysis, leading edge detection, and bottom detection using water-column backscatter modeling. The centroid analysis algorithm detects opaque hard surfaces. The leading edge algorithm detects topography beneath vegetation and shallow, submerged topography. The bottom detection algorithm uses water-column backscatter modeling for deeper submerged topography in turbid water.The report describes slant range calculations and explains how ALPS uses laser range and orientation measurements to project measurement points into the Universal Transverse Mercator coordinate system. Parameters used for coordinate transformations in ALPS are described, as are Interactive Data Language-based methods for gridding EAARL point cloud data to derive digital elevation models. Noise reduction in point clouds through use of a random consensus filter is explained, and detailed pseudocode, mathematical equations, and Yorick source code accompany the report.

  10. Distribution majorization of corner points by reinforcement learning for moving object detection

    NASA Astrophysics Data System (ADS)

    Wu, Hao; Yu, Hao; Zhou, Dongxiang; Cheng, Yongqiang

    2018-04-01

    Corner points play an important role in moving object detection, especially in the case of free-moving camera. Corner points provide more accurate information than other pixels and reduce the computation which is unnecessary. Previous works only use intensity information to locate the corner points, however, the information that former and the last frames provided also can be used. We utilize the information to focus on more valuable area and ignore the invaluable area. The proposed algorithm is based on reinforcement learning, which regards the detection of corner points as a Markov process. In the Markov model, the video to be detected is regarded as environment, the selections of blocks for one corner point are regarded as actions and the performance of detection is regarded as state. Corner points are assigned to be the blocks which are seperated from original whole image. Experimentally, we select a conventional method which uses marching and Random Sample Consensus algorithm to obtain objects as the main framework and utilize our algorithm to improve the result. The comparison between the conventional method and the same one with our algorithm show that our algorithm reduce 70% of the false detection.

  11. Management and Analysis of Radiation Portal Monitor Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rowe, Nathan C; Alcala, Scott; Crye, Jason Michael

    2014-01-01

    Oak Ridge National Laboratory (ORNL) receives, archives, and analyzes data from radiation portal monitors (RPMs). Over time the amount of data submitted for analysis has grown significantly, and in fiscal year 2013, ORNL received 545 gigabytes of data representing more than 230,000 RPM operating days. This data comes from more than 900 RPMs. ORNL extracts this data into a relational database, which is accessed through a custom software solution called the Desktop Analysis and Reporting Tool (DART). DART is used by data analysts to complete a monthly lane-by-lane review of RPM status. Recently ORNL has begun to extend its datamore » analysis based on program-wide data processing in addition to the lane-by-lane review. Program-wide data processing includes the use of classification algorithms designed to identify RPMs with specific known issues and clustering algorithms intended to identify as-yet-unknown issues or new methods and measures for use in future classification algorithms. This paper provides an overview of the architecture used in the management of this data, performance aspects of the system, and additional requirements and methods used in moving toward an increased program-wide analysis paradigm.« less

  12. Sensor Data Quality and Angular Rate Down-Selection Algorithms on SLS EM-1

    NASA Technical Reports Server (NTRS)

    Park, Thomas; Smith, Austin; Oliver, T. Emerson

    2018-01-01

    The NASA Space Launch System Block 1 launch vehicle is equipped with an Inertial Navigation System (INS) and multiple Rate Gyro Assemblies (RGA) that are used in the Guidance, Navigation, and Control (GN&C) algorithms. The INS provides the inertial position, velocity, and attitude of the vehicle along with both angular rate and specific force measurements. Additionally, multiple sets of co-located rate gyros supply angular rate data. The collection of angular rate data, taken along the launch vehicle, is used to separate out vehicle motion from flexible body dynamics. Since the system architecture uses redundant sensors, the capability was developed to evaluate the health (or validity) of the independent measurements. A suite of Sensor Data Quality (SDQ) algorithms is responsible for assessing the angular rate data from the redundant sensors. When failures are detected, SDQ will take the appropriate action and disqualify or remove faulted sensors from forward processing. Additionally, the SDQ algorithms contain logic for down-selecting the angular rate data used by the GNC software from the set of healthy measurements. This paper explores the trades and analyses that were performed in selecting a set of robust fault-detection algorithms included in the GN&C flight software. These trades included both an assessment of hardware-provided health and status data as well as an evaluation of different algorithms based on time-to-detection, type of failures detected, and probability of detecting false positives. We then provide an overview of the algorithms used for both fault-detection and measurement down selection. We next discuss the role of trajectory design, flexible-body models, and vehicle response to off-nominal conditions in setting the detection thresholds. Lastly, we present lessons learned from software integration and hardware-in-the-loop testing.

  13. Extracting information from the text of electronic medical records to improve case detection: a systematic review

    PubMed Central

    Carroll, John A; Smith, Helen E; Scott, Donia; Cassell, Jackie A

    2016-01-01

    Background Electronic medical records (EMRs) are revolutionizing health-related research. One key issue for study quality is the accurate identification of patients with the condition of interest. Information in EMRs can be entered as structured codes or unstructured free text. The majority of research studies have used only coded parts of EMRs for case-detection, which may bias findings, miss cases, and reduce study quality. This review examines whether incorporating information from text into case-detection algorithms can improve research quality. Methods A systematic search returned 9659 papers, 67 of which reported on the extraction of information from free text of EMRs with the stated purpose of detecting cases of a named clinical condition. Methods for extracting information from text and the technical accuracy of case-detection algorithms were reviewed. Results Studies mainly used US hospital-based EMRs, and extracted information from text for 41 conditions using keyword searches, rule-based algorithms, and machine learning methods. There was no clear difference in case-detection algorithm accuracy between rule-based and machine learning methods of extraction. Inclusion of information from text resulted in a significant improvement in algorithm sensitivity and area under the receiver operating characteristic in comparison to codes alone (median sensitivity 78% (codes + text) vs 62% (codes), P = .03; median area under the receiver operating characteristic 95% (codes + text) vs 88% (codes), P = .025). Conclusions Text in EMRs is accessible, especially with open source information extraction algorithms, and significantly improves case detection when combined with codes. More harmonization of reporting within EMR studies is needed, particularly standardized reporting of algorithm accuracy metrics like positive predictive value (precision) and sensitivity (recall). PMID:26911811

  14. Mid-ocean ridge jumps associated with hotspot magmatism

    NASA Astrophysics Data System (ADS)

    Mittelstaedt, Eric; Ito, Garrett; Behn, Mark D.

    2008-02-01

    Hotspot-ridge interaction produces a wide range of phenomena including excess crustal thickness, geochemical anomalies, off-axis volcanic ridges and ridge relocations or jumps. Ridges are recorded to have jumped toward many hotspots including, Iceland, Discovery, Galápagos, Kerguelen and Tristan de Cuhna. The causes of ridge jumps likely involve a number of interacting processes related to hotspots. One such process is reheating of the lithosphere as magma penetrates it to feed near-axis volcanism. We study this effect by using the hybrid, finite-element code, FLAC, to simulate two-dimensional (2-D, cross-section) viscous mantle flow, elasto-plastic deformation of the lithosphere and heat transport in a ridge setting near an off-axis hotspot. Heating due to magma transport through the lithosphere is implemented within a hotspot region of fixed width. To determine the conditions necessary to initiate a ridge jump, we vary four parameters: hotspot magmatic heating rate, spreading rate, seafloor age at the location of the hotspot and ridge migration rate. Our results indicate that the hotspot magmatic heating rate required to initiate a ridge jump increases non-linearly with increasing spreading rate and seafloor age. Models predict that magmatic heating, itself, is most likely to cause jumps at slow spreading rates such as at the Mid-Atlantic Ridge on Iceland. In contrast, despite the higher magma flux at the Galápagos hotspot, magmatic heating alone is probably insufficient to induce a ridge jump at the present-day due to the intermediate ridge spreading rate of the Galápagos Spreading Center. The time required to achieve a ridge jump, for fixed or migrating ridges, is found to be on the order of 10 5-10 6 years. Simulations that incorporate ridge migration predict that after a ridge jump occurs the hotspot and ridge migrate together for time periods that increase with magma flux. Model results also suggest a mechanism for ridge reorganizations not related to hotspots such as ridge jumps in back-arc settings and ridge segment propagation along the Mid-Atlantic Ridge.

  15. Appearance of De Geer moraines in southern and western Finland - Implications for reconstructing glacier retreat dynamics

    NASA Astrophysics Data System (ADS)

    Ojala, Antti E. K.

    2016-02-01

    LiDAR digital elevation models (DEMs) from southern and western Finland were investigated to map and discriminate features of De Geer moraines, sparser and more scattered end moraines, and larger end moraine features (i.e., ice-marginal complexes). The results indicate that the occurrence and distribution of De Geer moraines and scattered end moraine ridges in Finland are more widespread than previously suggested. This is probably attributed to the ease of detecting and mapping these features with high-resolution DEMs, indicating the efficiency of LiDAR applications in geological and geomorphological studies. The variable appearance and distribution of moraine ridges in Finland support previous interpretations that no single model is likely to be appropriate for the genesis of De Geer moraines at all localities and for all types of end moraines. De Geer moraine appearances and interdistances probably result from a combination of the general rapidity of ice-margin recession during deglaciation, the proglacial water depth in which they were formed, and local glacier dynamics related to climate and terrain topography. The correlation between the varved clay-based rate of deglaciation and interdistances of distinct and regularly spaced De Geer moraine ridges indicates that the rate of deglaciation is probably involved in the De Geer ridge-forming process, but more thorough comparisons are needed to understand the extent to which De Geer interdistances represent an annual rate of ice-margin decay and the rapidity of regional deglaciation.

  16. Landscape genetics of raccoons (Procyon lotor) associated with ridges and valleys of Pennsylvania: implications for oral rabies vaccination programs.

    PubMed

    Root, J Jeffrey; Puskas, Robert B; Fischer, Justin W; Swope, Craig B; Neubaum, Melissa A; Reeder, Serena A; Piaggio, Antoinette J

    2009-12-01

    Raccoons are the reservoir for the raccoon rabies virus variant in the United States. To combat this threat, oral rabies vaccination (ORV) programs are conducted in many eastern states. To aid in these efforts, the genetic structure of raccoons (Procyon lotor) was assessed in southwestern Pennsylvania to determine if select geographic features (i.e., ridges and valleys) serve as corridors or hindrances to raccoon gene flow (e.g., movement) and, therefore, rabies virus trafficking in this physiographic region. Raccoon DNA samples (n = 185) were collected from one ridge site and two adjacent valleys in southwestern Pennsylvania (Westmoreland, Cambria, Fayette, and Somerset counties). Raccoon genetic structure within and among these study sites was characterized at nine microsatellite loci. Results indicated that there was little population subdivision among any sites sampled. Furthermore, analyses using a model-based clustering approach indicated one essentially panmictic population was present among all the raccoons sampled over a reasonably broad geographic area (e.g., sites up to 36 km apart). However, a signature of isolation by distance was detected, suggesting that widths of ORV zones are critical for success. Combined, these data indicate that geographic features within this landscape influence raccoon gene flow only to a limited extent, suggesting that ridges of this physiographic system will not provide substantial long-term natural barriers to rabies virus trafficking. These results may be of value for future ORV efforts in Pennsylvania and other eastern states with similar landscapes.

  17. Alveolar ridge keratosis - a retrospective clinicopathological study

    PubMed Central

    2013-01-01

    Background Alveolar ridge keratosis (ARK) is a distinct, benign clinicopathological entity, characterized by a hyperkeratotic plaque or patch that occurs on the alveolar edentulous ridge or on the retromolar trigone, considered to be caused by chronic frictional trauma. The aim of this retrospective study is to present the clinicopathological features of 23 consecutive cases of ARK. Material and methods The 23 biopsy samples of ARK were selected and pathological features were revised (keratosis, acanthosis, surface architecture, and inflammation). Factors such as the patient’s gender, age, anatomical location, tobacco and alcohol use were analyzed. Results Sixteen out of the 23 cases studied were men and 7 women with a mean age of 55.05 (age ranged from 17 to 88 years). Thirteen cases had a history of tobacco habit, amongst whom, 4 also presented alcohol consumption. All the cases presented only unilateral lesions. Nineteen cases involved the retromolar trigone while 4 cases involved edentulous alveolar ridges. When observed microscopically, the lesions were mainly characterized by moderate to important hyperorthokeratosis. Inflammation was scanty or absent. In four of the cases, the presence of melanin pigment in the superficial corium or in the cytoplasm of macrophages was detected. None of the cases showed any features of dysplasia. Conclusion Our results reveal that ARK is a benign lesion. However, the high prevalence of smokers amongst the patients might suggest that some potentially malignant disorders such as tobacco associated leukoplakia may clinically mimic ARK. PMID:23587097

  18. Impacts of Volcanic Eruptions and Disturbances on Mid-Ocean Ridge Biological Communities

    NASA Astrophysics Data System (ADS)

    Shank, T. M.

    2009-12-01

    Understanding ecological processes in mid-ocean ridge benthic environments requires a knowledge of the temporal and spatial scales over which those processes take place. Over the past 17 years, the detection and now “direct observation” of more than nine seafloor eruptions and even more numerous and diverse geologic disturbances (e.g., dyking and cracking events) have provided a broad spectrum of perturbating seafloor phenomena that serve as key agents for creating new vent habitat, providing bursts of nutrients, supporting blooms of microbial and macrobiological communities, imparting magmatic/hydrothermal fluxes, controlling fluid geochemical composition, altering the successional stage of faunal communities, guiding the temporal and spatial scales of local extinction and recolonization, and for directing the evolution of physiological adaptations. Eruptions have now been documented on the East Pacific Rise, Southern Mid-Atlantic Ridge, Gakkel Ridge, Galapagos Rift, CoAxial, Northwest Rota, West Mata, and Loihi Seamounts, representing diverse emergent eruptive styles, from explosive pyroclastic deposits to thin lava flows, these processes are occurring in different biogeographic regions hosting different regional species pools. As such, not only do these eruptions provide a method of establishing a “time-zero” with which to construct manipulative temporal experiments, but also provide a contextual framework with which to interpret the affect eruptions and disturbance have on ecological interactions in different biogeographic regions of the world, and the timescales over which they vary. The temporal and spatial impact of these different eruptive styles in relation to the alteration of biological community structure will be discussed.

  19. A landslide-quake detection algorithm with STA/LTA and diagnostic functions of moving average and scintillation index: A preliminary case study of the 2009 Typhoon Morakot in Taiwan

    NASA Astrophysics Data System (ADS)

    Wu, Yu-Jie; Lin, Guan-Wei

    2017-04-01

    Since 1999, Taiwan has experienced a rapid rise in the number of landslides, and the number even reached a peak after the 2009 Typhoon Morakot. Although it is proved that the ground-motion signals induced by slope processes could be recorded by seismograph, it is difficult to be distinguished from continuous seismic records due to the lack of distinct P and S waves. In this study, we combine three common seismic detectors including the short-term average/long-term average (STA/LTA) approach, and two diagnostic functions of moving average and scintillation index. Based on these detectors, we have established an auto-detection algorithm of landslide-quakes and the detection thresholds are defined to distinguish landslide-quake from earthquakes and background noises. To further improve the proposed detection algorithm, we apply it to seismic archives recorded by Broadband Array in Taiwan for Seismology (BATS) during the 2009 Typhoon Morakots and consequently the discrete landslide-quakes detected by the automatic algorithm are located. The detection algorithm show that the landslide-detection results are consistent with that of visual inspection and hence can be used to automatically monitor landslide-quakes.

  20. Damage diagnosis algorithm using a sequential change point detection method with an unknown distribution for damage

    NASA Astrophysics Data System (ADS)

    Noh, Hae Young; Rajagopal, Ram; Kiremidjian, Anne S.

    2012-04-01

    This paper introduces a damage diagnosis algorithm for civil structures that uses a sequential change point detection method for the cases where the post-damage feature distribution is unknown a priori. This algorithm extracts features from structural vibration data using time-series analysis and then declares damage using the change point detection method. The change point detection method asymptotically minimizes detection delay for a given false alarm rate. The conventional method uses the known pre- and post-damage feature distributions to perform a sequential hypothesis test. In practice, however, the post-damage distribution is unlikely to be known a priori. Therefore, our algorithm estimates and updates this distribution as data are collected using the maximum likelihood and the Bayesian methods. We also applied an approximate method to reduce the computation load and memory requirement associated with the estimation. The algorithm is validated using multiple sets of simulated data and a set of experimental data collected from a four-story steel special moment-resisting frame. Our algorithm was able to estimate the post-damage distribution consistently and resulted in detection delays only a few seconds longer than the delays from the conventional method that assumes we know the post-damage feature distribution. We confirmed that the Bayesian method is particularly efficient in declaring damage with minimal memory requirement, but the maximum likelihood method provides an insightful heuristic approach.

  1. The development of line-scan image recognition algorithms for the detection of frass on mature tomatoes

    USDA-ARS?s Scientific Manuscript database

    In this research, a multispectral algorithm derived from hyperspectral line-scan fluorescence imaging under violet LED excitation was developed for the detection of frass contamination on mature tomatoes. The algorithm utilized the fluorescence intensities at two wavebands, 664 nm and 690 nm, for co...

  2. Detecting chaos in irregularly sampled time series.

    PubMed

    Kulp, C W

    2013-09-01

    Recently, Wiebe and Virgin [Chaos 22, 013136 (2012)] developed an algorithm which detects chaos by analyzing a time series' power spectrum which is computed using the Discrete Fourier Transform (DFT). Their algorithm, like other time series characterization algorithms, requires that the time series be regularly sampled. Real-world data, however, are often irregularly sampled, thus, making the detection of chaotic behavior difficult or impossible with those methods. In this paper, a characterization algorithm is presented, which effectively detects chaos in irregularly sampled time series. The work presented here is a modification of Wiebe and Virgin's algorithm and uses the Lomb-Scargle Periodogram (LSP) to compute a series' power spectrum instead of the DFT. The DFT is not appropriate for irregularly sampled time series. However, the LSP is capable of computing the frequency content of irregularly sampled data. Furthermore, a new method of analyzing the power spectrum is developed, which can be useful for differentiating between chaotic and non-chaotic behavior. The new characterization algorithm is successfully applied to irregularly sampled data generated by a model as well as data consisting of observations of variable stars.

  3. Foliage penetration by using 4-D point cloud data

    NASA Astrophysics Data System (ADS)

    Méndez Rodríguez, Javier; Sánchez-Reyes, Pedro J.; Cruz-Rivera, Sol M.

    2012-06-01

    Real-time awareness and rapid target detection are critical for the success of military missions. New technologies capable of detecting targets concealed in forest areas are needed in order to track and identify possible threats. Currently, LAser Detection And Ranging (LADAR) systems are capable of detecting obscured targets; however, tracking capabilities are severely limited. Now, a new LADAR-derived technology is under development to generate 4-D datasets (3-D video in a point cloud format). As such, there is a new need for algorithms that are able to process data in real time. We propose an algorithm capable of removing vegetation and other objects that may obfuscate concealed targets in a real 3-D environment. The algorithm is based on wavelets and can be used as a pre-processing step in a target recognition algorithm. Applications of the algorithm in a real-time 3-D system could help make pilots aware of high risk hidden targets such as tanks and weapons, among others. We will be using a 4-D simulated point cloud data to demonstrate the capabilities of our algorithm.

  4. Hyperspectral data acquisition and analysis in imaging and real-time active MIR backscattering spectroscopy

    NASA Astrophysics Data System (ADS)

    Jarvis, Jan; Haertelt, Marko; Hugger, Stefan; Butschek, Lorenz; Fuchs, Frank; Ostendorf, Ralf; Wagner, Joachim; Beyerer, Juergen

    2017-04-01

    In this work we present data analysis algorithms for detection of hazardous substances in hyperspectral observations acquired using active mid-infrared (MIR) backscattering spectroscopy. We present a novel background extraction algorithm based on the adaptive target generation process proposed by Ren and Chang called the adaptive background generation process (ABGP) that generates a robust and physically meaningful set of background spectra for operation of the well-known adaptive matched subspace detection (AMSD) algorithm. It is shown that the resulting AMSD-ABGP detection algorithm competes well with other widely used detection algorithms. The method is demonstrated in measurement data obtained by two fundamentally different active MIR hyperspectral data acquisition devices. A hyperspectral image sensor applicable in static scenes takes a wavelength sequential approach to hyperspectral data acquisition, whereas a rapid wavelength-scanning single-element detector variant of the same principle uses spatial scanning to generate the hyperspectral observation. It is shown that the measurement timescale of the latter is sufficient for the application of the data analysis algorithms even in dynamic scenarios.

  5. An OMIC biomarker detection algorithm TriVote and its application in methylomic biomarker detection.

    PubMed

    Xu, Cheng; Liu, Jiamei; Yang, Weifeng; Shu, Yayun; Wei, Zhipeng; Zheng, Weiwei; Feng, Xin; Zhou, Fengfeng

    2018-04-01

    Transcriptomic and methylomic patterns represent two major OMIC data sources impacted by both inheritable genetic information and environmental factors, and have been widely used as disease diagnosis and prognosis biomarkers. Modern transcriptomic and methylomic profiling technologies detect the status of tens of thousands or even millions of probing residues in the human genome, and introduce a major computational challenge for the existing feature selection algorithms. This study proposes a three-step feature selection algorithm, TriVote, to detect a subset of transcriptomic or methylomic residues with highly accurate binary classification performance. TriVote outperforms both filter and wrapper feature selection algorithms with both higher classification accuracy and smaller feature number on 17 transcriptomes and two methylomes. Biological functions of the methylome biomarkers detected by TriVote were discussed for their disease associations. An easy-to-use Python package is also released to facilitate the further applications.

  6. Decision-level fusion of SAR and IR sensor information for automatic target detection

    NASA Astrophysics Data System (ADS)

    Cho, Young-Rae; Yim, Sung-Hyuk; Cho, Hyun-Woong; Won, Jin-Ju; Song, Woo-Jin; Kim, So-Hyeon

    2017-05-01

    We propose a decision-level architecture that combines synthetic aperture radar (SAR) and an infrared (IR) sensor for automatic target detection. We present a new size-based feature, called target-silhouette to reduce the number of false alarms produced by the conventional target-detection algorithm. Boolean Map Visual Theory is used to combine a pair of SAR and IR images to generate the target-enhanced map. Then basic belief assignment is used to transform this map into a belief map. The detection results of sensors are combined to build the target-silhouette map. We integrate the fusion mass and the target-silhouette map on the decision level to exclude false alarms. The proposed algorithm is evaluated using a SAR and IR synthetic database generated by SE-WORKBENCH simulator, and compared with conventional algorithms. The proposed fusion scheme achieves higher detection rate and lower false alarm rate than the conventional algorithms.

  7. Detecting P and S-wave of Mt. Rinjani seismic based on a locally stationary autoregressive (LSAR) model

    NASA Astrophysics Data System (ADS)

    Nurhaida, Subanar, Abdurakhman, Abadi, Agus Maman

    2017-08-01

    Seismic data is usually modelled using autoregressive processes. The aim of this paper is to find the arrival times of the seismic waves of Mt. Rinjani in Indonesia. Kitagawa algorithm's is used to detect the seismic P and S-wave. Householder transformation used in the algorithm made it effectively finding the number of change points and parameters of the autoregressive models. The results show that the use of Box-Cox transformation on the variable selection level makes the algorithm works well in detecting the change points. Furthermore, when the basic span of the subinterval is set 200 seconds and the maximum AR order is 20, there are 8 change points which occur at 1601, 2001, 7401, 7601,7801, 8001, 8201 and 9601. Finally, The P and S-wave arrival times are detected at time 1671 and 2045 respectively using a precise detection algorithm.

  8. An epileptic seizures detection algorithm based on the empirical mode decomposition of EEG.

    PubMed

    Orosco, Lorena; Laciar, Eric; Correa, Agustina Garces; Torres, Abel; Graffigna, Juan P

    2009-01-01

    Epilepsy is a neurological disorder that affects around 50 million people worldwide. The seizure detection is an important component in the diagnosis of epilepsy. In this study, the Empirical Mode Decomposition (EMD) method was proposed on the development of an automatic epileptic seizure detection algorithm. The algorithm first computes the Intrinsic Mode Functions (IMFs) of EEG records, then calculates the energy of each IMF and performs the detection based on an energy threshold and a minimum duration decision. The algorithm was tested in 9 invasive EEG records provided and validated by the Epilepsy Center of the University Hospital of Freiburg. In 90 segments analyzed (39 with epileptic seizures) the sensitivity and specificity obtained with the method were of 56.41% and 75.86% respectively. It could be concluded that EMD is a promissory method for epileptic seizure detection in EEG records.

  9. Penalty dynamic programming algorithm for dim targets detection in sensor systems.

    PubMed

    Huang, Dayu; Xue, Anke; Guo, Yunfei

    2012-01-01

    In order to detect and track multiple maneuvering dim targets in sensor systems, an improved dynamic programming track-before-detect algorithm (DP-TBD) called penalty DP-TBD (PDP-TBD) is proposed. The performances of tracking techniques are used as a feedback to the detection part. The feedback is constructed by a penalty term in the merit function, and the penalty term is a function of the possible target state estimation, which can be obtained by the tracking methods. With this feedback, the algorithm combines traditional tracking techniques with DP-TBD and it can be applied to simultaneously detect and track maneuvering dim targets. Meanwhile, a reasonable constraint that a sensor measurement can originate from one target or clutter is proposed to minimize track separation. Thus, the algorithm can be used in the multi-target situation with unknown target numbers. The efficiency and advantages of PDP-TBD compared with two existing methods are demonstrated by several simulations.

  10. Automated detection and characterization of harmonic tremor in continuous seismic data

    NASA Astrophysics Data System (ADS)

    Roman, Diana C.

    2017-06-01

    Harmonic tremor is a common feature of volcanic, hydrothermal, and ice sheet seismicity and is thus an important proxy for monitoring changes in these systems. However, no automated methods for detecting harmonic tremor currently exist. Because harmonic tremor shares characteristics with speech and music, digital signal processing techniques for analyzing these signals can be adapted. I develop a novel pitch-detection-based algorithm to automatically identify occurrences of harmonic tremor and characterize their frequency content. The algorithm is applied to seismic data from Popocatepetl Volcano, Mexico, and benchmarked against a monthlong manually detected catalog of harmonic tremor events. During a period of heightened eruptive activity from December 2014 to May 2015, the algorithm detects 1465 min of harmonic tremor, which generally precede periods of heightened explosive activity. These results demonstrate the algorithm's ability to accurately characterize harmonic tremor while highlighting the need for additional work to understand its causes and implications at restless volcanoes.

  11. The effect of different distance measures in detecting outliers using clustering-based algorithm for circular regression model

    NASA Astrophysics Data System (ADS)

    Di, Nur Faraidah Muhammad; Satari, Siti Zanariah

    2017-05-01

    Outlier detection in linear data sets has been done vigorously but only a small amount of work has been done for outlier detection in circular data. In this study, we proposed multiple outliers detection in circular regression models based on the clustering algorithm. Clustering technique basically utilizes distance measure to define distance between various data points. Here, we introduce the similarity distance based on Euclidean distance for circular model and obtain a cluster tree using the single linkage clustering algorithm. Then, a stopping rule for the cluster tree based on the mean direction and circular standard deviation of the tree height is proposed. We classify the cluster group that exceeds the stopping rule as potential outlier. Our aim is to demonstrate the effectiveness of proposed algorithms with the similarity distances in detecting the outliers. It is found that the proposed methods are performed well and applicable for circular regression model.

  12. Improved Snow Mapping Accuracy with Revised MODIS Snow Algorithm

    NASA Technical Reports Server (NTRS)

    Riggs, George; Hall, Dorothy K.

    2012-01-01

    The MODIS snow cover products have been used in over 225 published studies. From those reports, and our ongoing analysis, we have learned about the accuracy and errors in the snow products. Revisions have been made in the algorithms to improve the accuracy of snow cover detection in Collection 6 (C6), the next processing/reprocessing of the MODIS data archive planned to start in September 2012. Our objective in the C6 revision of the MODIS snow-cover algorithms and products is to maximize the capability to detect snow cover while minimizing snow detection errors of commission and omission. While the basic snow detection algorithm will not change, new screens will be applied to alleviate snow detection commission and omission errors, and only the fractional snow cover (FSC) will be output (the binary snow cover area (SCA) map will no longer be included).

  13. A blind transform based approach for the detection of isolated astrophysical pulses

    NASA Astrophysics Data System (ADS)

    Alkhweldi, Marwan; Schmid, Natalia A.; Prestage, Richard M.

    2017-06-01

    This paper presents a blind algorithm for the automatic detection of isolated astrophysical pulses. The detection algorithm is applied to spectrograms (also known as "filter bank data" or "the (t,f) plane"). The detection algorithm comprises a sequence of three steps: (1) a Radon transform is applied to the spectrogram, (2) a Fourier transform is applied to each projection parametrized by an angle, and the total power in each projection is calculated, and (3) the total power of all projections above 90° is compared to the total power of all projections below 90° and a decision in favor of an astrophysical pulse present or absent is made. Once a pulse is detected, its Dispersion Measure (DM) is estimated by fitting an analytically developed expression for a transformed spectrogram containing a pulse, with varying value of DM, to the actual data. The performance of the proposed algorithm is numerically analyzed.

  14. Landslide detection using LiDAR data and data mining technology: Ali Mountain Highway case study (Taiwan)

    NASA Astrophysics Data System (ADS)

    Cheng, Youg-Sin; Yu, Teng-To; Tarolli, Paolo

    2017-04-01

    Taiwan mountains are severely affected each year by landslides, rock falls, and debris flows where the roads system suffer the most critical consequences. Among all mountain highways, Ali Highway, located into the main entrance of Alishan Mountain region, is one of the most landslide-prone areas in southern Taiwan. During the typhoon season, between May and August, the probability of occurrence of mass movements is at higher level than usual seeing great erosion rates. In fact, during Typhoon Morakot, in 2009, the intense rainfall caused abrupt interruption of the circulation for three months triggering several landslides (Liu et al. 2012). The topographic features such as slope, roughness and curvature among others have been extracted from 1 m DTM derived by a LiDAR dataset (collected in 2015) to investigate the slope failures along the Ali Mountain Highway. The high-resolution DTM highlighted that the hydrogeomorphological (e.g. density of stream, the distance from the ridge and terrain) features are one of the most influencing factors affecting the change and the instability of the slopes. To detect the landslide area, the decision tree classifier and the random forest algorithm (RF) have been adopted. The results provided a suitable analysis of the area involved in the failure. This will be a useful step in the understanding (and management) landslide processes of study area. References Liu CN, Dong JJ, Chen CJ, Lee WF (2012) Typical landslides and related mechanisms in Ali Mountain highway induced by typhoon Morakot: Perspectives from engineering geology. Landslides 9:239-254.

  15. Analysis of Community Detection Algorithms for Large Scale Cyber Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mane, Prachita; Shanbhag, Sunanda; Kamath, Tanmayee

    The aim of this project is to use existing community detection algorithms on an IP network dataset to create supernodes within the network. This study compares the performance of different algorithms on the network in terms of running time. The paper begins with an introduction to the concept of clustering and community detection followed by the research question that the team aimed to address. Further the paper describes the graph metrics that were considered in order to shortlist algorithms followed by a brief explanation of each algorithm with respect to the graph metric on which it is based. The nextmore » section in the paper describes the methodology used by the team in order to run the algorithms and determine which algorithm is most efficient with respect to running time. Finally, the last section of the paper includes the results obtained by the team and a conclusion based on those results as well as future work.« less

  16. Detection of the ice assertion on aircraft using empirical mode decomposition enhanced by multi-objective optimization

    NASA Astrophysics Data System (ADS)

    Bagherzadeh, Seyed Amin; Asadi, Davood

    2017-05-01

    In search of a precise method for analyzing nonlinear and non-stationary flight data of an aircraft in the icing condition, an Empirical Mode Decomposition (EMD) algorithm enhanced by multi-objective optimization is introduced. In the proposed method, dissimilar IMF definitions are considered by the Genetic Algorithm (GA) in order to find the best decision parameters of the signal trend. To resolve disadvantages of the classical algorithm caused by the envelope concept, the signal trend is estimated directly in the proposed method. Furthermore, in order to simplify the performance and understanding of the EMD algorithm, the proposed method obviates the need for a repeated sifting process. The proposed enhanced EMD algorithm is verified by some benchmark signals. Afterwards, the enhanced algorithm is applied to simulated flight data in the icing condition in order to detect the ice assertion on the aircraft. The results demonstrate the effectiveness of the proposed EMD algorithm in aircraft ice detection by providing a figure of merit for the icing severity.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katti, Amogh; Di Fatta, Giuseppe; Naughton, Thomas

    Future extreme-scale high-performance computing systems will be required to work under frequent component failures. The MPI Forum s User Level Failure Mitigation proposal has introduced an operation, MPI Comm shrink, to synchronize the alive processes on the list of failed processes, so that applications can continue to execute even in the presence of failures by adopting algorithm-based fault tolerance techniques. This MPI Comm shrink operation requires a failure detection and consensus algorithm. This paper presents three novel failure detection and consensus algorithms using Gossiping. The proposed algorithms were implemented and tested using the Extreme-scale Simulator. The results show that inmore » all algorithms the number of Gossip cycles to achieve global consensus scales logarithmically with system size. The second algorithm also shows better scalability in terms of memory and network bandwidth usage and a perfect synchronization in achieving global consensus. The third approach is a three-phase distributed failure detection and consensus algorithm and provides consistency guarantees even in very large and extreme-scale systems while at the same time being memory and bandwidth efficient.« less

  18. Detecting Anomalies in Process Control Networks

    NASA Astrophysics Data System (ADS)

    Rrushi, Julian; Kang, Kyoung-Don

    This paper presents the estimation-inspection algorithm, a statistical algorithm for anomaly detection in process control networks. The algorithm determines if the payload of a network packet that is about to be processed by a control system is normal or abnormal based on the effect that the packet will have on a variable stored in control system memory. The estimation part of the algorithm uses logistic regression integrated with maximum likelihood estimation in an inductive machine learning process to estimate a series of statistical parameters; these parameters are used in conjunction with logistic regression formulas to form a probability mass function for each variable stored in control system memory. The inspection part of the algorithm uses the probability mass functions to estimate the normalcy probability of a specific value that a network packet writes to a variable. Experimental results demonstrate that the algorithm is very effective at detecting anomalies in process control networks.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosler, Peter Andrew; Roesler, Erika Louise; Taylor, Mark A.

    This article discusses the problem of identifying extreme climate events such as intense storms within large climate data sets. The basic storm detection algorithm is reviewed, which splits the problem into two parts: a spatial search followed by a temporal correlation problem. Two specific implementations of the spatial search algorithm are compared. The commonly used grid point search algorithm is reviewed, and a new algorithm called Stride Search is introduced. Stride Search is designed to work at all latitudes, while grid point searches may fail in polar regions. Results from the two algorithms are compared for the application of tropicalmore » cyclone detection, and shown to produce similar results for the same set of storm identification criteria. The time required for both algorithms to search the same data set is compared. Furthermore, Stride Search's ability to search extreme latitudes is demonstrated for the case of polar low detection.« less

  20. An Efficient Conflict Detection Algorithm for Packet Filters

    NASA Astrophysics Data System (ADS)

    Lee, Chun-Liang; Lin, Guan-Yu; Chen, Yaw-Chung

    Packet classification is essential for supporting advanced network services such as firewalls, quality-of-service (QoS), virtual private networks (VPN), and policy-based routing. The rules that routers use to classify packets are called packet filters. If two or more filters overlap, a conflict occurs and leads to ambiguity in packet classification. This study proposes an algorithm that can efficiently detect and resolve filter conflicts using tuple based search. The time complexity of the proposed algorithm is O(nW+s), and the space complexity is O(nW), where n is the number of filters, W is the number of bits in a header field, and s is the number of conflicts. This study uses the synthetic filter databases generated by ClassBench to evaluate the proposed algorithm. Simulation results show that the proposed algorithm can achieve better performance than existing conflict detection algorithms both in time and space, particularly for databases with large numbers of conflicts.

Top