Science.gov

Sample records for algorithm successfully detected

  1. An Ensemble Successive Project Algorithm for Liquor Detection Using Near Infrared Sensor

    PubMed Central

    Qu, Fangfang; Ren, Dong; Wang, Jihua; Zhang, Zhong; Lu, Na; Meng, Lei

    2016-01-01

    Spectral analysis technique based on near infrared (NIR) sensor is a powerful tool for complex information processing and high precision recognition, and it has been widely applied to quality analysis and online inspection of agricultural products. This paper proposes a new method to address the instability of small sample sizes in the successive projections algorithm (SPA) as well as the lack of association between selected variables and the analyte. The proposed method is an evaluated bootstrap ensemble SPA method (EBSPA) based on a variable evaluation index (EI) for variable selection, and is applied to the quantitative prediction of alcohol concentrations in liquor using NIR sensor. In the experiment, the proposed EBSPA with three kinds of modeling methods are established to test their performance. In addition, the proposed EBSPA combined with partial least square is compared with other state-of-the-art variable selection methods. The results show that the proposed method can solve the defects of SPA and it has the best generalization performance and stability. Furthermore, the physical meaning of the selected variables from the near infrared sensor data is clear, which can effectively reduce the variables and improve their prediction accuracy. PMID:26761015

  2. License plate detection algorithm

    NASA Astrophysics Data System (ADS)

    Broitman, Michael; Klopovsky, Yuri; Silinskis, Normunds

    2013-12-01

    A novel algorithm for vehicle license plates localization is proposed. The algorithm is based on pixel intensity transition gradient analysis. Near to 2500 natural-scene gray-level vehicle images of different backgrounds and ambient illumination was tested. The best set of algorithm's parameters produces detection rate up to 0.94. Taking into account abnormal camera location during our tests and therefore geometrical distortion and troubles from trees this result could be considered as passable. Correlation between source data, such as license Plate dimensions and texture, cameras location and others, and parameters of algorithm were also defined.

  3. A universal symmetry detection algorithm.

    PubMed

    Maurer, Peter M

    2015-01-01

    Research on symmetry detection focuses on identifying and detecting new types of symmetry. The paper presents an algorithm that is capable of detecting any type of permutation-based symmetry, including many types for which there are no existing algorithms. General symmetry detection is library-based, but symmetries that can be parameterized, (i.e. total, partial, rotational, and dihedral symmetry), can be detected without using libraries. In many cases it is faster than existing techniques. Furthermore, it is simpler than most existing techniques, and can easily be incorporated into existing software. The algorithm can also be used with virtually any type of matrix-based symmetry, including conjugate symmetry.

  4. Innovative algorithm for cast detection

    NASA Astrophysics Data System (ADS)

    Gasparini, Francesca; Schettini, Raimondo; Gallina, Paolo

    2001-12-01

    The paper describes a method for detecting a color cast (i.e. a superimposed dominant color) in a digital image without any a priori knowledge of its semantic content. The color gamut of the image is first mapped in the CIELab color space. The color distribution of the whole image and of the so-called Near Neutral Objects (NNO) is then investigated using statistical tools then, to determine the presence of a cast. The boundaries of the near neutral objects in the color space are set adaptively by the algorithm on the basis of a preliminary analysis of the image color gamut. The method we propose has been tuned and successfully tested on a large data set of images, downloaded from personal web-pages or acquired using various digital and traditional cameras.

  5. MUSIC algorithms for rebar detection

    NASA Astrophysics Data System (ADS)

    Solimene, Raffaele; Leone, Giovanni; Dell'Aversano, Angela

    2013-12-01

    The MUSIC (MUltiple SIgnal Classification) algorithm is employed to detect and localize an unknown number of scattering objects which are small in size as compared to the wavelength. The ensemble of objects to be detected consists of both strong and weak scatterers. This represents a scattering environment challenging for detection purposes as strong scatterers tend to mask the weak ones. Consequently, the detection of more weakly scattering objects is not always guaranteed and can be completely impaired when the noise corrupting data is of a relatively high level. To overcome this drawback, here a new technique is proposed, starting from the idea of applying a two-stage MUSIC algorithm. In the first stage strong scatterers are detected. Then, information concerning their number and location is employed in the second stage focusing only on the weak scatterers. The role of an adequate scattering model is emphasized to improve drastically detection performance in realistic scenarios.

  6. A fast meteor detection algorithm

    NASA Astrophysics Data System (ADS)

    Gural, P.

    2016-01-01

    A low latency meteor detection algorithm for use with fast steering mirrors had been previously developed to track and telescopically follow meteors in real-time (Gural, 2007). It has been rewritten as a generic clustering and tracking software module for meteor detection that meets both the demanding throughput requirements of a Raspberry Pi while also maintaining a high probability of detection. The software interface is generalized to work with various forms of front-end video pre-processing approaches and provides a rich product set of parameterized line detection metrics. Discussion will include the Maximum Temporal Pixel (MTP) compression technique as a fast thresholding option for feeding the detection module, the detection algorithm trade for maximum processing throughput, details on the clustering and tracking methodology, processing products, performance metrics, and a general interface description.

  7. Introductory Students, Conceptual Understanding, and Algorithmic Success.

    ERIC Educational Resources Information Center

    Pushkin, David B.

    1998-01-01

    Addresses the distinction between conceptual and algorithmic learning and the clarification of what is meant by a second-tier student. Explores why novice learners in chemistry and physics are able to apply algorithms without significant conceptual understanding. (DDR)

  8. The minimal time detection algorithm

    NASA Technical Reports Server (NTRS)

    Kim, Sungwan

    1995-01-01

    An aerospace vehicle may operate throughout a wide range of flight environmental conditions that affect its dynamic characteristics. Even when the control design incorporates a degree of robustness, system parameters may drift enough to cause its performance to degrade below an acceptable level. The object of this paper is to develop a change detection algorithm so that we can build a highly adaptive control system applicable to aircraft systems. The idea is to detect system changes with minimal time delay. The algorithm developed is called Minimal Time-Change Detection Algorithm (MT-CDA) which detects the instant of change as quickly as possible with false-alarm probability below a certain specified level. Simulation results for the aircraft lateral motion with a known or unknown change in control gain matrices, in the presence of doublet input, indicate that the algorithm works fairly well as theory indicates though there is a difficulty in deciding the exact amount of change in some situations. One of MT-CDA distinguishing properties is that detection delay of MT-CDA is superior to that of Whiteness Test.

  9. Wire Detection Algorithms for Navigation

    NASA Technical Reports Server (NTRS)

    Kasturi, Rangachar; Camps, Octavia I.

    2002-01-01

    In this research we addressed the problem of obstacle detection for low altitude rotorcraft flight. In particular, the problem of detecting thin wires in the presence of image clutter and noise was studied. Wires present a serious hazard to rotorcrafts. Since they are very thin, their detection early enough so that the pilot has enough time to take evasive action is difficult, as their images can be less than one or two pixels wide. Two approaches were explored for this purpose. The first approach involved a technique for sub-pixel edge detection and subsequent post processing, in order to reduce the false alarms. After reviewing the line detection literature, an algorithm for sub-pixel edge detection proposed by Steger was identified as having good potential to solve the considered task. The algorithm was tested using a set of images synthetically generated by combining real outdoor images with computer generated wire images. The performance of the algorithm was evaluated both, at the pixel and the wire levels. It was observed that the algorithm performs well, provided that the wires are not too thin (or distant) and that some post processing is performed to remove false alarms due to clutter. The second approach involved the use of an example-based learning scheme namely, Support Vector Machines. The purpose of this approach was to explore the feasibility of an example-based learning based approach for the task of detecting wires from their images. Support Vector Machines (SVMs) have emerged as a promising pattern classification tool and have been used in various applications. It was found that this approach is not suitable for very thin wires and of course, not suitable at all for sub-pixel thick wires. High dimensionality of the data as such does not present a major problem for SVMs. However it is desirable to have a large number of training examples especially for high dimensional data. The main difficulty in using SVMs (or any other example-based learning

  10. A new real-time tsunami detection algorithm

    NASA Astrophysics Data System (ADS)

    Chierici, Francesco; Embriaco, Davide; Pignagnoli, Luca

    2017-01-01

    Real-time tsunami detection algorithms play a key role in any Tsunami Early Warning System. We have developed a new algorithm for tsunami detection based on the real-time tide removal and real-time band-pass filtering of seabed pressure recordings. The algorithm greatly increases the tsunami detection probability, shortens the detection delay and enhances detection reliability with respect to the most widely used tsunami detection algorithm, while containing the computational cost. The algorithm is designed to be used also in autonomous early warning systems with a set of input parameters and procedures which can be reconfigured in real time. We have also developed a methodology based on Monte Carlo simulations to test the tsunami detection algorithms. The algorithm performance is estimated by defining and evaluating statistical parameters, namely the detection probability, the detection delay, which are functions of the tsunami amplitude and wavelength, and the occurring rate of false alarms. Pressure data sets acquired by Bottom Pressure Recorders in different locations and environmental conditions have been used in order to consider real working scenarios in the test. We also present an application of the algorithm to the tsunami event which occurred at Haida Gwaii on 28 October 2012 using data recorded by the Bullseye underwater node of Ocean Networks Canada. The algorithm successfully ran for test purpose in year-long missions onboard abyssal observatories, deployed in the Gulf of Cadiz and in the Western Ionian Sea.

  11. An efficient parallel termination detection algorithm

    SciTech Connect

    Baker, A. H.; Crivelli, S.; Jessup, E. R.

    2004-05-27

    Information local to any one processor is insufficient to monitor the overall progress of most distributed computations. Typically, a second distributed computation for detecting termination of the main computation is necessary. In order to be a useful computational tool, the termination detection routine must operate concurrently with the main computation, adding minimal overhead, and it must promptly and correctly detect termination when it occurs. In this paper, we present a new algorithm for detecting the termination of a parallel computation on distributed-memory MIMD computers that satisfies all of those criteria. A variety of termination detection algorithms have been devised. Of these, the algorithm presented by Sinha, Kale, and Ramkumar (henceforth, the SKR algorithm) is unique in its ability to adapt to the load conditions of the system on which it runs, thereby minimizing the impact of termination detection on performance. Because their algorithm also detects termination quickly, we consider it to be the most efficient practical algorithm presently available. The termination detection algorithm presented here was developed for use in the PMESC programming library for distributed-memory MIMD computers. Like the SKR algorithm, our algorithm adapts to system loads and imposes little overhead. Also like the SKR algorithm, ours is tree-based, and it does not depend on any assumptions about the physical interconnection topology of the processors or the specifics of the distributed computation. In addition, our algorithm is easier to implement and requires only half as many tree traverses as does the SKR algorithm. This paper is organized as follows. In section 2, we define our computational model. In section 3, we review the SKR algorithm. We introduce our new algorithm in section 4, and prove its correctness in section 5. We discuss its efficiency and present experimental results in section 6.

  12. Automatic ionospheric layers detection: Algorithms analysis

    NASA Astrophysics Data System (ADS)

    Molina, María G.; Zuccheretti, Enrico; Cabrera, Miguel A.; Bianchi, Cesidio; Sciacca, Umberto; Baskaradas, James

    2016-03-01

    Vertical sounding is a widely used technique to obtain ionosphere measurements, such as an estimation of virtual height versus frequency scanning. It is performed by high frequency radar for geophysical applications called ;ionospheric sounder; (or ;ionosonde;). Radar detection depends mainly on targets characteristics. While several targets behavior and correspondent echo detection algorithms have been studied, a survey to address a suitable algorithm for ionospheric sounder has to be carried out. This paper is focused on automatic echo detection algorithms implemented in particular for an ionospheric sounder, target specific characteristics were studied as well. Adaptive threshold detection algorithms are proposed, compared to the current implemented algorithm, and tested using actual data obtained from the Advanced Ionospheric Sounder (AIS-INGV) at Rome Ionospheric Observatory. Different cases of study have been selected according typical ionospheric and detection conditions.

  13. Aquarius RFI Detection and Mitigation Algorithm: Assessment and Examples

    NASA Technical Reports Server (NTRS)

    Le Vine, David M.; De Matthaeis, P.; Ruf, Christopher S.; Chen, D. D.

    2013-01-01

    Aquarius is an L-band radiometer system designed to map sea surface salinity from space. This is a sensitive measurement, and protection from radio frequency interference (RFI) is important for success. An initial look at the performance of the Aquarius RFI detection and mitigation algorithm is reported together with examples of the global distribution of RFI at the L-band. To protect against RFI, Aquarius employs rapid sampling (10 ms) and a "glitch" detection algorithm that looks for outliers among the samples. Samples identified as RFI are removed, and the remainder is averaged to produce an RFI-free signal for the salinity retrieval algorithm. The RFI detection algorithm appears to work well over the ocean with modest rates for false alarms (5%) and missed detection. The global distribution of RFI coincides well with population centers and is consistent with observations reported by the Soil Moisture and Ocean Salinity mission.

  14. Smell Detection Agent Based Optimization Algorithm

    NASA Astrophysics Data System (ADS)

    Vinod Chandra, S. S.

    2016-09-01

    In this paper, a novel nature-inspired optimization algorithm has been employed and the trained behaviour of dogs in detecting smell trails is adapted into computational agents for problem solving. The algorithm involves creation of a surface with smell trails and subsequent iteration of the agents in resolving a path. This algorithm can be applied in different computational constraints that incorporate path-based problems. Implementation of the algorithm can be treated as a shortest path problem for a variety of datasets. The simulated agents have been used to evolve the shortest path between two nodes in a graph. This algorithm is useful to solve NP-hard problems that are related to path discovery. This algorithm is also useful to solve many practical optimization problems. The extensive derivation of the algorithm can be enabled to solve shortest path problems.

  15. Threshold-Based OSIC Detection Algorithm for Per-Antenna-Coded TIMO-OFDM Systems

    NASA Astrophysics Data System (ADS)

    Wang, Xinzheng; Chen, Ming; Zhu, Pengcheng

    Threshold-based ordered successive interference cancellation (OSIC) detection algorithm is proposed for per-antenna-coded (PAC) two-input multiple-output (TIMO) orthogonal frequency division multiplexing (OFDM) systems. Successive interference cancellation (SIC) is performed selectively according to channel conditions. Compared with the conventional OSIC algorithm, the proposed algorithm reduces the complexity significantly with only a slight performance degradation.

  16. Negative Selection Algorithm for Aircraft Fault Detection

    NASA Technical Reports Server (NTRS)

    Dasgupta, D.; KrishnaKumar, K.; Wong, D.; Berry, M.

    2004-01-01

    We investigated a real-valued Negative Selection Algorithm (NSA) for fault detection in man-in-the-loop aircraft operation. The detection algorithm uses body-axes angular rate sensory data exhibiting the normal flight behavior patterns, to generate probabilistically a set of fault detectors that can detect any abnormalities (including faults and damages) in the behavior pattern of the aircraft flight. We performed experiments with datasets (collected under normal and various simulated failure conditions) using the NASA Ames man-in-the-loop high-fidelity C-17 flight simulator. The paper provides results of experiments with different datasets representing various failure conditions.

  17. Stop and Look Detection Algorithm.

    DTIC Science & Technology

    1985-05-01

    cookie cutter, the target leaves datum on a random fixed course at constant velocity, the searcher travels at constant velocity and the searcher stops and...stops and looks for the target at predeter- mined search points. a. Detection is deterministic, i.e., cookie cutter. v-I po...deterministic, i.e., cookie cutter.’ 170 PRINT’ 3) The searcher begins searching at time late with constant velocity.’ *190 PRINT’ 4) The target leaves datum

  18. Memetic algorithm for community detection in networks.

    PubMed

    Gong, Maoguo; Fu, Bao; Jiao, Licheng; Du, Haifeng

    2011-11-01

    Community structure is one of the most important properties in networks, and community detection has received an enormous amount of attention in recent years. Modularity is by far the most used and best known quality function for measuring the quality of a partition of a network, and many community detection algorithms are developed to optimize it. However, there is a resolution limit problem in modularity optimization methods. In this study, a memetic algorithm, named Meme-Net, is proposed to optimize another quality function, modularity density, which includes a tunable parameter that allows one to explore the network at different resolutions. Our proposed algorithm is a synergy of a genetic algorithm with a hill-climbing strategy as the local search procedure. Experiments on computer-generated and real-world networks show the effectiveness and the multiresolution ability of the proposed method.

  19. SIDRA: a blind algorithm for signal detection in photometric surveys

    NASA Astrophysics Data System (ADS)

    Mislis, D.; Bachelet, E.; Alsubai, K. A.; Bramich, D. M.; Parley, N.

    2016-01-01

    We present the Signal Detection using Random-Forest Algorithm (SIDRA). SIDRA is a detection and classification algorithm based on the Machine Learning technique (Random Forest). The goal of this paper is to show the power of SIDRA for quick and accurate signal detection and classification. We first diagnose the power of the method with simulated light curves and try it on a subset of the Kepler space mission catalogue. We use five classes of simulated light curves (CONSTANT, TRANSIT, VARIABLE, MLENS and EB for constant light curves, transiting exoplanet, variable, microlensing events and eclipsing binaries, respectively) to analyse the power of the method. The algorithm uses four features in order to classify the light curves. The training sample contains 5000 light curves (1000 from each class) and 50 000 random light curves for testing. The total SIDRA success ratio is ≥90 per cent. Furthermore, the success ratio reaches 95-100 per cent for the CONSTANT, VARIABLE, EB and MLENS classes and 92 per cent for the TRANSIT class with a decision probability of 60 per cent. Because the TRANSIT class is the one which fails the most, we run a simultaneous fit using SIDRA and a Box Least Square (BLS)-based algorithm for searching for transiting exoplanets. As a result, our algorithm detects 7.5 per cent more planets than a classic BLS algorithm, with better results for lower signal-to-noise light curves. SIDRA succeeds to catch 98 per cent of the planet candidates in the Kepler sample and fails for 7 per cent of the false alarms subset. SIDRA promises to be useful for developing a detection algorithm and/or classifier for large photometric surveys such as TESS and PLATO exoplanet future space missions.

  20. Lightning detection and exposure algorithms for smartphones

    NASA Astrophysics Data System (ADS)

    Wang, Haixin; Shao, Xiaopeng; Wang, Lin; Su, Laili; Huang, Yining

    2015-05-01

    This study focuses on the key theory of lightning detection, exposure and the experiments. Firstly, the algorithm based on differential operation between two adjacent frames is selected to remove the lightning background information and extract lighting signal, and the threshold detection algorithm is applied to achieve the purpose of precise detection of lightning. Secondly, an algorithm is proposed to obtain scene exposure value, which can automatically detect external illumination status. Subsequently, a look-up table could be built on the basis of the relationships between the exposure value and average image brightness to achieve rapid automatic exposure. Finally, based on a USB 3.0 industrial camera including a CMOS imaging sensor, a set of hardware test platform is established and experiments are carried out on this platform to verify the performances of the proposed algorithms. The algorithms can effectively and fast capture clear lightning pictures such as special nighttime scenes, which will provide beneficial supporting to the smartphone industry, since the current exposure methods in smartphones often lost capture or induce overexposed or underexposed pictures.

  1. Staff line detection and revision algorithm based on subsection projection and correlation algorithm

    NASA Astrophysics Data System (ADS)

    Yang, Yin-xian; Yang, Ding-li

    2013-03-01

    Staff line detection plays a key role in OMR technology, and is the precon-ditions of subsequent segmentation 1& recognition of music sheets. For the phenomena of horizontal inclination & curvature of staff lines and vertical inclination of image, which often occur in music scores, an improved approach based on subsection projection is put forward to realize the detection of original staff lines and revision in an effect to implement staff line detection more successfully. Experimental results show the presented algorithm can detect and revise staff lines fast and effectively.

  2. Fast Outlier Detection Using a Grid-Based Algorithm.

    PubMed

    Lee, Jihwan; Cho, Nam-Wook

    2016-01-01

    As one of data mining techniques, outlier detection aims to discover outlying observations that deviate substantially from the reminder of the data. Recently, the Local Outlier Factor (LOF) algorithm has been successfully applied to outlier detection. However, due to the computational complexity of the LOF algorithm, its application to large data with high dimension has been limited. The aim of this paper is to propose grid-based algorithm that reduces the computation time required by the LOF algorithm to determine the k-nearest neighbors. The algorithm divides the data spaces in to a smaller number of regions, called as a "grid", and calculates the LOF value of each grid. To examine the effectiveness of the proposed method, several experiments incorporating different parameters were conducted. The proposed method demonstrated a significant computation time reduction with predictable and acceptable trade-off errors. Then, the proposed methodology was successfully applied to real database transaction logs of Korea Atomic Energy Research Institute. As a result, we show that for a very large dataset, the grid-LOF can be considered as an acceptable approximation for the original LOF. Moreover, it can also be effectively used for real-time outlier detection.

  3. Obstacle Detection Algorithms for Rotorcraft Navigation

    NASA Technical Reports Server (NTRS)

    Kasturi, Rangachar; Camps, Octavia I.; Huang, Ying; Narasimhamurthy, Anand; Pande, Nitin; Ahumada, Albert (Technical Monitor)

    2001-01-01

    In this research we addressed the problem of obstacle detection for low altitude rotorcraft flight. In particular, the problem of detecting thin wires in the presence of image clutter and noise was studied. Wires present a serious hazard to rotorcrafts. Since they are very thin, their detection early enough so that the pilot has enough time to take evasive action is difficult, as their images can be less than one or two pixels wide. After reviewing the line detection literature, an algorithm for sub-pixel edge detection proposed by Steger was identified as having good potential to solve the considered task. The algorithm was tested using a set of images synthetically generated by combining real outdoor images with computer generated wire images. The performance of the algorithm was evaluated both, at the pixel and the wire levels. It was observed that the algorithm performs well, provided that the wires are not too thin (or distant) and that some post processing is performed to remove false alarms due to clutter.

  4. Improved imaging algorithm for bridge crack detection

    NASA Astrophysics Data System (ADS)

    Lu, Jingxiao; Song, Pingli; Han, Kaihong

    2012-04-01

    This paper present an improved imaging algorithm for bridge crack detection, through optimizing the eight-direction Sobel edge detection operator, making the positioning of edge points more accurate than without the optimization, and effectively reducing the false edges information, so as to facilitate follow-up treatment. In calculating the crack geometry characteristics, we use the method of extracting skeleton on single crack length. In order to calculate crack area, we construct the template of area by making logical bitwise AND operation of the crack image. After experiment, the results show errors of the crack detection method and actual manual measurement are within an acceptable range, meet the needs of engineering applications. This algorithm is high-speed and effective for automated crack measurement, it can provide more valid data for proper planning and appropriate performance of the maintenance and rehabilitation processes of bridge.

  5. Detection algorithm for multiple rice seeds images

    NASA Astrophysics Data System (ADS)

    Cheng, F.; Ying, Y. B.

    2006-10-01

    The objective of this research is to develop a digital image analysis algorithm for detection of multiple rice seeds images. The rice seeds used for this study involved a hybrid rice seed variety. Images of multiple rice seeds were acquired with a machine vision system for quality inspection of bulk rice seeds, which is designed to inspect rice seeds on a rotating disk with a CCD camera. Combining morphological operations and parallel processing gave improvements in accuracy, and a reduction in computation time. Using image features selected based on classification ability; a highly acceptable defects classification was achieved when the algorithm was implemented for all the samples to test the adaptability.

  6. Algorithm development for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Rosario, Dalton S.

    2008-10-01

    This dissertation proposes and evaluates a novel anomaly detection algorithm suite for ground-to-ground, or air-to-ground, applications requiring automatic target detection using hyperspectral (HS) data. Targets are manmade objects in natural background clutter under unknown illumination and atmospheric conditions. The use of statistical models herein is purely for motivation of particular formulas for calculating anomaly output surfaces. In particular, formulas from semiparametrics are utilized to obtain novel forms for output surfaces, and alternative scoring algorithms are proposed to calculate output surfaces that are comparable to those of semiparametrics. Evaluation uses both simulated data and real HS data from a joint data collection effort between the Army Research Laboratory and the Army Armament Research Development & Engineering Center. A data transformation method is presented for use by the two-sample data structure univariate semiparametric and nonparametric scoring algorithms, such that, the two-sample data are mapped from their original multivariate space to an univariate domain, where the statistical power of the univariate scoring algorithms is shown to be improved relative to existing multivariate scoring algorithms testing the same two-sample data. An exhaustive simulation experimental study is conducted to assess the performance of different HS anomaly detection techniques, where the null and alternative hypotheses are completely specified, including all parameters, using multivariate normal and mixtures of multivariate normal distributions. Finally, for ground-to-ground anomaly detection applications, where the unknown scales of targets add to the problem complexity, a novel global anomaly detection algorithm suite is introduced, featuring autonomous partial random sampling (PRS) of the data cube. The PRS method is proposed to automatically sample the unknown background clutter in the test HS imagery, and by repeating multiple times this

  7. Detection of Cheating by Decimation Algorithm

    NASA Astrophysics Data System (ADS)

    Yamanaka, Shogo; Ohzeki, Masayuki; Decelle, Aurélien

    2015-02-01

    We expand the item response theory to study the case of "cheating students" for a set of exams, trying to detect them by applying a greedy algorithm of inference. This extended model is closely related to the Boltzmann machine learning. In this paper we aim to infer the correct biases and interactions of our model by considering a relatively small number of sets of training data. Nevertheless, the greedy algorithm that we employed in the present study exhibits good performance with a few number of training data. The key point is the sparseness of the interactions in our problem in the context of the Boltzmann machine learning: the existence of cheating students is expected to be very rare (possibly even in real world). We compare a standard approach to infer the sparse interactions in the Boltzmann machine learning to our greedy algorithm and we find the latter to be superior in several aspects.

  8. A novel algorithm for notch detection

    NASA Astrophysics Data System (ADS)

    Acosta, C.; Salazar, D.; Morales, D.

    2013-06-01

    It is common knowledge that DFM guidelines require revisions to design data. These guidelines impose the need for corrections inserted into areas within the design data flow. At times, this requires rather drastic modifications to the data, both during the layer derivation or DRC phase, and especially within the RET phase. For example, OPC. During such data transformations, several polygon geometry changes are introduced, which can substantially increase shot count, geometry complexity, and eventually conversion to mask writer machine formats. In this resulting complex data, it may happen that notches are found that do not significantly contribute to the final manufacturing results, but do in fact contribute to the complexity of the surrounding geometry, and are therefore undesirable. Additionally, there are cases in which the overall figure count can be reduced with minimum impact in the quality of the corrected data, if notches are detected and corrected. Case in point, there are other cases where data quality could be improved if specific valley notches are filled in, or peak notches are cut out. Such cases generally satisfy specific geometrical restrictions in order to be valid candidates for notch correction. Traditional notch detection has been done for rectilinear data (Manhattan-style) and only in axis-parallel directions. The traditional approaches employ dimensional measurement algorithms that measure edge distances along the outside of polygons. These approaches are in general adaptations, and therefore ill-fitted for generalized detection of notches with strange shapes and in strange rotations. This paper covers a novel algorithm developed for the CATS MRCC tool that finds both valley and/or peak notches that are candidates for removal. The algorithm is generalized and invariant to data rotation, so that it can find notches in data rotated in any angle. It includes parameters to control the dimensions of detected notches, as well as algorithm tolerances

  9. A bioinspired collision detection algorithm for VLSI implementation

    NASA Astrophysics Data System (ADS)

    Cuadri, J.; Linan, G.; Stafford, R.; Keil, M. S.; Roca, E.

    2005-06-01

    In this paper a bioinspired algorithm for collision detection is proposed, based on previous models of the locust (Locusta migratoria) visual system reported by F.C. Rind and her group, in the University of Newcastle-upon-Tyne. The algorithm is suitable for VLSI implementation in standard CMOS technologies as a system-on-chip for automotive applications. The working principle of the algorithm is to process a video stream that represents the current scenario, and to fire an alarm whenever an object approaches on a collision course. Moreover, it establishes a scale of warning states, from no danger to collision alarm, depending on the activity detected in the current scenario. In the worst case, the minimum time before collision at which the model fires the collision alarm is 40 msec (1 frame before, at 25 frames per second). Since the average time to successfully fire an airbag system is 2 msec, even in the worst case, this algorithm would be very helpful to more efficiently arm the airbag system, or even take some kind of collision avoidance countermeasures. Furthermore, two additional modules have been included: a "Topological Feature Estimator" and an "Attention Focusing Algorithm". The former takes into account the shape of the approaching object to decide whether it is a person, a road line or a car. This helps to take more adequate countermeasures and to filter false alarms. The latter centres the processing power into the most active zones of the input frame, thus saving memory and processing time resources.

  10. Detecting Neonatal Seizures With Computer Algorithms.

    PubMed

    Temko, Andriy; Lightbody, Gordon

    2016-10-01

    It is now generally accepted that EEG is the only reliable way to accurately detect newborn seizures and, as such, prolonged EEG monitoring is increasingly being adopted in neonatal intensive care units. Long EEG recordings may last from several hours to a few days. With neurophysiologists not always available to review the EEG during unsociable hours, there is a pressing need to develop a reliable and robust automatic seizure detection method-a computer algorithm that can take the EEG signal, process it, and output information that supports clinical decision making. In this study, we review existing algorithms based on how the relevant seizure information is exploited. We start with commonly used methods to extract signatures from seizure signals that range from those that mimic the clinical neurophysiologist to those that exploit mathematical models of neonatal EEG generation. Commonly used classification methods are reviewed that are based on a set of rules and thresholds that are either heuristically tuned or automatically derived from the data. These are followed by techniques to use information about spatiotemporal seizure context. The usual errors in system design and validation are discussed. Current clinical decision support tools that have met regulatory requirements and are available to detect neonatal seizures are reviewed with progress and the outstanding challenges are outlined. This review discusses the current state of the art regarding automatic detection of neonatal seizures.

  11. Parallelization of Edge Detection Algorithm using MPI on Beowulf Cluster

    NASA Astrophysics Data System (ADS)

    Haron, Nazleeni; Amir, Ruzaini; Aziz, Izzatdin A.; Jung, Low Tan; Shukri, Siti Rohkmah

    In this paper, we present the design of parallel Sobel edge detection algorithm using Foster's methodology. The parallel algorithm is implemented using MPI message passing library and master/slave algorithm. Every processor performs the same sequential algorithm but on different part of the image. Experimental results conducted on Beowulf cluster are presented to demonstrate the performance of the parallel algorithm.

  12. Network Algorithms for Detection of Radiation Sources

    SciTech Connect

    Rao, Nageswara S; Brooks, Richard R; Wu, Qishi

    2014-01-01

    In support of national defense, Domestic Nuclear Detection Office s (DNDO) Intelligent Radiation Sensor Systems (IRSS) program supported the development of networks of radiation counters for detecting, localizing and identifying low-level, hazardous radiation sources. Industry teams developed the first generation of such networks with tens of counters, and demonstrated several of their capabilities in indoor and outdoor characterization tests. Subsequently, these test measurements have been used in algorithm replays using various sub-networks of counters. Test measurements combined with algorithm outputs are used to extract Key Measurements and Benchmark (KMB) datasets. We present two selective analyses of these datasets: (a) a notional border monitoring scenario that highlights the benefits of a network of counters compared to individual detectors, and (b) new insights into the Sequential Probability Ratio Test (SPRT) detection method, which lead to its adaptations for improved detection. Using KMB datasets from an outdoor test, we construct a notional border monitoring scenario, wherein twelve 2 *2 NaI detectors are deployed on the periphery of 21*21meter square region. A Cs-137 (175 uCi) source is moved across this region, starting several meters from outside and finally moving away. The measurements from individual counters and the network were processed using replays of a particle filter algorithm developed under IRSS program. The algorithm outputs from KMB datasets clearly illustrate the benefits of combining measurements from all networked counters: the source was detected before it entered the region, during its trajectory inside, and until it moved several meters away. When individual counters are used for detection, the source was detected for much shorter durations, and sometimes was missed in the interior region. The application of SPRT for detecting radiation sources requires choosing the detection threshold, which in turn requires a source strength

  13. Evaluation of hybrids algorithms for mass detection in digitalized mammograms

    NASA Astrophysics Data System (ADS)

    Cordero, José; Garzón Reyes, Johnson

    2011-01-01

    The breast cancer remains being a significant public health problem, the early detection of the lesions can increase the success possibilities of the medical treatments. The mammography is an image modality effective to early diagnosis of abnormalities, where the medical image is obtained of the mammary gland with X-rays of low radiation, this allows detect a tumor or circumscribed mass between two to three years before that it was clinically palpable, and is the only method that until now achieved reducing the mortality by breast cancer. In this paper three hybrids algorithms for circumscribed mass detection on digitalized mammograms are evaluated. In the first stage correspond to a review of the enhancement and segmentation techniques used in the processing of the mammographic images. After a shape filtering was applied to the resulting regions. By mean of a Bayesian filter the survivors regions were processed, where the characteristics vector for the classifier was constructed with few measurements. Later, the implemented algorithms were evaluated by ROC curves, where 40 images were taken for the test, 20 normal images and 20 images with circumscribed lesions. Finally, the advantages and disadvantages in the correct detection of a lesion of every algorithm are discussed.

  14. CLADA: cortical longitudinal atrophy detection algorithm.

    PubMed

    Nakamura, Kunio; Fox, Robert; Fisher, Elizabeth

    2011-01-01

    Measurement of changes in brain cortical thickness is useful for the assessment of regional gray matter atrophy in neurodegenerative conditions. A new longitudinal method, called CLADA (cortical longitudinal atrophy detection algorithm), has been developed for the measurement of changes in cortical thickness in magnetic resonance images (MRI) acquired over time. CLADA creates a subject-specific cortical model which is longitudinally deformed to match images from individual time points. The algorithm was designed to work reliably for lower resolution images, such as the MRIs with 1×1×5 mm(3) voxels previously acquired for many clinical trials in multiple sclerosis (MS). CLADA was evaluated to determine reproducibility, accuracy, and sensitivity. Scan-rescan variability was 0.45% for images with 1mm(3) isotropic voxels and 0.77% for images with 1×1×5 mm(3) voxels. The mean absolute accuracy error was 0.43 mm, as determined by comparison of CLADA measurements to cortical thickness measured directly in post-mortem tissue. CLADA's sensitivity for correctly detecting at least 0.1mm change was 86% in a simulation study. A comparison to FreeSurfer showed good agreement (Pearson correlation=0.73 for global mean thickness). CLADA was also applied to MRIs acquired over 18 months in secondary progressive MS patients who were imaged at two different resolutions. Cortical thinning was detected in this group in both the lower and higher resolution images. CLADA detected a higher rate of cortical thinning in MS patients compared to healthy controls over 2 years. These results show that CLADA can be used for reliable measurement of cortical atrophy in longitudinal studies, even in lower resolution images.

  15. CLADA: Cortical Longitudinal Atrophy Detection Algorithm

    PubMed Central

    Nakamura, Kunio; Fox, Robert; Fisher, Elizabeth

    2010-01-01

    Measurement of changes in brain cortical thickness is useful for assessment of regional gray matter atrophy in neurodegenerative conditions. A new longitudinal method, called CLADA (cortical longitudinal atrophy detection algorithm), has been developed for measurement of changes in cortical thickness in magnetic resonance images (MRI) acquired over time. CLADA creates a subject-specific cortical model which is longitudinally deformed to match images from individual time points. The algorithm was designed to work reliably for lower-resolution images, such as the MRIs with 1×1×5mm3 voxels previously acquired for many clinical trials in multiple sclerosis (MS). CLADA was evaluated to determine reproducibility, accuracy, and sensitivity. Scan-rescan variability was 0.45% for images with 1mm3 isotropic voxels and 0.77% for images with 1×1×5 mm3 voxels. The mean absolute accuracy error was 0.43 mm, as determined by comparison of CLADA measurements to cortical thickness measured directly in post- mortem tissue. CLADA’s sensitivity for correctly detecting at least 0.1 mm change was 86% in a simulation study. A comparison to FreeSurfer showed good agreement (Pearson correlation = 0.73 for global mean thickness). CLADA was also applied to MRIs acquired over 18 months in secondary progressive MS patients who were imaged at two different resolutions. Cortical thinning was detected in this group in both the lower and higher resolution images. CLADA detected a higher rate of cortical thinning in MS patients compared to healthy controls over 2 years. These results show that CLADA can be used for reliable measurement of cortical atrophy in longitudinal studies, even in lower resolution images. PMID:20674750

  16. Photon Counting Using Edge-Detection Algorithm

    NASA Technical Reports Server (NTRS)

    Gin, Jonathan W.; Nguyen, Danh H.; Farr, William H.

    2010-01-01

    New applications such as high-datarate, photon-starved, free-space optical communications require photon counting at flux rates into gigaphoton-per-second regimes coupled with subnanosecond timing accuracy. Current single-photon detectors that are capable of handling such operating conditions are designed in an array format and produce output pulses that span multiple sample times. In order to discern one pulse from another and not to overcount the number of incoming photons, a detection algorithm must be applied to the sampled detector output pulses. As flux rates increase, the ability to implement such a detection algorithm becomes difficult within a digital processor that may reside within a field-programmable gate array (FPGA). Systems have been developed and implemented to both characterize gigahertz bandwidth single-photon detectors, as well as process photon count signals at rates into gigaphotons per second in order to implement communications links at SCPPM (serial concatenated pulse position modulation) encoded data rates exceeding 100 megabits per second with efficiencies greater than two bits per detected photon. A hardware edge-detection algorithm and corresponding signal combining and deserialization hardware were developed to meet these requirements at sample rates up to 10 GHz. The photon discriminator deserializer hardware board accepts four inputs, which allows for the ability to take inputs from a quadphoton counting detector, to support requirements for optical tracking with a reduced number of hardware components. The four inputs are hardware leading-edge detected independently. After leading-edge detection, the resultant samples are ORed together prior to deserialization. The deserialization is performed to reduce the rate at which data is passed to a digital signal processor, perhaps residing within an FPGA. The hardware implements four separate analog inputs that are connected through RF connectors. Each analog input is fed to a high-speed 1

  17. New segmentation algorithm for detecting tiny objects

    NASA Astrophysics Data System (ADS)

    Sun, Han; Yang, Jingyu; Ren, Mingwu; Gao, Jian-zhen

    2001-09-01

    Road cracks in the highway surface are very dangerous to traffic. They should be found and repaired as early as possible. So we designed the system of auto detecting cracks in the highway surface. In this system, there are several key steps. For instance, the first step, image recording should use high quality photography device because of the high speed. In addition, the original data is very large, so it needs huge storage media and some effective compress processing. As the illumination is affected by environment greatly, it is essential to do some preprocessing first, such as image reconstruction and enhancement. Because the cracks are too tiny to detect, segmentation is rather difficult. This paper here proposed a new segmentation method to detect such tiny cracks, even 2mm-width ones. In this algorithm, we first do edge detecting to get seeds for line growing in the following. Then delete the false ones and get the information of cracks. It is accurate and fast enough.

  18. Earthquake detection by new motion estimation algorithm in video processing

    NASA Astrophysics Data System (ADS)

    Hong, Chien-Shiang; Wang, Chuen-Ching; Tai, Shen-Chuan; Chen, Ji-Feng; Wang, Chung-Yao

    2011-01-01

    As increasing urbanization is taking place worldwide, earthquake hazards pose serious threats to lives and properties for urban areas. A practical earthquake prediction method appears to be far from realization. Generally, the traditional instruments for earthquake detection have the disadvantages of high cost and size. To solve these problems, this paper presents a new method which can detect earthquake intensity using video capture device. The main method is based on a new proposed motion vector algorithm with simple but effective methods to immediately calculate acceleration of a predefined target object. By estimating the motion vector variation, the movement distance of predefined target object can be computed, and therefore the earthquake amplitude can be defined. The effectiveness of the proposed scheme is demonstrated in a series of experimental simulations. It is shown that the scheme successfully detects the earthquake occurrence and identifies the earthquake amplitude from video streams.

  19. Boundary-detection algorithm for locating edges in digital imagery

    NASA Technical Reports Server (NTRS)

    Myers, V. I. (Principal Investigator); Russell, M. J.; Moore, D. G.; Nelson, G. D.

    1975-01-01

    The author has identified the following significant results. Initial development of a computer program which implements a boundary detection algorithm to detect edges in digital images is described. An evaluation of the boundary detection algorithm was conducted to locate boundaries of lakes from LANDSAT-1 imagery. The accuracy of the boundary detection algorithm was determined by comparing the area within boundaries of lakes located using digitized LANDSAT imagery with the area of the same lakes planimetered from imagery collected from an aircraft platform.

  20. Climate Data Homogenization Using Edge Detection Algorithms

    NASA Astrophysics Data System (ADS)

    Hammann, A. C.; Rennermalm, A. K.

    2015-12-01

    The problem of climate data homogenization has predominantly been addressed by testing the likelihood of one or more breaks inserted into a given time series and modeling the mean to be stationary in between the breaks. We recast the same problem in a slightly different form: that of detecting step-like changes in noisy data, and observe that this problem has spawned a large number of approaches to its solution as the "edge detection" problem in image processing. With respect to climate data, we ask the question: How can we optimally separate step-like from smoothly-varying low-frequency signals? We study the hypothesis that the edge-detection approach makes better use of all information contained in the time series than the "traditional" approach (e.g. Caussinus and Mestre, 2004), which we base on several observations. 1) The traditional formulation of the problem reduces the available information from the outset to that contained in the test statistic. 2) The criterion of local steepness of the low-frequency variability, while at least hypothetically useful, is ignored. 3) The practice of using monthly data corresponds, mathematically, to applying a moving average filter (to reduce noise) and subsequent subsampling of the result; this subsampling reduces the amount of available information beyond what is necessary for noise reduction. Most importantly, the tradeoff between noise reduction (better with filters with wide support in the time domain) and localization of detected changes (better with filters with narrow support) is expressed in the well-known uncertainty principle and can be addressed optimally within a time-frequency framework. Unsurprisingly, a large number of edge-detection algorithms have been proposed that make use of wavelet decompositions and similar techniques. We are developing this framework in part to be applied to a particular set of climate data from Greenland; we will present results from this application as well as from tests with

  1. Obstacle Detection Algorithms for Aircraft Navigation: Performance Characterization of Obstacle Detection Algorithms for Aircraft Navigation

    NASA Technical Reports Server (NTRS)

    Kasturi, Rangachar; Camps, Octavia; Coraor, Lee

    2000-01-01

    The research reported here is a part of NASA's Synthetic Vision System (SVS) project for the development of a High Speed Civil Transport Aircraft (HSCT). One of the components of the SVS is a module for detection of potential obstacles in the aircraft's flight path by analyzing the images captured by an on-board camera in real-time. Design of such a module includes the selection and characterization of robust, reliable, and fast techniques and their implementation for execution in real-time. This report describes the results of our research in realizing such a design. It is organized into three parts. Part I. Data modeling and camera characterization; Part II. Algorithms for detecting airborne obstacles; and Part III. Real time implementation of obstacle detection algorithms on the Datacube MaxPCI architecture. A list of publications resulting from this grant as well as a list of relevant publications resulting from prior NASA grants on this topic are presented.

  2. A Community Detection Algorithm Based on Topology Potential and Spectral Clustering

    PubMed Central

    Wang, Zhixiao; Chen, Zhaotong; Zhao, Ya; Chen, Shaoda

    2014-01-01

    Community detection is of great value for complex networks in understanding their inherent law and predicting their behavior. Spectral clustering algorithms have been successfully applied in community detection. This kind of methods has two inadequacies: one is that the input matrixes they used cannot provide sufficient structural information for community detection and the other is that they cannot necessarily derive the proper community number from the ladder distribution of eigenvector elements. In order to solve these problems, this paper puts forward a novel community detection algorithm based on topology potential and spectral clustering. The new algorithm constructs the normalized Laplacian matrix with nodes' topology potential, which contains rich structural information of the network. In addition, the new algorithm can automatically get the optimal community number from the local maximum potential nodes. Experiments results showed that the new algorithm gave excellent performance on artificial networks and real world networks and outperforms other community detection methods. PMID:25147846

  3. Extensions to Real-time Hierarchical Mine Detection Algorithm

    DTIC Science & Technology

    2002-09-01

    Extensions to Real-Time Hierarchical Mine Detection Algorithm System Number: Patron Number: Requester: Notes: DSIS Use only: Deliver to: DK...Recherche et developpement pour Ia defense Canada Extensions to Real-Time Hierarchical Mine Detection Algorithm Final Report Sinh Duong and Mabo R. Ito...EXTENSIONS TO REAL-TIME HIERARCHICAL MINE DETECTION ALGORITHM FINAL REPORT by Smh Duong and Mabo R Ito The Univer~ity of Bntl~h Columbia Vancouver

  4. Multimode algorithm for detection and tracking of point targets

    NASA Astrophysics Data System (ADS)

    Venkateswarlu, Ronda; Er, Meng H.; Deshpande, Suyog D.; Chan, Philip

    1999-07-01

    This paper deals with the problem of detection and tracking of point-targets from a sequence of IR images against slowly moving clouds as well as structural background. Many algorithms are reported in the literature for tracking sizeable targets with good result. However, the difficulties in tracking point-targets arise from the fact that they are not easily discernible from point like clutter. Though the point-targets are moving, it is very difficult to detect and track them with reduced false alarm rates, because of the non-stationary of the IR clutter, changing target statistics and sensor motion. The focus of research in this area is to reduce false alarm rate to an acceptable level. In certain situations not detecting a true target is acceptable, but declaring a false target as a true one may not be acceptable. Although, there are many approaches to tackle this problem, no single method works well in all the situations. In this paper, we present a multi-mode algorithm involving scene stabilization using image registration, 2D spatial filtering based on continuous wavelet transform, adaptive threshold, accumulation of the threshold frames and processing of the accumulated frame to get the final target trajectories. It is assumed that most of the targets occupy a couple of pixels. Head-on moving and maneuvering targets are not considered. It has been tested successfully with the available database and the results are presented.

  5. An improved corner detection algorithm for image sequence

    NASA Astrophysics Data System (ADS)

    Yan, Minqi; Zhang, Bianlian; Guo, Min; Tian, Guangyuan; Liu, Feng; Huo, Zeng

    2014-11-01

    A SUSAN corner detection algorithm for a sequence of images is proposed in this paper, The correlation matching algorithm is treated for the coarse positioning of the detection area, after that, SUSAN corner detection is used to obtain interesting points of the target. The SUSAN corner detection has been improved. For the situation that the points of a small area are often detected as corner points incorrectly, the neighbor direction filter is applied to reduce the rate of mistakes. Experiment results show that the algorithm enhances the anti-noise performance, improve the accuracy of detection.

  6. A Successive Shortest Path Algorithm for the Assignment Problem.

    DTIC Science & Technology

    1980-08-01

    a refinement of the Dinic-Kronrod algorithm [ 7 ]. We have used SSP to develop a computer code which is very efficient for solving large, sparse...x .. / - Node,i Predecessor,Pt Distance,D iI I lD, 3 none 0 2 3 6 2 3 1 1 4 3 3 (,2 4 5 1 3 6 2 10 7 1 1 6 Fig. 1. A shortest path tree. 4 In a...denote the number of elements in $I: j = Ail. The modified assignment problem relative to (C,A) is defined as follows: I 7 Minimize cij xij (i,j E

  7. Health Monitoring System for the SSME-fault detection algorithms

    NASA Technical Reports Server (NTRS)

    Tulpule, S.; Galinaitis, W. S.

    1990-01-01

    A Health Monitoring System (HMS) Framework for the Space Shuttle Main Engine (SSME) has been developed by United Technologies Corporation (UTC) for the NASA Lewis Research Center. As part of this effort, fault detection algorithms have been developed to detect the SSME faults with sufficient time to shutdown the engine. These algorithms have been designed to provide monitoring coverage during the startup, mainstage and shutdown phases of the SSME operation. The algorithms have the capability to detect multiple SSME faults, and are based on time series, regression and clustering techniques. This paper presents a discussion of candidate algorithms suitable for fault detection followed by a description of the algorithms selected for implementation in the HMS and the results of testing these algorithms with the SSME test stand data.

  8. A Formally Verified Conflict Detection Algorithm for Polynomial Trajectories

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony; Munoz, Cesar

    2015-01-01

    In air traffic management, conflict detection algorithms are used to determine whether or not aircraft are predicted to lose horizontal and vertical separation minima within a time interval assuming a trajectory model. In the case of linear trajectories, conflict detection algorithms have been proposed that are both sound, i.e., they detect all conflicts, and complete, i.e., they do not present false alarms. In general, for arbitrary nonlinear trajectory models, it is possible to define detection algorithms that are either sound or complete, but not both. This paper considers the case of nonlinear aircraft trajectory models based on polynomial functions. In particular, it proposes a conflict detection algorithm that precisely determines whether, given a lookahead time, two aircraft flying polynomial trajectories are in conflict. That is, it has been formally verified that, assuming that the aircraft trajectories are modeled as polynomial functions, the proposed algorithm is both sound and complete.

  9. A new dwarf detection algorithm applied to M101

    NASA Astrophysics Data System (ADS)

    Bennet, Paul; Sand, David J.; Crnojevic, Denija

    2017-01-01

    The Lambda Cold Dark Matter model for structure formation has been very successful at reproducing observations of large scale structures. However, challenges emerge at sub-galactic scales, e.g. the number of dwarfs around the Milky Way show an order of magnitude difference with simulations (the 'missing satellites problem'). There are several theories to explain this apparent discrepancy but further observations of Local Volume galaxies and their substructure is required to constrain these models by better sampling halo to halo scatter. Here we report on a survey of the M101 group from archival data and a novel dwarf detection algorithm. This survey has discovered 26 new dwarf candidates in the M101 system, extending the dwarf luminosity function by two magnitudes, to M=-7.5. These dwarf candidates also show a distinct spatial asymmetry suggestive of an infalling dwarf group.

  10. A new algorithmic approach for fingers detection and identification

    NASA Astrophysics Data System (ADS)

    Mubashar Khan, Arslan; Umar, Waqas; Choudhary, Taimoor; Hussain, Fawad; Haroon Yousaf, Muhammad

    2013-03-01

    Gesture recognition is concerned with the goal of interpreting human gestures through mathematical algorithms. Gestures can originate from any bodily motion or state but commonly originate from the face or hand. Hand gesture detection in a real time environment, where the time and memory are important issues, is a critical operation. Hand gesture recognition largely depends on the accurate detection of the fingers. This paper presents a new algorithmic approach to detect and identify fingers of human hand. The proposed algorithm does not depend upon the prior knowledge of the scene. It detects the active fingers and Metacarpophalangeal (MCP) of the inactive fingers from an already detected hand. Dynamic thresholding technique and connected component labeling scheme are employed for background elimination and hand detection respectively. Algorithm proposed a new approach for finger identification in real time environment keeping the memory and time constraint as low as possible.

  11. A hardware-algorithm co-design approach to optimize seizure detection algorithms for implantable applications.

    PubMed

    Raghunathan, Shriram; Gupta, Sumeet K; Markandeya, Himanshu S; Roy, Kaushik; Irazoqui, Pedro P

    2010-10-30

    Implantable neural prostheses that deliver focal electrical stimulation upon demand are rapidly emerging as an alternate therapy for roughly a third of the epileptic patient population that is medically refractory. Seizure detection algorithms enable feedback mechanisms to provide focally and temporally specific intervention. Real-time feasibility and computational complexity often limit most reported detection algorithms to implementations using computers for bedside monitoring or external devices communicating with the implanted electrodes. A comparison of algorithms based on detection efficacy does not present a complete picture of the feasibility of the algorithm with limited computational power, as is the case with most battery-powered applications. We present a two-dimensional design optimization approach that takes into account both detection efficacy and hardware cost in evaluating algorithms for their feasibility in an implantable application. Detection features are first compared for their ability to detect electrographic seizures from micro-electrode data recorded from kainate-treated rats. Circuit models are then used to estimate the dynamic and leakage power consumption of the compared features. A score is assigned based on detection efficacy and the hardware cost for each of the features, then plotted on a two-dimensional design space. An optimal combination of compared features is used to construct an algorithm that provides maximal detection efficacy per unit hardware cost. The methods presented in this paper would facilitate the development of a common platform to benchmark seizure detection algorithms for comparison and feasibility analysis in the next generation of implantable neuroprosthetic devices to treat epilepsy.

  12. Theoretical foundations of NRL spectral target detection algorithms.

    PubMed

    Schaum, Alan

    2015-11-01

    The principal spectral detection algorithms developed at the Naval Research Laboratory (NRL) over the past 20 years for use in operational systems are described. These include anomaly detectors, signature-based methods, and techniques for anomalous change detection. Newer derivations are provided that have motivated more recent work. Mathematical methods facilitating the use of forward models for the prediction of spectral signature statistics are described and a detection algorithm is derived for ocean surveillance that is based on principles of clairvoyant fusion.

  13. Computer algorithms to detect bloodstream infections.

    PubMed

    Trick, William E; Zagorski, Brandon M; Tokars, Jerome I; Vernon, Michael O; Welbel, Sharon F; Wisniewski, Mary F; Richards, Chesley; Weinstein, Robert A

    2004-09-01

    We compared manual and computer-assisted bloodstream infection surveillance for adult inpatients at two hospitals. We identified hospital-acquired, primary, central-venous catheter (CVC)-associated bloodstream infections by using five methods: retrospective, manual record review by investigators; prospective, manual review by infection control professionals; positive blood culture plus manual CVC determination; computer algorithms; and computer algorithms and manual CVC determination. We calculated sensitivity, specificity, predictive values, plus the kappa statistic (kappa) between investigator review and other methods, and we correlated infection rates for seven units. The kappa value was 0.37 for infection control review, 0.48 for positive blood culture plus manual CVC determination, 0.49 for computer algorithm, and 0.73 for computer algorithm plus manual CVC determination. Unit-specific infection rates, per 1,000 patient days, were 1.0-12.5 by investigator review and 1.4-10.2 by computer algorithm (correlation r = 0.91, p = 0.004). Automated bloodstream infection surveillance with electronic data is an accurate alternative to surveillance with manually collected data.

  14. Parallel Detection Algorithm for Fast Frequency Hopping OFDM

    NASA Astrophysics Data System (ADS)

    Kun, Xu; Xiao-xin, Yi

    2011-05-01

    Fast frequency hopping OFDM (FFH-OFDM) exploits frequency diversity in one OFDM symbol to enhance conventional OFDM performance without using channel coding. Zero-forcing (ZF) and minimum mean square error (MMSE) equalization were first used to detect FFH-OFDM signal with a relatively poor bit error rate (BER) performance compared to QR-based detection algorithm. This paper proposes a parallel detection algorithm (PDA) to further improve the BER performance with parallel interference cancelation (PIC) based on MMSE criterion. Our proposed PDA not only improves the BER performance at high signal to noise ratio (SNR) regime but also possesses lower decoding delay property with respect to QR-based detection algorithm while maintaining comparable computation complexity. Simulation results indicate that at BER = 10-3 the PDA achieves 5 dB SNR gain over QR-based detection algorithm and more as SNR increases.

  15. Analysis of Community Detection Algorithms for Large Scale Cyber Networks

    SciTech Connect

    Mane, Prachita; Shanbhag, Sunanda; Kamath, Tanmayee; Mackey, Patrick S.; Springer, John

    2016-09-30

    The aim of this project is to use existing community detection algorithms on an IP network dataset to create supernodes within the network. This study compares the performance of different algorithms on the network in terms of running time. The paper begins with an introduction to the concept of clustering and community detection followed by the research question that the team aimed to address. Further the paper describes the graph metrics that were considered in order to shortlist algorithms followed by a brief explanation of each algorithm with respect to the graph metric on which it is based. The next section in the paper describes the methodology used by the team in order to run the algorithms and determine which algorithm is most efficient with respect to running time. Finally, the last section of the paper includes the results obtained by the team and a conclusion based on those results as well as future work.

  16. Multi-object Detection and Discrimination Algorithms

    DTIC Science & Technology

    2015-03-26

    learning, image processing , multi-sensor fusion, classifier development, ground- penetrating radar (GPR), ground boundary detection REPORT DOCUMENTATION...penetrating radar signals 3. 2011-2012 • An efficient multiple layer boundary detection in ground-penetrating radar data using an extended Viterbi...Joseph Wilson. Support vector data description for detecting the air-ground interface in ground penetrating radar signals , Detection and Sensing of

  17. Evaluation schemes for video and image anomaly detection algorithms

    NASA Astrophysics Data System (ADS)

    Parameswaran, Shibin; Harguess, Josh; Barngrover, Christopher; Shafer, Scott; Reese, Michael

    2016-05-01

    Video anomaly detection is a critical research area in computer vision. It is a natural first step before applying object recognition algorithms. There are many algorithms that detect anomalies (outliers) in videos and images that have been introduced in recent years. However, these algorithms behave and perform differently based on differences in domains and tasks to which they are subjected. In order to better understand the strengths and weaknesses of outlier algorithms and their applicability in a particular domain/task of interest, it is important to measure and quantify their performance using appropriate evaluation metrics. There are many evaluation metrics that have been used in the literature such as precision curves, precision-recall curves, and receiver operating characteristic (ROC) curves. In order to construct these different metrics, it is also important to choose an appropriate evaluation scheme that decides when a proposed detection is considered a true or a false detection. Choosing the right evaluation metric and the right scheme is very critical since the choice can introduce positive or negative bias in the measuring criterion and may favor (or work against) a particular algorithm or task. In this paper, we review evaluation metrics and popular evaluation schemes that are used to measure the performance of anomaly detection algorithms on videos and imagery with one or more anomalies. We analyze the biases introduced by these by measuring the performance of an existing anomaly detection algorithm.

  18. Improvement and implementation for Canny edge detection algorithm

    NASA Astrophysics Data System (ADS)

    Yang, Tao; Qiu, Yue-hong

    2015-07-01

    Edge detection is necessary for image segmentation and pattern recognition. In this paper, an improved Canny edge detection approach is proposed due to the defect of traditional algorithm. A modified bilateral filter with a compensation function based on pixel intensity similarity judgment was used to smooth image instead of Gaussian filter, which could preserve edge feature and remove noise effectively. In order to solve the problems of sensitivity to the noise in gradient calculating, the algorithm used 4 directions gradient templates. Finally, Otsu algorithm adaptively obtain the dual-threshold. All of the algorithm simulated with OpenCV 2.4.0 library in the environments of vs2010, and through the experimental analysis, the improved algorithm has been proved to detect edge details more effectively and with more adaptability.

  19. Identifying Time Measurement Tampering in the Traversal Time and Hop Count Analysis (TTHCA) Wormhole Detection Algorithm

    PubMed Central

    Karlsson, Jonny; Dooley, Laurence S.; Pulkkis, Göran

    2013-01-01

    Traversal time and hop count analysis (TTHCA) is a recent wormhole detection algorithm for mobile ad hoc networks (MANET) which provides enhanced detection performance against all wormhole attack variants and network types. TTHCA involves each node measuring the processing time of routing packets during the route discovery process and then delivering the measurements to the source node. In a participation mode (PM) wormhole where malicious nodes appear in the routing tables as legitimate nodes, the time measurements can potentially be altered so preventing TTHCA from successfully detecting the wormhole. This paper analyses the prevailing conditions for time tampering attacks to succeed for PM wormholes, before introducing an extension to the TTHCA detection algorithm called ΔT Vector which is designed to identify time tampering, while preserving low false positive rates. Simulation results confirm that the ΔT Vector extension is able to effectively detect time tampering attacks, thereby providing an important security enhancement to the TTHCA algorithm. PMID:23686143

  20. Identifying time measurement tampering in the traversal time and hop count analysis (TTHCA) wormhole detection algorithm.

    PubMed

    Karlsson, Jonny; Dooley, Laurence S; Pulkkis, Göran

    2013-05-17

    Traversal time and hop count analysis (TTHCA) is a recent wormhole detection algorithm for mobile ad hoc networks (MANET) which provides enhanced detection performance against all wormhole attack variants and network types. TTHCA involves each node measuring the processing time of routing packets during the route discovery process and then delivering the measurements to the source node. In a participation mode (PM) wormhole where malicious nodes appear in the routing tables as legitimate nodes, the time measurements can potentially be altered so preventing TTHCA from successfully detecting the wormhole. This paper analyses the prevailing conditions for time tampering attacks to succeed for PM wormholes, before introducing an extension to the TTHCA detection algorithm called ∆T Vector which is designed to identify time tampering, while preserving low false positive rates. Simulation results confirm that the ∆T Vector extension is able to effectively detect time tampering attacks, thereby providing an important security enhancement to the TTHCA algorithm.

  1. Acoustic change detection algorithm using an FM radio

    NASA Astrophysics Data System (ADS)

    Goldman, Geoffrey H.; Wolfe, Owen

    2012-06-01

    The U.S. Army is interested in developing low-cost, low-power, non-line-of-sight sensors for monitoring human activity. One modality that is often overlooked is active acoustics using sources of opportunity such as speech or music. Active acoustics can be used to detect human activity by generating acoustic images of an area at different times, then testing for changes among the imagery. A change detection algorithm was developed to detect physical changes in a building, such as a door changing positions or a large box being moved using acoustics sources of opportunity. The algorithm is based on cross correlating the acoustic signal measured from two microphones. The performance of the algorithm was shown using data generated with a hand-held FM radio as a sound source and two microphones. The algorithm could detect a door being opened in a hallway.

  2. A Color Image Edge Detection Algorithm Based on Color Difference

    NASA Astrophysics Data System (ADS)

    Zhuo, Li; Hu, Xiaochen; Jiang, Liying; Zhang, Jing

    2016-12-01

    Although image edge detection algorithms have been widely applied in image processing, the existing algorithms still face two important problems. On one hand, to restrain the interference of noise, smoothing filters are generally exploited in the existing algorithms, resulting in loss of significant edges. On the other hand, since the existing algorithms are sensitive to noise, many noisy edges are usually detected, which will disturb the subsequent processing. Therefore, a color image edge detection algorithm based on color difference is proposed in this paper. Firstly, a new operation called color separation is defined in this paper, which can reflect the information of color difference. Then, for the neighborhood of each pixel, color separations are calculated in four different directions to detect the edges. Experimental results on natural and synthetic images show that the proposed algorithm can remove a large number of noisy edges and be robust to the smoothing filters. Furthermore, the proposed edge detection algorithm is applied in road foreground segmentation and shadow removal, which achieves good performances.

  3. A Quantum Algorithm Detecting Concentrated Maps.

    PubMed

    Beichl, Isabel; Bullock, Stephen S; Song, Daegene

    2007-01-01

    We consider an arbitrary mapping f: {0, …, N - 1} → {0, …, N - 1} for N = 2 (n) , n some number of quantum bits. Using N calls to a classical oracle evaluating f(x) and an N-bit memory, it is possible to determine whether f(x) is one-to-one. For some radian angle 0 ≤ θ ≤ π/2, we say f(x) is θ - concentrated if and only if [Formula: see text] for some given ψ 0 and any 0 ≤ x ≤ N - 1. We present a quantum algorithm that distinguishes a θ-concentrated f(x) from a one-to-one f(x) in O(1) calls to a quantum oracle function Uf with high probability. For 0 < θ < 0.3301 rad, the quantum algorithm outperforms random (classical) evaluation of the function testing for dispersed values (on average). Maximal outperformance occurs at [Formula: see text] rad.

  4. Estimating the chance of success in IVF treatment using a ranking algorithm.

    PubMed

    Güvenir, H Altay; Misirli, Gizem; Dilbaz, Serdar; Ozdegirmenci, Ozlem; Demir, Berfu; Dilbaz, Berna

    2015-09-01

    In medicine, estimating the chance of success for treatment is important in deciding whether to begin the treatment or not. This paper focuses on the domain of in vitro fertilization (IVF), where estimating the outcome of a treatment is very crucial in the decision to proceed with treatment for both the clinicians and the infertile couples. IVF treatment is a stressful and costly process. It is very stressful for couples who want to have a baby. If an initial evaluation indicates a low pregnancy rate, decision of the couple may change not to start the IVF treatment. The aim of this study is twofold, firstly, to develop a technique that can be used to estimate the chance of success for a couple who wants to have a baby and secondly, to determine the attributes and their particular values affecting the outcome in IVF treatment. We propose a new technique, called success estimation using a ranking algorithm (SERA), for estimating the success of a treatment using a ranking-based algorithm. The particular ranking algorithm used here is RIMARC. The performance of the new algorithm is compared with two well-known algorithms that assign class probabilities to query instances. The algorithms used in the comparison are Naïve Bayes Classifier and Random Forest. The comparison is done in terms of area under the ROC curve, accuracy and execution time, using tenfold stratified cross-validation. The results indicate that the proposed SERA algorithm has a potential to be used successfully to estimate the probability of success in medical treatment.

  5. A SAR ATR algorithm based on coherent change detection

    SciTech Connect

    Harmony, D.W.

    2000-12-01

    This report discusses an automatic target recognition (ATR) algorithm for synthetic aperture radar (SAR) imagery that is based on coherent change detection techniques. The algorithm relies on templates created from training data to identify targets. Objects are identified or rejected as targets by comparing their SAR signatures with templates using the same complex correlation scheme developed for coherent change detection. Preliminary results are presented in addition to future recommendations.

  6. An Anomaly Clock Detection Algorithm for a Robust Clock Ensemble

    DTIC Science & Technology

    2009-11-01

    41 st Annual Precise Time and Time Interval (PTTI) Meeting 121 AN ANOMALY CLOCK DETECTION ALGORITHM FOR A ROBUST CLOCK ENSEMBLE...clocks are in phase and on frequency all the time with advantages of relatively simple, robust, fully redundant, and improved performance. It allows...Algorithm parameters, such as the sliding window width as a function of the time constant, and the minimum detectable levels have been optimized and

  7. A fuzzy clustering algorithm to detect planar and quadric shapes

    NASA Technical Reports Server (NTRS)

    Krishnapuram, Raghu; Frigui, Hichem; Nasraoui, Olfa

    1992-01-01

    In this paper, we introduce a new fuzzy clustering algorithm to detect an unknown number of planar and quadric shapes in noisy data. The proposed algorithm is computationally and implementationally simple, and it overcomes many of the drawbacks of the existing algorithms that have been proposed for similar tasks. Since the clustering is performed in the original image space, and since no features need to be computed, this approach is particularly suited for sparse data. The algorithm may also be used in pattern recognition applications.

  8. AdaBoost-based algorithm for network intrusion detection.

    PubMed

    Hu, Weiming; Hu, Wei; Maybank, Steve

    2008-04-01

    Network intrusion detection aims at distinguishing the attacks on the Internet from normal use of the Internet. It is an indispensable part of the information security system. Due to the variety of network behaviors and the rapid development of attack fashions, it is necessary to develop fast machine-learning-based intrusion detection algorithms with high detection rates and low false-alarm rates. In this correspondence, we propose an intrusion detection algorithm based on the AdaBoost algorithm. In the algorithm, decision stumps are used as weak classifiers. The decision rules are provided for both categorical and continuous features. By combining the weak classifiers for continuous features and the weak classifiers for categorical features into a strong classifier, the relations between these two different types of features are handled naturally, without any forced conversions between continuous and categorical features. Adaptable initial weights and a simple strategy for avoiding overfitting are adopted to improve the performance of the algorithm. Experimental results show that our algorithm has low computational complexity and error rates, as compared with algorithms of higher computational complexity, as tested on the benchmark sample data.

  9. Automatic DarkAdaptation Threshold Detection Algorithm.

    PubMed

    G de Azevedo, Dario; Helegda, Sergio; Glock, Flavio; Russomano, Thais

    2005-01-01

    This paper describes an algorithm used to automatically determine the threshold sensitivity in a new dark adaptometer. The new instrument is controlled by a personal computer and can be used in the investigation of several retinal diseases. The stimulus field is delivered to the eye through the modified optics of a fundus camera. An automated light stimulus source was developed to operate together with this fundus camera. New control parameters were developed in this instrument to improve the traditional Goldmann-Weekers dark adaptometer.

  10. Detection Algorithms for Hyperspectral Imaging Applications

    DTIC Science & Technology

    2010-08-26

    Schaum and A. Stocker. Spectrally-selective target detection. Proceedings oflSSSR, 1997. [54] R. A. Schowengerdt. Remote Sensing: Models and...Stein, S. Beaven, L. Hoff, E. Winter, A. Schaum , and A. Stocker. Anomaly detection from fyper- spectral imagery. Signal Processing Magazine, 2002. [58...adaptive processor or a structured covariance matrix. IEEE AES, 36(4): 1115-1125, Oct. 2000. [60] A.D. Stocker and A. Schaum . Application of

  11. Community detection algorithms: a comparative analysis.

    PubMed

    Lancichinetti, Andrea; Fortunato, Santo

    2009-11-01

    Uncovering the community structure exhibited by real networks is a crucial step toward an understanding of complex systems that goes beyond the local organization of their constituents. Many algorithms have been proposed so far, but none of them has been subjected to strict tests to evaluate their performance. Most of the sporadic tests performed so far involved small networks with known community structure and/or artificial graphs with a simplified structure, which is very uncommon in real systems. Here we test several methods against a recently introduced class of benchmark graphs, with heterogeneous distributions of degree and community size. The methods are also tested against the benchmark by Girvan and Newman [Proc. Natl. Acad. Sci. U.S.A. 99, 7821 (2002)] and on random graphs. As a result of our analysis, three recent algorithms introduced by Rosvall and Bergstrom [Proc. Natl. Acad. Sci. U.S.A. 104, 7327 (2007); Proc. Natl. Acad. Sci. U.S.A. 105, 1118 (2008)], Blondel [J. Stat. Mech.: Theory Exp. (2008), P10008], and Ronhovde and Nussinov [Phys. Rev. E 80, 016109 (2009)] have an excellent performance, with the additional advantage of low computational complexity, which enables one to analyze large systems.

  12. A Motion Detection Algorithm Using Local Phase Information

    PubMed Central

    Lazar, Aurel A.; Ukani, Nikul H.; Zhou, Yiyin

    2016-01-01

    Previous research demonstrated that global phase alone can be used to faithfully represent visual scenes. Here we provide a reconstruction algorithm by using only local phase information. We also demonstrate that local phase alone can be effectively used to detect local motion. The local phase-based motion detector is akin to models employed to detect motion in biological vision, for example, the Reichardt detector. The local phase-based motion detection algorithm introduced here consists of two building blocks. The first building block measures/evaluates the temporal change of the local phase. The temporal derivative of the local phase is shown to exhibit the structure of a second order Volterra kernel with two normalized inputs. We provide an efficient, FFT-based algorithm for implementing the change of the local phase. The second processing building block implements the detector; it compares the maximum of the Radon transform of the local phase derivative with a chosen threshold. We demonstrate examples of applying the local phase-based motion detection algorithm on several video sequences. We also show how the locally detected motion can be used for segmenting moving objects in video scenes and compare our local phase-based algorithm to segmentation achieved with a widely used optic flow algorithm. PMID:26880882

  13. A novel algorithm of maximin Latin hypercube design using successive local enumeration

    NASA Astrophysics Data System (ADS)

    Zhu, Huaguang; Liu, Li; Long, Teng; Peng, Lei

    2012-05-01

    The design of computer experiments (DoCE) is a key technique in the field of metamodel-based design optimization. Space-filling and projective properties are desired features in DoCE. In this article, a novel algorithm of maximin Latin hypercube design (LHD) using successive local enumeration (SLE) is proposed for generating arbitrary m points in n-dimensional space. Testing results compared with lhsdesign function, binary encoded genetic algorithm (BinGA), permutation encoded genetic algorithm (PermGA) and translational propagation algorithm (TPLHD) indicate that SLE is effective to generate sampling points with good space-filling and projective properties. The accuracies of metamodels built with the sampling points produced by lhsdesign function and SLE are compared to illustrate the preferable performance of SLE. Through the comparative study on efficiency with BinGA, PermGA, and TPLHD, as a novel algorithm of LHD sampling techniques, SLE has good space-filling property and acceptable efficiency.

  14. A two-level detection algorithm for optical fiber vibration

    NASA Astrophysics Data System (ADS)

    Bi, Fukun; Ren, Xuecong; Qu, Hongquan; Jiang, Ruiqing

    2015-09-01

    Optical fiber vibration is detected by the coherent optical time domain reflection technique. In addition to the vibration signals, the reflected signals include clutters and noises, which lead to a high false alarm rate. The "cell averaging" constant false alarm rate algorithm has a high computing speed, but its detection performance will be declined in nonhomogeneous environments such as multiple targets. The "order statistics" constant false alarm rate algorithm has a distinct advantage in multiple target environments, but it has a lower computing speed. An intelligent two-level detection algorithm is presented based on "cell averaging" constant false alarm rate and "order statistics" constant false alarm rate which work in serial way, and the detection speed of "cell averaging" constant false alarm rate and performance of "order statistics" constant false alarm rate are conserved, respectively. Through the adaptive selection, the "cell averaging" is applied in homogeneous environments, and the two-level detection algorithm is employed in nonhomogeneous environments. Our Monte Carlo simulation results demonstrate that considering different signal noise ratios, the proposed algorithm gives better detection probability than that of "order statistics".

  15. Detecting Community Structure by Using a Constrained Label Propagation Algorithm

    PubMed Central

    Ratnavelu, Kuru

    2016-01-01

    Community structure is considered one of the most interesting features in complex networks. Many real-world complex systems exhibit community structure, where individuals with similar properties form a community. The identification of communities in a network is important for understanding the structure of said network, in a specific perspective. Thus, community detection in complex networks gained immense interest over the last decade. A lot of community detection methods were proposed, and one of them is the label propagation algorithm (LPA). The simplicity and time efficiency of the LPA make it a popular community detection method. However, the LPA suffers from instability detection due to randomness that is induced in the algorithm. The focus of this paper is to improve the stability and accuracy of the LPA, while retaining its simplicity. Our proposed algorithm will first detect the main communities in a network by using the number of mutual neighbouring nodes. Subsequently, nodes are added into communities by using a constrained LPA. Those constraints are then gradually relaxed until all nodes are assigned into groups. In order to refine the quality of the detected communities, nodes in communities can be switched to another community or removed from their current communities at various stages of the algorithm. We evaluated our algorithm on three types of benchmark networks, namely the Lancichinetti-Fortunato-Radicchi (LFR), Relaxed Caveman (RC) and Girvan-Newman (GN) benchmarks. We also apply the present algorithm to some real-world networks of various sizes. The current results show some promising potential, of the proposed algorithm, in terms of detecting communities accurately. Furthermore, our constrained LPA has a robustness and stability that are significantly better than the simple LPA as it is able to yield deterministic results. PMID:27176470

  16. A Fusion Method of Gabor Wavelet Transform and Unsupervised Clustering Algorithms for Tissue Edge Detection

    PubMed Central

    Ergen, Burhan

    2014-01-01

    This paper proposes two edge detection methods for medical images by integrating the advantages of Gabor wavelet transform (GWT) and unsupervised clustering algorithms. The GWT is used to enhance the edge information in an image while suppressing noise. Following this, the k-means and Fuzzy c-means (FCM) clustering algorithms are used to convert a gray level image into a binary image. The proposed methods are tested using medical images obtained through Computed Tomography (CT) and Magnetic Resonance Imaging (MRI) devices, and a phantom image. The results prove that the proposed methods are successful for edge detection, even in noisy cases. PMID:24790590

  17. QuateXelero: An Accelerated Exact Network Motif Detection Algorithm

    PubMed Central

    Khakabimamaghani, Sahand; Sharafuddin, Iman; Dichter, Norbert; Koch, Ina; Masoudi-Nejad, Ali

    2013-01-01

    Finding motifs in biological, social, technological, and other types of networks has become a widespread method to gain more knowledge about these networks’ structure and function. However, this task is very computationally demanding, because it is highly associated with the graph isomorphism which is an NP problem (not known to belong to P or NP-complete subsets yet). Accordingly, this research is endeavoring to decrease the need to call NAUTY isomorphism detection method, which is the most time-consuming step in many existing algorithms. The work provides an extremely fast motif detection algorithm called QuateXelero, which has a Quaternary Tree data structure in the heart. The proposed algorithm is based on the well-known ESU (FANMOD) motif detection algorithm. The results of experiments on some standard model networks approve the overal superiority of the proposed algorithm, namely QuateXelero, compared with two of the fastest existing algorithms, G-Tries and Kavosh. QuateXelero is especially fastest in constructing the central data structure of the algorithm from scratch based on the input network. PMID:23874498

  18. Novel automatic eye detection and tracking algorithm

    NASA Astrophysics Data System (ADS)

    Ghazali, Kamarul Hawari; Jadin, Mohd Shawal; Jie, Ma; Xiao, Rui

    2015-04-01

    The eye is not only one of the most complex but also the most important sensory organ of the human body. Eye detection and eye tracking are basement and hot issue in image processing. A non-invasive eye location and eye tracking is promising for hands-off gaze-based human-computer interface, fatigue detection, instrument control by paraplegic patients and so on. For this purpose, an innovation work frame is proposed to detect and tracking eye in video sequence in this paper. The contributions of this work can be divided into two parts. The first contribution is that eye filters were trained which can detect eye location efficiently and accurately without constraints on the background and skin colour. The second contribution is that a framework of tracker based on sparse representation and LK optic tracker were built which can track eye without constraint on eye status. The experimental results demonstrate the accuracy aspects and the real-time applicability of the proposed approach.

  19. Advanced modularity-specialized label propagation algorithm for detecting communities in networks

    NASA Astrophysics Data System (ADS)

    Liu, X.; Murata, T.

    2010-04-01

    A modularity-specialized label propagation algorithm (LPAm) for detecting network communities was recently proposed. This promising algorithm offers some desirable qualities. However, LPAm favors community divisions where all communities are similar in total degree and thus it is prone to get stuck in poor local maxima in the modularity space. To escape local maxima, we employ a multistep greedy agglomerative algorithm (MSG) that can merge multiple pairs of communities at a time. Combining LPAm and MSG, we propose an advanced modularity-specialized label propagation algorithm (LPAm+). Experiments show that LPAm+ successfully detects communities with higher modularity values than ever reported in two commonly used real-world networks. Moreover, LPAm+ offers a fair compromise between accuracy and speed.

  20. Face detection based on multiple kernel learning algorithm

    NASA Astrophysics Data System (ADS)

    Sun, Bo; Cao, Siming; He, Jun; Yu, Lejun

    2016-09-01

    Face detection is important for face localization in face or facial expression recognition, etc. The basic idea is to determine whether there is a face in an image or not, and also its location, size. It can be seen as a binary classification problem, which can be well solved by support vector machine (SVM). Though SVM has strong model generalization ability, it has some limitations, which will be deeply analyzed in the paper. To access them, we study the principle and characteristics of the Multiple Kernel Learning (MKL) and propose a MKL-based face detection algorithm. In the paper, we describe the proposed algorithm in the interdisciplinary research perspective of machine learning and image processing. After analyzing the limitation of describing a face with a single feature, we apply several ones. To fuse them well, we try different kernel functions on different feature. By MKL method, the weight of each single function is determined. Thus, we obtain the face detection model, which is the kernel of the proposed method. Experiments on the public data set and real life face images are performed. We compare the performance of the proposed algorithm with the single kernel-single feature based algorithm and multiple kernels-single feature based algorithm. The effectiveness of the proposed algorithm is illustrated. Keywords: face detection, feature fusion, SVM, MKL

  1. Novel Hierarchical Fall Detection Algorithm Using a Multiphase Fall Model

    PubMed Central

    Hsieh, Chia-Yeh; Liu, Kai-Chun; Huang, Chih-Ning; Chu, Woei-Chyn; Chan, Chia-Tai

    2017-01-01

    Falls are the primary cause of accidents for the elderly in the living environment. Reducing hazards in the living environment and performing exercises for training balance and muscles are the common strategies for fall prevention. However, falls cannot be avoided completely; fall detection provides an alarm that can decrease injuries or death caused by the lack of rescue. The automatic fall detection system has opportunities to provide real-time emergency alarms for improving the safety and quality of home healthcare services. Two common technical challenges are also tackled in order to provide a reliable fall detection algorithm, including variability and ambiguity. We propose a novel hierarchical fall detection algorithm involving threshold-based and knowledge-based approaches to detect a fall event. The threshold-based approach efficiently supports the detection and identification of fall events from continuous sensor data. A multiphase fall model is utilized, including free fall, impact, and rest phases for the knowledge-based approach, which identifies fall events and has the potential to deal with the aforementioned technical challenges of a fall detection system. Seven kinds of falls and seven types of daily activities arranged in an experiment are used to explore the performance of the proposed fall detection algorithm. The overall performances of the sensitivity, specificity, precision, and accuracy using a knowledge-based algorithm are 99.79%, 98.74%, 99.05% and 99.33%, respectively. The results show that the proposed novel hierarchical fall detection algorithm can cope with the variability and ambiguity of the technical challenges and fulfill the reliability, adaptability, and flexibility requirements of an automatic fall detection system with respect to the individual differences. PMID:28208694

  2. The Successive Projection Algorithm (SPA), an Algorithm with a Spatial Constraint for the Automatic Search of Endmembers in Hyperspectral Data

    PubMed Central

    Zhang, Jinkai; Rivard, Benoit; Rogge, D.M.

    2008-01-01

    Spectral mixing is a problem inherent to remote sensing data and results in few image pixel spectra representing ″pure″ targets. Linear spectral mixture analysis is designed to address this problem and it assumes that the pixel-to-pixel variability in a scene results from varying proportions of spectral endmembers. In this paper we present a different endmember-search algorithm called the Successive Projection Algorithm (SPA). SPA builds on convex geometry and orthogonal projection common to other endmember search algorithms by including a constraint on the spatial adjacency of endmember candidate pixels. Consequently it can reduce the susceptibility to outlier pixels and generates realistic endmembers.This is demonstrated using two case studies (AVIRIS Cuprite cube and Probe-1 imagery for Baffin Island) where image endmembers can be validated with ground truth data. The SPA algorithm extracts endmembers from hyperspectral data without having to reduce the data dimensionality. It uses the spectral angle (alike IEA) and the spatial adjacency of pixels in the image to constrain the selection of candidate pixels representing an endmember. We designed SPA based on the observation that many targets have spatial continuity (e.g. bedrock lithologies) in imagery and thus a spatial constraint would be beneficial in the endmember search. An additional product of the SPA is data describing the change of the simplex volume ratio between successive iterations during the endmember extraction. It illustrates the influence of a new endmember on the data structure, and provides information on the convergence of the algorithm. It can provide a general guideline to constrain the total number of endmembers in a search. PMID:27879768

  3. Lidar detection algorithm for time and range anomalies

    NASA Astrophysics Data System (ADS)

    Ben-David, Avishai; Davidson, Charles E.; Vanderbeek, Richard G.

    2007-10-01

    A new detection algorithm for lidar applications has been developed. The detection is based on hyperspectral anomaly detection that is implemented for time anomaly where the question "is a target (aerosol cloud) present at range R within time t1 to t2" is addressed, and for range anomaly where the question "is a target present at time t within ranges R1 and R2" is addressed. A detection score significantly different in magnitude from the detection scores for background measurements suggests that an anomaly (interpreted as the presence of a target signal in space/time) exists. The algorithm employs an option for a preprocessing stage where undesired oscillations and artifacts are filtered out with a low-rank orthogonal projection technique. The filtering technique adaptively removes the one over range-squared dependence of the background contribution of the lidar signal and also aids visualization of features in the data when the signal-to-noise ratio is low. A Gaussian-mixture probability model for two hypotheses (anomaly present or absent) is computed with an expectation-maximization algorithm to produce a detection threshold and probabilities of detection and false alarm. Results of the algorithm for CO2 lidar measurements of bioaerosol clouds Bacillus atrophaeus (formerly known as Bacillus subtilis niger, BG) and Pantoea agglomerans, Pa (formerly known as Erwinia herbicola, Eh) are shown and discussed.

  4. A layer reduction based community detection algorithm on multiplex networks

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Liu, Jing

    2017-04-01

    Detecting hidden communities is important for the analysis of complex networks. However, many algorithms have been designed for single layer networks (SLNs) while just a few approaches have been designed for multiplex networks (MNs). In this paper, we propose an algorithm based on layer reduction for detecting communities on MNs, which is termed as LRCD-MNs. First, we improve a layer reduction algorithm termed as neighaggre to combine similar layers and keep others separated. Then, we use neighaggre to find the community structure hidden in MNs. Experiments on real-life networks show that neighaggre can obtain higher relative entropy than the other algorithm. Moreover, we apply LRCD-MNs on some real-life and synthetic multiplex networks and the results demonstrate that, although LRCD-MNs does not have the advantage in terms of modularity, it can obtain higher values of surprise, which is used to evaluate the quality of partitions of a network.

  5. A Robust Mine Detection Algorithm for Acoustic and Radar Images

    DTIC Science & Technology

    2000-10-01

    Hough transforms as demonstrated on an NVL mine hunting SBIR and on SAR ground target detection. The fundamental detection technique will be...Williams, “IA-CHAMELEON: A SAR Wide Area Image Analysis Aid,” Proc. ATRWG Workshop, Baltimore, MD, July 1996 The adaptive detection algorithm will...University, Mississippi 38677, September 15, 1998 Systems Incorporated (PSI) Ground Penetrating Radar (GPR)9, and on synthetic aperture radar ( SAR ) images

  6. Label propagation algorithm based on local cycles for community detection

    NASA Astrophysics Data System (ADS)

    Zhang, Xian-Kun; Fei, Song; Song, Chen; Tian, Xue; Ao, Yang-Yue

    2015-12-01

    Label propagation algorithm (LPA) has been proven to be an extremely fast method for community detection in large complex networks. But an important issue of the algorithm has not yet been properly addressed that random update orders in label propagation process hamper the algorithm robustness of algorithm. We note that when there are multiple maximal labels among a node neighbors' labels, choosing a node' label from which there is a local cycle to the node instead of a random node' label can avoid the labels propagating among communities at random. In this paper, an improved LPA based on local cycles is given. We have evaluated the proposed algorithm on computer-generated networks with planted partition and some real-world networks whose community structure are already known. The result shows that the performance of the proposed approach is even significantly improved.

  7. A Survey of Successful Evaluations of Program Visualization and Algorithm Animation Systems

    ERIC Educational Resources Information Center

    Urquiza-Fuentes, Jaime; Velazquez-Iturbide, J. Angel

    2009-01-01

    This article reviews successful educational experiences in using program and algorithm visualizations (PAVs). First, we survey a total of 18 PAV systems that were subject to 33 evaluations. We found that half of the systems have only been tested for usability, and those were shallow inspections. The rest were evaluated with respect to their…

  8. An Optimal Algorithm towards Successive Location Privacy in Sensor Networks with Dynamic Programming

    NASA Astrophysics Data System (ADS)

    Zhao, Baokang; Wang, Dan; Shao, Zili; Cao, Jiannong; Chan, Keith C. C.; Su, Jinshu

    In wireless sensor networks, preserving location privacy under successive inference attacks is extremely critical. Although this problem is NP-complete in general cases, we propose a dynamic programming based algorithm and prove it is optimal in special cases where the correlation only exists between p immediate adjacent observations.

  9. Detection of combined occurrences. [computer algorithms

    NASA Technical Reports Server (NTRS)

    Zobrist, A. L.; Carlson, F. R., Jr.

    1977-01-01

    In this paper it is supposed that the variables x sub 1,...,x sub n each have finite range with the variable x sub i taking on p sub i possible values and that the values of the variables are changing with time. It is supposed further that it is desired to detect occurrences in which some subset of the variables achieve particular values. Finally, it is supposed that the problem involves the detection of a large number of combined occurrences for a large number of changes of values of variables. Two efficient solutions for this problem are described. Both methods have the unusual property of being faster for systems where the sum p sub 1 +...+ p sub n is larger. The first solution is error-free and suitable for most cases. The second solution is slightly more elegant and allows negation as well as conjunction, but is subject to the possibility of errors. An error analysis is given for the second method and an empirical study is reported.

  10. Detecting compact galactic binaries using a hybrid swarm-based algorithm

    NASA Astrophysics Data System (ADS)

    Bouffanais, Yann; Porter, Edward K.

    2016-03-01

    Compact binaries in our galaxy are expected to be one of the main sources of gravitational waves for the future eLISA mission. During the mission lifetime, many thousands of galactic binaries should be individually resolved. However, the identification of the sources and the extraction of the signal parameters in a noisy environment are real challenges for data analysis. So far, stochastic searches have proven to be the most successful for this problem. In this work, we present the first application of a swarm-based algorithm combining Particle Swarm Optimization and Differential Evolution. These algorithms have been shown to converge faster to global solutions on complicated likelihood surfaces than other stochastic methods. We first demonstrate the effectiveness of the algorithm for the case of a single binary in a 1-mHz search bandwidth. This interesting problem gave the algorithm plenty of opportunity to fail, as it can be easier to find a strong noise peak rather than the signal itself. After a successful detection of a fictitious low-frequency source, as well as the verification binary RXJ 0806.3 +1527 , we then applied the algorithm to the detection of multiple binaries, over different search bandwidths, in the cases of low and mild source confusion. In all cases, we show that we can successfully identify the sources and recover the true parameters within a 99% credible interval.

  11. Performance characterization of the dynamic programming obstacle detection algorithm.

    PubMed

    Gandhi, Tarak; Yang, Mau-Tsuen; Kasturi, Rangachar; Camps, Octavia I; Coraor, Lee D; McCandless, Jeffrey

    2006-05-01

    A computer vision-based system using images from an airborne aircraft can increase flight safety by aiding the pilot to detect obstacles in the flight path so as to avoid mid-air collisions. Such a system fits naturally with the development of an external vision system proposed by NASA for use in high-speed civil transport aircraft with limited cockpit visibility. The detection techniques should provide high detection probability for obstacles that can vary from subpixels to a few pixels in size, while maintaining a low false alarm probability in the presence of noise and severe background clutter. Furthermore, the detection algorithms must be able to report such obstacles in a timely fashion, imposing severe constraints on their execution time. For this purpose, we have implemented a number of algorithms to detect airborne obstacles using image sequences obtained from a camera mounted on an aircraft. This paper describes the methodology used for characterizing the performance of the dynamic programming obstacle detection algorithm and its special cases. The experimental results were obtained using several types of image sequences, with simulated and real backgrounds. The approximate performance of the algorithm is also theoretically derived using principles of statistical analysis in terms of the signal-to-noise ration (SNR) required for the probabilities of false alarms and misdetections to be lower than prespecified values. The theoretical and experimental performance are compared in terms of the required SNR.

  12. Road detection in SAR images using a tensor voting algorithm

    NASA Astrophysics Data System (ADS)

    Shen, Dajiang; Hu, Chun; Yang, Bing; Tian, Jinwen; Liu, Jian

    2007-11-01

    In this paper, the problem of the detection of road networks in Synthetic Aperture Radar (SAR) images is addressed. Most of the previous methods extract the road by detecting lines and network reconstruction. Traditional algorithms such as MRFs, GA, Level Set, used in the progress of reconstruction are iterative. The tensor voting methodology we proposed is non-iterative, and non-sensitive to initialization. Furthermore, the only free parameter is the size of the neighborhood, related to the scale. The algorithm we present is verified to be effective when it's applied to the road extraction using the real Radarsat Image.

  13. Space Object Maneuver Detection Algorithms Using TLE Data

    NASA Astrophysics Data System (ADS)

    Pittelkau, M.

    2016-09-01

    An important aspect of Space Situational Awareness (SSA) is detection of deliberate and accidental orbit changes of space objects. Although space surveillance systems detect orbit maneuvers within their tracking algorithms, maneuver data are not readily disseminated for general use. However, two-line element (TLE) data is available and can be used to detect maneuvers of space objects. This work is an attempt to improve upon existing TLE-based maneuver detection algorithms. Three adaptive maneuver detection algorithms are developed and evaluated: The first is a fading-memory Kalman filter, which is equivalent to the sliding-window least-squares polynomial fit, but computationally more efficient and adaptive to the noise in the TLE data. The second algorithm is based on a sample cumulative distribution function (CDF) computed from a histogram of the magnitude-squared |V|2 of change-in-velocity vectors (V), which is computed from the TLE data. A maneuver detection threshold is computed from the median estimated from the CDF, or from the CDF and a specified probability of false alarm. The third algorithm is a median filter. The median filter is the simplest of a class of nonlinear filters called order statistics filters, which is within the theory of robust statistics. The output of the median filter is practically insensitive to outliers, or large maneuvers. The median of the |V|2 data is proportional to the variance of the V, so the variance is estimated from the output of the median filter. A maneuver is detected when the input data exceeds a constant times the estimated variance.

  14. Statistical iterative reconstruction using fast optimization transfer algorithm with successively increasing factor in Digital Breast Tomosynthesis

    NASA Astrophysics Data System (ADS)

    Xu, Shiyu; Zhang, Zhenxi; Chen, Ying

    2014-03-01

    Statistical iterative reconstruction exhibits particularly promising since it provides the flexibility of accurate physical noise modeling and geometric system description in transmission tomography system. However, to solve the objective function is computationally intensive compared to analytical reconstruction methods due to multiple iterations needed for convergence and each iteration involving forward/back-projections by using a complex geometric system model. Optimization transfer (OT) is a general algorithm converting a high dimensional optimization to a parallel 1-D update. OT-based algorithm provides a monotonic convergence and a parallel computing framework but slower convergence rate especially around the global optimal. Based on an indirect estimation on the spectrum of the OT convergence rate matrix, we proposed a successively increasing factor- scaled optimization transfer (OT) algorithm to seek an optimal step size for a faster rate. Compared to a representative OT based method such as separable parabolic surrogate with pre-computed curvature (PC-SPS), our algorithm provides comparable image quality (IQ) with fewer iterations. Each iteration retains a similar computational cost to PC-SPS. The initial experiment with a simulated Digital Breast Tomosynthesis (DBT) system shows that a total 40% computing time is saved by the proposed algorithm. In general, the successively increasing factor-scaled OT exhibits a tremendous potential to be a iterative method with a parallel computation, a monotonic and global convergence with fast rate.

  15. Convergence Rate of the Successive Zooming Genetic Algorithm for Band-Widths of Equality Constraint

    NASA Astrophysics Data System (ADS)

    Kwon, Y. D.; Han, S. W.; Do, J. W.

    Modern optimization techniques, such as the steepest descent method, Newton's method, Rosen's gradient projection method, genetic algorithms, etc., have been developed and quickly improved with the progress of digital computers. The steepest descent method and Newton's method are applied efficiently to unconstrained problems. For many engineering problems involving constraints, the genetic algorithm and SUMT1are applied with relative ease. Genetic algorithms2have global search characteristics and relatively good convergence rates. Recently, a Successive Zooming Genetic Algorithm (SZGA)3,4 was introduced that can search the precise optimal solution at any level of desired accuracy. In the case of engineering problems involving an equality constraint, even if good optimization techniques are applied to the constraint problems, a proper constraint range can lead to a more rapid convergence and precise solution. This study investigated the proper band-width of an equality constraint using the Successive Zooming Genetic Algorithm (SZGA) technique both theoretically and numerically. We were able to find a certain band-width range of the rapid convergence for each problem, and a broad but more general one too.

  16. Road detection in spaceborne SAR images using genetic algorithm

    NASA Astrophysics Data System (ADS)

    Jeon, Byoungki; Jang, JeongHun; Hong, KiSang

    2000-08-01

    This paper presents a technique for detection of roads in a spaceborne SAR image using a genetic algorithm. Roads in a spaceborne SAR image can be modelled as curvilinear structures with some thickness. Curve segments, which represent candidate positions of roads, are extracted from the image using a curvilinear structure detector, and roads are detected accurately by grouping those curve segments. For this purpose, we designed a grouping method based on a genetic algorithm (GA), which is one of the global optimization methods, combined perceptual grouping factors with it, and tried to reduce its overall computational cost by introducing an operation of thresholding and a concept of region growing. To detect roads more accurately, postprocessing, including noisy curve segment removal, is performed after grouping. We applied our method to ERS-1 SAR images that have a resolution of about 30 meters, and the experimental results show that our method can detect roads accurately, and is much faster than a globally applied GA approach.

  17. Stepping community detection algorithm based on label propagation and similarity

    NASA Astrophysics Data System (ADS)

    Li, Wei; Huang, Ce; Wang, Miao; Chen, Xi

    2017-04-01

    Community or module structure is one of the most common features in complex networks. The label propagation algorithm (LPA) is a near linear time algorithm that is able to detect community structure effectively. Nevertheless, when labeling a node, the LPA adopts the label belonging to the majority of its neighbors, which means that it treats all neighbors equally in spite of their different effects on the node. Another disadvantage of LPA is that the results it generates are not unique. In this paper, we propose a modified LPA called Stepping LPA-S, in which labels are propagated by similarity. Furthermore, our algorithm divides networks using a stepping framework, and uses an evaluation function proposed in this paper to select the final unique partition. We tested this algorithm on several artificial and real-world networks. The results show that Stepping LPA-S can obtain accurate and meaningful community structure without priori information.

  18. An Adaptive Immune Genetic Algorithm for Edge Detection

    NASA Astrophysics Data System (ADS)

    Li, Ying; Bai, Bendu; Zhang, Yanning

    An adaptive immune genetic algorithm (AIGA) based on cost minimization technique method for edge detection is proposed. The proposed AIGA recommends the use of adaptive probabilities of crossover, mutation and immune operation, and a geometric annealing schedule in immune operator to realize the twin goals of maintaining diversity in the population and sustaining the fast convergence rate in solving the complex problems such as edge detection. Furthermore, AIGA can effectively exploit some prior knowledge and information of the local edge structure in the edge image to make vaccines, which results in much better local search ability of AIGA than that of the canonical genetic algorithm. Experimental results on gray-scale images show the proposed algorithm perform well in terms of quality of the final edge image, rate of convergence and robustness to noise.

  19. Detecting Outliers in Factor Analysis Using the Forward Search Algorithm

    ERIC Educational Resources Information Center

    Mavridis, Dimitris; Moustaki, Irini

    2008-01-01

    In this article we extend and implement the forward search algorithm for identifying atypical subjects/observations in factor analysis models. The forward search has been mainly developed for detecting aberrant observations in regression models (Atkinson, 1994) and in multivariate methods such as cluster and discriminant analysis (Atkinson, Riani,…

  20. Plagiarism Detection Algorithm for Source Code in Computer Science Education

    ERIC Educational Resources Information Center

    Liu, Xin; Xu, Chan; Ouyang, Boyu

    2015-01-01

    Nowadays, computer programming is getting more necessary in the course of program design in college education. However, the trick of plagiarizing plus a little modification exists among some students' home works. It's not easy for teachers to judge if there's plagiarizing in source code or not. Traditional detection algorithms cannot fit this…

  1. Hyperspectral Detection and Discrimination Using the ACE Algorithm

    DTIC Science & Technology

    2011-08-08

    08-2011 Proceedings AUG 2011 - SEPT 2011 Hyperspectral Detection and Discrimination Using the ACE Algorithm FA8720-05-C-0002 M. L. Pieper , D...relative to the background. If an object spectrum has a close resemblance to its surroundings, it will Correspondence to M. L. Pieper E-mail: mpieper

  2. Information dynamics algorithm for detecting communities in networks

    NASA Astrophysics Data System (ADS)

    Massaro, Emanuele; Bagnoli, Franco; Guazzini, Andrea; Lió, Pietro

    2012-11-01

    The problem of community detection is relevant in many scientific disciplines, from social science to statistical physics. Given the impact of community detection in many areas, such as psychology and social sciences, we have addressed the issue of modifying existing well performing algorithms by incorporating elements of the domain application fields, i.e. domain-inspired. We have focused on a psychology and social network-inspired approach which may be useful for further strengthening the link between social network studies and mathematics of community detection. Here we introduce a community-detection algorithm derived from the van Dongen's Markov Cluster algorithm (MCL) method [4] by considering networks' nodes as agents capable to take decisions. In this framework we have introduced a memory factor to mimic a typical human behavior such as the oblivion effect. The method is based on information diffusion and it includes a non-linear processing phase. We test our method on two classical community benchmark and on computer generated networks with known community structure. Our approach has three important features: the capacity of detecting overlapping communities, the capability of identifying communities from an individual point of view and the fine tuning the community detectability with respect to prior knowledge of the data. Finally we discuss how to use a Shannon entropy measure for parameter estimation in complex networks.

  3. A Monte Carlo Evaluation of Weighted Community Detection Algorithms

    PubMed Central

    Gates, Kathleen M.; Henry, Teague; Steinley, Doug; Fair, Damien A.

    2016-01-01

    The past decade has been marked with a proliferation of community detection algorithms that aim to organize nodes (e.g., individuals, brain regions, variables) into modular structures that indicate subgroups, clusters, or communities. Motivated by the emergence of big data across many fields of inquiry, these methodological developments have primarily focused on the detection of communities of nodes from matrices that are very large. However, it remains unknown if the algorithms can reliably detect communities in smaller graph sizes (i.e., 1000 nodes and fewer) which are commonly used in brain research. More importantly, these algorithms have predominantly been tested only on binary or sparse count matrices and it remains unclear the degree to which the algorithms can recover community structure for different types of matrices, such as the often used cross-correlation matrices representing functional connectivity across predefined brain regions. Of the publicly available approaches for weighted graphs that can detect communities in graph sizes of at least 1000, prior research has demonstrated that Newman's spectral approach (i.e., Leading Eigenvalue), Walktrap, Fast Modularity, the Louvain method (i.e., multilevel community method), Label Propagation, and Infomap all recover communities exceptionally well in certain circumstances. The purpose of the present Monte Carlo simulation study is to test these methods across a large number of conditions, including varied graph sizes and types of matrix (sparse count, correlation, and reflected Euclidean distance), to identify which algorithm is optimal for specific types of data matrices. The results indicate that when the data are in the form of sparse count networks (such as those seen in diffusion tensor imaging), Label Propagation and Walktrap surfaced as the most reliable methods for community detection. For dense, weighted networks such as correlation matrices capturing functional connectivity, Walktrap consistently

  4. An algorithm for seizure onset detection using intracranial EEG.

    PubMed

    Kharbouch, Alaa; Shoeb, Ali; Guttag, John; Cash, Sydney S

    2011-12-01

    This article addresses the problem of real-time seizure detection from intracranial EEG (IEEG). One difficulty in creating an approach that can be used for many patients is the heterogeneity of seizure IEEG patterns across different patients and even within a patient. In addition, simultaneously maximizing sensitivity and minimizing latency and false detection rates has been challenging as these are competing objectives. Automated machine learning systems provide a mechanism for dealing with these hurdles. Here we present and evaluate an algorithm for real-time seizure onset detection from IEEG using a machine-learning approach that permits a patient-specific solution. We extract temporal and spectral features across all intracranial EEG channels. A pattern recognition component is trained using these feature vectors and tested against unseen continuous data from the same patient. When tested on more than 875 hours of IEEG data from 10 patients, the algorithm detected 97% of 67 test seizures of several types with a median detection delay of 5 seconds and a median false alarm rate of 0.6 false alarms per 24-hour period. The sensitivity was 100% for 8 of 10 patients. These results indicate that a sensitive, specific, and relatively short-latency detection system based on machine learning can be employed for seizure detection from EEG using a full set of intracranial electrodes to individual patients. This article is part of a Supplemental Special Issue entitled The Future of Automated Seizure Detection and Prediction.

  5. An Efficient Conflict Detection Algorithm for Packet Filters

    NASA Astrophysics Data System (ADS)

    Lee, Chun-Liang; Lin, Guan-Yu; Chen, Yaw-Chung

    Packet classification is essential for supporting advanced network services such as firewalls, quality-of-service (QoS), virtual private networks (VPN), and policy-based routing. The rules that routers use to classify packets are called packet filters. If two or more filters overlap, a conflict occurs and leads to ambiguity in packet classification. This study proposes an algorithm that can efficiently detect and resolve filter conflicts using tuple based search. The time complexity of the proposed algorithm is O(nW+s), and the space complexity is O(nW), where n is the number of filters, W is the number of bits in a header field, and s is the number of conflicts. This study uses the synthetic filter databases generated by ClassBench to evaluate the proposed algorithm. Simulation results show that the proposed algorithm can achieve better performance than existing conflict detection algorithms both in time and space, particularly for databases with large numbers of conflicts.

  6. Statistical algorithms for target detection in coherent active polarimetric images.

    PubMed

    Goudail, F; Réfrégier, P

    2001-12-01

    We address the problem of small-target detection with a polarimetric imager that provides orthogonal state contrast images. Such active systems allow one to measure the degree of polarization of the light backscattered by purely depolarizing isotropic materials. To be independent of the spatial nonuniformities of the illumination beam, small-target detection on the orthogonal state contrast image must be performed without using the image of backscattered intensity. We thus propose and develop a simple and efficient target detection algorithm based on a nonlinear pointwise transformation of the orthogonal state contrast image followed by a maximum-likelihood algorithm optimal for additive Gaussian perturbations. We demonstrate the efficiency of this suboptimal technique in comparison with the optimal one, which, however, assumes a priori knowledge about the scene that is not available in practice. We illustrate the performance of this approach on both simulated and real polarimetric images.

  7. Toward an Objective Enhanced-V Detection Algorithm

    NASA Technical Reports Server (NTRS)

    Brunner, Jason; Feltz, Wayne; Moses, John; Rabin, Robert; Ackerman, Steven

    2007-01-01

    The area of coldest cloud tops above thunderstorms sometimes has a distinct V or U shape. This pattern, often referred to as an "enhanced-V' signature, has been observed to occur during and preceding severe weather in previous studies. This study describes an algorithmic approach to objectively detect enhanced-V features with observations from the Geostationary Operational Environmental Satellite and Low Earth Orbit data. The methodology consists of cross correlation statistics of pixels and thresholds of enhanced-V quantitative parameters. The effectiveness of the enhanced-V detection method will be examined using Geostationary Operational Environmental Satellite, MODerate-resolution Imaging Spectroradiometer, and Advanced Very High Resolution Radiometer image data from case studies in the 2003-2006 seasons. The main goal of this study is to develop an objective enhanced-V detection algorithm for future implementation into operations with future sensors, such as GOES-R.

  8. Advanced defect detection algorithm using clustering in ultrasonic NDE

    NASA Astrophysics Data System (ADS)

    Gongzhang, Rui; Gachagan, Anthony

    2016-02-01

    A range of materials used in industry exhibit scattering properties which limits ultrasonic NDE. Many algorithms have been proposed to enhance defect detection ability, such as the well-known Split Spectrum Processing (SSP) technique. Scattering noise usually cannot be fully removed and the remaining noise can be easily confused with real feature signals, hence becoming artefacts during the image interpretation stage. This paper presents an advanced algorithm to further reduce the influence of artefacts remaining in A-scan data after processing using a conventional defect detection algorithm. The raw A-scan data can be acquired from either traditional single transducer or phased array configurations. The proposed algorithm uses the concept of unsupervised machine learning to cluster segmental defect signals from pre-processed A-scans into different classes. The distinction and similarity between each class and the ensemble of randomly selected noise segments can be observed by applying a classification algorithm. Each class will then be labelled as `legitimate reflector' or `artefacts' based on this observation and the expected probability of defection (PoD) and probability of false alarm (PFA) determined. To facilitate data collection and validate the proposed algorithm, a 5MHz linear array transducer is used to collect A-scans from both austenitic steel and Inconel samples. Each pulse-echo A-scan is pre-processed using SSP and the subsequent application of the proposed clustering algorithm has provided an additional reduction to PFA while maintaining PoD for both samples compared with SSP results alone.

  9. Common pharmacophore identification using frequent clique detection algorithm.

    PubMed

    Podolyan, Yevgeniy; Karypis, George

    2009-01-01

    The knowledge of a pharmacophore, or the 3D arrangement of features in the biologically active molecule that is responsible for its pharmacological activity, can help in the search and design of a new or better drug acting upon the same or related target. In this paper, we describe two new algorithms based on the frequent clique detection in the molecular graphs. The first algorithm mines all frequent cliques that are present in at least one of the conformers of each (or a portion of all) molecules. The second algorithm exploits the similarities among the different conformers of the same molecule and achieves an order of magnitude performance speedup compared to the first algorithm. Both algorithms are guaranteed to find all common pharmacophores in the data set, which is confirmed by the validation on the set of molecules for which pharmacophores have been determined experimentally. In addition, these algorithms are able to scale to data sets with arbitrarily large number of conformers per molecule and identify multiple ligand binding modes or multiple binding sites of the target.

  10. Common Pharmacophore Identification Using Frequent Clique Detection Algorithm

    PubMed Central

    Podolyan, Yevgeniy; Karypis, George

    2008-01-01

    The knowledge of a pharmacophore, or the 3D arrangement of features in the biologically active molecule that is responsible for its pharmacological activity, can help in the search and design of a new or better drug acting upon the same or related target. In this paper we describe two new algorithms based on the frequent clique detection in the molecular graphs. The first algorithm mines all frequent cliques that are present in at least one of the conformers of each (or a portion of all) molecules. The second algorithm exploits the similarities among the different conformers of the same molecule and achieves an order of magnitude performance speedup compared to the first algorithm. Both algorithms are guaranteed to find all common pharmacophores in the dataset, which is confirmed by the validation on the set of molecules for which pharmacophores have been determined experimentally. In addition, these algorithms are able to scale to datasets with arbitrarily large number of conformers per molecule and identify multiple ligand binding modes or multiple binding sites of the target. PMID:19072298

  11. Algorithm for detecting human faces based on convex-hull.

    PubMed

    Park, Minsick; Park, Chang-Woo; Park, Mignon; Lee, Chang-Hoon

    2002-03-25

    In this paper, we proposed a new method to detect faces in color based on the convex-hull. We detect two kinds of regions that are skin and hair likeness region. After preprocessing, we apply the convex-hull to their regions and can find a face from their intersection relationship. The proposed algorithm can accomplish face detection in an image involving rotated and turned faces as well as several faces. To validity the effectiveness of the proposed method, we make experiment with various cases.

  12. Development of an IMU-based foot-ground contact detection (FGCD) algorithm.

    PubMed

    Kim, Myeongkyu; Lee, Donghun

    2017-03-01

    It is well known that, to locate humans in GPS-denied environments, a lower limb kinematic solution based on Inertial Measurement Unit (IMU), force plate, and pressure insoles is essential. The force plate and pressure insole are used to detect foot-ground contacts. However, the use of multiple sensors is not desirable in most cases. This paper documents the development of an IMU-based FGCD (foot-ground contact detection) algorithm considering the variations of both walking terrain and speed. All IMU outputs showing significant changes on the moments of foot-ground contact phases are fully identified through experiments in five walking terrains. For the experiment on each walking terrain, variations of walking speeds are also examined to confirm the correlations between walking speed and the main parameters in the FGCD algorithm. As experimental results, FGCD algorithm successfully detecting four contact phases is developed, and validation of performance of the FGCD algorithm is also implemented. Practitioner Summary: In this research, it was demonstrated that the four contact phases of Heel strike (or Toe strike), Full contact, Heel off and Toe off can be independently detected regardless of the walking speed and walking terrain based on the detection criteria composed of the ranges and the rates of change of the main parameters measured from the Inertial Measurement Unit sensors.

  13. Moment feature based fast feature extraction algorithm for moving object detection using aerial images.

    PubMed

    Saif, A F M Saifuddin; Prabuwono, Anton Satria; Mahayuddin, Zainal Rasyid

    2015-01-01

    Fast and computationally less complex feature extraction for moving object detection using aerial images from unmanned aerial vehicles (UAVs) remains as an elusive goal in the field of computer vision research. The types of features used in current studies concerning moving object detection are typically chosen based on improving detection rate rather than on providing fast and computationally less complex feature extraction methods. Because moving object detection using aerial images from UAVs involves motion as seen from a certain altitude, effective and fast feature extraction is a vital issue for optimum detection performance. This research proposes a two-layer bucket approach based on a new feature extraction algorithm referred to as the moment-based feature extraction algorithm (MFEA). Because a moment represents the coherent intensity of pixels and motion estimation is a motion pixel intensity measurement, this research used this relation to develop the proposed algorithm. The experimental results reveal the successful performance of the proposed MFEA algorithm and the proposed methodology.

  14. Algorithms for the detection of chewing behavior in dietary monitoring applications

    NASA Astrophysics Data System (ADS)

    Schmalz, Mark S.; Helal, Abdelsalam; Mendez-Vasquez, Andres

    2009-08-01

    The detection of food consumption is key to the implementation of successful behavior modification in support of dietary monitoring and therapy, for example, during the course of controlling obesity, diabetes, or cardiovascular disease. Since the vast majority of humans consume food via mastication (chewing), we have designed an algorithm that automatically detects chewing behaviors in surveillance video of a person eating. Our algorithm first detects the mouth region, then computes the spatiotemporal frequency spectrum of a small perioral region (including the mouth). Spectral data are analyzed to determine the presence of periodic motion that characterizes chewing. A classifier is then applied to discriminate different types of chewing behaviors. Our algorithm was tested on seven volunteers, whose behaviors included chewing with mouth open, chewing with mouth closed, talking, static face presentation (control case), and moving face presentation. Early test results show that the chewing behaviors induce a temporal frequency peak at 0.5Hz to 2.5Hz, which is readily detected using a distance-based classifier. Computational cost is analyzed for implementation on embedded processing nodes, for example, in a healthcare sensor network. Complexity analysis emphasizes the relationship between the work and space estimates of the algorithm, and its estimated error. It is shown that chewing detection is possible within a computationally efficient, accurate, and subject-independent framework.

  15. Density shrinking algorithm for community detection with path based similarity

    NASA Astrophysics Data System (ADS)

    Wu, Jianshe; Hou, Yunting; Jiao, Yang; Li, Yong; Li, Xiaoxiao; Jiao, Licheng

    2015-09-01

    Community structure is ubiquitous in real world complex networks. Finding the communities is the key to understand the functions of those networks. A lot of works have been done in designing algorithms for community detection, but it remains a challenge in the field. Traditional modularity optimization suffers from the resolution limit problem. Recent researches show that combining the density based technique with the modularity optimization can overcome the resolution limit and an efficient algorithm named DenShrink was provided. The main procedure of DenShrink is repeatedly finding and merging micro-communities (broad sense) into super nodes until they cannot merge. Analyses in this paper show that if the procedure is replaced by finding and merging only dense pairs, both of the detection accuracy and runtime can be obviously improved. Thus an improved density-based algorithm: ImDS is provided. Since the time complexity, path based similarity indexes are difficult to be applied in community detection for high performance. In this paper, the path based Katz index is simplified and used in the ImDS algorithm.

  16. Clever eye algorithm for target detection of remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Geng, Xiurui; Ji, Luyan; Sun, Kang

    2016-04-01

    Target detection algorithms for hyperspectral remote sensing imagery, such as the two most commonly used remote sensing detection algorithms, the constrained energy minimization (CEM) and matched filter (MF), can usually be attributed to the inner product between a weight filter (or detector) and a pixel vector. CEM and MF have the same expression except that MF requires data centralization first. However, this difference leads to a difference in the target detection results. That is to say, the selection of the data origin could directly affect the performance of the detector. Therefore, does there exist another data origin other than the zero and mean-vector points for a better target detection performance? This is a very meaningful issue in the field of target detection, but it has not been paid enough attention yet. In this study, we propose a novel objective function by introducing the data origin as another variable, and the solution of the function is corresponding to the data origin with the minimal output energy. The process of finding the optimal solution can be vividly regarded as a clever eye automatically searching the best observing position and direction in the feature space, which corresponds to the largest separation between the target and background. Therefore, this new algorithm is referred to as the clever eye algorithm (CE). Based on the Sherman-Morrison formula and the gradient ascent method, CE could derive the optimal target detection result in terms of energy. Experiments with both synthetic and real hyperspectral data have verified the effectiveness of our method.

  17. Detection Systems and Algorithms for Multiplexed Quantum Dots

    NASA Astrophysics Data System (ADS)

    Goss, Kelly Christine

    Quantum Dots (QDs) are semiconductor nanocrystals that absorb light and re-emit at a wavelength dependent on its size and shape. A group of quantum dots can be designed to have a unique spectral emission by varying the size of the quantum dots (wavelength) and number of quantum dots (optical power) [1]. This technology is refered to as Multiplexed Quantum Dots (MxQD) and when it was first proposed, MxQD tags were created with 6 optical power levels and one QD colour or 3 QD colours and 2 optical power levels. It was hypothesized that a realistic limit to the number of tags would be a system of 6 optical power levels and 6 QD colours resulting in 46655 unique tags. In recent work, the fabrication and detection of 9 unique tags [2] was demonstrated which is still far from the predicted capability of the technology. The limitations affecting the large number of unique tags are both the fabrication methods and the data detection algorithms used to read the spectral emissions. This thesis makes contributions toward improving the data detection algorithms for MxQD tags. To accomplish this, a communications system model is developed that includes the inteference between QD colours, Inter-Symbol Interference (ISI), and additive noise. The model is developed for the two optical detectors, namely a Charge-Coupled Device (CCD) spectrometer and photodiode detectors. The model also includes an analytical expression for the Signal-to-Noise Ratio (SNR) of the detectors. For the CCD spectrometer, this model is verified with an experimental prototype. With the models in place, communications systems tools are applied that overcome both ISI and noise. This is an improvement over previous work in the field that only considered algorithms to overcome the ISI or noise separately. Specifically, this thesis outlines the proposal of a matched filter to improve SNR, a Minimum Mean Square Error (MMSE) equalizer that mitigates ISI in the presence of noise and a Maximum Likelihood Sequence

  18. Multi-Objective Community Detection Based on Memetic Algorithm

    PubMed Central

    2015-01-01

    Community detection has drawn a lot of attention as it can provide invaluable help in understanding the function and visualizing the structure of networks. Since single objective optimization methods have intrinsic drawbacks to identifying multiple significant community structures, some methods formulate the community detection as multi-objective problems and adopt population-based evolutionary algorithms to obtain multiple community structures. Evolutionary algorithms have strong global search ability, but have difficulty in locating local optima efficiently. In this study, in order to identify multiple significant community structures more effectively, a multi-objective memetic algorithm for community detection is proposed by combining multi-objective evolutionary algorithm with a local search procedure. The local search procedure is designed by addressing three issues. Firstly, nondominated solutions generated by evolutionary operations and solutions in dominant population are set as initial individuals for local search procedure. Then, a new direction vector named as pseudonormal vector is proposed to integrate two objective functions together to form a fitness function. Finally, a network specific local search strategy based on label propagation rule is expanded to search the local optimal solutions efficiently. The extensive experiments on both artificial and real-world networks evaluate the proposed method from three aspects. Firstly, experiments on influence of local search procedure demonstrate that the local search procedure can speed up the convergence to better partitions and make the algorithm more stable. Secondly, comparisons with a set of classic community detection methods illustrate the proposed method can find single partitions effectively. Finally, the method is applied to identify hierarchical structures of networks which are beneficial for analyzing networks in multi-resolution levels. PMID:25932646

  19. Detection Algorithms of the Seismic Alert System of Mexico (SASMEX)

    NASA Astrophysics Data System (ADS)

    Cuellar Martinez, A.; Espinosa Aranda, J.; Ramos Perez, S.; Ibarrola Alvarez, G.; Zavala Guerrero, M.; Sasmex

    2013-05-01

    The importance of a rapid and reliable detection of an earthquake, allows taking advantage with more opportunity time of any possible opportunity warnings to the population. Thus detection algorithms in the sensing field station (FS) of an earthquake early earning system, must have a high rate of correct detection; this condition lets perform numerical processes to obtain appropriate parameters for the alert activation. During the evolution and continuous service of the Mexican Seismic Alert System (SASMEX) in more than 23 operation years, it has used various methodologies in the detection process to get the largest opportunity time when an earthquake occurs and it is alerted. In addition to the characteristics of the acceleration signal observed in sensing field stations, it is necessary the site conditions reducing urban noise, but sometimes it is not present through of the first operation years, however, urban growth near to FS cause urban noise, which should be tolerated while carrying out the relocation process of the station, and in the algorithm design should be contemplating the robustness to reduce possible errors and false detections. This work presents some results on detection algorithms used in Mexico for early warning systems for earthquakes considering recent events and different opportunity times obtained depending of the detections on P and S phases of the earthquake detected in the station. Some methodologies are reviewed and described in detail in this work and the main features implemented in The Seismic Alert System of Mexico City (SAS), in continuous operation since 1991, and the Seismic Alert System of Oaxaca City (SASO), today both comprise the SASMEX.

  20. Jump point detection for real estate investment success

    NASA Astrophysics Data System (ADS)

    Hui, Eddie C. M.; Yu, Carisa K. W.; Ip, Wai-Cheung

    2010-03-01

    In the literature, studies on real estate market were mainly concentrating on the relation between property price and some key factors. The trend of the real estate market is a major concern. It is believed that changes in trend are signified by some jump points in the property price series. Identifying such jump points reveals important findings that enable policy-makers to look forward. However, not all jump points are observable from the plot of the series. This paper looks into the trend and introduces a new approach to the framework for real estate investment success. The main purpose of this paper is to detect jump points in the time series of some housing price indices and stock price index in Hong Kong by applying the wavelet analysis. The detected jump points reflect to some significant political issues and economic collapse. Moreover, the relations among properties of different classes and between stocks and properties are examined. It can be shown from the empirical result that a lead-lag effect happened between the prices of large-size property and those of small/medium-size property. However, there is no apparent relation or consistent lead in terms of change point measure between property price and stock price. This may be due to the fact that globalization effect has more impact on the stock price than the property price.

  1. Artifact removal algorithms for stroke detection using a multistatic MIST beamforming algorithm.

    PubMed

    Ricci, E; Di Domenico, S; Cianca, E; Rossi, T

    2015-01-01

    Microwave imaging (MWI) has been recently proved as a promising imaging modality for low-complexity, low-cost and fast brain imaging tools, which could play a fundamental role to efficiently manage emergencies related to stroke and hemorrhages. This paper focuses on the UWB radar imaging approach and in particular on the processing algorithms of the backscattered signals. Assuming the use of the multistatic version of the MIST (Microwave Imaging Space-Time) beamforming algorithm, developed by Hagness et al. for the early detection of breast cancer, the paper proposes and compares two artifact removal algorithms. Artifacts removal is an essential step of any UWB radar imaging system and currently considered artifact removal algorithms have been shown not to be effective in the specific scenario of brain imaging. First of all, the paper proposes modifications of a known artifact removal algorithm. These modifications are shown to be effective to achieve good localization accuracy and lower false positives. However, the main contribution is the proposal of an artifact removal algorithm based on statistical methods, which allows to achieve even better performance but with much lower computational complexity.

  2. Performance of a community detection algorithm based on semidefinite programming

    NASA Astrophysics Data System (ADS)

    Ricci-Tersenghi, Federico; Javanmard, Adel; Montanari, Andrea

    2016-03-01

    The problem of detecting communities in a graph is maybe one the most studied inference problems, given its simplicity and widespread diffusion among several disciplines. A very common benchmark for this problem is the stochastic block model or planted partition problem, where a phase transition takes place in the detection of the planted partition by changing the signal-to-noise ratio. Optimal algorithms for the detection exist which are based on spectral methods, but we show these are extremely sensible to slight modification in the generative model. Recently Javanmard, Montanari and Ricci-Tersenghi [1] have used statistical physics arguments, and numerical simulations to show that finding communities in the stochastic block model via semidefinite programming is quasi optimal. Further, the resulting semidefinite relaxation can be solved efficiently, and is very robust with respect to changes in the generative model. In this paper we study in detail several practical aspects of this new algorithm based on semidefinite programming for the detection of the planted partition. The algorithm turns out to be very fast, allowing the solution of problems with O(105) variables in few second on a laptop computer.

  3. Ship detection algorithm used in a navigation lock monitored system

    NASA Astrophysics Data System (ADS)

    Li, Jiuxian; Yuan, Xiao-hui; Xia, LiangZheng

    2001-09-01

    A ship detection algorithm used in a navigation lock monitored control system, which helps to improve safety and dependability, is presented. We mean, in this paper, the processed region of river channel is determined through image edge detection. A method of peak-cutting histogram is introduced to restrain illusive movement information resulting from the glisten of water in the navigation lock. A concept of difference compactness from computing the projected density function of difference image is put forward, which helps to detect moving ships more precisely. With the statistical property of histogram of several small regions of river channel: gray scale variance, the detection of stationary ships can be accomplished, and the results can be judged by confidence region. Computer simulation has proved that it contributes a lot to the detection of ships in a navigation lock.

  4. Localization of tumors in various organs, using edge detection algorithms

    NASA Astrophysics Data System (ADS)

    López Vélez, Felipe

    2015-09-01

    The edge of an image is a set of points organized in a curved line, where in each of these points the brightness of the image changes abruptly, or has discontinuities, in order to find these edges there will be five different mathematical methods to be used and later on compared with its peers, this is with the aim of finding which of the methods is the one that can find the edges of any given image. In this paper these five methods will be used for medical purposes in order to find which one is capable of finding the edges of a scanned image more accurately than the others. The problem consists in analyzing the following two biomedicals images. One of them represents a brain tumor and the other one a liver tumor. These images will be analyzed with the help of the five methods described and the results will be compared in order to determine the best method to be used. It was decided to use different algorithms of edge detection in order to obtain the results shown below; Bessel algorithm, Morse algorithm, Hermite algorithm, Weibull algorithm and Sobel algorithm. After analyzing the appliance of each of the methods to both images it's impossible to determine the most accurate method for tumor detection due to the fact that in each case the best method changed, i.e., for the brain tumor image it can be noticed that the Morse method was the best at finding the edges of the image but for the liver tumor image it was the Hermite method. Making further observations it is found that Hermite and Morse have for these two cases the lowest standard deviations, concluding that these two are the most accurate method to find the edges in analysis of biomedical images.

  5. The weirdest SDSS galaxies: results from an outlier detection algorithm

    NASA Astrophysics Data System (ADS)

    Baron, Dalya; Poznanski, Dovi

    2017-03-01

    How can we discover objects we did not know existed within the large data sets that now abound in astronomy? We present an outlier detection algorithm that we developed, based on an unsupervised Random Forest. We test the algorithm on more than two million galaxy spectra from the Sloan Digital Sky Survey and examine the 400 galaxies with the highest outlier score. We find objects which have extreme emission line ratios and abnormally strong absorption lines, objects with unusual continua, including extremely reddened galaxies. We find galaxy-galaxy gravitational lenses, double-peaked emission line galaxies and close galaxy pairs. We find galaxies with high ionization lines, galaxies that host supernovae and galaxies with unusual gas kinematics. Only a fraction of the outliers we find were reported by previous studies that used specific and tailored algorithms to find a single class of unusual objects. Our algorithm is general and detects all of these classes, and many more, regardless of what makes them peculiar. It can be executed on imaging, time series and other spectroscopic data, operates well with thousands of features, is not sensitive to missing values and is easily parallelizable.

  6. An Algorithm for Coastline Detection Using SAR Images

    NASA Astrophysics Data System (ADS)

    Acar, U.; Bayram, B.; Sanli, F. B.; Abdikan, S.; Sunar, F.; Cetin, H. I.

    2012-08-01

    Coastal management requires rapid, up-to-date, and correct information. Thus, coastal movements have primary importance for coastal managers. For monitoring the change of shorelines, remote sensing data are some of the most important information and are utilized for differentiating any detections of change on shorelines. It is possible to monitor coastal changes by extracting the coastline from satellite images. In the literature most of the algorithms developed for optical images have been discussed in detail. In this study, an algorithm which extracts coastlines efficiently and automatically by processing SAR (Synthetic Aperture Radar) satellite images has been developed. A data set of ALOS Palsar image of Fine Beam Double (FBD) HH-HV polarized data has been used. PALSAR image has L-band data, and has a 14 MHz bandwidth and 34.3 degrees look angle. Data were acquired in ascending geometry. Ground resolution of PALSAR image was resampled to 15 m to amplitude image. Zonguldak city, lies on the northwest costs of Turkey, has been selected as the test area. An algorithm was developed for automatic coastline extraction from SAR images. The algorithm is encoded in a C_ environment. To verify the results the algorithm was applied on two PALSAR images gathered in two different date as 2007 and 2010. The results of automatic coastline extraction obtained from SAR images were compared to the results derived from manual digitizing. Random control points which are seen on each image were used. The average differences of selected points were calculated.

  7. SEU-tolerant IQ detection algorithm for LLRF accelerator system

    NASA Astrophysics Data System (ADS)

    Grecki, M.

    2007-08-01

    High-energy accelerators use RF field to accelerate charged particles. Measurements of effective field parameters (amplitude and phase) are tasks of great importance in these facilities. The RF signal is downconverted in frequency but keeping the information about amplitude and phase and then sampled in ADC. One of the several tasks for LLRF control system is to estimate the amplitude and phase (or I and Q components) of the RF signal. These parameters are further used in the control algorithm. The XFEL accelerator will be built using a single-tunnel concept. Therefore electronic devices (including LLRF control system) will be exposed to ionizing radiation, particularly to a neutron flux generating SEUs in digital circuits. The algorithms implemented in FPGA/DSP should therefore be SEU-tolerant. This paper presents the application of the WCC method to obtain immunity of IQ detection algorithm to SEUs. The VHDL implementation of this algorithm in Xilinx Virtex II Pro FPGA is presented, together with results of simulation proving the algorithm suitability for systems operating in the presence of SEUs.

  8. IQ quadrature demodulation algorithm used in heterodyne detection

    NASA Astrophysics Data System (ADS)

    Wang, Chunhui; Qu, Yang; Tang, Yajun Pang Tiantian

    2015-09-01

    In order to obtain better detection results of heterodyne, we used phase IQ quadrature demodulation algorithm to process the data which detected by laser heterodyne. Based on laser heterodyne interferometer, processing the data in the interferometer phase IQ quadrature demodulation algorithm from the signal to noise ratio, sampling rate, sampling rate, filter order and cutoff frequency, verify the effects of these system parameters to the phase precision, and choose the best parameters to obtain a better phase precision through experiment as: the signal to noise ratio is 25 dB, the IF signal frequency is 98.3 MHz, 98.5 MHz, 99.1 MHz, 99.5 MHz and 100 MHz, the sampling rate is 512-2048, the cutoff frequency and order of the filter are 0.11 and 40, respectively.

  9. Algorithm for Automated Detection of Edges of Clouds

    NASA Technical Reports Server (NTRS)

    Ward, Jennifer G.; Merceret, Francis J.

    2006-01-01

    An algorithm processes cloud-physics data gathered in situ by an aircraft, along with reflectivity data gathered by ground-based radar, to determine whether the aircraft is inside or outside a cloud at a given time. A cloud edge is deemed to be detected when the in/out state changes, subject to a hysteresis constraint. Such determinations are important in continuing research on relationships among lightning, electric charges in clouds, and decay of electric fields with distance from cloud edges.

  10. Prefiltering Model for Homology Detection Algorithms on GPU

    PubMed Central

    Retamosa, Germán; de Pedro, Luis; González, Ivan; Tamames, Javier

    2016-01-01

    Homology detection has evolved over the time from heavy algorithms based on dynamic programming approaches to lightweight alternatives based on different heuristic models. However, the main problem with these algorithms is that they use complex statistical models, which makes it difficult to achieve a relevant speedup and find exact matches with the original results. Thus, their acceleration is essential. The aim of this article was to prefilter a sequence database. To make this work, we have implemented a groundbreaking heuristic model based on NVIDIA’s graphics processing units (GPUs) and multicore processors. Depending on the sensitivity settings, this makes it possible to quickly reduce the sequence database by factors between 50% and 95%, while rejecting no significant sequences. Furthermore, this prefiltering application can be used together with multiple homology detection algorithms as a part of a next-generation sequencing system. Extensive performance and accuracy tests have been carried out in the Spanish National Centre for Biotechnology (NCB). The results show that GPU hardware can accelerate the execution times of former homology detection applications, such as National Centre for Biotechnology Information (NCBI), Basic Local Alignment Search Tool for Proteins (BLASTP), up to a factor of 4. KEY POINTS:Owing to the increasing size of the current sequence datasets, filtering approach and high-performance computing (HPC) techniques are the best solution to process all these information in acceptable processing times.Graphics processing unit cards and their corresponding programming models are good options to carry out these processing methods.Combination of filtration models with HPC techniques is able to offer new levels of performance and accuracy in homology detection algorithms such as National Centre for Biotechnology Information Basic Local Alignment Search Tool. PMID:28008220

  11. Efficient color face detection algorithm under different lighting conditions

    NASA Astrophysics Data System (ADS)

    Chow, Tze-Yin; Lam, Kin-Man; Wong, Kwok-Wai

    2006-01-01

    We present an efficient and reliable algorithm to detect human faces in an image under different lighting conditions. In our algorithm, skin-colored pixels are identified using a region-based approach, which can provide more reliable skin color segmentation under various lighting conditions. In addition, to compensate for extreme lighting conditions, a color compensation scheme is proposed, and the distributions of the skin-color components under various illuminations are modeled by means of the maximum-likelihood method. With the skin-color regions detected, a ratio method is proposed to determine the possible positions of the eyes in the image. Two eye candidates form a possible face region, which is then verified as a face or not by means of a two-stage procedure with an eigenmask. Finally, the face boundary region of a face candidate is further verified by a probabilistic approach to reduce the chance of false alarms. Experimental results based on the HHI MPEG-7 face database, the AR face database, and the CMU pose, illumination, and expression (PIE) database show that this face detection algorithm is efficient and reliable under different lighting conditions and facial expressions.

  12. Prefiltering Model for Homology Detection Algorithms on GPU.

    PubMed

    Retamosa, Germán; de Pedro, Luis; González, Ivan; Tamames, Javier

    2016-01-01

    Homology detection has evolved over the time from heavy algorithms based on dynamic programming approaches to lightweight alternatives based on different heuristic models. However, the main problem with these algorithms is that they use complex statistical models, which makes it difficult to achieve a relevant speedup and find exact matches with the original results. Thus, their acceleration is essential. The aim of this article was to prefilter a sequence database. To make this work, we have implemented a groundbreaking heuristic model based on NVIDIA's graphics processing units (GPUs) and multicore processors. Depending on the sensitivity settings, this makes it possible to quickly reduce the sequence database by factors between 50% and 95%, while rejecting no significant sequences. Furthermore, this prefiltering application can be used together with multiple homology detection algorithms as a part of a next-generation sequencing system. Extensive performance and accuracy tests have been carried out in the Spanish National Centre for Biotechnology (NCB). The results show that GPU hardware can accelerate the execution times of former homology detection applications, such as National Centre for Biotechnology Information (NCBI), Basic Local Alignment Search Tool for Proteins (BLASTP), up to a factor of 4.

  13. A Study of Lane Detection Algorithm for Personal Vehicle

    NASA Astrophysics Data System (ADS)

    Kobayashi, Kazuyuki; Watanabe, Kajiro; Ohkubo, Tomoyuki; Kurihara, Yosuke

    By the word “Personal vehicle”, we mean a simple and lightweight vehicle expected to emerge as personal ground transportation devices. The motorcycle, electric wheelchair, motor-powered bicycle, etc. are examples of the personal vehicle and have been developed as the useful for transportation for a personal use. Recently, a new types of intelligent personal vehicle called the Segway has been developed which is controlled and stabilized by using on-board intelligent multiple sensors. The demand for needs for such personal vehicles are increasing, 1) to enhance human mobility, 2) to support mobility for elderly person, 3) reduction of environmental burdens. Since rapidly growing personal vehicles' market, a number of accidents caused by human error is also increasing. The accidents are caused by it's drive ability. To enhance or support drive ability as well as to prevent accidents, intelligent assistance is necessary. One of most important elemental functions for personal vehicle is robust lane detection. In this paper, we develop a robust lane detection method for personal vehicle at outdoor environments. The proposed lane detection method employing a 360 degree omni directional camera and unique robust image processing algorithm. In order to detect lanes, combination of template matching technique and Hough transform are employed. The validity of proposed lane detection algorithm is confirmed by actual developed vehicle at various type of sunshined outdoor conditions.

  14. Sparsity-based algorithm for detecting faults in rotating machines

    NASA Astrophysics Data System (ADS)

    He, Wangpeng; Ding, Yin; Zi, Yanyang; Selesnick, Ivan W.

    2016-05-01

    This paper addresses the detection of periodic transients in vibration signals so as to detect faults in rotating machines. For this purpose, we present a method to estimate periodic-group-sparse signals in noise. The method is based on the formulation of a convex optimization problem. A fast iterative algorithm is given for its solution. A simulated signal is formulated to verify the performance of the proposed approach for periodic feature extraction. The detection performance of comparative methods is compared with that of the proposed approach via RMSE values and receiver operating characteristic (ROC) curves. Finally, the proposed approach is applied to single fault diagnosis of a locomotive bearing and compound faults diagnosis of motor bearings. The processed results show that the proposed approach can effectively detect and extract the useful features of bearing outer race and inner race defect.

  15. Clairvoyant fusion: a new methodology for designing robust detection algorithms

    NASA Astrophysics Data System (ADS)

    Schaum, Alan

    2016-10-01

    Many realistic detection problems cannot be solved with simple statistical tests for known alternative probability models. Uncontrollable environmental conditions, imperfect sensors, and other uncertainties transform simple detection problems with likelihood ratio solutions into composite hypothesis (CH) testing problems. Recently many multi- and hyperspectral sensing CH problems have been addressed with a new approach. Clairvoyant fusion (CF) integrates the optimal detectors ("clairvoyants") associated with every unspecified value of the parameters appearing in a detection model. For problems with discrete parameter values, logical rules emerge for combining the decisions of the associated clairvoyants. For many problems with continuous parameters, analytic methods of CF have been found that produce closed-form solutions-or approximations for intractable problems. Here the principals of CF are reviewed and mathematical insights are described that have proven useful in the derivation of solutions. It is also shown how a second-stage fusion procedure can be used to create theoretically superior detection algorithms for ALL discrete parameter problems.

  16. GPU based cloud system for high-performance arrhythmia detection with parallel k-NN algorithm.

    PubMed

    Tae Joon Jun; Hyun Ji Park; Hyuk Yoo; Young-Hak Kim; Daeyoung Kim

    2016-08-01

    In this paper, we propose an GPU based Cloud system for high-performance arrhythmia detection. Pan-Tompkins algorithm is used for QRS detection and we optimized beat classification algorithm with K-Nearest Neighbor (K-NN). To support high performance beat classification on the system, we parallelized beat classification algorithm with CUDA to execute the algorithm on virtualized GPU devices on the Cloud system. MIT-BIH Arrhythmia database is used for validation of the algorithm. The system achieved about 93.5% of detection rate which is comparable to previous researches while our algorithm shows 2.5 times faster execution time compared to CPU only detection algorithm.

  17. Algorithm for Detecting a Bright Spot in an Image

    NASA Technical Reports Server (NTRS)

    2009-01-01

    An algorithm processes the pixel intensities of a digitized image to detect and locate a circular bright spot, the approximate size of which is known in advance. The algorithm is used to find images of the Sun in cameras aboard the Mars Exploration Rovers. (The images are used in estimating orientations of the Rovers relative to the direction to the Sun.) The algorithm can also be adapted to tracking of circular shaped bright targets in other diverse applications. The first step in the algorithm is to calculate a dark-current ramp a correction necessitated by the scheme that governs the readout of pixel charges in the charge-coupled-device camera in the original Mars Exploration Rover application. In this scheme, the fraction of each frame period during which dark current is accumulated in a given pixel (and, hence, the dark-current contribution to the pixel image-intensity reading) is proportional to the pixel row number. For the purpose of the algorithm, the dark-current contribution to the intensity reading from each pixel is assumed to equal the average of intensity readings from all pixels in the same row, and the factor of proportionality is estimated on the basis of this assumption. Then the product of the row number and the factor of proportionality is subtracted from the reading from each pixel to obtain a dark-current-corrected intensity reading. The next step in the algorithm is to determine the best location, within the overall image, for a window of N N pixels (where N is an odd number) large enough to contain the bright spot of interest plus a small margin. (In the original application, the overall image contains 1,024 by 1,024 pixels, the image of the Sun is about 22 pixels in diameter, and N is chosen to be 29.)

  18. Fast automatic algorithm for bifurcation detection in vascular CTA scans

    NASA Astrophysics Data System (ADS)

    Brozio, Matthias; Gorbunova, Vladlena; Godenschwager, Christian; Beck, Thomas; Bernhardt, Dominik

    2012-02-01

    Endovascular imaging aims at identifying vessels and their branches. Automatic vessel segmentation and bifurcation detection eases both clinical research and routine work. In this article a state of the art bifurcation detection algorithm is developed and applied on vascular computed tomography angiography (CTA) scans to mark the common iliac artery and its branches, the internal and external iliacs. In contrast to other methods our algorithm does not rely on a complete segmentation of a vessel in the 3D volume, but evaluates the cross-sections of the vessel slice by slice. Candidates for vessels are obtained by thresholding, following by 2D connected component labeling and prefiltering by size and position. The remaining candidates are connected in a squared distanced weighted graph. With Dijkstra algorithm the graph is traversed to get candidates for the arteries. We use another set of features considering length and shape of the paths to determine the best candidate and detect the bifurcation. The method was tested on 119 datasets acquired with different CT scanners and varying protocols. Both easy to evaluate datasets with high resolution and no apparent clinical diseases and difficult ones with low resolution, major calcifications, stents or poor contrast between the vessel and surrounding tissue were included. The presented results are promising, in 75.7% of the cases the bifurcation was labeled correctly, and in 82.7% the common artery and one of its branches were assigned correctly. The computation time was on average 0.49 s +/- 0.28 s, close to human interaction time, which makes the algorithm applicable for time-critical applications.

  19. Track-Before-Detect Algorithm for Faint Moving Objects based on Random Sampling and Consensus

    NASA Astrophysics Data System (ADS)

    Dao, P.; Rast, R.; Schlaegel, W.; Schmidt, V.; Dentamaro, A.

    2014-09-01

    There are many algorithms developed for tracking and detecting faint moving objects in congested backgrounds. One obvious application is detection of targets in images where each pixel corresponds to the received power in a particular location. In our application, a visible imager operated in stare mode observes geostationary objects as fixed, stars as moving and non-geostationary objects as drifting in the field of view. We would like to achieve high sensitivity detection of the drifters. The ability to improve SNR with track-before-detect (TBD) processing, where target information is collected and collated before the detection decision is made, allows respectable performance against dim moving objects. Generally, a TBD algorithm consists of a pre-processing stage that highlights potential targets and a temporal filtering stage. However, the algorithms that have been successfully demonstrated, e.g. Viterbi-based and Bayesian-based, demand formidable processing power and memory. We propose an algorithm that exploits the quasi constant velocity of objects, the predictability of the stellar clutter and the intrinsically low false alarm rate of detecting signature candidates in 3-D, based on an iterative method called "RANdom SAmple Consensus” and one that can run real-time on a typical PC. The technique is tailored for searching objects with small telescopes in stare mode. Our RANSAC-MT (Moving Target) algorithm estimates parameters of a mathematical model (e.g., linear motion) from a set of observed data which contains a significant number of outliers while identifying inliers. In the pre-processing phase, candidate blobs were selected based on morphology and an intensity threshold that would normally generate unacceptable level of false alarms. The RANSAC sampling rejects candidates that conform to the predictable motion of the stars. Data collected with a 17 inch telescope by AFRL/RH and a COTS lens/EM-CCD sensor by the AFRL/RD Satellite Assessment Center is

  20. Oscillation Detection Algorithm Development Summary Report and Test Plan

    SciTech Connect

    Zhou, Ning; Huang, Zhenyu; Tuffner, Francis K.; Jin, Shuangshuang

    2009-10-03

    -based modal analysis algorithms have been developed. They include Prony analysis, Regularized Ro-bust Recursive Least Square (R3LS) algorithm, Yule-Walker algorithm, Yule-Walker Spectrum algorithm, and the N4SID algo-rithm. Each has been shown to be effective for certain situations, but not as effective for some other situations. For example, the traditional Prony analysis works well for disturbance data but not for ambient data, while Yule-Walker is designed for ambient data only. Even in an algorithm that works for both disturbance data and ambient data, such as R3LS, latency results from the time window used in the algorithm is an issue in timely estimation of oscillation modes. For ambient data, the time window needs to be longer to accumulate information for a reasonably accurate estimation; while for disturbance data, the time window can be significantly shorter so the latency in estimation can be much less. In addition, adding a known input signal such as noise probing signals can increase the knowledge of system oscillatory properties and thus improve the quality of mode estimation. System situations change over time. Disturbances can occur at any time, and probing signals can be added for a certain time period and then removed. All these observations point to the need to add intelligence to ModeMeter applications. That is, a ModeMeter needs to adaptively select different algorithms and adjust parameters for various situations. This project aims to develop systematic approaches for algorithm selection and parameter adjustment. The very first step is to detect occurrence of oscillations so the algorithm and parameters can be changed accordingly. The proposed oscillation detection approach is based on the signal-noise ratio of measurements.

  1. A hierarchic collision detection algorithm for simple Brownian dynamics.

    PubMed

    Katsimitsoulia, Zoe; Taylor, William R

    2010-04-01

    We describe an algorithm to avoid steric violation (bumps) between bodies arranged in a hierarchy. The algorithm recursively directs the focus of a bump-detector towards the interactions of children whose parents are in collision. This has the effect of concentrating available computer resources towards maintaining good steric interactions in the region where bodies are colliding. The algorithm was implemented and tested under two programming environments: a graphical environment, OpenGL under Java3D, and a non-graphical environment in "C". The former used a built-in collision detection system whereas the latter used a simple algorithm devised previously for the interaction of "soft" bodies. This simpler system was found to run much faster (by 50-fold) even after allowing for time spent on graphical activity and was also better at preventing steric violations. With a hierarchy of three levels of 100, the non-graphical implementation was able to simulate a million atomic bodies for 100,000 steps in 12h on a laptop computer.

  2. A Hybrid Swarm Intelligence Algorithm for Intrusion Detection Using Significant Features

    PubMed Central

    Amudha, P.; Karthik, S.; Sivakumari, S.

    2015-01-01

    Intrusion detection has become a main part of network security due to the huge number of attacks which affects the computers. This is due to the extensive growth of internet connectivity and accessibility to information systems worldwide. To deal with this problem, in this paper a hybrid algorithm is proposed to integrate Modified Artificial Bee Colony (MABC) with Enhanced Particle Swarm Optimization (EPSO) to predict the intrusion detection problem. The algorithms are combined together to find out better optimization results and the classification accuracies are obtained by 10-fold cross-validation method. The purpose of this paper is to select the most relevant features that can represent the pattern of the network traffic and test its effect on the success of the proposed hybrid classification algorithm. To investigate the performance of the proposed method, intrusion detection KDDCup'99 benchmark dataset from the UCI Machine Learning repository is used. The performance of the proposed method is compared with the other machine learning algorithms and found to be significantly different. PMID:26221625

  3. Detection of cracks in shafts with the Approximated Entropy algorithm

    NASA Astrophysics Data System (ADS)

    Sampaio, Diego Luchesi; Nicoletti, Rodrigo

    2016-05-01

    The Approximate Entropy is a statistical calculus used primarily in the fields of Medicine, Biology, and Telecommunication for classifying and identifying complex signal data. In this work, an Approximate Entropy algorithm is used to detect cracks in a rotating shaft. The signals of the cracked shaft are obtained from numerical simulations of a de Laval rotor with breathing cracks modelled by the Fracture Mechanics. In this case, one analysed the vertical displacements of the rotor during run-up transients. The results show the feasibility of detecting cracks from 5% depth, irrespective of the unbalance of the rotating system and crack orientation in the shaft. The results also show that the algorithm can differentiate the occurrence of crack only, misalignment only, and crack + misalignment in the system. However, the algorithm is sensitive to intrinsic parameters p (number of data points in a sample vector) and f (fraction of the standard deviation that defines the minimum distance between two sample vectors), and good results are only obtained by appropriately choosing their values according to the sampling rate of the signal.

  4. Runway Safety Monitor Algorithm for Runway Incursion Detection and Alerting

    NASA Technical Reports Server (NTRS)

    Green, David F., Jr.; Jones, Denise R. (Technical Monitor)

    2002-01-01

    The Runway Safety Monitor (RSM) is an algorithm for runway incursion detection and alerting that was developed in support of NASA's Runway Incursion Prevention System (RIPS) research conducted under the NASA Aviation Safety Program's Synthetic Vision System element. The RSM algorithm provides pilots with enhanced situational awareness and warnings of runway incursions in sufficient time to take evasive action and avoid accidents during landings, takeoffs, or taxiing on the runway. The RSM currently runs as a component of the NASA Integrated Display System, an experimental avionics software system for terminal area and surface operations. However, the RSM algorithm can be implemented as a separate program to run on any aircraft with traffic data link capability. The report documents the RSM software and describes in detail how RSM performs runway incursion detection and alerting functions for NASA RIPS. The report also describes the RIPS flight tests conducted at the Dallas-Ft Worth International Airport (DFW) during September and October of 2000, and the RSM performance results and lessons learned from those flight tests.

  5. Efficient detection and recognition algorithm of reference points in photogrammetry

    NASA Astrophysics Data System (ADS)

    Li, Weimin; Liu, Gang; Zhu, Lichun; Li, Xiaofeng; Zhang, Yuhai; Shan, Siyu

    2016-04-01

    In photogrammetry, an approach of automatic detection and recognition on reference points have been proposed to meet the requirements on detection and matching of reference points. The reference points used here are the CCT(circular coded target), which compose of two parts: the round target point in central region and the circular encoding band in surrounding region. Firstly, the contours of image are extracted, after that noises and disturbances of the image are filtered out by means of a series of criteria, such as the area of the contours, the correlation coefficient between two regions of contours etc. Secondly, the cubic spline interpolation is adopted to process the central contour region of the CCT. The contours of the interpolated image are extracted again, then the least square ellipse fitting is performed to calculate the center coordinates of the CCT. Finally, the encoded value is obtained by the angle information from the circular encoding band of the CCT. From the experiment results, the location precision of the CCT can be achieved to sub-pixel level of the algorithm presented. Meanwhile the recognition accuracy is pretty high, even if the background of the image is complex and full of disturbances. In addition, the property of the algorithm is robust. Furthermore, the runtime of the algorithm is fast.

  6. Fast pulse detection algorithms for digitized waveforms from scintillators

    NASA Astrophysics Data System (ADS)

    Krasilnikov, V.; Marocco, D.; Esposito, B.; Riva, M.; Kaschuck, Yu.

    2011-03-01

    Advanced C++ programming methods as well as fast Pulse Detection Algorithms (PDA) have been implemented in order to increase the computing speed of a LabVIEW™ data processing software developed for a Digital Pulse Shape Discrimination (DPSD) system for liquid scintillators. The newly implemented PDAs are described and compared: the most efficient method has been implemented in the data processing software, which has also been ported into C++. The comparison of the computing speeds of the new and old versions of the PDAs are presented. Program summaryProgram title: DPDS - Digital Pulse Detection Software Catalogue identifier: AEHQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHQ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 454 070 No. of bytes in distributed program, including test data, etc.: 20 987 104 Distribution format: tar.gz Programming language: C++ (Borland Visual C++) Computer: IBM PC Operating system: MS Windows 2000 and later… RAM: <50 Mbytes, highly depends on settings Classification: 4.12 External routines: Only standard Borland Visual C++ libraries Nature of problem: A very slow pulse detection algorithm, used as standard in LABView, is preventing the ability to process achieved data during the pause between plasma discharges in modern tokamaks. Solution method: Simple yet precise pulse detection algorithms implemented and the whole data processing software translated from LABView into C++. This speeded up the elaboration up to 30 times. Restrictions: Windows system decimal separator must be ".", not ",". Additional comments: Processing 300 MB data file should not take longer then 10 minutes. Running time: From 1 minute to 1 hour.

  7. Autopiquer - a Robust and Reliable Peak Detection Algorithm for Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Kilgour, David P. A.; Hughes, Sam; Kilgour, Samantha L.; Mackay, C. Logan; Palmblad, Magnus; Tran, Bao Quoc; Goo, Young Ah; Ernst, Robert K.; Clarke, David J.; Goodlett, David R.

    2017-02-01

    We present a simple algorithm for robust and unsupervised peak detection by determining a noise threshold in isotopically resolved mass spectrometry data. Solving this problem will greatly reduce the subjective and time-consuming manual picking of mass spectral peaks and so will prove beneficial in many research applications. The Autopiquer approach uses autocorrelation to test for the presence of (isotopic) structure in overlapping windows across the spectrum. Within each window, a noise threshold is optimized to remove the most unstructured data, whilst keeping as much of the (isotopic) structure as possible. This algorithm has been successfully demonstrated for both peak detection and spectral compression on data from many different classes of mass spectrometer and for different sample types, and this approach should also be extendible to other types of data that contain regularly spaced discrete peaks.

  8. A new real time tsunami detection algorithm for bottom pressure measurements in open ocean: characterization and benchmarks

    NASA Astrophysics Data System (ADS)

    Pignagnoli, Luca; Chierici, Francesco; Embriaco, Davide

    2010-05-01

    In the last decades the use of the Bottom Pressure Recorder (BPR) in a deep ocean environment for tsunami detection has had a relevant development. A key role for an early warning system based on BPRs is played by the tsunami detection algorithms running in real time on the BPR itself or on land. We present a new algorithm for tsunami detection based on real time pressure data analysis by a filtering cascade. This procedure consists of a tide removing, spike removing, low pass filtering and linear prediction or band pass filtering; the output filtered data is then matched against a given pressure threshold. Once exceeded a parent tsunami signal is detected. The main characteristics of the algorithm is its site specific adaptability and its flexibility that greatly enhance the detection reliability. In particular it was shown that removing the predicted tide strongly reduces the dynamical range of the pressure time series, allowing the detection of small tsunami signal. The algorithm can also be applied to the data acquired by a tide gauge. The algorithm is particularly designed and optimized to be used in an autonomous early warning system. A statistical method for algorithms evaluation has been developed in order to characterize the algorithms features with particular regards to false alarm probability, detection probability and detection earliness. Different configurations of the algorithm are tested for comparison using both synthetic and real pressure data set recorded in different environmental conditions and locations. The algorithm was installed onboard of the GEOSTAR abyssal station, deployed at 3264 m depth in the Gulf of Cadiz and successfully operated for 1 year, from August 2007 to August 2008.

  9. A novel dynamical community detection algorithm based on weighting scheme

    NASA Astrophysics Data System (ADS)

    Li, Ju; Yu, Kai; Hu, Ke

    2015-12-01

    Network dynamics plays an important role in analyzing the correlation between the function properties and the topological structure. In this paper, we propose a novel dynamical iteration (DI) algorithm, which incorporates the iterative process of membership vector with weighting scheme, i.e. weighting W and tightness T. These new elements can be used to adjust the link strength and the node compactness for improving the speed and accuracy of community structure detection. To estimate the optimal stop time of iteration, we utilize a new stability measure which is defined as the Markov random walk auto-covariance. We do not need to specify the number of communities in advance. It naturally supports the overlapping communities by associating each node with a membership vector describing the node's involvement in each community. Theoretical analysis and experiments show that the algorithm can uncover communities effectively and efficiently.

  10. Improving space object detection using a Fourier likelihood ratio detection algorithm

    NASA Astrophysics Data System (ADS)

    Becker, David J.; Cain, Stephen C.

    2016-09-01

    In this paper a new detection algorithm is proposed and developed for detecting space objects from images obtained using a ground-based telescope with the goal to improve space situational awareness. Most current space object detection algorithms rely on developing a likelihood ratio test (LRT) for the observed data based on a binary hypothesis test. These algorithms are based on the assumption that the observed data is Gaussian or Poisson distributed under both the hypothesis that a low signal-to-noise ratio (SNR) space object is present in the data and the hypothesis that an object is absent from the data. The LRT algorithm in this paper was developed based on the assumption that the distribution of the Fourier transform of the observed data will be different when a low SNR object is present in the data compared to when the data only contains background noise and known space objects. When an object is present the probability distribution of the real component of the Fourier transform of the intensity was found to follow a Gaussian distribution with a mean significantly different than in the data that doesn't contain an object even at low SNR levels. As the separation of these two probability distribution functions increases, it becomes more likely that an object can be detected. In this paper, simulated data are used to demonstrate the effectiveness and to highlight the benefits gained from this algorithm.

  11. Airport Traffic Conflict Detection and Resolution Algorithm Evaluation

    NASA Technical Reports Server (NTRS)

    Jones, Denise R.; Chartrand, Ryan C.; Wilson, Sara R.; Commo, Sean A.; Ballard, Kathryn M.; Otero, Sharon D.; Barker, Glover D.

    2016-01-01

    Two conflict detection and resolution (CD&R) algorithms for the terminal maneuvering area (TMA) were evaluated in a fast-time batch simulation study at the National Aeronautics and Space Administration (NASA) Langley Research Center. One CD&R algorithm, developed at NASA, was designed to enhance surface situation awareness and provide cockpit alerts of potential conflicts during runway, taxi, and low altitude air-to-air operations. The second algorithm, Enhanced Traffic Situation Awareness on the Airport Surface with Indications and Alerts (SURF IA), was designed to increase flight crew awareness of the runway environment and facilitate an appropriate and timely response to potential conflict situations. The purpose of the study was to evaluate the performance of the aircraft-based CD&R algorithms during various runway, taxiway, and low altitude scenarios, multiple levels of CD&R system equipage, and various levels of horizontal position accuracy. Algorithm performance was assessed through various metrics including the collision rate, nuisance and missed alert rate, and alert toggling rate. The data suggests that, in general, alert toggling, nuisance and missed alerts, and unnecessary maneuvering occurred more frequently as the position accuracy was reduced. Collision avoidance was more effective when all of the aircraft were equipped with CD&R and maneuvered to avoid a collision after an alert was issued. In order to reduce the number of unwanted (nuisance) alerts when taxiing across a runway, a buffer is needed between the hold line and the alerting zone so alerts are not generated when an aircraft is behind the hold line. All of the results support RTCA horizontal position accuracy requirements for performing a CD&R function to reduce the likelihood and severity of runway incursions and collisions.

  12. Comparing Several Algorithms for Change Detection of Wetland

    NASA Astrophysics Data System (ADS)

    Yan, F.; Zhang, S.; Chang, L.

    2015-12-01

    As "the kidneys of the landscape" and "ecological supermarkets", wetland plays an important role in ecological equilibrium and environmental protection.Therefore, it is of great significance to understand the dynamic changes of the wetland. Nowadays, many index and many methods have been used in dynamic Monitoring of Wetland. However, there are no single method and no single index are adapted to detect dynamic change of wetland all over the world. In this paper, three digital change detection algorithms are applied to 2005 and 2010 Landsat Thematic Mapper (TM) images of a portion of the Northeast China to detect wetland dynamic between the two dates. The change vector analysis method (CVA) uses 6 bands of TM images to detect wetland dynamic. The tassled cap transformation is used to create three change images (change in brightness, greenness, and wetness). A new method--- Comprehensive Change Detection Method (CCDM) is introduced to detect forest dynamic change. The CCDM integrates spectral-based change detection algorithms including a Multi-Index Integrated Change Analysis (MIICA) model and a novel change model called Zone, which extracts change information from two Landsat image pairs. The MIICA model is the core module of the change detection strategy and uses four spectral indices (differenced Normalized Burn Ratio (dNBR), differenced Normalized Difference Vegetation Index (dNDVI), the Change Vector (CV) and a new index called the Relative Change Vector Maximum (RCVMAX)) to obtain the changes that occurred between two image dates. The CCDM also includes a knowledge-based system, which uses critical information on historical and current land cover conditions and trends and the likelihood of land cover change, to combine the changes from MIICA and Zone. Related test proved that CCDM method is simple, easy to operate, widely applicable, and capable of capturing a variety of natural and anthropogenic disturbances potentially associated with land cover changes on

  13. Value Focused Thinking Applications to Supervised Pattern Classification With Extensions to Hyperspectral Anomaly Detection Algorithms

    DTIC Science & Technology

    2015-03-26

    HYPERSPECTRAL ANOMALY DETECTION ALGORITHMS THESIS MARCH 2015 David E. Scanland, Captain, USAF AFIT-ENS-MS-15-M-121 DEPARTMENT OF THE AIR FORCE...PATTERN CLASSIFICATION WITH EXTENSIONS TO HYPERSPECTRAL ANOMALY DETECTION ALGORITHMS THESIS Presented to the Faculty Department of...APPLICATION TO SUPERVISED PATTERN CLASSIFICATION WITH EXTENSIONS TO HYPERSPECTRAL ANOMALY DETECTION ALGORITHMS David E. Scanland, MS Captain, USAF

  14. EEG seizure detection and prediction algorithms: a survey

    NASA Astrophysics Data System (ADS)

    Alotaiby, Turkey N.; Alshebeili, Saleh A.; Alshawi, Tariq; Ahmad, Ishtiaq; Abd El-Samie, Fathi E.

    2014-12-01

    Epilepsy patients experience challenges in daily life due to precautions they have to take in order to cope with this condition. When a seizure occurs, it might cause injuries or endanger the life of the patients or others, especially when they are using heavy machinery, e.g., deriving cars. Studies of epilepsy often rely on electroencephalogram (EEG) signals in order to analyze the behavior of the brain during seizures. Locating the seizure period in EEG recordings manually is difficult and time consuming; one often needs to skim through tens or even hundreds of hours of EEG recordings. Therefore, automatic detection of such an activity is of great importance. Another potential usage of EEG signal analysis is in the prediction of epileptic activities before they occur, as this will enable the patients (and caregivers) to take appropriate precautions. In this paper, we first present an overview of seizure detection and prediction problem and provide insights on the challenges in this area. Second, we cover some of the state-of-the-art seizure detection and prediction algorithms and provide comparison between these algorithms. Finally, we conclude with future research directions and open problems in this topic.

  15. The Successive Projections Algorithm for interval selection in trilinear partial least-squares with residual bilinearization.

    PubMed

    Gomes, Adriano de Araújo; Alcaraz, Mirta Raquel; Goicoechea, Hector C; Araújo, Mario Cesar U

    2014-02-06

    In this work the Successive Projection Algorithm is presented for intervals selection in N-PLS for three-way data modeling. The proposed algorithm combines noise-reduction properties of PLS with the possibility of discarding uninformative variables in SPA. In addition, second-order advantage can be achieved by the residual bilinearization (RBL) procedure when an unexpected constituent is present in a test sample. For this purpose, SPA was modified in order to select intervals for use in trilinear PLS. The ability of the proposed algorithm, namely iSPA-N-PLS, was evaluated on one simulated and two experimental data sets, comparing the results to those obtained by N-PLS. In the simulated system, two analytes were quantitated in two test sets, with and without unexpected constituent. In the first experimental system, the determination of the four fluorophores (l-phenylalanine; l-3,4-dihydroxyphenylalanine; 1,4-dihydroxybenzene and l-tryptophan) was conducted with excitation-emission data matrices. In the second experimental system, quantitation of ofloxacin was performed in water samples containing two other uncalibrated quinolones (ciprofloxacin and danofloxacin) by high performance liquid chromatography with UV-vis diode array detector. For comparison purpose, a GA algorithm coupled with N-PLS/RBL was also used in this work. In most of the studied cases iSPA-N-PLS proved to be a promising tool for selection of variables in second-order calibration, generating models with smaller RMSEP, when compared to both the global model using all of the sensors in two dimensions and GA-NPLS/RBL.

  16. NASA airborne radar wind shear detection algorithm and the detection of wet microbursts in the vicinity of Orlando, Florida

    NASA Technical Reports Server (NTRS)

    Britt, Charles L.; Bracalente, Emedio M.

    1992-01-01

    The algorithms used in the NASA experimental wind shear radar system for detection, characterization, and determination of windshear hazard are discussed. The performance of the algorithms in the detection of wet microbursts near Orlando is presented. Various suggested algorithms that are currently being evaluated using the flight test results from Denver and Orlando are reviewed.

  17. Optical Algorithm for Cloud Shadow Detection Over Water

    DTIC Science & Technology

    2013-02-01

    contextual information to detect cumulus clouds and cloud shadows in Landsat data," Int. J. Remote Sens., vol. 3, no. l.pp. 51-62,1982. [12] T...Betendes, S. K. Sengupta, R. M. Welch, B. A. Wielicki, and M. Navar, " Cumulus cloud base height estimation from high spatial resolution rr-r 740 IEEE...REPORT DATE (DD-MM-YYYY) 05-02-2013 2. REPORT TYPE Journal Article 3. DATES COVERED (From ■ To) 4. TITLE AND SUBTITLE Optical Algorithm for Cloud

  18. Efficient implementations of hyperspectral chemical-detection algorithms

    NASA Astrophysics Data System (ADS)

    Brett, Cory J. C.; DiPietro, Robert S.; Manolakis, Dimitris G.; Ingle, Vinay K.

    2013-10-01

    Many military and civilian applications depend on the ability to remotely sense chemical clouds using hyperspectral imagers, from detecting small but lethal concentrations of chemical warfare agents to mapping plumes in the aftermath of natural disasters. Real-time operation is critical in these applications but becomes diffcult to achieve as the number of chemicals we search for increases. In this paper, we present efficient CPU and GPU implementations of matched-filter based algorithms so that real-time operation can be maintained with higher chemical-signature counts. The optimized C++ implementations show between 3x and 9x speedup over vectorized MATLAB implementations.

  19. An advanced algorithm for highway pavement fissure detection

    NASA Astrophysics Data System (ADS)

    Chen, Bei; Cao, Wenlun; He, Yuyao

    2012-04-01

    This paper presents image detection method of pavement crack based on fractal dimension feature and designs self-adapting algorithm of fractal dimension interval of pavement region. Through image pretreatment, calculation of fractal dimension, self-adapting calculation of dimension interval, we obtain the location image of damage pavement. The experimental results of transverse crack, longitudinal crack, net-shaped crack, pit slot are contrast with that of Sobel operator. The results show that they have the similar capability on the representation of crack, but the proposed method is more flexible on the aspect of representation of crack size and calculation of damage ratio.

  20. Geolocation Assessment Algorithm for CALIPSO Using Coastline Detection

    NASA Technical Reports Server (NTRS)

    Currey, J. Chris

    2002-01-01

    Cloud-Aerosol Lidar Infrared Pathfinder Satellite Observations (CALIPSO) is a joint satellite mission between NASA and the French space agency CNES. The investigation will gather long-term, global cloud and aerosol optical and physical properties to improve climate models. The CALIPSO spacecraft is scheduled to launch in 2004 into a 98.2 inclination, 705 km circular orbit approximately 3 minutes behind the Aqua spacecraft. The payload consists of a two-wavelength polarization-sensitive lidar, and two passive imagers operating in the visible (0.645 mm) and infrared (8.7 - 12.0 mm) spectral regions. The imagers are nadir viewing and co-aligned with the lidar. Earth viewing measurements are geolocated to the Earth fixed coordinate system using satellite ephemeris, Earth rotation and geoid, and instrument pointing data. The coastline detection algorithm will assess the accuracy of the CALIPSO geolocation process by analyzing Wide Field Camera (WFC) visible ocean land boundaries. Processing space-time coincident MODIS and WFC scenes with the coastline algorithm will help verify the co-registration requirement with Moderate Resolution Imaging Spectrometer (MODIS) data. This paper quantifies the accuracy of the coastline geolocation assessment algorithm.

  1. Network intrusion detection by the coevolutionary immune algorithm of artificial immune systems with clonal selection

    NASA Astrophysics Data System (ADS)

    Salamatova, T.; Zhukov, V.

    2017-02-01

    The paper presents the application of the artificial immune systems apparatus as a heuristic method of network intrusion detection for algorithmic provision of intrusion detection systems. The coevolutionary immune algorithm of artificial immune systems with clonal selection was elaborated. In testing different datasets the empirical results of evaluation of the algorithm effectiveness were achieved. To identify the degree of efficiency the algorithm was compared with analogs. The fundamental rules based of solutions generated by this algorithm are described in the article.

  2. Para-GMRF: parallel algorithm for anomaly detection of hyperspectral image

    NASA Astrophysics Data System (ADS)

    Dong, Chao; Zhao, Huijie; Li, Na; Wang, Wei

    2007-12-01

    The hyperspectral imager is capable of collecting hundreds of images corresponding to different wavelength channels for the observed area simultaneously, which make it possible to discriminate man-made objects from natural background. However, the price paid for the wealthy information is the enormous amounts of data, usually hundreds of Gigabytes per day. Turning the huge volume data into useful information and knowledge in real time is critical for geoscientists. In this paper, the proposed parallel Gaussian-Markov random field (Para-GMRF) anomaly detection algorithm is an attempt of applying parallel computing technology to solve the problem. Based on the locality of GMRF algorithm, we partition the 3-D hyperspectral image cube in spatial domain and distribute data blocks to multiple computers for concurrent detection. Meanwhile, to achieve load balance, a work pool scheduler is designed for task assignment. The Para-GMRF algorithm is organized in master-slave architecture, coded in C programming language using message passing interface (MPI) library and tested on a Beowulf cluster. Experimental results show that Para-GMRF algorithm successfully conquers the challenge and can be used in time sensitive areas, such as environmental monitoring and battlefield reconnaissance.

  3. Analogue Simulation and Orbital Solving Algorithm of Astrometric Exoplanet Detection

    NASA Astrophysics Data System (ADS)

    Huang, P. H.; Ji, J. H.

    2016-09-01

    Astrometry is an effective method to detect exoplanets. It has many advantages that other detection methods do not bear, such as providing three dimensional planetary orbit and determining the planetary mass. Astrometry will enrich the sample of exoplanets. As the high-precision astrometric satellite Gaia (Global Astrometry interferometer for Astrophysics) was launched in 2013, there will be abundant long-period Jupiter-size planets to be discovered by Gaia. In this paper, we specify the α Centauri A, HD 62509, and GJ 876 systems, and generate the synthetic astrometric data with the single astrometric precision of Gaia. Then we use the Lomb-Scargle periodogram to analyse the signature of planets and the Markov Chain Monte Carlo (MCMC) algorithm to fit the orbit of planets. The simulation results are well coincide with the initial solutions.

  4. Fast Parabola Detection Using Estimation of Distribution Algorithms

    PubMed Central

    Sierra-Hernandez, Juan Manuel; Avila-Garcia, Maria Susana; Rojas-Laguna, Roberto

    2017-01-01

    This paper presents a new method based on Estimation of Distribution Algorithms (EDAs) to detect parabolic shapes in synthetic and medical images. The method computes a virtual parabola using three random boundary pixels to calculate the constant values of the generic parabola equation. The resulting parabola is evaluated by matching it with the parabolic shape in the input image by using the Hadamard product as fitness function. This proposed method is evaluated in terms of computational time and compared with two implementations of the generalized Hough transform and RANSAC method for parabola detection. Experimental results show that the proposed method outperforms the comparative methods in terms of execution time about 93.61% on synthetic images and 89% on retinal fundus and human plantar arch images. In addition, experimental results have also shown that the proposed method can be highly suitable for different medical applications. PMID:28321264

  5. Fast Parabola Detection Using Estimation of Distribution Algorithms.

    PubMed

    Guerrero-Turrubiates, Jose de Jesus; Cruz-Aceves, Ivan; Ledesma, Sergio; Sierra-Hernandez, Juan Manuel; Velasco, Jonas; Avina-Cervantes, Juan Gabriel; Avila-Garcia, Maria Susana; Rostro-Gonzalez, Horacio; Rojas-Laguna, Roberto

    2017-01-01

    This paper presents a new method based on Estimation of Distribution Algorithms (EDAs) to detect parabolic shapes in synthetic and medical images. The method computes a virtual parabola using three random boundary pixels to calculate the constant values of the generic parabola equation. The resulting parabola is evaluated by matching it with the parabolic shape in the input image by using the Hadamard product as fitness function. This proposed method is evaluated in terms of computational time and compared with two implementations of the generalized Hough transform and RANSAC method for parabola detection. Experimental results show that the proposed method outperforms the comparative methods in terms of execution time about 93.61% on synthetic images and 89% on retinal fundus and human plantar arch images. In addition, experimental results have also shown that the proposed method can be highly suitable for different medical applications.

  6. A new detection algorithm for microcalcification clusters in mammographic screening

    NASA Astrophysics Data System (ADS)

    Xie, Weiying; Ma, Yide; Li, Yunsong

    2015-05-01

    A novel approach for microcalcification clusters detection is proposed. At the first time, we make a short analysis of mammographic images with microcalcification lesions to confirm these lesions have much greater gray values than normal regions. After summarizing the specific feature of microcalcification clusters in mammographic screening, we make more focus on preprocessing step including eliminating the background, image enhancement and eliminating the pectoral muscle. In detail, Chan-Vese Model is used for eliminating background. Then, we do the application of combining morphology method and edge detection method. After the AND operation and Sobel filter, we use Hough Transform, it can be seen that the result have outperformed for eliminating the pectoral muscle which is approximately the gray of microcalcification. Additionally, the enhancement step is achieved by morphology. We make effort on mammographic image preprocessing to achieve lower computational complexity. As well known, it is difficult to robustly achieve mammograms analysis due to low contrast between normal and lesion tissues, there are also much noise in such images. After a serious preprocessing algorithm, a method based on blob detection is performed to microcalcification clusters according their specific features. The proposed algorithm has employed Laplace operator to improve Difference of Gaussians (DoG) function in terms of low contrast images. A preliminary evaluation of the proposed method performs on a known public database namely MIAS, rather than synthetic images. The comparison experiments and Cohen's kappa coefficients all demonstrate that our proposed approach can potentially obtain better microcalcification clusters detection results in terms of accuracy, sensitivity and specificity.

  7. Data detection algorithms for multiplexed quantum dot encoding.

    PubMed

    Goss, Kelly C; Messier, Geoff G; Potter, Mike E

    2012-02-27

    A group of quantum dots can be designed to have a unique spectral emission by varying the size of the quantum dots (wavelength) and number of quantum dots (intensity). This technique has been previously proposed for biological tags and object identification. The potential of this system lies in the ability to have a large number of distinguishable wavelengths and intensity levels. This paper presents a communications system model for MxQDs including the interference between neighbouring QD colours and detector noise. An analytical model of the signal-to-noise ratio of a Charge-Coupled Device (CCD) spectrometer is presented and confirmed with experimental results. We then apply a communications system perspective and propose data detection algorithms that increase the readability of the quantum dots tags. It is demonstrated that multiplexed quantum dot barcodes can be read with 99.7% accuracy using the proposed data detection algorithms in a system with 6 colours and 6 intensity values resulting in 46,655 unique spectral codes.

  8. Enhancing time-series detection algorithms for automated biosurveillance.

    PubMed

    Tokars, Jerome I; Burkom, Howard; Xing, Jian; English, Roseanne; Bloom, Steven; Cox, Kenneth; Pavlin, Julie A

    2009-04-01

    BioSense is a US national system that uses data from health information systems for automated disease surveillance. We studied 4 time-series algorithm modifications designed to improve sensitivity for detecting artificially added data. To test these modified algorithms, we used reports of daily syndrome visits from 308 Department of Defense (DoD) facilities and 340 hospital emergency departments (EDs). At a constant alert rate of 1%, sensitivity was improved for both datasets by using a minimum standard deviation (SD) of 1.0, a 14-28 day baseline duration for calculating mean and SD, and an adjustment for total clinic visits as a surrogate denominator. Stratifying baseline days into weekdays versus weekends to account for day-of-week effects increased sensitivity for the DoD data but not for the ED data. These enhanced methods may increase sensitivity without increasing the alert rate and may improve the ability to detect outbreaks by using automated surveillance system data.

  9. Efficient List Extension Algorithm Using Multiple Detection Orders for Soft-Output MIMO Detection

    NASA Astrophysics Data System (ADS)

    Kim, Kilhwan; Jung, Yunho; Lee, Seongjoo; Kim, Jaeseok

    This paper proposes an efficient list extension algorithm for soft-output multiple-input-multiple-output (soft-MIMO) detection. This algorithm extends the list of candidate vectors based on the vector selected by initial detection, in order to solve the empty-set problem, while reducing the number of additional vectors. The additional vectors are obtained from multiple detection orders, from which high-quality soft-output can be generated. Furthermore, a method to reduce the complexity of the determination of the multiple detection orders is described. From simulation results for a 4×4 system with 16- and 64-quadrature amplitude modulations (QAM) and rate 1/2 and 5/6 duo-binary convolutional turbo code (CTC), the soft-MIMO detection to which the proposed list extension was applied showed a performance degradation of less than 0.5dB at bit error rate (BER) of 10-5, compared to that of the soft-output maximum-likelihood detection (soft-MLD) for all code rate and modulation pairs, while the complexity of the proposed list extension was approximately 38% and 17% of that of an existing algorithm with similar performance in a 4×4 system using 16- and 64-QAM, respectively.

  10. Dynamic multiple thresholding breast boundary detection algorithm for mammograms

    SciTech Connect

    Wu, Yi-Ta; Zhou Chuan; Chan, Heang-Ping; Paramagul, Chintana; Hadjiiski, Lubomir M.; Daly, Caroline Plowden; Douglas, Julie A.; Zhang Yiheng; Sahiner, Berkman; Shi Jiazheng; Wei Jun

    2010-01-15

    Purpose: Automated detection of breast boundary is one of the fundamental steps for computer-aided analysis of mammograms. In this study, the authors developed a new dynamic multiple thresholding based breast boundary (MTBB) detection method for digitized mammograms. Methods: A large data set of 716 screen-film mammograms (442 CC view and 274 MLO view) obtained from consecutive cases of an Institutional Review Board approved project were used. An experienced breast radiologist manually traced the breast boundary on each digitized image using a graphical interface to provide a reference standard. The initial breast boundary (MTBB-Initial) was obtained by dynamically adapting the threshold to the gray level range in local regions of the breast periphery. The initial breast boundary was then refined by using gradient information from horizontal and vertical Sobel filtering to obtain the final breast boundary (MTBB-Final). The accuracy of the breast boundary detection algorithm was evaluated by comparison with the reference standard using three performance metrics: The Hausdorff distance (HDist), the average minimum Euclidean distance (AMinDist), and the area overlap measure (AOM). Results: In comparison with the authors' previously developed gradient-based breast boundary (GBB) algorithm, it was found that 68%, 85%, and 94% of images had HDist errors less than 6 pixels (4.8 mm) for GBB, MTBB-Initial, and MTBB-Final, respectively. 89%, 90%, and 96% of images had AMinDist errors less than 1.5 pixels (1.2 mm) for GBB, MTBB-Initial, and MTBB-Final, respectively. 96%, 98%, and 99% of images had AOM values larger than 0.9 for GBB, MTBB-Initial, and MTBB-Final, respectively. The improvement by the MTBB-Final method was statistically significant for all the evaluation measures by the Wilcoxon signed rank test (p<0.0001). Conclusions: The MTBB approach that combined dynamic multiple thresholding and gradient information provided better performance than the breast boundary

  11. One algorithm to rule them all? An evaluation and discussion of ten eye movement event-detection algorithms.

    PubMed

    Andersson, Richard; Larsson, Linnea; Holmqvist, Kenneth; Stridh, Martin; Nyström, Marcus

    2016-05-18

    Almost all eye-movement researchers use algorithms to parse raw data and detect distinct types of eye movement events, such as fixations, saccades, and pursuit, and then base their results on these. Surprisingly, these algorithms are rarely evaluated. We evaluated the classifications of ten eye-movement event detection algorithms, on data from an SMI HiSpeed 1250 system, and compared them to manual ratings of two human experts. The evaluation focused on fixations, saccades, and post-saccadic oscillations. The evaluation used both event duration parameters, and sample-by-sample comparisons to rank the algorithms. The resulting event durations varied substantially as a function of what algorithm was used. This evaluation differed from previous evaluations by considering a relatively large set of algorithms, multiple events, and data from both static and dynamic stimuli. The main conclusion is that current detectors of only fixations and saccades work reasonably well for static stimuli, but barely better than chance for dynamic stimuli. Differing results across evaluation methods make it difficult to select one winner for fixation detection. For saccade detection, however, the algorithm by Larsson, Nyström and Stridh (IEEE Transaction on Biomedical Engineering, 60(9):2484-2493,2013) outperforms all algorithms in data from both static and dynamic stimuli. The data also show how improperly selected algorithms applied to dynamic data misestimate fixation and saccade properties.

  12. Algorithms for lineaments detection in processing of multispectral images

    NASA Astrophysics Data System (ADS)

    Borisova, D.; Jelev, G.; Atanassov, V.; Koprinkova-Hristova, Petia; Alexiev, K.

    2014-10-01

    Satellite remote sensing is a universal tool to investigate the different areas of Earth and environmental sciences. The advancement of the implementation capabilities of the optoelectronic devices which are long-term-tested in the laboratory and the field and are mounted on-board of the remote sensing platforms further improves the capability of instruments to acquire information about the Earth and its resources in global, regional and local scales. With the start of new high-spatial and spectral resolution satellite and aircraft imagery new applications for large-scale mapping and monitoring becomes possible. The integration with Geographic Information Systems (GIS) allows a synergistic processing of the multi-source spatial and spectral data. Here we present the results of a joint project DFNI I01/8 funded by the Bulgarian Science Fund focused on the algorithms of the preprocessing and the processing spectral data by using the methods of the corrections and of the visual and automatic interpretation. The objects of this study are lineaments. The lineaments are basically the line features on the earth's surface which are a sign of the geological structures. The geological lineaments usually appear on the multispectral images like lines or edges or linear shapes which is the result of the color variations of the surface structures. The basic geometry of a line is orientation, length and curve. The detection of the geological lineaments is an important operation in the exploration for mineral deposits, in the investigation of active fault patterns, in the prospecting of water resources, in the protecting people, etc. In this study the integrated approach for the detecting of the lineaments is applied. It combines together the methods of the visual interpretation of various geological and geographical indications in the multispectral satellite images, the application of the spatial analysis in GIS and the automatic processing of the multispectral images by Canny

  13. A comparison of edge detecting algorithms in magnetic imaging

    NASA Astrophysics Data System (ADS)

    Ekinci, Yunus Levent

    2010-05-01

    Directional derivatives based algorithms and special filters are widely used for enhancing the magnetic anomalies of the causative sources. The edge-detecting algorithms effectively aids geologic interpretation and may also bring out the subtle details in the data without specifying prior information about the nature of the sources, thus some model parameters of the source body may be estimated via this way which may guide the inversion process. These techniques have the ability of exhibiting maxima over lateral magnetization contrasts if the source magnetization and the ambient field are directed vertically, and hence the edges and lateral outlines of the causative sources may be determined. In the interpretation of the magnetic data, in order to bring out fine details, the frequently used filters are vertical derivatives or downward continuation and the other forms of high-pass filters. Because the shallow bodies produce magnetic anomalies with maximum horizontal gradients located nearly over their edges contrasts if the source magnetization and ambient field are directed vertically, the most popular technique is the first order total horizontal derivatives. The abrupt changes in mass magnetization may be located in anomaly map by using total horizontal gradient technique. Many filters and algorithms based on the use of the directional derivatives of the potential field data have been developed and suggested to determine the source parameters such as locations of the lateral source boundaries. In this research, the efficiency of several edge detectors such as sobel filter (SED), analytic signal (AS), horizontal derivatives of theta map (THD), horizontal derivatives of tilt angle (TAHD) and normalized standard deviations (NSD) were compared. Tests were performed on theoretically calculated magnetic anomalies resulting from 3D prismatic bodies for different cases. Before the applications of edge detector algorithms to the produced data, reduction to the pole

  14. A Block Successive Lower-Bound Maximization Algorithm for the Maximum Pseudo-Likelihood Estimation of Fully Visible Boltzmann Machines.

    PubMed

    Nguyen, Hien D; Wood, Ian A

    2016-03-01

    Maximum pseudo-likelihood estimation (MPLE) is an attractive method for training fully visible Boltzmann machines (FVBMs) due to its computational scalability and the desirable statistical properties of the MPLE. No published algorithms for MPLE have been proven to be convergent or monotonic. In this note, we present an algorithm for the MPLE of FVBMs based on the block successive lower-bound maximization (BSLM) principle. We show that the BSLM algorithm monotonically increases the pseudo-likelihood values and that the sequence of BSLM estimates converges to the unique global maximizer of the pseudo-likelihood function. The relationship between the BSLM algorithm and the gradient ascent (GA) algorithm for MPLE of FVBMs is also discussed, and a convergence criterion for the GA algorithm is given.

  15. [The analysis and comparison of different edge detection algorithms in ultrasound B-scan images].

    PubMed

    Zhang, Luo-ping; Yang, Bo-yuan; Wang, Chun-hong

    2006-05-01

    In this paper, some familiar algorithms of edge detection in ultrasound B-scan images are analyzed and studied. The results show that Sobel, Prewitt and Laplacian operators are sensitive to noise, Hough transform adapts to the whole detection, while LoG algorithm's average is zero and it couldn't change the whole dynamic area. Accordingly LoG algorithm is preferable.

  16. Cable Damage Detection System and Algorithms Using Time Domain Reflectometry

    SciTech Connect

    Clark, G A; Robbins, C L; Wade, K A; Souza, P R

    2009-03-24

    This report describes the hardware system and the set of algorithms we have developed for detecting damage in cables for the Advanced Development and Process Technologies (ADAPT) Program. This program is part of the W80 Life Extension Program (LEP). The system could be generalized for application to other systems in the future. Critical cables can undergo various types of damage (e.g. short circuits, open circuits, punctures, compression) that manifest as changes in the dielectric/impedance properties of the cables. For our specific problem, only one end of the cable is accessible, and no exemplars of actual damage are available. This work addresses the detection of dielectric/impedance anomalies in transient time domain reflectometry (TDR) measurements on the cables. The approach is to interrogate the cable using time domain reflectometry (TDR) techniques, in which a known pulse is inserted into the cable, and reflections from the cable are measured. The key operating principle is that any important cable damage will manifest itself as an electrical impedance discontinuity that can be measured in the TDR response signal. Machine learning classification algorithms are effectively eliminated from consideration, because only a small number of cables is available for testing; so a sufficient sample size is not attainable. Nonetheless, a key requirement is to achieve very high probability of detection and very low probability of false alarm. The approach is to compare TDR signals from possibly damaged cables to signals or an empirical model derived from reference cables that are known to be undamaged. This requires that the TDR signals are reasonably repeatable from test to test on the same cable, and from cable to cable. Empirical studies show that the repeatability issue is the 'long pole in the tent' for damage detection, because it is has been difficult to achieve reasonable repeatability. This one factor dominated the project. The two-step model-based approach is

  17. Design of infrasound-detection system via adaptive LMSTDE algorithm

    NASA Technical Reports Server (NTRS)

    Khalaf, C. S.; Stoughton, J. W.

    1984-01-01

    A proposed solution to an aviation safety problem is based on passive detection of turbulent weather phenomena through their infrasonic emission. This thesis describes a system design that is adequate for detection and bearing evaluation of infrasounds. An array of four sensors, with the appropriate hardware, is used for the detection part. Bearing evaluation is based on estimates of time delays between sensor outputs. The generalized cross correlation (GCC), as the conventional time-delay estimation (TDE) method, is first reviewed. An adaptive TDE approach, using the least mean square (LMS) algorithm, is then discussed. A comparison between the two techniques is made and the advantages of the adaptive approach are listed. The behavior of the GCC, as a Roth processor, is examined for the anticipated signals. It is shown that the Roth processor has the desired effect of sharpening the peak of the correlation function. It is also shown that the LMSTDE technique is an equivalent implementation of the Roth processor in the time domain. A LMSTDE lead-lag model, with a variable stability coefficient and a convergence criterion, is designed.

  18. Generalized Hough transform: A useful algorithm for signal path detection

    NASA Astrophysics Data System (ADS)

    Monari, Jader; Montebugnoli, Stelio; Orlati, Andrea; Ferri, Massimo; Leone, Giorgio

    2006-02-01

    How is it possible to recognize ETI signals coming from exoplanets? This is one of the questions that SETI researchers must answer. In early 1998, the Italian SETI program [S. Montebugnoli, et al., SETItalia, A new era in bioastronomy, ASP Conference Series, vol. 213, 2000, pp. 501-504.] started in Medicina with the installation of the Serendip IV 24Million Channel digital spectrometer. This system daily acquires a huge quantity of data to be processed off line, in order to detect possible ETI signals. The programs devoted to this topic are collectively called SALVE 2. Here a natural evolution of a previous effort is presented, which was based on a simple Hough transform and was limited to the detection of short linear tracks in the join time frequency matrix (JTFM) stored by SIV. The new generalized Hough algorithm allows us to detect the sinusoidal tracks by the transformation of the JTF bidimensional Cartesian space (x,y), in the generalized Hough quadridimensional space, where the main vectors are the sine parameters amplitude, frequency, phase and offset. At the end of the paper some results, obtained with the computation of real and simulated JTFM, are shown.

  19. Automatic ultrasonic breast lesions detection using support vector machine based algorithm

    NASA Astrophysics Data System (ADS)

    Yeh, Chih-Kuang; Miao, Shan-Jung; Fan, Wei-Che; Chen, Yung-Sheng

    2007-03-01

    It is difficult to automatically detect tumors and extract lesion boundaries in ultrasound images due to the variance in shape, the interference from speckle noise, and the low contrast between objects and background. The enhancement of ultrasonic image becomes a significant task before performing lesion classification, which was usually done with manual delineation of the tumor boundaries in the previous works. In this study, a linear support vector machine (SVM) based algorithm is proposed for ultrasound breast image training and classification. Then a disk expansion algorithm is applied for automatically detecting lesions boundary. A set of sub-images including smooth and irregular boundaries in tumor objects and those in speckle-noised background are trained by the SVM algorithm to produce an optimal classification function. Based on this classification model, each pixel within an ultrasound image is classified into either object or background oriented pixel. This enhanced binary image can highlight the object and suppress the speckle noise; and it can be regarded as degraded paint character (DPC) image containing closure noise, which is well known in perceptual organization of psychology. An effective scheme of removing closure noise using iterative disk expansion method has been successfully demonstrated in our previous works. The boundary detection of ultrasonic breast lesions can be further equivalent to the removal of speckle noise. By applying the disk expansion method to the binary image, we can obtain a significant radius-based image where the radius for each pixel represents the corresponding disk covering the specific object information. Finally, a signal transmission process is used for searching the complete breast lesion region and thus the desired lesion boundary can be effectively and automatically determined. Our algorithm can be performed iteratively until all desired objects are detected. Simulations and clinical images were introduced to

  20. Jitter Estimation Algorithms for Detection of Pathological Voices

    NASA Astrophysics Data System (ADS)

    Silva, Dárcio G.; Oliveira, Luís C.; Andrea, Mário

    2009-12-01

    This work is focused on the evaluation of different methods to estimate the amount of jitter present in speech signals. The jitter value is a measure of the irregularity of a quasiperiodic signal and is a good indicator of the presence of pathologies in the larynx such as vocal fold nodules or a vocal fold polyp. Given the irregular nature of the speech signal, each jitter estimation algorithm relies on its own model making a direct comparison of the results very difficult. For this reason, the evaluation of the different jitter estimation methods was target on their ability to detect pathological voices. Two databases were used for this evaluation: a subset of the MEEI database and a smaller database acquired in the scope of this work. The results showed that there were significant differences in the performance of the algorithms being evaluated. Surprisingly, in the largest database the best results were not achieved with the commonly used relative jitter, measured as a percentage of the glottal cycle, but with absolute jitter values measured in microseconds. Also, the new proposed measure for jitter, LocJitt, performed in general is equal to or better than the commonly used tools of MDVP and Praat.

  1. PATHOME: an algorithm for accurately detecting differentially expressed subpathways

    PubMed Central

    Nam, S; Chang, H R; Kim, K-T; Kook, M-C; Hong, D; Kwon, C H; Jung, H R; Park, H S; Powis, G; Liang, H; Park, T; Kim, Y H

    2014-01-01

    The translation of high-throughput gene expression data into biologically meaningful information remains a bottleneck. We developed a novel computational algorithm, PATHOME, for detecting differentially expressed biological pathways. This algorithm employs straightforward statistical tests to evaluate the significance of differential expression patterns along subpathways. Applying it to gene expression data sets of gastric cancer (GC), we compared its performance with those of other leading programs. Based on a literature-driven reference set, PATHOME showed greater consistency in identifying known cancer-related pathways. For the WNT pathway uniquely identified by PATHOME, we validated its involvement in gastric carcinogenesis through experimental perturbation of both cell lines and animal models. We identified HNF4α-WNT5A regulation in the cross-talk between the AMPK metabolic pathway and the WNT signaling pathway, and further identified WNT5A as a potential therapeutic target for GC. We have demonstrated PATHOME to be a powerful tool, with improved sensitivity for identifying disease-related dysregulated pathways. PMID:24681952

  2. On the Formal Verification of Conflict Detection Algorithms

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Butler, Ricky W.; Carreno, Victor A.; Dowek, Gilles

    2001-01-01

    Safety assessment of new air traffic management systems is a main issue for civil aviation authorities. Standard techniques such as testing and simulation have serious limitations in new systems that are significantly more autonomous than the older ones. In this paper, we present an innovative approach, based on formal verification, for establishing the correctness of conflict detection systems. Fundamental to our approach is the concept of trajectory, which is a continuous path in the x-y plane constrained by physical laws and operational requirements. From the Model of trajectories, we extract, and formally prove, high level properties that can serve as a framework to analyze conflict scenarios. We use the Airborne Information for Lateral Spacing (AILS) alerting algorithm as a case study of our approach.

  3. Ionospheric bubbles detection algorithms: Analysis in low latitudes

    NASA Astrophysics Data System (ADS)

    Magdaleno, S.; Cueto, M.; Herraiz, M.; Rodríguez-Caderot, G.; Sardón, E.; Rodríguez, I.

    2013-04-01

    Plasma depletions (or bubbles) are strong reductions in the ionospheric F-region plasma density due to the appearance of a Rayleigh-Taylor instability in the post-sunset, producing severe radio signal disruptions when crossing them. Most of the plasma depletions are confined on the Appleton Anomaly region, which also shows the presence of strong scintillations activity. Therefore, stations located in the vicinity of the geomagnetic equator are expected to be frequently affected by the presence of plasma depletions. This paper provides a comparison between the plasma depletion detection results achieved using two algorithms: one developed by the National Institute for Aerospace Technology and the University Complutense of Madrid and one developed by GMV. Six equatorial stations distributed all over the world and different solar activity and seasonal conditions have been selected to analyze the algorithms’ response to different plasma depletions characteristics. A regional behavior analysis of the plasma depletion occurrence and characteristics is also provided.

  4. An algorithm for pavement crack detection based on multiscale space

    NASA Astrophysics Data System (ADS)

    Liu, Xiang-long; Li, Qing-quan

    2006-10-01

    Conventional human-visual and manual field pavement crack detection method and approaches are very costly, time-consuming, dangerous, labor-intensive and subjective. They possess various drawbacks such as having a high degree of variability of the measure results, being unable to provide meaningful quantitative information and almost always leading to inconsistencies in crack details over space and across evaluation, and with long-periodic measurement. With the development of the public transportation and the growth of the Material Flow System, the conventional method can far from meet the demands of it, thereby, the automatic pavement state data gathering and data analyzing system come to the focus of the vocation's attention, and developments in computer technology, digital image acquisition, image processing and multi-sensors technology made the system possible, but the complexity of the image processing always made the data processing and data analyzing come to the bottle-neck of the whole system. According to the above description, a robust and high-efficient parallel pavement crack detection algorithm based on Multi-Scale Space is proposed in this paper. The proposed method is based on the facts that: (1) the crack pixels in pavement images are darker than their surroundings and continuous; (2) the threshold values of gray-level pavement images are strongly related with the mean value and standard deviation of the pixel-grey intensities. The Multi-Scale Space method is used to improve the data processing speed and minimize the effectiveness caused by image noise. Experiment results demonstrate that the advantages are remarkable: (1) it can correctly discover tiny cracks, even from very noise pavement image; (2) the efficiency and accuracy of the proposed algorithm are superior; (3) its application-dependent nature can simplify the design of the entire system.

  5. Successful Mainstreaming: Proven Ways To Detect and Correct Special Needs.

    ERIC Educational Resources Information Center

    Choate, Joyce S., Ed.

    This book of 16 chapters by various contributors is intended to assist teachers to translate theory into practical classroom strategies for mainstreaming children with disabilities. It includes guidelines for the detection of 101 special needs and more than 1000 corrective strategies. Chapters have the following titles and authors: "Teaching All…

  6. Motion mode recognition and step detection algorithms for mobile phone users.

    PubMed

    Susi, Melania; Renaudin, Valérie; Lachapelle, Gérard

    2013-01-24

    Microelectromechanical Systems (MEMS) technology is playing a key role in the design of the new generation of smartphones. Thanks to their reduced size, reduced power consumption, MEMS sensors can be embedded in above mobile devices for increasing their functionalities. However, MEMS cannot allow accurate autonomous location without external updates, e.g., from GPS signals, since their signals are degraded by various errors. When these sensors are fixed on the user's foot, the stance phases of the foot can easily be determined and periodic Zero velocity UPdaTes (ZUPTs) are performed to bound the position error. When the sensor is in the hand, the situation becomes much more complex. First of all, the hand motion can be decoupled from the general motion of the user. Second, the characteristics of the inertial signals can differ depending on the carrying modes. Therefore, algorithms for characterizing the gait cycle of a pedestrian using a handheld device have been developed. A classifier able to detect motion modes typical for mobile phone users has been designed and implemented. According to the detected motion mode, adaptive step detection algorithms are applied. Success of the step detection process is found to be higher than 97% in all motion modes.

  7. Motion Mode Recognition and Step Detection Algorithms for Mobile Phone Users

    PubMed Central

    Susi, Melania; Renaudin, Valérie; Lachapelle, Gérard

    2013-01-01

    Microelectromechanical Systems (MEMS) technology is playing a key role in the design of the new generation of smartphones. Thanks to their reduced size, reduced power consumption, MEMS sensors can be embedded in above mobile devices for increasing their functionalities. However, MEMS cannot allow accurate autonomous location without external updates, e.g., from GPS signals, since their signals are degraded by various errors. When these sensors are fixed on the user's foot, the stance phases of the foot can easily be determined and periodic Zero velocity UPdaTes (ZUPTs) are performed to bound the position error. When the sensor is in the hand, the situation becomes much more complex. First of all, the hand motion can be decoupled from the general motion of the user. Second, the characteristics of the inertial signals can differ depending on the carrying modes. Therefore, algorithms for characterizing the gait cycle of a pedestrian using a handheld device have been developed. A classifier able to detect motion modes typical for mobile phone users has been designed and implemented. According to the detected motion mode, adaptive step detection algorithms are applied. Success of the step detection process is found to be higher than 97% in all motion modes. PMID:23348038

  8. Accurate colon residue detection algorithm with partial volume segmentation

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Liang, Zhengrong; Zhang, PengPeng; Kutcher, Gerald J.

    2004-05-01

    Colon cancer is the second leading cause of cancer-related death in the United States. Earlier detection and removal of polyps can dramatically reduce the chance of developing malignant tumor. Due to some limitations of optical colonoscopy used in clinic, many researchers have developed virtual colonoscopy as an alternative technique, in which accurate colon segmentation is crucial. However, partial volume effect and existence of residue make it very challenging. The electronic colon cleaning technique proposed by Chen et al is a very attractive method, which is also kind of hard segmentation method. As mentioned in their paper, some artifacts were produced, which might affect the accurate colon reconstruction. In our paper, instead of labeling each voxel with a unique label or tissue type, the percentage of different tissues within each voxel, which we call a mixture, was considered in establishing a maximum a posterior probability (MAP) image-segmentation framework. A Markov random field (MRF) model was developed to reflect the spatial information for the tissue mixtures. The spatial information based on hard segmentation was used to determine which tissue types are in the specific voxel. Parameters of each tissue class were estimated by the expectation-maximization (EM) algorithm during the MAP tissue-mixture segmentation. Real CT experimental results demonstrated that the partial volume effects between four tissue types have been precisely detected. Meanwhile, the residue has been electronically removed and very smooth and clean interface along the colon wall has been obtained.

  9. Comparison of different classification algorithms for landmine detection using GPR

    NASA Astrophysics Data System (ADS)

    Karem, Andrew; Fadeev, Aleksey; Frigui, Hichem; Gader, Paul

    2010-04-01

    The Edge Histogram Detector (EHD) is a landmine detection algorithm that has been developed for ground penetrating radar (GPR) sensor data. It has been tested extensively and has demonstrated excellent performance. The EHD consists of two main components. The first one maps the raw data to a lower dimension using edge histogram based feature descriptors. The second component uses a possibilistic K-Nearest Neighbors (pK-NN) classifier to assign a confidence value. In this paper we show that performance of the baseline EHD could be improved by replacing the pK-NN classifier with model based classifiers. In particular, we investigate two such classifiers: Support Vector Regression (SVR), and Relevance Vector Machines (RVM). We investigate the adaptation of these classifiers to the landmine detection problem with GPR, and we compare their performance to the baseline EHD with a pK-NN classifier. As in the baseline EHD, we treat the problem as a two class classification problem: mine vs. clutter. Model parameters for the SVR and the RVM classifiers are estimated from training data using logarithmic grid search. For testing, soft labels are assigned to the test alarms. A confidence of zero indicates the maximum probability of being a false alarm. Similarly, a confidence of one represents the maximum probability of being a mine. Results on large and diverse GPR data collections show that the proposed modification to the classifier component can improve the overall performance of the EHD significantly.

  10. A frameshift error detection algorithm for DNA sequencing projects.

    PubMed Central

    Fichant, G A; Quentin, Y

    1995-01-01

    During the determination of DNA sequences, frameshift errors are not the most frequent but they are the most bothersome as they corrupt the amino acid sequence over several residues. Detection of such errors by sequence alignment is only possible when related sequences are found in the databases. To avoid this limitation, we have developed a new tool based on the distribution of non-overlapping 3-tuples or 6-tuples in the three frames of an ORF. The method relies upon the result of a correspondence analysis. It has been extensively tested on Bacillus subtilis and Saccharomyces cerevisiae sequences and has also been examined with human sequences. The results indicate that it can detect frameshift errors affecting as few as 20 bp with a low rate of false positives (no more than 1.0/1000 bp scanned). The proposed algorithm can be used to scan a large collection of data, but it is mainly intended for laboratory practice as a tool for checking the quality of the sequences produced during a sequencing project. PMID:7659513

  11. A successful anaemia management algorithm that achieves and maintains optimum haemoglobin status.

    PubMed

    Benton, Sharon

    2008-06-01

    The paper describes the need for the introduction of an anaemia management algorithm. It discussed the problems which the unit had in constant reviewing and re-prescribing ESA to maintain optimum haemoglobin levels for the unit's patients. The method used to create and use the algorithm is explained. The findings demonstrate the beneficial effects of using the algorithm. The paper concludes with the recommendation that algorithms should be more widely used for better treatment outcomes.

  12. Using the Gilbert-Johnson-Keerthi Algorithm for Collision Detection in System Effectiveness Modeling

    DTIC Science & Technology

    2015-09-01

    originator. ARL-TR-7474 • SEP 2015 ARL US Army Research Laboratory Using the Gilbert-johnson-Keerthi Algorithm for Collision Detection in System... Algorithm for Collision Detection in System Effectiveness Modeling Benjamin A Breech ARL-TR-7474 Approved for public release; distribution is unlimited...mail.mil> I present an overview of the Gilbert-Johnson-Keerthi (GJK) algorithm for collision detection using a geometrical approach that relies on

  13. SURF IA Conflict Detection and Resolution Algorithm Evaluation

    NASA Technical Reports Server (NTRS)

    Jones, Denise R.; Chartrand, Ryan C.; Wilson, Sara R.; Commo, Sean A.; Barker, Glover D.

    2012-01-01

    The Enhanced Traffic Situational Awareness on the Airport Surface with Indications and Alerts (SURF IA) algorithm was evaluated in a fast-time batch simulation study at the National Aeronautics and Space Administration (NASA) Langley Research Center. SURF IA is designed to increase flight crew situation awareness of the runway environment and facilitate an appropriate and timely response to potential conflict situations. The purpose of the study was to evaluate the performance of the SURF IA algorithm under various runway scenarios, multiple levels of conflict detection and resolution (CD&R) system equipage, and various levels of horizontal position accuracy. This paper gives an overview of the SURF IA concept, simulation study, and results. Runway incursions are a serious aviation safety hazard. As such, the FAA is committed to reducing the severity, number, and rate of runway incursions by implementing a combination of guidance, education, outreach, training, technology, infrastructure, and risk identification and mitigation initiatives [1]. Progress has been made in reducing the number of serious incursions - from a high of 67 in Fiscal Year (FY) 2000 to 6 in FY2010. However, the rate of all incursions has risen steadily over recent years - from a rate of 12.3 incursions per million operations in FY2005 to a rate of 18.9 incursions per million operations in FY2010 [1, 2]. The National Transportation Safety Board (NTSB) also considers runway incursions to be a serious aviation safety hazard, listing runway incursion prevention as one of their most wanted transportation safety improvements [3]. The NTSB recommends that immediate warning of probable collisions/incursions be given directly to flight crews in the cockpit [4].

  14. The feasibility test of state-of-the-art face detection algorithms for vehicle occupant detection

    NASA Astrophysics Data System (ADS)

    Makrushin, Andrey; Dittmann, Jana; Vielhauer, Claus; Langnickel, Mirko; Kraetzer, Christian

    2010-01-01

    Vehicle seat occupancy detection systems are designed to prevent the deployment of airbags at unoccupied seats, thus avoiding the considerable cost imposed by the replacement of airbags. Occupancy detection can also improve passenger comfort, e.g. by activating air-conditioning systems. The most promising development perspectives are seen in optical sensing systems which have become cheaper and smaller in recent years. The most plausible way to check the seat occupancy by occupants is the detection of presence and location of heads, or more precisely, faces. This paper compares the detection performances of the three most commonly used and widely available face detection algorithms: Viola- Jones, Kienzle et al. and Nilsson et al. The main objective of this work is to identify whether one of these systems is suitable for use in a vehicle environment with variable and mostly non-uniform illumination conditions, and whether any one face detection system can be sufficient for seat occupancy detection. The evaluation of detection performance is based on a large database comprising 53,928 video frames containing proprietary data collected from 39 persons of both sexes and different ages and body height as well as different objects such as bags and rearward/forward facing child restraint systems.

  15. How Small Can Impact Craters Be Detected at Large Scale by Automated Algorithms?

    NASA Astrophysics Data System (ADS)

    Bandeira, L.; Machado, M.; Pina, P.; Marques, J. S.

    2013-12-01

    intended to be detected: the lower this limit is, the higher the false detection rates are. A detailed evaluation is performed with breakdown results by crater dimension and image or surface type, permitting to realize that automated detections in large crater datasets in HiRISE imagery datasets with 25cm/pixel resolution can be successfully done (high correct and low false positive detections) until a crater dimension of about 8-10 m or 32-40 pixels. [1] Martins L, Pina P. Marques JS, Silveira M, 2009, Crater detection by a boosting approach. IEEE Geoscience and Remote Sensing Letters 6: 127-131. [2] Salamuniccar G, Loncaric S, Pina P. Bandeira L., Saraiva J, 2011, MA130301GT catalogue of Martian impact craters and advanced evaluation of crater detection algorithms using diverse topography and image datasets. Planetary and Space Science 59: 111-131. [3] Bandeira L, Ding W, Stepinski T, 2012, Detection of sub-kilometer craters in high resolution planetary images using shape and texture features. Advances in Space Research 49: 64-74.

  16. A stereo-vision hazard-detection algorithm to increase planetary lander autonomy

    NASA Astrophysics Data System (ADS)

    Woicke, Svenja; Mooij, Erwin

    2016-05-01

    For future landings on any celestial body, increasing the lander autonomy as well as decreasing risk are primary objectives. Both risk reduction and an increase in autonomy can be achieved by including hazard detection and avoidance in the guidance, navigation, and control loop. One of the main challenges in hazard detection and avoidance is the reconstruction of accurate elevation models, as well as slope and roughness maps. Multiple methods for acquiring the inputs for hazard maps are available. The main distinction can be made between active and passive methods. Passive methods (cameras) have budgetary advantages compared to active sensors (radar, light detection and ranging). However, it is necessary to proof that these methods deliver sufficiently good maps. Therefore, this paper discusses hazard detection using stereo vision. To facilitate a successful landing not more than 1% wrong detections (hazards that are not identified) are allowed. Based on a sensitivity analysis it was found that using a stereo set-up at a baseline of ≤ 2 m is feasible at altitudes of ≤ 200 m defining false positives of less than 1%. It was thus shown that stereo-based hazard detection is an effective means to decrease the landing risk and increase the lander autonomy. In conclusion, the proposed algorithm is a promising candidate for future landers.

  17. Detection and clustering of features in aerial images by neuron network-based algorithm

    NASA Astrophysics Data System (ADS)

    Vozenilek, Vit

    2015-12-01

    The paper presents the algorithm for detection and clustering of feature in aerial photographs based on artificial neural networks. The presented approach is not focused on the detection of specific topographic features, but on the combination of general features analysis and their use for clustering and backward projection of clusters to aerial image. The basis of the algorithm is a calculation of the total error of the network and a change of weights of the network to minimize the error. A classic bipolar sigmoid was used for the activation function of the neurons and the basic method of backpropagation was used for learning. To verify that a set of features is able to represent the image content from the user's perspective, the web application was compiled (ASP.NET on the Microsoft .NET platform). The main achievements include the knowledge that man-made objects in aerial images can be successfully identified by detection of shapes and anomalies. It was also found that the appropriate combination of comprehensive features that describe the colors and selected shapes of individual areas can be useful for image analysis.

  18. Fast Algorithm and Application of Wavelet Multiple-scale Edge Detection Filter

    NASA Astrophysics Data System (ADS)

    Liang, Likai; Yang, Min; Tong, Qiang; Zhang, Yue

    This paper focuses on the algorithm theory of the two-dimensional wavelet transform which is used for image edge detection. To simplify the algorithm, the author propounds to turn the two-dimensional dyadic wavelet to one dimensional dyadic wavelet that can be divided into product. We can use the filter to achieve the wavelet multiple scale edge detection quickly. Simultaneously, the process that the wavelet transform used for the multiple-scale edge detection is discussed in detail. Finally, the algorithm can be applied to vehicle license image detection and. Compared with the results of the Sobel, Canny and the others, this algorithm shows great feasibility and the effectiveness.

  19. The infrared moving object detection and security detection related algorithms based on W4 and frame difference

    NASA Astrophysics Data System (ADS)

    Yin, Jiale; Liu, Lei; Li, He; Liu, Qiankun

    2016-07-01

    This paper presents the infrared moving object detection and security detection related algorithms in video surveillance based on the classical W4 and frame difference algorithm. Classical W4 algorithm is one of the powerful background subtraction algorithms applying to infrared images which can accurately, integrally and quickly detect moving object. However, the classical W4 algorithm can only overcome the deficiency in the slight movement of background. The error will become bigger and bigger for long-term surveillance system since the background model is unchanged once established. In this paper, we present the detection algorithm based on the classical W4 and frame difference. It cannot only overcome the shortcoming of falsely detecting because of state mutations from background, but also eliminate holes caused by frame difference. Based on these we further design various security detection related algorithms such as illegal intrusion alarm, illegal persistence alarm and illegal displacement alarm. We compare our method with the classical W4, frame difference, and other state-of-the-art methods. Experiments detailed in this paper show the method proposed in this paper outperforms the classical W4 and frame difference and serves well for the security detection related algorithms.

  20. Successive smoothing algorithm for constructing the semiempirical model developed at ONERA to predict unsteady aerodynamic forces. [aeroelasticity in helicopters

    NASA Technical Reports Server (NTRS)

    Petot, D.; Loiseau, H.

    1982-01-01

    Unsteady aerodynamic methods adopted for the study of aeroelasticity in helicopters are considered with focus on the development of a semiempirical model of unsteady aerodynamic forces acting on an oscillating profile at high incidence. The successive smoothing algorithm described leads to the model's coefficients in a very satisfactory manner.

  1. Airport Traffic Conflict Detection and Resolution Algorithm Evaluation

    NASA Technical Reports Server (NTRS)

    Jones, Denise R.; Chartrand, Ryan C.; Wilson, Sara R.; Commo, Sean A.; Otero, Sharon D.; Barker, Glover D.

    2012-01-01

    A conflict detection and resolution (CD&R) concept for the terminal maneuvering area (TMA) was evaluated in a fast-time batch simulation study at the National Aeronautics and Space Administration (NASA) Langley Research Center. The CD&R concept is being designed to enhance surface situation awareness and provide cockpit alerts of potential conflicts during runway, taxi, and low altitude air-to-air operations. The purpose of the study was to evaluate the performance of aircraft-based CD&R algorithms in the TMA, as a function of surveillance accuracy. This paper gives an overview of the CD&R concept, simulation study, and results. The Next Generation Air Transportation System (NextGen) concept for the year 2025 and beyond envisions the movement of large numbers of people and goods in a safe, efficient, and reliable manner [1]. NextGen will remove many of the constraints in the current air transportation system, support a wider range of operations, and provide an overall system capacity up to three times that of current operating levels. Emerging NextGen operational concepts [2], such as four-dimensional trajectory based airborne and surface operations, equivalent visual operations, and super density arrival and departure operations, require a different approach to air traffic management and as a result, a dramatic shift in the tasks, roles, and responsibilities for the flight deck and air traffic control (ATC) to ensure a safe, sustainable air transportation system.

  2. Algorithms for Detecting Significantly Mutated Pathways in Cancer

    NASA Astrophysics Data System (ADS)

    Vandin, Fabio; Upfal, Eli; Raphael, Benjamin J.

    Recent genome sequencing studies have shown that the somatic mutations that drive cancer development are distributed across a large number of genes. This mutational heterogeneity complicates efforts to distinguish functional mutations from sporadic, passenger mutations. Since cancer mutations are hypothesized to target a relatively small number of cellular signaling and regulatory pathways, a common approach is to assess whether known pathways are enriched for mutated genes. However, restricting attention to known pathways will not reveal novel cancer genes or pathways. An alterative strategy is to examine mutated genes in the context of genome-scale interaction networks that include both well characterized pathways and additional gene interactions measured through various approaches. We introduce a computational framework for de novo identification of subnetworks in a large gene interaction network that are mutated in a significant number of patients. This framework includes two major features. First, we introduce a diffusion process on the interaction network to define a local neighborhood of "influence" for each mutated gene in the network. Second, we derive a two-stage multiple hypothesis test to bound the false discovery rate (FDR) associated with the identified subnetworks. We test these algorithms on a large human protein-protein interaction network using mutation data from two recent studies: glioblastoma samples from The Cancer Genome Atlas and lung adenocarcinoma samples from the Tumor Sequencing Project. We successfully recover pathways that are known to be important in these cancers, such as the p53 pathway. We also identify additional pathways, such as the Notch signaling pathway, that have been implicated in other cancers but not previously reported as mutated in these samples. Our approach is the first, to our knowledge, to demonstrate a computationally efficient strategy for de novo identification of statistically significant mutated subnetworks. We

  3. Evaluation of stereo vision obstacle detection algorithms for off-road autonomous navigation

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo; Huertas, Andres; Matthies, Larry

    2005-01-01

    Reliable detection of non-traversable hazards is a key requirement for off-road autonomous navigation. A detailed description of each obstacle detection algorithm and their performance on the surveyed obstacle course is presented in this paper.

  4. Target detection using the background model from the topological anomaly detection algorithm

    NASA Astrophysics Data System (ADS)

    Dorado Munoz, Leidy P.; Messinger, David W.; Ziemann, Amanda K.

    2013-05-01

    The Topological Anomaly Detection (TAD) algorithm has been used as an anomaly detector in hyperspectral and multispectral images. TAD is an algorithm based on graph theory that constructs a topological model of the background in a scene, and computes an anomalousness ranking for all of the pixels in the image with respect to the background in order to identify pixels with uncommon or strange spectral signatures. The pixels that are modeled as background are clustered into groups or connected components, which could be representative of spectral signatures of materials present in the background. Therefore, the idea of using the background components given by TAD in target detection is explored in this paper. In this way, these connected components are characterized in three different approaches, where the mean signature and endmembers for each component are calculated and used as background basis vectors in Orthogonal Subspace Projection (OSP) and Adaptive Subspace Detector (ASD). Likewise, the covariance matrix of those connected components is estimated and used in detectors: Constrained Energy Minimization (CEM) and Adaptive Coherence Estimator (ACE). The performance of these approaches and the different detectors is compared with a global approach, where the background characterization is derived directly from the image. Experiments and results using self-test data set provided as part of the RIT blind test target detection project are shown.

  5. Partitioning of a Signal Detection Algorithm to a Heterogeneous Multicomputing Platform

    DTIC Science & Technology

    2003-09-23

    Partitioning of a Signal Detection Algorithm to a Heterogeneous Multicomputing Platform Mr. Michael Vinskus, Principal Software Engineer Mercury ...UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Mercury Computer Systems, Inc. 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING...applications. © 2003 Mercury Computer Systems, Inc. Partitioning of a Signal Detection Algorithm to a Heterogeneous Multicomputing Platform Michael Vinskus

  6. Target Impact Detection Algorithm Using Computer-aided Design (CAD) Model Geometry

    DTIC Science & Technology

    2014-09-01

    UNCLASSIFIED AD-E403 558 Technical Report ARMET-TR-13024 TARGET IMPACT DETECTION ALGORITHM USING COMPUTER-AIDED DESIGN ( CAD ...DETECTION ALGORITHM USING COMPUTER-AIDED DESIGN ( CAD ) MODEL GEOMETRY 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...This report documents a method and algorithm to export geometry from a three-dimensional, computer-aided design ( CAD ) model in a format that can be

  7. A new algorithm for the detection of seismic quiescence: introduction of the RTM algorithm, a modified RTL algorithm

    NASA Astrophysics Data System (ADS)

    Nagao, Toshiyasu; Takeuchi, Akihiro; Nakamura, Kenji

    2011-03-01

    There are a number of reports on seismic quiescence phenomena before large earthquakes. The RTL algorithm is a weighted coefficient statistical method that takes into account the magnitude, occurrence time, and place of earthquake when seismicity pattern changes before large earthquakes are being investigated. However, we consider the original RTL algorithm to be overweighted on distance. In this paper, we introduce a modified RTL algorithm, called the RTM algorithm, and apply it to three large earthquakes in Japan, namely, the Hyogo-ken Nanbu earthquake in 1995 ( M JMA7.3), the Noto Hanto earthquake in 2007 ( M JMA 6.9), and the Iwate-Miyagi Nairiku earthquake in 2008 ( M JMA 7.2), as test cases. Because this algorithm uses several parameters to characterize the weighted coefficients, multiparameter sets have to be prepared for the tests. The results show that the RTM algorithm is more sensitive than the RTL algorithm to seismic quiescence phenomena. This paper represents the first step in a series of future analyses of seismic quiescence phenomena using the RTM algorithm. At this moment, whole surveyed parameters are empirically selected for use in the method. We have to consider the physical meaning of the "best fit" parameter, such as the relation of ACFS, among others, in future analyses.

  8. Low-Complexity Saliency Detection Algorithm for Fast Perceptual Video Coding

    PubMed Central

    Liu, Pengyu; Jia, Kebin

    2013-01-01

    A low-complexity saliency detection algorithm for perceptual video coding is proposed; low-level encoding information is adopted as the characteristics of visual perception analysis. Firstly, this algorithm employs motion vector (MV) to extract temporal saliency region through fast MV noise filtering and translational MV checking procedure. Secondly, spatial saliency region is detected based on optimal prediction mode distributions in I-frame and P-frame. Then, it combines the spatiotemporal saliency detection results to define the video region of interest (VROI). The simulation results validate that the proposed algorithm can avoid a large amount of computation work in the visual perception characteristics analysis processing compared with other existing algorithms; it also has better performance in saliency detection for videos and can realize fast saliency detection. It can be used as a part of the video standard codec at medium-to-low bit-rates or combined with other algorithms in fast video coding. PMID:24489495

  9. Distributed learning automata-based algorithm for community detection in complex networks

    NASA Astrophysics Data System (ADS)

    Khomami, Mohammad Mehdi Daliri; Rezvanian, Alireza; Meybodi, Mohammad Reza

    2016-03-01

    Community structure is an important and universal topological property of many complex networks such as social and information networks. The detection of communities of a network is a significant technique for understanding the structure and function of networks. In this paper, we propose an algorithm based on distributed learning automata for community detection (DLACD) in complex networks. In the proposed algorithm, each vertex of network is equipped with a learning automation. According to the cooperation among network of learning automata and updating action probabilities of each automaton, the algorithm interactively tries to identify high-density local communities. The performance of the proposed algorithm is investigated through a number of simulations on popular synthetic and real networks. Experimental results in comparison with popular community detection algorithms such as walk trap, Danon greedy optimization, Fuzzy community detection, Multi-resolution community detection and label propagation demonstrated the superiority of DLACD in terms of modularity, NMI, performance, min-max-cut and coverage.

  10. Contaminant detection on poultry carcasses using hyperspectral data: Part I. Algorithms for selection of individual wavebands

    NASA Astrophysics Data System (ADS)

    Nakariyakul, Songyot; Casasent, David P.

    2007-09-01

    Contaminant detection on chicken carcasses is an important product inspection application. The four contaminant types of interest contain three types of feces from different gastrointestinal regions (duodenum, ceca, and colon) and ingesta (undigested food) from the gizzard. Use of automated or semi-automated inspection systems for detecting fecal contaminant regions is of great interest. Hyperspectral data provided by ARS (Athens, GA) were used to examine detection of contaminants on carcasses. We address quasi-optimal algorithms for selecting a set of spectral bands (wavelengths) in hyperspectral data for on-line contaminant detection (feature selection). We introduce our new improved forward floating selection (IFFS) algorithm and compare its performance to that of other state-of-the-art feature selection algorithms. Our initial results indicate that our method gives an excellent detection rate and performs better than other feature selection algorithms. We also show that combination feature selection algorithms perform worse.

  11. Fast algorithm for probabilistic bone edge detection (FAPBED)

    NASA Astrophysics Data System (ADS)

    Scepanovic, Danilo; Kirshtein, Joshua; Jain, Ameet K.; Taylor, Russell H.

    2005-04-01

    The registration of preoperative CT to intra-operative reality systems is a crucial step in Computer Assisted Orthopedic Surgery (CAOS). The intra-operative sensors include 3D digitizers, fiducials, X-rays and Ultrasound (US). FAPBED is designed to process CT volumes for registration to tracked US data. Tracked US is advantageous because it is real time, noninvasive, and non-ionizing, but it is also known to have inherent inaccuracies which create the need to develop a framework that is robust to various uncertainties, and can be useful in US-CT registration. Furthermore, conventional registration methods depend on accurate and absolute segmentation. Our proposed probabilistic framework addresses the segmentation-registration duality, wherein exact segmentation is not a prerequisite to achieve accurate registration. In this paper, we develop a method for fast and automatic probabilistic bone surface (edge) detection in CT images. Various features that influence the likelihood of the surface at each spatial coordinate are combined using a simple probabilistic framework, which strikes a fair balance between a high-level understanding of features in an image and the low-level number crunching of standard image processing techniques. The algorithm evaluates different features for detecting the probability of a bone surface at each voxel, and compounds the results of these methods to yield a final, low-noise, probability map of bone surfaces in the volume. Such a probability map can then be used in conjunction with a similar map from tracked intra-operative US to achieve accurate registration. Eight sample pelvic CT scans were used to extract feature parameters and validate the final probability maps. An un-optimized fully automatic Matlab code runs in five minutes per CT volume on average, and was validated by comparison against hand-segmented gold standards. The mean probability assigned to nonzero surface points was 0.8, while nonzero non-surface points had a mean

  12. Detection of Local/Regional Events in Kuwait Using Next-Generation Detection Algorithms

    SciTech Connect

    Gok, M. Rengin; Al-Jerri, Farra; Dodge, Douglas; Al-Enezi, Abdullah; Hauk, Terri; Mellors, R.

    2014-12-10

    Seismic networks around the world use conventional triggering algorithms to detect seismic signals in order to locate local/regional seismic events. Kuwait National Seismological Network (KNSN) of Kuwait Institute of Scientific Research (KISR) is operating seven broad-band and short-period three-component stations in Kuwait. The network is equipped with Nanometrics digitizers and uses Antelope and Guralp acquisition software for processing and archiving the data. In this study, we selected 10 days of archived hourly-segmented continuous data of five stations (Figure 1) and 250 days of continuous recording at MIB. For the temporary deployment our selection criteria was based on KNSN catalog intensity for the period of time we test the method. An autonomous event detection and clustering framework is employed to test a more complete catalog of this short period of time. The goal is to illustrate the effectiveness of the technique and pursue the framework for longer period of time.

  13. Detection algorithm of infrared small target based on improved SUSAN operator

    NASA Astrophysics Data System (ADS)

    Liu, Xingmiao; Wang, Shicheng; Zhao, Jing

    2010-10-01

    The methods of detecting small moving targets in infrared image sequences that contain moving nuisance objects and background noise is analyzed in this paper. A novel infrared small target detection algorithm based on improved SUSAN operator is put forward. The algorithm selects double templates for the infrared small target detection: one size is greater than the small target point size and another size is equal to the small target point size. First, the algorithm uses the big template to calculate the USAN of each pixel in the image and detect the small target, the edge of the image and isolated noise pixels; Then the algorithm uses the another template to calculate the USAN of pixels detected in the first step and improves the principles of SUSAN algorithm based on the characteristics of the small target so that the algorithm can only detect small targets and don't sensitive to the edge pixels of the image and isolated noise pixels. So the interference of the edge of the image and isolate noise points are removed and the candidate target points can be identified; At last, the target is detected by utilizing the continuity and consistency of target movement. The experimental results indicate that the improved SUSAN detection algorithm can quickly and effectively detect the infrared small targets.

  14. AUTOMATIC DETECTION ALGORITHM OF DYNAMIC PRESSURE PULSES IN THE SOLAR WIND

    SciTech Connect

    Zuo, Pingbing; Feng, Xueshang; Wang, Yi; Xie, Yanqiong; Li, Huijun; Xu, Xiaojun E-mail: fengx@spaceweather.ac.cn

    2015-04-20

    Dynamic pressure pulses (DPPs) in the solar wind are a significant phenomenon closely related to the solar-terrestrial connection and physical processes of solar wind dynamics. In order to automatically identify DPPs from solar wind measurements, we develop a procedure with a three-step detection algorithm that is able to rapidly select DPPs from the plasma data stream and simultaneously define the transition region where large dynamic pressure variations occur and demarcate the upstream and downstream region by selecting the relatively quiet status before and after the abrupt change in dynamic pressure. To demonstrate the usefulness, efficiency, and accuracy of this procedure, we have applied it to the Wind observations from 1996 to 2008 by successfully obtaining the DPPs. The procedure can also be applied to other solar wind spacecraft observation data sets with different time resolutions.

  15. A Zero Velocity Detection Algorithm Using Inertial Sensors for Pedestrian Navigation Systems

    PubMed Central

    Park, Sang Kyeong; Suh, Young Soo

    2010-01-01

    In pedestrian navigation systems, the position of a pedestrian is computed using an inertial navigation algorithm. In the algorithm, the zero velocity updating plays an important role, where zero velocity intervals are detected and the velocity error is reset. To use the zero velocity updating, it is necessary to detect zero velocity intervals reliably. A new zero detection algorithm is proposed in the paper, where only one gyroscope value is used. A Markov model is constructed using segmentation of gyroscope outputs instead of using gyroscope outputs directly, which makes the zero velocity detection more reliable. PMID:22163402

  16. A zero velocity detection algorithm using inertial sensors for pedestrian navigation systems.

    PubMed

    Park, Sang Kyeong; Suh, Young Soo

    2010-01-01

    In pedestrian navigation systems, the position of a pedestrian is computed using an inertial navigation algorithm. In the algorithm, the zero velocity updating plays an important role, where zero velocity intervals are detected and the velocity error is reset. To use the zero velocity updating, it is necessary to detect zero velocity intervals reliably. A new zero detection algorithm is proposed in the paper, where only one gyroscope value is used. A Markov model is constructed using segmentation of gyroscope outputs instead of using gyroscope outputs directly, which makes the zero velocity detection more reliable.

  17. Rocketdyne safety algorithm: Space Shuttle main engine fault detection

    NASA Astrophysics Data System (ADS)

    Norman, Arnold M., Jr.

    1994-07-01

    The Rocketdyne Safety Algorithm (RSA) has been developed to the point of use on the TTBE at MSFC on Task 4 of LeRC contract NAS3-25884. This document contains a description of the work performed, the results of the nominal test of the major anomaly test cases and a table of the resulting cutoff times, a plot of the RSA value vs. time for each anomaly case, a logic flow description of the algorithm, the algorithm code, and a development plan for future efforts.

  18. Rocketdyne Safety Algorithm: Space Shuttle Main Engine Fault Detection

    NASA Technical Reports Server (NTRS)

    Norman, Arnold M., Jr.

    1994-01-01

    The Rocketdyne Safety Algorithm (RSA) has been developed to the point of use on the TTBE at MSFC on Task 4 of LeRC contract NAS3-25884. This document contains a description of the work performed, the results of the nominal test of the major anomaly test cases and a table of the resulting cutoff times, a plot of the RSA value vs. time for each anomaly case, a logic flow description of the algorithm, the algorithm code, and a development plan for future efforts.

  19. Parallelization of exoplanets detection algorithms based on field rotation; example of the MOODS algorithm for SPHERE

    NASA Astrophysics Data System (ADS)

    Mattei, D.; Smith, I.; Ferrari, A.; Carbillet, M.

    2010-10-01

    Post-processing for exoplanet detection using direct imaging requires large data cubes and/or sophisticated signal processing technics. For alt-azimuthal mounts, a projection effect called field rotation makes the potential planet rotate in a known manner on the set of images. For ground based telescopes that use extreme adaptive optics and advanced coronagraphy, technics based on field rotation are already broadly used and still under progress. In most such technics, for a given initial position of the planet the planet intensity estimate is a linear function of the set of images. However, due to field rotation the modified instrumental response applied is not shift invariant like usual linear filters. Testing all possible initial positions is therefore very time-consuming. To reduce the time process, we propose to deal with each subset of initial positions computed on a different machine using parallelization programming. In particular, the MOODS algorithm dedicated to the VLT-SPHERE instrument, that estimates jointly the light contributions of the star and the potential exoplanet, is parallelized on the Observatoire de la Cote d'Azur cluster. Different parallelization methods (OpenMP, MPI, Jobs Array) have been elaborated for the initial MOODS code and compared to each other. The one finally chosen splits the initial positions on the processors available by accounting at best for the different constraints of the cluster structure: memory, job submission queues, number of available CPUs, cluster average load. At the end, a standard set of images is satisfactorily processed in a few hours instead of a few days.

  20. Stride search: A general algorithm for storm detection in high resolution climate data

    DOE PAGES

    Bosler, Peter Andrew; Roesler, Erika Louise; Taylor, Mark A.; ...

    2015-09-08

    This article discusses the problem of identifying extreme climate events such as intense storms within large climate data sets. The basic storm detection algorithm is reviewed, which splits the problem into two parts: a spatial search followed by a temporal correlation problem. Two specific implementations of the spatial search algorithm are compared. The commonly used grid point search algorithm is reviewed, and a new algorithm called Stride Search is introduced. Stride Search is designed to work at all latitudes, while grid point searches may fail in polar regions. Results from the two algorithms are compared for the application of tropicalmore » cyclone detection, and shown to produce similar results for the same set of storm identification criteria. The time required for both algorithms to search the same data set is compared. Furthermore, Stride Search's ability to search extreme latitudes is demonstrated for the case of polar low detection.« less

  1. A general-purpose contact detection algorithm for nonlinear structural analysis codes

    SciTech Connect

    Heinstein, M.W.; Attaway, S.W.; Swegle, J.W.; Mello, F.J.

    1993-05-01

    A new contact detection algorithm has been developed to address difficulties associated with the numerical simulation of contact in nonlinear finite element structural analysis codes. Problems including accurate and efficient detection of contact for self-contacting surfaces, tearing and eroding surfaces, and multi-body impact are addressed. The proposed algorithm is portable between dynamic and quasi-static codes and can efficiently model contact between a variety of finite element types including shells, bricks, beams and particles. The algorithm is composed of (1) a location strategy that uses a global search to decide which slave nodes are in proximity to a master surface and (2) an accurate detailed contact check that uses the projected motions of both master surface and slave node. In this report, currently used contact detection algorithms and their associated difficulties are discussed. Then the proposed algorithm and how it addresses these problems is described. Finally, the capability of the new algorithm is illustrated with several example problems.

  2. Stride search: A general algorithm for storm detection in high resolution climate data

    SciTech Connect

    Bosler, Peter Andrew; Roesler, Erika Louise; Taylor, Mark A.; Mundt, Miranda

    2015-09-08

    This article discusses the problem of identifying extreme climate events such as intense storms within large climate data sets. The basic storm detection algorithm is reviewed, which splits the problem into two parts: a spatial search followed by a temporal correlation problem. Two specific implementations of the spatial search algorithm are compared. The commonly used grid point search algorithm is reviewed, and a new algorithm called Stride Search is introduced. Stride Search is designed to work at all latitudes, while grid point searches may fail in polar regions. Results from the two algorithms are compared for the application of tropical cyclone detection, and shown to produce similar results for the same set of storm identification criteria. The time required for both algorithms to search the same data set is compared. Furthermore, Stride Search's ability to search extreme latitudes is demonstrated for the case of polar low detection.

  3. A novel algorithm for real-time adaptive signal detection and identification

    SciTech Connect

    Sleefe, G.E.; Ladd, M.D.; Gallegos, D.E.; Sicking, C.W.; Erteza, I.A.

    1998-04-01

    This paper describes a novel digital signal processing algorithm for adaptively detecting and identifying signals buried in noise. The algorithm continually computes and updates the long-term statistics and spectral characteristics of the background noise. Using this noise model, a set of adaptive thresholds and matched digital filters are implemented to enhance and detect signals that are buried in the noise. The algorithm furthermore automatically suppresses coherent noise sources and adapts to time-varying signal conditions. Signal detection is performed in both the time-domain and the frequency-domain, thereby permitting the detection of both broad-band transients and narrow-band signals. The detection algorithm also provides for the computation of important signal features such as amplitude, timing, and phase information. Signal identification is achieved through a combination of frequency-domain template matching and spectral peak picking. The algorithm described herein is well suited for real-time implementation on digital signal processing hardware. This paper presents the theory of the adaptive algorithm, provides an algorithmic block diagram, and demonstrate its implementation and performance with real-world data. The computational efficiency of the algorithm is demonstrated through benchmarks on specific DSP hardware. The applications for this algorithm, which range from vibration analysis to real-time image processing, are also discussed.

  4. Evolutionary Algorithms Approach to the Solution of Damage Detection Problems

    NASA Astrophysics Data System (ADS)

    Salazar Pinto, Pedro Yoajim; Begambre, Oscar

    2010-09-01

    In this work is proposed a new Self-Configured Hybrid Algorithm by combining the Particle Swarm Optimization (PSO) and a Genetic Algorithm (GA). The aim of the proposed strategy is to increase the stability and accuracy of the search. The central idea is the concept of Guide Particle, this particle (the best PSO global in each generation) transmits its information to a particle of the following PSO generation, which is controlled by the GA. Thus, the proposed hybrid has an elitism feature that improves its performance and guarantees the convergence of the procedure. In different test carried out in benchmark functions, reported in the international literature, a better performance in stability and accuracy was observed; therefore the new algorithm was used to identify damage in a simple supported beam using modal data. Finally, it is worth noting that the algorithm is independent of the initial definition of heuristic parameters.

  5. GPU accelerated Foreign Object Debris Detection on Airfield Pavement with visual saliency algorithm

    NASA Astrophysics Data System (ADS)

    Qi, Jun; Gong, Guoping; Cao, Xiaoguang

    2017-01-01

    We present a GPU-based implementation of visual saliency algorithm to detect foreign object debris(FOD) on airfield pavement with effectiveness and efficiency. Visual saliency algorithm is introduced in FOD detection for the first time. We improve the image signature algorithm to target at FOD detection in complex background of pavement. First, we make pooling operations in obtaining saliency map to improve recall rate. Then, connected component analysis is applied to filter candidate regions in saliency map to get the final targets in original image. Besides, we map the algorithm to GPU-based kernels and data structures. The parallel version of the algorithm is able to get the results with 23.5 times speedup. Experimental results elucidate that the proposed method is effective to detect FOD real-time.

  6. Magnetic detection and localization using multichannel Levinson-Durbin algorithm

    NASA Astrophysics Data System (ADS)

    Murray, Ian B.; McAulay, Alastair D.

    2004-08-01

    The Levinson-Durbin (LD) algorithm has been used for decades as an alternative to Fast-Fourier Transforms (FFTs) in cases where several cycles of a signal are not available or too expensive to obtain. We describe a new application of this LD algorithm using spectral estimation to locate a magnetic dipole, such as a submarine or magnetic mine, relative to a high-sensitivity probe (i.e. gradiometer/magnetometer sensor) moving through the magnetic field. The weakness of the FFT is assuming periodic inputs, thus when the sample ends at a different level than the input, the FFT incorrectly inserts a step at the 'break' between cycles; the LD algorithm benefits by assuming that nothing outside the sampling window will change the spectrum. The iterative LD algorithm is also well suited for real-time operations since it can be solved continuously while the probe moves toward the subject. By establishing spectral templates for different measurement paths relative to the source dipole, we use correlation in the spectral domain to estimate the distance of the dipole from our current path. Direction, and thus location, is obtained by simultaneously sending a second probe to complement the information gained by the first probe, together with a multidimensional LD algorithm.

  7. A novel algorithm for the edge detection and edge enhancement of medical images.

    PubMed

    Crooks, I; Fallone, B G

    1993-01-01

    A novel algorithm, histogram shifting (HS) is presented, which performs edge detection or edge enhancement with the choice of two parameters. The histogram of a region surrounding each pixel is found and translated toward the origin, resulting in the new pixel value. Images from a variety of medical imaging modalities were processed with HS to perform detection and enhancement of edges. Comparison with results obtained from conventional edge detection (e.g., Sobel) and with conventional edge-enhancement algorithms is discussed. HS appears to perform the edge-detection operation without introducing "double-edge" effects often obtained with conventional edge-detection algorithms. HS also appears to perform edge enhancement without introducing extensive noise artifacts, which may be noticeable with many edge-enhancement algorithms.

  8. Reliability of old and new ventricular fibrillation detection algorithms for automated external defibrillators

    PubMed Central

    Amann, Anton; Tratnig, Robert; Unterkofler, Karl

    2005-01-01

    Background A pivotal component in automated external defibrillators (AEDs) is the detection of ventricular fibrillation by means of appropriate detection algorithms. In scientific literature there exists a wide variety of methods and ideas for handling this task. These algorithms should have a high detection quality, be easily implementable, and work in real time in an AED. Testing of these algorithms should be done by using a large amount of annotated data under equal conditions. Methods For our investigation we simulated a continuous analysis by selecting the data in steps of one second without any preselection. We used the complete BIH-MIT arrhythmia database, the CU database, and the files 7001 – 8210 of the AHA database. All algorithms were tested under equal conditions. Results For 5 well-known standard and 5 new ventricular fibrillation detection algorithms we calculated the sensitivity, specificity, and the area under their receiver operating characteristic. In addition, two QRS detection algorithms were included. These results are based on approximately 330 000 decisions (per algorithm). Conclusion Our values for sensitivity and specificity differ from earlier investigations since we used no preselection. The best algorithm is a new one, presented here for the first time. PMID:16253134

  9. An infrared small target detection algorithm based on high-speed local contrast method

    NASA Astrophysics Data System (ADS)

    Cui, Zheng; Yang, Jingli; Jiang, Shouda; Li, Junbao

    2016-05-01

    Small-target detection in infrared imagery with a complex background is always an important task in remote sensing fields. It is important to improve the detection capabilities such as detection rate, false alarm rate, and speed. However, current algorithms usually improve one or two of the detection capabilities while sacrificing the other. In this letter, an Infrared (IR) small target detection algorithm with two layers inspired by Human Visual System (HVS) is proposed to balance those detection capabilities. The first layer uses high speed simplified local contrast method to select significant information. And the second layer uses machine learning classifier to separate targets from background clutters. Experimental results show the proposed algorithm pursue good performance in detection rate, false alarm rate and speed simultaneously.

  10. Scale-space point spread function based framework to boost infrared target detection algorithms

    NASA Astrophysics Data System (ADS)

    Moradi, Saed; Moallem, Payman; Sabahi, Mohamad Farzan

    2016-07-01

    Small target detection is one of the major concern in the development of infrared surveillance systems. Detection algorithms based on Gaussian target modeling have attracted most attention from researchers in this field. However, the lack of accurate target modeling limits the performance of this type of infrared small target detection algorithms. In this paper, signal to clutter ratio (SCR) improvement mechanism based on the matched filter is described in detail and effect of Point Spread Function (PSF) on the intensity and spatial distribution of the target pixels is clarified comprehensively. In the following, a new parametric model for small infrared targets is developed based on the PSF of imaging system which can be considered as a matched filter. Based on this model, a new framework to boost model-based infrared target detection algorithms is presented. In order to show the performance of this new framework, the proposed model is adopted in Laplacian scale-space algorithms which is a well-known algorithm in the small infrared target detection field. Simulation results show that the proposed framework has better detection performance in comparison with the Gaussian one and improves the overall performance of IRST system. By analyzing the performance of the proposed algorithm based on this new framework in a quantitative manner, this new framework shows at least 20% improvement in the output SCR values in comparison with Laplacian of Gaussian (LoG) algorithm.

  11. Detecting activity locations from raw GPS data: a novel kernel-based algorithm

    PubMed Central

    2013-01-01

    Background Health studies and mHealth applications are increasingly resorting to tracking technologies such as Global Positioning Systems (GPS) to study the relation between mobility, exposures, and health. GPS tracking generates large sets of geographic data that need to be transformed to be useful for health research. This paper proposes a method to test the performance of activity place detection algorithms, and compares the performance of a novel kernel-based algorithm with a more traditional time-distance cluster detection method. Methods A set of 750 artificial GPS tracks containing three stops each were generated, with various levels of noise.. A total of 9,000 tracks were processed to measure the algorithms’ capacity to detect stop locations and estimate stop durations, with varying GPS noise and algorithm parameters. Results The proposed kernel-based algorithm outperformed the traditional algorithm on most criteria associated to activity place detection, and offered a stronger resilience to GPS noise, managing to detect up to 92.3% of actual stops, and estimating stop duration within 5% error margins at all tested noise levels. Conclusions Capacity to detect activity locations is an important feature in a context of increasing use of GPS devices in health and place research. While further testing with real-life tracks is recommended, testing algorithms’ performance with artificial track sets for which characteristics are controlled is useful. The proposed novel algorithm outperformed the traditional algorithm under these conditions. PMID:23497213

  12. [A Hyperspectral Imagery Anomaly Detection Algorithm Based on Gauss-Markov Model].

    PubMed

    Gao, Kun; Liu, Ying; Wang, Li-jing; Zhu, Zhen-yu; Cheng, Hao-bo

    2015-10-01

    With the development of spectral imaging technology, hyperspectral anomaly detection is getting more and more widely used in remote sensing imagery processing. The traditional RX anomaly detection algorithm neglects spatial correlation of images. Besides, it does not validly reduce the data dimension, which costs too much processing time and shows low validity on hyperspectral data. The hyperspectral images follow Gauss-Markov Random Field (GMRF) in space and spectral dimensions. The inverse matrix of covariance matrix is able to be directly calculated by building the Gauss-Markov parameters, which avoids the huge calculation of hyperspectral data. This paper proposes an improved RX anomaly detection algorithm based on three-dimensional GMRF. The hyperspectral imagery data is simulated with GMRF model, and the GMRF parameters are estimated with the Approximated Maximum Likelihood method. The detection operator is constructed with GMRF estimation parameters. The detecting pixel is considered as the centre in a local optimization window, which calls GMRF detecting window. The abnormal degree is calculated with mean vector and covariance inverse matrix, and the mean vector and covariance inverse matrix are calculated within the window. The image is detected pixel by pixel with the moving of GMRF window. The traditional RX detection algorithm, the regional hypothesis detection algorithm based on GMRF and the algorithm proposed in this paper are simulated with AVIRIS hyperspectral data. Simulation results show that the proposed anomaly detection method is able to improve the detection efficiency and reduce false alarm rate. We get the operation time statistics of the three algorithms in the same computer environment. The results show that the proposed algorithm improves the operation time by 45.2%, which shows good computing efficiency.

  13. Multi-pattern string matching algorithms comparison for intrusion detection system

    NASA Astrophysics Data System (ADS)

    Hasan, Awsan A.; Rashid, Nur'Aini Abdul; Abdulrazzaq, Atheer A.

    2014-12-01

    Computer networks are developing exponentially and running at high speeds. With the increasing number of Internet users, computers have become the preferred target for complex attacks that require complex analyses to be detected. The Intrusion detection system (IDS) is created and turned into an important part of any modern network to protect the network from attacks. The IDS relies on string matching algorithms to identify network attacks, but these string matching algorithms consume a considerable amount of IDS processing time, thereby slows down the IDS performance. A new algorithm that can overcome the weakness of the IDS needs to be developed. Improving the multi-pattern matching algorithm ensure that an IDS can work properly and the limitations can be overcome. In this paper, we perform a comparison between our three multi-pattern matching algorithms; MP-KR, MPHQS and MPH-BMH with their corresponding original algorithms Kr, QS and BMH respectively. The experiments show that MPH-QS performs best among the proposed algorithms, followed by MPH-BMH, and MP-KR is the slowest. MPH-QS detects a large number of signature patterns in short time compared to other two algorithms. This finding can prove that the multi-pattern matching algorithms are more efficient in high-speed networks.

  14. Enhanced Detection of Multivariate Outliers Using Algorithm-Based Visual Display Techniques.

    ERIC Educational Resources Information Center

    Dickinson, Wendy B.

    This study uses an algorithm-based visual display technique (FACES) to provide enhanced detection of multivariate outliers within large-scale data sets. The FACES computer graphing algorithm (H. Chernoff, 1973) constructs a cartoon-like face, using up to 18 variables for each case. A major advantage of FACES is the ability to store and show the…

  15. A real-time implementation of an advanced sensor failure detection, isolation, and accommodation algorithm

    NASA Technical Reports Server (NTRS)

    Delaat, J. C.; Merrill, W. C.

    1983-01-01

    A sensor failure detection, isolation, and accommodation algorithm was developed which incorporates analytic sensor redundancy through software. This algorithm was implemented in a high level language on a microprocessor based controls computer. Parallel processing and state-of-the-art 16-bit microprocessors are used along with efficient programming practices to achieve real-time operation.

  16. Multispectral fluorescence image algorithms for detection of frass on mature tomatoes

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A multispectral algorithm derived from hyperspectral line-scan fluorescence imaging under violet LED excitation was developed for the detection of frass contamination on mature tomatoes. The algorithm utilized the fluorescence intensities at five wavebands, 515 nm, 640 nm, 664 nm, 690 nm, and 724 nm...

  17. Comparison of fractal dimension estimation algorithms for epileptic seizure onset detection

    NASA Astrophysics Data System (ADS)

    Polychronaki, G. E.; Ktonas, P. Y.; Gatzonis, S.; Siatouni, A.; Asvestas, P. A.; Tsekou, H.; Sakas, D.; Nikita, K. S.

    2010-08-01

    Fractal dimension (FD) is a natural measure of the irregularity of a curve. In this study the performances of three waveform FD estimation algorithms (i.e. Katz's, Higuchi's and the k-nearest neighbour (k-NN) algorithm) were compared in terms of their ability to detect the onset of epileptic seizures in scalp electroencephalogram (EEG). The selection of parameters involved in FD estimation, evaluation of the accuracy of the different algorithms and assessment of their robustness in the presence of noise were performed based on synthetic signals of known FD. When applied to scalp EEG data, Katz's and Higuchi's algorithms were found to be incapable of producing consistent changes of a single type (either a drop or an increase) during seizures. On the other hand, the k-NN algorithm produced a drop, starting close to the seizure onset, in most seizures of all patients. The k-NN algorithm outperformed both Katz's and Higuchi's algorithms in terms of robustness in the presence of noise and seizure onset detection ability. The seizure detection methodology, based on the k-NN algorithm, yielded in the training data set a sensitivity of 100% with 10.10 s mean detection delay and a false positive rate of 0.27 h-1, while the corresponding values in the testing data set were 100%, 8.82 s and 0.42 h-1, respectively. The above detection results compare favourably to those of other seizure onset detection methodologies applied to scalp EEG in the literature. The methodology described, based on the k-NN algorithm, appears to be promising for the detection of the onset of epileptic seizures based on scalp EEG.

  18. Comparison of fractal dimension estimation algorithms for epileptic seizure onset detection.

    PubMed

    Polychronaki, G E; Ktonas, P Y; Gatzonis, S; Siatouni, A; Asvestas, P A; Tsekou, H; Sakas, D; Nikita, K S

    2010-08-01

    Fractal dimension (FD) is a natural measure of the irregularity of a curve. In this study the performances of three waveform FD estimation algorithms (i.e. Katz's, Higuchi's and the k-nearest neighbour (k-NN) algorithm) were compared in terms of their ability to detect the onset of epileptic seizures in scalp electroencephalogram (EEG). The selection of parameters involved in FD estimation, evaluation of the accuracy of the different algorithms and assessment of their robustness in the presence of noise were performed based on synthetic signals of known FD. When applied to scalp EEG data, Katz's and Higuchi's algorithms were found to be incapable of producing consistent changes of a single type (either a drop or an increase) during seizures. On the other hand, the k-NN algorithm produced a drop, starting close to the seizure onset, in most seizures of all patients. The k-NN algorithm outperformed both Katz's and Higuchi's algorithms in terms of robustness in the presence of noise and seizure onset detection ability. The seizure detection methodology, based on the k-NN algorithm, yielded in the training data set a sensitivity of 100% with 10.10 s mean detection delay and a false positive rate of 0.27 h(-1), while the corresponding values in the testing data set were 100%, 8.82 s and 0.42 h(-1), respectively. The above detection results compare favourably to those of other seizure onset detection methodologies applied to scalp EEG in the literature. The methodology described, based on the k-NN algorithm, appears to be promising for the detection of the onset of epileptic seizures based on scalp EEG.

  19. Infrared point target detection based on exponentially weighted RLS algorithm and dual solution improvement

    NASA Astrophysics Data System (ADS)

    Zhu, Bin; Fan, Xiang; Ma, Dong-hui; Cheng, Zheng-dong

    2009-07-01

    The desire to maximize target detection range focuses attention on algorithms for detecting and tracking point targets. However, point target detection and tracking is a challenging task for two difficulties: the one is targets occupying only a few pixels or less in the complex noise and background clutter; the other is the requirement of computational load for real-time applications. Temporal signal processing algorithms offer superior clutter rejection to that of the standard spatial processing approaches. In this paper, the traditional single frame algorithm based on the background prediction is improved to consecutive multi-frames exponentially weighted recursive least squared (EWRLS) algorithm. Farther, the dual solution of EWRLS (DEWLS) is deduced to reduce the computational burden. DEWLS algorithm only uses the inner product of the points pair in training set. The predict result is given directly without compute any middle variable. Experimental results show that the RLS filter can largely increase the signal to noise ratio (SNR) of images; it has the best detection performance than other mentioned algorithms; moving targets can be detected within 2 or 3 frames with lower false alarm. Moreover, whit the dual solution improvement, the computational efficiency is enhanced over 41% to the EWRLS algorithm.

  20. Fusion Schemes for Ensembles of Hyperspectral Anomaly Detection Algorithms

    DTIC Science & Technology

    2011-03-01

    9] Robert J. Johnson, "Improved Feature Extraction, Feature Selection, and Identification Techniques that Create a Fast Unsupervised Hyperspectral ...FUSION SCHEMES FOR ENSEMBLES OF  HYPERSPECTRAL  ANOMALY DETECTION  ALGORITHMS...SCHEMES FOR ENSEMBLES OF  HYPERSPECTRAL  ANOMALY DETECTION  ALGORITHMS    THESIS          Presented to the Faculty     Department of Operational

  1. Lining seam elimination algorithm and surface crack detection in concrete tunnel lining

    NASA Astrophysics Data System (ADS)

    Qu, Zhong; Bai, Ling; An, Shi-Quan; Ju, Fang-Rong; Liu, Ling

    2016-11-01

    Due to the particularity of the surface of concrete tunnel lining and the diversity of detection environments such as uneven illumination, smudges, localized rock falls, water leakage, and the inherent seams of the lining structure, existing crack detection algorithms cannot detect real cracks accurately. This paper proposed an algorithm that combines lining seam elimination with the improved percolation detection algorithm based on grid cell analysis for surface crack detection in concrete tunnel lining. First, check the characteristics of pixels within the overlapping grid to remove the background noise and generate the percolation seed map (PSM). Second, cracks are detected based on the PSM by the accelerated percolation algorithm so that the fracture unit areas can be scanned and connected. Finally, the real surface cracks in concrete tunnel lining can be obtained by removing the lining seam and performing percolation denoising. Experimental results show that the proposed algorithm can accurately, quickly, and effectively detect the real surface cracks. Furthermore, it can fill the gap in the existing concrete tunnel lining surface crack detection by removing the lining seam.

  2. Flight test results of failure detection and isolation algorithms for a redundant strapdown inertial measurement unit

    NASA Technical Reports Server (NTRS)

    Morrell, F. R.; Motyka, P. R.; Bailey, M. L.

    1990-01-01

    Flight test results for two sensor fault-tolerant algorithms developed for a redundant strapdown inertial measurement unit are presented. The inertial measurement unit (IMU) consists of four two-degrees-of-freedom gyros and accelerometers mounted on the faces of a semi-octahedron. Fault tolerance is provided by edge vector test and generalized likelihood test algorithms, each of which can provide dual fail-operational capability for the IMU. To detect the wide range of failure magnitudes in inertial sensors, which provide flight crucial information for flight control and navigation, failure detection and isolation are developed in terms of a multi level structure. Threshold compensation techniques, developed to enhance the sensitivity of the failure detection process to navigation level failures, are presented. Four flight tests were conducted in a commercial transport-type environment to compare and determine the performance of the failure detection and isolation methods. Dual flight processors enabled concurrent tests for the algorithms. Failure signals such as hard-over, null, or bias shift, were added to the sensor outputs as simple or multiple failures during the flights. Both algorithms provided timely detection and isolation of flight control level failures. The generalized likelihood test algorithm provided more timely detection of low-level sensor failures, but it produced one false isolation. Both algorithms demonstrated the capability to provide dual fail-operational performance for the skewed array of inertial sensors.

  3. AsteroidZoo: A New Zooniverse project to detect asteroids and improve asteroid detection algorithms

    NASA Astrophysics Data System (ADS)

    Beasley, M.; Lewicki, C. A.; Smith, A.; Lintott, C.; Christensen, E.

    2013-12-01

    We present a new citizen science project: AsteroidZoo. A collaboration between Planetary Resources, Inc., the Zooniverse Team, and the Catalina Sky Survey, we will bring the science of asteroid identification to the citizen scientist. Volunteer astronomers have proved to be a critical asset in identification and characterization of asteroids, especially potentially hazardous objects. These contributions, to date, have required that the volunteer possess a moderate telescope and the ability and willingness to be responsive to observing requests. Our new project will use data collected by the Catalina Sky Survey (CSS), currently the most productive asteroid survey, to be used by anyone with sufficient interest and an internet connection. As previous work by the Zooniverse has demonstrated, the capability of the citizen scientist is superb at classification of objects. Even the best automated searches require human intervention to identify new objects. These searches are optimized to reduce false positive rates and to prevent a single operator from being overloaded with requests. With access to the large number of people in Zooniverse, we will be able to avoid that problem and instead work to produce a complete detection list. Each frame from CSS will be searched in detail, generating a large number of new detections. We will be able to evaluate the completeness of the CSS data set and potentially provide improvements to the automated pipeline. The data corpus produced by AsteroidZoo will be used as a training environment for machine learning challenges in the future. Our goals include a more complete asteroid detection algorithm and a minimum computation program that skims the cream of the data suitable for implemention on small spacecraft. Our goal is to have the site become live in the Fall 2013.

  4. The research of moving objects behavior detection and tracking algorithm in aerial video

    NASA Astrophysics Data System (ADS)

    Yang, Le-le; Li, Xin; Yang, Xiao-ping; Li, Dong-hui

    2015-12-01

    The article focuses on the research of moving target detection and tracking algorithm in Aerial monitoring. Study includes moving target detection, moving target behavioral analysis and Target Auto tracking. In moving target detection, the paper considering the characteristics of background subtraction and frame difference method, using background reconstruction method to accurately locate moving targets; in the analysis of the behavior of the moving object, using matlab technique shown in the binary image detection area, analyzing whether the moving objects invasion and invasion direction; In Auto Tracking moving target, A video tracking algorithm that used the prediction of object centroids based on Kalman filtering was proposed.

  5. Competitive evaluation of failure detection algorithms for strapdown redundant inertial instruments

    NASA Technical Reports Server (NTRS)

    Wilcox, J. C.

    1973-01-01

    Algorithms for failure detection, isolation, and correction of redundant inertial instruments in the strapdown dodecahedron configuration are competitively evaluated in a digital computer simulation that subjects them to identical environments. Their performance is compared in terms of orientation and inertial velocity errors and in terms of missed and false alarms. The algorithms appear in the simulation program in modular form, so that they may be readily extracted for use elsewhere. The simulation program and its inputs and outputs are described. The algorithms, along with an eight algorithm that was not simulated, also compared analytically to show the relationships among them.

  6. An Automated Cloud-edge Detection Algorithm Using Cloud Physics and Radar Data

    NASA Technical Reports Server (NTRS)

    Ward, Jennifer G.; Merceret, Francis J.; Grainger, Cedric A.

    2003-01-01

    An automated cloud edge detection algorithm was developed and extensively tested. The algorithm uses in-situ cloud physics data measured by a research aircraft coupled with ground-based weather radar measurements to determine whether the aircraft is in or out of cloud. Cloud edges are determined when the in/out state changes, subject to a hysteresis constraint. The hysteresis constraint prevents isolated transient cloud puffs or data dropouts from being identified as cloud boundaries. The algorithm was verified by detailed manual examination of the data set in comparison to the results from application of the automated algorithm.

  7. The EUSTACE break-detection algorithm for a global air temperature dataset

    NASA Astrophysics Data System (ADS)

    Brugnara, Yuri; Auchmann, Renate; Brönnimann, Stefan

    2016-04-01

    EUSTACE (EU Surface Temperature for All Corners of Earth) is an EU-funded project that has started in 2015; its goal is to produce daily estimates of surface air temperature since 1850 across the globe for the first time by combining surface and satellite data using novel statistical techniques. For land surface data (LSAT), we assembled a global dataset of ca. 35000 stations where daily maximum and minimum air temperature observations are available, taking advantage of the most recent data rescue initiatives. Beside quantity, data quality also plays an important role for the success of the project; in particular, the assessment of the homogeneity of the temperature series is crucial in order to obtain a product suitable for the study of climate change. This poster describes a fully automatic state-of-the-art break-detection algorithm that we developed for the global LSAT dataset. We evaluate the performance of the method using artificial benchmarks and present various statistics related to frequency and amplitude of the inhomogeneities detected in the real data. We show in particular that long-term temperature trends calculated from raw data are more often underestimated than overestimated and that this behaviour is mostly related to inhomogeneities affecting maximum temperatures.

  8. Combining genetic algorithm and Levenberg-Marquardt algorithm in training neural network for hypoglycemia detection using EEG signals.

    PubMed

    Nguyen, Lien B; Nguyen, Anh V; Ling, Sai Ho; Nguyen, Hung T

    2013-01-01

    Hypoglycemia is the most common but highly feared complication induced by the intensive insulin therapy in patients with type 1 diabetes mellitus (T1DM). Nocturnal hypoglycemia is dangerous because sleep obscures early symptoms and potentially leads to severe episodes which can cause seizure, coma, or even death. It is shown that the hypoglycemia onset induces early changes in electroencephalography (EEG) signals which can be detected non-invasively. In our research, EEG signals from five T1DM patients during an overnight clamp study were measured and analyzed. By applying a method of feature extraction using Fast Fourier Transform (FFT) and classification using neural networks, we establish that hypoglycemia can be detected efficiently using EEG signals from only two channels. This paper demonstrates that by implementing a training process of combining genetic algorithm and Levenberg-Marquardt algorithm, the classification results are improved markedly up to 75% sensitivity and 60% specificity on a separate testing set.

  9. Anomaly detection in hyperspectral imagery: statistics vs. graph-based algorithms

    NASA Astrophysics Data System (ADS)

    Berkson, Emily E.; Messinger, David W.

    2016-05-01

    Anomaly detection (AD) algorithms are frequently applied to hyperspectral imagery, but different algorithms produce different outlier results depending on the image scene content and the assumed background model. This work provides the first comparison of anomaly score distributions between common statistics-based anomaly detection algorithms (RX and subspace-RX) and the graph-based Topological Anomaly Detector (TAD). Anomaly scores in statistical AD algorithms should theoretically approximate a chi-squared distribution; however, this is rarely the case with real hyperspectral imagery. The expected distribution of scores found with graph-based methods remains unclear. We also look for general trends in algorithm performance with varied scene content. Three separate scenes were extracted from the hyperspectral MegaScene image taken over downtown Rochester, NY with the VIS-NIR-SWIR ProSpecTIR instrument. In order of most to least cluttered, we study an urban, suburban, and rural scene. The three AD algorithms were applied to each scene, and the distributions of the most anomalous 5% of pixels were compared. We find that subspace-RX performs better than RX, because the data becomes more normal when the highest variance principal components are removed. We also see that compared to statistical detectors, anomalies detected by TAD are easier to separate from the background. Due to their different underlying assumptions, the statistical and graph-based algorithms highlighted different anomalies within the urban scene. These results will lead to a deeper understanding of these algorithms and their applicability across different types of imagery.

  10. Enhanced codebook algorithm for fast moving object detection from dynamic background using scene visual perception

    NASA Astrophysics Data System (ADS)

    Mousse, Mikaël A.; Motamed, Cina; Ezin, Eugène C.

    2016-11-01

    The detection of moving objects in a video sequence is the first step in an automatic video surveillance system. This work proposes an enhancement of a codebook-based algorithm for moving objects extraction. The proposed algorithm used a perceptual-based approach to optimize foreground information extraction complexity by using a modified codebook algorithm. The purpose of the adaptive strategy is to reduce the computational complexity of the foreground detection algorithm while maintaining its global accuracy. In this algorithm, we use a superpixels segmentation approach to model the spatial dependencies between pixels. The processing of the superpixels is controlled to focus it on the superpixels that are near to the possible location of foreground objects. The performance of the proposed algorithm is evaluated and compared to other algorithms of the state of the art using a public dataset that proposes sequences with a dynamic background. Experimental results prove that the proposed algorithm obtained the best the frame processing rate during the foreground detection.

  11. RS slope detection algorithm for extraction of heart rate from noisy, multimodal recordings.

    PubMed

    Gierałtowski, Jan; Ciuchciński, Kamil; Grzegorczyk, Iga; Kośna, Katarzyna; Soliński, Mateusz; Podziemski, Piotr

    2015-08-01

    Current gold-standard algorithms for heart beat detection do not work properly in the case of high noise levels and do not make use of multichannel data collected by modern patient monitors. The main idea behind the method presented in this paper is to detect the most prominent part of the QRS complex, i.e. the RS slope. We localize the RS slope based on the consistency of its characteristics, i.e. adequate, automatically determined amplitude and duration. It is a very simple and non-standard, yet very effective, solution. Minor data pre-processing and parameter adaptations make our algorithm fast and noise-resistant. As one of a few algorithms in the PhysioNet/Computing in Cardiology Challenge 2014, our algorithm uses more than two channels (i.e. ECG, BP, EEG, EOG and EMG). Simple fundamental working rules make the algorithm universal: it is able to work on all of these channels with no or only little changes. The final result of our algorithm in phase III of the Challenge was 86.38 (88.07 for a 200 record test set), which gave us fourth place. Our algorithm shows that current standards for heart beat detection could be improved significantly by taking a multichannel approach. This is an open-source algorithm available through the PhysioNet library.

  12. An asymptotic analysis of a general class of signal detection algorithms

    NASA Technical Reports Server (NTRS)

    Mceliece, R. J.; Rodemich, E. R.

    1977-01-01

    For applications to the problem of radio frequency interference identification, or in the search for extraterrestrial intelligence, it is important to have a basic understanding of signal detection algorithms. A general technique for assessing the asymptotic sensitivity of a broad class of signal detection algorithms is given. In these algorithms, the decision is based on the value of X sub 1 + X sub 2...+ X sub n where the X sub 1's are obtained by sampling and preliminary processing of a physical process.

  13. An infrared maritime target detection algorithm applicable to heavy sea fog

    NASA Astrophysics Data System (ADS)

    Wang, Bin; Dong, Lili; Zhao, Ming; Wu, Houde; Ji, Yuanyuan; Xu, Wenhai

    2015-07-01

    Infrared maritime images taken in heavy sea fog (HSF) are usually nonuniform in brightness distribution and targets in different region have significant differences in local contrast, which will cause great difficulty for normal target detection algorithm to remove background clutters and extract targets. To this problem, this paper proposes a new target detection algorithm based on image region division and wavelet inter-subband correlation. This algorithm will firstly divide the original image into different regions by an adaptive thresholding method OTSU. Then, wavelet threshold denoising is adopted to suppress noise in subbands. Finally, the real target is extracted according to its inter-subband correlation and local singularity in original image. Experiment results show that this algorithm can overcome the brightness nonuniformity and background clutters to extract all targets accurately. Besides, target's area is well retained. So the proposed algorithm has high practical value in maritime target search based on infrared imaging system.

  14. Modified motion-based detection algorithm for dim targets in IR image sequences

    NASA Astrophysics Data System (ADS)

    Srivastava, Hari Babu; Saran, Ram; Kumar, Ashok

    2006-04-01

    An integrated algorithm for the detection of dim point/extended size, slow/fast-moving targets has been presented in this paper. In the proposed algorithm, essentially an innovation over an existing algorithm reported by Nengli Dong et al [7], morphological operations are carried out on the incoming IR data to improve signal to noise ratio (SNR). Methods of entropy thresholding and conjunction functions are integrated together. Conjunction function based algorithm has been significantly modified to take care of fast moving targets, a limitation of the method proposed by Nengli Dong et al. Our proposed algorithm is able to detect point as well extended size targets with low contrast and having frame to frame movements varying from sub-pixel to tens of pixels.

  15. Unsupervised unstained cell detection by SIFT keypoint clustering and self-labeling algorithm.

    PubMed

    Muallal, Firas; Schöll, Simon; Sommerfeldt, Björn; Maier, Andreas; Steidl, Stefan; Buchholz, Rainer; Hornegger, Joachim

    2014-01-01

    We propose a novel unstained cell detection algorithm based on unsupervised learning. The algorithm utilizes the scale invariant feature transform (SIFT), a self-labeling algorithm, and two clustering steps in order to achieve high performance in terms of time and detection accuracy. Unstained cell imaging is dominated by phase contrast and bright field microscopy. Therefore, the algorithm was assessed on images acquired using these two modalities. Five cell lines having in total 37 images and 7250 cells were considered for the evaluation: CHO, L929, Sf21, HeLa, and Bovine cells. The obtained F-measures were between 85.1 and 89.5. Compared to the state-of-the-art, the algorithm achieves very close F-measure to the supervised approaches in much less time.

  16. The development of a line-scan imaging algorithm for the detection of fecal contamination on leafy geens

    NASA Astrophysics Data System (ADS)

    Yang, Chun-Chieh; Kim, Moon S.; Chuang, Yung-Kun; Lee, Hoyoung

    2013-05-01

    This paper reports the development of a multispectral algorithm, using the line-scan hyperspectral imaging system, to detect fecal contamination on leafy greens. Fresh bovine feces were applied to the surfaces of washed loose baby spinach leaves. A hyperspectral line-scan imaging system was used to acquire hyperspectral fluorescence images of the contaminated leaves. Hyperspectral image analysis resulted in the selection of the 666 nm and 688 nm wavebands for a multispectral algorithm to rapidly detect feces on leafy greens, by use of the ratio of fluorescence intensities measured at those two wavebands (666 nm over 688 nm). The algorithm successfully distinguished most of the lowly diluted fecal spots (0.05 g feces/ml water and 0.025 g feces/ml water) and some of the highly diluted spots (0.0125 g feces/ml water and 0.00625 g feces/ml water) from the clean spinach leaves. The results showed the potential of the multispectral algorithm with line-scan imaging system for application to automated food processing lines for food safety inspection of leafy green vegetables.

  17. Combined Dust Detection Algorithm by Using MODIS Infrared Channels over East Asia

    NASA Technical Reports Server (NTRS)

    Park, Sang Seo; Kim, Jhoon; Lee, Jaehwa; Lee, Sukjo; Kim, Jeong Soo; Chang, Lim Seok; Ou, Steve

    2014-01-01

    A new dust detection algorithm is developed by combining the results of multiple dust detectionmethods using IR channels onboard the MODerate resolution Imaging Spectroradiometer (MODIS). Brightness Temperature Difference (BTD) between two wavelength channels has been used widely in previous dust detection methods. However, BTDmethods have limitations in identifying the offset values of the BTDto discriminate clear-sky areas. The current algorithm overcomes the disadvantages of previous dust detection methods by considering the Brightness Temperature Ratio (BTR) values of the dual wavelength channels with 30-day composite, the optical properties of the dust particles, the variability of surface properties, and the cloud contamination. Therefore, the current algorithm shows improvements in detecting the dust loaded region over land during daytime. Finally, the confidence index of the current dust algorithm is shown in 10 × 10 pixels of the MODIS observations. From January to June, 2006, the results of the current algorithm are within 64 to 81% of those found using the fine mode fraction (FMF) and aerosol index (AI) from the MODIS and Ozone Monitoring Instrument (OMI). The agreement between the results of the current algorithm and the OMI AI over the non-polluted land also ranges from 60 to 67% to avoid errors due to the anthropogenic aerosol. In addition, the developed algorithm shows statistically significant results at four AErosol RObotic NETwork (AERONET) sites in East Asia.

  18. A new algorithm for detecting central apnea in neonates

    PubMed Central

    Lee, Hoshik; Rusin, Craig G.; Lake, Douglas E.; Clark, Matthew T.; Guin, Lauren; Smoot, Terri J.; Paget-Brown, Alix O.; Vergales, Brooke D.; Kattwinkel, John; Moorman, J. Randall; Delos, John B.

    2017-01-01

    Apnea of prematurity (AOP) is an important and common clinical problem, and is often the rate-limiting process in NICU discharge. Accurate detection of episodes of clinically important neonatal apnea using existing chest impedance monitoring is a clinical imperative. The technique relies on changes in impedance as the lungs fill with air, a high impedance substance. A potential confounder, however, is blood coursing through the heart. Thus the cardiac signal during apnea might be mistaken for breathing. We report here a new filter to remove the cardiac signal from the chest impedance that employs a novel resampling technique optimally suited to remove the heart rate signal, allowing improved apnea detection. We also develop an apnea detection method that employs the chest impedance after cardiac filtering. The method has been applied to a large database of physiological signals, and we prove that, compared to the presently-used monitors, the new method gives substantial improvement in apnea detection. PMID:22156193

  19. Detection of convective initiation using Meteosat SEVIRI: implementation in and verification with the tracking and nowcasting algorithm Cb-TRAM

    NASA Astrophysics Data System (ADS)

    Merk, D.; Zinner, T.

    2013-08-01

    In this paper a new detection scheme for convective initiation (CI) under day and night conditions is presented. The new algorithm combines the strengths of two existing methods for detecting CI with geostationary satellite data. It uses the channels of the Spinning Enhanced Visible and Infrared Imager (SEVIRI) onboard Meteosat Second Generation (MSG). For the new algorithm five infrared (IR) criteria from the Satellite Convection Analysis and Tracking algorithm (SATCAST) and one high-resolution visible channel (HRV) criteria from Cb-TRAM were adapted. This set of criteria aims to identify the typical development of quickly developing convective cells in an early stage. The different criteria include time trends of the 10.8 IR channel, and IR channel differences, as well as their time trends. To provide the trend fields an optical-flow-based method is used: the pyramidal matching algorithm, which is part of Cb-TRAM. The new detection scheme is implemented in Cb-TRAM, and is verified for seven days which comprise different weather situations in central Europe. Contrasted with the original early-stage detection scheme of Cb-TRAM, skill scores are provided. From the comparison against detections of later thunderstorm stages, which are also provided by Cb-TRAM, a decrease in false prior warnings (false alarm ratio) from 91 to 81% is presented, an increase of the critical success index from 7.4 to 12.7%, and a decrease of the BIAS from 320 to 146% for normal scan mode. Similar trends are found for rapid scan mode. Most obvious is the decline of false alarms found for the synoptic class "cold air" masses.

  20. Detection of convective initiation using Meteosat SEVIRI: implementation in and verification with the tracking and nowcasting algorithm Cb-TRAM

    NASA Astrophysics Data System (ADS)

    Merk, D.; Zinner, T.

    2013-02-01

    In this paper a new detection scheme for Convective Initation (CI) under day and night conditions is presented. The new algorithm combines the strengths of two existing methods for detecting Convective Initation with geostationary satellite data and uses the channels of the Spinning Enhanced Visible and Infrared Imager (SEVIRI) onboard Meteosat Second Generation (MSG). For the new algorithm five infrared criteria from the Satellite Convection Analysis and Tracking algorithm (SATCAST) and one High Resolution Visible channel (HRV) criteria from Cb-TRAM were adapted. This set of criteria aims for identifying the typical development of quickly developing convective cells in an early stage. The different criteria include timetrends of the 10.8 IR channel and IR channel differences as well as their timetrends. To provide the trend fields an optical flow based method is used, the Pyramidal Matching algorithm which is part of Cb-TRAM. The new detection scheme is implemented in Cb-TRAM and is verified for seven days which comprise different weather situations in Central Europe. Contrasted with the original early stage detection scheme of Cb-TRAM skill scores are provided. From the comparison against detections of later thunderstorm stages, which are also provided by Cb-TRAM, a decrease in false prior warnings (false alarm ratio) from 91 to 81% is presented, an increase of the critical success index from 7.4 to 12.7%, and a decrease of the BIAS from 320 to 146% for normal scan mode. Similar trends are found for rapid scan mode. Most obvious is the decline of false alarms found for synoptic conditions with upper cold air masses triggering convection.

  1. Design of asynchronous phase detection algorithms optimized for wide frequency response.

    PubMed

    Crespo, Daniel; Quiroga, Juan Antonio; Gomez-Pedrero, Jose Antonio

    2006-06-10

    In many fringe pattern processing applications the local phase has to be obtained from a sinusoidal irradiance signal with unknown local frequency. This process is called asynchronous phase demodulation. Existing algorithms for asynchronous phase detection, or asynchronous algorithms, have been designed to yield no algebraic error in the recovered value of the phase for any signal frequency. However, each asynchronous algorithm has a characteristic frequency response curve. Existing asynchronous algorithms present a range of frequencies with low response, reaching zero for particular values of the signal frequency. For real noisy signals, low response implies a low signal-to-noise ratio in the recovered phase and therefore unreliable results. We present a new Fourier-based methodology for designing asynchronous algorithms with any user-defined frequency response curve and known limit of algebraic error. We show how asynchronous algorithms designed with this method can have better properties for real conditions of noise and signal frequency variation.

  2. Planet Detection Algorithms for the Terrestrial Planet Finder-C

    NASA Astrophysics Data System (ADS)

    Kasdin, N. J.; Braems, I.

    2005-12-01

    Critical to mission planning for the terrestrial planet finder coronagraph (TPF-C) is the ability to estimate integration times for planet detection. This detection is complicated by the presence of background noise due to local and exo-zodiacal dust, by residual speckle due optical errors, and by the dependence of the PSF shape on the specific coronagraph. In this paper we examine in detail the use of PSF fitting (matched filtering) for planet detection, derive probabilistic bounds for the signal-to-noise ratio by balancing missed detection and false alarm rates, and demonstrate that this is close to the optimal linear detection technique. We then compare to a Bayesian detection approach and show that for very low background the Bayesian method offers integration time improvements, but rapidly approaches the PSF fitting result for reasonable levels of background noise. We confirm via monte-carlo simulations. This work was supported under a grant from the Jet Propulsion Laboratory and by a fellowship from the Institut National de Recherche en Informatique et Automatique (INRIA).

  3. An algorithm for image clusters detection and identification based on color for an autonomous mobile robot

    SciTech Connect

    Uy, D.L.

    1996-02-01

    An algorithm for detection and identification of image clusters or {open_quotes}blobs{close_quotes} based on color information for an autonomous mobile robot is developed. The input image data are first processed using a crisp color fuszzyfier, a binary smoothing filter, and a median filter. The processed image data is then inputed to the image clusters detection and identification program. The program employed the concept of {open_quotes}elastic rectangle{close_quotes}that stretches in such a way that the whole blob is finally enclosed in a rectangle. A C-program is develop to test the algorithm. The algorithm is tested only on image data of 8x8 sizes with different number of blobs in them. The algorithm works very in detecting and identifying image clusters.

  4. An algorithm for on-line detection of high frequency oscillations related to epilepsy.

    PubMed

    López-Cuevas, Armando; Castillo-Toledo, Bernardino; Medina-Ceja, Laura; Ventura-Mejía, Consuelo; Pardo-Peña, Kenia

    2013-06-01

    Recent studies suggest that the appearance of signals with high frequency oscillations components in specific regions of the brain is related to the incidence of epilepsy. These oscillations are in general small in amplitude and short in duration, making them difficult to identify. The analysis of these oscillations are particularly important in epilepsy and their study could lead to the development of better medical treatments. Therefore, the development of algorithms for detection of these high frequency oscillations is of great importance. In this work, a new algorithm for automatic detection of high frequency oscillations is presented. This algorithm uses approximate entropy and artificial neural networks to extract features in order to detect and classify high frequency components in electrophysiological signals. In contrast to the existing algorithms, the one proposed here is fast and accurate, and can be implemented on-line, thus reducing the time employed to analyze the experimental electrophysiological signals.

  5. Creating a Successful Citizen Science Model to Detect and Report Invasive Species

    ERIC Educational Resources Information Center

    Gallo, Travis; Waitt, Damon

    2011-01-01

    The Invaders of Texas program is a successful citizen science program in which volunteers survey and monitor invasive plants throughout Texas. Invasive plants are being introduced at alarming rates, and our limited knowledge about their distribution is a major cause for concern. The Invaders of Texas program trains citizen scientists to detect the…

  6. Parameters for successful implant integration revisited part II: algorithm for immediate loading diagnostic factors.

    PubMed

    Bahat, Oded; Sullivan, Richard M

    2010-05-01

    Immediate loading of dental implants has become a widely reported practice with success rates ranging from 70.8% to 100%. Although most studies have considered implant survival to be the only measure of success, a better definition includes the long-term stability of the hard and soft tissues around the implant(s) and other adjacent structures, as well as the long-term stability of all the restorative components. The parameters identified in 1981 by Albrektsson and colleagues as influencing the establishment and maintenance of osseointegration have been reconsidered in relation to immediate loading to improve the chances of achieving such success. Two of the six parameters (status of the bone/implant site and implant loading conditions) have preoperative diagnostic implications, whereas three (implant design, surgical technique, and implant finish) may compensate for less-than-ideal site and loading conditions. Factors affecting the outcome of immediate loading are reviewed to assist clinicians attempting to assess its risks and benefits.

  7. Studies of Target Detection Algorithms Which Use Polarimetric Radar Data

    DTIC Science & Technology

    1987-10-28

    are of the form problem (i.e., target-plus-clutter versus clutter) the likelihood ratio is [3] 1 0 P/ (3) f(Xi wt+c) > TD say (9) 1 = 0 0 0 where...we denote the target-plus-clutter class by P" / -Y 0 wt+c and the clutter only class by % c". TD is the detection threshold. The solution to this...it yields the best possible proba- 22112 bility of detection for a given false alarm proba- 41cftP - 2ct +t c I + t c bility. An alternative approach is

  8. Polarization Lidar Liquid Cloud Detection Algorithm for Winter Mountain Storms

    NASA Technical Reports Server (NTRS)

    Sassen, Kenneth; Zhao, Hongjie

    1992-01-01

    We have collected an extensive polarization lidar dataset from elevated sites in the Tushar Mountains of Utah in support of winter storm cloud seeding research and experiments. Our truck-mounted ruby lidar collected zenith, dual-polarization lidar data through a roof window equipped with a wiper system to prevent snowfall accumulation. Lidar returns were collected at a rate of one shot every 1 to 5 min during declared storm periods over the 1985 and 1987 mid-Jan. to mid-Mar. Field seasons. The mid-barrier remote sensor field site was located at 2.57 km MSL. Of chief interest to weather modification efforts are the heights of supercooled liquid water (SLW) clouds, which must be known to assess their 'seedability' (i.e., temperature and height suitability for artificially increasing snowfall). We are currently re-examining out entire dataset to determine the climatological properties of SLW clouds in winter storms using an autonomous computer algorithm.

  9. Robust pupil center detection using a curvature algorithm

    NASA Technical Reports Server (NTRS)

    Zhu, D.; Moore, S. T.; Raphan, T.; Wall, C. C. (Principal Investigator)

    1999-01-01

    Determining the pupil center is fundamental for calculating eye orientation in video-based systems. Existing techniques are error prone and not robust because eyelids, eyelashes, corneal reflections or shadows in many instances occlude the pupil. We have developed a new algorithm which utilizes curvature characteristics of the pupil boundary to eliminate these artifacts. Pupil center is computed based solely on points related to the pupil boundary. For each boundary point, a curvature value is computed. Occlusion of the boundary induces characteristic peaks in the curvature function. Curvature values for normal pupil sizes were determined and a threshold was found which together with heuristics discriminated normal from abnormal curvature. Remaining boundary points were fit with an ellipse using a least squares error criterion. The center of the ellipse is an estimate of the pupil center. This technique is robust and accurately estimates pupil center with less than 40% of the pupil boundary points visible.

  10. Clique-detection algorithms for matching three-dimensional molecular structures.

    PubMed

    Gardiner, E J; Artymiuk, P J; Willett, P

    1997-08-01

    The representation of chemical and biological molecules by means of graphs permits the use of a maximum common subgraph (MCS) isomorphism algorithm to identify the structural relationships existing between pairs of such molecular graphs. Clique detection provides an efficient way of implementing MCS detection, and this article reports a comparison of several different clique-detection algorithms when used for this purpose. Experiments with both small molecules and proteins demonstrate that the most efficient of these particular applications, which typically involve correspondence graphs with low edge densities, is the algorithm described by Carraghan and Pardalos. This is shown to be two to three times faster than the Bron-Kerbosch algorithm that has been used previously for MCS applications in chemistry and biology. However, the latter algorithm enables all substructures common to a pair of molecules to be identified, and not just the largest ones, as with the other algorithms considered here. The two algorithms can usefully be combined to increase the efficiency of database-searching systems that use the MCS as a measure of structural similarity.

  11. Stride search: A general algorithm for storm detection in high-resolution climate data

    SciTech Connect

    Bosler, Peter A.; Roesler, Erika L.; Taylor, Mark A.; Mundt, Miranda R.

    2016-04-13

    This study discusses the problem of identifying extreme climate events such as intense storms within large climate data sets. The basic storm detection algorithm is reviewed, which splits the problem into two parts: a spatial search followed by a temporal correlation problem. Two specific implementations of the spatial search algorithm are compared: the commonly used grid point search algorithm is reviewed, and a new algorithm called Stride Search is introduced. The Stride Search algorithm is defined independently of the spatial discretization associated with a particular data set. Results from the two algorithms are compared for the application of tropical cyclone detection, and shown to produce similar results for the same set of storm identification criteria. Differences between the two algorithms arise for some storms due to their different definition of search regions in physical space. The physical space associated with each Stride Search region is constant, regardless of data resolution or latitude, and Stride Search is therefore capable of searching all regions of the globe in the same manner. Stride Search's ability to search high latitudes is demonstrated for the case of polar low detection. Wall clock time required for Stride Search is shown to be smaller than a grid point search of the same data, and the relative speed up associated with Stride Search increases as resolution increases.

  12. Stride search: A general algorithm for storm detection in high-resolution climate data

    DOE PAGES

    Bosler, Peter A.; Roesler, Erika L.; Taylor, Mark A.; ...

    2016-04-13

    This study discusses the problem of identifying extreme climate events such as intense storms within large climate data sets. The basic storm detection algorithm is reviewed, which splits the problem into two parts: a spatial search followed by a temporal correlation problem. Two specific implementations of the spatial search algorithm are compared: the commonly used grid point search algorithm is reviewed, and a new algorithm called Stride Search is introduced. The Stride Search algorithm is defined independently of the spatial discretization associated with a particular data set. Results from the two algorithms are compared for the application of tropical cyclonemore » detection, and shown to produce similar results for the same set of storm identification criteria. Differences between the two algorithms arise for some storms due to their different definition of search regions in physical space. The physical space associated with each Stride Search region is constant, regardless of data resolution or latitude, and Stride Search is therefore capable of searching all regions of the globe in the same manner. Stride Search's ability to search high latitudes is demonstrated for the case of polar low detection. Wall clock time required for Stride Search is shown to be smaller than a grid point search of the same data, and the relative speed up associated with Stride Search increases as resolution increases.« less

  13. Assessment of Anovulation in Eumenorrheic Women: Comparison of Ovulation Detection Algorithms

    PubMed Central

    Lynch, Kristine E.; Mumford, Sunni L.; Schliep, Karen C.; Whitcomb, Brian W.; Zarek, Shvetha M.; Pollack, Anna Z; Bertone-Johnson, Elizabeth R.; Danaher, Michelle; Wactawski-Wende, Jean; Gaskins, Audrey J.; Schisterman, Enrique F.

    2014-01-01

    Objective To compare previously used algorithms to identify anovulatory menstrual cycles in women self-reporting regular menses. Design Prospective cohort study Setting Western New York Study participants 259 healthy, regularly menstruating women followed for one (n=9) or two (n=250) menstrual cycles (2005–2007). Intervention(s) None. Main Outcome Measure(s) Prevalence of sporadic anovulatory cycles identified using eleven previously defined algorithms that utilize estradiol, progesterone, and luteinizing hormone (LH) concentrations. Result(s) Algorithms based on serum LH, estradiol, and progesterone levels detected a prevalence of anovulation across the study period of 5.5% to 12.8% (concordant classification for 91.7% to 97.4% of cycles). The prevalence of anovulatory cycles varied from 3.4% to 18.6% using algorithms based on urinary LH alone or with the primary estradiol metabolite, estrone-3-glucuronide (E3G), levels. Conclusion(s) The prevalence of anovulatory cycles among healthy women varied by algorithm. Mid-cycle LH surge urine-based algorithms used in over-the-counter fertility monitors tended to classify a higher proportion of anovulatory cycles compared to luteal phase progesterone serum-based algorithms. Our study demonstrates that algorithms based on the LH surge, or in conjunction with E3G, potentially estimate a higher percentage of anovulatory episodes. Addition of measurements of post-ovulatory serum progesterone or urine pregnanediol may aid in detecting ovulation. PMID:24875398

  14. Microwave detection of breast tumors: comparison of skin subtraction algorithms

    NASA Astrophysics Data System (ADS)

    Fear, Elise C.; Stuchly, Maria A.

    2000-07-01

    Early detection of breast cancer is an important part of effective treatment. Microwave detection of breast cancer is of interest due to the contrast in dielectric properties of normal and malignant breast tissues. We are investigating a confocal microwave imaging system that adapts ideas from ground penetrating radar to breast cancer detection. In the proposed system, the patient lies prone with the breast extending through a hole in the examining table and encircled by an array of antennas. The breast is illuminated sequentially by each antenna with an ultrawideband signal, and the returns are recorded at the same antenna. Because the antennas are offset from the breast, the dominant component of the recorded returns is the reflection from the thin layer of breast skin. Two methods of reducing this reflection are compared, namely approximation of the signal with two time shifted, scaled and summed returns from a cylinder of skin, and subtraction of the mean of the set of aligned returns. Both approaches provide effective decrease of the skin signal, allowing for tumor detection.

  15. Change Detection Algorithms for Information Assurance of Computer Networks

    DTIC Science & Technology

    2002-01-01

    LIST OF FIGURES 2.1 Code Red I Infection (source CAIDA ) . . . . . . . . . . . . . . . . . 17 2.2 Number of probes due to the w32.Leave worm...16 Figure 2.1: Code Red I Infection (source CAIDA ) 2.3.2 Detection of an exponential signal in noise The i.i.d. assumption of the observations after

  16. An unsupervised learning algorithm for fatigue crack detection in waveguides

    NASA Astrophysics Data System (ADS)

    Rizzo, Piervincenzo; Cammarata, Marcello; Dutta, Debaditya; Sohn, Hoon; Harries, Kent

    2009-02-01

    Ultrasonic guided waves (UGWs) are a useful tool in structural health monitoring (SHM) applications that can benefit from built-in transduction, moderately large inspection ranges, and high sensitivity to small flaws. This paper describes an SHM method based on UGWs and outlier analysis devoted to the detection and quantification of fatigue cracks in structural waveguides. The method combines the advantages of UGWs with the outcomes of the discrete wavelet transform (DWT) to extract defect-sensitive features aimed at performing a multivariate diagnosis of damage. In particular, the DWT is exploited to generate a set of relevant wavelet coefficients to construct a uni-dimensional or multi-dimensional damage index vector. The vector is fed to an outlier analysis to detect anomalous structural states. The general framework presented in this paper is applied to the detection of fatigue cracks in a steel beam. The probing hardware consists of a National Instruments PXI platform that controls the generation and detection of the ultrasonic signals by means of piezoelectric transducers made of lead zirconate titanate. The effectiveness of the proposed approach to diagnose the presence of defects as small as a few per cent of the waveguide cross-sectional area is demonstrated.

  17. Experimental infrared point-source detection using an iterative generalized likelihood ratio test algorithm.

    PubMed

    Nichols, J M; Waterman, J R

    2017-03-01

    This work documents the performance of a recently proposed generalized likelihood ratio test (GLRT) algorithm in detecting thermal point-source targets against a sky background. A calibrated source is placed above the horizon at various ranges and then imaged using a mid-wave infrared camera. The proposed algorithm combines a so-called "shrinkage" estimator of the background covariance matrix and an iterative maximum likelihood estimator of the point-source parameters to produce the GLRT statistic. It is clearly shown that the proposed approach results in better detection performance than either standard energy detection or previous implementations of the GLRT detector.

  18. Edge detection based on genetic algorithm and sobel operator in image

    NASA Astrophysics Data System (ADS)

    Tong, Xin; Ren, Aifeng; Zhang, Haifeng; Ruan, Hang; Luo, Ming

    2011-10-01

    Genetic algorithm (GA) is widely used as the optimization problems using techniques inspired by natural evolution. In this paper we present a new edge detection technique based on GA and sobel operator. The sobel edge detection built in DSP Builder is first used to determine the boundaries of objects within an image. Then the genetic algorithm using SOPC Builder proposes a new threshold algorithm for the image processing. Finally, the performance of the new edge detection technique-based the best threshold approaches in DSP Builder and Quartus II software is compared both qualitatively and quantitatively with the single sobel operator. The new edge detection technique is shown to perform very well in terms of robustness to noise, edge search capability and quality of the final edge image.

  19. Execution Time Optimization Analysis on Multiple Algorithms Performance of Moving Object Edge Detection

    NASA Astrophysics Data System (ADS)

    Islam, Syed Zahurul; Islam, Syed Zahidul; Jidin, Razali; Ali, Mohd. Alauddin Mohd.

    2010-06-01

    Computer vision and digital image processing comprises varieties of applications, where some of these used in image processing include convolution, edge detection as well as contrast enhancement. This paper analyzes execution time optimization analysis between Sobel and Canny image processing algorithms in terms of moving objects edge detection. Sobel and Canny edge detection algorithms have been described with pseudo code and detailed flow chart and implemented in C and MATLAB respectively on different platforms to evaluate performance and execution time for moving cars. It is shown that Sobel algorithm is very effective in case of moving multiple cars and blurs images with efficient execution time. Moreover, convolution operation of Canny takes 94-95% time of total execution time with thin and smooth but redundant edges. This also makes more robust of Sobel to detect moving cars edges.

  20. An algorithm to detect low incidence arrhythmic events in electrocardiographic records from ambulatory patients.

    PubMed

    Hungenahally, S K; Willis, R J

    1994-11-01

    An algorithm was devised to detect low incidence arrhythmic events in electrocardiograms obtained during ambulatory monitoring. The algorithm incorporated baseline correction and R wave detection. The RR interval was used to identify tachycardia, bradycardia, and premature ventricular beats. Only a few beats before and after the arrhythmic event were stored. The software was evaluated on a prototype hardware system which consisted of an Intel 86/30 single board computer with a suitable analog pre-processor and an analog to digital converter. The algorithm was used to determine the incidence and type of arrhythmia in records from an ambulatory electrocardiogram (ECG) database and from a cardiac exercise laboratory. These results were compared to annotations on the records which were assumed to be correct. Standard criteria used previously to evaluate algorithms designed for arrhythmia detection were sensitivity, specificity, and diagnostic accuracy. Sensitivities ranging from 77 to 100%, specificities from 94 to 100%, and diagnostic accuracies from 92 to 100% were obtained on the different data sets. These results compare favourably with published results based on more elaborate algorithms. By circumventing the need to make a continuous record of the ECG, the algorithm could form the basis for a compact monitoring device for the detection of arrhythmic events which are so infrequent that standard 24-h Holter monitoring is insufficient.

  1. Development of a fire detection algorithm for the COMS (Communication Ocean and Meteorological Satellite)

    NASA Astrophysics Data System (ADS)

    Kim, Goo; Kim, Dae Sun; Lee, Yang-Won

    2013-10-01

    The forest fires do much damage to our life in ecological and economic aspects. South Korea is probably more liable to suffer from the forest fire because mountain area occupies more than half of land in South Korea. They have recently launched the COMS(Communication Ocean and Meteorological Satellite) which is a geostationary satellite. In this paper, we developed forest fire detection algorithm using COMS data. Generally, forest fire detection algorithm uses characteristics of 4 and 11 micrometer brightness temperature. Our algorithm additionally uses LST(Land Surface Temperature). We confirmed the result of our fire detection algorithm using statistical data of Korea Forest Service and ASTER(Advanced Spaceborne Thermal Emission and Reflection Radiometer) images. We used the data in South Korea On April 1 and 2, 2011 because there are small and big forest fires at that time. The detection rate was 80% in terms of the frequency of the forest fires and was 99% in terms of the damaged area. Considering the number of COMS's channels and its low resolution, this result is a remarkable outcome. To provide users with the result of our algorithm, we developed a smartphone application for users JSP(Java Server Page). This application can work regardless of the smartphone's operating system. This study can be unsuitable for other areas and days because we used just two days data. To improve the accuracy of our algorithm, we need analysis using long-term data as future work.

  2. ALGORITHMS FOR OPTIMIZATION OF SYSYTEM PERFORMANCE IN LAYERED DETECTION SYSTEMS UNDER DETECTOR COORELATION

    SciTech Connect

    Wood, Thomas W.; Heasler, Patrick G.; Daly, Don S.

    2010-07-15

    Almost all of the "architectures" for radiation detection systems in Department of Energy (DOE) and other USG programs rely on some version of layered detector deployment. Efficacy analyses of layered (or more generally extended) detection systems in many contexts often assume statistical independence among detection events and thus predict monotonically increasing system performance with the addition of detection layers. We show this to be a false conclusion for the ROC curves typical of most current technology gamma detectors, and more generally show that statistical independence is often an unwarranted assumption for systems in which there is ambiguity about the objects to be detected. In such systems, a model of correlation among detection events allows optimization of system algorithms for interpretation of detector signals. These algorithms are framed as optimal discriminant functions in joint signal space, and may be applied to gross counting or spectroscopic detector systems. We have shown how system algorithms derived from this model dramatically improve detection probabilities compared to the standard serial detection operating paradigm for these systems. These results would not surprise anyone who has confronted the problem of correlated errors (or failure rates) in the analogous contexts, but is seems to be largely underappreciated among those analyzing the radiation detection problem – independence is widely assumed and experimental studies typical fail to measure correlation. This situation, if not rectified, will lead to several unfortunate results. Including [1] overconfidence in system efficacy, [2] overinvestment in layers of similar technology, and [3] underinvestment in diversity among detection assets.

  3. Performance Assessment Method for a Forged Fingerprint Detection Algorithm

    NASA Astrophysics Data System (ADS)

    Shin, Yong Nyuo; Jun, In-Kyung; Kim, Hyun; Shin, Woochang

    The threat of invasion of privacy and of the illegal appropriation of information both increase with the expansion of the biometrics service environment to open systems. However, while certificates or smart cards can easily be cancelled and reissued if found to be missing, there is no way to recover the unique biometric information of an individual following a security breach. With the recognition that this threat factor may disrupt the large-scale civil service operations approaching implementation, such as electronic ID cards and e-Government systems, many agencies and vendors around the world continue to develop forged fingerprint detection technology, but no objective performance assessment method has, to date, been reported. Therefore, in this paper, we propose a methodology designed to evaluate the objective performance of the forged fingerprint detection technology that is currently attracting a great deal of attention.

  4. Fall detection algorithm in energy efficient multistate sensor system.

    PubMed

    Korats, Gundars; Hofmanis, Janis; Skorodumovs, Aleksejs; Avots, Egils

    2015-01-01

    Health issues for elderly people may lead to different injuries obtained during simple activities of daily living (ADL). Potentially the most dangerous are unintentional falls that may be critical or even lethal to some patients due to the heavy injury risk. Many fall detection systems are proposed but only recently such health care systems became available. Nevertheless sensor design, accuracy as well as energy consumption efficiency can be improved. In this paper we present a single 3-axial accelerometer energy-efficient sensor system. Power saving is achieved by selective event processing triggered by fall detection procedure. The results in our simulations show 100% accuracy when the threshold parameters are chosen correctly. Estimated energy consumption seems to extend battery life significantly.

  5. Comparison of algorithms for automatic border detection of melanoma in dermoscopy images

    NASA Astrophysics Data System (ADS)

    Srinivasa Raghavan, Sowmya; Kaur, Ravneet; LeAnder, Robert

    2016-09-01

    Melanoma is one of the most rapidly accelerating cancers in the world [1]. Early diagnosis is critical to an effective cure. We propose a new algorithm for more accurately detecting melanoma borders in dermoscopy images. Proper border detection requires eliminating occlusions like hair and bubbles by processing the original image. The preprocessing step involves transforming the RGB image to the CIE L*u*v* color space, in order to decouple brightness from color information, then increasing contrast, using contrast-limited adaptive histogram equalization (CLAHE), followed by artifacts removal using a Gaussian filter. After preprocessing, the Chen-Vese technique segments the preprocessed images to create a lesion mask which undergoes a morphological closing operation. Next, the largest central blob in the lesion is detected, after which, the blob is dilated to generate an image output mask. Finally, the automatically-generated mask is compared to the manual mask by calculating the XOR error [3]. Our border detection algorithm was developed using training and test sets of 30 and 20 images, respectively. This detection method was compared to the SRM method [4] by calculating the average XOR error for each of the two algorithms. Average error for test images was 0.10, using the new algorithm, and 0.99, using SRM method. In comparing the average error values produced by the two algorithms, it is evident that the average XOR error for our technique is lower than the SRM method, thereby implying that the new algorithm detects borders of melanomas more accurately than the SRM algorithm.

  6. Practical comparison of aberration detection algorithms for biosurveillance systems.

    PubMed

    Zhou, Hong; Burkom, Howard; Winston, Carla A; Dey, Achintya; Ajani, Umed

    2015-10-01

    National syndromic surveillance systems require optimal anomaly detection methods. For method performance comparison, we injected multi-day signals stochastically drawn from lognormal distributions into time series of aggregated daily visit counts from the U.S. Centers for Disease Control and Prevention's BioSense syndromic surveillance system. The time series corresponded to three different syndrome groups: rash, upper respiratory infection, and gastrointestinal illness. We included a sample of facilities with data reported every day and with median daily syndromic counts ⩾1 over the entire study period. We compared anomaly detection methods of five control chart adaptations, a linear regression model and a Poisson regression model. We assessed sensitivity and timeliness of these methods for detection of multi-day signals. At a daily background alert rate of 1% and 2%, the sensitivities and timeliness ranged from 24 to 77% and 3.3 to 6.1days, respectively. The overall sensitivity and timeliness increased substantially after stratification by weekday versus weekend and holiday. Adjusting the baseline syndromic count by the total number of facility visits gave consistently improved sensitivity and timeliness without stratification, but it provided better performance when combined with stratification. The daily syndrome/total-visit proportion method did not improve the performance. In general, alerting based on linear regression outperformed control chart based methods. A Poisson regression model obtained the best sensitivity in the series with high-count data.

  7. Clinical implementation of a neonatal seizure detection algorithm.

    PubMed

    Temko, Andriy; Marnane, William; Boylan, Geraldine; Lightbody, Gordon

    2015-02-01

    Technologies for automated detection of neonatal seizures are gradually moving towards cot-side implementation. The aim of this paper is to present different ways to visualize the output of a neonatal seizure detection system and analyse their influence on performance in a clinical environment. Three different ways to visualize the detector output are considered: a binary output, a probabilistic trace, and a spatio-temporal colormap of seizure observability. As an alternative to visual aids, audified neonatal EEG is also considered. Additionally, a survey on the usefulness and accuracy of the presented methods has been performed among clinical personnel. The main advantages and disadvantages of the presented methods are discussed. The connection between information visualization and different methods to compute conventional metrics is established. The results of the visualization methods along with the system validation results indicate that the developed neonatal seizure detector with its current level of performance would unambiguously be of benefit to clinicians as a decision support system. The results of the survey suggest that a suitable way to visualize the output of neonatal seizure detection systems in a clinical environment is a combination of a binary output and a probabilistic trace. The main healthcare benefits of the tool are outlined. The decision support system with the chosen visualization interface is currently undergoing pre-market European multi-centre clinical investigation to support its regulatory approval and clinical adoption.

  8. An Optional Threshold with Svm Cloud Detection Algorithm and Dsp Implementation

    NASA Astrophysics Data System (ADS)

    Zhou, Guoqing; Zhou, Xiang; Yue, Tao; Liu, Yilong

    2016-06-01

    This paper presents a method which combines the traditional threshold method and SVM method, to detect the cloud of Landsat-8 images. The proposed method is implemented using DSP for real-time cloud detection. The DSP platform connects with emulator and personal computer. The threshold method is firstly utilized to obtain a coarse cloud detection result, and then the SVM classifier is used to obtain high accuracy of cloud detection. More than 200 cloudy images from Lansat-8 were experimented to test the proposed method. Comparing the proposed method with SVM method, it is demonstrated that the cloud detection accuracy of each image using the proposed algorithm is higher than those of SVM algorithm. The results of the experiment demonstrate that the implementation of the proposed method on DSP can effectively realize the real-time cloud detection accurately.

  9. An algorithm for power line detection and warning based on a millimeter-wave radar video.

    PubMed

    Ma, Qirong; Goshi, Darren S; Shih, Yi-Chi; Sun, Ming-Ting

    2011-12-01

    Power-line-strike accident is a major safety threat for low-flying aircrafts such as helicopters, thus an automatic warning system to power lines is highly desirable. In this paper we propose an algorithm for detecting power lines from radar videos from an active millimeter-wave sensor. Hough Transform is employed to detect candidate lines. The major challenge is that the radar videos are very noisy due to ground return. The noise points could fall on the same line which results in signal peaks after Hough Transform similar to the actual cable lines. To differentiate the cable lines from the noise lines, we train a Support Vector Machine to perform the classification. We exploit the Bragg pattern, which is due to the diffraction of electromagnetic wave on the periodic surface of power lines. We propose a set of features to represent the Bragg pattern for the classifier. We also propose a slice-processing algorithm which supports parallel processing, and improves the detection of cables in a cluttered background. Lastly, an adaptive algorithm is proposed to integrate the detection results from individual frames into a reliable video detection decision, in which temporal correlation of the cable pattern across frames is used to make the detection more robust. Extensive experiments with real-world data validated the effectiveness of our cable detection algorithm.

  10. Lesion detection in magnetic resonance brain images by hyperspectral imaging algorithms

    NASA Astrophysics Data System (ADS)

    Xue, Bai; Wang, Lin; Li, Hsiao-Chi; Chen, Hsian Min; Chang, Chein-I.

    2016-05-01

    Magnetic Resonance (MR) images can be considered as multispectral images so that MR imaging can be processed by multispectral imaging techniques such as maximum likelihood classification. Unfortunately, most multispectral imaging techniques are not particularly designed for target detection. On the other hand, hyperspectral imaging is primarily developed to address subpixel detection, mixed pixel classification for which multispectral imaging is generally not effective. This paper takes advantages of hyperspectral imaging techniques to develop target detection algorithms to find lesions in MR brain images. Since MR images are collected by only three image sequences, T1, T2 and PD, if a hyperspectral imaging technique is used to process MR images it suffers from the issue of insufficient dimensionality. To address this issue, two approaches to nonlinear dimensionality expansion are proposed, nonlinear correlation expansion and nonlinear band ratio expansion. Once dimensionality is expanded hyperspectral imaging algorithms are readily applied. The hyperspectral detection algorithm to be investigated for lesion detection in MR brain is the well-known subpixel target detection algorithm, called Constrained Energy Minimization (CEM). In order to demonstrate the effectiveness of proposed CEM in lesion detection, synthetic images provided by BrainWeb are used for experiments.

  11. A semi-synchronous label propagation algorithm with constraints for community detection in complex networks

    PubMed Central

    Hou Chin, Jia; Ratnavelu, Kuru

    2017-01-01

    Community structure is an important feature of a complex network, where detection of the community structure can shed some light on the properties of such a complex network. Amongst the proposed community detection methods, the label propagation algorithm (LPA) emerges as an effective detection method due to its time efficiency. Despite this advantage in computational time, the performance of LPA is affected by randomness in the algorithm. A modified LPA, called CLPA-GNR, was proposed recently and it succeeded in handling the randomness issues in the LPA. However, it did not remove the tendency for trivial detection in networks with a weak community structure. In this paper, an improved CLPA-GNR is therefore proposed. In the new algorithm, the unassigned and assigned nodes are updated synchronously while the assigned nodes are updated asynchronously. A similarity score, based on the Sørensen-Dice index, is implemented to detect the initial communities and for breaking ties during the propagation process. Constraints are utilised during the label propagation and community merging processes. The performance of the proposed algorithm is evaluated on various benchmark and real-world networks. We find that it is able to avoid trivial detection while showing substantial improvement in the quality of detection. PMID:28374836

  12. Detection and Segmentation of Erythrocytes in Blood Smear Images Using a Line Operator and Watershed Algorithm

    PubMed Central

    Khajehpour, Hassan; Dehnavi, Alireza Mehri; Taghizad, Hossein; Khajehpour, Esmat; Naeemabadi, Mohammadreza

    2013-01-01

    Most of the erythrocyte related diseases are detectable by hematology images analysis. At the first step of this analysis, segmentation and detection of blood cells are inevitable. In this study, a novel method using a line operator and watershed algorithm is rendered for erythrocyte detection and segmentation in blood smear images, as well as reducing over-segmentation in watershed algorithm that is useful for segmentation of different types of blood cells having partial overlap. This method uses gray scale structure of blood cell, which is obtained by exertion of Euclidian distance transform on binary images. Applying this transform, the gray intensity of cell images gradually reduces from the center of cells to their margins. For detecting this intensity variation structure, a line operator measuring gray level variations along several directional line segments is applied. Line segments have maximum and minimum gray level variations has a special pattern that is applicable for detections of the central regions of cells. Intersection of these regions with the signs which are obtained by calculating of local maxima in the watershed algorithm was applied for cells’ centers detection, as well as a reduction in over-segmentation of watershed algorithm. This method creates 1300 sign in segmentation of 1274 erythrocytes available in 25 blood smear images. Accuracy and sensitivity of the proposed method are equal to 95.9% and 97.99%, respectively. The results show the proposed method's capability in detection of erythrocytes in blood smear images. PMID:24672764

  13. Damage diagnosis algorithm using a sequential change point detection method with an unknown distribution for damage

    NASA Astrophysics Data System (ADS)

    Noh, Hae Young; Rajagopal, Ram; Kiremidjian, Anne S.

    2012-04-01

    This paper introduces a damage diagnosis algorithm for civil structures that uses a sequential change point detection method for the cases where the post-damage feature distribution is unknown a priori. This algorithm extracts features from structural vibration data using time-series analysis and then declares damage using the change point detection method. The change point detection method asymptotically minimizes detection delay for a given false alarm rate. The conventional method uses the known pre- and post-damage feature distributions to perform a sequential hypothesis test. In practice, however, the post-damage distribution is unlikely to be known a priori. Therefore, our algorithm estimates and updates this distribution as data are collected using the maximum likelihood and the Bayesian methods. We also applied an approximate method to reduce the computation load and memory requirement associated with the estimation. The algorithm is validated using multiple sets of simulated data and a set of experimental data collected from a four-story steel special moment-resisting frame. Our algorithm was able to estimate the post-damage distribution consistently and resulted in detection delays only a few seconds longer than the delays from the conventional method that assumes we know the post-damage feature distribution. We confirmed that the Bayesian method is particularly efficient in declaring damage with minimal memory requirement, but the maximum likelihood method provides an insightful heuristic approach.

  14. Application of edge detection algorithm for vision guided robotics assembly system

    NASA Astrophysics Data System (ADS)

    Balabantaray, Bunil Kumar; Jha, Panchanand; Biswal, Bibhuti Bhusan

    2013-12-01

    Machine vision system has a major role in making robotic assembly system autonomous. Part detection and identification of the correct part are important tasks which need to be carefully done by a vision system to initiate the process. This process consists of many sub-processes wherein, the image capturing, digitizing and enhancing, etc. do account for reconstructive the part for subsequent operations. Edge detection of the grabbed image, therefore, plays an important role in the entire image processing activity. Thus one needs to choose the correct tool for the process with respect to the given environment. In this paper the comparative study of edge detection algorithm with grasping the object in robot assembly system is presented. The proposed work is performed on the Matlab R2010a Simulink. This paper proposes four algorithms i.e. Canny's, Robert, Prewitt and Sobel edge detection algorithm. An attempt has been made to find the best algorithm for the problem. It is found that Canny's edge detection algorithm gives better result and minimum error for the intended task.

  15. Fault Detection of Aircraft System with Random Forest Algorithm and Similarity Measure

    PubMed Central

    Park, Wookje; Jung, Sikhang

    2014-01-01

    Research on fault detection algorithm was developed with the similarity measure and random forest algorithm. The organized algorithm was applied to unmanned aircraft vehicle (UAV) that was readied by us. Similarity measure was designed by the help of distance information, and its usefulness was also verified by proof. Fault decision was carried out by calculation of weighted similarity measure. Twelve available coefficients among healthy and faulty status data group were used to determine the decision. Similarity measure weighting was done and obtained through random forest algorithm (RFA); RF provides data priority. In order to get a fast response of decision, a limited number of coefficients was also considered. Relation of detection rate and amount of feature data were analyzed and illustrated. By repeated trial of similarity calculation, useful data amount was obtained. PMID:25057508

  16. A Novel Automatic Detection System for ECG Arrhythmias Using Maximum Margin Clustering with Immune Evolutionary Algorithm

    PubMed Central

    Zhu, Bohui; Ding, Yongsheng; Hao, Kuangrong

    2013-01-01

    This paper presents a novel maximum margin clustering method with immune evolution (IEMMC) for automatic diagnosis of electrocardiogram (ECG) arrhythmias. This diagnostic system consists of signal processing, feature extraction, and the IEMMC algorithm for clustering of ECG arrhythmias. First, raw ECG signal is processed by an adaptive ECG filter based on wavelet transforms, and waveform of the ECG signal is detected; then, features are extracted from ECG signal to cluster different types of arrhythmias by the IEMMC algorithm. Three types of performance evaluation indicators are used to assess the effect of the IEMMC method for ECG arrhythmias, such as sensitivity, specificity, and accuracy. Compared with K-means and iterSVR algorithms, the IEMMC algorithm reflects better performance not only in clustering result but also in terms of global search ability and convergence ability, which proves its effectiveness for the detection of ECG arrhythmias. PMID:23690875

  17. [Two Data Inversion Algorithms of Aerosol Horizontal Distributiol Detected by MPL and Error Analysis].

    PubMed

    Lü, Li-hui; Liu, Wen-qing; Zhang, Tian-shu; Lu, Yi-huai; Dong, Yun-sheng; Chen, Zhen-yi; Fan, Guang-qiang; Qi, Shao-shuai

    2015-07-01

    Atmospheric aerosols have important impacts on human health, the environment and the climate system. Micro Pulse Lidar (MPL) is a new effective tool for detecting atmosphere aerosol horizontal distribution. And the extinction coefficient inversion and error analysis are important aspects of data processing. In order to detect the horizontal distribution of atmospheric aerosol near the ground, slope and Fernald algorithms were both used to invert horizontal MPL data and then the results were compared. The error analysis showed that the error of the slope algorithm and Fernald algorithm were mainly from theoretical model and some assumptions respectively. Though there still some problems exist in those two horizontal extinction coefficient inversions, they can present the spatial and temporal distribution of aerosol particles accurately, and the correlations with the forward-scattering visibility sensor are both high with the value of 95%. Furthermore relatively speaking, Fernald algorithm is more suitable for the inversion of horizontal extinction coefficient.

  18. A Speech Endpoint Detection Algorithm Based on BP Neural Network and Multiple Features

    NASA Astrophysics Data System (ADS)

    Shi, Yong-Qiang; Li, Ru-Wei; Zhang, Shuang; Wang, Shuai; Yi, Xiao-Qun

    Focusing on a sharp decline in the performance of endpoint detection algorithm in a complicated noise environment, a new speech endpoint detection method based on BPNN (back propagation neural network) and multiple features is presented. Firstly, maximum of short-time autocorrelation function and spectrum variance of speech signals are extracted respectively. Secondly, these feature vectors as the input of BP neural network are trained and modeled and then the Genetic Algorithm is used to optimize the BP Neural Network. Finally, the signal's type is determined according to the output of Neural Network. The experiments show that the correct rate of this proposed algorithm is improved, because this method has better robustness and adaptability than algorithm based on maximum of short-time autocorrelation function or spectrum variance.

  19. Detection of Human Impacts by an Adaptive Energy-Based Anisotropic Algorithm

    PubMed Central

    Prado-Velasco, Manuel; Ortiz Marín, Rafael; del Rio Cidoncha, Gloria

    2013-01-01

    Boosted by health consequences and the cost of falls in the elderly, this work develops and tests a novel algorithm and methodology to detect human impacts that will act as triggers of a two-layer fall monitor. The two main requirements demanded by socio-healthcare providers—unobtrusiveness and reliability—defined the objectives of the research. We have demonstrated that a very agile, adaptive, and energy-based anisotropic algorithm can provide 100% sensitivity and 78% specificity, in the task of detecting impacts under demanding laboratory conditions. The algorithm works together with an unsupervised real-time learning technique that addresses the adaptive capability, and this is also presented. The work demonstrates the robustness and reliability of our new algorithm, which will be the basis of a smart falling monitor. This is shown in this work to underline the relevance of the results. PMID:24157505

  20. An Individual Tree Detection Algorithm for Dense Deciduous Forests with Spreading Branches

    NASA Astrophysics Data System (ADS)

    Shao, G.

    2015-12-01

    Individual tree information derived from LiDAR may have the potential to assist forest inventory and improve the assessment of forest structure and composition for sustainable forest management. The algorithms developed for individual tree detection are commonly focusing on finding tree tops to allocation the tree positions. However, the spreading branches (cylinder crowns) in deciduous forests cause such kind of algorithms work less effectively on dense canopy. This research applies a machine learning algorithm, mean shift, to position individual trees based on the density of LiDAR point could instead of detecting tree tops. The study site locates in a dense oak forest in Indiana, US. The selection of mean shift kernels is discussed. The constant and dynamic bandwidths of mean shit algorithms are applied and compared.

  1. Wavelet based edge detection algorithm for web surface inspection of coated board web

    NASA Astrophysics Data System (ADS)

    Barjaktarovic, M.; Petricevic, S.

    2010-07-01

    This paper presents significant improvement of the already installed vision system. System was designed for real time coated board inspection. The improvement is achieved with development of a new algorithm for edge detection. The algorithm is based on the redundant (undecimated) wavelet transform. Compared to the existing algorithm better delineation of edges is achieved. This yields to better defect detection probability and more accurate geometrical classification, which will provide additional reduction of waste. Also, algorithm will provide detailed classification and more reliably tracking of defects. This improvement requires minimal changes in processing hardware, only a replacement of the graphic card would be needed, adding only negligibly to the system cost. Other changes are accomplished entirely in the image processing software.

  2. Label propagation algorithm based on edge clustering coefficient for community detection in complex networks

    NASA Astrophysics Data System (ADS)

    Zhang, Xian-Kun; Tian, Xue; Li, Ya-Nan; Song, Chen

    2014-08-01

    The label propagation algorithm (LPA) is a graph-based semi-supervised learning algorithm, which can predict the information of unlabeled nodes by a few of labeled nodes. It is a community detection method in the field of complex networks. This algorithm is easy to implement with low complexity and the effect is remarkable. It is widely applied in various fields. However, the randomness of the label propagation leads to the poor robustness of the algorithm, and the classification result is unstable. This paper proposes a LPA based on edge clustering coefficient. The node in the network selects a neighbor node whose edge clustering coefficient is the highest to update the label of node rather than a random neighbor node, so that we can effectively restrain the random spread of the label. The experimental results show that the LPA based on edge clustering coefficient has made improvement in the stability and accuracy of the algorithm.

  3. The remarkable success of adaptive cosine estimator in hyperspectral target detection

    NASA Astrophysics Data System (ADS)

    Manolakis, D.; Pieper, M.; Truslow, E.; Cooley, T.; Brueggeman, M.; Lipson, S.

    2013-05-01

    A challenging problem of major importance in hyperspectral imaging applications is the detection of subpixel targets of military and civilian interest. The background clutter surrounding the target, acts as an interference source that simultaneously distorts the target spectrum and reduces its strength. Two additional limiting factors are the spectral variability of the background clutter and the spectral variability of the target. Since a result in applied statistics is only as reliable as the assumptions from which it is derived, it is important to investigate whether the basic assumptions used for the derivation of matched filter and adaptive cosine estimator algorithms are a reasonable description of the physical situation. Careful examination of the linear signal model used to derive these algorithms and the replacement signal model, which is a more realistic model for subpixel targets, reveals a serious discrepancy between modeling assumptions and the physical world. Despite this discrepancy and additional mismatches between assumed and actual signal and clutter models, the adaptive cosine estimator shows an amazing effectiveness in practical target detection applications. The objective of this paper is an attempt to explain this unbelievable effectiveness using a combination of classical statistical detection theory, geometrical interpretations, and a novel realistic performance prediction model for the adaptive cosine estimator.

  4. Utilization of advanced clutter suppression algorithms for improved standoff detection and identification of radionuclide threats

    NASA Astrophysics Data System (ADS)

    Cosofret, Bogdan R.; Shokhirev, Kirill; Mulhall, Phil; Payne, David; Harris, Bernard

    2014-05-01

    Technology development efforts seek to increase the capability of detection systems in low Signal-to-Noise regimes encountered in both portal and urban detection applications. We have recently demonstrated significant performance enhancement in existing Advanced Spectroscopic Portals (ASP), Standoff Radiation Detection Systems (SORDS) and handheld isotope identifiers through the use of new advanced detection and identification algorithms. The Poisson Clutter Split (PCS) algorithm is a novel approach for radiological background estimation that improves the detection and discrimination capability of medium resolution detectors. The algorithm processes energy spectra and performs clutter suppression, yielding de-noised gamma-ray spectra that enable significant enhancements in detection and identification of low activity threats with spectral target recognition algorithms. The performance is achievable at the short integration times (0.5 - 1 second) necessary for operation in a high throughput and dynamic environment. PCS has been integrated with ASP, SORDS and RIID units and evaluated in field trials. We present a quantitative analysis of algorithm performance against data collected by a range of systems in several cluttered environments (urban and containerized) with embedded check sources. We show that the algorithm achieves a high probability of detection/identification with low false alarm rates under low SNR regimes. For example, utilizing only 4 out of 12 NaI detectors currently available within an ASP unit, PCS processing demonstrated Pd,ID > 90% at a CFAR (Constant False Alarm Rate) of 1 in 1000 occupancies against weak activity (7 - 8μCi) and shielded sources traveling through the portal at 30 mph. This vehicle speed is a factor of 6 higher than was previously possible and results in significant increase in system throughput and overall performance.

  5. A new approach to optic disc detection in human retinal images using the firefly algorithm.

    PubMed

    Rahebi, Javad; Hardalaç, Fırat

    2016-03-01

    There are various methods and algorithms to detect the optic discs in retinal images. In recent years, much attention has been given to the utilization of the intelligent algorithms. In this paper, we present a new automated method of optic disc detection in human retinal images using the firefly algorithm. The firefly intelligent algorithm is an emerging intelligent algorithm that was inspired by the social behavior of fireflies. The population in this algorithm includes the fireflies, each of which has a specific rate of lighting or fitness. In this method, the insects are compared two by two, and the less attractive insects can be observed to move toward the more attractive insects. Finally, one of the insects is selected as the most attractive, and this insect presents the optimum response to the problem in question. Here, we used the light intensity of the pixels of the retinal image pixels instead of firefly lightings. The movement of these insects due to local fluctuations produces different light intensity values in the images. Because the optic disc is the brightest area in the retinal images, all of the insects move toward brightest area and thus specify the location of the optic disc in the image. The results of implementation show that proposed algorithm could acquire an accuracy rate of 100 % in DRIVE dataset, 95 % in STARE dataset, and 94.38 % in DiaRetDB1 dataset. The results of implementation reveal high capability and accuracy of proposed algorithm in the detection of the optic disc from retinal images. Also, recorded required time for the detection of the optic disc in these images is 2.13 s for DRIVE dataset, 2.81 s for STARE dataset, and 3.52 s for DiaRetDB1 dataset accordingly. These time values are average value.

  6. Performance Characterization of Obstacle Detection Algorithms for Aircraft Navigation

    NASA Technical Reports Server (NTRS)

    Kasturi, Rangachar; Camps, Octavia; Coraor, Lee; Gandhi, Tarak; Hartman, Kerry; Yang, Mau-Tsuen

    2000-01-01

    The research reported here is a part of NASA's Synthetic Vision System (SVS) project for the development of a High Speed Civil Transport Aircraft (HSCT). One of the components of the SVS is a module for detection of potential obstacles in the aircraft's flight path by analyzing the images captured by an on-board camera in real-time. Design of such a module includes the selection and characterization of robust, reliable, and fast techniques and their implementation for execution in real-time. This report describes the results of our research in realizing such a design.

  7. Comparison of the specificity of implantable dual chamber defibrillator detection algorithms.

    PubMed

    Hintringer, Florian; Deibl, Martina; Berger, Thomas; Pachinger, Otmar; Roithinger, Franz Xaver

    2004-07-01

    The aim of the study was to compare the specificity of dual chamber ICDs detection algorithms for correct classification of supraventricular tachyarrhythmias derived from clinical studies according to their size to detect an impact of sample size on the specificity. Furthermore, the study sought to compare the specificities of detection algorithms calculated from clinical data with the specificity calculated from simulations of tachyarrhythmias. A survey was conducted of all available sources providing data regarding the specificity of five dual chamber ICDs. The specificity was correlated with the number of patients included, number of episodes, and number of supraventricular tachyarrhythmias recorded. The simulation was performed using tachyarrhythmias recorded in the electrophysiology laboratory. The range of the number of patients included into the studies was 78-1,029, the range of the total number of episodes recorded was 362-5,788, and the range of the number of supraventricular tachyarrhythmias used for calculation of the specificity for correct detection of these arrhythmias was 100 (Biotronik) to 1662 (Medtronic). The specificity for correct detection of supraventricular tachyarrhythmias was 90% (Biotronik), 89% (ELA Medical), 89% (Guidant), 68% (Medtronic), and 76% (St. Jude Medical). There was an inverse correlation (r = -0.9, P = 0.037) between the specificity for correct classification of supraventricular tachyarrhythmias and the number of patients. The specificity for correct detection of supraventricular tachyarrhythmias calculated from the simulation after correction for the clinical prevalence of the simulated tachyarrhythmias was 95% (Biotronik), 99% (ELA Medical), 94% (Guidant), 93% (Medtronic), and 92% (St. Jude Medical). In conclusion, the specificity of ICD detection algorithms calculated from clinical studies or registries may depend on the number of patients studied. Therefore, a direct comparison between different detection algorithms

  8. A Comparative Evaluation of Unsupervised Anomaly Detection Algorithms for Multivariate Data

    PubMed Central

    Goldstein, Markus; Uchida, Seiichi

    2016-01-01

    Anomaly detection is the process of identifying unexpected items or events in datasets, which differ from the norm. In contrast to standard classification tasks, anomaly detection is often applied on unlabeled data, taking only the internal structure of the dataset into account. This challenge is known as unsupervised anomaly detection and is addressed in many practical applications, for example in network intrusion detection, fraud detection as well as in the life science and medical domain. Dozens of algorithms have been proposed in this area, but unfortunately the research community still lacks a comparative universal evaluation as well as common publicly available datasets. These shortcomings are addressed in this study, where 19 different unsupervised anomaly detection algorithms are evaluated on 10 different datasets from multiple application domains. By publishing the source code and the datasets, this paper aims to be a new well-funded basis for unsupervised anomaly detection research. Additionally, this evaluation reveals the strengths and weaknesses of the different approaches for the first time. Besides the anomaly detection performance, computational effort, the impact of parameter settings as well as the global/local anomaly detection behavior is outlined. As a conclusion, we give an advise on algorithm selection for typical real-world tasks. PMID:27093601

  9. Forced detection Monte Carlo algorithms for accelerated blood vessel image simulations.

    PubMed

    Fredriksson, Ingemar; Larsson, Marcus; Strömberg, Tomas

    2009-03-01

    Two forced detection (FD) variance reduction Monte Carlo algorithms for image simulations of tissue-embedded objects with matched refractive index are presented. The principle of the algorithms is to force a fraction of the photon weight to the detector at each and every scattering event. The fractional weight is given by the probability for the photon to reach the detector without further interactions. Two imaging setups are applied to a tissue model including blood vessels, where the FD algorithms produce identical results as traditional brute force simulations, while being accelerated with two orders of magnitude. Extending the methods to include refraction mismatches is discussed.

  10. A High-Order Statistical Tensor Based Algorithm for Anomaly Detection in Hyperspectral Imagery

    PubMed Central

    Geng, Xiurui; Sun, Kang; Ji, Luyan; Zhao, Yongchao

    2014-01-01

    Recently, high-order statistics have received more and more interest in the field of hyperspectral anomaly detection. However, most of the existing high-order statistics based anomaly detection methods require stepwise iterations since they are the direct applications of blind source separation. Moreover, these methods usually produce multiple detection maps rather than a single anomaly distribution image. In this study, we exploit the concept of coskewness tensor and propose a new anomaly detection method, which is called COSD (coskewness detector). COSD does not need iteration and can produce single detection map. The experiments based on both simulated and real hyperspectral data sets verify the effectiveness of our algorithm. PMID:25366706

  11. A review of algorithms for detecting volcanic hot spots in satellite infrared data

    NASA Astrophysics Data System (ADS)

    Steffke, Andrea M.; Harris, Andrew J. L.

    2011-11-01

    Since the 1980s, application of thermal infrared satellite data for volcano monitoring has rapidly evolved to become a proven operational tool. Due to the large quantities of data provided by sensors in polar and geostationary orbits, as well as the sheer number of active volcanoes on earth, processing and managing such data sets requires an enormous amount of workforce. A number of algorithms have been developed to facilitate detection, location, and tracking of hot spots of active volcanoes. A collation and review of hot spot detection algorithms developed and applied by the volcanological community reveals three main types which have been applied to date: contextual, fixed threshold, and temporal. The founding algorithms for these three classes are VAST, MODVOLC, and RST, respectively. Through comparison with manually based detections, the performance of each algorithm was tested for sustained lava flows (Etna and Stromboli), strombolian activity (Stromboli), lava dome growth and collapse (Augustine), and fumarole fields (Vulcano). It is shown that, as the number of correctly identified anomalies increases, so too does the number of false positives. Although each of the algorithms operates well within the limits and criteria of their design requirements and application, under current data restraints, no algorithm can be expected to perform perfectly.

  12. Time quantified detection of fetal movements using a new fetal movement algorithm.

    PubMed

    Lowery, C L; Russell, W A; Baggot, P J; Wilson, J D; Walls, R C; Bentz, L S; Murphy, P

    1997-01-01

    Primarily, the objective is to develop an automated ultrasound fetal movement detection system that will better characterize fetal movements. Secondarily, the objective is to develop an improved method of quantifying the performance of fetal movement detectors. We recorded 20-minute segments of fetal movement on 101 patients using a UAMS-developed fetal movement detection algorithm (Russell algorithm) and compared this to a Hewlett-Packard (HP) M-1350-A. Movements were recorded on a second-per-second basis by an expert examiner reviewing videotaped real-time ultrasound images. Videotape (86,592 seconds) was scored and compared with the electronic movement-detection systems. The Russell algorithm detected 95.53% of the discrete movements greater than 5 seconds, while the HP system (M-1350-A) detected only 86.08% of the discrete movements (p = 0.012). Both devices were less efficient at detecting the short discrete movements, obtaining sensitivities of 57.39 and 35.22, respectively. Neither system fully identifies fetal movement based on the second-per-second system. Improved methods of quantifying performance indicated that the Russell algorithm performed better than the HP on these patients.

  13. A PD control-based QRS detection algorithm for wearable ECG applications.

    PubMed

    Choi, Changmok; Kim, Younho; Shin, Kunsoo

    2012-01-01

    We present a QRS detection algorithm for wearable ECG applications using a proportional-derivative (PD) control. ECG data of arrhythmia have irregular intervals and magnitudes of QRS waves that impede correct QRS detection. To resolve the problem, PD control is applied to avoid missing a small QRS wave followed from a large QRS wave and to avoid falsely detecting noise as QRS waves when an interval between two adjacent QRS waves is large (e.g. bradycardia, pause, and arioventricular block). ECG data was obtained from 78 patients with various cardiovascular diseases and tested for the performance evaluation of the proposed algorithm. The overall sensitivity and positive predictive value were 99.28% and 99.26%, respectively. The proposed algorithm has low computational complexity, so that it can be suitable to apply mobile ECG monitoring system in real time.

  14. Comparison of point target detection algorithms for space-based scanning infrared sensors

    NASA Astrophysics Data System (ADS)

    Namoos, Omar M.; Schulenburg, Nielson W.

    1995-09-01

    The tracking of resident space objects (RSO) by space-based sensors can lead to engagements that result in stressing backgrounds. These backgrounds, including hard earth, earth limb, and zodiacal, pose various difficulties for signal processing algorithms designed to detect and track the target with a minimum of false alarms. Simulated RSO engagements were generated using the Strategic Scene Generator Model and a sensor model to create focal plane scenes. Using this data, the performance of several detection algorithms has been quantified for space, earth limb and cluttered hard earth backgrounds. These algorithms consist of an adaptive spatial filter, a transversal (matched) filters, and a median variance (nonlinear) filter. Signal-to-clutter statistics of the filtered scenes are compared to those of the unfiltered scene. False alarm and detection results are included. Based on these findings, a suggested processing software architectures design is hypothesized.

  15. A novel hybrid motion detection algorithm based on 2D histogram

    NASA Astrophysics Data System (ADS)

    Su, Xiaomeng; Wang, Haiying

    2015-03-01

    This article proposes a novel hybrid motion detection algorithm based on 2-D (2-Dimensional) spatio-temporal states histogram. The new algorithm combines the idea of image change detection based on 2-D histogram and spatio-temporal entropy image segmentation. It quantifies the continuity of pixel state in time and space domain which are called TDF (Time Domain Filter) and SDF (Space Domain Filter) respectively. After this, put both channels of output data from TDF and SDF into a 2-D histogram. In the 2-D histogram, a curve division method helps to separate the foreground state points and the background ones more accurately. Innovatively, the new algorithm converts the video sequence to its histogram sequence, and transforms the difference of pixel's value in the video sequence into the difference of pixel's position in the 2-D histogram. Experimental results on different types of scenes added Gaussian noise shows that the proposed technique has strong ability of detecting moving objects.

  16. Novel Ultrasound Sensor and Reconstruction Algorithm for Breast Cancer Detection

    SciTech Connect

    Kallman, J S; Ashby, A E; Ciarlo, D R; Thomas, G H

    2002-09-09

    Mammography is currently used for screening women over the age of 40 for breast cancer. It has not been used routinely on younger women because their breast composition is mostly glandular, or radiodense, meaning there is an increased radiation exposure risk as well as a high likelihood of poor image quality. For these younger women, it is calculated that the radiation exposure risk is higher than the potential benefit from the screening. It is anticipated that transmission ultrasound will enable screening of much younger women and complement mamographic screening in women over 40. Ultrasonic transmission tomography holds out the hope of being a discriminating tool for breast cancer screening that is safe, comfortable, and inexpensive. From its inception, however, this imaging modality has been plagued by the problem of how to quickly and inexpensively obtain the data necessary for the tomographic reconstruction. The objectives of this project were: to adapt a new kind of sensor to data acquisition for ultrasonic transmission tomography of the breast, to collect phantom data, to devise new reconstruction algorithms to use that data, and to recommend improved methods for displaying the reconstructions. The ultrasound sensor images an acoustic pressure wave over an entire surface by converting sound pressure into an optical modulation. At the beginning of this project the sensor imaged an area of approximately 7mm by 7mm and was very fragile. During the first year of this research we improved the production and assembly process of the sensors so they now last indefinitely. Our goal for the second year was to enlarge the sensor aperture. Due to unavailability of high quality materials, we were not able to enlarge our original design. We created a phantom of materials similar to those used in manufacturing breast phantoms. We used the sensors to collect data from this phantom. We used both established (diffraction tomography) and new (paraxial adjoint method tomography

  17. Detection of sclerotic bone metastases in the spine using watershed algorithm and graph cut

    NASA Astrophysics Data System (ADS)

    Wiese, Tatjana; Yao, Jianhua; Burns, Joseph E.; Summers, Ronald M.

    2012-03-01

    The early detection of bone metastases is important for determining the prognosis and treatment of a patient. We developed a CAD system which detects sclerotic bone metastases in the spine on CT images. After the spine is segmented from the image, a watershed algorithm detects lesion candidates. The over-segmentation problem of the watershed algorithm is addressed by the novel incorporation of a graph-cuts driven merger. 30 quantitative features for each detection are computed to train a support vector machine (SVM) classifier. The classifier was trained on 12 clinical cases and tested on 10 independent clinical cases. Ground truth lesions were manually segmented by an expert. The system prior to classification detected 87% (72/83) of the manually segmented lesions with volume greater than 300 mm3. On the independent test set, the sensitivity was 71.2% (95% confidence interval (63.1%, 77.3%)) with 8.8 false positives per case.

  18. Integration of a Self-Coherence Algorithm into DISAT for Forced Oscillation Detection

    SciTech Connect

    Follum, James D.; Tuffner, Francis K.; Amidan, Brett G.

    2015-03-03

    With the increasing number of phasor measurement units on the power system, behaviors typically not observable on the power system are becoming more apparent. Oscillatory behavior on the power system, notably forced oscillations, are one such behavior. However, the large amounts of data coming from the PMUs makes manually detecting and locating these oscillations difficult. To automate portions of the process, an oscillation detection routine was coded into the Data Integrity and Situational Awareness Tool (DISAT) framework. Integration into the DISAT framework allows forced oscillations to be detected and information about the event provided to operational engineers. The oscillation detection algorithm integrates with the data handling and atypical data detecting capabilities of DISAT, building off of a standard library of functions. This report details that integration with information on the algorithm, some implementation issues, and some sample results from the western United States’ power grid.

  19. A new pivoting and iterative text detection algorithm for biomedical images.

    PubMed

    Xu, Songhua; Krauthammer, Michael

    2010-12-01

    There is interest to expand the reach of literature mining to include the analysis of biomedical images, which often contain a paper's key findings. Examples include recent studies that use Optical Character Recognition (OCR) to extract image text, which is used to boost biomedical image retrieval and classification. Such studies rely on the robust identification of text elements in biomedical images, which is a non-trivial task. In this work, we introduce a new text detection algorithm for biomedical images based on iterative projection histograms. We study the effectiveness of our algorithm by evaluating the performance on a set of manually labeled random biomedical images, and compare the performance against other state-of-the-art text detection algorithms. We demonstrate that our projection histogram-based text detection approach is well suited for text detection in biomedical images, and that the iterative application of the algorithm boosts performance to an F score of .60. We provide a C++ implementation of our algorithm freely available for academic use.

  20. A New Pivoting and Iterative Text Detection Algorithm for Biomedical Images

    SciTech Connect

    Xu, Songhua; Krauthammer, Prof. Michael

    2010-01-01

    There is interest to expand the reach of literature mining to include the analysis of biomedical images, which often contain a paper's key findings. Examples include recent studies that use Optical Character Recognition (OCR) to extract image text, which is used to boost biomedical image retrieval and classification. Such studies rely on the robust identification of text elements in biomedical images, which is a non-trivial task. In this work, we introduce a new text detection algorithm for biomedical images based on iterative projection histograms. We study the effectiveness of our algorithm by evaluating the performance on a set of manually labeled random biomedical images, and compare the performance against other state-of-the-art text detection algorithms. We demonstrate that our projection histogram-based text detection approach is well suited for text detection in biomedical images, and that the iterative application of the algorithm boosts performance to an F score of .60. We provide a C++ implementation of our algorithm freely available for academic use.

  1. Peak detection in fiber Bragg grating using a fast phase correlation algorithm

    NASA Astrophysics Data System (ADS)

    Lamberti, A.; Vanlanduit, S.; De Pauw, B.; Berghmans, F.

    2014-05-01

    Fiber Bragg grating sensing principle is based on the exact tracking of the peak wavelength location. Several peak detection techniques have already been proposed in literature. Among these, conventional peak detection (CPD) methods such as the maximum detection algorithm (MDA), do not achieve very high precision and accuracy, especially when the Signal to Noise Ratio (SNR) and the wavelength resolution are poor. On the other hand, recently proposed algorithms, like the cross-correlation demodulation algorithm (CCA), are more precise and accurate but require higher computational effort. To overcome these limitations, we developed a novel fast phase correlation algorithm (FPC) which performs as well as the CCA, being at the same time considerably faster. This paper presents the FPC technique and analyzes its performances for different SNR and wavelength resolutions. Using simulations and experiments, we compared the FPC with the MDA and CCA algorithms. The FPC detection capabilities were as precise and accurate as those of the CCA and considerably better than those of the CPD. The FPC computational time was up to 50 times lower than CCA, making the FPC a valid candidate for future implementation in real-time systems.

  2. Edge detection algorithms implemented on Bi-i cellular vision system

    NASA Astrophysics Data System (ADS)

    Karabiber, Fethullah; Arik, Sabri

    2009-02-01

    Bi-i (Bio-inspired) Cellular Vision system is built mainly on Cellular Neural /Nonlinear Networks (CNNs) type (ACE16k) and Digital Signal Processing (DSP) type microprocessors. CNN theory proposed by Chua has advanced properties for image processing applications. In this study, the edge detection algorithms are implemented on the Bi-i Cellular Vision System. Extracting the edge of an image to be processed correctly and fast is of crucial importance for image processing applications. Threshold Gradient based edge detection algorithm is implemented using ACE16k microprocessor. In addition, pre-processing operation is realized by using an image enhancement technique based on Laplacian operator. Finally, morphologic operations are performed as post processing operations. Sobel edge detection algorithm is performed by convolving sobel operators with the image in the DSP. The performances of the edge detection algorithms are compared using visual inspection and timing analysis. Experimental results show that the ACE16k has great computational power and Bi-i Cellular Vision System is very qualified to apply image processing algorithms in real time.

  3. Combining contour detection algorithms for the automatic extraction of the preparation line from a dental 3D measurement

    NASA Astrophysics Data System (ADS)

    Ahlers, Volker; Weigl, Paul; Schachtzabel, Hartmut

    2005-04-01

    Due to the increasing demand for high-quality ceramic crowns and bridges, the CAD/CAM-based production of dental restorations has been a subject of intensive research during the last fifteen years. A prerequisite for the efficient processing of the 3D measurement of prepared teeth with a minimal amount of user interaction is the automatic determination of the preparation line, which defines the sealing margin between the restoration and the prepared tooth. Current dental CAD/CAM systems mostly require the interactive definition of the preparation line by the user, at least by means of giving a number of start points. Previous approaches to the automatic extraction of the preparation line rely on single contour detection algorithms. In contrast, we use a combination of different contour detection algorithms to find several independent potential preparation lines from a height profile of the measured data. The different algorithms (gradient-based, contour-based, and region-based) show their strengths and weaknesses in different clinical situations. A classifier consisting of three stages (range check, decision tree, support vector machine), which is trained by human experts with real-world data, finally decides which is the correct preparation line. In a test with 101 clinical preparations, a success rate of 92.0% has been achieved. Thus the combination of different contour detection algorithms yields a reliable method for the automatic extraction of the preparation line, which enables the setup of a turn-key dental CAD/CAM process chain with a minimal amount of interactive screen work.

  4. Raw data based image processing algorithm for fast detection of surface breaking cracks

    NASA Astrophysics Data System (ADS)

    Sruthi Krishna K., P.; Puthiyaveetil, Nithin; Kidangan, Renil; Unnikrishnakurup, Sreedhar; Zeigler, Mathias; Myrach, Philipp; Balasubramaniam, Krishnan; Biju, P.

    2017-02-01

    The aim of this work is to illustrate the contribution of signal processing techniques in the field of Non-Destructive Evaluation. A component's life evaluation is inevitably related to the presence of flaws in it. The detection and characterization of cracks prior to damage is a technologically and economically significant task and is of very importance when it comes to safety-relevant measures. The Laser Thermography is the most effective and advanced thermography method for Non-Destructive Evaluation. High capability for the detection of surface cracks and for the characterization of the geometry of artificial surface flaws in metallic samples of laser thermography is particularly encouraging. This is one of the non-contacting, fast and real time detection method. The presence of a vertical surface breaking crack will disturb the thermal footprint. The data processing method plays vital role in fast detection of the surface and sub-surface cracks. Currently in laser thermographic inspection lacks a compromising data processing algorithm which is necessary for the fast crack detection and also the analysis of data is done as part of post processing. In this work we introduced a raw data based image processing algorithm which results precise, better and fast crack detection. The algorithm we developed gives better results in both experimental and modeling data. By applying this algorithm we carried out a detailed investigation variation of thermal contrast with crack parameters like depth and width. The algorithm we developed is applied for various surface temperature data from the 2D scanning model and also validated credibility of algorithm with experimental data.

  5. An effective hair detection algorithm for dermoscopic melanoma images of skin lesions

    NASA Astrophysics Data System (ADS)

    Chakraborti, Damayanti; Kaur, Ravneet; Umbaugh, Scott; LeAnder, Robert

    2016-09-01

    Dermoscopic images are obtained using the method of skin surface microscopy. Pigmented skin lesions are evaluated in terms of texture features such as color and structure. Artifacts, such as hairs, bubbles, black frames, ruler-marks, etc., create obstacles that prevent accurate detection of skin lesions by both clinicians and computer-aided diagnosis. In this article, we propose a new algorithm for the automated detection of hairs, using an adaptive, Canny edge-detection method, followed by morphological filtering and an arithmetic addition operation. The algorithm was applied to 50 dermoscopic melanoma images. In order to ascertain this method's relative detection accuracy, it was compared to the Razmjooy hair-detection method [1], using segmentation error (SE), true detection rate (TDR) and false positioning rate (FPR). The new method produced 6.57% SE, 96.28% TDR and 3.47% FPR, compared to 15.751% SE, 86.29% TDR and 11.74% FPR produced by the Razmjooy method [1]. Because of the 7.27-9.99% improvement in those parameters, we conclude that the new algorithm produces much better results for detecting thick, thin, dark and light hairs. The new method proposed here, shows an appreciable difference in the rate of detecting bubbles, as well.

  6. A Genetic Algorithm and Fuzzy Logic Approach for Video Shot Boundary Detection.

    PubMed

    Thounaojam, Dalton Meitei; Khelchandra, Thongam; Manglem Singh, Kh; Roy, Sudipta

    2016-01-01

    This paper proposed a shot boundary detection approach using Genetic Algorithm and Fuzzy Logic. In this, the membership functions of the fuzzy system are calculated using Genetic Algorithm by taking preobserved actual values for shot boundaries. The classification of the types of shot transitions is done by the fuzzy system. Experimental results show that the accuracy of the shot boundary detection increases with the increase in iterations or generations of the GA optimization process. The proposed system is compared to latest techniques and yields better result in terms of F1score parameter.

  7. Aggressively Parallel Algorithms of Collision and Nearest Neighbor Detection for GPU Planetesimal Disk Simulation

    NASA Astrophysics Data System (ADS)

    Quillen, Alice C.; Moore, A.

    2008-09-01

    Planetesimal and dust dynamical simulations require collision and nearest neighbor detection. A brute force implementation for sorting interparticle distances requires O(N2) computations for N particles, limiting the numbers of particles that have been simulated. Parallel algorithms recently developed for the GPU (graphics processing unit), such as the radix sort, can run as fast as O(N) and sort distances between a million particles in a few hundred milliseconds. We introduce improvements in collision and nearest neighbor detection algorithms and how we have incorporated them into our efficient parallel 2nd order democratic heliocentric method symplectic integrator written in NVIDIA's CUDA for the GPU.

  8. The algorithm of crack and crack tip coordinates detection in optical images during fatigue test

    NASA Astrophysics Data System (ADS)

    Panin, S. V.; Chemezov, V. O.; Lyubutin, P. S.; Titkov, V. V.

    2017-02-01

    An algorithm of crack detection during fatigue testing of materials, designed to automate the process of cyclic loading and tracking the crack tip, is proposed and tested. The ultimate goal of the study is aimed at controlling the displacements of the optical system with regard to the specimen under fatigue loading to ensure observation of the ‘area of interest’. It is shown that the image region that contains the crack may be detected and positioned with an average error of 1.93%. In terms of determining the crack tip position, the algorithm provides the accuracy of its localization with the average error value of 56 pixels.

  9. Detection of Carious Lesions and Restorations Using Particle Swarm Optimization Algorithm

    PubMed Central

    Naebi, Mohammad; Saberi, Eshaghali; Risbaf Fakour, Sirous; Naebi, Ahmad; Hosseini Tabatabaei, Somayeh; Ansari Moghadam, Somayeh; Bozorgmehr, Elham; Davtalab Behnam, Nasim; Azimi, Hamidreza

    2016-01-01

    Background/Purpose. In terms of the detection of tooth diagnosis, no intelligent detection has been done up till now. Dentists just look at images and then they can detect the diagnosis position in tooth based on their experiences. Using new technologies, scientists will implement detection and repair of tooth diagnosis intelligently. In this paper, we have introduced one intelligent method for detection using particle swarm optimization (PSO) and our mathematical formulation. This method was applied to 2D special images. Using developing of our method, we can detect tooth diagnosis for all of 2D and 3D images. Materials and Methods. In recent years, it is possible to implement intelligent processing of images by high efficiency optimization algorithms in many applications especially for detection of dental caries and restoration without human intervention. In the present work, we explain PSO algorithm with our detection formula for detection of dental caries and restoration. Also image processing helped us to implement our method. And to do so, pictures taken by digital radiography systems of tooth are used. Results and Conclusion. We implement some mathematics formula for fitness of PSO. Our results show that this method can detect dental caries and restoration in digital radiography pictures with the good convergence. In fact, the error rate of this method was 8%, so that it can be implemented for detection of dental caries and restoration. Using some parameters, it is possible that the error rate can be even reduced below 0.5%. PMID:27212947

  10. An Algorithm for 353 Odor Detection Thresholds in Humans

    PubMed Central

    Sánchez-Moreno, Ricardo; Cometto-Muñiz, J. Enrique; Cain, William S.

    2012-01-01

    One hundred and ninety three odor detection thresholds, ODTs, obtained by Nagata using the Japanese triangular bag method can be correlated as log (1/ODT) by a linear equation with R2 = 0.748 and a standard deviation, SD, of 0.830 log units; the latter may be compared with our estimate of 0.66 log units for the self-consistency of Nagata's data. Aldehydes, acids, unsaturated esters, and mercaptans were included in the equation through indicator variables that took into account the higher potency of these compounds. The ODTs obtained by Cometto-Muñiz and Cain, by Cometto-Muñiz and Abraham, and by Hellman and Small could be put on the same scale as those of Nagata to yield a linear equation for 353 ODTs with R2 = 0.759 and SD = 0.819 log units. The compound descriptors are available for several thousand compounds, and can be calculated from structure, so that further ODT values on the Nagata scale can be predicted for a host of volatile or semivolatile compounds. PMID:21976369

  11. Detection algorithm for glass bottle mouth defect by continuous wavelet transform based on machine vision

    NASA Astrophysics Data System (ADS)

    Qian, Jinfang; Zhang, Changjiang

    2014-11-01

    An efficient algorithm based on continuous wavelet transform combining with pre-knowledge, which can be used to detect the defect of glass bottle mouth, is proposed. Firstly, under the condition of ball integral light source, a perfect glass bottle mouth image is obtained by Japanese Computar camera through the interface of IEEE-1394b. A single threshold method based on gray level histogram is used to obtain the binary image of the glass bottle mouth. In order to efficiently suppress noise, moving average filter is employed to smooth the histogram of original glass bottle mouth image. And then continuous wavelet transform is done to accurately determine the segmentation threshold. Mathematical morphology operations are used to get normal binary bottle mouth mask. A glass bottle to be detected is moving to the detection zone by conveyor belt. Both bottle mouth image and binary image are obtained by above method. The binary image is multiplied with normal bottle mask and a region of interest is got. Four parameters (number of connected regions, coordinate of centroid position, diameter of inner cycle, and area of annular region) can be computed based on the region of interest. Glass bottle mouth detection rules are designed by above four parameters so as to accurately detect and identify the defect conditions of glass bottle. Finally, the glass bottles of Coca-Cola Company are used to verify the proposed algorithm. The experimental results show that the proposed algorithm can accurately detect the defect conditions of the glass bottles and have 98% detecting accuracy.

  12. An automatic multi-lead electrocardiogram segmentation algorithm based on abrupt change detection.

    PubMed

    Illanes-Manriquez, Alfredo

    2010-01-01

    Automatic detection of electrocardiogram (ECG) waves provides important information for cardiac disease diagnosis. In this paper a new algorithm is proposed for automatic ECG segmentation based on multi-lead ECG processing. Two auxiliary signals are computed from the first and second derivatives of several ECG leads signals. One auxiliary signal is used for R peak detection and the other for ECG waves delimitation. A statistical hypothesis testing is finally applied to one of the auxiliary signals in order to detect abrupt mean changes. Preliminary experimental results show that the detected mean changes instants coincide with the boundaries of the ECG waves.

  13. Parallel algorithm for linear feature detection from airborne LiDAR data

    NASA Astrophysics Data System (ADS)

    Mareboyana, Manohar; Chi, Paul

    2006-05-01

    Linear features from airport images correspond to runways, taxiways and roads. Detecting runways helps pilots to focus on runway incursions in poor visibility conditions. In this work, we attempt to detect linear features from LiDAR swath in near real time using parallel implementation on G5-based apple cluster called Xseed. Data from LiDAR swath is converted into a uniform grid with nearest neighbor interpolation. The edges and gradient directions are computed using standard edge detection algorithms such as Canny's detector. Edge linking and detecting straight-line features are described. Preliminary results on Reno, Nevada airport data are included.

  14. A computerized algorithm for arousal detection in healthy adults and patients with Parkinson disease.

    PubMed

    Sorensen, Gertrud L; Jennum, Poul; Kempfner, Jacob; Zoetmulder, Marielle; Sorensen, Helge B D

    2012-02-01

    Arousals occur from all sleep stages and can be identified as abrupt electroencephalogram (EEG) and electromyogram (EMG) changes. Manual scoring of arousals is time consuming with low interscore agreement. The aim of this study was to design an arousal detection algorithm capable of detecting arousals from non-rapid eye movement (REM) and REM sleep, independent of the subject's age and disease. The proposed algorithm uses features from EEG, EMG, and the manual sleep stage scoring as input to a feed-forward artificial neural network (ANN). The performance of the algorithm has been assessed using polysomnographic (PSG) recordings from a total of 24 subjects. Eight of the subjects were diagnosed with Parkinson disease (PD) and the rest (16) were healthy adults in various ages. The performance of the algorithm was validated in 3 settings: testing on the 8 patients with PD using the leave-one-out method, testing on the 16 healthy adults using the leave-one-out method, and finally testing on all 24 subjects using a 4-fold crossvalidation. For these 3 validations, the sensitivities were 89.8%, 90.3%, and 89.4%, and the positive predictive values (PPVs) were 88.8%, 89.4%, and 86.1%. These results are high compared with those of previously presented arousal detection algorithms and especially compared with the high interscore variability of manual scorings.

  15. Community detection in complex networks using density-based clustering algorithm and manifold learning

    NASA Astrophysics Data System (ADS)

    You, Tao; Cheng, Hui-Min; Ning, Yi-Zi; Shia, Ben-Chang; Zhang, Zhong-Yuan

    2016-12-01

    Like clustering analysis, community detection aims at assigning nodes in a network into different communities. Fdp is a recently proposed density-based clustering algorithm which does not need the number of clusters as prior input and the result is insensitive to its parameter. However, Fdp cannot be directly applied to community detection due to its inability to recognize the community centers in the network. To solve the problem, a new community detection method (named IsoFdp) is proposed in this paper. First, we use IsoMap technique to map the network data into a low dimensional manifold which can reveal diverse pair-wised similarity. Then Fdp is applied to detect the communities in the network. An improved partition density function is proposed to select the proper number of communities automatically. We test our method on both synthetic and real-world networks, and the results demonstrate the effectiveness of our algorithm over the state-of-the-art methods.

  16. Robust Mokken Scale Analysis by Means of the Forward Search Algorithm for Outlier Detection.

    PubMed

    Zijlstra, Wobbe P; van der Ark, L Andries; Sijtsma, Klaas

    2011-02-07

    Exploratory Mokken scale analysis (MSA) is a popular method for identifying scales from larger sets of items. As with any statistical method, in MSA the presence of outliers in the data may result in biased results and wrong conclusions. The forward search algorithm is a robust diagnostic method for outlier detection, which we adapt here to identify outliers in MSA. This adaptation involves choices with respect to the algorithm's objective function, selection of items from samples without outliers, and scalability criteria to be used in the forward search algorithm. The application of the adapted forward search algorithm for MSA is demonstrated using real data. Recommendations are given for its use in practical scale analysis.

  17. Automatic face detection and tracking based on Adaboost with camshift algorithm

    NASA Astrophysics Data System (ADS)

    Lin, Hui; Long, JianFeng

    2011-10-01

    With the development of information technology, video surveillance is widely used in security monitoring and identity recognition. For most of pure face tracking algorithms are hard to specify the initial location and scale of face automatically, this paper proposes a fast and robust method to detect and track face by combining adaboost with camshift algorithm. At first, the location and scale of face is specified by adaboost algorithm based on Haar-like features and it will be conveyed to the initial search window automatically. Then, we apply camshift algorithm to track face. The experimental results based on OpenCV software yield good results, even in some special circumstances, such as light changing and face rapid movement. Besides, by drawing out the tracking trajectory of face movement, some abnormal behavior events can be analyzed.

  18. CenLP: A centrality-based label propagation algorithm for community detection in networks

    NASA Astrophysics Data System (ADS)

    Sun, Heli; Liu, Jiao; Huang, Jianbin; Wang, Guangtao; Yang, Zhou; Song, Qinbao; Jia, Xiaolin

    2015-10-01

    Community detection is an important work for discovering the structure and features of complex networks. Many existing methods are sensitive to critical user-dependent parameters or time-consuming in practice. In this paper, we propose a novel label propagation algorithm, called CenLP (Centrality-based Label Propagation). The algorithm introduces a new function to measure the centrality of nodes quantitatively without any user interaction by calculating the local density and the similarity with higher density neighbors for each node. Based on the centrality of nodes, we present a new label propagation algorithm with specific update order and node preference to uncover communities in large-scale networks automatically without imposing any prior restriction. Experiments on both real-world and synthetic networks manifest our algorithm retains the simplicity, effectiveness, and scalability of the original label propagation algorithm and becomes more robust and accurate. Extensive experiments demonstrate the superior performance of our algorithm over the baseline methods. Moreover, our detailed experimental evaluation on real-world networks indicates that our algorithm can effectively measure the centrality of nodes in social networks.

  19. Screening analysis of biodiesel feedstock using UV-vis, NIR and synchronous fluorescence spectrometries and the successive projections algorithm.

    PubMed

    Insausti, Matías; Gomes, Adriano A; Cruz, Fernanda V; Pistonesi, Marcelo F; Araujo, Mario C U; Galvão, Roberto K H; Pereira, Claudete F; Band, Beatriz S F

    2012-08-15

    This paper investigates the use of UV-vis, near infrared (NIR) and synchronous fluorescence (SF) spectrometries coupled with multivariate classification methods to discriminate biodiesel samples with respect to the base oil employed in their production. More specifically, the present work extends previous studies by investigating the discrimination of corn-based biodiesel from two other biodiesel types (sunflower and soybean). Two classification methods are compared, namely full-spectrum SIMCA (soft independent modelling of class analogies) and SPA-LDA (linear discriminant analysis with variables selected by the successive projections algorithm). Regardless of the spectrometric technique employed, full-spectrum SIMCA did not provide an appropriate discrimination of the three biodiesel types. In contrast, all samples were correctly classified on the basis of a reduced number of wavelengths selected by SPA-LDA. It can be concluded that UV-vis, NIR and SF spectrometries can be successfully employed to discriminate corn-based biodiesel from the two other biodiesel types, but wavelength selection by SPA-LDA is key to the proper separation of the classes.

  20. Cloud detection algorithm comparison and validation for operational Landsat data products

    USGS Publications Warehouse

    Foga, Steven Curtis; Scaramuzza, Pat; Guo, Song; Zhu, Zhe; Dilley, Ronald; Beckmann, Tim; Schmidt, Gail L.; Dwyer, John L.; Hughes, MJ; Laue, Brady

    2017-01-01

    Clouds are a pervasive and unavoidable issue in satellite-borne optical imagery. Accurate, well-documented, and automated cloud detection algorithms are necessary to effectively leverage large collections of remotely sensed data. The Landsat project is uniquely suited for comparative validation of cloud assessment algorithms because the modular architecture of the Landsat ground system allows for quick evaluation of new code, and because Landsat has the most comprehensive manual truth masks of any current satellite data archive. Currently, the Landsat Level-1 Product Generation System (LPGS) uses separate algorithms for determining clouds, cirrus clouds, and snow and/or ice probability on a per-pixel basis. With more bands onboard the Landsat 8 Operational Land Imager (OLI)/Thermal Infrared Sensor (TIRS) satellite, and a greater number of cloud masking algorithms, the U.S. Geological Survey (USGS) is replacing the current cloud masking workflow with a more robust algorithm that is capable of working across multiple Landsat sensors with minimal modification. Because of the inherent error from stray light and intermittent data availability of TIRS, these algorithms need to operate both with and without thermal data. In this study, we created a workflow to evaluate cloud and cloud shadow masking algorithms using cloud validation masks manually derived from both Landsat 7 Enhanced Thematic Mapper Plus (ETM +) and Landsat 8 OLI/TIRS data. We created a new validation dataset consisting of 96 Landsat 8 scenes, representing different biomes and proportions of cloud cover. We evaluated algorithm performance by overall accuracy, omission error, and commission error for both cloud and cloud shadow. We found that CFMask, C code based on the Function of Mask (Fmask) algorithm, and its confidence bands have the best overall accuracy among the many algorithms tested using our validation data. The Artificial Thermal-Automated Cloud Cover Algorithm (AT-ACCA) is the most accurate

  1. Successful Detection of Floods in Real Time Onboard EO1 Through NASA's ST6 Autonomous Sciencecraft Experiment (ASE)

    NASA Astrophysics Data System (ADS)

    Ip, F.; Dohm, J. M.; Baker, V. R.; Castano, R.; Cichy, B.; Chien, S.; Davies, A.; Doggett, T.; Greeley, R.

    2004-12-01

    For the first time, a spacecraft has the ability to autonomously detect and react to flood events. Flood detection and the investigation of flooding dynamics in real time from space have never been done before at least not until now. Part of the challenge for the hydrological community has been the difficulty of obtaining cloud-free scenes from orbit at sufficient temporal and spatial resolutions to accurately assess flooding. In addition, the large spatial extent of drainage networks coupled with the size of the data sets necessary to be downlinked from satellites add to the difficulty of monitoring flooding from space. Technology developed as part of the Autonomous Sciencecraft Experiment (ASE) creates the new capability to autonomously detect, assess, and react to dynamic events, thereby enabling the monitoring of transient processes such as flooding in real time. In addition to being able to autonomously process the imaged data onboard the spacecraft for the first time and search the data for specific spectral features, the ASE Science Team has developed and tested change detection algorithms for the Hyperion spectrometer on EO-1. For flood events, if a change is detected in the onboard processed image (i.e. an increase in the number of ¡wet¡" pixels relative to a baseline image where the system is in normal flow condition or relatively dry), the spacecraft is autonomously retasked to obtain additional scenes. For instance, in February 2004 a rare flooding of the Australian Diamantina River was captured by EO-1. In addition, in August during ASE onboard testing a Zambezi River scene in Central Africa was successfully triggered by the classifier to autonomously take another observation. Yet another successful trigger-response flooding test scenario of the Yellow River in China was captured by ASE on 8/18/04. These exciting results pave the way for future smart reconnaissance missions of transient processes on Earth and beyond. Acknowledgments: We are grateful

  2. Estimation of Distribution Algorithm with Local Sampling Strategy for Community Detection in Complex Networks

    NASA Astrophysics Data System (ADS)

    Yu, Fahong; Li, Wenping; He, Feng; Yu, Bolin; Xia, Xiaoyun; Ma, Longhua

    2016-12-01

    It is important to discover the potential community structure for analyzing complex networks. In this paper, an estimation of distribution algorithm with local sampling strategy for community detection in complex networks is presented to optimize the modularity density function. In the proposed algorithm, the evolution probability model is built according to eminent individuals selected by simulated annealing mechanism and a local sampling strategy based on a local similarity model is adopted to improve both the speed and the accuracy for detecting community structure in complex networks. At the same time, a more general version of the criterion function with a tunable parameter λ is used to avoid the resolution limit. Experiments on synthetic and real-life networks demonstrate the performance and the comparison of experimental results with those of several state-of-the-art methods, the proposed algorithm is considerably efficient and competitive.

  3. Two-stage neural algorithm for defect detection and characterization uses an active thermography

    NASA Astrophysics Data System (ADS)

    Dudzik, Sebastian

    2015-07-01

    In the paper a two-stage neural algorithm for defect detection and characterization is presented. In order to estimate the defect depth two neural networks trained on data obtained using an active thermography were employed. The first stage of the algorithm is developed to detect the defect by a classification neural network. Then the defects depth is estimated using a regressive neural network. In this work the results of experimental investigations and simulations are shown. Further, the sensitivity analysis of the presented algorithm was conducted and the impacts of emissivity error and the ambient temperature error on the depth estimation errors were studied. The results were obtained using a test sample made of material with a low thermal diffusivity.

  4. A simple multispectral imaging algorithm for detection of defects on red delicious apples

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Purpose: A multispectral algorithm for detection and differentiation of defect and normal Red Delicious apples was developed from analysis of a series of hyperspectral line-scan images. Methods: A fast line-scan hyperspectral imaging system mounted on a conventional apple sorting machine was used t...

  5. Credit card fraud detection: An application of the gene expression messy genetic algorithm

    SciTech Connect

    Kargupta, H.; Gattiker, J.R.; Buescher, K.

    1996-05-01

    This paper describes an application of the recently introduced gene expression messy genetic algorithm (GEMGA) (Kargupta, 1996) for detecting fraudulent transactions of credit cards. It also explains the fundamental concepts underlying the GEMGA in the light of the SEARCH (Search Envisioned As Relation and Class Hierarchizing) (Kargupta, 1995) framework.

  6. New Graph Models and Algorithms for Detecting Salient Structures from Cluttered Images

    DTIC Science & Technology

    2010-02-24

    Development of graph models and algorithms to detect boundaries that show certain levels of symmetry, an important geometric property of many...Bookstein. Morphometric tools for landmark data. Cambridge University Press, 1991. [8] F. L. Bookstein. Principal warps: Thin-plate splines and the

  7. Sideband Algorithm for Automatic Wind Turbine Gearbox Fault Detection and Diagnosis: Preprint

    SciTech Connect

    Zappala, D.; Tavner, P.; Crabtree, C.; Sheng, S.

    2013-01-01

    Improving the availability of wind turbines (WT) is critical to minimize the cost of wind energy, especially for offshore installations. As gearbox downtime has a significant impact on WT availabilities, the development of reliable and cost-effective gearbox condition monitoring systems (CMS) is of great concern to the wind industry. Timely detection and diagnosis of developing gear defects within a gearbox is an essential part of minimizing unplanned downtime of wind turbines. Monitoring signals from WT gearboxes are highly non-stationary as turbine load and speed vary continuously with time. Time-consuming and costly manual handling of large amounts of monitoring data represent one of the main limitations of most current CMSs, so automated algorithms are required. This paper presents a fault detection algorithm for incorporation into a commercial CMS for automatic gear fault detection and diagnosis. The algorithm allowed the assessment of gear fault severity by tracking progressive tooth gear damage during variable speed and load operating conditions of the test rig. Results show that the proposed technique proves efficient and reliable for detecting gear damage. Once implemented into WT CMSs, this algorithm can automate data interpretation reducing the quantity of information that WT operators must handle.

  8. MEMS-based sensing and algorithm development for fall detection and gait analysis

    NASA Astrophysics Data System (ADS)

    Gupta, Piyush; Ramirez, Gabriel; Lie, Donald Y. C.; Dallas, Tim; Banister, Ron E.; Dentino, Andrew

    2010-02-01

    Falls by the elderly are highly detrimental to health, frequently resulting in injury, high medical costs, and even death. Using a MEMS-based sensing system, algorithms are being developed for detecting falls and monitoring the gait of elderly and disabled persons. In this study, wireless sensors utilize Zigbee protocols were incorporated into planar shoe insoles and a waist mounted device. The insole contains four sensors to measure pressure applied by the foot. A MEMS based tri-axial accelerometer is embedded in the insert and a second one is utilized by the waist mounted device. The primary fall detection algorithm is derived from the waist accelerometer. The differential acceleration is calculated from samples received in 1.5s time intervals. This differential acceleration provides the quantification via an energy index. From this index one may ascertain different gait and identify fall events. Once a pre-determined index threshold is exceeded, the algorithm will classify an event as a fall or a stumble. The secondary algorithm is derived from frequency analysis techniques. The analysis consists of wavelet transforms conducted on the waist accelerometer data. The insole pressure data is then used to underline discrepancies in the transforms, providing more accurate data for classifying gait and/or detecting falls. The range of the transform amplitude in the fourth iteration of a Daubechies-6 transform was found sufficient to detect and classify fall events.

  9. Robust Mokken Scale Analysis by Means of the Forward Search Algorithm for Outlier Detection

    ERIC Educational Resources Information Center

    Zijlstra, Wobbe P.; van der Ark, L. Andries; Sijtsma, Klaas

    2011-01-01

    Exploratory Mokken scale analysis (MSA) is a popular method for identifying scales from larger sets of items. As with any statistical method, in MSA the presence of outliers in the data may result in biased results and wrong conclusions. The forward search algorithm is a robust diagnostic method for outlier detection, which we adapt here to…

  10. A coded aperture compressive imaging array and its visual detection and tracking algorithms for surveillance systems.

    PubMed

    Chen, Jing; Wang, Yongtian; Wu, Hanxiao

    2012-10-29

    In this paper, we propose an application of a compressive imaging system to the problem of wide-area video surveillance systems. A parallel coded aperture compressive imaging system is proposed to reduce the needed high resolution coded mask requirements and facilitate the storage of the projection matrix. Random Gaussian, Toeplitz and binary phase coded masks are utilized to obtain the compressive sensing images. The corresponding motion targets detection and tracking algorithms directly using the compressive sampling images are developed. A mixture of Gaussian distribution is applied in the compressive image space to model the background image and for foreground detection. For each motion target in the compressive sampling domain, a compressive feature dictionary spanned by target templates and noises templates is sparsely represented. An l(1) optimization algorithm is used to solve the sparse coefficient of templates. Experimental results demonstrate that low dimensional compressed imaging representation is sufficient to determine spatial motion targets. Compared with the random Gaussian and Toeplitz phase mask, motion detection algorithms using a random binary phase mask can yield better detection results. However using random Gaussian and Toeplitz phase mask can achieve high resolution reconstructed image. Our tracking algorithm can achieve a real time speed that is up to 10 times faster than that of the l(1) tracker without any optimization.

  11. An Algorithm to Improve Test Answer Copying Detection Using the Omega Statistic

    ERIC Educational Resources Information Center

    Maeda, Hotaka; Zhang, Bo

    2017-01-01

    The omega (?) statistic is reputed to be one of the best indices for detecting answer copying on multiple choice tests, but its performance relies on the accurate estimation of copier ability, which is challenging because responses from the copiers may have been contaminated. We propose an algorithm that aims to identify and delete the suspected…

  12. Development of Outlier detection Algorithm Applicable to a Korean Surge-Gauge

    NASA Astrophysics Data System (ADS)

    Lee, Jun-Whan; Park, Sun-Cheon; Lee, Won-Jin; Lee, Duk Kee

    2016-04-01

    The Korea Meteorological Administration (KMA) is operating a surge-gauge (aerial ultrasonic type) at Ulleung-do to monitor tsunamis. And the National Institute of Meteorological Sciences (NIMS), KMA is developing a tsunami detection and observation system using this surge-gauge. Outliers resulting from a problem with the transmission and extreme events, which change the water level temporarily, are one of the most common discouraging problems in tsunami detection. Unlike a spike, multipoint outliers are difficult to detect clearly. Most of the previous studies used statistic values or signal processing methods such as wavelet transform and filter to detect the multipoint outliers, and used a continuous dataset. However, as the focus moved to a near real-time operation with a dataset that contains gaps, these methods are no longer tenable. In this study, we developed an outlier detection algorithm applicable to the Ulleung-do surge gauge where both multipoint outliers and missing data exist. Although only 9-point data and two arithmetic operations (plus and minus) are used, because of the newly developed keeping method, the algorithm is not only simple and fast but also effective in a non-continuous dataset. We calibrated 17 thresholds and conducted performance tests using the three month data from the Ulleung-do surge gauge. The results show that the newly developed despiking algorithm performs reliably in alleviating the outlier detecting problem.

  13. Detectability Thresholds and Optimal Algorithms for Community Structure in Dynamic Networks

    NASA Astrophysics Data System (ADS)

    Ghasemian, Amir; Zhang, Pan; Clauset, Aaron; Moore, Cristopher; Peel, Leto

    2016-07-01

    The detection of communities within a dynamic network is a common means for obtaining a coarse-grained view of a complex system and for investigating its underlying processes. While a number of methods have been proposed in the machine learning and physics literature, we lack a theoretical analysis of their strengths and weaknesses, or of the ultimate limits on when communities can be detected. Here, we study the fundamental limits of detecting community structure in dynamic networks. Specifically, we analyze the limits of detectability for a dynamic stochastic block model where nodes change their community memberships over time, but where edges are generated independently at each time step. Using the cavity method, we derive a precise detectability threshold as a function of the rate of change and the strength of the communities. Below this sharp threshold, we claim that no efficient algorithm can identify the communities better than chance. We then give two algorithms that are optimal in the sense that they succeed all the way down to this threshold. The first uses belief propagation, which gives asymptotically optimal accuracy, and the second is a fast spectral clustering algorithm, based on linearizing the belief propagation equations. These results extend our understanding of the limits of community detection in an important direction, and introduce new mathematical tools for similar extensions to networks with other types of auxiliary information.

  14. A cloud detection algorithm using the downwelling infrared radiance measured by an infrared pyrometer of the ground-based microwave radiometer

    SciTech Connect

    Ahn, M. H.; Han, D.; Won, H. Y.; Morris, Victor R.

    2015-02-03

    For better utilization of the ground-based microwave radiometer, it is important to detect the cloud presence in the measured data. Here, we introduce a simple and fast cloud detection algorithm by using the optical characteristics of the clouds in the infrared atmospheric window region. The new algorithm utilizes the brightness temperature (Tb) measured by an infrared radiometer installed on top of a microwave radiometer. The two-step algorithm consists of a spectral test followed by a temporal test. The measured Tb is first compared with a predicted clear-sky Tb obtained by an empirical formula as a function of surface air temperature and water vapor pressure. For the temporal test, the temporal variability of the measured Tb during one minute compares with a dynamic threshold value, representing the variability of clear-sky conditions. It is designated as cloud-free data only when both the spectral and temporal tests confirm cloud-free data. Overall, most of the thick and uniform clouds are successfully detected by the spectral test, while the broken and fast-varying clouds are detected by the temporal test. The algorithm is validated by comparison with the collocated ceilometer data for six months, from January to June 2013. The overall proportion of correctness is about 88.3% and the probability of detection is 90.8%, which are comparable with or better than those of previous similar approaches. Two thirds of discrepancies occur when the new algorithm detects clouds while the ceilometer does not, resulting in different values of the probability of detection with different cloud-base altitude, 93.8, 90.3, and 82.8% for low, mid, and high clouds, respectively. Finally, due to the characteristics of the spectral range, the new algorithm is found to be insensitive to the presence of inversion layers.

  15. A cloud detection algorithm using the downwelling infrared radiance measured by an infrared pyrometer of the ground-based microwave radiometer

    DOE PAGES

    Ahn, M. H.; Han, D.; Won, H. Y.; ...

    2015-02-03

    For better utilization of the ground-based microwave radiometer, it is important to detect the cloud presence in the measured data. Here, we introduce a simple and fast cloud detection algorithm by using the optical characteristics of the clouds in the infrared atmospheric window region. The new algorithm utilizes the brightness temperature (Tb) measured by an infrared radiometer installed on top of a microwave radiometer. The two-step algorithm consists of a spectral test followed by a temporal test. The measured Tb is first compared with a predicted clear-sky Tb obtained by an empirical formula as a function of surface air temperaturemore » and water vapor pressure. For the temporal test, the temporal variability of the measured Tb during one minute compares with a dynamic threshold value, representing the variability of clear-sky conditions. It is designated as cloud-free data only when both the spectral and temporal tests confirm cloud-free data. Overall, most of the thick and uniform clouds are successfully detected by the spectral test, while the broken and fast-varying clouds are detected by the temporal test. The algorithm is validated by comparison with the collocated ceilometer data for six months, from January to June 2013. The overall proportion of correctness is about 88.3% and the probability of detection is 90.8%, which are comparable with or better than those of previous similar approaches. Two thirds of discrepancies occur when the new algorithm detects clouds while the ceilometer does not, resulting in different values of the probability of detection with different cloud-base altitude, 93.8, 90.3, and 82.8% for low, mid, and high clouds, respectively. Finally, due to the characteristics of the spectral range, the new algorithm is found to be insensitive to the presence of inversion layers.« less

  16. A cloud detection algorithm using the downwelling infrared radiance measured by an infrared pyrometer of the ground-based microwave radiometer

    NASA Astrophysics Data System (ADS)

    Ahn, M.-H.; Han, D.; Won, H. Y.; Morris, V.

    2015-02-01

    For better utilization of the ground-based microwave radiometer, it is important to detect the cloud presence in the measured data. Here, we introduce a simple and fast cloud detection algorithm by using the optical characteristics of the clouds in the infrared atmospheric window region. The new algorithm utilizes the brightness temperature (Tb) measured by an infrared radiometer installed on top of a microwave radiometer. The two-step algorithm consists of a spectral test followed by a temporal test. The measured Tb is first compared with a predicted clear-sky Tb obtained by an empirical formula as a function of surface air temperature and water vapor pressure. For the temporal test, the temporal variability of the measured Tb during one minute compares with a dynamic threshold value, representing the variability of clear-sky conditions. It is designated as cloud-free data only when both the spectral and temporal tests confirm cloud-free data. Overall, most of the thick and uniform clouds are successfully detected by the spectral test, while the broken and fast-varying clouds are detected by the temporal test. The algorithm is validated by comparison with the collocated ceilometer data for six months, from January to June 2013. The overall proportion of correctness is about 88.3% and the probability of detection is 90.8%, which are comparable with or better than those of previous similar approaches. Two thirds of discrepancies occur when the new algorithm detects clouds while the ceilometer does not, resulting in different values of the probability of detection with different cloud-base altitude, 93.8, 90.3, and 82.8% for low, mid, and high clouds, respectively. Finally, due to the characteristics of the spectral range, the new algorithm is found to be insensitive to the presence of inversion layers.

  17. Algorithm Optimization in Methylation Detection with Multiple RT-qPCR

    PubMed Central

    Jia, Jia; Zhou, Guangpeng; Wang, Jianming; Kang, Qian; Jin, Peng; Sheng, Jianqiu; Cai, Guoxiang; Cai, Sanjun; Han, Xiaoliang

    2016-01-01

    Epigenetic markers based on differential methylation of DNA sequences are used in cancer screening and diagnostics. Detection of abnormal methylation at specific loci by real-time quantitative polymerase chain reaction (RT-qPCR) has been developed to enable high-throughput cancer screening. For tests that combine the results of multiple PCR replicates into a single reportable result, both individual PCR cutoff and weighting of the individual PCR result are essential to test outcome. In this opportunistic screening study, we tested samples from 1133 patients using the triplicate Epi proColon assay with various algorithms and compared it with the newly developed single replicate SensiColon assay that measures methylation status of the same SEPT9 gene sequence. The Epi proColon test approved by the US FDA (1/3 algorithm) showed the highest sensitivity (82.4%) at a lower specificity (82.0%) compared with the Epi proColon 2.0 CE version with 2/3 algorithm (75.1% sensitivity, 97.1% specificity) or 1/1 algorithm (71.3% sensitivity, 92.7% specificity). No significant difference in performance was found between the Epi proColon 2.0 CE and the SensiColon assays. The choice of algorithm must depend on specific test usage, including screening and early detection. These considerations allow one to choose the optimal algorithm to maximize the test performance. We hope this study can help to optimize the methylation detection in cancer screening and early detection. PMID:27898666

  18. Spatiotemporal representations of rapid visual target detection: a single-trial EEG classification algorithm.

    PubMed

    Fuhrmann Alpert, Galit; Manor, Ran; Spanier, Assaf B; Deouell, Leon Y; Geva, Amir B

    2014-08-01

    Brain computer interface applications, developed for both healthy and clinical populations, critically depend on decoding brain activity in single trials. The goal of the present study was to detect distinctive spatiotemporal brain patterns within a set of event related responses. We introduce a novel classification algorithm, the spatially weighted FLD-PCA (SWFP), which is based on a two-step linear classification of event-related responses, using fisher linear discriminant (FLD) classifier and principal component analysis (PCA) for dimensionality reduction. As a benchmark algorithm, we consider the hierarchical discriminant component Analysis (HDCA), introduced by Parra, et al. 2007. We also consider a modified version of the HDCA, namely the hierarchical discriminant principal component analysis algorithm (HDPCA). We compare single-trial classification accuracies of all the three algorithms, each applied to detect target images within a rapid serial visual presentation (RSVP, 10 Hz) of images from five different object categories, based on single-trial brain responses. We find a systematic superiority of our classification algorithm in the tested paradigm. Additionally, HDPCA significantly increases classification accuracies compared to the HDCA. Finally, we show that presenting several repetitions of the same image exemplars improve accuracy, and thus may be important in cases where high accuracy is crucial.

  19. A Novel Zero Velocity Interval Detection Algorithm for Self-Contained Pedestrian Navigation System with Inertial Sensors

    PubMed Central

    Tian, Xiaochun; Chen, Jiabin; Han, Yongqiang; Shang, Jianyu; Li, Nan

    2016-01-01

    Zero velocity update (ZUPT) plays an important role in pedestrian navigation algorithms with the premise that the zero velocity interval (ZVI) should be detected accurately and effectively. A novel adaptive ZVI detection algorithm based on a smoothed pseudo Wigner–Ville distribution to remove multiple frequencies intelligently (SPWVD-RMFI) is proposed in this paper. The novel algorithm adopts the SPWVD-RMFI method to extract the pedestrian gait frequency and to calculate the optimal ZVI detection threshold in real time by establishing the function relationships between the thresholds and the gait frequency; then, the adaptive adjustment of thresholds with gait frequency is realized and improves the ZVI detection precision. To put it into practice, a ZVI detection experiment is carried out; the result shows that compared with the traditional fixed threshold ZVI detection method, the adaptive ZVI detection algorithm can effectively reduce the false and missed detection rate of ZVI; this indicates that the novel algorithm has high detection precision and good robustness. Furthermore, pedestrian trajectory positioning experiments at different walking speeds are carried out to evaluate the influence of the novel algorithm on positioning precision. The results show that the ZVI detected by the adaptive ZVI detection algorithm for pedestrian trajectory calculation can achieve better performance. PMID:27669266

  20. Parallel computation safety analysis irradiation targets fission product molybdenum in neutronic aspect using the successive over-relaxation algorithm

    NASA Astrophysics Data System (ADS)

    Susmikanti, Mike; Dewayatna, Winter; Sulistyo, Yos

    2014-09-01

    One of the research activities in support of commercial radioisotope production program is a safety research on target FPM (Fission Product Molybdenum) irradiation. FPM targets form a tube made of stainless steel which contains nuclear-grade high-enrichment uranium. The FPM irradiation tube is intended to obtain fission products. Fission materials such as Mo99 used widely the form of kits in the medical world. The neutronics problem is solved using first-order perturbation theory derived from the diffusion equation for four groups. In contrast, Mo isotopes have longer half-lives, about 3 days (66 hours), so the delivery of radioisotopes to consumer centers and storage is possible though still limited. The production of this isotope potentially gives significant economic value. The criticality and flux in multigroup diffusion model was calculated for various irradiation positions and uranium contents. This model involves complex computation, with large and sparse matrix system. Several parallel algorithms have been developed for the sparse and large matrix solution. In this paper, a successive over-relaxation (SOR) algorithm was implemented for the calculation of reactivity coefficients which can be done in parallel. Previous works performed reactivity calculations serially with Gauss-Seidel iteratives. The parallel method can be used to solve multigroup diffusion equation system and calculate the criticality and reactivity coefficients. In this research a computer code was developed to exploit parallel processing to perform reactivity calculations which were to be used in safety analysis. The parallel processing in the multicore computer system allows the calculation to be performed more quickly. This code was applied for the safety limits calculation of irradiated FPM targets containing highly enriched uranium. The results of calculations neutron show that for uranium contents of 1.7676 g and 6.1866 g (× 106 cm-1) in a tube, their delta reactivities are the still

  1. A hyperspectral imagery anomaly detection algorithm based on local three-dimensional orthogonal subspace projection

    NASA Astrophysics Data System (ADS)

    Zhang, Xing; Wen, Gongjian

    2015-10-01

    Anomaly detection (AD) becomes increasingly important in hyperspectral imagery analysis with many practical applications. Local orthogonal subspace projection (LOSP) detector is a popular anomaly detector which exploits local endmembers/eigenvectors around the pixel under test (PUT) to construct background subspace. However, this subspace only takes advantage of the spectral information, but the spatial correlat ion of the background clutter is neglected, which leads to the anomaly detection result sensitive to the accuracy of the estimated subspace. In this paper, a local three dimensional orthogonal subspace projection (3D-LOSP) algorithm is proposed. Firstly, under the jointly use of both spectral and spatial information, three directional background subspaces are created along the image height direction, the image width direction and the spectral direction, respectively. Then, the three corresponding orthogonal subspaces are calculated. After that, each vector along three direction of the local cube is projected onto the corresponding orthogonal subspace. Finally, a composite score is given through the three direction operators. In 3D-LOSP, the anomalies are redefined as the target not only spectrally different to the background, but also spatially distinct. Thanks to the addition of the spatial information, the robustness of the anomaly detection result has been improved greatly by the proposed 3D-LOSP algorithm. It is noteworthy that the proposed algorithm is an expansion of LOSP and this ideology can inspire many other spectral-based anomaly detection methods. Experiments with real hyperspectral images have proved the stability of the detection result.

  2. A Linked List-Based Algorithm for Blob Detection on Embedded Vision-Based Sensors

    PubMed Central

    Acevedo-Avila, Ricardo; Gonzalez-Mendoza, Miguel; Garcia-Garcia, Andres

    2016-01-01

    Blob detection is a common task in vision-based applications. Most existing algorithms are aimed at execution on general purpose computers; while very few can be adapted to the computing restrictions present in embedded platforms. This paper focuses on the design of an algorithm capable of real-time blob detection that minimizes system memory consumption. The proposed algorithm detects objects in one image scan; it is based on a linked-list data structure tree used to label blobs depending on their shape and node information. An example application showing the results of a blob detection co-processor has been built on a low-powered field programmable gate array hardware as a step towards developing a smart video surveillance system. The detection method is intended for general purpose application. As such, several test cases focused on character recognition are also examined. The results obtained present a fair trade-off between accuracy and memory requirements; and prove the validity of the proposed approach for real-time implementation on resource-constrained computing platforms. PMID:27240382

  3. Unsupervised, low latency anomaly detection of algorithmically generated domain names by generative probabilistic modeling

    PubMed Central

    Raghuram, Jayaram; Miller, David J.; Kesidis, George

    2014-01-01

    We propose a method for detecting anomalous domain names, with focus on algorithmically generated domain names which are frequently associated with malicious activities such as fast flux service networks, particularly for bot networks (or botnets), malware, and phishing. Our method is based on learning a (null hypothesis) probability model based on a large set of domain names that have been white listed by some reliable authority. Since these names are mostly assigned by humans, they are pronounceable, and tend to have a distribution of characters, words, word lengths, and number of words that are typical of some language (mostly English), and often consist of words drawn from a known lexicon. On the other hand, in the present day scenario, algorithmically generated domain names typically have distributions that are quite different from that of human-created domain names. We propose a fully generative model for the probability distribution of benign (white listed) domain names which can be used in an anomaly detection setting for identifying putative algorithmically generated domain names. Unlike other methods, our approach can make detections without considering any additional (latency producing) information sources, often used to detect fast flux activity. Experiments on a publicly available, large data set of domain names associated with fast flux service networks show encouraging results, relative to several baseline methods, with higher detection rates and low false positive rates. PMID:25685511

  4. Community Detection Algorithm Combining Stochastic Block Model and Attribute Data Clustering

    NASA Astrophysics Data System (ADS)

    Kataoka, Shun; Kobayashi, Takuto; Yasuda, Muneki; Tanaka, Kazuyuki

    2016-11-01

    We propose a new algorithm to detect the community structure in a network that utilizes both the network structure and vertex attribute data. Suppose we have the network structure together with the vertex attribute data, that is, the information assigned to each vertex associated with the community to which it belongs. The problem addressed this paper is the detection of the community structure from the information of both the network structure and the vertex attribute data. Our approach is based on the Bayesian approach that models the posterior probability distribution of the community labels. The detection of the community structure in our method is achieved by using belief propagation and an EM algorithm. We numerically verified the performance of our method using computer-generated networks and real-world networks.

  5. Application of Artificial Bee Colony algorithm in TEC seismo-ionospheric anomalies detection

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2015-09-01

    In this study, the efficiency of Artificial Bee Colony (ABC) algorithm is investigated to detect the TEC (Total Electron Content) seismo-ionospheric anomalies around the time of some strong earthquakes including Chile (27 February 2010; 01 April 2014), Varzeghan (11 August 2012), Saravan (16 April 2013) and Papua New Guinea (29 March 2015). In comparison with other anomaly detection algorithms, ABC has a number of advantages which can be numerated as (1) detection of discord patterns in a large non linear data during a short time, (2) simplicity, (3) having less control parameters and (4) efficiently for solving multimodal and multidimensional optimization problems. Also the results of this study acknowledge the TEC time-series as a robust earthquake precursor.

  6. A blind detection scheme based on modified wavelet denoising algorithm for wireless optical communications

    NASA Astrophysics Data System (ADS)

    Li, Ruijie; Dang, Anhong

    2015-10-01

    This paper investigates a detection scheme without channel state information for wireless optical communication (WOC) systems in turbulence induced fading channel. The proposed scheme can effectively diminish the additive noise caused by background radiation and photodetector, as well as the intensity scintillation caused by turbulence. The additive noise can be mitigated significantly using the modified wavelet threshold denoising algorithm, and then, the intensity scintillation can be attenuated by exploiting the temporal correlation of the WOC channel. Moreover, to improve the performance beyond that of the maximum likelihood decision, the maximum a posteriori probability (MAP) criterion is considered. Compared with conventional blind detection algorithm, simulation results show that the proposed detection scheme can improve the signal-to-noise ratio (SNR) performance about 4.38 dB while the bit error rate and scintillation index (SI) are 1×10-6 and 0.02, respectively.

  7. Context exploitation in intelligence, surveillance, and reconnaissance for detection and tracking algorithms

    NASA Astrophysics Data System (ADS)

    Tucker, Jonathan D.; Stanfill, S. Robert

    2015-05-01

    Intelligence, Surveillance, and Reconnaissance (ISR) missions involve complex analysis of sensor data that can benefit from the exploitation of geographically aligned context. In this paper we discuss our approach to utilizing geo-registered imagery and context for the purpose of aiding ISR detection and tracking applications. Specifically this includes rendering context masks on imagery, increasing the speed at which detection algorithms process data, providing a way to intelligently control detection density for given ground areas, identifying difficult traffic terrain, refining peak suppression for congested areas, reducing target center of mass location errors, and increasing track coverage and duration through track prediction error robustness.

  8. A Digital Forgery Image Detection Algorithm Based on Wavelet Homomorphic Filtering

    NASA Astrophysics Data System (ADS)

    Zheng, Jiangbin; Liu, Miao

    A novel forgery image detection algorithm is proposed to recognize some traces of artificial blur operation that is one of common ways to forge a digital image. Firstly, a wavelet homonorphic filtering is applied to enhance the high frequency edges after the blurring process. Secondly, the natural edges are eroded by mathematical morphology method, and then the enhanced artificial blur edges are preserved. Finally, the forgery image regions are localized by the region labeling method. Experimental results demonstrate the proposed method can detect forgery area accurately and reduce the detecting errors when some artificial blur operations are used to create a forgery image.

  9. A cloud detection algorithm using the downwelling infrared radiance measured by an infrared pyrometer of the ground-based microwave radiometer

    NASA Astrophysics Data System (ADS)

    Ahn, M.-H.; Han, D.; Won, H.-Y.; Morris, V.

    2014-09-01

    For a better utilization of the ground-based microwave radiometer, it is important to detect the cloud presence in the measured data. Here, we introduce a simple and fast cloud detection algorithm by using the optical characteristics of the clouds in the infrared atmospheric window region. The new algorithm utilizes the brightness temperature (Tb) measured by an infrared radiometer installed on top of a microwave radiometer. The two step algorithm consists of a spectral test followed by a temporal test. The measured Tb is first compared with a predicted clear sky Tb obtained by an empirical formula as a function of surface air temperature and water vapor pressure. For the temporal test, the temporal variability of the measured Tb during one minute compares with a dynamic threshold value, representing the variability of the clear sky condition. It is designated as cloud free data only when both the spectral and temporal tests confirm a cloud free data. Overall, most of the thick and uniform clouds are successfully screened out by the spectral test, while the broken and fast-varying clouds are screened out by the temporal test. The algorithm is validated by comparison with the collocated ceilometer data for 6 months, from January 2013 to June 2013. The overall proportion correct is about 88.3% and the probability of detection is 90.8%, which are comparable with or better than those of previous similar approaches. Two thirds of failures occur when the new algorithm detects clouds while the ceilometer does not detect, resulting in different values of the probability of detection with different cloud base altitude, 93.8, 90.3, and 82.8% for low, mid, and high clouds, respectively. Finally, due to the characteristics of the spectral range, the new algorithm is found to be insensitive to the presence of inversion layers.

  10. Algorithms for detecting and predicting influenza outbreaks: metanarrative review of prospective evaluations

    PubMed Central

    Spreco, A; Timpka, T

    2016-01-01

    Objectives Reliable monitoring of influenza seasons and pandemic outbreaks is essential for response planning, but compilations of reports on detection and prediction algorithm performance in influenza control practice are largely missing. The aim of this study is to perform a metanarrative review of prospective evaluations of influenza outbreak detection and prediction algorithms restricted settings where authentic surveillance data have been used. Design The study was performed as a metanarrative review. An electronic literature search was performed, papers selected and qualitative and semiquantitative content analyses were conducted. For data extraction and interpretations, researcher triangulation was used for quality assurance. Results Eight prospective evaluations were found that used authentic surveillance data: three studies evaluating detection and five studies evaluating prediction. The methodological perspectives and experiences from the evaluations were found to have been reported in narrative formats representing biodefence informatics and health policy research, respectively. The biodefence informatics narrative having an emphasis on verification of technically and mathematically sound algorithms constituted a large part of the reporting. Four evaluations were reported as health policy research narratives, thus formulated in a manner that allows the results to qualify as policy evidence. Conclusions Awareness of the narrative format in which results are reported is essential when interpreting algorithm evaluations from an infectious disease control practice perspective. PMID:27154479

  11. Sequential structural damage diagnosis algorithm using a change point detection method

    NASA Astrophysics Data System (ADS)

    Noh, H.; Rajagopal, R.; Kiremidjian, A. S.

    2013-11-01

    This paper introduces a damage diagnosis algorithm for civil structures that uses a sequential change point detection method. The general change point detection method uses the known pre- and post-damage feature distributions to perform a sequential hypothesis test. In practice, however, the post-damage distribution is unlikely to be known a priori, unless we are looking for a known specific type of damage. Therefore, we introduce an additional algorithm that estimates and updates this distribution as data are collected using the maximum likelihood and the Bayesian methods. We also applied an approximate method to reduce the computation load and memory requirement associated with the estimation. The algorithm is validated using a set of experimental data collected from a four-story steel special moment-resisting frame and multiple sets of simulated data. Various features of different dimensions have been explored, and the algorithm was able to identify damage, particularly when it uses multidimensional damage sensitive features and lower false alarm rates, with a known post-damage feature distribution. For unknown feature distribution cases, the post-damage distribution was consistently estimated and the detection delays were only a few time steps longer than the delays from the general method that assumes we know the post-damage feature distribution. We confirmed that the Bayesian method is particularly efficient in declaring damage with minimal memory requirement, but the maximum likelihood method provides an insightful heuristic approach.

  12. Algorithms for detection of objects in image sequences captured from an airborne imaging system

    NASA Technical Reports Server (NTRS)

    Kasturi, Rangachar; Camps, Octavia; Tang, Yuan-Liang; Devadiga, Sadashiva; Gandhi, Tarak

    1995-01-01

    This research was initiated as a part of the effort at the NASA Ames Research Center to design a computer vision based system that can enhance the safety of navigation by aiding the pilots in detecting various obstacles on the runway during critical section of the flight such as a landing maneuver. The primary goal is the development of algorithms for detection of moving objects from a sequence of images obtained from an on-board video camera. Image regions corresponding to the independently moving objects are segmented from the background by applying constraint filtering on the optical flow computed from the initial few frames of the sequence. These detected regions are tracked over subsequent frames using a model based tracking algorithm. Position and velocity of the moving objects in the world coordinate is estimated using an extended Kalman filter. The algorithms are tested using the NASA line image sequence with six static trucks and a simulated moving truck and experimental results are described. Various limitations of the currently implemented version of the above algorithm are identified and possible solutions to build a practical working system are investigated.

  13. Evaluation of automatic feature detection algorithms in EEG: application to interburst intervals.

    PubMed

    Chauvet, Pierre E; Tich, Sylvie Nguyen The; Schang, Daniel; Clément, Alain

    2014-11-01

    In this paper, we present a new method to compare and improve algorithms for feature detection in neonatal EEG. The method is based on the algorithm׳s ability to compute accurate statistics to predict the results of EEG visual analysis. This method is implemented inside a Java software called EEGDiag, as part of an e-health Web portal dedicated to neonatal EEG. EEGDiag encapsulates a component-based implementation of the detection algorithms called analyzers. Each analyzer is defined by a list of modules executed sequentially. As the libraries of modules are intended to be enriched by its users, we developed a process to evaluate the performance of new modules and analyzers using a database of expertized and categorized EEGs. The evaluation is based on the Davies-Bouldin index (DBI) which measures the quality of cluster separation, so that it will ease the building of classifiers on risk categories. For the first application we tested this method on the detection of interburst intervals (IBI) using a database of 394 EEG acquired on premature newborns. We have defined a class of IBI detectors based on a threshold of the standard deviation on contiguous short time windows, inspired by previous work. Then we determine which detector and what threshold values are the best regarding DBI, as well as the robustness of this choice. This method allows us to make counter-intuitive choices, such as removing the 50 Hz filter (power supply) to save time.

  14. The design and hardware implementation of a low-power real-time seizure detection algorithm.

    PubMed

    Raghunathan, Shriram; Gupta, Sumeet K; Ward, Matthew P; Worth, Robert M; Roy, Kaushik; Irazoqui, Pedro P

    2009-10-01

    Epilepsy affects more than 1% of the world's population. Responsive neurostimulation is emerging as an alternative therapy for the 30% of the epileptic patient population that does not benefit from pharmacological treatment. Efficient seizure detection algorithms will enable closed-loop epilepsy prostheses by stimulating the epileptogenic focus within an early onset window. Critically, this is expected to reduce neuronal desensitization over time and lead to longer-term device efficacy. This work presents a novel event-based seizure detection algorithm along with a low-power digital circuit implementation. Hippocampal depth-electrode recordings from six kainate-treated rats are used to validate the algorithm and hardware performance in this preliminary study. The design process illustrates crucial trade-offs in translating mathematical models into hardware implementations and validates statistical optimizations made with empirical data analyses on results obtained using a real-time functioning hardware prototype. Using quantitatively predicted thresholds from the depth-electrode recordings, the auto-updating algorithm performs with an average sensitivity and selectivity of 95.3 +/- 0.02% and 88.9 +/- 0.01% (mean +/- SE(alpha = 0.05)), respectively, on untrained data with a detection delay of 8.5 s [5.97, 11.04] from electrographic onset. The hardware implementation is shown feasible using CMOS circuits consuming under 350 nW of power from a 250 mV supply voltage from simulations on the MIT 180 nm SOI process.

  15. Competitive evaluation of failure detection algorithms for strapdown redundant inertial instruments.

    NASA Technical Reports Server (NTRS)

    Wilcox, J. C.

    1973-01-01

    Seven algorithms for failure detection, isolation, and correction of strapdown inertial instruments in the dodecahedron configuration are competitively evaluated by means of a digital computer simulation that provides them with identical inputs. Their performance is compared in terms of orientation errors and computer burden. The analytical foundations of the algorithms are presented. The features that are found to contribute to superior performance are use of a definite logical structure, elimination of interaction between failures, different thresholds for first and second failures, use of the 'parity' test signals, and avoidance of iteration loops.

  16. A New Algorithm for Detecting Cloud Height using OMPS/LP Measurements

    NASA Technical Reports Server (NTRS)

    Chen, Zhong; DeLand, Matthew; Bhartia, Pawan K.

    2016-01-01

    The Ozone Mapping and Profiler Suite Limb Profiler (OMPS/LP) ozone product requires the determination of cloud height for each event to establish the lower boundary of the profile for the retrieval algorithm. We have created a revised cloud detection algorithm for LP measurements that uses the spectral dependence of the vertical gradient in radiance between two wavelengths in the visible and near-IR spectral regions. This approach provides better discrimination between clouds and aerosols than results obtained using a single wavelength. Observed LP cloud height values show good agreement with coincident Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) measurements.

  17. Advanced Oil Spill Detection Algorithms For Satellite Based Maritime Environment Monitoring

    NASA Astrophysics Data System (ADS)

    Radius, Andrea; Azevedo, Rui; Sapage, Tania; Carmo, Paulo

    2013-12-01

    During the last years, the increasing pollution occurrence and the alarming deterioration of the environmental health conditions of the sea, lead to the need of global monitoring capabilities, namely for marine environment management in terms of oil spill detection and indication of the suspected polluter. The sensitivity of Synthetic Aperture Radar (SAR) to the different phenomena on the sea, especially for oil spill and vessel detection, makes it a key instrument for global pollution monitoring. The SAR performances in maritime pollution monitoring are being operationally explored by a set of service providers on behalf of the European Maritime Safety Agency (EMSA), which has launched in 2007 the CleanSeaNet (CSN) project - a pan-European satellite based oil monitoring service. EDISOFT, which is from the beginning a service provider for CSN, is continuously investing in R&D activities that will ultimately lead to better algorithms and better performance on oil spill detection from SAR imagery. This strategy is being pursued through EDISOFT participation in the FP7 EC Sea-U project and in the Automatic Oil Spill Detection (AOSD) ESA project. The Sea-U project has the aim to improve the current state of oil spill detection algorithms, through the informative content maximization obtained with data fusion, the exploitation of different type of data/ sensors and the development of advanced image processing, segmentation and classification techniques. The AOSD project is closely related to the operational segment, because it is focused on the automation of the oil spill detection processing chain, integrating auxiliary data, like wind information, together with image and geometry analysis techniques. The synergy between these different objectives (R&D versus operational) allowed EDISOFT to develop oil spill detection software, that combines the operational automatic aspect, obtained through dedicated integration of the processing chain in the existing open source NEST

  18. A multi-objective discrete cuckoo search algorithm with local search for community detection in complex networks

    NASA Astrophysics Data System (ADS)

    Zhou, Xu; Liu, Yanheng; Li, Bin

    2016-03-01

    Detecting community is a challenging task in analyzing networks. Solving community detection problem by evolutionary algorithm is a heated topic in recent years. In this paper, a multi-objective discrete cuckoo search algorithm with local search (MDCL) for community detection is proposed. To the best of our knowledge, it is first time to apply cuckoo search algorithm for community detection. Two objective functions termed as negative ratio association and ratio cut are to be minimized. These two functions can break through the modularity limitation. In the proposed algorithm, the nest location updating strategy and abandon operator of cuckoo are redefined in discrete form. A local search strategy and a clone operator are proposed to obtain the optimal initial population. The experimental results on synthetic and real-world networks show that the proposed algorithm has better performance than other algorithms and can discover the higher quality community structure without prior information.

  19. A new morphological anomaly detection algorithm for hyperspectral images and its GPU implementation

    NASA Astrophysics Data System (ADS)

    Paz, Abel; Plaza, Antonio

    2011-10-01

    Anomaly detection is considered a very important task for hyperspectral data exploitation. It is now routinely applied in many application domains, including defence and intelligence, public safety, precision agriculture, geology, or forestry. Many of these applications require timely responses for swift decisions which depend upon high computing performance of algorithm analysis. However, with the recent explosion in the amount and dimensionality of hyperspectral imagery, this problem calls for the incorporation of parallel computing techniques. In the past, clusters of computers have offered an attractive solution for fast anomaly detection in hyperspectral data sets already transmitted to Earth. However, these systems are expensive and difficult to adapt to on-board data processing scenarios, in which low-weight and low-power integrated components are essential to reduce mission payload and obtain analysis results in (near) real-time, i.e., at the same time as the data is collected by the sensor. An exciting new development in the field of commodity computing is the emergence of commodity graphics processing units (GPUs), which can now bridge the gap towards on-board processing of remotely sensed hyperspectral data. In this paper, we develop a new morphological algorithm for anomaly detection in hyperspectral images along with an efficient GPU implementation of the algorithm. The algorithm is implemented on latest-generation GPU architectures, and evaluated with regards to other anomaly detection algorithms using hyperspectral data collected by NASA's Airborne Visible Infra-Red Imaging Spectrometer (AVIRIS) over the World Trade Center (WTC) in New York, five days after the terrorist attacks that collapsed the two main towers in the WTC complex. The proposed GPU implementation achieves real-time performance in the considered case study.

  20. Evaluation of accelerometer-based fall detection algorithms on real-world falls.

    PubMed

    Bagalà, Fabio; Becker, Clemens; Cappello, Angelo; Chiari, Lorenzo; Aminian, Kamiar; Hausdorff, Jeffrey M; Zijlstra, Wiebren; Klenk, Jochen

    2012-01-01

    Despite extensive preventive efforts, falls continue to be a major source of morbidity and mortality among elderly. Real-time detection of falls and their urgent communication to a telecare center may enable rapid medical assistance, thus increasing the sense of security of the elderly and reducing some of the negative consequences of falls. Many different approaches have been explored to automatically detect a fall using inertial sensors. Although previously published algorithms report high sensitivity (SE) and high specificity (SP), they have usually been tested on simulated falls performed by healthy volunteers. We recently collected acceleration data during a number of real-world falls among a patient population with a high-fall-risk as part of the SensAction-AAL European project. The aim of the present study is to benchmark the performance of thirteen published fall-detection algorithms when they are applied to the database of 29 real-world falls. To the best of our knowledge, this is the first systematic comparison of fall detection algorithms tested on real-world falls. We found that the SP average of the thirteen algorithms, was (mean ± std) 83.0% ± 30.3% (maximum value = 98%). The SE was considerably lower (SE = 57.0% ± 27.3%, maximum value = 82.8%), much lower than the values obtained on simulated falls. The number of false alarms generated by the algorithms during 1-day monitoring of three representative fallers ranged from 3 to 85. The factors that affect the performance of the published algorithms, when they are applied to the real-world falls, are also discussed. These findings indicate the importance of testing fall-detection algorithms in real-life conditions in order to produce more effective automated alarm systems with higher acceptance. Further, the present results support the idea that a large, shared real-world fall database could, potentially, provide an enhanced understanding of the fall process and the information needed to design and

  1. Assessment of an Automated Touchdown Detection Algorithm for the Orion Crew Module

    NASA Technical Reports Server (NTRS)

    Gay, Robert S.

    2011-01-01

    Orion Crew Module (CM) touchdown detection is critical to activating the post-landing sequence that safe?s the Reaction Control Jets (RCS), ensures that the vehicle remains upright, and establishes communication with recovery forces. In order to accommodate safe landing of an unmanned vehicle or incapacitated crew, an onboard automated detection system is required. An Orion-specific touchdown detection algorithm was developed and evaluated to differentiate landing events from in-flight events. The proposed method will be used to initiate post-landing cutting of the parachute riser lines, to prevent CM rollover, and to terminate RCS jet firing prior to submersion. The RCS jets continue to fire until touchdown to maintain proper CM orientation with respect to the flight path and to limit impact loads, but have potentially hazardous consequences if submerged while firing. The time available after impact to cut risers and initiate the CM Up-righting System (CMUS) is measured in minutes, whereas the time from touchdown to RCS jet submersion is a function of descent velocity, sea state conditions, and is often less than one second. Evaluation of the detection algorithms was performed for in-flight events (e.g. descent under chutes) using hi-fidelity rigid body analyses in the Decelerator Systems Simulation (DSS), whereas water impacts were simulated using a rigid finite element model of the Orion CM in LS-DYNA. Two touchdown detection algorithms were evaluated with various thresholds: Acceleration magnitude spike detection, and Accumulated velocity changed (over a given time window) spike detection. Data for both detection methods is acquired from an onboard Inertial Measurement Unit (IMU) sensor. The detection algorithms were tested with analytically generated in-flight and landing IMU data simulations. The acceleration spike detection proved to be faster while maintaining desired safety margin. Time to RCS jet submersion was predicted analytically across a series of

  2. Facilitation of Third-party Development of Advanced Algorithms for Explosive Detection Using Workshops and Grand Challenges

    SciTech Connect

    Martz, H E; Crawford, C R; Beaty, J S; Castanon, D

    2011-02-15

    The US Department of Homeland Security (DHS) has requirements for future explosive detection scanners that include dealing with a larger number of threats, higher probability of detection, lower false alarm rates and lower operating costs. One tactic that DHS is pursuing to achieve these requirements is to augment the capabilities of the established security vendors with third-party algorithm developers. The purposes of this presentation are to review DHS's objectives for involving third parties in the development of advanced algorithms and then to discuss how these objectives are achieved using workshops and grand challenges. Terrorists are still trying and they are getting more sophisticated. There is a need to increase the number of smart people working on homeland security. Augmenting capabilities and capacities of system vendors with third-parties is one tactic. Third parties can be accessed via workshops and grand challenges. Successes have been achieved to date. There are issues that need to be resolved to further increase third party involvement.

  3. Bladed wheels damage detection through Non-Harmonic Fourier Analysis improved algorithm

    NASA Astrophysics Data System (ADS)

    Neri, P.

    2017-05-01

    Recent papers introduced the Non-Harmonic Fourier Analysis for bladed wheels damage detection. This technique showed its potential in estimating the frequency of sinusoidal signals even when the acquisition time is short with respect to the vibration period, provided that some hypothesis are fulfilled. Anyway, previously proposed algorithms showed severe limitations in cracks detection at their early stage. The present paper proposes an improved algorithm which allows to detect a blade vibration frequency shift due to a crack whose size is really small compared to the blade width. Such a technique could be implemented for condition-based maintenance, allowing to use non-contact methods for vibration measurements. A stator-fixed laser sensor could monitor all the blades as they pass in front of the spot, giving precious information about the wheel health. This configuration determines an acquisition time for each blade which become shorter as the machine rotational speed increases. In this situation, traditional Discrete Fourier Transform analysis results in poor frequency resolution, being not suitable for small frequency shift detection. Non-Harmonic Fourier Analysis instead showed high reliability in vibration frequency estimation even with data samples collected in a short time range. A description of the improved algorithm is provided in the paper, along with a comparison with the previous one. Finally, a validation of the method is presented, based on finite element simulations results.

  4. Improving lesion detectability in PET imaging with a penalized likelihood reconstruction algorithm

    NASA Astrophysics Data System (ADS)

    Wangerin, Kristen A.; Ahn, Sangtae; Ross, Steven G.; Kinahan, Paul E.; Manjeshwar, Ravindra M.

    2015-03-01

    Ordered Subset Expectation Maximization (OSEM) is currently the most widely used image reconstruction algorithm for clinical PET. However, OSEM does not necessarily provide optimal image quality, and a number of alternative algorithms have been explored. We have recently shown that a penalized likelihood image reconstruction algorithm using the relative difference penalty, block sequential regularized expectation maximization (BSREM), achieves more accurate lesion quantitation than OSEM, and importantly, maintains acceptable visual image quality in clinical wholebody PET. The goal of this work was to evaluate lesion detectability with BSREM versus OSEM. We performed a twoalternative forced choice study using 81 patient datasets with lesions of varying contrast inserted into the liver and lung. At matched imaging noise, BSREM and OSEM showed equivalent detectability in the lungs, and BSREM outperformed OSEM in the liver. These results suggest that BSREM provides not only improved quantitation and clinically acceptable visual image quality as previously shown but also improved lesion detectability compared to OSEM. We then modeled this detectability study, applying both nonprewhitening (NPW) and channelized Hotelling (CHO) model observers to the reconstructed images. The CHO model observer showed good agreement with the human observers, suggesting that we can apply this model to future studies with varying simulation and reconstruction parameters.

  5. Radiation anomaly detection algorithms for field-acquired gamma energy spectra

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Sanjoy; Maurer, Richard; Wolff, Ron; Guss, Paul; Mitchell, Stephen

    2015-08-01

    The Remote Sensing Laboratory (RSL) is developing a tactical, networked radiation detection system that will be agile, reconfigurable, and capable of rapid threat assessment with high degree of fidelity and certainty. Our design is driven by the needs of users such as law enforcement personnel who must make decisions by evaluating threat signatures in urban settings. The most efficient tool available to identify the nature of the threat object is real-time gamma spectroscopic analysis, as it is fast and has a very low probability of producing false positive alarm conditions. Urban radiological searches are inherently challenged by the rapid and large spatial variation of background gamma radiation, the presence of benign radioactive materials in terms of the normally occurring radioactive materials (NORM), and shielded and/or masked threat sources. Multiple spectral anomaly detection algorithms have been developed by national laboratories and commercial vendors. For example, the Gamma Detector Response and Analysis Software (GADRAS) a one-dimensional deterministic radiation transport software capable of calculating gamma ray spectra using physics-based detector response functions was developed at Sandia National Laboratories. The nuisance-rejection spectral comparison ratio anomaly detection algorithm (or NSCRAD), developed at Pacific Northwest National Laboratory, uses spectral comparison ratios to detect deviation from benign medical and NORM radiation source and can work in spite of strong presence of NORM and or medical sources. RSL has developed its own wavelet-based gamma energy spectral anomaly detection algorithm called WAVRAD. Test results and relative merits of these different algorithms will be discussed and demonstrated.

  6. Developing Fire Detection Algorithms by Geostationary Orbiting Platforms and Machine Learning Techniques

    NASA Astrophysics Data System (ADS)

    Salvador, Pablo; Sanz, Julia; Garcia, Miguel; Casanova, Jose Luis

    2016-08-01

    Fires in general and forest fires specific are a major concern in terms of economical and biological loses. Remote sensing technologies have been focusing on developing several algorithms, adapted to a large kind of sensors, platforms and regions in order to obtain hotspots as faster as possible. The aim of this study is to establish an automatic methodology to develop hotspots detection algorithms with Spinning Enhanced Visible and Infrared Imager (SEVIRI) sensor on board Meteosat Second Generation platform (MSG) based on machine learning techniques that can be exportable to others geostationary platforms and sensors and to any area of the Earth. The sensitivity (SE), specificity (SP) and accuracy (AC) parameters have been analyzed in order to develop the final machine learning algorithm taking into account the preferences and final use of the predicted data.

  7. A consensus algorithm for approximate string matching and its application to QRS complex detection

    NASA Astrophysics Data System (ADS)

    Alba, Alfonso; Mendez, Martin O.; Rubio-Rincon, Miguel E.; Arce-Santana, Edgar R.

    2016-08-01

    In this paper, a novel algorithm for approximate string matching (ASM) is proposed. The novelty resides in the fact that, unlike most other methods, the proposed algorithm is not based on the Hamming or Levenshtein distances, but instead computes a score for each symbol in the search text based on a consensus measure. Those symbols with sufficiently high scores will likely correspond to approximate instances of the pattern string. To demonstrate the usefulness of the proposed method, it has been applied to the detection of QRS complexes in electrocardiographic signals with competitive results when compared against the classic Pan-Tompkins (PT) algorithm. The proposed method outperformed PT in 72% of the test cases, with no extra computational cost.

  8. [Detection of QRS complexes using wavelet transformation and golden section search algorithm].

    PubMed

    Chen, Wenli; Mo, Zhiwen; Guo, Wen

    2009-08-01

    The extraction and identification of ECG (electrocardiogram) signal characteristic parameters are the basis of ECG analysis and diagnosis. The fast and precise detection of QRS complexes is very important in ECG signal analysis; for it is a pre-requisite for the correlative parameters calculation as well as for correct diagnosis. In our work, firstly, the modulus maximum of wavelet transform is applied to the QRS complexes detection from ECG signal. Once there are mis-detections or missed detections happening, we utilize the Golden Section Search algorithm to adjust the threshold of maxima determination. The correct detection rate of the QRS complexes is up to 99.6% based on MIT-BIH ECG data.

  9. A new comparison of hyperspectral anomaly detection algorithms for real-time applications

    NASA Astrophysics Data System (ADS)

    Díaz, María.; López, Sebastián.; Sarmiento, Roberto

    2016-10-01

    Due to the high spectral resolution that remotely sensed hyperspectral images provide, there has been an increasing interest in anomaly detection. The aim of anomaly detection is to stand over pixels whose spectral signature differs significantly from the background spectra. Basically, anomaly detectors mark pixels with a certain score, considering as anomalies those whose scores are higher than a threshold. Receiver Operating Characteristic (ROC) curves have been widely used as an assessment measure in order to compare the performance of different algorithms. ROC curves are graphical plots which illustrate the trade- off between false positive and true positive rates. However, they are limited in order to make deep comparisons due to the fact that they discard relevant factors required in real-time applications such as run times, costs of misclassification and the competence to mark anomalies with high scores. This last fact is fundamental in anomaly detection in order to distinguish them easily from the background without any posterior processing. An extensive set of simulations have been made using different anomaly detection algorithms, comparing their performances and efficiencies using several extra metrics in order to complement ROC curves analysis. Results support our proposal and demonstrate that ROC curves do not provide a good visualization of detection performances for themselves. Moreover, a figure of merit has been proposed in this paper which encompasses in a single global metric all the measures yielded for the proposed additional metrics. Therefore, this figure, named Detection Efficiency (DE), takes into account several crucial types of performance assessment that ROC curves do not consider. Results demonstrate that algorithms with the best detection performances according to ROC curves do not have the highest DE values. Consequently, the recommendation of using extra measures to properly evaluate performances have been supported and justified by

  10. A multiobjective evolutionary algorithm based on similarity for community detection from signed social networks.

    PubMed

    Liu, Chenlong; Liu, Jing; Jiang, Zhongzhou

    2014-12-01

    Various types of social relationships, such as friends and foes, can be represented as signed social networks (SNs) that contain both positive and negative links. Although many community detection (CD) algorithms have been proposed, most of them were designed primarily for networks containing only positive links. Thus, it is important to design CD algorithms which can handle large-scale SNs. To this purpose, we first extend the original similarity to the signed similarity based on the social balance theory. Then, based on the signed similarity and the natural contradiction between positive and negative links, two objective functions are designed to model the problem of detecting communities in SNs as a multiobjective problem. Afterward, we propose a multiobjective evolutionary algorithm, called MEAs-SN. In MEAs-SN, to overcome the defects of direct and indirect representations for communities, a direct and indirect combined representation is designed. Attributing to this representation, MEAs-SN can switch between different representations during the evolutionary process. As a result, MEAs-SN can benefit from both representations. Moreover, owing to this representation, MEAs-SN can also detect overlapping communities directly. In the experiments, both benchmark problems and large-scale synthetic networks generated by various parameter settings are used to validate the performance of MEAs-SN. The experimental results show the effectiveness and efficacy of MEAs-SN on networks with 1000, 5000, and 10,000 nodes and also in various noisy situations. A thorough comparison is also made between MEAs-SN and three existing algorithms, and the results show that MEAs-SN outperforms other algorithms.

  11. Real-time implementation of a multispectral mine target detection algorithm

    NASA Astrophysics Data System (ADS)

    Samson, Joseph W.; Witter, Lester J.; Kenton, Arthur C.; Holloway, John H., Jr.

    2003-09-01

    Spatial-spectral anomaly detection (the "RX Algorithm") has been exploited on the USMC's Coastal Battlefield Reconnaissance and Analysis (COBRA) Advanced Technology Demonstration (ATD) and several associated technology base studies, and has been found to be a useful method for the automated detection of surface-emplaced antitank land mines in airborne multispectral imagery. RX is a complex image processing algorithm that involves the direct spatial convolution of a target/background mask template over each multispectral image, coupled with a spatially variant background spectral covariance matrix estimation and inversion. The RX throughput on the ATD was about 38X real time using a single Sun UltraSparc system. A goal to demonstrate RX in real-time was begun in FY01. We now report the development and demonstration of a Field Programmable Gate Array (FPGA) solution that achieves a real-time implementation of the RX algorithm at video rates using COBRA ATD data. The approach uses an Annapolis Microsystems Firebird PMC card containing a Xilinx XCV2000E FPGA with over 2,500,000 logic gates and 18MBytes of memory. A prototype system was configured using a Tek Microsystems VME board with dual-PowerPC G4 processors and two PMC slots. The RX algorithm was translated from its C programming implementation into the VHDL language and synthesized into gates that were loaded into the FPGA. The VHDL/synthesizer approach allows key RX parameters to be quickly changed and a new implementation automatically generated. Reprogramming the FPGA is done rapidly and in-circuit. Implementation of the RX algorithm in a single FPGA is a major first step toward achieving real-time land mine detection.

  12. Automatic Detection and Extraction Algorithm of Inter-Granular Bright Points

    NASA Astrophysics Data System (ADS)

    Feng, Song; Ji, Kai-fan; Deng, Hui; Wang, Feng; Fu, Xiao-dong

    2012-12-01

    Inter-granular Bright Points (igBPs) are small-scale objects in the Solar photosphere which can be seen within dark inter-granular lanes. We present a new algorithm to automatically detect and extract igBPs. Laplacian and Morphological Dilation (LMD) technique is employed by the algorithm. It involves three basic processing steps: (1) obtaining candidate ``seed" regions by Laplacian; (2) determining the boundary and size of igBPs by morphological dilation; (3) discarding brighter granules by a probability criterion. For validating our algorithm, we used the observed samples of the Dutch Open Telescope (DOT), collected on April 12, 2007. They contain 180 high-resolution images, and each has a 85 × 68 arcsec^{2} field of view (FOV). Two important results are obtained: first, the identified rate of igBPs reaches 95% and is higher than previous results; second, the diameter distribution is 220 ± 25 km, which is fully consistent with previously published data. We conclude that the presented algorithm can detect and extract igBPs automatically and effectively.

  13. A positive detecting code and its decoding algorithm for DNA library screening.

    PubMed

    Uehara, Hiroaki; Jimbo, Masakazu

    2009-01-01

    The study of gene functions requires high-quality DNA libraries. However, a large number of tests and screenings are necessary for compiling such libraries. We describe an algorithm for extracting as much information as possible from pooling experiments for library screening. Collections of clones are called pools, and a pooling experiment is a group test for detecting all positive clones. The probability of positiveness for each clone is estimated according to the outcomes of the pooling experiments. Clones with high chance of positiveness are subjected to confirmatory testing. In this paper, we introduce a new positive clone detecting algorithm, called the Bayesian network pool result decoder (BNPD). The performance of BNPD is compared, by simulation, with that of the Markov chain pool result decoder (MCPD) proposed by Knill et al. in 1996. Moreover, the combinatorial properties of pooling designs suitable for the proposed algorithm are discussed in conjunction with combinatorial designs and d-disjunct matrices. We also show the advantage of utilizing packing designs or BIB designs for the BNPD algorithm.

  14. Detecting Blending End-Point Using Mean Squares Successive Difference Test and Near-Infrared Spectroscopy.

    PubMed

    Khorasani, Milad; Amigo, José M; Bertelsen, Poul; Van Den Berg, Frans; Rantanen, Jukka

    2015-08-01

    An algorithm based on mean squares successive difference test applied to near-infrared and principal component analysis scores was developed to monitor and determine the blending profile and to assess the end-point in the statistical stabile phase. Model formulations consisting of an active compound (acetylsalicylic acid), together with microcrystalline cellulose and two grades of calcium carbonate with dramatically different particle shapes, were prepared. The formulation comprising angular-shaped calcium carbonate reached blending end-point slower when compared with the formulation comprising equant-shaped calcium carbonate. Utilizing the ring shear test, this distinction in end-point could be related to the difference in flowability of the formulations. On the basis of the two model formulations, a design of experiments was conducted to characterize the blending process by studying the effect of CaCO3 grades and fill level of the bin on blending end-point. Calcium carbonate grades, fill level, and their interaction were shown to have a significant impact on the blending process.

  15. Optimizing convergence rates of alternating minimization reconstruction algorithms for real-time explosive detection applications

    NASA Astrophysics Data System (ADS)

    Bosch, Carl; Degirmenci, Soysal; Barlow, Jason; Mesika, Assaf; Politte, David G.; O'Sullivan, Joseph A.

    2016-05-01

    X-ray computed tomography reconstruction for medical, security and industrial applications has evolved through 40 years of experience with rotating gantry scanners using analytic reconstruction techniques such as filtered back projection (FBP). In parallel, research into statistical iterative reconstruction algorithms has evolved to apply to sparse view scanners in nuclear medicine, low data rate scanners in Positron Emission Tomography (PET) [5, 7, 10] and more recently to reduce exposure to ionizing radiation in conventional X-ray CT scanners. Multiple approaches to statistical iterative reconstruction have been developed based primarily on variations of expectation maximization (EM) algorithms. The primary benefit of EM algorithms is the guarantee of convergence that is maintained when iterative corrections are made within the limits of convergent algorithms. The primary disadvantage, however is that strict adherence to correction limits of convergent algorithms extends the number of iterations and ultimate timeline to complete a 3D volumetric reconstruction. Researchers have studied methods to accelerate convergence through more aggressive corrections [1], ordered subsets [1, 3, 4, 9] and spatially variant image updates. In this paper we describe the development of an AM reconstruction algorithm with accelerated convergence for use in a real-time explosive detection application for aviation security. By judiciously applying multiple acceleration techniques and advanced GPU processing architectures, we are able to perform 3D reconstruction of scanned passenger baggage at a rate of 75 slices per second. Analysis of the results on stream of commerce passenger bags demonstrates accelerated convergence by factors of 8 to 15, when comparing images from accelerated and strictly convergent algorithms.

  16. Quantitative evaluation of image processing algorithms for ill-structured road detection and tracking

    NASA Astrophysics Data System (ADS)

    Dufourd, Delphine; Dalgalarrondo, Andre

    2003-09-01

    In a previous presentation at AeroSense 2002, we described a methodology to assess the results of image processing algorithms for ill-structured road detection and tracking. In this paper, we present our first application of this methodology on sixedge detectors and a database counting about 20,000 images. Our evaluation approach is based on the use of video image sequences, ground truth - reference results established by human experts - and assessment metrics which measure the quality of the image processing results. We need a quantitative, comparative and repetitive evaluation of many algorithms in order to direct future developments. The main scope of this paper consists in presenting the lessons learned from applying our methodology. More precisely, we describe the assessment metrics, the algorithms and the database. Then we describe how we manage to extract the qualities and weaknesses of each algorithm and to establish a global scoring. The insight we gain for the definition of assessment metrics is also presented. Finally, we suggest some promising directions for the development of road tracking algorithms and complementarities that must be sought after. To conclude, we describe future improvements for the database constitution, the assessment tools and the overall methodology.

  17. ASTErIsM: application of topometric clustering algorithms in automatic galaxy detection and classification

    NASA Astrophysics Data System (ADS)

    Tramacere, A.; Paraficz, D.; Dubath, P.; Kneib, J.-P.; Courbin, F.

    2016-12-01

    We present a study on galaxy detection and shape classification using topometric clustering algorithms. We first use the DBSCAN algorithm to extract, from CCD frames, groups of adjacent pixels with significant fluxes and we then apply the DENCLUE algorithm to separate the contributions of overlapping sources. The DENCLUE separation is based on the localization of pattern of local maxima, through an iterative algorithm, which associates each pixel to the closest local maximum. Our main classification goal is to take apart elliptical from spiral galaxies. We introduce new sets of features derived from the computation of geometrical invariant moments of the pixel group shape and from the statistics of the spatial distribution of the DENCLUE local maxima patterns. Ellipticals are characterized by a single group of local maxima, related to the galaxy core, while spiral galaxies have additional groups related to segments of spiral arms. We use two different supervised ensemble classification algorithms: Random Forest and Gradient Boosting. Using a sample of ≃24 000 galaxies taken from the Galaxy Zoo 2 main sample with spectroscopic redshifts, and we test our classification against the Galaxy Zoo 2 catalogue. We find that features extracted from our pipeline give, on average, an accuracy of ≃93 per cent, when testing on a test set with a size of 20 per cent of our full data set, with features deriving from the angular distribution of density attractor ranking at the top of the discrimination power.

  18. Decomposition-Based Multiobjective Evolutionary Algorithm for Community Detection in Dynamic Social Networks

    PubMed Central

    Ma, Jingjing; Liu, Jie; Ma, Wenping; Gong, Maoguo; Jiao, Licheng

    2014-01-01

    Community structure is one of the most important properties in social networks. In dynamic networks, there are two conflicting criteria that need to be considered. One is the snapshot quality, which evaluates the quality of the community partitions at the current time step. The other is the temporal cost, which evaluates the difference between communities at different time steps. In this paper, we propose a decomposition-based multiobjective community detection algorithm to simultaneously optimize these two objectives to reveal community structure and its evolution in dynamic networks. It employs the framework of multiobjective evolutionary algorithm based on decomposition to simultaneously optimize the modularity and normalized mutual information, which quantitatively measure the quality of the community partitions and temporal cost, respectively. A local search strategy dealing with the problem-specific knowledge is incorporated to improve the effectiveness of the new algorithm. Experiments on computer-generated and real-world networks demonstrate that the proposed algorithm can not only find community structure and capture community evolution more accurately, but also be steadier than the two compared algorithms. PMID:24723806

  19. Runway Safety Monitor Algorithm for Single and Crossing Runway Incursion Detection and Alerting

    NASA Technical Reports Server (NTRS)

    Green, David F., Jr.

    2006-01-01

    The Runway Safety Monitor (RSM) is an aircraft based algorithm for runway incursion detection and alerting that was developed in support of NASA's Runway Incursion Prevention System (RIPS) research conducted under the NASA Aviation Safety and Security Program's Synthetic Vision System project. The RSM algorithm provides warnings of runway incursions in sufficient time for pilots to take evasive action and avoid accidents during landings, takeoffs or when taxiing on the runway. The report documents the RSM software and describes in detail how RSM performs runway incursion detection and alerting functions for NASA RIPS. The report also describes the RIPS flight tests conducted at the Reno/Tahoe International Airport (RNO) and the Wallops Flight Facility (WAL) during July and August of 2004, and the RSM performance results and lessons learned from those flight tests.

  20. An Efficient Moving Target Detection Algorithm Based on Sparsity-Aware Spectrum Estimation

    PubMed Central

    Shen, Mingwei; Wang, Jie; Wu, Di; Zhu, Daiyin

    2014-01-01

    In this paper, an efficient direct data domain space-time adaptive processing (STAP) algorithm for moving targets detection is proposed, which is achieved based on the distinct spectrum features of clutter and target signals in the angle-Doppler domain. To reduce the computational complexity, the high-resolution angle-Doppler spectrum is obtained by finding the sparsest coefficients in the angle domain using the reduced-dimension data within each Doppler bin. Moreover, we will then present a knowledge-aided block-size detection algorithm that can discriminate between the moving targets and the clutter based on the extracted spectrum features. The feasibility and effectiveness of the proposed method are validated through both numerical simulations and raw data processing results. PMID:25222035

  1. Quantitative detection of defects based on Markov-PCA-BP algorithm using pulsed infrared thermography technology

    NASA Astrophysics Data System (ADS)

    Tang, Qingju; Dai, Jingmin; Liu, Junyan; Liu, Chunsheng; Liu, Yuanlin; Ren, Chunping

    2016-07-01

    Quantitative detection of debonding defects' diameter and depth in TBCs has been carried out using pulsed infrared thermography technology. By combining principal component analysis with neural network theory, the Markov-PCA-BP algorithm was proposed. The principle and realization process of the proposed algorithm was described. In the prediction model, the principal components which can reflect most characteristics of the thermal wave signal were set as the input, and the defect depth and diameter was set as the output. The experimental data from pulsed infrared thermography tests of TBCs with flat bottom hole defects was selected as the training and testing sample. Markov-PCA-BP predictive system was arrived, based on which both the defect depth and diameter were identified accurately, which proved the effectiveness of the proposed method for quantitative detection of debonding defects in TBCs.

  2. Optimized Swinging Door Algorithm for Wind Power Ramp Event Detection: Preprint

    SciTech Connect

    Cui, Mingjian; Zhang, Jie; Florita, Anthony R.; Hodge, Bri-Mathias; Ke, Deping; Sun, Yuanzhang

    2015-08-06

    Significant wind power ramp events (WPREs) are those that influence the integration of wind power, and they are a concern to the continued reliable operation of the power grid. As wind power penetration has increased in recent years, so has the importance of wind power ramps. In this paper, an optimized swinging door algorithm (SDA) is developed to improve ramp detection performance. Wind power time series data are segmented by the original SDA, and then all significant ramps are detected and merged through a dynamic programming algorithm. An application of the optimized SDA is provided to ascertain the optimal parameter of the original SDA. Measured wind power data from the Electric Reliability Council of Texas (ERCOT) are used to evaluate the proposed optimized SDA.

  3. A multi-split mapping algorithm for circular RNA, splicing, trans-splicing and fusion detection.

    PubMed

    Hoffmann, Steve; Otto, Christian; Doose, Gero; Tanzer, Andrea; Langenberger, David; Christ, Sabina; Kunz, Manfred; Holdt, Lesca M; Teupser, Daniel; Hackermüller, Jörg; Stadler, Peter F

    2014-02-10

    Numerous high-throughput sequencing studies have focused on detecting conventionally spliced mRNAs in RNA-seq data. However, non-standard RNAs arising through gene fusion, circularization or trans-splicing are often neglected. We introduce a novel, unbiased algorithm to detect splice junctions from single-end cDNA sequences. In contrast to other methods, our approach accommodates multi-junction structures. Our method compares favorably with competing tools for conventionally spliced mRNAs and, with a gain of up to 40% of recall, systematically outperforms them on reads with multiple splits, trans-splicing and circular products. The algorithm is integrated into our mapping tool segemehl (http://www.bioinf.uni-leipzig.de/Software/segemehl/).

  4. A new algorithm for epilepsy seizure onset detection and spread estimation from EEG signals

    NASA Astrophysics Data System (ADS)

    Quintero-Rincón, Antonio; Pereyra, Marcelo; D'Giano, Carlos; Batatia, Hadj; Risk, Marcelo

    2016-04-01

    Appropriate diagnosis and treatment of epilepsy is a main public health issue. Patients suffering from this disease often exhibit different physical characterizations, which result from the synchronous and excessive discharge of a group of neurons in the cerebral cortex. Extracting this information using EEG signals is an important problem in biomedical signal processing. In this work we propose a new algorithm for seizure onset detection and spread estimation in epilepsy patients. The algorithm is based on a multilevel 1-D wavelet decomposition that captures the physiological brain frequency signals coupled with a generalized gaussian model. Preliminary experiments with signals from 30 epilepsy crisis and 11 subjects, suggest that the proposed methodology is a powerful tool for detecting the onset of epilepsy seizures with his spread across the brain.

  5. An effective detection algorithm for region duplication forgery in digital images

    NASA Astrophysics Data System (ADS)

    Yavuz, Fatih; Bal, Abdullah; Cukur, Huseyin

    2016-04-01

    Powerful image editing tools are very common and easy to use these days. This situation may cause some forgeries by adding or removing some information on the digital images. In order to detect these types of forgeries such as region duplication, we present an effective algorithm based on fixed-size block computation and discrete wavelet transform (DWT). In this approach, the original image is divided into fixed-size blocks, and then wavelet transform is applied for dimension reduction. Each block is processed by Fourier Transform and represented by circle regions. Four features are extracted from each block. Finally, the feature vectors are lexicographically sorted, and duplicated image blocks are detected according to comparison metric results. The experimental results show that the proposed algorithm presents computational efficiency due to fixed-size circle block architecture.

  6. Parallel computation safety analysis irradiation targets fission product molybdenum in neutronic aspect using the successive over-relaxation algorithm

    SciTech Connect

    Susmikanti, Mike; Dewayatna, Winter; Sulistyo, Yos

    2014-09-30

    One of the research activities in support of commercial radioisotope production program is a safety research on target FPM (Fission Product Molybdenum) irradiation. FPM targets form a tube made of stainless steel which contains nuclear-grade high-enrichment uranium. The FPM irradiation tube is intended to obtain fission products. Fission materials such as Mo{sup 99} used widely the form of kits in the medical world. The neutronics problem is solved using first-order perturbation theory derived from the diffusion equation for four groups. In contrast, Mo isotopes have longer half-lives, about 3 days (66 hours), so the delivery of radioisotopes to consumer centers and storage is possible though still limited. The production of this isotope potentially gives significant economic value. The criticality and flux in multigroup diffusion model was calculated for various irradiation positions and uranium contents. This model involves complex computation, with large and sparse matrix system. Several parallel algorithms have been developed for the sparse and large matrix solution. In this paper, a successive over-relaxation (SOR) algorithm was implemented for the calculation of reactivity coefficients which can be done in parallel. Previous works performed reactivity calculations serially with Gauss-Seidel iteratives. The parallel method can be used to solve multigroup diffusion equation system and calculate the criticality and reactivity coefficients. In this research a computer code was developed to exploit parallel processing to perform reactivity calculations which were to be used in safety analysis. The parallel processing in the multicore computer system allows the calculation to be performed more quickly. This code was applied for the safety limits calculation of irradiated FPM targets containing highly enriched uranium. The results of calculations neutron show that for uranium contents of 1.7676 g and 6.1866 g (× 10{sup 6} cm{sup −1}) in a tube, their delta

  7. Flow-batch technique for the simultaneous enzymatic determination of levodopa and carbidopa in pharmaceuticals using PLS and successive projections algorithm.

    PubMed

    Grünhut, Marcos; Centurión, María E; Fragoso, Wallace D; Almeida, Luciano F; de Araújo, Mário C U; Fernández Band, Beatriz S

    2008-05-30

    An enzymatic flow-batch system with spectrophotometric detection was developed for simultaneous determination of levodopa [(S)-2 amino-3-(3,4-dihydroxyphenyl)propionic acid] and carbidopa [(S)-3-(3,4-dihydroxyphenyl)-2-hydrazino-2-methylpropionic acid] in pharmaceutical preparations. The data were analysed by univariate method, partial least squares (PLS) and a novel variable selection for multiple lineal regression (MLR), the successive projections algorithm (SPA). The enzyme polyphenol oxidase (PPO; EC 1.14.18.1) obtained from Ipomoea batatas (L.) Lam. was used to oxidize both analytes to their respective dopaquinones, which presented a strong absorption between 295 and 540 nm. The statistical parameters (RMSE and correlation coefficient) calculated after the PLS in the spectral region between 295 and 540 nm and MLR-SPA application were appropriate for levodopa and carbidopa. A comparative study of univariate, PLS, in different ranges, and MLR-SPA chemometrics models, was carried out by applying the elliptical joint confidence region (EJCR) test. The results were satisfactory for PLS in the spectral region between 295 and 540 nm and for MLR-SPA. Tablets of commercial samples were analysed and the results obtained are in close agreement with both, spectrophotometric and HPLC pharmacopeia methods. The sample throughput was 18 h(-1).

  8. Dramatyping: a generic algorithm for detecting reasonable temporal correlations between drug administration and lab value alterations

    PubMed Central

    2016-01-01

    According to the World Health Organization, one of the criteria for the standardized assessment of case causality in adverse drug reactions is the temporal relationship between the intake of a drug and the occurrence of a reaction or a laboratory test abnormality. This article presents and describes an algorithm for the detection of a reasonable temporal correlation between the administration of a drug and the alteration of a laboratory value course. The algorithm is designed to process normalized lab values and is therefore universally applicable. It has a sensitivity of 0.932 for the detection of lab value courses that show changes in temporal correlation with the administration of a drug and it has a specificity of 0.967 for the detection of lab value courses that show no changes. Therefore, the algorithm is appropriate to screen the data of electronic health records and to support human experts in revealing adverse drug reactions. A reference implementation in Python programming language is available. PMID:27042396

  9. Combined evolutionary algorithm and minimum classification error training for DHMM-based land mine detection

    NASA Astrophysics Data System (ADS)

    Zhao, Yunxin; Chen, Ping; Gader, Paul D.; Zhang, Yue

    2002-08-01

    Minimum classification error (MCE) training is proposed to improve performance of a discrete hidden Markov model (DHMM) based landmine detection system. The system (baseline) was proposed previously for detection of both metal and nonmetal mines from ground penetrating radar signatures collected by moving vehicles. An initial DHMM model is trained by conventional methods of vector quantization and Baum-Welch algorithm. A sequential generalized probabilistic descent (GPD) algorithm that minimizes an empirical loss function is then used to estimate the landmine/background DHMM parameters, and an evolutionary algorithm based on fitness score of classification accuracy is used to generate and select codebooks. The landmine data of one geographical site was used for model training, and those of two different sites were used for evaluation of system performance. Three scenarios were studied: apply MCE/GPD alone to DHMM estimation, apply EA alone to codebook generation, first apply EA to codebook generation and then apply MCE/GPD to DHMM estimation. Overall, the combined EA and MCE/GPD training led to the best performance. At the same level of detection rate as the baseline DHMM system, the proposed training reduced false alarm rate by a factor of two, indicating significant performance improvement.

  10. Automatic Mexico Gulf Oil Spill Detection from Radarsat-2 SAR Satellite Data Using Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Marghany, Maged

    2016-10-01

    In this work, a genetic algorithm is exploited for automatic detection of oil spills of small and large size. The route is achieved using arrays of RADARSAT-2 SAR ScanSAR Narrow single beam data obtained in the Gulf of Mexico. The study shows that genetic algorithm has automatically segmented the dark spot patches related to small and large oil spill pixels. This conclusion is confirmed by the receiveroperating characteristic (ROC) curve and ground data which have been documented. The ROC curve indicates that the existence of oil slick footprints can be identified with the area under the curve between the ROC curve and the no-discrimination line of 90%, which is greater than that of other surrounding environmental features. The small oil spill sizes represented 30% of the discriminated oil spill pixels in ROC curve. In conclusion, the genetic algorithm can be used as a tool for the automatic detection of oil spills of either small or large size and the ScanSAR Narrow single beam mode serves as an excellent sensor for oil spill patterns detection and surveying in the Gulf of Mexico.

  11. Optimal Parameter Exploration for Online Change-Point Detection in Activity Monitoring Using Genetic Algorithms

    PubMed Central

    Khan, Naveed; McClean, Sally; Zhang, Shuai; Nugent, Chris

    2016-01-01

    In recent years, smart phones with inbuilt sensors have become popular devices to facilitate activity recognition. The sensors capture a large amount of data, containing meaningful events, in a short period of time. The change points in this data are used to specify transitions to distinct events and can be used in various scenarios such as identifying change in a patient’s vital signs in the medical domain or requesting activity labels for generating real-world labeled activity datasets. Our work focuses on change-point detection to identify a transition from one activity to another. Within this paper, we extend our previous work on multivariate exponentially weighted moving average (MEWMA) algorithm by using a genetic algorithm (GA) to identify the optimal set of parameters for online change-point detection. The proposed technique finds the maximum accuracy and F_measure by optimizing the different parameters of the MEWMA, which subsequently identifies the exact location of the change point from an existing activity to a new one. Optimal parameter selection facilitates an algorithm to detect accurate change points and minimize false alarms. Results have been evaluated based on two real datasets of accelerometer data collected from a set of different activities from two users, with a high degree of accuracy from 99.4% to 99.8% and F_measure of up to 66.7%. PMID:27792177

  12. A Universal Fast Algorithm for Sensitivity-Based Structural Damage Detection

    PubMed Central

    Yang, Q. W.; Liu, J. K.; Li, C. H.; Liang, C. F.

    2013-01-01

    Structural damage detection using measured response data has emerged as a new research area in civil, mechanical, and aerospace engineering communities in recent years. In this paper, a universal fast algorithm is presented for sensitivity-based structural damage detection, which can quickly improve the calculation accuracy of the existing sensitivity-based technique without any high-order sensitivity analysis or multi-iterations. The key formula of the universal fast algorithm is derived from the stiffness and flexibility matrix spectral decomposition theory. With the introduction of the key formula, the proposed method is able to quickly achieve more accurate results than that obtained by the original sensitivity-based methods, regardless of whether the damage is small or large. Three examples are used to demonstrate the feasibility and superiority of the proposed method. It has been shown that the universal fast algorithm is simple to implement and quickly gains higher accuracy over the existing sensitivity-based damage detection methods. PMID:24453815

  13. Envelope analysis with a genetic algorithm-based adaptive filter bank for bearing fault detection.

    PubMed

    Kang, Myeongsu; Kim, Jaeyoung; Choi, Byeong-Keun; Kim, Jong-Myon

    2015-07-01

    This paper proposes a fault detection methodology for bearings using envelope analysis with a genetic algorithm (GA)-based adaptive filter bank. Although a bandpass filter cooperates with envelope analysis for early identification of bearing defects, no general consensus has been reached as to which passband is optimal. This study explores the impact of various passbands specified by the GA in terms of a residual frequency components-to-defect frequency components ratio, which evaluates the degree of defectiveness in bearings and finally outputs an optimal passband for reliable bearing fault detection.

  14. Study of Host-Based Cyber Attack Precursor Symptom Detection Algorithm

    NASA Astrophysics Data System (ADS)

    Song, Jae-Gu; Kim, Jong Hyun; Seo, Dongil; Soh, Wooyoung; Kim, Seoksoo

    Botnet-based cyber attacks cause large-scale damage with increasingly intelligent tools, which has called for varied research on bot detection. In this study, we developed a method of monitoring behaviors of host-based processes from the point that a bot header attempts to make zombie PCs, detecting cyber attack precursor symptoms. We designed an algorithm that figures out characteristics of botnet which attempts to launch malicious behaviors by means of signature registration, which is for process/reputation/network traffic/packet/source analysis and a white list, as a measure to respond to bots from the end point.

  15. New detection algorithm for dim point moving target in IR-image sequence based on an image frames transformation

    NASA Astrophysics Data System (ADS)

    Mohamed, M. A.; Li, Hongzuo

    2013-09-01

    In this paper we follow the concept of the track before detect (TBD) category in order to perform a simple, fast and adaptive detection and tracking processes of dim pixel size moving targets in IR images sequence. We present two new algorithms based on an image frames transformation, the first algorithm is a recursive algorithm to measure the image background Baseline which help in assigning an adaptive threshold, while the second is an adaptive recursive statistical spatio-temporal algorithm for detecting and tracking the target. The results of applying the proposed algorithms on a set of frames having a simple single pixel target performing a linear motion shows a high efficiency and validity in the detecting of the motion, and the measurement of the background baseline.

  16. A Fast Inspection of Tool Electrode and Drilling Depth in EDM Drilling by Detection Line Algorithm

    PubMed Central

    Huang, Kuo-Yi

    2008-01-01

    The purpose of this study was to develop a novel measurement method using a machine vision system. Besides using image processing techniques, the proposed system employs a detection line algorithm that detects the tool electrode length and drilling depth of a workpiece accurately and effectively. Different boundaries of areas on the tool electrode are defined: a baseline between base and normal areas, a ND-line between normal and drilling areas (accumulating carbon area), and a DD-line between drilling area and dielectric fluid droplet on the electrode tip. Accordingly, image processing techniques are employed to extract a tool electrode image, and the centroid, eigenvector, and principle axis of the tool electrode are determined. The developed detection line algorithm (DLA) is then used to detect the baseline, ND-line, and DD-line along the direction of the principle axis. Finally, the tool electrode length and drilling depth of the workpiece are estimated via detected baseline, ND-line, and DD-line. Experimental results show good accuracy and efficiency in estimation of the tool electrode length and drilling depth under different conditions. Hence, this research may provide a reference for industrial application in EDM drilling measurement. PMID:27873790

  17. A Fast Inspection of Tool Electrode and Drilling Depth in EDM Drilling by Detection Line Algorithm.

    PubMed

    Huang, Kuo-Yi

    2008-08-21

    The purpose of this study was to develop a novel measurement method using a machine vision system. Besides using image processing techniques, the proposed system employs a detection line algorithm that detects the tool electrode length and drilling depth of a workpiece accurately and effectively. Different boundaries of areas on the tool electrode are defined: a baseline between base and normal areas, a ND-line between normal and drilling areas (accumulating carbon area), and a DD-line between drilling area and dielectric fluid droplet on the electrode tip. Accordingly, image processing techniques are employed to extract a tool electrode image, and the centroid, eigenvector, and principle axis of the tool electrode are determined. The developed detection line algorithm (DLA) is then used to detect the baseline, ND-line, and DD-line along the direction of the principle axis. Finally, the tool electrode length and drilling depth of the workpiece are estimated via detected baseline, ND-line, and DD-line. Experimental results show good accuracy and efficiency in estimation of the tool electrode length and drilling depth under different conditions. Hence, this research may provide a reference for industrial application in EDM drilling measurement.

  18. Research on conflict detection algorithm in 3D visualization environment of urban rail transit line

    NASA Astrophysics Data System (ADS)

    Wang, Li; Xiong, Jing; You, Kuokuo

    2017-03-01

    In this paper, a method of collision detection is introduced, and the theory of three-dimensional modeling of underground buildings and urban rail lines is realized by rapidly extracting the buildings that are in conflict with the track area in the 3D visualization environment. According to the characteristics of the buildings, CSG and B-rep are used to model the buildings based on CSG and B-rep. On the basis of studying the modeling characteristics, this paper proposes to use the AABB level bounding volume method to detect the first conflict and improve the detection efficiency, and then use the triangular rapid intersection detection algorithm to detect the conflict, and finally determine whether the building collides with the track area. Through the algorithm of this paper, we can quickly extract buildings colliding with the influence area of the track line, so as to help the line design, choose the best route and calculate the cost of land acquisition in the three-dimensional visualization environment.

  19. A Hybrid Spectral Clustering and Deep Neural Network Ensemble Algorithm for Intrusion Detection in Sensor Networks

    PubMed Central

    Ma, Tao; Wang, Fen; Cheng, Jianjun; Yu, Yang; Chen, Xiaoyun

    2016-01-01

    The development of intrusion detection systems (IDS) that are adapted to allow routers and network defence systems to detect malicious network traffic disguised as network protocols or normal access is a critical challenge. This paper proposes a novel approach called SCDNN, which combines spectral clustering (SC) and deep neural network (DNN) algorithms. First, the dataset is divided into k subsets based on sample similarity using cluster centres, as in SC. Next, the distance between data points in a testing set and the training set is measured based on similarity features and is fed into the deep neural network algorithm for intrusion detection. Six KDD-Cup99 and NSL-KDD datasets and a sensor network dataset were employed to test the performance of the model. These experimental results indicate that the SCDNN classifier not only performs better than backpropagation neural network (BPNN), support vector machine (SVM), random forest (RF) and Bayes tree models in detection accuracy and the types of abnormal attacks found. It also provides an effective tool of study and analysis of intrusion detection in large networks. PMID:27754380

  20. An earthquake detection algorithm with pseudo-probabilities of multiple indicators

    NASA Astrophysics Data System (ADS)

    Ross, Z. E.; Ben-Zion, Y.

    2014-04-01

    We develop an automatic earthquake detection algorithm combining information from numerous indicator variables in a non-parametric framework. The method is shown to perform well with multiple ratios of moving short- and long-time averages having ranges of time intervals and frequency bands. The results from each indicator are transformed to a pseudo-probability time-series (PPTS) in the range [0, 1]. The various PPTS of the different indicators are multiplied to form a single joint PPTS that is used for detections. Since all information is combined, redundancy among the different indicators produces robust peaks in the output. This allows the trigger threshold applied to the joint PPTS to be significantly lower than for any one detector, leading to substantially more detected earthquakes. Application of the algorithm to a small data set recorded during a 7-d window by 13 stations near the San Jacinto fault zone detects 3.13 times as many earthquakes as listed in the Southern California Seismic Network catalogue. The method provides a convenient statistical platform for including other indicators, and may utilize different sets of indicators to detect other information such as specific seismic phases or tremor.

  1. Target detection in diagnostic ultrasound: Evaluation of a method based on the CLEAN algorithm.

    PubMed

    Masoom, Hassan; Adve, Raviraj S; Cobbold, Richard S C

    2013-02-01

    A technique is proposed for the detection of abnormalities (targets) in ultrasound images using little or no a priori information and requiring little operator intervention. The scheme is a combination of the CLEAN algorithm, originally proposed for radio astronomy, and constant false alarm rate (CFAR) processing, as developed for use in radar systems. The CLEAN algorithm identifies areas in the ultrasound image that stand out above a threshold in relation to the background; CFAR techniques allow for an adaptive, semi-automated, selection of the threshold. Neither appears to have been previously used for target detection in ultrasound images and never together in any context. As a first step towards assessing the potential of this method we used a widely used method of simulating B-mode images (Field II). We assumed the use of a 256 element linear array operating at 3.0MHz into a water-like medium containing a density of point scatterers sufficient to simulate a background of fully developed speckle. Spherical targets with diameters ranging from 0.25 to 6.0mm and contrasts ranging from 0 to 12dB relative to the background were used as test objects. Using a contrast-detail analysis, the probability of detection curves indicate these targets can be consistently detected within a speckle background. Our results indicate that the method has considerable promise for the semi-automated detection of abnormalities with diameters greater than a few millimeters, depending on the contrast.

  2. A Hybrid Spectral Clustering and Deep Neural Network Ensemble Algorithm for Intrusion Detection in Sensor Networks.

    PubMed

    Ma, Tao; Wang, Fen; Cheng, Jianjun; Yu, Yang; Chen, Xiaoyun

    2016-10-13

    The development of intrusion detection systems (IDS) that are adapted to allow routers and network defence systems to detect malicious network traffic disguised as network protocols or normal access is a critical challenge. This paper proposes a novel approach called SCDNN, which combines spectral clustering (SC) and deep neural network (DNN) algorithms. First, the dataset is divided into k subsets based on sample similarity using cluster centres, as in SC. Next, the distance between data points in a testing set and the training set is measured based on similarity features and is fed into the deep neural network algorithm for intrusion detection. Six KDD-Cup99 and NSL-KDD datasets and a sensor network dataset were employed to test the performance of the model. These experimental results indicate that the SCDNN classifier not only performs better than backpropagation neural network (BPNN), support vector machine (SVM), random forest (RF) and Bayes tree models in detection accuracy and the types of abnormal attacks found. It also provides an effective tool of study and analysis of intrusion detection in large networks.

  3. Advances in multiplex PCR: balancing primer efficiencies and improving detection success

    PubMed Central

    Sint, Daniela; Raso, Lorna; Traugott, Michael

    2012-01-01

    1. Multiplex PCR is a valuable tool in many biological studies but it is a multifaceted procedure that has to be planned and optimised thoroughly to achieve robust and meaningful results. In particular, primer concentrations have to be adjusted to assure an even amplification of all targeted DNA fragments. Until now, total DNA extracts were used for balancing primer efficiencies; however, the applicability for comparisons between taxa or different multiple-copy genes was limited owing to the unknown number of template molecules present per total DNA. 2. Based on a multiplex system developed to track trophic interactions in high Alpine arthropods, we demonstrate a fast and easy way of generating standardised DNA templates. These were then used to balance the amplification success for the different targets and to subsequently determine the sensitivity of each primer pair in the multiplex PCR. 3. In the current multiplex assay, this approach led to an even amplification success for all seven targeted DNA fragments. Using this balanced multiplex PCR, methodological bias owing to variation in primer efficiency will be avoided when analysing field-derived samples. 4. The approach outlined here allows comparing multiplex PCR sensitivity, independent of the investigated species, genome size or the targeted genes. The application of standardised DNA templates not only makes it possible to optimise primer efficiency within a given multiplex PCR, but it also offers to adjust and/or to compare the sensitivity between different assays. Along with other factors that influence the success of multiplex reactions, and which we discuss here in relation to the presented detection system, the adoption of this approach will allow for direct comparison of multiplex PCR data between systems and studies, enhancing the utility of this assay type. PMID:23549328

  4. A New MANET Wormhole Detection Algorithm Based on Traversal Time and Hop Count Analysis

    PubMed Central

    Karlsson, Jonny; Dooley, Laurence S.; Pulkkis, Göran

    2011-01-01

    As demand increases for ubiquitous network facilities, infrastructure-less and self-configuring systems like Mobile Ad hoc Networks (MANET) are gaining popularity. MANET routing security however, is one of the most significant challenges to wide scale adoption, with wormhole attacks being an especially severe MANET routing threat. This is because wormholes are able to disrupt a major component of network traffic, while concomitantly being extremely difficult to detect. This paper introduces a new wormhole detection paradigm based upon Traversal Time and Hop Count Analysis (TTHCA), which in comparison to existing algorithms, consistently affords superior detection performance, allied with low false positive rates for all wormhole variants. Simulation results confirm that the TTHCA model exhibits robust wormhole route detection in various network scenarios, while incurring only a small network overhead. This feature makes TTHCA an attractive choice for MANET environments which generally comprise devices, such as wireless sensors, which possess a limited processing capability. PMID:22247657

  5. A New MANET wormhole detection algorithm based on traversal time and hop count analysis.

    PubMed

    Karlsson, Jonny; Dooley, Laurence S; Pulkkis, Göran

    2011-01-01

    As demand increases for ubiquitous network facilities, infrastructure-less and self-configuring systems like Mobile Ad hoc Networks (MANET) are gaining popularity. MANET routing security however, is one of the most significant challenges to wide scale adoption, with wormhole attacks being an especially severe MANET routing threat. This is because wormholes are able to disrupt a major component of network traffic, while concomitantly being extremely difficult to detect. This paper introduces a new wormhole detection paradigm based upon Traversal Time and Hop Count Analysis (TTHCA), which in comparison to existing algorithms, consistently affords superior detection performance, allied with low false positive rates for all wormhole variants. Simulation results confirm that the TTHCA model exhibits robust wormhole route detection in various network scenarios, while incurring only a small network overhead. This feature makes TTHCA an attractive choice for MANET environments which generally comprise devices, such as wireless sensors, which possess a limited processing capability.

  6. Benchmark for Peak Detection Algorithms in Fiber Bragg Grating Interrogation and a New Neural Network for its Performance Improvement

    PubMed Central

    Negri, Lucas; Nied, Ademir; Kalinowski, Hypolito; Paterno, Aleksander

    2011-01-01

    This paper presents a benchmark for peak detection algorithms employed in fiber Bragg grating spectrometric interrogation systems. The accuracy, precision, and computational performance of currently used algorithms and those of a new proposed artificial neural network algorithm are compared. Centroid and gaussian fitting algorithms are shown to have the highest precision but produce systematic errors that depend on the FBG refractive index modulation profile. The proposed neural network displays relatively good precision with reduced systematic errors and improved computational performance when compared to other networks. Additionally, suitable algorithms may be chosen with the general guidelines presented. PMID:22163806

  7. Comparison of algorithms for blood stain detection applied to forensic hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Messinger, David W.; Mathew, Jobin J.; Dube, Roger R.

    2016-05-01

    Blood stains are among the most important types of evidence for forensic investigation. They contain valuable DNA information, and the pattern of the stains can suggest specifics about the nature of the violence that transpired at the scene. Early detection of blood stains is particularly important since the blood reacts physically and chemically with air and materials over time. Accurate identification of blood remnants, including regions that might have been intentionally cleaned, is an important aspect of forensic investigation. Hyperspectral imaging might be a potential method to detect blood stains because it is non-contact and provides substantial spectral information that can be used to identify regions in a scene with trace amounts of blood. The potential complexity of scenes in which such vast violence occurs can be high when the range of scene material types and conditions containing blood stains at a crime scene are considered. Some stains are hard to detect by the unaided eye, especially if a conscious effort to clean the scene has occurred (we refer to these as "latent" blood stains). In this paper we present the initial results of a study of the use of hyperspectral imaging algorithms for blood detection in complex scenes. We describe a hyperspectral imaging system which generates images covering 400 nm - 700 nm visible range with a spectral resolution of 10 nm. Three image sets of 31 wavelength bands were generated using this camera for a simulated indoor crime scene in which blood stains were placed on a T-shirt and walls. To detect blood stains in the scene, Principal Component Analysis (PCA), Subspace Reed Xiaoli Detection (SRXD), and Topological Anomaly Detection (TAD) algorithms were used. Comparison of the three hyperspectral image analysis techniques shows that TAD is most suitable for detecting blood stains and discovering latent blood stains.

  8. Breadth-First Search-Based Single-Phase Algorithms for Bridge Detection in Wireless Sensor Networks

    PubMed Central

    Akram, Vahid Khalilpour; Dagdeviren, Orhan

    2013-01-01

    Wireless sensor networks (WSNs) are promising technologies for exploring harsh environments, such as oceans, wild forests, volcanic regions and outer space. Since sensor nodes may have limited transmission range, application packets may be transmitted by multi-hop communication. Thus, connectivity is a very important issue. A bridge is a critical edge whose removal breaks the connectivity of the network. Hence, it is crucial to detect bridges and take preventions. Since sensor nodes are battery-powered, services running on nodes should consume low energy. In this paper, we propose energy-efficient and distributed bridge detection algorithms for WSNs. Our algorithms run single phase and they are integrated with the Breadth-First Search (BFS) algorithm, which is a popular routing algorithm. Our first algorithm is an extended version of Milic's algorithm, which is designed to reduce the message length. Our second algorithm is novel and uses ancestral knowledge to detect bridges. We explain the operation of the algorithms, analyze their proof of correctness, message, time, space and computational complexities. To evaluate practical importance, we provide testbed experiments and extensive simulations. We show that our proposed algorithms provide less resource consumption, and the energy savings of our algorithms are up by 5.5-times. PMID:23845930

  9. A two-scale algorithm for detection of multiple flaws in structures modeled with XFEM

    NASA Astrophysics Data System (ADS)

    Sun, Hao; Waisman, Haim; Betti, Raimondo

    2014-03-01

    This paper presents a novel algorithm for detection of multiple flaws in structures as an inverse process, where the forward problem is based on eXtended Finite Element Method (XFEM). The proposed algorithm can be applied to quantify any flaw with arbitrary shape and size (e.g., cracks, voids, or their combination) whose number is unknown beforehand, and is shown to be significantly more efficient than other methods proposed in the literature. The basic concept is to employ a two-scale optimization framework, where first a coarse flaw region is detected and then fine scale convergence is used to zoom in on the flaw. Both optimization steps rely on a forward problem in which an XFEM model with both circular and elliptical enrichments is used. The advantage of XFEM is in the alleviation of costly remeshing techniques when candidate flaws keep updating with the optimization process. The proposed hierarchical optimizers include both heuristic and gradient-based algorithms, such as the discrete artificial bee colony (DABC) algorithms and the Broyden-Fletcher-Goldfarb-Shanno (BFGS) method. In details, the first step employs a DABC optimization as a coarse-scale search where the optimizer is limited to specific solutions that correspond to locations and shapes of flaws, thus converting a continuous optimization problem into a discrete optimization with a small number of choices. The results of the first step provide local subdomains and rough identified flaw parameters, which can be considered as search space reduction and initial guess for a fine tuning optimization step. This solution zooming is carried out by the BFGS method and leads to a fast converging method as illustrated on two benchmark detection examples.

  10. Algorithm for detecting defects in wooden logs using ground penetrating radar

    NASA Astrophysics Data System (ADS)

    Devaru, Dayakar; Halabe, Udaya B.; Gopalakrishnan, B.; Agrawal, Sachin; Grushecky, Shawn

    2005-11-01

    Presently there are no suitable non-invasive methods for precisely detecting the subsurface defects in logs in real time. Internal defects such as knots, decays, and embedded metals are of greatest concern for lumber production. While defects such as knots and decays (rots) are of major concern related to productivity and yield of high value wood products, embedded metals can damage the saw blade and significantly increase the down time and maintenance costs of saw mills. Currently, a large number of logs end up being discarded by saw mills, or result in low value wood products since they include defects. Nondestructive scanning of logs using techniques such as Ground Penetrating Radar (GPR) prior to sawing can greatly increase the productivity and yield of high value lumber. In this research, the GPR scanned data has been analyzed to differentiate the defective part of the wooden log from the good part. The location and size of the defect has been found in the GPR scanned data using the MATLAB algorithm. The output of this algorithm can be used as an input for generating operating instructions for a CNC sawing machine. This paper explains the advantages of the GPR technique, experimental setup and parameters used, data processing using RADAN software for detection of subsurface defects in logs, GPR data processing and analysis using MATLAB algorithm for automated defect detection, and comparison of results between the two processing methods. The results show that GPR in conjunction with the proposed algorithm provides a very promising technique for future on-line implementation in saw mills.

  11. Flight test results of a vector-based failure detection and isolation algorithm for a redundant strapdown inertial measurement unit

    NASA Technical Reports Server (NTRS)

    Morrell, F. R.; Bailey, M. L.; Motyka, P. R.

    1988-01-01

    Flight test results of a vector-based fault-tolerant algorithm for a redundant strapdown inertial measurement unit are presented. Because the inertial sensors provide flight-critical information for flight control and navigation, failure detection and isolation is developed in terms of a multi-level structure. Threshold compensation techniques for gyros and accelerometers, developed to enhance the sensitivity of the failure detection process to low-level failures, are presented. Four flight tests, conducted in a commercial transport type environment, were used to determine the ability of the failure detection and isolation algorithm to detect failure signals, such a hard-over, null, or bias shifts. The algorithm provided timely detection and correct isolation of flight control- and low-level failures. The flight tests of the vector-based algorithm demonstrated its capability to provide false alarm free dual fail-operational performance for the skewed array of inertial sensors.

  12. A novel seizure detection algorithm informed by hidden Markov model event states

    NASA Astrophysics Data System (ADS)

    Baldassano, Steven; Wulsin, Drausin; Ung, Hoameng; Blevins, Tyler; Brown, Mesha-Gay; Fox, Emily; Litt, Brian

    2016-06-01

    Objective. Recently the FDA approved the first responsive, closed-loop intracranial device to treat epilepsy. Because these devices must respond within seconds of seizure onset and not miss events, they are tuned to have high sensitivity, leading to frequent false positive stimulations and decreased battery life. In this work, we propose a more robust seizure detection model. Approach. We use a Bayesian nonparametric Markov switching process to parse intracranial EEG (iEEG) data into distinct dynamic event states. Each event state is then modeled as a multidimensional Gaussian distribution to allow for predictive state assignment. By detecting event states highly specific for seizure onset zones, the method can identify precise regions of iEEG data associated with the transition to seizure activity, reducing false positive detections associated with interictal bursts. The seizure detection algorithm was translated to a real-time application and validated in a small pilot study using 391 days of continuous iEEG data from two dogs with naturally occurring, multifocal epilepsy. A feature-based seizure detector modeled after the NeuroPace RNS System was developed as a control. Main results. Our novel seizure detection method demonstrated an improvement in false negative rate (0/55 seizures missed versus 2/55 seizures missed) as well as a significantly reduced false positive rate (0.0012 h versus 0.058 h-1). All seizures were detected an average of 12.1 ± 6.9 s before the onset of unequivocal epileptic activity (unequivocal epileptic onset (UEO)). Significance. This algorithm represents a computationally inexpensive, individualized, real-time detection method suitable for implantable antiepileptic devices that may considerably reduce false positive rate relative to current industry standards.

  13. Adaptive Fault Detection on Liquid Propulsion Systems with Virtual Sensors: Algorithms and Architectures

    NASA Technical Reports Server (NTRS)

    Matthews, Bryan L.; Srivastava, Ashok N.

    2010-01-01

    Prior to the launch of STS-119 NASA had completed a study of an issue in the flow control valve (FCV) in the Main Propulsion System of the Space Shuttle using an adaptive learning method known as Virtual Sensors. Virtual Sensors are a class of algorithms that estimate the value of a time series given other potentially nonlinearly correlated sensor readings. In the case presented here, the Virtual Sensors algorithm is based on an ensemble learning approach and takes sensor readings and control signals as input to estimate the pressure in a subsystem of the Main Propulsion System. Our results indicate that this method can detect faults in the FCV at the time when they occur. We use the standard deviation of the predictions of the ensemble as a measure of uncertainty in the estimate. This uncertainty estimate was crucial to understanding the nature and magnitude of transient characteristics during startup of the engine. This paper overviews the Virtual Sensors algorithm and discusses results on a comprehensive set of Shuttle missions and also discusses the architecture necessary for deploying such algorithms in a real-time, closed-loop system or a human-in-the-loop monitoring system. These results were presented at a Flight Readiness Review of the Space Shuttle in early 2009.

  14. An Angular Momentum Eddy Detection Algorithm (AMEDA) applied to coastal eddies

    NASA Astrophysics Data System (ADS)

    Le Vu, Briac; Stegner, Alexandre; Arsouze, Thomas

    2016-04-01

    We present a new automated eddy detection and tracking algorithm based on the computation of the LNAM (Local and Normalized Angular Momentum). This method is an improvement of the previous method by Mkhinini et al. (2014) with the aim to be applied to multiple datasets (satellite data, numerical models, laboratory experiments) using as few objective criteria as possible. First, we show the performance of the algorithm for three different source of data: a Mediterranean 1/8° AVISO geostrophic velocities fields based on the Absolute Dynamical Topography (ADT), a ROMS idealized simulation and a high resolution velocity field derived from PIV measurements in a rotating tank experiment. All the velocity fields describe the dynamical evolution of mesoscale eddies generated by the instability of coastal currents. Then, we compare the results of the AMEDA algorithm applied to regional 1/8° AVISO Mediterranean data set with in situ measurements (drifter, ARGO, ADCP…). This quantitative comparisons with few specific test cases enables us to estimate the accuracy of the method to quantify the eddies features: trajectory, size and intensity. We also use the AMEDA algorithm to identify the main formation areas of long-lived eddies in the Mediterranean Sea during the last 15 years.

  15. A Monocular Vision Sensor-Based Obstacle Detection Algorithm for Autonomous Robots

    PubMed Central

    Lee, Tae-Jae; Yi, Dong-Hoon; Cho, Dong-Il “Dan”

    2016-01-01

    This paper presents a monocular vision sensor-based obstacle detection algorithm for autonomous robots. Each individual image pixel at the bottom region of interest is labeled as belonging either to an obstacle or the floor. While conventional methods depend on point tracking for geometric cues for obstacle detection, the proposed algorithm uses the inverse perspective mapping (IPM) method. This method is much more advantageous when the camera is not high off the floor, which makes point tracking near the floor difficult. Markov random field-based obstacle segmentation is then performed using the IPM results and a floor appearance model. Next, the shortest distance between the robot and the obstacle is calculated. The algorithm is tested by applying it to 70 datasets, 20 of which include nonobstacle images where considerable changes in floor appearance occur. The obstacle segmentation accuracies and the distance estimation error are quantitatively analyzed. For obstacle datasets, the segmentation precision and the average distance estimation error of the proposed method are 81.4% and 1.6 cm, respectively, whereas those for a conventional method are 57.5% and 9.9 cm, respectively. For nonobstacle datasets, the proposed method gives 0.0% false positive rates, while the conventional method gives 17.6%. PMID:26938540

  16. Algorithm for lung cancer detection based on PET/CT images

    NASA Astrophysics Data System (ADS)

    Saita, Shinsuke; Ishimatsu, Keita; Kubo, Mitsuru; Kawata, Yoshiki; Niki, Noboru; Ohtsuka, Hideki; Nishitani, Hiromu; Ohmatsu, Hironobu; Eguchi, Kenji; Kaneko, Masahiro; Moriyama, Noriyuki

    2009-02-01

    The five year survival rate of the lung cancer is low with about twenty-five percent. In addition it is an obstinate lung cancer wherein three out of four people die within five years. Then, the early stage detection and treatment of the lung cancer are important. Recently, we can obtain CT and PET image at the same time because PET/CT device has been developed. PET/CT is possible for a highly accurate cancer diagnosis because it analyzes quantitative shape information from CT image and FDG distribution from PET image. However, neither benign-malignant classification nor staging intended for lung cancer have been established still enough by using PET/CT images. In this study, we detect lung nodules based on internal organs extracted from CT image, and we also develop algorithm which classifies benignmalignant and metastatic or non metastatic lung cancer using lung structure and FDG distribution(one and two hour after administering FDG). We apply the algorithm to 59 PET/CT images (malignant 43 cases [Ad:31, Sq:9, sm:3], benign 16 cases) and show the effectiveness of this algorithm.

  17. Evaluation of a Pair-Wise Conflict Detection and Resolution Algorithm in a Multiple Aircraft Scenario

    NASA Technical Reports Server (NTRS)

    Carreno, Victor A.

    2002-01-01

    The KB3D algorithm is a pairwise conflict detection and resolution (CD&R) algorithm. It detects and generates trajectory vectoring for an aircraft which has been predicted to be in an airspace minima violation within a given look-ahead time. It has been proven, using mechanized theorem proving techniques, that for a pair of aircraft, KB3D produces at least one vectoring solution and that all solutions produced are correct. Although solutions produced by the algorithm are mathematically correct, they might not be physically executable by an aircraft or might not solve multiple aircraft conflicts. This paper describes a simple solution selection method which assesses all solutions generated by KB3D and determines the solution to be executed. The solution selection method and KB3D are evaluated using a simulation in which N aircraft fly in a free-flight environment and each aircraft in the simulation uses KB3D to maintain separation. Specifically, the solution selection method filters KB3D solutions which are procedurally undesirable or physically not executable and uses a predetermined criteria for selection.

  18. An improved segmentation algorithm to detect moving object in video sequences

    NASA Astrophysics Data System (ADS)

    Li, Jinkui; Sang, Xinzhu; Wang, Yongqiang; Yan, Binbin; Yu, Chongxiu

    2010-11-01

    The segmentation of moving object in video sequences is attracting more and more attention because of its important role in various camera video applications, such as video surveillance, traffic monitoring, people tracking. and so on. Conventional segmentation algorithms can be divided into two classes. One class is based on spatial homogeneity, which results in the promising output. However, the computation is too complex and heavy to be unsuitable to real-time applications. The other class utilizes change detection as the segmentation standard to extract the moving object. Typical approaches include frame difference, background subtraction and optical flow. A novel algorithm based on adaptive symmetrical difference and background subtraction is proposed. Firstly, the moving object mask is detected through the adaptive symmetrical difference, and the contour of the mask is extracted. And then, the adaptive background subtraction is carried out in the acquired region to extract the accurate moving object. Morphological operation and shadow cancellation are adopted to refine the result. Experimental results show that the algorithm is robust and effective in improving the segmentation accuracy.

  19. A Monocular Vision Sensor-Based Obstacle Detection Algorithm for Autonomous Robots.

    PubMed

    Lee, Tae-Jae; Yi, Dong-Hoon; Cho, Dong-Il Dan

    2016-03-01

    This paper presents a monocular vision sensor-based obstacle detection algorithm for autonomous robots. Each individual image pixel at the bottom region of interest is labeled as belonging either to an obstacle or the floor. While conventional methods depend on point tracking for geometric cues for obstacle detection, the proposed algorithm uses the inverse perspective mapping (IPM) method. This method is much more advantageous when the camera is not high off the floor, which makes point tracking near the floor difficult. Markov random field-based obstacle segmentation is then performed using the IPM results and a floor appearance model. Next, the shortest distance between the robot and the obstacle is calculated. The algorithm is tested by applying it to 70 datasets, 20 of which include nonobstacle images where considerable changes in floor appearance occur. The obstacle segmentation accuracies and the distance estimation error are quantitatively analyzed. For obstacle datasets, the segmentation precision and the average distance estimation error of the proposed method are 81.4% and 1.6 cm, respectively, whereas those for a conventional method are 57.5% and 9.9 cm, respectively. For nonobstacle datasets, the proposed method gives 0.0% false positive rates, while the conventional method gives 17.6%.

  20. An automatic detection algorithm for extracting the representative frequency of cetacean tonal sounds.

    PubMed

    Lin, Tzu-Hao; Chou, Lien-Siang; Akamatsu, Tomonari; Chan, Hsiang-Chih; Chen, Chi-Fang

    2013-09-01

    Most studies on tonal sounds extract contour parameters from fundamental frequencies. The presence of harmonics and the frequency distribution of multiple tonal sounds have not been well researched. To investigate the occurrence and frequency modulation of cetacean tonal sounds, the procedure of detecting the instantaneous frequency bandwidth of tonal spectral peaks was integrated within the local-max detector to extract adopted frequencies. The adopted frequencies, considered the representative frequencies of tonal sounds, are used to find the presence of harmonics and overlapping tonal sounds. The utility and detection performance are demonstrated on acoustic recordings of five species from two databases. The recordings of humpback dolphins showed a 75% detection rate with a 5% false detection rate, and recordings from the MobySound archive showed an 85% detection rate with a 5% false detection rate. These detections were achieved in signal-to-noise ratios of -12 to 21 dB. The parameters that measured the distribution of adopted frequency, as well as the prominence of harmonics and overlaps, indicate that the modulation of tonal sounds varied among different species and behaviors. This algorithm can be applied to studies on cetacean communication signals and long-term passive acoustic monitoring.

  1. A Universal Dynamic Threshold Cloud Detection Algorithm (UDTCDA) supported by a prior surface reflectance database

    NASA Astrophysics Data System (ADS)

    Sun, Lin; Wei, Jing; Wang, Jian; Mi, Xueting; Guo, Yamin; Lv, Yang; Yang, Yikun; Gan, Ping; Zhou, Xueying; Jia, Chen; Tian, Xinpeng

    2016-06-01

    Conventional cloud detection methods are easily affected by mixed pixels, complex surface structures, and atmospheric factors, resulting in poor cloud detection results. To minimize these problems, a new Universal Dynamic Threshold Cloud Detection Algorithm (UDTCDA) supported by a priori surface reflectance database is proposed in this paper. A monthly surface reflectance database is constructed using long-time-sequenced MODerate resolution Imaging Spectroradiometer surface reflectance product (MOD09A1) to provide the surface reflectance of the underlying surfaces. The relationships between the apparent reflectance changes and the surface reflectance are simulated under different observation and atmospheric conditions with the 6S (Second Simulation of the Satellite Signal in the Solar Spectrum) model, and the dynamic threshold cloud detection models are developed. Two typical remote sensing data with important application significance and different sensor parameters, MODIS and Landsat 8, are selected for cloud detection experiments. The results were validated against the visual interpretation of clouds and Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation cloud measurements. The results showed that the UDTCDA can obtain a high precision in cloud detection, correctly identifying cloudy pixels and clear-sky pixels at rates greater than 80% with error rate and missing rate of less than 20%. The UDTCDA cloud product overall shows less estimation uncertainty than the current MODIS cloud mask products. Moreover, the UDTCDA can effectively reduce the effects of atmospheric factors and mixed pixels and can be applied to different satellite sensors to realize long-term, large-scale cloud detection operations.

  2. An efficient technique for nuclei segmentation based on ellipse descriptor analysis and improved seed detection algorithm.

    PubMed

    Xu, Hongming; Lu, Cheng; Mandal, Mrinal

    2014-09-01

    In this paper, we propose an efficient method for segmenting cell nuclei in the skin histopathological images. The proposed technique consists of four modules. First, it separates the nuclei regions from the background with an adaptive threshold technique. Next, an elliptical descriptor is used to detect the isolated nuclei with elliptical shapes. This descriptor classifies the nuclei regions based on two ellipticity parameters. Nuclei clumps and nuclei with irregular shapes are then localized by an improved seed detection technique based on voting in the eroded nuclei regions. Finally, undivided nuclei regions are segmented by a marked watershed algorithm. Experimental results on 114 different image patches indicate that the proposed technique provides a superior performance in nuclei detection and segmentation.

  3. High effective algorithm of the detection and identification of substance using the noisy reflected THz pulse

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Varentsova, Svetlana A.; Trofimov, Vladislav V.; Tikhomirov, Vasily V.

    2015-08-01

    Principal limitations of the standard THz-TDS method for the detection and identification are demonstrated under real conditions (at long distance of about 3.5 m and at a high relative humidity more than 50%) using neutral substances thick paper bag, paper napkins and chocolate. We show also that the THz-TDS method detects spectral features of dangerous substances even if the THz signals were measured in laboratory conditions (at distance 30-40 cm from the receiver and at a low relative humidity less than 2%); silicon-based semiconductors were used as the samples. However, the integral correlation criteria, based on SDA method, allows us to detect the absence of dangerous substances in the neutral substances. The discussed algorithm shows high probability of the substance identification and a reliability of realization in practice, especially for security applications and non-destructive testing.

  4. ECG Based Heart Arrhythmia Detection Using Wavelet Coherence and Bat Algorithm

    NASA Astrophysics Data System (ADS)

    Kora, Padmavathi; Sri Rama Krishna, K.

    2016-12-01

    Atrial fibrillation (AF) is a type of heart abnormality, during the AF electrical discharges in the atrium are rapid, results in abnormal heart beat. The morphology of ECG changes due to the abnormalities in the heart. This paper consists of three major steps for the detection of heart diseases: signal pre-processing, feature extraction and classification. Feature extraction is the key process in detecting the heart abnormality. Most of the ECG detection systems depend on the time domain features for cardiac signal classification. In this paper we proposed a wavelet coherence (WTC) technique for ECG signal analysis. The WTC calculates the similarity between two waveforms in frequency domain. Parameters extracted from WTC function is used as the features of the ECG signal. These features are optimized using Bat algorithm. The Levenberg Marquardt neural network classifier is used to classify the optimized features. The performance of the classifier can be improved with the optimized features.

  5. Improving Estimation of Distribution Algorithm on Multimodal Problems by Detecting Promising Areas.

    PubMed

    Yang, Peng; Tang, Ke; Lu, Xiaofen

    2015-08-01

    In this paper, a novel multiple sub-models maintenance technique, named maintaining and processing sub-models (MAPS), is proposed. MAPS aims to enhance the ability of estimation of distribution algorithms (EDAs) on multimodal problems. The advantages of MAPS over the existing multiple sub-models based EDAs stem from the explicit detection of the promising areas, which can save many function evaluations for exploration and thus accelerate the optimization speed. MAPS can be combined with any EDA that adopts a single Gaussian model. The performance of MAPS has been assessed through empirical studies where MAPS is integrated with three different types of EDAs. The experimental results show that MAPS can lead to much faster convergence speed and obtain more stable solutions than the compared algorithms on 12 benchmark problems.

  6. An automatic algorithm for the detection of Trypanosoma cruzi parasites in blood sample images.

    PubMed

    Soberanis-Mukul, Roger; Uc-Cetina, Víctor; Brito-Loeza, Carlos; Ruiz-Piña, Hugo

    2013-12-01

    Chagas disease is a tropical parasitic disease caused by the flagellate protozoan Trypanosoma cruzi (T. cruzi) and currently affecting large portions of the Americas. One of the standard laboratory methods to determine the presence of the parasite is by direct visualization in blood smears stained with some colorant. This method is time-consuming, requires trained microscopists and is prone to human mistakes. In this article we propose a novel algorithm for the automatic detection of T. cruzi parasites, in microscope digital images obtained from peripheral blood smears treated with Wright's stain. Our algorithm achieved a sensitivity of 0.98 and specificity of 0.85 when evaluated against a dataset of 120 test images. Experimental results show the versatility of the method for parasitemia determination.

  7. An ensemble of k-nearest neighbours algorithm for detection of Parkinson's disease

    NASA Astrophysics Data System (ADS)

    Gök, Murat

    2015-04-01

    Parkinson's disease is a disease of the central nervous system that leads to severe difficulties in motor functions. Developing computational tools for recognition of Parkinson's disease at the early stages is very desirable for alleviating the symptoms. In this paper, we developed a discriminative model based on a selected feature subset and applied several classifier algorithms in the context of disease detection. All classifier performances from the point of both stand-alone and rotation-forest ensemble approach were evaluated on a Parkinson's disease data-set according to a blind testing protocol. The new method compared to hitherto methods outperforms the state-of-the-art in terms of both predictions of accuracy (98.46%) and area under receiver operating characteristic curve (0.99) scores applying rotation-forest ensemble k-nearest neighbour classifier algorithm.

  8. Distributed algorithms for small vehicle detection, classification, and velocity estimation using unattended ground sensors

    NASA Astrophysics Data System (ADS)

    Doser, Adele B.; Yee, Mark L.; O'Rourke, William T.; Slinkard, Megan E.; Craft, David C.; Nguyen, Hung D.

    2005-05-01

    This study developed a distributed vehicle target detection and estimation capability using two algorithmic approaches designed to take advantage of the capabilities of networked sensor systems. The primary interest was on small, quiet vehicles, such as personally owned SUVs and light trucks. The first algorithm approach utilized arrayed sensor beamforming techniques. In addition, it demonstrated a capability to find locations of unknown roads by extending code developed by the Army Acoustic Center for Excellence at Picatinny Arsenal. The second approach utilized single (non-array) sensors and employed generalized correlation techniques. Modifications to both techniques were suggested that, if implemented, could yield robust methods for target classification and tracking using two different types of networked sensor systems.

  9. An efficient sampling algorithm for uncertain abnormal data detection in biomedical image processing and disease prediction.

    PubMed

    Liu, Fei; Zhang, Xi; Jia, Yan

    2015-01-01

    In this paper, we propose a computer information processing algorithm that can be used for biomedical image processing and disease prediction. A biomedical image is considered a data object in a multi-dimensional space. Each dimension is a feature that can be used for disease diagnosis. We introduce a new concept of the top (k1,k2) outlier. It can be used to detect abnormal data objects in the multi-dimensional space. This technique focuses on uncertain space, where each data object has several possible instances with distinct probabilities. We design an efficient sampling algorithm for the top (k1,k2) outlier in uncertain space. Some improvement techniques are used for acceleration. Experiments show our methods' high accuracy and high efficiency.

  10. Simultaneous spectral/spatial detection of edges for hyperspectral imagery: the HySPADE algorithm revisited

    NASA Astrophysics Data System (ADS)

    Resmini, Ronald G.

    2012-06-01

    The hyperspectral/spatial detection of edges (HySPADE) algorithm, originally published in 2004 [1], has been modified and applied to a wider diversity of hyperspectral imagery (HSI) data. As originally described in [1], HySPADE operates by converting the naturally two-dimensional edge detection process based on traditional image analysis methods into a series of one-dimensional edge detections based on spectral angle. The HySPADE algorithm: i) utilizes spectral signature information to identify edges; ii) requires only the spectral information of the HSI scene data and does not require a spectral library nor spectral matching against a library; iii) facilitates simultaneous use of all spectral information; iv) does not require endmember or training data selection; v) generates multiple, independent data points for statistical analysis of detected edges; vi) is robust in the presence of noise; and vii) may be applied to radiance, reflectance, and emissivity data--though it is applied to radiance and reflectance spectra (and their principal components transformation) in this report. HySPADE has recently been modified to use Euclidean distance values as an alternative to spectral angle. It has also been modified to use an N x N-pixel sliding window in contrast to the 2004 version which operated only on spatial subset image chips. HySPADE results are compared to those obtained using traditional (Roberts and Sobel) edge-detection methods. Spectral angle and Euclidean distance HySPADE results are superior to those obtained using the traditional edge detection methods; the best results are obtained by applying HySPADE to the first few, information-containing bands of principal components transformed data (both radiance and reflectance). However, in practice, both the Euclidean distance and spectral angle versions of HySPADE should be applied and their results compared. HySPADE results are shown; extensions of the HySPADE concept are discussed as are applications for Hy

  11. The development of line-scan image recognition algorithms for the detection of frass on mature tomatoes

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this research, a multispectral algorithm derived from hyperspectral line-scan fluorescence imaging under violet LED excitation was developed for the detection of frass contamination on mature tomatoes. The algorithm utilized the fluorescence intensities at two wavebands, 664 nm and 690 nm, for co...

  12. Change Detection from differential airborne LiDAR using a weighted Anisotropic Iterative Closest Point Algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, X.; Kusari, A.; Glennie, C. L.; Oskin, M. E.; Hinojosa-Corona, A.; Borsa, A. A.; Arrowsmith, R.

    2013-12-01

    Differential LiDAR (Light Detection and Ranging) from repeated surveys has recently emerged as an effective tool to measure three-dimensional (3D) change for applications such as quantifying slip and spatially distributed warping associated with earthquake ruptures, and examining the spatial distribution of beach erosion after hurricane impact. Currently, the primary method for determining 3D change is through the use of the iterative closest point (ICP) algorithm and its variants. However, all current studies using ICP have assumed that all LiDAR points in the compared point clouds have uniform accuracy. This assumption is simplistic given that the error for each LiDAR point is variable, and dependent upon highly variable factors such as target range, angle of incidence, and aircraft trajectory accuracy. Therefore, to rigorously determine spatial change, it would be ideal to model the random error for every LiDAR observation in the differential point cloud, and use these error estimates as apriori weights in the ICP algorithm. To test this approach, we implemented a rigorous LiDAR observation error propagation method to generate estimated random error for each point in a LiDAR point cloud, and then determine 3D displacements between two point clouds using an anistropic weighted ICP algorithm. The algorithm was evaluated by qualitatively and quantitatively comparing post earthquake slip estimates from the 2010 El Mayor-Cucapah Earthquake between a uniform weight and anistropically weighted ICP algorithm, using pre-event LiDAR collected in 2006 by Instituto Nacional de Estadística y Geografía (INEGI), and post-event LiDAR collected by The National Center for Airborne Laser Mapping (NCALM).

  13. Detection and Analyses of Traveling Ionospheric Disturbances from two Successive Earthquakes and Tsunami Waves

    NASA Astrophysics Data System (ADS)

    Yang, Y.; Garrison, J. L.; Komjathy, A.

    2012-12-01

    Traveling ionospheric disturbances (TIDs), induced by acoustic-gravity waves (AGWs) in the neutral atmosphere, are observable in trans-ionospheric Global Navigation Satellite System (GNSS) measurements. Previous studies on the GNSS-derived ionospheric disturbances have been presented for studying the interactions between ionospheric perturbations associated with the 2011 Japan Tohoku Earthquakes and Tsunamis. Three different types of TIDs were observed. Short-period disturbances (2 - 8 minutes) with speeds up to 2300 km/s were observed in the near-field, and the long-period (8-22 minutes) disturbances with speeds (195 - 354m/s) were identified in both near- and far-fields. In this study, identification and classification of ionospheric disturbances was conducted using a wavelet detection method in combination with a cross-correlation technique estimating the propagation speeds and directions of atmosphere wave-induced disturbances in dual frequency IEC time series collected form GNSS networks near the epicenter of the March 11, 2011 Japan Tohoku earthquake and the subsequent tsunami induced by it. Through the use of the wavelet detection process, we are able to find major wave trains, present in the data collected from these networks, with two dominate frequency bands corresponding to the disturbances from two successive earthquakes and tsunami propagations. Additionally, the comparative observations and model predictions, including ground motions and tsunami propagations calculated by JPL using the MOST model are used to understand and perceive the dominant properties (propagation speeds, directions, periods and occurrence times) of the GPS-derived ionospheric disturbances. This analysis is demonstrated on data from 1235 stations in the Japanese GEONET GPS network. A comparison between GNSS-derived disturbances, ground motions and tsunami propagations shows that the propagation directions and speeds of short-period disturbances are consistent with the acoustic

  14. Mapping of Planetary Surface Age Based on Crater Statistics Obtained by AN Automatic Detection Algorithm

    NASA Astrophysics Data System (ADS)

    Salih, A. L.; Mühlbauer, M.; Grumpe, A.; Pasckert, J. H.; Wöhler, C.; Hiesinger, H.

    2016-06-01

    The analysis of the impact crater size-frequency distribution (CSFD) is a well-established approach to the determination of the age of planetary surfaces. Classically, estimation of the CSFD is achieved by manual crater counting and size determination in spacecraft images, which, however, becomes very time-consuming for large surface areas and/or high image resolution. With increasing availability of high-resolution (nearly) global image mosaics of planetary surfaces, a variety of automated methods for the detection of craters based on image data and/or topographic data have been developed. In this contribution a template-based crater detection algorithm is used which analyses image data acquired under known illumination conditions. Its results are used to establish the CSFD for the examined area, which is then used to estimate the absolute model age of the surface. The detection threshold of the automatic crater detection algorithm is calibrated based on a region with available manually determined CSFD such that the age inferred from the manual crater counts corresponds to the age inferred from the automatic crater detection results. With this detection threshold, the automatic crater detection algorithm can be applied to a much larger surface region around the calibration area. The proposed age estimation method is demonstrated for a Kaguya Terrain Camera image mosaic of 7.4 m per pixel resolution of the floor region of the lunar crater Tsiolkovsky, which consists of dark and flat mare basalt and has an area of nearly 10,000 km2. The region used for calibration, for which manual crater counts are available, has an area of 100 km2. In order to obtain a spatially resolved age map, CSFDs and surface ages are computed for overlapping quadratic regions of about 4.4 x 4.4 km2 size offset by a step width of 74 m. Our constructed surface age map of the floor of Tsiolkovsky shows age values of typically 3.2-3.3 Ga, while for small regions lower (down to 2.9 Ga) and higher

  15. Refinements and practical implementation of a power based loss of grid detection algorithm for embedded generators

    NASA Astrophysics Data System (ADS)

    Barrett, James

    The incorporation of small, privately owned generation operating in parallel with, and supplying power to, the distribution network is becoming more widespread. This method of operation does however have problems associated with it. In particular, a loss of the connection to the main utility supply which leaves a portion of the utility load connected to the embedded generator will result in a power island. This situation presents possible dangers to utility personnel and the public, complications for smooth system operation and probable plant damage should the two systems be reconnected out-of-synchronism. Loss of Grid (or Islanding), as this situation is known, is the subject of this thesis. The work begins by detailing the requirements for operation of generation embedded in the utility supply with particular attention drawn to the requirements for a loss of grid protection scheme. The mathematical basis for a new loss of grid protection algorithm is developed and the inclusion of the algorithm in an integrated generator protection scheme described. A detailed description is given on the implementation of the new algorithm in a microprocessor based relay hardware to allow practical tests on small embedded generation facilities, including an in-house multiple generator test facility. The results obtained from the practical tests are compared with those obtained from simulation studies carried out in previous work and the differences are discussed. The performance of the algorithm is enhanced from the theoretical algorithm developed using the simulation results with simple filtering together with pattern recognition techniques. This provides stability during severe load fluctuations under parallel operation and system fault conditions and improved performance under normal operating conditions and for loss of grid detection. In addition to operating for a loss of grid connection, the algorithm will respond to load fluctuations which occur within a power island

  16. Detection of co-colonization with Streptococcus pneumoniae by algorithmic use of conventional and molecular methods.

    PubMed

    Saha, Sudipta; Modak, Joyanta K; Naziat, Hakka; Al-Emran, Hassan M; Chowdury, Mrittika; Islam, Maksuda; Hossain, Belal; Darmstadt, Gary L; Whitney, Cynthia G; Saha, Samir K

    2015-01-29

    Detection of pneumococcal carriage by multiple co-colonizing serotypes is important in assessing the benefits of pneumococcal conjugate vaccine (PCV). Various methods differing in sensitivity, cost and technical complexity have been employed to detect multiple serotypes of pneumococcus in respiratory specimens. We have developed an algorithmic method to detect all known serotypes that preserves the relative abundance of specific serotypes by using Quellung-guided molecular techniques. The method involves culturing respiratory swabs followed by serotyping of 100 colonies by either capsular (10 colonies) or PCR (90 colonies) reactions on 96-well plates. The method was evaluated using 102 nasal swabs from children carrying pneumococcus. Multiple serotypes were detected in 22% of carriers, compared to 3% by World Health Organization (WHO)-recommended morphology-based selection of 1 to 3 colonies. Our method, with a processing cost of $87, could detect subdominant strains making up as low as 1% of the population. The method is affordable, practical, and capable of detecting all known serotypes without false positive reactions or change in the native distribution of multiple serotypes.

  17. An improved algorithm for automatic detection of saccades in eye movement data and for calculating saccade parameters.

    PubMed

    Behrens, F; Mackeben, M; Schröder-Preikschat, W

    2010-08-01

    This analysis of time series of eye movements is a saccade-detection algorithm that is based on an earlier algorithm. It achieves substantial improvements by using an adaptive-threshold model instead of fixed thresholds and using the eye-movement acceleration signal. This has four advantages: (1) Adaptive thresholds are calculated automatically from the preceding acceleration data for detecting the beginning of a saccade, and thresholds are modified during the saccade. (2) The monotonicity of the position signal during the saccade, together with the acceleration with respect to the thresholds, is used to reliably determine the end of the saccade. (3) This allows differentiation between saccades following the main-sequence and non-main-sequence saccades. (4) Artifacts of various kinds can be detected and eliminated. The algorithm is demonstrated by applying it to human eye movement data (obtained by EOG) recorded during driving a car. A second demonstration of the algorithm detects microsleep episodes in eye movement data.

  18. [Multi-population elitists shared genetic algorithm for outlier detection of spectroscopy analysis].

    PubMed

    Cao, Hui; Zhou, Yan

    2011-07-01

    The present paper proposed an outlier detection method for spectral analysis based on multi-population elitists shared genetic algorithm. The method was exploited in the NIR data set analysis to remove the outliers from the data set, and partial least squares (PLS) was combined with the proposed method to build a prediction model. In contrast with Monte Carlo cross validation, leave-one-out cross validation, Mahalanobis-distance and traditional genetic algorithm for outlier detection, the prediction residual error sum of squares (PRESS) for moisture prediction model based on the proposed method decreases in the rate of 72.4%, 39.5%, 39.5% and 14.5%; the PRESS value for fat prediction model decreases in the rate of 86.2%, 75.9%, 84.9% and 19.9%; and the PRESS value for protein prediction model decreases in the rate of 56.5%, 35.7%, 35.7% and 18.2% respectively. Results indicated that the method is applicable for spectral outlier detection for different species, and the model based on the data set without the removed outliers is more accurate and robust.

  19. Algorithm for automatic detection of the cardiovascular parameter PR-interval from LDV-velocity signals

    NASA Astrophysics Data System (ADS)

    Mignanelli, Laura; Rembe, Christian

    2016-06-01

    Laser-Doppler-vibrometry (LDV) is broadly employed in mechanical engineering but it has been demonstrated by several researchers that the technique has also large potential in biomedical applications. In particular, the detection of several vital parameters (heart rate, heart rate variability, respiration period) is known as optical vibrocardiography - VBCG. Recent studies have demonstrated the possibility of a reliable detection of the PR-interval (time between atria and ventricle contractions) and classification of the different types of atrioventricular (AV) blocks from this velocity signals. In this work, an algorithm for the localization of the vibrations generated by atrial contraction for the detection of the PR-interval in VBCG acquired on the thorax is presented. The determination of the time point of a heart beat can be extracted easily because it generates an unambiguous maximal velocity peak in the time data. Extracting the contraction of the atrium is more challenging because it is a characteristic signature with an amplitude at the magnitude of the signal disturbances. We compare different approaches of a cost function for the determination of the time point of the atria-contraction signature as well as different optimization algorithms to find the correct PR-time.

  20. A novel electrocardiogram parameterization algorithm and its application in myocardial infarction detection.

    PubMed

    Liu, Bin; Liu, Jikui; Wang, Guoqing; Huang, Kun; Li, Fan; Zheng, Yang; Luo, Youxi; Zhou, Fengfeng

    2015-06-01

    The electrocardiogram (ECG) is a biophysical electric signal generated by the heart muscle, and is one of the major measurements of how well a heart functions. Automatic ECG analysis algorithms usually extract the geometric or frequency-domain features of the ECG signals and have already significantly facilitated automatic ECG-based cardiac disease diagnosis. We propose a novel ECG feature by fitting a given ECG signal with a 20th order polynomial function, defined as PolyECG-S. The PolyECG-S feature is almost identical to the fitted ECG curve, measured by the Akaike information criterion (AIC), and achieved a 94.4% accuracy in detecting the Myocardial Infarction (MI) on the test dataset. Currently ST segment elongation is one of the major ways to detect MI (ST-elevation myocardial infarction, STEMI). However, many ECG signals have weak or even undetectable ST segments. Since PolyECG-S does not rely on the information of ST waves, it can be used as a complementary MI detection algorithm with the STEMI strategy. Overall, our results suggest that the PolyECG-S feature may satisfactorily reconstruct the fitted ECG curve, and is complementary to the existing ECG features for automatic cardiac function analysis.

  1. Solar Power Ramp Events Detection Using an Optimized Swinging Door Algorithm

    SciTech Connect

    Cui, Mingjian; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Ke, Deping; Sun, Yuanzhang

    2015-08-05

    Solar power ramp events (SPREs) significantly influence the integration of solar power on non-clear days and threaten the reliable and economic operation of power systems. Accurately extracting solar power ramps becomes more important with increasing levels of solar power penetrations in power systems. In this paper, we develop an optimized swinging door algorithm (OpSDA) to enhance the state of the art in SPRE detection. First, the swinging door algorithm (SDA) is utilized to segregate measured solar power generation into consecutive segments in a piecewise linear fashion. Then we use a dynamic programming approach to combine adjacent segments into significant ramps when the decision thresholds are met. In addition, the expected SPREs occurring in clear-sky solar power conditions are removed. Measured solar power data from Tucson Electric Power is used to assess the performance of the proposed methodology. OpSDA is compared to two other ramp detection methods: the SDA and the L1-Ramp Detect with Sliding Window (L1-SW) method. The statistical results show the validity and effectiveness of the proposed method. OpSDA can significantly improve the performance of the SDA, and it can perform as well as or better than L1-SW with substantially less computation time.

  2. A multi-agent genetic algorithm for community detection in complex networks

    NASA Astrophysics Data System (ADS)

    Li, Zhangtao; Liu, Jing

    2016-05-01

    Complex networks are popularly used to represent a lot of practical systems in the domains of biology and sociology, and the structure of community is one of the most important network attributes which has received an enormous amount of attention. Community detection is the process of discovering the community structure hidden in complex networks, and modularity Q is one of the best known quality functions measuring the quality of communities of networks. In this paper, a multi-agent genetic algorithm, named as MAGA-Net, is proposed to optimize modularity value for the community detection. An agent, coded by a division of a network, represents a candidate solution. All agents live in a lattice-like environment, with each agent fixed on a lattice point. A series of operators are designed, namely split and merging based neighborhood competition operator, hybrid neighborhood crossover, adaptive mutation and self-learning operator, to increase modularity value. In the experiments, the performance of MAGA-Net is validated on both well-known real-world benchmark networks and large-scale synthetic LFR networks with 5000 nodes. The systematic comparisons with GA-Net and Meme-Net show that MAGA-Net outperforms these two algorithms, and can detect communities with high speed, accuracy and stability.

  3. A taxonomy of algorithms for chemical vapor detection with hyperspectral imaging spectroscopy

    NASA Astrophysics Data System (ADS)

    Manolakis, Dimitris G.; D'Amico, Francis M.

    2005-05-01

    Remote detection of chemical vapors in the atmosphere has a wide range of civilian and military applications. In the past few years there has been significant interest in the detection of effluent plumes using hyperspectral imaging spectroscopy in the 8-12- m atmospheric window. A major obstacle in the full exploitation of this technology is the fact that everything in the infrared is a source of radiation. As a result, the emission from the gases of interest is always mixed with emission by the more abundant atmospheric constituents and by other objects in the sensor field of view. The radiance fluctuations in this background emission constitute an additional source of interference which is much stronger than the detector noise. The purpose of this paper is threefold. First, we review the thin plume approximation, the resulting additive signal model, and the key differences between reflective and emissive radiance signal models. Second, based on the additive signal model we derive two families of detection algorithms using the generalized likelihood ratio test. The first family models the background using a multivariate normal distribution whereas the second family models the background using a linear subspace. Finally, we present a taxonomy of the available algorithms and show that some other ad-hoc approaches, like orthogonal background suppression, are simplified special cases of optimally derived detectors.

  4. Multiple Kernel Learning for Heterogeneous Anomaly Detection: Algorithm and Aviation Safety Case Study

    NASA Technical Reports Server (NTRS)

    Das, Santanu; Srivastava, Ashok N.; Matthews, Bryan L.; Oza, Nikunj C.

    2010-01-01

    The world-wide aviation system is one of the most complex dynamical systems ever developed and is generating data at an extremely rapid rate. Most modern commercial aircraft record several hundred flight parameters including information from the guidance, navigation, and control systems, the avionics and propulsion systems, and the pilot inputs into the aircraft. These parameters may be continuous measurements or binary or categorical measurements recorded in one second intervals for the duration of the flight. Currently, most approaches to aviation safety are reactive, meaning that they are designed to react to an aviation safety incident or accident. In this paper, we discuss a novel approach based on the theory of multiple kernel learning to detect potential safety anomalies in very large data bases of discrete and continuous data from world-wide operations of commercial fleets. We pose a general anomaly detection problem which includes both discrete and continuous data streams, where we assume that the discrete streams have a causal influence on the continuous streams. We also assume that atypical sequence of events in the discrete streams can lead to off-nominal system performance. We discuss the application domain, novel algorithms, and also discuss results on real-world data sets. Our algorithm uncovers operationally significant events in high dimensional data streams in the aviation industry which are not detectable using state of the art methods

  5. CK-LPA: Efficient community detection algorithm based on label propagation with community kernel

    NASA Astrophysics Data System (ADS)

    Lin, Zhen; Zheng, Xiaolin; Xin, Nan; Chen, Deren

    2014-12-01

    With the rapid development of Web 2.0 and the rise of online social networks, finding community structures from user data has become a hot topic in network analysis. Although research achievements are numerous at present, most of these achievements cannot be adopted in large-scale social networks because of heavy computation. Previous studies have shown that label propagation is an efficient means to detect communities in social networks and is easy to implement; however, some drawbacks, such as low accuracy, high randomness, and the formation of a “monster” community, have been found. In this study, we propose an efficient community detection method based on the label propagation algorithm (LPA) with community kernel (CK-LPA). We assign a corresponding weight to each node according to node importance in the whole network and update node labels in sequence based on weight. Then, we discuss the composition of weights, the label updating strategy, the label propagation strategy, and the convergence conditions. Compared with the primitive LPA, existing drawbacks are solved by CK-LPA. Experiments and benchmarks reveal that our proposed method sustains nearly linear time complexity and exhibits significant improvements in the quality aspect of static community detection. Hence, the algorithm can be applied in large-scale social networks.

  6. Defect-detection algorithm for noncontact acoustic inspection using spectrum entropy

    NASA Astrophysics Data System (ADS)

    Sugimoto, Kazuko; Akamatsu, Ryo; Sugimoto, Tsuneyoshi; Utagawa, Noriyuki; Kuroda, Chitose; Katakura, Kageyoshi

    2015-07-01

    In recent years, the detachment of concrete from bridges or tunnels and the degradation of concrete structures have become serious social problems. The importance of inspection, repair, and updating is recognized in measures against degradation. We have so far studied the noncontact acoustic inspection method using airborne sound and the laser Doppler vibrometer. In this method, depending on the surface state (reflectance, dirt, etc.), the quantity of the light of the returning laser decreases and optical noise resulting from the leakage of light reception arises. Some influencing factors are the stability of the output of the laser Doppler vibrometer, the low reflective characteristic of the measurement surface, the diffused reflection characteristic, measurement distance, and laser irradiation angle. If defect detection depends only on the vibration energy ratio since the frequency characteristic of the optical noise resembles white noise, the detection of optical noise resulting from the leakage of light reception may indicate a defective part. Therefore, in this work, the combination of the vibrational energy ratio and spectrum entropy is used to judge whether a measured point is healthy or defective or an abnormal measurement point. An algorithm that enables more vivid detection of a defective part is proposed. When our technique was applied in an experiment with real concrete structures, the defective part could be extracted more vividly and the validity of our proposed algorithm was confirmed.

  7. Further development of image processing algorithms to improve detectability of defects in Sonic IR NDE

    NASA Astrophysics Data System (ADS)

    Obeidat, Omar; Yu, Qiuye; Han, Xiaoyan

    2017-02-01

    Sonic Infrared imaging (SIR) technology is a relatively new NDE technique that has received significant acceptance in the NDE community. SIR NDE is a super-fast, wide range NDE method. The technology uses short pulses of ultrasonic excitation together with infrared imaging to detect defects in the structures under inspection. Defects become visible to the IR camera when the temperature in the crack vicinity increases due to various heating mechanisms in the specimen. Defect detection is highly affected by noise levels as well as mode patterns in the image. Mode patterns result from the superposition of sonic waves interfering within the specimen during the application of sound pulse. Mode patterns can be a serious concern, especially in composite structures. Mode patterns can either mimic real defects in the specimen, or alternatively, hide defects if they overlap. In last year's QNDE, we have presented algorithms to improve defects detectability in severe noise. In this paper, we will present our development of algorithms on defect extraction targeting specifically to mode patterns in SIR images.

  8. A novel through-wall respiration detection algorithm using UWB radar.

    PubMed

    Li, Xin; Qiao, Dengyu; Li, Ye; Dai, Huhe

    2013-01-01

    Through-wall respiration detection using Ultra-wideband (UWB) impulse radar can be applied to the post-disaster rescue, e.g., searching living persons trapped in ruined buildings after an earthquake. Since strong interference signals always exist in the real-life scenarios, such as static clutter, noise, etc., while the respiratory signal is very weak, the signal to noise and clutter ratio (SNCR) is quite low. Therefore, through-wall respiration detection using UWB impulse radar under low SNCR is a challenging work in the research field of searching survivors after disaster. In this paper, an improved UWB respiratory signal model is built up based on an even power of cosine function for the first time. This model is used to reveal the harmonic structure of respiratory signal, based on which a novel high-performance respiration detection algorithm is proposed. This novel algorithm is assessed by experimental verification and simulation and shows about a 1.5dB improvement of SNR and SNCR.

  9. Solar Power Ramp Events Detection Using an Optimized Swinging Door Algorithm: Preprint

    SciTech Connect

    Cui, Mingjian; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Ke, Deping; Sun, Yuanzhang

    2015-08-07

    Solar power ramp events (SPREs) are those that significantly influence the integration of solar power on non-clear days and threaten the reliable and economic operation of power systems. Accurately extracting solar power ramps becomes more important with increasing levels of solar power penetrations in power systems. In this paper, we develop an optimized swinging door algorithm (OpSDA) to detection. First, the swinging door algorithm (SDA) is utilized to segregate measured solar power generation into consecutive segments in a piecewise linear fashion. Then we use a dynamic programming approach to combine adjacent segments into significant ramps when the decision thresholds are met. In addition, the expected SPREs occurring in clear-sky solar power conditions are removed. Measured solar power data from Tucson Electric Power is used to assess the performance of the proposed methodology. OpSDA is compared to two other ramp detection methods: the SDA and the L1-Ramp Detect with Sliding Window (L1-SW) method. The statistical results show the validity and effectiveness of the proposed method. OpSDA can significantly improve the performance of the SDA, and it can perform as well as or better than L1-SW with substantially less computation time.

  10. Design of measuring system for wire diameter based on sub-pixel edge detection algorithm

    NASA Astrophysics Data System (ADS)

    Chen, Yudong; Zhou, Wang

    2016-09-01

    Light projection method is often used in measuring system for wire diameter, which is relatively simpler structure and lower cost, and the measuring accuracy is limited by the pixel size of CCD. Using a CCD with small pixel size can improve the measuring accuracy, but will increase the cost and difficulty of making. In this paper, through the comparative analysis of a variety of sub-pixel edge detection algorithms, polynomial fitting method is applied for data processing in measuring system for wire diameter, to improve the measuring accuracy and enhance the ability of anti-noise. In the design of system structure, light projection method with orthogonal structure is used for the detection optical part, which can effectively reduce the error caused by line jitter in the measuring process. For the electrical part, ARM Cortex-M4 microprocessor is used as the core of the circuit module, which can not only drive double channel linear CCD but also complete the sampling, processing and storage of the CCD video signal. In addition, ARM microprocessor can complete the high speed operation of the whole measuring system for wire diameter in the case of no additional chip. The experimental results show that sub-pixel edge detection algorithm based on polynomial fitting can make up for the lack of single pixel size and improve the precision of measuring system for wire diameter significantly, without increasing hardware complexity of the entire system.

  11. Mobile detection assessment and response systems (MDARS): a force protection physical security operational success

    NASA Astrophysics Data System (ADS)

    Shoop, Brian; Johnston, Michael; Goehring, Richard; Moneyhun, Jon; Skibba, Brian

    2006-05-01

    MDARS is a Semi-autonomous unmanned ground vehicle with intrusion detection & assessment, product & barrier assessment payloads. Its functions include surveillance, security, early warning, incident first response and product and barrier status primarily focused on a depot/munitions security mission at structured/semi-structured facilities. MDARS is in Systems Development and Demonstration (SDD) under the Product Manager for Force Protection Systems (PM-FPS). MDARS capabilities include semi-autonomous navigation, obstacle avoidance, motion detection, day and night imagers, radio frequency tag inventory/barrier assessment and audio challenge and response. Four SDD MDARS Patrol Vehicles have been undergoing operational evaluation at Hawthorne Army Depot, NV (HWAD) since October 2004. Hawthorne personnel were trained to administer, operate and maintain the system in accordance with the US Army Military Police School (USAMPS) Concept of Employment and the PM-FPS MDARS Integrated Logistic Support Plan. The system was subjected to intensive periods of evaluation under the guidance and control of the Army Test and Evaluation Center (ATEC) and PM-FPS. Significantly, in terms of User acceptance, the system has been under the "operational control" of the installation performing security and force protection missions in support of daily operations. This evaluation is intended to assess MDARS operational effectiveness in an operational environment. Initial observations show that MDARS provides enhanced force protection, can potentially reduce manpower requirements by conducting routine tasks within its design capabilities and reduces Soldier exposure in the initial response to emerging incidents and situations. Success of the MDARS program has been instrumental in the design and development of two additional robotic force protection programs. The first was the USAF Force Protection Battle Lab sponsored Remote Detection Challenge & Response (REDCAR) concept demonstration

  12. Algorithms Performance Investigation of a Generalized Spreader-Bar Detection System

    SciTech Connect

    Robinson, Sean M.; Ashbaker, Eric D.; Hensley, Walter K.; Schweppe, John E.; Sandness, Gerald A.; Erikson, Luke E.; Ely, James H.

    2010-10-01

    A “generic” gantry-crane-mounted spreader bar detector has been simulated in the Monte-Carlo radiation transport code MCNP [1]. This model is intended to represent the largest realistically feasible number of detector crystals in a single gantry-crane model intended to sit atop an InterModal Cargo Container (IMCC). Detectors were chosen from among large commonly-available sodium iodide (NaI) crystal scintillators and spaced as evenly as is thought possible with a detector apparatus attached to a gantry crane. Several scenarios were simulated with this model, based on a single IMCC being moved between a ship’s deck or cargo hold and the dock. During measurement, the gantry crane will carry that IMCC through the air and lower it onto a receiving vehicle (e.g. a chassis or a bomb cart). The case of an IMCC being moved through the air from an unknown radiological environment to the ground is somewhat complex; for this initial study a single location was picked at which to simulate background. An HEU source based on earlier validated models was used, and placed at varying depths in a wood cargo. Many statistical realizations of these scenarios are constructed from simulations of the component spectra, simulated to have high statistics. The resultant data are analyzed with several different algorithms. The simulated data were evaluated by each algorithm, with a threshold set to a statistical-only false alarm probability of 0.001 and the resultant Minimum Detectable Amounts were generated for each Cargo depth possible within the IMCC. Using GADRAS as an anomaly detector provided the greatest detection sensitivity, and it is expected that an algorithm similar to this will be of great use to the detection of highly shielded sources.

  13. An algorithm to detect and communicate the differences in computational models describing biological systems

    PubMed Central

    Scharm, Martin; Wolkenhauer, Olaf; Waltemath, Dagmar

    2016-01-01

    Motivation: Repositories support the reuse of models and ensure transparency about results in publications linked to those models. With thousands of models available in repositories, such as the BioModels database or the Physiome Model Repository, a framework to track the differences between models and their versions is essential to compare and combine models. Difference detection not only allows users to study the history of models but also helps in the detection of errors and inconsistencies. Existing repositories lack algorithms to track a model’s development over time. Results: Focusing on SBML and CellML, we present an algorithm to accurately detect and describe differences between coexisting versions of a model with respect to (i) the models’ encoding, (ii) the structure of biological networks and (iii) mathematical expressions. This algorithm is implemented in a comprehensive and open source library called BiVeS. BiVeS helps to identify and characterize changes in computational models and thereby contributes to the documentation of a model’s history. Our work facilitates the reuse and extension of existing models and supports collaborative modelling. Finally, it contributes to better reproducibility of modelling results and to the challenge of model provenance. Availability and implementation: The workflow described in this article is implemented in BiVeS. BiVeS is freely available as source code and binary from sems.uni-rostock.de. The web interface BudHat demonstrates the capabilities of BiVeS at budhat.sems.uni-rostock.de. Contact: martin.scharm@uni-rostock.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26490504

  14. GPU implementation of target and anomaly detection algorithms for remotely sensed hyperspectral image analysis

    NASA Astrophysics Data System (ADS)

    Paz, Abel; Plaza, Antonio

    2010-08-01

    Automatic target and anomaly detection are considered very important tasks for hyperspectral data exploitation. These techniques are now routinely applied in many application domains, including defence and intelligence, public safety, precision agriculture, geology, or forestry. Many of these applications require timely responses for swift decisions which depend upon high computing performance of algorithm analysis. However, with the recent explosion in the amount and dimensionality of hyperspectral imagery, this problem calls for the incorporation of parallel computing techniques. In the past, clusters of computers have offered an attractive solution for fast anomaly and target detection in hyperspectral data sets already transmitted to Earth. However, these systems are expensive and difficult to adapt to on-board data processing scenarios, in which low-weight and low-power integrated components are essential to reduce mission payload and obtain analysis results in (near) real-time, i.e., at the same time as the data is collected by the sensor. An exciting new development in the field of commodity computing is the emergence of commodity graphics processing units (GPUs), which can now bridge the gap towards on-board processing of remotely sensed hyperspectral data. In this paper, we describe several new GPU-based implementations of target and anomaly detection algorithms for hyperspectral data exploitation. The parallel algorithms are implemented on latest-generation Tesla C1060 GPU architectures, and quantitatively evaluated using hyperspectral data collected by NASA's AVIRIS system over the World Trade Center (WTC) in New York, five days after the terrorist attacks that collapsed the two main towers in the WTC complex.

  15. Development of an apnea detection algorithm based on temporal analysis of thoracic respiratory effort signal

    NASA Astrophysics Data System (ADS)

    Dell'Aquila, C. R.; Cañadas, G. E.; Correa, L. S.; Laciar, E.

    2016-04-01

    This work describes the design of an algorithm for detecting apnea episodes, based on analysis of thorax respiratory effort signal. Inspiration and expiration time, and range amplitude of respiratory cycle were evaluated. For range analysis the standard deviation statistical tool was used over respiratory signal temporal windows. The validity of its performance was carried out in 8 records of Apnea-ECG database that has annotations of apnea episodes. The results are: sensitivity (Se) 73%, specificity (Sp) 83%. These values can be improving eliminating artifact of signal records.

  16. Error analysis of coefficient-based regularized algorithm for density-level detection.

    PubMed

    Chen, Hong; Pan, Zhibin; Li, Luoqing; Tang, Yuanyan

    2013-04-01

    In this letter, we consider a density-level detection (DLD) problem by a coefficient-based classification framework with [Formula: see text]-regularizer and data-dependent hypothesis spaces. Although the data-dependent characteristic of the algorithm provides flexibility and adaptivity for DLD, it leads to difficulty in generalization error analysis. To overcome this difficulty, an error decomposition is introduced from an established classification framework. On the basis of this decomposition, the estimate of the learning rate is obtained by using Rademacher average and stepping-stone techniques. In particular, the estimate is independent of the capacity assumption used in the previous literature.

  17. Laser spot detection-based computer interface system using autoassociative multilayer perceptron with input-to-output mapping-sensitive error back propagation learning algorithm

    NASA Astrophysics Data System (ADS)

    Jeong, Sungmoon; Jung, Chanwoong; Kim, Cheol-Su; Shim, Jae Hoon; Lee, Minho

    2011-08-01

    This paper presents a new computer interface system based on laser spot detection and moving pattern analysis of the detected laser spots in real-time processing. We propose a systematic method that uses either the frame difference of successive input images or an autoassociative multilayer perceptron (AAMLP) to detect laser spots. The AAMLP is applied only to areas of the input images where the frame difference of the successive images is not effective for detecting laser spots. In order to enhance the detection performance, the AAMLP is trained by a new training algorithm that increases the sensitivity of the input-to-output mapping of the AAMLP allowing a small variation in the input feature of the laser spot image to be successfully indicated. The proposed interface system is also able to keep track of the laser spot and recognize gesture commands. The moving pattern of the laser spot is recognized by using a multilayer perception. It is experimentally shown that the proposed computer interface system is fast enough for real-time operation with reliable accuracy.

  18. Effects of prey quality and predator body size on prey DNA detection success in a centipede predator.

    PubMed

    Eitzinger, B; Unger, E M; Traugott, M; Scheu, S

    2014-08-01

    Predator body size and prey quality are important factors driving prey choice and consumption rates. Both factors might affect prey detection success in PCR-based gut content analysis, potentially resulting in over- or underestimation of feeding rates. Experimental evidence, however, is scarce. We examined how body size and prey quality affect prey DNA detection success in centipede predators. Due to metabolic rates increasing with body size, we hypothesized that prey DNA detection intervals will be shorter in large predators than in smaller ones. Moreover, we hypothesized that prey detection intervals of high-quality prey, defined by low carbon-to-nitrogen ratio will be shorter than in low-quality prey due to faster assimilation. Small, medium and large individuals of centipedes Lithobius spp. (Lithobiidae, Chilopoda) were fed Collembola and allowed to digest prey for up to 168 h post-feeding. To test our second hypothesis, medium-sized lithobiids were fed with either Diptera or Lumbricidae. No significant differences in 50% prey DNA detection success time intervals for a 272-bp prey DNA fragment were found between the predator size groups, indicating that body size does not affect prey DNA detection success. Post-feeding detection intervals were significantly shorter in Lumbricidae and Diptera compared to Collembola prey, apparently supporting the second hypothesis. However, sensitivity of diagnostic PCR differed between prey types, and quantitative PCR revealed that concentration of targeted DNA varied significantly between prey types. This suggests that both DNA concentration and assay sensitivity need to be considered when assessing prey quality effects on prey DNA detection success.

  19. Detection of nasopharyngeal cancer using confocal Raman spectroscopy and genetic algorithm technique

    NASA Astrophysics Data System (ADS)

    Li, Shao-Xin; Chen, Qiu-Yan; Zhang, Yan-Jiao; Liu, Zhi-Ming; Xiong, Hong-Lian; Guo, Zhou-Yi; Mai, Hai-Qiang; Liu, Song-Hao

    2012-12-01

    Raman spectroscopy (RS) and a genetic algorithm (GA) were applied to distinguish nasopharyngeal cancer (NPC) from normal nasopharyngeal tissue. A total of 225 Raman spectra are acquired from 120 tissue sites of 63 nasopharyngeal patients, 56 Raman spectra from normal tissue and 169 Raman spectra from NPC tissue. The GA integrated with linear discriminant analysis (LDA) is developed to differentiate NPC and normal tissue according to spectral variables in the selected regions of 792-805, 867-880, 996-1009, 1086-1099, 1288-1304, 1663-1670, and 1742-1752 cm-1 related to proteins, nucleic acids and lipids of tissue. The GA-LDA algorithms with the leave-one-out cross-validation method provide a sensitivity of 69.2% and specificity of 100%. The results are better than that of principal component analysis which is applied to the same Raman dataset of nasopharyngeal tissue with a sensitivity of 63.3% and specificity of 94.6%. This demonstrates that Raman spectroscopy associated with GA-LDA diagnostic algorithm has enormous potential to detect and diagnose nasopharyngeal cancer.

  20. Detection and mapping vegetation cover based on the Spectral Angle Mapper algorithm using NOAA AVHRR data

    NASA Astrophysics Data System (ADS)

    Yagoub, Houria; Belbachir, Ahmed Hafid; Benabadji, Noureddine

    2014-06-01

    Satellite data, taken from the National Oceanic and Atmospheric Administration (NOAA) have been proposed and used for the detection and the cartography of vegetation cover in North Africa. The data used were acquired at the Analysis and Application of Radiation Laboratory (LAAR) from the Advanced Very High Resolution Radiometer (AVHRR) sensor of 1 km spatial resolution. The Spectral Angle Mapper Algorithm (SAM) is used for the classification of many studies using high resolution satellite data. In the present paper, we propose to apply the SAM algorithm to the moderate resolution of the NOAA AVHRR sensor data for classifying the vegetation cover. This study allows also exploiting other classification methods for the low resolution. First, the normalized difference vegetation index (NDVI) is extracted from two channels 1 and 2 of the AVHRR sensor. In order to obtain an initial density representation of vegetal formation distribution, a methodology, based on the combination between the threshold method and the decision tree, is used. This combination is carried out due to the lack of accurate data related to the thresholds that delimit each class. In a second time, and based on spectral behavior, a vegetation cover map is developed using SAM algorithm. Finally, with the use of low resolution satellite images (NOAA AVHRR) and with only two channels, it is possible to identify the most dominant species in North Africa such as: forests of the Liege oaks, other forests, cereal's cultivation, steppes and bar soil.

  1. [A quick algorithm of dynamic spectrum photoelectric pulse wave detection based on LabVIEW].

    PubMed

    Lin, Ling; Li, Na; Li, Gang

    2010-02-01

    Dynamic spectrum (DS) detection is attractive among the numerous noninvasive blood component detection methods because of the elimination of the main interference of the individual discrepancy and measure conditions. DS is a kind of spectrum extracted from the photoelectric pulse wave and closely relative to the artery blood. It can be used in a noninvasive blood component concentration examination. The key issues in DS detection are high detection precision and high operation speed. The precision of measure can be advanced by making use of over-sampling and lock-in amplifying on the pick-up of photoelectric pulse wave in DS detection. In the present paper, the theory expression formula of the over-sampling and lock-in amplifying method was deduced firstly. Then in order to overcome the problems of great data and excessive operation brought on by this technology, a quick algorithm based on LabVIEW and a method of using external C code applied in the pick-up of photoelectric pulse wave were presented. Experimental verification was conducted in the environment of LabVIEW. The results show that by the method pres ented, the speed of operation was promoted rapidly and the data memory was reduced largely.

  2. A robust active contour edge detection algorithm based on local Gaussian statistical model for oil slick remote sensing image

    NASA Astrophysics Data System (ADS)

    Jing, Yu; Wang, Yaxuan; Liu, Jianxin; Liu, Zhaoxia

    2015-08-01

    Edge detection is a crucial method for the location and quantity estimation of oil slick when oil spills on the sea. In this paper, we present a robust active contour edge detection algorithm for oil spill remote sensing images. In the proposed algorithm, we define a local Gaussian data fitting energy term with spatially varying means and variances, and this data fitting energy term is introduced into a global minimization active contour (GMAC) framework. The energy function minimization is achieved fast by a dual formulation of the weighted total variation norm. The proposed algorithm avoids the existence of local minima, does not require the definition of initial contour, and is robust to weak boundaries, high noise and severe intensity inhomogeneity exiting in oil slick remote sensing images. Furthermore, the edge detection of oil slick and the correction of intensity inhomogeneity are simultaneously achieved via the proposed algorithm. The experiment results have shown that a superior performance of proposed algorithm over state-of-the-art edge detection algorithms. In addition, the proposed algorithm can also deal with the special images with the object and background of the same intensity means but different variances.

  3. A level-crossing based QRS-detection algorithm for wearable ECG sensors.

    PubMed

    Ravanshad, Nassim; Rezaee-Dehsorkh, Hamidreza; Lotfi, Reza; Lian, Yong

    2014-01-01

    In this paper, an asynchronous analog-to-information conversion system is introduced for measuring the RR intervals of the electrocardiogram (ECG) signals. The system contains a modified level-crossing analog-to-digital converter and a novel algorithm for detecting the R-peaks from the level-crossing sampled data in a compressed volume of data. Simulated with MIT-BIH Arrhythmia Database, the proposed system delivers an average detection accuracy of 98.3%, a sensitivity of 98.89%, and a positive prediction of 99.4%. Synthesized in 0.13 μm CMOS technology with a 1.2 V supply voltage, the overall system consumes 622 nW with core area of 0.136 mm (2), which make it suitable for wearable wireless ECG sensors in body-sensor networks.

  4. Develop algorithms to improve detectability of defects in Sonic IR imaging NDE

    NASA Astrophysics Data System (ADS)

    Obeidat, Omar; Yu, Qiuye; Han, Xiaoyan

    2016-02-01

    Sonic Infrared (IR) technology is relative new in the NDE family. It is a fast, wide area imaging method. It combines ultrasound excitation and infrared imaging while the former to apply ultrasound energy thus induce friction heating in defects and the latter to capture the IR emission from the target. This technology can detect both surface and subsurface defects such as cracks and disbands/delaminations in various materials, metal/metal alloy or composites. However, certain defects may results in only very small IR signature be buried in noise or heating patterns. In such cases, to effectively extract the defect signals becomes critical in identifying the defects. In this paper, we will present algorithms which are developed to improve the detectability of defects in Sonic IR.

  5. Computer simulation and evaluation of edge detection algorithms and their application to automatic path selection

    NASA Technical Reports Server (NTRS)

    Longendorfer, B. A.

    1976-01-01

    The construction of an autonomous roving vehicle requires the development of complex data-acquisition and processing systems, which determine the path along which the vehicle travels. Thus, a vehicle must possess algorithms which can (1) reliably detect obstacles by processing sensor data, (2) maintain a constantly updated model of its surroundings, and (3) direct its immediate actions to further a long range plan. The first function consisted of obstacle recognition. Obstacles may be identified by the use of edge detection techniques. Therefore, the Kalman Filter was implemented as part of a large scale computer simulation of the Mars Rover. The second function consisted of modeling the environment. The obstacle must be reconstructed from its edges, and the vast amount of data must be organized in a readily retrievable form. Therefore, a Terrain Modeller was developed which assembled and maintained a rectangular grid map of the planet. The third function consisted of directing the vehicle's actions.

  6. Research on the Filtering Algorithm in Speed and Position Detection of Maglev Trains

    PubMed Central

    Dai, Chunhui; Long, Zhiqiang; Xie, Yunde; Xue, Song

    2011-01-01

    This paper introduces in brief the traction system of a permanent magnet electrodynamic suspension (EDS) train. The synchronous traction mode based on long stators and track cable is described. A speed and position detection system is recommended. It is installed on board and is used as the feedback end. Restricted by the maglev train’s structure, the permanent magnet electrodynamic suspension (EDS) train uses the non-contact method to detect its position. Because of the shake and the track joints, the position signal sent by the position sensor is always aberrant and noisy. To solve this problem, a linear discrete track-differentiator filtering algorithm is proposed. The filtering characters of the track-differentiator (TD) and track-differentiator group are analyzed. The four series of TD are used in the signal processing unit. The result shows that the track-differentiator could have a good effect and make the traction system run normally. PMID:22164012

  7. Advanced Algorithms and High-Performance Testbed for Large-Scale Site Characterization and Subsurface Target Detecting Using Airborne Ground Penetrating SAR

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Collier, James B.; Citak, Ari

    1997-01-01

    A team of US Army Corps of Engineers, Omaha District and Engineering and Support Center, Huntsville, let Propulsion Laboratory (JPL), Stanford Research Institute (SRI), and Montgomery Watson is currently in the process of planning and conducting the largest ever survey at the Former Buckley Field (60,000 acres), in Colorado, by using SRI airborne, ground penetrating, Synthetic Aperture Radar (SAR). The purpose of this survey is the detection of surface and subsurface Unexploded Ordnance (UXO) and in a broader sense the site characterization for identification of contaminated as well as clear areas. In preparation for such a large-scale survey, JPL has been developing advanced algorithms and a high-performance restbed for processing of massive amount of expected SAR data from this site. Two key requirements of this project are the accuracy (in terms of UXO detection) and speed of SAR data processing. The first key feature of this testbed is a large degree of automation and a minimum degree of the need for human perception in the processing to achieve an acceptable processing rate of several hundred acres per day. For accurate UXO detection, novel algorithms have been developed and implemented. These algorithms analyze dual polarized (HH and VV) SAR data. They are based on the correlation of HH and VV SAR data and involve a rather large set of parameters for accurate detection of UXO. For each specific site, this set of parameters can be optimized by using ground truth data (i.e., known surface and subsurface UXOs). In this paper, we discuss these algorithms and their successful application for detection of surface and subsurface anti-tank mines by using a data set from Yuma proving Ground, A7, acquired by SRI SAR.

  8. A prototype hail detection algorithm and hail climatology developed with the advanced microwave sounding unit (AMSU)

    NASA Astrophysics Data System (ADS)

    Ferraro, Ralph; Beauchamp, James; Cecil, Daniel; Heymsfield, Gerald

    2015-09-01

    In previous studies published in the open literature, a strong relationship between the occurrence of hail and the microwave brightness temperatures (primarily at 37 and 85 GHz) was documented. These studies were performed with the Nimbus-7 Scanning Multichannel Microwave Radiometer (SMMR), the Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI) and most recently, the Aqua Advanced Microwave Scanning Radiometer (AMSR-E) sensor. This led to climatologies of hail frequency from TMI and AMSR-E, however, limitations included geographical domain of the TMI sensor (35 S to 35 N) and the overpass time of the Aqua satellite (130 am/pm local time), both of which reduce an accurate mapping of hail events over the global domain and the full diurnal cycle. Nonetheless, these studies presented exciting, new applications for passive microwave sensors. NOAA and EUMETSAT have been operating the Advanced Microwave Sounding Unit (AMSU-A and -B) and the Microwave Humidity Sounder (MHS) on several operational satellites since 1998: NOAA-15 through NOAA-19; MetOp-A and -B. With multiple satellites in operation since 2000, the AMSU/MHS sensors provide near global coverage every 4 h, thus, offering a much larger time and temporal sampling than TRMM or AMSR-E. With similar observation frequencies near 30 and 85 GHz, one at 157 GHz, and additionally three at the 183 GHz water vapor band, the potential to detect strong convection associated with severe storms on a more comprehensive time and space scale exists. In this study, we develop a prototype AMSU-based hail detection algorithm through the use of collocated satellite and surface hail reports over the continental US for a 10-year period (2000-2009). Compared with the surface observations, the algorithm detects approximately 40% of hail occurrences. The simple threshold algorithm is then used to generate a hail climatology based on all available AMSU observations during 2000-2011 that is stratified in several ways

  9. A Prototype Hail Detection Algorithm and Hail Climatology Developed with the Advanced Microwave Sounding Unit (AMSU)

    NASA Technical Reports Server (NTRS)

    Ferraro, Ralph; Beauchamp, James; Cecil, Dan; Heymsfeld, Gerald

    2015-01-01

    In previous studies published in the open literature, a strong relationship between the occurrence of hail and the microwave brightness temperatures (primarily at 37 and 85 GHz) was documented. These studies were performed with the Nimbus-7 SMMR, the TRMM Microwave Imager (TMI) and most recently, the Aqua AMSR-E sensor. This lead to climatologies of hail frequency from TMI and AMSR-E, however, limitations include geographical domain of the TMI sensor (35 S to 35 N) and the overpass time of the Aqua satellite (130 am/pm local time), both of which reduce an accurate mapping of hail events over the global domain and the full diurnal cycle. Nonetheless, these studies presented exciting, new applications for passive microwave sensors. Since 1998, NOAA and EUMETSAT have been operating the AMSU-A/B and the MHS on several operational satellites: NOAA-15