A Doubly Stochastic Change Point Detection Algorithm for Noisy Biological Signals.
Gold, Nathan; Frasch, Martin G; Herry, Christophe L; Richardson, Bryan S; Wang, Xiaogang
2017-01-01
Experimentally and clinically collected time series data are often contaminated with significant confounding noise, creating short, noisy time series. This noise, due to natural variability and measurement error, poses a challenge to conventional change point detection methods. We propose a novel and robust statistical method for change point detection for noisy biological time sequences. Our method is a significant improvement over traditional change point detection methods, which only examine a potential anomaly at a single time point. In contrast, our method considers all suspected anomaly points and considers the joint probability distribution of the number of change points and the elapsed time between two consecutive anomalies. We validate our method with three simulated time series, a widely accepted benchmark data set, two geological time series, a data set of ECG recordings, and a physiological data set of heart rate variability measurements of fetal sheep model of human labor, comparing it to three existing methods. Our method demonstrates significantly improved performance over the existing point-wise detection methods.
Reference point detection for camera-based fingerprint image based on wavelet transformation.
Khalil, Mohammed S
2015-04-30
Fingerprint recognition systems essentially require core-point detection prior to fingerprint matching. The core-point is used as a reference point to align the fingerprint with a template database. When processing a larger fingerprint database, it is necessary to consider the core-point during feature extraction. Numerous core-point detection methods are available and have been reported in the literature. However, these methods are generally applied to scanner-based images. Hence, this paper attempts to explore the feasibility of applying a core-point detection method to a fingerprint image obtained using a camera phone. The proposed method utilizes a discrete wavelet transform to extract the ridge information from a color image. The performance of proposed method is evaluated in terms of accuracy and consistency. These two indicators are calculated automatically by comparing the method's output with the defined core points. The proposed method is tested on two data sets, controlled and uncontrolled environment, collected from 13 different subjects. In the controlled environment, the proposed method achieved a detection rate 82.98%. In uncontrolled environment, the proposed method yield a detection rate of 78.21%. The proposed method yields promising results in a collected-image database. Moreover, the proposed method outperformed compare to existing method.
A multi points ultrasonic detection method for material flow of belt conveyor
NASA Astrophysics Data System (ADS)
Zhang, Li; He, Rongjun
2018-03-01
For big detection error of single point ultrasonic ranging technology used in material flow detection of belt conveyor when coal distributes unevenly or is large, a material flow detection method of belt conveyor is designed based on multi points ultrasonic counter ranging technology. The method can calculate approximate sectional area of material by locating multi points on surfaces of material and belt, in order to get material flow according to running speed of belt conveyor. The test results show that the method has smaller detection error than single point ultrasonic ranging technology under the condition of big coal with uneven distribution.
NASA Astrophysics Data System (ADS)
Dong, Xiabin; Huang, Xinsheng; Zheng, Yongbin; Bai, Shengjian; Xu, Wanying
2014-07-01
Infrared moving target detection is an important part of infrared technology. We introduce a novel infrared small moving target detection method based on tracking interest points under complicated background. Firstly, Difference of Gaussians (DOG) filters are used to detect a group of interest points (including the moving targets). Secondly, a sort of small targets tracking method inspired by Human Visual System (HVS) is used to track these interest points for several frames, and then the correlations between interest points in the first frame and the last frame are obtained. Last, a new clustering method named as R-means is proposed to divide these interest points into two groups according to the correlations, one is target points and another is background points. In experimental results, the target-to-clutter ratio (TCR) and the receiver operating characteristics (ROC) curves are computed experimentally to compare the performances of the proposed method and other five sophisticated methods. From the results, the proposed method shows a better discrimination of targets and clutters and has a lower false alarm rate than the existing moving target detection methods.
A Robust False Matching Points Detection Method for Remote Sensing Image Registration
NASA Astrophysics Data System (ADS)
Shan, X. J.; Tang, P.
2015-04-01
Given the influences of illumination, imaging angle, and geometric distortion, among others, false matching points still occur in all image registration algorithms. Therefore, false matching points detection is an important step in remote sensing image registration. Random Sample Consensus (RANSAC) is typically used to detect false matching points. However, RANSAC method cannot detect all false matching points in some remote sensing images. Therefore, a robust false matching points detection method based on Knearest- neighbour (K-NN) graph (KGD) is proposed in this method to obtain robust and high accuracy result. The KGD method starts with the construction of the K-NN graph in one image. K-NN graph can be first generated for each matching points and its K nearest matching points. Local transformation model for each matching point is then obtained by using its K nearest matching points. The error of each matching point is computed by using its transformation model. Last, L matching points with largest error are identified false matching points and removed. This process is iterative until all errors are smaller than the given threshold. In addition, KGD method can be used in combination with other methods, such as RANSAC. Several remote sensing images with different resolutions and terrains are used in the experiment. We evaluate the performance of KGD method, RANSAC + KGD method, RANSAC, and Graph Transformation Matching (GTM). The experimental results demonstrate the superior performance of the KGD and RANSAC + KGD methods.
Eubanks-Carter, Catherine; Gorman, Bernard S; Muran, J Christopher
2012-01-01
Analysis of change points in psychotherapy process could increase our understanding of mechanisms of change. In particular, naturalistic change point detection methods that identify turning points or breakpoints in time series data could enhance our ability to identify and study alliance ruptures and resolutions. This paper presents four categories of statistical methods for detecting change points in psychotherapy process: criterion-based methods, control chart methods, partitioning methods, and regression methods. Each method's utility for identifying shifts in the alliance is illustrated using a case example from the Beth Israel Psychotherapy Research program. Advantages and disadvantages of the various methods are discussed.
Applications of 3D-EDGE Detection for ALS Point Cloud
NASA Astrophysics Data System (ADS)
Ni, H.; Lin, X. G.; Zhang, J. X.
2017-09-01
Edge detection has been one of the major issues in the field of remote sensing and photogrammetry. With the fast development of sensor technology of laser scanning system, dense point clouds have become increasingly common. Precious 3D-edges are able to be detected from these point clouds and a great deal of edge or feature line extraction methods have been proposed. Among these methods, an easy-to-use 3D-edge detection method, AGPN (Analyzing Geometric Properties of Neighborhoods), has been proposed. The AGPN method detects edges based on the analysis of geometric properties of a query point's neighbourhood. The AGPN method detects two kinds of 3D-edges, including boundary elements and fold edges, and it has many applications. This paper presents three applications of AGPN, i.e., 3D line segment extraction, ground points filtering, and ground breakline extraction. Experiments show that the utilization of AGPN method gives a straightforward solution to these applications.
Change Point Detection in Correlation Networks
NASA Astrophysics Data System (ADS)
Barnett, Ian; Onnela, Jukka-Pekka
2016-01-01
Many systems of interacting elements can be conceptualized as networks, where network nodes represent the elements and network ties represent interactions between the elements. In systems where the underlying network evolves, it is useful to determine the points in time where the network structure changes significantly as these may correspond to functional change points. We propose a method for detecting change points in correlation networks that, unlike previous change point detection methods designed for time series data, requires minimal distributional assumptions. We investigate the difficulty of change point detection near the boundaries of the time series in correlation networks and study the power of our method and competing methods through simulation. We also show the generalizable nature of the method by applying it to stock price data as well as fMRI data.
Optimizing Probability of Detection Point Estimate Demonstration
NASA Technical Reports Server (NTRS)
Koshti, Ajay M.
2017-01-01
Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.
Probabilistic model for quick detection of dissimilar binary images
NASA Astrophysics Data System (ADS)
Mustafa, Adnan A. Y.
2015-09-01
We present a quick method to detect dissimilar binary images. The method is based on a "probabilistic matching model" for image matching. The matching model is used to predict the probability of occurrence of distinct-dissimilar image pairs (completely different images) when matching one image to another. Based on this model, distinct-dissimilar images can be detected by matching only a few points between two images with high confidence, namely 11 points for a 99.9% successful detection rate. For image pairs that are dissimilar but not distinct-dissimilar, more points need to be mapped. The number of points required to attain a certain successful detection rate or confidence depends on the amount of similarity between the compared images. As this similarity increases, more points are required. For example, images that differ by 1% can be detected by mapping fewer than 70 points on average. More importantly, the model is image size invariant; so, images of any sizes will produce high confidence levels with a limited number of matched points. As a result, this method does not suffer from the image size handicap that impedes current methods. We report on extensive tests conducted on real images of different sizes.
Simultaneous Detection and Tracking of Pedestrian from Panoramic Laser Scanning Data
NASA Astrophysics Data System (ADS)
Xiao, Wen; Vallet, Bruno; Schindler, Konrad; Paparoditis, Nicolas
2016-06-01
Pedestrian traffic flow estimation is essential for public place design and construction planning. Traditional data collection by human investigation is tedious, inefficient and expensive. Panoramic laser scanners, e.g. Velodyne HDL-64E, which scan surroundings repetitively at a high frequency, have been increasingly used for 3D object tracking. In this paper, a simultaneous detection and tracking (SDAT) method is proposed for precise and automatic pedestrian trajectory recovery. First, the dynamic environment is detected using two different methods, Nearest-point and Max-distance. Then, all the points on moving objects are transferred into a space-time (x, y, t) coordinate system. The pedestrian detection and tracking amounts to assign the points belonging to pedestrians into continuous trajectories in space-time. We formulate the point assignment task as an energy function which incorporates the point evidence, trajectory number, pedestrian shape and motion. A low energy trajectory will well explain the point observations, and have plausible trajectory trend and length. The method inherently filters out points from other moving objects and false detections. The energy function is solved by a two-step optimization process: tracklet detection in a short temporal window; and global tracklet association through the whole time span. Results demonstrate that the proposed method can automatically recover the pedestrians trajectories with accurate positions and low false detections and mismatches.
Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method
Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu
2016-01-01
A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis. PMID:28029121
Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method.
Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu
2016-12-24
A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.
THE SCREENING AND RANKING ALGORITHM FOR CHANGE-POINTS DETECTION IN MULTIPLE SAMPLES
Song, Chi; Min, Xiaoyi; Zhang, Heping
2016-01-01
The chromosome copy number variation (CNV) is the deviation of genomic regions from their normal copy number states, which may associate with many human diseases. Current genetic studies usually collect hundreds to thousands of samples to study the association between CNV and diseases. CNVs can be called by detecting the change-points in mean for sequences of array-based intensity measurements. Although multiple samples are of interest, the majority of the available CNV calling methods are single sample based. Only a few multiple sample methods have been proposed using scan statistics that are computationally intensive and designed toward either common or rare change-points detection. In this paper, we propose a novel multiple sample method by adaptively combining the scan statistic of the screening and ranking algorithm (SaRa), which is computationally efficient and is able to detect both common and rare change-points. We prove that asymptotically this method can find the true change-points with almost certainty and show in theory that multiple sample methods are superior to single sample methods when shared change-points are of interest. Additionally, we report extensive simulation studies to examine the performance of our proposed method. Finally, using our proposed method as well as two competing approaches, we attempt to detect CNVs in the data from the Primary Open-Angle Glaucoma Genes and Environment study, and conclude that our method is faster and requires less information while our ability to detect the CNVs is comparable or better. PMID:28090239
lidar change detection using building models
NASA Astrophysics Data System (ADS)
Kim, Angela M.; Runyon, Scott C.; Jalobeanu, Andre; Esterline, Chelsea H.; Kruse, Fred A.
2014-06-01
Terrestrial LiDAR scans of building models collected with a FARO Focus3D and a RIEGL VZ-400 were used to investigate point-to-point and model-to-model LiDAR change detection. LiDAR data were scaled, decimated, and georegistered to mimic real world airborne collects. Two physical building models were used to explore various aspects of the change detection process. The first model was a 1:250-scale representation of the Naval Postgraduate School campus in Monterey, CA, constructed from Lego blocks and scanned in a laboratory setting using both the FARO and RIEGL. The second model at 1:8-scale consisted of large cardboard boxes placed outdoors and scanned from rooftops of adjacent buildings using the RIEGL. A point-to-point change detection scheme was applied directly to the point-cloud datasets. In the model-to-model change detection scheme, changes were detected by comparing Digital Surface Models (DSMs). The use of physical models allowed analysis of effects of changes in scanner and scanning geometry, and performance of the change detection methods on different types of changes, including building collapse or subsistence, construction, and shifts in location. Results indicate that at low false-alarm rates, the point-to-point method slightly outperforms the model-to-model method. The point-to-point method is less sensitive to misregistration errors in the data. Best results are obtained when the baseline and change datasets are collected using the same LiDAR system and collection geometry.
Comparison of methods for accurate end-point detection of potentiometric titrations
NASA Astrophysics Data System (ADS)
Villela, R. L. A.; Borges, P. P.; Vyskočil, L.
2015-01-01
Detection of the end point in potentiometric titrations has wide application on experiments that demand very low measurement uncertainties mainly for certifying reference materials. Simulations of experimental coulometric titration data and consequential error analysis of the end-point values were conducted using a programming code. These simulations revealed that the Levenberg-Marquardt method is in general more accurate than the traditional second derivative technique used currently as end-point detection for potentiometric titrations. Performance of the methods will be compared and presented in this paper.
Point counts are a common method for sampling avian distribution and abundance. Though methods for estimating detection probabilities are available, many analyses use raw counts and do not correct for detectability. We use a removal model of detection within an N-mixture approa...
Vision Based Obstacle Detection in Uav Imaging
NASA Astrophysics Data System (ADS)
Badrloo, S.; Varshosaz, M.
2017-08-01
Detecting and preventing incidence with obstacles is crucial in UAV navigation and control. Most of the common obstacle detection techniques are currently sensor-based. Small UAVs are not able to carry obstacle detection sensors such as radar; therefore, vision-based methods are considered, which can be divided into stereo-based and mono-based techniques. Mono-based methods are classified into two groups: Foreground-background separation, and brain-inspired methods. Brain-inspired methods are highly efficient in obstacle detection; hence, this research aims to detect obstacles using brain-inspired techniques, which try to enlarge the obstacle by approaching it. A recent research in this field, has concentrated on matching the SIFT points along with, SIFT size-ratio factor and area-ratio of convex hulls in two consecutive frames to detect obstacles. This method is not able to distinguish between near and far obstacles or the obstacles in complex environment, and is sensitive to wrong matched points. In order to solve the above mentioned problems, this research calculates the dist-ratio of matched points. Then, each and every point is investigated for Distinguishing between far and close obstacles. The results demonstrated the high efficiency of the proposed method in complex environments.
Assessing the accuracy of TDR-based water leak detection system
NASA Astrophysics Data System (ADS)
Fatemi Aghda, S. M.; GanjaliPour, K.; Nabiollahi, K.
2018-03-01
The use of TDR system to detect leakage locations in underground pipes has been developed in recent years. In this system, a bi-wire is installed in parallel with the underground pipes and is considered as a TDR sensor. This approach greatly covers the limitations arisen with using the traditional method of acoustic leak positioning. TDR based leak detection method is relatively accurate when the TDR sensor is in contact with water in just one point. Researchers have been working to improve the accuracy of this method in recent years. In this study, the ability of TDR method was evaluated in terms of the appearance of multi leakage points simultaneously. For this purpose, several laboratory tests were conducted. In these tests in order to simulate leakage points, the TDR sensor was put in contact with water at some points, then the number and the dimension of the simulated leakage points were gradually increased. The results showed that with the increase in the number and dimension of the leakage points, the error rate of the TDR-based water leak detection system increases. The authors tried, according to the results obtained from the laboratory tests, to develop a method to improve the accuracy of the TDR-based leak detection systems. To do that, they defined a few reference points on the TDR sensor. These points were created via increasing the distance between two conductors of TDR sensor and were easily identifiable in the TDR waveform. The tests were repeated again using the TDR sensor having reference points. In order to calculate the exact distance of the leakage point, the authors developed an equation in accordance to the reference points. A comparison between the results obtained from both tests (with and without reference points) showed that using the method and equation developed by the authors can significantly improve the accuracy of positioning the leakage points.
NASA Astrophysics Data System (ADS)
Noh, Hae Young; Rajagopal, Ram; Kiremidjian, Anne S.
2012-04-01
This paper introduces a damage diagnosis algorithm for civil structures that uses a sequential change point detection method for the cases where the post-damage feature distribution is unknown a priori. This algorithm extracts features from structural vibration data using time-series analysis and then declares damage using the change point detection method. The change point detection method asymptotically minimizes detection delay for a given false alarm rate. The conventional method uses the known pre- and post-damage feature distributions to perform a sequential hypothesis test. In practice, however, the post-damage distribution is unlikely to be known a priori. Therefore, our algorithm estimates and updates this distribution as data are collected using the maximum likelihood and the Bayesian methods. We also applied an approximate method to reduce the computation load and memory requirement associated with the estimation. The algorithm is validated using multiple sets of simulated data and a set of experimental data collected from a four-story steel special moment-resisting frame. Our algorithm was able to estimate the post-damage distribution consistently and resulted in detection delays only a few seconds longer than the delays from the conventional method that assumes we know the post-damage feature distribution. We confirmed that the Bayesian method is particularly efficient in declaring damage with minimal memory requirement, but the maximum likelihood method provides an insightful heuristic approach.
NASA Astrophysics Data System (ADS)
Lee, Daeho; Lee, Seohyung
2017-11-01
We propose an image stitching method that can remove ghost effects and realign the structure misalignments that occur in common image stitching methods. To reduce the artifacts caused by different parallaxes, an optimal seam pair is selected by comparing the cross correlations from multiple seams detected by variable cost weights. Along the optimal seam pair, a histogram of oriented gradients is calculated, and feature points for matching are detected. The homography is refined using the matching points, and the remaining misalignment is eliminated using the propagation of deformation vectors calculated from matching points. In multiband blending, the overlapping regions are determined from a distance between the matching points to remove overlapping artifacts. The experimental results show that the proposed method more robustly eliminates misalignments and overlapping artifacts than the existing method that uses single seam detection and gradient features.
Farnsworth, G.L.; Nichols, J.D.; Sauer, J.R.; Fancy, S.G.; Pollock, K.H.; Shriner, S.A.; Simons, T.R.; Ralph, C. John; Rich, Terrell D.
2005-01-01
Point counts are a standard sampling procedure for many bird species, but lingering concerns still exist about the quality of information produced from the method. It is well known that variation in observer ability and environmental conditions can influence the detection probability of birds in point counts, but many biologists have been reluctant to abandon point counts in favor of more intensive approaches to counting. However, over the past few years a variety of statistical and methodological developments have begun to provide practical ways of overcoming some of the problems with point counts. We describe some of these approaches, and show how they can be integrated into standard point count protocols to greatly enhance the quality of the information. Several tools now exist for estimation of detection probability of birds during counts, including distance sampling, double observer methods, time-depletion (removal) methods, and hybrid methods that combine these approaches. Many counts are conducted in habitats that make auditory detection of birds much more likely than visual detection. As a framework for understanding detection probability during such counts, we propose separating two components of the probability a bird is detected during a count into (1) the probability a bird vocalizes during the count and (2) the probability this vocalization is detected by an observer. In addition, we propose that some measure of the area sampled during a count is necessary for valid inferences about bird populations. This can be done by employing fixed-radius counts or more sophisticated distance-sampling models. We recommend any studies employing point counts be designed to estimate detection probability and to include a measure of the area sampled.
Automated exploitation of sky polarization imagery.
Sadjadi, Firooz A; Chun, Cornell S L
2018-03-10
We propose an automated method for detecting neutral points in the sunlit sky. Until now, detecting these singularities has been done manually. Results are presented that document the application of this method on a limited number of polarimetric images of the sky captured with a camera and rotating polarizer. The results are significant because a method for automatically detecting the neutral points may aid in the determination of the solar position when the sun is obscured and may have applications in meteorology and pollution detection and characterization.
Hongyi Xu; Barbic, Jernej
2017-01-01
We present an algorithm for fast continuous collision detection between points and signed distance fields, and demonstrate how to robustly use it for 6-DoF haptic rendering of contact between objects with complex geometry. Continuous collision detection is often needed in computer animation, haptics, and virtual reality applications, but has so far only been investigated for polygon (triangular) geometry representations. We demonstrate how to robustly and continuously detect intersections between points and level sets of the signed distance field. We suggest using an octree subdivision of the distance field for fast traversal of distance field cells. We also give a method to resolve continuous collisions between point clouds organized into a tree hierarchy and a signed distance field, enabling rendering of contact between rigid objects with complex geometry. We investigate and compare two 6-DoF haptic rendering methods now applicable to point-versus-distance field contact for the first time: continuous integration of penalty forces, and a constraint-based method. An experimental comparison to discrete collision detection demonstrates that the continuous method is more robust and can correctly resolve collisions even under high velocities and during complex contact.
Huang, Chenxi; Huang, Hongxin; Toyoda, Haruyoshi; Inoue, Takashi; Liu, Huafeng
2012-11-19
We propose a new method for realizing high-spatial-resolution detection of singularity points in optical vortex beams. The method uses a Shack-Hartmann wavefront sensor (SHWS) to record a Hartmanngram. A map of evaluation values related to phase slope is then calculated from the Hartmanngram. The position of an optical vortex is determined by comparing the map with reference maps that are calculated from numerically created spiral phases having various positions. Optical experiments were carried out to verify the method. We displayed various spiral phase distribution patterns on a phase-only spatial light modulator and measured the resulting singularity point using the proposed method. The results showed good linearity in detecting the position of singularity points. The RMS error of the measured position of the singularity point was approximately 0.056, in units normalized to the lens size of the lenslet array used in the SHWS.
A source-attractor approach to network detection of radiation sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Qishi; Barry, M. L..; Grieme, M.
Radiation source detection using a network of detectors is an active field of research for homeland security and defense applications. We propose Source-attractor Radiation Detection (SRD) method to aggregate measurements from a network of detectors for radiation source detection. SRD method models a potential radiation source as a magnet -like attractor that pulls in pre-computed virtual points from the detector locations. A detection decision is made if a sufficient level of attraction, quantified by the increase in the clustering of the shifted virtual points, is observed. Compared with traditional methods, SRD has the following advantages: i) it does not requiremore » an accurate estimate of the source location from limited and noise-corrupted sensor readings, unlike the localizationbased methods, and ii) its virtual point shifting and clustering calculation involve simple arithmetic operations based on the number of detectors, avoiding the high computational complexity of grid-based likelihood estimation methods. We evaluate its detection performance using canonical datasets from Domestic Nuclear Detection Office s (DNDO) Intelligence Radiation Sensors Systems (IRSS) tests. SRD achieves both lower false alarm rate and false negative rate compared to three existing algorithms for network source detection.« less
NASA Astrophysics Data System (ADS)
De Ridder, Simon; Vandermarliere, Benjamin; Ryckebusch, Jan
2016-11-01
A framework based on generalized hierarchical random graphs (GHRGs) for the detection of change points in the structure of temporal networks has recently been developed by Peel and Clauset (2015 Proc. 29th AAAI Conf. on Artificial Intelligence). We build on this methodology and extend it to also include the versatile stochastic block models (SBMs) as a parametric family for reconstructing the empirical networks. We use five different techniques for change point detection on prototypical temporal networks, including empirical and synthetic ones. We find that none of the considered methods can consistently outperform the others when it comes to detecting and locating the expected change points in empirical temporal networks. With respect to the precision and the recall of the results of the change points, we find that the method based on a degree-corrected SBM has better recall properties than other dedicated methods, especially for sparse networks and smaller sliding time window widths.
Salient Point Detection in Protrusion Parts of 3D Object Robust to Isometric Variations
NASA Astrophysics Data System (ADS)
Mirloo, Mahsa; Ebrahimnezhad, Hosein
2018-03-01
In this paper, a novel method is proposed to detect 3D object salient points robust to isometric variations and stable against scaling and noise. Salient points can be used as the representative points from object protrusion parts in order to improve the object matching and retrieval algorithms. The proposed algorithm is started by determining the first salient point of the model based on the average geodesic distance of several random points. Then, according to the previous salient point, a new point is added to this set of points in each iteration. By adding every salient point, decision function is updated. Hence, a condition is created for selecting the next point in which the iterative point is not extracted from the same protrusion part so that drawing out of a representative point from every protrusion part is guaranteed. This method is stable against model variations with isometric transformations, scaling, and noise with different levels of strength due to using a feature robust to isometric variations and considering the relation between the salient points. In addition, the number of points used in averaging process is decreased in this method, which leads to lower computational complexity in comparison with the other salient point detection algorithms.
Distribution majorization of corner points by reinforcement learning for moving object detection
NASA Astrophysics Data System (ADS)
Wu, Hao; Yu, Hao; Zhou, Dongxiang; Cheng, Yongqiang
2018-04-01
Corner points play an important role in moving object detection, especially in the case of free-moving camera. Corner points provide more accurate information than other pixels and reduce the computation which is unnecessary. Previous works only use intensity information to locate the corner points, however, the information that former and the last frames provided also can be used. We utilize the information to focus on more valuable area and ignore the invaluable area. The proposed algorithm is based on reinforcement learning, which regards the detection of corner points as a Markov process. In the Markov model, the video to be detected is regarded as environment, the selections of blocks for one corner point are regarded as actions and the performance of detection is regarded as state. Corner points are assigned to be the blocks which are seperated from original whole image. Experimentally, we select a conventional method which uses marching and Random Sample Consensus algorithm to obtain objects as the main framework and utilize our algorithm to improve the result. The comparison between the conventional method and the same one with our algorithm show that our algorithm reduce 70% of the false detection.
Road traffic sign detection and classification from mobile LiDAR point clouds
NASA Astrophysics Data System (ADS)
Weng, Shengxia; Li, Jonathan; Chen, Yiping; Wang, Cheng
2016-03-01
Traffic signs are important roadway assets that provide valuable information of the road for drivers to make safer and easier driving behaviors. Due to the development of mobile mapping systems that can efficiently acquire dense point clouds along the road, automated detection and recognition of road assets has been an important research issue. This paper deals with the detection and classification of traffic signs in outdoor environments using mobile light detection and ranging (Li- DAR) and inertial navigation technologies. The proposed method contains two main steps. It starts with an initial detection of traffic signs based on the intensity attributes of point clouds, as the traffic signs are always painted with highly reflective materials. Then, the classification of traffic signs is achieved based on the geometric shape and the pairwise 3D shape context. Some results and performance analyses are provided to show the effectiveness and limits of the proposed method. The experimental results demonstrate the feasibility and effectiveness of the proposed method in detecting and classifying traffic signs from mobile LiDAR point clouds.
NASA Astrophysics Data System (ADS)
Yang, C. H.; Kenduiywo, B. K.; Soergel, U.
2016-06-01
Persistent Scatterer Interferometry (PSI) is a technique to detect a network of extracted persistent scatterer (PS) points which feature temporal phase stability and strong radar signal throughout time-series of SAR images. The small surface deformations on such PS points are estimated. PSI particularly works well in monitoring human settlements because regular substructures of man-made objects give rise to large number of PS points. If such structures and/or substructures substantially alter or even vanish due to big change like construction, their PS points are discarded without additional explorations during standard PSI procedure. Such rejected points are called big change (BC) points. On the other hand, incoherent change detection (ICD) relies on local comparison of multi-temporal images (e.g. image difference, image ratio) to highlight scene modifications of larger size rather than detail level. However, image noise inevitably degrades ICD accuracy. We propose a change detection approach based on PSI to synergize benefits of PSI and ICD. PS points are extracted by PSI procedure. A local change index is introduced to quantify probability of a big change for each point. We propose an automatic thresholding method adopting change index to extract BC points along with a clue of the period they emerge. In the end, PS ad BC points are integrated into a change detection image. Our method is tested at a site located around north of Berlin main station where steady, demolished, and erected building substructures are successfully detected. The results are consistent with ground truth derived from time-series of aerial images provided by Google Earth. In addition, we apply our technique for traffic infrastructure, business district, and sports playground monitoring.
A NEW METHOD FOR FINDING POINT SOURCES IN HIGH-ENERGY NEUTRINO DATA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fang, Ke; Miller, M. Coleman
The IceCube collaboration has reported the first detection of high-energy astrophysical neutrinos, including ∼50 high-energy starting events, but no individual sources have been identified. It is therefore important to develop the most sensitive and efficient possible algorithms to identify the point sources of these neutrinos. The most popular current method works by exploring a dense grid of possible directions to individual sources, and identifying the single direction with the maximum probability of having produced multiple detected neutrinos. This method has numerous strengths, but it is computationally intensive and because it focuses on the single best location for a point source,more » additional point sources are not included in the evidence. We propose a new maximum likelihood method that uses the angular separations between all pairs of neutrinos in the data. Unlike existing autocorrelation methods for this type of analysis, which also use angular separations between neutrino pairs, our method incorporates information about the point-spread function and can identify individual point sources. We find that if the angular resolution is a few degrees or better, then this approach reduces both false positive and false negative errors compared to the current method, and is also more computationally efficient up to, potentially, hundreds of thousands of detected neutrinos.« less
Modeling seasonal detection patterns for burrowing owl surveys
Quresh S. Latif; Kathleen D. Fleming; Cameron Barrows; John T. Rotenberry
2012-01-01
To guide monitoring of burrowing owls (Athene cunicularia) in the Coachella Valley, California, USA, we analyzed survey-method-specific seasonal variation in detectability. Point-based call-broadcast surveys yielded high early season detectability that then declined through time, whereas detectability on driving surveys increased through the season. Point surveys...
Vision System for Coarsely Estimating Motion Parameters for Unknown Fast Moving Objects in Space
Chen, Min; Hashimoto, Koichi
2017-01-01
Motivated by biological interests in analyzing navigation behaviors of flying animals, we attempt to build a system measuring their motion states. To do this, in this paper, we build a vision system to detect unknown fast moving objects within a given space, calculating their motion parameters represented by positions and poses. We proposed a novel method to detect reliable interest points from images of moving objects, which can be hardly detected by general purpose interest point detectors. 3D points reconstructed using these interest points are then grouped and maintained for detected objects, according to a careful schedule, considering appearance and perspective changes. In the estimation step, a method is introduced to adapt the robust estimation procedure used for dense point set to the case for sparse set, reducing the potential risk of greatly biased estimation. Experiments are conducted against real scenes, showing the capability of the system of detecting multiple unknown moving objects and estimating their positions and poses. PMID:29206189
Method and apparatus for automatically detecting patterns in digital point-ordered signals
Brudnoy, David M.
1998-01-01
The present invention is a method and system for detecting a physical feature of a test piece by detecting a pattern in a signal representing data from inspection of the test piece. The pattern is detected by automated additive decomposition of a digital point-ordered signal which represents the data. The present invention can properly handle a non-periodic signal. A physical parameter of the test piece is measured. A digital point-ordered signal representative of the measured physical parameter is generated. The digital point-ordered signal is decomposed into a baseline signal, a background noise signal, and a peaks/troughs signal. The peaks/troughs from the peaks/troughs signal are located and peaks/troughs information indicating the physical feature of the test piece is output.
Method and apparatus for automatically detecting patterns in digital point-ordered signals
Brudnoy, D.M.
1998-10-20
The present invention is a method and system for detecting a physical feature of a test piece by detecting a pattern in a signal representing data from inspection of the test piece. The pattern is detected by automated additive decomposition of a digital point-ordered signal which represents the data. The present invention can properly handle a non-periodic signal. A physical parameter of the test piece is measured. A digital point-ordered signal representative of the measured physical parameter is generated. The digital point-ordered signal is decomposed into a baseline signal, a background noise signal, and a peaks/troughs signal. The peaks/troughs from the peaks/troughs signal are located and peaks/troughs information indicating the physical feature of the test piece is output. 14 figs.
Staircase-scene-based nonuniformity correction in aerial point target detection systems.
Huo, Lijun; Zhou, Dabiao; Wang, Dejiang; Liu, Rang; He, Bin
2016-09-01
Focal-plane arrays (FPAs) are often interfered by heavy fixed-pattern noise, which severely degrades the detection rate and increases the false alarms in airborne point target detection systems. Thus, high-precision nonuniformity correction is an essential preprocessing step. In this paper, a new nonuniformity correction method is proposed based on a staircase scene. This correction method can compensate for the nonlinear response of the detector and calibrate the entire optical system with computational efficiency and implementation simplicity. Then, a proof-of-concept point target detection system is established with a long-wave Sofradir FPA. Finally, the local standard deviation of the corrected image and the signal-to-clutter ratio of the Airy disk of a Boeing B738 are measured to evaluate the performance of the proposed nonuniformity correction method. Our experimental results demonstrate that the proposed correction method achieves high-quality corrections.
A cascade method for TFT-LCD defect detection
NASA Astrophysics Data System (ADS)
Yi, Songsong; Wu, Xiaojun; Yu, Zhiyang; Mo, Zhuoya
2017-07-01
In this paper, we propose a novel cascade detection algorithm which focuses on point and line defects on TFT-LCD. At the first step of the algorithm, we use the gray level difference of su-bimage to segment the abnormal area. The second step is based on phase only transform (POT) which corresponds to the Discrete Fourier Transform (DFT), normalized by the magnitude. It can remove regularities like texture and noise. After that, we improve the method of setting regions of interest (ROI) with the method of edge segmentation and polar transformation. The algorithm has outstanding performance in both computation speed and accuracy. It can solve most of the defect detections including dark point, light point, dark line, etc.
Can Detectability Analysis Improve the Utility of Point Counts for Temperate Forest Raptors?
Temperate forest breeding raptors are poorly represented in typical point count surveys because these birds are cryptic and typically breed at low densities. In recent years, many new methods for estimating detectability during point counts have been developed, including distanc...
NASA Astrophysics Data System (ADS)
Nemoto, Mitsutaka; Nomura, Yukihiro; Hanaoka, Shohei; Masutani, Yoshitaka; Yoshikawa, Takeharu; Hayashi, Naoto; Yoshioka, Naoki; Ohtomo, Kuni
Anatomical point landmarks as most primitive anatomical knowledge are useful for medical image understanding. In this study, we propose a detection method for anatomical point landmark based on appearance models, which include gray-level statistical variations at point landmarks and their surrounding area. The models are built based on results of Principal Component Analysis (PCA) of sample data sets. In addition, we employed generative learning method by transforming ROI of sample data. In this study, we evaluated our method with 24 data sets of body trunk CT images and obtained 95.8 ± 7.3 % of the average sensitivity in 28 landmarks.
Dew inspired breathing-based detection of genetic point mutation visualized by naked eye
Xie, Liping; Wang, Tongzhou; Huang, Tianqi; Hou, Wei; Huang, Guoliang; Du, Yanan
2014-01-01
A novel label-free method based on breathing-induced vapor condensation was developed for detection of genetic point mutation. The dew-inspired detection was realized by integration of target-induced DNA ligation with rolling circle amplification (RCA). The vapor condensation induced by breathing transduced the RCA-amplified variances in DNA contents into visible contrast. The image could be recorded by a cell phone for further or even remote analysis. This green assay offers a naked-eye-reading method potentially applied for point-of-care liver cancer diagnosis in resource-limited regions. PMID:25199907
Dew inspired breathing-based detection of genetic point mutation visualized by naked eye
NASA Astrophysics Data System (ADS)
Xie, Liping; Wang, Tongzhou; Huang, Tianqi; Hou, Wei; Huang, Guoliang; Du, Yanan
2014-09-01
A novel label-free method based on breathing-induced vapor condensation was developed for detection of genetic point mutation. The dew-inspired detection was realized by integration of target-induced DNA ligation with rolling circle amplification (RCA). The vapor condensation induced by breathing transduced the RCA-amplified variances in DNA contents into visible contrast. The image could be recorded by a cell phone for further or even remote analysis. This green assay offers a naked-eye-reading method potentially applied for point-of-care liver cancer diagnosis in resource-limited regions.
Dew inspired breathing-based detection of genetic point mutation visualized by naked eye.
Xie, Liping; Wang, Tongzhou; Huang, Tianqi; Hou, Wei; Huang, Guoliang; Du, Yanan
2014-09-09
A novel label-free method based on breathing-induced vapor condensation was developed for detection of genetic point mutation. The dew-inspired detection was realized by integration of target-induced DNA ligation with rolling circle amplification (RCA). The vapor condensation induced by breathing transduced the RCA-amplified variances in DNA contents into visible contrast. The image could be recorded by a cell phone for further or even remote analysis. This green assay offers a naked-eye-reading method potentially applied for point-of-care liver cancer diagnosis in resource-limited regions.
A removal model for estimating detection probabilities from point-count surveys
Farnsworth, G.L.; Pollock, K.H.; Nichols, J.D.; Simons, T.R.; Hines, J.E.; Sauer, J.R.
2002-01-01
Use of point-count surveys is a popular method for collecting data on abundance and distribution of birds. However, analyses of such data often ignore potential differences in detection probability. We adapted a removal model to directly estimate detection probability during point-count surveys. The model assumes that singing frequency is a major factor influencing probability of detection when birds are surveyed using point counts. This may be appropriate for surveys in which most detections are by sound. The model requires counts to be divided into several time intervals. Point counts are often conducted for 10 min, where the number of birds recorded is divided into those first observed in the first 3 min, the subsequent 2 min, and the last 5 min. We developed a maximum-likelihood estimator for the detectability of birds recorded during counts divided into those intervals. This technique can easily be adapted to point counts divided into intervals of any length. We applied this method to unlimited-radius counts conducted in Great Smoky Mountains National Park. We used model selection criteria to identify whether detection probabilities varied among species, throughout the morning, throughout the season, and among different observers. We found differences in detection probability among species. Species that sing frequently such as Winter Wren (Troglodytes troglodytes) and Acadian Flycatcher (Empidonax virescens) had high detection probabilities (∼90%) and species that call infrequently such as Pileated Woodpecker (Dryocopus pileatus) had low detection probability (36%). We also found detection probabilities varied with the time of day for some species (e.g. thrushes) and between observers for other species. We used the same approach to estimate detection probability and density for a subset of the observations with limited-radius point counts.
Sensitive detection of point mutation by electrochemiluminescence and DNA ligase-based assay
NASA Astrophysics Data System (ADS)
Zhou, Huijuan; Wu, Baoyan
2008-12-01
The technology of single-base mutation detection plays an increasingly important role in diagnosis and prognosis of genetic-based diseases. Here we reported a new method for the analysis of point mutations in genomic DNA through the integration of allele-specific oligonucleotide ligation assay (OLA) with magnetic beads-based electrochemiluminescence (ECL) detection scheme. In this assay the tris(bipyridine) ruthenium (TBR) labeled probe and the biotinylated probe are designed to perfectly complementary to the mutant target, thus a ligation can be generated between those two probes by Taq DNA Ligase in the presence of mutant target. If there is an allele mismatch, the ligation does not take place. The ligation products are then captured onto streptavidin-coated paramagnetic beads, and detected by measuring the ECL signal of the TBR label. Results showed that the new method held a low detection limit down to 10 fmol and was successfully applied in the identification of point mutations from ASTC-α-1, PANC-1 and normal cell lines in codon 273 of TP53 oncogene. In summary, this method provides a sensitive, cost-effective and easy operation approach for point mutation detection.
Contour-Based Corner Detection and Classification by Using Mean Projection Transform
Kahaki, Seyed Mostafa Mousavi; Nordin, Md Jan; Ashtari, Amir Hossein
2014-01-01
Image corner detection is a fundamental task in computer vision. Many applications require reliable detectors to accurately detect corner points, commonly achieved by using image contour information. The curvature definition is sensitive to local variation and edge aliasing, and available smoothing methods are not sufficient to address these problems properly. Hence, we propose Mean Projection Transform (MPT) as a corner classifier and parabolic fit approximation to form a robust detector. The first step is to extract corner candidates using MPT based on the integral properties of the local contours in both the horizontal and vertical directions. Then, an approximation of the parabolic fit is calculated to localize the candidate corner points. The proposed method presents fewer false-positive (FP) and false-negative (FN) points compared with recent standard corner detection techniques, especially in comparison with curvature scale space (CSS) methods. Moreover, a new evaluation metric, called accuracy of repeatability (AR), is introduced. AR combines repeatability and the localization error (Le) for finding the probability of correct detection in the target image. The output results exhibit better repeatability, localization, and AR for the detected points compared with the criteria in original and transformed images. PMID:24590354
Contour-based corner detection and classification by using mean projection transform.
Kahaki, Seyed Mostafa Mousavi; Nordin, Md Jan; Ashtari, Amir Hossein
2014-02-28
Image corner detection is a fundamental task in computer vision. Many applications require reliable detectors to accurately detect corner points, commonly achieved by using image contour information. The curvature definition is sensitive to local variation and edge aliasing, and available smoothing methods are not sufficient to address these problems properly. Hence, we propose Mean Projection Transform (MPT) as a corner classifier and parabolic fit approximation to form a robust detector. The first step is to extract corner candidates using MPT based on the integral properties of the local contours in both the horizontal and vertical directions. Then, an approximation of the parabolic fit is calculated to localize the candidate corner points. The proposed method presents fewer false-positive (FP) and false-negative (FN) points compared with recent standard corner detection techniques, especially in comparison with curvature scale space (CSS) methods. Moreover, a new evaluation metric, called accuracy of repeatability (AR), is introduced. AR combines repeatability and the localization error (Le) for finding the probability of correct detection in the target image. The output results exhibit better repeatability, localization, and AR for the detected points compared with the criteria in original and transformed images.
Robust curb detection with fusion of 3D-Lidar and camera data.
Tan, Jun; Li, Jian; An, Xiangjing; He, Hangen
2014-05-21
Curb detection is an essential component of Autonomous Land Vehicles (ALV), especially important for safe driving in urban environments. In this paper, we propose a fusion-based curb detection method through exploiting 3D-Lidar and camera data. More specifically, we first fuse the sparse 3D-Lidar points and high-resolution camera images together to recover a dense depth image of the captured scene. Based on the recovered dense depth image, we propose a filter-based method to estimate the normal direction within the image. Then, by using the multi-scale normal patterns based on the curb's geometric property, curb point features fitting the patterns are detected in the normal image row by row. After that, we construct a Markov Chain to model the consistency of curb points which utilizes the continuous property of the curb, and thus the optimal curb path which links the curb points together can be efficiently estimated by dynamic programming. Finally, we perform post-processing operations to filter the outliers, parameterize the curbs and give the confidence scores on the detected curbs. Extensive evaluations clearly show that our proposed method can detect curbs with strong robustness at real-time speed for both static and dynamic scenes.
Pendleton, G.W.; Ralph, C. John; Sauer, John R.; Droege, Sam
1995-01-01
Many factors affect the use of point counts for monitoring bird populations, including sampling strategies, variation in detection rates, and independence of sample points. The most commonly used sampling plans are stratified sampling, cluster sampling, and systematic sampling. Each of these might be most useful for different objectives or field situations. Variation in detection probabilities and lack of independence among sample points can bias estimates and measures of precision. All of these factors should be con-sidered when using point count methods.
Vanishing points detection using combination of fast Hough transform and deep learning
NASA Astrophysics Data System (ADS)
Sheshkus, Alexander; Ingacheva, Anastasia; Nikolaev, Dmitry
2018-04-01
In this paper we propose a novel method for vanishing points detection based on convolutional neural network (CNN) approach and fast Hough transform algorithm. We show how to determine fast Hough transform neural network layer and how to use it in order to increase usability of the neural network approach to the vanishing point detection task. Our algorithm includes CNN with consequence of convolutional and fast Hough transform layers. We are building estimator for distribution of possible vanishing points in the image. This distribution can be used to find candidates of vanishing point. We provide experimental results from tests of suggested method using images collected from videos of road trips. Our approach shows stable result on test images with different projective distortions and noise. Described approach can be effectively implemented for mobile GPU and CPU.
Research on infrared dim-point target detection and tracking under sea-sky-line complex background
NASA Astrophysics Data System (ADS)
Dong, Yu-xing; Li, Yan; Zhang, Hai-bo
2011-08-01
Target detection and tracking technology in infrared image is an important part of modern military defense system. Infrared dim-point targets detection and recognition under complex background is a difficulty and important strategic value and challenging research topic. The main objects that carrier-borne infrared vigilance system detected are sea-skimming aircrafts and missiles. Due to the characteristics of wide field of view of vigilance system, the target is usually under the sea clutter. Detection and recognition of the target will be taken great difficulties .There are some traditional point target detection algorithms, such as adaptive background prediction detecting method. When background has dispersion-decreasing structure, the traditional target detection algorithms would be more useful. But when the background has large gray gradient, such as sea-sky-line, sea waves etc .The bigger false-alarm rate will be taken in these local area .It could not obtain satisfactory results. Because dim-point target itself does not have obvious geometry or texture feature ,in our opinion , from the perspective of mathematics, the detection of dim-point targets in image is about singular function analysis .And from the perspective image processing analysis , the judgment of isolated singularity in the image is key problem. The foregoing points for dim-point targets detection, its essence is a separation of target and background of different singularity characteristics .The image from infrared sensor usually accompanied by different kinds of noise. These external noises could be caused by the complicated background or from the sensor itself. The noise might affect target detection and tracking. Therefore, the purpose of the image preprocessing is to reduce the effects from noise, also to raise the SNR of image, and to increase the contrast of target and background. According to the low sea-skimming infrared flying small target characteristics , the median filter is used to eliminate noise, improve signal-to-noise ratio, then the multi-point multi-storey vertical Sobel algorithm will be used to detect the sea-sky-line ,so that we can segment sea and sky in the image. Finally using centroid tracking method to capture and trace target. This method has been successfully used to trace target under the sea-sky complex background.
NASA Astrophysics Data System (ADS)
Ge, Xuming
2017-08-01
The coarse registration of point clouds from urban building scenes has become a key topic in applications of terrestrial laser scanning technology. Sampling-based algorithms in the random sample consensus (RANSAC) model have emerged as mainstream solutions to address coarse registration problems. In this paper, we propose a novel combined solution to automatically align two markerless point clouds from building scenes. Firstly, the method segments non-ground points from ground points. Secondly, the proposed method detects feature points from each cross section and then obtains semantic keypoints by connecting feature points with specific rules. Finally, the detected semantic keypoints from two point clouds act as inputs to a modified 4PCS algorithm. Examples are presented and the results compared with those of K-4PCS to demonstrate the main contributions of the proposed method, which are the extension of the original 4PCS to handle heavy datasets and the use of semantic keypoints to improve K-4PCS in relation to registration accuracy and computational efficiency.
Micro Ring Grating Spectrometer with Adjustable Aperture
NASA Technical Reports Server (NTRS)
Park, Yeonjoon (Inventor); King, Glen C. (Inventor); Elliott, James R. (Inventor); Choi, Sang H. (Inventor)
2012-01-01
A spectrometer includes a micro-ring grating device having coaxially-aligned ring gratings for diffracting incident light onto a target focal point, a detection device for detecting light intensity, one or more actuators, and an adjustable aperture device defining a circular aperture. The aperture circumscribes a target focal point, and directs a light to the detection device. The aperture device is selectively adjustable using the actuators to select a portion of a frequency band for transmission to the detection device. A method of detecting intensity of a selected band of incident light includes directing incident light onto coaxially-aligned ring gratings of a micro-ring grating device, and diffracting the selected band onto a target focal point using the ring gratings. The method includes using an actuator to adjust an aperture device and pass a selected portion of the frequency band to a detection device for measuring the intensity of the selected portion.
Small target detection using objectness and saliency
NASA Astrophysics Data System (ADS)
Zhang, Naiwen; Xiao, Yang; Fang, Zhiwen; Yang, Jian; Wang, Li; Li, Tao
2017-10-01
We are motived by the need for generic object detection algorithm which achieves high recall for small targets in complex scenes with acceptable computational efficiency. We propose a novel object detection algorithm, which has high localization quality with acceptable computational cost. Firstly, we obtain the objectness map as in BING[1] and use NMS to get the top N points. Then, k-means algorithm is used to cluster them into K classes according to their location. We set the center points of the K classes as seed points. For each seed point, an object potential region is extracted. Finally, a fast salient object detection algorithm[2] is applied to the object potential regions to highlight objectlike pixels, and a series of efficient post-processing operations are proposed to locate the targets. Our method runs at 5 FPS on 1000*1000 images, and significantly outperforms previous methods on small targets in cluttered background.
Text vectorization based on character recognition and character stroke modeling
NASA Astrophysics Data System (ADS)
Fan, Zhigang; Zhou, Bingfeng; Tse, Francis; Mu, Yadong; He, Tao
2014-03-01
In this paper, a text vectorization method is proposed using OCR (Optical Character Recognition) and character stroke modeling. This is based on the observation that for a particular character, its font glyphs may have different shapes, but often share same stroke structures. Like many other methods, the proposed algorithm contains two procedures, dominant point determination and data fitting. The first one partitions the outlines into segments and second one fits a curve to each segment. In the proposed method, the dominant points are classified as "major" (specifying stroke structures) and "minor" (specifying serif shapes). A set of rules (parameters) are determined offline specifying for each character the number of major and minor dominant points and for each dominant point the detection and fitting parameters (projection directions, boundary conditions and smoothness). For minor points, multiple sets of parameters could be used for different fonts. During operation, OCR is performed and the parameters associated with the recognized character are selected. Both major and minor dominant points are detected as a maximization process as specified by the parameter set. For minor points, an additional step could be performed to test the competing hypothesis and detect degenerated cases.
Isothermal amplification detection of nucleic acids by a double-nicked beacon.
Shi, Chao; Zhou, Meiling; Pan, Mei; Zhong, Guilin; Ma, Cuiping
2016-03-01
Isothermal and rapid amplification detection of nucleic acids is an important technology in environmental monitoring, foodborne pathogen detection, and point-of-care clinical diagnostics. Here we have developed a novel method of isothermal signal amplification for single-stranded DNA (ssDNA) detection. The ssDNA target could be used as an initiator, coupled with a double-nicked molecular beacon, to originate amplification cycles, achieving cascade signal amplification. In addition, the method showed good specificity and strong anti-jamming capability. Overall, it is a one-pot and isothermal strand displacement amplification method without the requirement of a stepwise procedure, which greatly simplifies the experimental procedure and decreases the probability of contamination of samples. With its advantages, the method would be very useful to detect nucleic acids in point-of-care or field use. Copyright © 2015 Elsevier Inc. All rights reserved.
Robust Curb Detection with Fusion of 3D-Lidar and Camera Data
Tan, Jun; Li, Jian; An, Xiangjing; He, Hangen
2014-01-01
Curb detection is an essential component of Autonomous Land Vehicles (ALV), especially important for safe driving in urban environments. In this paper, we propose a fusion-based curb detection method through exploiting 3D-Lidar and camera data. More specifically, we first fuse the sparse 3D-Lidar points and high-resolution camera images together to recover a dense depth image of the captured scene. Based on the recovered dense depth image, we propose a filter-based method to estimate the normal direction within the image. Then, by using the multi-scale normal patterns based on the curb's geometric property, curb point features fitting the patterns are detected in the normal image row by row. After that, we construct a Markov Chain to model the consistency of curb points which utilizes the continuous property of the curb, and thus the optimal curb path which links the curb points together can be efficiently estimated by dynamic programming. Finally, we perform post-processing operations to filter the outliers, parameterize the curbs and give the confidence scores on the detected curbs. Extensive evaluations clearly show that our proposed method can detect curbs with strong robustness at real-time speed for both static and dynamic scenes. PMID:24854364
High-order optical vortex position detection using a Shack-Hartmann wavefront sensor.
Luo, Jia; Huang, Hongxin; Matsui, Yoshinori; Toyoda, Haruyoshi; Inoue, Takashi; Bai, Jian
2015-04-06
Optical vortex (OV) beams have null-intensity singular points, and the intensities in the region surrounding the singular point are quite low. This low intensity region influences the position detection accuracy of phase singular point, especially for high-order OV beam. In this paper, we propose a new method for solving this problem, called the phase-slope-combining correlation matching method. A Shack-Hartmann wavefront sensor (SH-WFS) is used to measure phase slope vectors at lenslet positions of the SH-WFS. Several phase slope vectors are combined into one to reduce the influence of low-intensity regions around the singular point, and the combined phase slope vectors are used to determine the OV position with the aid of correlation matching with a pre-calculated database. Experimental results showed that the proposed method works with high accuracy, even when detecting an OV beam with a topological charge larger than six. The estimated precision was about 0.15 in units of lenslet size when detecting an OV beam with a topological charge of up to 20.
3D change detection at street level using mobile laser scanning point clouds and terrestrial images
NASA Astrophysics Data System (ADS)
Qin, Rongjun; Gruen, Armin
2014-04-01
Automatic change detection and geo-database updating in the urban environment are difficult tasks. There has been much research on detecting changes with satellite and aerial images, but studies have rarely been performed at the street level, which is complex in its 3D geometry. Contemporary geo-databases include 3D street-level objects, which demand frequent data updating. Terrestrial images provides rich texture information for change detection, but the change detection with terrestrial images from different epochs sometimes faces problems with illumination changes, perspective distortions and unreliable 3D geometry caused by the lack of performance of automatic image matchers, while mobile laser scanning (MLS) data acquired from different epochs provides accurate 3D geometry for change detection, but is very expensive for periodical acquisition. This paper proposes a new method for change detection at street level by using combination of MLS point clouds and terrestrial images: the accurate but expensive MLS data acquired from an early epoch serves as the reference, and terrestrial images or photogrammetric images captured from an image-based mobile mapping system (MMS) at a later epoch are used to detect the geometrical changes between different epochs. The method will automatically mark the possible changes in each view, which provides a cost-efficient method for frequent data updating. The methodology is divided into several steps. In the first step, the point clouds are recorded by the MLS system and processed, with data cleaned and classified by semi-automatic means. In the second step, terrestrial images or mobile mapping images at a later epoch are taken and registered to the point cloud, and then point clouds are projected on each image by a weighted window based z-buffering method for view dependent 2D triangulation. In the next step, stereo pairs of the terrestrial images are rectified and re-projected between each other to check the geometrical consistency between point clouds and stereo images. Finally, an over-segmentation based graph cut optimization is carried out, taking into account the color, depth and class information to compute the changed area in the image space. The proposed method is invariant to light changes, robust to small co-registration errors between images and point clouds, and can be applied straightforwardly to 3D polyhedral models. This method can be used for 3D street data updating, city infrastructure management and damage monitoring in complex urban scenes.
Exact extraction method for road rutting laser lines
NASA Astrophysics Data System (ADS)
Hong, Zhiming
2018-02-01
This paper analyzes the importance of asphalt pavement rutting detection in pavement maintenance and pavement administration in today's society, the shortcomings of the existing rutting detection methods are presented and a new rutting line-laser extraction method based on peak intensity characteristic and peak continuity is proposed. The intensity of peak characteristic is enhanced by a designed transverse mean filter, and an intensity map of peak characteristic based on peak intensity calculation for the whole road image is obtained to determine the seed point of the rutting laser line. Regarding the seed point as the starting point, the light-points of a rutting line-laser are extracted based on the features of peak continuity, which providing exact basic data for subsequent calculation of pavement rutting depths.
Note: A dual-channel sensor for dew point measurement based on quartz crystal microbalance.
Li, Ning; Meng, Xiaofeng; Nie, Jing
2017-05-01
A new sensor with dual-channel was designed for eliminating the temperature effect on the frequency measurement of the quartz crystal microbalance (QCM) in dew point detection. The sensor uses active temperature control, produces condensation on the surface of QCM, and then detects the dew point. Both the single-channel and the dual-channel methods were conducted based on the device. The measurement error of the single-channel method was less than 0.5 °C at the dew point range of -2 °C-10 °C while the dual-channel was 0.3 °C. The results showed that the dual-channel method was able to eliminate the temperature effect and yield better measurement accuracy.
Note: A dual-channel sensor for dew point measurement based on quartz crystal microbalance
NASA Astrophysics Data System (ADS)
Li, Ning; Meng, Xiaofeng; Nie, Jing
2017-05-01
A new sensor with dual-channel was designed for eliminating the temperature effect on the frequency measurement of the quartz crystal microbalance (QCM) in dew point detection. The sensor uses active temperature control, produces condensation on the surface of QCM, and then detects the dew point. Both the single-channel and the dual-channel methods were conducted based on the device. The measurement error of the single-channel method was less than 0.5 °C at the dew point range of -2 °C-10 °C while the dual-channel was 0.3 °C. The results showed that the dual-channel method was able to eliminate the temperature effect and yield better measurement accuracy.
Distance-based microfluidic quantitative detection methods for point-of-care testing.
Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James
2016-04-07
Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.
NASA Astrophysics Data System (ADS)
Hadi Sutrisno, Himawan; Kiswanto, Gandjar; Istiyanto, Jos
2017-06-01
The rough machining is aimed at shaping a workpiece towards to its final form. This process takes up a big proportion of the machining time due to the removal of the bulk material which may affect the total machining time. In certain models, the rough machining has limitations especially on certain surfaces such as turbine blade and impeller. CBV evaluation is one of the concepts which is used to detect of areas admissible in the process of machining. While in the previous research, CBV area detection used a pair of normal vectors, in this research, the writer simplified the process to detect CBV area with a slicing line for each point cloud formed. The simulation resulted in three steps used for this method and they are: 1. Triangulation from CAD design models, 2. Development of CC point from the point cloud, 3. The slicing line method which is used to evaluate each point cloud position (under CBV and outer CBV). The result of this evaluation method can be used as a tool for orientation set-up on each CC point position of feasible areas in rough machining.
Newell, Felicity L.; Sheehan, James; Wood, Petra Bohall; Rodewald, Amanda D.; Buehler, David A.; Keyser, Patrick D.; Larkin, Jeffrey L.; Beachy, Tiffany A.; Bakermans, Marja H.; Boves, Than J.; Evans, Andrea; George, Gregory A.; McDermott, Molly E.; Perkins, Kelly A.; White, Matthew; Wigley, T. Bently
2013-01-01
Point counts are commonly used to assess changes in bird abundance, including analytical approaches such as distance sampling that estimate density. Point-count methods have come under increasing scrutiny because effects of detection probability and field error are difficult to quantify. For seven forest songbirds, we compared fixed-radii counts (50 m and 100 m) and density estimates obtained from distance sampling to known numbers of birds determined by territory mapping. We applied point-count analytic approaches to a typical forest management question and compared results to those obtained by territory mapping. We used a before–after control impact (BACI) analysis with a data set collected across seven study areas in the central Appalachians from 2006 to 2010. Using a 50-m fixed radius, variance in error was at least 1.5 times that of the other methods, whereas a 100-m fixed radius underestimated actual density by >3 territories per 10 ha for the most abundant species. Distance sampling improved accuracy and precision compared to fixed-radius counts, although estimates were affected by birds counted outside 10-ha units. In the BACI analysis, territory mapping detected an overall treatment effect for five of the seven species, and effects were generally consistent each year. In contrast, all point-count methods failed to detect two treatment effects due to variance and error in annual estimates. Overall, our results highlight the need for adequate sample sizes to reduce variance, and skilled observers to reduce the level of error in point-count data. Ultimately, the advantages and disadvantages of different survey methods should be considered in the context of overall study design and objectives, allowing for trade-offs among effort, accuracy, and power to detect treatment effects.
Research on facial expression simulation based on depth image
NASA Astrophysics Data System (ADS)
Ding, Sha-sha; Duan, Jin; Zhao, Yi-wu; Xiao, Bo; Wang, Hao
2017-11-01
Nowadays, face expression simulation is widely used in film and television special effects, human-computer interaction and many other fields. Facial expression is captured by the device of Kinect camera .The method of AAM algorithm based on statistical information is employed to detect and track faces. The 2D regression algorithm is applied to align the feature points. Among them, facial feature points are detected automatically and 3D cartoon model feature points are signed artificially. The aligned feature points are mapped by keyframe techniques. In order to improve the animation effect, Non-feature points are interpolated based on empirical models. Under the constraint of Bézier curves we finish the mapping and interpolation. Thus the feature points on the cartoon face model can be driven if the facial expression varies. In this way the purpose of cartoon face expression simulation in real-time is came ture. The experiment result shows that the method proposed in this text can accurately simulate the facial expression. Finally, our method is compared with the previous method. Actual data prove that the implementation efficiency is greatly improved by our method.
NASA Astrophysics Data System (ADS)
Luo, Yiping; Jiang, Ting; Gao, Shengli; Wang, Xin
2010-10-01
It presents a new approach for detecting building footprints in a combination of registered aerial image with multispectral bands and airborne laser scanning data synchronously obtained by Leica-Geosystems ALS40 and Applanix DACS-301 on the same platform. A two-step method for building detection was presented consisting of selecting 'building' candidate points and then classifying candidate points. A digital surface model(DSM) derived from last pulse laser scanning data was first filtered and the laser points were classified into classes 'ground' and 'building or tree' based on mathematic morphological filter. Then, 'ground' points were resample into digital elevation model(DEM), and a Normalized DSM(nDSM) was generated from DEM and DSM. The candidate points were selected from 'building or tree' points by height value and area threshold in nDSM. The candidate points were further classified into building points and tree points by using the support vector machines(SVM) classification method. Two classification tests were carried out using features only from laser scanning data and associated features from two input data sources. The features included height, height finite difference, RGB bands value, and so on. The RGB value of points was acquired by matching laser scanning data and image using collinear equation. The features of training points were presented as input data for SVM classification method, and cross validation was used to select best classification parameters. The determinant function could be constructed by the classification parameters and the class of candidate points was determined by determinant function. The result showed that associated features from two input data sources were superior to features only from laser scanning data. The accuracy of more than 90% was achieved for buildings in first kind of features.
Interest point detection for hyperspectral imagery
NASA Astrophysics Data System (ADS)
Dorado-Muñoz, Leidy P.; Vélez-Reyes, Miguel; Roysam, Badrinath; Mukherjee, Amit
2009-05-01
This paper presents an algorithm for automated extraction of interest points (IPs)in multispectral and hyperspectral images. Interest points are features of the image that capture information from its neighbours and they are distinctive and stable under transformations such as translation and rotation. Interest-point operators for monochromatic images were proposed more than a decade ago and have since been studied extensively. IPs have been applied to diverse problems in computer vision, including image matching, recognition, registration, 3D reconstruction, change detection, and content-based image retrieval. Interest points are helpful in data reduction, and reduce the computational burden of various algorithms (like registration, object detection, 3D reconstruction etc) by replacing an exhaustive search over the entire image domain by a probe into a concise set of highly informative points. An interest operator seeks out points in an image that are structurally distinct, invariant to imaging conditions, stable under geometric transformation, and interpretable which are good candidates for interest points. Our approach extends ideas from Lowe's keypoint operator that uses local extrema of Difference of Gaussian (DoG) operator at multiple scales to detect interest point in gray level images. The proposed approach extends Lowe's method by direct conversion of scalar operations such as scale-space generation, and extreme point detection into operations that take the vector nature of the image into consideration. Experimental results with RGB and hyperspectral images which demonstrate the potential of the method for this application and the potential improvements of a fully vectorial approach over band-by-band approaches described in the literature.
Detection limit for rate fluctuations in inhomogeneous Poisson processes
NASA Astrophysics Data System (ADS)
Shintani, Toshiaki; Shinomoto, Shigeru
2012-04-01
Estimations of an underlying rate from data points are inevitably disturbed by the irregular occurrence of events. Proper estimation methods are designed to avoid overfitting by discounting the irregular occurrence of data, and to determine a constant rate from irregular data derived from a constant probability distribution. However, it can occur that rapid or small fluctuations in the underlying density are undetectable when the data are sparse. For an estimation method, the maximum degree of undetectable rate fluctuations is uniquely determined as a phase transition, when considering an infinitely long series of events drawn from a fluctuating density. In this study, we analytically examine an optimized histogram and a Bayesian rate estimator with respect to their detectability of rate fluctuation, and determine whether their detectable-undetectable phase transition points are given by an identical formula defining a degree of fluctuation in an underlying rate. In addition, we numerically examine the variational Bayes hidden Markov model in its detectability of rate fluctuation, and determine whether the numerically obtained transition point is comparable to those of the other two methods. Such consistency among these three principled methods suggests the presence of a theoretical limit for detecting rate fluctuations.
Detection limit for rate fluctuations in inhomogeneous Poisson processes.
Shintani, Toshiaki; Shinomoto, Shigeru
2012-04-01
Estimations of an underlying rate from data points are inevitably disturbed by the irregular occurrence of events. Proper estimation methods are designed to avoid overfitting by discounting the irregular occurrence of data, and to determine a constant rate from irregular data derived from a constant probability distribution. However, it can occur that rapid or small fluctuations in the underlying density are undetectable when the data are sparse. For an estimation method, the maximum degree of undetectable rate fluctuations is uniquely determined as a phase transition, when considering an infinitely long series of events drawn from a fluctuating density. In this study, we analytically examine an optimized histogram and a Bayesian rate estimator with respect to their detectability of rate fluctuation, and determine whether their detectable-undetectable phase transition points are given by an identical formula defining a degree of fluctuation in an underlying rate. In addition, we numerically examine the variational Bayes hidden Markov model in its detectability of rate fluctuation, and determine whether the numerically obtained transition point is comparable to those of the other two methods. Such consistency among these three principled methods suggests the presence of a theoretical limit for detecting rate fluctuations.
Testing the importance of auditory detections in avian point counts
Brewster, J.P.; Simons, T.R.
2009-01-01
Recent advances in the methods used to estimate detection probability during point counts suggest that the detection process is shaped by the types of cues available to observers. For example, models of the detection process based on distance-sampling or time-of-detection methods may yield different results for auditory versus visual cues because of differences in the factors that affect the transmission of these cues from a bird to an observer or differences in an observer's ability to localize cues. Previous studies suggest that auditory detections predominate in forested habitats, but it is not clear how often observers hear birds prior to detecting them visually. We hypothesized that auditory cues might be even more important than previously reported, so we conducted an experiment in a forested habitat in North Carolina that allowed us to better separate auditory and visual detections. Three teams of three observers each performed simultaneous 3-min unlimited-radius point counts at 30 points in a mixed-hardwood forest. One team member could see, but not hear birds, one could hear, but not see, and the third was nonhandicapped. Of the total number of birds detected, 2.9% were detected by deafened observers, 75.1% by blinded observers, and 78.2% by nonhandicapped observers. Detections by blinded and nonhandicapped observers were the same only 54% of the time. Our results suggest that the detection of birds in forest habitats is almost entirely by auditory cues. Because many factors affect the probability that observers will detect auditory cues, the accuracy and precision of avian point count estimates are likely lower than assumed by most field ornithologists. ?? 2009 Association of Field Ornithologists.
A new method of real-time detection of changes in periodic data stream
NASA Astrophysics Data System (ADS)
Lyu, Chen; Lu, Guoliang; Cheng, Bin; Zheng, Xiangwei
2017-07-01
The change point detection in periodic time series is much desirable in many practical usages. We present a novel algorithm for this task, which includes two phases: 1) anomaly measure- on the basis of a typical regression model, we propose a new computation method to measure anomalies in time series which does not require any reference data from other measurement(s); 2) change detection- we introduce a new martingale test for detection which can be operated in an unsupervised and nonparametric way. We have conducted extensive experiments to systematically test our algorithm. The results make us believe that our algorithm can be directly applicable in many real-world change-point-detection applications.
Mass Spectrometry for Paper-Based Immunoassays: Toward On-Demand Diagnosis.
Chen, Suming; Wan, Qiongqiong; Badu-Tawiah, Abraham K
2016-05-25
Current analytical methods, either point-of-care or centralized detection, are not able to meet recent demands of patient-friendly testing and increased reliability of results. Here, we describe a two-point separation on-demand diagnostic strategy based on a paper-based mass spectrometry immunoassay platform that adopts stable and cleavable ionic probes as mass reporter; these probes make possible sensitive, interruptible, storable, and restorable on-demand detection. In addition, a new touch paper spray method was developed for on-chip, sensitive, and cost-effective analyte detection. This concept is successfully demonstrated via (i) the detection of Plasmodium falciparum histidine-rich protein 2 antigen and (ii) multiplexed and simultaneous detection of cancer antigen 125 and carcinoembryonic antigen.
NASA Astrophysics Data System (ADS)
Su, Qiang; Zhou, Xiaoming
2008-12-01
Many pathogenic and genetic diseases are associated with changes in the sequence of particular genes. We describe here a rapid and highly efficient assay for the detection of point mutation. This method is a combination of isothermal rolling circle amplification (RCA) and high sensitive electrochemluminescence (ECL) detection. In the design, a circular template generated by ligation upon the recognition of a point mutation on DNA targets was amplified isothermally by the Phi29 polymerase using a biotinylated primer. The elongation products were hybridized with tris (bipyridine) ruthenium (TBR)-tagged probes and detected in a magnetic bead based ECL platform, indicating the mutation occurrence. P53 was chosen as a model for the identification of this method. The method allowed sensitive determination of the P53 mutation from wild-type and mutant samples. The main advantage of RCA-ECL is that it can be performed under isothermal conditions and avoids the generation of false-positive results. Furthermore, ECL provides a faster, more sensitive, and economical option to currently available electrophoresis-based methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yokoyama, Yoko; Shimizu, Akira; Okada, Etsuko
Highlights: Black-Right-Pointing-Pointer We developed new method to rapidly identify COL1A1-PDGFB fusion in DFSP. Black-Right-Pointing-Pointer New PCR method using a single primer pair detected COL1A1-PDGFB fusion in DFSP. Black-Right-Pointing-Pointer This is the first report of DFSP with a novel COL1A1 breakpoint in exon 5. -- Abstract: The detection of fusion transcripts of the collagen type 1{alpha}1 (COL1A1) and platelet-derived growth factor-BB (PDGFB) genes by genetic analysis has recognized as a reliable and valuable molecular tool for the diagnosis of dermatofibrosarcoma protuberans (DFSP). To detect the COL1A1-PDGFB fusion, almost previous reports performed reverse transcription polymerase chain reaction (RT-PCR) using multiplex forward primersmore » from COL1A1. However, it has possible technical difficulties with respect to the handling of multiple primers and reagents in the procedure. The objective of this study is to establish a rapid, easy, and efficient one-step method of PCR using only a single primer pair to detect the fusion transcripts of the COL1A1 and PDGFB in DFSP. To validate new method, we compared the results of RT-PCR in five patients of DFSP between the previous method using multiplex primers and our established one-step RT-PCR using a single primer pair. In all cases of DFSP, the COL1A1-PDGFB fusion was detected by both previous method and newly established one-step PCR. Importantly, we detected a novel COL1A1 breakpoint in exon 5. The newly developed method is valuable to rapidly identify COL1A1-PDGFB fusion transcripts in DFSP.« less
Automated feature detection and identification in digital point-ordered signals
Oppenlander, Jane E.; Loomis, Kent C.; Brudnoy, David M.; Levy, Arthur J.
1998-01-01
A computer-based automated method to detect and identify features in digital point-ordered signals. The method is used for processing of non-destructive test signals, such as eddy current signals obtained from calibration standards. The signals are first automatically processed to remove noise and to determine a baseline. Next, features are detected in the signals using mathematical morphology filters. Finally, verification of the features is made using an expert system of pattern recognition methods and geometric criteria. The method has the advantage that standard features can be, located without prior knowledge of the number or sequence of the features. Further advantages are that standard features can be differentiated from irrelevant signal features such as noise, and detected features are automatically verified by parameters extracted from the signals. The method proceeds fully automatically without initial operator set-up and without subjective operator feature judgement.
A Support System for Mouse Operations Using Eye-Gaze Input
NASA Astrophysics Data System (ADS)
Abe, Kiyohiko; Nakayama, Yasuhiro; Ohi, Shoichi; Ohyama, Minoru
We have developed an eye-gaze input system for people with severe physical disabilities, such as amyotrophic lateral sclerosis (ALS) patients. This system utilizes a personal computer and a home video camera to detect eye-gaze under natural light. The system detects both vertical and horizontal eye-gaze by simple image analysis, and does not require special image processing units or sensors. Our conventional eye-gaze input system can detect horizontal eye-gaze with a high degree of accuracy. However, it can only classify vertical eye-gaze into 3 directions (up, middle and down). In this paper, we propose a new method for vertical eye-gaze detection. This method utilizes the limbus tracking method for vertical eye-gaze detection. Therefore our new eye-gaze input system can detect the two-dimension coordinates of user's gazing point. By using this method, we develop a new support system for mouse operation. This system can move the mouse cursor to user's gazing point.
Ballari, Rajashekhar V; Martin, Asha; Gowda, Lalitha R
2013-01-01
Brinjal is an important vegetable crop. Major crop loss of brinjal is due to insect attack. Insect-resistant EE-1 brinjal has been developed and is awaiting approval for commercial release. Consumer health concerns and implementation of international labelling legislation demand reliable analytical detection methods for genetically modified (GM) varieties. End-point and real-time polymerase chain reaction (PCR) methods were used to detect EE-1 brinjal. In end-point PCR, primer pairs specific to 35S CaMV promoter, NOS terminator and nptII gene common to other GM crops were used. Based on the revealed 3' transgene integration sequence, primers specific for the event EE-1 brinjal were designed. These primers were used for end-point single, multiplex and SYBR-based real-time PCR. End-point single PCR showed that the designed primers were highly specific to event EE-1 with a sensitivity of 20 pg of genomic DNA, corresponding to 20 copies of haploid EE-1 brinjal genomic DNA. The limits of detection and quantification for SYBR-based real-time PCR assay were 10 and 100 copies respectively. The prior development of detection methods for this important vegetable crop will facilitate compliance with any forthcoming labelling regulations. Copyright © 2012 Society of Chemical Industry.
Method of noncontacting ultrasonic process monitoring
Garcia, Gabriel V.; Walter, John B.; Telschow, Kenneth L.
1992-01-01
A method of monitoring a material during processing comprising the steps of (a) shining a detection light on the surface of a material; (b) generating ultrasonic waves at the surface of the material to cause a change in frequency of the detection light; (c) detecting a change in the frequency of the detection light at the surface of the material; (d) detecting said ultrasonic waves at the surface point of detection of the material; (e) measuring a change in the time elapsed from generating the ultrasonic waves at the surface of the material and return to the surface point of detection of the material, to determine the transit time; and (f) comparing the transit time to predetermined values to determine properties such as, density and the elastic quality of the material.
Experimental detection of optical vortices with a Shack-Hartmann wavefront sensor.
Murphy, Kevin; Burke, Daniel; Devaney, Nicholas; Dainty, Chris
2010-07-19
Laboratory experiments are carried out to detect optical vortices in conditions typical of those experienced when a laser beam is propagated through the atmosphere. A Spatial Light Modulator (SLM) is used to mimic atmospheric turbulence and a Shack-Hartmann wavefront sensor is utilised to measure the slopes of the wavefront surface. A matched filter algorithm determines the positions of the Shack-Hartmann spot centroids more robustly than a centroiding algorithm. The slope discrepancy is then obtained by taking the slopes measured by the wavefront sensor away from the slopes calculated from a least squares reconstruction of the phase. The slope discrepancy field is used as an input to the branch point potential method to find if a vortex is present, and if so to give its position and sign. The use of the slope discrepancy technique greatly improves the detection rate of the branch point potential method. This work shows the first time the branch point potential method has been used to detect optical vortices in an experimental setup.
Cabrieto, Jedelyn; Tuerlinckx, Francis; Kuppens, Peter; Grassmann, Mariel; Ceulemans, Eva
2017-06-01
Change point detection in multivariate time series is a complex task since next to the mean, the correlation structure of the monitored variables may also alter when change occurs. DeCon was recently developed to detect such changes in mean and\\or correlation by combining a moving windows approach and robust PCA. However, in the literature, several other methods have been proposed that employ other non-parametric tools: E-divisive, Multirank, and KCP. Since these methods use different statistical approaches, two issues need to be tackled. First, applied researchers may find it hard to appraise the differences between the methods. Second, a direct comparison of the relative performance of all these methods for capturing change points signaling correlation changes is still lacking. Therefore, we present the basic principles behind DeCon, E-divisive, Multirank, and KCP and the corresponding algorithms, to make them more accessible to readers. We further compared their performance through extensive simulations using the settings of Bulteel et al. (Biological Psychology, 98 (1), 29-42, 2014) implying changes in mean and in correlation structure and those of Matteson and James (Journal of the American Statistical Association, 109 (505), 334-345, 2014) implying different numbers of (noise) variables. KCP emerged as the best method in almost all settings. However, in case of more than two noise variables, only DeCon performed adequately in detecting correlation changes.
Dynamic path planning for mobile robot based on particle swarm optimization
NASA Astrophysics Data System (ADS)
Wang, Yong; Cai, Feng; Wang, Ying
2017-08-01
In the contemporary, robots are used in many fields, such as cleaning, medical treatment, space exploration, disaster relief and so on. The dynamic path planning of robot without collision is becoming more and more the focus of people's attention. A new method of path planning is proposed in this paper. Firstly, the motion space model of the robot is established by using the MAKLINK graph method. Then the A* algorithm is used to get the shortest path from the start point to the end point. Secondly, this paper proposes an effective method to detect and avoid obstacles. When an obstacle is detected on the shortest path, the robot will choose the nearest safety point to move. Moreover, calculate the next point which is nearest to the target. Finally, the particle swarm optimization algorithm is used to optimize the path. The experimental results can prove that the proposed method is more effective.
Adaptive 4d Psi-Based Change Detection
NASA Astrophysics Data System (ADS)
Yang, Chia-Hsiang; Soergel, Uwe
2018-04-01
In a previous work, we proposed a PSI-based 4D change detection to detect disappearing and emerging PS points (3D) along with their occurrence dates (1D). Such change points are usually caused by anthropic events, e.g., building constructions in cities. This method first divides an entire SAR image stack into several subsets by a set of break dates. The PS points, which are selected based on their temporal coherences before or after a break date, are regarded as change candidates. Change points are then extracted from these candidates according to their change indices, which are modelled from their temporal coherences of divided image subsets. Finally, we check the evolution of the change indices for each change point to detect the break date that this change occurred. The experiment validated both feasibility and applicability of our method. However, two questions still remain. First, selection of temporal coherence threshold associates with a trade-off between quality and quantity of PS points. This selection is also crucial for the amount of change points in a more complex way. Second, heuristic selection of change index thresholds brings vulnerability and causes loss of change points. In this study, we adapt our approach to identify change points based on statistical characteristics of change indices rather than thresholding. The experiment validates this adaptive approach and shows increase of change points compared with the old version. In addition, we also explore and discuss optimal selection of temporal coherence threshold.
Micro-vibration detection with heterodyne holography based on time-averaged method
NASA Astrophysics Data System (ADS)
Qin, XiaoDong; Pan, Feng; Chen, ZongHui; Hou, XueQin; Xiao, Wen
2017-02-01
We propose a micro-vibration detection method by introducing heterodyne interferometry to time-averaged holography. This method compensates for the deficiency of time-average holography in quantitative measurements and widens its range of application effectively. Acousto-optic modulators are used to modulate the frequencies of the reference beam and the object beam. Accurate detection of the maximum amplitude of each point in the vibration plane is performed by altering the frequency difference of both beams. The range of amplitude detection of plane vibration is extended. In the stable vibration mode, the distribution of the maximum amplitude of each point is measured and the fitted curves are plotted. Hence the plane vibration mode of the object is demonstrated intuitively and detected quantitatively. We analyzed the method in theory and built an experimental system with a sine signal as the excitation source and a typical piezoelectric ceramic plate as the target. The experimental results indicate that, within a certain error range, the detected vibration mode agrees with the intrinsic vibration characteristics of the object, thus proving the validity of this method.
NASA Technical Reports Server (NTRS)
Zhou, Wei
1993-01-01
In the high accurate measurement of periodic signals, the greatest common factor frequency and its characteristics have special functions. A method of time difference measurement - the time difference method by dual 'phase coincidence points' detection is described. This method utilizes the characteristics of the greatest common factor frequency to measure time or phase difference between periodic signals. It can suit a very wide frequency range. Measurement precision and potential accuracy of several picoseconds were demonstrated with this new method. The instrument based on this method is very simple, and the demand for the common oscillator is low. This method and instrument can be used widely.
Error Mitigation of Point-to-Point Communication for Fault-Tolerant Computing
NASA Technical Reports Server (NTRS)
Akamine, Robert L.; Hodson, Robert F.; LaMeres, Brock J.; Ray, Robert E.
2011-01-01
Fault tolerant systems require the ability to detect and recover from physical damage caused by the hardware s environment, faulty connectors, and system degradation over time. This ability applies to military, space, and industrial computing applications. The integrity of Point-to-Point (P2P) communication, between two microcontrollers for example, is an essential part of fault tolerant computing systems. In this paper, different methods of fault detection and recovery are presented and analyzed.
Advanced DNA-Based Point-of-Care Diagnostic Methods for Plant Diseases Detection.
Lau, Han Yih; Botella, Jose R
2017-01-01
Diagnostic technologies for the detection of plant pathogens with point-of-care capability and high multiplexing ability are an essential tool in the fight to reduce the large agricultural production losses caused by plant diseases. The main desirable characteristics for such diagnostic assays are high specificity, sensitivity, reproducibility, quickness, cost efficiency and high-throughput multiplex detection capability. This article describes and discusses various DNA-based point-of care diagnostic methods for applications in plant disease detection. Polymerase chain reaction (PCR) is the most common DNA amplification technology used for detecting various plant and animal pathogens. However, subsequent to PCR based assays, several types of nucleic acid amplification technologies have been developed to achieve higher sensitivity, rapid detection as well as suitable for field applications such as loop-mediated isothermal amplification, helicase-dependent amplification, rolling circle amplification, recombinase polymerase amplification, and molecular inversion probe. The principle behind these technologies has been thoroughly discussed in several review papers; herein we emphasize the application of these technologies to detect plant pathogens by outlining the advantages and disadvantages of each technology in detail.
Advanced DNA-Based Point-of-Care Diagnostic Methods for Plant Diseases Detection
Lau, Han Yih; Botella, Jose R.
2017-01-01
Diagnostic technologies for the detection of plant pathogens with point-of-care capability and high multiplexing ability are an essential tool in the fight to reduce the large agricultural production losses caused by plant diseases. The main desirable characteristics for such diagnostic assays are high specificity, sensitivity, reproducibility, quickness, cost efficiency and high-throughput multiplex detection capability. This article describes and discusses various DNA-based point-of care diagnostic methods for applications in plant disease detection. Polymerase chain reaction (PCR) is the most common DNA amplification technology used for detecting various plant and animal pathogens. However, subsequent to PCR based assays, several types of nucleic acid amplification technologies have been developed to achieve higher sensitivity, rapid detection as well as suitable for field applications such as loop-mediated isothermal amplification, helicase-dependent amplification, rolling circle amplification, recombinase polymerase amplification, and molecular inversion probe. The principle behind these technologies has been thoroughly discussed in several review papers; herein we emphasize the application of these technologies to detect plant pathogens by outlining the advantages and disadvantages of each technology in detail. PMID:29375588
NASA Astrophysics Data System (ADS)
Zhou, Anran; Xie, Weixin; Pei, Jihong; Chen, Yapei
2018-02-01
For ship targets detection in cluttered infrared image sequences, a robust detection method, based on the probabilistic single Gaussian model of sea background in Fourier domain, is put forward. The amplitude spectrum sequences at each frequency point of the pure seawater images in Fourier domain, being more stable than the gray value sequences of each background pixel in the spatial domain, are regarded as a Gaussian model. Next, a probability weighted matrix is built based on the stability of the pure seawater's total energy spectrum in the row direction, to make the Gaussian model more accurate. Then, the foreground frequency points are separated from the background frequency points by the model. Finally, the false-alarm points are removed utilizing ships' shape features. The performance of the proposed method is tested by visual and quantitative comparisons with others.
A travel time forecasting model based on change-point detection method
NASA Astrophysics Data System (ADS)
LI, Shupeng; GUANG, Xiaoping; QIAN, Yongsheng; ZENG, Junwei
2017-06-01
Travel time parameters obtained from road traffic sensors data play an important role in traffic management practice. A travel time forecasting model is proposed for urban road traffic sensors data based on the method of change-point detection in this paper. The first-order differential operation is used for preprocessing over the actual loop data; a change-point detection algorithm is designed to classify the sequence of large number of travel time data items into several patterns; then a travel time forecasting model is established based on autoregressive integrated moving average (ARIMA) model. By computer simulation, different control parameters are chosen for adaptive change point search for travel time series, which is divided into several sections of similar state.Then linear weight function is used to fit travel time sequence and to forecast travel time. The results show that the model has high accuracy in travel time forecasting.
Sequential structural damage diagnosis algorithm using a change point detection method
NASA Astrophysics Data System (ADS)
Noh, H.; Rajagopal, R.; Kiremidjian, A. S.
2013-11-01
This paper introduces a damage diagnosis algorithm for civil structures that uses a sequential change point detection method. The general change point detection method uses the known pre- and post-damage feature distributions to perform a sequential hypothesis test. In practice, however, the post-damage distribution is unlikely to be known a priori, unless we are looking for a known specific type of damage. Therefore, we introduce an additional algorithm that estimates and updates this distribution as data are collected using the maximum likelihood and the Bayesian methods. We also applied an approximate method to reduce the computation load and memory requirement associated with the estimation. The algorithm is validated using a set of experimental data collected from a four-story steel special moment-resisting frame and multiple sets of simulated data. Various features of different dimensions have been explored, and the algorithm was able to identify damage, particularly when it uses multidimensional damage sensitive features and lower false alarm rates, with a known post-damage feature distribution. For unknown feature distribution cases, the post-damage distribution was consistently estimated and the detection delays were only a few time steps longer than the delays from the general method that assumes we know the post-damage feature distribution. We confirmed that the Bayesian method is particularly efficient in declaring damage with minimal memory requirement, but the maximum likelihood method provides an insightful heuristic approach.
A new methodology for automatic detection of reference points in 3D cephalometry: A pilot study.
Ed-Dhahraouy, Mohammed; Riri, Hicham; Ezzahmouly, Manal; Bourzgui, Farid; El Moutaoukkil, Abdelmajid
2018-04-05
The aim of this study was to develop a new method for an automatic detection of reference points in 3D cephalometry to overcome the limits of 2D cephalometric analyses. A specific application was designed using the C++ language for automatic and manual identification of 21 (reference) points on the craniofacial structures. Our algorithm is based on the implementation of an anatomical and geometrical network adapted to the craniofacial structure. This network was constructed based on the anatomical knowledge of the 3D cephalometric (reference) points. The proposed algorithm was tested on five CBCT images. The proposed approach for the automatic 3D cephalometric identification was able to detect 21 points with a mean error of 2.32mm. In this pilot study, we propose an automated methodology for the identification of the 3D cephalometric (reference) points. A larger sample will be implemented in the future to assess the method validity and reliability. Copyright © 2018 CEO. Published by Elsevier Masson SAS. All rights reserved.
NASA Astrophysics Data System (ADS)
Amiri, N.; Polewski, P.; Yao, W.; Krzystek, P.; Skidmore, A. K.
2017-09-01
Airborne Laser Scanning (ALS) is a widespread method for forest mapping and management purposes. While common ALS techniques provide valuable information about the forest canopy and intermediate layers, the point density near the ground may be poor due to dense overstory conditions. The current study highlights a new method for detecting stems of single trees in 3D point clouds obtained from high density ALS with a density of 300 points/m2. Compared to standard ALS data, due to lower flight height (150-200 m) this elevated point density leads to more laser reflections from tree stems. In this work, we propose a three-tiered method which works on the point, segment and object levels. First, for each point we calculate the likelihood that it belongs to a tree stem, derived from the radiometric and geometric features of its neighboring points. In the next step, we construct short stem segments based on high-probability stem points, and classify the segments by considering the distribution of points around them as well as their spatial orientation, which encodes the prior knowledge that trees are mainly vertically aligned due to gravity. Finally, we apply hierarchical clustering on the positively classified segments to obtain point sets corresponding to single stems, and perform ℓ1-based orthogonal distance regression to robustly fit lines through each stem point set. The ℓ1-based method is less sensitive to outliers compared to the least square approaches. From the fitted lines, the planimetric tree positions can then be derived. Experiments were performed on two plots from the Hochficht forest in Oberösterreich region located in Austria.We marked a total of 196 reference stems in the point clouds of both plots by visual interpretation. The evaluation of the automatically detected stems showed a classification precision of 0.86 and 0.85, respectively for Plot 1 and 2, with recall values of 0.7 and 0.67.
Fast Edge Detection and Segmentation of Terrestrial Laser Scans Through Normal Variation Analysis
NASA Astrophysics Data System (ADS)
Che, E.; Olsen, M. J.
2017-09-01
Terrestrial Laser Scanning (TLS) utilizes light detection and ranging (lidar) to effectively and efficiently acquire point cloud data for a wide variety of applications. Segmentation is a common procedure of post-processing to group the point cloud into a number of clusters to simplify the data for the sequential modelling and analysis needed for most applications. This paper presents a novel method to rapidly segment TLS data based on edge detection and region growing. First, by computing the projected incidence angles and performing the normal variation analysis, the silhouette edges and intersection edges are separated from the smooth surfaces. Then a modified region growing algorithm groups the points lying on the same smooth surface. The proposed method efficiently exploits the gridded scan pattern utilized during acquisition of TLS data from most sensors and takes advantage of parallel programming to process approximately 1 million points per second. Moreover, the proposed segmentation does not require estimation of the normal at each point, which limits the errors in normal estimation propagating to segmentation. Both an indoor and outdoor scene are used for an experiment to demonstrate and discuss the effectiveness and robustness of the proposed segmentation method.
Zhu, Hai-Zhen; Liu, Wei; Mao, Jian-Wei; Yang, Ming-Min
2008-04-28
4-Amino-4'-nitrobiphenyl, which is formed by catalytic effect of trichlorfon on sodium perborate oxidizing benzidine, is extracted with a cloud point extraction method and then detected using a high performance liquid chromatography with ultraviolet detection (HPLC-UV). Under the optimum experimental conditions, there was a linear relationship between trichlorfon in the concentration range of 0.01-0.2 mgL(-1) and the peak areas of 4-amino-4'-nitrobiphenyl (r=0.996). Limit of detection was 2.0 microgL(-1), recoveries of spiked water and cabbage samples ranged between 95.4-103 and 85.2-91.2%, respectively. It was proved that the cloud point extraction (CPE) method was simple, cheap, and environment friendly than extraction with organic solvents and had more effective extraction yield.
NASA Astrophysics Data System (ADS)
Salançon, Evelyne; Degiovanni, Alain; Lapena, Laurent; Morin, Roger
2018-04-01
An event-counting method using a two-microchannel plate stack in a low-energy electron point projection microscope is implemented. 15 μm detector spatial resolution, i.e., the distance between first-neighbor microchannels, is demonstrated. This leads to a 7 times better microscope resolution. Compared to previous work with neutrons [Tremsin et al., Nucl. Instrum. Methods Phys. Res., Sect. A 592, 374 (2008)], the large number of detection events achieved with electrons shows that the local response of the detector is mainly governed by the angle between the hexagonal structures of the two microchannel plates. Using this method in point projection microscopy offers the prospect of working with a greater source-object distance (350 nm instead of 50 nm), advancing toward atomic resolution.
Shi, Chao; Ge, Yujie; Gu, Hongxi; Ma, Cuiping
2011-08-15
Single nucleotide polymorphism (SNP) genotyping is attracting extensive attentions owing to its direct connections with human diseases including cancers. Here, we have developed a highly sensitive chemiluminescence biosensor based on circular strand-displacement amplification and the separation by magnetic beads reducing the background signal for point mutation detection at room temperature. This method took advantage of both the T4 DNA ligase recognizing single-base mismatch with high selectivity and the strand-displacement reaction of polymerase to perform signal amplification. The detection limit of this method was 1.3 × 10(-16)M, which showed better sensitivity than that of most of those reported detection methods of SNP. Additionally, the magnetic beads as carrier of immobility was not only to reduce the background signal, but also may have potential apply in high through-put screening of SNP detection in human genome. Copyright © 2011 Elsevier B.V. All rights reserved.
Song, Yunke; Zhang, Yi; Wang, Tza-Huei
2013-04-08
Gene point mutations present important biomarkers for genetic diseases. However, existing point mutation detection methods suffer from low sensitivity, specificity, and a tedious assay processes. In this report, an assay technology is proposed which combines the outstanding specificity of gap ligase chain reaction (Gap-LCR), the high sensitivity of single-molecule coincidence detection, and the superior optical properties of quantum dots (QDs) for multiplexed detection of point mutations in genomic DNA. Mutant-specific ligation products are generated by Gap-LCR and subsequently captured by QDs to form DNA-QD nanocomplexes that are detected by single-molecule spectroscopy (SMS) through multi-color fluorescence burst coincidence analysis, allowing for multiplexed mutation detection in a separation-free format. The proposed assay is capable of detecting zeptomoles of KRAS codon 12 mutation variants with near 100% specificity. Its high sensitivity allows direct detection of KRAS mutation in crude genomic DNA without PCR pre-amplification. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Shinchi, Hiroyuki; Yuki, Nobuhiro; Ishida, Hideharu; Hirata, Koichi; Wakao, Masahiro; Suda, Yasuo
2015-01-01
Sugar chain binding antibodies have gained substantial attention as biomarkers due to their crucial roles in various disorders. In this study, we developed simple and quick detection method of anti-sugar chain antibodies in sera using our previously developed sugar chain-immobilized fluorescent nanoparticles (SFNPs) for the point-of-care diagnostics. Sugar chain structure on SFNPs was modified with the sugar moieties of the GM1 ganglioside via our original linker molecule to detect anti-GM1 antibodies. The structures and densities of the sugar moieties immobilized on the nanoparticles were evaluated in detail using lectins and sera containing anti-GM1 antibodies from patients with Guillain-Barré syndrome, a neurological disorder, as an example of disease involving anti-sugar chain antibodies. When optimized SFNPs were added to sera from patients with Guillain-Barré syndrome, fluorescent aggregates were able to visually detect under UV light in three hours. The sensitivity of the detection method was equivalent to that of the current ELISA method used for the diagnosis of Guillain-Barré syndrome. These results suggest that our method using SFNPs is suitable for the point-of-care diagnostics of diseases involving anti-sugar chain antibodies.
Methods for point-of-care detection of nucleic acid in a sample
Bearinger, Jane P.; Dugan, Lawrence C.
2015-12-29
Provided herein are methods and apparatus for detecting a target nucleic acid in a sample and related methods and apparatus for diagnosing a condition in an individual. The condition is associated with presence of nucleic acid produced by certain pathogens in the individual.
Statistical methods for change-point detection in surface temperature records
NASA Astrophysics Data System (ADS)
Pintar, A. L.; Possolo, A.; Zhang, N. F.
2013-09-01
We describe several statistical methods to detect possible change-points in a time series of values of surface temperature measured at a meteorological station, and to assess the statistical significance of such changes, taking into account the natural variability of the measured values, and the autocorrelations between them. These methods serve to determine whether the record may suffer from biases unrelated to the climate signal, hence whether there may be a need for adjustments as considered by M. J. Menne and C. N. Williams (2009) "Homogenization of Temperature Series via Pairwise Comparisons", Journal of Climate 22 (7), 1700-1717. We also review methods to characterize patterns of seasonality (seasonal decomposition using monthly medians or robust local regression), and explain the role they play in the imputation of missing values, and in enabling robust decompositions of the measured values into a seasonal component, a possible climate signal, and a station-specific remainder. The methods for change-point detection that we describe include statistical process control, wavelet multi-resolution analysis, adaptive weights smoothing, and a Bayesian procedure, all of which are applicable to single station records.
Islanding detection technique using wavelet energy in grid-connected PV system
NASA Astrophysics Data System (ADS)
Kim, Il Song
2016-08-01
This paper proposes a new islanding detection method using wavelet energy in a grid-connected photovoltaic system. The method detects spectral changes in the higher-frequency components of the point of common coupling voltage and obtains wavelet coefficients by multilevel wavelet analysis. The autocorrelation of the wavelet coefficients can clearly identify islanding detection, even in the variations of the grid voltage harmonics during normal operating conditions. The advantage of the proposed method is that it can detect islanding condition the conventional under voltage/over voltage/under frequency/over frequency methods fail to detect. The theoretical method to obtain wavelet energies is evolved and verified by the experimental result.
Fast and Robust Segmentation and Classification for Change Detection in Urban Point Clouds
NASA Astrophysics Data System (ADS)
Roynard, X.; Deschaud, J.-E.; Goulette, F.
2016-06-01
Change detection is an important issue in city monitoring to analyse street furniture, road works, car parking, etc. For example, parking surveys are needed but are currently a laborious task involving sending operators in the streets to identify the changes in car locations. In this paper, we propose a method that performs a fast and robust segmentation and classification of urban point clouds, that can be used for change detection. We apply this method to detect the cars, as a particular object class, in order to perform parking surveys automatically. A recently proposed method already addresses the need for fast segmentation and classification of urban point clouds, using elevation images. The interest to work on images is that processing is much faster, proven and robust. However there may be a loss of information in complex 3D cases: for example when objects are one above the other, typically a car under a tree or a pedestrian under a balcony. In this paper we propose a method that retain the three-dimensional information while preserving fast computation times and improving segmentation and classification accuracy. It is based on fast region-growing using an octree, for the segmentation, and specific descriptors with Random-Forest for the classification. Experiments have been performed on large urban point clouds acquired by Mobile Laser Scanning. They show that the method is as fast as the state of the art, and that it gives more robust results in the complex 3D cases.
Two-stage Keypoint Detection Scheme for Region Duplication Forgery Detection in Digital Images.
Emam, Mahmoud; Han, Qi; Zhang, Hongli
2018-01-01
In digital image forensics, copy-move or region duplication forgery detection became a vital research topic recently. Most of the existing keypoint-based forgery detection methods fail to detect the forgery in the smooth regions, rather than its sensitivity to geometric changes. To solve these problems and detect points which cover all the regions, we proposed two steps for keypoint detection. First, we employed the scale-invariant feature operator to detect the spatially distributed keypoints from the textured regions. Second, the keypoints from the missing regions are detected using Harris corner detector with nonmaximal suppression to evenly distribute the detected keypoints. To improve the matching performance, local feature points are described using Multi-support Region Order-based Gradient Histogram descriptor. Based on precision-recall rates and commonly tested dataset, comprehensive performance evaluation is performed. The results demonstrated that the proposed scheme has better detection and robustness against some geometric transformation attacks compared with state-of-the-art methods. © 2017 American Academy of Forensic Sciences.
Animals as Mobile Biological Sensors for Forest Fire Detection
2007-01-01
This paper proposes a mobile biological sensor system that can assist in early detection of forest fires one of the most dreaded natural disasters on the earth. The main idea presented in this paper is to utilize animals with sensors as Mobile Biological Sensors (MBS). The devices used in this system are animals which are native animals living in forests, sensors (thermo and radiation sensors with GPS features) that measure the temperature and transmit the location of the MBS, access points for wireless communication and a central computer system which classifies of animal actions. The system offers two different methods, firstly: access points continuously receive data about animals' location using GPS at certain time intervals and the gathered data is then classified and checked to see if there is a sudden movement (panic) of the animal groups: this method is called animal behavior classification (ABC). The second method can be defined as thermal detection (TD): the access points get the temperature values from the MBS devices and send the data to a central computer to check for instant changes in the temperatures. This system may be used for many purposes other than fire detection, namely animal tracking, poaching prevention and detecting instantaneous animal death. PMID:28903281
Incorporating availability for detection in estimates of bird abundance
Diefenbach, D.R.; Marshall, M.R.; Mattice, J.A.; Brauning, D.W.
2007-01-01
Several bird-survey methods have been proposed that provide an estimated detection probability so that bird-count statistics can be used to estimate bird abundance. However, some of these estimators adjust counts of birds observed by the probability that a bird is detected and assume that all birds are available to be detected at the time of the survey. We marked male Henslow's Sparrows (Ammodramus henslowii) and Grasshopper Sparrows (A. savannarum) and monitored their behavior during May-July 2002 and 2003 to estimate the proportion of time they were available for detection. We found that the availability of Henslow's Sparrows declined in late June to <10% for 5- or 10-min point counts when a male had to sing and be visible to the observer; but during 20 May-19 June, males were available for detection 39.1% (SD = 27.3) of the time for 5-min point counts and 43.9% (SD = 28.9) of the time for 10-min point counts (n = 54). We detected no temporal changes in availability for Grasshopper Sparrows, but estimated availability to be much lower for 5-min point counts (10.3%, SD = 12.2) than for 10-min point counts (19.2%, SD = 22.3) when males had to be visible and sing during the sampling period (n = 80). For distance sampling, we estimated the availability of Henslow's Sparrows to be 44.2% (SD = 29.0) and the availability of Grasshopper Sparrows to be 20.6% (SD = 23.5). We show how our estimates of availability can be incorporated in the abundance and variance estimators for distance sampling and modify the abundance and variance estimators for the double-observer method. Methods that directly estimate availability from bird counts but also incorporate detection probabilities need further development and will be important for obtaining unbiased estimates of abundance for these species.
A fast image matching algorithm based on key points
NASA Astrophysics Data System (ADS)
Wang, Huilin; Wang, Ying; An, Ru; Yan, Peng
2014-05-01
Image matching is a very important technique in image processing. It has been widely used for object recognition and tracking, image retrieval, three-dimensional vision, change detection, aircraft position estimation, and multi-image registration. Based on the requirements of matching algorithm for craft navigation, such as speed, accuracy and adaptability, a fast key point image matching method is investigated and developed. The main research tasks includes: (1) Developing an improved celerity key point detection approach using self-adapting threshold of Features from Accelerated Segment Test (FAST). A method of calculating self-adapting threshold was introduced for images with different contrast. Hessian matrix was adopted to eliminate insecure edge points in order to obtain key points with higher stability. This approach in detecting key points has characteristics of small amount of computation, high positioning accuracy and strong anti-noise ability; (2) PCA-SIFT is utilized to describe key point. 128 dimensional vector are formed based on the SIFT method for the key points extracted. A low dimensional feature space was established by eigenvectors of all the key points, and each eigenvector was projected onto the feature space to form a low dimensional eigenvector. These key points were re-described by dimension-reduced eigenvectors. After reducing the dimension by the PCA, the descriptor was reduced to 20 dimensions from the original 128. This method can reduce dimensions of searching approximately near neighbors thereby increasing overall speed; (3) Distance ratio between the nearest neighbour and second nearest neighbour searching is regarded as the measurement criterion for initial matching points from which the original point pairs matched are obtained. Based on the analysis of the common methods (e.g. RANSAC (random sample consensus) and Hough transform cluster) used for elimination false matching point pairs, a heuristic local geometric restriction strategy is adopted to discard false matched point pairs further; and (4) Affine transformation model is introduced to correct coordinate difference between real-time image and reference image. This resulted in the matching of the two images. SPOT5 Remote sensing images captured at different date and airborne images captured with different flight attitude were used to test the performance of the method from matching accuracy, operation time and ability to overcome rotation. Results show the effectiveness of the approach.
A QRS Detection and R Point Recognition Method for Wearable Single-Lead ECG Devices.
Chen, Chieh-Li; Chuang, Chun-Te
2017-08-26
In the new-generation wearable Electrocardiogram (ECG) system, signal processing with low power consumption is required to transmit data when detecting dangerous rhythms and to record signals when detecting abnormal rhythms. The QRS complex is a combination of three of the graphic deflection seen on a typical ECG. This study proposes a real-time QRS detection and R point recognition method with low computational complexity while maintaining a high accuracy. The enhancement of QRS segments and restraining of P and T waves are carried out by the proposed ECG signal transformation, which also leads to the elimination of baseline wandering. In this study, the QRS fiducial point is determined based on the detected crests and troughs of the transformed signal. Subsequently, the R point can be recognized based on four QRS waveform templates and preliminary heart rhythm classification can be also achieved at the same time. The performance of the proposed approach is demonstrated using the benchmark of the MIT-BIH Arrhythmia Database, where the QRS detected sensitivity (Se) and positive prediction (+P) are 99.82% and 99.81%, respectively. The result reveals the approach's advantage of low computational complexity, as well as the feasibility of the real-time application on a mobile phone and an embedded system.
Wildfire Detection using by Multi Dimensional Histogram in Boreal Forest
NASA Astrophysics Data System (ADS)
Honda, K.; Kimura, K.; Honma, T.
2008-12-01
Early detection of wildfires is an issue for reduction of damage to environment and human. There are some attempts to detect wildfires by using satellite imagery, which are mainly classified into three methods: Dozier Method(1981-), Threshold Method(1986-) and Contextual Method(1994-). However, the accuracy of these methods is not enough: some commission and omission errors are included in the detected results. In addition, it is not so easy to analyze satellite imagery with high accuracy because of insufficient ground truth data. Kudoh and Hosoi (2003) developed the detection method by using three-dimensional (3D) histogram from past fire data with the NOAA-AVHRR imagery. But their method is impractical because their method depends on their handworks to pick up past fire data from huge data. Therefore, the purpose of this study is to collect fire points as hot spots efficiently from satellite imagery and to improve the method to detect wildfires with the collected data. As our method, we collect past fire data with the Alaska Fire History data obtained by the Alaska Fire Service (AFS). We select points that are expected to be wildfires, and pick up the points inside the fire area of the AFS data. Next, we make 3D histogram with the past fire data. In this study, we use Bands 1, 21 and 32 of MODIS. We calculate the likelihood to detect wildfires with the three-dimensional histogram. As our result, we select wildfires with the 3D histogram effectively. We can detect the troidally spreading wildfire. This result shows the evidence of good wildfire detection. However, the area surrounding glacier tends to rise brightness temperature. It is a false alarm. Burnt area and bare ground are sometimes indicated as false alarms, so that it is necessary to improve this method. Additionally, we are trying various combinations of MODIS bands as the better method to detect wildfire effectively. So as to adjust our method in another area, we are applying our method to tropical forest in Kalimantan, Indonesia and around Chiang Mai, Thailand. But the ground truth data in these areas is lesser than the one in Alaska. Our method needs lots of accurate observed data to make multi-dimensional histogram in the same area. In this study, we can show the system to select wildfire data efficiently from satellite imagery. Furthermore, the development of multi-dimensional histogram from past fire data makes it possible to detect wildfires accurately.
Glue detection based on teaching points constraint and tracking model of pixel convolution
NASA Astrophysics Data System (ADS)
Geng, Lei; Ma, Xiao; Xiao, Zhitao; Wang, Wen
2018-01-01
On-line glue detection based on machine version is significant for rust protection and strengthening in car production. Shadow stripes caused by reflect light and unevenness of inside front cover of car reduce the accuracy of glue detection. In this paper, we propose an effective algorithm to distinguish the edges of the glue and shadow stripes. Teaching points are utilized to calculate slope between the two adjacent points. Then a tracking model based on pixel convolution along motion direction is designed to segment several local rectangular regions using distance. The distance is the height of rectangular region. The pixel convolution along the motion direction is proposed to extract edges of gules in local rectangular region. A dataset with different illumination and complexity shape stripes are used to evaluate proposed method, which include 500 thousand images captured from the camera of glue gun machine. Experimental results demonstrate that the proposed method can detect the edges of glue accurately. The shadow stripes are distinguished and removed effectively. Our method achieves the 99.9% accuracies for the image dataset.
Characterisation of Feature Points in Eye Fundus Images
NASA Astrophysics Data System (ADS)
Calvo, D.; Ortega, M.; Penedo, M. G.; Rouco, J.
The retinal vessel tree adds decisive knowledge in the diagnosis of numerous opthalmologic pathologies such as hypertension or diabetes. One of the problems in the analysis of the retinal vessel tree is the lack of information in terms of vessels depth as the image acquisition usually leads to a 2D image. This situation provokes a scenario where two different vessels coinciding in a point could be interpreted as a vessel forking into a bifurcation. That is why, for traking and labelling the retinal vascular tree, bifurcations and crossovers of vessels are considered feature points. In this work a novel method for these retinal vessel tree feature points detection and classification is introduced. The method applies image techniques such as filters or thinning to obtain the adequate structure to detect the points and sets a classification of these points studying its environment. The methodology is tested using a standard database and the results show high classification capabilities.
Automatic identification of vessel crossovers in retinal images
NASA Astrophysics Data System (ADS)
Sánchez, L.; Barreira, N.; Penedo, M. G.; Cancela, B.
2015-02-01
Crossovers and bifurcations are interest points of the retinal vascular tree useful to diagnose diseases. Specifically, detecting these interest points and identifying which of them are crossings will give us the opportunity to search for arteriovenous nicking, this is, an alteration of the vessel tree where an artery is crossed by a vein and the former compresses the later. These formations are a clear indicative of hypertension, among other medical problems. There are several studies that have attempted to define an accurate and reliable method to detect and classify these relevant points. In this article, we propose a new method to identify crossovers. Our approach is based on segmenting the vascular tree and analyzing the surrounding area of each interest point. The minimal path between vessel points in this area is computed in order to identify the connected vessel segments and, as a result, to distinguish between bifurcations and crossovers. Our method was tested using retinographies from public databases DRIVE and VICAVR, obtaining an accuracy of 90%.
Detection and Classification of Pole-Like Objects from Mobile Mapping Data
NASA Astrophysics Data System (ADS)
Fukano, K.; Masuda, H.
2015-08-01
Laser scanners on a vehicle-based mobile mapping system can capture 3D point-clouds of roads and roadside objects. Since roadside objects have to be maintained periodically, their 3D models are useful for planning maintenance tasks. In our previous work, we proposed a method for detecting cylindrical poles and planar plates in a point-cloud. However, it is often required to further classify pole-like objects into utility poles, streetlights, traffic signals and signs, which are managed by different organizations. In addition, our previous method may fail to extract low pole-like objects, which are often observed in urban residential areas. In this paper, we propose new methods for extracting and classifying pole-like objects. In our method, we robustly extract a wide variety of poles by converting point-clouds into wireframe models and calculating cross-sections between wireframe models and horizontal cutting planes. For classifying pole-like objects, we subdivide a pole-like object into five subsets by extracting poles and planes, and calculate feature values of each subset. Then we apply a supervised machine learning method using feature variables of subsets. In our experiments, our method could achieve excellent results for detection and classification of pole-like objects.
NASA Astrophysics Data System (ADS)
Fu, Rongxin; Li, Qi; Zhang, Junqi; Wang, Ruliang; Lin, Xue; Xue, Ning; Su, Ya; Jiang, Kai; Huang, Guoliang
2016-10-01
Label free point mutation detection is particularly momentous in the area of biomedical research and clinical diagnosis since gene mutations naturally occur and bring about highly fatal diseases. In this paper, a label free and high sensitive approach is proposed for point mutation detection based on hyperspectral interferometry. A hybridization strategy is designed to discriminate a single-base substitution with sequence-specific DNA ligase. Double-strand structures will take place only if added oligonucleotides are perfectly paired to the probe sequence. The proposed approach takes full use of the inherent conformation of double-strand DNA molecules on the substrate and a spectrum analysis method is established to point out the sub-nanoscale thickness variation, which benefits to high sensitive mutation detection. The limit of detection reach 4pg/mm2 according to the experimental result. A lung cancer gene point mutation was demonstrated, proving the high selectivity and multiplex analysis capability of the proposed biosensor.
Image Mosaic Method Based on SIFT Features of Line Segment
Zhu, Jun; Ren, Mingwu
2014-01-01
This paper proposes a novel image mosaic method based on SIFT (Scale Invariant Feature Transform) feature of line segment, aiming to resolve incident scaling, rotation, changes in lighting condition, and so on between two images in the panoramic image mosaic process. This method firstly uses Harris corner detection operator to detect key points. Secondly, it constructs directed line segments, describes them with SIFT feature, and matches those directed segments to acquire rough point matching. Finally, Ransac method is used to eliminate wrong pairs in order to accomplish image mosaic. The results from experiment based on four pairs of images show that our method has strong robustness for resolution, lighting, rotation, and scaling. PMID:24511326
Welding deviation detection algorithm based on extremum of molten pool image contour
NASA Astrophysics Data System (ADS)
Zou, Yong; Jiang, Lipei; Li, Yunhua; Xue, Long; Huang, Junfen; Huang, Jiqiang
2016-01-01
The welding deviation detection is the basis of robotic tracking welding, but the on-line real-time measurement of welding deviation is still not well solved by the existing methods. There is plenty of information in the gas metal arc welding(GMAW) molten pool images that is very important for the control of welding seam tracking. The physical meaning for the curvature extremum of molten pool contour is revealed by researching the molten pool images, that is, the deviation information points of welding wire center and the molten tip center are the maxima and the local maxima of the contour curvature, and the horizontal welding deviation is the position difference of these two extremum points. A new method of weld deviation detection is presented, including the process of preprocessing molten pool images, extracting and segmenting the contours, obtaining the contour extremum points, and calculating the welding deviation, etc. Extracting the contours is the premise, segmenting the contour lines is the foundation, and obtaining the contour extremum points is the key. The contour images can be extracted with the method of discrete dyadic wavelet transform, which is divided into two sub contours including welding wire and molten tip separately. The curvature value of each point of the two sub contour lines is calculated based on the approximate curvature formula of multi-points for plane curve, and the two points of the curvature extremum are the characteristics needed for the welding deviation calculation. The results of the tests and analyses show that the maximum error of the obtained on-line welding deviation is 2 pixels(0.16 mm), and the algorithm is stable enough to meet the requirements of the pipeline in real-time control at a speed of less than 500 mm/min. The method can be applied to the on-line automatic welding deviation detection.
Reviving common standards in point-count surveys for broad inference across studies
Matsuoka, Steven M.; Mahon, C. Lisa; Handel, Colleen M.; Solymos, Peter; Bayne, Erin M.; Fontaine, Patricia C.; Ralph, C.J.
2014-01-01
We revisit the common standards recommended by Ralph et al. (1993, 1995a) for conducting point-count surveys to assess the relative abundance of landbirds breeding in North America. The standards originated from discussions among ornithologists in 1991 and were developed so that point-count survey data could be broadly compared and jointly analyzed by national data centers with the goals of monitoring populations and managing habitat. Twenty years later, we revisit these standards because (1) they have not been universally followed and (2) new methods allow estimation of absolute abundance from point counts, but these methods generally require data beyond the original standards to account for imperfect detection. Lack of standardization and the complications it introduces for analysis become apparent from aggregated data. For example, only 3% of 196,000 point counts conducted during the period 1992-2011 across Alaska and Canada followed the standards recommended for the count period and count radius. Ten-minute, unlimited-count-radius surveys increased the number of birds detected by >300% over 3-minute, 50-m-radius surveys. This effect size, which could be eliminated by standardized sampling, was ≥10 times the published effect sizes of observers, time of day, and date of the surveys. We suggest that the recommendations by Ralph et al. (1995a) continue to form the common standards when conducting point counts. This protocol is inexpensive and easy to follow but still allows the surveys to be adjusted for detection probabilities. Investigators might optionally collect additional information so that they can analyze their data with more flexible forms of removal and time-of-detection models, distance sampling, multiple-observer methods, repeated counts, or combinations of these methods. Maintaining the common standards as a base protocol, even as these study-specific modifications are added, will maximize the value of point-count data, allowing compilation and analysis by regional and national data centers.
A Study of Dim Object Detection for the Space Surveillance Telescope
2013-03-21
ENG-13-M-32 Abstract Current methods of dim object detection for space surveillance make use of a Gaussian log-likelihood-ratio-test-based...quantitatively comparing the efficacy of two methods for dim object detection , termed in this paper the point detector and the correlator, both of which rely... applications . It is used in national defense for detecting satellites. It is used to detecting space debris, which threatens both civilian and
NASA Astrophysics Data System (ADS)
Tupas, M. E. A.; Dasallas, J. A.; Jiao, B. J. D.; Magallon, B. J. P.; Sempio, J. N. H.; Ramos, M. K. F.; Aranas, R. K. D.; Tamondong, A. M.
2017-10-01
The FAST-SIFT corner detector and descriptor extractor combination was used to automatically georeference DIWATA-1 Spaceborne Multispectral Imager images. Features from the Fast Accelerated Segment Test (FAST) algorithm detects corners or keypoints in an image, and these robustly detected keypoints have well-defined positions. Descriptors were computed using Scale-Invariant Feature Transform (SIFT) extractor. FAST-SIFT method effectively SMI same-subscene images detected by the NIR sensor. The method was also tested in stitching NIR images with varying subscene swept by the camera. The slave images were matched to the master image. The keypoints served as the ground control points. Random sample consensus was used to eliminate fall-out matches and ensure accuracy of the feature points from which the transformation parameters were derived. Keypoints are matched based on their descriptor vector. Nearest-neighbor matching is employed based on a metric distance between the descriptors. The metrics include Euclidean and city block, among others. Rough matching outputs not only the correct matches but also the faulty matches. A previous work in automatic georeferencing incorporates a geometric restriction. In this work, we applied a simplified version of the learning method. RANSAC was used to eliminate fall-out matches and ensure accuracy of the feature points. This method identifies if a point fits the transformation function and returns inlier matches. The transformation matrix was solved by Affine, Projective, and Polynomial models. The accuracy of the automatic georeferencing method were determined by calculating the RMSE of interest points, selected randomly, between the master image and transformed slave image.
[Automated analyzer of enzyme immunoassay].
Osawa, S
1995-09-01
Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.
Automatic Classification of Trees from Laser Scanning Point Clouds
NASA Astrophysics Data System (ADS)
Sirmacek, B.; Lindenbergh, R.
2015-08-01
Development of laser scanning technologies has promoted tree monitoring studies to a new level, as the laser scanning point clouds enable accurate 3D measurements in a fast and environmental friendly manner. In this paper, we introduce a probability matrix computation based algorithm for automatically classifying laser scanning point clouds into 'tree' and 'non-tree' classes. Our method uses the 3D coordinates of the laser scanning points as input and generates a new point cloud which holds a label for each point indicating if it belongs to the 'tree' or 'non-tree' class. To do so, a grid surface is assigned to the lowest height level of the point cloud. The grids are filled with probability values which are calculated by checking the point density above the grid. Since the tree trunk locations appear with very high values in the probability matrix, selecting the local maxima of the grid surface help to detect the tree trunks. Further points are assigned to tree trunks if they appear in the close proximity of trunks. Since heavy mathematical computations (such as point cloud organization, detailed shape 3D detection methods, graph network generation) are not required, the proposed algorithm works very fast compared to the existing methods. The tree classification results are found reliable even on point clouds of cities containing many different objects. As the most significant weakness, false detection of light poles, traffic signs and other objects close to trees cannot be prevented. Nevertheless, the experimental results on mobile and airborne laser scanning point clouds indicate the possible usage of the algorithm as an important step for tree growth observation, tree counting and similar applications. While the laser scanning point cloud is giving opportunity to classify even very small trees, accuracy of the results is reduced in the low point density areas further away than the scanning location. These advantages and disadvantages of two laser scanning point cloud sources are discussed in detail.
De Francesco, Vincenzo; Zullo, Angelo; Giorgio, Floriana; Saracino, Ilaria; Zaccaro, Cristina; Hassan, Cesare; Ierardi, Enzo; Di Leo, Alfredo; Fiorini, Giulia; Castelli, Valentina; Lo Re, Giovanna; Vaira, Dino
2014-03-01
Primary clarithromycin resistance is the main factor affecting the efficacy of Helicobacter pylori therapy. This study aimed: (i) to assess the concordance between phenotypic (culture) and genotypic (real-time PCR) tests in resistant strains; (ii) to search, in the case of disagreement between the methods, for point mutations other than those reported as the most frequent in Europe; and (iii) to compare the MICs associated with the single point mutations. In order to perform real-time PCR, we retrieved biopsies from patients in whom H. pylori infection was successful diagnosed by bacterial culture and clarithromycin resistance was assessed using the Etest. Only patients who had never been previously treated, and with H. pylori strains that were either resistant exclusively to clarithromycin or without any resistance, were included. Biopsies from 82 infected patients were analysed, including 42 strains that were clarithromycin resistant and 40 that were clarithromycin susceptible on culture. On genotypic analysis, at least one of the three most frequently reported point mutations (A2142C, A2142G and A2143G) was detected in only 23 cases (54.8%), with a concordance between the two methods of 0.67. Novel point mutations (A2115G, G2141A and A2144T) were detected in a further 14 out of 19 discordant cases, increasing the resistance detection rate of PCR to 88% (P<0.001; odds ratio 6.1, 95% confidence interval 2-18.6) and the concordance to 0.81. No significant differences in MIC values among different point mutations were observed. This study suggests that: (i) the prevalence of the usually reported point mutations may be decreasing, with a concomitant emergence of new mutations; (ii) PCR-based methods should search for at least six point mutations to achieve good accuracy in detecting clarithromycin resistance; and (iii) none of the tested point mutations is associated with significantly higher MIC values than the others.
NASA Astrophysics Data System (ADS)
Vu, Tinh Thi; Kiesel, Jens; Guse, Bjoern; Fohrer, Nicola
2017-04-01
The damming of rivers causes one of the most considerable impacts of our society on the riverine environment. More than 50% of the world's streams and rivers are currently impounded by dams before reaching the oceans. The construction of dams is of high importance in developing and emerging countries, i.e. for power generation and water storage. In the Vietnamese Vu Gia - Thu Bon Catchment (10,350 km2), about 23 dams were built during the last decades and store approximately 2,156 billion m3 of water. The water impoundment in 10 dams in upstream regions amounts to 17 % of the annual discharge volume. It is expected that impacts from these dams have altered the natural flow regime. However, up to now it is unclear how the flow regime was altered. For this, it needs to be investigated at what point in time these changes became significant and detectable. Many approaches exist to detect changes in stationary or consistency of hydrological records using statistical analysis of time series for the pre- and post-dam period. The objective of this study is to reliably detect and assess hydrologic shifts occurring in the discharge regime of an anthropogenically influenced river basin, mainly affected by the construction of dams. To achieve this, we applied nine available change-point tests to detect change in mean, variance and median on the daily and annual discharge records at two main gauges of the basin. The tests yield conflicting results: The majority of tests found abrupt changes that coincide with the damming-period, while others did not. To interpret how significant the changes in discharge regime are, and to which different properties of the time series each test responded, we calculated Indicators of Hydrologic Alteration (IHAs) for the time period before and after the detected change points. From the results, we can deduce, that the change point tests are influenced in different levels by different indicator groups (magnitude, duration, frequency, etc) and that within the indicator groups, some indicators are more sensitive than others. For instance, extreme low-flow, especially 7- and, 30-day minima and mean minimum low flow, as well as the variability of monthly flow are highly-sensitive to most detected change points. Our study clearly shows that, the detected change points depend on which test is chosen. For an objective assessment of change points, it is therefore necessary to explain the change points by calculating differences in IHAs. This analysis can be used to assess which change point method reacts to which type of hydrological change and, more importantly, it can be used to rank the change points according to their overall impact on the discharge regime. This leads to an improved evaluation of hydrologic change-points caused by anthropogenic impacts. Our study clearly shows that, the detected change points depend on which test is chosen. For an objective assessment of change points, it is therefore necessary to explain the change points by calculating differences in IHAs. This analysis can be used to assess which change point method reacts to which type of hydrological change and, more importantly, it can be used to rank the change points according to their overall impact on the discharge regime. This leads to an improved evaluation of hydrologic change-points caused by anthropogenic impacts.
a Voxel-Based Metadata Structure for Change Detection in Point Clouds of Large-Scale Urban Areas
NASA Astrophysics Data System (ADS)
Gehrung, J.; Hebel, M.; Arens, M.; Stilla, U.
2018-05-01
Mobile laser scanning has not only the potential to create detailed representations of urban environments, but also to determine changes up to a very detailed level. An environment representation for change detection in large scale urban environments based on point clouds has drawbacks in terms of memory scalability. Volumes, however, are a promising building block for memory efficient change detection methods. The challenge of working with 3D occupancy grids is that the usual raycasting-based methods applied for their generation lead to artifacts caused by the traversal of unfavorable discretized space. These artifacts have the potential to distort the state of voxels in close proximity to planar structures. In this work we propose a raycasting approach that utilizes knowledge about planar surfaces to completely prevent this kind of artifacts. To demonstrate the capabilities of our approach, a method for the iterative volumetric approximation of point clouds that allows to speed up the raycasting by 36 percent is proposed.
Solar cell anomaly detection method and apparatus
NASA Technical Reports Server (NTRS)
Miller, Emmett L. (Inventor); Shumka, Alex (Inventor); Gauthier, Michael K. (Inventor)
1981-01-01
A method is provided for detecting cracks and other imperfections in a solar cell, which includes scanning a narrow light beam back and forth across the cell in a raster pattern, while monitoring the electrical output of the cell to find locations where the electrical output varies significantly. The electrical output can be monitored on a television type screen containing a raster pattern with each point on the screen corresponding to a point on the solar cell surface, and with the brightness of each point on the screen corresponding to the electrical output from the cell which was produced when the light beam was at the corresponding point on the cell. The technique can be utilized to scan a large array of interconnected solar cells, to determine which ones are defective.
Layer stacking: A novel algorithm for individual forest tree segmentation from LiDAR point clouds
Elias Ayrey; Shawn Fraver; John A. Kershaw; Laura S. Kenefic; Daniel Hayes; Aaron R. Weiskittel; Brian E. Roth
2017-01-01
As light detection and ranging (LiDAR) technology advances, it has become common for datasets to be acquired at a point density high enough to capture structural information from individual trees. To process these data, an automatic method of isolating individual trees from a LiDAR point cloud is required. Traditional methods for segmenting trees attempt to isolate...
Field Demonstration of a Multiplexed Point-of-Care Diagnostic Platform for Plant Pathogens.
Lau, Han Yih; Wang, Yuling; Wee, Eugene J H; Botella, Jose R; Trau, Matt
2016-08-16
Effective disease management strategies to prevent catastrophic crop losses require rapid, sensitive, and multiplexed detection methods for timely decision making. To address this need, a rapid, highly specific and sensitive point-of-care method for multiplex detection of plant pathogens was developed by taking advantage of surface-enhanced Raman scattering (SERS) labeled nanotags and recombinase polymerase amplification (RPA), which is a rapid isothermal amplification method with high specificity. In this study, three agriculturally important plant pathogens (Botrytis cinerea, Pseudomonas syringae, and Fusarium oxysporum) were used to demonstrate potential translation into the field. The RPA-SERS method was faster, more sensitive than polymerase chain reaction, and could detect as little as 2 copies of B. cinerea DNA. Furthermore, multiplex detection of the three pathogens was demonstrated for complex systems such as the Arabidopsis thaliana plant and commercial tomato crops. To demonstrate the potential for on-site field applications, a rapid single-tube RPA/SERS assay was further developed and successfully performed for a specific target outside of a laboratory setting.
Bacteriophage Amplification-Coupled Detection and Identification of Bacterial Pathogens
NASA Astrophysics Data System (ADS)
Cox, Christopher R.; Voorhees, Kent J.
Current methods of species-specific bacterial detection and identification are complex, time-consuming, and often require expensive specialized equipment and highly trained personnel. Numerous biochemical and genotypic identification methods have been applied to bacterial characterization, but all rely on tedious microbiological culturing practices and/or costly sequencing protocols which render them impractical for deployment as rapid, cost-effective point-of-care or field detection and identification methods. With a view towards addressing these shortcomings, we have exploited the evolutionarily conserved interactions between a bacteriophage (phage) and its bacterial host to develop species-specific detection methods. Phage amplification-coupled matrix assisted laser desorption time-of-flight mass spectrometry (MALDI-TOF-MS) was utilized to rapidly detect phage propagation resulting from species-specific in vitro bacterial infection. This novel signal amplification method allowed for bacterial detection and identification in as little as 2 h, and when combined with disulfide bond reduction methods developed in our laboratory to enhance MALDI-TOF-MS resolution, was observed to lower the limit of detection by several orders of magnitude over conventional spectroscopy and phage typing methods. Phage amplification has been combined with lateral flow immunochromatography (LFI) to develop rapid, easy-to-operate, portable, species-specific point-of-care (POC) detection devices. Prototype LFI detectors have been developed and characterized for Yersinia pestis and Bacillus anthracis, the etiologic agents of plague and anthrax, respectively. Comparable sensitivity and rapidity was observed when phage amplification was adapted to a species-specific handheld LFI detector, thus allowing for rapid, simple, POC bacterial detection and identification while eliminating the need for bacterial culturing or DNA isolation and amplification techniques.
Point target detection utilizing super-resolution strategy for infrared scanning oversampling system
NASA Astrophysics Data System (ADS)
Wang, Longguang; Lin, Zaiping; Deng, Xinpu; An, Wei
2017-11-01
To improve the resolution of remote sensing infrared images, infrared scanning oversampling system is employed with information amount quadrupled, which contributes to the target detection. Generally the image data from double-line detector of infrared scanning oversampling system is shuffled to a whole oversampled image to be post-processed, whereas the aliasing between neighboring pixels leads to image degradation with a great impact on target detection. This paper formulates a point target detection method utilizing super-resolution (SR) strategy concerning infrared scanning oversampling system, with an accelerated SR strategy proposed to realize fast de-aliasing of the oversampled image and an adaptive MRF-based regularization designed to achieve the preserving and aggregation of target energy. Extensive experiments demonstrate the superior detection performance, robustness and efficiency of the proposed method compared with other state-of-the-art approaches.
Detection of small surface defects using DCT based enhancement approach in machine vision systems
NASA Astrophysics Data System (ADS)
He, Fuqiang; Wang, Wen; Chen, Zichen
2005-12-01
Utilizing DCT based enhancement approach, an improved small defect detection algorithm for real-time leather surface inspection was developed. A two-stage decomposition procedure was proposed to extract an odd-odd frequency matrix after a digital image has been transformed to DCT domain. Then, the reverse cumulative sum algorithm was proposed to detect the transition points of the gentle curves plotted from the odd-odd frequency matrix. The best radius of the cutting sector was computed in terms of the transition points and the high-pass filtering operation was implemented. The filtered image was then inversed and transformed back to the spatial domain. Finally, the restored image was segmented by an entropy method and some defect features are calculated. Experimental results show the proposed small defect detection method can reach the small defect detection rate by 94%.
Sawamura, Kensuke; Hashimoto, Masahiko
2017-01-01
A fluorescence quenching assay based on a ligase detection reaction was developed for facile and rapid detection of point mutations present in a mixed population of non-variant DNA. If the test DNA carried a targeted mutation, then the two allele-specific primers were ligated to form a molecular beacon resulting in the expected fluorescence quenching signatures. Using this method, we successfully detected as low as 5% mutant DNA in a mixture of wild-type DNA (t test at 99% confidence level).
Semi-Tomographic Gamma Scanning Technique for Non-Destructive Assay of Radioactive Waste Drums
NASA Astrophysics Data System (ADS)
Gu, Weiguo; Rao, Kaiyuan; Wang, Dezhong; Xiong, Jiemei
2016-12-01
Segmented gamma scanning (SGS) and tomographic gamma scanning (TGS) are two traditional detection techniques for low and intermediate level radioactive waste drum. This paper proposes one detection method named semi-tomographic gamma scanning (STGS) to avoid the poor detection accuracy of SGS and shorten detection time of TGS. This method and its algorithm synthesize the principles of SGS and TGS. In this method, each segment is divided into annual voxels and tomography is used in the radiation reconstruction. The accuracy of STGS is verified by experiments and simulations simultaneously for the 208 liter standard waste drums which contains three types of nuclides. The cases of point source or multi-point sources, uniform or nonuniform materials are employed for comparison. The results show that STGS exhibits a large improvement in the detection performance, and the reconstruction error and statistical bias are reduced by one quarter to one third or less for most cases if compared with SGS.
Building Facade Modeling Under Line Feature Constraint Based on Close-Range Images
NASA Astrophysics Data System (ADS)
Liang, Y.; Sheng, Y. H.
2018-04-01
To solve existing problems in modeling facade of building merely with point feature based on close-range images , a new method for modeling building facade under line feature constraint is proposed in this paper. Firstly, Camera parameters and sparse spatial point clouds data were restored using the SFM , and 3D dense point clouds were generated with MVS; Secondly, the line features were detected based on the gradient direction , those detected line features were fit considering directions and lengths , then line features were matched under multiple types of constraints and extracted from multi-image sequence. At last, final facade mesh of a building was triangulated with point cloud and line features. The experiment shows that this method can effectively reconstruct the geometric facade of buildings using the advantages of combining point and line features of the close - range image sequence, especially in restoring the contour information of the facade of buildings.
NASA Astrophysics Data System (ADS)
Petrovic, Goran; Kilic, Tomislav; Terzic, Bozo
2009-04-01
In this paper a sensorless speed detection method of induction squirrel-cage machines is presented. This method is based on frequency determination of the stator neutral point voltage primary slot harmonic, which is dependent on rotor speed. In order to prove method in steady state and dynamic conditions the simulation and experimental study was carried out. For theoretical investigation the mathematical model of squirrel cage induction machines, which takes into consideration actual geometry and windings layout, is used. Speed-related harmonics that arise from rotor slotting are analyzed using digital signal processing and DFT algorithm with Hanning window. The performance of the method is demonstrated over a wide range of load conditions.
Low-complexity object detection with deep convolutional neural network for embedded systems
NASA Astrophysics Data System (ADS)
Tripathi, Subarna; Kang, Byeongkeun; Dane, Gokce; Nguyen, Truong
2017-09-01
We investigate low-complexity convolutional neural networks (CNNs) for object detection for embedded vision applications. It is well-known that consolidation of an embedded system for CNN-based object detection is more challenging due to computation and memory requirement comparing with problems like image classification. To achieve these requirements, we design and develop an end-to-end TensorFlow (TF)-based fully-convolutional deep neural network for generic object detection task inspired by one of the fastest framework, YOLO.1 The proposed network predicts the localization of every object by regressing the coordinates of the corresponding bounding box as in YOLO. Hence, the network is able to detect any objects without any limitations in the size of the objects. However, unlike YOLO, all the layers in the proposed network is fully-convolutional. Thus, it is able to take input images of any size. We pick face detection as an use case. We evaluate the proposed model for face detection on FDDB dataset and Widerface dataset. As another use case of generic object detection, we evaluate its performance on PASCAL VOC dataset. The experimental results demonstrate that the proposed network can predict object instances of different sizes and poses in a single frame. Moreover, the results show that the proposed method achieves comparative accuracy comparing with the state-of-the-art CNN-based object detection methods while reducing the model size by 3× and memory-BW by 3 - 4× comparing with one of the best real-time CNN-based object detectors, YOLO. Our 8-bit fixed-point TF-model provides additional 4× memory reduction while keeping the accuracy nearly as good as the floating-point model. Moreover, the fixed- point model is capable of achieving 20× faster inference speed comparing with the floating-point model. Thus, the proposed method is promising for embedded implementations.
Detection of longitudinal visual field progression in glaucoma using machine learning.
Yousefi, Siamak; Kiwaki, Taichi; Zheng, Yuhui; Suigara, Hiroki; Asaoka, Ryo; Murata, Hiroshi; Lemij, Hans; Yamanishi, Kenji
2018-06-16
Global indices of standard automated perimerty are insensitive to localized losses, while point-wise indices are sensitive but highly variable. Region-wise indices sit in between. This study introduces a machine-learning-based index for glaucoma progression detection that outperforms global, region-wise, and point-wise indices. Development and comparison of a prognostic index. Visual fields from 2085 eyes of 1214 subjects were used to identify glaucoma progression patterns using machine learning. Visual fields from 133 eyes of 71 glaucoma patients were collected 10 times over 10 weeks to provide a no-change, test-retest dataset. The parameters of all methods were identified using visual field sequences in the test-retest dataset to meet fixed 95% specificity. An independent dataset of 270 eyes of 136 glaucoma patients and survival analysis were utilized to compare methods. The time to detect progression in 25% of the eyes in the longitudinal dataset using global mean deviation (MD) was 5.2 years (95% confidence interval, 4.1 - 6.5 years); 4.5 years (4.0 - 5.5) using region-wise, 3.9 years (3.5 - 4.6) using point-wise, and 3.5 years (3.1 - 4.0) using machine learning analysis. The time until 25% of eyes showed subsequently confirmed progression after two additional visits were included were 6.6 years (5.6 - 7.4 years), 5.7 years (4.8 - 6.7), 5.6 years (4.7 - 6.5), and 5.1 years (4.5 - 6.0) for global, region-wise, point-wise, and machine learning analyses, respectively. Machine learning analysis detects progressing eyes earlier than other methods consistently, with or without confirmation visits. In particular, machine learning detects more slowly progressing eyes than other methods. Copyright © 2018 Elsevier Inc. All rights reserved.
Apparatus for point-of-care detection of nucleic acid in a sample
Bearinger, Jane P.; Dugan, Lawrence C.
2016-04-19
Provided herein are methods and apparatus for detecting a target nucleic acid in a sample and related methods and apparatus for diagnosing a condition in an individual. The condition is associated with presence of nucleic acid produced by certain pathogens in the individual.
Infrared dim and small target detecting and tracking method inspired by Human Visual System
NASA Astrophysics Data System (ADS)
Dong, Xiabin; Huang, Xinsheng; Zheng, Yongbin; Shen, Lurong; Bai, Shengjian
2014-01-01
Detecting and tracking dim and small target in infrared images and videos is one of the most important techniques in many computer vision applications, such as video surveillance and infrared imaging precise guidance. Recently, more and more algorithms based on Human Visual System (HVS) have been proposed to detect and track the infrared dim and small target. In general, HVS concerns at least three mechanisms including contrast mechanism, visual attention and eye movement. However, most of the existing algorithms simulate only a single one of the HVS mechanisms, resulting in many drawbacks of these algorithms. A novel method which combines the three mechanisms of HVS is proposed in this paper. First, a group of Difference of Gaussians (DOG) filters which simulate the contrast mechanism are used to filter the input image. Second, a visual attention, which is simulated by a Gaussian window, is added at a point near the target in order to further enhance the dim small target. This point is named as the attention point. Eventually, the Proportional-Integral-Derivative (PID) algorithm is first introduced to predict the attention point of the next frame of an image which simulates the eye movement of human being. Experimental results of infrared images with different types of backgrounds demonstrate the high efficiency and accuracy of the proposed method to detect and track the dim and small targets.
Nondestructive detection of decay in living trees
Bertil Larsson; Bengt Bengtsson; Mats Gustaffson
2004-01-01
We used a four-point resistivity method to detect wood decay in living trees. low-frequency alternating current was applied to the stem and the induced voltage measured between two points along the stem. The effective resistivity of the stem was estimated based on stem cross-sectional area. A comparison within a group of trees showed that trees with butt rot had an...
Animals as Mobile Biological Sensors for Forest Fire Detection.
Sahin, Yasar Guneri
2007-12-04
This paper proposes a mobile biological sensor system that can assist in earlydetection of forest fires one of the most dreaded natural disasters on the earth. The main ideapresented in this paper is to utilize animals with sensors as Mobile Biological Sensors(MBS). The devices used in this system are animals which are native animals living inforests, sensors (thermo and radiation sensors with GPS features) that measure thetemperature and transmit the location of the MBS, access points for wireless communicationand a central computer system which classifies of animal actions. The system offers twodifferent methods, firstly: access points continuously receive data about animals' locationusing GPS at certain time intervals and the gathered data is then classified and checked tosee if there is a sudden movement (panic) of the animal groups: this method is called animalbehavior classification (ABC). The second method can be defined as thermal detection(TD): the access points get the temperature values from the MBS devices and send the datato a central computer to check for instant changes in the temperatures. This system may beused for many purposes other than fire detection, namely animal tracking, poachingprevention and detecting instantaneous animal death.
Motion estimation accuracy for visible-light/gamma-ray imaging fusion for portable portal monitoring
NASA Astrophysics Data System (ADS)
Karnowski, Thomas P.; Cunningham, Mark F.; Goddard, James S.; Cheriyadat, Anil M.; Hornback, Donald E.; Fabris, Lorenzo; Kerekes, Ryan A.; Ziock, Klaus-Peter; Gee, Timothy F.
2010-01-01
The use of radiation sensors as portal monitors is increasing due to heightened concerns over the smuggling of fissile material. Portable systems that can detect significant quantities of fissile material that might be present in vehicular traffic are of particular interest. We have constructed a prototype, rapid-deployment portal gamma-ray imaging portal monitor that uses machine vision and gamma-ray imaging to monitor multiple lanes of traffic. Vehicles are detected and tracked by using point detection and optical flow methods as implemented in the OpenCV software library. Points are clustered together but imperfections in the detected points and tracks cause errors in the accuracy of the vehicle position estimates. The resulting errors cause a "blurring" effect in the gamma image of the vehicle. To minimize these errors, we have compared a variety of motion estimation techniques including an estimate using the median of the clustered points, a "best-track" filtering algorithm, and a constant velocity motion estimation model. The accuracy of these methods are contrasted and compared to a manually verified ground-truth measurement by quantifying the rootmean- square differences in the times the vehicles cross the gamma-ray image pixel boundaries compared with a groundtruth manual measurement.
NASA Astrophysics Data System (ADS)
Li, Yanran; Chen, Duo; Li, Li; Zhang, Jiwei; Li, Guang; Liu, Hongxia
2017-11-01
GIS (gas insulated switchgear), is an important equipment in power system. Partial discharge plays an important role in detecting the insulation performance of GIS. UHF method and ultrasonic method frequently used in partial discharge (PD) detection for GIS. However, few studies have been conducted on comparison of this two methods. From the view point of safety, it is necessary to investigate UHF method and ultrasonic method for partial discharge in GIS. This paper presents study aimed at clarifying the effect of UHF method and ultrasonic method for partial discharge caused by free metal particles in GIS. Partial discharge tests were performed in laboratory simulated environment. Obtained results show the ability of anti-interference of signal detection and the accuracy of fault localization for UHF method and ultrasonic method. A new method based on UHF method and ultrasonic method of PD detection for GIS is proposed in order to greatly enhance the ability of anti-interference of signal detection and the accuracy of detection localization.
Automatic comic page image understanding based on edge segment analysis
NASA Astrophysics Data System (ADS)
Liu, Dong; Wang, Yongtao; Tang, Zhi; Li, Luyuan; Gao, Liangcai
2013-12-01
Comic page image understanding aims to analyse the layout of the comic page images by detecting the storyboards and identifying the reading order automatically. It is the key technique to produce the digital comic documents suitable for reading on mobile devices. In this paper, we propose a novel comic page image understanding method based on edge segment analysis. First, we propose an efficient edge point chaining method to extract Canny edge segments (i.e., contiguous chains of Canny edge points) from the input comic page image; second, we propose a top-down scheme to detect line segments within each obtained edge segment; third, we develop a novel method to detect the storyboards by selecting the border lines and further identify the reading order of these storyboards. The proposed method is performed on a data set consisting of 2000 comic page images from ten printed comic series. The experimental results demonstrate that the proposed method achieves satisfactory results on different comics and outperforms the existing methods.
NASA Astrophysics Data System (ADS)
Aijazi, A. K.; Malaterre, L.; Tazir, M. L.; Trassoudaine, L.; Checchin, P.
2016-06-01
This work presents a new method that automatically detects and analyzes surface defects such as corrosion spots of different shapes and sizes, on large ship hulls. In the proposed method several scans from different positions and viewing angles around the ship are registered together to form a complete 3D point cloud. The R, G, B values associated with each scan, obtained with the help of an integrated camera are converted into HSV space to separate out the illumination invariant color component from the intensity. Using this color component, different surface defects such as corrosion spots of different shapes and sizes are automatically detected, within a selected zone, using two different methods depending upon the level of corrosion/defects. The first method relies on a histogram based distribution whereas the second on adaptive thresholds. The detected corrosion spots are then analyzed and quantified to help better plan and estimate the cost of repair and maintenance. Results are evaluated on real data using different standard evaluation metrics to demonstrate the efficacy as well as the technical strength of the proposed method.
A removal model for estimating detection probabilities from point-count surveys
Farnsworth, G.L.; Pollock, K.H.; Nichols, J.D.; Simons, T.R.; Hines, J.E.; Sauer, J.R.
2000-01-01
We adapted a removal model to estimate detection probability during point count surveys. The model assumes one factor influencing detection during point counts is the singing frequency of birds. This may be true for surveys recording forest songbirds when most detections are by sound. The model requires counts to be divided into several time intervals. We used time intervals of 2, 5, and 10 min to develop a maximum-likelihood estimator for the detectability of birds during such surveys. We applied this technique to data from bird surveys conducted in Great Smoky Mountains National Park. We used model selection criteria to identify whether detection probabilities varied among species, throughout the morning, throughout the season, and among different observers. The overall detection probability for all birds was 75%. We found differences in detection probability among species. Species that sing frequently such as Winter Wren and Acadian Flycatcher had high detection probabilities (about 90%) and species that call infrequently such as Pileated Woodpecker had low detection probability (36%). We also found detection probabilities varied with the time of day for some species (e.g. thrushes) and between observers for other species. This method of estimating detectability during point count surveys offers a promising new approach to using count data to address questions of the bird abundance, density, and population trends.
BPP: a sequence-based algorithm for branch point prediction.
Zhang, Qing; Fan, Xiaodan; Wang, Yejun; Sun, Ming-An; Shao, Jianlin; Guo, Dianjing
2017-10-15
Although high-throughput sequencing methods have been proposed to identify splicing branch points in the human genome, these methods can only detect a small fraction of the branch points subject to the sequencing depth, experimental cost and the expression level of the mRNA. An accurate computational model for branch point prediction is therefore an ongoing objective in human genome research. We here propose a novel branch point prediction algorithm that utilizes information on the branch point sequence and the polypyrimidine tract. Using experimentally validated data, we demonstrate that our proposed method outperforms existing methods. Availability and implementation: https://github.com/zhqingit/BPP. djguo@cuhk.edu.hk. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Detecting Abrupt Changes in a Piecewise Locally Stationary Time Series
Last, Michael; Shumway, Robert
2007-01-01
Non-stationary time series arise in many settings, such as seismology, speech-processing, and finance. In many of these settings we are interested in points where a model of local stationarity is violated. We consider the problem of how to detect these change-points, which we identify by finding sharp changes in the time-varying power spectrum. Several different methods are considered, and we find that the symmetrized Kullback-Leibler information discrimination performs best in simulation studies. We derive asymptotic normality of our test statistic, and consistency of estimated change-point locations. We then demonstrate the technique on the problem of detecting arrival phases in earthquakes. PMID:19190715
Rossi, Patrizia; Pozio, Edoardo
2008-01-01
The European Community Regulation (EC) No. 2075/2005 lays down specific rules on official controls for the detection of Trichinella in fresh meat for human consumption, recommending the pooled-sample digestion method as the reference method. The aim of this document is to provide specific guidance to implement an appropriate Trichinella digestion method by a laboratory accredited according to the ISO/IEC 17025:2005 international standard, and performing microbiological testing following the EA-04/10:2002 international guideline. Technical requirements for the correct implementation of the method, such as the personnel competence, specific equipments and reagents, validation of the method, reference materials, sampling, quality assurance of results and quality control of performance are provided, pointing out the critical control points for the correct implementation of the digestion method.
Estimating occupancy and abundance using aerial images with imperfect detection
Williams, Perry J.; Hooten, Mevin B.; Womble, Jamie N.; Bower, Michael R.
2017-01-01
Species distribution and abundance are critical population characteristics for efficient management, conservation, and ecological insight. Point process models are a powerful tool for modelling distribution and abundance, and can incorporate many data types, including count data, presence-absence data, and presence-only data. Aerial photographic images are a natural tool for collecting data to fit point process models, but aerial images do not always capture all animals that are present at a site. Methods for estimating detection probability for aerial surveys usually include collecting auxiliary data to estimate the proportion of time animals are available to be detected.We developed an approach for fitting point process models using an N-mixture model framework to estimate detection probability for aerial occupancy and abundance surveys. Our method uses multiple aerial images taken of animals at the same spatial location to provide temporal replication of sample sites. The intersection of the images provide multiple counts of individuals at different times. We examined this approach using both simulated and real data of sea otters (Enhydra lutris kenyoni) in Glacier Bay National Park, southeastern Alaska.Using our proposed methods, we estimated detection probability of sea otters to be 0.76, the same as visual aerial surveys that have been used in the past. Further, simulations demonstrated that our approach is a promising tool for estimating occupancy, abundance, and detection probability from aerial photographic surveys.Our methods can be readily extended to data collected using unmanned aerial vehicles, as technology and regulations permit. The generality of our methods for other aerial surveys depends on how well surveys can be designed to meet the assumptions of N-mixture models.
Multi-window detection for P-wave in electrocardiograms based on bilateral accumulative area.
Chen, Riqing; Huang, Yingsong; Wu, Jian
2016-11-01
P-wave detection is one of the most challenging aspects in electrocardiograms (ECGs) due to its low amplitude, low frequency, and variable waveforms. This work introduces a novel multi-window detection method for P-wave delineation based on the bilateral accumulative area. The bilateral accumulative area is calculated by summing the areas covered by the P-wave curve with left and right sliding windows. The onset and offset of a positive P-wave correspond to the local maxima of the area detector. The position drift and difference in area variation of local extreme points with different windows are used to systematically combine multi-window and 12-lead synchronous detection methods, which are used to screen the optimization boundary points from all extreme points of different window widths and adaptively match the P-wave location. The proposed method was validated with ECG signals from various databases, including the Standard CSE Database, T-Wave Alternans Challenge Database, PTB Diagnostic ECG Database, and the St. Petersburg Institute of Cardiological Technics 12-Lead Arrhythmia Database. The average sensitivity Se was 99.44% with a positive predictivity P+ of 99.37% for P-wave detection. Standard deviations of 3.7 and 4.3ms were achieved for the onset and offset of P-waves, respectively, which is in agreement with the accepted tolerances required by the CSE committee. Compared with well-known delineation methods, this method can achieve high sensitivity and positive predictability using a simple calculation process. The experiment results suggest that the bilateral accumulative area could be an effective detection tool for ECG signal analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.
Detection of kinetic change points in piece-wise linear single molecule motion
NASA Astrophysics Data System (ADS)
Hill, Flynn R.; van Oijen, Antoine M.; Duderstadt, Karl E.
2018-03-01
Single-molecule approaches present a powerful way to obtain detailed kinetic information at the molecular level. However, the identification of small rate changes is often hindered by the considerable noise present in such single-molecule kinetic data. We present a general method to detect such kinetic change points in trajectories of motion of processive single molecules having Gaussian noise, with a minimum number of parameters and without the need of an assumed kinetic model beyond piece-wise linearity of motion. Kinetic change points are detected using a likelihood ratio test in which the probability of no change is compared to the probability of a change occurring, given the experimental noise. A predetermined confidence interval minimizes the occurrence of false detections. Applying the method recursively to all sub-regions of a single molecule trajectory ensures that all kinetic change points are located. The algorithm presented allows rigorous and quantitative determination of kinetic change points in noisy single molecule observations without the need for filtering or binning, which reduce temporal resolution and obscure dynamics. The statistical framework for the approach and implementation details are discussed. The detection power of the algorithm is assessed using simulations with both single kinetic changes and multiple kinetic changes that typically arise in observations of single-molecule DNA-replication reactions. Implementations of the algorithm are provided in ImageJ plugin format written in Java and in the Julia language for numeric computing, with accompanying Jupyter Notebooks to allow reproduction of the analysis presented here.
A Unimodal Model for Double Observer Distance Sampling Surveys.
Becker, Earl F; Christ, Aaron M
2015-01-01
Distance sampling is a widely used method to estimate animal population size. Most distance sampling models utilize a monotonically decreasing detection function such as a half-normal. Recent advances in distance sampling modeling allow for the incorporation of covariates into the distance model, and the elimination of the assumption of perfect detection at some fixed distance (usually the transect line) with the use of double-observer models. The assumption of full observer independence in the double-observer model is problematic, but can be addressed by using the point independence assumption which assumes there is one distance, the apex of the detection function, where the 2 observers are assumed independent. Aerially collected distance sampling data can have a unimodal shape and have been successfully modeled with a gamma detection function. Covariates in gamma detection models cause the apex of detection to shift depending upon covariate levels, making this model incompatible with the point independence assumption when using double-observer data. This paper reports a unimodal detection model based on a two-piece normal distribution that allows covariates, has only one apex, and is consistent with the point independence assumption when double-observer data are utilized. An aerial line-transect survey of black bears in Alaska illustrate how this method can be applied.
For the past nine years the Ecological Exposure Research Division (EERD) has been developing methods for the assessment of EDCs and other contaminants of emerging concern (CECs). These methods include genomic techniques for detecting the presence and potential exposure to human p...
A double-observer approach for estimating detection probability and abundance from point counts
Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Fallon, F.W.; Fallon, J.E.; Heglund, P.J.
2000-01-01
Although point counts are frequently used in ornithological studies, basic assumptions about detection probabilities often are untested. We apply a double-observer approach developed to estimate detection probabilities for aerial surveys (Cook and Jacobson 1979) to avian point counts. At each point count, a designated 'primary' observer indicates to another ('secondary') observer all birds detected. The secondary observer records all detections of the primary observer as well as any birds not detected by the primary observer. Observers alternate primary and secondary roles during the course of the survey. The approach permits estimation of observer-specific detection probabilities and bird abundance. We developed a set of models that incorporate different assumptions about sources of variation (e.g. observer, bird species) in detection probability. Seventeen field trials were conducted, and models were fit to the resulting data using program SURVIV. Single-observer point counts generally miss varying proportions of the birds actually present, and observer and bird species were found to be relevant sources of variation in detection probabilities. Overall detection probabilities (probability of being detected by at least one of the two observers) estimated using the double-observer approach were very high (>0.95), yielding precise estimates of avian abundance. We consider problems with the approach and recommend possible solutions, including restriction of the approach to fixed-radius counts to reduce the effect of variation in the effective radius of detection among various observers and to provide a basis for using spatial sampling to estimate bird abundance on large areas of interest. We believe that most questions meriting the effort required to carry out point counts also merit serious attempts to estimate detection probabilities associated with the counts. The double-observer approach is a method that can be used for this purpose.
Space Subdivision in Indoor Mobile Laser Scanning Point Clouds Based on Scanline Analysis.
Zheng, Yi; Peter, Michael; Zhong, Ruofei; Oude Elberink, Sander; Zhou, Quan
2018-06-05
Indoor space subdivision is an important aspect of scene analysis that provides essential information for many applications, such as indoor navigation and evacuation route planning. Until now, most proposed scene understanding algorithms have been based on whole point clouds, which has led to complicated operations, high computational loads and low processing speed. This paper presents novel methods to efficiently extract the location of openings (e.g., doors and windows) and to subdivide space by analyzing scanlines. An opening detection method is demonstrated that analyses the local geometric regularity in scanlines to refine the extracted opening. Moreover, a space subdivision method based on the extracted openings and the scanning system trajectory is described. Finally, the opening detection and space subdivision results are saved as point cloud labels which will be used for further investigations. The method has been tested on a real dataset collected by ZEB-REVO. The experimental results validate the completeness and correctness of the proposed method for different indoor environment and scanning paths.
NASA Astrophysics Data System (ADS)
Li, Yanran; Chen, Duo; Zhang, Jiwei; Chen, Ning; Li, Xiaoqi; Gong, Xiaojing
2017-09-01
GIS (gas insulated switchgear), is an important equipment in power system. Partial discharge plays an important role in detecting the insulation performance of GIS. UHF method and ultrasonic method frequently used in partial discharge (PD) detection for GIS. It is necessary to investigate UHF method and ultrasonic method for partial discharge in GIS. However, very few studies have been conducted on the method combined this two methods. From the view point of safety, a new method based on UHF method and ultrasonic method of PD detection for GIS is proposed in order to greatly enhance the ability of anti-interference of signal detection and the accuracy of fault localization. This paper presents study aimed at clarifying the effect of the new method combined UHF method and ultrasonic method. Partial discharge tests were performed in laboratory simulated environment. Obtained results show the ability of anti-interference of signal detection and the accuracy of fault localization for this new method combined UHF method and ultrasonic method.
Improving signal-to-noise in the direct imaging of exoplanets and circumstellar disks with MLOCI
NASA Astrophysics Data System (ADS)
Wahhaj, Zahed; Cieza, Lucas A.; Mawet, Dimitri; Yang, Bin; Canovas, Hector; de Boer, Jozua; Casassus, Simon; Ménard, François; Schreiber, Matthias R.; Liu, Michael C.; Biller, Beth A.; Nielsen, Eric L.; Hayward, Thomas L.
2015-09-01
We present a new algorithm designed to improve the signal-to-noise ratio (S/N) of point and extended source detections around bright stars in direct imaging data.One of our innovations is that we insert simulated point sources into the science images, which we then try to recover with maximum S/N. This improves the S/N of real point sources elsewhere in the field. The algorithm, based on the locally optimized combination of images (LOCI) method, is called Matched LOCI or MLOCI. We show with Gemini Planet Imager (GPI) data on HD 135344 B and Near-Infrared Coronagraphic Imager (NICI) data on several stars that the new algorithm can improve the S/N of point source detections by 30-400% over past methods. We also find no increase in false detections rates. No prior knowledge of candidate companion locations is required to use MLOCI. On the other hand, while non-blind applications may yield linear combinations of science images that seem to increase the S/N of true sources by a factor >2, they can also yield false detections at high rates. This is a potential pitfall when trying to confirm marginal detections or to redetect point sources found in previous epochs. These findings are relevant to any method where the coefficients of the linear combination are considered tunable, e.g., LOCI and principal component analysis (PCA). Thus we recommend that false detection rates be analyzed when using these techniques. Based on observations obtained at the Gemini Observatory, which is operated by the Association of Universities for Research in Astronomy, Inc., under a cooperative agreement with the NSF on behalf of the Gemini partnership: the National Science Foundation (USA), the Science and Technology Facilities Council (UK), the National Research Council (Canada), CONICYT (Chile), the Australian Research Council (Australia), Ministério da Ciência e Tecnologia (Brazil) and Ministerio de Ciencia, Tecnología e Innovación Productiva (Argentina).
Detection method of flexion relaxation phenomenon based on wavelets for patients with low back pain
NASA Astrophysics Data System (ADS)
Nougarou, François; Massicotte, Daniel; Descarreaux, Martin
2012-12-01
The flexion relaxation phenomenon (FRP) can be defined as a reduction or silence of myoelectric activity of the lumbar erector spinae muscle during full trunk flexion. It is typically absent in patients with chronic low back pain (LBP). Before any broad clinical utilization of this neuromuscular response can be made, effective, standardized, and accurate methods of identifying FRP limits are needed. However, this phenomenon is clearly more difficult to detect for LBP patients than for healthy patients. The main goal of this study is to develop an automated method based on wavelet transformation that would improve time point limits detection of surface electromyography signals of the FRP in case of LBP patients. Conventional visual identification and proposed automated methods of time point limits detection of relaxation phase were compared on experimental data using criteria of accuracy and repeatability based on physiological properties. The evaluation demonstrates that the use of wavelet transform (WT) yields better results than methods without wavelet decomposition. Furthermore, methods based on wavelet per packet transform are more effective than algorithms employing discrete WT. Compared to visual detection, in addition to demonstrating an obvious saving of time, the use of wavelet per packet transform improves the accuracy and repeatability in the detection of the FRP limits. These results clearly highlight the value of the proposed technique in identifying onset and offset of the flexion relaxation response in LBP subjects.
Research on Horizontal Accuracy Method of High Spatial Resolution Remotely Sensed Orthophoto Image
NASA Astrophysics Data System (ADS)
Xu, Y. M.; Zhang, J. X.; Yu, F.; Dong, S.
2018-04-01
At present, in the inspection and acceptance of high spatial resolution remotly sensed orthophoto image, the horizontal accuracy detection is testing and evaluating the accuracy of images, which mostly based on a set of testing points with the same accuracy and reliability. However, it is difficult to get a set of testing points with the same accuracy and reliability in the areas where the field measurement is difficult and the reference data with high accuracy is not enough. So it is difficult to test and evaluate the horizontal accuracy of the orthophoto image. The uncertainty of the horizontal accuracy has become a bottleneck for the application of satellite borne high-resolution remote sensing image and the scope of service expansion. Therefore, this paper proposes a new method to test the horizontal accuracy of orthophoto image. This method using the testing points with different accuracy and reliability. These points' source is high accuracy reference data and field measurement. The new method solves the horizontal accuracy detection of the orthophoto image in the difficult areas and provides the basis for providing reliable orthophoto images to the users.
Fault Detection and Diagnosis of Railway Point Machines by Sound Analysis
Lee, Jonguk; Choi, Heesu; Park, Daihee; Chung, Yongwha; Kim, Hee-Young; Yoon, Sukhan
2016-01-01
Railway point devices act as actuators that provide different routes to trains by driving switchblades from the current position to the opposite one. Point failure can significantly affect railway operations, with potentially disastrous consequences. Therefore, early detection of anomalies is critical for monitoring and managing the condition of rail infrastructure. We present a data mining solution that utilizes audio data to efficiently detect and diagnose faults in railway condition monitoring systems. The system enables extracting mel-frequency cepstrum coefficients (MFCCs) from audio data with reduced feature dimensions using attribute subset selection, and employs support vector machines (SVMs) for early detection and classification of anomalies. Experimental results show that the system enables cost-effective detection and diagnosis of faults using a cheap microphone, with accuracy exceeding 94.1% whether used alone or in combination with other known methods. PMID:27092509
Anomaly detection in forward looking infrared imaging using one-class classifiers
NASA Astrophysics Data System (ADS)
Popescu, Mihail; Stone, Kevin; Havens, Timothy; Ho, Dominic; Keller, James
2010-04-01
In this paper we describe a method for generating cues of possible abnormal objects present in the field of view of an infrared (IR) camera installed on a moving vehicle. The proposed method has two steps. In the first step, for each frame, we generate a set of possible points of interest using a corner detection algorithm. In the second step, the points related to the background are discarded from the point set using an one class classifier (OCC) trained on features extracted from a local neighborhood of each point. The advantage of using an OCC is that we do not need examples from the "abnormal object" class to train the classifier. Instead, OCC is trained using corner points from images known to be abnormal object free, i.e., that contain only background scenes. To further reduce the number of false alarms we use a temporal fusion procedure: a region has to be detected as "interesting" in m out of n, m
Point cloud registration from local feature correspondences-Evaluation on challenging datasets.
Petricek, Tomas; Svoboda, Tomas
2017-01-01
Registration of laser scans, or point clouds in general, is a crucial step of localization and mapping with mobile robots or in object modeling pipelines. A coarse alignment of the point clouds is generally needed before applying local methods such as the Iterative Closest Point (ICP) algorithm. We propose a feature-based approach to point cloud registration and evaluate the proposed method and its individual components on challenging real-world datasets. For a moderate overlap between the laser scans, the method provides a superior registration accuracy compared to state-of-the-art methods including Generalized ICP, 3D Normal-Distribution Transform, Fast Point-Feature Histograms, and 4-Points Congruent Sets. Compared to the surface normals, the points as the underlying features yield higher performance in both keypoint detection and establishing local reference frames. Moreover, sign disambiguation of the basis vectors proves to be an important aspect in creating repeatable local reference frames. A novel method for sign disambiguation is proposed which yields highly repeatable reference frames.
Effect of distance-related heterogeneity on population size estimates from point counts
Efford, Murray G.; Dawson, Deanna K.
2009-01-01
Point counts are used widely to index bird populations. Variation in the proportion of birds counted is a known source of error, and for robust inference it has been advocated that counts be converted to estimates of absolute population size. We used simulation to assess nine methods for the conduct and analysis of point counts when the data included distance-related heterogeneity of individual detection probability. Distance from the observer is a ubiquitous source of heterogeneity, because nearby birds are more easily detected than distant ones. Several recent methods (dependent double-observer, time of first detection, time of detection, independent multiple-observer, and repeated counts) do not account for distance-related heterogeneity, at least in their simpler forms. We assessed bias in estimates of population size by simulating counts with fixed radius w over four time intervals (occasions). Detection probability per occasion was modeled as a half-normal function of distance with scale parameter sigma and intercept g(0) = 1.0. Bias varied with sigma/w; values of sigma inferred from published studies were often 50% for a 100-m fixed-radius count. More critically, the bias of adjusted counts sometimes varied more than that of unadjusted counts, and inference from adjusted counts would be less robust. The problem was not solved by using mixture models or including distance as a covariate. Conventional distance sampling performed well in simulations, but its assumptions are difficult to meet in the field. We conclude that no existing method allows effective estimation of population size from point counts.
LIDAR Point Cloud Data Extraction and Establishment of 3D Modeling of Buildings
NASA Astrophysics Data System (ADS)
Zhang, Yujuan; Li, Xiuhai; Wang, Qiang; Liu, Jiang; Liang, Xin; Li, Dan; Ni, Chundi; Liu, Yan
2018-01-01
This paper takes the method of Shepard’s to deal with the original LIDAR point clouds data, and generate regular grid data DSM, filters the ground point cloud and non ground point cloud through double least square method, and obtains the rules of DSM. By using region growing method for the segmentation of DSM rules, the removal of non building point cloud, obtaining the building point cloud information. Uses the Canny operator to extract the image segmentation is needed after the edges of the building, uses Hough transform line detection to extract the edges of buildings rules of operation based on the smooth and uniform. At last, uses E3De3 software to establish the 3D model of buildings.
A Robust Method for Ego-Motion Estimation in Urban Environment Using Stereo Camera.
Ci, Wenyan; Huang, Yingping
2016-10-17
Visual odometry estimates the ego-motion of an agent (e.g., vehicle and robot) using image information and is a key component for autonomous vehicles and robotics. This paper proposes a robust and precise method for estimating the 6-DoF ego-motion, using a stereo rig with optical flow analysis. An objective function fitted with a set of feature points is created by establishing the mathematical relationship between optical flow, depth and camera ego-motion parameters through the camera's 3-dimensional motion and planar imaging model. Accordingly, the six motion parameters are computed by minimizing the objective function, using the iterative Levenberg-Marquard method. One of key points for visual odometry is that the feature points selected for the computation should contain inliers as much as possible. In this work, the feature points and their optical flows are initially detected by using the Kanade-Lucas-Tomasi (KLT) algorithm. A circle matching is followed to remove the outliers caused by the mismatching of the KLT algorithm. A space position constraint is imposed to filter out the moving points from the point set detected by the KLT algorithm. The Random Sample Consensus (RANSAC) algorithm is employed to further refine the feature point set, i.e., to eliminate the effects of outliers. The remaining points are tracked to estimate the ego-motion parameters in the subsequent frames. The approach presented here is tested on real traffic videos and the results prove the robustness and precision of the method.
A Robust Method for Ego-Motion Estimation in Urban Environment Using Stereo Camera
Ci, Wenyan; Huang, Yingping
2016-01-01
Visual odometry estimates the ego-motion of an agent (e.g., vehicle and robot) using image information and is a key component for autonomous vehicles and robotics. This paper proposes a robust and precise method for estimating the 6-DoF ego-motion, using a stereo rig with optical flow analysis. An objective function fitted with a set of feature points is created by establishing the mathematical relationship between optical flow, depth and camera ego-motion parameters through the camera’s 3-dimensional motion and planar imaging model. Accordingly, the six motion parameters are computed by minimizing the objective function, using the iterative Levenberg–Marquard method. One of key points for visual odometry is that the feature points selected for the computation should contain inliers as much as possible. In this work, the feature points and their optical flows are initially detected by using the Kanade–Lucas–Tomasi (KLT) algorithm. A circle matching is followed to remove the outliers caused by the mismatching of the KLT algorithm. A space position constraint is imposed to filter out the moving points from the point set detected by the KLT algorithm. The Random Sample Consensus (RANSAC) algorithm is employed to further refine the feature point set, i.e., to eliminate the effects of outliers. The remaining points are tracked to estimate the ego-motion parameters in the subsequent frames. The approach presented here is tested on real traffic videos and the results prove the robustness and precision of the method. PMID:27763508
NASA Astrophysics Data System (ADS)
Xiong, J. P.; Zhang, A. L.; Ji, K. F.; Feng, S.; Deng, H.; Yang, Y. F.
2016-01-01
Photospheric bright points (PBPs) are tiny and short-lived phenomena which can be seen within dark inter-granular lanes. In this paper, we develop a new method to identify and track the PBPs in the three-dimensional data cube. Different from the previous way such as Detection-Before-Tracking, this method is based on the Tracking-While-Detection. Using this method, the whole lifetime of a PBP can be accurately measured while this PBP is possibly separated into several with Laplacian and morphological dilation (LMD) method due to its weak intensity sometimes. With consideration of the G-band PBPs observed by Hinode/SOT (Solar Optical Telescope) for more than two hours, we find that the isolated PBPs have an average lifetime of 3 minutes, and the longest one is up to 27 minutes, which are greater than the values detected by the previous LMD method. Furthermore, we also find that the mean intensity of PBPs is 1.02 times of the mean photospheric intensity, which is less than the values detected by LMD method, and the intensity of PBPs presents a period of oscillation with 2-3 minutes during the whole lifetime.
Extracting valley-ridge lines from point-cloud-based 3D fingerprint models.
Pang, Xufang; Song, Zhan; Xie, Wuyuan
2013-01-01
3D fingerprinting is an emerging technology with the distinct advantage of touchless operation. More important, 3D fingerprint models contain more biometric information than traditional 2D fingerprint images. However, current approaches to fingerprint feature detection usually must transform the 3D models to a 2D space through unwrapping or other methods, which might introduce distortions. A new approach directly extracts valley-ridge features from point-cloud-based 3D fingerprint models. It first applies the moving least-squares method to fit a local paraboloid surface and represent the local point cloud area. It then computes the local surface's curvatures and curvature tensors to facilitate detection of the potential valley and ridge points. The approach projects those points to the most likely valley-ridge lines, using statistical means such as covariance analysis and cross correlation. To finally extract the valley-ridge lines, it grows the polylines that approximate the projected feature points and removes the perturbations between the sampled points. Experiments with different 3D fingerprint models demonstrate this approach's feasibility and performance.
Probabilistic peak detection in CE-LIF for STR DNA typing.
Woldegebriel, Michael; van Asten, Arian; Kloosterman, Ate; Vivó-Truyols, Gabriel
2017-07-01
In this work, we present a novel probabilistic peak detection algorithm based on a Bayesian framework for forensic DNA analysis. The proposed method aims at an exhaustive use of raw electropherogram data from a laser-induced fluorescence multi-CE system. As the raw data are informative up to a single data point, the conventional threshold-based approaches discard relevant forensic information early in the data analysis pipeline. Our proposed method assigns a posterior probability reflecting the data point's relevance with respect to peak detection criteria. Peaks of low intensity generated from a truly existing allele can thus constitute evidential value instead of fully discarding them and contemplating a potential allele drop-out. This way of working utilizes the information available within each individual data point and thus avoids making early (binary) decisions on the data analysis that can lead to error propagation. The proposed method was tested and compared to the application of a set threshold as is current practice in forensic STR DNA profiling. The new method was found to yield a significant improvement in the number of alleles identified, regardless of the peak heights and deviation from Gaussian shape. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christensen, E.; Alleman, T. L.; McCormick, R. L.
Total acid value titration has long been used to estimate corrosive potential of petroleum crude oil and fuel oil products. The method commonly used for this measurement, ASTM D664, utilizes KOH in isopropanol as the titrant with potentiometric end point determination by pH sensing electrode and Ag/AgCl reference electrode with LiCl electrolyte. A natural application of the D664 method is titration of pyrolysis-derived bio-oil, which is a candidate for refinery upgrading to produce drop in fuels. Determining the total acid value of pyrolysis derived bio-oil has proven challenging and not necessarily amenable to the methodology employed for petroleum products duemore » to the different nature of acids present. We presented an acid value titration for bio-oil products in our previous publication which also utilizes potentiometry using tetrabutylammonium hydroxide in place of KOH as the titrant and tetraethylammonium bromide in place of LiCl as the reference electrolyte to improve the detection of these types of acids. This method was shown to detect numerous end points in samples of bio-oil that were not detected by D664. These end points were attributed to carboxylic acids and phenolics based on the results of HPLC and GC-MS studies. Additional work has led to refinement of the method and it has been established that both carboxylic acids and phenolics can be determined accurately. Use of pH buffer calibration to determine half-neutralization potentials of acids in conjunction with the analysis of model compounds has allowed us to conclude that this titration method is suitable for the determination of total acid value of pyrolysis oil and can be used to differentiate and quantify weak acid species. The measurement of phenolics in bio-oil is subject to a relatively high limit of detection, which may limit the utility of titrimetric methodology for characterizing the acidic potential of pyrolysis oil and products.« less
3D change detection in staggered voxels model for robotic sensing and navigation
NASA Astrophysics Data System (ADS)
Liu, Ruixu; Hampshire, Brandon; Asari, Vijayan K.
2016-05-01
3D scene change detection is a challenging problem in robotic sensing and navigation. There are several unpredictable aspects in performing scene change detection. A change detection method which can support various applications in varying environmental conditions is proposed. Point cloud models are acquired from a RGB-D sensor, which provides the required color and depth information. Change detection is performed on robot view point cloud model. A bilateral filter smooths the surface and fills the holes as well as keeps the edge details on depth image. Registration of the point cloud model is implemented by using Random Sample Consensus (RANSAC) algorithm. It uses surface normal as the previous stage for the ground and wall estimate. After preprocessing the data, we create a point voxel model which defines voxel as surface or free space. Then we create a color model which defines each voxel that has a color by the mean of all points' color value in this voxel. The preliminary change detection is detected by XOR subtract on the point voxel model. Next, the eight neighbors for this center voxel are defined. If they are neither all `changed' voxels nor all `no changed' voxels, a histogram of location and hue channel color is estimated. The experimental evaluations performed to evaluate the capability of our algorithm show promising results for novel change detection that indicate all the changing objects with very limited false alarm rate.
The Detection of Transport Land-Use Data Using Crowdsourcing Taxi Trajectory
NASA Astrophysics Data System (ADS)
Ai, T.; Yang, W.
2016-06-01
This study tries to explore the question of transport land-use change detection by large volume of vehicle trajectory data, presenting a method based on Deluanay triangulation. The whole method includes three steps. The first one is to pre-process the vehicle trajectory data including the point anomaly removing and the conversion of trajectory point to track line. Secondly, construct Deluanay triangulation within the vehicle trajectory line to detect neighborhood relation. Considering the case that some of the trajectory segments are too long, we use a interpolation measure to add more points for the improved triangulation. Thirdly, extract the transport road by cutting short triangle edge and organizing the polygon topology. We have conducted the experiment of transport land-use change discovery using the data of taxi track in Beijing City. We extract not only the transport land-use area but also the semantic information such as the transformation speed, the traffic jam distribution, the main vehicle movement direction and others. Compared with the existed transport network data, such as OpenStreet Map, our method is proved to be quick and accurate.
Traffic sign detection in MLS acquired point clouds for geometric and image-based semantic inventory
NASA Astrophysics Data System (ADS)
Soilán, Mario; Riveiro, Belén; Martínez-Sánchez, Joaquín; Arias, Pedro
2016-04-01
Nowadays, mobile laser scanning has become a valid technology for infrastructure inspection. This technology permits collecting accurate 3D point clouds of urban and road environments and the geometric and semantic analysis of data became an active research topic in the last years. This paper focuses on the detection of vertical traffic signs in 3D point clouds acquired by a LYNX Mobile Mapper system, comprised of laser scanning and RGB cameras. Each traffic sign is automatically detected in the LiDAR point cloud, and its main geometric parameters can be automatically extracted, therefore aiding the inventory process. Furthermore, the 3D position of traffic signs are reprojected on the 2D images, which are spatially and temporally synced with the point cloud. Image analysis allows for recognizing the traffic sign semantics using machine learning approaches. The presented method was tested in road and urban scenarios in Galicia (Spain). The recall results for traffic sign detection are close to 98%, and existing false positives can be easily filtered after point cloud projection. Finally, the lack of a large, publicly available Spanish traffic sign database is pointed out.
a Method of Time-Series Change Detection Using Full Polsar Images from Different Sensors
NASA Astrophysics Data System (ADS)
Liu, W.; Yang, J.; Zhao, J.; Shi, H.; Yang, L.
2018-04-01
Most of the existing change detection methods using full polarimetric synthetic aperture radar (PolSAR) are limited to detecting change between two points in time. In this paper, a novel method was proposed to detect the change based on time-series data from different sensors. Firstly, the overall difference image of a time-series PolSAR was calculated by ominous statistic test. Secondly, difference images between any two images in different times ware acquired by Rj statistic test. Generalized Gaussian mixture model (GGMM) was used to obtain time-series change detection maps in the last step for the proposed method. To verify the effectiveness of the proposed method, we carried out the experiment of change detection by using the time-series PolSAR images acquired by Radarsat-2 and Gaofen-3 over the city of Wuhan, in China. Results show that the proposed method can detect the time-series change from different sensors.
Chest wall segmentation in automated 3D breast ultrasound scans.
Tan, Tao; Platel, Bram; Mann, Ritse M; Huisman, Henkjan; Karssemeijer, Nico
2013-12-01
In this paper, we present an automatic method to segment the chest wall in automated 3D breast ultrasound images. Determining the location of the chest wall in automated 3D breast ultrasound images is necessary in computer-aided detection systems to remove automatically detected cancer candidates beyond the chest wall and it can be of great help for inter- and intra-modal image registration. We show that the visible part of the chest wall in an automated 3D breast ultrasound image can be accurately modeled by a cylinder. We fit the surface of our cylinder model to a set of automatically detected rib-surface points. The detection of the rib-surface points is done by a classifier using features representing local image intensity patterns and presence of rib shadows. Due to attenuation of the ultrasound signal, a clear shadow is visible behind the ribs. Evaluation of our segmentation method is done by computing the distance of manually annotated rib points to the surface of the automatically detected chest wall. We examined the performance on images obtained with the two most common 3D breast ultrasound devices in the market. In a dataset of 142 images, the average mean distance of the annotated points to the segmented chest wall was 5.59 ± 3.08 mm. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Li, Y. H.; Shinohara, T.; Satoh, T.; Tachibana, K.
2016-06-01
High-definition and highly accurate road maps are necessary for the realization of automated driving, and road signs are among the most important element in the road map. Therefore, a technique is necessary which can acquire information about all kinds of road signs automatically and efficiently. Due to the continuous technical advancement of Mobile Mapping System (MMS), it has become possible to acquire large number of images and 3d point cloud efficiently with highly precise position information. In this paper, we present an automatic road sign detection and recognition approach utilizing both images and 3D point cloud acquired by MMS. The proposed approach consists of three stages: 1) detection of road signs from images based on their color and shape features using object based image analysis method, 2) filtering out of over detected candidates utilizing size and position information estimated from 3D point cloud, region of candidates and camera information, and 3) road sign recognition using template matching method after shape normalization. The effectiveness of proposed approach was evaluated by testing dataset, acquired from more than 180 km of different types of roads in Japan. The results show a very high success in detection and recognition of road signs, even under the challenging conditions such as discoloration, deformation and in spite of partial occlusions.
NASA Technical Reports Server (NTRS)
McGill, Matthew J. (Inventor); Scott, Vibart S. (Inventor); Marzouk, Marzouk (Inventor)
2001-01-01
A holographic optical element transforms a spectral distribution of light to image points. The element comprises areas, each of which acts as a separate lens to image the light incident in its area to an image point. Each area contains the recorded hologram of a point source object. The image points can be made to lie in a line in the same focal plane so as to align with a linear array detector. A version of the element has been developed that has concentric equal areas to match the circular fringe pattern of a Fabry-Perot interferometer. The element has high transmission efficiency, and when coupled with high quantum efficiency solid state detectors, provides an efficient photon-collecting detection system. The element may be used as part of the detection system in a direct detection Doppler lidar system or multiple field of view lidar system.
D Scanning of Live Pigs System and its Application in Body Measurements
NASA Astrophysics Data System (ADS)
Guo, H.; Wang, K.; Su, W.; Zhu, D. H.; Liu, W. L.; Xing, Ch.; Chen, Z. R.
2017-09-01
The shape of a live pig is an important indicator of its health and value, whether for breeding or for carcass quality. This paper implements a prototype system for live single pig body surface 3d scanning based on two consumer depth cameras, utilizing the 3d point clouds data. These cameras are calibrated in advance to have a common coordinate system. The live 3D point clouds stream of moving single pig is obtained by two Xtion Pro Live sensors from different viewpoints simultaneously. A novel detection method is proposed and applied to automatically detect the frames containing pigs with the correct posture from the point clouds stream, according to the geometric characteristics of pig's shape. The proposed method is incorporated in a hybrid scheme, that serves as the preprocessing step in a body measurements framework for pigs. Experimental results show the portability of our scanning system and effectiveness of our detection method. Furthermore, an updated this point cloud preprocessing software for livestock body measurements can be downloaded freely from https://github.com/LiveStockShapeAnalysis to livestock industry, research community and can be used for monitoring livestock growth status.
Face pose tracking using the four-point algorithm
NASA Astrophysics Data System (ADS)
Fung, Ho Yin; Wong, Kin Hong; Yu, Ying Kin; Tsui, Kwan Pang; Kam, Ho Chuen
2017-06-01
In this paper, we have developed an algorithm to track the pose of a human face robustly and efficiently. Face pose estimation is very useful in many applications such as building virtual reality systems and creating an alternative input method for the disabled. Firstly, we have modified a face detection toolbox called DLib for the detection of a face in front of a camera. The detected face features are passed to a pose estimation method, known as the four-point algorithm, for pose computation. The theory applied and the technical problems encountered during system development are discussed in the paper. It is demonstrated that the system is able to track the pose of a face in real time using a consumer grade laptop computer.
Quantification of Forecasting and Change-Point Detection Methods for Predictive Maintenance
2015-08-19
industries to manage the service life of equipment, and also to detect precursors to the failure of components found in nuclear power plants, wind turbines ...detection methods for predictive maintenance 5a. CONTRACT NUMBER FA2386-14-1-4096 5b. GRANT NUMBER Grant 14IOA015 AOARD-144096 5c. PROGRAM ELEMENT...sensitive to changes related to abnormality. 15. SUBJECT TERMS predictive maintenance , predictive maintenance , forecasting 16
Multigrid methods for bifurcation problems: The self adjoint case
NASA Technical Reports Server (NTRS)
Taasan, Shlomo
1987-01-01
This paper deals with multigrid methods for computational problems that arise in the theory of bifurcation and is restricted to the self adjoint case. The basic problem is to solve for arcs of solutions, a task that is done successfully with an arc length continuation method. Other important issues are, for example, detecting and locating singular points as part of the continuation process, switching branches at bifurcation points, etc. Multigrid methods have been applied to continuation problems. These methods work well at regular points and at limit points, while they may encounter difficulties in the vicinity of bifurcation points. A new continuation method that is very efficient also near bifurcation points is presented here. The other issues mentioned above are also treated very efficiently with appropriate multigrid algorithms. For example, it is shown that limit points and bifurcation points can be solved for directly by a multigrid algorithm. Moreover, the algorithms presented here solve the corresponding problems in just a few work units (about 10 or less), where a work unit is the work involved in one local relaxation on the finest grid.
Assessment of circulating copy number variant detection for cancer screening.
Molparia, Bhuvan; Nichani, Eshaan; Torkamani, Ali
2017-01-01
Current high-sensitivity cancer screening methods, largely utilizing correlative biomarkers, suffer from false positive rates that lead to unnecessary medical procedures and debatable public health benefit overall. Detection of circulating tumor DNA (ctDNA), a causal biomarker, has the potential to revolutionize cancer screening. Thus far, the majority of ctDNA studies have focused on detection of tumor-specific point mutations after cancer diagnosis for the purpose of post-treatment surveillance. However, ctDNA point mutation detection methods developed to date likely lack either the scope or analytical sensitivity necessary to be useful for cancer screening, due to the low (<1%) ctDNA fraction derived from early stage tumors. On the other hand, tumor-derived copy number variant (CNV) detection is hypothetically a superior means of ctDNA-based cancer screening for many tumor types, given that, relative to point mutations, each individual tumor CNV contributes a much larger number of ctDNA fragments to the overall pool of circulating free DNA (cfDNA). A small number of studies have demonstrated the potential of ctDNA CNV-based screening in select cancer types. Here we perform an in silico assessment of the potential for ctDNA CNV-based cancer screening across many common cancers, and suggest ctDNA CNV detection shows promise as a broad cancer screening methodology.
Graphical methods for the sensitivity analysis in discriminant analysis
Kim, Youngil; Anderson-Cook, Christine M.; Dae-Heung, Jang
2015-09-30
Similar to regression, many measures to detect influential data points in discriminant analysis have been developed. Many follow similar principles as the diagnostic measures used in linear regression in the context of discriminant analysis. Here we focus on the impact on the predicted classification posterior probability when a data point is omitted. The new method is intuitive and easily interpretative compared to existing methods. We also propose a graphical display to show the individual movement of the posterior probability of other data points when a specific data point is omitted. This enables the summaries to capture the overall pattern ofmore » the change.« less
Street curb recognition in 3d point cloud data using morphological operations
NASA Astrophysics Data System (ADS)
Rodríguez-Cuenca, Borja; Concepción Alonso-Rodríguez, María; García-Cortés, Silverio; Ordóñez, Celestino
2015-04-01
Accurate and automatic detection of cartographic-entities saves a great deal of time and money when creating and updating cartographic databases. The current trend in remote sensing feature extraction is to develop methods that are as automatic as possible. The aim is to develop algorithms that can obtain accurate results with the least possible human intervention in the process. Non-manual curb detection is an important issue in road maintenance, 3D urban modeling, and autonomous navigation fields. This paper is focused on the semi-automatic recognition of curbs and street boundaries using a 3D point cloud registered by a mobile laser scanner (MLS) system. This work is divided into four steps. First, a coordinate system transformation is carried out, moving from a global coordinate system to a local one. After that and in order to simplify the calculations involved in the procedure, a rasterization based on the projection of the measured point cloud on the XY plane was carried out, passing from the 3D original data to a 2D image. To determine the location of curbs in the image, different image processing techniques such as thresholding and morphological operations were applied. Finally, the upper and lower edges of curbs are detected by an unsupervised classification algorithm on the curvature and roughness of the points that represent curbs. The proposed method is valid in both straight and curved road sections and applicable both to laser scanner and stereo vision 3D data due to the independence of its scanning geometry. This method has been successfully tested with two datasets measured by different sensors. The first dataset corresponds to a point cloud measured by a TOPCON sensor in the Spanish town of Cudillero. That point cloud comprises more than 6,000,000 points and covers a 400-meter street. The second dataset corresponds to a point cloud measured by a RIEGL sensor in the Austrian town of Horn. That point cloud comprises 8,000,000 points and represents a 160-meter street. The proposed method provides success rates in curb recognition of over 85% in both datasets.
Evaluation of null-point detection methods on simulation data
NASA Astrophysics Data System (ADS)
Olshevsky, Vyacheslav; Fu, Huishan; Vaivads, Andris; Khotyaintsev, Yuri; Lapenta, Giovanni; Markidis, Stefano
2014-05-01
We model the measurements of artificial spacecraft that resemble the configuration of CLUSTER propagating in the particle-in-cell simulation of turbulent magnetic reconnection. The simulation domain contains multiple isolated X-type null-points, but the majority are O-type null-points. Simulations show that current pinches surrounded by twisted fields, analogous to laboratory pinches, are formed along the sequences of O-type nulls. In the simulation, the magnetic reconnection is mainly driven by the kinking of the pinches, at spatial scales of several ion inertial lentghs. We compute the locations of magnetic null-points and detect their type. When the satellites are separated by the fractions of ion inertial length, as it is for CLUSTER, they are able to locate both the isolated null-points, and the pinches. We apply the method to the real CLUSTER data and speculate how common are pinches in the magnetosphere, and whether they play a dominant role in the dissipation of magnetic energy.
Nanosatellite Maneuver Planning for Point Cloud Generation With a Rangefinder
2015-06-05
aided active vision systems [11], dense stereo [12], and TriDAR [13]. However, these systems are unsuitable for a nanosatellite system from power, size...command profiles as well as improving the fidelity of gap detection with better filtering methods for background objects . For example, attitude...application of a single beam laser rangefinder (LRF) to point cloud generation, shape detection , and shape reconstruction for a space-based space
Ryvolová, Markéta; Preisler, Jan; Foret, Frantisek; Hauser, Peter C; Krásenský, Pavel; Paull, Brett; Macka, Mirek
2010-01-01
This work for the first time combines three on-capillary detection methods, namely, capacitively coupled contactless conductometric (C(4)D), photometric (PD), and fluorimetric (FD), in a single (identical) point of detection cell, allowing concurrent measurements at a single point of detection for use in capillary electrophoresis, capillary electrochromatography, and capillary/nanoliquid chromatography. The novel design is based on a standard 6.3 mm i.d. fiber-optic SMA adapter with a drilled opening for the separation capillary to go through, to which two concentrically positioned C(4)D detection electrodes with a detection gap of 7 mm were added on each side acting simultaneously as capillary guides. The optical fibers in the SMA adapter were used for the photometric signal (absorbance), and another optical fiber at a 45 degrees angle to the capillary was applied to collect the emitted light for FD. Light emitting diodes (255 and 470 nm) were used as light sources for the PD and FD detection modes. LOD values were determined under flow-injection conditions to exclude any stacking effects: For the 470 nm LED limits of detection (LODs) for FD and PD were for fluorescein (1 x 10(-8) mol/L) and tartrazine (6 x 10(-6) mol/L), respectively, and the LOD for the C(4)D was for magnesium chloride (5 x 10(-7) mol/L). The advantage of the three different detection signals in a single point is demonstrated in capillary electrophoresis using model mixtures and samples including a mixture of fluorescent and nonfluorescent dyes and common ions, underivatized amino acids, and a fluorescently labeled digest of bovine serum albumin.
Comparison of methods for estimating density of forest songbirds from point counts
Jennifer L. Reidy; Frank R. Thompson; J. Wesley. Bailey
2011-01-01
New analytical methods have been promoted for estimating the probability of detection and density of birds from count data but few studies have compared these methods using real data. We compared estimates of detection probability and density from distance and time-removal models and survey protocols based on 5- or 10-min counts and outer radii of 50 or 100 m. We...
Video shot boundary detection using region-growing-based watershed method
NASA Astrophysics Data System (ADS)
Wang, Jinsong; Patel, Nilesh; Grosky, William
2004-10-01
In this paper, a novel shot boundary detection approach is presented, based on the popular region growing segmentation method - Watershed segmentation. In image processing, gray-scale pictures could be considered as topographic reliefs, in which the numerical value of each pixel of a given image represents the elevation at that point. Watershed method segments images by filling up basins with water starting at local minima, and at points where water coming from different basins meet, dams are built. In our method, each frame in the video sequences is first transformed from the feature space into the topographic space based on a density function. Low-level features are extracted from frame to frame. Each frame is then treated as a point in the feature space. The density of each point is defined as the sum of the influence functions of all neighboring data points. The height function that is originally used in Watershed segmentation is then replaced by inverting the density at the point. Thus, all the highest density values are transformed into local minima. Subsequently, Watershed segmentation is performed in the topographic space. The intuitive idea under our method is that frames within a shot are highly agglomerative in the feature space and have higher possibilities to be merged together, while those frames between shots representing the shot changes are not, hence they have less density values and are less likely to be clustered by carefully extracting the markers and choosing the stopping criterion.
On the detection of other planetary systems by astrometric techniques
NASA Technical Reports Server (NTRS)
Black, D. C.; Scargle, J. D.
1982-01-01
A quantitative method for astrometrically detecting perturbations induced in a star's motion by the presence of a planetary object is described. A periodogram is defined, wherein signals observed from a star show exactly periodic variations, which can be extracted from observational data using purely statistical methods. A detection threshold is defined for the frequency of occurrence of some detectable signal, e.g., the Nyquist frequency. Possible effects of a stellar orbital eccentricity and multiple companions are discussed, noting that assumption of a circular orbit assures the spectral purity of the signal described. The periodogram technique was applied to 12 yr of astrometric data from the U.S. Naval Observatory for three stars with low mass stellar companions. Periodic perturbations were confirmed. A comparison of the accuracy of different astrometric systems shows that the detection accuracy of a system is determined by the measurement accuracy and the number of observations, although the detection efficiency can be maximized by minimizing the number of data points for the case when observational errors are proportional to the square root of the number of data points. It is suggested that a space-based astrometric telescope is best suited to take advantage of the method.
Al-Kaff, Abdulla; García, Fernando; Martín, David; De La Escalera, Arturo; Armingol, José María
2017-01-01
One of the most challenging problems in the domain of autonomous aerial vehicles is the designing of a robust real-time obstacle detection and avoidance system. This problem is complex, especially for the micro and small aerial vehicles, that is due to the Size, Weight and Power (SWaP) constraints. Therefore, using lightweight sensors (i.e., Digital camera) can be the best choice comparing with other sensors; such as laser or radar.For real-time applications, different works are based on stereo cameras in order to obtain a 3D model of the obstacles, or to estimate their depth. Instead, in this paper, a method that mimics the human behavior of detecting the collision state of the approaching obstacles using monocular camera is proposed. The key of the proposed algorithm is to analyze the size changes of the detected feature points, combined with the expansion ratios of the convex hull constructed around the detected feature points from consecutive frames. During the Aerial Vehicle (UAV) motion, the detection algorithm estimates the changes in the size of the area of the approaching obstacles. First, the method detects the feature points of the obstacles, then extracts the obstacles that have the probability of getting close toward the UAV. Secondly, by comparing the area ratio of the obstacle and the position of the UAV, the method decides if the detected obstacle may cause a collision. Finally, by estimating the obstacle 2D position in the image and combining with the tracked waypoints, the UAV performs the avoidance maneuver. The proposed algorithm was evaluated by performing real indoor and outdoor flights, and the obtained results show the accuracy of the proposed algorithm compared with other related works. PMID:28481277
NASA Astrophysics Data System (ADS)
Wang, Weixing; Wang, Zhiwei; Han, Ya; Li, Shuang; Zhang, Xin
2015-03-01
In order to ensure safety, long term stability and quality control in modern tunneling operations, the acquisition of geotechnical information about encountered rock conditions and detailed installed support information is required. The limited space and time in an operational tunnel environment make the acquiring data challenging. The laser scanning in a tunneling environment, however, shows a great potential. The surveying and mapping of tunnels are crucial for the optimal use after construction and in routine inspections. Most of these applications focus on the geometric information of the tunnels extracted from the laser scanning data. There are two kinds of applications widely discussed: deformation measurement and feature extraction. The traditional deformation measurement in an underground environment is performed with a series of permanent control points installed around the profile of an excavation, which is unsuitable for a global consideration of the investigated area. Using laser scanning for deformation analysis provides many benefits as compared to traditional monitoring techniques. The change in profile is able to be fully characterized and the areas of the anomalous movement can easily be separated from overall trends due to the high density of the point cloud data. Furthermore, monitoring with a laser scanner does not require the permanent installation of control points, therefore the monitoring can be completed more quickly after excavation, and the scanning is non-contact, hence, no damage is done during the installation of temporary control points. The main drawback of using the laser scanning for deformation monitoring is that the point accuracy of the original data is generally the same magnitude as the smallest level of deformations that are to be measured. To overcome this, statistical techniques and three dimensional image processing techniques for the point clouds must be developed. For safely, effectively and easily control the problem of Over Underbreak detection of road and solve the problemof the roadway data collection difficulties, this paper presents a new method of continuous section extraction and Over Underbreak detection of road based on 3D laser scanning technology and image processing, the method is divided into the following three steps: based on Canny edge detection, local axis fitting, continuous extraction section and Over Underbreak detection of section. First, after Canny edge detection, take the least-squares curve fitting method to achieve partial fitting in axis. Then adjust the attitude of local roadway that makes the axis of the roadway be consistent with the direction of the extraction reference, and extract section along the reference direction. Finally, we compare the actual cross-sectional view and the cross-sectional design to complete Overbreak detected. Experimental results show that the proposed method have a great advantage in computing costs and ensure cross-section orthogonal intercept terms compared with traditional detection methods.
Railway clearance intrusion detection method with binocular stereo vision
NASA Astrophysics Data System (ADS)
Zhou, Xingfang; Guo, Baoqing; Wei, Wei
2018-03-01
In the stage of railway construction and operation, objects intruding railway clearance greatly threaten the safety of railway operation. Real-time intrusion detection is of great importance. For the shortcomings of depth insensitive and shadow interference of single image method, an intrusion detection method with binocular stereo vision is proposed to reconstruct the 3D scene for locating the objects and judging clearance intrusion. The binocular cameras are calibrated with Zhang Zhengyou's method. In order to improve the 3D reconstruction speed, a suspicious region is firstly determined by background difference method of a single camera's image sequences. The image rectification, stereo matching and 3D reconstruction process are only executed when there is a suspicious region. A transformation matrix from Camera Coordinate System(CCS) to Track Coordinate System(TCS) is computed with gauge constant and used to transfer the 3D point clouds into the TCS, then the 3D point clouds are used to calculate the object position and intrusion in TCS. The experiments in railway scene show that the position precision is better than 10mm. It is an effective way for clearance intrusion detection and can satisfy the requirement of railway application.
NASA Astrophysics Data System (ADS)
Zang, Lixin; Zhao, Huimin; Zhang, Zhiguo; Cao, Wenwu
2017-02-01
Photodynamic therapy (PDT) is currently an advanced optical technology in medical applications. However, the application of PDT is limited by the detection of photosensitizers. This work focuses on the application of fluorescence spectroscopy and imaging in the detection of an effective photosenzitizer, hematoporphyrin monomethyl ether (HMME). Optical properties of HMME were measured and analyzed based on its absorption and fluorescence spectra. The production mechanism of its fluorescence emission was analyzed. The detection device for HMME based on fluorescence spectroscopy was designed. Ratiometric method was applied to eliminate the influence of intensity change of excitation sources, fluctuates of excitation sources and photo detectors, and background emissions. The detection limit of this device is 6 μg/L, and it was successfully applied to the diagnosis of the metabolism of HMME in the esophageal cancer cells. To overcome the limitation of the point measurement using fluorescence spectroscopy, a two-dimensional (2D) fluorescence imaging system was established. The algorithm of the 2D fluorescence imaging system is deduced according to the fluorescence ratiometric method using bandpass filters. The method of multiple pixel point addition (MPPA) was used to eliminate fluctuates of signals. Using the method of MPPA, SNR was improved by about 30 times. The detection limit of this imaging system is 1.9 μg/L. Our systems can be used in the detection of porphyrins to improve the PDT effect.
Classification of Aerial Photogrammetric 3d Point Clouds
NASA Astrophysics Data System (ADS)
Becker, C.; Häni, N.; Rosinskaya, E.; d'Angelo, E.; Strecha, C.
2017-05-01
We present a powerful method to extract per-point semantic class labels from aerial photogrammetry data. Labelling this kind of data is important for tasks such as environmental modelling, object classification and scene understanding. Unlike previous point cloud classification methods that rely exclusively on geometric features, we show that incorporating color information yields a significant increase in accuracy in detecting semantic classes. We test our classification method on three real-world photogrammetry datasets that were generated with Pix4Dmapper Pro, and with varying point densities. We show that off-the-shelf machine learning techniques coupled with our new features allow us to train highly accurate classifiers that generalize well to unseen data, processing point clouds containing 10 million points in less than 3 minutes on a desktop computer.
Optimizing probability of detection point estimate demonstration
NASA Astrophysics Data System (ADS)
Koshti, Ajay M.
2017-04-01
The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using point estimate method. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. Traditionally largest flaw size in the set is considered to be a conservative estimate of the flaw size with minimum 90% probability and 95% confidence. The flaw size is denoted as α90/95PE. The paper investigates relationship between range of flaw sizes in relation to α90, i.e. 90% probability flaw size, to provide a desired PPD. The range of flaw sizes is expressed as a proportion of the standard deviation of the probability density distribution. Difference between median or average of the 29 flaws and α90 is also expressed as a proportion of standard deviation of the probability density distribution. In general, it is concluded that, if probability of detection increases with flaw size, average of 29 flaw sizes would always be larger than or equal to α90 and is an acceptable measure of α90/95PE. If NDE technique has sufficient sensitivity and signal-to-noise ratio, then the 29 flaw-set can be optimized to meet requirements of minimum required PPD, maximum allowable POF, requirements on flaw size tolerance about mean flaw size and flaw size detectability requirements. The paper provides procedure for optimizing flaw sizes in the point estimate demonstration flaw-set.
Cryo-balloon catheter localization in fluoroscopic images
NASA Astrophysics Data System (ADS)
Kurzendorfer, Tanja; Brost, Alexander; Jakob, Carolin; Mewes, Philip W.; Bourier, Felix; Koch, Martin; Kurzidim, Klaus; Hornegger, Joachim; Strobel, Norbert
2013-03-01
Minimally invasive catheter ablation has become the preferred treatment option for atrial fibrillation. Although the standard ablation procedure involves ablation points set by radio-frequency catheters, cryo-balloon catheters have even been reported to be more advantageous in certain cases. As electro-anatomical mapping systems do not support cryo-balloon ablation procedures, X-ray guidance is needed. However, current methods to provide support for cryo-balloon catheters in fluoroscopically guided ablation procedures rely heavily on manual user interaction. To improve this, we propose a first method for automatic cryo-balloon catheter localization in fluoroscopic images based on a blob detection algorithm. Our method is evaluated on 24 clinical images from 17 patients. The method successfully detected the cryoballoon in 22 out of 24 images, yielding a success rate of 91.6 %. The successful localization achieved an accuracy of 1.00 mm +/- 0.44 mm. Even though our methods currently fails in 8.4 % of the images available, it still offers a significant improvement over manual methods. Furthermore, detecting a landmark point along the cryo-balloon catheter can be a very important step for additional post-processing operations.
Advances in Candida detection platforms for clinical and point-of-care applications
Safavieh, Mohammadali; Coarsey, Chad; Esiobu, Nwadiuto; Memic, Adnan; Vyas, Jatin Mahesh; Shafiee, Hadi; Asghar, Waseem
2016-01-01
Invasive candidiasis remains one of the most serious community and healthcare-acquired infections worldwide. Conventional Candida detection methods based on blood and plate culture are time-consuming and require at least 2–4 days to identify various Candida species. Despite considerable advances for candidiasis detection, the development of simple, compact and portable point-of-care diagnostics for rapid and precise testing that automatically performs cell lysis, nucleic acid extraction, purification and detection still remains a challenge. Here, we systematically review most prominent conventional and nonconventional techniques for the detection of various Candida species, including Candida staining, blood culture, serological testing and nucleic acid-based analysis. We also discuss the most advanced lab on a chip devices for candida detection. PMID:27093473
Appraisal of an Array TEM Method in Detecting a Mined-Out Area Beneath a Conductive Layer
NASA Astrophysics Data System (ADS)
Li, Hai; Xue, Guo-qiang; Zhou, Nan-nan; Chen, Wei-ying
2015-10-01
The transient electromagnetic method has been extensively used for the detection of mined-out area in China for the past few years. In the cases that the mined-out area is overlain by a conductive layer, the detection of the target layer is difficult with a traditional loop source TEM method. In order to detect the target layer in this condition, this paper presents a newly developed array TEM method, which uses a grounded wire source. The underground current density distribution and the responses of the grounded wire source TEM configuration are modeled to demonstrate that the target layer is detectable in this condition. The 1D OCCAM inversion routine is applied to the synthetic single station data and common middle point gather. The result reveals that the electric source TEM method is capable of recovering the resistive target layer beneath the conductive overburden. By contrast, the conductive target layer cannot be recovered unless the distance between the target layer and the conductive overburden is large. Compared with inversion result of the single station data, the inversion of common middle point gather can better recover the resistivity of the target layer. Finally, a case study illustrates that the array TEM method is successfully applied in recovering a water-filled mined-out area beneath a conductive overburden.
2011-01-01
Background The Prospective Space-Time scan statistic (PST) is widely used for the evaluation of space-time clusters of point event data. Usually a window of cylindrical shape is employed, with a circular or elliptical base in the space domain. Recently, the concept of Minimum Spanning Tree (MST) was applied to specify the set of potential clusters, through the Density-Equalizing Euclidean MST (DEEMST) method, for the detection of arbitrarily shaped clusters. The original map is cartogram transformed, such that the control points are spread uniformly. That method is quite effective, but the cartogram construction is computationally expensive and complicated. Results A fast method for the detection and inference of point data set space-time disease clusters is presented, the Voronoi Based Scan (VBScan). A Voronoi diagram is built for points representing population individuals (cases and controls). The number of Voronoi cells boundaries intercepted by the line segment joining two cases points defines the Voronoi distance between those points. That distance is used to approximate the density of the heterogeneous population and build the Voronoi distance MST linking the cases. The successive removal of edges from the Voronoi distance MST generates sub-trees which are the potential space-time clusters. Finally, those clusters are evaluated through the scan statistic. Monte Carlo replications of the original data are used to evaluate the significance of the clusters. An application for dengue fever in a small Brazilian city is presented. Conclusions The ability to promptly detect space-time clusters of disease outbreaks, when the number of individuals is large, was shown to be feasible, due to the reduced computational load of VBScan. Instead of changing the map, VBScan modifies the metric used to define the distance between cases, without requiring the cartogram construction. Numerical simulations showed that VBScan has higher power of detection, sensitivity and positive predicted value than the Elliptic PST. Furthermore, as VBScan also incorporates topological information from the point neighborhood structure, in addition to the usual geometric information, it is more robust than purely geometric methods such as the elliptic scan. Those advantages were illustrated in a real setting for dengue fever space-time clusters. PMID:21513556
Wheat Ear Detection in Plots by Segmenting Mobile Laser Scanner Data
NASA Astrophysics Data System (ADS)
Velumani, K.; Oude Elberink, S.; Yang, M. Y.; Baret, F.
2017-09-01
The use of Light Detection and Ranging (LiDAR) to study agricultural crop traits is becoming popular. Wheat plant traits such as crop height, biomass fractions and plant population are of interest to agronomists and biologists for the assessment of a genotype's performance in the environment. Among these performance indicators, plant population in the field is still widely estimated through manual counting which is a tedious and labour intensive task. The goal of this study is to explore the suitability of LiDAR observations to automate the counting process by the individual detection of wheat ears in the agricultural field. However, this is a challenging task owing to the random cropping pattern and noisy returns present in the point cloud. The goal is achieved by first segmenting the 3D point cloud followed by the classification of segments into ears and non-ears. In this study, two segmentation techniques: a) voxel-based segmentation and b) mean shift segmentation were adapted to suit the segmentation of plant point clouds. An ear classification strategy was developed to distinguish the ear segments from leaves and stems. Finally, the ears extracted by the automatic methods were compared with reference ear segments prepared by manual segmentation. Both the methods had an average detection rate of 85 %, aggregated over different flowering stages. The voxel-based approach performed well for late flowering stages (wheat crops aged 210 days or more) with a mean percentage accuracy of 94 % and takes less than 20 seconds to process 50,000 points with an average point density of 16 points/cm2. Meanwhile, the mean shift approach showed comparatively better counting accuracy of 95% for early flowering stage (crops aged below 225 days) and takes approximately 4 minutes to process 50,000 points.
Modified screening and ranking algorithm for copy number variation detection.
Xiao, Feifei; Min, Xiaoyi; Zhang, Heping
2015-05-01
Copy number variation (CNV) is a type of structural variation, usually defined as genomic segments that are 1 kb or larger, which present variable copy numbers when compared with a reference genome. The screening and ranking algorithm (SaRa) was recently proposed as an efficient approach for multiple change-points detection, which can be applied to CNV detection. However, some practical issues arise from application of SaRa to single nucleotide polymorphism data. In this study, we propose a modified SaRa on CNV detection to address these issues. First, we use the quantile normalization on the original intensities to guarantee that the normal mean model-based SaRa is a robust method. Second, a novel normal mixture model coupled with a modified Bayesian information criterion is proposed for candidate change-point selection and further clustering the potential CNV segments to copy number states. Simulations revealed that the modified SaRa became a robust method for identifying change-points and achieved better performance than the circular binary segmentation (CBS) method. By applying the modified SaRa to real data from the HapMap project, we illustrated its performance on detecting CNV segments. In conclusion, our modified SaRa method improves SaRa theoretically and numerically, for identifying CNVs with high-throughput genotyping data. The modSaRa package is implemented in R program and freely available at http://c2s2.yale.edu/software/modSaRa. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Data approximation using a blending type spline construction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dalmo, Rune; Bratlie, Jostein
2014-11-18
Generalized expo-rational B-splines (GERBS) is a blending type spline construction where local functions at each knot are blended together by C{sup k}-smooth basis functions. One way of approximating discrete regular data using GERBS is by partitioning the data set into subsets and fit a local function to each subset. Partitioning and fitting strategies can be devised such that important or interesting data points are interpolated in order to preserve certain features. We present a method for fitting discrete data using a tensor product GERBS construction. The method is based on detection of feature points using differential geometry. Derivatives, which aremore » necessary for feature point detection and used to construct local surface patches, are approximated from the discrete data using finite differences.« less
A Method for Automatic Surface Inspection Using a Model-Based 3D Descriptor.
Madrigal, Carlos A; Branch, John W; Restrepo, Alejandro; Mery, Domingo
2017-10-02
Automatic visual inspection allows for the identification of surface defects in manufactured parts. Nevertheless, when defects are on a sub-millimeter scale, detection and recognition are a challenge. This is particularly true when the defect generates topological deformations that are not shown with strong contrast in the 2D image. In this paper, we present a method for recognizing surface defects in 3D point clouds. Firstly, we propose a novel 3D local descriptor called the Model Point Feature Histogram (MPFH) for defect detection. Our descriptor is inspired from earlier descriptors such as the Point Feature Histogram (PFH). To construct the MPFH descriptor, the models that best fit the local surface and their normal vectors are estimated. For each surface model, its contribution weight to the formation of the surface region is calculated and from the relative difference between models of the same region a histogram is generated representing the underlying surface changes. Secondly, through a classification stage, the points on the surface are labeled according to five types of primitives and the defect is detected. Thirdly, the connected components of primitives are projected to a plane, forming a 2D image. Finally, 2D geometrical features are extracted and by a support vector machine, the defects are recognized. The database used is composed of 3D simulated surfaces and 3D reconstructions of defects in welding, artificial teeth, indentations in materials, ceramics and 3D models of defects. The quantitative and qualitative results showed that the proposed method of description is robust to noise and the scale factor, and it is sufficiently discriminative for detecting some surface defects. The performance evaluation of the proposed method was performed for a classification task of the 3D point cloud in primitives, reporting an accuracy of 95%, which is higher than for other state-of-art descriptors. The rate of recognition of defects was close to 94%.
A Method for Automatic Surface Inspection Using a Model-Based 3D Descriptor
Branch, John W.
2017-01-01
Automatic visual inspection allows for the identification of surface defects in manufactured parts. Nevertheless, when defects are on a sub-millimeter scale, detection and recognition are a challenge. This is particularly true when the defect generates topological deformations that are not shown with strong contrast in the 2D image. In this paper, we present a method for recognizing surface defects in 3D point clouds. Firstly, we propose a novel 3D local descriptor called the Model Point Feature Histogram (MPFH) for defect detection. Our descriptor is inspired from earlier descriptors such as the Point Feature Histogram (PFH). To construct the MPFH descriptor, the models that best fit the local surface and their normal vectors are estimated. For each surface model, its contribution weight to the formation of the surface region is calculated and from the relative difference between models of the same region a histogram is generated representing the underlying surface changes. Secondly, through a classification stage, the points on the surface are labeled according to five types of primitives and the defect is detected. Thirdly, the connected components of primitives are projected to a plane, forming a 2D image. Finally, 2D geometrical features are extracted and by a support vector machine, the defects are recognized. The database used is composed of 3D simulated surfaces and 3D reconstructions of defects in welding, artificial teeth, indentations in materials, ceramics and 3D models of defects. The quantitative and qualitative results showed that the proposed method of description is robust to noise and the scale factor, and it is sufficiently discriminative for detecting some surface defects. The performance evaluation of the proposed method was performed for a classification task of the 3D point cloud in primitives, reporting an accuracy of 95%, which is higher than for other state-of-art descriptors. The rate of recognition of defects was close to 94%. PMID:28974037
NASA Astrophysics Data System (ADS)
Micheletti, Natan; Tonini, Marj; Lane, Stuart N.
2017-02-01
Acquisition of high density point clouds using terrestrial laser scanners (TLSs) has become commonplace in geomorphic science. The derived point clouds are often interpolated onto regular grids and the grids compared to detect change (i.e. erosion and deposition/advancement movements). This procedure is necessary for some applications (e.g. digital terrain analysis), but it inevitably leads to a certain loss of potentially valuable information contained within the point clouds. In the present study, an alternative methodology for geomorphological analysis and feature detection from point clouds is proposed. It rests on the use of the Density-Based Spatial Clustering of Applications with Noise (DBSCAN), applied to TLS data for a rock glacier front slope in the Swiss Alps. The proposed methods allowed the detection and isolation of movements directly from point clouds which yield to accuracies in the following computation of volumes that depend only on the actual registered distance between points. We demonstrated that these values are more conservative than volumes computed with the traditional DEM comparison. The results are illustrated for the summer of 2015, a season of enhanced geomorphic activity associated with exceptionally high temperatures.
Detection of image structures using the Fisher information and the Rao metric.
Maybank, Stephen J
2004-12-01
In many detection problems, the structures to be detected are parameterized by the points of a parameter space. If the conditional probability density function for the measurements is known, then detection can be achieved by sampling the parameter space at a finite number of points and checking each point to see if the corresponding structure is supported by the data. The number of samples and the distances between neighboring samples are calculated using the Rao metric on the parameter space. The Rao metric is obtained from the Fisher information which is, in turn, obtained from the conditional probability density function. An upper bound is obtained for the probability of a false detection. The calculations are simplified in the low noise case by making an asymptotic approximation to the Fisher information. An application to line detection is described. Expressions are obtained for the asymptotic approximation to the Fisher information, the volume of the parameter space, and the number of samples. The time complexity for line detection is estimated. An experimental comparison is made with a Hough transform-based method for detecting lines.
NASA Astrophysics Data System (ADS)
Damiani, F.; Maggio, A.; Micela, G.; Sciortino, S.
1997-07-01
We apply to the specific case of images taken with the ROSAT PSPC detector our wavelet-based X-ray source detection algorithm presented in a companion paper. Such images are characterized by the presence of detector ``ribs,'' strongly varying point-spread function, and vignetting, so that their analysis provides a challenge for any detection algorithm. First, we apply the algorithm to simulated images of a flat background, as seen with the PSPC, in order to calibrate the number of spurious detections as a function of significance threshold and to ascertain that the spatial distribution of spurious detections is uniform, i.e., unaffected by the ribs; this goal was achieved using the exposure map in the detection procedure. Then, we analyze simulations of PSPC images with a realistic number of point sources; the results are used to determine the efficiency of source detection and the accuracy of output quantities such as source count rate, size, and position, upon a comparison with input source data. It turns out that sources with 10 photons or less may be confidently detected near the image center in medium-length (~104 s), background-limited PSPC exposures. The positions of sources detected near the image center (off-axis angles < 15') are accurate to within a few arcseconds. Output count rates and sizes are in agreement with the input quantities, within a factor of 2 in 90% of the cases. The errors on position, count rate, and size increase with off-axis angle and for detections of lower significance. We have also checked that the upper limits computed with our method are consistent with the count rates of undetected input sources. Finally, we have tested the algorithm by applying it on various actual PSPC images, among the most challenging for automated detection procedures (crowded fields, extended sources, and nonuniform diffuse emission). The performance of our method in these images is satisfactory and outperforms those of other current X-ray detection techniques, such as those employed to produce the MPE and WGA catalogs of PSPC sources, in terms of both detection reliability and efficiency. We have also investigated the theoretical limit for point-source detection, with the result that even sources with only 2-3 photons may be reliably detected using an efficient method in images with sufficiently high resolution and low background.
Replacement Condition Detection of Railway Point Machines Using an Electric Current Sensor.
Sa, Jaewon; Choi, Younchang; Chung, Yongwha; Kim, Hee-Young; Park, Daihee; Yoon, Sukhan
2017-01-29
Detecting replacement conditions of railway point machines is important to simultaneously satisfy the budget-limit and train-safety requirements. In this study, we consider classification of the subtle differences in the aging effect-using electric current shape analysis-for the purpose of replacement condition detection of railway point machines. After analyzing the shapes of after-replacement data and then labeling the shapes of each before-replacement data, we can derive the criteria that can handle the subtle differences between "does-not-need-to-be-replaced" and "needs-to-be-replaced" shapes. On the basis of the experimental results with in-field replacement data, we confirmed that the proposed method could detect the replacement conditions with acceptable accuracy, as well as provide visual interpretability of the criteria used for the time-series classification.
Replacement Condition Detection of Railway Point Machines Using an Electric Current Sensor
Sa, Jaewon; Choi, Younchang; Chung, Yongwha; Kim, Hee-Young; Park, Daihee; Yoon, Sukhan
2017-01-01
Detecting replacement conditions of railway point machines is important to simultaneously satisfy the budget-limit and train-safety requirements. In this study, we consider classification of the subtle differences in the aging effect—using electric current shape analysis—for the purpose of replacement condition detection of railway point machines. After analyzing the shapes of after-replacement data and then labeling the shapes of each before-replacement data, we can derive the criteria that can handle the subtle differences between “does-not-need-to-be-replaced” and “needs-to-be-replaced” shapes. On the basis of the experimental results with in-field replacement data, we confirmed that the proposed method could detect the replacement conditions with acceptable accuracy, as well as provide visual interpretability of the criteria used for the time-series classification. PMID:28146057
The Segmentation of Point Clouds with K-Means and ANN (artifical Neural Network)
NASA Astrophysics Data System (ADS)
Kuçak, R. A.; Özdemir, E.; Erol, S.
2017-05-01
Segmentation of point clouds is recently used in many Geomatics Engineering applications such as the building extraction in urban areas, Digital Terrain Model (DTM) generation and the road or urban furniture extraction. Segmentation is a process of dividing point clouds according to their special characteristic layers. The present paper discusses K-means and self-organizing map (SOM) which is a type of ANN (Artificial Neural Network) segmentation algorithm which treats the segmentation of point cloud. The point clouds which generate with photogrammetric method and Terrestrial Lidar System (TLS) were segmented according to surface normal, intensity and curvature. Thus, the results were evaluated. LIDAR (Light Detection and Ranging) and Photogrammetry are commonly used to obtain point clouds in many remote sensing and geodesy applications. By photogrammetric method or LIDAR method, it is possible to obtain point cloud from terrestrial or airborne systems. In this study, the measurements were made with a Leica C10 laser scanner in LIDAR method. In photogrammetric method, the point cloud was obtained from photographs taken from the ground with a 13 MP non-metric camera.
Apparatus and method for creating a photonic densely-accumulated ray-point
NASA Technical Reports Server (NTRS)
Park, Yeonjoon (Inventor); Choi, Sang H. (Inventor); King, Glen C. (Inventor); Elliott, James R. (Inventor)
2012-01-01
An optical apparatus includes an optical diffraction device configured for diffracting a predetermined wavelength of incident light onto adjacent optical focal points, and a photon detector for detecting a spectral characteristic of the predetermined wavelength. One of the optical focal points is a constructive interference point and the other optical focal point is a destructive interference point. The diffraction device, which may be a micro-zone plate (MZP) of micro-ring gratings or an optical lens, generates a constructive ray point using phase-contrasting of the destructive interference point. The ray point is located between adjacent optical focal points. A method of generating a densely-accumulated ray point includes directing incident light onto the optical diffraction device, diffracting the selected wavelength onto the constructive interference focal point and the destructive interference focal point, and generating the densely-accumulated ray point in a narrow region.
George L. Farnsworth; James D. Nichols; John R. Sauer; Steven G. Fancy; Kenneth H. Pollock; Susan A. Shriner; Theodore R. Simons
2005-01-01
Point counts are a standard sampling procedure for many bird species, but lingering concerns still exist about the quality of information produced from the method. It is well known that variation in observer ability and environmental conditions can influence the detection probability of birds in point counts, but many biologists have been reluctant to abandon point...
Self-Similar Spin Images for Point Cloud Matching
NASA Astrophysics Data System (ADS)
Pulido, Daniel
The rapid growth of Light Detection And Ranging (Lidar) technologies that collect, process, and disseminate 3D point clouds have allowed for increasingly accurate spatial modeling and analysis of the real world. Lidar sensors can generate massive 3D point clouds of a collection area that provide highly detailed spatial and radiometric information. However, a Lidar collection can be expensive and time consuming. Simultaneously, the growth of crowdsourced Web 2.0 data (e.g., Flickr, OpenStreetMap) have provided researchers with a wealth of freely available data sources that cover a variety of geographic areas. Crowdsourced data can be of varying quality and density. In addition, since it is typically not collected as part of a dedicated experiment but rather volunteered, when and where the data is collected is arbitrary. The integration of these two sources of geoinformation can provide researchers the ability to generate products and derive intelligence that mitigate their respective disadvantages and combine their advantages. Therefore, this research will address the problem of fusing two point clouds from potentially different sources. Specifically, we will consider two problems: scale matching and feature matching. Scale matching consists of computing feature metrics of each point cloud and analyzing their distributions to determine scale differences. Feature matching consists of defining local descriptors that are invariant to common dataset distortions (e.g., rotation and translation). Additionally, after matching the point clouds they can be registered and processed further (e.g., change detection). The objective of this research is to develop novel methods to fuse and enhance two point clouds from potentially disparate sources (e.g., Lidar and crowdsourced Web 2.0 datasets). The scope of this research is to investigate both scale and feature matching between two point clouds. The specific focus of this research will be in developing a novel local descriptor based on the concept of self-similarity to aid in the scale and feature matching steps. An open problem in fusion is how best to extract features from two point clouds and then perform feature-based matching. The proposed approach for this matching step is the use of local self-similarity as an invariant measure to match features. In particular, the proposed approach is to combine the concept of local self-similarity with a well-known feature descriptor, Spin Images, and thereby define "Self-Similar Spin Images". This approach is then extended to the case of matching two points clouds in very different coordinate systems (e.g., a geo-referenced Lidar point cloud and stereo-image derived point cloud without geo-referencing). The use of Self-Similar Spin Images is again applied to address this problem by introducing a "Self-Similar Keyscale" that matches the spatial scales of two point clouds. Another open problem is how best to detect changes in content between two point clouds. A method is proposed to find changes between two point clouds by analyzing the order statistics of the nearest neighbors between the two clouds, and thereby define the "Nearest Neighbor Order Statistic" method. Note that the well-known Hausdorff distance is a special case as being just the maximum order statistic. Therefore, by studying the entire histogram of these nearest neighbors it is expected to yield a more robust method to detect points that are present in one cloud but not the other. This approach is applied at multiple resolutions. Therefore, changes detected at the coarsest level will yield large missing targets and at finer levels will yield smaller targets.
A method of camera calibration with adaptive thresholding
NASA Astrophysics Data System (ADS)
Gao, Lei; Yan, Shu-hua; Wang, Guo-chao; Zhou, Chun-lei
2009-07-01
In order to calculate the parameters of the camera correctly, we must figure out the accurate coordinates of the certain points in the image plane. Corners are the important features in the 2D images. Generally speaking, they are the points that have high curvature and lie in the junction of different brightness regions of images. So corners detection has already widely used in many fields. In this paper we use the pinhole camera model and SUSAN corner detection algorithm to calibrate the camera. When using the SUSAN corner detection algorithm, we propose an approach to retrieve the gray difference threshold, adaptively. That makes it possible to pick up the right chessboard inner comers in all kinds of gray contrast. The experiment result based on this method was proved to be feasible.
Algorithms used in the Airborne Lidar Processing System (ALPS)
Nagle, David B.; Wright, C. Wayne
2016-05-23
The Airborne Lidar Processing System (ALPS) analyzes Experimental Advanced Airborne Research Lidar (EAARL) data—digitized laser-return waveforms, position, and attitude data—to derive point clouds of target surfaces. A full-waveform airborne lidar system, the EAARL seamlessly and simultaneously collects mixed environment data, including submerged, sub-aerial bare earth, and vegetation-covered topographies.ALPS uses three waveform target-detection algorithms to determine target positions within a given waveform: centroid analysis, leading edge detection, and bottom detection using water-column backscatter modeling. The centroid analysis algorithm detects opaque hard surfaces. The leading edge algorithm detects topography beneath vegetation and shallow, submerged topography. The bottom detection algorithm uses water-column backscatter modeling for deeper submerged topography in turbid water.The report describes slant range calculations and explains how ALPS uses laser range and orientation measurements to project measurement points into the Universal Transverse Mercator coordinate system. Parameters used for coordinate transformations in ALPS are described, as are Interactive Data Language-based methods for gridding EAARL point cloud data to derive digital elevation models. Noise reduction in point clouds through use of a random consensus filter is explained, and detailed pseudocode, mathematical equations, and Yorick source code accompany the report.
Detection of Operator Performance Breakdown as an Automation Triggering Mechanism
NASA Technical Reports Server (NTRS)
Yoo, Hyo-Sang; Lee, Paul U.; Landry, Steven J.
2015-01-01
Performance breakdown (PB) has been anecdotally described as a state where the human operator "loses control of context" and "cannot maintain required task performance." Preventing such a decline in performance is critical to assure the safety and reliability of human-integrated systems, and therefore PB could be useful as a point at which automation can be applied to support human performance. However, PB has never been scientifically defined or empirically demonstrated. Moreover, there is no validated objective way of detecting such a state or the transition to that state. The purpose of this work is: 1) to empirically demonstrate a PB state, and 2) to develop an objective way of detecting such a state. This paper defines PB and proposes an objective method for its detection. A human-in-the-loop study was conducted: 1) to demonstrate PB by increasing workload until the subject reported being in a state of PB, and 2) to identify possible parameters of a detection method for objectively identifying the subjectively-reported PB point, and 3) to determine if the parameters are idiosyncratic to an individual/context or are more generally applicable. In the experiment, fifteen participants were asked to manage three concurrent tasks (one primary and two secondary) for 18 minutes. The difficulty of the primary task was manipulated over time to induce PB while the difficulty of the secondary tasks remained static. The participants' task performance data was collected. Three hypotheses were constructed: 1) increasing workload will induce subjectively-identified PB, 2) there exists criteria that identifies the threshold parameters that best matches the subjectively-identified PB point, and 3) the criteria for choosing the threshold parameters is consistent across individuals. The results show that increasing workload can induce subjectively-identified PB, although it might not be generalizable-only 12 out of 15 participants declared PB. The PB detection method based on signal detection analysis was applied to the performance data and the results showed that PB can be identified using the method, particularly when the values of the parameters for the detection method were calibrated individually.
Edge detection based on adaptive threshold b-spline wavelet for optical sub-aperture measuring
NASA Astrophysics Data System (ADS)
Zhang, Shiqi; Hui, Mei; Liu, Ming; Zhao, Zhu; Dong, Liquan; Liu, Xiaohua; Zhao, Yuejin
2015-08-01
In the research of optical synthetic aperture imaging system, phase congruency is the main problem and it is necessary to detect sub-aperture phase. The edge of the sub-aperture system is more complex than that in the traditional optical imaging system. And with the existence of steep slope for large-aperture optical component, interference fringe may be quite dense when interference imaging. Deep phase gradient may cause a loss of phase information. Therefore, it's urgent to search for an efficient edge detection method. Wavelet analysis as a powerful tool is widely used in the fields of image processing. Based on its properties of multi-scale transform, edge region is detected with high precision in small scale. Longing with the increase of scale, noise is reduced in contrary. So it has a certain suppression effect on noise. Otherwise, adaptive threshold method which sets different thresholds in various regions can detect edge points from noise. Firstly, fringe pattern is obtained and cubic b-spline wavelet is adopted as the smoothing function. After the multi-scale wavelet decomposition of the whole image, we figure out the local modulus maxima in gradient directions. However, it also contains noise, and thus adaptive threshold method is used to select the modulus maxima. The point which greater than threshold value is boundary point. Finally, we use corrosion and expansion deal with the resulting image to get the consecutive boundary of image.
New method for analyzing dark matter direct detection data
NASA Astrophysics Data System (ADS)
Davis, Jonathan H.; Enßlin, Torsten; BÅ`hm, Céline
2014-02-01
The experimental situation of dark matter direct detection has reached an exciting crossroads, with potential hints of a discovery of dark matter (DM) from the CDMS, CoGeNT, CRESST-II and DAMA experiments in tension with null results from xenon-based experiments such as XENON100 and LUX. Given the present controversial experimental status, it is important that the analytical method used to search for DM in direct detection experiments is both robust and flexible enough to deal with data for which the distinction between signal and background points is difficult, and hence where the choice between setting a limit or defining a discovery region is debatable. In this article we propose a novel (Bayesian) analytical method, which can be applied to all direct detection experiments and which extracts the maximum amount of information from the data. We apply our method to the XENON100 experiment data as a worked example, and show that firstly our exclusion limit at 90% confidence is in agreement with their own for the 225 live days data, but is several times stronger for the 100 live days data. Secondly we find that, due to the two points at low values of S1 and S2 in the 225 days data set, our analysis points to either weak consistency with low-mass dark matter or the possible presence of an unknown background. Given the null result from LUX, the latter scenario seems the more plausible.
FPFH-based graph matching for 3D point cloud registration
NASA Astrophysics Data System (ADS)
Zhao, Jiapeng; Li, Chen; Tian, Lihua; Zhu, Jihua
2018-04-01
Correspondence detection is a vital step in point cloud registration and it can help getting a reliable initial alignment. In this paper, we put forward an advanced point feature-based graph matching algorithm to solve the initial alignment problem of rigid 3D point cloud registration with partial overlap. Specifically, Fast Point Feature Histograms are used to determine the initial possible correspondences firstly. Next, a new objective function is provided to make the graph matching more suitable for partially overlapping point cloud. The objective function is optimized by the simulated annealing algorithm for final group of correct correspondences. Finally, we present a novel set partitioning method which can transform the NP-hard optimization problem into a O(n3)-solvable one. Experiments on the Stanford and UWA public data sets indicates that our method can obtain better result in terms of both accuracy and time cost compared with other point cloud registration methods.
Automated location detection of injection site for preclinical stereotactic neurosurgery procedure
NASA Astrophysics Data System (ADS)
Abbaszadeh, Shiva; Wu, Hemmings C. H.
2017-03-01
Currently, during stereotactic neurosurgery procedures, the manual task of locating the proper area for needle insertion or implantation of electrode/cannula/optic fiber can be time consuming. The requirement of the task is to quickly and accurately find the location for insertion. In this study we investigate an automated method to locate the entry point of region of interest. This method leverages a digital image capture system, pattern recognition, and motorized stages. Template matching of known anatomical identifiable regions is used to find regions of interest (e.g. Bregma) in rodents. For our initial study, we tackle the problem of automatically detecting the entry point.
A microwave resonance dew-point hygrometer
NASA Astrophysics Data System (ADS)
Underwood, R. J.; Cuccaro, R.; Bell, S.; Gavioso, R. M.; Madonna Ripa, D.; Stevens, M.; de Podesta, M.
2012-08-01
We report the first measurements of a quasi-spherical microwave resonator used as a dew-point hygrometer. In conventional dew-point hygrometers, the condensation of water from humid gas flowing over a mirror is detected optically, and the mirror surface is then temperature-controlled to yield a stable condensed layer. In our experiments we flowed moist air from a humidity generator through a quasi-spherical resonator and detected the onset of condensation by measuring the frequency ratio of selected microwave modes. We verified the basic operation of the device over the dew-point range 9.5-13.5 °C by comparison with calibrated chilled-mirror hygrometers. These tests indicate that the microwave method may allow a quantitative estimation of the volume and thickness of the water layer which is condensed on the inner surface of the resonator. The experiments reported here are preliminary due to the limited time available for the work, but show the potential of the method for detecting not only water but a variety of other liquid or solid condensates. The robust all-metal construction should make the device appropriate for use in industrial applications over a wide range of temperatures and pressures.
Building Change Detection from Bi-Temporal Dense-Matching Point Clouds and Aerial Images.
Pang, Shiyan; Hu, Xiangyun; Cai, Zhongliang; Gong, Jinqi; Zhang, Mi
2018-03-24
In this work, a novel building change detection method from bi-temporal dense-matching point clouds and aerial images is proposed to address two major problems, namely, the robust acquisition of the changed objects above ground and the automatic classification of changed objects into buildings or non-buildings. For the acquisition of changed objects above ground, the change detection problem is converted into a binary classification, in which the changed area above ground is regarded as the foreground and the other area as the background. For the gridded points of each period, the graph cuts algorithm is adopted to classify the points into foreground and background, followed by the region-growing algorithm to form candidate changed building objects. A novel structural feature that was extracted from aerial images is constructed to classify the candidate changed building objects into buildings and non-buildings. The changed building objects are further classified as "newly built", "taller", "demolished", and "lower" by combining the classification and the digital surface models of two periods. Finally, three typical areas from a large dataset are used to validate the proposed method. Numerous experiments demonstrate the effectiveness of the proposed algorithm.
Robust method to detect and locate local earthquakes by means of amplitude measurements.
NASA Astrophysics Data System (ADS)
del Puy Papí Isaba, María; Brückl, Ewald
2016-04-01
In this study we present a robust new method to detect and locate medium and low magnitude local earthquakes. This method is based on an empirical model of the ground motion obtained from amplitude data of earthquakes in the area of interest, which were located using traditional methods. The first step of our method is the computation of maximum resultant ground velocities in sliding time windows covering the whole period of interest. In the second step, these maximum resultant ground velocities are back-projected to every point of a grid covering the whole area of interest while applying the empirical amplitude - distance relations. We refer to these back-projected ground velocities as pseudo-magnitudes. The number of operating seismic stations in the local network equals the number of pseudo-magnitudes at each grid-point. Our method introduces the new idea of selecting the minimum pseudo-magnitude at each grid-point for further analysis instead of searching for a minimum of the L2 or L1 norm. In case no detectable earthquake occurred, the spatial distribution of the minimum pseudo-magnitudes constrains the magnitude of weak earthquakes hidden in the ambient noise. In the case of a detectable local earthquake, the spatial distribution of the minimum pseudo-magnitudes shows a significant maximum at the grid-point nearest to the actual epicenter. The application of our method is restricted to the area confined by the convex hull of the seismic station network. Additionally, one must ensure that there are no dead traces involved in the processing. Compared to methods based on L2 and even L1 norms, our new method is almost wholly insensitive to outliers (data from locally disturbed seismic stations). A further advantage is the fast determination of the epicenter and magnitude of a seismic event located within a seismic network. This is possible due to the method of obtaining and storing a back-projected matrix, independent of the registered amplitude, for each seismic station. As a direct consequence, we are able to save computing time for the calculation of the final back-projected maximum resultant amplitude at every grid-point. The capability of the method was demonstrated firstly using synthetic data. In the next step, this method was applied to data of 43 local earthquakes of low and medium magnitude (1.7 < magnitude scale < 4.3). These earthquakes were recorded and detected by the seismic network ALPAACT (seismological and geodetic monitoring of Alpine PAnnonian ACtive Tectonics) in the period 2010/06/11 to 2013/09/20. Data provided by the ALPAACT network is used in order to understand seismic activity in the Mürz Valley - Semmering - Vienna Basin transfer fault system in Austria and what makes it such a relatively high earthquake hazard and risk area. The method will substantially support our efforts to involve scholars from polytechnic schools in seismological work within the Sparkling Science project Schools & Quakes.
NASA Astrophysics Data System (ADS)
Abate, D.; Avgousti, A.; Faka, M.; Hermon, S.; Bakirtzis, N.; Christofi, P.
2017-10-01
This study compares performance of aerial image based point clouds (IPCs) and light detection and ranging (LiDAR) based point clouds in detection of thinnings and clear cuts in forests. IPCs are an appealing method to update forest resource data, because of their accuracy in forest height estimation and cost-efficiency of aerial image acquisition. We predicted forest changes over a period of three years by creating difference layers that displayed the difference in height or volume between the initial and subsequent time points. Both IPCs and LiDAR data were used in this process. The IPCs were constructed with the Semi-Global Matching (SGM) algorithm. Difference layers were constructed by calculating differences in fitted height or volume models or in canopy height models (CHMs) from both time points. The LiDAR-derived digital terrain model (DTM) was used to scale heights to above ground level. The study area was classified in logistic regression into the categories ClearCut, Thinning or NoChange with the values from the difference layers. We compared the predicted changes with the true changes verified in the field, and obtained at best a classification accuracy for clear cuts 93.1 % with IPCs and 91.7 % with LiDAR data. However, a classification accuracy for thinnings was only 8.0 % with IPCs. With LiDAR data 41.4 % of thinnings were detected. In conclusion, the LiDAR data proved to be more accurate method to predict the minor changes in forests than IPCs, but both methods are useful in detection of major changes.
Detection of antisalivary duct antibody from Sjögren's syndrome by an autoradiographic method.
Cummings, N A; Tarpley, T M
1978-01-01
A new technique to detect anti-salivary duct antibody (ASDA) has been developed by using autoradiographic, rather than immunofluorescent methods. The antibody activity detected by autoradiography is probably classic ASDA. Both techniques may be consecutively performed on the same tissue section without attenuation of either. Some of the potential advantages of the radiolabelling of ASDA are pointed out, and a few preliminary experiments using the labelled antibody as a marker are presented.
A Monocular Vision Sensor-Based Obstacle Detection Algorithm for Autonomous Robots.
Lee, Tae-Jae; Yi, Dong-Hoon; Cho, Dong-Il Dan
2016-03-01
This paper presents a monocular vision sensor-based obstacle detection algorithm for autonomous robots. Each individual image pixel at the bottom region of interest is labeled as belonging either to an obstacle or the floor. While conventional methods depend on point tracking for geometric cues for obstacle detection, the proposed algorithm uses the inverse perspective mapping (IPM) method. This method is much more advantageous when the camera is not high off the floor, which makes point tracking near the floor difficult. Markov random field-based obstacle segmentation is then performed using the IPM results and a floor appearance model. Next, the shortest distance between the robot and the obstacle is calculated. The algorithm is tested by applying it to 70 datasets, 20 of which include nonobstacle images where considerable changes in floor appearance occur. The obstacle segmentation accuracies and the distance estimation error are quantitatively analyzed. For obstacle datasets, the segmentation precision and the average distance estimation error of the proposed method are 81.4% and 1.6 cm, respectively, whereas those for a conventional method are 57.5% and 9.9 cm, respectively. For nonobstacle datasets, the proposed method gives 0.0% false positive rates, while the conventional method gives 17.6%.
Binocular stereo matching method based on structure tensor
NASA Astrophysics Data System (ADS)
Song, Xiaowei; Yang, Manyi; Fan, Yubo; Yang, Lei
2016-10-01
In a binocular visual system, to recover the three-dimensional information of the object, the most important step is to acquire matching points. Structure tensor is the vector representation of each point in its local neighborhood. Therefore, structure tensor performs well in region detection of local structure, and it is very suitable for detecting specific graphics such as pedestrians, cars and road signs in the image. In this paper, the structure tensor is combined with the luminance information to form the extended structure tensor. The directional derivatives of luminance in x and y directions are calculated, so that the local structure of the image is more prominent. Meanwhile, the Euclidean distance between the eigenvectors of key points is used as the similarity determination metric of key points in the two images. By matching, the coordinates of the matching points in the detected target are precisely acquired. In this paper, experiments were performed on the captured left and right images. After the binocular calibration, image matching was done to acquire the matching points, and then the target depth was calculated according to these matching points. By comparison, it is proved that the structure tensor can accurately acquire the matching points in binocular stereo matching.
Iris recognition using possibilistic fuzzy matching on local features.
Tsai, Chung-Chih; Lin, Heng-Yi; Taur, Jinshiuh; Tao, Chin-Wang
2012-02-01
In this paper, we propose a novel possibilistic fuzzy matching strategy with invariant properties, which can provide a robust and effective matching scheme for two sets of iris feature points. In addition, the nonlinear normalization model is adopted to provide more accurate position before matching. Moreover, an effective iris segmentation method is proposed to refine the detected inner and outer boundaries to smooth curves. For feature extraction, the Gabor filters are adopted to detect the local feature points from the segmented iris image in the Cartesian coordinate system and to generate a rotation-invariant descriptor for each detected point. After that, the proposed matching algorithm is used to compute a similarity score for two sets of feature points from a pair of iris images. The experimental results show that the performance of our system is better than those of the systems based on the local features and is comparable to those of the typical systems.
Fast Vessel Detection in Gaofen-3 SAR Images with Ultrafine Strip-Map Mode
Liu, Lei; Qiu, Xiaolan; Lei, Bin
2017-01-01
This study aims to detect vessels with lengths ranging from about 70 to 300 m, in Gaofen-3 (GF-3) SAR images with ultrafine strip-map (UFS) mode as fast as possible. Based on the analysis of the characteristics of vessels in GF-3 SAR imagery, an effective vessel detection method is proposed in this paper. Firstly, the iterative constant false alarm rate (CFAR) method is employed to detect the potential ship pixels. Secondly, the mean-shift operation is applied on each potential ship pixel to identify the candidate target region. During the mean-shift process, we maintain a selection matrix recording which pixels can be taken, and these pixels are called as the valid points of the candidate target. The l1 norm regression is used to extract the principal axis and detect the valid points. Finally, two kinds of false alarms, the bright line and the azimuth ambiguity, are removed by comparing the valid area of the candidate target with a pre-defined value and computing the displacement between the true target and the corresponding replicas respectively. Experimental results on three GF-3 SAR images with UFS mode demonstrate the effectiveness and efficiency of the proposed method. PMID:28678197
Collision detection and modeling of rigid and deformable objects in laparoscopic simulator
NASA Astrophysics Data System (ADS)
Dy, Mary-Clare; Tagawa, Kazuyoshi; Tanaka, Hiromi T.; Komori, Masaru
2015-03-01
Laparoscopic simulators are viable alternatives for surgical training and rehearsal. Haptic devices can also be incorporated with virtual reality simulators to provide additional cues to the users. However, to provide realistic feedback, the haptic device must be updated by 1kHz. On the other hand, realistic visual cues, that is, the collision detection and deformation between interacting objects must be rendered at least 30 fps. Our current laparoscopic simulator detects the collision between a point on the tool tip, and on the organ surfaces, in which haptic devices are attached on actual tool tips for realistic tool manipulation. The triangular-mesh organ model is rendered using a mass spring deformation model, or finite element method-based models. In this paper, we investigated multi-point-based collision detection on the rigid tool rods. Based on the preliminary results, we propose a method to improve the collision detection scheme, and speed up the organ deformation reaction. We discuss our proposal for an efficient method to compute simultaneous multiple collision between rigid (laparoscopic tools) and deformable (organs) objects, and perform the subsequent collision response, with haptic feedback, in real-time.
Performance Analysis of a Pole and Tree Trunk Detection Method for Mobile Laser Scanning Data
NASA Astrophysics Data System (ADS)
Lehtomäki, M.; Jaakkola, A.; Hyyppä, J.; Kukko, A.; Kaartinen, H.
2011-09-01
Dense point clouds can be collected efficiently from large areas using mobile laser scanning (MLS) technology. Accurate MLS data can be used for detailed 3D modelling of the road surface and objects around it. The 3D models can be utilised, for example, in street planning and maintenance and noise modelling. Utility poles, traffic signs, and lamp posts can be considered an important part of road infrastructure. Poles and trees stand out from the environment and should be included in realistic 3D models. Detection of narrow vertical objects, such as poles and tree trunks, from MLS data was studied. MLS produces huge amounts of data and, therefore, processing methods should be as automatic as possible and for the methods to be practical, the algorithms should run in an acceptable time. The automatic pole detection method tested in this study is based on first finding point clusters that are good candidates for poles and then separating poles and tree trunks from other clusters using features calculated from the clusters and by applying a mask that acts as a model of a pole. The method achieved detection rates of 77.7% and 69.7% in the field tests while 81.0% and 86.5% of the detected targets were correct. Pole-like targets that were surrounded by other objects, such as tree trunks that were inside branches, were the most difficult to detect. Most of the false detections came from wall structures, which could be corrected in further processing.
Probabilistic Surface Characterization for Safe Landing Hazard Detection and Avoidance (HDA)
NASA Technical Reports Server (NTRS)
Johnson, Andrew E. (Inventor); Ivanov, Tonislav I. (Inventor); Huertas, Andres (Inventor)
2015-01-01
Apparatuses, systems, computer programs and methods for performing hazard detection and avoidance for landing vehicles are provided. Hazard assessment takes into consideration the geometry of the lander. Safety probabilities are computed for a plurality of pixels in a digital elevation map. The safety probabilities are combined for pixels associated with one or more aim points and orientations. A worst case probability value is assigned to each of the one or more aim points and orientations.
Method for detecting point mutations in DNA utilizing fluorescence energy transfer
Parkhurst, Lawrence J.; Parkhurst, Kay M.; Middendorf, Lyle
2001-01-01
A method for detecting point mutations in DNA using a fluorescently labeled oligomeric probe and Forster resonance energy transfer (FRET) is disclosed. The selected probe is initially labeled at each end with a fluorescence dye, which act together as a donor/acceptor pair for FRET. The fluorescence emission from the dyes changes dramatically from the duplex stage, wherein the probe is hybridized to the complementary strand of DNA, to the single strand stage, when the probe is melted to become detached from the DNA. The change in fluorescence is caused by the dyes coming into closer proximity after melting occurs and the probe becomes detached from the DNA strand. The change in fluorescence emission as a function of temperature is used to calculate the melting temperature of the complex or T.sub.m. In the case where there is a base mismatch between the probe and the DNA strand, indicating a point mutation, the T.sub.m has been found to be significantly lower than the T.sub.m for a perfectly match probelstand duplex. The present invention allows for the detection of the existence and magnitude of T.sub.m, which allows for the quick and accurate detection of a point mutation in the DNA strand and, in some applications, the determination of the approximate location of the mutation within the sequence.
Lost in Virtual Reality: Pathfinding Algorithms Detect Rock Fractures and Contacts in Point Clouds
NASA Astrophysics Data System (ADS)
Thiele, S.; Grose, L.; Micklethwaite, S.
2016-12-01
UAV-based photogrammetric and LiDAR techniques provide high resolution 3D point clouds and ortho-rectified photomontages that can capture surface geology in outstanding detail over wide areas. Automated and semi-automated methods are vital to extract full value from these data in practical time periods, though the nuances of geological structures and materials (natural variability in colour and geometry, soft and hard linkage, shadows and multiscale properties) make this a challenging task. We present a novel method for computer assisted trace detection in dense point clouds, using a lowest cost path solver to "follow" fracture traces and lithological contacts between user defined end points. This is achieved by defining a local neighbourhood network where each point in the cloud is linked to its neighbours, and then using a least-cost path algorithm to search this network and estimate the trace of the fracture or contact. A variety of different algorithms can then be applied to calculate the best fit plane, produce a fracture network, or map properties such as roughness, curvature and fracture intensity. Our prototype of this method (Fig. 1) suggests the technique is feasible and remarkably good at following traces under non-optimal conditions such as variable-shadow, partial occlusion and complex fracturing. Furthermore, if a fracture is initially mapped incorrectly, the user can easily provide further guidance by defining intermediate waypoints. Future development will include optimization of the algorithm to perform well on large point clouds and modifications that permit the detection of features such as step-overs. We also plan on implementing this approach in an interactive graphical user environment.
Method for oil pipeline leak detection based on distributed fiber optic technology
NASA Astrophysics Data System (ADS)
Chen, Huabo; Tu, Yaqing; Luo, Ting
1998-08-01
Pipeline leak detection is a difficult problem to solve up to now. Some traditional leak detection methods have such problems as high rate of false alarm or missing detection, low location estimate capability. For the problems given above, a method for oil pipeline leak detection based on distributed optical fiber sensor with special coating is presented. The fiber's coating interacts with hydrocarbon molecules in oil, which alters the refractive indexed of the coating. Therefore the light-guiding properties of the fiber are modified. Thus pipeline leak location can be determined by OTDR. Oil pipeline lead detection system is designed based on the principle. The system has some features like real time, multi-point detection at the same time and high location accuracy. In the end, some factors that probably influence detection are analyzed and primary improving actions are given.
Wan, Jiangwen; Yu, Yang; Wu, Yinfeng; Feng, Renjian; Yu, Ning
2012-01-01
In light of the problems of low recognition efficiency, high false rates and poor localization accuracy in traditional pipeline security detection technology, this paper proposes a type of hierarchical leak detection and localization method for use in natural gas pipeline monitoring sensor networks. In the signal preprocessing phase, original monitoring signals are dealt with by wavelet transform technology to extract the single mode signals as well as characteristic parameters. In the initial recognition phase, a multi-classifier model based on SVM is constructed and characteristic parameters are sent as input vectors to the multi-classifier for initial recognition. In the final decision phase, an improved evidence combination rule is designed to integrate initial recognition results for final decisions. Furthermore, a weighted average localization algorithm based on time difference of arrival is introduced for determining the leak point's position. Experimental results illustrate that this hierarchical pipeline leak detection and localization method could effectively improve the accuracy of the leak point localization and reduce the undetected rate as well as false alarm rate.
Jang, Hyunjung; Kim, Jihyun; Choi, Jae-jin; Son, Yeojin; Park, Heekyung
2010-01-01
The detection of antiviral-resistant hepatitis B virus (HBV) mutations is important for monitoring the response to treatment and for effective treatment decisions. We have developed an array using peptide nucleic acid (PNA) probes to detect point mutations in HBV associated with antiviral resistance. PNA probes were designed to detect mutations associated with resistance to lamivudine, adefovir, and entecavir. The PNA array assay was sensitive enough to detect 102 copies/ml. The PNA array assay was able to detect mutants present in more than 5% of the virus population when the total HBV DNA concentration was greater than 104 copies/ml. We analyzed a total of 68 clinical samples by this assay and validated its usefulness by comparing results to those of the sequencing method. The PNA array correctly identified viral mutants and has high concordance (98.3%) with direct sequencing in detecting antiviral-resistant mutations. Our results showed that the PNA array is a rapid, sensitive, and easily applicable assay for the detection of antiviral-resistant mutation in HBV. Thus, the PNA array is a useful and powerful diagnostic tool for the detection of point mutations or polymorphisms. PMID:20573874
Scan Line Based Road Marking Extraction from Mobile LiDAR Point Clouds.
Yan, Li; Liu, Hua; Tan, Junxiang; Li, Zan; Xie, Hong; Chen, Changjun
2016-06-17
Mobile Mapping Technology (MMT) is one of the most important 3D spatial data acquisition technologies. The state-of-the-art mobile mapping systems, equipped with laser scanners and named Mobile LiDAR Scanning (MLS) systems, have been widely used in a variety of areas, especially in road mapping and road inventory. With the commercialization of Advanced Driving Assistance Systems (ADASs) and self-driving technology, there will be a great demand for lane-level detailed 3D maps, and MLS is the most promising technology to generate such lane-level detailed 3D maps. Road markings and road edges are necessary information in creating such lane-level detailed 3D maps. This paper proposes a scan line based method to extract road markings from mobile LiDAR point clouds in three steps: (1) preprocessing; (2) road points extraction; (3) road markings extraction and refinement. In preprocessing step, the isolated LiDAR points in the air are removed from the LiDAR point clouds and the point clouds are organized into scan lines. In the road points extraction step, seed road points are first extracted by Height Difference (HD) between trajectory data and road surface, then full road points are extracted from the point clouds by moving least squares line fitting. In the road markings extraction and refinement step, the intensity values of road points in a scan line are first smoothed by a dynamic window median filter to suppress intensity noises, then road markings are extracted by Edge Detection and Edge Constraint (EDEC) method, and the Fake Road Marking Points (FRMPs) are eliminated from the detected road markings by segment and dimensionality feature-based refinement. The performance of the proposed method is evaluated by three data samples and the experiment results indicate that road points are well extracted from MLS data and road markings are well extracted from road points by the applied method. A quantitative study shows that the proposed method achieves an average completeness, correctness, and F-measure of 0.96, 0.93, and 0.94, respectively. The time complexity analysis shows that the scan line based road markings extraction method proposed in this paper provides a promising alternative for offline road markings extraction from MLS data.
Scan Line Based Road Marking Extraction from Mobile LiDAR Point Clouds†
Yan, Li; Liu, Hua; Tan, Junxiang; Li, Zan; Xie, Hong; Chen, Changjun
2016-01-01
Mobile Mapping Technology (MMT) is one of the most important 3D spatial data acquisition technologies. The state-of-the-art mobile mapping systems, equipped with laser scanners and named Mobile LiDAR Scanning (MLS) systems, have been widely used in a variety of areas, especially in road mapping and road inventory. With the commercialization of Advanced Driving Assistance Systems (ADASs) and self-driving technology, there will be a great demand for lane-level detailed 3D maps, and MLS is the most promising technology to generate such lane-level detailed 3D maps. Road markings and road edges are necessary information in creating such lane-level detailed 3D maps. This paper proposes a scan line based method to extract road markings from mobile LiDAR point clouds in three steps: (1) preprocessing; (2) road points extraction; (3) road markings extraction and refinement. In preprocessing step, the isolated LiDAR points in the air are removed from the LiDAR point clouds and the point clouds are organized into scan lines. In the road points extraction step, seed road points are first extracted by Height Difference (HD) between trajectory data and road surface, then full road points are extracted from the point clouds by moving least squares line fitting. In the road markings extraction and refinement step, the intensity values of road points in a scan line are first smoothed by a dynamic window median filter to suppress intensity noises, then road markings are extracted by Edge Detection and Edge Constraint (EDEC) method, and the Fake Road Marking Points (FRMPs) are eliminated from the detected road markings by segment and dimensionality feature-based refinement. The performance of the proposed method is evaluated by three data samples and the experiment results indicate that road points are well extracted from MLS data and road markings are well extracted from road points by the applied method. A quantitative study shows that the proposed method achieves an average completeness, correctness, and F-measure of 0.96, 0.93, and 0.94, respectively. The time complexity analysis shows that the scan line based road markings extraction method proposed in this paper provides a promising alternative for offline road markings extraction from MLS data. PMID:27322279
Murphy, Christine M; Devlin, John J; Beuhler, Michael C; Cheifetz, Paul; Maynard, Susan; Schwartz, Michael D; Kacinko, Sherri
2018-04-01
Nitromethane, found in fuels used for short distance racing, model cars, and model airplanes, produces a falsely elevated serum creatinine with standard creatinine analysis via the Jaffé method. Erroneous creatinine elevation often triggers extensive testing, leads to inaccurate diagnoses, and delayed or inappropriate medical interventions. Multiple reports in the literature identify "enzymatic assays" as an alternative method to detect the true value of creatinine, but this ambiguity does not help providers translate what type of enzymatic assay testing can be done in real time to determine if there is indeed false elevation. We report seven cases of ingested nitromethane where creatinine was determined via Beckman Coulter ® analyser using the Jaffé method, Vitros ® analyser, or i-Stat ® point-of-care testing. Nitromethane was detected and semi-quantified using a common clinical toxic alcohol analysis method, and quantified by headspace-gas chromatography-mass spectrometry. When creatinine was determined using i-Stat ® point-of-care testing or a Vitros ® analyser, levels were within the normal range. Comparatively, all initial creatinine levels obtained via the Jaffé method were elevated. Nitromethane concentrations ranged from 42 to 310 μg/mL. These cases demonstrate reliable assessment of creatinine through other enzymatic methods using a Vitros ® analyser or i-STAT ® . Additionally, nitromethane is detectable and quantifiable using routine alcohols gas chromatography analysis and by headspace-gas chromatography-mass spectrometry.
Klingbeil, Brian T; Willig, Michael R
2015-01-01
Effective monitoring programs for biodiversity are needed to assess trends in biodiversity and evaluate the consequences of management. This is particularly true for birds and faunas that occupy interior forest and other areas of low human population density, as these are frequently under-sampled compared to other habitats. For birds, Autonomous Recording Units (ARUs) have been proposed as a supplement or alternative to point counts made by human observers to enhance monitoring efforts. We employed two strategies (i.e., simultaneous-collection and same-season) to compare point count and ARU methods for quantifying species richness and composition of birds in temperate interior forests. The simultaneous-collection strategy compares surveys by ARUs and point counts, with methods matched in time, location, and survey duration such that the person and machine simultaneously collect data. The same-season strategy compares surveys from ARUs and point counts conducted at the same locations throughout the breeding season, but methods differ in the number, duration, and frequency of surveys. This second strategy more closely follows the ways in which monitoring programs are likely to be implemented. Site-specific estimates of richness (but not species composition) differed between methods; however, the nature of the relationship was dependent on the assessment strategy. Estimates of richness from point counts were greater than estimates from ARUs in the simultaneous-collection strategy. Woodpeckers in particular, were less frequently identified from ARUs than point counts with this strategy. Conversely, estimates of richness were lower from point counts than ARUs in the same-season strategy. Moreover, in the same-season strategy, ARUs detected the occurrence of passerines at a higher frequency than did point counts. Differences between ARU and point count methods were only detected in site-level comparisons. Importantly, both methods provide similar estimates of species richness and composition for the region. Consequently, if single visits to sites or short-term monitoring are the goal, point counts will likely perform better than ARUs, especially if species are rare or vocalize infrequently. However, if seasonal or annual monitoring of sites is the goal, ARUs offer a viable alternative to standard point-count methods, especially in the context of large-scale or long-term monitoring of temperate forest birds.
NASA Astrophysics Data System (ADS)
Nieuwoudt, Michel K.; Holroyd, Steve E.; McGoverin, Cushla M.; Simpson, M. Cather; Williams, David E.
2017-02-01
Point-of-care diagnostics are of interest in the medical, security and food industry, the latter particularly for screening food adulterated for economic gain. Milk adulteration continues to be a major problem worldwide and different methods to detect fraudulent additives have been investigated for over a century. Laboratory based methods are limited in their application to point-of-collection diagnosis and also require expensive instrumentation, chemicals and skilled technicians. This has encouraged exploration of spectroscopic methods as more rapid and inexpensive alternatives. Raman spectroscopy has excellent potential for screening of milk because of the rich complexity inherent in its signals. The rapid advances in photonic technologies and fabrication methods are enabling increasingly sensitive portable mini-Raman systems to be placed on the market that are both affordable and feasible for both point-of-care and point-of-collection applications. We have developed a powerful spectroscopic method for rapidly screening liquid milk for sucrose and four nitrogen-rich adulterants (dicyandiamide (DCD), ammonium sulphate, melamine, urea), using a combined system: a small, portable Raman spectrometer with focusing fibre optic probe and optimized reflective focusing wells, simply fabricated in aluminium. The reliable sample presentation of this system enabled high reproducibility of 8% RSD (residual standard deviation) within four minutes. Limit of detection intervals for PLS calibrations ranged between 140 - 520 ppm for the four N-rich compounds and between 0.7 - 3.6 % for sucrose. The portability of the system and reliability and reproducibility of this technique opens opportunities for general, reagentless adulteration screening of biological fluids as well as milk, at point-of-collection.
Barroso, Teresa G; Martins, Rui C; Fernandes, Elisabete; Cardoso, Susana; Rivas, José; Freitas, Paulo P
2018-02-15
Tuberculosis is one of the major public health concerns. This highly contagious disease affects more than 10.4 million people, being a leading cause of morbidity by infection. Tuberculosis is diagnosed at the point-of-care by the Ziehl-Neelsen sputum smear microscopy test. Ziehl-Neelsen is laborious, prone to human error and infection risk, with a limit of detection of 10 4 cells/mL. In resource-poor nations, a more practical test, with lower detection limit, is paramount. This work uses a magnetoresistive biosensor to detect BCG bacteria for tuberculosis diagnosis. Herein we report: i) nanoparticle assembly method and specificity for tuberculosis detection; ii) demonstration of proportionality between BCG cell concentration and magnetoresistive voltage signal; iii) application of multiplicative signal correction for systematic effects removal; iv) investigation of calibration effectiveness using chemometrics methods; and v) comparison with state-of-the-art point-of-care tuberculosis biosensors. Results present a clear correspondence between voltage signal and cell concentration. Multiplicative signal correction removes baseline shifts within and between biochip sensors, allowing accurate and precise voltage signal between different biochips. The corrected signal was used for multivariate regression models, which significantly decreased the calibration standard error from 0.50 to 0.03log 10 (cells/mL). Results show that Ziehl-Neelsen detection limits and below are achievable with the magnetoresistive biochip, when pre-processing and chemometrics are used. Copyright © 2017 Elsevier B.V. All rights reserved.
Privacy Protection Versus Cluster Detection in Spatial Epidemiology
Olson, Karen L.; Grannis, Shaun J.; Mandl, Kenneth D.
2006-01-01
Objectives. Patient data that includes precise locations can reveal patients’ identities, whereas data aggregated into administrative regions may preserve privacy and confidentiality. We investigated the effect of varying degrees of address precision (exact latitude and longitude vs the center points of zip code or census tracts) on detection of spatial clusters of cases. Methods. We simulated disease outbreaks by adding supplementary spatially clustered emergency department visits to authentic hospital emergency department syndromic surveillance data. We identified clusters with a spatial scan statistic and evaluated detection rate and accuracy. Results. More clusters were identified, and clusters were more accurately detected, when exact locations were used. That is, these clusters contained at least half of the simulated points and involved few additional emergency department visits. These results were especially apparent when the synthetic clustered points crossed administrative boundaries and fell into multiple zip code or census tracts. Conclusions. The spatial cluster detection algorithm performed better when addresses were analyzed as exact locations than when they were analyzed as center points of zip code or census tracts, particularly when the clustered points crossed administrative boundaries. Use of precise addresses offers improved performance, but this practice must be weighed against privacy concerns in the establishment of public health data exchange policies. PMID:17018828
A fast and automatic mosaic method for high-resolution satellite images
NASA Astrophysics Data System (ADS)
Chen, Hongshun; He, Hui; Xiao, Hongyu; Huang, Jing
2015-12-01
We proposed a fast and fully automatic mosaic method for high-resolution satellite images. First, the overlapped rectangle is computed according to geographical locations of the reference and mosaic images and feature points on both the reference and mosaic images are extracted by a scale-invariant feature transform (SIFT) algorithm only from the overlapped region. Then, the RANSAC method is used to match feature points of both images. Finally, the two images are fused into a seamlessly panoramic image by the simple linear weighted fusion method or other method. The proposed method is implemented in C++ language based on OpenCV and GDAL, and tested by Worldview-2 multispectral images with a spatial resolution of 2 meters. Results show that the proposed method can detect feature points efficiently and mosaic images automatically.
Molecular analyses of two bacterial sampling methods in ligature-induced periodontitis in rats.
Fontana, Carla Raquel; Grecco, Clovis; Bagnato, Vanderlei Salvador; de Freitas, Laura Marise; Boussios, Constantinos I; Soukos, Nikolaos S
2018-02-01
The prevalence profile of periodontal pathogens in dental plaque can vary as a function of the detection method; however, the sampling technique may also play a role in determining dental plaque microbial profiles. We sought to determine the bacterial composition comparing two sampling methods, one well stablished and a new one proposed here. In this study, a ligature-induced periodontitis model was used in 30 rats. Twenty-seven days later, ligatures were removed and microbiological samples were obtained directly from the ligatures as well as from the periodontal pockets using absorbent paper points. Microbial analysis was performed using DNA probes to a panel of 40 periodontal species in the checkerboard assay. The bacterial composition patterns were similar for both sampling methods. However, detection levels for all species were markedly higher for ligatures compared with paper points. Ligature samples provided more bacterial counts than paper points, suggesting that the technique for induction of periodontitis could also be applied for sampling in rats. Our findings may be helpful in designing studies of induced periodontal disease-associated microbiota.
Liu, Zhenbang; Ng, Junxiang; Yuwono, Arianto; Lu, Yadong; Tan, Yung Khan
2017-01-01
ABSTRACT Purpose: To compare the staining intensity of the upper urinary tract (UUT) urothelium among three UUT delivery methods in an in vivo porcine model. Materials and methods: A fluorescent dye solution (indigo carmine) was delivered to the UUT via three different methods: antegrade perfusion, vesico-ureteral reflux via in-dwelling ureteric stent and retrograde perfusion via a 5F open-ended ureteral catheter. Twelve renal units were tested with 4 in each method. After a 2-hour delivery time, the renal-ureter units were harvested en bloc. Time from harvesting to analysis was also standardised to be 2 hours in each arm. Three urothelium samples of the same weight and size were taken from each of the 6 pre-defined points (upper pole, mid pole, lower pole, renal pelvis, mid ureter and distal ureter) and the amount of fluorescence was measured with a spectrometer. Results: The mean fluorescence detected at all 6 predefined points of the UUT urothelium was the highest for the retrograde method. This was statistically significant with p-value less than <0.05 at all 6 points. Conclusions: Retrograde infusion of UUT by an open ended ureteral catheter resulted in highest mean fluorescence detected at all 6 pre-defined points of the UUT urothelium compared to antegrade infusion and vesico-ureteral reflux via indwelling ureteric stents indicating retrograde method ideal for topical therapy throughout the UUT urothelium. More clinical studies are needed to demonstrate if retrograde method could lead to better clinical outcomes compared to the other two methods. PMID:29039888
Congruence analysis of point clouds from unstable stereo image sequences
NASA Astrophysics Data System (ADS)
Jepping, C.; Bethmann, F.; Luhmann, T.
2014-06-01
This paper deals with the correction of exterior orientation parameters of stereo image sequences over deformed free-form surfaces without control points. Such imaging situation can occur, for example, during photogrammetric car crash test recordings where onboard high-speed stereo cameras are used to measure 3D surfaces. As a result of such measurements 3D point clouds of deformed surfaces are generated for a complete stereo sequence. The first objective of this research focusses on the development and investigation of methods for the detection of corresponding spatial and temporal tie points within the stereo image sequences (by stereo image matching and 3D point tracking) that are robust enough for a reliable handling of occlusions and other disturbances that may occur. The second objective of this research is the analysis of object deformations in order to detect stable areas (congruence analysis). For this purpose a RANSAC-based method for congruence analysis has been developed. This process is based on the sequential transformation of randomly selected point groups from one epoch to another by using a 3D similarity transformation. The paper gives a detailed description of the congruence analysis. The approach has been tested successfully on synthetic and real image data.
Section-Based Tree Species Identification Using Airborne LIDAR Point Cloud
NASA Astrophysics Data System (ADS)
Yao, C.; Zhang, X.; Liu, H.
2017-09-01
The application of LiDAR data in forestry initially focused on mapping forest community, particularly and primarily intended for largescale forest management and planning. Then with the smaller footprint and higher sampling density LiDAR data available, detecting individual tree overstory, estimating crowns parameters and identifying tree species are demonstrated practicable. This paper proposes a section-based protocol of tree species identification taking palm tree as an example. Section-based method is to detect objects through certain profile among different direction, basically along X-axis or Y-axis. And this method improve the utilization of spatial information to generate accurate results. Firstly, separate the tree points from manmade-object points by decision-tree-based rules, and create Crown Height Mode (CHM) by subtracting the Digital Terrain Model (DTM) from the digital surface model (DSM). Then calculate and extract key points to locate individual trees, thus estimate specific tree parameters related to species information, such as crown height, crown radius, and cross point etc. Finally, with parameters we are able to identify certain tree species. Comparing to species information measured on ground, the portion correctly identified trees on all plots could reach up to 90.65 %. The identification result in this research demonstrate the ability to distinguish palm tree using LiDAR point cloud. Furthermore, with more prior knowledge, section-based method enable the process to classify trees into different classes.
NASA Technical Reports Server (NTRS)
Lo, C. F.; Wu, K.; Whitehead, B. A.
1993-01-01
The statistical and neural networks methods have been applied to investigate the feasibility in detecting anomalies in turbopump vibration of SSME. The anomalies are detected based on the amplitude of peaks of fundamental and harmonic frequencies in the power spectral density. These data are reduced to the proper format from sensor data measured by strain gauges and accelerometers. Both methods are feasible to detect the vibration anomalies. The statistical method requires sufficient data points to establish a reasonable statistical distribution data bank. This method is applicable for on-line operation. The neural networks method also needs to have enough data basis to train the neural networks. The testing procedure can be utilized at any time so long as the characteristics of components remain unchanged.
Time Series UAV Image-Based Point Clouds for Landslide Progression Evaluation Applications
Moussa, Adel; El-Sheimy, Naser; Habib, Ayman
2017-01-01
Landslides are major and constantly changing threats to urban landscapes and infrastructure. It is essential to detect and capture landslide changes regularly. Traditional methods for monitoring landslides are time-consuming, costly, dangerous, and the quality and quantity of the data is sometimes unable to meet the necessary requirements of geotechnical projects. This motivates the development of more automatic and efficient remote sensing approaches for landslide progression evaluation. Automatic change detection involving low-altitude unmanned aerial vehicle image-based point clouds, although proven, is relatively unexplored, and little research has been done in terms of accounting for volumetric changes. In this study, a methodology for automatically deriving change displacement rates, in a horizontal direction based on comparisons between extracted landslide scarps from multiple time periods, has been developed. Compared with the iterative closest projected point (ICPP) registration method, the developed method takes full advantage of automated geometric measuring, leading to fast processing. The proposed approach easily processes a large number of images from different epochs and enables the creation of registered image-based point clouds without the use of extensive ground control point information or further processing such as interpretation and image correlation. The produced results are promising for use in the field of landslide research. PMID:29057847
Time Series UAV Image-Based Point Clouds for Landslide Progression Evaluation Applications.
Al-Rawabdeh, Abdulla; Moussa, Adel; Foroutan, Marzieh; El-Sheimy, Naser; Habib, Ayman
2017-10-18
Landslides are major and constantly changing threats to urban landscapes and infrastructure. It is essential to detect and capture landslide changes regularly. Traditional methods for monitoring landslides are time-consuming, costly, dangerous, and the quality and quantity of the data is sometimes unable to meet the necessary requirements of geotechnical projects. This motivates the development of more automatic and efficient remote sensing approaches for landslide progression evaluation. Automatic change detection involving low-altitude unmanned aerial vehicle image-based point clouds, although proven, is relatively unexplored, and little research has been done in terms of accounting for volumetric changes. In this study, a methodology for automatically deriving change displacement rates, in a horizontal direction based on comparisons between extracted landslide scarps from multiple time periods, has been developed. Compared with the iterative closest projected point (ICPP) registration method, the developed method takes full advantage of automated geometric measuring, leading to fast processing. The proposed approach easily processes a large number of images from different epochs and enables the creation of registered image-based point clouds without the use of extensive ground control point information or further processing such as interpretation and image correlation. The produced results are promising for use in the field of landslide research.
a Weighted Closed-Form Solution for Rgb-D Data Registration
NASA Astrophysics Data System (ADS)
Vestena, K. M.; Dos Santos, D. R.; Oilveira, E. M., Jr.; Pavan, N. L.; Khoshelham, K.
2016-06-01
Existing 3D indoor mapping of RGB-D data are prominently point-based and feature-based methods. In most cases iterative closest point (ICP) and its variants are generally used for pairwise registration process. Considering that the ICP algorithm requires an relatively accurate initial transformation and high overlap a weighted closed-form solution for RGB-D data registration is proposed. In this solution, we weighted and normalized the 3D points based on the theoretical random errors and the dual-number quaternions are used to represent the 3D rigid body motion. Basically, dual-number quaternions provide a closed-form solution by minimizing a cost function. The most important advantage of the closed-form solution is that it provides the optimal transformation in one-step, it does not need to calculate good initial estimates and expressively decreases the demand for computer resources in contrast to the iterative method. Basically, first our method exploits RGB information. We employed a scale invariant feature transformation (SIFT) for extracting, detecting, and matching features. It is able to detect and describe local features that are invariant to scaling and rotation. To detect and filter outliers, we used random sample consensus (RANSAC) algorithm, jointly with an statistical dispersion called interquartile range (IQR). After, a new RGB-D loop-closure solution is implemented based on the volumetric information between pair of point clouds and the dispersion of the random errors. The loop-closure consists to recognize when the sensor revisits some region. Finally, a globally consistent map is created to minimize the registration errors via a graph-based optimization. The effectiveness of the proposed method is demonstrated with a Kinect dataset. The experimental results show that the proposed method can properly map the indoor environment with an absolute accuracy around 1.5% of the travel of a trajectory.
DOT National Transportation Integrated Search
2017-01-01
The traditional vehicle detection method that has been used by the Texas Department of Transportation (TxDOT) on high-speed signalized intersection approaches for many years involved multiple detection points, with inductive loops being the early fav...
Potential mapping with charged-particle beams
NASA Technical Reports Server (NTRS)
Robinson, J. W.; Tillery, D. G.
1979-01-01
Experimental methods of mapping the equipotential surfaces near some structure of interest rely on the detection of charged particles which have traversed the regions of interest and are detected remotely. One method is the measurement of ion energies for ions created at a point of interest and expelled from the region by the fields. The ion energy at the detector in eV corresponds to the potential where the ion was created. An ionizing beam forms the ions from background neutrals. The other method is to inject charged particles into the region of interest and to locate their exit points. A set of several trajectories becomes a data base for a systematic mapping technique. An iterative solution of a boundary value problem establishes concepts and limitations pertaining to the mapping problem.
2012-01-01
Background Myocardial ischemia can be developed into more serious diseases. Early Detection of the ischemic syndrome in electrocardiogram (ECG) more accurately and automatically can prevent it from developing into a catastrophic disease. To this end, we propose a new method, which employs wavelets and simple feature selection. Methods For training and testing, the European ST-T database is used, which is comprised of 367 ischemic ST episodes in 90 records. We first remove baseline wandering, and detect time positions of QRS complexes by a method based on the discrete wavelet transform. Next, for each heart beat, we extract three features which can be used for differentiating ST episodes from normal: 1) the area between QRS offset and T-peak points, 2) the normalized and signed sum from QRS offset to effective zero voltage point, and 3) the slope from QRS onset to offset point. We average the feature values for successive five beats to reduce effects of outliers. Finally we apply classifiers to those features. Results We evaluated the algorithm by kernel density estimation (KDE) and support vector machine (SVM) methods. Sensitivity and specificity for KDE were 0.939 and 0.912, respectively. The KDE classifier detects 349 ischemic ST episodes out of total 367 ST episodes. Sensitivity and specificity of SVM were 0.941 and 0.923, respectively. The SVM classifier detects 355 ischemic ST episodes. Conclusions We proposed a new method for detecting ischemia in ECG. It contains signal processing techniques of removing baseline wandering and detecting time positions of QRS complexes by discrete wavelet transform, and feature extraction from morphology of ECG waveforms explicitly. It was shown that the number of selected features were sufficient to discriminate ischemic ST episodes from the normal ones. We also showed how the proposed KDE classifier can automatically select kernel bandwidths, meaning that the algorithm does not require any numerical values of the parameters to be supplied in advance. In the case of the SVM classifier, one has to select a single parameter. PMID:22703641
Datum Feature Extraction and Deformation Analysis Method Based on Normal Vector of Point Cloud
NASA Astrophysics Data System (ADS)
Sun, W.; Wang, J.; Jin, F.; Liang, Z.; Yang, Y.
2018-04-01
In order to solve the problem lacking applicable analysis method in the application of three-dimensional laser scanning technology to the field of deformation monitoring, an efficient method extracting datum feature and analysing deformation based on normal vector of point cloud was proposed. Firstly, the kd-tree is used to establish the topological relation. Datum points are detected by tracking the normal vector of point cloud determined by the normal vector of local planar. Then, the cubic B-spline curve fitting is performed on the datum points. Finally, datum elevation and the inclination angle of the radial point are calculated according to the fitted curve and then the deformation information was analyzed. The proposed approach was verified on real large-scale tank data set captured with terrestrial laser scanner in a chemical plant. The results show that the method could obtain the entire information of the monitor object quickly and comprehensively, and reflect accurately the datum feature deformation.
SU-E-J-237: Image Feature Based DRR and Portal Image Registration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, X; Chang, J
Purpose: Two-dimensional (2D) matching of the kV X-ray and digitally reconstructed radiography (DRR) images is an important setup technique for image-guided radiotherapy (IGRT). In our clinics, mutual information based methods are used for this purpose on commercial linear accelerators, but with often needs for manual corrections. This work proved the feasibility that feature based image transform can be used to register kV and DRR images. Methods: The scale invariant feature transform (SIFT) method was implemented to detect the matching image details (or key points) between the kV and DRR images. These key points represent high image intensity gradients, and thusmore » the scale invariant features. Due to the poor image contrast from our kV image, direct application of the SIFT method yielded many detection errors. To assist the finding of key points, the center coordinates of the kV and DRR images were read from the DICOM header, and the two groups of key points with similar relative positions to their corresponding centers were paired up. Using these points, a rigid transform (with scaling, horizontal and vertical shifts) was estimated. We also artificially introduced vertical and horizontal shifts to test the accuracy of our registration method on anterior-posterior (AP) and lateral pelvic images. Results: The results provided a satisfactory overlay of the transformed kV onto the DRR image. The introduced vs. detected shifts were fit into a linear regression. In the AP image experiments, linear regression analysis showed a slope of 1.15 and 0.98 with an R2 of 0.89 and 0.99 for the horizontal and vertical shifts, respectively. The results are 1.2 and 1.3 with R2 of 0.72 and 0.82 for the lateral image shifts. Conclusion: This work provided an alternative technique for kV to DRR alignment. Further improvements in the estimation accuracy and image contrast tolerance are underway.« less
A Novel Field Deployable Point-of-Care Diagnostic Test for Cutaneous Leishmaniasis
2015-10-01
include localized cutaneous leishmaniasis (LCL), and destructive nasal and oropharyngeal lesions of mucosal leishmaniasis (ML). LCL in the New World...the high costs, personnel training and need of sophisticated equipment. Therefore, novel methods to detect leishmaniasis at the POC are urgently needed...To date, there is no field-standardized molecular method based on DNA amplification coupled with Lateral Flow reading to detect leishmaniasis
Wu, Jun; Yu, Zhijing; Wang, Tao; Zhuge, Jingchang; Ji, Yue; Xue, Bin
2017-06-01
Airplane wing deformation is an important element of aerodynamic characteristics, structure design, and fatigue analysis for aircraft manufacturing, as well as a main test content of certification regarding flutter for airplanes. This paper presents a novel real-time detection method for wing deformation and flight flutter detection by using three-dimensional speckle image correlation technology. Speckle patterns whose positions are determined through the vibration characteristic of the aircraft are coated on the wing; then the speckle patterns are imaged by CCD cameras which are mounted inside the aircraft cabin. In order to reduce the computation, a matching technique based on Geodetic Systems Incorporated coded points combined with the classical epipolar constraint is proposed, and a displacement vector map for the aircraft wing can be obtained through comparing the coordinates of speckle points before and after deformation. Finally, verification experiments containing static and dynamic tests by using an aircraft wing model demonstrate the accuracy and effectiveness of the proposed method.
Electronic method for autofluorography of macromolecules on two-D matrices
Davidson, Jackson B.; Case, Arthur L.
1983-01-01
A method for detecting, localizing, and quantifying macromolecules contained in a two-dimensional matrix is provided which employs a television-based position sensitive detection system. A molecule-containing matrix may be produced by conventional means to produce spots of light at the molecule locations which are detected by the television system. The matrix, such as a gel matrix, is exposed to an electronic camera system including an image-intensifier and secondary electron conduction camera capable of light integrating times of many minutes. A light image stored in the form of a charge image on the camera tube target is scanned by conventional television techniques, digitized, and stored in a digital memory. Intensity of any point on the image may be determined from the number at the memory address of the point. The entire image may be displayed on a television monitor for inspection and photographing or individual spots may be analyzed through selected readout of the memory locations. Compared to conventional film exposure methods, the exposure time may be reduced 100-1000 times.
3-Dimensional Reconstruction of the ROSETTA Targets - Application to Asteroid 2867 Steins
NASA Astrophysics Data System (ADS)
Besse, Sebastien; Groussin, O.; Jorda, L.; Lamy, P.; OSIRIS Team
2008-09-01
The OSIRIS imaging experiment aboard the Rosetta spacecraft will image asteroids Steins in September 2008 and Lutetia in 2010, and comet 67P/Churyumov-Gerasimenko in 2014. An accurate determination of the shape is a key point for the success of the mission operations and scientific objectives. Based on the experience of previous space missions (Deep Impact, Near, Galileo, Hayabusa), we are developing our own procedure for the shape reconstruction of small bodies. We use two different techniques : i) limb and terminator constraints and ii) ground control points (GCP) constraints. The first method allows the determination of a rough shape of the body when it is poorly resolved and no features are visible on the surface, while the second method provides an accurate shape model using high resolution images. We are currently testing both methods on simulated data, using and developing different algorithms for limb and terminator extraction (e.g.,wavelet), detection of points of interest (Harris, Susan, Fast Corner Detection), points pairing using correlation techniques (geometric model) and 3-dimensional reconstruction using line-of-sight information (photogrammetry). Both methods will be fully automated. We will hopefully present the 3D reconstruction of the Steins asteroid from images obtained during its flyby. Acknowledgment: Sébastien Besse acknowledges CNES and Thales for funding.
Infrared small target detection based on directional zero-crossing measure
NASA Astrophysics Data System (ADS)
Zhang, Xiangyue; Ding, Qinghai; Luo, Haibo; Hui, Bin; Chang, Zheng; Zhang, Junchao
2017-12-01
Infrared small target detection under complex background and low signal-to-clutter ratio (SCR) condition is of great significance to the development on precision guidance and infrared surveillance. In order to detect targets precisely and extract targets from intricate clutters effectively, a detection method based on zero-crossing saliency (ZCS) map is proposed. The original map is first decomposed into different first-order directional derivative (FODD) maps by using FODD filters. Then the ZCS map is obtained by fusing all directional zero-crossing points. At last, an adaptive threshold is adopted to segment targets from the ZCS map. Experimental results on a series of images show that our method is effective and robust for detection under complex backgrounds. Moreover, compared with other five state-of-the-art methods, our method achieves better performance in terms of detection rate, SCR gain and background suppression factor.
Perception-based road hazard identification with Internet support.
Tarko, Andrew P; DeSalle, Brian R
2003-01-01
One of the most important tasks faced by highway agencies is identifying road hazards. Agencies use crash statistics to detect road intersections and segments where the frequency of crashes is excessive. With the crash-based method, a dangerous intersection or segment can be pointed out only after a sufficient number of crashes occur. A more proactive method is needed, and motorist complaints may be able to assist agencies in detecting road hazards before crashes occur. This paper investigates the quality of safety information reported by motorists and the effectiveness of hazard identification based on motorist reports, which were collected with an experimental Internet website. It demonstrates that the intersections pointed out by motorists tended to have more crashes than other intersections. The safety information collected through the website was comparable to 2-3 months of crash data. It was concluded that although the Internet-based method could not substitute for the traditional crash-based methods, its joint use with crash statistics might be useful in detecting new hazards where crash data had been collected for a short time.
Threshold-adaptive canny operator based on cross-zero points
NASA Astrophysics Data System (ADS)
Liu, Boqi; Zhang, Xiuhua; Hong, Hanyu
2018-03-01
Canny edge detection[1] is a technique to extract useful structural information from different vision objects and dramatically reduce the amount of data to be processed. It has been widely applied in various computer vision systems. There are two thresholds have to be settled before the edge is segregated from background. Usually, by the experience of developers, two static values are set as the thresholds[2]. In this paper, a novel automatic thresholding method is proposed. The relation between the thresholds and Cross-zero Points is analyzed, and an interpolation function is deduced to determine the thresholds. Comprehensive experimental results demonstrate the effectiveness of proposed method and advantageous for stable edge detection at changing illumination.
Liu, Zhenbang; Ng, Junxiang; Yuwono, Arianto; Lu, Yadong; Tan, Yung Khan
2017-01-01
To compare the staining intensity of the upper urinary tract (UUT) urothelium among three UUT delivery methods in an in vivo porcine model. A fluorescent dye solution (indigo carmine) was delivered to the UUT via three different methods: antegrade perfusion, vesico-ureteral reflux via indwelling ureteric stent and retrograde perfusion via a 5F open-ended ureteral catheter. Twelve renal units were tested with 4 in each method. After a 2-hour delivery time, the renal-ureter units were harvested en bloc. Time from harvesting to analysis was also standardised to be 2 hours in each arm. Three urothelium samples of the same weight and size were taken from each of the 6 pre-defined points (upper pole, mid pole, lower pole, renal pelvis, mid ureter and distal ureter) and the amount of fluorescence was measured with a spectrometer. The mean fluorescence detected at all 6 predefined points of the UUT urothelium was the highest for the retrograde method. This was statistically significant with p-value less than <0.05 at all 6 points. Retrograde infusion of UUT by an open ended ureteral catheter resulted in highest mean fluorescence detected at all 6 pre-defined points of the UUT urothelium compared to antegrade infusion and vesico-ureteral reflux via indwelling ureteric stents indicating retrograde method ideal for topical therapy throughout the UUT urothelium. More clinical studies are needed to demonstrate if retrograde method could lead to better clinical outcomes compared to the other two methods. Copyright® by the International Brazilian Journal of Urology.
Characterizing Sorghum Panicles using 3D Point Clouds
NASA Astrophysics Data System (ADS)
Lonesome, M.; Popescu, S. C.; Horne, D. W.; Pugh, N. A.; Rooney, W.
2017-12-01
To address demands of population growth and impacts of global climate change, plant breeders must increase crop yield through genetic improvement. However, plant phenotyping, the characterization of a plant's physical attributes, remains a primary bottleneck in modern crop improvement programs. 3D point clouds generated from terrestrial laser scanning (TLS) and unmanned aerial systems (UAS) based structure from motion (SfM) are a promising data source to increase the efficiency of screening plant material in breeding programs. This study develops and evaluates methods for characterizing sorghum (Sorghum bicolor) panicles (heads) in field plots from both TLS and UAS-based SfM point clouds. The TLS point cloud over experimental sorghum field at Texas A&M farm in Burleston County TX were collected using a FARO Focus X330 3D laser scanner. SfM point cloud was generated from UAS imagery captured using a Phantom 3 Professional UAS at 10m altitude and 85% image overlap. The panicle detection method applies point cloud reflectance, height and point density attributes characteristic of sorghum panicles to detect them and estimate their dimensions (panicle length and width) through image classification and clustering procedures. We compare the derived panicle counts and panicle sizes with field-based and manually digitized measurements in selected plots and study the strengths and limitations of each data source for sorghum panicle characterization.
NASA Astrophysics Data System (ADS)
Zhang, Bin; Qian, Yao; Wu, Yuntian; Yang, Y. B.
2018-04-01
To further the technique of indirect measurement, the contact-point response of a moving test vehicle is adopted for the damage detection of bridges. First, the contact-point response of the vehicle moving over the bridge is derived both analytically and in central difference form (for field use). Then, the instantaneous amplitude squared (IAS) of the driving component of the contact-point response is calculated by the Hilbert transform, making use of its narrow-band feature. The IAS peaks serve as the key parameter for damage detection. In the numerical simulation, a damage (crack) is modeled by a hinge-spring unit. The feasibility of the proposed method to detect the location and severity of a damage or multi damages of the bridge is verified. Also, the effects of surface roughness, vehicle speed, measurement noise and random traffic are studied. In the presence of ongoing traffic, the damages of the bridge are identified from the repeated or invariant IAS peaks generated for different traffic flows by the same test vehicle over the bridge.
Systems and Methods for Automated Water Detection Using Visible Sensors
NASA Technical Reports Server (NTRS)
Rankin, Arturo L. (Inventor); Matthies, Larry H. (Inventor); Bellutta, Paolo (Inventor)
2016-01-01
Systems and methods are disclosed that include automated machine vision that can utilize images of scenes captured by a 3D imaging system configured to image light within the visible light spectrum to detect water. One embodiment includes autonomously detecting water bodies within a scene including capturing at least one 3D image of a scene using a sensor system configured to detect visible light and to measure distance from points within the scene to the sensor system, and detecting water within the scene using a processor configured to detect regions within each of the at least one 3D images that possess at least one characteristic indicative of the presence of water.
Fast intersection detection algorithm for PC-based robot off-line programming
NASA Astrophysics Data System (ADS)
Fedrowitz, Christian H.
1994-11-01
This paper presents a method for fast and reliable collision detection in complex production cells. The algorithm is part of the PC-based robot off-line programming system of the University of Siegen (Ropsus). The method is based on a solid model which is managed by a simplified constructive solid geometry model (CSG-model). The collision detection problem is divided in two steps. In the first step the complexity of the problem is reduced in linear time. In the second step the remaining solids are tested for intersection. For this the Simplex algorithm, which is known from linear optimization, is used. It computes a point which is common to two convex polyhedra. The polyhedra intersect, if such a point exists. Regarding the simplified geometrical model of Ropsus the algorithm runs also in linear time. In conjunction with the first step a resultant collision detection algorithm is found which requires linear time in all. Moreover it computes the resultant intersection polyhedron using the dual transformation.
NASA Technical Reports Server (NTRS)
Cramer, Alexander Krishnan
2014-01-01
This work covers the design and test of a machine vision algorithm for generating high- accuracy pitch and yaw pointing solutions relative to the sun on a high altitude balloon. It describes how images were constructed by focusing an image of the sun onto a plate printed with a pattern of small cross-shaped fiducial markers. Images of this plate taken with an off-the-shelf camera were processed to determine relative position of the balloon payload to the sun. The algorithm is broken into four problems: circle detection, fiducial detection, fiducial identification, and image registration. Circle detection is handled by an "Average Intersection" method, fiducial detection by a matched filter approach, and identification with an ad-hoc method based on the spacing between fiducials. Performance is verified on real test data where possible, but otherwise uses artificially generated data. Pointing knowledge is ultimately verified to meet the 20 arcsecond requirement.
Detection of genetically modified organisms in foods by DNA amplification techniques.
García-Cañas, Virginia; Cifuentes, Alejandro; González, Ramón
2004-01-01
In this article, the different DNA amplification techniques that are being used for detecting genetically modified organisms (GMOs) in foods are examined. This study intends to provide an updated overview (including works published till June 2002) on the principal applications of such techniques together with their main advantages and drawbacks in GMO detection in foods. Some relevant facts on sampling, DNA isolation, and DNA amplification methods are discussed. Moreover; these analytical protocols are discuissed from a quantitative point of view, including the newest investigations on multiplex detection of GMOs in foods and validation of methods.
Evaluation of Pseudo-Haptic Interactions with Soft Objects in Virtual Environments.
Li, Min; Sareh, Sina; Xu, Guanghua; Ridzuan, Maisarah Binti; Luo, Shan; Xie, Jun; Wurdemann, Helge; Althoefer, Kaspar
2016-01-01
This paper proposes a pseudo-haptic feedback method conveying simulated soft surface stiffness information through a visual interface. The method exploits a combination of two feedback techniques, namely visual feedback of soft surface deformation and control of the indenter avatar speed, to convey stiffness information of a simulated surface of a soft object in virtual environments. The proposed method was effective in distinguishing different sizes of virtual hard nodules integrated into the simulated soft bodies. To further improve the interactive experience, the approach was extended creating a multi-point pseudo-haptic feedback system. A comparison with regards to (a) nodule detection sensitivity and (b) elapsed time as performance indicators in hard nodule detection experiments to a tablet computer incorporating vibration feedback was conducted. The multi-point pseudo-haptic interaction is shown to be more time-efficient than the single-point pseudo-haptic interaction. It is noted that multi-point pseudo-haptic feedback performs similarly well when compared to a vibration-based feedback method based on both performance measures elapsed time and nodule detection sensitivity. This proves that the proposed method can be used to convey detailed haptic information for virtual environmental tasks, even subtle ones, using either a computer mouse or a pressure sensitive device as an input device. This pseudo-haptic feedback method provides an opportunity for low-cost simulation of objects with soft surfaces and hard inclusions, as, for example, occurring in ever more realistic video games with increasing emphasis on interaction with the physical environment and minimally invasive surgery in the form of soft tissue organs with embedded cancer nodules. Hence, the method can be used in many low-budget applications where haptic sensation is required, such as surgeon training or video games, either using desktop computers or portable devices, showing reasonably high fidelity in conveying stiffness perception to the user.
Power-limited low-thrust trajectory optimization with operation point detection
NASA Astrophysics Data System (ADS)
Chi, Zhemin; Li, Haiyang; Jiang, Fanghua; Li, Junfeng
2018-06-01
The power-limited solar electric propulsion system is considered more practical in mission design. An accurate mathematical model of the propulsion system, based on experimental data of the power generation system, is used in this paper. An indirect method is used to deal with the time-optimal and fuel-optimal control problems, in which the solar electric propulsion system is described using a finite number of operation points, which are characterized by different pairs of thruster input power. In order to guarantee the integral accuracy for the discrete power-limited problem, a power operation detection technique is embedded in the fourth-order Runge-Kutta algorithm with fixed step. Moreover, the logarithmic homotopy method and normalization technique are employed to overcome the difficulties caused by using indirect methods. Three numerical simulations with actual propulsion systems are given to substantiate the feasibility and efficiency of the proposed method.
NASA Astrophysics Data System (ADS)
Zhang, Y. M.; Evans, J. R. G.; Yang, S. F.
2010-11-01
The authors have discovered a systematic, intelligent and potentially automatic method to detect errors in handbooks and stop their transmission using unrecognised relationships between materials properties. The scientific community relies on the veracity of scientific data in handbooks and databases, some of which have a long pedigree covering several decades. Although various outlier-detection procedures are employed to detect and, where appropriate, remove contaminated data, errors, which had not been discovered by established methods, were easily detected by our artificial neural network in tables of properties of the elements. We started using neural networks to discover unrecognised relationships between materials properties and quickly found that they were very good at finding inconsistencies in groups of data. They reveal variations from 10 to 900% in tables of property data for the elements and point out those that are most probably correct. Compared with the statistical method adopted by Ashby and co-workers [Proc. R. Soc. Lond. Ser. A 454 (1998) p. 1301, 1323], this method locates more inconsistencies and could be embedded in database software for automatic self-checking. We anticipate that our suggestion will be a starting point to deal with this basic problem that affects researchers in every field. The authors believe it may eventually moderate the current expectation that data field error rates will persist at between 1 and 5%.
A Novel Molecular Test to Diagnose Canine Visceral Leishmaniasis at the Point of Care
Castellanos-Gonzalez, Alejandro; Saldarriaga, Omar A.; Tartaglino, Lilian; Gacek, Rosana; Temple, Elissa; Sparks, Hayley; Melby, Peter C.; Travi, Bruno L.
2015-01-01
Dogs are the principal reservoir hosts of zoonotic visceral leishmaniasis (VL) but current serological methods are not sensitive enough to detect all subclinically infected animals, which is crucial to VL control programs. Polymerase chain reaction (PCR) methods have greater sensitivity but require expensive equipment and trained personnel, impairing its implementation in endemic areas. We developed a diagnostic test that uses isothermal recombinase polymerase amplification (RPA) to detect Leishmania infantum. This method was coupled with lateral flow (LF) reading with the naked eye to be adapted as a point-of-care test. The L. infantum RPA-LF had an analytical sensitivity similar to real time-PCR, detecting DNA of 0.1 parasites spiked in dog blood, which was equivalent to 40 parasites/mL. There was no cross amplification with dog or human DNA or with Leishmania braziliensis, Leishmania amazonensis, or Trypanosoma cruzi. The test also amplified Leishmania donovani strains (N = 7). In a group of clinically normal dogs (N = 30), RPA-LF detected more subclinical infections than rK39 strip test, a standard serological method (50% versus 13.3% positivity, respectively; P = 0.005). Also, RPA-LF detected L. infantum in noninvasive mucosal samples of dogs with a sensitivity comparable to blood samples. This novel molecular test may have a positive impact in leishmaniasis control programs. PMID:26240156
Development of new structural health monitoring techniques
NASA Astrophysics Data System (ADS)
Fekrmandi, Hadi
During the past two decades, many researchers have developed methods for the detection of structural defects at the early stages to operate the aerospace vehicles safely and to reduce the operating costs. The Surface Response to Excitation (SuRE) method is one of these approaches developed at FIU to reduce the cost and size of the equipment. The SuRE method excites the surface at a series of frequencies and monitors the propagation characteristics of the generated waves. The amplitude of the waves reaching to any point on the surface varies with frequency; however, it remains consistent as long as the integrity and strain distribution on the part is consistent. These spectral characteristics change when cracks develop or the strain distribution changes. The SHM methods may be used for many applications, from the detection of loose screws to the monitoring of manufacturing operations. A scanning laser vibrometer was used in this study to investigate the characteristics of the spectral changes at different points on the parts. The study started with detecting a load on a plate and estimating its location. The modifications on the part with manufacturing operations were detected and the Part-Based Manufacturing Process Performance Monitoring (PbPPM) method was developed. Hardware was prepared to demonstrate the feasibility of the proposed methods in real time. Using low-cost piezoelectric elements and the non-contact scanning laser vibrometer successfully, the data was collected for the SuRE and PbPPM methods. Locational force, loose bolts and material loss could be easily detected by comparing the spectral characteristics of the arriving waves. On-line methods used fast computational methods for estimating the spectrum and detecting the changing operational conditions from sum of the squares of the variations. Neural networks classified the spectrums when the desktop -- DSP combination was used. The results demonstrated the feasibility of the SuRE and PbPPM methods.
Protocol for monitoring forest-nesting birds in National Park Service parks
Dawson, Deanna K.; Efford, Murray G.
2013-01-01
These documents detail the protocol for monitoring forest-nesting birds in National Park Service parks in the National Capital Region Network (NCRN). In the first year of sampling, counts of birds should be made at 384 points on the NCRN spatially randomized grid, developed to sample terrestrial resources. Sampling should begin on or about May 20 and continue into early July; on each day the sampling period begins at sunrise and ends five hours later. Each point should be counted twice, once in the first half of the field season and once in the second half, with visits made by different observers, balancing the within-season coverage of points and their spatial coverage by observers, and allowing observer differences to be tested. Three observers, skilled in identifying birds of the region by sight and sound and with previous experience in conducting timed counts of birds, will be needed for this effort. Observers should be randomly assigned to ‘routes’ consisting of eight points, in close proximity and, ideally, in similar habitat, that can be covered in one morning. Counts are 10 minutes in length, subdivided into four 2.5-min intervals. Within each time interval, new birds (i.e., those not already detected) are recorded as within or beyond 50 m of the point, based on where first detected. Binomial distance methods are used to calculate annual estimates of density for species. The data are also amenable to estimation of abundance and detection probability via the removal method. Generalized linear models can be used to assess between-year changes in density estimates or unadjusted count data. This level of sampling is expected to be sufficient to detect a 50% decline in 10 years for approximately 50 bird species, including 14 of 19 species that are priorities for conservation efforts, if analyses are based on unadjusted count data, and for 30 species (6 priority species) if analyses are based on density estimates. The estimates of required sample sizes are based on the mean number of individuals detected per 10 minutes in available data from surveys in three NCRN parks. Once network-wide data from the first year of sampling are available, this and other aspects of the protocol should be re-assessed, and changes made as desired or necessary before the start of the second field season. Thereafter, changes should not be made to the field methods, and sampling should be conducted annually for at least ten years. NCRN staff should keep apprised of new analytical methods developed for analysis of point-count data.
Sang-Mook Lee; A. Lynn Abbott; Neil A. Clark; Philip A. Araman
2003-01-01
Splines can be used to approximate noisy data with a few control points. This paper presents a new curve matching method for deformable shapes using two-dimensional splines. In contrast to the residual error criterion, which is based on relative locations of corresponding knot points such that is reliable primarily for dense point sets, we use deformation energy of...
Detecting and connecting agricultural ditches using LiDAR data
NASA Astrophysics Data System (ADS)
Roelens, Jennifer; Dondeyne, Stefaan; Van Orshoven, Jos; Diels, Jan
2017-04-01
High-resolution hydrological data are essential for spatially-targeted water resource management decisions and future modelling efforts. For Flanders, small water courses like agricultural ditches and their connection to the river network are incomplete in the official digital atlas. High-resolution LiDAR data offer the prospect for automated detection of ditches, but there is no established method or software to do so nor to predict how these are connected to each other and the wider hydrographic network. An aerial LiDAR database encompassing at least 16 points per square meter linked with simultaneously collected digital RGB aerial images, is available for Flanders. The potential of detecting agricultural ditches and their connectivity based on point LiDAR data was investigated in a 1.9 km2 study area located in the alluvial valley of the river Demer. The area consists of agricultural parcels and woodland with a ditch network of approximately 17 km. The entire network of open ditches, and the location of culverts were mapped during a field survey to test the effectiveness of the proposed method. In the first step of the proposed method, the LiDAR point data were transformed into a raster DEM with a 1-m resolution to reduce the amount of data to be analyzed. This was done by interpolating the bare earth points using the nearest neighborhood method. In a next step, a morphological approach was used for detecting a preliminary network as traditional flow algorithms are not suitable for detecting small water courses in low-lying areas. This resulted in a preliminary classified raster image with ditch and non-ditch cells. After eliminating small details that are the result of background noise, the resulting classified raster image was vectorized to match the format of the digital watercourse network. As the vectorisation does not always adequately represent the shape of linear features, the results did not meet the high-quality cartographic needs. The spatial accuracy of the derived ditches was improved by referring to the original LiDAR point cloud data. In each 1-m buffer of the preliminary detected network vertices, the lowest LiDAR point was taken as the vertex of an improved network, shifting the preliminary network into the lowest 'depressions' of the study area. A drawback of a morphological approach is that the connectivity of the network is usually poor since the pixels are processed separately. Therefore, the field observations on connectivity (including culverts) were used to develop an empirical model to estimate the probability of connectivity between LiDAR-derived ditch segments from auxiliary datasets and hydrological and morphological properties of the preliminary network. This allowed deriving a connectivity probability map. The underlying model was tested by cross-validating the field observations on connectivity.
Tenorio, Bruno Mendes; da Silva Filho, Eurípedes Alves; Neiva, Gentileza Santos Martins; da Silva, Valdemiro Amaro; Tenorio, Fernanda das Chagas Angelo Mendes; da Silva, Themis de Jesus; Silva, Emerson Carlos Soares E; Nogueira, Romildo de Albuquerque
2017-08-01
Shrimps can accumulate environmental toxicants and suffer behavioral changes. However, methods to quantitatively detect changes in the behavior of these shrimps are still needed. The present study aims to verify whether mathematical and fractal methods applied to video tracking can adequately describe changes in the locomotion behavior of shrimps exposed to low concentrations of toxic chemicals, such as 0.15µgL -1 deltamethrin pesticide or 10µgL -1 mercuric chloride. Results showed no change after 1min, 4, 24, and 48h of treatment. However, after 72 and 96h of treatment, both the linear methods describing the track length, mean speed, mean distance from the current to the previous track point, as well as the non-linear methods of fractal dimension (box counting or information entropy) and multifractal analysis were able to detect changes in the locomotion behavior of shrimps exposed to deltamethrin. Analysis of angular parameters of the track points vectors and lacunarity were not sensitive to those changes. None of the methods showed adverse effects to mercury exposure. These mathematical and fractal methods applicable to software represent low cost useful tools in the toxicological analyses of shrimps for quality of food, water and biomonitoring of ecosystems. Copyright © 2017 Elsevier Inc. All rights reserved.
A Monocular Vision Sensor-Based Obstacle Detection Algorithm for Autonomous Robots
Lee, Tae-Jae; Yi, Dong-Hoon; Cho, Dong-Il “Dan”
2016-01-01
This paper presents a monocular vision sensor-based obstacle detection algorithm for autonomous robots. Each individual image pixel at the bottom region of interest is labeled as belonging either to an obstacle or the floor. While conventional methods depend on point tracking for geometric cues for obstacle detection, the proposed algorithm uses the inverse perspective mapping (IPM) method. This method is much more advantageous when the camera is not high off the floor, which makes point tracking near the floor difficult. Markov random field-based obstacle segmentation is then performed using the IPM results and a floor appearance model. Next, the shortest distance between the robot and the obstacle is calculated. The algorithm is tested by applying it to 70 datasets, 20 of which include nonobstacle images where considerable changes in floor appearance occur. The obstacle segmentation accuracies and the distance estimation error are quantitatively analyzed. For obstacle datasets, the segmentation precision and the average distance estimation error of the proposed method are 81.4% and 1.6 cm, respectively, whereas those for a conventional method are 57.5% and 9.9 cm, respectively. For nonobstacle datasets, the proposed method gives 0.0% false positive rates, while the conventional method gives 17.6%. PMID:26938540
Gorresen, P. Marcos; Camp, Richard J.; Brinck, Kevin W.; Farmer, Chris
2012-01-01
Point-transect surveys indicated that millerbirds were more abundant than shown by the striptransect method, and were estimated at 802 birds in 2010 (95%CI = 652 – 964) and 704 birds in 2011 (95%CI = 579 – 837). Point-transect surveys yielded population estimates with improved precision which will permit trends to be detected in shorter time periods and with greater statistical power than is available from strip-transect survey methods. Mean finch population estimates and associated uncertainty were not markedly different among the three survey methods, but the performance of models used to estimate density and population size are expected to improve as the data from additional surveys are incorporated. Using the pointtransect survey, the mean finch population size was estimated at 2,917 birds in 2010 (95%CI = 2,037 – 3,965) and 2,461 birds in 2011 (95%CI = 1,682 – 3,348). Preliminary testing of the line-transect method in 2011 showed that it would not generate sufficient detections to effectively model bird density, and consequently, relatively precise population size estimates. Both species were fairly evenly distributed across Nihoa and appear to occur in all or nearly all available habitat. The time expended and area traversed by observers was similar among survey methods; however, point-transect surveys do not require that observers walk a straight transect line, thereby allowing them to avoid culturally or biologically sensitive areas and minimize the adverse effects of recurrent travel to any particular area. In general, pointtransect surveys detect more birds than strip-survey methods, thereby improving precision and resulting population size and trend estimation. The method is also better suited for the steep and uneven terrain of Nihoa
Zulkifley, Mohd Asyraf; Rawlinson, David; Moran, Bill
2012-01-01
In video analytics, robust observation detection is very important as the content of the videos varies a lot, especially for tracking implementation. Contrary to the image processing field, the problems of blurring, moderate deformation, low illumination surroundings, illumination change and homogenous texture are normally encountered in video analytics. Patch-Based Observation Detection (PBOD) is developed to improve detection robustness to complex scenes by fusing both feature- and template-based recognition methods. While we believe that feature-based detectors are more distinctive, however, for finding the matching between the frames are best achieved by a collection of points as in template-based detectors. Two methods of PBOD—the deterministic and probabilistic approaches—have been tested to find the best mode of detection. Both algorithms start by building comparison vectors at each detected points of interest. The vectors are matched to build candidate patches based on their respective coordination. For the deterministic method, patch matching is done in 2-level test where threshold-based position and size smoothing are applied to the patch with the highest correlation value. For the second approach, patch matching is done probabilistically by modelling the histograms of the patches by Poisson distributions for both RGB and HSV colour models. Then, maximum likelihood is applied for position smoothing while a Bayesian approach is applied for size smoothing. The result showed that probabilistic PBOD outperforms the deterministic approach with average distance error of 10.03% compared with 21.03%. This algorithm is best implemented as a complement to other simpler detection methods due to heavy processing requirement. PMID:23202226
TaqMan based real time PCR assay targeting EML4-ALK fusion transcripts in NSCLC.
Robesova, Blanka; Bajerova, Monika; Liskova, Kvetoslava; Skrickova, Jana; Tomiskova, Marcela; Pospisilova, Sarka; Mayer, Jiri; Dvorakova, Dana
2014-07-01
Lung cancer with the ALK rearrangement constitutes only a small fraction of patients with non-small cell lung cancer (NSCLC). However, in the era of molecular-targeted therapy, efficient patient selection is crucial for successful treatment. In this context, an effective method for EML4-ALK detection is necessary. We developed a new highly sensitive variant specific TaqMan based real time PCR assay applicable to RNA from formalin-fixed paraffin-embedded tissue (FFPE). This assay was used to analyze the EML4-ALK gene in 96 non-selected NSCLC specimens and compared with two other methods (end-point PCR and break-apart FISH). EML4-ALK was detected in 33/96 (34%) specimens using variant specific real time PCR, whereas in only 23/96 (24%) using end-point PCR. All real time PCR positive samples were confirmed with direct sequencing. A total of 46 specimens were subsequently analyzed by all three detection methods. Using variant specific real time PCR we identified EML4-ALK transcript in 17/46 (37%) specimens, using end-point PCR in 13/46 (28%) specimens and positive ALK rearrangement by FISH was detected in 8/46 (17.4%) specimens. Moreover, using variant specific real time PCR, 5 specimens showed more than one EML4-ALK variant simultaneously (in 2 cases the variants 1+3a+3b, in 2 specimens the variants 1+3a and in 1 specimen the variant 1+3b). In one case of 96 EML4-ALK fusion gene and EGFR mutation were detected. All simultaneous genetic variants were confirmed using end-point PCR and direct sequencing. Our variant specific real time PCR assay is highly sensitive, fast, financially acceptable, applicable to FFPE and seems to be a valuable tool for the rapid prescreening of NSCLC patients in clinical practice, so, that most patients able to benefit from targeted therapy could be identified. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Minimizing Higgs potentials via numerical polynomial homotopy continuation
NASA Astrophysics Data System (ADS)
Maniatis, M.; Mehta, D.
2012-08-01
The study of models with extended Higgs sectors requires to minimize the corresponding Higgs potentials, which is in general very difficult. Here, we apply a recently developed method, called numerical polynomial homotopy continuation (NPHC), which guarantees to find all the stationary points of the Higgs potentials with polynomial-like non-linearity. The detection of all stationary points reveals the structure of the potential with maxima, metastable minima, saddle points besides the global minimum. We apply the NPHC method to the most general Higgs potential having two complex Higgs-boson doublets and up to five real Higgs-boson singlets. Moreover the method is applicable to even more involved potentials. Hence the NPHC method allows to go far beyond the limits of the Gröbner basis approach.
Experimental results for the rapid determination of the freezing point of fuels
NASA Technical Reports Server (NTRS)
Mathiprakasam, B.
1984-01-01
Two methods for the rapid determination of the freezing point of fuels were investigated: an optical method, which detected the change in light transmission from the disappearance of solid particles in the melted fuel; and a differential thermal analysis (DTA) method, which sensed the latent heat of fusion. A laboratory apparatus was fabricated to test the two methods. Cooling was done by thermoelectric modules using an ice-water bath as a heat sink. The DTA method was later modified to eliminate the reference fuel. The data from the sample were digitized and a point of inflection, which corresponds to the ASTM D-2386 freezing point (final melting point), was identified from the derivative. The apparatus was modifified to cool the fuel to -60 C and controls were added for maintaining constant cooling rate, rewarming rate, and hold time at minimum temperature. A parametric series of tests were run for twelve fuels with freezing points from -10 C to -50 C, varying cooling rate, rewarming rate, and hold time. Based on the results, an optimum test procedure was established. The results showed good agreement with ASTM D-2386 freezing point and differential scanning calorimetry results.
[Development of residual voltage testing equipment].
Zeng, Xiaohui; Wu, Mingjun; Cao, Li; He, Jinyi; Deng, Zhensheng
2014-07-01
For the existing measurement methods of residual voltage which can't turn the power off at peak voltage exactly and simultaneously display waveforms, a new residual voltage detection method is put forward in this paper. First, the zero point of the power supply is detected with zero cross detection circuit and is inputted to a single-chip microcomputer in the form of pulse signal. Secend, when the zero point delays to the peak voltage, the single-chip microcomputer sends control signal to power off the relay. At last, the waveform of the residual voltage is displayed on a principal computer or oscilloscope. The experimental results show that the device designed in this paper can turn the power off at peak voltage and is able to accurately display the voltage waveform immediately after power off and the standard deviation of the residual voltage is less than 0.2 V at exactly one second and later.
A High Speed Finger-Print Optical Scanning Method
2000-01-01
biometrics technologies for authentication, from the view point of convenience and higher security, dactyloscopy is by far the best, much better than the...sensing technologies using static capacitance, thermal or optical detection, the optical detection is by far with the most potential to meet the...present time due to the low resolution of the inherent nature of thermal imaging technique. Besides, this method is easily influenced by environmental
Woldegebriel, Michael; Derks, Eduard
2017-01-17
In this work, a novel probabilistic untargeted feature detection algorithm for liquid chromatography coupled to high-resolution mass spectrometry (LC-HRMS) using artificial neural network (ANN) is presented. The feature detection process is approached as a pattern recognition problem, and thus, ANN was utilized as an efficient feature recognition tool. Unlike most existing feature detection algorithms, with this approach, any suspected chromatographic profile (i.e., shape of a peak) can easily be incorporated by training the network, avoiding the need to perform computationally expensive regression methods with specific mathematical models. In addition, with this method, we have shown that the high-resolution raw data can be fully utilized without applying any arbitrary thresholds or data reduction, therefore improving the sensitivity of the method for compound identification purposes. Furthermore, opposed to existing deterministic (binary) approaches, this method rather estimates the probability of a feature being present/absent at a given point of interest, thus giving chance for all data points to be propagated down the data analysis pipeline, weighed with their probability. The algorithm was tested with data sets generated from spiked samples in forensic and food safety context and has shown promising results by detecting features for all compounds in a computationally reasonable time.
NASA Astrophysics Data System (ADS)
Liu, Hongxing; Xing, Da; Zhou, Xiaoming
2014-09-01
Food-borne pathogens such as Listeria monocytogenes have been recognized as a major cause of human infections worldwide, leading to substantial health problems. Food-borne pathogen identification needs to be simpler, cheaper and more reliable than the current traditional methods. Here, we have constructed a low-cost paper biosensor for the detection of viable pathogenic bacteria with the naked eye. In this study, an effective isothermal amplification method was used to amplify the hlyA mRNA gene, a specific RNA marker in Listeria monocytogenes. The amplification products were applied to the paper biosensor to perform a visual test, in which endpoint detection was performed using sandwich hybridization assays. When the RNA products migrated along the paper biosensor by capillary action, the gold nanoparticles accumulated at the designated Test line and Control line. Under optimized experimental conditions, as little as 0.5 pg/μL genomic RNA from Listeria monocytogenes could be detected. The whole assay process, including RNA extraction, amplification, and visualization, can be completed within several hours. The developed method is suitable for point-of-care applications to detect food-borne pathogens, as it can effectively overcome the false-positive results caused by amplifying nonviable Listeria monocytogenes.
NASA Astrophysics Data System (ADS)
Zhou, Anran; Xie, Weixin; Pei, Jihong
2018-06-01
Accurate detection of maritime targets in infrared imagery under various sea clutter conditions is always a challenging task. The fractional Fourier transform (FRFT) is the extension of the Fourier transform in the fractional order, and has richer spatial-frequency information. By combining it with the high order statistic filtering, a new ship detection method is proposed. First, the proper range of angle parameter is determined to make it easier for the ship components and background to be separated. Second, a new high order statistic curve (HOSC) at each fractional frequency point is designed. It is proved that maximal peak interval in HOSC reflects the target information, while the points outside the interval reflect the background. And the value of HOSC relative to the ship is much bigger than that to the sea clutter. Then, search the curve's maximal target peak interval and extract the interval by bandpass filtering in fractional Fourier domain. The value outside the peak interval of HOSC decreases rapidly to 0, so the background is effectively suppressed. Finally, the detection result is obtained by the double threshold segmenting and the target region selection method. The results show the proposed method is excellent for maritime targets detection with high clutters.
Vehicle Localization by LIDAR Point Correlation Improved by Change Detection
NASA Astrophysics Data System (ADS)
Schlichting, A.; Brenner, C.
2016-06-01
LiDAR sensors are proven sensors for accurate vehicle localization. Instead of detecting and matching features in the LiDAR data, we want to use the entire information provided by the scanners. As dynamic objects, like cars, pedestrians or even construction sites could lead to wrong localization results, we use a change detection algorithm to detect these objects in the reference data. If an object occurs in a certain number of measurements at the same position, we mark it and every containing point as static. In the next step, we merge the data of the single measurement epochs to one reference dataset, whereby we only use static points. Further, we also use a classification algorithm to detect trees. For the online localization of the vehicle, we use simulated data of a vertical aligned automotive LiDAR sensor. As we only want to use static objects in this case as well, we use a random forest classifier to detect dynamic scan points online. Since the automotive data is derived from the LiDAR Mobile Mapping System, we are able to use the labelled objects from the reference data generation step to create the training data and further to detect dynamic objects online. The localization then can be done by a point to image correlation method using only static objects. We achieved a localization standard deviation of about 5 cm (position) and 0.06° (heading), and were able to successfully localize the vehicle in about 93 % of the cases along a trajectory of 13 km in Hannover, Germany.
Automated Point Cloud Correspondence Detection for Underwater Mapping Using AUVs
NASA Technical Reports Server (NTRS)
Hammond, Marcus; Clark, Ashley; Mahajan, Aditya; Sharma, Sumant; Rock, Stephen
2015-01-01
An algorithm for automating correspondence detection between point clouds composed of multibeam sonar data is presented. This allows accurate initialization for point cloud alignment techniques even in cases where accurate inertial navigation is not available, such as iceberg profiling or vehicles with low-grade inertial navigation systems. Techniques from computer vision literature are used to extract, label, and match keypoints between "pseudo-images" generated from these point clouds. Image matches are refined using RANSAC and information about the vehicle trajectory. The resulting correspondences can be used to initialize an iterative closest point (ICP) registration algorithm to estimate accumulated navigation error and aid in the creation of accurate, self-consistent maps. The results presented use multibeam sonar data obtained from multiple overlapping passes of an underwater canyon in Monterey Bay, California. Using strict matching criteria, the method detects 23 between-swath correspondence events in a set of 155 pseudo-images with zero false positives. Using less conservative matching criteria doubles the number of matches but introduces several false positive matches as well. Heuristics based on known vehicle trajectory information are used to eliminate these.
A Study of Impact Point Detecting Method Based on Seismic Signal
NASA Astrophysics Data System (ADS)
Huo, Pengju; Zhang, Yu; Xu, Lina; Huang, Yong
The projectile landing position has to be determined for its recovery and range in the targeting test. In this paper, a global search method based on the velocity variance is proposed. In order to verify the applicability of this method, simulation analysis within the scope of four million square meters has been conducted in the same array structure of the commonly used linear positioning method, and MATLAB was used to compare and analyze the two methods. The compared simulation results show that the global search method based on the speed of variance has high positioning accuracy and stability, which can meet the needs of impact point location.
Error Distribution Evaluation of the Third Vanishing Point Based on Random Statistical Simulation
NASA Astrophysics Data System (ADS)
Li, C.
2012-07-01
POS, integrated by GPS / INS (Inertial Navigation Systems), has allowed rapid and accurate determination of position and attitude of remote sensing equipment for MMS (Mobile Mapping Systems). However, not only does INS have system error, but also it is very expensive. Therefore, in this paper error distributions of vanishing points are studied and tested in order to substitute INS for MMS in some special land-based scene, such as ground façade where usually only two vanishing points can be detected. Thus, the traditional calibration approach based on three orthogonal vanishing points is being challenged. In this article, firstly, the line clusters, which parallel to each others in object space and correspond to the vanishing points, are detected based on RANSAC (Random Sample Consensus) and parallelism geometric constraint. Secondly, condition adjustment with parameters is utilized to estimate nonlinear error equations of two vanishing points (VX, VY). How to set initial weights for the adjustment solution of single image vanishing points is presented. Solving vanishing points and estimating their error distributions base on iteration method with variable weights, co-factor matrix and error ellipse theory. Thirdly, under the condition of known error ellipses of two vanishing points (VX, VY) and on the basis of the triangle geometric relationship of three vanishing points, the error distribution of the third vanishing point (VZ) is calculated and evaluated by random statistical simulation with ignoring camera distortion. Moreover, Monte Carlo methods utilized for random statistical estimation are presented. Finally, experimental results of vanishing points coordinate and their error distributions are shown and analyzed.
Zhang, Bin Bin; Shi, Yi; Chen, Hui; Zhu, Qing Xia; Lu, Feng; Li, Ying Wei
2018-01-02
By coupling surface-enhanced Raman spectroscopy (SERS) with thin-layer chromatography (TLC), a powerful method for detecting complex samples was successfully developed. However, in the TLC-SERS method, metal nanoparticles serving as the SERS-active substrate are likely to disturb the detection of target compounds, particularly in overlapping compounds after TLC development. In addition, the SERS detection of compounds that are invisible under both visible light and UV 254/365 after TLC development is still a significant challenge. In this study, we demonstrated a facile strategy to fabricate a TLC plate with metal-organic framework-modified gold nanoparticles as a separable SERS substrate, on which all separated components, including overlapping and invisible compounds, could be detected by a point-by-point SERS scan along the developing direction. Rhodamine 6G (R6G) was used as a probe to evaluate the performance of the substrate. The results indicated that the substrate provided good sensitivity and reproducibility, and optimal SERS signals could be collected in 5 s. Furthermore, this new substrate exhibited a long shelf life. Thus, our method has great potential for the sensitive and rapid detection of overlapping and invisible compounds in complex samples after TLC development. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
Development of a novel constellation based landmark detection algorithm
NASA Astrophysics Data System (ADS)
Ghayoor, Ali; Vaidya, Jatin G.; Johnson, Hans J.
2013-03-01
Anatomical landmarks such as the anterior commissure (AC) and posterior commissure (PC) are commonly used by researchers for co-registration of images. In this paper, we present a novel, automated approach for landmark detection that combines morphometric constraining and statistical shape models to provide accurate estimation of landmark points. This method is made robust to large rotations in initial head orientation by extracting extra information of the eye centers using a radial Hough transform and exploiting the centroid of head mass (CM) using a novel estimation approach. To evaluate the effectiveness of this method, the algorithm is trained on a set of 20 images with manually selected landmarks, and a test dataset is used to compare the automatically detected against the manually detected landmark locations of the AC, PC, midbrain-pons junction (MPJ), and fourth ventricle notch (VN4). The results show that the proposed method is accurate as the average error between the automatically and manually labeled landmark points is less than 1 mm. Also, the algorithm is highly robust as it was successfully run on a large dataset that included different kinds of images with various orientation, spacing, and origin.
Capillary electrophoresis of conidia from cultivated microscopic filamentous fungi.
Horká, Marie; Růzicka, Filip; Kubesová, Anna; Holá, Veronika; Slais, Karel
2009-05-15
In immunocompromised people fungal agents are able to cause serious infections with high mortality rate. An early diagnosis can increase the chances of survival of the affected patients. Simultaneously, the fungi produce toxins and they are frequent cause of allergy. Currently, various methods are used for detection and identification of these pathogens. They use microscopic examination and growth characteristic of the fungi. New methods are based on the analysis of structural elements of the target microorganisms such as proteins, polysaccharides, glycoproteins, nucleic acids, etc. for the construction of antibodies, probes, and primers for detection. The above-mentioned methods are time-consuming and elaborate. Here hydrophobic conidia from the cultures of different strains of the filamentous fungi were focused and separated by capillary zone electrophoresis and capillary isoelectric focusing. The detection was optimized by dynamic modifying of conidia by the nonionogenic tenside on the basis of pyrenebutanoate. Down to 10 labeled conidia of the fungal strains were fluorometrically detected, and isoelectric points of conidia were determined. The observed isoelectric points were compared with those obtained from the separation of the cultured clinical samples, and they were found to be not host-specific.
Automatic arteriovenous crossing phenomenon detection on retinal fundus images
NASA Astrophysics Data System (ADS)
Hatanaka, Yuji; Muramatsu, Chisako; Hara, Takeshi; Fujita, Hiroshi
2011-03-01
Arteriolosclerosis is one cause of acquired blindness. Retinal fundus image examination is useful for early detection of arteriolosclerosis. In order to diagnose the presence of arteriolosclerosis, the physicians find the silver-wire arteries, the copper-wire arteries and arteriovenous crossing phenomenon on retinal fundus images. The focus of this study was to develop the automated detection method of the arteriovenous crossing phenomenon on the retinal images. The blood vessel regions were detected by using a double ring filter, and the crossing sections of artery and vein were detected by using a ring filter. The center of that ring was an interest point, and that point was determined as a crossing section when there were over four blood vessel segments on that ring. And two blood vessels gone through on the ring were classified into artery and vein by using the pixel values on red and blue component image. Finally, V2-to-V1 ratio was measured for recognition of abnormalities. V1 was the venous diameter far from the blood vessel crossing section, and V2 was the venous diameter near from the blood vessel crossing section. The crossing section with V2-to-V1 ratio over 0.8 was experimentally determined as abnormality. Twenty four images, including 27 abnormalities and 54 normal crossing sections, were used for preliminary evaluation of the proposed method. The proposed method was detected 73% of crossing sections when the 2.8 sections per image were mis-detected. And, 59% of abnormalities were detected by measurement of V1-to-V2 ratio when the 1.7 sections per image were mis-detected.
Automated analysis of plethysmograms for functional studies of hemodynamics
NASA Astrophysics Data System (ADS)
Zatrudina, R. Sh.; Isupov, I. B.; Gribkov, V. Yu.
2018-04-01
The most promising method for the quantitative determination of cardiovascular tone indicators and of cerebral hemodynamics indicators is the method of impedance plethysmography. The accurate determination of these indicators requires the correct identification of the characteristic points in the thoracic impedance plethysmogram and the cranial impedance plethysmogram respectively. An algorithm for automatic analysis of these plethysmogram is presented. The algorithm is based on the hard temporal relationships between the phases of the cardiac cycle and the characteristic points of the plethysmogram. The proposed algorithm does not require estimation of initial data and selection of processing parameters. Use of the method on healthy subjects showed a very low detection error of characteristic points.
Moire technique utilization for detection and measurement of scoliosis
NASA Astrophysics Data System (ADS)
Zawieska, Dorota; Podlasiak, Piotr
1993-02-01
Moire projection method enables non-contact measurement of the shape or deformation of different surfaces and constructions by fringe pattern analysis. The fringe map acquisition of the whole surface of the object under test is one of the main advantages compared with 'point by point' methods. The computer analyzes the shape of the whole surface and next user can selected different points or cross section of the object map. In this paper a few typical examples of an application of the moire technique in solving different medical problems will be presented. We will also present to you the equipment the moire pattern analysis is done in real time using the phase stepping method with CCD camera.
NASA Astrophysics Data System (ADS)
Wang, L.; Toshioka, T.; Nakajima, T.; Narita, A.; Xue, Z.
2017-12-01
In recent years, more and more Carbon Capture and Storage (CCS) studies focus on seismicity monitoring. For the safety management of geological CO2 storage at Tomakomai, Hokkaido, Japan, an Advanced Traffic Light System (ATLS) combined different seismic messages (magnitudes, phases, distributions et al.) is proposed for injection controlling. The primary task for ATLS is the seismic events detection in a long-term sustained time series record. Considering the time-varying characteristics of Signal to Noise Ratio (SNR) of a long-term record and the uneven energy distributions of seismic event waveforms will increase the difficulty in automatic seismic detecting, in this work, an improved probability autoregressive (AR) method for automatic seismic event detecting is applied. This algorithm, called sequentially discounting AR learning (SDAR), can identify the effective seismic event in the time series through the Change Point detection (CPD) of the seismic record. In this method, an anomaly signal (seismic event) can be designed as a change point on the time series (seismic record). The statistical model of the signal in the neighborhood of event point will change, because of the seismic event occurrence. This means the SDAR aims to find the statistical irregularities of the record thought CPD. There are 3 advantages of SDAR. 1. Anti-noise ability. The SDAR does not use waveform messages (such as amplitude, energy, polarization) for signal detecting. Therefore, it is an appropriate technique for low SNR data. 2. Real-time estimation. When new data appears in the record, the probability distribution models can be automatic updated by SDAR for on-line processing. 3. Discounting property. the SDAR introduces a discounting parameter to decrease the influence of present statistic value on future data. It makes SDAR as a robust algorithm for non-stationary signal processing. Within these 3 advantages, the SDAR method can handle the non-stationary time-varying long-term series and achieve real-time monitoring. Finally, we employ the SDAR on a synthetic model and Tomakomai Ocean Bottom Cable (OBC) baseline data to prove the feasibility and advantage of our method.
Barbés, Benigno; Azcona, Juan Diego; Prieto, Elena; de Foronda, José Manuel; García, Marina; Burguete, Javier
2015-09-08
A simple and independent system to detect and measure the position of a number of points in space was devised and implemented. Its application aimed to detect patient motion during radiotherapy treatments, alert of out-of-tolerances motion, and record the trajectories for subsequent studies. The system obtains the 3D position of points in space, through its projections in 2D images recorded by two cameras. It tracks black dots on a white sticker placed on the surface of the moving object. The system was tested with linear displacements of a phantom, circular trajectories of a rotating disk, oscillations of an in-house phantom, and oscillations of a 4D phantom. It was also used to track 461 trajectories of points on the surface of patients during their radiotherapy treatments. Trajectories of several points were reproduced with accuracy better than 0.3 mm in the three spatial directions. The system was able to follow periodic motion with amplitudes lower than 0.5 mm, to follow trajectories of rotating points at speeds up to 11.5 cm/s, and to track accurately the motion of a respiratory phantom. The technique has been used to track the motion of patients during radiotherapy and to analyze that motion. The method is flexible. Its installation and calibration are simple and quick. It is easy to use and can be implemented at a very affordable price. Data collection does not involve any discomfort to the patient and does not delay the treatment, so the system can be used routinely in all treatments. It has an accuracy similar to that of other, more sophisticated, commercially available systems. It is suitable to implement a gating system or any other application requiring motion detection, such as 4D CT, MRI or PET.
Methods for threshold determination in multiplexed assays
Tammero, Lance F. Bentley; Dzenitis, John M; Hindson, Benjamin J
2014-06-24
Methods for determination of threshold values of signatures comprised in an assay are described. Each signature enables detection of a target. The methods determine a probability density function of negative samples and a corresponding false positive rate curve. A false positive criterion is established and a threshold for that signature is determined as a point at which the false positive rate curve intersects the false positive criterion. A method for quantitative analysis and interpretation of assay results together with a method for determination of a desired limit of detection of a signature in an assay are also described.
Dual Low-Rank Pursuit: Learning Salient Features for Saliency Detection.
Lang, Congyan; Feng, Jiashi; Feng, Songhe; Wang, Jingdong; Yan, Shuicheng
2016-06-01
Saliency detection is an important procedure for machines to understand visual world as humans do. In this paper, we consider a specific saliency detection problem of predicting human eye fixations when they freely view natural images, and propose a novel dual low-rank pursuit (DLRP) method. DLRP learns saliency-aware feature transformations by utilizing available supervision information and constructs discriminative bases for effectively detecting human fixation points under the popular low-rank and sparsity-pursuit framework. Benefiting from the embedded high-level information in the supervised learning process, DLRP is able to predict fixations accurately without performing the expensive object segmentation as in the previous works. Comprehensive experiments clearly show the superiority of the proposed DLRP method over the established state-of-the-art methods. We also empirically demonstrate that DLRP provides stronger generalization performance across different data sets and inherits the advantages of both the bottom-up- and top-down-based saliency detection methods.
A Robust Shape Reconstruction Method for Facial Feature Point Detection.
Tan, Shuqiu; Chen, Dongyi; Guo, Chenggang; Huang, Zhiqi
2017-01-01
Facial feature point detection has been receiving great research advances in recent years. Numerous methods have been developed and applied in practical face analysis systems. However, it is still a quite challenging task because of the large variability in expression and gestures and the existence of occlusions in real-world photo shoot. In this paper, we present a robust sparse reconstruction method for the face alignment problems. Instead of a direct regression between the feature space and the shape space, the concept of shape increment reconstruction is introduced. Moreover, a set of coupled overcomplete dictionaries termed the shape increment dictionary and the local appearance dictionary are learned in a regressive manner to select robust features and fit shape increments. Additionally, to make the learned model more generalized, we select the best matched parameter set through extensive validation tests. Experimental results on three public datasets demonstrate that the proposed method achieves a better robustness over the state-of-the-art methods.
[Pharmacovigilance of major parmaceutical innovation].
Xiang, Yongyang; Xie, Yanming; Yi, Danhui
2011-10-01
With the continuous improvement of international "pharmacovigilance" technology and methods,it becomes the key part of the post-marketing evaluation. This issue is based on this research background, and also means to find out the Chinese medicine safety monitor which consistents with the reality. A common problem is that those who choose a career in pharmacovigilance know how the complex data presented to us are a source of both fascination and frustration. In the 70's, for the first time data mining technology in the international pharmacovigilance turn up, we try to establish new signal detection method to make contributes to post-marketing evaluation of Chinese medicine and establishment of registration. Building the national adverse reaction reporting database is widely used in western country. Nature of the problem is that pharmacovigilance issues can come through a lot of assumptions into the statistical problems, different assumptions are for different statistical tests. Through the traditional imbalance between the proportion of fourfold table for other assumptions, few countries use in practice, this does not involve evidence, but this issue provides the introduce of the principle. Methods include the ratio of the report of the Netherlands (ROR), the proportion of reports than the UK ratio (PRR),WHO's information points (IC), the U.S. Food and Drug Administration empirical Bayes (EBS), etc. Because there is no international gold standard of the signal detection method, at first we use the simulation comparing these four methods of data mining, From the point of specificity, the sample size demand, this issue views the advantages and disadvantages of four methods and application conditions,and from a technical point of view and try to propose a new signal detection method, for example, Hierarchical Bayesian.
Unsupervised Detection of Planetary Craters by a Marked Point Process
NASA Technical Reports Server (NTRS)
Troglio, G.; Benediktsson, J. A.; Le Moigne, J.; Moser, G.; Serpico, S. B.
2011-01-01
With the launch of several planetary missions in the last decade, a large amount of planetary images is being acquired. Preferably, automatic and robust processing techniques need to be used for data analysis because of the huge amount of the acquired data. Here, the aim is to achieve a robust and general methodology for crater detection. A novel technique based on a marked point process is proposed. First, the contours in the image are extracted. The object boundaries are modeled as a configuration of an unknown number of random ellipses, i.e., the contour image is considered as a realization of a marked point process. Then, an energy function is defined, containing both an a priori energy and a likelihood term. The global minimum of this function is estimated by using reversible jump Monte-Carlo Markov chain dynamics and a simulated annealing scheme. The main idea behind marked point processes is to model objects within a stochastic framework: Marked point processes represent a very promising current approach in the stochastic image modeling and provide a powerful and methodologically rigorous framework to efficiently map and detect objects and structures in an image with an excellent robustness to noise. The proposed method for crater detection has several feasible applications. One such application area is image registration by matching the extracted features.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gwyther, Ceri L.; Jones, David L.; Golyshin, Peter N.
Highlights: Black-Right-Pointing-Pointer Bioreduction is a novel on-farm storage option for livestock carcasses. Black-Right-Pointing-Pointer Legislation demands that pathogens are contained and do not proliferate during carcass storage. Black-Right-Pointing-Pointer We examined the survival of key pathogens in lab-scale bioreduction vessels. Black-Right-Pointing-Pointer Pathogen numbers reduced in the resulting liquor waste and bioaerosols. Black-Right-Pointing-Pointer The results indicate that bioreduction should be validated for industry use. - Abstract: The EU Animal By-Products Regulations generated the need for novel methods of storage and disposal of dead livestock. Bioreduction prior to rendering or incineration has been proposed as a practical and potentially cost-effective method; however, its biosecuritymore » characteristics need to be elucidated. To address this, Salmonella enterica (serovars Senftenberg and Poona), Enterococcus faecalis, Campylobacter jejuni, Campylobacter coli and a lux-marked strain of Escherichia coli O157 were inoculated into laboratory-scale bioreduction vessels containing sheep carcass constituents. Numbers of all pathogens and the metabolic activity of E. coli O157 decreased significantly within the liquor waste over time, and only E. faecalis remained detectable after 3 months. Only very low numbers of Salmonella spp. and E. faecalis were detected in bioaerosols, and only at initial stages of the trial. These results further indicate that bioreduction represents a suitable method of storing and reducing the volume of livestock carcasses prior to ultimate disposal.« less
Face liveness detection for face recognition based on cardiac features of skin color image
NASA Astrophysics Data System (ADS)
Suh, Kun Ha; Lee, Eui Chul
2016-07-01
With the growth of biometric technology, spoofing attacks have been emerged a threat to the security of the system. Main spoofing scenarios in the face recognition system include the printing attack, replay attack, and 3D mask attack. To prevent such attacks, techniques that evaluating liveness of the biometric data can be considered as a solution. In this paper, a novel face liveness detection method based on cardiac signal extracted from face is presented. The key point of proposed method is that the cardiac characteristic is detected in live faces but not detected in non-live faces. Experimental results showed that the proposed method can be effective way for determining printing attack or 3D mask attack.
NASA Astrophysics Data System (ADS)
Sayegh, Samir I.; Taghian, Alphonse
2013-03-01
Breast cancer-related lymphedema (BCRL) can be irreversible with profound negative impact on patients' quality of life. Programs that provide screening and active surveillance for BCRL are essential to determine whether early detection and intervention influences the course of lymphedema development. Established methods of quantitatively assessing lymphedema at early stages include "volume" methods such as perometry and bioimpedance spectroscopy. Here we demonstrate 1) Use of topographical techniques analogous to those used in corneal topography 2) Development of point-of-care lymphedema detection and characterization based on off-the-shelf hardward 3) The role of subsurface imaging 4) Multimodal diagnostics and integration yielding higher sensitivity/ specificity.
Theory and Application of Magnetic Flux Leakage Pipeline Detection.
Shi, Yan; Zhang, Chao; Li, Rui; Cai, Maolin; Jia, Guanwei
2015-12-10
Magnetic flux leakage (MFL) detection is one of the most popular methods of pipeline inspection. It is a nondestructive testing technique which uses magnetic sensitive sensors to detect the magnetic leakage field of defects on both the internal and external surfaces of pipelines. This paper introduces the main principles, measurement and processing of MFL data. As the key point of a quantitative analysis of MFL detection, the identification of the leakage magnetic signal is also discussed. In addition, the advantages and disadvantages of different identification methods are analyzed. Then the paper briefly introduces the expert systems used. At the end of this paper, future developments in pipeline MFL detection are predicted.
Theory and Application of Magnetic Flux Leakage Pipeline Detection
Shi, Yan; Zhang, Chao; Li, Rui; Cai, Maolin; Jia, Guanwei
2015-01-01
Magnetic flux leakage (MFL) detection is one of the most popular methods of pipeline inspection. It is a nondestructive testing technique which uses magnetic sensitive sensors to detect the magnetic leakage field of defects on both the internal and external surfaces of pipelines. This paper introduces the main principles, measurement and processing of MFL data. As the key point of a quantitative analysis of MFL detection, the identification of the leakage magnetic signal is also discussed. In addition, the advantages and disadvantages of different identification methods are analyzed. Then the paper briefly introduces the expert systems used. At the end of this paper, future developments in pipeline MFL detection are predicted. PMID:26690435
Method and apparatus for determining the coordinates of an object
Pedersen, Paul S.
2002-01-01
A simplified method and related apparatus are described for determining the location of points on the surface of an object by varying, in accordance with a unique sequence, the intensity of each illuminated pixel directed to the object surface, and detecting at known detector pixel locations the intensity sequence of reflected illumination from the surface of the object whereby the identity and location of the originating illuminated pixel can be determined. The coordinates of points on the surface of the object are then determined by conventional triangulation methods.
A Robust Gradient Based Method for Building Extraction from LiDAR and Photogrammetric Imagery.
Siddiqui, Fasahat Ullah; Teng, Shyh Wei; Awrangjeb, Mohammad; Lu, Guojun
2016-07-19
Existing automatic building extraction methods are not effective in extracting buildings which are small in size and have transparent roofs. The application of large area threshold prohibits detection of small buildings and the use of ground points in generating the building mask prevents detection of transparent buildings. In addition, the existing methods use numerous parameters to extract buildings in complex environments, e.g., hilly area and high vegetation. However, the empirical tuning of large number of parameters reduces the robustness of building extraction methods. This paper proposes a novel Gradient-based Building Extraction (GBE) method to address these limitations. The proposed method transforms the Light Detection And Ranging (LiDAR) height information into intensity image without interpolation of point heights and then analyses the gradient information in the image. Generally, building roof planes have a constant height change along the slope of a roof plane whereas trees have a random height change. With such an analysis, buildings of a greater range of sizes with a transparent or opaque roof can be extracted. In addition, a local colour matching approach is introduced as a post-processing stage to eliminate trees. This stage of our proposed method does not require any manual setting and all parameters are set automatically from the data. The other post processing stages including variance, point density and shadow elimination are also applied to verify the extracted buildings, where comparatively fewer empirically set parameters are used. The performance of the proposed GBE method is evaluated on two benchmark data sets by using the object and pixel based metrics (completeness, correctness and quality). Our experimental results show the effectiveness of the proposed method in eliminating trees, extracting buildings of all sizes, and extracting buildings with and without transparent roof. When compared with current state-of-the-art building extraction methods, the proposed method outperforms the existing methods in various evaluation metrics.
A Robust Gradient Based Method for Building Extraction from LiDAR and Photogrammetric Imagery
Siddiqui, Fasahat Ullah; Teng, Shyh Wei; Awrangjeb, Mohammad; Lu, Guojun
2016-01-01
Existing automatic building extraction methods are not effective in extracting buildings which are small in size and have transparent roofs. The application of large area threshold prohibits detection of small buildings and the use of ground points in generating the building mask prevents detection of transparent buildings. In addition, the existing methods use numerous parameters to extract buildings in complex environments, e.g., hilly area and high vegetation. However, the empirical tuning of large number of parameters reduces the robustness of building extraction methods. This paper proposes a novel Gradient-based Building Extraction (GBE) method to address these limitations. The proposed method transforms the Light Detection And Ranging (LiDAR) height information into intensity image without interpolation of point heights and then analyses the gradient information in the image. Generally, building roof planes have a constant height change along the slope of a roof plane whereas trees have a random height change. With such an analysis, buildings of a greater range of sizes with a transparent or opaque roof can be extracted. In addition, a local colour matching approach is introduced as a post-processing stage to eliminate trees. This stage of our proposed method does not require any manual setting and all parameters are set automatically from the data. The other post processing stages including variance, point density and shadow elimination are also applied to verify the extracted buildings, where comparatively fewer empirically set parameters are used. The performance of the proposed GBE method is evaluated on two benchmark data sets by using the object and pixel based metrics (completeness, correctness and quality). Our experimental results show the effectiveness of the proposed method in eliminating trees, extracting buildings of all sizes, and extracting buildings with and without transparent roof. When compared with current state-of-the-art building extraction methods, the proposed method outperforms the existing methods in various evaluation metrics. PMID:27447631
Hoang, Phuong Le; Ahn, Sanghoon; Kim, Jeng-o; Kang, Heeshin; Noh, Jiwhan
2017-01-01
In modern high-intensity ultrafast laser processing, detecting the focal position of the working laser beam, at which the intensity is the highest and the beam diameter is the lowest, and immediately locating the target sample at that point are challenging tasks. A system that allows in-situ real-time focus determination and fabrication using a high-power laser has been in high demand among both engineers and scientists. Conventional techniques require the complicated mathematical theory of wave optics, employing interference as well as diffraction phenomena to detect the focal position; however, these methods are ineffective and expensive for industrial application. Moreover, these techniques could not perform detection and fabrication simultaneously. In this paper, we propose an optical design capable of detecting the focal point and fabricating complex patterns on a planar sample surface simultaneously. In-situ real-time focus detection is performed using a bandpass filter, which only allows for the detection of laser transmission. The technique enables rapid, non-destructive, and precise detection of the focal point. Furthermore, it is sufficiently simple for application in both science and industry for mass production, and it is expected to contribute to the next generation of laser equipment, which can be used to fabricate micro-patterns with high complexity. PMID:28671566
A method of 3D object recognition and localization in a cloud of points
NASA Astrophysics Data System (ADS)
Bielicki, Jerzy; Sitnik, Robert
2013-12-01
The proposed method given in this article is prepared for analysis of data in the form of cloud of points directly from 3D measurements. It is designed for use in the end-user applications that can directly be integrated with 3D scanning software. The method utilizes locally calculated feature vectors (FVs) in point cloud data. Recognition is based on comparison of the analyzed scene with reference object library. A global descriptor in the form of a set of spatially distributed FVs is created for each reference model. During the detection process, correlation of subsets of reference FVs with FVs calculated in the scene is computed. Features utilized in the algorithm are based on parameters, which qualitatively estimate mean and Gaussian curvatures. Replacement of differentiation with averaging in the curvatures estimation makes the algorithm more resistant to discontinuities and poor quality of the input data. Utilization of the FV subsets allows to detect partially occluded and cluttered objects in the scene, while additional spatial information maintains false positive rate at a reasonably low level.
Feature-based registration of historical aerial images by Area Minimization
NASA Astrophysics Data System (ADS)
Nagarajan, Sudhagar; Schenk, Toni
2016-06-01
The registration of historical images plays a significant role in assessing changes in land topography over time. By comparing historical aerial images with recent data, geometric changes that have taken place over the years can be quantified. However, the lack of ground control information and precise camera parameters has limited scientists' ability to reliably incorporate historical images into change detection studies. Other limitations include the methods of determining identical points between recent and historical images, which has proven to be a cumbersome task due to continuous land cover changes. Our research demonstrates a method of registering historical images using Time Invariant Line (TIL) features. TIL features are different representations of the same line features in multi-temporal data without explicit point-to-point or straight line-to-straight line correspondence. We successfully determined the exterior orientation of historical images by minimizing the area formed between corresponding TIL features in recent and historical images. We then tested the feasibility of the approach with synthetic and real data and analyzed the results. Based on our analysis, this method shows promise for long-term 3D change detection studies.
Direct writing electrodes using a ball pen for paper-based point-of-care testing.
Li, Zedong; Li, Fei; Hu, Jie; Wee, Wei Hong; Han, Yu Long; Pingguan-Murphy, Belinda; Lu, Tian Jian; Xu, Feng
2015-08-21
The integration of paper with an electrochemical device has attracted growing attention for point-of-care testing, where it is of great importance to fabricate electrodes on paper in a low-cost, easy and versatile way. In this work, we report a simple strategy for directly writing electrodes on paper using a pressure-assisted ball pen to form a paper-based electrochemical device (PED). This method is demonstrated to be capable of fabricating electrodes on paper with good electrical conductivity and electrochemical performance, holding great potential to be employed in point-of-care applications, such as in human health diagnostics and food safety detection. As examples, the PEDs fabricated using the developed method are applied for detection of glucose in artificial urine and melamine in sample solutions. Furthermore, our developed strategy is also extended to fabricate PEDs with multi-electrode arrays and write electrodes on non-planar surfaces (e.g., paper cup, human skin), indicating the potential application of our method in other fields, such as fabricating biosensors, paper electronics etc.
Image registration with uncertainty analysis
Simonson, Katherine M [Cedar Crest, NM
2011-03-22
In an image registration method, edges are detected in a first image and a second image. A percentage of edge pixels in a subset of the second image that are also edges in the first image shifted by a translation is calculated. A best registration point is calculated based on a maximum percentage of edges matched. In a predefined search region, all registration points other than the best registration point are identified that are not significantly worse than the best registration point according to a predetermined statistical criterion.
Sensitive and reliable detection of Kit point mutation Asp 816 to Val in pathological material
Kähler, Christian; Didlaukat, Sabine; Feller, Alfred C; Merz, Hartmut
2007-01-01
Background Human mastocytosis is a heterogenous disorder which is linked to a gain-of-function mutation in the kinase domain of the receptor tyrosine kinase Kit. This D816V mutation leads to constitutive activation and phosphorylation of Kit with proliferative disorders of mast cells in the peripheral blood, skin, and spleen. Most PCR applications used so far are labour-intensive and are not adopted to daily routine in pathological laboratories. The method has to be robust and working on such different materials like archival formalin-fixed, paraffin-embedded tissue (FFPE) and blood samples. Such a method is introduced in this publication. Methods The Kit point mutation Asp 816 to Val is heterozygous which means a problem in detection by PCR because the wild-type allele is also amplified and the number of cells which bear the point mutation is in most of the cases low. Most PCR protocols use probes to block the wild-type allele during amplification with more or less satisfying result. This is why point-mutated forward primers were designed and tested for efficiency in amplification of the mutated allele. Results One primer combination (A) fits the most for the introduced PCR assay. It was able just to amplify the mutated allele with high specificity from different patient's materials (FFPE or blood) of varying quality and quantity. Moreover, the sensitivity for this assay was convincing because 10 ng of DNA which bears the point mutation could be detected in a total volume of 200 ng of DNA. Conclusion The PCR assay is able to deal with different materials (blood and FFPE) this means quality and quantity of DNA and can be used for high-througput screening because of its robustness. Moreover, the method is easy-to-use, not labour-intensive, and easy to realise in a standard laboratory. PMID:17900365
Wilkes, Rebecca Penrose; Kania, Stephen A; Tsai, Yun-Long; Lee, Pei-Yu Alison; Chang, Hsiu-Hui; Ma, Li-Juan; Chang, Hsiao-Fen Grace; Wang, Hwa-Tang Thomas
2015-07-01
Feline immunodeficiency virus (FIV) is an important infectious agent of cats. Clinical syndromes resulting from FIV infection include immunodeficiency, opportunistic infections, and neoplasia. In our study, a 5' long terminal repeat/gag region-based reverse transcription insulated isothermal polymerase chain reaction (RT-iiPCR) was developed to amplify all known FIV strains to facilitate point-of-need FIV diagnosis. The RT-iiPCR method was applied in a point-of-need PCR detection platform--a field-deployable device capable of generating automatically interpreted RT-iiPCR results from nucleic acids within 1 hr. Limit of detection 95% of FIV RT-iiPCR was calculated to be 95 copies standard in vitro transcription RNA per reaction. Endpoint dilution studies with serial dilutions of an ATCC FIV type strain showed that the sensitivity of lyophilized FIV RT-iiPCR reagent was comparable to that of a reference nested PCR. The established reaction did not amplify any nontargeted feline pathogens, including Felid herpesvirus 1, feline coronavirus, Feline calicivirus, Feline leukemia virus, Mycoplasma haemofelis, and Chlamydophila felis. Based on analysis of 76 clinical samples (including blood and bone marrow) with the FIV RT-iiPCR, test sensitivity was 97.78% (44/45), specificity was 100.00% (31/31), and agreement was 98.65% (75/76), determined against a reference nested-PCR assay. A kappa value of 0.97 indicated excellent correlation between these 2 methods. The lyophilized FIV RT-iiPCR reagent, deployed on a user-friendly portable device, has potential utility for rapid and easy point-of-need detection of FIV in cats. © 2015 The Author(s).
Efficiency transfer using the GEANT4 code of CERN for HPGe gamma spectrometry.
Chagren, S; Tekaya, M Ben; Reguigui, N; Gharbi, F
2016-01-01
In this work we apply the GEANT4 code of CERN to calculate the peak efficiency in High Pure Germanium (HPGe) gamma spectrometry using three different procedures. The first is a direct calculation. The second corresponds to the usual case of efficiency transfer between two different configurations at constant emission energy assuming a reference point detection configuration and the third, a new procedure, consists on the transfer of the peak efficiency between two detection configurations emitting the gamma ray in different energies assuming a "virtual" reference point detection configuration. No pre-optimization of the detector geometrical characteristics was performed before the transfer to test the ability of the efficiency transfer to reduce the effect of the ignorance on their real magnitude on the quality of the transferred efficiency. The obtained and measured efficiencies were found in good agreement for the two investigated methods of efficiency transfer. The obtained agreement proves that Monte Carlo method and especially the GEANT4 code constitute an efficient tool to obtain accurate detection efficiency values. The second investigated efficiency transfer procedure is useful to calibrate the HPGe gamma detector for any emission energy value for a voluminous source using one point source detection efficiency emitting in a different energy as a reference efficiency. The calculations preformed in this work were applied to the measurement exercise of the EUROMET428 project. A measurement exercise where an evaluation of the full energy peak efficiencies in the energy range 60-2000 keV for a typical coaxial p-type HpGe detector and several types of source configuration: point sources located at various distances from the detector and a cylindrical box containing three matrices was performed. Copyright © 2015 Elsevier Ltd. All rights reserved.
Giovannetti, Rita; Alibabaei, Leila; Zannotti, Marco; Ferraro, Stefano; Petetta, Laura
2013-01-01
The composition of sedimentary pigments in the Antarctic lake at Edmonson Point has been investigated and compared with the aim to provide a useful analytical method for pigments separation and identification, providing reference data for future assessment of possible changes in environmental conditions. Reversed phase high performance liquid chromatography (HPLC) with electrospray-mass spectrometry (ESI-MS) detection and diode array detection (DAD) has been used to identify light screening and light harvesting pigments. The results are discussed in terms of local environmental conditions.
Hyperspectral microscopic imaging by multiplex coherent anti-Stokes Raman scattering (CARS)
NASA Astrophysics Data System (ADS)
Khmaladze, Alexander; Jasensky, Joshua; Zhang, Chi; Han, Xiaofeng; Ding, Jun; Seeley, Emily; Liu, Xinran; Smith, Gary D.; Chen, Zhan
2011-10-01
Coherent anti-Stokes Raman scattering (CARS) microscopy is a powerful technique to image the chemical composition of complex samples in biophysics, biology and materials science. CARS is a four-wave mixing process. The application of a spectrally narrow pump beam and a spectrally wide Stokes beam excites multiple Raman transitions, which are probed by a probe beam. This generates a coherent directional CARS signal with several orders of magnitude higher intensity relative to spontaneous Raman scattering. Recent advances in the development of ultrafast lasers, as well as photonic crystal fibers (PCF), enable multiplex CARS. In this study, we employed two scanning imaging methods. In one, the detection is performed by a photo-multiplier tube (PMT) attached to the spectrometer. The acquisition of a series of images, while tuning the wavelengths between images, allows for subsequent reconstruction of spectra at each image point. The second method detects CARS spectrum in each point by a cooled coupled charged detector (CCD) camera. Coupled with point-by-point scanning, it allows for a hyperspectral microscopic imaging. We applied this CARS imaging system to study biological samples such as oocytes.
Lardeux, Frédéric; Torrico, Gino; Aliaga, Claudia
2016-07-04
In ELISAs, sera of individuals infected by Trypanosoma cruzi show absorbance values above a cut-off value. The cut-off is generally computed by means of formulas that need absorbance readings of negative (and sometimes positive) controls, which are included in the titer plates amongst the unknown samples. When no controls are available, other techniques should be employed such as change-point analysis. The method was applied to Bolivian dog sera processed by ELISA to diagnose T. cruzi infection. In each titer plate, the change-point analysis estimated a step point which correctly discriminated among known positive and known negative sera, unlike some of the six usual cut-off formulas tested. To analyse the ELISAs results, the change-point method was as good as the usual cut-off formula of the form "mean + 3 standard deviation of negative controls". Change-point analysis is therefore an efficient alternative method to analyse ELISA absorbance values when no controls are available.
Test sensitivity is important for detecting variability in pointing comprehension in canines.
Pongrácz, Péter; Gácsi, Márta; Hegedüs, Dorottya; Péter, András; Miklósi, Adám
2013-09-01
Several articles have been recently published on dogs' (Canis familiaris) performance in two-way object choice experiments in which subjects had to find hidden food by utilizing human pointing. The interpretation of results has led to a vivid theoretical debate about the cognitive background of human gestural signal understanding in dogs, despite the fact that many important details of the testing method have not yet been standardized. We report three experiments that aim to reveal how some procedural differences influence adult companion dogs' performance in these tests. Utilizing a large sample in Experiment 1, we provide evidence that neither the keeping conditions (garden/house) nor the location of the testing (outdoor/indoor) affect a dogs' performance. In Experiment 2, we compare dogs' performance using three different types of pointing gestures. Dogs' performance varied between momentary distal and momentary cross-pointing but "low" and "high" performer dogs chose uniformly better than chance level if they responded to sustained pointing gestures with reinforcement (food reward and a clicking sound; "clicker pointing"). In Experiment 3, we show that single features of the aforementioned "clicker pointing" method can slightly improve dogs' success rate if they were added one by one to the momentary distal pointing method. These results provide evidence that although companion dogs show a robust performance at different testing locations regardless of their keeping conditions, the exact execution of the human gesture and additional reinforcement techniques have substantial effect on the outcomes. Consequently, researchers should standardize their methodology before engaging in debates on the comparative aspects of socio-cognitive skills because the procedures they utilize may differ in sensitivity for detecting differences.
Advanced signal processing methods applied to guided waves for wire rope defect detection
NASA Astrophysics Data System (ADS)
Tse, Peter W.; Rostami, Javad
2016-02-01
Steel wire ropes, which are usually composed of a polymer core and enclosed by twisted wires, are used to hoist heavy loads. These loads are different structures that can be clamshells, draglines, elevators, etc. Since the loading of these structures is dynamic, the ropes are working under fluctuating forces in a corrosive environment. This consequently leads to progressive loss of the metallic cross-section due to abrasion and corrosion. These defects can be seen in the forms of roughened and pitted surface of the ropes, reduction in diameter, and broken wires. Therefore, their deterioration must be monitored so that any unexpected damage or corrosion can be detected before it causes fatal accident. This is of vital importance in the case of passenger transportation, particularly in elevators in which any failure may cause a catastrophic disaster. At present, the widely used methods for thorough inspection of wire ropes include visual inspection and magnetic flux leakage (MFL). Reliability of the first method is questionable since it only depends on the operators' eyes that fails to determine the integrity of internal wires. The later method has the drawback of being a point by point and time-consuming inspection method. Ultrasonic guided wave (UGW) based inspection, which has proved its capability in inspecting plate like structures such as tubes and pipes, can monitor the cross-section of wire ropes in their entire length from a single point. However, UGW have drawn less attention for defect detection in wire ropes. This paper reports the condition monitoring of a steel wire rope from a hoisting elevator with broken wires as a result of corrosive environment and fatigue. Experiments were conducted to investigate the efficiency of using magnetostrictive based UGW for rope defect detection. The obtained signals were analyzed by two time-frequency representation (TFR) methods, namely the Short Time Fourier Transform (STFT) and the Wavelet analysis. The location of the defect and its severity were successfully identified and characterized.
Video Analytics Evaluation: Survey of Datasets, Performance Metrics and Approaches
2014-09-01
training phase and a fusion of the detector outputs. 6.3.1 Training Techniques 1. Bagging: The basic idea of Bagging is to train multiple classifiers...can reduce more noise interesting points. Person detection and background subtraction methods were used to create hot regions. The hot regions were...detection algorithms are incorporated with MHT to construct one integrated detector /tracker. 6.8 IRDS-CASIA team IRDS-CASIA proposed a method to solve a
Cedergren, A
1974-06-01
A rapid and sensitive method using true potentiometric end-point detection has been developed and compared with the conventional amperometric method for Karl Fischer determination of water. The effect of the sulphur dioxide concentration on the shape of the titration curve is shown. By using kinetic data it was possible to calculate the course of titrations and make comparisons with those found experimentally. The results prove that the main reaction is the slow step, both in the amperometric and the potentiometric method. Results obtained in the standardization of the Karl Fischer reagent showed that the potentiometric method, including titration to a preselected potential, gave a standard deviation of 0.001(1) mg of water per ml, the amperometric method using extrapolation 0.002(4) mg of water per ml and the amperometric titration to a pre-selected diffusion current 0.004(7) mg of water per ml. Theories and results dealing with dilution effects are presented. The time of analysis was 1-1.5 min for the potentiometric and 4-5 min for the amperometric method using extrapolation.
The small low SNR target tracking using sparse representation information
NASA Astrophysics Data System (ADS)
Yin, Lifan; Zhang, Yiqun; Wang, Shuo; Sun, Chenggang
2017-11-01
Tracking small targets, such as missile warheads, from a remote distance is a difficult task since the targets are "points" which are similar to sensor's noise points. As a result, traditional tracking algorithms only use the information contained in point measurement, such as the position information and intensity information, as characteristics to identify targets from noise points. But in fact, as a result of the diffusion of photon, any small target is not a point in the focal plane array and it occupies an area which is larger than one sensor cell. So, if we can take the geometry characteristic into account as a new dimension of information, it will be of helpful in distinguishing targets from noise points. In this paper, we use a novel method named sparse representation (SR) to depict the geometry information of target intensity and define it as the SR information of target. Modeling the intensity spread and solving its SR coefficients, the SR information is represented by establishing its likelihood function. Further, the SR information likelihood is incorporated in the conventional Probability Hypothesis Density (PHD) filter algorithm with point measurement. To illustrate the different performances of algorithm with or without the SR information, the detection capability and estimation error have been compared through simulation. Results demonstrate the proposed method has higher estimation accuracy and probability of detecting target than the conventional algorithm without the SR information.
NASA Astrophysics Data System (ADS)
Yu, Yongtao; Li, Jonathan; Wen, Chenglu; Guan, Haiyan; Luo, Huan; Wang, Cheng
2016-03-01
This paper presents a novel algorithm for detection and recognition of traffic signs in mobile laser scanning (MLS) data for intelligent transportation-related applications. The traffic sign detection task is accomplished based on 3-D point clouds by using bag-of-visual-phrases representations; whereas the recognition task is achieved based on 2-D images by using a Gaussian-Bernoulli deep Boltzmann machine-based hierarchical classifier. To exploit high-order feature encodings of feature regions, a deep Boltzmann machine-based feature encoder is constructed. For detecting traffic signs in 3-D point clouds, the proposed algorithm achieves an average recall, precision, quality, and F-score of 0.956, 0.946, 0.907, and 0.951, respectively, on the four selected MLS datasets. For on-image traffic sign recognition, a recognition accuracy of 97.54% is achieved by using the proposed hierarchical classifier. Comparative studies with the existing traffic sign detection and recognition methods demonstrate that our algorithm obtains promising, reliable, and high performance in both detecting traffic signs in 3-D point clouds and recognizing traffic signs on 2-D images.
A computational method for detecting copy number variations using scale-space filtering
2013-01-01
Background As next-generation sequencing technology made rapid and cost-effective sequencing available, the importance of computational approaches in finding and analyzing copy number variations (CNVs) has been amplified. Furthermore, most genome projects need to accurately analyze sequences with fairly low-coverage read data. It is urgently needed to develop a method to detect the exact types and locations of CNVs from low coverage read data. Results Here, we propose a new CNV detection method, CNV_SS, which uses scale-space filtering. The scale-space filtering is evaluated by applying to the read coverage data the Gaussian convolution for various scales according to a given scaling parameter. Next, by differentiating twice and finding zero-crossing points, inflection points of scale-space filtered read coverage data are calculated per scale. Then, the types and the exact locations of CNVs are obtained by analyzing the finger print map, the contours of zero-crossing points for various scales. Conclusions The performance of CNV_SS showed that FNR and FPR stay in the range of 1.27% to 2.43% and 1.14% to 2.44%, respectively, even at a relatively low coverage (0.5x ≤C ≤2x). CNV_SS gave also much more effective results than the conventional methods in the evaluation of FNR, at 3.82% at least and 76.97% at most even when the coverage level of read data is low. CNV_SS source code is freely available from http://dblab.hallym.ac.kr/CNV SS/. PMID:23418726
Multi-scale silica structures for improved point of care detection
NASA Astrophysics Data System (ADS)
Lin, Sophia; Lin, Lancy; Cho, Eunbyul; Pezzani, Gaston A. O.; Khine, Michelle
2017-03-01
The need for sensitive, portable diagnostic tests at the point of care persists. We report on a simple method to obtain improved detection of biomolecules by a two-fold mechanism. Silica (SiO2) is coated on pre-stressed thermoplastic shrink-wrap film. When the film retracts, the resulting micro- and nanostructures yield far-field fluorescence signal enhancements over their planar or wrinkled counterparts. Because the film shrinks by 95% in surface area, there is also a 20x concentration effect. The SiO2 structured substrate is therefore used for improved detection of labeled proteins and DNA hybridization via both fluorescent and bright field. Through optical characterization studies, we attribute the fluorescence signal enhancements of 100x to increased surface density and light scattering from the rough SiO2 structures. Combining with our open channel self-wicking microfluidics, we can achieve extremely low cost yet sensitive point of care diagnostics.
Detection method of visible and invisible nipples on digital breast tomosynthesis
NASA Astrophysics Data System (ADS)
Chae, Seung-Hoon; Jeong, Ji-Wook; Lee, Sooyeul; Chae, Eun Young; Kim, Hak Hee; Choi, Young-Wook
2015-03-01
Digital Breast Tomosynthesis(DBT) with 3D breast image can improve detection sensitivity of breast cancer more than 2D mammogram on dense breast. The nipple location information is needed to analyze DBT. The nipple location is invaluable information in registration and as a reference point for classifying mass or micro-calcification clusters. Since there are visible nipple and invisible nipple in 2D mammogram or DBT, the nipple detection of breast must be possible to detect visible and invisible nipple of breast. The detection method of visible nipple using shape information of nipple is simple and highly efficient. However, it is difficult to detect invisible nipple because it doesn't have prominent shape. Mammary glands in breast connect nipple, anatomically. The nipple location is detected through analyzing location of mammary glands in breast. In this paper, therefore, we propose a method to detect the nipple on a breast, which has a visible or invisible nipple using changes of breast area and mammary glands, respectively. The result shows that our proposed method has average error of 2.54+/-1.47mm.
An improved algorithm of laser spot center detection in strong noise background
NASA Astrophysics Data System (ADS)
Zhang, Le; Wang, Qianqian; Cui, Xutai; Zhao, Yu; Peng, Zhong
2018-01-01
Laser spot center detection is demanded in many applications. The common algorithms for laser spot center detection such as centroid and Hough transform method have poor anti-interference ability and low detection accuracy in the condition of strong background noise. In this paper, firstly, the median filtering was used to remove the noise while preserving the edge details of the image. Secondly, the binarization of the laser facula image was carried out to extract target image from background. Then the morphological filtering was performed to eliminate the noise points inside and outside the spot. At last, the edge of pretreated facula image was extracted and the laser spot center was obtained by using the circle fitting method. In the foundation of the circle fitting algorithm, the improved algorithm added median filtering, morphological filtering and other processing methods. This method could effectively filter background noise through theoretical analysis and experimental verification, which enhanced the anti-interference ability of laser spot center detection and also improved the detection accuracy.
Yu, Yifei; Luo, Linqing; Li, Bo; Guo, Linfeng; Yan, Jize; Soga, Kenichi
2015-10-01
The measured distance error caused by double peaks in the BOTDRs (Brillouin optical time domain reflectometers) system is a kind of Brillouin scattering spectrum (BSS) deformation, discussed and simulated for the first time in the paper, to the best of the authors' knowledge. Double peak, as a kind of Brillouin spectrum deformation, is important in the enhancement of spatial resolution, measurement accuracy, and crack detection. Due to the variances of the peak powers of the BSS along the fiber, the measured starting point of a step-shape frequency transition region is shifted and results in distance errors. Zero-padded short-time-Fourier-transform (STFT) can restore the transition-induced double peaks in the asymmetric and deformed BSS, thus offering more accurate and quicker measurements than the conventional Lorentz-fitting method. The recovering method based on the double-peak detection and corresponding BSS deformation can be applied to calculate the real starting point, which can improve the distance accuracy of the STFT-based BOTDR system.
Detection of ferromagnetic target based on mobile magnetic gradient tensor system
NASA Astrophysics Data System (ADS)
Gang, Y. I. N.; Yingtang, Zhang; Zhining, Li; Hongbo, Fan; Guoquan, Ren
2016-03-01
Attitude change of mobile magnetic gradient tensor system critically affects the precision of gradient measurements, thereby increasing ambiguity in target detection. This paper presents a rotational invariant-based method for locating and identifying ferromagnetic targets. Firstly, unit magnetic moment vector was derived based on the geometrical invariant, such that the intermediate eigenvector of the magnetic gradient tensor is perpendicular to the magnetic moment vector and the source-sensor displacement vector. Secondly, unit source-sensor displacement vector was derived based on the characteristic that the angle between magnetic moment vector and source-sensor displacement is a rotational invariant. By introducing a displacement vector between two measurement points, the magnetic moment vector and the source-sensor displacement vector were theoretically derived. To resolve the problem of measurement noises existing in the realistic detection applications, linear equations were formulated using invariants corresponding to several distinct measurement points and least square solution of magnetic moment vector and source-sensor displacement vector were obtained. Results of simulation and principal verification experiment showed the correctness of the analytical method, along with the practicability of the least square method.
Electronic method for autofluorography of macromolecules on two-D matrices. [Patent application
Davidson, J.B.; Case, A.L.
1981-12-30
A method for detecting, localizing, and quantifying macromolecules contained in a two-dimensional matrix is provided which employs a television-based position sensitive detection system. A molecule-containing matrix may be produced by conventional means to produce spots of light at the molecule locations which are detected by the television system. The matrix, such as a gel matrix, is exposed to an electronic camera system including an image-intensifier and secondary electron conduction camera capable of light integrating times of many minutes. A light image stored in the form of a charge image on the camera tube target is scanned by conventional television techniques, digitized, and stored in a digital memory. Intensity of any point on the image may be determined from the number at the memory address of the point. The entire image may be displayed on a television monitor for inspection and photographing or individual spots may be analyzed through selected readout of the memory locations. Compared to conventional film exposure methods, the exposure time may be reduced 100 to 1000 times.
The algorithm of fast image stitching based on multi-feature extraction
NASA Astrophysics Data System (ADS)
Yang, Chunde; Wu, Ge; Shi, Jing
2018-05-01
This paper proposed an improved image registration method combining Hu-based invariant moment contour information and feature points detection, aiming to solve the problems in traditional image stitching algorithm, such as time-consuming feature points extraction process, redundant invalid information overload and inefficiency. First, use the neighborhood of pixels to extract the contour information, employing the Hu invariant moment as similarity measure to extract SIFT feature points in those similar regions. Then replace the Euclidean distance with Hellinger kernel function to improve the initial matching efficiency and get less mismatching points, further, estimate affine transformation matrix between the images. Finally, local color mapping method is adopted to solve uneven exposure, using the improved multiresolution fusion algorithm to fuse the mosaic images and realize seamless stitching. Experimental results confirm high accuracy and efficiency of method proposed in this paper.
The conservation value of degraded forests for agile gibbons Hylobates agilis.
Lee, David C; Powell, Victoria J; Lindsell, Jeremy A
2015-01-01
All gibbon species are globally threatened with extinction yet conservation efforts are undermined by a lack of population and ecological data. Agile gibbons (Hylobates agilis) occur in Sumatra, Indonesia and adjacent mainland Southeast Asia. Population densities are known from four sites (three in Sumatra) while little is known about their ability to tolerate habitat degradation. We conducted a survey of agile gibbons in Harapan Rainforest, a lowland forest site in Sumatra. The area has been severely degraded by selective logging and encroachment but is now managed for ecosystem restoration. We used two survey methods: an established point count method for gibbons with some modifications, and straight line transects using auditory detections. Surveys were conducted in the three main forest types prevalent at the site: high, medium, and low canopy cover secondary forests. Mean group density estimates were higher from point counts than from line transects, and tended to be higher in less degraded forests within the study site. We consider points more time efficient and reliable than transects since detectability of gibbons was higher from points per unit effort. We recommend the additional use of Distance sampling methods to account for imperfect detection and provide other recommendations to improve surveys of gibbons. We estimate that the site holds at least 6,070 and as many as 11,360 gibbons. Our results demonstrate that degraded forests can be extremely important for the conservation of agile gibbons and that efforts to protect and restore such sites could contribute significantly to the conservation of the species. © 2014 Wiley Periodicals, Inc.
Wilkes, Rebecca P; Lee, Pei-Yu A; Tsai, Yun-Long; Tsai, Chuan-Fu; Chang, Hsiu-Hui; Chang, Hsiao-Fen G; Wang, Hwa-Tang T
2015-08-01
Canine parvovirus type 2 (CPV-2), including subtypes 2a, 2b and 2c, causes an acute enteric disease in both domestic and wild animals. Rapid and sensitive diagnosis aids effective disease management at points of need (PON). A commercially available, field-deployable and user-friendly system, designed with insulated isothermal PCR (iiPCR) technology, displays excellent sensitivity and specificity for nucleic acid detection. An iiPCR method was developed for on-site detection of all circulating CPV-2 strains. Limit of detection was determined using plasmid DNA. CPV-2a, 2b and 2c strains, a feline panleukopenia virus (FPV) strain, and nine canine pathogens were tested to evaluate assay specificity. Reaction sensitivity and performance were compared with an in-house real-time PCR using serial dilutions of a CPV-2b strain and 100 canine fecal clinical samples collected from 2010 to 2014, respectively. The 95% limit of detection of the iiPCR method was 13 copies of standard DNA and detection limits for CPV-2b DNA were equivalent for iiPCR and real-time PCR. The iiPCR reaction detected CPV-2a, 2b and 2c and FPV. Non-targeted pathogens were not detected. Test results of real-time PCR and iiPCR from 99 fecal samples agreed with each other, while one real-time PCR-positive sample tested negative by iiPCR. Therefore, excellent agreement (k = 0.98) with sensitivity of 98.41% and specificity of 100% in detecting CPV-2 in feces was found between the two methods. In conclusion, the iiPCR system has potential to serve as a useful tool for rapid and accurate PON, molecular detection of CPV-2. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Montagne, Louise; Derhourhi, Mehdi; Piton, Amélie; Toussaint, Bénédicte; Durand, Emmanuelle; Vaillant, Emmanuel; Thuillier, Dorothée; Gaget, Stefan; De Graeve, Franck; Rabearivelo, Iandry; Lansiaux, Amélie; Lenne, Bruno; Sukno, Sylvie; Desailloud, Rachel; Cnop, Miriam; Nicolescu, Ramona; Cohen, Lior; Zagury, Jean-François; Amouyal, Mélanie; Weill, Jacques; Muller, Jean; Sand, Olivier; Delobel, Bruno; Froguel, Philippe; Bonnefond, Amélie
2018-05-16
The molecular diagnosis of extreme forms of obesity, in which accurate detection of both copy number variations (CNVs) and point mutations, is crucial for an optimal care of the patients and genetic counseling for their families. Whole-exome sequencing (WES) has benefited considerably this molecular diagnosis, but its poor ability to detect CNVs remains a major limitation. We aimed to develop a method (CoDE-seq) enabling the accurate detection of both CNVs and point mutations in one step. CoDE-seq is based on an augmented WES method, using probes distributed uniformly throughout the genome. CoDE-seq was validated in 40 patients for whom chromosomal DNA microarray was available. CNVs and mutations were assessed in 82 children/young adults with suspected Mendelian obesity and/or intellectual disability and in their parents when available (n total = 145). CoDE-seq not only detected all of the 97 CNVs identified by chromosomal DNA microarrays but also found 84 additional CNVs, due to a better resolution. When compared to CoDE-seq and chromosomal DNA microarrays, WES failed to detect 37% and 14% of CNVs, respectively. In the 82 patients, a likely molecular diagnosis was achieved in >30% of the patients. Half of the genetic diagnoses were explained by CNVs while the other half by mutations. CoDE-seq has proven cost-efficient and highly effective as it avoids the sequential genetic screening approaches currently used in clinical practice for the accurate detection of CNVs and point mutations. Copyright © 2018 The Authors. Published by Elsevier GmbH.. All rights reserved.
Turner, Andrew; Sasse, Jurgen; Varadi, Aniko
2016-10-19
Inherited disorders of haemoglobin are the world's most common genetic diseases, resulting in significant morbidity and mortality. The large number of mutations associated with the haemoglobin beta gene (HBB) makes gene scanning by High Resolution Melting (HRM) PCR an attractive diagnostic approach. However, existing HRM-PCR assays are not able to detect all common point mutations and have only a very limited ability to detect larger gene rearrangements. The aim of the current study was to develop a HBB assay, which can be used as a screening test in highly heterogeneous populations, for detection of both point mutations and larger gene rearrangements. The assay is based on a combination of conventional HRM-PCR and a novel Gene Ratio Analysis Copy Enumeration (GRACE) PCR method. HRM-PCR was extensively optimised, which included the use of an unlabelled probe and incorporation of universal bases into primers to prevent interference from common non-pathological polymorphisms. GRACE-PCR was employed to determine HBB gene copy numbers relative to a reference gene using melt curve analysis to detect rearrangements in the HBB gene. The performance of the assay was evaluated by analysing 410 samples. A total of 44 distinct pathological genotypes were detected. In comparison with reference methods, the assay has a sensitivity of 100 % and a specificity of 98 %. We have developed an assay that detects both point mutations and larger rearrangements of the HBB gene. This assay is quick, sensitive, specific and cost effective making it suitable as an initial screening test that can be used for highly heterogeneous cohorts.
Kokoris, M; Nabavi, M; Lancaster, C; Clemmens, J; Maloney, P; Capadanno, J; Gerdes, J; Battrell, C F
2005-09-01
One current challenge facing point-of-care cancer detection is that existing methods make it difficult, time consuming and too costly to (1) collect relevant cell types directly from a patient sample, such as blood and (2) rapidly assay those cell types to determine the presence or absence of a particular type of cancer. We present a proof of principle method for an integrated, sample-to-result, point-of-care detection device that employs microfluidics technology, accepted assays, and a silica membrane for total RNA purification on a disposable, credit card sized laboratory-on-card ('lab card") device in which results are obtained in minutes. Both yield and quality of on-card purified total RNA, as determined by both LightCycler and standard reverse transcriptase amplification of G6PDH and BCR-ABL transcripts, were found to be better than or equal to accepted standard purification methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kercel, S.W.
1999-11-07
For several reasons, Bayesian parameter estimation is superior to other methods for inductively learning a model for an anticipatory system. Since it exploits prior knowledge, the analysis begins from a more advantageous starting point than other methods. Also, since "nuisance parameters" can be removed from the Bayesian analysis, the description of the model need not be as complete as is necessary for such methods as matched filtering. In the limit of perfectly random noise and a perfect description of the model, the signal-to-noise ratio improves as the square root of the number of samples in the data. Even with themore » imperfections of real-world data, Bayesian methods approach this ideal limit of performance more closely than other methods. These capabilities provide a strategy for addressing a major unsolved problem in pump operation: the identification of precursors of cavitation. Cavitation causes immediate degradation of pump performance and ultimate destruction of the pump. However, the most efficient point to operate a pump is just below the threshold of cavitation. It might be hoped that a straightforward method to minimize pump cavitation damage would be to simply adjust the operating point until the inception of cavitation is detected and then to slightly readjust the operating point to let the cavitation vanish. However, due to the continuously evolving state of the fluid moving through the pump, the threshold of cavitation tends to wander. What is needed is to anticipate cavitation, and this requires the detection and identification of precursor features that occur just before cavitation starts.« less
Segmentation of suspicious objects in an x-ray image using automated region filling approach
NASA Astrophysics Data System (ADS)
Fu, Kenneth; Guest, Clark; Das, Pankaj
2009-08-01
To accommodate the flow of commerce, cargo inspection systems require a high probability of detection and low false alarm rate while still maintaining a minimum scan speed. Since objects of interest (high atomic-number metals) will often be heavily shielded to avoid detection, any detection algorithm must be able to identify such objects despite the shielding. Since pixels of a shielded object have a greater opacity than the shielding, we use a clustering method to classify objects in the image by pixel intensity levels. We then look within each intensity level region for sub-clusters of pixels with greater opacity than the surrounding region. A region containing an object has an enclosed-contour region (a hole) inside of it. We apply a region filling technique to fill in the hole, which represents a shielded object of potential interest. One method for region filling is seed-growing, which puts a "seed" starting point in the hole area and uses a selected structural element to fill out that region. However, automatic seed point selection is a hard problem; it requires additional information to decide if a pixel is within an enclosed region. Here, we propose a simple, robust method for region filling that avoids the problem of seed point selection. In our approach, we calculate the gradient Gx and Gy at each pixel in a binary image, and fill in 1s between a pair of x1 Gx(x1,y)=-1 and x2 Gx(x2,y)=1, and do the same thing in y-direction. The intersection of the two results will be filled region. We give a detailed discussion of our algorithm, discuss the strengths this method has over other methods, and show results of using our method.
Practical Method to Identify Orbital Anomaly as Breakup Event in the Geostationary Region
2015-01-14
point ! Geocentric distance at the pinch point Table 4 summarizes the results of the origin identifications. One object labeled x15300 was...Table 4. The result of origin identification of the seven detected objects Object name Parent object Inclination vector Pinch point Geocentric distance...of the object. X-Y, X’-Y’, and R.A.-Dec. represent the Image Coordinate before rotating the CCD sensor, after rotation, and the Geocentric Inertial
NASA Astrophysics Data System (ADS)
Chen, Shuwang; Sha, Zhanyou; Wang, Shuhai; Wen, Huanming
2007-12-01
The research of the brain cognition is mainly to find out the activation position in brain according to the stimulation at present in the world. The research regards the animals as the experimental objects and explores the stimulation response on the cerebral cortex of acupuncture. It provides a new method, which can detect the activation position on the creatural cerebral cortex directly by middle-far infrared imaging. According to the theory of local temperature situation, the difference of cortical temperature maybe associate with the excitement of cortical nerve cells, the metabolism of local tissue and the local hemal circulation. Direct naked detection of temperature variety on cerebral cortex is applied by middle and far infrared imaging technology. So the activation position is ascertained. The effect of stimulation response is superior to other indirect methods. After removing the skulls on the head, full of cerebral cortex of a cat are exposed. By observing the infrared images and measuring the temperatures of the visual cerebral cortex during the process of acupuncturing, the points are used to judge the activation position. The variety in the cortical functional sections is corresponding to the result of the acupuncture points in terms of infrared images and temperatures. According to experimental results, we know that the variety of a cortical functional section is corresponding to a special acupuncture point exactly.
Indicator saturation: a novel approach to detect multiple breaks in geodetic time series.
NASA Astrophysics Data System (ADS)
Jackson, L. P.; Pretis, F.; Williams, S. D. P.
2016-12-01
Geodetic time series can record long term trends, quasi-periodic signals at a variety of time scales from days to decades, and sudden breaks due to natural or anthropogenic causes. The causes of breaks range from instrument replacement to earthquakes to unknown (i.e. no attributable cause). Furthermore, breaks can be permanent or short-lived and range at least two orders of magnitude in size (mm to 100's mm). To account for this range of possible signal-characteristics requires a flexible time series method that can distinguish between true and false breaks, outliers and time-varying trends. One such method, Indicator Saturation (IS) comes from the field of econometrics where analysing stochastic signals in these terms is a common problem. The IS approach differs from alternative break detection methods by considering every point in the time series as a break until it is demonstrated statistically that it is not. A linear model is constructed with a break function at every point in time, and all but statistically significant breaks are removed through a general-to-specific model selection algorithm for more variables than observations. The IS method is flexible because it allows multiple breaks of different forms (e.g. impulses, shifts in the mean, and changing trends) to be detected, while simultaneously modelling any underlying variation driven by additional covariates. We apply the IS method to identify breaks in a suite of synthetic GPS time series used for the Detection of Offsets in GPS Experiments (DOGEX). We optimise the method to maximise the ratio of true-positive to false-positive detections, which improves estimates of errors in the long term rates of land motion currently required by the GPS community.
SCOUT: simultaneous time segmentation and community detection in dynamic networks
Hulovatyy, Yuriy; Milenković, Tijana
2016-01-01
Many evolving complex real-world systems can be modeled via dynamic networks. An important problem in dynamic network research is community detection, which finds groups of topologically related nodes. Typically, this problem is approached by assuming either that each time point has a distinct community organization or that all time points share a single community organization. The reality likely lies between these two extremes. To find the compromise, we consider community detection in the context of the problem of segment detection, which identifies contiguous time periods with consistent network structure. Consequently, we formulate a combined problem of segment community detection (SCD), which simultaneously partitions the network into contiguous time segments with consistent community organization and finds this community organization for each segment. To solve SCD, we introduce SCOUT, an optimization framework that explicitly considers both segmentation quality and partition quality. SCOUT addresses limitations of existing methods that can be adapted to solve SCD, which consider only one of segmentation quality or partition quality. In a thorough evaluation, SCOUT outperforms the existing methods in terms of both accuracy and computational complexity. We apply SCOUT to biological network data to study human aging. PMID:27881879
Different strategies for detection of HbA1c emphasizing on biosensors and point-of-care analyzers.
Kaur, Jagjit; Jiang, Cheng; Liu, Guozhen
2018-06-07
Measurement of glycosylated hemoglobin (HbA1c) is a gold standard procedure for assessing long term glycemic control in individuals with diabetes mellitus as it gives the stable and reliable value of blood glucose levels for a period of 90-120 days. HbA1c is formed by the non-enzymatic glycation of terminal valine of hemoglobin. The analysis of HbA1c tends to be complicated because there are more than 300 different assay methods for measuring HbA1c which leads to variations in reported values from same samples. Therefore, standardization of detection methods is recommended. The review outlines the current research activities on developing assays including biosensors for the detection of HbA1c. The pros and cons of different techniques for measuring HbA1c are outlined. The performance of current point-of-care HbA1c analyzers available on the market are also compared and discussed. The future perspectives for HbA1c detection and diabetes management are proposed. Copyright © 2018 Elsevier B.V. All rights reserved.
Accurately estimating PSF with straight lines detected by Hough transform
NASA Astrophysics Data System (ADS)
Wang, Ruichen; Xu, Liangpeng; Fan, Chunxiao; Li, Yong
2018-04-01
This paper presents an approach to estimating point spread function (PSF) from low resolution (LR) images. Existing techniques usually rely on accurate detection of ending points of the profile normal to edges. In practice however, it is often a great challenge to accurately localize profiles of edges from a LR image, which hence leads to a poor PSF estimation of the lens taking the LR image. For precisely estimating the PSF, this paper proposes firstly estimating a 1-D PSF kernel with straight lines, and then robustly obtaining the 2-D PSF from the 1-D kernel by least squares techniques and random sample consensus. Canny operator is applied to the LR image for obtaining edges and then Hough transform is utilized to extract straight lines of all orientations. Estimating 1-D PSF kernel with straight lines effectively alleviates the influence of the inaccurate edge detection on PSF estimation. The proposed method is investigated on both natural and synthetic images for estimating PSF. Experimental results show that the proposed method outperforms the state-ofthe- art and does not rely on accurate edge detection.
Photogrammetry research for FAST eleven-meter reflector panel surface shape measurement
NASA Astrophysics Data System (ADS)
Zhou, Rongwei; Zhu, Lichun; Li, Weimin; Hu, Jingwen; Zhai, Xuebing
2010-10-01
In order to design and manufacture the Five-hundred-meter Aperture Spherical Radio Telescope (FAST) active reflector measuring equipment, measurement on each reflector panel surface shape was presented, static measurement of the whole neutral spherical network of nodes was performed, real-time dynamic measurement at the cable network dynamic deformation was undertaken. In the implementation process of the FAST, reflector panel surface shape detection was completed before eleven-meter reflector panel installation. Binocular vision system was constructed based on the method of binocular stereo vision in machine vision, eleven-meter reflector panel surface shape was measured with photogrammetry method. Cameras were calibrated with the feature points. Under the linearity camera model, the lighting spot array was used as calibration standard pattern, and the intrinsic and extrinsic parameters were acquired. The images were collected for digital image processing and analyzing with two cameras, feature points were extracted with the detection algorithm of characteristic points, and those characteristic points were matched based on epipolar constraint method. Three-dimensional reconstruction coordinates of feature points were analyzed and reflective panel surface shape structure was established by curve and surface fitting method. The error of reflector panel surface shape was calculated to realize automatic measurement on reflector panel surface shape. The results show that unit reflector panel surface inspection accuracy was 2.30mm, within the standard deviation error of 5.00mm. Compared with the requirement of reflector panel machining precision, photogrammetry has fine precision and operation feasibility on eleven-meter reflector panel surface shape measurement for FAST.
Aerial Images and Convolutional Neural Network for Cotton Bloom Detection.
Xu, Rui; Li, Changying; Paterson, Andrew H; Jiang, Yu; Sun, Shangpeng; Robertson, Jon S
2017-01-01
Monitoring flower development can provide useful information for production management, estimating yield and selecting specific genotypes of crops. The main goal of this study was to develop a methodology to detect and count cotton flowers, or blooms, using color images acquired by an unmanned aerial system. The aerial images were collected from two test fields in 4 days. A convolutional neural network (CNN) was designed and trained to detect cotton blooms in raw images, and their 3D locations were calculated using the dense point cloud constructed from the aerial images with the structure from motion method. The quality of the dense point cloud was analyzed and plots with poor quality were excluded from data analysis. A constrained clustering algorithm was developed to register the same bloom detected from different images based on the 3D location of the bloom. The accuracy and incompleteness of the dense point cloud were analyzed because they affected the accuracy of the 3D location of the blooms and thus the accuracy of the bloom registration result. The constrained clustering algorithm was validated using simulated data, showing good efficiency and accuracy. The bloom count from the proposed method was comparable with the number counted manually with an error of -4 to 3 blooms for the field with a single plant per plot. However, more plots were underestimated in the field with multiple plants per plot due to hidden blooms that were not captured by the aerial images. The proposed methodology provides a high-throughput method to continuously monitor the flowering progress of cotton.
An Improved Image Matching Method Based on Surf Algorithm
NASA Astrophysics Data System (ADS)
Chen, S. J.; Zheng, S. Z.; Xu, Z. G.; Guo, C. C.; Ma, X. L.
2018-04-01
Many state-of-the-art image matching methods, based on the feature matching, have been widely studied in the remote sensing field. These methods of feature matching which get highly operating efficiency, have a disadvantage of low accuracy and robustness. This paper proposes an improved image matching method which based on the SURF algorithm. The proposed method introduces color invariant transformation, information entropy theory and a series of constraint conditions to increase feature points detection and matching accuracy. First, the model of color invariant transformation is introduced for two matching images aiming at obtaining more color information during the matching process and information entropy theory is used to obtain the most information of two matching images. Then SURF algorithm is applied to detect and describe points from the images. Finally, constraint conditions which including Delaunay triangulation construction, similarity function and projective invariant are employed to eliminate the mismatches so as to improve matching precision. The proposed method has been validated on the remote sensing images and the result benefits from its high precision and robustness.
A comparison of five approaches to measurement of anatomic knee alignment from radiographs.
McDaniel, G; Mitchell, K L; Charles, C; Kraus, V B
2010-02-01
The recent recognition of the correlation of the hip-knee-ankle angle (HKA) with femur-tibia angle (FTA) on a standard knee radiograph has led to the increasing inclusion of FTA assessments in OA studies due to its clinical relevance, cost effectiveness and minimal radiation exposure. Our goal was to investigate the performance metrics of currently used methods of FTA measurement to determine whether a specific protocol could be recommended based on these results. Inter- and intra-rater reliability of FTA measurements were determined by intraclass correlation coefficient (ICC) of two independent analysts. Minimal detectable differences were determined and the correlation of FTA and HKA was analyzed by linear regression. Differences among methods of measuring HKA were assessed by ANOVA. All five methods of FTA measurement demonstrated high precision by inter- and intra-rater reproducibility (ICCs>or=0.93). All five methods displayed good accuracy, but after correction for the offset of FTA from HKA, the femoral notch landmark method was the least accurate. However, the methods differed according to their minimal detectable differences; the FTA methods utilizing the center of the base of the tibial spines or the center of the tibial plateau as knee center landmarks yielded the smallest minimal detectable differences (1.25 degrees and 1.72 degrees, respectively). All methods of FTA were highly reproducible, but varied in their accuracy and sensitivity to detect meaningful differences. Based on these parameters we recommend standardizing measurement angles with vertices at the base of the tibial spines or the center of the tibia and comparing single-point and two-point methods in larger studies. Copyright 2009 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.
Change-point detection of induced and natural seismicity
NASA Astrophysics Data System (ADS)
Fiedler, B.; Holschneider, M.; Zoeller, G.; Hainzl, S.
2016-12-01
Earthquake rates are influenced by tectonic stress buildup, earthquake-induced stress changes, and transient aseismic sources. While the first two sources can be well modeled due to the fact that the source is known, transient aseismic processes are more difficult to detect. However, the detection of the associated changes of the earthquake activity is of great interest, because it might help to identify natural aseismic deformation patterns (such as slow slip events) and the occurrence of induced seismicity related to human activities. We develop a Bayesian approach to detect change-points in seismicity data which are modeled by Poisson processes. By means of a Likelihood-Ratio-Test, we proof the significance of the change of the intensity. The model is also extended to spatiotemporal data to detect the area of the transient changes. The method is firstly tested for synthetic data and then applied to observational data from central US and the Bardarbunga volcano in Iceland.
Chen, Wen-Yuan; Wang, Mei; Fu, Zhou-Xing
2014-06-16
Most railway accidents happen at railway crossings. Therefore, how to detect humans or objects present in the risk area of a railway crossing and thus prevent accidents are important tasks. In this paper, three strategies are used to detect the risk area of a railway crossing: (1) we use a terrain drop compensation (TDC) technique to solve the problem of the concavity of railway crossings; (2) we use a linear regression technique to predict the position and length of an object from image processing; (3) we have developed a novel strategy called calculating local maximum Y-coordinate object points (CLMYOP) to obtain the ground points of the object. In addition, image preprocessing is also applied to filter out the noise and successfully improve the object detection. From the experimental results, it is demonstrated that our scheme is an effective and corrective method for the detection of railway crossing risk areas.
Chen, Wen-Yuan; Wang, Mei; Fu, Zhou-Xing
2014-01-01
Most railway accidents happen at railway crossings. Therefore, how to detect humans or objects present in the risk area of a railway crossing and thus prevent accidents are important tasks. In this paper, three strategies are used to detect the risk area of a railway crossing: (1) we use a terrain drop compensation (TDC) technique to solve the problem of the concavity of railway crossings; (2) we use a linear regression technique to predict the position and length of an object from image processing; (3) we have developed a novel strategy called calculating local maximum Y-coordinate object points (CLMYOP) to obtain the ground points of the object. In addition, image preprocessing is also applied to filter out the noise and successfully improve the object detection. From the experimental results, it is demonstrated that our scheme is an effective and corrective method for the detection of railway crossing risk areas. PMID:24936948
A multiscale curvature algorithm for classifying discrete return LiDAR in forested environments
Jeffrey S. Evans; Andrew T. Hudak
2007-01-01
One prerequisite to the use of light detection and ranging (LiDAR) across disciplines is differentiating ground from nonground returns. The objective was to automatically and objectively classify points within unclassified LiDAR point clouds, with few model parameters and minimal postprocessing. Presented is an automated method for classifying LiDAR returns as ground...
Dziedzinska, Radka
2017-01-01
The main reasons to improve the detection of Mycobacterium avium subsp. paratuberculosis (MAP) are animal health and monitoring of MAP entering the food chain via meat, milk, and/or dairy products. Different approaches can be used for the detection of MAP, but the use of magnetic separation especially in conjunction with PCR as an end-point detection method has risen in past years. However, the extraction of DNA which is a crucial step prior to PCR detection can be complicated due to the presence of inhibitory substances. Magnetic separation methods involving either antibodies or peptides represent a powerful tool for selective separation of target bacteria from other nontarget microorganisms and inhibitory sample components. These methods enable the concentration of pathogens present in the initial matrix into smaller volume and facilitate the isolation of sufficient quantities of pure DNA. The purpose of this review was to summarize the methods based on the magnetic separation approach that are currently available for the detection of MAP in a broad range of matrices. PMID:28642876
Operational analysis for the drug detection problem
NASA Astrophysics Data System (ADS)
Hoopengardner, Roger L.; Smith, Michael C.
1994-10-01
New techniques and sensors to identify the molecular, chemical, or elemental structures unique to drugs are being developed under several national programs. However, the challenge faced by U.S. drug enforcement and Customs officials goes far beyond the simple technical capability to detect an illegal drug. Entry points into the U.S. include ports, border crossings, and airports where cargo ships, vehicles, and aircraft move huge volumes of freight. Current technology and personnel are able to physically inspect only a small fraction of the entering cargo containers. The complexities of how to best utilize new technology to aid the detection process and yet not adversely affect the processing of vehicles and time-sensitive cargo is the challenge faced by these officials. This paper describes an ARPA sponsored initiative to develop a simple, yet useful, method for examining the operational consequences of utilizing various procedures and technologies in combination to achieve an `acceptable' level of detection probability. Since Customs entry points into the U.S. vary from huge seaports to a one lane highway checkpoint between the U.S. and Canadian or Mexico border, no one system can possibly be right for all points. This approach can examine alternative concepts for using different techniques/systems for different types of entry points. Operational measures reported include the average time to process vehicles and containers, the average and maximum numbers in the system at any time, and the utilization of inspection teams. The method is implemented via a PC-based simulation written in GPSS-PC language. Input to the simulation model is (1) the individual detection probabilities and false positive rates for each detection technology or procedure, (2) the inspection time for each procedure, (3) the system configuration, and (4) the physical distance between inspection stations. The model offers on- line graphics to examine effects as the model runs.
Background suppression of infrared small target image based on inter-frame registration
NASA Astrophysics Data System (ADS)
Ye, Xiubo; Xue, Bindang
2018-04-01
We propose a multi-frame background suppression method for remote infrared small target detection. Inter-frame information is necessary when the heavy background clutters make it difficult to distinguish real targets and false alarms. A registration procedure based on points matching in image patches is used to compensate the local deformation of background. Then the target can be separated by background subtraction. Experiments show our method serves as an effective preliminary of target detection.
2013-09-01
of sperm whales. Although the methods developed in those papers demonstrate feasibility, they are not applicable to a)Author to whom correspondence...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...location clicks (Marques et al., 2009) instead of detecting individual animals or groups of animals; these cue- counting methods will not be specifically
Duplicate document detection in DocBrowse
NASA Astrophysics Data System (ADS)
Chalana, Vikram; Bruce, Andrew G.; Nguyen, Thien
1998-04-01
Duplicate documents are frequently found in large databases of digital documents, such as those found in digital libraries or in the government declassification effort. Efficient duplicate document detection is important not only to allow querying for similar documents, but also to filter out redundant information in large document databases. We have designed three different algorithm to identify duplicate documents. The first algorithm is based on features extracted from the textual content of a document, the second algorithm is based on wavelet features extracted from the document image itself, and the third algorithm is a combination of the first two. These algorithms are integrated within the DocBrowse system for information retrieval from document images which is currently under development at MathSoft. DocBrowse supports duplicate document detection by allowing (1) automatic filtering to hide duplicate documents, and (2) ad hoc querying for similar or duplicate documents. We have tested the duplicate document detection algorithms on 171 documents and found that text-based method has an average 11-point precision of 97.7 percent while the image-based method has an average 11- point precision of 98.9 percent. However, in general, the text-based method performs better when the document contains enough high-quality machine printed text while the image- based method performs better when the document contains little or no quality machine readable text.
NASA Astrophysics Data System (ADS)
Doko, Tomoko; Chen, Wenbo; Higuchi, Hiroyoshi
2016-06-01
Satellite tracking technology has been used to reveal the migration patterns and flyways of migratory birds. In general, bird migration can be classified according to migration status. These statuses include the wintering period, spring migration, breeding period, and autumn migration. To determine the migration status, periods of these statuses should be individually determined, but there is no objective method to define 'a threshold date' for when an individual bird changes its status. The research objective is to develop an effective and objective method to determine threshold dates of migration status based on satellite-tracked data. The developed method was named the "MATCHED (Migratory Analytical Time Change Easy Detection) method". In order to demonstrate the method, data acquired from satellite-tracked Tundra Swans were used. MATCHED method is composed by six steps: 1) dataset preparation, 2) time frame creation, 3) automatic identification, 4) visualization of change points, 5) interpretation, and 6) manual correction. Accuracy was tested. In general, MATCHED method was proved powerful to identify the change points between migration status as well as stopovers. Nevertheless, identifying "exact" threshold dates is still challenging. Limitation and application of this method was discussed.
Point of Care- A Novel Approach to Periodontal Diagnosis-A Review
Nayak, Prathibha Anand; Rana, Shivendra
2017-01-01
Periodontal disease, one of the prevalent oral diseases, is characterized by gingival inflammation and periodontal tissue destruction. Diagnosing this disease is challenging to the clinicians as the disease process is discontinuous and shows periods of exacerbation and remission. Traditional diagnostic methods basically tells about the past tissue destruction so new diagnostic methods are required which is able to detect the active state of the disease, determine the future progression and also estimates the response to the therapy, thereby helping in the better clinical management of the patient. Both saliva and Gingival crevicular fluid (GCF) are believed to be reliable medium to detect the biomarkers which plays a pivotal role in measuring the disease activity. Keeping these observations in mind rapid chairside tests are developed to diagnose periodontal disease called as Point of Care (POC) diagnostics which simplifies diagnosis and helps in improving the prognosis. This review article highlights about the biomarkers used in the diagnosis and throws light on the various available point of care diagnostic devices. PMID:28969294
DOE Office of Scientific and Technical Information (OSTI.GOV)
Proudnikov, D.; Kirillov, E.; Chumakov, K.
2000-01-01
This paper describes use of a new technology of hybridization with a micro-array of immobilized oligonucleotides for detection and quantification of neurovirulent mutants in Oral Poliovirus Vaccine (OPV). We used a micro-array consisting of three-dimensional gel-elements containing all possible hexamers (total of 4096 probes). Hybridization of fluorescently labelled viral cDNA samples with such microchips resulted in a pattern of spots that was registered and quantified by a computer-linked CCD camera, so that the sequence of the original cDNA could be deduced. The method could reliably identify single point mutations, since each of them affected fluorescence intensity of 12 micro-array elements.more » Micro-array hybridization of DNA mixtures with varying contents of point mutants demonstrated that the method can detect as little as 10% of revertants in a population of vaccine virus. This new technology should be useful for quality control of live viral vaccines, as well as for other applications requiring identification and quantification of point mutations.« less
Fuzzy pulmonary vessel segmentation in contrast enhanced CT data
NASA Astrophysics Data System (ADS)
Kaftan, Jens N.; Kiraly, Atilla P.; Bakai, Annemarie; Das, Marco; Novak, Carol L.; Aach, Til
2008-03-01
Pulmonary vascular tree segmentation has numerous applications in medical imaging and computer-aided diagnosis (CAD), including detection and visualization of pulmonary emboli (PE), improved lung nodule detection, and quantitative vessel analysis. We present a novel approach to pulmonary vessel segmentation based on a fuzzy segmentation concept, combining the strengths of both threshold and seed point based methods. The lungs of the original image are first segmented and a threshold-based approach identifies core vessel components with a high specificity. These components are then used to automatically identify reliable seed points for a fuzzy seed point based segmentation method, namely fuzzy connectedness. The output of the method consists of the probability of each voxel belonging to the vascular tree. Hence, our method provides the possibility to adjust the sensitivity/specificity of the segmentation result a posteriori according to application-specific requirements, through definition of a minimum vessel-probability required to classify a voxel as belonging to the vascular tree. The method has been evaluated on contrast-enhanced thoracic CT scans from clinical PE cases and demonstrates overall promising results. For quantitative validation we compare the segmentation results to randomly selected, semi-automatically segmented sub-volumes and present the resulting receiver operating characteristic (ROC) curves. Although we focus on contrast enhanced chest CT data, the method can be generalized to other regions of the body as well as to different imaging modalities.
Arata, Hideyuki; Komatsu, Hiroshi; Hosokawa, Kazuo; Maeda, Mizuo
2012-01-01
Detection of microRNAs, small noncoding single-stranded RNAs, is one of the key topics in the new generation of cancer research because cancer in the human body can be detected or even classified by microRNA detection. This report shows rapid and sensitive microRNA detection using a power-free microfluidic device, which is driven by degassed poly(dimethylsiloxane), thus eliminating the need for an external power supply. MicroRNA is detected by sandwich hybridization, and the signal is amplified by laminar flow-assisted dendritic amplification. This method allows us to detect microRNA of specific sequences at a limit of detection of 0.5 pM from a 0.5 µL sample solution with a detection time of 20 min. Together with the advantages of self-reliance of this device, this method might contribute substantially to future point-of-care early-stage cancer diagnosis.
Method for improving the limit of detection in a data signal
Synovec, Robert E.; Yueng, Edward S.
1989-10-17
A method for improving the limit of detection for a data set in which experimental noise is uncorrelated along a given abscissa and an analytical signal is correlated to the abscissa, the steps comprising collecting the data set, converting the data set into a data signal including an analytical portion and the experimental noise portion, designating and adjusting a baseline of the data signal to center the experimental noise numerically about a zero reference, and integrating the data signal preserving the corresponding information for each point of the data signal. The steps of the method produce an enhanced integrated data signal which improves the limit of detection of the data signal.
Method for improving the limit of detection in a data signal
Synovec, R.E.; Yueng, E.S.
1989-10-17
Disclosed is a method for improving the limit of detection for a data set in which experimental noise is uncorrelated along a given abscissa and an analytical signal is correlated to the abscissa, the steps comprising collecting the data set, converting the data set into a data signal including an analytical portion and the experimental noise portion, designating and adjusting a baseline of the data signal to center the experimental noise numerically about a zero reference, and integrating the data signal preserving the corresponding information for each point of the data signal. The steps of the method produce an enhanced integrated data signal which improves the limit of detection of the data signal. 8 figs.
Puelacher, Christian; Wagener, Max; Abächerli, Roger; Honegger, Ursina; Lhasam, Nundsin; Schaerli, Nicolas; Prêtre, Gil; Strebel, Ivo; Twerenbold, Raphael; Boeddinghaus, Jasper; Nestelberger, Thomas; Rubini Giménez, Maria; Hillinger, Petra; Wildi, Karin; Sabti, Zaid; Badertscher, Patrick; Cupa, Janosch; Kozhuharov, Nikola; du Fay de Lavallaz, Jeanne; Freese, Michael; Roux, Isabelle; Lohrmann, Jens; Leber, Remo; Osswald, Stefan; Wild, Damian; Zellweger, Michael J; Mueller, Christian; Reichlin, Tobias
2017-07-01
Exercise ECG stress testing is the most widely available method for evaluation of patients with suspected myocardial ischemia. Its major limitation is the relatively poor accuracy of ST-segment changes regarding ischemia detection. Little is known about the optimal method to assess ST-deviations. A total of 1558 consecutive patients undergoing bicycle exercise stress myocardial perfusion imaging (MPI) were enrolled. Presence of inducible myocardial ischemia was adjudicated using MPI results. The diagnostic value of ST-deviations for detection of exercise-induced myocardial ischemia was systematically analyzed 1) for each individual lead, 2) at three different intervals after the J-point (J+40ms, J+60ms, J+80ms), and 3) at different time points during the test (baseline, maximal workload, 2min into recovery). Exercise-induced ischemia was detected in 481 (31%) patients. The diagnostic accuracy of ST-deviations was highest at +80ms after the J-point, and at 2min into recovery. At this point, ST-amplitude showed an AUC of 0.63 (95% CI 0.59-0.66) for the best-performing lead I. The combination of ST-amplitude and ST-slope in lead I did not increase the AUC. Lead I reached a sensitivity of 37% and a specificity of 83%, with similar sensitivity to manual ECG analysis (34%, p=0.31) but lower specificity (90%, p<0.001). When using ECG stress testing for evaluation of patients with suspected myocardial ischemia, the diagnostic accuracy of ST-deviations is highest when evaluated at +80ms after the J-point, and at 2min into recovery. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Ofek, Eran O.; Zackay, Barak
2018-04-01
Detection of templates (e.g., sources) embedded in low-number count Poisson noise is a common problem in astrophysics. Examples include source detection in X-ray images, γ-rays, UV, neutrinos, and search for clusters of galaxies and stellar streams. However, the solutions in the X-ray-related literature are sub-optimal in some cases by considerable factors. Using the lemma of Neyman–Pearson, we derive the optimal statistics for template detection in the presence of Poisson noise. We demonstrate that, for known template shape (e.g., point sources), this method provides higher completeness, for a fixed false-alarm probability value, compared with filtering the image with the point-spread function (PSF). In turn, we find that filtering by the PSF is better than filtering the image using the Mexican-hat wavelet (used by wavdetect). For some background levels, our method improves the sensitivity of source detection by more than a factor of two over the popular Mexican-hat wavelet filtering. This filtering technique can also be used for fast PSF photometry and flare detection; it is efficient and straightforward to implement. We provide an implementation in MATLAB. The development of a complete code that works on real data, including the complexities of background subtraction and PSF variations, is deferred for future publication.
Reconstruction of Building Outlines in Dense Urban Areas Based on LIDAR Data and Address Points
NASA Astrophysics Data System (ADS)
Jarzabek-Rychard, M.
2012-07-01
The paper presents a comprehensive method for automated extraction and delineation of building outlines in densely built-up areas. A novel approach to outline reconstruction is the use of geocoded building address points. They give information about building location thus highly reduce task complexity. Reconstruction process is executed on 3D point clouds acquired by airborne laser scanner. The method consists of three steps: building detection, delineation and contours refinement. The algorithm is tested against a data set that presents the old market town and its surroundings. The results are discussed and evaluated by comparison to reference cadastral data.
Acoustic method respiratory rate monitoring is useful in patients under intravenous anesthesia.
Ouchi, Kentaro; Fujiwara, Shigeki; Sugiyama, Kazuna
2017-02-01
Respiratory depression can occur during intravenous general anesthesia without tracheal intubation. A new acoustic method for respiratory rate monitoring, RRa ® (Masimo Corp., Tokyo, Japan), has been reported to show good reliability in post-anesthesia care and emergency units. The purpose of this study was to investigate the reliability of the acoustic method for measurement of respiratory rate during intravenous general anesthesia, as compared with capnography. Patients with dental anxiety undergoing dental treatment under intravenous anesthesia without tracheal intubation were enrolled in this study. Respiratory rate was recorded every 30 s using the acoustic method and capnography, and detectability of respiratory rate was investigated for both methods. This study used a cohort study design. In 1953 recorded respiratory rate data points, the number of detected points by the acoustic method (1884, 96.5 %) was significantly higher than that by capnography (1682, 86.1 %) (P < 0.0001). In the intraoperative period, there was a significant difference in the LOA (95 % limits of agreement of correlation between difference and average of the two methods)/ULLOA (under the lower limit of agreement) in terms of use or non-use of a dental air turbine (P < 0.0001). In comparison between capnography, the acoustic method is useful for continuous monitoring of respiratory rate in spontaneously breathing subjects undergoing dental procedures under intravenous general anesthesia. However, the acoustic method might not accurately detect in cases in with dental air turbine.
Rigid shape matching by segmentation averaging.
Wang, Hongzhi; Oliensis, John
2010-04-01
We use segmentations to match images by shape. The new matching technique does not require point-to-point edge correspondence and is robust to small shape variations and spatial shifts. To address the unreliability of segmentations computed bottom-up, we give a closed form approximation to an average over all segmentations. Our method has many extensions, yielding new algorithms for tracking, object detection, segmentation, and edge-preserving smoothing. For segmentation, instead of a maximum a posteriori approach, we compute the "central" segmentation minimizing the average distance to all segmentations of an image. For smoothing, instead of smoothing images based on local structures, we smooth based on the global optimal image structures. Our methods for segmentation, smoothing, and object detection perform competitively, and we also show promising results in shape-based tracking.
Products recognition on shop-racks from local scale-invariant features
NASA Astrophysics Data System (ADS)
Zawistowski, Jacek; Kurzejamski, Grzegorz; Garbat, Piotr; Naruniec, Jacek
2016-04-01
This paper presents a system designed for the multi-object detection purposes and adjusted for the application of product search on the market shelves. System uses well known binary keypoint detection algorithms for finding characteristic points in the image. One of the main idea is object recognition based on Implicit Shape Model method. Authors of the article proposed many improvements of the algorithm. Originally fiducial points are matched with a very simple function. This leads to the limitations in the number of objects parts being success- fully separated, while various methods of classification may be validated in order to achieve higher performance. Such an extension implies research on training procedure able to deal with many objects categories. Proposed solution opens a new possibilities for many algorithms demanding fast and robust multi-object recognition.
NASA Astrophysics Data System (ADS)
Wang, Jinxia; Dou, Aixia; Wang, Xiaoqing; Huang, Shusong; Yuan, Xiaoxiang
2016-11-01
Compared to remote sensing image, post-earthquake airborne Light Detection And Ranging (LiDAR) point cloud data contains a high-precision three-dimensional information on earthquake disaster which can improve the accuracy of the identification of destroy buildings. However after the earthquake, the damaged buildings showed so many different characteristics that we can't distinguish currently between trees and damaged buildings points by the most commonly used method of pre-processing. In this study, we analyse the number of returns for given pulse of trees and damaged buildings point cloud and explore methods to distinguish currently between trees and damaged buildings points. We propose a new method by searching for a certain number of neighbourhood space and calculate the ratio(R) of points whose number of returns for given pulse greater than 1 of the neighbourhood points to separate trees from buildings. In this study, we select some point clouds of typical undamaged building, collapsed building and tree as samples from airborne LiDAR point cloud data which got after 2010 earthquake in Haiti MW7.0 by the way of human-computer interaction. Testing to get the Rvalue to distinguish between trees and buildings and apply the R-value to test testing areas. The experiment results show that the proposed method in this study can distinguish between building (undamaged and damaged building) points and tree points effectively but be limited in area where buildings various, damaged complex and trees dense, so this method will be improved necessarily.
NASA Astrophysics Data System (ADS)
Kwon, Su-Yong; Kim, Jong-Chul; Choi, Byung-Il
2007-10-01
Distinguishing between a supercooled dew and frost below 0 °C in dew/frost-point measurements is an important and challenging problem that has not yet been completely solved. This study presents a new method for the recognition of a supercooled dew in a dew/frost-point sensor. A quartz crystal microbalance (QCM) sensor was used as a dew/frost-point sensor to detect a dew and a supercooled dew as well as frost. The slip phenomenon occurring at an interface between the water droplet and the surface of the quartz crystal resonator of the QCM sensor gives a simple and accurate way of distinguishing between a supercooled dew and frost below 0 °C. This method can give a highly accurate measurement of the dew or the frost point without misreading in the dew-point sensor at temperatures below 0 °C.
An adhered-particle analysis system based on concave points
NASA Astrophysics Data System (ADS)
Wang, Wencheng; Guan, Fengnian; Feng, Lin
2018-04-01
Particles adhered together will influence the image analysis in computer vision system. In this paper, a method based on concave point is designed. First, corner detection algorithm is adopted to obtain a rough estimation of potential concave points after image segmentation. Then, it computes the area ratio of the candidates to accurately localize the final separation points. Finally, it uses the separation points of each particle and the neighboring pixels to estimate the original particles before adhesion and provides estimated profile images. The experimental results have shown that this approach can provide good results that match the human visual cognitive mechanism.
CNV-TV: a robust method to discover copy number variation from short sequencing reads.
Duan, Junbo; Zhang, Ji-Gang; Deng, Hong-Wen; Wang, Yu-Ping
2013-05-02
Copy number variation (CNV) is an important structural variation (SV) in human genome. Various studies have shown that CNVs are associated with complex diseases. Traditional CNV detection methods such as fluorescence in situ hybridization (FISH) and array comparative genomic hybridization (aCGH) suffer from low resolution. The next generation sequencing (NGS) technique promises a higher resolution detection of CNVs and several methods were recently proposed for realizing such a promise. However, the performances of these methods are not robust under some conditions, e.g., some of them may fail to detect CNVs of short sizes. There has been a strong demand for reliable detection of CNVs from high resolution NGS data. A novel and robust method to detect CNV from short sequencing reads is proposed in this study. The detection of CNV is modeled as a change-point detection from the read depth (RD) signal derived from the NGS, which is fitted with a total variation (TV) penalized least squares model. The performance (e.g., sensitivity and specificity) of the proposed approach are evaluated by comparison with several recently published methods on both simulated and real data from the 1000 Genomes Project. The experimental results showed that both the true positive rate and false positive rate of the proposed detection method do not change significantly for CNVs with different copy numbers and lengthes, when compared with several existing methods. Therefore, our proposed approach results in a more reliable detection of CNVs than the existing methods.
Field evaluation of distance-estimation error during wetland-dependent bird surveys
Nadeau, Christopher P.; Conway, Courtney J.
2012-01-01
Context: The most common methods to estimate detection probability during avian point-count surveys involve recording a distance between the survey point and individual birds detected during the survey period. Accurately measuring or estimating distance is an important assumption of these methods; however, this assumption is rarely tested in the context of aural avian point-count surveys. Aims: We expand on recent bird-simulation studies to document the error associated with estimating distance to calling birds in a wetland ecosystem. Methods: We used two approaches to estimate the error associated with five surveyor's distance estimates between the survey point and calling birds, and to determine the factors that affect a surveyor's ability to estimate distance. Key results: We observed biased and imprecise distance estimates when estimating distance to simulated birds in a point-count scenario (x̄error = -9 m, s.d.error = 47 m) and when estimating distances to real birds during field trials (x̄error = 39 m, s.d.error = 79 m). The amount of bias and precision in distance estimates differed among surveyors; surveyors with more training and experience were less biased and more precise when estimating distance to both real and simulated birds. Three environmental factors were important in explaining the error associated with distance estimates, including the measured distance from the bird to the surveyor, the volume of the call and the species of bird. Surveyors tended to make large overestimations to birds close to the survey point, which is an especially serious error in distance sampling. Conclusions: Our results suggest that distance-estimation error is prevalent, but surveyor training may be the easiest way to reduce distance-estimation error. Implications: The present study has demonstrated how relatively simple field trials can be used to estimate the error associated with distance estimates used to estimate detection probability during avian point-count surveys. Evaluating distance-estimation errors will allow investigators to better evaluate the accuracy of avian density and trend estimates. Moreover, investigators who evaluate distance-estimation errors could employ recently developed models to incorporate distance-estimation error into analyses. We encourage further development of such models, including the inclusion of such models into distance-analysis software.
High accuracy position method based on computer vision and error analysis
NASA Astrophysics Data System (ADS)
Chen, Shihao; Shi, Zhongke
2003-09-01
The study of high accuracy position system is becoming the hotspot in the field of autocontrol. And positioning is one of the most researched tasks in vision system. So we decide to solve the object locating by using the image processing method. This paper describes a new method of high accuracy positioning method through vision system. In the proposed method, an edge-detection filter is designed for a certain running condition. Here, the filter contains two mainly parts: one is image-processing module, this module is to implement edge detection, it contains of multi-level threshold self-adapting segmentation, edge-detection and edge filter; the other one is object-locating module, it is to point out the location of each object in high accurate, and it is made up of medium-filtering and curve-fitting. This paper gives some analysis error for the method to prove the feasibility of vision in position detecting. Finally, to verify the availability of the method, an example of positioning worktable, which is using the proposed method, is given at the end of the paper. Results show that the method can accurately detect the position of measured object and identify object attitude.
The ship edge feature detection based on high and low threshold for remote sensing image
NASA Astrophysics Data System (ADS)
Li, Xuan; Li, Shengyang
2018-05-01
In this paper, a method based on high and low threshold is proposed to detect the ship edge feature due to the low accuracy rate caused by the noise. Analyze the relationship between human vision system and the target features, and to determine the ship target by detecting the edge feature. Firstly, using the second-order differential method to enhance the quality of image; Secondly, to improvement the edge operator, we introduction of high and low threshold contrast to enhancement image edge and non-edge points, and the edge as the foreground image, non-edge as a background image using image segmentation to achieve edge detection, and remove the false edges; Finally, the edge features are described based on the result of edge features detection, and determine the ship target. The experimental results show that the proposed method can effectively reduce the number of false edges in edge detection, and has the high accuracy of remote sensing ship edge detection.
Poisson denoising on the sphere: application to the Fermi gamma ray space telescope
NASA Astrophysics Data System (ADS)
Schmitt, J.; Starck, J. L.; Casandjian, J. M.; Fadili, J.; Grenier, I.
2010-07-01
The Large Area Telescope (LAT), the main instrument of the Fermi gamma-ray Space telescope, detects high energy gamma rays with energies from 20 MeV to more than 300 GeV. The two main scientific objectives, the study of the Milky Way diffuse background and the detection of point sources, are complicated by the lack of photons. That is why we need a powerful Poisson noise removal method on the sphere which is efficient on low count Poisson data. This paper presents a new multiscale decomposition on the sphere for data with Poisson noise, called multi-scale variance stabilizing transform on the sphere (MS-VSTS). This method is based on a variance stabilizing transform (VST), a transform which aims to stabilize a Poisson data set such that each stabilized sample has a quasi constant variance. In addition, for the VST used in the method, the transformed data are asymptotically Gaussian. MS-VSTS consists of decomposing the data into a sparse multi-scale dictionary like wavelets or curvelets, and then applying a VST on the coefficients in order to get almost Gaussian stabilized coefficients. In this work, we use the isotropic undecimated wavelet transform (IUWT) and the curvelet transform as spherical multi-scale transforms. Then, binary hypothesis testing is carried out to detect significant coefficients, and the denoised image is reconstructed with an iterative algorithm based on hybrid steepest descent (HSD). To detect point sources, we have to extract the Galactic diffuse background: an extension of the method to background separation is then proposed. In contrary, to study the Milky Way diffuse background, we remove point sources with a binary mask. The gaps have to be interpolated: an extension to inpainting is then proposed. The method, applied on simulated Fermi LAT data, proves to be adaptive, fast and easy to implement.
A Hyperspherical Adaptive Sparse-Grid Method for High-Dimensional Discontinuity Detection
Zhang, Guannan; Webster, Clayton G.; Gunzburger, Max D.; ...
2015-06-24
This study proposes and analyzes a hyperspherical adaptive hierarchical sparse-grid method for detecting jump discontinuities of functions in high-dimensional spaces. The method is motivated by the theoretical and computational inefficiencies of well-known adaptive sparse-grid methods for discontinuity detection. Our novel approach constructs a function representation of the discontinuity hypersurface of an N-dimensional discontinuous quantity of interest, by virtue of a hyperspherical transformation. Then, a sparse-grid approximation of the transformed function is built in the hyperspherical coordinate system, whose value at each point is estimated by solving a one-dimensional discontinuity detection problem. Due to the smoothness of the hypersurface, the newmore » technique can identify jump discontinuities with significantly reduced computational cost, compared to existing methods. In addition, hierarchical acceleration techniques are also incorporated to further reduce the overall complexity. Rigorous complexity analyses of the new method are provided as are several numerical examples that illustrate the effectiveness of the approach.« less
A hyper-spherical adaptive sparse-grid method for high-dimensional discontinuity detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Guannan; Webster, Clayton G.; Gunzburger, Max D.
This work proposes and analyzes a hyper-spherical adaptive hierarchical sparse-grid method for detecting jump discontinuities of functions in high-dimensional spaces is proposed. The method is motivated by the theoretical and computational inefficiencies of well-known adaptive sparse-grid methods for discontinuity detection. Our novel approach constructs a function representation of the discontinuity hyper-surface of an N-dimensional dis- continuous quantity of interest, by virtue of a hyper-spherical transformation. Then, a sparse-grid approximation of the transformed function is built in the hyper-spherical coordinate system, whose value at each point is estimated by solving a one-dimensional discontinuity detection problem. Due to the smoothness of themore » hyper-surface, the new technique can identify jump discontinuities with significantly reduced computational cost, compared to existing methods. Moreover, hierarchical acceleration techniques are also incorporated to further reduce the overall complexity. Rigorous error estimates and complexity analyses of the new method are provided as are several numerical examples that illustrate the effectiveness of the approach.« less
Tracking of Ball and Players in Beach Volleyball Videos
Gomez, Gabriel; Herrera López, Patricia; Link, Daniel; Eskofier, Bjoern
2014-01-01
This paper presents methods for the determination of players' positions and contact time points by tracking the players and the ball in beach volleyball videos. Two player tracking methods are compared, a classical particle filter and a rigid grid integral histogram tracker. Due to mutual occlusion of the players and the camera perspective, results are best for the front players, with 74,6% and 82,6% of correctly tracked frames for the particle method and the integral histogram method, respectively. Results suggest an improved robustness against player confusion between different particle sets when tracking with a rigid grid approach. Faster processing and less player confusions make this method superior to the classical particle filter. Two different ball tracking methods are used that detect ball candidates from movement difference images using a background subtraction algorithm. Ball trajectories are estimated and interpolated from parabolic flight equations. The tracking accuracy of the ball is 54,2% for the trajectory growth method and 42,1% for the Hough line detection method. Tracking results of over 90% from the literature could not be confirmed. Ball contact frames were estimated from parabolic trajectory intersection, resulting in 48,9% of correctly estimated ball contact points. PMID:25426936
Compensatable muon collider calorimeter with manageable backgrounds
Raja, Rajendran
2015-02-17
A method and system for reducing background noise in a particle collider, comprises identifying an interaction point among a plurality of particles within a particle collider associated with a detector element, defining a trigger start time for each of the pixels as the time taken for light to travel from the interaction point to the pixel and a trigger stop time as a selected time after the trigger start time, and collecting only detections that occur between the start trigger time and the stop trigger time in order to thereafter compensate the result from the particle collider to reduce unwanted background detection.
Apparatus and method for detecting flaws in conductive material
Hockey, Ronald L.; Riechers, Douglas M.
1999-01-01
The present invention is an improved sensing unit for detecting flaws in conductive material wherein the sensing coil is positioned away from a datum of either the datum point, the datum orientation, or a combination thereof. Position of the sensing coil away from a datum increases sensitivity for detecting flaws having a characteristic volume less than about 1 mm.sup.3, and further permits detection of subsurface flaws. Use of multiple sensing coils permits quantification of flaw area or volume.
Non-magnetic photospheric bright points in 3D simulations of the solar atmosphere
NASA Astrophysics Data System (ADS)
Calvo, F.; Steiner, O.; Freytag, B.
2016-11-01
Context. Small-scale bright features in the photosphere of the Sun, such as faculae or G-band bright points, appear in connection with small-scale magnetic flux concentrations. Aims: Here we report on a new class of photospheric bright points that are free of magnetic fields. So far, these are visible in numerical simulations only. We explore conditions required for their observational detection. Methods: Numerical radiation (magneto-)hydrodynamic simulations of the near-surface layers of the Sun were carried out. The magnetic field-free simulations show tiny bright points, reminiscent of magnetic bright points, only smaller. A simple toy model for these non-magnetic bright points (nMBPs) was established that serves as a base for the development of an algorithm for their automatic detection. Basic physical properties of 357 detected nMBPs were extracted and statistically evaluated. We produced synthetic intensity maps that mimic observations with various solar telescopes to obtain hints on their detectability. Results: The nMBPs of the simulations show a mean bolometric intensity contrast with respect to their intergranular surroundings of approximately 20%, a size of 60-80 km, and the isosurface of optical depth unity is at their location depressed by 80-100 km. They are caused by swirling downdrafts that provide, by means of the centripetal force, the necessary pressure gradient for the formation of a funnel of reduced mass density that reaches from the subsurface layers into the photosphere. Similar, frequently occurring funnels that do not reach into the photosphere, do not produce bright points. Conclusions: Non-magnetic bright points are the observable manifestation of vertically extending vortices (vortex tubes) in the photosphere. The resolving power of 4-m-class telescopes, such as the DKIST, is needed for an unambiguous detection of them. The movie associated to Fig. 1 is available at http://www.aanda.org
Comparison of two stand-alone CADe systems at multiple operating points
NASA Astrophysics Data System (ADS)
Sahiner, Berkman; Chen, Weijie; Pezeshk, Aria; Petrick, Nicholas
2015-03-01
Computer-aided detection (CADe) systems are typically designed to work at a given operating point: The device displays a mark if and only if the level of suspiciousness of a region of interest is above a fixed threshold. To compare the standalone performances of two systems, one approach is to select the parameters of the systems to yield a target false-positive rate that defines the operating point, and to compare the sensitivities at that operating point. Increasingly, CADe developers offer multiple operating points, which necessitates the comparison of two CADe systems involving multiple comparisons. To control the Type I error, multiple-comparison correction is needed for keeping the family-wise error rate (FWER) less than a given alpha-level. The sensitivities of a single modality at different operating points are correlated. In addition, the sensitivities of the two modalities at the same or different operating points are also likely to be correlated. It has been shown in the literature that when test statistics are correlated, well-known methods for controlling the FWER are conservative. In this study, we compared the FWER and power of three methods, namely the Bonferroni, step-up, and adjusted step-up methods in comparing the sensitivities of two CADe systems at multiple operating points, where the adjusted step-up method uses the estimated correlations. Our results indicate that the adjusted step-up method has a substantial advantage over other the two methods both in terms of the FWER and power.
Multispectral processing based on groups of resolution elements
NASA Technical Reports Server (NTRS)
Richardson, W.; Gleason, J. M.
1975-01-01
Several nine-point rules are defined and compared with previously studied rules. One of the rules performed well in boundary areas, but with reduced efficiency in field interiors; another combined best performance on field interiors with good sensitivity to boundary detail. The basic threshold gradient and some modifications were investigated as a means of boundary point detection. The hypothesis testing methods of closed-boundary formation were also tested and evaluated. An analysis of the boundary detection problem was initiated, employing statistical signal detection and parameter estimation techniques to analyze various formulations of the problem. These formulations permit the atmospheric and sensor system effects on the data to be thoroughly analyzed. Various boundary features and necessary assumptions can also be investigated in this manner.
A review of automatic mass detection and segmentation in mammographic images.
Oliver, Arnau; Freixenet, Jordi; Martí, Joan; Pérez, Elsa; Pont, Josep; Denton, Erika R E; Zwiggelaar, Reyer
2010-04-01
The aim of this paper is to review existing approaches to the automatic detection and segmentation of masses in mammographic images, highlighting the key-points and main differences between the used strategies. The key objective is to point out the advantages and disadvantages of the various approaches. In contrast with other reviews which only describe and compare different approaches qualitatively, this review also provides a quantitative comparison. The performance of seven mass detection methods is compared using two different mammographic databases: a public digitised database and a local full-field digital database. The results are given in terms of Receiver Operating Characteristic (ROC) and Free-response Receiver Operating Characteristic (FROC) analysis. Copyright 2009 Elsevier B.V. All rights reserved.
Invasive pulmonary aspergillosis: current diagnostic methodologies and a new molecular approach.
Moura, S; Cerqueira, L; Almeida, A
2018-05-13
The fungus Aspergillus fumigatus is the main pathogenic agent responsible for invasive pulmonary aspergillosis. Immunocompromised patients are more likely to develop this pathology due to a decrease in the immune system's defense capacity. Despite of the low occurrence of invasive pulmonary aspergillosis, this pathology presents high rates of mortality, mostly due to late and unspecific diagnosis. Currently, the diagnostic methods used to detect this fungal infection are conventional mycological examination (direct microscopic examination, histological examination, and culture), imaging, non-culture-based tests for the detection of galactomannan, β(1,3)-glucan and an extracellular glycoprotein, and molecular tests based on PCR. However, most of these methods do not detect the species A. fumigatus; they only allow the identification of genus Aspergillus. The development of more specific detection methods is of extreme importance. Fluorescent in situ hybridization-based molecular methods can be a good alternative to achieve this purpose. In this review, it is intended to point out that most of the methods used for the diagnosis of invasive pulmonary aspergillosis do not allow to detect the fungus at the species level and that fluorescence in situ hybridization-based molecular method will be a promising approach in the A. fumigatus detection.
TREFEX: Trend Estimation and Change Detection in the Response of MOX Gas Sensors
Pashami, Sepideh; Lilienthal, Achim J.; Schaffernicht, Erik; Trincavelli, Marco
2013-01-01
Many applications of metal oxide gas sensors can benefit from reliable algorithms to detect significant changes in the sensor response. Significant changes indicate a change in the emission modality of a distant gas source and occur due to a sudden change of concentration or exposure to a different compound. As a consequence of turbulent gas transport and the relatively slow response and recovery times of metal oxide sensors, their response in open sampling configuration exhibits strong fluctuations that interfere with the changes of interest. In this paper we introduce TREFEX, a novel change point detection algorithm, especially designed for metal oxide gas sensors in an open sampling system. TREFEX models the response of MOX sensors as a piecewise exponential signal and considers the junctions between consecutive exponentials as change points. We formulate non-linear trend filtering and change point detection as a parameter-free convex optimization problem for single sensors and sensor arrays. We evaluate the performance of the TREFEX algorithm experimentally for different metal oxide sensors and several gas emission profiles. A comparison with the previously proposed GLR method shows a clearly superior performance of the TREFEX algorithm both in detection performance and in estimating the change time. PMID:23736853
NASA Astrophysics Data System (ADS)
Shi, Aiye; Wang, Chao; Shen, Shaohong; Huang, Fengchen; Ma, Zhenli
2016-10-01
Chi-squared transform (CST), as a statistical method, can describe the difference degree between vectors. The CST-based methods operate directly on information stored in the difference image and are simple and effective methods for detecting changes in remotely sensed images that have been registered and aligned. However, the technique does not take spatial information into consideration, which leads to much noise in the result of change detection. An improved unsupervised change detection method is proposed based on spatial constraint CST (SCCST) in combination with a Markov random field (MRF) model. First, the mean and variance matrix of the difference image of bitemporal images are estimated by an iterative trimming method. In each iteration, spatial information is injected to reduce scattered changed points (also known as "salt and pepper" noise). To determine the key parameter confidence level in the SCCST method, a pseudotraining dataset is constructed to estimate the optimal value. Then, the result of SCCST, as an initial solution of change detection, is further improved by the MRF model. The experiments on simulated and real multitemporal and multispectral images indicate that the proposed method performs well in comprehensive indices compared with other methods.
Laser desorption mass spectrometry for molecular diagnosis
NASA Astrophysics Data System (ADS)
Chen, C. H. Winston; Taranenko, N. I.; Zhu, Y. F.; Allman, S. L.; Tang, K.; Matteson, K. J.; Chang, L. Y.; Chung, C. N.; Martin, Steve; Haff, Lawrence
1996-04-01
Laser desorption mass spectrometry has been used for molecular diagnosis of cystic fibrosis. Both 3-base deletion and single-base point mutation have been successfully detected by clinical samples. This new detection method can possibly speed up the diagnosis by one order of magnitude in the future. It may become a new biotechnology technique for population screening of genetic disease.
Real-time EEG-based detection of fatigue driving danger for accident prediction.
Wang, Hong; Zhang, Chi; Shi, Tianwei; Wang, Fuwang; Ma, Shujun
2015-03-01
This paper proposes a real-time electroencephalogram (EEG)-based detection method of the potential danger during fatigue driving. To determine driver fatigue in real time, wavelet entropy with a sliding window and pulse coupled neural network (PCNN) were used to process the EEG signals in the visual area (the main information input route). To detect the fatigue danger, the neural mechanism of driver fatigue was analyzed. The functional brain networks were employed to track the fatigue impact on processing capacity of brain. The results show the overall functional connectivity of the subjects is weakened after long time driving tasks. The regularity is summarized as the fatigue convergence phenomenon. Based on the fatigue convergence phenomenon, we combined both the input and global synchronizations of brain together to calculate the residual amount of the information processing capacity of brain to obtain the dangerous points in real time. Finally, the danger detection system of the driver fatigue based on the neural mechanism was validated using accident EEG. The time distributions of the output danger points of the system have a good agreement with those of the real accident points.
A scalable self-priming fractal branching microchannel net chip for digital PCR.
Zhu, Qiangyuan; Xu, Yanan; Qiu, Lin; Ma, Congcong; Yu, Bingwen; Song, Qi; Jin, Wei; Jin, Qinhan; Liu, Jinyu; Mu, Ying
2017-05-02
As an absolute quantification method at the single-molecule level, digital PCR has been widely used in many bioresearch fields, such as next generation sequencing, single cell analysis, gene editing detection and so on. However, existing digital PCR methods still have some disadvantages, including high cost, sample loss, and complicated operation. In this work, we develop an exquisite scalable self-priming fractal branching microchannel net digital PCR chip. This chip with a special design inspired by natural fractal-tree systems has an even distribution and 100% compartmentalization of the sample without any sample loss, which is not available in existing chip-based digital PCR methods. A special 10 nm nano-waterproof layer was created to prevent the solution from evaporating. A vacuum pre-packaging method called self-priming reagent introduction is used to passively drive the reagent flow into the microchannel nets, so that this chip can realize sequential reagent loading and isolation within a couple of minutes, which is very suitable for point-of-care detection. When the number of positive microwells stays in the range of 100 to 4000, the relative uncertainty is below 5%, which means that one panel can detect an average of 101 to 15 374 molecules by the Poisson distribution. This chip is proved to have an excellent ability for single molecule detection and quantification of low expression of hHF-MSC stem cell markers. Due to its potential for high throughput, high density, low cost, lack of sample and reagent loss, self-priming even compartmentalization and simple operation, we envision that this device will significantly expand and extend the application range of digital PCR involving rare samples, liquid biopsy detection and point-of-care detection with higher sensitivity and accuracy.
Scene-based nonuniformity correction for airborne point target detection systems.
Zhou, Dabiao; Wang, Dejiang; Huo, Lijun; Liu, Rang; Jia, Ping
2017-06-26
Images acquired by airborne infrared search and track (IRST) systems are often characterized by nonuniform noise. In this paper, a scene-based nonuniformity correction method for infrared focal-plane arrays (FPAs) is proposed based on the constant statistics of the received radiation ratios of adjacent pixels. The gain of each pixel is computed recursively based on the ratios between adjacent pixels, which are estimated through a median operation. Then, an elaborate mathematical model describing the error propagation, derived from random noise and the recursive calculation procedure, is established. The proposed method maintains the characteristics of traditional methods in calibrating the whole electro-optics chain, in compensating for temporal drifts, and in not preserving the radiometric accuracy of the system. Moreover, the proposed method is robust since the frame number is the only variant, and is suitable for real-time applications owing to its low computational complexity and simplicity of implementation. The experimental results, on different scenes from a proof-of-concept point target detection system with a long-wave Sofradir FPA, demonstrate the compelling performance of the proposed method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaufmann, Ralph M., E-mail: rkaufman@math.purdue.edu; Khlebnikov, Sergei, E-mail: skhleb@physics.purdue.edu; Wehefritz-Kaufmann, Birgit, E-mail: ebkaufma@math.purdue.edu
2012-11-15
Motivated by the Double Gyroid nanowire network we develop methods to detect Dirac points and classify level crossings, aka. singularities in the spectrum of a family of Hamiltonians. The approach we use is singularity theory. Using this language, we obtain a characterization of Dirac points and also show that the branching behavior of the level crossings is given by an unfolding of A{sub n} type singularities. Which type of singularity occurs can be read off a characteristic region inside the miniversal unfolding of an A{sub k} singularity. We then apply these methods in the setting of families of graph Hamiltonians,more » such as those for wire networks. In the particular case of the Double Gyroid we analytically classify its singularities and show that it has Dirac points. This indicates that nanowire systems of this type should have very special physical properties. - Highlights: Black-Right-Pointing-Pointer New method for analytically finding Dirac points. Black-Right-Pointing-Pointer Novel relation of level crossings to singularity theory. Black-Right-Pointing-Pointer More precise version of the von-Neumann-Wigner theorem for arbitrary smooth families of Hamiltonians of fixed size. Black-Right-Pointing-Pointer Analytical proof of the existence of Dirac points for the Gyroid wire network.« less
Yu, Kate; Di, Li; Kerns, Edward; Li, Susan Q; Alden, Peter; Plumb, Robert S
2007-01-01
We report in this paper an ultra-performance liquid chromatography/tandem mass spectrometric (UPLC(R)/MS/MS) method utilizing an ESI-APCI multimode ionization source to quantify structurally diverse analytes. Eight commercial drugs were used as test compounds. Each LC injection was completed in 1 min using a UPLC system coupled with MS/MS multiple reaction monitoring (MRM) detection. Results from three separate sets of experiments are reported. In the first set of experiments, the eight test compounds were analyzed as a single mixture. The mass spectrometer was switching rapidly among four ionization modes (ESI+, ESI-, APCI-, and APCI+) during an LC run. Approximately 8-10 data points were collected across each LC peak. This was insufficient for a quantitative analysis. In the second set of experiments, four compounds were analyzed as a single mixture. The mass spectrometer was switching rapidly among four ionization modes during an LC run. Approximately 15 data points were obtained for each LC peak. Quantification results were obtained with a limit of detection (LOD) as low as 0.01 ng/mL. For the third set of experiments, the eight test compounds were analyzed as a batch. During each LC injection, a single compound was analyzed. The mass spectrometer was detecting at a particular ionization mode during each LC injection. More than 20 data points were obtained for each LC peak. Quantification results were also obtained. This single-compound analytical method was applied to a microsomal stability test. Compared with a typical HPLC method currently used for the microsomal stability test, the injection-to-injection cycle time was reduced to 1.5 min (UPLC method) from 3.5 min (HPLC method). The microsome stability results were comparable with those obtained by traditional HPLC/MS/MS.
NASA Astrophysics Data System (ADS)
Naseralavi, S. S.; Salajegheh, E.; Fadaee, M. J.; Salajegheh, J.
2014-06-01
This paper presents a technique for damage detection in structures under unknown periodic excitations using the transient displacement response. The method is capable of identifying the damage parameters without finding the input excitations. We first define the concept of displacement space as a linear space in which each point represents displacements of structure under an excitation and initial condition. Roughly speaking, the method is based on the fact that structural displacements under free and forced vibrations are associated with two parallel subspaces in the displacement space. Considering this novel geometrical viewpoint, an equation called kernel parallelization equation (KPE) is derived for damage detection under unknown periodic excitations and a sensitivity-based algorithm for solving KPE is proposed accordingly. The method is evaluated via three case studies under periodic excitations, which confirm the efficiency of the proposed method.
The Bone in the Throat: Some Uncertain Thoughts on Baroque Method
ERIC Educational Resources Information Center
MacLure, Maggie
2006-01-01
The paper conjures some possibilities for a baroque method in qualitative educational research. It draws on work across a range of disciplines that has detected a recurrence of the baroque in the philosophical and literary texts of modernity. A baroque method would resist clarity, mastery and the single point of view, be radically uncertain about…
Haghshenas, Maryam; Akbari, Mohammad Taghi; Karizi, Shohreh Zare; Deilamani, Faravareh Khordadpoor; Nafissi, Shahriar; Salehi, Zivar
2016-06-01
Duchenne and Becker muscular dystrophies (DMD and BMD) are X-linked neuromuscular diseases characterized by progressive muscular weakness and degeneration of skeletal muscles. Approximately two-thirds of the patients have large deletions or duplications in the dystrophin gene and the remaining one-third have point mutations. This study was performed to evaluate point mutations in Iranian DMD/BMD male patients. A total of 29 DNA samples from patients who did not show any large deletion/duplication mutations following multiplex polymerase chain reaction (PCR) and multiplex ligation-dependent probe amplification (MLPA) screening were sequenced for detection of point mutations in exons 50-79. Also exon 44 was sequenced in one sample in which a false positive deletion was detected by MLPA method. Cycle sequencing revealed four nonsense, one frameshift and two splice site mutations as well as two missense variants.
Registration methods for nonblind watermark detection in digital cinema applications
NASA Astrophysics Data System (ADS)
Nguyen, Philippe; Balter, Raphaele; Montfort, Nicolas; Baudry, Severine
2003-06-01
Digital watermarking may be used to enforce copyright protection of digital cinema, by embedding in each projected movie an unique identifier (fingerprint). By identifying the source of illegal copies, watermarking will thus incite movie theatre managers to enforce copyright protection, in particular by preventing people from coming in with a handy cam. We propose here a non-blind watermark method to improve the watermark detection on very impaired sequences. We first present a study on the picture impairments caused by the projection on a screen, then acquisition with a handy cam. We show that images undergo geometric deformations, which are fully described by a projective geometry model. The sequence also undergoes spatial and temporal luminance variation. Based on this study and on the impairments models which follow, we propose a method to match the retrieved sequence to the original one. First, temporal registration is performed by comparing the average luminance variation on both sequences. To compensate for geometric transformations, we used paired points from both sequences, obtained by applying a feature points detector. The matching of the feature points then enables to retrieve the geometric transform parameters. Tests show that the watermark retrieval on rectified sequences is greatly improved.
Barbosa, Jocelyn; Lee, Kyubum; Lee, Sunwon; Lodhi, Bilal; Cho, Jae-Gu; Seo, Woo-Keun; Kang, Jaewoo
2016-03-12
Facial palsy or paralysis (FP) is a symptom that loses voluntary muscles movement in one side of the human face, which could be very devastating in the part of the patients. Traditional methods are solely dependent to clinician's judgment and therefore time consuming and subjective in nature. Hence, a quantitative assessment system becomes apparently invaluable for physicians to begin the rehabilitation process; and to produce a reliable and robust method is challenging and still underway. We introduce a novel approach for a quantitative assessment of facial paralysis that tackles classification problem for FP type and degree of severity. Specifically, a novel method of quantitative assessment is presented: an algorithm that extracts the human iris and detects facial landmarks; and a hybrid approach combining the rule-based and machine learning algorithm to analyze and prognosticate facial paralysis using the captured images. A method combining the optimized Daugman's algorithm and Localized Active Contour (LAC) model is proposed to efficiently extract the iris and facial landmark or key points. To improve the performance of LAC, appropriate parameters of initial evolving curve for facial features' segmentation are automatically selected. The symmetry score is measured by the ratio between features extracted from the two sides of the face. Hybrid classifiers (i.e. rule-based with regularized logistic regression) were employed for discriminating healthy and unhealthy subjects, FP type classification, and for facial paralysis grading based on House-Brackmann (H-B) scale. Quantitative analysis was performed to evaluate the performance of the proposed approach. Experiments show that the proposed method demonstrates its efficiency. Facial movement feature extraction on facial images based on iris segmentation and LAC-based key point detection along with a hybrid classifier provides a more efficient way of addressing classification problem on facial palsy type and degree of severity. Combining iris segmentation and key point-based method has several merits that are essential for our real application. Aside from the facial key points, iris segmentation provides significant contribution as it describes the changes of the iris exposure while performing some facial expressions. It reveals the significant difference between the healthy side and the severe palsy side when raising eyebrows with both eyes directed upward, and can model the typical changes in the iris region.
Portable point-of-care blood analysis system for global health (Conference Presentation)
NASA Astrophysics Data System (ADS)
Dou, James J.; Aitchison, James Stewart; Chen, Lu; Nayyar, Rakesh
2016-03-01
In this paper we present a portable blood analysis system based on a disposable cartridge and hand-held reader. The platform can perform all the sample preparation, detection and waste collection required to complete a clinical test. In order to demonstrate the utility of this approach a CD4 T cell enumeration was carried out. A handheld, point-of-care CD4 T cell system was developed based on this system. In particular we will describe a pneumatic, active pumping method to control the on-chip fluidic actuation. Reagents for the CD4 T cell counting assay were dried on a reagent plug to eliminate the need for cold chain storage when used in the field. A micromixer based on the active fluidic actuation was designed to complete sample staining with fluorescent dyes that was dried on the reagent plugs. A novel image detection and analysis algorithm was developed to detect and track the flight of target particles and cells during each analysis. The handheld, point-of-care CD4 testing system was benchmarked against clinical cytometer. The experimental results demonstrated experimental results were closely matched with the flow cytometry. The same platform can be further expanded into a bead-array detection system where other types of biomolecules such as proteins can be detected using the same detection system.
Zhang, Yuqin; Lin, Fanbo; Zhang, Youyu; Li, Haitao; Zeng, Yue; Tang, Hao; Yao, Shouzhuo
2011-01-01
A new method for the detection of point mutation in DNA based on the monobase-coded cadmium tellurium nanoprobes and the quartz crystal microbalance (QCM) technique was reported. A point mutation (single-base, adenine, thymine, cytosine, and guanine, namely, A, T, C and G, mutation in DNA strand, respectively) DNA QCM sensor was fabricated by immobilizing single-base mutation DNA modified magnetic beads onto the electrode surface with an external magnetic field near the electrode. The DNA-modified magnetic beads were obtained from the biotin-avidin affinity reaction of biotinylated DNA and streptavidin-functionalized core/shell Fe(3)O(4)/Au magnetic nanoparticles, followed by a DNA hybridization reaction. Single-base coded CdTe nanoprobes (A-CdTe, T-CdTe, C-CdTe and G-CdTe, respectively) were used as the detection probes. The mutation site in DNA was distinguished by detecting the decreases of the resonance frequency of the piezoelectric quartz crystal when the coded nanoprobe was added to the test system. This proposed detection strategy for point mutation in DNA is proved to be sensitive, simple, repeatable and low-cost, consequently, it has a great potential for single nucleotide polymorphism (SNP) detection. 2011 © The Japan Society for Analytical Chemistry
Line segment confidence region-based string matching method for map conflation
NASA Astrophysics Data System (ADS)
Huh, Yong; Yang, Sungchul; Ga, Chillo; Yu, Kiyun; Shi, Wenzhong
2013-04-01
In this paper, a method to detect corresponding point pairs between polygon object pairs with a string matching method based on a confidence region model of a line segment is proposed. The optimal point edit sequence to convert the contour of a target object into that of a reference object was found by the string matching method which minimizes its total error cost, and the corresponding point pairs were derived from the edit sequence. Because a significant amount of apparent positional discrepancies between corresponding objects are caused by spatial uncertainty and their confidence region models of line segments are therefore used in the above matching process, the proposed method obtained a high F-measure for finding matching pairs. We applied this method for built-up area polygon objects in a cadastral map and a topographical map. Regardless of their different mapping and representation rules and spatial uncertainties, the proposed method with a confidence level at 0.95 showed a matching result with an F-measure of 0.894.
Eliminating ambiguity in digital signals
NASA Technical Reports Server (NTRS)
Weber, W. J., III
1979-01-01
Multiamplitude minimum shift keying (mamsk) transmission system, method of differential encoding overcomes problem of ambiguity associated with advanced digital-transmission techniques with little or no penalty in transmission rate, error rate, or system complexity. Principle of method states, if signal points are properly encoded and decoded, bits are detected correctly, regardless of phase ambiguities.
USDA-ARS?s Scientific Manuscript database
Skin sensitization is an important toxicological end-point in the risk assessment of chemical allergens. Because of the complexity of the biological mechanisms associated with skin sensitization integrated approaches combining different chemical, biological and in silico methods are recommended to r...
R Peak Detection Method Using Wavelet Transform and Modified Shannon Energy Envelope.
Park, Jeong-Seon; Lee, Sang-Woong; Park, Unsang
2017-01-01
Rapid automatic detection of the fiducial points-namely, the P wave, QRS complex, and T wave-is necessary for early detection of cardiovascular diseases (CVDs). In this paper, we present an R peak detection method using the wavelet transform (WT) and a modified Shannon energy envelope (SEE) for rapid ECG analysis. The proposed WTSEE algorithm performs a wavelet transform to reduce the size and noise of ECG signals and creates SEE after first-order differentiation and amplitude normalization. Subsequently, the peak energy envelope (PEE) is extracted from the SEE. Then, R peaks are estimated from the PEE, and the estimated peaks are adjusted from the input ECG. Finally, the algorithm generates the final R features by validating R-R intervals and updating the extracted R peaks. The proposed R peak detection method was validated using 48 first-channel ECG records of the MIT-BIH arrhythmia database with a sensitivity of 99.93%, positive predictability of 99.91%, detection error rate of 0.16%, and accuracy of 99.84%. Considering the high detection accuracy and fast processing speed due to the wavelet transform applied before calculating SEE, the proposed method is highly effective for real-time applications in early detection of CVDs.
Helicase dependent OnChip-amplification and its use in multiplex pathogen detection.
Andresen, Dennie; von Nickisch-Rosenegk, Markus; Bier, Frank F
2009-05-01
The need for fast, specific and sensitive multiparametric detection methods is an ever growing demand in molecular diagnostics. Here we report on a newly developed method, the helicase dependent OnChip amplification (OnChip-HDA). This approach integrates the analysis and detection in one single reaction thus leading to time and cost savings in multiparametric analysis. HDA is an isothermal amplification method that is not depending on thermocycling as known from PCR due to the helicases' ability to unwind DNA double-strands. We have combined the HDA with microarray based detection, making it suitable for multiplex detection. As an example we used the OnChip HDA in single and multiplex amplifications for the detection of the two pathogens N. gonorrhoeae and S. aureus directly on surface bound primers. We have successfully shown the OnChip-HDA and applied it for single- and duplex-detection of the pathogens N. gonorrhoeae and S. aureus. We have developed a new method, the OnChip-HDA for the multiplex detection of pathogens. Its simplicity in reaction setup and potential for miniaturization and multiparametric analysis is advantageous for the integration in miniaturized Lab on Chip systems, e.g. needed in point of care diagnostics.
D'Autry, Ward; Zheng, Chao; Bugalama, John; Wolfs, Kris; Hoogmartens, Jos; Adams, Erwin; Wang, Bochu; Van Schepdael, Ann
2011-07-15
Residual solvents are volatile organic compounds which can be present in pharmaceutical substances. A generic static headspace-gas chromatography analysis method for the identification and control of residual solvents is described in the European Pharmacopoeia. Although this method is proved to be suitable for the majority of samples and residual solvents, the method may lack sensitivity for high boiling point residual solvents such as N,N-dimethylformamide, N,N-dimethylacetamide, dimethyl sulfoxide and benzyl alcohol. In this study, liquid paraffin was investigated as new dilution medium for the analysis of these residual solvents. The headspace-gas chromatography method was developed and optimized taking the official Pharmacopoeia method as a starting point. The optimized method was validated according to ICH criteria. It was found that the detection limits were below 1μg/vial for each compound, indicating a drastically increased sensitivity compared to the Pharmacopoeia method, which failed to detect the compounds at their respective limit concentrations. Linearity was evaluated based on the R(2) values, which were above 0.997 for all compounds, and inspection of residual plots. Instrument and method precision were examined by calculating the relative standard deviations (RSD) of repeated analyses within the linearity and accuracy experiments, respectively. It was found that all RSD values were below 10%. Accuracy was checked by a recovery experiment at three different levels. Mean recovery values were all in the range 95-105%. Finally, the optimized method was applied to residual DMSO analysis in four different Kollicoat(®) sample batches. Copyright © 2011 Elsevier B.V. All rights reserved.
Etesami, M; Hoi, Y; Steinman, D A; Gujar, S K; Nidecker, A E; Astor, B C; Portanova, A; Qiao, Y; Abdalla, W M A; Wasserman, B A
2013-01-01
Ulceration in carotid plaque is a risk indicator for ischemic stroke. Our aim was to compare plaque ulcer detection by standard TOF and CE-MRA techniques and to identify factors that influence its detection. Carotid MR imaging scans were acquired on 2066 participants in the ARIC study. We studied the 600 thickest plaques. TOF-MRA, CE-MRA, and black-blood MR images were analyzed together to define ulcer presence (plaque surface niche ≥2 mm in depth). Sixty ulcerated arteries were detected. These arteries were randomly assigned, along with 40 nonulcerated plaques from the remaining 540, for evaluation of ulcer presence by 2 neuroradiologists. Associations between ulcer detection and ulcer characteristics, including orientation, location, and size, were determined and explored by CFD modeling. One CE-MRA and 3 TOF-MRAs were noninterpretable and excluded. Of 71 ulcers in 56 arteries, readers detected an average of 39 (55%) on both TOF-MRA and CE-MRA, 26.5 (37.5%) only on CE-MRA, and 1 (1.5%) only on TOF-MRA, missing 4.5 (6%) ulcers by both methods. Ulcer detection by TOF-MRA was associated with its orientation (distally pointing versus perpendicular: OR = 5.57 [95% CI, 1.08-28.65]; proximally pointing versus perpendicular: OR = 0.21 [95% CI, 0.14-0.29]); location relative to point of maximum stenosis (distal versus isolevel: OR = 5.17 [95% CI, 2.10-12.70]); and neck-to-depth ratio (OR = 1.96 [95% CI, 1.11-3.45]) after controlling for stenosis and ulcer volume. CE-MRA detects more ulcers than TOF-MRA in carotid plaques. Missed ulcers on TOF-MRA are influenced by ulcer orientation, location relative to point of maximum stenosis, and neck-to-depth ratio.
Detecting signals of drug–drug interactions in a spontaneous reports database
Thakrar, Bharat T; Grundschober, Sabine Borel; Doessegger, Lucette
2007-01-01
Aims The spontaneous reports database is widely used for detecting signals of ADRs. We have extended the methodology to include the detection of signals of ADRs that are associated with drug–drug interactions (DDI). In particular, we have investigated two different statistical assumptions for detecting signals of DDI. Methods Using the FDA's spontaneous reports database, we investigated two models, a multiplicative and an additive model, to detect signals of DDI. We applied the models to four known DDIs (methotrexate-diclofenac and bone marrow depression, simvastatin-ciclosporin and myopathy, ketoconazole-terfenadine and torsades de pointes, and cisapride-erythromycin and torsades de pointes) and to four drug-event combinations where there is currently no evidence of a DDI (fexofenadine-ketoconazole and torsades de pointes, methotrexade-rofecoxib and bone marrow depression, fluvastatin-ciclosporin and myopathy, and cisapride-azithromycine and torsade de pointes) and estimated the measure of interaction on the two scales. Results The additive model correctly identified all four known DDIs by giving a statistically significant (P< 0.05) positive measure of interaction. The multiplicative model identified the first two of the known DDIs as having a statistically significant or borderline significant (P< 0.1) positive measure of interaction term, gave a nonsignificant positive trend for the third interaction (P= 0.27), and a negative trend for the last interaction. Both models correctly identified the four known non interactions by estimating a negative measure of interaction. Conclusions The spontaneous reports database is a valuable resource for detecting signals of DDIs. In particular, the additive model is more sensitive in detecting such signals. The multiplicative model may further help qualify the strength of the signal detected by the additive model. PMID:17506784
Feng, Kejun; Zhao, Jingjin; Wu, Zai-Sheng; Jiang, Jianhui; Shen, Guoli; Yu, Ruqin
2011-03-15
Here a highly sensitive electrochemical method is described for the detection of point mutation in DNA. Polymerization extension reaction is applied to specifically initiate enzymatic electrochemical amplification to improve the sensitivity and enhance the performance of point mutation detection. In this work, 5'-thiolated DNA probe sequences complementary to the wild target DNA are assembled on the gold electrode. In the presence of wild target DNA, the probe is extended by DNA polymerase over the free segment of target as the template. After washing with NaOH solution, the target DNA is removed while the elongated probe sequence remains on the sensing surface. Via hybridizing to the designed biotin-labeled detection probe, the extended sequence is capable of capturing detection probe. After introducing streptavidin-conjugated alkaline phosphatase (SA-ALP), the specific binding between streptavidin and biotin mediates a catalytic reaction of ascorbic acid 2-phosphate (AA-P) substrate to produce a reducing agent ascorbic acid (AA). Then the silver ions in solution are reduced by AA, leading to the deposition of silver metal onto the electrode surface. The amount of deposited silver which is determined by the amount of wild target can be quantified by the linear sweep voltammetry (LSV). The present approach proved to be capable of detecting the wild target DNA down to a detection limit of 1.0×10(-14) M in a wide target concentration range and identifying -28 site (A to G) of the β-thalassemia gene, demonstrating that this scheme offers a highly sensitive and specific approach for point mutation detection. Copyright © 2010 Elsevier B.V. All rights reserved.
Fully 3D printed integrated reactor array for point-of-care molecular diagnostics.
Kadimisetty, Karteek; Song, Jinzhao; Doto, Aoife M; Hwang, Young; Peng, Jing; Mauk, Michael G; Bushman, Frederic D; Gross, Robert; Jarvis, Joseph N; Liu, Changchun
2018-06-30
Molecular diagnostics that involve nucleic acid amplification tests (NAATs) are crucial for prevention and treatment of infectious diseases. In this study, we developed a simple, inexpensive, disposable, fully 3D printed microfluidic reactor array that is capable of carrying out extraction, concentration and isothermal amplification of nucleic acids in variety of body fluids. The method allows rapid molecular diagnostic tests for infectious diseases at point of care. A simple leak-proof polymerization strategy was developed to integrate flow-through nucleic acid isolation membranes into microfluidic devices, yielding a multifunctional diagnostic platform. Static coating technology was adopted to improve the biocompatibility of our 3D printed device. We demonstrated the suitability of our device for both end-point colorimetric qualitative detection and real-time fluorescence quantitative detection. We applied our diagnostic device to detection of Plasmodium falciparum in plasma samples and Neisseria meningitides in cerebrospinal fluid (CSF) samples by loop-mediated, isothermal amplification (LAMP) within 50 min. The detection limits were 100 fg for P. falciparum and 50 colony-forming unit (CFU) for N. meningitidis per reaction, which are comparable to that of benchtop instruments. This rapid and inexpensive 3D printed device has great potential for point-of-care molecular diagnosis of infectious disease in resource-limited settings. Copyright © 2018 Elsevier B.V. All rights reserved.
Magnetically-focusing biochip structures for high-speed active biosensing with improved selectivity.
Yoo, Haneul; Lee, Dong Jun; Kim, Daesan; Park, Juhun; Chen, Xing; Hong, Seunghun
2018-06-29
We report a magnetically-focusing biochip structure enabling a single layered magnetic trap-and-release cycle for biosensors with an improved detection speed and selectivity. Here, magnetic beads functionalized with specific receptor molecules were utilized to trap target molecules in a solution and transport actively to and away from the sensor surfaces to enhance the detection speed and reduce the non-specific bindings, respectively. Using our method, we demonstrated the high speed detection of IL-13 antigens with the improved detection speed by more than an order of magnitude. Furthermore, the release step in our method was found to reduce the non-specific bindings and improve the selectivity and sensitivity of biosensors. This method is a simple but powerful strategy and should open up various applications such as ultra-fast biosensors for point-of-care services.
Apparatus and method for detecting leaks in piping
Trapp, Donald J.
1994-01-01
A method and device for detecting the location of leaks along a wall or piping system, preferably in double-walled piping. The apparatus comprises a sniffer probe, a rigid cord such as a length of tube attached to the probe on one end and extending out of the piping with the other end, a source of pressurized air and a source of helium. The method comprises guiding the sniffer probe into the inner pipe to its distal end, purging the inner pipe with pressurized air, filling the annulus defined between the inner and outer pipe with helium, and then detecting the presence of helium within the inner pipe with the probe as is pulled back through the inner pipe. The length of the tube at the point where a leak is detected determines the location of the leak in the pipe.
Magnetically-focusing biochip structures for high-speed active biosensing with improved selectivity
NASA Astrophysics Data System (ADS)
Yoo, Haneul; Lee, Dong Jun; Kim, Daesan; Park, Juhun; Chen, Xing; Hong, Seunghun
2018-06-01
We report a magnetically-focusing biochip structure enabling a single layered magnetic trap-and-release cycle for biosensors with an improved detection speed and selectivity. Here, magnetic beads functionalized with specific receptor molecules were utilized to trap target molecules in a solution and transport actively to and away from the sensor surfaces to enhance the detection speed and reduce the non-specific bindings, respectively. Using our method, we demonstrated the high speed detection of IL-13 antigens with the improved detection speed by more than an order of magnitude. Furthermore, the release step in our method was found to reduce the non-specific bindings and improve the selectivity and sensitivity of biosensors. This method is a simple but powerful strategy and should open up various applications such as ultra-fast biosensors for point-of-care services.
A 3D Laser Profiling System for Rail Surface Defect Detection
Li, Qingquan; Mao, Qingzhou; Zou, Qin
2017-01-01
Rail surface defects such as the abrasion, scratch and peeling often cause damages to the train wheels and rail bearings. An efficient and accurate detection of rail defects is of vital importance for the safety of railway transportation. In the past few decades, automatic rail defect detection has been studied; however, most developed methods use optic-imaging techniques to collect the rail surface data and are still suffering from a high false recognition rate. In this paper, a novel 3D laser profiling system (3D-LPS) is proposed, which integrates a laser scanner, odometer, inertial measurement unit (IMU) and global position system (GPS) to capture the rail surface profile data. For automatic defect detection, first, the deviation between the measured profile and a standard rail model profile is computed for each laser-imaging profile, and the points with large deviations are marked as candidate defect points. Specifically, an adaptive iterative closest point (AICP) algorithm is proposed to register the point sets of the measured profile with the standard rail model profile, and the registration precision is improved to the sub-millimeter level. Second, all of the measured profiles are combined together to form the rail surface through a high-precision positioning process with the IMU, odometer and GPS data. Third, the candidate defect points are merged into candidate defect regions using the K-means clustering. At last, the candidate defect regions are classified by a decision tree classifier. Experimental results demonstrate the effectiveness of the proposed laser-profiling system in rail surface defect detection and classification. PMID:28777323
Brain's tumor image processing using shearlet transform
NASA Astrophysics Data System (ADS)
Cadena, Luis; Espinosa, Nikolai; Cadena, Franklin; Korneeva, Anna; Kruglyakov, Alexey; Legalov, Alexander; Romanenko, Alexey; Zotin, Alexander
2017-09-01
Brain tumor detection is well known research area for medical and computer scientists. In last decades there has been much research done on tumor detection, segmentation, and classification. Medical imaging plays a central role in the diagnosis of brain tumors and nowadays uses methods non-invasive, high-resolution techniques, especially magnetic resonance imaging and computed tomography scans. Edge detection is a fundamental tool in image processing, particularly in the areas of feature detection and feature extraction, which aim at identifying points in a digital image at which the image has discontinuities. Shearlets is the most successful frameworks for the efficient representation of multidimensional data, capturing edges and other anisotropic features which frequently dominate multidimensional phenomena. The paper proposes an improved brain tumor detection method by automatically detecting tumor location in MR images, its features are extracted by new shearlet transform.
Applying the Multiple Signal Classification Method to Silent Object Detection Using Ambient Noise
NASA Astrophysics Data System (ADS)
Mori, Kazuyoshi; Yokoyama, Tomoki; Hasegawa, Akio; Matsuda, Minoru
2004-05-01
The revolutionary concept of using ocean ambient noise positively to detect objects, called acoustic daylight imaging, has attracted much attention. The authors attempted the detection of a silent target object using ambient noise and a wide-band beam former consisting of an array of receivers. In experimental results obtained in air, using the wide-band beam former, we successfully applied the delay-sum array (DSA) method to detect a silent target object in an acoustic noise field generated by a large number of transducers. This paper reports some experimental results obtained by applying the multiple signal classification (MUSIC) method to a wide-band beam former to detect silent targets. The ocean ambient noise was simulated by transducers decentralized to many points in air. Both MUSIC and DSA detected a spherical target object in the noise field. The relative power levels near the target obtained with MUSIC were compared with those obtained by DSA. Then the effectiveness of the MUSIC method was evaluated according to the rate of increase in the maximum and minimum relative power levels.
On-Site Detection as a Countermeasure to Chemical Warfare/Terrorism.
Seto, Y
2014-01-01
On-site monitoring and detection are necessary in the crisis and consequence management of wars and terrorism involving chemical warfare agents (CWAs) such as sarin. The analytical performance required for on-site detection is mainly determined by the fatal vapor concentration and volatility of the CWAs involved. The analytical performance for presently available on-site technologies and commercially available on-site equipment for detecting CWAs interpreted and compared in this review include: classical manual methods, photometric methods, ion mobile spectrometry, vibrational spectrometry, gas chromatography, mass spectrometry, sensors, and other methods. Some of the data evaluated were obtained from our experiments using authentic CWAs. We concluded that (a) no technologies perfectly fulfill all of the on-site detection requirements and (b) adequate on-site detection requires (i) a combination of the monitoring-tape method and ion-mobility spectrometry for point detection and (ii) a combination of the monitoring-tape method, atmospheric pressure chemical ionization mass spectrometry with counterflow introduction, and gas chromatography with a trap and special detectors for continuous monitoring. The basic properties of CWAs, the concept of on-site detection, and the sarin gas attacks in Japan as well as the forensic investigations thereof, are also explicated in this article. Copyright © 2014 Central Police University.
Castor, José Martín Rosas; Portugal, Lindomar; Ferrer, Laura; Hinojosa-Reyes, Laura; Guzmán-Mar, Jorge Luis; Hernández-Ramírez, Aracely; Cerdà, Víctor
2016-08-01
A simple, inexpensive and rapid method was proposed for the determination of bioaccessible arsenic in corn and rice samples using an in vitro bioaccessibility assay. The method was based on the preconcentration of arsenic by cloud point extraction (CPE) using o,o-diethyldithiophosphate (DDTP) complex, which was generated from an in vitro extract using polyethylene glycol tert-octylphenyl ether (Triton X-114) as a surfactant prior to its detection by atomic fluorescence spectrometry with a hydride generation system (HG-AFS). The CPE method was optimized by a multivariate approach (two-level full factorial and Doehlert designs). A photo-oxidation step of the organic species prior to HG-AFS detection was included for the accurate quantification of the total As. The limit of detection was 1.34μgkg(-1) and 1.90μgkg(-1) for rice and corn samples, respectively. The accuracy of the method was confirmed by analyzing certified reference material ERM BC-211 (rice powder). The corn and rice samples that were analyzed showed a high bioaccessible arsenic content (72-88% and 54-96%, respectively), indicating a potential human health risk. Copyright © 2016 Elsevier Ltd. All rights reserved.
Supervised segmentation of microelectrode recording artifacts using power spectral density.
Bakstein, Eduard; Schneider, Jakub; Sieger, Tomas; Novak, Daniel; Wild, Jiri; Jech, Robert
2015-08-01
Appropriate detection of clean signal segments in extracellular microelectrode recordings (MER) is vital for maintaining high signal-to-noise ratio in MER studies. Existing alternatives to manual signal inspection are based on unsupervised change-point detection. We present a method of supervised MER artifact classification, based on power spectral density (PSD) and evaluate its performance on a database of 95 labelled MER signals. The proposed method yielded test-set accuracy of 90%, which was close to the accuracy of annotation (94%). The unsupervised methods achieved accuracy of about 77% on both training and testing data.
Wei, Wei; Chang, Jun; Wang, Qiang; Qin, Zengguang
2017-01-15
A new technique of modulation index adjustment for pure wavelength modulation spectroscopy second harmonic signal waveforms recovery is presented. As the modulation index is a key parameter in determining the exact form of the signals generated by the technique of wavelength modulation spectroscopy, the method of modulation index adjustment is applied to recover the second harmonic signal with wavelength modulation spectroscopy. By comparing the measured profile with the theoretical profile by calculation, the relationship between the modulation index and average quantities of the scanning wavelength can be obtained. Furthermore, when the relationship is applied in the experimental setup by point-by-point modulation index modification for gas detection, the results show good agreement with the theoretical profile and signal waveform distortion (such as the amplitude modulation effect caused by diode laser) can be suppressed. Besides, the method of modulation index adjustment can be used in many other aspects which involve profile improvement. In practical applications, when the amplitude modulation effect can be neglected and the stability of the detection system is limited by the sampling rate of analog-to-digital, modulation index adjustment can be used to improve detection into softer inflection points and solve the insufficient sampling problem. As a result, measurement stability is improved by 40%.
Wei, Wei; Chang, Jun; Wang, Qiang; Qin, Zengguang
2017-01-01
A new technique of modulation index adjustment for pure wavelength modulation spectroscopy second harmonic signal waveforms recovery is presented. As the modulation index is a key parameter in determining the exact form of the signals generated by the technique of wavelength modulation spectroscopy, the method of modulation index adjustment is applied to recover the second harmonic signal with wavelength modulation spectroscopy. By comparing the measured profile with the theoretical profile by calculation, the relationship between the modulation index and average quantities of the scanning wavelength can be obtained. Furthermore, when the relationship is applied in the experimental setup by point-by-point modulation index modification for gas detection, the results show good agreement with the theoretical profile and signal waveform distortion (such as the amplitude modulation effect caused by diode laser) can be suppressed. Besides, the method of modulation index adjustment can be used in many other aspects which involve profile improvement. In practical applications, when the amplitude modulation effect can be neglected and the stability of the detection system is limited by the sampling rate of analog-to-digital, modulation index adjustment can be used to improve detection into softer inflection points and solve the insufficient sampling problem. As a result, measurement stability is improved by 40%. PMID:28098842
High Precision Edge Detection Algorithm for Mechanical Parts
NASA Astrophysics Data System (ADS)
Duan, Zhenyun; Wang, Ning; Fu, Jingshun; Zhao, Wenhui; Duan, Boqiang; Zhao, Jungui
2018-04-01
High precision and high efficiency measurement is becoming an imperative requirement for a lot of mechanical parts. So in this study, a subpixel-level edge detection algorithm based on the Gaussian integral model is proposed. For this purpose, the step edge normal section line Gaussian integral model of the backlight image is constructed, combined with the point spread function and the single step model. Then gray value of discrete points on the normal section line of pixel edge is calculated by surface interpolation, and the coordinate as well as gray information affected by noise is fitted in accordance with the Gaussian integral model. Therefore, a precise location of a subpixel edge was determined by searching the mean point. Finally, a gear tooth was measured by M&M3525 gear measurement center to verify the proposed algorithm. The theoretical analysis and experimental results show that the local edge fluctuation is reduced effectively by the proposed method in comparison with the existing subpixel edge detection algorithms. The subpixel edge location accuracy and computation speed are improved. And the maximum error of gear tooth profile total deviation is 1.9 μm compared with measurement result with gear measurement center. It indicates that the method has high reliability to meet the requirement of high precision measurement.
Feature-Based Retinal Image Registration Using D-Saddle Feature
Hasikin, Khairunnisa; A. Karim, Noor Khairiah; Ahmedy, Fatimah
2017-01-01
Retinal image registration is important to assist diagnosis and monitor retinal diseases, such as diabetic retinopathy and glaucoma. However, registering retinal images for various registration applications requires the detection and distribution of feature points on the low-quality region that consists of vessels of varying contrast and sizes. A recent feature detector known as Saddle detects feature points on vessels that are poorly distributed and densely positioned on strong contrast vessels. Therefore, we propose a multiresolution difference of Gaussian pyramid with Saddle detector (D-Saddle) to detect feature points on the low-quality region that consists of vessels with varying contrast and sizes. D-Saddle is tested on Fundus Image Registration (FIRE) Dataset that consists of 134 retinal image pairs. Experimental results show that D-Saddle successfully registered 43% of retinal image pairs with average registration accuracy of 2.329 pixels while a lower success rate is observed in other four state-of-the-art retinal image registration methods GDB-ICP (28%), Harris-PIIFD (4%), H-M (16%), and Saddle (16%). Furthermore, the registration accuracy of D-Saddle has the weakest correlation (Spearman) with the intensity uniformity metric among all methods. Finally, the paired t-test shows that D-Saddle significantly improved the overall registration accuracy of the original Saddle. PMID:29204257
Sampayan, Stephen E.
2016-11-22
Apparatus, systems, and methods that provide an X-ray interrogation system having a plurality of stationary X-ray point sources arranged to substantially encircle an area or space to be interrogated. A plurality of stationary detectors are arranged to substantially encircle the area or space to be interrogated, A controller is adapted to control the stationary X-ray point sources to emit X-rays one at a time, and to control the stationary detectors to detect the X-rays emitted by the stationary X-ray point sources.
Semantic Information Extraction of Lanes Based on Onboard Camera Videos
NASA Astrophysics Data System (ADS)
Tang, L.; Deng, T.; Ren, C.
2018-04-01
In the field of autonomous driving, semantic information of lanes is very important. This paper proposes a method of automatic detection of lanes and extraction of semantic information from onboard camera videos. The proposed method firstly detects the edges of lanes by the grayscale gradient direction, and improves the Probabilistic Hough transform to fit them; then, it uses the vanishing point principle to calculate the lane geometrical position, and uses lane characteristics to extract lane semantic information by the classification of decision trees. In the experiment, 216 road video images captured by a camera mounted onboard a moving vehicle were used to detect lanes and extract lane semantic information. The results show that the proposed method can accurately identify lane semantics from video images.
A Lateral Flow Biosensor for the Detection of Single Nucleotide Polymorphisms.
Zeng, Lingwen; Xiao, Zhuo
2017-01-01
A lateral flow biosensor (LFB) is introduced for the detection of single nucleotide polymorphisms (SNPs). The assay is composed of two steps: circular strand displacement reaction and lateral flow biosensor detection. In step 1, the nucleotide at SNP site is recognized by T4 DNA ligase and the signal is amplified by strand displacement DNA polymerase, which can be accomplished at a constant temperature. In step 2, the reaction product of step 1 is detected by a lateral flow biosensor, which is a rapid and cost effective tool for nuclei acid detection. Comparing with conventional methods, it requires no complicated machines. It is suitable for the use of point of care diagnostics. Therefore, this simple, cost effective, robust, and promising LFB detection method of SNP has great potential for the detection of genetic diseases, personalized medicine, cancer related mutations, and drug-resistant mutations of infectious agents.
Study of driving fatigue alleviation by transcutaneous acupoints electrical stimulations.
Wang, Fuwang; Wang, Hong
2014-01-01
Driving fatigue is more likely to bring serious safety trouble to traffic. Therefore, accurately and rapidly detecting driving fatigue state and alleviating fatigue are particularly important. In the present work, the electrical stimulation method stimulating the Láogóng point (PC8) of human body is proposed, which is used to alleviate the mental fatigue of drivers. The wavelet packet decomposition (WPD) is used to extract θ, α, and β subbands of drivers' electroencephalogram (EEG) signals. Performances of the two algorithms (θ + α)/(α + β) and θ/β are also assessed as possible indicators for fatigue detection. Finally, the differences between the drivers with electrical stimulation and normal driving are discussed. It is shown that stimulating the Láogóng point (PC8) using electrical stimulation method can alleviate driver fatigue effectively during longtime driving.
NASA Astrophysics Data System (ADS)
Givianrad, M. H.; Saber-Tehrani, M.; Aberoomand-Azar, P.; Mohagheghian, M.
2011-03-01
The applicability of H-point standard additions method (HPSAM) to the resolving of overlapping spectra corresponding to the sulfamethoxazole and trimethoprim is verified by UV-vis spectrophotometry. The results show that the H-point standard additions method with simultaneous addition of both analytes is suitable for the simultaneous determination of sulfamethoxazole and trimethoprim in aqueous media. The results of applying the H-point standard additions method showed that the two drugs could be determined simultaneously with the concentration ratios of sulfamethoxazole to trimethoprim varying from 1:18 to 16:1 in the mixed samples. Also, the limits of detections were 0.58 and 0.37 μmol L -1 for sulfamethoxazole and trimethoprim, respectively. In addition the means of the calculated RSD (%) were 1.63 and 2.01 for SMX and TMP, respectively in synthetic mixtures. The proposed method has been successfully applied to the simultaneous determination of sulfamethoxazole and trimethoprim in some synthetic, pharmaceutical formulation and biological fluid samples.
Givianrad, M H; Saber-Tehrani, M; Aberoomand-Azar, P; Mohagheghian, M
2011-03-01
The applicability of H-point standard additions method (HPSAM) to the resolving of overlapping spectra corresponding to the sulfamethoxazole and trimethoprim is verified by UV-vis spectrophotometry. The results show that the H-point standard additions method with simultaneous addition of both analytes is suitable for the simultaneous determination of sulfamethoxazole and trimethoprim in aqueous media. The results of applying the H-point standard additions method showed that the two drugs could be determined simultaneously with the concentration ratios of sulfamethoxazole to trimethoprim varying from 1:18 to 16:1 in the mixed samples. Also, the limits of detections were 0.58 and 0.37 μmol L(-1) for sulfamethoxazole and trimethoprim, respectively. In addition the means of the calculated RSD (%) were 1.63 and 2.01 for SMX and TMP, respectively in synthetic mixtures. The proposed method has been successfully applied to the simultaneous determination of sulfamethoxazole and trimethoprim in some synthetic, pharmaceutical formulation and biological fluid samples. Copyright © 2011 Elsevier B.V. All rights reserved.
Detection of Golden apples' climacteric peak by laser biospeckle measurements.
Nassif, Rana; Nader, Christelle Abou; Afif, Charbel; Pellen, Fabrice; Le Brun, Guy; Le Jeune, Bernard; Abboud, Marie
2014-12-10
In this paper, we report a study in which a laser biospeckle technique is used to detect the climacteric peak indicating the optimal ripeness of fruits. We monitor two batches of harvested Golden apples going through the ripening phase in low- and room-temperature environments, determine speckle parameters, and measure the emitted ethylene concentration using gas chromatography as reference method. Speckle results are then correlated to the emitted ethylene concentration by a principal component analysis. From a practical point of view, this approach allows us to validate biospeckle as a noninvasive and alternative method to respiration rate and ethylene production for climacteric peak detection as a ripening index.
Radiation detection method and system using the sequential probability ratio test
Nelson, Karl E [Livermore, CA; Valentine, John D [Redwood City, CA; Beauchamp, Brock R [San Ramon, CA
2007-07-17
A method and system using the Sequential Probability Ratio Test to enhance the detection of an elevated level of radiation, by determining whether a set of observations are consistent with a specified model within a given bounds of statistical significance. In particular, the SPRT is used in the present invention to maximize the range of detection, by providing processing mechanisms for estimating the dynamic background radiation, adjusting the models to reflect the amount of background knowledge at the current point in time, analyzing the current sample using the models to determine statistical significance, and determining when the sample has returned to the expected background conditions.
Jarmusch, Alan K; Pirro, Valentina; Kerian, Kevin S; Cooks, R Graham
2014-10-07
Strep throat causing Streptococcus pyogenes was detected in vitro and in simulated clinical samples by performing touch spray ionization-mass spectrometry. MS analysis took only seconds to reveal characteristic bacterial and human lipids. Medical swabs were used as the substrate for ambient ionization. This work constitutes the initial step in developing a non-invasive MS-based test for clinical diagnosis of strep throat. It is limited to the single species, S. pyogenes, which is responsible for the vast majority of cases. The method is complementary to and, with further testing, a potential alternative to current methods of point-of-care detection of S. pyogenes.
A Nanocoaxial-Based Electrochemical Sensor for the Detection of Cholera Toxin
NASA Astrophysics Data System (ADS)
Archibald, Michelle M.; Rizal, Binod; Connolly, Timothy; Burns, Michael J.; Naughton, Michael J.; Chiles, Thomas C.
2015-03-01
Sensitive, real-time detection of biomarkers is of critical importance for rapid and accurate diagnosis of disease for point of care (POC) technologies. Current methods do not allow for POC applications due to several limitations, including sophisticated instrumentation, high reagent consumption, limited multiplexing capability, and cost. Here, we report a nanocoaxial-based electrochemical sensor for the detection of bacterial toxins using an electrochemical enzyme-linked immunosorbent assay (ELISA) and differential pulse voltammetry (DPV). Proof-of-concept was demonstrated for the detection of cholera toxin (CT). The linear dynamic range of detection was 10 ng/ml - 1 μg/ml, and the limit of detection (LOD) was found to be 2 ng/ml. This level of sensitivity is comparable to the standard optical ELISA used widely in clinical applications. In addition to matching the detection profile of the standard ELISA, the nanocoaxial array provides a simple electrochemical readout and a miniaturized platform with multiplexing capabilities for the simultaneous detection of multiple biomarkers, giving the nanocoax a desirable advantage over the standard method towards POC applications. Sensitive, real-time detection of biomarkers is of critical importance for rapid and accurate diagnosis of disease for point of care (POC) technologies. Current methods do not allow for POC applications due to several limitations, including sophisticated instrumentation, high reagent consumption, limited multiplexing capability, and cost. Here, we report a nanocoaxial-based electrochemical sensor for the detection of bacterial toxins using an electrochemical enzyme-linked immunosorbent assay (ELISA) and differential pulse voltammetry (DPV). Proof-of-concept was demonstrated for the detection of cholera toxin (CT). The linear dynamic range of detection was 10 ng/ml - 1 μg/ml, and the limit of detection (LOD) was found to be 2 ng/ml. This level of sensitivity is comparable to the standard optical ELISA used widely in clinical applications. In addition to matching the detection profile of the standard ELISA, the nanocoaxial array provides a simple electrochemical readout and a miniaturized platform with multiplexing capabilities for the simultaneous detection of multiple biomarkers, giving the nanocoax a desirable advantage over the standard method towards POC applications. This work was supported by the National Institutes of Health (National Cancer Institute award No. CA137681 and National Institute of Allergy and Infectious Diseases Award No. AI100216).
Yoshida, Yukinaga; Matsuda, Koji; Tamai, Naoto; Yoshizawa, Kai; Nikami, Toshiki; Ishiguro, Haruya; Tajiri, Hisao
2014-01-01
Endoscopic submucosal dissection (ESD) for superficial gastric neoplasm is a curative method. The aim of this study was to detect potential nonbleeding visible vessels (NBVVs) by using an infrared imaging (IRI) system. A total of 24 patients (25 lesions) were consecutively enrolled between March 2010 and December 2010. The day after ESD, endoscopist A (K.M.), who was blinded to the actual procedure of ESD, performed esophagogastroduodenoscopy (EGD) of the post-ESD ulcer base using the IRI system. Endoscopist A marked gray/blue points in the hard-copy images with the IRI system. After the first procedure, endoscopist B (Y.Y.), who was blinded to the results recorded by endoscopist A, performed a second EGD with white light endoscopy and administered water-jet pressure with the maximum level of an Olympus flushing pump onto the post-ESD ulcer base. This test can cause iatrogenic bleeding via application of pressure to NBVV in the post-ESD ulcer. The IRI system detected 58 gray points and 71 blue points. The post-ESD ulcer was divided into the central area and the peripheral area. There were 14 gray points (24 %) in the central area and 44 gray points (76 %) in the peripheral area. There were 19 blue points (27 %) in the central area and 52 blue points (73 %) in the peripheral area. There was no significant difference when comparing the distribution of gray points and blue points. Bleeding occurred with a water-jet pressure in 11 of 58 gray points and in none of the blue points (P = 0.000478). Among the gray points, bleeding in response to a water-jet pressure occurred in 2 points in the central area and in 9 points in the peripheral area. The IRI system detects visible vessels (VVs) that are in no need of coagulation as blue points, and VVs have a potential risk of bleeding as gray points.
Automatic correspondence detection in mammogram and breast tomosynthesis images
NASA Astrophysics Data System (ADS)
Ehrhardt, Jan; Krüger, Julia; Bischof, Arpad; Barkhausen, Jörg; Handels, Heinz
2012-02-01
Two-dimensional mammography is the major imaging modality in breast cancer detection. A disadvantage of mammography is the projective nature of this imaging technique. Tomosynthesis is an attractive modality with the potential to combine the high contrast and high resolution of digital mammography with the advantages of 3D imaging. In order to facilitate diagnostics and treatment in the current clinical work-flow, correspondences between tomosynthesis images and previous mammographic exams of the same women have to be determined. In this paper, we propose a method to detect correspondences in 2D mammograms and 3D tomosynthesis images automatically. In general, this 2D/3D correspondence problem is ill-posed, because a point in the 2D mammogram corresponds to a line in the 3D tomosynthesis image. The goal of our method is to detect the "most probable" 3D position in the tomosynthesis images corresponding to a selected point in the 2D mammogram. We present two alternative approaches to solve this 2D/3D correspondence problem: a 2D/3D registration method and a 2D/2D mapping between mammogram and tomosynthesis projection images with a following back projection. The advantages and limitations of both approaches are discussed and the performance of the methods is evaluated qualitatively and quantitatively using a software phantom and clinical breast image data. Although the proposed 2D/3D registration method can compensate for moderate breast deformations caused by different breast compressions, this approach is not suitable for clinical tomosynthesis data due to the limited resolution and blurring effects perpendicular to the direction of projection. The quantitative results show that the proposed 2D/2D mapping method is capable of detecting corresponding positions in mammograms and tomosynthesis images automatically for 61 out of 65 landmarks. The proposed method can facilitate diagnosis, visual inspection and comparison of 2D mammograms and 3D tomosynthesis images for the physician.
NMR Detection Using Laser-Polarized Xenon as a DipolarSensor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Granwehr, Josef; Urban, Jeffry T.; Trabesinger, Andreas H.
2005-02-28
Hyperpolarized Xe-129 can be used as a sensor to indirectly detect NMR spectra of heteronuclei that are neither covalently bound nor necessarily in direct contact with the Xe atoms, but coupled through long-range intermolecular dipolar couplings. In order to reintroduce long-range dipolar couplings the sample symmetry has to be broken. This can be done either by an asymmetric sample arrangement, or by breaking the symmetry of the spin magnetization with field gradient pulses. Experiments are performed where only a small fraction of the available Xe-129 magnetization is used for each point, so that a single batch of xenon suffices formore » the point-by-point acquisition of a heteronuclear NMR spectrum. Examples with H-1 as analyte nucleus show that these methods have the potential to obtain spectra with a resolution that is high enough to determine homonuclear J couplings. The applicability of this technique with remote detection is discussed.« less
Navarro, Pedro J.; Fernández, Carlos; Borraz, Raúl; Alonso, Diego
2016-01-01
This article describes an automated sensor-based system to detect pedestrians in an autonomous vehicle application. Although the vehicle is equipped with a broad set of sensors, the article focuses on the processing of the information generated by a Velodyne HDL-64E LIDAR sensor. The cloud of points generated by the sensor (more than 1 million points per revolution) is processed to detect pedestrians, by selecting cubic shapes and applying machine vision and machine learning algorithms to the XY, XZ, and YZ projections of the points contained in the cube. The work relates an exhaustive analysis of the performance of three different machine learning algorithms: k-Nearest Neighbours (kNN), Naïve Bayes classifier (NBC), and Support Vector Machine (SVM). These algorithms have been trained with 1931 samples. The final performance of the method, measured a real traffic scenery, which contained 16 pedestrians and 469 samples of non-pedestrians, shows sensitivity (81.2%), accuracy (96.2%) and specificity (96.8%). PMID:28025565
Navarro, Pedro J; Fernández, Carlos; Borraz, Raúl; Alonso, Diego
2016-12-23
This article describes an automated sensor-based system to detect pedestrians in an autonomous vehicle application. Although the vehicle is equipped with a broad set of sensors, the article focuses on the processing of the information generated by a Velodyne HDL-64E LIDAR sensor. The cloud of points generated by the sensor (more than 1 million points per revolution) is processed to detect pedestrians, by selecting cubic shapes and applying machine vision and machine learning algorithms to the XY, XZ, and YZ projections of the points contained in the cube. The work relates an exhaustive analysis of the performance of three different machine learning algorithms: k-Nearest Neighbours (kNN), Naïve Bayes classifier (NBC), and Support Vector Machine (SVM). These algorithms have been trained with 1931 samples. The final performance of the method, measured a real traffic scenery, which contained 16 pedestrians and 469 samples of non-pedestrians, shows sensitivity (81.2%), accuracy (96.2%) and specificity (96.8%).
A mathematical programming approach for sequential clustering of dynamic networks
NASA Astrophysics Data System (ADS)
Silva, Jonathan C.; Bennett, Laura; Papageorgiou, Lazaros G.; Tsoka, Sophia
2016-02-01
A common analysis performed on dynamic networks is community structure detection, a challenging problem that aims to track the temporal evolution of network modules. An emerging area in this field is evolutionary clustering, where the community structure of a network snapshot is identified by taking into account both its current state as well as previous time points. Based on this concept, we have developed a mixed integer non-linear programming (MINLP) model, SeqMod, that sequentially clusters each snapshot of a dynamic network. The modularity metric is used to determine the quality of community structure of the current snapshot and the historical cost is accounted for by optimising the number of node pairs co-clustered at the previous time point that remain so in the current snapshot partition. Our method is tested on social networks of interactions among high school students, college students and members of the Brazilian Congress. We show that, for an adequate parameter setting, our algorithm detects the classes that these students belong more accurately than partitioning each time step individually or by partitioning the aggregated snapshots. Our method also detects drastic discontinuities in interaction patterns across network snapshots. Finally, we present comparative results with similar community detection methods for time-dependent networks from the literature. Overall, we illustrate the applicability of mathematical programming as a flexible, adaptable and systematic approach for these community detection problems. Contribution to the Topical Issue "Temporal Network Theory and Applications", edited by Petter Holme.
Matsumoto, S; Kobayashi, H
1979-10-15
It is necessary to distinguish between the dew point and the frost point below 0 degrees C. The freezing of the dew and the melting of the frost are respectively detected by the rapid decrease and the increase of the conduction current on the narrow surface of insulated layer made of epoxy, 0.5 mm in width and 10 mm in length, on which the dew deposits. The dew point -9 degrees C and the frost point -8 degrees C in the humidity 21% at the temperature 13 degrees C are clearly distinguished in this method.
The use of biochemical methods in extraterrestrial life detection
NASA Astrophysics Data System (ADS)
McDonald, Gene
2006-08-01
Instrument development for in situ extraterrestrial life detection focuses primarily on the ability to distinguish between biological and non-biological material, mostly through chemical analysis for potential biosignatures (e.g., biogenic minerals, enantiomeric excesses). In constrast, biochemical analysis techniques commonly applied to Earth life focus primarily on the exploration of cellular and molecular processes, not on the classification of a given system as biological or non-biological. This focus has developed because of the relatively large functional gap between life and non-life on Earth today. Life on Earth is very diverse from an environmental and physiological point of view, but is highly conserved from a molecular point of view. Biochemical analysis techniques take advantage of this similarity of all terrestrial life at the molecular level, particularly through the use of biologically-derived reagents (e.g., DNA polymerases, antibodies), to enable analytical methods with enormous sensitivity and selectivity. These capabilities encourage consideration of such reagents and methods for use in extraterrestrial life detection instruments. The utility of this approach depends in large part on the (unknown at this time) degree of molecular compositional differences between extraterrestrial and terrestrial life. The greater these differences, the less useful laboratory biochemical techniques will be without significant modification. Biochemistry and molecular biology methods may need to be "de-focused" in order to produce instruments capable of unambiguously detecting a sufficiently wide range of extraterrestrial biochemical systems. Modern biotechnology tools may make that possible in some cases.
NASA Astrophysics Data System (ADS)
Yang, Hongxin; Su, Fulin
2018-01-01
We propose a moving target analysis algorithm using speeded-up robust features (SURF) and regular moment in inverse synthetic aperture radar (ISAR) image sequences. In our study, we first extract interest points from ISAR image sequences by SURF. Different from traditional feature point extraction methods, SURF-based feature points are invariant to scattering intensity, target rotation, and image size. Then, we employ a bilateral feature registering model to match these feature points. The feature registering scheme can not only search the isotropic feature points to link the image sequences but also reduce the error matching pairs. After that, the target centroid is detected by regular moment. Consequently, a cost function based on correlation coefficient is adopted to analyze the motion information. Experimental results based on simulated and real data validate the effectiveness and practicability of the proposed method.
M, Soorya; Issac, Ashish; Dutta, Malay Kishore
2018-02-01
Glaucoma is an ocular disease which can cause irreversible blindness. The disease is currently identified using specialized equipment operated by optometrists manually. The proposed work aims to provide an efficient imaging solution which can help in automating the process of Glaucoma diagnosis using computer vision techniques from digital fundus images. The proposed method segments the optic disc using a geometrical feature based strategic framework which improves the detection accuracy and makes the algorithm invariant to illumination and noise. Corner thresholding and point contour joining based novel methods are proposed to construct smooth contours of Optic Disc. Based on a clinical approach as used by ophthalmologist, the proposed algorithm tracks blood vessels inside the disc region and identifies the points at which first vessel bend from the optic disc boundary and connects them to obtain the contours of Optic Cup. The proposed method has been compared with the ground truth marked by the medical experts and the similarity parameters, used to determine the performance of the proposed method, have yield a high similarity of segmentation. The proposed method has achieved a macro-averaged f-score of 0.9485 and accuracy of 97.01% in correctly classifying fundus images. The proposed method is clinically significant and can be used for Glaucoma screening over a large population which will work in a real time. Copyright © 2017 Elsevier B.V. All rights reserved.
A Label-Free, Quantitative Fecal Hemoglobin Detection Platform for Colorectal Cancer Screening
Soraya, Gita V.; Nguyen, Thanh C.; Abeyrathne, Chathurika D.; Huynh, Duc H.; Chan, Jianxiong; Nguyen, Phuong D.; Nasr, Babak; Chana, Gursharan; Kwan, Patrick; Skafidas, Efstratios
2017-01-01
The early detection of colorectal cancer is vital for disease management and patient survival. Fecal hemoglobin detection is a widely-adopted method for screening and early diagnosis. Fecal Immunochemical Test (FIT) is favored over the older generation chemical based Fecal Occult Blood Test (FOBT) as it does not require dietary or drug restrictions, and is specific to human blood from the lower digestive tract. To date, no quantitative FIT platforms are available for use in the point-of-care setting. Here, we report proof of principle data of a novel low cost quantitative fecal immunochemical-based biosensor platform that may be further developed into a point-of-care test in low-resource settings. The label-free prototype has a lower limit of detection (LOD) of 10 µg hemoglobin per gram (Hb/g) of feces, comparable to that of conventional laboratory based quantitative FIT diagnostic systems. PMID:28475117
The 3D Hough Transform for plane detection in point clouds: A review and a new accumulator design
NASA Astrophysics Data System (ADS)
Borrmann, Dorit; Elseberg, Jan; Lingemann, Kai; Nüchter, Andreas
2011-03-01
The Hough Transform is a well-known method for detecting parameterized objects. It is the de facto standard for detecting lines and circles in 2-dimensional data sets. For 3D it has attained little attention so far. Even for the 2D case high computational costs have lead to the development of numerous variations for the Hough Transform. In this article we evaluate different variants of the Hough Transform with respect to their applicability to detect planes in 3D point clouds reliably. Apart from computational costs, the main problem is the representation of the accumulator. Usual implementations favor geometrical objects with certain parameters due to uneven sampling of the parameter space. We present a novel approach to design the accumulator focusing on achieving the same size for each cell and compare it to existing designs. [Figure not available: see fulltext.
Infrared Skin Thermometry: Validating and Comparing Techniques to Detect Periwound Skin Infection.
Mufti, Asfandyar; Somayaji, Ranjani; Coutts, Patricia; Sibbald, R Gary
2018-01-01
Diagnosis of wound infection can be challenging because it relies on a combination of clinical signs and symptoms that are often nonspecific. Increased periwound cutaneous temperature is a classic sign of deep and surrounding wound infection, and its quantitative measurement is one of the most reliable and valid clinical signs of deep and surrounding skin infection at the bedside. Skin surface temperature differences may be detected using commercially available noncontact infrared thermometers. However, techniques to detect temperature using noncontact infrared thermometers vary, and no studies have evaluated these methods. Two such measurement techniques include the "4-point" and "whole-wound" scanning methods. This study assessed the ability of different infrared thermometers using the aforementioned techniques to detect clinically meaningful differences in periwound cutaneous temperatures used in the diagnosis of wound infection. A prospective cohort study was conducted from 2015 to 2016 of consenting adult patients 18 years or older with an open wound attending a regional wound care clinic. One hundred patients with wounds underwent surface temperature measurement. Infection was not a specific inclusion criterion as the primary objective was to conduct a comparative assessment of infrared thermometry devices. Demographic data (age, height, weight, gender, and ethnicity) were also collected. Each wound was measured using 4 different noncontact infrared thermometers: Exergen DermaTemp 1001 (Exergen Corporation, Watertown, Massachusetts), Mastercraft Digital Temperature Reader (Mastercraft, Toronto, Ontario, Canada), Mastercool MSC52224-A (Mastercool Inc, Randolph, New Jersey), and Etekcity ETC-8250 Temperature Heat Pen (Etekcity, Anaheim, California). Data analysis was based on a comparison of 4 periwound skin surface temperature measurement differences (ΔT in degrees Fahrenheit) between the wound site and an equivalent contralateral control site. The primary outcome of the ability of each thermometer to detect a clinically significant difference in temperature was assessed with χ analysis. Paired t tests were conducted to compare the ΔT measurements of each specific thermometry device between the 2 measurement techniques. Pearson product moment correlation coefficients were calculated for the temperature ΔT for both measuring techniques for all devices to determine level of agreement. A 1-way analysis of variance was conducted to compare temperature measurements among the infrared thermometry devices. There was no significant difference in the ability of each thermometer to detect a clinically meaningful difference of 3° F by either the 4-point (P = .10) or whole-wound techniques (P = .67). When a definition of 4° F was used, results were similar (4-point, P = .15; whole wound, P = .20). Comparisons among devices and techniques showed strong correlations (>0.80). Etekcity ETC-8250 and the 4-point measurement compared with the Exergen device using the whole-wound technique had a correlation coefficient of 0.72. Spearman ρ demonstrated a similarly high degree of correlation between techniques and devices, and only the Etekcity ETC-8250 device had a coefficient of 0.71 to 0.90 when compared with others. Paired t testing for each thermometry device comparing measurement techniques for raw temperatures ΔT demonstrated no significant difference (P > .05). No statistical differences among the ΔT values for the 3 different thermometers were noted when using the whole-wound method (P = .095). Similarly, no significant differences among the ΔT values were noted for the 4 different thermometers when using the 4-point method (P = .10). The results of this study demonstrate that both the 4-point and whole-wound methods of temperature acquisition using cost-efficient infrared thermometers provide accurate and similar results in clinical wound care settings.
Recognition and defect detection of dot-matrix text via variation-model based learning
NASA Astrophysics Data System (ADS)
Ohyama, Wataru; Suzuki, Koushi; Wakabayashi, Tetsushi
2017-03-01
An algorithm for recognition and defect detection of dot-matrix text printed on products is proposed. Extraction and recognition of dot-matrix text contains several difficulties, which are not involved in standard camera-based OCR, that the appearance of dot-matrix characters is corrupted and broken by illumination, complex texture in the background and other standard characters printed on product packages. We propose a dot-matrix text extraction and recognition method which does not require any user interaction. The method employs detected location of corner points and classification score. The result of evaluation experiment using 250 images shows that recall and precision of extraction are 78.60% and 76.03%, respectively. Recognition accuracy of correctly extracted characters is 94.43%. Detecting printing defect of dot-matrix text is also important in the production scene to avoid illegal productions. We also propose a detection method for printing defect of dot-matrix characters. The method constructs a feature vector of which elements are classification scores of each character class and employs support vector machine to classify four types of printing defect. The detection accuracy of the proposed method is 96.68 %.
De Los Ríos, F. A.; Paluszny, M.
2015-01-01
We consider some methods to extract information about the rotator cuff based on magnetic resonance images; the study aims to define an alternative method of display that might facilitate the detection of partial tears in the supraspinatus tendon. Specifically, we are going to use families of ellipsoidal triangular patches to cover the humerus head near the affected area. These patches are going to be textured and displayed with the information of the magnetic resonance images using the trilinear interpolation technique. For the generation of points to texture each patch, we propose a new method that guarantees the uniform distribution of its points using a random statistical method. Its computational cost, defined as the average computing time to generate a fixed number of points, is significantly lower as compared with deterministic and other standard statistical techniques. PMID:25650281
Yeh, Chia-Hsien; Zhao, Zi-Qi; Shen, Pi-Lan; Lin, Yu-Cheng
2014-01-01
This study presents an optical inspection system for detecting a commercial point-of-care testing product and a new detection model covering from qualitative to quantitative analysis. Human chorionic gonadotropin (hCG) strips (cut-off value of the hCG commercial product is 25 mIU/mL) were the detection target in our study. We used a complementary metal-oxide semiconductor (CMOS) sensor to detect the colors of the test line and control line in the specific strips and to reduce the observation errors by the naked eye. To achieve better linearity between the grayscale and the concentration, and to decrease the standard deviation (increase the signal to noise ratio, S/N), the Taguchi method was used to find the optimal parameters for the optical inspection system. The pregnancy test used the principles of the lateral flow immunoassay, and the colors of the test and control line were caused by the gold nanoparticles. Because of the sandwich immunoassay model, the color of the gold nanoparticles in the test line was darkened by increasing the hCG concentration. As the results reveal, the S/N increased from 43.48 dB to 53.38 dB, and the hCG concentration detection increased from 6.25 to 50 mIU/mL with a standard deviation of less than 10%. With the optimal parameters to decrease the detection limit and to increase the linearity determined by the Taguchi method, the optical inspection system can be applied to various commercial rapid tests for the detection of ketamine, troponin I, and fatty acid binding protein (FABP). PMID:25256108
Luo, Xiao-Feng; Jiao, Jian-Hua; Zhang, Wen-Yue; Pu, Han-Ming; Qu, Bao-Jin; Yang, Bing-Ya; Hou, Min; Ji, Min-Jun
2016-01-01
AIM: To investigate clarithromycin resistance positions 2142, 2143 and 2144 of the 23SrRNA gene in Helicobacter pylori (H. pylori) by nested-allele specific primer-polymerase chain reaction (nested-ASP-PCR). METHODS: The gastric tissue and saliva samples from 99 patients with positive results of the rapid urease test (RUT) were collected. The nested-ASP-PCR method was carried out with the external primers and inner allele-specific primers corresponding to the reference strain and clinical strains. Thirty gastric tissue and saliva samples were tested to determine the sensitivity of nested-ASP-PCR and ASP-PCR methods. Then, clarithromycin resistance was detected for 99 clinical samples by using different methods, including nested-ASP-PCR, bacterial culture and disk diffusion. RESULTS: The nested-ASP-PCR method was successfully established to test the resistance mutation points 2142, 2143 and 2144 of the 23SrRNA gene of H. pylori. Among 30 samples of gastric tissue and saliva, the H. pylori detection rate of nested-ASP-PCR was 90% and 83.33%, while the detection rate of ASP-PCR was just 63% and 56.67%. Especially in the saliva samples, nested-ASP-PCR showed much higher sensitivity in H. pylori detection and resistance mutation rates than ASP-PCR. In the 99 RUT-positive gastric tissue and saliva samples, the H. pylori-positive detection rate by nested-ASP-PCR was 87 (87.88%) and 67 (67.68%), in which there were 30 wild-type and 57 mutated strains in gastric tissue and 22 wild-type and 45 mutated strains in saliva. Genotype analysis showed that three-points mixed mutations were quite common, but different resistant strains were present in gastric mucosa and saliva. Compared to the high sensitivity shown by nested-ASP-PCR, the positive detection of bacterial culture with gastric tissue samples was 50 cases, in which only 26 drug-resistant strains were found through analyzing minimum inhibitory zone of clarithromycin. CONCLUSION: The nested-ASP-PCR assay showed higher detection sensitivity than ASP-PCR and drug sensitivity testing, which could be performed to evaluate clarithromycin resistance of H. pylori. PMID:27433095
Autonomous Detection of Eruptions, Plumes, and Other Transient Events in the Outer Solar System
NASA Astrophysics Data System (ADS)
Bunte, M. K.; Lin, Y.; Saripalli, S.; Bell, J. F.
2012-12-01
The outer solar system abounds with visually stunning examples of dynamic processes such as eruptive events that jettison materials from satellites and small bodies into space. The most notable examples of such events are the prominent volcanic plumes of Io, the wispy water jets of Enceladus, and the outgassing of comet nuclei. We are investigating techniques that will allow a spacecraft to autonomously detect those events in visible images. This technique will allow future outer planet missions to conduct sustained event monitoring and automate prioritization of data for downlink. Our technique detects plumes by searching for concentrations of large local gradients in images. Applying a Scale Invariant Feature Transform (SIFT) to either raw or calibrated images identifies interest points for further investigation based on the magnitude and orientation of local gradients in pixel values. The interest points are classified as possible transient geophysical events when they share characteristics with similar features in user-classified images. A nearest neighbor classification scheme assesses the similarity of all interest points within a threshold Euclidean distance and classifies each according to the majority classification of other interest points. Thus, features marked by multiple interest points are more likely to be classified positively as events; isolated large plumes or multiple small jets are easily distinguished from a textured background surface due to the higher magnitude gradient of the plume or jet when compared with the small, randomly oriented gradients of the textured surface. We have applied this method to images of Io, Enceladus, and comet Hartley 2 from the Voyager, Galileo, New Horizons, Cassini, and Deep Impact EPOXI missions, where appropriate, and have successfully detected up to 95% of manually identifiable events that our method was able to distinguish from the background surface and surface features of a body. Dozens of distinct features are identifiable under a variety of viewing conditions and hundreds of detections are made in each of the aforementioned datasets. In this presentation, we explore the controlling factors in detecting transient events and discuss causes of success or failure due to distinct data characteristics. These include the level of calibration of images, the ability to differentiate an event from artifacts, and the variety of event appearances in user-classified images. Other important factors include the physical characteristics of the events themselves: albedo, size as a function of image resolution, and proximity to other events (as in the case of multiple small jets which feed into the overall plume at the south pole of Enceladus). A notable strength of this method is the ability to detect events that do not extend beyond the limb of a planetary body or are adjacent to the terminator or other strong edges in the image. The former scenario strongly influences the success rate of detecting eruptive events in nadir views.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schaefferkoetter, Joshua, E-mail: dnrjds@nus.edu.sg; Ouyang, Jinsong; Rakvongthai, Yothin
2014-06-15
Purpose: A study was designed to investigate the impact of time-of-flight (TOF) and point spread function (PSF) modeling on the detectability of myocardial defects. Methods: Clinical FDG-PET data were used to generate populations of defect-present and defect-absent images. Defects were incorporated at three contrast levels, and images were reconstructed by ordered subset expectation maximization (OSEM) iterative methods including ordinary Poisson, alone and with PSF, TOF, and PSF+TOF. Channelized Hotelling observer signal-to-noise ratio (SNR) was the surrogate for human observer performance. Results: For three iterations, 12 subsets, and no postreconstruction smoothing, TOF improved overall defect detection SNR by 8.6% as comparedmore » to its non-TOF counterpart for all the defect contrasts. Due to the slow convergence of PSF reconstruction, PSF yielded 4.4% less SNR than non-PSF. For reconstruction parameters (iteration number and postreconstruction smoothing kernel size) optimizing observer SNR, PSF showed larger improvement for faint defects. The combination of TOF and PSF improved mean detection SNR as compared to non-TOF and non-PSF counterparts by 3.0% and 3.2%, respectively. Conclusions: For typical reconstruction protocol used in clinical practice, i.e., less than five iterations, TOF improved defect detectability. In contrast, PSF generally yielded less detectability. For large number of iterations, TOF+PSF yields the best observer performance.« less
Wiemken, Timothy L; Furmanek, Stephen P; Mattingly, William A; Wright, Marc-Oliver; Persaud, Annuradha K; Guinn, Brian E; Carrico, Ruth M; Arnold, Forest W; Ramirez, Julio A
2018-02-01
Although not all health care-associated infections (HAIs) are preventable, reducing HAIs through targeted intervention is key to a successful infection prevention program. To identify areas in need of targeted intervention, robust statistical methods must be used when analyzing surveillance data. The objective of this study was to compare and contrast statistical process control (SPC) charts with Twitter's anomaly and breakout detection algorithms. SPC and anomaly/breakout detection (ABD) charts were created for vancomycin-resistant Enterococcus, Acinetobacter baumannii, catheter-associated urinary tract infection, and central line-associated bloodstream infection data. Both SPC and ABD charts detected similar data points as anomalous/out of control on most charts. The vancomycin-resistant Enterococcus ABD chart detected an extra anomalous point that appeared to be higher than the same time period in prior years. Using a small subset of the central line-associated bloodstream infection data, the ABD chart was able to detect anomalies where the SPC chart was not. SPC charts and ABD charts both performed well, although ABD charts appeared to work better in the context of seasonal variation and autocorrelation. Because they account for common statistical issues in HAI data, ABD charts may be useful for practitioners for analysis of HAI surveillance data. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Montazeri, Sina; Gisinger, Christoph; Eineder, Michael; Zhu, Xiao xiang
2018-05-01
Geodetic stereo Synthetic Aperture Radar (SAR) is capable of absolute three-dimensional localization of natural Persistent Scatterer (PS)s which allows for Ground Control Point (GCP) generation using only SAR data. The prerequisite for the method to achieve high precision results is the correct detection of common scatterers in SAR images acquired from different viewing geometries. In this contribution, we describe three strategies for automatic detection of identical targets in SAR images of urban areas taken from different orbit tracks. Moreover, a complete work-flow for automatic generation of large number of GCPs using SAR data is presented and its applicability is shown by exploiting TerraSAR-X (TS-X) high resolution spotlight images over the city of Oulu, Finland and a test site in Berlin, Germany.
Thermodynamic framework to assess low abundance DNA mutation detection by hybridization.
Willems, Hanny; Jacobs, An; Hadiwikarta, Wahyu Wijaya; Venken, Tom; Valkenborg, Dirk; Van Roy, Nadine; Vandesompele, Jo; Hooyberghs, Jef
2017-01-01
The knowledge of genomic DNA variations in patient samples has a high and increasing value for human diagnostics in its broadest sense. Although many methods and sensors to detect or quantify these variations are available or under development, the number of underlying physico-chemical detection principles is limited. One of these principles is the hybridization of sample target DNA versus nucleic acid probes. We introduce a novel thermodynamics approach and develop a framework to exploit the specific detection capabilities of nucleic acid hybridization, using generic principles applicable to any platform. As a case study, we detect point mutations in the KRAS oncogene on a microarray platform. For the given platform and hybridization conditions, we demonstrate the multiplex detection capability of hybridization and assess the detection limit using thermodynamic considerations; DNA containing point mutations in a background of wild type sequences can be identified down to at least 1% relative concentration. In order to show the clinical relevance, the detection capabilities are confirmed on challenging formalin-fixed paraffin-embedded clinical tumor samples. This enzyme-free detection framework contains the accuracy and efficiency to screen for hundreds of mutations in a single run with many potential applications in molecular diagnostics and the field of personalised medicine.
Kodogiannis, Vassilis S; Lygouras, John N; Tarczynski, Andrzej; Chowdrey, Hardial S
2008-11-01
Current clinical diagnostics are based on biochemical, immunological, or microbiological methods. However, these methods are operator dependent, time-consuming, expensive, and require special skills, and are therefore, not suitable for point-of-care testing. Recent developments in gas-sensing technology and pattern recognition methods make electronic nose technology an interesting alternative for medical point-of-care devices. An electronic nose has been used to detect urinary tract infection from 45 suspected cases that were sent for analysis in a U.K. Public Health Registry. These samples were analyzed by incubation in a volatile generation test tube system for 4-5 h. Two issues are being addressed, including the implementation of an advanced neural network, based on a modified expectation maximization scheme that incorporates a dynamic structure methodology and the concept of a fusion of multiple classifiers dedicated to specific feature parameters. This study has shown the potential for early detection of microbial contaminants in urine samples using electronic nose technology.
Eye gaze tracking using correlation filters
NASA Astrophysics Data System (ADS)
Karakaya, Mahmut; Bolme, David; Boehnen, Chris
2014-03-01
In this paper, we studied a method for eye gaze tracking that provide gaze estimation from a standard webcam with a zoom lens and reduce the setup and calibration requirements for new users. Specifically, we have developed a gaze estimation method based on the relative locations of points on the top of the eyelid and eye corners. Gaze estimation method in this paper is based on the distances between top point of the eyelid and eye corner detected by the correlation filters. Advanced correlation filters were found to provide facial landmark detections that are accurate enough to determine the subjects gaze direction up to angle of approximately 4-5 degrees although calibration errors often produce a larger overall shift in the estimates. This is approximately a circle of diameter 2 inches for a screen that is arm's length from the subject. At this accuracy it is possible to figure out what regions of text or images the subject is looking but it falls short of being able to determine which word the subject has looked at.
Quantitative ultrasonic evaluation of concrete structures using one-sided access
NASA Astrophysics Data System (ADS)
Khazanovich, Lev; Hoegh, Kyle
2016-02-01
Nondestructive diagnostics of concrete structures is an important and challenging problem. A recent introduction of array ultrasonic dry point contact transducer systems offers opportunities for quantitative assessment of the subsurface condition of concrete structures, including detection of defects and inclusions. The methods described in this paper are developed for signal interpretation of shear wave impulse response time histories from multiple fixed distance transducer pairs in a self-contained ultrasonic linear array. This included generalizing Kirchoff migration-based synthetic aperture focusing technique (SAFT) reconstruction methods to handle the spatially diverse transducer pair locations, creating expanded virtual arrays with associated reconstruction methods, and creating automated reconstruction interpretation methods for reinforcement detection and stochastic flaw detection. Interpretation of the reconstruction techniques developed in this study were validated using the results of laboratory and field forensic studies. Applicability of the developed methods for solving practical engineering problems was demonstrated.
Zheng, Xuhui; Liu, Lei; Li, Gao; Zhou, Fubiao; Xu, Jiemin
2018-01-01
Geological and hydrogeological conditions in karst areas are complicated from the viewpoint of engineering. The construction of underground structures in these areas is often disturbed by the gushing of karst water, which may delay the construction schedule, result in economic losses, and even cause heavy casualties. In this paper, an innovative method of multichannel transient Rayleigh wave detecting is proposed by introducing the concept of arrival time difference phase between channels (TDP). Overcoming the restriction of the space-sampling law, the proposed method can extract the phase velocities of different frequency components from only two channels of transient Rayleigh wave recorded on two adjacent detecting points. This feature greatly improves the work efficiency and lateral resolution of transient Rayleigh wave detecting. The improved multichannel transient Rayleigh wave detecting method is applied to the detection of karst caves and fractures in rock mass of the foundation pit of Yan’an Road Station of Guiyang Metro. The imaging of the detecting results clearly reveals the distribution of karst water inflow channels, which provided significant guidance for water plugging and enabled good control over karst water gushing in the foundation pit. PMID:29883492
Zheng, Xuhui; Liu, Lei; Sun, Jinzhong; Li, Gao; Zhou, Fubiao; Xu, Jiemin
2018-01-01
Geological and hydrogeological conditions in karst areas are complicated from the viewpoint of engineering. The construction of underground structures in these areas is often disturbed by the gushing of karst water, which may delay the construction schedule, result in economic losses, and even cause heavy casualties. In this paper, an innovative method of multichannel transient Rayleigh wave detecting is proposed by introducing the concept of arrival time difference phase between channels (TDP). Overcoming the restriction of the space-sampling law, the proposed method can extract the phase velocities of different frequency components from only two channels of transient Rayleigh wave recorded on two adjacent detecting points. This feature greatly improves the work efficiency and lateral resolution of transient Rayleigh wave detecting. The improved multichannel transient Rayleigh wave detecting method is applied to the detection of karst caves and fractures in rock mass of the foundation pit of Yan'an Road Station of Guiyang Metro. The imaging of the detecting results clearly reveals the distribution of karst water inflow channels, which provided significant guidance for water plugging and enabled good control over karst water gushing in the foundation pit.
Experimental and environmental factors affect spurious detection of ecological thresholds
Daily, Jonathan P.; Hitt, Nathaniel P.; Smith, David; Snyder, Craig D.
2012-01-01
Threshold detection methods are increasingly popular for assessing nonlinear responses to environmental change, but their statistical performance remains poorly understood. We simulated linear change in stream benthic macroinvertebrate communities and evaluated the performance of commonly used threshold detection methods based on model fitting (piecewise quantile regression [PQR]), data partitioning (nonparametric change point analysis [NCPA]), and a hybrid approach (significant zero crossings [SiZer]). We demonstrated that false detection of ecological thresholds (type I errors) and inferences on threshold locations are influenced by sample size, rate of linear change, and frequency of observations across the environmental gradient (i.e., sample-environment distribution, SED). However, the relative importance of these factors varied among statistical methods and between inference types. False detection rates were influenced primarily by user-selected parameters for PQR (τ) and SiZer (bandwidth) and secondarily by sample size (for PQR) and SED (for SiZer). In contrast, the location of reported thresholds was influenced primarily by SED. Bootstrapped confidence intervals for NCPA threshold locations revealed strong correspondence to SED. We conclude that the choice of statistical methods for threshold detection should be matched to experimental and environmental constraints to minimize false detection rates and avoid spurious inferences regarding threshold location.
Miles, Robin R [Danville, CA; Belgrader, Phillip [Severna Park, MD; Fuller, Christopher D [Oakland, CA
2007-01-02
Impedance measurements are used to detect the end-point for PCR DNA amplification. A pair of spaced electrodes are located on a surface of a microfluidic channel and an AC or DC voltage is applied across the electrodes to produce an electric field. An ionically labeled probe will attach to a complementary DNA segment, and a polymerase enzyme will release the ionic label. This causes the conductivity of the solution in the area of the electrode to change. This change in conductivity is measured as a change in the impedance been the two electrodes.
Shuryak, Igor; Loucas, Bradford D; Cornforth, Michael N
2017-01-01
The concept of curvature in dose-response relationships figures prominently in radiation biology, encompassing a wide range of interests including radiation protection, radiotherapy and fundamental models of radiation action. In this context, the ability to detect even small amounts of curvature becomes important. Standard (ST) statistical approaches used for this purpose typically involve least-squares regression, followed by a test on sums of squares. Because we have found that these methods are not particularly robust, we investigated an alternative information theoretic (IT) approach, which involves Poisson regression followed by information-theoretic model selection. Our first objective was to compare the performances of the ST and IT methods by using them to analyze mFISH data on gamma-ray-induced simple interchanges in human lymphocytes, and on Monte Carlo simulated data. Real and simulated data sets that contained small-to-moderate curvature were deliberately selected for this exercise. The IT method tended to detect curvature with higher confidence than the ST method. The finding of curvature in the dose response for true simple interchanges is discussed in the context of fundamental models of radiation action. Our second objective was to optimize the design of experiments aimed specifically at detecting curvature. We used Monte Carlo simulation to investigate the following parameters. Constrained by available resources (i.e., the total number of cells to be scored) these include: the optimal number of dose points to use; the best way to apportion the total number of cells among these dose points; and the spacing of dose intervals. Counterintuitively, our simulation results suggest that 4-5 radiation doses were typically optimal, whereas adding more dose points may actually prove detrimental. Superior results were also obtained by implementing unequal dose spacing and unequal distributions in the number of cells scored at each dose.
Revealing turning points in ecosystem functioning over the Northern Eurasian agricultural frontier.
Horion, Stéphanie; Prishchepov, Alexander V; Verbesselt, Jan; de Beurs, Kirsten; Tagesson, Torbern; Fensholt, Rasmus
2016-08-01
The collapse of the Soviet Union in 1991 has been a turning point in the World history that left a unique footprint on the Northern Eurasian ecosystems. Conducting large scale mapping of environmental change and separating between naturogenic and anthropogenic drivers is a difficult endeavor in such highly complex systems. In this research a piece-wise linear regression method was used for breakpoint detection in Rain-Use Efficiency (RUE) time series and a classification of ecosystem response types was produced. Supported by earth observation data, field data, and expert knowledge, this study provides empirical evidence regarding the occurrence of drastic changes in RUE (assessment of the timing, the direction and the significance of these changes) in Northern Eurasian ecosystems between 1982 and 2011. About 36% of the study area (3.4 million km(2) ) showed significant (P < 0.05) trends and/or turning points in RUE during the observation period. A large proportion of detected turning points in RUE occurred around the fall of the Soviet Union in 1991 and in the following years which were attributed to widespread agricultural land abandonment. Our study also showed that recurrent droughts deeply affected vegetation productivity throughout the observation period, with a general worsening of the drought conditions in recent years. Moreover, recent human-induced turning points in ecosystem functioning were detected and attributed to ongoing recultivation and change in irrigation practices in the Volgograd region, and to increased salinization and increased grazing intensity around Lake Balkhash. The ecosystem-state assessment method introduced here proved to be a valuable support that highlighted hotspots of potentially altered ecosystems and allowed for disentangling human from climatic disturbances. © 2016 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Polewski, Przemyslaw; Yao, Wei; Heurich, Marco; Krzystek, Peter; Stilla, Uwe
2017-07-01
This paper introduces a statistical framework for detecting cylindrical shapes in dense point clouds. We target the application of mapping fallen trees in datasets obtained through terrestrial laser scanning. This is a challenging task due to the presence of ground vegetation, standing trees, DTM artifacts, as well as the fragmentation of dead trees into non-collinear segments. Our method shares the concept of voting in parameter space with the generalized Hough transform, however two of its significant drawbacks are improved upon. First, the need to generate samples on the shape's surface is eliminated. Instead, pairs of nearby input points lying on the surface cast a vote for the cylinder's parameters based on the intrinsic geometric properties of cylindrical shapes. Second, no discretization of the parameter space is required: the voting is carried out in continuous space by means of constructing a kernel density estimator and obtaining its local maxima, using automatic, data-driven kernel bandwidth selection. Furthermore, we show how the detected cylindrical primitives can be efficiently merged to obtain object-level (entire tree) semantic information using graph-cut segmentation and a tailored dynamic algorithm for eliminating cylinder redundancy. Experiments were performed on 3 plots from the Bavarian Forest National Park, with ground truth obtained through visual inspection of the point clouds. It was found that relative to sample consensus (SAC) cylinder fitting, the proposed voting framework can improve the detection completeness by up to 10 percentage points while maintaining the correctness rate.
A 3D clustering approach for point clouds to detect and quantify changes at a rock glacier front
NASA Astrophysics Data System (ADS)
Micheletti, Natan; Tonini, Marj; Lane, Stuart N.
2016-04-01
Terrestrial Laser Scanners (TLS) are extensively used in geomorphology to remotely-sense landforms and surfaces of any type and to derive digital elevation models (DEMs). Modern devices are able to collect many millions of points, so that working on the resulting dataset is often troublesome in terms of computational efforts. Indeed, it is not unusual that raw point clouds are filtered prior to DEM creation, so that only a subset of points is retained and the interpolation process becomes less of a burden. Whilst this procedure is in many cases necessary, it implicates a considerable loss of valuable information. First, and even without eliminating points, the common interpolation of points to a regular grid causes a loss of potentially useful detail. Second, it inevitably causes the transition from 3D information to only 2.5D data where each (x,y) pair must have a unique z-value. Vector-based DEMs (e.g. triangulated irregular networks) partially mitigate these issues, but still require a set of parameters to be set and a considerable burden in terms of calculation and storage. Because of the reasons above, being able to perform geomorphological research directly on point clouds would be profitable. Here, we propose an approach to identify erosion and deposition patterns on a very active rock glacier front in the Swiss Alps to monitor sediment dynamics. The general aim is to set up a semiautomatic method to isolate mass movements using 3D-feature identification directly from LiDAR data. An ultra-long range LiDAR RIEGL VZ-6000 scanner was employed to acquire point clouds during three consecutive summers. In order to isolate single clusters of erosion and deposition we applied the Density-Based Scan Algorithm with Noise (DBSCAN), previously successfully employed by Tonini and Abellan (2014) in a similar case for rockfall detection. DBSCAN requires two input parameters, strongly influencing the number, shape and size of the detected clusters: the minimum number of points (i) at a maximum distance (ii) around each core-point. Under this condition, seed points are said to be density-reachable by a core point delimiting a cluster around it. A chain of intermediate seed-points can connect contiguous clusters allowing clusters of arbitrary shape to be defined. The novelty of the proposed approach consists in the implementation of the DBSCAN 3D-module, where the xyz-coordinates identify each point and the density of points within a sphere is considered. This allows detecting volumetric features with a higher accuracy, depending only on actual sampling resolution. The approach is truly 3D and exploits all TLS measurements without the need of interpolation or data reduction. Using this method, enhanced geomorphological activity during the summer of 2015 in respect to the previous two years was observed. We attribute this result to the exceptionally high temperatures of that summer, which we deem responsible for accelerating the melting process at the rock glacier front and probably also increasing creep velocities. References: - Tonini, M. and Abellan, A. (2014). Rockfall detection from terrestrial LiDAR point clouds: A clustering approach using R. Journal of Spatial Information Sciences. Number 8, pp95-110 - Hennig, C. Package fpc: Flexible procedures for clustering. https://cran.r-project.org/web/packages/fpc/index.html, 2015. Accessed 2016-01-12.
Structure Line Detection from LIDAR Point Clouds Using Topological Elevation Analysis
NASA Astrophysics Data System (ADS)
Lo, C. Y.; Chen, L. C.
2012-07-01
Airborne LIDAR point clouds, which have considerable points on object surfaces, are essential to building modeling. In the last two decades, studies have developed different approaches to identify structure lines using two main approaches, data-driven and modeldriven. These studies have shown that automatic modeling processes depend on certain considerations, such as used thresholds, initial value, designed formulas, and predefined cues. Following the development of laser scanning systems, scanning rates have increased and can provide point clouds with higher point density. Therefore, this study proposes using topological elevation analysis (TEA) to detect structure lines instead of threshold-dependent concepts and predefined constraints. This analysis contains two parts: data pre-processing and structure line detection. To preserve the original elevation information, a pseudo-grid for generating digital surface models is produced during the first part. The highest point in each grid is set as the elevation value, and its original threedimensional position is preserved. In the second part, using TEA, the structure lines are identified based on the topology of local elevation changes in two directions. Because structure lines can contain certain geometric properties, their locations have small relieves in the radial direction and steep elevation changes in the circular direction. Following the proposed approach, TEA can be used to determine 3D line information without selecting thresholds. For validation, the TEA results are compared with those of the region growing approach. The results indicate that the proposed method can produce structure lines using dense point clouds.
Advanced yellow fever virus genome detection in point-of-care facilities and reference laboratories.
Domingo, Cristina; Patel, Pranav; Yillah, Jasmin; Weidmann, Manfred; Méndez, Jairo A; Nakouné, Emmanuel Rivalyn; Niedrig, Matthias
2012-12-01
Reported methods for the detection of the yellow fever viral genome are beset by limitations in sensitivity, specificity, strain detection spectra, and suitability to laboratories with simple infrastructure in areas of endemicity. We describe the development of two different approaches affording sensitive and specific detection of the yellow fever genome: a real-time reverse transcription-quantitative PCR (RT-qPCR) and an isothermal protocol employing the same primer-probe set but based on helicase-dependent amplification technology (RT-tHDA). Both assays were evaluated using yellow fever cell culture supernatants as well as spiked and clinical samples. We demonstrate reliable detection by both assays of different strains of yellow fever virus with improved sensitivity and specificity. The RT-qPCR assay is a powerful tool for reference or diagnostic laboratories with real-time PCR capability, while the isothermal RT-tHDA assay represents a useful alternative to earlier amplification techniques for the molecular diagnosis of yellow fever by field or point-of-care laboratories.
Digital holographic microscopy for detection of Trypanosoma cruzi parasites in fresh blood mounts
NASA Astrophysics Data System (ADS)
Romero, G. G.; Monaldi, A. C.; Alanís, E. E.
2012-03-01
An off-axis holographic microscope, in a transmission mode, calibrated to automatically detect the presence of Trypanosoma cruzi in blood is developed as an alternative diagnosis tool for Chagas disease. Movements of the microorganisms are detected by measuring the phase shift they produce on the transmitted wave front. A thin layer of blood infected by Trypanosoma cruzi parasites is examined in the holographic microscope, the images of the visual field being registered with a CCD camera. Two consecutive holograms of the same visual field are subtracted point by point and a phase contrast image of the resulting hologram is reconstructed by means of the angular spectrum propagation algorithm. This method enables the measurement of phase distributions corresponding to temporal differences between digital holograms in order to detect whether parasites are present or not. Experimental results obtained using this technique show that it is an efficient alternative that can be incorporated successfully as a part of a fully automatic system for detection and counting of this type of microorganisms.
NASA Astrophysics Data System (ADS)
Ninsalam, Y.; Qin, R.; Rekittke, J.
2016-06-01
In our study we use 3D scene understanding to detect the discharge of domestic solid waste along an urban river. Solid waste found along the Ciliwung River in the neighbourhoods of Bukit Duri and Kampung Melayu may be attributed to households. This is in part due to inadequate municipal waste infrastructure and services which has caused those living along the river to rely upon it for waste disposal. However, there has been little research to understand the prevalence of household waste along the river. Our aim is to develop a methodology that deploys a low cost sensor to identify point source discharge of solid waste using image classification methods. To demonstrate this we describe the following five-step method: 1) a strip of GoPro images are captured photogrammetrically and processed for dense point cloud generation; 2) depth for each image is generated through a backward projection of the point clouds; 3) a supervised image classification method based on Random Forest classifier is applied on the view dependent red, green, blue and depth (RGB-D) data; 4) point discharge locations of solid waste can then be mapped by projecting the classified images to the 3D point clouds; 5) then the landscape elements are classified into five types, such as vegetation, human settlement, soil, water and solid waste. While this work is still ongoing, the initial results have demonstrated that it is possible to perform quantitative studies that may help reveal and estimate the amount of waste present along the river bank.
Structural damage detection-oriented multi-type sensor placement with multi-objective optimization
NASA Astrophysics Data System (ADS)
Lin, Jian-Fu; Xu, You-Lin; Law, Siu-Seong
2018-05-01
A structural damage detection-oriented multi-type sensor placement method with multi-objective optimization is developed in this study. The multi-type response covariance sensitivity-based damage detection method is first introduced. Two objective functions for optimal sensor placement are then introduced in terms of the response covariance sensitivity and the response independence. The multi-objective optimization problem is formed by using the two objective functions, and the non-dominated sorting genetic algorithm (NSGA)-II is adopted to find the solution for the optimal multi-type sensor placement to achieve the best structural damage detection. The proposed method is finally applied to a nine-bay three-dimensional frame structure. Numerical results show that the optimal multi-type sensor placement determined by the proposed method can avoid redundant sensors and provide satisfactory results for structural damage detection. The restriction on the number of each type of sensors in the optimization can reduce the searching space in the optimization to make the proposed method more effective. Moreover, how to select a most optimal sensor placement from the Pareto solutions via the utility function and the knee point method is demonstrated in the case study.
40 CFR 63.7521 - What fuel analyses and procedures must I use?
Code of Federal Regulations, 2010 CFR
2010-07-01
..., at a point prior to mixing with other dissimilar fuel types. (iv) For each fuel type, the analytical methods, with the expected minimum detection levels, to be used for the measurement of selected total metals, chlorine, or mercury. (v) If you request to use an alternative analytical method other than those...
Here we report results from a multi-laboratory (n=11) evaluation of four different PCR methods targeting the 16S rRNA gene of Catellicoccus marimammalium used to detect fecal contamination from birds in coastal environments. The methods included conventional end-point PCR, a SYBR...
Improved spatial resolution of luminescence images acquired with a silicon line scanning camera
NASA Astrophysics Data System (ADS)
Teal, Anthony; Mitchell, Bernhard; Juhl, Mattias K.
2018-04-01
Luminescence imaging is currently being used to provide spatially resolved defect in high volume silicon solar cell production. One option to obtain the high throughput required for on the fly detection is the use a silicon line scan cameras. However, when using a silicon based camera, the spatial resolution is reduced as a result of the weakly absorbed light scattering within the camera's chip. This paper address this issue by applying deconvolution from a measured point spread function. This paper extends the methods for determining the point spread function of a silicon area camera to a line scan camera with charge transfer. The improvement in resolution is quantified in the Fourier domain and in spatial domain on an image of a multicrystalline silicon brick. It is found that light spreading beyond the active sensor area is significant in line scan sensors, but can be corrected for through normalization of the point spread function. The application of this method improves the raw data, allowing effective detection of the spatial resolution of defects in manufacturing.
Recent developments in optical detection methods for microchip separations.
Götz, Sebastian; Karst, Uwe
2007-01-01
This paper summarizes the features and performances of optical detection systems currently applied in order to monitor separations on microchip devices. Fluorescence detection, which delivers very high sensitivity and selectivity, is still the most widely applied method of detection. Instruments utilizing laser-induced fluorescence (LIF) and lamp-based fluorescence along with recent applications of light-emitting diodes (LED) as excitation sources are also covered in this paper. Since chemiluminescence detection can be achieved using extremely simple devices which no longer require light sources and optical components for focusing and collimation, interesting approaches based on this technique are presented, too. Although UV/vis absorbance is a detection method that is commonly used in standard desktop electrophoresis and liquid chromatography instruments, it has not yet reached the same level of popularity for microchip applications. Current applications of UV/vis absorbance detection to microchip separations and innovative approaches that increase sensitivity are described. This article, which contains 85 references, focuses on developments and applications published within the last three years, points out exciting new approaches, and provides future perspectives on this field.
Urban Growth Detection Using Filtered Landsat Dense Time Trajectory in an Arid City
NASA Astrophysics Data System (ADS)
Ye, Z.; Schneider, A.
2014-12-01
Among all remote sensing environment monitoring techniques, time series analysis of biophysical index is drawing increasing attention. Although many of them studied forest disturbance and land cover change detection, few focused on urban growth mapping at medium spatial resolution. As Landsat archive becomes open accessible, methods using Landsat time-series imagery to detect urban growth is possible. It is found that a time trajectory from a newly developed urban area shows a dramatic drop of vegetation index. This enable the utilization of time trajectory analysis to distinguish impervious surface and crop land that has a different temporal biophysical pattern. Also, the time of change can be estimated, yet many challenges remain. Landsat data has lower temporal resolution, which may be worse when cloud-contaminated pixels and SLC-off effect exist. It is difficult to tease apart intra-annual, inter-annual, and land cover difference in a time series. Here, several methods of time trajectory analysis are utilized and compared to find a computationally efficient and accurate way on urban growth detection. A case study city, Ankara, Turkey is chosen for its arid climate and various landscape distributions. For preliminary research, Landsat TM and ETM+ scenes from 1998 to 2002 are chosen. NDVI, EVI, and SAVI are selected as research biophysical indices. The procedure starts with a seasonality filtering. Only areas with seasonality need to be filtered so as to decompose seasonality and extract overall trend. Harmonic transform, wavelet transform, and a pre-defined bell shape filter are used to estimate the overall trend in the time trajectory for each pixel. The point with significant drop in the trajectory is tagged as change point. After an urban change is detected, forward and backward checking is undertaken to make sure it is really new urban expansion other than short time crop fallow or forest disturbance. The method proposed here can capture most of the urban growth during research time period, although the accuracy of time point determination is a bit lower than this. Results from several biophysical indices and filtering methods are similar. Some fallows and bare lands in arid area are easily confused with urban impervious surface.
Cocci, Andrea; Zuppi, Cecilia; Persichilli, Silvia
2013-01-01
Objective. 25-hydroxyvitamin D2/D3 (25-OHD2/D3) determination is a reliable biomarker for vitamin D status. Liquid chromatography-tandem mass spectrometry was recently proposed as a reference method for vitamin D status evaluation. The aim of this work is to compare two commercial kits (Chromsystems and PerkinElmer) for 25-OHD2/D3 determination by our entry level LC-MS/MS. Design and Methods. Chromsystems kit adds an online trap column to an HPLC column and provides atmospheric pressure chemical ionization, isotopically labeled internal standard, and 4 calibrator points. PerkinElmer kit uses a solvent extraction and protein precipitation method. This kit can be used with or without derivatization with, respectively, electrospray and atmospheric pressure chemical ionization. For each analyte, there are isotopically labeled internal standards and 7 deuterated calibrator points. Results. Performance characteristics are acceptable for both methods. Mean bias between methods calculated on 70 samples was 1.9 ng/mL. Linear regression analysis gave an R 2 of 0.94. 25-OHD2 is detectable only with PerkinElmer kit in derivatized assay option. Conclusion. Both methods are suitable for routine. Chromsystems kit minimizes manual sample preparation, requiring only protein precipitation, but, with our system, 25-OHD2 is not detectable. PerkinElmer kit without derivatization does not guarantee acceptable performance with our LC-MS/MS system, as sample is not purified online. Derivatization provides sufficient sensitivity for 25-OHD2 detection. PMID:23555079
Ngo, Hoan T; Gandra, Naveen; Fales, Andrew M; Taylor, Steve M; Vo-Dinh, Tuan
2016-07-15
One of the major obstacles to implement nucleic acid-based molecular diagnostics at the point-of-care (POC) and in resource-limited settings is the lack of sensitive and practical DNA detection methods that can be seamlessly integrated into portable platforms. Herein we present a sensitive yet simple DNA detection method using a surface-enhanced Raman scattering (SERS) nanoplatform: the ultrabright SERS nanorattle. The method, referred to as the nanorattle-based method, involves sandwich hybridization of magnetic beads that are loaded with capture probes, target sequences, and ultrabright SERS nanorattles that are loaded with reporter probes. Upon hybridization, a magnet was applied to concentrate the hybridization sandwiches at a detection spot for SERS measurements. The ultrabright SERS nanorattles, composed of a core and a shell with resonance Raman reporters loaded in the gap space between the core and the shell, serve as SERS tags for signal detection. Using this method, a specific DNA sequence of the malaria parasite Plasmodium falciparum could be detected with a detection limit of approximately 100 attomoles. Single nucleotide polymorphism (SNP) discrimination of wild type malaria DNA and mutant malaria DNA, which confers resistance to artemisinin drugs, was also demonstrated. These test models demonstrate the molecular diagnostic potential of the nanorattle-based method to both detect and genotype infectious pathogens. Furthermore, the method's simplicity makes it a suitable candidate for integration into portable platforms for POC and in resource-limited settings applications. Copyright © 2016. Published by Elsevier B.V.
Lab-on-a-chip nucleic-acid analysis towards point-of-care applications
NASA Astrophysics Data System (ADS)
Kopparthy, Varun Lingaiah
Recent infectious disease outbreaks, such as Ebola in 2013, highlight the need for fast and accurate diagnostic tools to combat the global spread of the disease. Detection and identification of the disease-causing viruses and bacteria at the genetic level is required for accurate diagnosis of the disease. Nucleic acid analysis systems have shown promise in identifying diseases such as HIV, anthrax, and Ebola in the past. Conventional nucleic acid analysis systems are still time consuming, and are not suitable for point-ofcare applications. Miniaturized nucleic acid systems has shown great promise for rapid analysis, but they have not been commercialized due to several factors such as footprint, complexity, portability, and power consumption. This dissertation presents the development of technologies and methods for a labon-a-chip nucleic acid analysis towards point-of-care applications. An oscillatory-flow PCR methodology in a thermal gradient is developed which provides real-time analysis of nucleic-acid samples. Oscillating flow PCR was performed in the microfluidic device under thermal gradient in 40 minutes. Reverse transcription PCR (RT-PCR) was achieved in the system without an additional heating element for incubation to perform reverse transcription step. A novel method is developed for the simultaneous pattering and bonding of all-glass microfluidic devices in a microwave oven. Glass microfluidic devices were fabricated in less than 4 minutes. Towards an integrated system for the detection of amplified products, a thermal sensing method is studied for the optimization of the sensor output. Calorimetric sensing method is characterized to identify design considerations and optimal parameters such as placement of the sensor, steady state response, and flow velocity for improved performance. An understanding of these developed technologies and methods will facilitate the development of lab-on-a-chip systems for point-of-care analysis.
Eboigbodin, Kevin; Filén, Sanna; Ojalehto, Tuomas; Brummer, Mirko; Elf, Sonja; Pousi, Kirsi; Hoser, Mark
2016-06-01
Rapid and accurate diagnosis of influenza viruses plays an important role in infection control, as well as in preventing the misuse of antibiotics. Isothermal nucleic acid amplification methods offer significant advantages over the polymerase chain reaction (PCR), since they are more rapid and do not require the sophisticated instruments needed for thermal cycling. We previously described a novel isothermal nucleic acid amplification method, 'Strand Invasion Based Amplification' (SIBA®), with high analytical sensitivity and specificity, for the detection of DNA. In this study, we describe the development of a variant of the SIBA method, namely, reverse transcription SIBA (RT-SIBA), for the rapid detection of viral RNA targets. The RT-SIBA method includes a reverse transcriptase enzyme that allows one-step reverse transcription of RNA to complementary DNA (cDNA) and simultaneous amplification and detection of the cDNA by SIBA under isothermal reaction conditions. The RT-SIBA method was found to be more sensitive than PCR for the detection of influenza A and B and could detect 100 copies of influenza RNA within 15 min. The development of RT-SIBA will enable rapid and accurate diagnosis of viral RNA targets within point-of-care or central laboratory settings.
Localization of tumors in various organs, using edge detection algorithms
NASA Astrophysics Data System (ADS)
López Vélez, Felipe
2015-09-01
The edge of an image is a set of points organized in a curved line, where in each of these points the brightness of the image changes abruptly, or has discontinuities, in order to find these edges there will be five different mathematical methods to be used and later on compared with its peers, this is with the aim of finding which of the methods is the one that can find the edges of any given image. In this paper these five methods will be used for medical purposes in order to find which one is capable of finding the edges of a scanned image more accurately than the others. The problem consists in analyzing the following two biomedicals images. One of them represents a brain tumor and the other one a liver tumor. These images will be analyzed with the help of the five methods described and the results will be compared in order to determine the best method to be used. It was decided to use different algorithms of edge detection in order to obtain the results shown below; Bessel algorithm, Morse algorithm, Hermite algorithm, Weibull algorithm and Sobel algorithm. After analyzing the appliance of each of the methods to both images it's impossible to determine the most accurate method for tumor detection due to the fact that in each case the best method changed, i.e., for the brain tumor image it can be noticed that the Morse method was the best at finding the edges of the image but for the liver tumor image it was the Hermite method. Making further observations it is found that Hermite and Morse have for these two cases the lowest standard deviations, concluding that these two are the most accurate method to find the edges in analysis of biomedical images.
Chiranjeevi, Pojala; Gopalakrishnan, Viswanath; Moogi, Pratibha
2015-09-01
Facial expression recognition is one of the open problems in computer vision. Robust neutral face recognition in real time is a major challenge for various supervised learning-based facial expression recognition methods. This is due to the fact that supervised methods cannot accommodate all appearance variability across the faces with respect to race, pose, lighting, facial biases, and so on, in the limited amount of training data. Moreover, processing each and every frame to classify emotions is not required, as user stays neutral for majority of the time in usual applications like video chat or photo album/web browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this paper, we propose a light-weight neutral versus emotion classification engine, which acts as a pre-processer to the traditional supervised emotion classification approaches. It dynamically learns neutral appearance at key emotion (KE) points using a statistical texture model, constructed by a set of reference neutral frames for each user. The proposed method is made robust to various types of user head motions by accounting for affine distortions based on a statistical texture model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the prior information regarding the directionality of specific facial action units acting on the respective KE point. The proposed method, as a result, improves emotion recognition (ER) accuracy and simultaneously reduces computational complexity of the ER system, as validated on multiple databases.
NASA Astrophysics Data System (ADS)
Udpa, Nitin; Sampat, Mehul P.; Kim, Min Soon; Reece, Gregory P.; Markey, Mia K.
2007-03-01
The contemporary goals of breast cancer treatment are not limited to cure but include maximizing quality of life. All breast cancer treatment can adversely affect breast appearance. Developing objective, quantifiable methods to assess breast appearance is important to understand the impact of deformity on patient quality of life, guide selection of current treatments, and make rational treatment advances. A few measures of aesthetic properties such as symmetry have been developed. They are computed from the distances between manually identified fiducial points on digital photographs. However, this is time-consuming and subject to intra- and inter-observer variability. The purpose of this study is to investigate methods for automatic localization of fiducial points on anterior-posterior digital photographs taken to document the outcomes of breast reconstruction. Particular emphasis is placed on automatic localization of the nipple complex since the most widely used aesthetic measure, the Breast Retraction Assessment, quantifies the symmetry of nipple locations. The nipple complexes are automatically localized using normalized cross-correlation with a template bank of variants of Gaussian and Laplacian of Gaussian filters. A probability map of likely nipple locations determined from the image database is used to reduce the number of false positive detections from the matched filter operation. The accuracy of the nipple detection was evaluated relative to markings made by three human observers. The impact of using the fiducial point locations as identified by the automatic method, as opposed to the manual method, on the calculation of the Breast Retraction Assessment was also evaluated.
Apparatus and method for detecting leaks in piping
Trapp, D.J.
1994-12-27
A method and device are disclosed for detecting the location of leaks along a wall or piping system, preferably in double-walled piping. The apparatus comprises a sniffer probe, a rigid cord such as a length of tube attached to the probe on one end and extending out of the piping with the other end, a source of pressurized air and a source of helium. The method comprises guiding the sniffer probe into the inner pipe to its distal end, purging the inner pipe with pressurized air, filling the annulus defined between the inner and outer pipe with helium, and then detecting the presence of helium within the inner pipe with the probe as is pulled back through the inner pipe. The length of the tube at the point where a leak is detected determines the location of the leak in the pipe. 2 figures.
Multisignal detecting system of pile integrity testing
NASA Astrophysics Data System (ADS)
Liu, Zuting; Luo, Ying; Yu, Shihai
2002-05-01
The low strain reflection wave method plays a principal rule in the integrating detection of base piles. However, there are some deficiencies with this method. For example, there is a blind area of detection on top of the tested pile; it is difficult to recognize the defects at deep-seated parts of the pile; there is still the planar of 3D domino effect, etc. It is very difficult to solve these problems only with the single-transducer pile integrity testing system. A new multi-signal piles integrity testing system is proposed in this paper, which is able to impulse and collect signals on multiple points on top of the pile. By using the multiple superposition data processing method, the detecting system can effectively restrain the interference and elevate the precision and SNR of pile integrity testing. The system can also be applied to the evaluation of engineering structure health.